If entry.is_dir(follow_symlinks = False):Įlif entry.is_file(follow_symlinks = False): # using fast 'os.scandir' method (new in version 3.5) Start_dir = os.path.normpath(os.path.abspath(sys.argv)) if len(sys.argv) > 1 else get_dir_size(start_path = '.'): My_cache_decorator = functools.lru_cache(maxsize=4096) PS i've used recipe 578019 for showing directory size in human-friendly format ( ) from _future_ import print_function The output is sorted by the directory size from biggest to smallest ones. If an argument is omitted, the script will work in the current directory. It also tries to benefit (if possible) from caching the calls of a recursive functions. The following script prints directory size of all sub-directories for the specified directory. The code: def humanized_size(num, suffix='B', si=False): Return (apparent_total_bytes, total_bytes) Total_bytes += os.lstat(dirpath).st_blocks * 512Īpparent_total_bytes += os.lstat(fp).st_sizeĬontinue # skip hardlinks which were already countedĪpparent_total_bytes += os.lstat(dp).st_size
![osx sizeup osx sizeup](https://miro.medium.com/max/1104/1*_AZqP9dZI7z0ijx6C6auCA.png)
Nbytes = sum(d.stat().st_size for d in os.scandir('.') if d.is_file()) Can also be used to get file size and other file related information. Os.stat - st_size Gives the size in bytes. Thanks to ghostdog74 for pointing this out! To use os.path.getsize, this is clearer than using the os.stat().st_size method. os.path.getsize - Gives the size in bytes.
![osx sizeup osx sizeup](https://etherealmind.com/wp-content/uploads/2013/04/current-macbook-desk-layout-three-screens-2-595x470.png)
Sum(os.path.getsize(f) for f in os.listdir('.') if os.path.isfile(f)) This walks all sub-directories summing file sizes: import osįor dirpath, dirnames, filenames in os.walk(start_path):Īnd a oneliner for fun using os.listdir ( Does not include sub-directories): import os