Hello community,
I'm currently facing an issue with the `du` command taking an extended amount of time when dealing with a directory containing a large number of files.
I'm using the command to calculate the total size of the directory.
Total number of files: 956878
Here is the command I've been using:In this case, the command took approximately 4.428 seconds to execute.
I would greatly appreciate any advice or suggestions from the community on optimizing the `du` command or alternative approaches to efficiently calculate the total size of a directory with a large number of files.
Best regards,
Samadhan
I'm currently facing an issue with the `du` command taking an extended amount of time when dealing with a directory containing a large number of files.
I'm using the command to calculate the total size of the directory.
Total number of files: 956878
Here is the command I've been using:
Code:
root@DEB-NUC11PAH-G6PA2500045B:~# time du --total /var/lib/docker/volumes/data.vol/_data/out/priority.images | tail -n 1 | cut -f11710732real0m4.255suser0m0.671ssys0m2.453s
I would greatly appreciate any advice or suggestions from the community on optimizing the `du` command or alternative approaches to efficiently calculate the total size of a directory with a large number of files.
Best regards,
Samadhan
Statistics: Posted by samadhan — 2024-01-10 10:45 — Replies 0 — Views 725