Feral Cat

Building, just because…
bash

Listing the Largest N Files or Folders Recursively

Listing the largest N files or folders recursively is handy. Let’s frame a usecase. So, you’re Ubuntu server is tanking. You’re running Jenkins or something else and your job logs have just started piling up. You’re UI doesn’t work anymore and the only thing you can do is SSH into your instance. What do you do?

This is more specific to non scalable infrastructure. Because if your shit could scale, you wouldn’t be here manually accessing your instance and killing some files. But you are.

We’re not going to cover how to SSH into your server. If you’re SSH’ing into an EC2 instance then this post might help if you come across problems.

The Solution

Listing the Largest N Files or Folders Recursively

  1. Backup your server!!!!
  2. SSH into your server
  3. Move into the directory you suspect or go to the root and exeute the following
du -a $PWD | sort -n -r | head -n 10

4. Then make sure the files the command outputs aren’t necessary for typical execution. Personally, I look for log files that end in .log or that live in a log folder

5. And now, delete stuff by using the following!

rm -rf pathToFolderOrFile

Check your server! You should potentially be good now. If not, work your way down that list of files the du command crapped out. Remember make sure to verify you can kill them without damaging your infra! Happy coding! Want to check out Airflow? Dig into this post.

Leave a Reply Cancel reply

Exit mobile version