Advent of Code is a cool yearly programming challenge.
Each day from 1st to 25th December a new challenge appears.
You can solve it in any programming language you like.
After deepening my Go skills last year I will do this in Oracle (so PL/SQL and SQL) this year.
I try to blog about my solutions but I won’t make it every day or give up after a while as they get more difficult and time-consuming.
You can find my solutions for this year in this repository.
I really suggest you check Advent of Code out yourself as it is a lot of fun and you will improve your problem-solving skills.
The challenges are clearly phrased with a simple example you can do in your head.
You then get a way bigger generated and user-specific input and have to solve the challenge with that.
You have to enter a solution in an input field and get a star for each day you solved.
There is also a second challenge every day where you have to adapt some part of your algorithm to solve it.
Today’s challenge is really interesting.
The elves gave us a (UNIX-like) device that ran out of storage.
From some shell commands, we have to build an index that keeps track of folder and file sizes.
The input has some commands starting with a $ the other lines are output.
The history basically traverses through a file system and lists directory contents:
In the end, we need to provide the total size of all directories that are smaller than 100000 (elf-bytes?).
As we need to figure out directory sizes we first need to get the absolute file path for each file to be able to sum these up.
For that, we need to process the directory changes.
I created an apex_t_varchar2 array where I append or remove a directory when we go deeper or higher in the file system:
Looking at the table data hierarchical queries immediately come to mind.
After some googling and trial and error I came up with the following query that sums up the dir sizes.
What it does is produce multiple rows for each file. Each file is listed in a directory it is included in, not only the direct parent.
Look how /d/d.log is both in /d and /:
What we need to do now is to sum the sizes up and group them by directories.
I also joined the result again with the source table to filter out the files and only get the directories:
With that in place, we only need to sum up the size of the dirs that are smaller than 100000. You can check out the full solution here.
In the second challenge, we need to calculate what (single) folder to delete if we want to have 30M elf-bytes of free space.
We know that the drive has a capacity of 70M elf-bytes.
We also want to delete the smallest folder possible.
The input needed is the exact size of the folder that gets deleted.
Because we have a pretty good setup from part 1 we don’t need to change that much. First, we need to know how much space we need to delete to get to 30M elf-bytes:
We can then use that value to find the folder that has the size closest to that