I have a Raspberry Pi that is acting as a sensor which I am sending out to various customers. The sensor is recording approximately 1 GB every 2-3 days, so I would like to have a way to remove old data.
The only way I have found involves using crontab to delete the files, which requires internet to know what time it is. The commands I have found are the following: crontab -e
then input: 0 15 * * * find PATH -mindepth 1 -mtime +30 -delete
which would delete files 30 days old at 3pm every day. The problem is, these sensors won't have access to the internet, so the time will reset when they are restarted. Is there an alternative to this that wouldn't require internet?
To give a little more information: the sensors will typically be in place for about 10 days before being turned off and back on a few days later. The reason I want the files to delete after some time is in case there is a problem. If they deleted 30 days after creation, this would give the customer time to ship it back to me so I could take a look at it
I already have the folders sequentially numbered, so that route may be easiest for me. The folders are labelled 1, 2, 3 etc., with each folder being data from one startup/shutdown procedure and every file inside the folders is a .csv
file. Is there a way that I could write a cron command so that when it is close to running out of space, it would delete say folders 1-X to clear 10 GB of space or something like that? If it's easier I could also tell it to delete folders 1-5 when its running out of space or something like that.
I also have looked into the savelog option a little bit, and this might be what I am looking for although I am not sure how to use it. I would want to use it with folders of data since that would be easier. If it could only keep the last 5 folders at a time that would work since I expect each folder to be ~2-3 GB.