• 1 Post
  • 246 Comments
Joined 1 year ago
cake
Cake day: July 24th, 2023

help-circle







  • 30p87@feddit.detoProgrammer Humor@programming.devblahaj
    link
    fedilink
    arrow-up
    5
    arrow-down
    1
    ·
    edit-2
    4 months ago

    Because you would need to know the code for å in all kb layouts, on all OS’s, even in a bare terminal with no way to just open the emoji picker, with or without special keys and no clipboard. Of course, tab completion or globs may help you, but not in all cases.

    Try to select blåhaj.txt in a dir with blåhaj.txt and blahaj.txt present. Easy, ls bl*haj.txt | grep -i blahaj.txt. Now with blåhaj.txt and bløhaj.txt. Not as easy anymore, but doable with tail -n1 or head -n1. Now do it consistently in a script. So you again need to single out the right string, or single char, and >> it into the script so you have the special char. Then you have a component that does not like certain special chars, so you need to escape it. All because one decided to use special chars as a file name/identifier. Using [a-zA-Z0-9-_.:;,]* would be so easy.













  • 30p87@feddit.detoMemes@lemmy.mlpriorities
    link
    fedilink
    arrow-up
    1
    ·
    4 months ago

    The local backups are done hourly, and incrementally. They hold 2+ weeks of backups, which means I can roll back versions of packages easily, as the normal package cache is cleaned regularly. They also prevent losing individual files accidentally through weird behaviour of apps, or me.

    The backups to my workstation are also done hourly, 15 minutes shifted for every device, and also incrementally. They protect against the device itself breaking, ransomware or some rouge program rm -rf’inf /, which would affect local backups too (as they’re mounted in /backups, but those are mainly for providing a file history as I said.)

    As most drives are slower than the 1 Gbps ethernet, the local backups are just more convenient to access and use than the one on my workstation, but otherwise exactly the same.

    The .tar.xz’d backups are actual backups, considering they are not easily accessible, and need to be unpacked and externally stored.

    I didn’t measure the speeds of a normal SSD vs the raid - but it feels faster. Not a valid argument, of course. But in any way, I want to use it as Raid 0/Unraided for more storage space, so I can have 2 weeks of backups instead of 5 days (considering it always keeps space for 2 backups, I would have 200- GB of space instead of 700+).

    The latest hourly backup is 1.3 GB in size, but if an application is used which has a single, big DB that can quickly shoot up to dozens of GB - relatively big for a homeserver hosting primarily my own stuff + a few things for my father. Like synapses’ DB has 20 GB alone. On an uneventful day, that would be 31 GB. With several updates done, which means dozens of new packages in cache, that could grow to 70+GB.