Researchers at the University of Chicago and Argonne National Lab have developed a new type of optical memory that stores data by transferring light from rare-earth element atoms embedded in a solid material to nearby quantum defects. They published their study in Physical Review Research.
Study: https://journals.aps.org/prresearch/pdf/10.1103/PhysRevResearch.6.033170
Data density vs IOPS.
Mechanical media are slow.Well, for backups this still sounds kinda nice
Tape backups are rather slow as well - at least as far as I know. The professional stuff was always out of my league monetary wise.
If someone has a good alternative, I’m absolutely up for it.
Currently I’m using a local server with just a RAID1 to mirror important files on my workstation and those (incremental) backups are getting encrypted and uploaded to a cloud drive.
But for really large data amounts, this isn’t really practical. So I only use this route for business documents, invoices, etc.
But for large data like code, I’m currently only doing a local mirroring (although on multiple devices), so if my office burns down, I’ll lose quite much - at the moment I’m lucky, because I can push my code changes to a customer git mirror, so I should be fine on that front for now.But still, I loathe the day, I really need to restore from my cloud backup.
Maybe I should do some dry runs periodically, to verify my restore path works. But just like server stuff, I really don’t like to touch it that much o:-)I’m currently using BORG (with Vorta) to backup everything locally and distribute it to my server and the cloud.
If anyone has a better idea, I’d be really grateful…
Doing periodically hard disk backups and giving them to a partner company (while I keep theirs in my safe) seems to not really work out in the long term, as I’m often on business trips and our exchanges got more seldom over time…
It kinda sounds elaborate? Like, how practical is that, especially in regards to being a standard?