I have not tried much of anything yet. I just got a cheap laptop with a BD which came with Windows and VLC. I popped in a blu-ray disc from the library and it could not handle it… something about not having a aacs decoder or something like that. I didn’t spend any time on it yet but ultimately in principle I would install debian and try to liberate the drive to read BDs.
thanks!
Though I should mention my original motivation with makemkv was to rip blu-ray discs, which has complications that go beyond DVD. But the DVD guide will still be quite useful.
Fun suggestion… could be useful to have as a side hack if congestion becomes an issue but I doubt it would come to that. They have what seems to be a high-end switch with 20 or so ports and internal fans.
The event is ~2—3 hours or so. If someone needs the full Debian (80 gb!), I think over USB 2 it would not transfer in that timeframe. USB 2 sticks may be rare but at this event there are some ppl with old laptops that have no USB 3 sockets. A lot of people plug into ethernet. And the switch looks somewhat more serious than a 4-port SOHO… it has like 20+ ports with fans, so I don’t get the impression ethernet congestion would be an issue.
I think they could do the job. I’ve never admin’d an NFS so I’m figuring there’s a notable learning curve there. SAMBA, well, maybe. I’ve used it before. I’m leaning toward ProFTPd at the moment but if that gives me any friction I guess I’ll consider SAMBA. Perhaps I’ll go into overachiever mode and have both SAMBA and ProFTPd pointing to the same directory.
Two possible issues w/that w.r.t my use case:
Nonetheless, I appreciate the suggestion. It could be handy in some situations.
oh, sorry. Indeed. I answered from the notifications page w/out context. Glad to know Filezilla will work for that!
I use filezilla but AFAIK it’s just a client not a server.
Indeed i noticed openssh-sftp-server
was automatically installed with Debian 12. Guess I’ll look into that first. Might be interesting if ppl could choose between FTP or mounting with SSHFS.
(edit) found this guide
Thanks for mentioning it. It encouraged me to look closer at it and I believe it’s well suited for my needs.
Well it’s still the same problem. I mean, it’s likely piracy to copy the public lib’s disc to begin with, even if just for a moment. From there, if I want to share it w/others I still need to be able to exit the library with the data before they close. So it’d still be a matter of transcoding as a distinctly separate step.
What’s the point of spending a day compressing something that I only need to watch once?
If I pop into the public library and start a ripping process using Handbrake, the library will close for the day before the job is complete for a single title. I could check-out the media, but there are trade-offs:
Wow, thanks for the research and effort! I will be taking your approach for sure.
I’ll have a brief look but I doubt ffmpeg would know about DVD CSS encryption.
The design approach would also help serve people in impoverished areas, where you might imagine they have no internet access at home but can still participate through some community access point.
Indeed, I really meant tools that have some cloud interaction but give us asynchronous autonomy from the cloud.
Of course there are also scenarios that normally use the could but can be made fully offline. E.g. Argos Translate. If you use a web-based translator like Google Translate or Yandex Translate, you are not only exposed to the dependency of having a WAN when you need to translate, but you give up privacy. Argos Translate empowers you to translate text without cloud dependency while also getting sensible level of privacy. Or in the case of Google Maps vs. OSMand, you have the privacy of not sharing your location and also the robustness of not being dependant on a functioning uplink.
Both scenarios (fully offline apps and periodic syncing of msgs) are about power and control. If all your content is sitting on someone else’s server, you are disempowered because they can boot you at any moment, alter your content, or they can pull the plug on their server spontaneously without warning (this has happened to me many times). They can limit your searching capability too. A natural artifact of offline consumption is that you have your own copy of the data.
if it aint broke dont fix it
It’s broke from where I’m sitting. Many times Mastodon and Lemmy servers went offline out of the pure blue and all my msgs were mostly gone, apart from what got cached on other hosts which is tedious and non-trivial to track down. It’s technically broken security in the form of data loss/loss of availability.
I have nothing for these use cases, off the top of my head:
You just wrote your response using an app that’s dysfunctional offline. You had to be online.
Perhaps before your time, Usenet was the way to do forums. Gnus (an emacs mode) was good for this. Gnus would fetch everything to my specification and store a local copy. It served as an offline newsreader. I could search my local archive of messages and the search was not constrained to a specific tool (e.g. grep would work, but gnus was better). I could configure it to grab all headers for new msgs in a particular newsgroup, or full payloads. Then when disconnected it was possible to read posts. I never tested replies because I had other complexities in play (mixmaster), but it was likely possible to compose a reply and sync/upload it later when online. The UX was similar to how mailing lists work.
None of that is possible with Lemmy. It’s theoretically possible given the API, but the tools don’t exist for that.
Offline workflows were designed to accommodate WAN access interruptions, but an unforeseen benefit was control. Having your own copy naturally gives you a bit of control and censorship resilience.
(update) Makes no sense that I have to be online to read something I previously wrote. I sometimes post some useful bit of information but there are only so many notes I can keep organised. Then I later need to recall (e.g. what was that legal statute that I cited for situation X?) If I wrote it into a Lemmy post, I have to be online to find it again. The search tool might be too limited to search the way I need to… and that assumes the host I wrote it on is even still online.
Worth noting that some countries adjust to the reduced demand for postal service by reducing the number of delivery days. Belgium did this, where they only deliver standard class letters a couple times per week. Priority class gets better treatment.
In the US, Trump is trying to fuck with USPS. He wants to privatize it. Funny thing is, it’s the (Trump supporting) rural areas that would be fucked over the most by privatization. Since he prioritizes his voters above ethical people, he’s struggling with cognitive dissonance.
We need a reform and a robust way to interact digitally with the government, pay taxes and also send messages etc.
I think that’s nearly impossible. Some people use the Tor network and govs tend to block it. For me, “robust” means being strong enough to handle Tor traffic, but I don’t think anti-Tor ignorance could ever be flushed out.
Some people also use very OLD devices, like myself, and refuse to contribute e-waste to landfills. That crowd is also hard to cater for. For me, “robust” also means working with lynx
browser, but I don’t think the chase-the-shiny incompetence of only supporting new devices could ever be flushed out.
So I must ultimately disagree because if the gov were to achieve what they believe is robust, it would be a recipe for ending analog transactions that everyone excluded from their digital systems rely on. They should strive for robustness, but never call it robust. They should recognise that digital tech always excludes some people and so analog systems are still needed.
By the way: If your emails frequently lands in spam folders you should check your mail servers IP if it’s on some spam filter list.
That is exactly the problem. My mail server runs on a residential IP – deliberately so. My comment stands: it’s naive to make a sender responsible for email landing in a spam folder when the sender has no control or even transparency over the operation of the recipient’s mail server.
I would not even announce centralized instances like piefed.ca. It’s part of the Cloudflare giant.