Did you know most coyotes are illiterate?

Lemmy.ca flavor

  • 0 Posts
  • 20 Comments
Joined 6 months ago
cake
Cake day: June 7th, 2025

help-circle




  • Absolutely not trusting this. Uninstalling until we know more, and ideally just getting a different solution entirely. A new account tried to impersonate Catfriend1 directly at first, and then they switched to researchxxl when someone called it out (both are new accounts). Meanwhile the original Catfriend1 has provided no information about this, and we only have the new person’s word as to what’s going on. There’s way too many red flags here.



  • I just want to note that Jellyfin MPV Shim exists and can do most of this MPV stuff while still getting the benefits of Jellyfin. You’re putting a lot of emphasis on Plex-specific limitations (which Jellyfin doesn’t have obviously) and transcoding (which is a FEATURE to stopgap an improper media player setup, not a limitation of Jellyfin).

    Pretty much every single “Pro” is not exclusive to pure MPV vs. Jellyfin MPV Shim, which mainly leaves you with the cons. Also as another commenter said, I set my Jellyfin up so that my friends and family can use it, and that’s its primary value to me. I feel like a lot of this post should be re-oriented towards MPV as a great media player, not against Jellyfin as a media platform.



  • If you’re only at 10mbps upload you’ll have to be very careful about selecting microsized 1080p (~4-9mbps) or quality 720p (~6-9mbps) encodes, and even then I really wouldn’t bother. If you’re not able to get any more upload speed from your plan then you’ll either have to cancel the idea or host everything from a VPS.

    You can go with a VPS and maybe make people chip in for the storage space, but in that case I’d still lean towards either microsized 1080p encodes or 1080p WEB-DL (which are inherently efficient for the size) if you want to have a big content base without breaking the bank. E.g, these prices look pretty doable if you’ve got people that can chip in: https://hostingby.design/app-hosting/. I’m not very familiar with what VPS options are available or reputable so you’ll have to shop around. Anything with a big harddrive should pretty much work, though I’d probably recommend at least a few gigs of RAM just for Jellyfin (my long-running local instance is taking 1.3GB at the moment; no idea what the usual range might be). Also, you likely won’t be able to transcode video, so you’ll have to be a little careful about what everyone’s playback devices support.

    Edit: Also, if you’re not familiar with microsized encodes, look for groups like BHDStudio, NAN0, hallowed, TAoE, QxR, HONE, PxHD, and such. I know at least BHDStudio, NAN0, and hallowed are well-regarded, but intentionally microsizing for streaming is a relatively new concept, and it’s hard to sleuth out who’s doing a good job and who’s just crushing the hell out of the source and making a mess - especially because a lot of these groups don’t even post source<->encode comparisons (I can guess why). You can find a lot of them on TL, ATH, and HUNO, if those acronyms mean anything to you. Otherwise, a lot of these groups post completely publicly as well, since most private trackers do not allow microsizing.




  • Screen-sharing is part of chat apps nowadays. You’re fully within your rights to stay on IRC and pretend that featureful chat is not the norm these days, but that doesn’t mean society is going to move to IRC with you. Like it or not, encrypted chat apps have to become even more usable for the average person for adoption to go up. This reminds me of how all the old Linux-heads insisted that gaming was for children and that Linux didn’t need gaming. Suddenly now that Linux has gaming, adoption is going way up - what a coincidence.

    Edit: Also for the record, I have a tech-savvy friend who refuses to move to Signal until there are custom emoji reactions, of all things. You can definitely direct your ire towards these people, but the reality is some people have a certain comfort target, and convincing them to settle for less is often harder than improving the app itself.


  • Yeah h264 is the base codec (also known as AVC), x264 is the dominant encoder that encodes in that codec. So the base BDs are just plain h264, and remuxes will take that h264 and put it into an mkv container. Colloquially, people tag WEB-DL and BDs/remuxes as “h264” as they’re raw/untampered-with, and anything that’s been encoded by a person as “x264”. Same thing for h265/HEVC and x265, and same for h266/VVC and x266.


  • As an idea, I use an SSD as a “Default Download Directory” within qBittorrent itself, and then qB automatically moves it to a HDD when the download is fully finished. I do this because I want the write to be sequential going into my ZFS pool, since ZFS has no defragmentation capabilities.

    Hardlinks are only important if you want to continue seeding the media in its original form and also have a cleaned-up/renamed copy in your Jellyfin library. If you’re going to continue to seed from the HDD, it doesn’t matter that the initial download is done on the SSD. The *arr stack will make the hardlink only after the download is finished.


  • Yep, fully agree. At least BluRays still exist for now. Building a beefy NAS and collecting full BluRay disks allows us to brute force the picture quality through sheer bitrate at least. There are a number of other problems to think about as well before we even get to the encoder stage, such as many (most?) 4k movies/TV shows being mastered in 2k (aka 1080p) and then upscaled to 4k. Not to mention a lot of 2k BluRays are upscaled from 720p! It just goes on and on. As a whole, we’re barely using the capabilities of true 4k in our current day. Most of this UHD/4k “quality” craze is being driven by HDR, which also has its own share of design/cultural problems. The more you dig into all this stuff the worse it gets. 4k is billed as “the last resolution we’ll ever need”, which IMO is probably true, but they don’t tell you that the 4k discs they’re selling you aren’t really 4k.


  • The nice thing is that Linux is always improving and Windows is always in retrograde. The more users Linux has, the faster it will improve. If the current state of Linux is acceptable enough for you as a user, then it should be possible to get your foot in the door and ride the wave upwards. If not, wait for the wave to reach your comfort level. People always say <CURRENT_YEAR> is the year of the Linux desktop but IMO the real year of the Linux desktop was like 4 or 5 years ago now, and hopefully that captured momentum will keep going until critical mass is achieved (optimistically, I think we’re basically already there).


  • CoyoteFacts@piefed.catoLinux@lemmy.mlAOMedia To Release AV2 Video Codec At Year's End
    link
    fedilink
    English
    arrow-up
    12
    arrow-down
    1
    ·
    edit-2
    3 months ago

    To be fair, it’s also basically impossible to have extremely high quality AV1 video, which is what a lot of P2P groups strive for. A lot of effort has gone into trying to do so and results weren’t good enough compared to x264, so it’s been ignored. AV1 is great at compression efficiency, but it can’t make fully transparent encodes (i.e., indistinguishable from the source). It might be different with AV2, though again even if it’s possible it may be ignored because of compatibility instead; groups still use DTS-HD MA over the objectively superior FLAC codec for surround sound because of hardware compatibility to this day. (1.0/2.0 channels they use FLAC because players support that usually)

    As for HEVC/x265, it too is not as good as x264 at very high quality encoding, so it’s also ignored when possible. Basically the breakdown is that 4k encoding uses x265 in order to store HDR and because the big block efficiency of x265 is good enough to compress further than the source material. x264 wouldn’t be used for 4k encoding even if it could store HDR because its compression efficiency is so bad at higher resolutions that to have any sort of quality encode it would end up bigger than the source material. Many people don’t even bother with 4k x265 encodes and just collect the full disc/remuxes instead, because they dislike x265’s encoder quality and don’t deem the size efficiency worth its picture quality impact (pretty picky people here, and I’m not really in that camp).

    For 1080p, x265 is only used when you want to have HDR in a 1080p package, because again x265’s picture quality can’t match x264, but most people deem HDR a bigger advantage. x264 is still the tool of choice for non-HDR 1080p encodes, and that’s not a culture thing, that’s just a quality thing. When you get down into public P2P or random encoding groups it’s anything goes, and x265 1080p encodes get a lot more common because x265 efficiency is pretty great compared to x264, but the very top-end quality just can’t match x264 in the hands of an experienced encoder, so those encoding groups only use x265 when they have to.

    Edit: All that to say, we can’t entirely blame old-head culture or hardware compatibility for the unpopularity of newer formats. I think the home media collector usecase is actually a complete outlier in terms of what these formats are actually being developed for. WEB-DL content favors HEVC and AV1 because it’s very efficient and displays a “good enough” quality picture for their viewers. Physical Blu-Rays don’t have to worry about HDD space or bandwidth and just pump the bitrate insane on HEVC so that the picture quality looks great. For the record, VVC/x266 is already on the shortlist for being junk for the usecases described above (x266 is too new to fully judge), so I wouldn’t hold my breath for AV2 either. If you’re okay with non-transparency, I’d just stick with HEVC WEB-DLs or try to find good encoding groups that target a more opinionated quality:size ratio (some do actually use AV1!). Rules of thumb for WEB-DL quality are here, though it will always vary on a title-by-title basis.



  • I’m not a security expert by any means, but here are a few things I know as a regular user:

    Always keep your system up-to-date and only download and execute software from the official Arch repository if you can help it. Malware often takes advantage of outdated systems that don’t have the latest security patches, so by staying as up-to-date as possible you’re making yourself a very difficult target. The AUR is a user-based repository and is not inherently trusted/maintained like the official Arch repos, so be careful and always read PKGBUILDs before you use AUR software. Don’t use AUR auto-updaters unless you’re reading the PKGBUILD changes every time. Ideally try not to use the AUR at all if you can help it; official Arch Linux is usually quite stable, but AUR software is often responsible for a lot of the “breakages” people tend to get with Arch. If you have to run sketchy software, use a virtual machine for it, as a 0-day VM escape is almost certainly not going to happen with any sort of malware you’d run into. ClamAV or VirusTotal may also help you scan specific files that you’re wary of, but I wouldn’t trust that a file is clean just because it passes an AV check. Also, never run anything as root unless you have a very specific reason, and even then try to use sudo instead of elevating to a full root shell.

    Don’t open up any network ports on your system unless you absolutely have to, and if you’re opening an SSH port, make sure that it: isn’t the default port number, requires a keyfile for login, root cannot be logged into directly, and authentication attempts are limited to a low number. If you’re opening ports for other services, try to use Docker/Podman containers with minimal access to your system resources and not running in root mode. Also consider using something like CrowdSec or fail2ban for blocking bots crawling ports.

    As far as finding out if you’re infected, I’m not sure if there’s a great way to know unless they immediately encrypt all your stuff and demand crypto. Malware could also come in the form of silent keyloggers (which you’d only find out about after you start getting your accounts hacked) or cryptocurrency miners/botnets (which probably attempt to hide their CPU/GPU usage while you’re actively using your computer). At the very least, you’re not likely to be hit by a sophisticated 0-day, so whatever malware you get on your computer probably wants something direct and uncomplicated from you.

    Setting up a backup solution to a NAS running e.g. ZFS can help with preventing malware from pwning your important data, as a filesystem like ZFS can rollback its snapshots and just unencrypt the data again (even if it’s encrypted directly on the NAS). 2FA’ing your accounts (especially important ones like email) is a good way to prevent keyloggers from being able to repeat your username+password into a service and get access. Setting up a resource monitoring daemon can probably help you find out if you’re leaking resources to some kind of crypto miner, though I don’t have specific recommendations as I haven’t done this before.

    In the case of what to do once you’re pwned, IMO the only real solution is to salvage and verify your data, wipe everything down, and reinstall. There’s no guarantee that the malware isn’t continually hiding itself somewhere, so trying to remove it yourself is probably not going to solve anything. If you follow all the above precautions and still get pwned, I’m fairly sure the malware will be news somewhere, and security experts may already be studying the malware’s behavior and giving tips on what to do as a resolution.


  • It’s important to use services with a workflow that works for you; not every popular service is going to be a good fit for everyone. Find your balance between exhaustive categorization and meaningless pile of data, and make sure you’re getting more out than you’re putting in. If you do decide that an extensive amount of effort is worth it, make sure that the service in question is able to export your data in a data-rich format so that you won’t have to do it all again if you decide to move to a different tool.