• 0 Posts
  • 34 Comments
Joined 1 year ago
cake
Cake day: June 13th, 2023

help-circle
  • If you’re using the built in speakers on any device, you deserve the bad audio quality lol.

    It’s possible to make good built in speakers. The MacBook Pros sound great, even the new iPads sound way better than you’d ever expect from such a thin device. My 13” M4 iPad Pro even has decent bass, it’s ridiculous.

    Is it as good as a stand alone amplifier with two tower speakers? No, of course not. But I’m not bringing those along with me either.


  • Lots of movies sold on 4k bluray are upscales/‘remasters’ of the 2k version. Some are re-scans from the original 35mm film, those can be pretty good, depending on the source material. There is a huge variety in the quality or 4k movies.

    If you can get your hands on it, try the LOTR 4k extended editions. If you get them from an ‘alternative source’ make sure you get one with untouched video (e.g. a remux). They are huge, about 100-120GB per movie but they look amazing. Wonder Woman 1984 also looks really good in 4k HDR, especially the opening scene.

    4k content on streaming services varies a lot in quality, but is generally not as good as 4k bluray. Amazon Prime Video looks quite bad, terrible compression with lots of artifacts. Out of the streaming services, Apple TV+ has the best 4k video quality by far.

    The LOTR 4k bluray is in my opinion on of the best showcases, especially if you compare it to the HD version. The HD bluray looks good, don’t get me wrong, but that’s all it is. Just a good movie with nice pictures. In 4k HDR with Dolby Atmos it’s something completely different. It’s like magic, almost impossible to look away from the screen. You end up starting it up just to see what the quality looks like and before you know it you unintentionally watched the entire extended edition.





  • I have a 4k TV and don’t get it either. Watched the odd video in 4k and the colors are maybe a bit crisper, but that’s about it. I’d have to compare movies side by side to actually spot the difference.

    The point of 4k is that you can have a TV twice as large as your 1080p TV before it without losing sharpness.

    I can definitely tell the difference on my 77” OLED.




  • I’m an IT person professionally, and I use Fedora as my daily driver.

    Ah, Fedora, that brings back memories. We used to call it RootHat back in the day when it was still RedHat. It was what all the first-time Linux users used before they graduated to Debian or Slackware. They would use root as they day to day account, hence the name.

    Havent used it in forever. Is it still as big a pile of shit as it was in the 90’s ?





  • majority of phone users who are or are not tech savvy mostly care about charging and the fact that they can use just about any USB C cable

    But that’s the problem, you can’t just use any cable. Use a standard 5W cable with a laptop that needs 100W and it will either not charge at all or charge so slow that it will take weeks to charge your laptop.


  • im fine with using any cable, be it 30 pin, lighting, usb c, etc. etc. As long as everyone uses the same cable. Keep it simple, convenient, and reduce extra waste.

    But that’s not what we have with USB-C. Now, the situation is even more complicated than it was before. We still have a whole bunch of different cables, but now they all look the same and use the same connector. You can no longer easily tell them apart and there is no easy way to tell from the port on a device what features it supports and what cable it needs.

    If I see a USB-C port on a device it tells me exactly nothing. Is it a USB host or not? Can the port be used to charge the device ? At what wattage? How big a charger do I need? What kind of USB data transfer speed does it support ? 12Mbit, 480Mbit, 10Gbit? Does the port support Thunderbolt? Displayport alt. mode? HDMI? Analog audio? MHL? HDMI? VirtualLink? What cable do I need ? a 5W, 10W, 30W? 60W? 100W?

    A 40 Gbit 100W Thunderbolt 4 cable looks exactly the same as a 5W 480Mbit USB 2.0 cable. A cable that can carry a displayport signal looks exactly the same as one that can’t.

    And shit is even more confusing than that. The USB-C spec supports an HDMI alt-mode. Cables with a USB-C connector on one side and a HDMI connector on the other exist. You’d think that to be able to use this cable your device needs to support HDMI alt. mode. Nope. HDMI alt. mode isn’t actually used, not even in USB-C to HDMI cables. Instead all such cables require DisplayPort alt. mode, as they all contain a displayport-to-HDMI converter chip.

    So simple and convenient that we now have this USB-C standard.


  • The 30 pin dock connector had line-level audio output, as well as serial data lines for remote control. Back in the day I could plug my iPod or iPhone into my car and browse my music on a display on my dashboard and play back the audio over my car stereo. The dock connector also carried analog video (both composite and s-video), line audio input, firewire and was able to power accessories (3.3v) as well as charge the iPhone/iPod.

    There was nothing at all at the time that could do all this using a single connector.