• 1 Post
  • 185 Comments
Joined 4 years ago
cake
Cake day: January 21st, 2021

help-circle

  • kevincox@lemmy.mltoTechnology@lemmy.world*Permanently Deleted*
    link
    fedilink
    English
    arrow-up
    24
    ·
    2 months ago

    Just looking at the numbers, they are spending $5G and losing $1G. Their subscriptions are growing. So if they grow another 25% they are making money. (Ignoring infrastructure costs which are most likely a tiny fraction of per-user revenue.) They also just launched an Android app. So I think their story is looking pretty good. Not even considering that it raises the value of Apple TV hardware, their other devices and gives them more lock-in for customers in general that seems like a great investment they made.


  • kevincox@lemmy.mlMtoOpen Source@lemmy.mlGIMP 3.0 Released
    link
    fedilink
    arrow-up
    23
    arrow-down
    2
    ·
    2 months ago

    Actually I would pick GIMP.

    1. Says what it is, an image editor.
    2. No popups and random interruptions.
    3. Not only AI editing examples which makes me thing the tool is AI only.
    4. An overview of the variety of major features it has rather than just AI editing.
    5. Links to helpful documentation rather than endless marketing pages that say nothing.

    Really think only thing I would like to see is some screenshots and examples of using the tool, rather than just info on what it does. But the Photoshop page barely has this, just a few examples of the AI tools.












  • there will be scaling with all of its negative consequences on perceived quality

    In theory this is true. If you had a nice high-bitrate 1080p video it may look better on a 1080 display than any quality of 1440p video would due to loss while scaling. But in almost all cases selecting higher resolutions will provide better perceived quality due to the higher bitrate, even if they aren’t integer multiples of the displayed size.

    It will also be more bandwidth efficient to target the output size directly. But streaming services want to keep the number of different versions small. Often this will already be >4 resolutions and 2-3 codecs. If they wanted to also have low/medium/high for each resolution that would be a significant cost (encoding itself, storage and reduction in cache hits). So they sort of squish the resolution and quality together into one scale, so 1080p isn’t just 1080p it also serves as a general “medium” quality. If you want “high” you need to go to 1440p or 2160p even if your output is only 1080.


  • the reason no one posts the bitrates is because it’s not exactly interesting information for the the general population.

    But they post resolutions, which are arguably less interesting. The “general public” has been taught to use resolution as a proxy of quality. For TVs and other screens this is mostly true, but for video it isn’t the best metric (lossless video aside).

    Bitrate is probably a better metric but even then it isn’t great. Different codecs and encoding settings can result in much better quality at the same bitrate. But I think in most cases it correlates better with quality than resolution does.

    The ideal metric would probably be some sort of actual quality metric, but none of these are perfect either. Maybe we should just go back to Low/Med/High for quality descriptions.