• shortrounddev@lemmy.world
    link
    fedilink
    English
    arrow-up
    8
    ·
    edit-2
    1 day ago

    Do monitors keep a stable amount of features from one generation to the next? I mean the only real reason to upgrade a monitor is for new features, not because it has incrementally improved on the features it already offered, or size maybe. What would be the basis for calling something a “porkchop” vs a “lizard milkshake”

    I guess you could have like 3 tiers of features, going from Cheapest to most Expensive (i.e, lower end is 60hz, higher end 120+hz) and then each generation you know which monitor is “better”

    • kattfisk@lemmy.dbzer0.com
      link
      fedilink
      arrow-up
      1
      ·
      7 hours ago

      This is exactly what the companies try to do. For example ASUS has (in order of increasing fancyness) TUF, ROG Strix and ROG Swift. While MSI has G, MAG, MPG and MEG.

      For each step up you can assume that it will be more cutting edge, have more extras and a higher price. But why would you care? You want to know if the image is good, if it has the features you want and what it costs. You likely don’t care what price segment it was originally intended for.

      As time goes by, what was once expensive premium features become mainstay. So an older top-of-the-line display might be similar in price and performance to a new budget display. Which is better? Well you’ll have to read some reviews and ideally look at it to figure that out. And then you need to know the exact model number of the ones you are comparing. Good thing theres a compact alphanumeric string that uniquely identifies each model ;)