• 5 Posts
  • 2.36K Comments
Joined 2 years ago
cake
Cake day: June 10th, 2023

help-circle
  • Trainguyrom@reddthat.comtoMemes@sopuli.xyzgigachad
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 day ago

    Lately I’ve been replaying the campaign in Railroad Tycoon 2 (available for a just few bucks on GOG) and I’ve had just as much enjoyment as playing a full priced modern title. Granted it’s got pretty timeless 2d(ish) graphics and a studio recorded bluegrass soundtrack at a time when most games just opted for midi soundtracks. It’s even got a complete stock market simulation where every action you take affects your company’s stock value, and you as an individual can manipulate the market to benefit your company and vice versa. I’ve had particular enjoyment by personally selling a bunch of stocks to artificially depress the value of another company right before attempting a merger. That and trying to survive margin calls without being forced to sell my 90% stake in every single railroad company that inevitably tanks the value of every single railroad in the game

    Pretty dang good for a nearly 30 year old game!







  • So with datacenter GPUs (excellerators is the more accurate term, honestly), historically they were the exact same architecture as nVidia’s gaming GPUs (usually about half to a full generation behind. But in the last 5 years or so they’ve moved to their own dedicated architectures.

    But more to your question, the actual silicon that got etched and burned into these datacenter GPUs could’ve been used for anything. Could’ve become cellular modems, networking ASICs, SDR controllers, mobile SOCs, etc. etc. but more importantly these high dollar data center GPUs are usually produced on the newest, most expensive process nodes so the only hardware that would be produced would be similarly high dollar, and not like basic logic controllers used in dollar store junk






  • It’s super easy to forget but Ubuntu tried to do it back in the day with Convergence as well, and amusingly this article also compares it to Microsoft’s solution on Windows Phone. It’s a brilliant idea but apparently no corporation with the ecosystem to make it actually happen has the will to risk actually changing the world despite every company talking about wanting an “iPhone moment”

    Apple would be in a prime position, they’re entire ecosystem is now ARM based and they have the chips with enough power. But it’s not their style to do something cool to threaten their bottom line. Why sell one phone when you can sell phone, laptop, tablet, desktop separately?

    Let’s be real, Apple’s biggest risk would be losing the entire student and young professional market by actually demonstrating that they don’t need a Mac Book Pro to use the same 5 webapps that would work just as well on a decent Chromebook (if such a thing existed)