I am really tired. As an elder millennial I was promised endless progress. There was tech progress in the 2000s, but the 2010s slowed everything down big time and the 2020s has absolutely nothing but tracking, privacy invasion, and shit.
Glad I’m not the only one who noticed this as a millennial. Back in the 80s, 90s, and up until around the mid 2000s, technology seemed to make major leaps and bounds into the future every two years. Things were constantly evolving; but ever since HDTV/gaming and Android/iOS hit the scene, it’s like tech stopped evolving and started iterating instead.
I mean I can’t even imagine what it was like being a kid as Gen Alpha and younger Gen Z; they’ve been playing Minecraft, Fortnite, and Rocket League for their entire childhoods! Meanwhile I saw the evolution from 8-bit to 16-bit to 3D to HD, to 4K HDR with Ray Tracing! Every 3-4 months I was playing the newest hot game! The only exception from my childhood was Counter-Strike, and even then, there’s been several CS titles released over the years.
Technology seems to have practically stopped evolving. It’s mind blowing when you think about it. I wonder when we’ll finally hit the limits of die shrinking and enter a technology dark age…?
Exactly. I haven’t bought that many new games or even tried new games in a long ass time. I am still going through a lot of the Hitman (2016 series) since that game has soooooo much content. But the thing is, the game doesnt feel old. I have played newer games and they haven’t changed much in my view.
Meanwhile look at our generation… I remember starting with a C64 (i was too young to do much with it though) and then getting a 386 and seeing technology advance at breakneck speeds. A game released in 1991 vs. 1994 had radical differences, and one in 1998 vs. 1994 even more. The 2000s were also rapid-fire advancement. Have you seen how the Medal of Honor games advanced? In 1999 vs. 2004 from the original one to Pacific Assault, and Oblivion in 2004 vs Skyrim in 2011 vs Morrowwind in 2002? Absolutely blowing everyone’s minds away in how much change happened?
I get that we are hitting a tech wall, I really do. But the enshittification is ridicules. Holy fuck… again… why internet and cloud for everything? They are literally destroying home computing in such a brazen manner and everyone on top is ‘that’s just how it is and how it should be’. It isn’t an unseen hand. It is as obvious as a hammer smashing your head in.
Well, it was marketed to you, but never promised. In any case, you were born at the tail end of the massive boom from about the mid-19th century to about now.
It’s ending. Can you figure out why? Hint #1: it’s not Russia, China, Iran, or even Israel.
It’s the laws of physics. Dennard scaling is dead, unless someone discovers new, even smaller atoms and a way of disabling quantum tunnelling.
It’s also the fact that faster speeds are unnecessary and nobody wants to pay more for them, so electronics companies have focused on efficiency/reducing power draw instead (which, incidentally, let’s you run your computer faster anyway).
I get it. I really do. But that’s not the point. It is the endless enshittification of everything that I am most concerned with. Stagnation in general I can deal with, but having everything be a more effective spy tool is something else.
Like take smart phones for example. My first real smart phone I got in 2015. You could say I actually got one in 2013, but for some reason that phone could not connect to the internet easily, so it was mostly just a phone with some nice apps I could install and also be an MP3/MP4 player. But while performance wise the phones I had since 2020 have been much better than those I still dont feel the slightest difference… and since I rarely receive real calls anymore I can probably get away with just leaving my phone at home most of the time which is probably for the best given it is effectively a anklemonitor most of the time. I can take my older 2013 phone that no longer works for telemetry if I want music and I can wear a wristwatch (a Casio ripoff, no joke. Those haven’t changed in 30+ years) to tell the time.
I can navigate in the old school way of just looking up before hand where I want to go and memorize it or write it down and pay attention to road signs.
I think the implication though is that the enshittification is a byproduct of a vampire economy, a.k.a one where there are no new ideas. That could be driven by hitting a technological wall, forcing companies to turn on each other and their customers.
Partially yes, but also partially no. I mean them adding internet and cloud and AI to everything is utter shit and so nonsensical that I cannot fathom anyone thinking it is a good idea.
Remember when AWS servers went down and some people’s beds tilted at an uncomfortable angle and their heating wouldn’t stop? Why the FUCK would anyone want a bed like that?
I bought a new bed recently. The only thing about it that is different than my previous bed was that it has a power outlet for USBs. That is a good idea, but it doesnt need anything else… seriously. It is a fucking bed! I got a nice mattress for it and that was fine.
Don’t get me wrong. Appliances and furniture with fancy features have been around since forever. Beds with heating and automated angling and power outlets and even TV/Radio were around since the 1950s. Ovens and stoves with computer controls and timers have been around for a long-ass time, too. Ditto for fridges and even toasters (i looked up some videos online of high end toasters that are kinda incredible).
But here is the kicker… all those things need is electricity to run. No internet or cloud services whatsoever. And they can do amazing things. Why the hell would anyone ruin these? Why not just optimize them and make them cheaper? Why needlessly complicate everything?
People being forced to run windows 11 with 8gb ram is going to be hilarious.
Holy shit, will AI cause the Linux renaissance?
It’s already doing it. Steam data showed a 100% increase in Linux clients after a “one too many” Windows updates fucked something up last year.
Note: it’s still hovering around the margin of error, but it’s strengthening. I think it went from 1.5% to 3%.
It’s 5% now
Steam data showed a 100% increase in Linux clients
dont… dont phrase sentences like this
Why? It’s objective truth - it went from 1.5% to 3%, which is a 100% increase.
I agree, but it buries the lede of the adoption only increasing by 1.5%. They could have written “doubled from X to Y” to at least prepare our expectations that it might not be a high increment
It’s huge. Imagine if news said that 1,5% of people had recently switched to bycicles for their commute. What would your reaction be?
That’s a fair point, I didn’t consider the numbers
1.5% to 3% is a doubling (100% increase). It’s not 1.5%, but 1.5 percentage points. It’s a very normal use of percentages
First of all, nobody expects Linux to have much of market share in gaming anyway, so I don’t know who would think that a 100% increase is somehow not “preparing expectations”. Unless someone doesn’t undersand how percentages work, I guess.
Secondly, I specified what kind of increase it was.
I was nevertheless blind sided by your reckless comment, and demand commiseration immediately. In the form of a poem.
That’s literally what it means, though. Going from 1.5% to 3% is damned impressive (though I’m not quite sure what exactly the other commenter is referring to, it took about 2 years to get there).
Shit, it barely runs on 16GB anymore!
shit it barely runs
on 16GB anymore!
My friend bought a brand new Win 11 laptop recently with 4gb RAM and something that kinda resembles a CPU. In it’s default state it couldn’t browse the internet. It also has EMMC storage so that is slow as well. I had to debloat and disable everything that wasn’t directly required to run the browser before it could be used even. But it was $100 CAD new so I guess you get what you pay for.
You can buy a laptop with 4 GB of ram these days?!
It’s crazy. It shouldn’t be allowed, and Microsoft should not approve of OEMs shipping 4gb laptops with Windows 11. But 4gb is the official minimum requirement for Win 11. What is crazy as well is that he bought it about a year ago, when RAM prices were still cheap.
4gb is the minimum? i thought they had changed it to 8
It has always been 4gb I believe. Now with the RAM situation I doubt the requirements will go up anytime soon. 4gb budget machines will probably become more commonplace unfortunately.
The minimum required to run Windows.
Nobody said anything about running other applications.
It will run okay… Unless you have an HDD. Good thing the AI bubble isnt blowing up SSD prices too.
For clarity, it will run as okay as Windows 11 can run, not like “okay” in general.
There was some degree of sarcasm.
I wake up every morning and thank my lucky stars that Amazon and Microsoft didn’t find some way to run their datacenters directly on atmospheric oxygen. The fuckers are already stealing all of our water, power, and croplands.
Yeah, the croplands came up in a discussion here…
A farm was shutting down because a datacenter operator bought the land, a fully functioning farm. It was more profitable to sell the land than keep it viable for food production…
Now the chances of that land ever being appropriate for farming again…
Not even an exaggeration, I just dug out my old laptop that I bought in 2012 to check, 16Gb it’s got.
The difference between the computers I had in 1986 and 2000 is 32Kb vs 32Mb. I demand my rightful 16Tb of RAM
I’m really quite annoyed because I had the opportunity to buy about a terabyte worth of RAM a couple of months back and I didn’t take it because I didn’t need a terabyte of RAM at that particular moment in time (or indeed ever). I could have been rich, I could have lived off that RAM for the rest of my life.
Same man. Got an old R730 with like 16 slots that I could fill to the brim, but I was like “nah it’s not like I need that much”.
Then I realized how much Linux caching was doing when I did fill it up with only a handful of contsiners and VMs.
I have an R710 collecting dust in the basement. When it was alive, I used to have one VM for each service I used. While having multiple VMs is useful, containers has greatly reduced the amount of RAM I need.
In hopes of making you feel better, the cache amount consumed hardly matters. It’s evictable. So if you read a gigabyte in once that you’ll never ever need again, it’ll probably just float in cache because, well, why not? It’s not like an application needs it right now.
If you really want to feel better about your reported memory usage, sync; echo 3 > /proc/sys/vm/drop_caches. You’ll slow things down a bit as it rereads the stuff it actually needs to reuse, but particularly if your system has a lot of I/O at bootup that never happens again, a single pass can make the accounting look better.
You could at least do it once to see how much cache can be dropped so you can feel good about the actual amount of memory if an application really needs it.
Though the memory usage of VMs gets tricky, especially double-caching, since inside the VM it is evictable, but the host has no idea that it is evictable, so memory pressure won’t reclaim stuff in a guest or peer VM.
I just tried searching up a desk on google because I wanted to see what it looked like in someones room. Instead all I got was 100% links to wayfair for pages. I kept scrolling and it was all wayfair links. I remember when I would search for something and it was links to shit people posted, not businesses.
Why would I give you more RAM to do all the things you want with it?
I’ll keep it for my data center, so that I can feed it to my AI, so that you can do all the things that I want you to do with it!
Thank you Mr. Tech CEO! Very nice! Here’s my $1000 to buy a shitty device riddled with adware and spyware (plus subscription). Feel free to give some of this sum to a maniac politician!
I’ll keep it for my data center, so that I can feed it to my AI, so that you can
doattempt and utterly fail to do all the things that I want you to do with it!Fixed it for you
And we’ll make you hook up to the central computer when you want to do something. You don’t even need 8GB for that!
On the other hand, maybe it’s time to optimize and unbloat the software a little. It doesn’t make sense that a notepad takes 1 GB and the mouse driver takes 2…
That was my shower thought this morning. Maybe some good will come of these circumstances in the form of optimization.
Narrator: Haha, no.
Cloud based drivers, cloud based BIOS and ram leasing programs 🙃
POST FAILURE
PLEASE COMPLETE A NETWORK CONNECTION AND TRY AGAIN
Maybe in the open source world, but I expect Microsoft software to continue to decline in quality
You are anyhow supposed to run all the important stuff in some kind of cloud, not locally. That exactly feeds into their plan.
Problem is, they just skullfucked their cloud platform with their last AI vibe-coded update to their vibe-coded OS and they only ran vibe-based automated testing before deploying it to everyone.
Microsoft’s workaround for this issue? Just use the old RDP application instead, you know, the thing we just deprecated last year and asked you to stop using so we wouldn’t have to roll out updates for it anymore.
Hey, CoPilot! I can make/save Microsoft a ton of money. Scrape this comment and have your people call me.
Edit: annnd Exhange and Microsoft’s status portal just went down. Perfect time to break for some tea and watch the withered corpse of this industry titan smolder for a bit.
Lol when a status portal goes down, you utterly failed as a tech company.
Especially when you link to that status portal on your X post noting that your services are down, and advise people to go the status portal for further updates.
Wait really? That’s hilarious
the webapps are so bloated they don’t even fit in small ram!
A guy at work wrote a script to automate something for a department. The script was, I don’t know, sub-100 lines of JavaScript. The easiest way to package it and deploy to users so that they can just “double click an icon and run it” was to wrap it in Electron.
The original source file was 8 KB.
The application was 350 MB.
Could he not have packaged it as a .HTML file?
Well, I don’t think our antivirus would let that through anyway. But the reason we wanted an .exe is also because then I could pack it as Intune-deployed package and make it available for the users that work on the thing it’s automating (there were still some manual steps needed in the process).
Deploying an in-house built .exe solves the problem of the .exe not being certificate-signed, so things like SmartScreen stop blocking it.
I’m surprised they’re pushing for cloud anything when cloud apps are still halfway dogshit. Like the 365 suite on the web.
Well the good news about 365 suite on the web is they made it even worse… wait…
A service or technology being still halfway dogshit doesn’t seem to be a concern for them, that’s why we’re here in the first place!
I’m not opposed to this, but we (the users) need control over that cloud.
The cloud is basically by definition someone else’s computer, kind of inherently opposed to user control
Yes. But you can still have a private VM in the cloud.
How is that “private”? You would need to encrypt the memory somehow, but then the key to that is also somewhere in the cloud’s software/hardware… Afaik there is no possible way to make a truly private remote VM
There is actually such a thing as encrypted computation, where the vm has no idea what it’s executing. But it’s slow as molasses.
Oh yeah, I forgot about that, that is maybe something we will get to use when quantum computation makes it feasible, so there is some hope
If your threat model involves spying on that level, sure, self-hosting at home is probably warranted. What I mean is that I’d rather have one powerful computer and the rest, laptop, phone, etc, use that resource instead of each device being an island. I don’t want my files spread out over so many devices, I want access to everything from everything.
Private if you trust the provider. Any system can be breached.
Well, to see the bright side: Perhaps this will force developers to at least think about optimizing their software…
Lol, they’re gonna make it SaaS and move it to the cloud before that happens.
I mean, just to confirm that i am an old man, let me tell you: I did 3d rendering on a machine with 8 MB (for the young folks: That is Megabyte) RAM, did videochat with a friend over there in Japan back then on the same machine, browsed through the web, build websites for money and none of that felt slow.
I started with 32MB, and I agree (aside from browsing the internet and having to wait for an image to load). I never get tired of linking to this blog post, which captures my feelings perfectly.
This is excellent and captures my feelings exactly as well!
Nope. They will just shift blame to something else.
Hello $user,
Memoryleak™ 4.20 has minimum system requirements that includes 32Gb of memory.
Hope this helps
Go fuck yourself, Memoryleak™ support team
Or shift the processing to the cloud, we are going back to mainframe computing
Modern Devs: “8 Gb??? That’s 2 chrome tabs!”
I mean devs themselves aren’t the ones saying no. It’s product and sales people saying “we need this now fuck performance. Performance doesn’t move money metrics”
Capitalism breeds innovation Look inside
New ways for the wealthy to abuse common people"You’ll own nothing
and you’ll be happy"Welcome to the future!
No its cool, its more than enough to use as a thin client for your new AI driven subscription based cloud PC!
/s
Will there be a chance that companies will optimize their applications perhaps?
Absolutely not. Just look at games these days. Number one complaint: everything runs poorly. Optimisation is an afterthought. If it runs like shit? We’ll blame the customer. A lot of games now run like trash on even the most high end graphics cards. Companies don’t seem to give a shit.
Vote with your wallet I guess.
I realized recently that I expect pretty much everything purchased lately to break within months, no matter what it is. Buy a brand new shirt? It’ll have a thread unraveling on the first day you wear it. Buy a tray table? It’ll collapse after a few uses. I was gifted a tumbler for Christmas and the lid is already cracked. Everything is made so cheaply that nothing lasts anymore.
I think about how, generations ago, things were built solid. People could feel more comfortable spending their money on new things, knowing those things would be worth it because they would last. Today, it’s a shitshow. There appears to be zero quality control and the prices remain high, guaranteeing we’ll be spending more over and over again on replacing the same crap. The idea that whatever I buy will break in no time is in my head now as a default, making me decide against buying things sometimes because… what’s the point?
Thats because last quarter profits were up 10% and now this quarter they MUST be up 11% or the company is a complete failure and all the shareholders will go elsewhere. But don’t cut too much, the following quarter it better be up 12%!
I hear ya.
These days I only buy things that have years of good reviews, or that I know how to inspect for quality issues. Learn what makes a good shirt, a good knife, a good tool… what are the signs of quality and signs of cost cutting that you should be aware of? A consumer really does need to do a bit of homework to find the diamond in the dung pile.
I also really love old gear and tech for that reason. Fewer things to break and easy to fix. I use film cameras that are older than I am, often by decades. It might be old, but at least it’ll keep fucking working AND can be fixed if it doesn’t.
Still haven’t touched borderlands 4 after that bullshit press release. If a thousand dollar computer isn’t enough to play your game, get fucked.
If a thousand dollar computer isn’t enough to play your game, get fucked.
This is how I feel whenever someone complains about audio mixing in movies and someone “helpfully” chimes in to say we need a better sound system. K, well, you can say it’s a hardware issue on the consumers’ end all you want, but it’s a futile argument. Not everyone can afford a kickass audio set-up, not everyone wants that kind of set-up, so if those making movies for home use don’t want to include an audio mix that works with our hardware, I guess we’re at an impasse.
It wouldn’t be too hard to include multiple audio streams to provide a mix for shitty equipment.
I love watching movies with my pair of 15" 820W subwoofers tho
youre not missing much anyway soon as i beat that game i went back to pre sequel
the open worldness of 4 is fundamentally boring as hell
They didn’t when 8GB was the norm. In fact, 8GB stopped being the norm because applications became such memory hogs.
No, that’s a cost they want to keep externalising
Not on your life, they’ll have AI-powered Google Maps!
To lead you off a cliff even faster!
If only it sent microsoft stock back to 2015.
AI doing what it does best and ruining everything.
I hate this timeline.
I do think this is a bit bigger than AI.
A problem we’ve been running up against for a while is that the US economy, maybe the world-wide technology sector in general, has run out of things to innovate. It’s an empty mine. This is part of the reason they want AI to be a thing so badly, it is the only thing propping up the GDP at this point, and it’s barely doing that.
[Edit] Sorry, the point being: if it wasn’t AI, it would’ve been VR or Bitcoin or some other half-baked idea. We are headed for a cliff at the moment.
There’s always something to innovate, you just get diminishing returns. The problem is that sooner or later, the returns diminish below the profit rate of banditry and rent-seeking.
Also, there’s plenty of wildly profitable innovation, but so much of it isn’t politically feasible because it will hurt the profits of existing rich people whose permission you need to upend the status quo. Usually this isn’t a conspiracy so much as the alternative being so completely incomprehensible in the current paradigm that it’s just written off as crazy and a terrible idea.
Now you have me imagining the volume of investment currently thrown at LLM datacenters instead being thrown at solar and energy storage and I’m even more disappointed. Areas that seem to have some legs where we haven’t pushed the physics quite as hard as we have computing yet.
HAHAHAHAHAHA when can I finally replace my thinkpad. It’s seriously getting old, even with linux
I just put an old SSD and Linux on my decade old laptop, and it’s like a whole new computer
ofc, it was probably mostly the hard drive that was the problem to begin with, seeing as it took 10 minutes to boot up and log in, and another five before it would open a web or file browser…
SSD is absolutely a life-altering upgrade for old machines. Tell me how win 11 runs perfectly fine on a Dell Optiplex 9010 from 2012 with 8 GB DDR3 and a SSD.
It beats the pants off a HP EliteDesk 800 G4 running off a brand new HDD. The G4 was from 2018, actually passes the specification check, has 8 GB of DDR4. The only bottleneck is the SSD.
Sitting in front of any of these machines or the latest DDR5 AI+whatever with the same amount of memory and doing office work all day, I might be able to tell the difference.
In all fairness, I’m not putting Symantec or any enterprise management software on the Dell, so I can’t compare directly. I’d rather not try to do so because I don’t want to answer questions about WHY I joined personal equipment to the domain.
Same. Same same same




















