Is there a better algorithm to generate code? Hmm, maybe theoretically but it certainly doesn’t exist now.
I don’t know if there is a better algorithm, but there is definitely a lot of prior work in this field:
https://en.wikipedia.org/wiki/Comparison_of_code_generation_tools
Yes, smart precise helping tools and code analysis is what we want, not some humbo jumbo.
Yes, that’s… the complaint. Umpteen hojillion dollars have been spent turning LLMs into a dancing bear, with nobody seeming to ask for a better text-to-text approach.
Meanwhile image and video shit keeps going crazy, just by removing noise.
I noticed a slow down to functional programming concepts (monads, pattern matching, option result, immutability, pipes) infavour of shitty vibe coded python.
Im honestly not that certain that there’s a better algo for NLP. LLMs do suck for most of the things we try to make it do but the NLP part of it has been consistently great.
People should look at something like Inform 7 and everything that has gone into making software understand natural speech. Even a basic LambdaMOO parser takes awhile to set up.
Now it’s free to map basic language to functions. My 4b models use less ram / CPU than watching a 720p music video. Sucks people overhype AI because transformers are really cool.
The bubble is already starting to burst. Or at the very least starting to deflate. I’m currently in a technical conference with one of the main topics being AI. About half the sentiment of the talks is shortcomings, challenges and alternatives to LLMs in pretty much all areas. Devs and Execs alike notice that the advancement of LLMs has stalled and promised capabilities won’t be achieved anytime soon.
Out of curiosity what were they trying to do with LLMs that they’re now trying to do with something else?
Context engines for search engines and voice assistants were vastly superior to the LLM AI counterparts that have taken their place. I fucking hate that they broke plenty of things that were on the verge of perfection for something so inferior.
LLMs/AI can do anything. We just need to keep training them. It’s not like their effectiveness has started to plateau or the feedback loop of AI slop is poisoning the training data. You just don’t understand. AI can’t fail. We need to light more money on fire. Don’t worry about the smoke, the electricity required to run the LLMs make that carbon little more than a drop in the bucket. Plus, AI can solve the climate crisis - we just need more of your data to train the model. It can do anything. It’ll pay off in the end - you’ll see. When it does we can make boatloads of money by firing all of the workers that ultimately make up our customer base and buy our products. Don’t you understand? It can’t fail. More money. More training data. It has to work.
🙄 corpos not talking about real tech doesn’t mean people aren’t developing real tech
they just aren’t the five people who own english speaking newspapers
Bro just reinvented cost of opportunity
For “boilerplate code generation”, you can just write a script, use a language with metaprogramming capabilities (even the C preprocessor is capable of it to a llimited degree), etc.
She’s describing literally every tech bubble we have had
This could be applied one way or another in virtualisation, blockchain, web3.0, AR, and all those other tech buzzwords of the year.
A few years from now there will be a new set of shiney keys dangling and people will instantly forget what AI was
3 years ago, literally everything was about blockchains. It would change the way we live, it’s a revolution! We invested billions. Now? Blockwhat? I only know AI!
It’s a guarantee that two years from now nobody will be talking about this AI shit and maybe maybe a few projects and or side tools totally worth a few millions dollars will have survived. The rest went up in flames, as per usual.
I’m fucking tired of the tech world, especially the tech managers and ESPECIALLY the techbro’s which all have to die in a fire, it would make the world measurably better.
LLMs are spell checkers on steroids. Mathy-maths aren’t much better.
LLMs are like Excel. They’re not great at most things but can handle lost things you throw at it.
Excel doesn’t hallucinate.
Well, they’ve added copilot to it now right? So I reckon it probably does
Excel is fundamentally different to LLMs because it doesn’t generate anything. It just calculates stuff in a table, every calculation is traceable and you can backtrack to the source of each input.
LLMs calculate the next most likely token based on what they were trained on and it’s impossible to tell exactly where it got the info from. LLMs don’t even know that themselves, you can ask for a source but 90% of the time they just generate a non-working link in return. It has to be that way or else LLMs could only reproduce tokens that were in their training set verbatim, like a giant database search engine.
The Excel function “flash fill” is an example of excel “guessing” the user intent and automatically creating output based on a few examples.
It is probably still “traceable”, though.
It does now! Because what everybody needed was unreliable calculations in their math software!
Excel is why Office rules the world. It’s the finest spreadsheet program out there, best in class, and here’s the thing, it never changes. You can trust Excel with your business data. Your employees know how to use it and what to expect. Power users can write their own code.
Been using M$ since 3.11. Hell, even tried 2.0 a few times. I can bag on about anything they’ve done, but not Excel.
Well there is the automatic date guessing, and that time the NHS lost months of covid data to a row maximum…
But both of those are trivial to avoid if you know you need to
I remember reading that that too no longer is the case as Excel also added some new AI features that makes it just as unreliable somehow. Don’t ask for a source, I remember reading it like 2 weeks ago
Read that too, but there’s no way in hell MS won’t allow that to be turned off in Excel, and it’s likely opt-in. They are not about to kill their cash cow no matter how hard they’re pushing AI.
Eeehh, have you been paying attention? MS has been doing a godunholy amount of stupid things lately, like pushing win11 which requires new hardware which pushes people to Linux.
This too, it wouldn’t surprise me, will be pushed down everyone’s throats
Excel is like a hammer. As long as you use it correctly, it will reliably pound nails, every time.
LLMs are like a golden retriever. It might bring back the ball most of the time, but sometimes it might just take a shit in front of you.