Guy who buys programmers and sells AI thinks he can sell more AI and stop buying programmers.
This is up there with Uber pretending self driving cars will make them rich.
I mean… self driving cars probably will. Just not as soon as they think. My guess, at least another decade.
Not until a self driving car can safely handle all manner of edge cases thrown at it, and I don’t see that happening any time soon. The cars would need to be able to recognize situations that may not be explicitly programmed into it, and figure out a safe way to deal with it.
As someone said on this thread: as soon as they can convince legislators, even if they are murder machines, capital will go for it.
Borrowing from my favorite movie: “it’s just a glitch”.
I doubt it. The liability would be far too great. Ambulance chasing lawyers would salivate at the chance to represent the families of pedestrians struck and killed by buggy self driving cars. Those capitalists don’t want endless years of class action cases tying up their profits.
When was the last time a corporation got anything other than a slap on the wrist and a small donation to the government just so they could keep doing what they’re doing?
Like Boeing. As much as I hate people saying dumb shit about a company they don’t know much of anything about, Boeing is the epitome of what you said. A company getting a small slap on the wrist for gross negligence in the name of profit. Especially because of all the goodies they develope for the US Federal Government. And since they are a world wide company our government isn’t the only one. They know they reside in a place of power because they fill a hole in an industry that basically has to be filled. And people want to try to bankrupt them with some weird ideas about voting with their dollar. But that’s nonsense.
People don’t understand about how they build planes not to sell but to lease. How these types of leases keep their customers paying out the nose for an asset they don’t own, and responsible for the maintenance of that asset until it’s time to upgrade. They cornered the market on enshitification long before the likes of Microsoft and Google, and they have mastered the art of it.
Tesla or Uber or whoever wish they could do what Boeing has been doing for decades. People have this rose tinted glasses view of what Boeing “used to be” when it was “run by engineers” etc. That’s hilarious to me. Back in the day they hedged their bets in a race to the bottom to develop a two engined plane that wouldn’t catastrophically fail and fall out of the sky if it lost an engine so they could skirt worldwide federal regulations that required planes to have more than two engines. This added to upkeep and fuel costs making it untenable and creating air travel that was incredibly expensive. And their engineers managed it, so they played the long game, basically allowing them to develop planes that were more fuel efficient and cost effective to maintenance meaning their customers could afford to buy more of them by providing air travel opportunities to more people.
You know what we got from that? Shittier seating arrangements, poorly manufactured planes, and baggage fees out the whazoo in addition to ever rising ticket prices for air travel.
Alternatively measures could be put in place to eliminate certain edge cases. You can see similar concepts in places with separate infrastructure for things like busses or HOV lanes. Places you could still ostensibly allow “regular” vehicles to travel but limit/eliminate pedestrians or merging.
Abolish snowfall
We’re working on it, although for some parts of the world we will need to go through periods of increased snowfall to get there.
Tbf human operated cars are also murder machines, we just are more amenable to tolerating it
there will be a massive building in like india with many thousand of atrociously paid workers donning VR goggles who spend their long hours constantly Quantum Leap finding themselves in traumatizing last second emergency situations that the AI gives up on. Instantly they slam on the brakes as hard as they can. They drink tea. there’s suicide netting everywhere. they were the lowest bidder this quarter.
I wish I could give this comment more than a simple upvote. I want to mail you a freshly baked cinnamon bun.
Their accident rate continues to decrease and things like quorum sensing and platooning are going to push them to be better than humans. You’re never going to have a perfect system that never has accidents, but if you’re substantially better than humans in accidents per mile driven and you’re dramatically improving throughput and reducing traffic through V2X, it’s going to make sense to fully transition.
I imagine some east Asian countries will be the first to transition and then the rest of the world will begrudgingly accept it once the advantages become clear and the traditional car driving zealots die off.
The robot taxi from Total Recall came to mind while reading your reply. Our future is almost assuredly dystopian.
Plus, as soon as the cars can drive themselves people will stop needing Uber in many cases.
No parking? Just tell your car to go park on a street 10 blocks away.
Drunk? Car drives itself while you sleep.
Going to the airport? Car drops you off and returns home. Car also picks you up when you are back.
This is combined with the fact that people will do more disgusting things in an Uber without the driver there. If you have ever driven for Uber, you know that 10% of people are trying to eat or drink in the car. They are going to spill and it’s going to end up like the back of a bus.
Not sure if we’re agreeing and saying exactly the same thing here, but Uber’s business model is to get suckers who are bad at math to own the cars. Uber’s business model does not work if they have to own their own cars. Self-driving Uber doesn’t work because Uber would have to own the cars and therefore has to cover vehicle insurance, vehicle depreciation, and so on out of its own margin.
“handle” is doing a lot of heavy lifting there. The signs are already there that all of these edge cases will just be programmed as “safely pull over and stop until conditions change or a human takes control”. Which isn’t a small task in itself, but it’s a lot easier than figuring out to continue (e.g.) on ice.
Just like all humans can do right now, right?
I never see any humans on the rode staring at their phone and driving like shit.
The problem with self-driving cars isn’t that it’s worse than human drivers on average, it’s that it’s SO INCREDIBLY BAD when it’s wrong that no company would ever assume the liability for the worst of its mistakes.
But if the average is better, then we’re will clearly win by using it. I’m not following the logic of tracking the worst case scenarios as opposed to the average.
Average is better means fewer incidents overall. But when there are incidents, the damages for those incidents tend to be much worse. This means the victims are more likely to lawyer up and go after the company responsible for the AI that was driving, and that means that the company who makes the self-driving software better be prepared to pay for those worst case scenarios, which will now be 100% their fault.
Uber can avoid liability for crashes caused by their human drivers. They won’t be able to do the same when their fleet is AI. And when that happens, AI sensibilities will be measured my human metrics because courts are run by humans. The mistakes that they make will be VERY expensive ones, because a minor glitch can turn an autonomous vehicle from the safest driving experience possible to a rogue machine with zero sense of self-preservation. That liability is not worth the cost savings of getting rid of human drivers yet, and it won’t be for a very long time.
Those self-driving cars are called trains. They already can be self-driving. In a situation where the computational complexity and required precision are somewhat controlled, that is, on train tracks.
Maybe, or maybe like harnessing fusion it will always be “just a few more years away!”
Self driving taxis are definitely happening, but the people getting rich in a gold rush are the people selling shovels.
Uber has no structural advantage because their unique value proposition is the army of cheap drivers.
We’re a century away from self-driving cars that can handle snowfall
Just this year farmers with self-driving tractors got screwed because a solar flare made GPS inaccurate and so tractors went wild because they were programmed with the assumption of GPS being 100% reliable and accurate with no way to override
I’m right there with you, but I also remember hearing that this time last decade.
Way longer. Roads will have to be designed and maintained with them in mind.
I hope this helps people understand that you don’t get to be CEO by being smart or working hard. It’s all influence and gossip all the way up.
In fact, being stupid is probably a benefit.
Yep if I had that kind of money and surrounded by like minded people I’d agree. Unfortunately I’m cursed with a rational mind 🙃🙃🙃
“Coding” was never the source of value, and people shouldn’t get overly attached to it. Problem solving is the core skill. The discipline and precision demanded by traditional programming will remain valuable transferable attributes, but they won’t be a barrier to entry. - John Carmack
This right here.
Problem is not coding. Anybody can learn that with a couple of well focused courses.
I’d love to see an AI find the cause of a catastrophic crash of a machine that isn’t caused by a software bug.
Catching up on what Carmack’s been up to for the last decade has revived the fan in me. I love that 2 years after leaving Oculus to focus on AGI, this is all the hype he’s willing to put out there.
Fucking lol.
Keen Technologies
Perfect name for a Carmack company!
Oh my god, I just got it l that, haha.
Agreed! Problem solving is core to any sort of success. Whether you’re moving up or on for more pay, growing tomatoes or nurturing a relationship, you’re problem solving. But I can see AI putting the screws to those of us in tech.
Haven’t used it much so far, last job didn’t afford much coding opportunity, but I wrote a Google Apps script to populate my calendar given changes to an Excel sheet. Pretty neat!
With zero experience App scripting, I tried going the usual way, searching web pages. Got it half-ass working, got stuck. Asked ChatGPT to write it and boom, solved with an hour’s additional work.
You could say, “Yeah, but you at least had a clue as to general scripting and still had to problem solve. Plus, you came up with the idea in the first place, not the AI!” Yes! But point being, AI made the task shockingly easier. That was at a software outfit so I had the oppurtuniy to chat with my dev friends, see what they were up to. They were properly skeptical/realistic as to what AI can do, but they still used it to great effect.
Another example: Struggled like hell to teach myself database scripting, so ignorant I didn’t know the words to search and the solutions I found were more advanced answers than my beginner work required (or understood!). First script was 8 short lines, took 8 hours. Had AI been available to jump start me, I could have done that in an hour, maybe two. That’s a wild productivity boost. So while AI will never make programmers obsolete, we’ll surely need fewer of them.
“Guy who was fed a pay-to-win degree at a nepotism practicing school with a silver spoon shares fantasy, to his fan base that own large publications, about replacing hard working and intelligent employees with machines he is unable to comprehend the most basic features of”
You did a great summary honestly
They’ve been saying this kind of bullshit since the early 90s. Employers hate programmers because they are expensive employees with ideas of their own. The half-dozen elite lizard people running the world really don’t like that kind of thing.
Unfortunately, I don’t think any job is truly safe forever. For myriad reasons. Of course there will always be a need for programmers, engineers, designers, testers, and many other human-performed jobs. However, that will be a rapidly changing landscape and the number of positions will be reduced as much as the owning class can get away with. We currently have large teams of people creating digital content, websites, apps, etc. Those teams will get smaller and smaller as AI can do more and more of the tedious / repetitive / well-solved stuff.
And by that time, processors and open source AI are good enough that any noob can ask his phone to generate a new app from scratch. You’d only need big corpo for cloud storage and then only when distributed systems written by AI don’t work.
I’m relaxed. IMHO this is just another trend.
In all my career I haven’t seen a single customer who was able to tell me out of the box what they need. Big part of my job is to talk to all entities to get the big picture. Gather information about Soft- and Hardware interfaces, visit places to see PHYSICAL things like sub processes or machines.
My focus may be shifted to less coding in an IDE and more of generating code with prompts to use AI as what it is: a TOOL.
I’m annoyed of this mentality of get rich quick, earn a lot of money with no work, develop software without earning the skills and experience. It’s like using libraries for every little problem you have to solve. Worst case you land in dependency/debug hell and waste much more time debugging stuff other people wrote than coding it by yourself and understanding how the things work under the hood.
the number of positions will be reduced as much as the owning class can get away with
Well, after all, you don’t hire people to do nothing. It’s simply a late-stage capitalism thing. Hopefully one day we can take the benefits of that extra productivity and share the wealth. The younger generations seem like they might move us that way in the coming decades.
I really hope so. Sometimes I think the kids are alright. Like the 12 year old owning the My Pillow idiot. Then I hear the horror stories from my school teacher friends.
when will ai replace ceos?
Mark Zuckerberg is not a robot ?
good point
Is he fully functional? I have some standards.
Lizardman. Easy to confuse the two as they’re both cold to the touch by default.
What kind of weird conspiracy theory is this?
Lizardmen are not robots.
No!!! They’re useful because uhhmm uuhhhh uhmm uhhhbbh dndusfjduehrhrh
Wouldn’t even need AI, a coin flip would do just as well.
This. ⬆️ 😆
It’s worth noting that the new CEO is one of few people at Amazon to have worked their way up from PM and sales to CEO.
With that in mind, while it’s a hilariously stupid comment to make, he’s in the business of selling AWS and its role in AI. Take it with the same level of credibility as that crypto scammer you know telling you that Bitcoin is the future of banking.
PM and sales, eh?
So you’re saying his lack of respect for programmers isn’t new, but has spanned his whole career?
Devalue another persons labour is what being an executive is all about.
As a wage slave with no bitcoin or crypto, the technology has been hijacked by these types and could otherwise have been useful.
I’m not entirely sold on the technology, especially since immutable ledgers have been around long before the blockchain, but also due to potential attack vectors and the natural push towards centralisation for many applications - but I’m just one man and if people find uses for it then good for them.
What other solutions to double spending were there in financial cryptography before?
No idea, I don’t work in fintech, but was it a fundamental problem that required a solution?
I’ve worked with blockchain in the past, and the uses where it excelled were in immutable bidding contracts for shared resources between specific owners (e.g. who uses this cable at x time).
Fully decentralized p2p cryptocurrency transactions without double spending by proof of work (improvement upon Hashcash) was done first with Bitcoin. The term fintech did not exist at the time. EDIT: looked it up, apparently first use as Fin-Tech was 1967 https://en.wikipedia.org/wiki/Fintech – it’s not the current use of the term though.
I guess additional bonus for crypto would be not burning the planet, and actuallt have a real value of something, not the imagined one.
Yeah hows that goin’?
It can write really buggy Python code, so… Yeah, seems promising
It does a frequently shitty job of writing docstrings for simple functions, too!
Almost like dealing with real engineers…
It does great with common code patterns that it can just drop in place. 99.9% Artificial, 0.01% intelligence.
When I last tried to let some AI write actual code, it didn’t even compile 🙂 And another time when it actually compiled it was trash anyway and I had to spend as much time fixing it, as I would have spent writing it myself in the first place.
So far I can only use AI as a glorified search engine 😅
Lol sure, and AI made human staff at grocery stores a thing of the…oops, oh yeah…y’all tried that for a while and it failed horribly…
So tired of the bullshit “AI” hype train. I can’t wait for the market to crash hard once everybody realizes it’s a bubble and AI won’t magically make programmers obsolete.
Remember when everything was using machine learning and blockchain technology? Pepperidge Farm remembers…
It’s the pinnacle of MBA evolution.
In their worldview engineers are a material, and all that matters in the world is knowing how to do business. So it just makes sense that one can guide and use and direct engineers to replace themselves.
They don’t think of fundamentals, they really believe it’s some magic that happens all by itself, you just have to direct energy and something will come out of it.
Lysenko vibes.
This wouldn’t happen were not the C-suite mostly comprised of bean counters. They really think they are to engineers what officers are to soldiers. The issue is - an officer must perfectly know everything a soldier knows and their own specialty, and also bears responsibility. Bean counters in general less education, experience and intelligence than engineers they direct, and also avoid responsibility all the time.
So, putting themselves as some superior caste, they really think they can “direct progress” to replace everyone else the way factories with machines replaced artisans.
It’s literally a whole layer of people who know how to get power, but not how to create it, and imagine weird magical stuff about things they don’t know.
I am a black box
Yeah, that’s what I mean. Black boxes are a concept to accelerate development, but we can’t blackbox ourselves through civilization. They are also mostly useful for horizontal, not vertical relationships, which people misunderstand all the time (leaky abstractions).
This actually should make us optimistic. If hierarchical blackboxing were efficient, it would be certain that state of human societies will become more and more fascist and hierarchical over time, while not slowing down in development. But it’s not.
Yeah, I member.
This person memegens
Just the other day, the Mixtral chatbot insisted that PostgreSQL v16 doesn’t exist.
A few weeks ago, Chat GPT gave me a DAX measure for an Excel pivot table that used several DAX functions in ways that they could not be used.
The funny thing was, it knew and could explain why those functions couldn’t be used when I corrected it. But it wasn’t able to correlate and use that information to generate a proper function. In fact, I had to correct it for the same mistakes multiple times and it never did get it quite right.
Generative AI is very good at confidently spitting out inaccurate information in ways that make it sound like it knows what it’s talking about to the average person.
Basically, AI is currently functioning at the same level as the average tech CEO.
Spoken like someone who manages programmers instead of working as one.
The job of CEO seems the far easier to replace with AI. A fairly basic algorithm with weighted goals and parameters (chosen by the board) + LLM + character avatar would probably perform better than most CEOs. Leave out the LLM if you want it to spout nonsense like this Amazon Cloud CEO.
And no asinine private jet commute required for the AI CEO…
But plenty of electricity still needed.
Probably cheap at the price compared to burning Jet A by the tens or hundreds of gallons.
Not that I am unconcerned about the resource usage. Lesser of two evils.
Good point.
Cheaper too I bet.
Also lol for the AI coder 😁 good luck with that 😂
Worst case scenario the ai fucking loses it and decides to do some wacky but weirdly effective shit. Like spamming out 1 width units en masse in Hearts of Iron 4.
I just want to remind everyone that capital won’t wait until AI is “as good” as humans, just when it’s minimally viable.
They didn’t wait for self-checkout to be as good as a cashier; They didn’t wait for chat-bots to be as good as human support; and they won’t wait for AI to be as good as programmers.
And then we should all charge outrageous hourly rates to fix the AI generated code.
They’ll try the opposite. It’s what the movie producers did to the writers. They gave them AI generated junk and told them to fix it. It was basically rewriting the whole thing but because now it was “just touching up an existing script” it was half price.
They can try. But cleaning up a mess takes a while and there’s no magic wand to make it ho faster.
Yeah they’ll try. Surely that can’t cascade into a snowball of issues. Good luck for them 😎
A strike with tech workers would be something else. Curious what would happen if the one maintaining the servers for entertainment, stock market or factories would just walk out. On the other hand, tech doesn’t have unions.
Removed by mod
They won’t, and they’ll suffer because of it and want to immediately hire back programmers (who can actually do problem solving for difficult issues). We’ve already seen this happen with customer service reps - some companies have resumed hiring customer service reps because they realized AI isn’t able to do their jobs.
And because all the theft and malfunctions, the nearby supermarkets replaced the self checkout by normal cashiers again.
If it’s AI doing all the work, the responsibility goes to the remaining humans. They’ll be interesting lawsuits even there’s the inevitable bug that the AI itself can’t figure out.
We saw this happen in Amazon’s cashier-less stores. They were actively trying to use a computer based AI system but it didn’t work without thousands of man hours from real humans which is why those stores are going away. Companies will try this repeatedly til they get something that does work or run out of money. The problem is, some companies have cash to burn.
I doubt the vast majority of tech workers will be replaced by AI any time soon. But they’ll probably keep trying because they really really don’t want to pay human beings a liveable wage.
Unexpected item in bagging are? I think you meant free item in bagging area.
Already happening. Cisco just smoked another 4,000 employees. And anecdotally, my tech job hunt is, for the first time, not going so hot.
It’s really funny how AI “will perform X job in the near future” but you barely, if any, see articles saying that AI will replace CEO’s in the near future.
Somewhere there is a dev team secretly programming an AI to take over bureaucratic and manegerial jobs but disguising it as code writing AI to their CTO and CEO
C-suites are like Russian elites.
The latter are some thieves who’ve inherited a state from Soviet leadership. They have a layman’s idea of what a state and a country is, what history itself is, plus something that a taxi driver would say. In the last 20 years they are trying to apply that weird idea to reality, as if playing Hearts of Iron, because they want to be great and to be in charge of everything that happens.
The former have heard in school that there were industrial revolutions and such, and they too want to be great and believe in every such stupid hype about someone being replaced with new great technology, and of course they want to be in charge of that process.
While in actuality with today’s P2P technologies CEO’s are the most likely to be replaced, if we use our common sense, but without “AI”, of course. Just by decentralized systems allowing much bigger, more powerful and competitive cooperatives than before, and those that form and disband very easily.
Here’s one. And their profits went up when they replaced their CEO https://www.forbes.com/sites/sherzododilov/2024/01/11/can-ai-become-your-next-ceo/
‘Soon’ is a questionable claim from a CEO who sells AI services and GPU instances. A single faulty update caused worldwide down time recently. Now, imagine all infrastructure is written with today’s LLMs - which are sometimes hallucinate so bad, they claim ‘C’ in CRC-32C stands for ‘Cool’.
I wish we could also add a “Do not hallucinate” prompt to some CEOs.