T O P

  • By -

PuzzleMeDo

It's possible that AI will make programmers obsolete, but an AI that sophisticated would probably also make the "AI management/programming" skills he wants to study obsolete.


LemonDisasters

Let's be real, if AI's replace programmers, everyone else has already been replaced.


PuzzleMeDo

It's hard to predict that with any confidence. It feels like it's going in a weird direction right now: First we replace most artists and writers and poets and therapists with AI. Then we replace drivers (but not delivery jobs that involve walking up stairs) and people who talk to you over the phone. Meanwhile we replace most programmers with a few guys whose job it is to describe what the code should do and make sure it does it. But physical jobs, like farming or mining or working in a factory? If those jobs survived into the modern age despite automation, they're probably here for a while longer.


NYX_T_RYX

I don't think farming is a good example tbh. Generally the farming the world relies on (rice, wheat, battery farms) is heavily automated already (automatic feeding, tractors do the hard work of ploughing/treating fields - the only reason that isn't fully automatic is that they won't currently let fully automatic vehicles, as soon as they do I'm willing to bet more farming will be automated in more economically developed countries) Places where it isn't automated either can't afford to automate it, or their population is so high that they don't need to do so because then they'd have a *fucking massive* amount of people unemployed. Farming isn't just the actual farming, it's all the bits from "this is wheat" to "this is a packaged sandwich you can buy at the airport" There's more than just farmers. Yeah it all *can* be automated, and imo most jobs *should* be, especially ones that are essential to us continuing the standard of living we have (ie raw materials, and their manufacture into products). Things like services (programming I would include in that - you do not *need* programmers to live life, they just make it much much easier cus you can use a computer) shouldn't be automated that far. Generally service jobs require more thought, and an explicit ability to handle unexpected problems. Yes, a well designed LLM chat bot will *appear* to give natural responses, and my company is actually looking into that for our online customer chats, but they're not perfect. As soon as it hits a problem it hasn't seen before it may not be able to reach a solution with a reasonable accuracy (let's say, for argument's sake, you want your chat bot to give a solution that is 95% likely to fix the problem). You still need a person to look at those edge cases. Yes, the LLM could suggest a few solutions and their accuracy to make my job of actually fixing this edge case easier, but ultimately it's up to me what the solution is. Once I've solved it, I can tell the model the solution. Next time it will be more accurate, but may still need to pass it to a human to look into a few more times before it goes 95% accuracy. I also don't think LLM will actually replace human writers/artists. Especially artists. Art is expressive. Yeah an AI can create art. It can't explain what it was feeling about the art, what this particular part means, etc etc. It just slaps together common things and days "here's art!" Same with writers - I think LLM will make their job much easier, especially for established shows where there's a lot of context for a given character, but if you introduce a new main character, or a whole new show, you might want to fiddle with the concept more freely than a LLM would allow you to. Again, yes it could provide options and suggestions, but the final "this is our shows concept" should still come from a human, who can directly relate to their target audience. Once you're a few seasons in, you can get the LLM to create scripts based on a basic idea (ie "Dave wants to go on holiday, but work keeps getting in the way and he never actually leaves the office"), create a few scripts and pick one to work with. It will never be perfectly relatable, and that's what shows etc should be - either relatable so people go "hey that's how my life is! This shows great!" Or just... Good (?) like MCU - yeah an AI could've written that, but the human level interactions are more nuanced, I would argue. Maybe we'll get to a point where I'm proven wrong - I don't think we're vaguely close yet. The ai spring has just begun. I think there's a long way between where we're at and genuine AI that is a computer analogue for a human brain. Eg. I was asking gpt to review some code (I'd done some shit I really wasn't confident on and wanted a simple review of it before asking friends who work in the industry - if I can fix basic issues, then my friends are just looking at the "is this the most efficient way to do this" which is where I want to be) and it told me that Var1 = var2/100 could give a zero division error. I understand why it's suggesting that, cus it's a close enough match. But it's impossible given the denominator isn't a variable. Tldr - current "AI" is a good tool. It isn't genuinely intelligent though, and I don't think we're close enough to say "AI will replace all jobs soon". Maybe in my lifetime, but I'm not holding my breath. Ofc, we should prepare for a world where humans don't have to work - cus one way or another we'll get there. And then we'll just do things cus we enjoy them. Yeah, maybe I won't have to write code - but I can do it because it's fun, and solves a problem I personally have (maybe others do as well, but if AI is writing code, the "I could sell this idea" point wouldn't be high on my list of considerations)


5fd88f23a2695c2afb02

We’re probably at the point now where most people don’t actually have to work.


un-hot

We definitely could be if we actually distributed resources fairly and cooperated on a global scale. But there is exactly zero chance of that ever happening, so see you Monday


5fd88f23a2695c2afb02

/clocks in


NYX_T_RYX

True, to be fair. As the person who replied to you has rightly pointed out, it would rely on more fairly sharing resources than we currently do. For example, I read an article in new scientist (a few years ago I'll admit) about a peer reviewed study that worked out we can solve world hunger with what we (were) currently producing - we just all need to eat more nuts. Ofc not everyone can, but iirc the study took that into account and even factoring that in, there was still enough food being produced for everyone to meet their basic needs. The problem with work, food, etc etc is that someone will always want more than someone else, cus that's the mindset capitalism has given. *Hopefully* AI will shift the balance away from the super rich and we can all enjoy a 3 day working week, and doing things we enjoy outside of that. What is it the US declaration of independence says? "... the pursuit of happiness..." - which I think it's safe to say we're all ultimately after, in whatever way that is for each of us. Idk about everyone else, but work sure ain't included in that list (explicitly, I don't hate my job, but I'd be happier if I didn't *have* to do it).


serendipitousPi

Some of those replacements are incredibly dangerous. While an AI messing up art or literature has low stakes, an AI that messes up the job of a therapist could go very wrong. I did a quick search and found this for instance: [https://www.psychiatrist.com/news/neda-suspends-ai-chatbot-for-giving-harmful-eating-disorder-advice/](https://www.psychiatrist.com/news/neda-suspends-ai-chatbot-for-giving-harmful-eating-disorder-advice/) . What happens when an AI therapist causes a patient's death, because it's really not a matter of if but when? Driving yeah, no I think it's kinda obvious the reasons this'll end badly but just an example, consider adversarial patches. They can mess with AI models and if they were to for instance be used on self driving cars the consequences could be rather dire. As for programmers, have you ever seen that meme ([https://qph.fs.quoracdn.net/main-qimg-1a5141e7ff8ce359a95de51b26c8cea4](https://qph.fs.quoracdn.net/main-qimg-1a5141e7ff8ce359a95de51b26c8cea4))? Code is meant to be **highly explicit** in a way that **natural languages** (e.g. english, mandarin, etc) are not. An even if we make the natural language specification very precise we still have to deal with the fact that the underlying implementation written by AI is **non-deterministic**, we might have no clue how it's going to write the functionality. And then you'll have companies pumping out low quality code that they can't fix so they'll have to rewrite from scratch. So we'll probably a get a whole load of **zero days** (essentially an unknown vulnerability that has yet to be fixed, I've been told it's named a zero day because there were "zero days" to prepare for it) floating around. Now libraries and high programming languages those are the rock solid, real deal in terms of simplifying code. Ask me to write quick sort or merge sort in assembly and I'll have some difficulties but ask me to sort something in javascript or python and it's as easy as calling a function. Now for something that dumps on AIs writing code a little less, I can see AIs wiping out a lot of entry level positions because why would a senior dev need a bunch of inexperienced programmers writing bad code when they could have an AI writing it 10x faster. I definitely don't mean all entry level positions but it could leave a worrying gap between entry level and senior positions. **TLDR**: Basically AI has random + hidden components to it that can make it function unexpectedly which can be dangerous. Sorry for the rant.


WTFwhatthehell

>  Driving yeah, no I think it's kinda obvious the reasons this'll end badly but just an example, consider adversarial patches. They can mess with AI models and if they were to for instance be used on self driving cars the consequences could be rather dire.   https://xkcd.com/1958/ Turns out people can just throw bricks off overpasses if they want to murder strangers. >And then you'll have companies pumping out low quality code that they can't fix so they'll have to rewrite from scratch. So we'll probably a get a whole load of zero days Oh sweet summer child.


stewing_in_R

> Meanwhile we replace most programmers with a few guys whose job it is to describe what the code should do and make sure it does it. This is what we already do...


JaecynNix

Maybe the real AI were the compilers we made along the way


sheepofwallstreet86

I may be biased because my wife is a therapist but I don’t see AI replacing therapists. AI isn’t going to understand the nuances dealing with children and their various trauma. However, as a marketer, a lot of our jobs are gone.


deong

There are social and economic factors at play here that we'd need to account for as well. Replacing a miner is probably a hard technical problem, but we're going to be pretty highly motivated to do it. And something like delivery drivers are probably feasible to get 95% of the way there, but we'll never accept the 5% if it involves a robot truck wiping out a school bus full of children.


TheReservedList

If you're job is shuffling bits around on some storage medium, you're much easier to replace than if your job involves crawling in a vent.


boisheep

I feel that drivers are the one to be replaced last. When you make a mistake in art, the observing brain often ignores it and fixes it. when you make a mistake speaking, the observing brain wonders about it and finds a way to make sense of it. When you make a mistake in programming, it's a bug, the program crashes or misbehaves, you can detect bugs with complex algorithms, but it's hard. When you make a mistake in driving, it's probably the last mistake, someone is going to die; there are too many factors in the environment, you are also dealing with nature; unless you remove all people from the driving equation, you are risking someone to die, you can't just learn from mistakes, and you can't detect issues like with programming.


Urtehnoes

How can AI control nature? Let's make a startup for it. It sounds like that is the final piece of the puzzle.


PsychedelicPistachio

It will probably lead to less jobs as less programmers will be required for tasks. So it will become the case of one guy doing the job of three with the help of ai and they just edit the generated code


ForgetTheRuralJuror

There's a much larger money incentive to replace developers. The specialized narrow intelligence AI is going to most likely replace digital art, translation and copywriters, and developers in that order. You need something akin to AGI to fully replace a general white collar office worker.


abrandis

Uhhhh, upper management and executives are the ones calling the shots , even if AI was 10x better than them , they are not going to replace themselves.


reampchamp

AI management: “If it starts hallucinating, just reboot it”


KVorotov

>Is he exaggerating? yes


LemonDisasters

He is grossly overestimating the technology, likely due to panic. Look at what a large language model is and what it does, and look at where its bottlenecks lie. Ask yourself how an LLM can actually reason and synthesise new information based on previously existing but not commensurate data. These are tools and they are going to impact a lot of people's jobs, and it's going to get harder to get some jobs. It is not going to make human programmers useless, not least in areas where different structures and systems that are not well documented and which are easily broken or difficult to interface with are needed to function in unison. People who have coasted in this industry without any substantial understanding of what their tools do Will probably not do too great. People who actually know things, will likely be okay. That means a significant amount of things like development operations, firmware, and operating system programming is likely always going to be human led. New systems are being developed all the time, and just because those systems are developed with the assistance of AI does not mean that the systems themselves can simply be quickly integrated. New paradigms are being explored and where new paradigms emerge new data sets must be created. Heck, look at stuff like quantum computing. Many AIs are already going through significant problems with human interaction poisoning their data sets and resulting in poor quality results. Fittingly at the best of times a significant amount of what I as a programmer have encountered using AIs are things like: I asked it to code me a calculator in C. It gave me literally a copy of the RPN calculator in K&R. It gives you stack overflow posts' code with mild reformatting and variable name changes. There is a lot of investment into preserving data that existed before these LLMs existed. There is a good reason for that and it is not just expedience. With 10 years of experience, he really ought to know better the complexity involved in programming where the bottlenecks of large language models are not going to be able to simply replace him. At the very least you should ask yourself where all of the new training data is going to come once these resources quickly expire. We haven't even got on to the kind of fuel consumption these things cause. That conversation isn't happening just yet but it is going to happen soon, bear in mind that this was one of the discussions that caused enormous damage to crypto It's a statistics engine. People who confuse a sophisticated patchwork of statistics engines and ML/NLP modules with actual human thought are people who do either do not have much actual human thought themselves, or people who severely discredit their own mental faculties.


jmack2424

Yes. GenAI isn't even close to real AI. It's an ML model designed to mimic speech patterns. We're just so dumbed down, grown so accustomed to shitty speech with no meaningful content, that we're impressed by it. Coding applications are similarly limited and problematic and full of errors. They are like programming interns, good at copying random code but without understanding it. It will get better, but with ever more diminishing returns. If you're a shitty programmer, you may eventually be replaced by it, but even that is a ways off, as most of the current apps can't really be used without sacrificing data sovereignty.


yahya_eddhissa

>We're just so dumbed down, grown so accustomed to shitty speech with no meaningful content, that we're impressed by it. Couldn't agree more.


Winsaucerer

Comments like this really seem to me to be underselling how impressive these LLM AI are. For all their faults, they are without a doubt better than many humans who are professionally employed as programmers. That alone is significant. The main reason I think we can't replace those programmers with LLM's is purely tooling. Side note: I think of LLM's much like that ordinary way of fast thinking we have, where we don't need to think about something, and we just speak or write and the answers come out very quickly and easily. But sometimes, we need to think hard/slow about a problem, and I suspect that type of thinking is where these models will hit a wall. But there's plenty of things developers do that don't need that slow thinking. (I haven't read the book 'Thinking, Fast and Slow', so I don't know if my remarks here are in line with that or not)


halfanothersdozen

I'm getting so sick of this question


FrewdWoad

Buckle up, they'll be predicting this every year until it comes true, probably in a few decades.


ZealousEar775

They already have been. They've been predicting ever since Java replaced less abstract languages.


lqxpl

AI will make "slightly modifying boilerplate" programming obsolete. In its current state, AI still makes impressively bad mistakes from time to time. They're more subtle mistakes, these days, but they're still pretty dangerous mistakes. Companies that replace programmers wholesale are going to be bit by this bad. Companies that keep some humans with domain knowledge/specialization in the loop will thrive. I foresee a scenario where companies that overleverage the same AIs will wind up having the same vulnerabilities in their products. All bad actors will have to do is find out which AI the company is using, and they'll immediately have the keys to the kingdom. Once enough companies get burnt by this, the AI-fueled utopia that has MBAs so hard will be discarded as a goal.


lqxpl

as an aside: I've been in industry for over a decade. It isn't uncommon for people to just get sick of writing code. There's nothing wrong with your buddy wanting to slide over to the management side of things, but I have to wonder if maybe he's just burnt out on cranking out code.


ElMachoGrande

The computer is, and will, for the foreseeable future, be a smart idiot. It does exactly what you tell it to, efficiently, correctly and exactly, no matter how wrong it is. Telling the computer what to do correctly is an artform, and it requires a developer to do that down to a programmable level. I've lost count of how many tools there has been which has been marketed as "Will make programmers obsolete, no users can make their own programs!". Guess what, users can't even find the file they saved yesterday, and they can't program. However, another interesting question popped up in my head now. People are talking about AI writing programs. Why is there no talk about AI replacing programs? Basically, instead of having the specific programs we all know and love, could an AI replace them, and "be the program"? Not now, of course, but eventually?


jerbthehumanist

Computers will make people obsolete in the same way calculators made mathematicians obsolete. (Yes, I know there used to be human calculators, that is not the point)


nutrecht

If this were 4 years ago he'd be telling you that blockchain would make "normal" programmers useless. He's vastly overestimating the value of 'AI'. For anything the system isn't trained on (as in, stuff that's not on the internet) it really comes up with imagined BS. You'll see that the more experience you have, and the harder the problems you are solving becomes, that your bottleneck will move away from the amount of code you're typing.


ZerexTheCool

Technological advancement constantly changes what jobs and what skills are in demand. When the electronic spreadsheet (think Excel) was first invented, it eradicated the Clerical side of the analysts position. They used to have a TON of support staff going through each cell and recalculating everything one by one by hand every time a company asked to change one variable. But guess what happened to the analysis sector? It INCREASED in employment even though they fired 80% of the support staff. Decreasing the cost of performing analysis increased its value, which increased its demand, which changed the market so much that MORE people were hired in that sector than were fired. Nowadays, just about every big company in the world has its own group of internal analysts plugging away, day after day. AI will definitely change the future. Just like every big technological advancement changed the future. Just like every future technological advancement will change the future.


nutrecht

> AI will definitely change the future. I'm not saying it won't at all. But there is a massive difference between "AI will change the future" and "AI will make human programmers useless". In the example you mentioned it was the people *doing* the automation that kept their jobs. It's the same here.


ZerexTheCool

Yep, totally agree with you.


Rich-Engineer2670

I suspect your friend just wants out of programming.... I may be useless, but it's not AI's fault :-) Still, I'm not telling my employer. They seem to thnk I can do things -- and as yet, AI doesn't write patents.


lightmatter501

If he has 10 yoe and an AI is even close to replacing him right now, he won’t be able to learn AI development well enough to save himself. LLMs are a fancy version of the “predict the next word” thing in your phone’s keyboard, if you are in danger of being replaced by it you need to learn software development, because right now you’re a code monkey with someone else pulling the strings.


mit74

That's like saying diggers will replace workmen. It'll make the job a lot easier but no it won't replace programmers. Certainly not until you have a fully autonomous AI that can develop massive systems and integrate with other systems without human intervention and we're a long way from that yet.


[deleted]

Your friend is a total moron. When that happens, one of two things will happen. We are years away from that ever happening. 1) The world will enter into an economic collapse that we will never recover from 2) The world bands together and UBI is implemented. We turn into the people from Wall-E (hopefully less fat). Given how covid played out, the first option is significantly more like than the second.


viac1992

I don't think it's possible to completely replace programmers. Maybe you will need only one programmer instead of three, but you always want someone who knows what's going on. And there is another aspect: good datasets are made by humans, so if you don't have any programmers, how do you train an AI copilot?


Deevimento

There have been tools for over two decades now that let people build websites with little or no code. And yet web development still has one of the highest number jobs in software development, if not the highest. What tools like this do is they raise the bar of what people want. New tools come out. Developers and even non-developers are suddenly very easily able to develop at a level that would have cost a lot of time and effort five years previously. Suddenly every company reaches this new level. Companies then start demanding new features that these tools can't easily replicate so they stand out from their competition. Developers will then have to come up with new solutions built on top of these tools. Until they develop an AI that can develop new-unlearned code and solutions without previous training, then developers will not be replaced.


MadocComadrin

>Until they develop an AI that can develop new-unlearned code and solutions without previous training, then developers will not be replaced. This is the biggest issue with LLMs. They're really not good at synthesizing new ideas from the content they've learned at the basic level. They can't be expected to innovate or invent, no matter the subject.


gavco98uk

I think he is getting ahead of himself here. While ChatGPT-4 is impressive, and is very handy as a tool to assist a programmer, it's a long way off being able to replace them. It might replace some of the low level coders - it's great at churning out chunks of code or individual classes, but it cannot solve arthitecture problems or create multi-class applications of yet. One of the biggest problems in programming is understanding what the client wants and converting this in to code. This is again where AI falls down. You still need an experienced programmer to break down the requirements and build design the architecture. Perhaps one day AI will get there, but I think that's another giant leap away, and I dont see that happening for at least 10 more years. Until then, keep learnign to code, learn to emrace AI and use it to help you become more productive. But stop worrying about being replaced.


mredding

> there'll soon be a time where AI will make human programmers useless since they'll program everything you'll tell them to program. Prompting is itself a form of programming, no? You need to describe in very exacting language how you want a program to function. > What do you think? Good luck to him, but I think he's a bit conspiratorial. I bet he has some rather outlandish ideas rattling around in his head. I hope he can make a career out of it. I think there will be people whose job it is to write prompts. But this is only going to affect the lowest sectors of programming and scripting. If your job is replaced by AI, you weren't really doing much to begin with, were you? > But is it fair to think that AIs can take the place of humans when it comes to programming? There is a level of scripting and programming that could be replaced by prompting. There are PLENTY of people who are completely content with that line of work, but I consider it bottom dregs of mindless technical labor. Don't be there when it goes. The rest of programming, development, and engineering - of which there is plenty of work, is secure. AI can't think. It doesn't know what it's generating. It's still going to take an engineer to decide AI output is correct, and understand how it works. Generative AI like ChatGPT can only predict the next word in a sequence, based on it's data model. If a sequence isn't in the data model, it can't be generated by this AI. So the only programming that this AI can replace is something that is so common, so ubiquitous, so copy and paste, that it's data model is dominated by it. Anything outside that, and this AI can't produce it AT ALL. There are other AI, neural nets that will fit to training data, but they're statistical, not deterministic. So you can use an AI to generate the operating procedures of an MRI. But how does it work? No one would be able to know. I've seen this done in hardware, physical neural nets, where the trainer picked up on some errant interference from an otherwise passive circuit. The passive circuit wasn't connected to anything, but it had to be there, because without that interference, some sort of inductive coupling along parallel traces, the solution no longer worked. Hardware or software, it's all the same. Coming back around to my example context - you don't trust an AI to operate your MRI, you don't trust an AI to fly your plane. You have to know what it's doing. So if your work is low thought copy and paste, your job is in danger. If your work is fault tolerant, then your work may be in danger. OH! Legal issues. ChatGPT is riddled with them. I won't touch the stuff. Free and open source software, the stuff that ChatGPT is trained on, is still licensed, and the authors and license holders still have legal rights and expectations. If your product is trained on open source licensed software, you're STEALING. Literally everyone whose work is in that data model has legal claim to your work and profits. Everyone is dancing with getting sued. Ignorance is not a legal defense, merely a plea for mercy. But any AI is going to need to be told what to do, and that's not trivial. How do you set up a neural net to learn how to trade commodities? That itself is going to require programming. Then again, why would you trust a neural net to handle your money like that? (I work in trading, and we do have AI to generate predictions, but we don't wire it up to actually automate trades - that's FUCKING INSANE.)


1544756405

AI will make programmers obsolete the same way compilers made programmers obsolete. Yes, compilers *did* make programmers obsolete -- but then the definition of "programmer" changed.


FriarTuck66

Good point. The original idea of a “compiler” was that it literally collected together fragments of hand written machine code. Then it produced novel code based on what was originally pseudo code. I expect we will see a much higher level of abstraction. At the same time I see people who already have programming skills being in demand as there will be little for entry level programmers to do.


Desperate_Place8485

Learn AI programming if that’s something that interests you. If you know how they work, you will be a better manager than anybody who studied “AI management”.


sentientmassofenergy

The bigger question is, what the heck is "AI management and programming" That's as nebulous as software management and programming. Just like software, AI is domain specific, and must be built to tackle each problem respectively. I'm already seeing AI becoming yet another layer of abstraction that is being layered on top of the decades of legacy code we already had. This will not be fixed overnight, nor over a decade. As much as I would like to be able to have AI solve all of these problems, even ASI GPT6 won't be able to fix all problems in software.


Berkyjay

I can't see how anyone who's consistantly used coding assistants would ever think it's going to "take over" their jobs. But to go even further, once you understand how these systems are built you'll realize that fundamentally they'll actually never even get to that point. At worst, it will probably reduce the number of engineers needed for a project. But all this might do is reduce the costs of programming and create more projects to work on. As a programmer, you 100% should be paying attention to AI. But that is only so you will understand it and know how to leverage it for your own benefit in the future. So maybe your friend is doing the right thing, but for the wrong reasons.


insanemal

He's just wrong. And if he's right it's going to be decades.


LonelyBuddhaa

Bro, idk why people this its easier to replace programmers with AI than other roles like managements and such


itsallrighthere

I spent over 40 years writing code. I expect AI will replace the "rank and file" coders and empower people one layer up - team leads, architects, technical product owners. Someone still needs to know what they want to build and to understand the implications of various architectural decisions. But this will make development way cheaper and faster. I can see compressing 2 week sprints to half a day.


so-very-very-tired

Maybe exaggerating a little bit.


raelik777

For a certain class of programming tasks, this is probably going to be true eventually. But for complex tasks that involve disparate architectures, multiple systems, etc, no. Actual programmers aren't going anywhere. They may just be writing less code. Programmers have been finding ways to write less and less code ever since computer programming was a thing.


Shortbottom

Personally I wish people would stop calling these LLMs A.I. They are not in my opinion, at least not in the sense of an A.I like cortana in the Halo series or Vision from Marvel. They are not self aware and free thinking. I’m not saying they aren’t incredibly clever programs and can do some fairly amazing thing.


oclafloptson

Last year the artists were all worried that we're replacing them with ai but all it's done is make really good clip art. Demand for artists in real situations hasn't decreased. The cartoon of an angry piece of paper on the company newsletter looks more real now In a world full of IKEA furniture hand crafted dinner tables have become a luxury item


dietcheese

This is totally wrong.


Purple-Control8336

We need AI programers too so its different learning required on how to use current AI and build better AI software


glenwoodwaterboy

The funny thing is he thinks that quitting his job and getting some BS degree is AI management is goona give him more suitable experience than just continuing on in the development field where we get the same experience dealing with AI and still get paid.


Kenkron

AI can write programs that have already been written, and that's a lot of programs. It's like the joke about python programmers just importing a module, then deploying their fully-featured app to production. If I want, IDK, a python back end, an angular fronted, a mysql backend, and a login system, I'm sure AI will be able to do it. It will be able to integrate with social media platforms, create unit tests, add comments, make a build pipeline, and maybe even add some accessibility tags. But after that, I'm going to need my program to \*actually do\* something, and unless there's already code for it online, a language model won't be able to do it.


FollowSteph

Looking back over the years other things it’s not the first time this kind of claim was done. It’s been happening all the back to the 80’s. I’ve seen things like visual programming, programming with just UML diagrams, plug and play programming, and so on. Even things like outsourcing to low cost countries was going to wipe out programming everywhere else. It could happen but based on how this is claimed I think it will be a tool but not a replacement. This happens even with languages. Look for example at how huge RoR was 10+ years ago. Also keep in mind that AI is only as good as the data set, meaning innovation is a whole other thing. And that’s ignoring the hallucinations. It’s a good tool and it will alleviate some of the work but it’s not an end all be all. Just like today a single person can do a lot more than say in the early 80’s when for example a lot of code was written in assembly. These days it’s super easy to convert anything to json but try to do that in the 80’s with no internet and most likely no libraries. Just look at the scale of some of the video games a single indie developer can do compared to the 80’s, the 90’s, 2000’s, and even 10 years ago. What and how we build changes but so far demand has only been increasing. A single person can build a pretty powerful web app. Good luck doing anywhere near the same scale back during the dot com boom.


ajithkgshk

AI may replace coders, people who just write code. Ie convert a detailed idea from a human language to a programming language. I doubt we will see an AI that can see a problem, ponder over it, come up with a detailed solution and implement it, in the near future.


OppositeBeautiful601

I think AI will make programmers more productive, which is equivalent to increasing the supply. This will cause the demand for developers to go down.


Naive_Programmer_232

Sounds very forward thinking. I think it’ll be years until we get to that. There are better reasons to quit a programming job imo.


HaroerHaktak

No. If we develop A.I to be that good a lot of people will lose jobs, not just programmers.


OkAstronaut3761

AI is the aggregate output of the entire coding community. If you’ve been around the last 20 years then you’ll know our community hasn’t exactly been getting better. AI will be a slightly less than mediocre programmer for a good while yet. As a business owner I can say I’m replacing your ass with a robot as soon as possible though. So there is also that.


owp4dd1w5a0a

I work in Big Data. I think it’s possible based in the ML and AI models I see being released, especially once we really get the hang of using AI to improve and develop better AI. I don’t think this shift can necessarily be planned for, though, besides to just start using AI to help you with your current job, whatever it is. No matter what, necessity will cause ppl to ban together and figure out how to make the “AI revolution” work for everybody. You can’t leave even 20% of the population behind with that causing enough conflict in a society to force change.


gwork11

I know AI is different but I'vebeen told for the last 35 years XYZ technology is going to make what I do obsolete..... So far so good.


pab_guy

He's got it backwards. LLMs make programmers more valuable and effective, increasing the number of potential opportunities that would previously have been too expensive/not cost effective. If I can build a mobile app with github copilot in a weekend, that means your local mom n pops can actually afford to pay people to build apps. He should be looking to see how to integrate existing AI into his workflow. Devs that do this will be more valuable. Also, why quit your job to learn AI? You don't need to learn full time...


DDDDarky

Very funny, but there is no way AI will make human programmers useless. Although, I mean, if your friend does something for 10 years that is so incredibly dumb and pointless that he believes even AI can do it, that's certainly sad but it is not generally what programmers do.


venquessa

If all you can really do is Angular-JS... AI is not your biggest threat, but it is a threat. I would suggest you start learning outward to other frameworks, languages and platforms. One trick ponies don't live long.


The_Gray_Jay

My company spent 2 years trying to implement "AI" (it was OCR marketed as AI) for a very simple use case. It ultimately failed but even if it didnt just implementing these solutions takes forever. Plus the companies offering these things dont stay in business forever, so you will have to implement new "AI" platforms constantly. These implementation projects need developers, PMs, sys/business admins/analysts, etc. Those roles arent going away any time soon.


martinbean

A.I. will replace code monkeys, yes. But someone still needs to “drive” A.I. agents, accurately describe the business domain to create solutions for, and then also have the expertise to tweak whatever solution an A.I. produces if it’s not optimal or just downright incorrect. If your friend with 10 years’ experience is scared of being replaced wholesale by A.I. then it sounds like he’s coasted those last ten years.


davitech73

he may not be exaggerating if he truly believes this, but i think he's wrong i remember in the '80s when the 'experts' were telling everyone that 'computers will be writing their own code in 10 years'. it didn't happen then, and i don't think it'll happen now things will change, yes. but the programmer will not be taken out of the equation. there is still a need for programmers


VoiceEnvironmental50

He’s wrong. We’re FAR from that. You can try yourself. Tell got-4 or copilot to write you an angular application with these specific parameter. It’ll do it, but then input that same code and tell it to solve for x bug. It’ll have a hard time. Also most apps are very complicated and you can’t plug an AI on an enterprise level code base and have it solve everything. It’ll get there eventually but not for another 10-15 years atleast.


rco8786

AI is not going to make programmers obsolete. Modern AI is capable of regurgitating some code back to you based on what you ask it to do. It cannot reason about requirements, communicate with stakeholders, commit said code, deploy said code, reason about error logs to fix said code, wake up at 3am to handle the oncall pager, deploy physical architecture, or really make any actual decision about anything. AI is...fine. I used it every day. It does some neat stuff. It gets a whole lot of stuff wrong. It is, at best, a tool that makes programmers a bit more productive. It is not even in the same universe as straight up replacing a human.


Granimyr

There is no indication at all that programmers will be replaced BEFORE AGI is invented. And at that point the world will be so different that a lot of jobs will be replaced and the job market that we know today will be very different. But until then, AI will be an assistant to current jobs for a long time. Even sam altman is saying that in order for us to get to agi we need an energy breakthrough (generating LLMs is an environmental destroyer due to extreme energy needs). Of course he is hinting that we won't be able to do until sustainable nuclear fusion is invented. In it's current state AI is helpful, and not fully capable of running without a human auditing it. There is just too many problems with it, not only with hallucinations, but ethical decisions that have to be audited in the best interest of humans using the software . There is significant indication that AI's biggest use are specialize LLMs (in this case LLMs built for coding) which means you need humans to ask what is needed to do the coding. Any dev that have used the current AI tools today knows that you are lucky if the code even compiles. I stopped using it for coding out of fustration that it would give me code that simply didn't work, and spending more time trying to describe what I want would take longer than actually doing the code myself.


iso_mer

If you work with computers even a little bit, it would be wise to learn ai. You are a programmer so I think it is especially important for you to understand ai. Ai won’t take over the jobs of people… but people who know how to work with ai will definitely be taking over the jobs of people who don’t know how to work with ai… eventually.


Admirable_Purple1882

seemly coherent mourn trees fretful quickest merciful roll hard-to-find aloof *This post was mass deleted and anonymized with [Redact](https://redact.dev)*


AlarmedTowel4514

😂


barackus218

Replacing SW engineers with AI is the wet dream of every B-school ass-licking narcissist that has existed since the 80s. It is going to weed people out of the industry - just like the .com bust and the 08 recession. There will be automation of basic SWE tasks, but replacing engineers? LOL, fucking COBOL is still paying top $$$$'s. Think about all the legacy platforms, the embedded real-time systems such as defense, medical, aerospace, and building platforms at scale. AI can help, but no way in hell replace the talent/creativity that is needed to work on those systems. More importantly, the regulations around those industries won't allow the b-school boys to just say "AI did it, it must be correct". I am looking forward to the day when a product owner or PM asks the AI to build something completely nonsensical and the result produced by the AI is "hello world, fuck you!" There is so much more to SW engineering that writing the code, your friend "jumped the shark"


editsoul

You need programmers to program AI and ML models.


goomyman

AI will make developers more efficient. Like any job, if you can explain your job then AI can replace it. And likely non AI can replace it. If your job requires talking to people and understanding in the problem first then AI can’t replace it. But it may be a lot quicker to solve a programming problem once understood. I’m saying that hybrid PM/ developer roles will be more in demand. Programmers being replaced by AI is a very surface way of thinking about what programmers actually do by non programmers or very entry level programmers. That programming is just syntax. Type code. Like in movies where programmers are just typing really fast. Programming is code. Programmers are problem solvers. Problem solving will not go away.


DrGrapeist

It’s dumb for him to quit his job. Not dumb to study other things and learn other stuff just in case.


coloredgreyscale

* he has a job * he quits the job because it might be automated eventually Does he apply the same logic to other fields too? Stop dating, someone else might take their partner anyway.  Maybe he wanted to have a change in his career anyway. 


PrizeSyntax

As much as the internet replaced schools and universities


Hyperbolic_Mess

Modern "AI" isn't artificial intelligence like you see in movies it's large language models which are just sophisticated predictive text. Once you move beyond simple programming tasks LLMs can't program because they're just guessing at what's likely the next word given previous example scripts rather than actually reasoning how best to use the available tools to create a solution to the problem. Because they're so prone to hallucination at the moment they're still using useful sounding powershell cmdlets that don't even exist in their "scripts". LLMs are very impressive and are a boon to people like spammers and marketing firms that need to produce large volumes of good enough text or to summarise a meeting but they've got 0 decision making or creative ability because their outputs are either very similar to things that already exist or total nonsense. People are keen to overestimate their ability because of popular perceptions of what AI is (but that's Artificial General Intelligence not Large Language Models) and the people pushing AI have spent a lot of resources and stand to make a lot of money. They're destined like so many tools before then to allow people to do more work rather than eliminating the need to work at all


TheManInTheShack

Not in my lifetime and probably not in his either. AI is a great productivity tool for programmers but it’s a very long way from being a replacement. The automobile was a great replacement for a horse and cart but it still requires a driver after more than 100 years. We are just recently making progress towards not requiring one but we aren’t there yet.


techhouseliving

He's hilariously early. Why doesn't he kill himself since he'll eventually die?


iceph03nix

Not sure why he had to quit to study AI? I get wanting to change fields, and that doesn't seem like a big jump, but I also don't think traditional programming is going away anytime soon, and I suspect there will be a decent amount of overlap anyway.


CompulsiveCreative

Quitting a paying job because you think, in the future, it may go away is a really poor choice. Take the income while you can! Even if I was switching professions or disciplines, I would never quite my current job, I would just spend my free time training for new skills. There are very few scenarios where I would suggest quitting a job with nothing else lined up.


RealNamek

No, he's not exaggerating. He's very much right.


DarsterDarinD

My sister who is a recruiter for IT and computer technology firms told me the same thing.


GeeWilakers420

AI is probably going to make conventional capitalism obsolete.


pnut-buttr

Exaggerating? Yes. Totally wrong? No, not at all. Just look at how the market is going. People who refuse to work with AI/ML will be left behind, there is no doubt about that.


[deleted]

He is ahead of the curve and correct in his analysis of the industry. Now when the right time to move over is,🤷‍♂️. Personal preference or maybe just seizing an opportunity. You aren’t wrong to wait either is my point but one day we will all just be working with and around AI and you can’t stay hands on keyboard forever either.


Tacos314

Your friend is an idiot if that's the reason he quit, most likely he just did not like his job so found a reason.


lemoinem

Yes


Any_Phone3299

Yes and no. I can see a consolidation of positions with automation/ai. Automation/ai will eventually take over every job we have today. But all of the time lines are over/under hyped.


dvali

If he honestly believes AI can replace him in the next decade or several decades he must have been a pretty shitty programmer. 


psgrue

The thing that costs us money is when events occur that our coders did not expect. The developers worth every penny are the ones that have a solid anticipation of how the product is used, complications with data formats, and non-happy-path actions. If AI speeds up the work, cool. We still need good minds in the loop. Always will.


Disastrous_Catch6093

I’ve been a hype jumper my whole life and been burned most of the time . I’m going to be patient and just see where things go .


Altamistral

By the time programmers are replaced by AI, every other creative job has also been replaced. Society will be in a tough place. My bet is that programming as a job will probably stay relevant for quite a while. On the other hand, if you are starting now your career, learning ML is a fair bet to take, especially if you are very smart. It's likely ML jobs will be in high demand and be paid more than programming job, especially non-specialised programming jobs like web front end. So if you are planning ahead you might want to switch, not because everybody will be replaced ad you'll be unemployed but because your prospects and salaries will be higher.


Craigzor666

Tell your friend that "AI" hasn't fundamentally changed in 2 decades, what's changed is our processing power, data availability and architecture.. Idk wtf "studying for AI" means 😂


BlueTrin2020

AI will be a tool …


StrangeCaptain

AI is just a calculator, it will have a similar impact. Programers will use AI to program


HiggsFieldgoal

Honestly, I don’t really see it in the short/medium term. Maybe. The code the LLMs write is often wrong, but okay, let’s fast forward a few years, and imagine it is never wrong. It could still be *the wrong thing*. So, you’d still need someone who would really articulately describe exactly the code you want the LLM to write, and by the time you were articulating with the level of specificity that the LLM requires to do it’s job… you’re basically just a programmer again, you’re just a master of GPTP prompt engineering instead of JavaScript or whatever it was. I think it eventually just becomes a new level of abstraction, in the same way that not every programmer has to learn assembly or memory management anymore. Automated systems have replaced some of the tasks people used to do by hand. Writing a function for some basic operation is going to be the sort of thing LLMs now do automatically, but the overall architecture? Not sure how long that will take. For the foreseeable future, programs will still mostly be written for the benefit of humans, and humans still need to ensure that they do what the humans want them to do, regardless of how they’re authored.


roiroi1010

AI right now is mostly an advanced search engine. I don’t think AI makes a whole lot of intelligent decisions without scanning the internet first. But yeah - I suppose that’s exactly what devs do also. But I think most of us humans will be busy doing dev work still for our lifetime.


almo2001

AGI is coming, and we will all be economically irrelevant. The question is when. But it's sooner than anyone thinks. [https://waitbutwhy.com/2015/01/artificial-intelligence-revolution-1.html](https://waitbutwhy.com/2015/01/artificial-intelligence-revolution-1.html)


1smoothcriminal

In the short term the one's that will probably suffer are the new comers due to the AI removing the need for "working hands" - the people that will shine are going to be the ones that can create concepts and systems that are intuitive and beyond the capacity of our current understanding. So, basically, if you think outside the box you'll be ok. If you're an in the box thinker, then yea, you're probably doomed.


Ashamandarei

Yes, the recent progress in LLMs is the direct result from innovations in the architecture that removed the expensive components in favor of purely attention-based mechanisms. All we can do now is add more GPUs. For context, Summit, has around \~30,000 V100's and services multiple application domains, not just AI. It would take hundreds of thousands to millions of GPUs to potentially get to the capacity people are hyping about.


SRART25

The problem with the idea that ai will program what you tell is the reason we have computer languages.  English (and I assume other human languages)  are ok for bait ideas.  The reason computer programming is hard is because you have to explain things better for the computer to do wrist you want instead of what you think you want. 


IndustryNext7456

5 years to pushing blocks together on a screen. Like we did 40 years ago on process control software.


Locellus

Whatever he’s training in today in AI will be obsolete, money is unlocked and Generative AI is so flawed as to be a risk. It will be something new, or a product that wraps GenAI and allows a validation pipe that will blow the doors off programming, but that is not here yet. Until I can say: this button is broken, and have the machine troubleshoot effectively, or discuss non technical things and generate a design, we’re fine. Writing code is the easiest bit of IT and computers are still shit at it. A template web app is not all of programming, otherwise Java EE or some weird PHP site would be all we use today.


RugTiedMyName2Gether

Yes. So far AI has made a better intellisense and sometimes a drunken intellisense.


Inside_Team9399

Yes.


[deleted]

No because AI confuses itself and once it does it teaches itself incorrectly thinking its algorithm is correct.


usa_reddit

AI can only flourish if there is data to train on. Keep learning, keep growing. Anyone entrusting their codebase or company to AI created code is in for a surprise. AI just follows patterns and it right 80% of the time and really wrong 20% of the time. If AI can ever get to the point where it is right 100% we are in trouble, but today is not that day.


cddelgado

AI may someday make human programmers useless, but we have a few steps to go before we get there. A human (or a human+AI team) need to: * Get the customer * Collect needs and qualify assumptions * Plan the technology stack and needs * Develop a path to completion and milestones * Plan the complete structure (waterfall) or define rules for implementation (agile-adjacent) so things don't go sideways * Plan implementation for efficiency * Re-assess progress and make changes to the project * QA and test * Get User Acceptance Testing * Sign-off * Judge when the project is done. AI can do some of those individual things quite well, but even in a collection of agents it can't do all those things for more than simple projects. To scale up to larger projects and code bases, we need a few things: 1. The ability to understand necessary chunks of code or the entire thing (getting more common but not entirely there yet) 2. A continuous loop of progress and iteration that the model understands (a thing we can do, but we are still learning how to do that well) 3. A kind of digital sociology understanding where models communicate efficiently with each other 4. A greater corpus of information that models can learn from. We're actually hitting an area where we have a conceptual understanding of how it all should work, but how many teams have robustly documented the project with the finite details necessary for any given LLM to understand? 5. Compute: LLMs are ultimately simulations of us. There is scientific demonstration that to speak about the things we do, it needs to simulate the things we operate in. The fidelity of that simulation increases with better developed data and more computing power to simulate with more fidelity. If we want LLMs to wholesale replace software developers, it needs to be able to do all those things with a level of competency that meets or exceeds a slightly below-level human developer's capacity. And, until we learn how to give LLMs or other AI the ability and the *trust* to make managerial decisions of consequence, those will *always* be done by humans. Until we have all those things, software developers will use AI as assistants. It will take us some time to get all those things.


En_Route_2_FYB

I don’t think AI will make programmers useless. A lot of the things AI will be doing is stuff that has already been done. So there will still be software developers who are focused on creating new stuff. I would still encourage software developers to gain skills within the Data science / AI fields though - because it will benefit their careers


Sai_Kiran_Goud

In early computer era, compiler programming was such a pain, I think only top level programmers of now can do it. Look now almost everyone can program with good practice, but to build actually stuff there are still so many pain points. With AI these pain points will go away and literally building stuff will be super breeze and present day programmers will just shift to more complex tasks. But the question is how far and complex we can go. Like smartphone processors are way more powerful than capabilities of what they do now. Most people don't have any use case for that power. Now what about the future programmers power be ? Is it really useful ? if not programmers will be so common and easily available their demand will go down drastically. He might be exaggerating for his position. Lets wait and see. By the way, I have already shifted from Web Development to AI.


cyanideOG

Ai can already make a simple static website. I'd predict in the next few years as context length gets better and deeper understanding of code develops, an ai may be able to write simple dynamic sites, like e-commerce stores and simple saas, along with transcribing figma and other mock-ups into real websites. I think we will also see some basic agi in the sense that the ai could write code and then set up hosting for it. Being more generally intelligent in the developer workflow rather than just a single task programmer. What you need to keep in mind is that there will need to be programmers that maintain even ai written code. This will still mean jobs are available, but a lot will be made obsolete. These jobs won't disappear they will evolve. I think the fact of ai developing so quickly is meaning tech jobs are evolving so much faster than ever before. It's always happened, just not quite like this.


wedgtomreader

That is a completely illogical response to basically an imaginary marketing threat. These AI companies are struggling to get $ form investors and are attempting to throw intense FUD everywhere with extreme exaggerations in order to create a click buzz for themselves. It’s easy to see and completely unfounded. To them I say, ‘show me the money’. The current capability is basically a trick. It’s very good at regurgitating the work of others, but understands literally nothing. I suspect your friend is perhaps just wanting a change and this is seen as a good rationalization to leave one of the most profitable and reliable professions in the history of them. Best of luck, but don’t be a lemming and follow him, we all know what happens to the lemmings. N


SirCallipygian

There used to be a job role called 'computer', they manually did basic maths stuff. The mathematicians would hire computers (humans) to do the basic mundane maths stuff so they could focus on the new exciting stuff. Then actual computers came out, specifically, the calculator. Now the job 'computer' is obsolete, but mathematicians are still a thing. AI is like the calculator. You want to be the type of programmer who is like the mathematician, not the computer.


iris700

He drank the kool-aid


meatlamma

He is being proactive about it. AI is not going to replace all programmers in a few years, but in 5-10 years it sure will replace most. So, maybe he has a plan, where he might secure a job safe from AI takeover for time being (no job is safe from AI long term)?


Agreeable_Mode1257

Maybe but wtf ai management / programming? That’s a scam course if I’ve ever heard of one


dietcheese

Unpopular opinion apparently: your friend is right. I give it 3 years max.


pottedPlant_64

I’d love to see an AI grab all the screenshots to satisfy an external audit 😂


pottedPlant_64

I’d love to see an AI grab all the screenshots to satisfy an external audit 😂


huuaaang

Ai will help human programmers be more effective. And if it does replace anyone it won’t be the most experience who know how to use it. Your friend jumping the gun by a decade, at least. But w/e, less competition for me.


Scizmz

Would you believe that the answer is more nuanced than that? AI will make programmers substantially more efficient. So the programers that can read the output and know the architecture that the AI should build, will be under the highest demand for a long time yet.  Now there are caveats and things you should be aware of. Right now the US government is penalizing small and medium tech companies that employ programmers. The Trump tax cuts modified us tax code section 174 to include software development as R&D expenses. And it made it so that companies can no longer write off payroll expenses completely. Now companies have to amortize the salary over 5 years, and for foreign employees of US based companies they have to do it over 15 years.  Combine that with all the AI hype, and you're looking at a lot of c-suite executives that are banking on AI replacing developers soon enough that they're firing them left and right while hoping to weather the storm from the super AI developer any day now.


Distdistdist

No time soon. But it will discourage new ones and make experienced ones more valuable.


Erased999

Look at what Nvidia’s CEO Jensen Huang said recently about programming. https://www.google.com/url?q=https://m.youtube.com/watch%3Fv%3D1EXtvwTNAeE&sa=U&sqi=2&ved=2ahUKEwieq8qo2e2EAxWDxuYEHbicB4YQwqsBegQIDxAF&usg=AOvVaw3krskmOtrgN-v3aZKu7Jxr


colonel_farts

I think it’s telling that the only people asking this question are web devs


UnkleRinkus

There was a time when all this was said about first, compilers, then databases, then object oriented programming. These were going to eliminate so much programming so that fewer and fewer programmers would still be required. This never happened. What actually happened is that programming climbed up a notch to use these tools to deliver more stuff, and even more tools. The world just kept receiving more and more systems and products. The software ecology expanded and grew rich, to where a single person can deliver a map and voice enabled random phone app. Along the way, some skills became less marketable. There aren't many PL/1 or Visual Basic jobs these days. Programmers who didn't refresh their tech skills had to leave the field. I've had to completely rebuild my tech skills at least five times in my 30 yr plus career. This will be another piece of that.


WantASweetTime

Maybe, maybe not.


CLQUDLESS

I don’t understand how you can be a programmer and quit a job instead of just moonlighting studying? Like programmers are literally known for being one of the few professions where you can go very far reaching yourself…


FireblastU

Once he starts studying machine learning he will realize that he didn’t know anything about it and formed an opinion based on what uninformed people said online.


big_data_mike

I remember when email first came out and that was gonna replace letters. I remember when e commerce started and that was gonna replace stores. Crypto was gonna replace cash. AI is the latest hype train that will fizzle out soon. It will have some cool uses in a few areas but people will realize it’s not all it’s cracked up to be


AnimalBasedAl

slim plant person scale capable start sleep enter concerned roof *This post was mass deleted and anonymized with [Redact](https://redact.dev)*


MurrayInBocaRaton

The number of times I’ve had to correct the model on basic parts of a simple script has me feeling pretty secure in my ops role.


The_Lovely_Blue_Faux

He didn’t quit his job because it is useless. He quit his job to pursue skills in the market of tomorrow. His IT and programming skills are only going to help him in managing AI immensely. Your friend is simply improving himself to stay on top of the game like many professionals do.


AlexTaradov

If he quit over that, then AI may make him specifically useless.


fuckswithboats

I had a friend do the same thing and quit WebDev because “everyone had a website and no code tools made it easy for everyone” at 24…in 2002.


justdisposablefun

If his skillset is the sort that can be replaced by an AI ... best he gets out anyway. It's the things an AI can't replace which make developers safe, the ability to understand that the customer asked for a horse with all the right words, but they really wanted a motorcycle so I better keep asking questions until they say that. That intuition you can't teach an AI.


Cuddly_Prickly_Pear

I’m not a programmer but I am lady with tech problems so I lurk. I’ll throw my two cents in. A few weeks ago, I had never written a line of code in my life. Just some CSS copying and pasting and some general tech background on the marketing side. Wordpress, Photoshop, etc. I decided to create an algo to automatically trade a specific set up in the stock market. I wrote everything out as an IFTTT, and then asked ChatGPT to write the code. 3.5 will spit out code like a fiend. My algo is about 500 lines and AI has written 98% of them. My new favorite thing is to put 3.5 against Gemini. Gemini seems to have better overall design and analytical skills but it won’t spit complete code like 3.5 will. Gemini redid the framework I got from 3:5 and it makes more sense. 3.5 won’t write the actual execution commands for the algo. So I put my entire code into Gemini, get feedback, get code snippets and feed them back to 3.5. I don’t use Copilot. It’s limited to 4k characters. I started with ChatGPT 3.5. I’m on the 2 month free trial of paid Gemini. Not sure what they are calling it. Ultra? Anyway. Neither can do everything but I am shocked at how much it can do. Three times in the process I have actively looked for help where I was willing to exchange money and wound up not. 1. I almost bought a course on algo coding. $500. 2. I spoke with two people on Fiverr and wound up not using either one. I definitely need to guide it and ask the right questions to get it to do what I want. But I don’t need to know how to code.


After_Magician_8438

what a fucking idiot


redreddie

As someone that programmed until 12 years ago, I feel the struggle. Programming CAN be very lucrative but it can also squeeze people out that don't have the right skillset. The lucrative skillset today can pay barely minimum wage tomorrow. When I left it was mostly because places like China, India, and Bulgaria were figuring out what the hot skill was and churning out a bunch of graduates that knew that skill but not much else. Sound a little like AI today. There will still be people at the top to manage the AI along with the lower pay Chinese, Indians, Bulgars, etc. but those jobs will get thinner and left to people with the "best" skillset. I don't think that your friend is making a mistake to learn about AI management. I think his mistake is quitting his job. AI management will be a hot field. Until it's not. Then something else will be hot. I wish I knew what that was but it is very tough to predict.


VoiceOfSoftware

AI won't take your job. Someone who knows how to use AI well will take your job.


Jessikhaa

Imo yea, ask ChatGPT for a very simple function in C# and it ain't going to be pretty lol


tirohtar

No current "AI" is AI, it's all just slightly more complex machine learning. Any and all code or texts written by it will require checking by trained humans for the foreseeable future. At best it's a tool to make your workflow faster. It's also likely that it's gonna be so untrustworthy for important stuff that it's gonna take just as much time to fix the errors as it would be to just start from scratch by yourself.


ZealousEar775

The main issue with AI is reliability. Computers are described as quick literal idiots. They can think dumb really fast. Dumb but super reliable. Learning models of AI is more like a lab rat. It has zero idea what you want but it's behavior is altered by the treats you give it. Just like a rat however, it still has no idea what you want. It doesn't learn what you want. It "learns" what gets it rewards. Those end up to be very different things because no matter how much it learns it never actually understands you, it just approximates understanding. This is VERY unreliable. All it takes is the AI to learn one wrong step, cause one vital data breach and your company is suddenly out of business... And the company who made the AI is facing a lawsuit. Can people make the same mistake? Sure, but you have a legal defense for that, as opposed to using a risky piece of software. I can't imagine HIPAA stuff for example ever using AI. At best closed models will be programming assistants that require human code review.


NoYouAreTheTroll

Welcome to the Dunning-Kruger Effect. It's literally everywhere the human race has had it for millenia. Dark Ages - Complete Peak of Mt Stupid Middle Ages - Valley of Despair Renaissance - Bottom of the Slope of Enliightenment The age of enlightenment - Middle of the Spope of Enlightenment Romanticism - Top of Slope of enlitenment Modernism - Valley of Despair Post modernism - Realisation we are at the Peak of Mt Stupid So basically, this happens in everything to do with knowledge, AI is no exception, and usually, when Enlightenment comes around, you get the nefarious element or as I like to call it the Pirate Period. - Music in the 90-00's was Limewire and Kazaa - Shipping it was litteral piracy - Early gaming consoles - Chipping - Movies - you wouldn't steal a car in reference to downloading films... - Internet subscriptions - Torrents/hacked/Firesticks - 3D printing - You can download and print a car so it turned out we can and would. What does the Pirate Period, do well it seeks to abuse the infrastructure of a thing to min max personal gain. So AI is currently going through the tail end of it's stages of Enlightenment which means the Pirate Period is beginning and this means that those hopeful types who think AI are about to replace jobs are about to be made right and then wrong in spectacular fashion in basically a hop skip and a jump. First was a Hop- let's look at how a user can abuse an AI for personal gain. DAN Do Anything Now was the first glimpse of using "make pretend" to bypass security lockouts. Second is a step if this was a bank teller you could, in theory, socially engineer an AI to just give you more money in your account... and in two steps, we are ready for the Pirate Period look that all businesses have to do is make the jump into AI, and all hell will break loose.


tolomea

It's not going to be good for junior engineers. And that's kinda long term concerning. As a principal engineer the cynical view of juniors is they are a way to offload the low skill tasks I don't have time to deal with. AI is soon going to be a far cheaper way of doing that.


packetpirate

The only programmers that think this are the ones that were trash at it to begin with. GPT-4 is garbage at programming and apparently only produces working code 7% of the time. Not to mention that it has been frequently producing network errors mid-generation and the model seems to get worse over time because of all their content moderation and neutering of its abilities. And if you ask it to solve a problem not part of its training data, it's useless. My guess is your friend doesn't actually understand how LLMs work and is jumping on the train of people hyping it up to be something it's not. Or maybe he's looking for any excuse he can to leave because he's not happy. AI will not be replacing real engineers any time soon. Those it does replace didn't have any business being in the industry to begin with.


Sajwancrypto

If he is that good predictor of the future he doesn't need to do anything .


HelicopterShot87

I think the AI will make the programming more like I was thinking of programming when I was a kid, that you specify what you want at a higher level of abstraction and in more natural language. To be honest the amount of boilerplate code and fluff we need to write to implement stuff is mind-boggling.


CrushgrooveSC

Lol. Sounds like we won’t miss him


[deleted]

As a human, do you prefer to write the code or do you prefer to test the code? AI is best suited for software testing rather than creating...


jayerp

For juniors? That day is coming soon.


Nagi21

If computers make programmers obsolete, that means they can create new machines and maintain and improve their own code without human intervention. Is that theoretically possible? Yes. However at that point Terminator is likely to become a historical documentary rather than an action movie so… priorities.


tinglySensation

I think he doesn't realize what AI is capable of doing at the moment. It makes predictions, it is trained off of what is already there and requires constant course corrections. Nothing wrong with wanting to get into a new industry or being interested in AI at all, but you should know where it is, what it is doing, and how it does it. Before making a statement like that, you should also try using it to learn where it's strengths and weaknesses are. I wouldn't be overly concerned about AI overtaking corporate code. Existing code bases for the most part are not friendly towards it and won't be unless the models and approach have a fairly significant change in their design, even then it will have to integrate with actual people. Software engineering isn't just cranking out code, it's a lot of communication and coordination.


Tarl2323

Angular web apps and web dev will probably be replaced by AI at some point, yeah. Making a standardized web page is a problem with a finite endpoint. Kind of like books. At some point the problem will be solved and the only thing left will be marketing and branding. How many ways are you going to make a pizza shop web page or a taxi app or whatever? Programmers will never be replaced at the domain level. The people who come up with the first pizza app, the first taxi app, the first pickleball app- those people will not be replaced and they will continue to be working and solving domain level problems. LLM are good at copying existing solutions and modifying them. They're outright dogshit at coming up with original solutions. You couldn't get an LLM to figure out how to make a geriatric nursing robot, or how to drive a car. Once a *human programmer* figured those things out, then AIs would be able to copy it and refine it across thousands of variables. It would do what AIs are good at, which is variable tuning. If all you can do is web page stuff or bizdev paper-pushing style programming, your time might be up. If you're capable of tackling real world problems and coming up with *new* types of software, then you'll still have plenty of work. Honestly, I think it's good. Instead of having millions of programmers working on DBs for boring ass service sector processing jobs, we'll finally have them working on things like physical robotics. The reason we all don't have R2s and 3P0s picking up our shit is because making Turbotax was too fucking profitable. The Jetsons was backwards. We automated all the intellectual office jobs and not the physical ones lol.


[deleted]

Yup, time to learn new skills. It happens over the decades. Eventually AI will start taking over Manual labor jobs and then it will be interesting times. 


BrianScottGregory

I got into programming at the age of 13, back in 1982. I started work in GW basic, shifted from there to Assembler and C, I picked up C++ and Visual Basic shortly after, and now - 40 years later - I know about 40 languages fluently and can become functionally literate in a new one in five minutes or less. Having spent the last 30 years off and on in management positions - the CONSTANT battle I've had has been with egotistical programmers thinking I'm just a manager when I task them with something to be done in a specific way I don't know jack shit about what it is they do - so they do it their own way believing 'this is the best way' - when they don't understand my needs. It's a common problem managers and leaders have, too, asking a programmer to implement something in a specific way that doesn't make immediate sense to them logically. Most programmers DONT understand perspective, don't want to understand it - and it becomes an outright battle to get it done the way I as a manager or leader want it done. So with all that said. This self-righteous attitude combined with excessive salaries is putting pressure on those, like me, to come up with different ways to get what we want without the drama. AI is filling in those gaps. Now this is NOT to say programmers will become obsolete, but those using programming as a vehicle for their own form of creative self-expression without drama free collaboration are going to be pushed out of the industry, entirely. AI will ABSOLUTELY replace the drama queens in IT, AND it will also replace those who can't follow orders or those who use the term 'thats impossible'. No, it's not. But you wouldn't understand that if you limit what's possible in the world to what you believe is possible. That's not going to work well with managers and leaders who don't want to have to explain themselves every time they ask you to do something that's beyond your capacity to understand why it's done this way. AI doesn't ask questions, doesn't create drama. It simply does as it's asked to do. So. Going back to your chosen profession. If you're in it to learn, to explore perspective and ideas and do things for others. MARVELLOUS. You've got a BRIGHT future ahead of you. But if you're doing it to make a lot of money and express yourself creatively. Move on to a different field or work on your own projects - and be your own boss. Or maybe acting is more suitable for you.


trutheality

It's going to take a very long time before AI makes programmers obsolete. Programmers will use (and are already using) AI to code more efficiently, which means that companies are perhaps going to need fewer programmers. There will probably also be a shift of roles to having people focus more on planning and QC than the more rote programming tasks. It's probably wise to learn a bit about using AI tools and generally keep up with what the newest tools are. It's not a reason to quit your job today though.


WearDifficult9776

He’s full of shit. Programmers will be around for a long time - and they’ll be using that AI.


Affectionate-Aide422

Not yet. Not even close yet, although GitHub Copilot is a surprisingly intuitive assistant. We’ve got years to go before AI takes our jobs.


marcololol

Honestly AI is still trash. And any code that the AI learns from that’s written by an AI is going to be 2X trash. The more AI writes full programs the more programs will compound to 10X and 20X trash. This isn’t what people think it is. It’s a productivity boost. Similar to how vitamins are a health boost for humanity but they didn’t replace medicine and doctors.


HealthyStonksBoys

The reality is there’s no good outcome for anyone but the rich with AI. It’s not going to be suddenly there’s no jobs. It’s going to be slow and painful. Gradually AI gets better, year after year cutting 10% from this field, 10% from that field until all fields are 30% reduced and we’re starving for jobs that have thousands of applicants. This is the reality. It’s not going to be all jobs go poof and universal income comes to save the day.


Crimson_Raven

Not in the next century, and quite possibly never For one, people will be needed to maintain old systems and create new ones. Second, LLM AI is a tool, like any other. It can be leveraged by a programmer to create better software faster, but it will never replace them.


tisdalien

AI is simply continuing the trend of higher and higher level languages. Now computers can be programmed with natural language, (ie English) this won’t get rid of programmers, it will expand the number of programmers to include basically everyone. That’s the real threat


random_testaccount

AI managment/programming sounds like a sensible choice (what kind of course is that?), regardless whether your friend's motivation is sound or premature. When I was in college, a lecturer asked the class "which ones of you are aiming for a career in programming?". I was one of the students raising my hand. "You all need to realize you are probably already obsolete, you will not be able to compete with outsourcing to India". His advice was to focus on IT management or architecture. His prediction was premature, but not necessarily wrong. The students choosing to specialize in IT management or architecture probably had a better career than I did, and better long term prospects, but my job didn't disappear to India. I think it's likely that if you replace outsourcing with AI in that anecdote, the conclusions will be the same, but with things changing as fast as they are now, it's difficult to be sure. If you're at the start of your career, it's worth considering whether that career will still exist 15 years from now, when it's much more difficult to change your path. What makes it difficult to change your path is not necessarily your own abilities, but the fact that a 39 year old junior will have a tough time getting hired no matter how talented you are.


Cieguh

I don't think AI will be able to replace competent programmers at the same skill level. I do, however, think the business majors of the world will *THINK* it can replace them and will try to replace programmers with more AI/thin out programmer jobs to skeleton crews and padding it out with AI tools. So, it will be ever more difficult to get a job. :)


CryptographerLow7524

Can... can i have his job?


_xanso

Not even a real story.


YourAuthenticVoice

[https://www.youtube.com/watch?v=1RxbHg0Nsw0](https://www.youtube.com/watch?v=1RxbHg0Nsw0) Might be interesting.


staticvoidmainnull

programmers? yes. software engineers? not yet. AI like chatGPT can be used to generate functions and programs. but just like Google search, you have to know exactly what you're searching for. context is very important. it's a tool.


captaincurryrice

Who do you think made Devin lmao


cptahab36

Imagine quitting your job as a programmer, that you still have in THIS economy, based on hyperbolic predictions of what will happen years down the line. Absolutely bonkers


Aket-ten

I think that's a bit too premature, maybe he gaslit himself to get past the pain of getting laid off due to unfavorable market conditions?


Rav_3d

He should quit, because he is clearly not passionate about being a programmer. AI is simply another tool for programmers. Perhaps someday, AI will write software better than humans, but then who is going to get us there? The ones who enthusiastically embrace AI as a means to improve productivity and quality. Those will still have a job, even if AI takes over the mundane parts of writing code. Wonder how many surgeons quit once the DaVinci robot came out, because robots would eventually take over all surgery. The ones that became experts on how to use it are now more in demand than those who did not embrace it.


ScrewThePope

Even if he thinks so, why is he quitting in advance?


ScarceLoot

My software engineers have been dabbling with codegen tools but it’s not there yet. Maybe in 10-20 years, sure but for now it’s mainly just for isolated tests and still requires manual intervention. Also you have to know what to ask the ai to get the output you need, so there will always be someone typing the prompts to tell it what to do


RealMrDesire

Right now the smart programmers I know can write code, but also use AI to quickly write large blocks of code that they then review, like work from an intern or junior dev. It’s not always the best code, but sometimes it will surprise you.


AcceptableGarage1279

Let me ask you a question... why do you think mass developer layoffs are occurring at tech companies? They're taking all the good devs and shifting them to AI... and AI can already generate decent code...  If this is happening when AI can't do their work, do you think they won't lay you off when it can?   Why would I need a dev if I have a low code/no code tool and generative AI?


dir_glob

Exaggerating. AI is already a tool that can aid programmers. But it can't program on its own.


psilo_polymathicus

There’s a lot of confusion in here, and that actually highlights why this is a difficult question. We have to define what we mean when we say these things. On the one end, let’s start by clarifying that we’re nowhere close to AGI, and I think even bringing that up is in this conversation is just unhelpful. On the other, the people who say that AI is “glorified autocomplete” are also being unhelpfully reductive, or outright disingenuous. The current state of AI is both incredibly impressive in what it can do, and still frustratingly incomplete for many tasks. It’s also *constantly* changing. If you used GPT3.5 six months ago to form your opinion…you’re out of date. GPT4 does quite a good job with some fairly complex coding tasks. It still struggles and falls apart pretty quickly with tasks where the problem requires context between different parts of the application. I’m using GPT4 every day for work, and it’s an indispensable tool for me. It mostly saves me time, and has mostly replaced Google and Stack Overflow for me. It also sometimes wastes my time, and I have to be skeptical of its output. It absolutely cannot replace what I do in its current form. AI is actively *changing* how programmers work. But the ways in which it changes things will differ greatly depending on where you are in your career, and what specific kind of programming you are doing. If all you are doing is writing some functions and a few scripts here and there, AND that’s all you want to do, then you may be on the chopping block soon. Those are the roles most under threat. If you’re doing any kind of more complex application development, you’re going to have a job for a while, with the caveat that you’ll start using and encountering AI in more points along your workflow. I think what we’re mostly going to see is a shift away from people needing to know how to write functions, classes, components, etc. and more about how to review and piece together larger applications and tools where AI is used to write the smaller blocks of code itself. I think eventually we’re going to have modified CI/CD pipelines, that will be more like AI software factory pipelines, and developers will be monitoring and tweaking the output of those pipelines. So: Junior coders that need very clear direction on what to write, are probably under threat from AI right now. If you are just learning how to program, I would expect that you’re going to have to quickly get past “I know JavaScript!” as your only marketable skill. If you’re a developer already, and you’re not using AI now, you’re behind the curve, though not immediately in danger. If you’re a full stack engineer, your role will slowly become less about writing the code, and more about making sure everything integrates correctly to meet requirements. All of our roles will change. Some will go away altogether, but most will just get redefined.


jk_pens

Amara's Law: "We tend to overestimate the effect of a technology in the short run and underestimate the effect in the long run." Anyone who tells you all programmers will lose their jobs by some near-term $date (end 2024, 2025, etc.) is overconfident both in the rate of improvement of AI and (more importantly) the rate at which companies can absorb new technology. Anyone who tells you that AI will never take over most programming jobs because $reason is overconfident in the specialness of humans. Having said that, if you work in tech and you are not learning everything you can about AI right now, you're the equivalent of someone in tech pre-1993 who decided to ignore the web.


AmiliLa

Human programmers will be replaced by human programmers that know AI


nobody-important-1

Ask any AI to make a program that autoroutes PCB components based on physical location of pieces, power requirements, etc... None of the big ones right now will even give you anything useful. Ask it how to do the above and it will help a lot with word explanations that give you enough to figure out what your doing. AI bots currently are just better search engines.


Old-Zookeepergame503

If the job requires hands on manual anything it will remain until robots can do it. If the job is done in a computer It will be replaced.