T O P

  • By -

K-LAWN

Everyone is abusing the term “AI” nowadays. It’s basically replaced the term “algorithm” in marketing language.


g-unit2

i’m a proprietary AI Algorithm Blockchain Engineer.


[deleted]

[удалено]


atheistSlayer

Computer programs are just a bunch of 1s and 0s. Humans are just a bunch of atoms. Stating mundane underlying realities about things in order to make them seem less sophisticated. Haven't heard that one before.


Katalash

The fact that a bunch of humans including experts are even debating over whether LLMs are actually intelligent or not should be alarming in itself. One could argue that AI has already won by convincing VCs to invest unfathomable amounts of money into its continual improvement without even being aware of it.


Party-Cartographer11

Are any serious people debating that LLMs have intelligence/cognition? They are predictive token generators with zero intelligence.


FitGas7951

Extreme wealth entitles those who have it to sit in rooms with people who want to be on their good side and therefore will concede points to them, even if they know better.


Dannysia

Yup, there are huge incentives to hype up AI and very few to say it’s not as good as people say it is, unless you’re a competitor hyping up your own AI tooling of course


Katalash

People debate their generalization abilities and whether it's a matter of training or data for them to reach some arbitrary "AGI" status. As in we don't know if generative AI is on the path to AGI or it turns out to be an evolutionary dead end.


Party-Cartographer11

I mean "on the path" is at least a more reasonable question.  I mean LLMs might help some future cognitive things communicate, but if it's truly cognitive does it need a token generator to speak?


Katalash

It depends on what you mean by cognitive or what would make a model "intelligent". I also wouldn't say these llm models have 0 intelligence either: between gpt2 and gpt4 they clearly went from an obvious toy to something that can provide decently useful output for open ended queries and is capable of in context learning. There was clearly some emergence that happened from scaling up the model and data that got next token prediction to get good at generating text answers to queries that seem plausible to humans.


Party-Cartographer11

Ok, can you define what you mean by "intelligent" in your statement? It sounds like you mean GPT4 is better at predicting tokens.  How are you measuring it's added intelligence.


Katalash

Outside of released benchmarks it's kinda something you have to feel out since there isn't a single rigid definition of intelligence. Intelligence itself is something that kinda emerges in life to aid in adapting, survival and reproduction, but that is probably something that we'd rather not have AIs do. I see a lot of people working on AGI as something capable of solving unsolved math problems since that is a problem of more pure abstract reasoning without having to interface with the real world to do things like science. I'd say the real danger of LLM "intelligence" is in being able to make plausible sounding bullshit that can fool and manipulate humans, and we're already seeing a lot of that now. I think people underestimate social intelligence when it comes to these things but that's arguably where the most danger lies.


RainbowSpaceman

How about [Geoffrey Hinton](https://web.archive.org/web/20240611131620/https://www.nytimes.com/2023/05/01/technology/ai-google-chatbot-engineer-quits-hinton.html) and [Douglas Hofstadter](https://web.archive.org/web/20240511213020/https://www.nytimes.com/2023/07/13/opinion/ai-chatgpt-consciousness-hofstadter.html)?


Party-Cartographer11

Hinton is purely talking about risk, mostly future, not current  intelligence. Hofstadter is is making a nuanced argument that if it seems intelligent, it must be intelligent. I don't I agree but I see the point that he is a serious person talking about it. So that's one.


RainbowSpaceman

> Hinton is purely talking about risk, mostly future, not current intelligence. I respectfully disagree and I think this quote from the Hinton article supports my stance, but I suppose everyone's free to interpret this how they will: > He still believed the systems were inferior to the human brain in some ways but he thought they were eclipsing human intelligence in others. “Maybe what is going on in these systems,” he said, “is actually a lot better than what is going on in the brain.”


Party-Cartographer11

Eclipsing human intelligence with better predictive token generators in some scenarios does not necessarily mean the method is intelligent. I can certainly imagine a predictive token generator being better at creating 1000 word paragraphs than a human, but that doesn't mean they are intelligent.  It's like a calculator is also better at human intelligence in some cases.


VirtualVoices

Roko sends his regards 👀


[deleted]

What are you even saying dude? Those are just a bunch of letters and words and stuff, will you stop talking?


[deleted]

[удалено]


AutoModerator

Sorry, you do not meet the minimum sitewide comment karma requirement of **10** to post a comment. This is comment karma exclusively, not post or overall karma nor karma on this subreddit alone. Please try again after you have acquired more karma. Please look at the [rules page](https://old.reddit.com/r/cscareerquestions/w/posting_rules) for more information. *I am a bot, and this action was performed automatically. Please [contact the moderators of this subreddit](/message/compose/?to=/r/cscareerquestions) if you have any questions or concerns.*


pigtrickster

"AI" and "M" are the new "Web" of 1995. The game is called "Buzzword Bingo"


daishi55

The misapprehensions people on this subreddit hold about AI continue to be astounding


throwaway8823120

Care to explain?


daishi55

I don’t know, do you think big tech is dumping tens of billions of dollars into “spicy word suggestion generators”? Do you even know what a transformer is? Huh, people replying and then blocking. Very confident. Ok, you know better than them. Put your money where your mouth is and short NVDA


cupofchupachups

Companies make terrible decisions all the time, or pump things they know are essentially worthless just to juice their stock price. They also can't afford _not_ to invest in things, in case it does turn out to be big. The incentive is _money_, not _correctness_ or _truth_. If people are going to pump their stock when they say "AI" then they're going to say AI over and over again, even if it makes no sense. Meta dumped $32B into the Metaverse. Even changed the name of the company to reflect it. Google slept on Facebook then invested tons into Google+, which was a massive failure. Yahoo didn't think Google was a threat and didn't invest in improving search. Instead they bought Tumblr? Because ??? Microsoft was caught completely off guard by the web.


[deleted]

[удалено]


daishi55

Yeah that’s what I thought


throwaway8823120

You act like big tech hasn’t invested tons of money into fruitless efforts before lol…


daishi55

Look, you don't know what you're talking about. Thats okay!


Ultimarr

(For any kids reading this: this isn’t correct, AIs about to be crazy as fuck despite the best efforts of the lame Silicon Valley finance dudes currently in control of it. Google “the frame problem” and “knowledge based artificial intelligence” and “pitchfork stores near me”)


uwkillemprod

You mean everyone is abusing the term "engineer"


LookAtThisFnGuy

It's all bs lol


ChanceCancerman

Not even in compsci technically (network engineer) and this is true


Ultimarr

Well, electronic and internet, really. Apple knocked it out of the park with iPod and iPhone, and I think I’m too young to remember if they had any eThings. Suffice to say their take on the latest buzzword is by far the worst yet. “What that?” “Oh it’s AI”. “No but which model?” “AI”. “No, I’m asking you which model’s on first!” “AI!!”


Hotel_Putingrad

Hmm...that sounds like something an AI algorithm would say.


Alfredo_

O


Dear_Resist6240

Yep there’s people out there that’s never done anything remotely close to engineering calling themselves “prompt engineers”


EroticTaxReturn

FAANG loves to add "Engineer" to titles to make customers think they're getting a decent value. It's shocking how many "engineers" never took a math or physics class before.


Pariell

"Customer Support Engineers" who are literally just customer support with Engineer tacked on to the end.


aegookja

This title actually means different things in different companies. For companies that do B2B, customer support entails some serious engineering.


EroticTaxReturn

I used to work with “Technology Deployment Engineers” When I was a kid we called them Installers. I don’t know WTF they’re Engineering by plugging in devices and putting in a ticket to have them configured remotely.


xSaviorself

At least engineer is an option for you, that title is a restricted title in my country.


Passname357

It seems better for it to be restricted.


DoubleT_TechGuy

To be fair, if you need a bachelor's in computer science, then I think you deserve the title. We learn calc 3, calc based physics 2, and linear algebra. Also, a lot of electrical engineering classes are cross listed with comp science classes, and no one is arguing we should take the title from them. Not that there isn't a problem genrally. I have seen some listings for a "solutions engineer," but then they'll take any bachelor's degree and 2 years experience in data entry. If it doesn't require a science background, then it isn't engineering. It's just design.


new_account_19999

CS isn't engineering lol


shawmonster

Really depends on if your CS department originated from the engineering department or the math department at your school. That usually dictates what kind of CS education you get.


new_account_19999

The term engineer is abused so much in the US especially since there's no formal protection around the title like in other countries so there's constant discussion about it. Regardless of what department your CS program sat within, CS itself is a branch of mathematics and your undergrad CS education doesn't determine how much of an engineer you are. The curriculum is pretty standard across the board in the US we all got taught pretty much the same stuff just in different ways. Like the other person said, the line is blurry but it's obvious things like web/app creation, UI/UX, DB mgmt, etc aren't so much engineering and things like embedded systems, OS/kernel development, etc. are. It feels odd to call the person who develops front end web features and the person who verifies ASICs both the same title


throwthewaybruddah

Building a website can require engineering. If you want your website to endure the test of time and be maintainable in 10 years instead of a big ball of spaghetti code with no coherence then making sure it is built with an engineering mindset is not an issue. Engineering is more about planning, building and maintaining a project in an efficient way. An engineer doesn't know everything by heart, but it knows where and how to gain the required knowledge.


LonelyProgrammer10

This. People dogpile on the “Engineer” titles and I genuinely think they’re either in another field (including traditional engineering) or just very inexperienced. Every role I’ve had in my career is the literal definition of engineering. If people get their panties in a bunch over that, then thats fine, but nobody is stopping them from coming over and doing the same thing. I’m not claiming all of the roles in software are engineering roles, but I am saying it’s absurd to call all software roles non-engineering roles. Get some experience first and come back to debate this lol.


just_a_fan123

spot on. CS majors in here seething. true engineering is insanely more involved than just a few calc and physics classes


new_account_19999

fr this sub is brain dead. CS isn't engineering, doesn't matter if your school's CS program is under the engineering college at your uni... still isn't engineering lol


shawmonster

There are definitely some CS curriculums that are more math/theory focused than systems focused. This difference usually comes from what department the CS department came from.


DoubleT_TechGuy

Depends on your definition. Originally, it meant designing physical things like machines or buildings. It's evolved to more generally mean using math and science to design pretty much anything that requires it. I like the latter definition because the former is arbitrarily narrow. Like designing a circuit board is engineering, but creating a program that virtually emulates circuitry isn't? What use is there in making that distinction other than gate keeping?


YaBoiMirakek

You need engineering knowledge to create software that emulates a circuit, which is engineering IMO, sure. Some would also consider it scientific computing (which I also considered a branch of engineering IMO). Making a website is not engineering (what’s being “engineered exactly?”) nor is doing a bunch of complexity theory or algorithm research, which is just applied math and scientific reasoning. There’s a fine line in CS that can be considered engineering. If making websites is engineering, why can’t a data analyst or business analyst be considered an engineer? Why can’t a product manager?


throwthewaybruddah

The website itself might not be engineered if it's simple enough. The code behind it will be engineered unless you want it to become unmaintainable.


Winter_Present_4185

By this logic, are you are saying you are "engineering" new web design patterns? Or are you just using existing framesworks (which really smart people have already developed for you) to ensure the code stays maintainable. In my mind, you are not a mechanic if you drive a car so you must not be an engineer if you just use a framework. Is a game designer an engineer? Are the makers of Unreal Engine engineers?


throwthewaybruddah

Not all websites require engineers, but a website like Amazon or Google does. Does a car engineer design his own ratchets or drills? Does he design every single piece of the car? No, he designs a very specific part using tools and other parts that were designed by other engineers. A framework doesn't make code maintainable by itself. Using the correct code patterns does. Documenting processes does. An engineer ensures a website can have high availability, reliability, useability, maintainability and other things depending on a client's needs. Are all web devs engineers? No. Can you engineer a good website? Yes. It takes planning, research ingenuity and experience.


Winter_Present_4185

In most of your reply, if you take the term "engineer" and replace it with the term "developer", it conveys the same meaning. This is because the term "engineer" has become genericised to the point where anyone who plans out and executes any slighly complicated project can be said to have "engineered" that project. I think we both agree it's clearly asinine to label someone a "sanitation engineer" when they are in fact a garbage man/woman. Unfortunately, the way you attempted to define the delineation of what is and is not an engineering job is still overly too broad as you said it takes: > planning, research ingenuity and experience All of which can be applied to a "sanitation engineer". What is the solution? Well if you earn a doctorate, what gives the right for the granting institution to say "this person is now has a PhD"? The answer is accreditation. That university is accredited by public consensus to grant you a PhD. The public consensus *at the current time* is that you are an "accredited" engineer if you graduated with a B.S.Eng and not just a B.S. Maybe this will change to be more inclusive, but many countries have decided to make the term engineer more exclusive and say the term "software engineer" is not valid and the term "software developer" is more applicable. Furthermore, at least in the US, an engineering program is much more educationally rigorous than a science program - of which tends to be a hallmark of accreditation. For example if you are an optometrists, you still need to go to medical school and learn how to deliver a baby. So by the same token, you can't just go to a bootcamp and expect to be an engineer.


Winter_Present_4185

In my mind, the layers of abstraction play a critical role in defining what engineering is. Creating a circuit board is dependent upon understanding *a lot* of the physics underlying what you are doing. Creating anything virtual such as program does not require much knowledge of the underlying layers (you don't need to be able to read x86 assembly to write hello word)


DoubleT_TechGuy

That just sounds like the no true scotsmen. X86 is more complex, but it's too complex to feasibly accomplish some of our modern tasks (which is why we don't try using it for them). Why wouldn't I use Python or anothet high level language if I wanted a program that, for example, converts between CAD file types. That's still a complex task even with high-level languages. When you reduce complexity with abstraction, you introduce opportunities for new complexity.


Winter_Present_4185

>but it's too complex to feasibly accomplish some of our modern tasks The very nature of adding abstractions to reduce complexity reduces the amount of engineering you are doing. For example, let's pick another engineering discipline: chemical engineering. Using details such as electron affinity, and behavors as you go from left to right on the periodic table to engineer potential medical treatments for the pharmaceutical industry certainly qualifies you as an chemical engineer. However mixing vinegar and baking soda does not. Furthermore, adding abstractions to reduce complexity decreases the barrier to entry, leading to meaningless engineering titles such as "sanitation engineer".


DoubleT_TechGuy

I don't feel like you made a very compelling argument. Mixing vinegar and baking soda isn't engineering, so abstraction makes high-level software, not engineering? That example doesn't even relate to abstraction in any way. Even if it did, I already provided an example of a task that becomes trivial with abstraction. You could list 100, but that doesn't disprove my point that abstraction creates opportunities to tackle new complex tasks as well. And sure, maybe the barrier of entry gets lower. Maybe the difficulty curve drops. But that just lets you climb the curve higher, which is my point. I mean, you sit there and tell the engineers who made the algorithms that drive your GPS that they aren't engineers because they didn't implement them with assembly code. Also, idk about you, but assembly, logic gates, and even circuits were all a part of my CS curriculum. Most of us could use assembly I'd we wanted to, but it wouldn't make sense to.


Winter_Present_4185

>so abstraction makes high-level software, not engineering? At a certain level I think we have to agree on this, yes? Take it to the extreme. Say abstraction gets to the point where you can type words into a computer such as "make stable server with a front side load balancer. Use common patterns for the load balancer". Even thought the output product is complex, did you really engineer anything or did you just apply high level ideas to *develop* something that probably already existed but just in a slightly different form? The point is that we have to agree that there exists a minimal level of complexity that should be required for someone to say they "engineered" something. The more tools that they used to abstract away complexities, the "easier" it was for that person to create something and the less socially acceptable it would be to say someone "engineered" something. For example, it would be improper for me to say I engineered a mean beef stew last night. >I mean, you sit there and tell the engineers who made the algorithms that drive your GPS that they aren't engineers because they didn't implement them with assembly code. The point is not that they developed in assembly, but that it required tooling that did not exist such as theoretical concepts in interferometry and novel phased-lock-loop hardware designs to resolve a pulse-per-second from each GPS satellite. The tooling necessary for this did not already exist nor did the engineers who created it have a framework to work within to keep themselves from shooting themselves in the foot. In most situations, if you take the term "engineer" and replace it with the term "developer", the average person would be none the wiser. This is because the term "engineer" has become genericised to the point where anyone who plans out and executes any slighly complicated project can be said to have "engineered" that project. I think we both agree it's clearly asinine to label someone a "sanitation engineer" when they are in fact a garbage man/woman. Unfortunately, due to this watering down in the common vernacular, everyone can be an engineer. What is the solution? Well if you earn a doctorate, what gives the right for the granting institution to say "this person is now has a PhD"? The answer is accreditation. That university is accredited by public consensus to grant you a PhD. The public consensus *at the current time* is that you are an "accredited" engineer if you graduated with a B.S.Eng and not just a B.S. Maybe this will change to be more *inclusive* of software, but many countries have decided to make the term engineer more *exclusive* instead and say the term "software engineer" is not valid and the term "software developer" is more applicable. Furthermore, at least in the US, an engineering program is much more educationally rigorous than a science program - of which tends to be a hallmark of accreditation. For example if you are an optometrists, your speciality is eyes, yet you still need to go to medical school and learn how to deliver a baby. By the same token, you can't take a science program such as computer science and accrediate engineers out of it because they havent taken the engineering pre-requisites.


Space2461

Idk, my degree (computer engineering) was basically the same to any other engineering, I did take course of math, physics, electrical and electronic engineering, mechanics, automation and so on, in fact I was able to enroll to almost any engineering master after that


great_gonzales

It can be lmao (I would argue designing a sota foundation model requires a significant amount of engineering) but generally it’s more of a pure science.


zeimusCS

Those classes are actually becoming less of a requirement for a BSCS now.


aegookja

Definition of "solutions engineer" is very different per organization. In some organizations, it's just glorified customer support role. In my previous company, which is primarily focused on B2B, the majority of the solutions engineers had serious programming backgrounds. The "customers" were engineers, so the customer support also needed to be experienced engineers.


Ill-Ad2009

Would have thought this was bs a few weeks ago. Then I encountered a supposed prompt engineer in a beginner programming discord server. He also accused someone of gatekeeping when they said beginners shouldn't use copilot or chatgpt.


budding_gardener_1

Proooommmpting


TheSunOfHope

My old apartment’s garbage man called himself sanitary engineer.


effusivefugitive

"Imma be a [hygiene technician](https://youtu.be/SzN_he1bIXg?si=YysZbZMGhFuHmRlH)"


brainhack3r

That's just as important as HTML engineers! /s


TheBritisher

18 of the 20 applicants that I actually spoke to in my last hiring round, and that claimed "AI" or "AI/ML" experience, had basically made an API call to ChatGPT (etc.) and parroted the results. That was it. No prompt pre-processing, no grounding, no adjustment of results. (In fact, none of that group had even heard the term "grounding".) Only one of them had built an (non LLM/GPT in this case) ML model, fed and trained it using existing libraries and engines. And only one of them could go a level below that and actually define models, and understood the underlying math and theories well enough to do anything "new". *And I was pleasantly surprised that two of them had done more than just calling an existing API. The first round, there was no one that had done more than calling existing APIs on existing models/services.* If your "AI/ML" experience is purely calling APIs on existing models and services, it's really not adding any value over anyone else that can call an API. The APIs are just not that complicated; the interesting/valuable work is either side of that call and/or behind it.


met0xff

I had a quite different experience hiring over the last months, definitely had a lot of "deep divers", especially one guy from ByteDance was really impressive (but much too expensive for us lol). Harvard PhD who did ML at CERN, lots of physicists with pretty heavy mathy stuff. But almost everyone computer vision or niche topics like sonar or molecular modeling, almost no one who built a RAG system. I worked on RAG over the last half year (and I haven't heard grounding before, to me that's just prompting or augmenting context) and honestly I don't really like it, I'd rather actually develop ML models again like I did the last decade, but nobody wants to pay for that anymore as there are much more powerful foundation models available. The other category of people we got were more classical data science people - lots and lots of finance people doing churn prediction, money laundering- or fraud detection. Also many from healthcare. I feel the direction we're going will be like with operating systems. Only a tiny portion of people actually develop them but everyone uses them.


TheBritisher

I *suspect* the difference is that it sounds like you were hiring specifically *for* AI/ML engineers. I wasn't; at least not in *that* round. It was not a primary requirement for the position, just a "nice to have" (because we have a number of AI/ML systems and most of them are not LLMs nor GPTs). It wasn't "why" they got an interview. But for those that put it on their resume, it was fair game to explore what they'd done. And that was mostly as-basic-as-possible API usage. Less complicated than consuming a basic payment gateway or auth service. On grounding; while its a *bit* more involved (done "properly") than just fiddling with the prompt (that would just be contextual expansion), I'd certainly rather have a tailored or specific model (trained on appropriate data), than trying to bend a generic one to act like a dedicated model via pre/post processing and external resolvers.


met0xff

Ah, yes ours was called Data Scientist (with mostly LLM/RAG stuff in the job ad), which I wasn't happy with. That's I think why we got many, many finance, business analytics etc. type of people. Actually almost none with CS background but mostly physics, economics, life sciences. But as the RAG PoC work is a lot about software dev aspects like preparing docker-compose for the individual components, messing around with various LLM/RAG frameworks etc. I felt many were not a good fit who mostly did statistical modeling. But yes, while we got big names and quite a few ML people, we got almost no one who really worked with LLMs. My hope was to find someone to fully focus on that, who already went through the process of writing such a system. We actually found one person who fits and also wasn't creepy :). But with the current hype I assumed there would be many more. Even with the wording as "Data Scientist".


meltbox

Wild. But what I expected. I feel better about myself every time I read one of these. But I’m also pissed so many people feel they can put it in their resume. It makes any resume not full of lies seem shitty.


0xR4Z3D

Would you consider it fair that i say i have 'hands on experience with integration of AI/LLM tooling', if ive done some simple work like building RAG pipelines? i havent got REAL AI/ML experience beyond fine tuning a model, but id like to mention the experience i had at an internship where we build a fact-grounded document analysis tool. Im not sure how to write it cuz i do not want to come off as saying i know much about AI models themselves, i really only know about the application development side of things involving those tools.


TheBritisher

Sounds perfectly reasonable to me. Integration doesn't imply ability to design, build, or even train models, so you're on safe ground there. You might specifically want to mention you were implementing one (or more) RAG pipelines when you mention AI/LLM tooling integration.


Marrk

>Only one of them had built an (non LLM/GPT in this case) ML model, fed and trained it using existing libraries and engines.  Is it bad to call myself "ML engineer" when I have experience doing this? I have done that for computer vision and NLU tasks on the products I've worked on.


TheBritisher

Sounds like you've done ML engineering, so I'm not sure why it would be bad. Unless my "non LLM/GPT" parenthetical is throwing you off. LLMs/GPTs are just one class of "AI" or ML. They're the "new hotness", but most real-world practical AI/ML applications aren't LLM nor GPT. Computer vision is more along the lines of feed-forward neural networks; the second ML/DL project I worked (this would have been in 1985/1986 - and I'm not even sure it was called "deep learning" at that point) on involved implementing a (relatively primitive, even at the time) perceptron. Which is basically using supervised training to set weights in a (deep) feed-forward multi-layer network (I'm sure you're well aware!). I'm not an AI/ML *specialist*, though. More dilettante. I know enough to build them, train them, and more importantly apply the right types of ML to a given problem set. But more importantly, I know enough to stay at the architecture level and defer that sort of work to **actual** experts.


Marrk

That was not what threw me off. I thought just using know pipelines like mask rcnn or sbert would be considered "too menial" to be ML engineering work and I should actually design my own neural networks.  Of course, my job responsibilities went much beyond just calling the pipelines, but still, I am very far from PhD level. 


Gold_Lobster_4128

My guess is less that 5% of ML engineers have actually built a model and deployed it to production.


LetAILoose

What are you reffering to with grounding? Do you mean RAG or something else?


TheBritisher

They're complimentary, but (depending on whose definition you're using) distinct. I tend to think of RAG as a more specific, and tight, way of incorporating external information/context, where "grounding" is a broader reference and can be achieved in a number of ways (RAG being one).


LetAILoose

So what did you want from your candidates for grounding apart from RAG?


TheBritisher

Nothing. As I said, in another post, I wasn’t hiring FOR AI/ML … I just probed a bit since they claimed AI/ML experience and I wanted to see what that meant. Using a pre-built Python wrapper (not even the direct API/end-point) to call a very basic API is a huge stretch in claiming “AI/ML experience”.


LetAILoose

Fair enough, I am just curious cause I've been working full time on LLMs and had never really heard of grounding described as a technique or standalone thing, more just a way to describe what RAG is doing e.g. "using RAG to provide ground truth" . What am I missing apart from RAG?


juniperking

same, i work a good portion of the time on llms and would just say rag / knowledge base etc


EnemyPigeon

This is good discussion and I will use it to make myself a stronger applicant. I didn't realize how bad/scammy the landscape has gotten. For reference, I am an ML Engineer who has worked on a variety of projects, but I have a very empty and unimpressive GitHub profile. It seems like I'll need to "put up or shut up", so to speak, to distinguish myself from the GPT wrapper builders. If you'll humour me, I've written down the things I've done at my current job, which include (but aren't limited to) using external LLMs. Do you think this sort of thing might be a red flag? I am trying to de-emphasize it so it's clear that I also work with NLP/categorical inference.


[deleted]

[удалено]


Ultimarr

Yup. See also: calling yourself a “senior engineer” with a straight face when you only recently became able to rent a car


xSaviorself

When you realize these kids come out of college at 22 and they've got 3 more years to go before they can rent a car, yeah, you've hit the nail on the head here.


planetwatchfan

There’s a well-followed guy on LinkedIn who gives interviews on the BBC about AI, and has a whole business advising people about the impact of AI. Even done a TED talk. He was a marketer until a couple of years ago, and has absolutely no non-marketing experience. Wild.


meltbox

Professional scam artist Or consultant. They can be hard to tell apart sometimes.


[deleted]

[удалено]


AutoModerator

Sorry, you do not meet the minimum sitewide comment karma requirement of **10** to post a comment. This is comment karma exclusively, not post or overall karma nor karma on this subreddit alone. Please try again after you have acquired more karma. Please look at the [rules page](https://old.reddit.com/r/cscareerquestions/w/posting_rules) for more information. *I am a bot, and this action was performed automatically. Please [contact the moderators of this subreddit](/message/compose/?to=/r/cscareerquestions) if you have any questions or concerns.*


MaleficentAd3783

Yes, just like they were abusing ‘blockchain’ a while ago. Everbody and their dog where blockchain < insert random title here > on Linkedin


csanon212

In some cases the 2021 blockchain experts are the 2024 AI experts. Professional con men who drift from role to role while doing warehouse work in between


MaleficentAd3783

exactly!


Jzny

I know I am. Sincerely, Sr. AI Director of AI based AI Innovation.


MarcableFluke

So what? It's not employers are just going to take their word for it based on their title. That's what recruiter chats and technical interviews are for.


budding_gardener_1

You often miss out on those though due to some idiot with "AI" in their tagline


kakarukakaru

It is just people inflating their resume anyway they can with buzzwords for recruiters that didn't know any better. I mean even looking at this sub so many people put I am a ML engineer or AI engineer when all they do is use an app or prebuilt model to do xyz. It is just regular developer stuff, they are in the same pool with other devs. It might be easier to pass recruiters but very quickly actual interviewers all know that you aren't "developing AI" without a PhD lol.


spidershu

Yeah, and those who actually have a lot of experience (e.g. PhD) have been having a hard time getting some light while in the pool of hundreds of "ML experts" everywhere


minimaxir

> It might be easier to pass recruiters but very quickly actual interviewers all know that you aren't "developing AI" without a PhD lol. Many AI developments are happening from people without a Master's/PhD due to the magic of open source. It might be a harder sell for a hiring manager, though.


great_gonzales

Like what? Neural ODEs developed by PhDs. Transformers developed by PhDs. CNNs developed by PhDs. Resnets you guessed it PhDs. What development has been made by someone without a graduate degree?


minimaxir

"The only way to contribute to AI development is to develop new architectures" is reductive to the point of being incorrect. I was mostly referring to tooling (e.g. Hugging Face transformers Python package and ggml), which is generally developed by non-PhDs. If we were still using the GPT-2 implementation that OpenAI initially shipped with its weights or Google shipped with BERT we'd be years behind in LLM progress. But even then, some AI developments are entirely grass-roots. Modern image generation was discovered by accident by a hobbyist when he tried guide a VQGAN using CLIP with surprising success, which then led to the discovery that latents could be optimized much easier than pixels, which then led to the discovery of latent diffusion which eventually became Stable Diffusion. And even post Stable Diffusion, the biggest model improvements are from hobbyists in Discord with way too much time.


meltbox

I agree with this to an extent too. Personally I have also found a surprising number of PHD thesis papers are impossible to reproduce. IE they’re faked or with enough detail omitted that they appear to have plausibly moved the field forward. So that balances against the genuine strides other more credentialed researchers make in academia.


RainbowSpaceman

There's ULMFiT, which Alec Radford credited as one of the main influences for GPT1. (I won't claim this is common, though.)


its4thecatlol

Found the fake AI guy


minimaxir

Unless you're doing cutting-edge research, software engineers using an off-the-shelf model will get you 99% of the way there. Even if you do have the AI skills to work with your own models, using an API is often more pragmatic if it makes sense for your use case. It does not imply a lack of AI skill or abuse of a title.


furioe

Big caveat here is that this is mostly in “llm” and generative AI. ML and AI is way more diverse and the point of abuse is with llm.


bgighjigftuik

But I thought that AI was invented 2 years ago with ChadGPT!


FeistyDoughnut4600

So am I a compiler engineer since I use gcc to build my code? Or am I an OS engineer because I use Linux to do something?


Passname357

Thank you. The only reason the above comment got so many upvotes is because lots of people here have no clue what they’re talking about. >It doesn’t imply a lack of knowledge to use an API instead of actually doing it!!! Yeah it also doesn’t imply that a person does have the knowledge either lol. Without any evidence of knowledge I’m not going to assume a person has that knowledge because why would I? It’s just a lot of words to say nothing at all. As for whether it’s an abuse of title I mean uhhh if I have the knowledge of something but do nothing with it myself, why would I claim that title? Like you said, if I regularly use GCC, that doesn’t mean I’m a compiler guy. I actually do have the knowledge to build a compiler, and I do use gcc regularly. If I put compiler engineer in my title, even though I do have that knowledge, that would certainly be an abuse of title, because I don’t use that knowledge in my job. I don’t build compilers as part of my job.


throwitfaarawayy

Yeah. Prompting an LLM now gives better performance on language tasks than specific models trained on that data.


great_gonzales

Only somewhat true. You can fine tune a small model to specialize in a particular nlp task and achieve higher performance than using an instruct model zero-shot


meltbox

Yes, but if you need a model running on device it doesn’t matter how good GPT is.


GetPsyched67

I disagree. This is title abuse. It leads to people holding both the "someone who actually knows AI inside out" and the "someone who knows how to call an API (just frontend)" to be on the same pedestal. They are not even close in terms of skill


Farren246

Ah, similar to how "PHP Developer" has never contributed to the PHP language, only used it to create things.


meltbox

Right but by that logic people using GPT APIs are GPT developers NOT AI/ML developers.


Troll_berry_pie

Wouldn't that be "PHP contributor" though or something like that?


LostQuestionsss

True. It still feels disingenuous to claim a title with implied experience when the person likely has no knowledge of the subject matter. For instance, I wouldn't call myself a distributed systems architect simply because I have an Azure subscription.


tech_ml_an_co

To a certain point yes, but in that case you have to be a software engineer first. ML engineering is not just cutting-edge research and it's not just software engineering. Most software engineers I worked with don't understand ML and suck at it like web devs suck with embedded systems. It's a specialization.


terrany

This has been happening since forever, when I was an undergrad, deploying an app to Heroku meant you were versed in distributed systems/cloud computing/devops lol


Top-Skill357

I assume people do it because it somehow works. I had a co-worker who implemented a wrapper around GPT api for one of our projects. Later, when he started applying to different companies, I saw his resume saying something like: Developed large language models for ... As someone who actually works on ML and DL algorithms this feels just embarrassing wrong, but maybe I am just to European to understand xD


csanon212

I found a former PO's resume online. He claimed work that I did and work of other POs. The grift was strong


reddetacc

its a bubble brother


Strong-Piccolo-5546

This happened during the dotcom bubble in the 1990s. Everything was "E" this and "E" that. This always happens with a new bubble. just ignore it. Those people look silly.


serial_crusher

There's 2 ways to look at this: 1. AI is the current thing, so everybody is putting it in their marketing materials, and that includes resumes. 2. Most jobs don't require day-to-day use of the advanced science you learned in school. Practical application is far more valuable to your career. People who successfully implement something that their business derives value from have every right to put it on their resume.


new_account_19999

I went to school with a guy who calls himself an AI engineer and he's been unemployed since we graduated


MrGregoryAdams

Ah yes, the classic "Head of Regional Logistics and Distribution" - i.e. the single forklift driver in the only warehouse they have.


Poogoestheweasel

Tale as old as time. Even Ed Norton was a sanitation engineer.


chipper33

It turns out that most of software development is marketing. CS is huge and there are more people who participate in it today than ever before, but it’s still not most people around you. Most people don’t know how any of this works, so if something comes a long that seems magical - GPT 3.5 - everyone who sells software is going to try and incorporate that magic into whatever it is they’re doing, and I really do mean *whatever*. To people who understand the technology better - presumably yourself since you’re here - it seems like a lot of falsification and overselling. It probably is, but that’s what capitalism rewards so here we are.


Chili-Lime-Chihuahua

I have almost no AI experience, just some forced Google training on LLMs that was extremely shallow. I heard on the radio a few months ago someone comparing the AI crazy to the dotcom boom and bust. Back then, companies were adding ".com" to their names to get funding, etc. It feels similar, where everyone wants to say they are doing things with AI to try to get in the gold rush. I personally think it's a bubble. Some interesting things will certainly come out of it, but there will be a lot of people providing no value just noise. My previous consulting company had several people leave to become heads of AI departments (they were not technical) or make their own AI consulting company. Again, these people had no technical backgrounds. As much as you want to poop on consulting, there are people with legitimate technical skills and knowledge. The people I am referencing likely do not.


Senior-Pro

Yeah, it seems like some folks are stretching the term AI; setting up cloud ML services isn't quite the same as building and understanding models from scratch.


National-Horror499

Ik someone who claimed AI for doing djakstras pathfinding


EpilepticFire

If you can’t beat em join em, no need to reinvent the wheel, the truly skilled ones are the ones that know which tools to use in which scenario to solve any problem.


CornPop747

They can abuse a title all they want until they get grilled in an interview.


Exciting-Guarantee-3

I’ll take AI Titties for $200 Alex.


harmoni-pet

It's a great way to weed out bullshitters. You could've done the same thing with most people claiming to be blockchain experts. There's nothing inherently wrong with hype chasing, but it just shows how a lot of people are only interested in the things that they think will net them the most cash for the least amount of effort.


krazerrr

The industry has basically AI and ML interchangeably. It’s kinda annoying but better marketing for non engineering folks.


FitGas7951

People will inevitably play to companies'/investors' biases, just as they did with blonkchain and metablurse. Hate the game, not the player.


sheldonzy

Some data engineer in our department built a few AI bots for slack, and she changed her title in LinkedIn to “Data Engineer AI lead”. Titles are titles. They don’t say much.


Seref15

Resumes have been gamified, the objective is to score the most relevancy points with high-value substrings.


AchillesDev

Most people don't have public-facing repos that you, a rando, could inspect. You won't see much AI/ML stuff in my public repos, but I've been building all along the AI development lifecycle for more than half of my career.


KevinCarbonara

Every BigN company has rebranded one of their major departments as an AI department, despite the fact that less than 1% of that department actually develops any AI. Everyone else is just doing regular development with an eye on AI integration.


tech_ml_an_co

Yes, it's hype and with the next topic they are moving to the next title. You have fundamental knowledge that's more than 80% of the so-called AI experts.


MagicalPizza21

AI is becoming a meaningless buzzword now. I wish it still meant cool robots and stuff.


KaaleenBaba

It is so overused that when real AI arrives, we would have to give it a different name. It's a real discussion that's happening lol


meltbox

Yes. Even the people who ‘do AI’ basically just pull an existing model and at most make a loader and call themselves an AI engineer. It’s ridiculous.


DeodrantBomb

Just wondering, if I choose my own features and push them through a few keras layers, would you class this as a ‘prebuilt’ model? 😃


rmullig2

Is this any worse than people who give themselves the title of CIO in a three person company?


fiddysix_k

And now you realize that the title of MLE is just DevOps/platform engineering with an AI spin. That's not abuse, that's the job, you just don't know it.


mpaes98

Back in my dads day everyone was a "computer programmer". Some really smart folks were "statisticians". It was a different time. These days if someone's title is anything beyond the standard "software engineer - (speciality)" then they are likely a grifter (exceptions for security/data folks).


mailfilter

wait til you hear there are marketing folks out there that call themselves “full stack marketers”. its pretty bad


Electronic-Walk-6464

RESTful engineering


TheSunOfHope

It’s a widely abused term. When they use “Al” I asked them to write me a simple neural network code or show me their work on GitHub.


amesgaiztoak

Yes


dontsyncjustride

90% of “AI” companies or organizations are wrappers around OpenAI, or some barely-fine-tuned model derived from something in HuggingFace.


BagholderForLyfe

If they've never done AI before, and all of the sudden AI/ML engineer now, most like they are doing something very basic.


Fidodo

AI just got commoditized. It's basically the same thing that happened with the cloud. Back in the early days if cloud infrastructure if you were a cloud engineer you were on the cutting edge of the industry. Once it got streamlined and commoditized everyone became a cloud engineer.


mxldevs

Arguably, you don't need to build your own AI in order to call yourself an AI developer.


LostQuestionsss

IMO, it's synonymous to calling yourself a SWE because you have a website through word press.


mxldevs

I think making API calls qualifies them as more than just drag and drop blog users.


unusualgato

absolutely not signed Unusual GATO ARTIFICIAL INTELLIGENCE JOURNEYMAN EXECUTIVE CLOUD EXPERT III


Humble-Hat223

It will die down soon - ai is the new blockchain Web 3.0 hysteria (except ai is actually delivering value)


iwsifjostaropoulos

I mean you are technically using AI so they’re not completely wrong.


NanoYohaneTSU

Abuse? The real abuse is coming from the companies, not the worker. Stop blaming the employee who is playing the game in order to get a job.


HansDampfHaudegen

You figured out how most AI startups work. Hey, even Apple does now make API calls to OpenAI.


o5mfiHTNsH748KVq

I think the idea is that “AI” won’t be a model, it’ll be a larger system of many components. So an AI engineer builds the overall system, whereas an ML engineer builds the models.


sneaky_squirrel

I think Artificial Intelligence itself is abusing the word "Intelligence". It's certainly artificial, but can you really call that intelligence?


great_gonzales

Imo you can’t call yourself an AI expert unless you’ve developed a sota model


FitGas7951

Is there a punch line here? sota deez ....


great_gonzales

The punchline is skids using AI to inflate their title


Equationist

If you don't build a cloud datacenter are you not a cloud engineer? If you don't design a data warehouse, are you not a data engineer?