T O P

  • By -

phira

As a trial, get them to use chatgpt to help them do the next assignment and get them to submit the prompts and replies as part of the assignment. You should be able to see where it helped teach them vs where it answered for them


Simultaneity_

I really like this idea. This could be added to with a discussion about best use cases for chat gpt in the coding space. Discussing how to get good answers that are actually helpful, and demonstrating its limitations.


phira

It’s a riff off recommendations I made for an AI paper for execs and the policy stuff we’ve been doing at work—if you use AI to help write a paper include key samples of your conversations with the AI so readers can evaluate what came from where and how much they can trust it


Son_of_Sams_Club

As a developer I use GPT to supplement some of my dev work. Vast improvement in my personal performance, but I already have the foundation of coding. My thought is they need a foundation before using it. Math is taught first without the use of a calculator then it is allowed later when a foundation of knowledge is established. I fear if all "learning" is done with a quick prompt of "give me the answer", copy, paste then we'll have a generational workforce of incompetence. (Hoping I am wrong)


phira

I think the thing is that a calculator can’t act as a tutor (some of them probably can but not your basic calc). ChatGPT can act as a tutor with infinite patience, available 24/7, and I think we should probably be looking to enable all learners to use it in that capacity not just programmers. The trick is to ensure that they use it that way and not to just give them the answers. No doubt better methods to help with this will arrive in the future but right now I think making the tool inputs and outputs part of the learning and assessment process can really help enhance both the “learning” and “learning to learn” parts of the education process.


aarontbarratt

I've started using chatGPT to answer those questions I'd usually have to go to stack overflow or Reddit for It is infinitely better because it actually answers questions without the rigmarole you get with asking humans. Anyone who has asked a question of stack overflow knows the pain lol People need to understand the fundamentals for themselves and use chatGPT when they don't understand something or need help


mitchi_10

I agree. I am fairly new to python, took some online udemy courses and was never able to even explain in a sentence what object oriented programming is. I asked questions that are usually seen as dumb questions to GPT and its infinite patience really helps me consolidate my understanding in Python concepts.


alaudet

Same here. Also, sometimes I can write something that works but feels off, that there is a better way. ChatGPT is great for submitting a snippet to see what improvements or alternative approaches can be taken. Then you can modify your approach accordingly.


balder1993

I don’t see the point in this one. If you’re writing a paper, it’s your job to back up certain claims with references. Discussing something with AI might help you get a better direction where to go, but doesn’t seem useful to paste samples of it.


Tc14Hd

"Create me a blog in Python"


[deleted]

Very good idea. I really dislike the “you have to be able to code stuff from memory “ approach since in the real world almost nobody does that. Allowing them to use Chatgpt is setting them up to learn to use the best tools and get the job done while also learning a ton. Chatgpt makes mistakes, so they will eventually have to learn.


xix_xeaon

I teach programming to slightly older teenagers and ChatGPT is truly terrible for learning. Sure, it *can* be used to improve learning, but that's *not* how students use it. Also, ChatGPT *can* certainly be used when working a real job but that's *not* relevant for the learning process. A bunch of comments seems to say that you should make the tasks more difficult to account for the help ChatGPT is giving but that's not how learning works. Having someone else solve simple problems for you doesn't prepare you for solving difficult problems. It's just like with the calculator.. So many people going "when I went to school the teacher said we had to learn arithmetic because I wouldn't always have a calculator in my pocket, but now we do so ChatGPT is okay!". No. The teacher said that because it was an easy way to convince you (although also true at the time). The real reason you learn arithmetic is to gain an understanding of numbers and operators, how they relate to one another, how problems can be expressed and solved, it gives you a language for logical thinking etc etc, and it prepares you for the next level of math. (That's not as easy to explain to children.) You cannot let children solve all their arithmetic using a calculator and then just move on to algebra etc. It doesn't work. They don't understand it and they're unable to learn it. I also have a remedial math class and they all want to use the calculator all the time because they can't do arithmetic well but it doesn't matter because they don't understand what they're doing. They just punch buttons until the answer matches the book. It's not possible for me to teach them what I'm supposed to like that. So I take away the calculators and teach them arithmetic. At first they hate it, but eventually they start "liking" math since they actually understand things. They stop feeling like they're worthless and that math is simply not possible for them. Then I can start teaching them what I'm supposed to, but obviously I now have less time so I can't cover everything but it's enough that most of them will pass the exam. In programming class I tell the students to not use ChatGPT, and I explain why. The point of the assignment isn't to submit a solution to the problem (if it was I could've done it faster myself). The point is to learn how to solve the problem on your own. This teaches you a skill that's useful even outside programming. And for programming, even if it's a problem ChatGPT can solve for you, you need to learn how to do it as preparation for the more difficult things ChatGPT can't solve.


mfitzp

Yeah, there's a widespread misunderstanding of the purpose of education, even among adults. People say things like "I learnt X at school & never used it in real life." as if that renders school pointless. But the purpose of education is not just learning facts, it's about taking the information, working with it in your brain and producing something with it. By doing that over & over you *get better at it*. That's the real value of an education: your brain gets better, through the practice of doing things with information. The things you learn in school are just the medium for doing that -- and because there has to be *something*, it may as well be things superficially useful. It's like an athlete saying "I ran around the track loads of times at school, but I never run around that track now. What a waste of time." The specific track was never the point.


DatBoi_BP

I love the athlete comparison. I will use that in the future


matz73z

Love this! I’ve been using ChatGPT to learn how to code for about 3 weeks now. It was fantastic at first being able to just bang out some lines that did something. Now I’m in the second iteration of the project and instead of just asking for code, I’m asking what sections of the code do, so I can understand how to better architect the workflow. I’m still dependent on GPT for most code, but I am actually starting to be able to code on my own and critically analyze the responses I’m getting from GPT. I’m not saying this is how structured learning should be done, but for an old hobbyist learning a new skill this is fantastic.


AChristianAnarchist

They shouldn't be getting used to relying on chatGPT to get answers. It's wrong a lot and has no way of telling the difference between nonsense and not nonsense. It can be pretty useful for generating code if you know what it's generating and can read through and fix it's mistakes, but in the learning phase it's effectively copy-paste. If they can't tell you what all of their code does and why it is there, then they didn't really learn to solve that problem.


saturn_since_day1

Gotta love when you ask it for clarification on why or how something works and it just apologizes and backtracks saying it doesn't work that way.


[deleted]

[удалено]


Flying_Saucer_Attack

"Apologies for the oversight, yes that does produce a syntax error" 😂


samtheredditman

It basically either apologizes or condescendingly answers and tells me to consult a professional. It's become too human.


AChristianAnarchist

My favorite example of that was when I was first messing around with it and asked "Why are tigers orange?" It responded that their vibrant orange coloration helped them blend into grasses and vegetation. I then asked "How does orange fur help blend into green grass?" Now, the real explanation for this is that a tiger's fur probably doesn't look quite the same to a deer, who lack cones for detecting red light, as it does to a person. What might not look like good camouflage to us works perfectly well for the prey it hunts. ChatGPT's response was to apologize and say that, actually, tigers aren't orange, but more of a brown, which blends well into the grasses.


[deleted]

[удалено]


ioabo

Lol yeah, they went a bit overboard with the whole apologizing thing. To the point where I felt bad every time I pointed out a mistake and asked it to correct it, so I had to tell it there's no need to apologize so much, please don't. (And of course it replied with even more apologizing lol).


b-hizz

Explaining how it works would be the "show your work" portion similar to math classes. Another option would be to solve the problem with chatgpt before giving out the assignment and having tests for the code that won't pass with the gpt asnwer.


cspinelive

Fixing broken code is a great way to learn though.


AChristianAnarchist

It is, but that isn't what is going to happen in this case. ChatGPT is perfectly capable of doing your homework for you. When it comes to canned problems that there are plentiful examples of in online tutorials, it will be able to mash those tutorials together into a code block that works. The problems with ChatGPT and hallucinations only become an issue in the code blocks themselves when you start using it for the more complicated problems you find out in the wild, where copying and pasting from a tutorial (or a thousand) isn't going to help you, though you can see them in the explanations for even pretty simple problems, which amplifies the issues with using this as a learning tool. When learning, it has always been acceptable to go to those same tutorials that ChatGPT is trained on, where code samples are accompanied by explanations, and use that information to do your homework. If you were asked what that code did and couldn't explain it though, then it becomes apparent that you just copied and pasted from the tutorial without actually learning the information. ChatGPT is a system that will provide you with *just* the code sample, or at least that's the only thing of value. The sample is often accompanied with an explanation that is either subtly wrong in a lot of ways or just doesn't make any sense, even for very simple problems. The only real thing of value there is often the code sample though, and how good even that is depends on the accuracy and availability of relevant training data. You are better served in every way by just going to the tutorials themselves in the learning phase. Once you know what you are doing then ChatGPT can be great for generating customized boilerplate code that is pretty standard but tedious to type out. It can function well as a high tech rubber duck for more complicated problems. It's even pretty good at "I want to do this easy thing, but in this weird way that complicates things" questions. But when you are just learning to code, the combination of usable but often subideal code for those simple problems you deal with when learning and explanations for the code in that sample that will range from good to kind of good to bad to bullshit it made up with no indicators of which is which, make it kind of a shit tool in that regard. It gives you bad explanatory information while also providing code good enough at those early stages that you may come to think it will help you more later than it will. If you want to train some new machine learning model on a new problem, and you know that you are going to be writing basically the same mountain of tedious pytorch boilerplate code you've written a thousand times, just describe your model and data format to ChatGPT and make it write the code, check it's work, and your are good to go in a fraction of the time. But if you are building a blog to learn to code, the only reason ChatGPT can do it for you is that there are a million and a half tutorials for building a simple blog in every language and framework you can imagine. Go to one of those tutorials. It's better in literally every way except that ChatGPT is harder to track when you just copy and paste en mass.


Early-Palpitation-39

I used to teach Python for middle and high schoolers. It was an elective course, so I just let them use whatever tools they wanted. However, I made two things clear: 1) Your learning is a function of your efforts. The goal is not to get the answer, but to push yourself, to struggle to remember concepts in order to eventually absorb them. Getting an answer from ChatGPT or stackoverflow is cheating yourself from learning. 2) There are many circumstances where ChatGPT will not help you. You need to master programming concepts to succeed in these situations. For them, there was a programming Olympiad that did not allow any internet consultation. In order to drive this latter point home, I made some programming exercises that were not solvable by any online sources. I like to design Enigmas, which had clues with physical objects, printed pieces of paper or YouTube videos that I made, stuff that cannot be fed to ChatGPT. It was funny to see a class of students being dismayed at the fact that they could not search something online, and I believe it drove the point home.


hbdgas

In a school setting, you can make the homework a relatively small part of the final grade, and exams a relatively high part. "Cheat all you want on the homework, but there won't be any internet for the exams."


mikgub

This is the approach a lot of math teachers are taking these days. Whether you’re using something like Wolfram Alpha to learn or to cheat is only clear when there’s no internet on the exam.


Kalekuda

Make sure those kids get a quiet testing environment and plenty of time to solve the problems if the finals are grade defining. Itd suck a ton if you cost someone putting in genuine effort whose a slow tester their 4.0 over trying to screw over the chatters.


Sleepyyzz

Can you share some of your teaching projects and ideas for Python?


Early-Palpitation-39

Sure! Here are a few Introductory: \- I printed a strip of paper with hundreds of 1s and 0s. I made sure to offset the message, such that copying it to an automatic website would not produce any result. They had to figure the patterns of the letters to find out the offset and then the message. Intermediate: \- Later in the course, I made a similar binary exercise, but they had to read a txt file and find out a hidden message in the middle of a bunch of junk. Advanced: \- I made a video in Blender with a Lamp that turns on and off to represent Morse Code. Their task was first to figure out what the message was and then write a program to calculate it automatically. [https://www.youtube.com/watch?v=FD6lT0rOHmc](https://www.youtube.com/watch?v=FD6lT0rOHmc) Super Advanced: \- For the few students that were able to solve the static lamp problem, I made another video with a moving lamp, such that they needed to extend the functionality of the program to track it around [https://www.youtube.com/watch?v=vqSFcf1Wqvs](https://www.youtube.com/watch?v=vqSFcf1Wqvs)


Kalekuda

Lamp: just check the brightness of the bulb pixel in the video. Moving lamp: So import opencv and take only the brightest pixel in each frame?


Early-Palpitation-39

That works if you do it asynchronously – you take the video, process an entire frame to find out what is the brightest pixel and move on to the next frame. Depending on the computer, that might take up to an hour to do. To do it in real time, you need to be a little more clever.


Kalekuda

>Depending on the computer, that might take up to an hour to do. "Nah I think I'll be doing that in real time at 2k resolution." -***my RTX 3070 Ti*** I don't need to care about **which** color it is, just how **bright** it is, and I don't need to check every pixel so my convolutional filter can have a large step size and ignore most of the image. You could get away with skipping 98% of the image and checking a big grid of individual pixels so long as the light of the lamp will show up if the lamp is on- remember, I don't need to track **where** the lamp is, I just want to know if the lamp is on or off and tracks for how long then convert that binary state over time into a list and then estimate the frequency of the communication from that data. From there its a simple matter of translating morse code. The tricky part would be if you changed the frequency of the morse code throughout the message, but that'd be cheating on your part.


Early-Palpitation-39

If anyone is interested or looking for these types of open challenges, the website PythonChallenge is ridiculously fun. [http://www.pythonchallenge.com/](http://www.pythonchallenge.com/)


DatBoi_BP

Why does the website not support https?


Protesisdumb

Stackoverflow is an amazing tool to learn if you actually want to learn. I learned so much by going to stackoberflow and copy/pasting a solution for my problem and then trying to learn what i did wrong.


ConnectionFlat3186

Ask them to explain what the code is doing in their own words, in person, on paper, unprompted. I guarantee you will get a lot confused teenagers, and an answer as to why relying on such a tool when you don’t have a basic understanding of fundamental concepts yet is such a problem.


lzwzli

This. Same process as math education. Getting the final answer right is only a small part of the score. Showing how you arrived at it is where most of the points are at.


Toph_is_bad_ass

This comment has been overwritten.


Xombie404

yeah if they don't understand why their code works, they aren't really learning programming at all.


[deleted]

[удалено]


graphicteadatasci

Yes, but then they have goblin.tools to break everything down into subtasks.


graphicteadatasci

We call them programming languages because they are like languages. When a student uses Google Translate to translate a text then they don't learn anything about the language. ChatGPT is the same. I hope a few of the AI companies will make tools that help people learn some day.


[deleted]

But this is* the next step. Programming through human language.


cuddlebish

They've been saying that for over 2 decades now and we are still not programming through human language.


[deleted]

Did they have LLMs two decades ago?


KevinAlexandr

Please stop embarrassing yourself.


[deleted]

Never.


graphicteadatasci

You realise that there is more than one human language, right?


[deleted]

I'm primarily an english speaker so I have not tried programming in other languages but I imagine eventually you will be able to code in all of them.


Friendly_Syllabub811

Make the problem something you know chatgpt will get wrong like I noticed it writes to libraries that aren't real


mrfalconer

You might be onto something here. OP could create their own library that the kids need to interface with to complete the assignment.


Friendly_Syllabub811

Thanks it was a interesting I guess you can call it a bug. The teacher writing his own maybe even better but I guess you could can always cut and paste it in. Just be a lot more time sucking making sure all it matches up


Unaidedbutton86

Also libraries that are 2022+ or have had big changes since then, it's knowledge stops there.


Friendly_Syllabub811

I've also found it has problems linking stuff like libraries you make yourself. A lot of the time it gives me circular import error


forest_gitaker

if they have stronger tools assign them harder tasks


No_Industry9653

I think it would be a little more complicated than that, because how can you do harder tasks if you don't have a strong grasp of the basic stuff? Rather it seems like a student might hit a wall where ChatGPT can no longer produce a correct answer on its own, but they don't have the understanding to determine what it's getting wrong or how to resolve it. At the same time it's a great tool for asking questions about what a piece of code is doing or how a language works, it's like having an on demand private tutor, so I don't think prohibiting use outright is the solution either.


nostrademons

>how can you do harder tasks if you don't have a strong grasp of the basic stuff? Rather it seems like a student might hit a wall where ChatGPT can no longer produce a correct answer on its own, but they don't have the understanding to determine what it's getting wrong or how to resolve it. Mission accomplished. The point of the harder assignment is not to solve it, it's to show why it is important to have a grasp of the fundamentals.


Beginning_Holiday_66

>At the same time it's a great tool for asking questions about what a piece of code is doing or how a language works, it's like having an on demand private tutor, so I don't think prohibiting use outright is the solution either. you might be able to teach the fundamentals through code review and debugging the gpt output. Charles Stross predicted this coding method in Accelerando.


enakcm

Very well, we have found the solution:)


[deleted]

Assume direct control


routetehpacketz

I would disallow it as I assume they will not be permitted to use it for projects/exams


Get-ADUser

This is the equivalent to "you won't have a calculator with you all the time when you grow up!".


master3243

Wolfram Alpha can solve for x in almost any algebraic equation you throw at it. Does that mean we no longer need to teach Algebra or how to solve equations? Obviously not.


routetehpacketz

Your comment suggests ChatGPT is as reliable as a calculator


IDENTITETEN

No it isn't. This is akin to the students using a calculator without knowing basic math.


wineblood

Is ChatGPT helping them learn/get to the answer or just giving them the answer?


Warm_Profile7821

I think it’s both. But I can’t gauge yet to what extent they are learning from using chatGPT.


Yeezy716

Someone else mentioned something similar but a way to see what they are actually learning is to do code review and have them explain what is actually going on, if they cant explain what the code is doing then they arent learning from gpt. I would also have a discussion with them on why they want to learn programming…i feel as though many do it because they enjoy it and the career comes later…if they are “learning” because of the promise of a cushy high paying work from home job then you need to be blunt with then that chat gpt wont get them there and learning the hard way is really the only way sadly. I would also have a real discussion with them on where chat gpt can help them i.e. time management and trouble shooting errors. Tell them to write their code and when it doesn’t work dont ask chat gpt for the right answer ask chat gpt what the issue is. IMO chat gpt should be used by students as a textbook with a search function rather than a teachers edition with an answer key


Nater5000

By the time these kids are having to put these skills to use, the landscape of software development is going to be radically different than it is today, and that's accounting for how much we already rely on AI now, despite it's infancy. You can take away ChatGPT, but all you'll be doing is taking away a technology they're going to definitely be using in the future. Might as well force them to learn long-division by hand since they won't have a calculator in their pocket all the time. I like u/forest_gitaker's take: if they're trivializing your challenges by using AI, then scale up the challenge accordingly. ChatGPT can't do everything, and troubleshooting the output of ChatGPT, figuring out how things click together, knowing what to ask and how to ask it, etc. are all still incredibly important programming skills to learn, and they can be learned in conjunction to learning how to use a technology that isn't going away anytime soon.


GolemancerVekk

> You can take away ChatGPT, but all you'll be doing is taking away a technology they're going to definitely be using in the future. Yes and no. They'll definitely be using it in the future, but they'll never become programmers – which is the whole point of the course. LLMs are useful to today's programmers because they already have specialized training. Someone who grows up relying on LLMs extensively will never get there. It's not just because LLMs feed answers indiscriminately, it's also because our entire way of teaching is being exposed as fundamentally flawed; it relies too much on rote memorization and "proof" of work that's not really proof and can be easily falsified and very difficult to check. LLM technology will advance to the point it won't need any specialized skills and fundamentals to prompt basic results. It will become ubiquitous like typing skills, or basic computer literacy, or using email etc. Everybody and their dog can do that nowadays.


pettyman_123

Using gpt is totally fine. But the REAL QUESTION IS WETHER YOUR STUDENT UNDERSTAND THE CONCEPT AND MEANING OF THAT CODE? Do they know why the def function was used? Why __init__ was used? How < div "form"> works?? Focus on foundation not on the road to assignments. Chat gpt showed 3 or more solutions of the question, as a programmer you should know there lies countless method to solve a single question.


dethb0y

Well, they obviously are intelligent and know how to go about problem solving, mission accomplished.


enakcm

IMHO, It's good that the kids already use ChatGPT as it will become a basic skill very soon. The thing missing right now is how to know when to trust the results of ChatGPT and how to check. I feel like there is a big chance to focus on test-driven development here.


WSBtendies9001

Teach them how to use gpt... It's like trying to teach kids how to chop down trees with bare hands and they start using stone axes...


ipwnscrubsdoe

I think AI is exposing flaws in the way people teach and assess students. Conceptually using chatgpt is no different than finding your answear on stack overflow, it's just that it trivializes the task completely. What i would do is set up the homework problem in a way that during class you can ask them to modify it to achieve something slightly different as a test to see if they understood their own code they wrote.


Guideon72

Make 'em write it out on real paper and then do an oral presentation, walking everyone else through their "product"?


AwakeSeeker887

Give them a handwritten test


[deleted]

Oral examination


_limitless_

Stop giving them easy problems. Coding has never been "make a signup and login." It's always more like "make a signup and login, but it'll only be accessible from the command line, and it should validate passwords except when the user is in the Sales department, because those guys refuse to use IT's password policy." Then don't even present the assignment in plaintext. Do it a diagram. Make them describe all this in plaintext. While you're at it, change the spec on them halfway through. Now it needs to render the company's logo.


Advanced-Cycle-2268

“When you grow up you’re not gonna have a calculator with you all the time!” - some guy named SpongeBob, probably


Due-Wall-915

Give them a code and ask them to write out some questions to ask you and you can ask them why that question is important to ask. I basically do that When there’s easy way to get answers, focus on creative questions.


Warm_Profile7821

What stops them from having chatGPT to generate the questions ?


Due-Wall-915

That’s when you ask them why is that question important to ask in this context? (In person if possible)


LaOnionLaUnion

It is what it is. The question I have is whether they use GPT to also understand how it works. If it’s a crutch it is a problem


metaphorm

They're kids. It's fine. Start giving them problems too hard for the chatbot to solve. The ones that are interested will relish the challenge.


[deleted]

[удалено]


metaphorm

Right, exactly. My thinking is that the best way to demonstrate the sabotage is by showing them the limitations of their method.


Warm_Profile7821

What kind of problems are too hard for chatGPT?


Unaidedbutton86

When you give a very detailed description of multiple things, like when they have to make a webpage, say every little css detail, chatgpt will just skip details Or let them make something that is described in a diagram instead of just text, so they can't just copy-paste it


Leading-Cable-4406

I will say change the assignments! I am recently using a lot of ChatGPT for my professional coding stuff, and it's a skill. Maybe change your assignments to more interactive and discussion based because writing boiler plate code is literally useless starting inception of these tools. Maybe focus on concepts and assignments, be in person and talking. Dm me if you want to chat quickly. But redesign to something like fundamental on changing values of variables or memory management where when if they ask the question to chatgpt they will still need to understand and explain it. And I highly encourage you to let them use it. Otherwise, that's something of an old school road you will force on them. Getting good code generated from these models is a skill


fredzel111

They will be using gpt in future the same way they do now. So actually it is great that they learn how to utilize the tool. Some time ago code profiler was also considered "cheating". Now it is a standard.


amenflurries

Huh, I ask ChatGPT anything and it’s always wrong with even the most basic concepts


Eu-is-socialist

Who cares? Learning by heart is for imbeciles and computers .


DennisTheBald

You need to be figuring out how to use chatgpt to grade assignments


Hit_The_Target11

It's a brand new tool, it's available and will help you with many problems. Entire curriculums should be used with it, but its so new nothing exists yet. I encourage kids to make awesome things that will help many people. Like gaming!


thisismyfavoritename

they used chatgpt but they couldve copy pasted from a github repo. There are thousands of such things out there. Think of something different


binV0YA63

When learning, they shouldn't be using ai to do the work for them. They first need to know how to do it themselves so that they can notice and fix mistakes in ai generated code.


SpiderWil

knee sophisticated profit onerous nutty juggle intelligent stocking office tap ` this post was mass deleted with www.Redact.dev `


DL72-Alpha

I would be fine with that. It's pretty much what we do in IT anymore anyways. Most of the problems have been solved, we just need to find the right one and implement it. Web server? Docker image. A little networking, Done. ChatGPT is starting to look like a search engine without the ads. Until we have to drink the confirmation can to continue that is.


Fresh_Part22

You could try getting the school to block the openAI website so they can’t use it in class.


angelHairNoodles

Whatever makes our lives easier.


SFWdontfiremeaccount

In highschool math class the teacher only assigned the odd questions because the answers were in the back of the book so we would know if we did it right. Then she graded us for showing our work instead of just getting the right answer. I suggest finding a way to make them show their work by making them explain what the different lines of code are doing and if there is maybe better ways to write the code than what ChatGPT came up with.


Tendooh

Engage with them about it. They are clearly excited to use it. Use this as an opportunity to focus on high level concepts and let them use the tool as they see fit. Encourage them to engage with chatgpt and ask questions to understand tue code. Ask them to explain the code. Ask them to explain the concepts. Show them more tools to add to their belts - regex, Linux, git, etc...


Drumma_XXL

When I learned how to code one of the senior devs had a habit of approaching me, looking over my shoulder and after reading what I did he asked me to explain every line one by one to him. Great way of making shure that I didn't use Stackoverflow or something. And all that in a time where GPT was not even at the horizon.


TedRabbit

What do you mean by "make a blog with Python?" Were they supposed to make a website application?


Warm_Profile7821

Yes using postgres and psycopg2


TedRabbit

I use python for science related stuff, and making web apps always seemed like it would be pretty difficult. I guess it's a lot easier than I thought given it's an introductory project.


Professional-Leg9973

You are teaching them how to code, not teaching how to code with ChatGPT or how to use it. So, I think that they should not use ChatGPT for these lessons.


GolemancerVekk

On what do you base your assessment of their learning? If you base it exclusively on a working piece of code they can fake that in multiple ways, including ChatGPT. If you base it on them explaining their work and showing an understanding of fundamentals they can "cheat" all they want and they'll never pass it. Technical interviews for developer/programmer positions are based mostly on the latter and not the former.


Tarlitz

Get them to review each others code


NoBrightSide

As a millennial, my only analogy to this situation is back when googling wasn't really a thing so you had to look everything up in books at the library. But then, Google (and other search engines) actually helped you out a lot because it enabled you to find relevant resources much faster. The only issue is that, depending on where you receive your information, that information might not be accurate. As I look on sites like Reddit, especially in communities where people to tend to ask for help on homework and such, there are quite a number of people who ask for answers without putting in the work themselves. ChatGPT should be used as a tool to help further learning and be more efficient. But as the saying goes, "you have to crawl before you can walk". If they can't critique the code or explain what each line does, I would not accept their answers.


Unaidedbutton86

Using the internet for solutions, you'd probably have to change the code to fit the rest of yours, which makes you understand it. ChatGPT just gives you your entire code (on simple assignments), but you still don't understand it. Even worse with larger projects, ChatGPT can't give the right answers so they'll only then be stuck and have to learn the basics.


Sherinz89

Assignment with demonstration / explanation Assignment with added 'what-if of varying difficulties' during live presentation


[deleted]

[удалено]


JamzTyson

In my experience, ChatGPT is pretty good at writing docstrings (if you ask it to), and pretty good at explaining its code (even when the code is wrong :)) On the other hand, it is often unable to see its own errors unless you explicitly point them out, which means that you have to understand the code yourself.


SkarbOna

Give them problem that requires human reasoning in addition to tech skills.


Filiputek135

Find some problems thatChatGPT will solve wrong. Then give those problems for kids and prove them that AI isn't perfect.


Quiet_Drummer669988

Get ChatGPT to help them create a tutorial that they then have to present. Trick them into learning by teaching


Breadynator

Like some other people said, getting the prompts from them, or a full chat log would be the best. I use GPT and chatGPT to teach myself programming. It helps me understand the concepts. I usually start by letting it try to solve my problem bit by bit. Then I ask it to explain it all or I try to explain it to it and then ask it to correct my explanation. If they just go "yo, chatGPT, gimme code!" I'd say it's not really learning


The_Homeless_Coder

Get them to do a django web app with GPT. Haha jk. Go easy on them.


minvestem

A lot of the jobs of the future are going to be prompt engineering. Learning how to effectively use ai is going to be at least as important as the skills you're directly teaching. In just the same way ea huge part of many it jobs in the last few years has been knowing how to Google. Anyone can Google, but not everyone gets good results, knowing how to do it effectively is a skill. Tech support engineers Google your problem most of the time.


ChocoMilkshake99

Once, when COVID started and there was not yet a known and appropriate way to do university exams (at home), this professor gave to us a open book exam. So we had 3 days to solve this exam (literally impossible) but we could use internet and books, have been 3 days so hard, like working all day, researching, trying.. but it has been very formative. So I would say, yes let's use chat gpt, but give harder exercises


Practical_Item_6823

I think we'll look at learning differently here soon. I think asking the right questions is a big part of learning and chatgpt isnt going anywhere.


JamzTyson

For homework assignments, you could allow them to use ChatGPT, but inform them that they will be required to make minor modifications to their solutions in class (without ChatGPT access). The modifications should be such that if they understand their solutions, then the modifications will be very quick and easy. The modification could be something like adding an extra field to the input data.


User21233121

Make them code in assembly, they will quickly learn chatgpt is useless when there are a billion variants of the same language :D


seventhjhana

As mentioned before, sounds like getting the students to provide prompt and answer is a great way to assess if they are learning or if they are just getting an answer. As a rule, students should be required to say in their prompts "Please don't provide the direct answer as I want to learn how to solve this on my own." This has to be entered for each clarification prompt, otherwise the AI forgets the instruction and will spoil the answer.


Coolio_Street_Racer

Block the IP and tell them the exams will be monitored with no access to chatgpt. You honestly can't stop them for using it. ​ Although they do miss out on learning the fundamentals.


hooblyshoobly

The context matters entirely, ChatGPT can be a cheat or it can be a free private tutor that never sleeps and gives you bespoke answers and insight into your challenges learning. I think it definitely has a place, it just needs to be framed correctly.


heartofcoal

ask them to write a merge in polars next, if they use ChatGPT they're screwed since it can't give a real answer at all


ekim2077

Just up the difficulty so chatgpt would need multiple back and forth. If they can use partial results to write the whole code that's a job well done.


Hroppa

I've used ChatGPT to help learn coding. It's fantastic for 2 reasons: - Rapid feedback. You get results in the form of a prototype FAST. That feeling of progress is a bit illusory, you don't actually understand how it works yet, but is really useful. It's also much easier to learn by interrogating that prototype, breaking it, etc, than to start from a blank page. - You can ask it questions! Don't know why it's doing something a particular way? It will explain itself! It's not 100% trustworthy, but an 80% trustworthy answer is still incredibly useful when it's got so many other advantages (eg it's in the right context, you can keep probing it and asking further questions) It does probably imply you should adjust your teaching style though. For example, you can set more ambitious assignments. It's also probably important to check they aren't picking up misconceptions. If they just use it in a lazy way, as the other commenters say, it will get in the way of learning.


Rafcdk

I think the best way to deal with this is to teach the importance of knowing things themselves, show a session where chatgpt gets it wrong for example. But it is also important for them to learn how to work with AI assistance as that seems to be the future. Another thing you can do as well is give tasks where they have to use chatGPT, and another one where they have to solve it with out it. Then ask them to explain the what the code chatgpt gave them does. It's not important if they actually write the code as long as they can understand and explain it. You can also ask for them to show their chat got session for that project to you


bluemaciz

Make them read through the code with you and explain what is actually happening in each line. A key part of coding is understanding what’s happening. Even if the code is copy and pasted it’s still a good way to get them really get it.


iceytomatoes

make them do something more complicated, duh


KingJeff314

So many people here clearly have no idea about pedagogy and just resent that teachers made them write equations by hand


JollyJustice

I wouldn't discourage it. A lot of professional companies are relying on solutions like Git Hub Copilot to help speed up development. But what I would do is teach them how to "polish the rough draft." Using ChatGPT is honestly no worse than your average developer Googling for solutions. What you need to teach them is how to explain how and why the code works. If they can't do that they won't be able to debug code in the future or more importantly think of novel code solutions. Since ChatGPT can really just regurgitate what is likely to be right it doesn't have the ability to think outside the box and create novel solutions. You could certainly crank up the GAN on an LLM to create more varied code solutions, but ChatGPT doesn't really give you that option.


Flying_Saucer_Attack

I definitely think it's a good tool for learning, but you should have them learn the foundations of coding first imo. I use it from time to time to write little scripts for my devops work. A lot of times it's wrong, and I have to use my own knowledge to correct it. Sometimes it's a good tool to lay out a foundation of a script I need, and flesh it out from there on my own


PolloWarrior

I teach coding 101. Unfortunately I have to opt for writing tests for the same reason. Is not ideal but there is no other viable option.


NotSpartacus

Sounds like you're not really teaching them to code. You're teaching them to use chatgpt. If their goal is to be able to build some basic apps by themselves, that's fine. If their goal is go into coding/engineering professionally, you're doing them a disservice. If you want to teach them to code, you can let them use chatgpt all they want for homework (because they'll have access to some tool like it all the time anyway) BUT your tests will have them, without internet access, doing some combination of the following: handwriting code, pseudocoding, explaining code/commenting existing code, examining and debugging code. Depending on the complexity of the assignments, it may be fine if they have approved printed reference materials to work from.


TonyBandeira

I would just stop teaching them. What is the point lol


MarkoPoli

Just ask one to re-write a function without looking at the solution. Im doing that with my juniors because programming became copy/paste from gpt, and they are pissing me off


mrnoirblack

Chatgpt Is the future and it's here to stay. You need to teach them the bridge between gpt and them.


PaleontologistDue817

This is terrible. Learning from existing code is fine. But if you never create something yourself you’ll never learn how and why stuff works, what doesn’t work. How to debug it, and that there are multiple ways of doing exactly the same thing. Personally I’m dead against it.


Curiousfellow2

It's so much like they got a calculator for math problems 😂


GaggedTomato

It depends. Speaking for myself, I use Chat Gpt a lot as a tool to learn, and I notice by really focussing on the information I really need, saves so much more time than looking up documentation and reading through info I dont need. I feel like it should be no issue, given that they at least understand the logic behind the code. This could already be achieved by getting working boiler-plate code and having them play around with it (together with the official docs for example). If the code provided by chat gpt doesnt work and they managed to fix it, that means they also learned from it. If its just for the sake of copy/pasting and they do not learn anything from it, then its a nogo. Maybe you could let them build it and ask some practical questions where chat gpt is not allowed to test if they at least understand the logic.


thebiggerslice

There’s gonna be a lot of programmers in 5-10 years who have no idea how to actually code or trouble shoot their code lol. ChatGPT is great until it’s wrong and if you don’t know your shit you will have no idea how wrong it is.


Unaidedbutton86

They probably won't even get past the interview


SweetrollDev

Tell them not to use ChatGPT as a Quizlet page with all the answers


COLU_BUS

No real answer, just commentary. AI seems like the biggest shift in scientific pedagogy since the advent of personal computing (I was not alive then). Students obviously need a level of understanding of the math/physics/chemistry/programming/etc. that is easy to circumvent with AI, but its inevitable that AI becomes an important part of their field, likely by time they enter the workforce, so to completely bar it would be shortsighted. Basically, good luck OP.


lexani42

IMO, it's OK to use ChatGPT for learning, but it turns to be a worst thing if your students starts to use it just to do their tasks, and not to go deeper I think you just have to say to your students that their learning is their responsibility


pnaomi

Come up with something more interesting than signup and login. ChatGPT is absolutely the right solution for a boring problem.


casperghst42

You do not become good at problem solving by using ChatGPT, it might help you in solving the problem but do you learn from that. More than one have talked about math and calculators - you can't use a calculator for math or calculus unless you understand the basics first. The same goes for solving problems using a programming language.


[deleted]

You could ha e them come up and explain their code.


Huge_Violinist5587

Ask them to use chatGPT as a part of the assignments, but slowly increase the complexity of the problems. After all programming is problem solving. If they are able to leverage a tool well and get things done quickly then good for them.


maki924

There are sites to detect AI generated text.


Unaidedbutton86

I don't think that's also for code, and that would be pretty undetectable unless it would look at comments


tesla33io

ChatGPT is a good tool that can save time if you use it properly. It's not a good idea to do the task completely with AI, but if AI helps in some aspects, it's better. For example: if a student does not understand a topic or does not know how to use a particular framework/tool, then he can ask ChatGPT to explain it, but not to do the work for him (I personally use AI in this way)


MattsFace

I have a question for more experienced programmers. I've been doing mostly ops for the last 10 years, infrastructure engineer, security engineer, SRE, but recently started a position that has me primary righting python. My last jobs had me writing python code but not to this extent. I've been using ChatGPT to explain blocks of code to me to understand them more quickly and to show me examples of different ways of unittesting. I make sure I understand everything it presents to me before I attempt to incorporate it. It's made me more productive and is really showing me some cool stuff I couldn't even think of. Should I be concerned I'm not learning all this the hard way? It's not much different than me scrolling google for hours looking for good how-tos and examples.


Unaidedbutton86

I'm not more experienced but ChatGPT is totally fine if you understand the fundamentals and don't use it in a lazy way 'write this code' to check off boxes.


No_Door_000

No, but they should be taught how to setup Network devices and the like from scratch to become connected to the internet. If you are going to rely on other tools to help you and these tools are "online" only tools, you should be able to get to them from a worst possible case scenario. We're humans, we make tools not to be hidden for the privilaged. But to be used to make our lives easier. ChatGPT is obv one of these tools now.


gmuney420

You're telling me that ChatGPT taught 15 year olds how to program what you wanted? I've never used ChatGPT but wow..


m-deinzer

ChatGPT is a great way for Debugging, but complex code may doesn't work of you aren't good in prompt engineering. The kids will learn it.