console.log()

Can AI Code Assistants Really Teach Junior Developers to Code? Ask a Redditor

Share via Twitter Share via Facebook Share via Linkedin Share via Reddit

Anyone paying attention to the marketing language around AI code assistants will recognize the push to position these tools as not merely developer-led (in the vein of “Trust, but Verify”), but also as experts able to lead developers. Take, for example, GitHub Copilot. Earlier this month during the opening keynote for GitHub Universe Colin Merkel, a Software Engineer at GitHub, explained that:

with Copilot, you now have an expert available anytime to answer questions about your code base right from your browser or IDE.

Of course, as I have already written, mid and senior level developers don’t accept this line of argument at all. Most tend to view these tools as occupying the subservient role of “an endlessly enthusiastic savant intern.” I have yet to encounter a single experienced practitioner that thinks AI code assistants don’t require their oversight and direction. Nonetheless, vendors have been eager to suggest that their AI code assistants are increasingly able to stand in as authorities.

It makes good business sense for GitHub and others marketing AI code assistant products to suggest that these tools no longer fill a supporting role, but have instead apotheosized into full-fledged experts. AI code assistants able to program at the level of senior engineers would be an obvious boon for businesses with software engineering departments. Companies want to see tangible productivity gains before investing in these tools, and team happiness has so far been a less compelling selling point than increased throughput. Aspirations and intentions aside, the AI code assistant expert remains in the realm of science fiction. What may happen in the next year, five-years, or decade is yet to be seen. Perhaps with products personalized to a company’s codebase like GitHub Copilot Enterprise even the notoriously skeptical professional developers that congregate on Hacker News will acknowledge the possibility of an AI code expert.

But today there is a demographic of developers that already treats AI code assistants as experts: junior developers.

The needs and experiences of early career developers around AI code assistants has been woefully understudied. This is a shame as nowhere is the issue of whether AI code assistants are experts or interns more important or contentious.

I argue that beginning programmers can benefit from AI code assistants. Although caution should be exercised by new developers until (if ever) these tools can actually serve the role of experts, but there is no sense in gatekeeping this tech. AI is for tinkerers: beginning, expert, and every ability level in-between. New learners should be encouraged to experiment with these tools early and often in order to discover some of the unique and surprising ways that AI has already augmented the education of many new developers. My goal in this post is to shine a light onto how this segment of the developer community—a group which is as enthusiastic and curious about the potential of these AI code assistants as their more seasoned counterparts—are using this tech. Pedagogical use cases for these tools are only beginning to be theorized, and what group is better situated to discover them than newbies themselves?

 

Is This A Junior Developer?

Before hopping into my findings, I want to outline several of the unique challenges that hindered this inquiry. First, because developers are always learning it is necessary to differentiate between the experiences of junior developers learning how to code, and mid and senior level developers that can code already, but are using AI code assistants to learn a new language, tool, library, or part of the stack. Both of these developer types use AI code assistants, but for this article I am only concerned with the former.

Another challenge is the imprecision and negative connotations attached to the junior developer rank. The term “junior” promotes ageist assumptions equating relative youth with inexperience or, worse, incompetence. What is more, the term “junior” is unreliable and prone to ambiguity. Experience level is generally self-reported and subjective, particularly online. Ability often has to be inferred when researching the experiences of new learners. External hiring practices within the discipline offer little help. After my own stint as an intern for an engineering department, I jumped straight to the rank of “Frontend Engineer” not owing to my virtuosic skill but because my company didn’t have juniors.

It’s also worth mentioning the sources and methods I employ, and why I ultimately chose to focus on Reddit. There is a lot of promoted content about AI targeted to early career developers which I do not consider useful to my inquiry. FreeCodeCamp has extensive content tagged #AI, such as “How to Integrate AI into Your Serverless App With Amazon Bedrock” by Sam Williams, an AWS Consultant and Trainer. Similarly, the Code Newbie Community feed includes a post by Ariel Weinberger entitled “Every Developer Should Know How To Add AI To His Toolkit!.” Although these types of posts reveal the ubiquity of AI in the early career developer community, they are less interesting to me for two reasons. First, this are top-down content fed to junior developers and not grassroots examples highlighting their unique voices and perspectives. Second, I saw no promoted posts in this type of forum that urged new learners to use AI code assistants as tutors. This was curious to me, as the use case of the AI tutor is touted so frequently. Khan Academy’s AI tutor Khanmigo, for instance, promises to automate computer science instruction:

Learning to code has never been more accessible. Khanmigo’s interactive experiences and real-time feedback will help learners hone their computer science skills.

Khanmigo, which is still in beta, is intended to democratize access to 1-on-1 instruction while offering significant support to overburdened educators. However, as we near the end of 2023 Khan Academy’s AI tutor has not caught on in the developer community as a means for reskilling or upskilling.

Reddit is hands-down the most vibrant community hosting conversations on the subject of how junior developers use AI code assistants. Unlike StackOverflow or Hacker News, which have historically been unwelcoming to new coders, Reddit has whole subreddits devoted to learning how to program that act as safe spaces for these newbies to discuss their needs, struggles, and experiences. The remainder of this post dives deeply into these important conversations. It is my hope that by offering qualitative insight into the AI code assistant’s pedagogical function we in the tech industry can expand the conversation around this transformative technology, both in terms of personas and use cases.

 

“I just copy and pasted the code to see if it worked (it did)”

The subject of how AI code assistants succeed as instructional aids is taken up with great enthusiasm across the span of Reddit’s programming communities. Many host questions and conversations relating to whether GPT should be used to reskill in the discipline of software engineering.

Below is one illustrative example from the r/learnprogramming subreddit posted by user prog_22:

Do any of you use chat-gpt to learn (should I use it)?

I created a chess game with vanilla javascript

  • Phase 1: just got it working.
  • Phase 2: Refactored it into different modules (with vite).
  • Phase 3: convert it into TypeScript (as I’m not yet 100% confident with ts, I need to mess around with just js first).
  • Phase 4: gave chat my validatePawn module and asked it questions.*
    • review it for me = not all that useful but if I ever go on an interview and if I talk about my code like chat talked about it, it’s better than nothing
    • if I want to refactor it using design patterns, which ones apply? and it told me:
      • Chain of Responsibility, Template Method, Visitor, State
    • Then I asked it to rewrite the method using each pattern and it did

I just copy and pasted the code to see if it worked (it did).

I could then learn how those patterns work and ask it more questions: “I have this Chain of Responsibility pattern for validatePawn, I know nothing about Chain of Responsibility pattern, please explain”

Or is this total bull and it will lead me to nowhere?

Before considering the responses to prog_22’s question, it’s worth pausing to tease out how this junior developer uses ChatGPT as a teaching aid. Before turning to AI, prog_22 has already learned enough JavaScript to create a game, likely through online video tutorials such as Ania Kubów’s “Code CHESS in JavaScript (Super simple!).” These online guides typically include boilerplate code snippets that are either shown on the screen or else packaged in a repository for easy download (here’s one). In my experience, tutorials like Kubów’s are exercises in copypasta or, if the student is feeling ambitious, typing out code blocks character-by-character as they appear on the screen. Regardless of which strategy prog_22 adopted, they refactored their chess code to make it work in alternative frameworks and libraries. Only then did prog_22 use ChatGPT to further tinker with their project using design patterns and by requesting a generic code review.

It is interesting that prog_22 does not use the numerous AI code assistants that can be added to the text editor, and instead confines themself OpenAI’s ChatGPT. As the examples I discuss later demonstrate, this is not uncommon and I think has much to do with friction and familiarity. ChatGPT supports a copy and paste workflow that feels natural for developers wary of learning new tools and UIs. In fact, there is tremendous overlap between the typical pattern of copy and paste used in tutorials and supported by forums like StackOverflow, and what prog_22 describes doing with GPT: “I just copy and pasted the code to see if it worked (it did).” In this way, AI code assistants are just the newest permutation of copypasta up/reskilling.

But, of course, AI code assistants extend the copy-and-paste tutorial experience so familiar to self-trained developers. For this reason it is important that prog_22 implies anxiety about whether GPT is a useful pedagogical tool, or if it is even safe. In fact, the purpose of their post is to ask questions: “should I use it? … Or is this total bull and it will lead me to nowhere?”

 

Hey, who’s driving this thing?

Without a deep understanding of development best practices or a structured education in computer science, junior developers that use AI-generated code snippets have effectively let AI take the wheel and steer. In the hands of new developers ChatGPT, Copilot, Duo, Code Whisperer, Cody, and others are no longer code assistants, but code drivers.

Prog_22 reveals their own uncertainty about whether copying and pasting AI generated and imperfectly understood code is acceptable. However, the risk seems worth it to this Redditor because by asking ChatGPT questions about the code it generates they are discovering more about what it does. Of course, there is no guarantee that ChatGPT’s explanations are correct. Prog_22 is learning from a tutor that must be constantly fact checked. By all measures, this unreliable instructor introduces a topsy-turvy reality. Tutors are supposed to fact check their students, not the other way round. And yet a reversal of roles is typical for online education. Our post-truth reality has democratized access to wide swaths of human knowledge, while at the same time obscuring facts within algorithms tuned to maximize views rather than advance truth. Today it is more important to know how to track down information and then determine this source’s reliability, than to know everything outright.

Prog_22’s question about whether to use ChatGPT is not out of character for Reddit’s programmer community. Permutations of this basic question appear in numerous other developer subreddits. Here is a sampling of similar posts:

Clearly there is significant demand among junior developers occupying every level of the stack (frontend, backend) and multiple verticals (game developers, web dev) for AI code assistants to serve as tutors. However, these self-described beginners worry that they err in using these tools.

Let’s now turn to a representative sampling of the responses that questions like prog_22’s attract. Over and over, the answer to queries about whether AI code assistants should be used to learn programming is a resounding No. According to the bulk of Redditors participating in developer subreddits, early career developers should avoid these tools. As Redditor Slypenslyde elaborates in the C# community:

The problem with trying to learn at this level is so far nobody’s made an AI that focuses on being correct because that is difficult. So asking ChatGPT a question is about the same thing as asking Google and picking the “average” of each tutorial. Considering some example code is dead wrong and other example code is wrong in subtle ways, there’s a chance ChatGPT will show you things that seem to work but fail subtly. A newbie is the person worst-equipped to deal with untrustworthy code, and an AI is something most well-equipped to deliver it. (For example, I’ve seen people get weird answers from ChatGPT and when they ask it, “Can you tell me why you think that?” it happily sends them to whackadoodle conspiracy theory sites to back up its claims.)

Many offer the caveat that once the junior developer reaches a certain level of ability they can begin using AI code assistants to speed up and support development, but for newbies these tools are unreliable. According to BigYoSpeck:

An absolute beginner should stay well away from it just like a child learning arithmetic should leave a calculator well alone. But once you’re into more advanced work it’s a good brainstorming tool.

Fair enough. The sages of Reddit emphatically counsel junior developers to avoid AI code assistants, and some target new programmers explicitly because this demographic is overly dazzled by it. According to House13Games:

Get a book and learn from a proper source. Chatgpt writes shit, buggy code.

I see beginners getting impressed all the time though. Probably because they can’t see what’s wrong with it.

It’s worth pausing to here to state that although the notion AI code assistants write bad code is not universal across Reddit or the dev community more broadly, suspicions against this tech are not unwarranted. The problem of hallucinations remains unresolved—an issue that general AI chatbots like ChatGPT exacerbate by not being trained on a particular knowledge base. Dedicated AI code assistants, and those utilizing the latest LLMs like the recently released GPT-3.5 Turbo model or even GPT-4, are bound to do better. Consider that many Hacker News users laud GPT-4’s abilities, and vendors are sharing impressive statistics regarding the amount code written by their commercial AI code assistants. Tabnine‘s homepage, for instance, claims that its assistant generates “approximately 30% of code for millions of users.”

 

Will the Real Junior Developers Please Stand Up?

Most of the commenters on posts asking whether or not to use AI code assistants to learn are not new coders and don’t reflect how early career developers actually act and feel. The anti-ChatGPT-for-learning camp is weighted toward mid and senior level developers; they are not junior developers themselves. Therefore, focusing too much on these responses and reply guys is unwise. It not only fails to reflect the lived experiences of developers in the process of reskilling themselves, it actually obscures their distinctive perspective.

I went looking for the voices of junior developers to see if what they have witnessed bears out the warnings of more senior devs. Unsurprisingly, many do. One Redditor draws from his own disastrous experience to caution against copying and pasting imperfectly understood GPT written code snippets:

I recently wasted 30 minutes trying to get it to do something with mpv that wasn’t actually supported by mpv (I forget all the contexts). GPT just flat out lied to me: “Yes, MPV does support X thing, here is how to set it up…”

After enough coding and scratching my head, I finally checked the documentation myself and caught onto what was actually happening, I told GPT “That feature isn’t supported” and it replied “That’s correct, MPV does not support X feature. Here are some alternatives.””

This developer would have been better off executing a command + F from the MPV documentation, instead of asking ChatGPT. But the ability to bypass technical documentation is precisely the use case so many developers of all ability levels want. As anki_steve explains, ChatGPT:

removes the tedium of referring to the manual for simple questions with simple answers, making the process much more enjoyable for those of us without photographic memories.

 

Education Beyond Copypasta

That's his name: Henry Jones, Junior.

So now I am finally arriving at the upside of AI code assistants for teaching junior developers.

A remarkable number of junior developers on Reddit report having positive experiences with AI code assistants, and GPT specifically. Despite warnings from more seasoned engineers, what junior developers are finding is that AI code assistants provide what Redditor Jerricoda characterizes as:

a balance between mindless copying and pasting youtube tutorial code, and doing everything on your own but ripping your hair out because nothing makes sense.

5 years ago, self-trained engineers relied on free tutorials, often YouTube videos, to learn programming. Today, these tutorials can be supplemented with AI code assistants.

Vivid-Hat3134 shares their successful ChatGPT-enhanced reskilling journey in the r/ChatGPTCoding subreddit:

started coding about 2 1/2 years ago with a Raspberry Pi model B. Despite working full-time as a locksmith and being 33 years old, I dedicated my free time to learning programming. While I didn’t take formal classes or workshops, I made progress and developed a basic understanding of computer programming. Discovering ChatGPT and image generation AI humbled me, realizing how much I still had to learn. In the past couple of months, using ChatGPT, I’ve learned more than I did on my own over the years. I don’t rely on GPT to give me code directly; instead, I ask for examples and explanations to understand and utilize the information better. Understanding my work is crucial to me, as working without comprehension could lead to disastrous results. No matter the field. My encounter with ChatGPT had been nothing short of transformative. What began as a mere curiosity to understand AI and coding as a whole, turned into an adventure of self-discovery and continuous learning. With ChatGPT as my tutor and study partner I realized that the possibilities of coding were limitless, and my passion for technology grew stronger with each passing day.

Long story short, I made leaps and bounds in my skills thanks to ChatGPT, at a higher rate than I had expected or even hoped for. Maybe I was already becoming less ignorant or GPT really did give an educational edge, either way I wanted to share my thoughts. Sorry if this isn’t within the rules etc, just needed to share.

Now, for good or ill, I have known developers who attained mid-level status in their careers after 2 years, so categorizing this Redditor as a junior developer is possibly contentious. However, I contend that they fulfill the remit of my research precis. Moreover, this positive reskilling experience has profound implications for those looking to use AI code assistants as a tutor.

Vivid-Hat3134 sketches out a map to AI-enhanced self training. Instead of serving as a source for code blocks to copy, “I ask for examples and explanations to understand and utilize the information better.” In this way, Vivid-Hat3134 uses GPT to accelerate their learning process and gain a better understanding of programming concepts. Importantly, this Redditor’s experience does not contradict the notion that it is unwise for true beginner-level engineers to use AI code assistants owing to the frequency of hallucinations. At its best, the code assistant becomes a “study partner” for junior developers, rather than an expert.

 

Conclusion

New programmers can benefit from AI code assistants. If approached in a proper spirit these tools are boons to learners. Despite the difficulties I have outlined this is a positive outcome, as the train has left the station in terms of keeping newbies away from AI code assistants.

To understand how junior developers are using AI to up/reskill you can’t do better than listen to Redditors. If you seek out the opinions of the self-described early career developers on this platform it quickly becomes clear that AI code assistants fascinate folks in the process of learning to program, and may even support them on their journey. While no one recommends blindly copying and pasting into a production environment AI-generated code, these assistants have proven themselves to be excellent, if far from expert, study partners.

Disclaimer: Microsoft (GitHub), GitLab, and AWS are RedMonk clients.

One comment

  1. Great article, Kate! You’ve highlighted a crucial aspect of the ongoing debate about AI code assistants, especially in the context of junior developers. The anecdote from the r/learnprogramming subreddit shared by prog_22 is particularly insightful.

Leave a Reply

Your email address will not be published. Required fields are marked *