Coding assistants have improved productivity and saved millions, but they come with the risk of bias and emerging copyright concerns. ET looks at the rise of coding assistants and pitfalls as the adoption increases.Six hours. That was all it took Kiran (name changed), founder of a climate tech startup, to design a static mobile application for his product using the latest AI coding tools available in the market. As a solo developer, he pegs it would take another couple of months to get the minimum viable product ready, in time for his pre-seed round six months later.
A year ago, the whole process would have taken half-a-year, a team of 3-4 developers and a few thousand dollars. Thanks to AI-powered coding tools like Cursor AI, Copilot and others, this is possible with just a few hundred dollars and half the time, according to Kiran.
Improvements in reducing hallucination, better large language models and prompt engineering, where the large language models were trained to generate relevant output with the right context, have helped in wider adoption. Then, it is hardly a wonder that AI coding assistants have become a mainstay in developers' life.
These tools will have a significant impact on India given that it has one of the largest developer markets at 15.4 million and large Indian enterprises adopting them.
Srikanth Nadhamuni, chairman, Khosla Labs and cofounder, Trustt, a fintech platform, told ET that when he had met OpenAI founder Sam Altman a couple of years ago, Altman had identified coding as one of the areas where GenAI could have a significant impact.
At Trustt, Nadhamuni said that the company has introduced coding assistants and has seen substantial benefits. He said that what used to take two hours, it takes just five minutes with the right prompt.
Since the launch of the very first version of Chat GPT in 2020 till date, several AI-powered coding tools have flooded the market offering code suggestions to generating an entire block of it, based on prompts.
Enterprises are lapping it up. Take GitHub's Copilot. Till date, about 77,000 businesses have begun using Copilot. Shuyin Zhao, VP-product management at GitHub, said that Copilot helps developers stay in the flow and preserve mental effort during repetitive tasks, which drain and derail focus. Abhiram R, a Bengaluru-based Python developer, said that in the last six months, what used to take six hours a week, involving repeated functions like creating files, has come down to three hours now.
This is a significant saving in developers' time, which can be used to create more software and innovate rapidly, say experts.
Krish Ramineni, cofounder, Fireflies.ai, an AI note taking tool said, "AI helps our engineers become more productive and write more code in less time and helps with code reviews. We built our AI internally, an agent that can answer 30 to 40% of CS (support) tickets. So this has helped free up a lot of the stress." He added that there are certain tasks that they are able to complete almost 20-30% faster. Over time, it will save over 10 hours on average per month for an engineer. "The cost savings then adds up."
These tools can also be used to fast-track learning. Paras Chopra, founder, Turing's Dream, a Bengaluru based AI residency, said that use of these tools can make a good developer better and help those who are starting out leapfrog their experience using these tools. Experts, however, point out that they are a double-edged sword since there are concerns of bias and copyright issues.
Developer analytics tool GitClear analysed 153 million changes in lines of code authored between January 2020 and December 2023 and found that the code quality has taken a hit since use of AI tools. Developer security platform Synx highlighted that coding assistants have limited understanding of software and can make systems vulnerable. All this requires human oversight more than ever. Bias is another big issue.
James Landay, co-director and cofounder, Stanford Institute for Human-centred Artificial Intelligence, earlier told ET that bias is one of the biggest issues with the LLMs. With no clarity on the kind of data that are used for training the LLM models, developers need to be aware of the pitfalls of using the model. Both Copilot and OpenAI are currently facing a class action suit from developers for using licensed code without attribution, violating copyright laws. The outcome of this case will have a huge impact on generative AI systems.