Back to main page

The AI Paradigm Shift Changing Software Engineering

By Shon Shampain, Director of Mobile Engineering

It’s fair to say that the introduction of the tractor as a farming tool in 1892 absolutely transformed the industry, with its output increasing exponentially. In a virtual instant, manual plowing was relegated to history.

This is very similar to what’s happening in code development given the advent of AI as the days of manually coding are waning. Engineers are now being asked to perform in a different manner that supports a higher output.

While there’s a lot being said about what’s happening in software engineering with respect to AI, it’s important to note that there’s a lot of misinformation circulating.

The Current State of AI Development

On the one hand, there are companies operating software without any AI involvement. In my opinion, they really can’t be expected to survive long term unless they are incredibly niche applications, which is very similar to boutique farming these days.

Then, there are companies doing the lion’s share of their development with AI. These early adopters have figured out the procedures necessary to support their development chain.

Left in the middle are the vast majority of companies that are still figuring things out, and introducing AI as best they can. Regardless of where they are in their journey, any company utilizing AI has realized that software engineers are critical in the process.

Up until a point, there will always be a developer calling the shots. They are the driving force behind the technology, so it’s critical to understand that using AI effectively involves creating leverage for a single developer to be able to create multiples of their previous output. This kind of leverage doesn’t really have a cap at the moment, and it’s why you hear that some companies are no longer hiring junior developers—because the seniors have found the leverage.

The senior staff that remain will have evolved to a new way of operation. As philosopher Ken Wilbur puts it, you “transcend and include.” You transcend to a higher level, but this higher level includes the core of your being that got you here in the first place. It has to be this way. Without core engineering skills, there is nothing to evolve around.

A New Chapter in Software Engineering

The new paradigm is one where most of the work is spent in one of the three main aspects of AI usage, those being: selecting the correct engine, managing the context, and working with the prompt. Gone are the long hours of carefully crafting meticulous code. Sure, some small amount of coding will still be required, but it is less and less every week because the work is one step removed.

Previously, code was generated directly. Now, code is generated from prompts that are generated directly. In a very real sense it’s similar to going from a coding role to an architecture role.

We learned quickly at Tala that sharing information regularly provides the best return on investment with regard to getting everyone up to speed. Engineers tend to like to solve problems by themselves and present work when it is known to be correct, but what’s interesting about AI is that the real speed comes when it’s collaborative.

To advance our knowledge on prompt engineering, we first had to set up the right infrastructure. We created a repository for prompts that are indexed to feature stories, a documentation page detailing meta prompting techniques, and a rich company culture that celebrates sharing both our victories in our AI journey, but more importantly the struggles.

In our sessions where we share our experiences, it’s inevitable that two or more people come across the same struggle, find the same answer, or decide to pursue different angles. Sometimes we’re able to find improvements in our process during unexpected moments that tend to cluster together.

A great example of this came from taking Figma design directly to Kotlin code. Our group was hand coding UI up to a particular point in time before some team members discovered tools that could automate the process while maintaining our coding standards and design guidelines. Almost overnight, we were able to cut down the time it takes to complete a high quality UI story––a once major task—by 50-75%.

The Quest for the Whole Codebase & the Mental Hurdle of Prompt Management

Working in a corporate environment generally involves a static set of AI engines to choose from. This means that selecting the proper engine is usually a one time effort, or at least a task that doesn’t happen very often. The real challenges for the AI engineer have to do with managing the context and working the prompt.

The context and how to manage it are an aspect of the game that has to be continually monitored. A public facing LLM has an incentive to encourage limiting the context as much as possible, both in terms of reducing the computation necessary in any query and to mitigate data privacy issues as much as possible. This is in direct conflict with most users who want AI to have as much context as possible.

The holy grail is getting the whole codebase into the context, as well as all relevant business documents from Slack conversations to design documents and feature requirements. Needless to say, most companies are still quite a far ways off.

Our efforts to get our whole codebase into the context has been limited by the significant discrepancy between current content limitations (typically a few thousand tokens) and the size of our codebase (often hundreds of thousands of lines of code).

Some vendors advertise complete project awareness, but the numbers suggest they must be cutting corners by swapping things in and out of context space. While this might work in some instances, we’ve seen subpar results in others.

In response, we’ve pursued a more programmatic approach where we control the loading of the context as we navigate the codebase. It’s been a game changer for tasks like refactoring, sweeping for specific coding patterns, or upgrading coding standards across all files, but loading files and information into the context is still a time consuming, manual process.

Similarly, prompts are an ongoing situation that developers have to deal with. The end goal is always to stay with a prompt (repeat the question, modified, over and over) until you get the exact output you want, then share it with your team so it can be re-used. This is one of the biggest challenges companies face.

Fortunately, it is more of a mental hurdle than a technological limitation. To overcome it, developers need to be encouraged to fully make the change away from analog coding and jump aboard AI development. Every time a developer stops working with a prompt and fills in the details of some code by hand, the process stagnates.

The Goals Ahead

AI is also helping the Mobile group in other ways that are not specifically related to the direct coding experience. For example, we are hooking up actions on GitHub to have AI perform code reviews after our developers share their thoughts. This is both to double check that we haven’t missed anything, and to ensure that our coding standards and architectural patterns are adhered to.

Our ultimate goal in the Mobile group is to set up an expert system where we can link documentation, Slack threads, feature stories and the codebase all together. Due to the limitations on loading context with current AI engines, this likely involves us creating a custom indexing mechanism among the data that we consider relevant to the question at hand, and then interfacing with AI properly through an API call. It’s ambitious, but being able to link all aspects of business data together would be a game changing development.

Share this article now: