Computers understanding humans makes codebases irrelevant

2023 Apr 08  |  8 min read  |  tags: blog (4) gen-ai (1)

Codebase is actually dead code

Look at the codebases you have. They are all "dead code". Code that doesn't change. Code that doesn't mutate.

  1. A business analyst sits down, and gathers relevant business requirements.
  2. This list of business requirements are passed down to a developer.
  3. The developer translates each business requirement into logical steps.
  4. The developer translates the logical steps into computer code.
  5. The developer deploys the group of pre-programmed logical steps (computer code) for execution, to be used in the business.

We call this collection of computer instructions as a "codebase".

This is an extremely static process that generates static output. All relevant needs were anticipated and programmed beforehand. Once the codebase is created, it doesn't change. Its capabilities don't change. It will only perform the business functions that the human programmed it to do.

The entire codebase is "dead code".

Once the codebase is built, it doesn't change again. It will be changed manually through another business analyst -> developer -> codebase cycle.

It remains dead. It remains unchanged until a human manually modifies it.

Why is the developer even writing code?

Now, look at that developer.

The developer is simply a translator from human-speak to computer-speak.

If you look at this entire process from above, you'll see the asymmetry -

  • The human can understand computer-speak, but the computer cannot understand human-speak.
  • The human can understand what the computer is doing. But the computer cannot understand what the human is doing.

In all of this, the computer is simply a stupid executor. Nothing more.

Our fundamental problem has been that human languages and computer languages are different. Computers couldn't interpret human language, but humans could interpret computer language. There wa\s an asymmetry.

Humans who could understand computer language are relatively limited in numbers, but we have enough of them to get get most of our problems modelled in software.

We solved the problem from the human side, while computers remained relatively stupid and unable to understand the human.

Non-computer-fluent-human communicates instructions to computer-fluent-human who communicates instructions to the computer.

The bottleneck to computer based solutions have always been humans. Humans who are fluent at communicating our instructions to computers are relatively rare. We are primarily in the stage of humans communicating to computers directly in computer language. And this solution has simply been too expensive to solve from the human side.

Now, imagine that this asymmetry is solved.

Imagine that computers can now understand human-speak.

Now, when the business analyst gathers requirements, the business analyst can directly tell them to the computer. The computer can understand the human-speak.

The computer will simply translate the instructions into code. Since the computer can generate code tremendously faster than a human can, the computer can practically generate code to execute on the fly.

This is where things actually take off.

Now, there is no human bottleneck to codebase creation. The business analyst tells business requirements, and the computer implements those requirements in code right then, right there. In an instant.

We have arrived at "just in time code". Instant code delivery whenever you need it.

Now, take this a step further.

Why is the business analyst even gathering requirements?

It is because the human developers have human limitations. There is a limit to how much logic can be implemented in a reasonable period of time. Since we cannot implement everything, we have to figure out the most pressing requirements and prioritise their implementation in the codebase.

Now since the computer itself is generating code on the fly, there is no limit to the amount of code that can be generated. Code has become something very trivial now.

Since you can now generate practically infinite amount of code, you can now support practically infinite business requirements.

There is no need to narrow down and prioritise business requirements, as there is no limitation to the number of requirements can be implemented in code.

So now, you don't even need a business analyst to gather requirements. Just tell the computer what you want done, and the computer will figure out the code and get it done.

You just ask the computer like you'd ask a human. The computer will figure things out.

The very concept of codebases is dead. The very concept of requirement gathering is dead.

Why is there a codebase?

Till now, we thought from the perspective of codebase. Computer generates codebase. But ask the question - do we even need a codebase now?

The answer is no.

Codebase was a collection of fixed capabilities, created by translating fixed business requirements into code. Now there are no fixed business requirements, and there is no fixed code. So, the very concept of a codebase is dead.

You simply give your requirement in the current moment, and a loop will kick off:

  1. You provide your current requirements - what needs to be done
  2. Computer breaks down requirement into logical steps
  3. Computer translates logical steps into code that it can execute
  4. Computer executes the code to get the work done
  5. Computer "forgets" the generated code and waits for your next requirements
  6. Back to step 1

You get a need in the moment. Computer implements, executes and gets the need fulfilled in the moment. There is no need to store a set of instructions. There is no need to save a codebase.

This is exactly like how we humans operate :

  1. We get a thought of doing something.
  2. The brain generates thoughts & instructions that moves the body and gets the things done.
  3. The instructions "disappear" and the brain waits for the next thought.

Look at this from the same lens, and you'll see that we humans too operate on ephemeral codebases. Instructions are continuously generated just in time, executed, and forgotten. We already have just in time executors. And we've come to expect the same from our computers.

Why do we even need codebases now? We don't.

Conclusion and future needs

I had written the thought of this blog as a part of my personal thesis on computing in 2019. It was before I set my direction and started my autodidact studies. I saw that computing as a whole is naturally moving on from imperative to declarative. But I never imagined that the timeline could progress this fast.

Right now, I can clearly see a reduction in demand for programming as a profession. People claim that computer generate code isn't good enough. In reality, 80% of human generated code isn't good enough. Look around and you'll realize that the bar for average is actually very low. The bottom of the programming pyramid is huge, and is currently getting easily consumed by AI (specifically in this case, LLMs).

One future need that I can think of is schools for computers. Currently we have human schools, where society funds an environment for humans to study, think, model the world, and come up with what to do next. Just like human schools, we will also have to setup schools to train computers to think and execute.

If society wants to set these up or not is society's problem. But I don't think this is avoidable. Computers are simply too useful to ignore right now. And tribes of people crave getting an advantage over one another. So tribes of people will spend resources into making sure that their computer is the best.

I don't have anything more to add to this thought. This is the end of this blog.

Now go, think for a while.