Language Matters, and What Matters Has Changed

If you’re a programmer, don’t fixate on the language you know. If you’re not, experiment at becoming one. LLMs (like ChatGPT) are dramatically reforming programming.

Now Computers Talk “Human”, Do Humans Need to Talk “Computer”?

As with fellow humans, we often talk to computers with language. The big news for 2023 is how computers will now talk “human” as never before—so much so that they really appear as our species (no, any bots reading this…not you…).

Does that mean we never need to speak “computer”, i.e. program syntactically? Will LLM technology (like ChatGPT) mean we can drop this recent, evolving form of discourse, familiar to most but used by only an elite few?

I don’t believe so, no, at least not for the foreseeable future: there’s a value to unambiguous precision, and syntactic language provides that in a way that human language cannot. Instead LLMs promise to change who, what, when and how the power of abstracted code can empower us—bringing in a huge new cohort of “programmers” along the way.

In a few short months, “being a programmer” is starting to look very different from its slowly evolving picture since the 1950s.

All Change for Coding Courses and Computer Science?

It’s worth remembering how programming (and particularly abstraction to code) has been an increasingly sought-after skill, an escalating bottleneck in the workforce, which has driven demand for coding and computer science (or CS) courses—though the latter often cover more than the pure vocational skill.

Paradoxically, this most central employment of the computer age is itself subject to computer or AI takeover.

A bit like The Math(s) Fix, we might now need The Coding(_) Fix because very few people will be writing lower-level code in future; this will be more specialist than now; instead they will be highly AI-assisted in writing that code. Different, broader-based skillset, and therefore I expect traditional coding or CS courses—if we can yet call them traditional—may have reached their peak. The demand for programming will continue to grow, indeed accelerate, but much of it will be satisfied by AIs.

Which Abstraction Language to Speak?

Then there’s the question of the best language to abstract to. What you want is the highest-level, shortest, most expressive and readable language, not necessarily the one that’s most familiar (for you) to write from scratch. Does that change your selection? Wouldn’t you always have wanted that, irrespective of AI coding assistance? Well, yes, but there’s a lot else that affects what’s used, in particular which language you’re most likely to have learnt, what’s been in fashion for all sorts of reasons, whether there’s lots of existing code and coders, whether a particular organisation has made a play to establish it for their advantage (e.g. Google for Python). The net effect has been huge inertia in the evolution of widely used programming languages. Many innovative languages are out there; few get used by most programmers.

Over time the trend has been for languages to get commands that each do more, a wider range of structures that need more translation to get down to chip level of actually getting processed. As computers get more powerful, the tradeoff goes towards higher-level languages to save human time even at the expense of increasingly plentiful computer time. But higher-level languages have more commands, so they’re not always easier for the human to start with. In common with human language, there’s potentially an infinite vocabulary you can learn, many constructions and so forth. This is rather daunting: there’s an energy barrier, but if overcome, higher-level languages can be extremely productive, much easier to maintain, almost always a better optimisation.

New AI Assistance, New Optimisation, New Language?

LLM assistance is a powerful catalyst that slices through this energy barrier—as well affecting many other aspects of the coding ecosystem. Here’s why. The LLM can assemble which commands to use so you don’t need to know a large vocabulary—or indeed any—to get going. They’re good at finding the right vocab, even if they don’t always use it right. Once you have your first cut of code, it’s highly readable, short and easy to edit; and even if you don’t know a particular function, you can look it up trivially. In fact, because it will have a more specialised, close-to-wanted capability than lower-level commands in a low-level language, that’s easier too.

The high-level program is much, much better than anything lower level that you have to try to walk through, but without the experience of that language needed to get there. It’s far less important you use a language you already know; it’s more important you use something you can interpret and edit easily.

Crucial to remember is that although lower-level languages don’t need you to have the breadth of vocabulary, they do need you to do “abstraction gymnastics” in trying to turn your problem definition in English into code as the fewer structures and commands (or words) you have to use, the less likely they’ll naturally fit the problem.

Honing that gymnastics is key to learning today’s coding or computer science and although for some people it comes naturally and they are very good at it, for others it’s very hard. Even for those who are naturals, it’s not necessarily efficient longer term if they were to put in the time to master a wider set of vocabulary and structures. Worse, if they ever want to go higher level, it’s a surprisingly hard transition for many: if you’re used to turning everything into a few low-level structures or paradigms, then our experience is that making the transition to highly multiparadigm languages is much more troublesome than the reverse.

Net effect: abstraction gymnastics and the bias to lower-level languages has cut out many from programming or programming efficiently.

Code-Assisted Wolfram Language

I’m pleased to say that for all these reasons, Wolfram Language is now absolutely ideal as the language to abstract to. Gone are the barriers about fewer people knowing it than Python, out are the “it’s hard to learn” because of the number of functions (more than 6000). In are, “it’s 10x shorter”, “readable”, “unified”, “computable data is right there” alongside the world’s largest unified set of algorithms.

For years we’ve struggled with cutting through the initial energy barrier for Wolfram Language adoption. And now, seemingly out of the blue, the answer has arrived better than we imagined. Very exciting times as we enter our 36th year. (Mathematica—the initial Wolfram Language manifestation—is 35 on 23 June.)

It will be so nice when code writing is as everyday for everyone as making a spreadsheet or a word-processed document: the next key stage of this industrial revolution, a further step into the AI age.

More in The Math(s) Fix!

There’s a much longer discussion about language effects on computational thinking in The Math(s) Fix. This was written before the LLM revolution, but with our newfound step 2 mechanisation in mind, this is a more comprehensive discussion.

How is Wolfram Language linking to LLMs? Much more in my brother’s post.