Learning to Code before & during the AI Revolution - Part 2 - From Craft to Career

As computing moved from hobby to profession, learning shifted again. Universities, commercial software, and the early web reshaped how developers were trained and how careers were built.

In part 1 of this series, I looked at how many of us joined the PC quiet nerd revolution of the 1980s and taught ourselves how to code and use computers. This part takes the journey further into Uni and beyond.

For the next five or six years I programmed in BASIC on and off. Then I went to university and studied Computer Science and Management. If I were doing it again, I would choose straight Computer Science. The management component was largely theoretical and detached from reality.

We learned Modula-2 on VAX minicomputers, a language used almost exclusively in academia. C and UNIX would have been more relevant, but in hindsight the specific language mattered less than the ideas behind it.

One thing I have learned from teaching people of all ages is this: concepts remain. Languages change.

This distinction becomes clearer with time. Syntax is temporary. Mental models are more enduring!

Totally Insane Fact: In 1988, whilst doing my first university assignment - coding a perceptron I think - we used DEC VAX mini computers. As is the way, everyone on the course left it till the last minute and crammed the computer labs to get their assignments done. The poor VAX CPU had to cope with this. The upshot was that it took 1 hour to compile and link your code. That meant that you could perhaps make 10 changes in one working day!

Compiling our Assignments at University in the 1980s

This was reality in the late 1980s and early 1990s. You had to be very careful when poring over your printer paper readout of your code and do a dry run to ensure that any bug fixes were going to work as you only had 10 chances to do it in the day…and that submission deadline was looming.

The web arrives

The World Wide Web was invented by Tim Berners-Lee in 1990, but most of us encountered it properly around 1994 with Mosaic, a browser that could display images inline with text. That may not sound revolutionary now, but at the time it was electric. Who remember 14.4kbps modems?

Speed then: 14,400 bits per second travelling down the wire to your computer.

Speed today: 1,000,000,000 bits per second (69 thousand times faster).

We began seeing strange and wonderful sites that resembled Homer Simpson’s Mister X page. It was chaotic, amateurish, and full of personality. A new World if you like.

The web triggered an explosion of computer books. I bought many of them. I read about half. O’Reilly books were the best. Thick, full of useful detail, and practical.

By the late 1990s I was coding commercially, first in Visual Basic and later in Java. Learning still came from books and documentation. Online coding resources were sparse. YouTube did not exist.

If you wanted to progress, you had to read. A lot.

Waiting for a hi-res image to download. It was actually like this!

We were at the mercy of the slowest component - the modem! Not so today.

Learning as a requirement

Alongside programming languages, we learned the fundamentals: HTML, JavaScript, ASP, PHP, networking, and how the internet actually worked. TCP/IP was not optional background knowledge. It was part of the job.

If you wanted a career in IT, learning was non-negotiable. You had to enjoy it.

This is still true, although now there are so many paths and so many choice, it is almost more difficult for a newbie navigating the IT world than it was for us.

If you do not like learning, IT is probably the wrong career choice for you. For myself, I think I’ve always felt like I don’t know as much as I should and so I try to learn something everyday - but not just that - I have several project (software & hardware) going on at any one time. The challenge is to finish them!

Professional life

University was the most common route into the industry, but it was never the only one. Starting at the bottom and working your way up was always viable.

Some of the smartest people I have worked with never went to university. A degree has never been a totally reliable proxy for intelligence or capability.

In the mid-1990s, graduates struggled to find work. After a couple of years of experience, salaries rose quickly. Within a few more years, contracting became possible.

It was suddenly possible to earn more in a single year than the value of your house. (That will probably never happen again for most of us).

The industry rewarded skill, curiosity, and momentum. It felt open and there were always loads of opportunities around.

In many ways it is still the same in 2026 - the one major difference is AI. I no longer code in a corporate environment so although I know it is being used and ‘junior’ jobs are disappearing in various places - I don’t know what a typical developer’s day looks like. Do they spend all day prompting ChatGPT ?

The learning continues

As the web matured, learning shifted again. Websites emerged to fill the gaps: W3Schools, MDN, FreeCodeCamp, and eventually Stack Overflow. Social media was not yet mainstream.

In 2004 I worked on a Java front-end project using JSX 1.x, an objectively terrible framework. Stack Overflow was often the only place I could find usable answers. It was not convenient, but it worked.

In 2010 I worked on SongCat - my iPad app for musicians. Objective-C was fun - not! I spent a year of my life on Stack Overflow.

Today, very few people need Stack Overflow in the same way. That change alone tells you how much the landscape has shifted and could be a bit of a concern.

It was a different time, but the through-line remained the same: if you wanted to stay relevant, you had to keep learning.

Next time: tools, the emerging web and the iPhone

Peter Januarius

Founder of Nexgen STEM School

Next
Next

Learning to Code before & during the AI Revolution - Part 1 - Before the Great Coding Madness