web analytics

Bill Bennett

Menu

Tag: software

COBOL back from the dead… again

Many US systems that process unemployment claims still run on COBOL, a 60-year-old programming language that relatively few coders understand.

IBM plans to offer free COBOL training to address overloaded US unemployment systems:

“IBM is releasing a free training course next week to teach the 60-year-old programming language COBOL to coders. It is also launching a forum where those with knowledge of the language can be matched with companies in need of help maintaining their critical systems.

“The moves come in response to desperate pleas by state governors for anyone with knowledge of COBOL to volunteer their time to help keep unemployment systems functioning, a critical need as the coronavirus has resulted in an unprecedented surge in people being laid off and having to claim unemployment benefits.”

Younger readers may be mystified by this. COBOL is a programming language from the early 1960s. It was mainly used to write business applications1 on mainframe computers.

It wouldn’t be accurate to say COBOL is like Basic, but there are some similarities. If you’ve used Basic, especially an early version, you’d recognise some of the concepts.

Printing numbers, text

One big feature back in the day was the ability to carefully place printed numbers and text in the right places on forms… so that computers could, among other things, write paper cheques.

This was a strength, but, in a sense it turned out to be a weakness, as we shall see later.

Another big feature of COBOL something newer programming languages might learn from, is that it is self-documenting. That is, it is easy to write code in a way that other programmers can follow. This was important as it meant others could pick up the pieces at any point.

In 1975 I studied Computer Science A Level at a technology college in the UK. We mainly worked in Fortran, another early language, and Assembler, which is a kind of half way house between machine code and a formal language. There was some Basic and some COBOL.

COBOL obsolete in the 1970s… sort of

At that time a computer science lecturer introduced us to COBOL. I remember him saying that the language was still important, if we were going to end up as programmers it would probably be what we would use at first. However, he went on to say  COBOL probably wouldn’t be around much longer.

That was 45 years ago. It is still going strong.

The headline mentions that COBOL is back from the dead again. That’s because it had a come back in the late 1990s. Much of the problem that was known as the millennium bug or Y2K, was to do with COBOL code.

In the late 1990s, many important programmes, especially those used for billing systems and for government, used COBOL. The printing features had standard ways of handling numbers and, in particular dates.

From memory, correct me if this is wrong, COBOL could handle the year number in more than one format, but almost every programmer stuck to using just two digits to show the year. So they represented 1975 as 75. When the year 2000 rolled around, most COBOL programs interpreted this a 1900. The error would then cause problems elsewhere in a program.

COBOL’s second coming

So in the late 1990s, industry dragged COBOL programmers out of retirement to fix up the software some of them wrote in the first place. I fielded calls from recruiter offering me huge money to work on fixing COBOL code.

Keep in mind that I have never worked as a programmer. I did one year of university Computer Science before realising this was not the career for me. And my only brush with COBOL was a few lectures during the mid 1970s. This tells you just how desperate businesses were to get their code fixed in the late 1990s.

Now COBOL is back in the spotlight as many US unemployment systems still use the language. They are busy right now with US unemployment rates going through the roof. IBM is training people to fixed COBOL code.

Undead software

Here’s the weird thing about COBOL. It wasn’t even remotely cutting edge technology by 1980, yet a lot of mainframe programs depended on it. Companies rarely reworked programs with more modern programming languages. Instead they employed teams to keep maintaining and tweaking ancient code. Over time they dropped many old programs for newer technology, but not at the pace you might imagine.

Because everyone knew COBOL was past its sell-by date, people didn’t worry about the looming Y2K problem until the late 1990s. They assumed the old code would be replaced by then. But it wasn’t. Those programs kept churning on.

I found this reference at the Six Colors website:

Back in 2015 my pal Glenn Fleishman found the oldest computer program still in use for MIT Technology Review, and guess what:

In 1958, the United States Department of Defense launched a computerized contract-management system that it dubbed Mechanization of Contract Administration Services, or MOCAS (pronounced “MOH-cass”). The system was designed to use the latest in computation and output technology to track contracts in progress and payments to vendors.

Fifty-seven years later, it’s still going…. MOCAS is written in COBOL.

On and on

That same attitude applied in the late 1990s. Programmers patched code on the assumption that the clunky old COBOL programs that had made it to 2000 were not going to stay in use much longer. This lead to another problem.

It’s not only COBOL that carried on this way. Microsoft Windows retained many original lines of code from the 1980s well into this millennium. I’m not sure if it is still true today, but for decades Microsoft Word included segments of ancient code. In many cases the app didn’t actually use the code, but developers were often unwilling to remove it until the next big clean up.


  1. In those days people rarely talked of applications. If they did, it was more often about the application of the computer to a task than a synonym for program. Likewise, the people who wrote programs were always programmers, I don’t recall hearing the term developer until well into the PC era. Again, if you remember otherwise, feel free to comment. I’m not infallible. Nor is my memory. ↩︎

Skeuomorphism: A user interface Catch 22

If I didn’t promise to help you out in the next sentence, you’d probably have to look up skeuomorphism in a dictionary.

In simple terms the word means something that resembles whatever it was that used to do the job.1

The word may be unfamiliar. The idea is not.

Take the old Macintosh Address Book app. Before Apple modernised its software, the Address Book app looked like a paper address book.

You might also remember when computer operating system desktops had waste paper bin or trash can icons to tell you this is where you throw things away.

Skeuomorph central

The smartphone is skeuomorph central. Every iPhone has icons showing a torch, a telephone handset, a camera and so on. What each of these does is obvious. The envelope icon isn’t quite so apparent, yet you don’t need a PhD to figure out it is for email. Android phones have similar skeuomorphs.

Skeuomorphs don’t have to be software. Houses might have cladding where manufacturers made the building material resemble wooden boards or brick.

Soon electric vehicles in Europe will have to make noises so that pedestrians and others get an audio cue to take care.

Understanding

The idea behind skeuomorphism is that it helps you to better understand what you are looking at. It’s a visual clue telling you the purpose of the object. You see something familiar and, bingo, you know what that thing is going to do.

There’s a special breed of skeuomorph idea where the visual cue lives on long after the original item has disappeared from use.

Mr flippy floppy

Perhaps the best known is the floppy disk icon you sometimes see used to indicate the save function.

It’s getting on for 20 years since computers had built-in floppy disk drives. An entire generation has entered the workforce without every having seen a floppy disk in action. And yet, everyone knows what that image is supposed to mean.

No doubt you have heard stories of young people encountering a real floppy disc for the first time. While they may not know what the item is, or how it is used. They often recognise it from the icon.

Time to put skeuomorphism to bed

While the thinking behind skeuomorphism makes sense, as far as software and operating systems go, it’s best days are in the past. Skeuomorphic designs are often fussy and ugly. They clutter things up. The images are often meaningless and what is represented is not always clear cut.

Yet there’s a Catch 22 here. I prefer minimalist design. It’s easier to focus on the job in hand when the software stays out of the way. I was about to say that when I’m writing, I prefer to start with a blank sheet of paper. Which is, of course, itself a skeuomorphism.


  1. 1 My Mac’s dictionary says: An object or feature which imitates the design of a similar artefact made from another material. ↩︎

Microsoft Word stickiness

Microsoft Word is my fourth favourite writing tool1.

I rarely use Word to write stories or blog posts. Yet, I never hesitate to renew my Office 365 subscription.

It sounds contradictory. That NZ$165.00 Office 365 subscription is good value. That’s true even though I don’t use Word to write and I almost never open Excel. I go out of my way to not let PowerPoint into my life.

At this point you might think this is throwing money away. Open source fans reading this will be aghast.

But there is a method in my madness. Writing is my work. A typical year’s work is 250,000 words. My writing output was even higher for a few years. After more than 40 years in the business, I’ve written, and publishers have paid me for, at least 10 million words.

Most, not all, of the time, I’m paid by the word. Which means my ability to produce quality writing puts food on my table and a roof over my head.

Writing is talking to people, researching, checking then putting it all into words. Sometimes it is about reviving my own work or dealing with words others have sent to me.

Microsoft Word is not optional

Like it or not, Microsoft Word is the lingua franca of digital writing. Almost everyone in the business uses it. It’s been more than two decades since an editor expected copy in anything other than Word format.

At this point, people who dislike Word might be thinking: “Yes, but everything else can save in a Word format. So it isn’t necessary to buy a subscription”.

They have a point. Except that sooner or later, something doesn’t convert between Word and another format.

The most troublesome issue is with edits marked using Microsoft’s Track Changes feature.

Yes, many non-Word writing applications can understand and deal with Track Changes markups. But this is not always straightforward.

The cost to me of failing to deal fast with one edit incident can be greater than the subscription price. It’s rare, but over 250,000 words, it happens a few times every year.

It costs even more than the subscription when we take into account it includes licenses for other family members. In effect my personal subscription costs 25 percent of $165, that’s $40 plus change.

Don’t go there

Even a quick dive down the troubleshooting rabbit hole costs more.

Multiply this by the two or more incidents a year and you can see that paying the subscription leaves me ahead. It’s a solid investment.

Open source fans tell me this attitude is wrong and that I’m paying a tax or even a ransom to Microsoft to be able to work. You could see it this way.

Yet it isn’t Microsoft that is holding me to ransom, it is the editors and publishers who commit to Word. If everyone accepted plain text2 I wouldn’t need to pay the fee.

It might be better to frame the fee as paying for membership of the hireable professional writers club. Either way, it’s a bargain.


  1. In my world it ranks behind IA Writer, Byword and Pages. ↩︎
  2. Text was fine for a long time. That changed about 20 years ago. ↩︎

The Y2K bug makes a comeback | RNZ

Technology commentator Bill Bennett looks at how the millennium bug is back – because it never exactly went away. In trying to solve the problem, programmers pushed it back 20 years. And time’s up. He’ll also look at how Volvo is experimenting with adding noise to near-silent EVs, after research showed pedestrians were twice as likely to be involved in an accident with EVs than those with traditional engines. And is working remotely back in fashion in response to corona virus?

Source: The Y2K bug makes a comeback | RNZ

I’m on RNZ Nine-to-Noon talking about technology.

You can read more about the return of the millennium bug in Y2K bug has a 2020 echo elsewhere on this site.

One of the nice things about electric cars is they are less noisy than petrol cars. This can be a problem, so Volvo is working to add noise.

Research shows pedestrians are twice as likely to have accidents with near-silent EVs than vehicles with traditional engines.

(U.S. Department of Transportation, National Highway Traffic Safety Administration)

This also applies to scooters and electric bikes. 

Being able to hear vehicles coming is as important as seeing them. 

New Zealand currently has no mandatory requirement for the vehicles to make noise to alert others on the roads.

Teleworking, remote working, telecommuting – they are all the same thing and they are back in fashion thanks to the Covid-19 pandemic. Teleworking has been a perennial technology story for well over a generation. But bosses are not always keen to have employees working from where they can’t be watched over. 

New Zealand’s game developers export photons not atoms

NZ game developers exportsNew Zealand interactive game developers earned $203.4 million dollars during the 2019 financial year – double the $99.9m earned only two years earlier in 2017. The success comes from targeting audiences around the world and 96% of the industry’s earnings came from exports.

Source: Interactive Game Exports Double in Two Years to $200m – NZGDA

Technology lets us export photons instead of atoms. The idea was a common theme in my writing 25 years ago when the internet took off. It took time for the reality of this to creep up on us. Now it is happening in a big way thanks to New Zealand’s game developers.

One hundred years ago farmers would load sheep carcasses onto the, then, latest technology; refrigerator ships. These would belch smoke as they steamed to the other side of the world. It meant exporters earned foreign currency. This kick-started New Zealand on the path to, fifty years later, being one of the world’s richest countries.

Sheep carcasses, milk powder, crayfish, apples and all those other exports were made of atoms. They weighed kilograms and they needed to be physically shifted. The products would often take weeks to reach their destination by ship. There were physical risks.

Game developers sell light particles

Today, when, say, Grinding Gear Games, makes a game sale on the other side of the world, photons, tiny particles of light, race to their new home in a fraction of a second.

There’s nothing wrong with physical exports, that’s been what we’ve done for as long as anyone can remember. Yet tomorrow’s rivers of gold are going to come from exporting photons. We need to start thinking of games exports in the same way we once thought of meat or dairy exports.

If the game industry grows at the same pace for the next five years it could be worth a billion dollars a year by 2025. That’s still less than, say, wine or kiwifruit, but with much better margins.