web analytics

Many US systems that process unemployment claims still run on COBOL, a 60-year-old programming language that relatively few coders understand.

IBM plans to offer free COBOL training to address overloaded US unemployment systems:

“IBM is releasing a free training course next week to teach the 60-year-old programming language COBOL to coders. It is also launching a forum where those with knowledge of the language can be matched with companies in need of help maintaining their critical systems.

“The moves come in response to desperate pleas by state governors for anyone with knowledge of COBOL to volunteer their time to help keep unemployment systems functioning, a critical need as the coronavirus has resulted in an unprecedented surge in people being laid off and having to claim unemployment benefits.”

Younger readers may be mystified by this. COBOL is a programming language from the early 1960s. It was mainly used to write business applications1 on mainframe computers.

It wouldn’t be accurate to say COBOL is like Basic, but there are some similarities. If you’ve used Basic, especially an early version, you’d recognise some of the concepts.

Printing numbers, text

One big feature back in the day was the ability to carefully place printed numbers and text in the right places on forms… so that computers could, among other things, write paper cheques.

This was a strength, but, in a sense it turned out to be a weakness, as we shall see later.

Another big feature of COBOL something newer programming languages might learn from, is that it is self-documenting. That is, it is easy to write code in a way that other programmers can follow. This was important as it meant others could pick up the pieces at any point.

In 1975 I studied Computer Science A Level at a technology college in the UK. We mainly worked in Fortran, another early language, and Assembler, which is a kind of half way house between machine code and a formal language. There was some Basic and some COBOL.

COBOL obsolete in the 1970s… sort of

At that time a computer science lecturer introduced us to COBOL. I remember him saying that the language was still important, if we were going to end up as programmers it would probably be what we would use at first. However, he went on to say  COBOL probably wouldn’t be around much longer.

That was 45 years ago. It is still going strong.

The headline mentions that COBOL is back from the dead again. That’s because it had a come back in the late 1990s. Much of the problem that was known as the millennium bug or Y2K, was to do with COBOL code.

In the late 1990s, many important programmes, especially those used for billing systems and for government, used COBOL. The printing features had standard ways of handling numbers and, in particular dates.

From memory, correct me if this is wrong, COBOL could handle the year number in more than one format, but almost every programmer stuck to using just two digits to show the year. So they represented 1975 as 75. When the year 2000 rolled around, most COBOL programs interpreted this a 1900. The error would then cause problems elsewhere in a program.

COBOL’s second coming

So in the late 1990s, industry dragged COBOL programmers out of retirement to fix up the software some of them wrote in the first place. I fielded calls from recruiter offering me huge money to work on fixing COBOL code.

Keep in mind that I have never worked as a programmer. I did one year of university Computer Science before realising this was not the career for me. And my only brush with COBOL was a few lectures during the mid 1970s. This tells you just how desperate businesses were to get their code fixed in the late 1990s.

Now COBOL is back in the spotlight as many US unemployment systems still use the language. They are busy right now with US unemployment rates going through the roof. IBM is training people to fixed COBOL code.

Undead software

Here’s the weird thing about COBOL. It wasn’t even remotely cutting edge technology by 1980, yet a lot of mainframe programs depended on it. Companies rarely reworked programs with more modern programming languages. Instead they employed teams to keep maintaining and tweaking ancient code. Over time they dropped many old programs for newer technology, but not at the pace you might imagine.

Because everyone knew COBOL was past its sell-by date, people didn’t worry about the looming Y2K problem until the late 1990s. They assumed the old code would be replaced by then. But it wasn’t. Those programs kept churning on.

I found this reference at the Six Colors website:

Back in 2015 my pal Glenn Fleishman found the oldest computer program still in use for MIT Technology Review, and guess what:

In 1958, the United States Department of Defense launched a computerized contract-management system that it dubbed Mechanization of Contract Administration Services, or MOCAS (pronounced “MOH-cass”). The system was designed to use the latest in computation and output technology to track contracts in progress and payments to vendors.

Fifty-seven years later, it’s still going…. MOCAS is written in COBOL.

On and on

That same attitude applied in the late 1990s. Programmers patched code on the assumption that the clunky old COBOL programs that had made it to 2000 were not going to stay in use much longer. This lead to another problem.

It’s not only COBOL that carried on this way. Microsoft Windows retained many original lines of code from the 1980s well into this millennium. I’m not sure if it is still true today, but for decades Microsoft Word included segments of ancient code. In many cases the app didn’t actually use the code, but developers were often unwilling to remove it until the next big clean up.


  1. In those days people rarely talked of applications. If they did, it was more often about the application of the computer to a task than a synonym for program. Likewise, the people who wrote programs were always programmers, I don’t recall hearing the term developer until well into the PC era. Again, if you remember otherwise, feel free to comment. I’m not infallible. Nor is my memory. ↩︎

In our coronavirus-tainted world, we’re realising that we depend a lot on our laptop webcams… and they’re not good. WSJ’s Joanna Stern compared the new MacBook

At the Wall Street Journal Joanna Stern takes a critical look at laptop webcams: Laptop Webcam Showdown: MacBook Air? Dell XPS? They’re Pretty Bad

Part of the problem comes down to laptops having thin lids, too thin to include great webcams. Mind you, thin hasn’t stopped phone makers from putting a lot of time and energy into making better cameras.

To a degree none of this would have been much of an issue before half the world started working from home on their laptops. For most people video conferencing was something of a nice-to-have after thought until now.

Suddenly we all notice the poor picture quality. What makes this worse is we now have much more bandwidth, so the internet connection is no longer the limiting factor. We also tend to have much higher resolution screens, so camera flaws are more noticeable.

Opportunity for better webcams

There is a huge opportunity for the first laptop maker to get this right. Apple is the most likely candidate here. It’s noticeable how much better the front facing camera is on a iPad Pro when compared with, say, the MacBook Air.

The 2020 12.9 inch iPad Pro has a seven megapixel front facing camera with all the trimmings. It handles 1080p video at up to 60 frames per second.  In contrast, the 2020 MacBook Air camera is only 720p.

No doubt there is room for improvement now the laptop camera specification matters in ways it didn’t.

The most curious thing about Stern’s video story is that Apple put a better camera on MacBooks ten years ago. Of course they weren’t as thin then.

Of course there is a trade off between thin and camera performance. Laptop lids are thinner than phones or iPads. Apple’s obsession with thin meant laptop keyboard problems until recently. Now it has to rethink where cameras fit in this.

 

The Guardian reviewer Samuel Gibbs gives Apple’s 2020 MacBook Air five stars.

In form and function the MacBook Air is just a few shades short of the perfect traditional laptop. If you don’t want a more modern convertible, you’ll struggle to find a better consumer machine than this.

The keyboard is finally as great as the trackpad, the battery lasts long enough for a work day, it’s light but strong and the screen is beautiful, while the little things such as Touch ID work great. You also get two Thunderbolt 3 ports and a long support life.

Sure, the screen could have smaller bezels and the webcam could be better – why Apple hasn’t put its excellent Face ID into its laptops I have no idea. You can’t upgrade the RAM or storage after purchase, there’s no wifi 6 support, nor SD card slot or USB-A port, but by now most will have enough USB-C cables and accessories, and if not, now is the time to buy them.

As the headline suggests, this is a positive review. The pluses are big, niggles are minor. It squares with my long term MacBook Air experience.

I’ve used MacBook Airs for the past six years. I’m on my second one. Every member of my family now has one. They are by far the best laptop for writing and other light computing tasks. There is more than enough power for everyday users.

Apple MacBook Air 2020

Price and productivity

When I write about Apple products1 there are always readers who complain about the price.

That misses a lot of context. You’re not just buying the hardware, you are buying into a different way of using computers.

Macs come with a suite of productivity software that costs extra if you buy a Windows computer.

They also come with a complete, fully realised, this is a word I hesitate to use, ecosystem.

Apple’s world is not necessarily better or worse than what you’d get with Windows or for that matter with Android or a Chromebook. But if it suits the way you work and think, the relatively small margin you pay for an Apple will pay off immediately in terms of improved productivity.

My freelance writing business quite literally took off when I switched back to Macs from Windows.

You may experience something similar. You may also experience the same kind of improvement moving from MacOs to Windows or from anything to Linux or any other operating system. This is not a one-size-fits-all world.

What my experience says does not work, is attempting to do things on the cheap. Skimping saves you dozens or maybe hundreds of dollars. Being unable to work productively will cost you thousands.


  1. It’s not only Apple, people say similar things about any premium hardware product ↩︎

If I didn’t promise to help you out in the next sentence, you’d probably have to look up skeuomorphism in a dictionary.

In simple terms the word means something that resembles whatever it was that used to do the job.1

The word may be unfamiliar. The idea is not.

Take the old Macintosh Address Book app. Before Apple modernised its software, the Address Book app looked like a paper address book.

You might also remember when computer operating system desktops had waste paper bin or trash can icons to tell you this is where you throw things away.

Skeuomorph central

The smartphone is skeuomorph central. Every iPhone has icons showing a torch, a telephone handset, a camera and so on. What each of these does is obvious. The envelope icon isn’t quite so apparent, yet you don’t need a PhD to figure out it is for email. Android phones have similar skeuomorphs.

Skeuomorphs don’t have to be software. Houses might have cladding where manufacturers made the building material resemble wooden boards or brick.

Soon electric vehicles in Europe will have to make noises so that pedestrians and others get an audio cue to take care.

Understanding

The idea behind skeuomorphism is that it helps you to better understand what you are looking at. It’s a visual clue telling you the purpose of the object. You see something familiar and, bingo, you know what that thing is going to do.

There’s a special breed of skeuomorph idea where the visual cue lives on long after the original item has disappeared from use.

Mr flippy floppy

Perhaps the best known is the floppy disk icon you sometimes see used to indicate the save function.

It’s getting on for 20 years since computers had built-in floppy disk drives. An entire generation has entered the workforce without every having seen a floppy disk in action. And yet, everyone knows what that image is supposed to mean.

No doubt you have heard stories of young people encountering a real floppy disc for the first time. While they may not know what the item is, or how it is used. They often recognise it from the icon.

Time to put skeuomorphism to bed

While the thinking behind skeuomorphism makes sense, as far as software and operating systems go, it’s best days are in the past. Skeuomorphic designs are often fussy and ugly. They clutter things up. The images are often meaningless and what is represented is not always clear cut.

Yet there’s a Catch 22 here. I prefer minimalist design. It’s easier to focus on the job in hand when the software stays out of the way. I was about to say that when I’m writing, I prefer to start with a blank sheet of paper. Which is, of course, itself a skeuomorphism.


  1. 1 My Mac’s dictionary says: An object or feature which imitates the design of a similar artefact made from another material. ↩︎

Microsoft Word is my fourth favourite writing tool1.

I rarely use Word to write stories or blog posts. Yet, I never hesitate to renew my Office 365 subscription.

It sounds contradictory. That NZ$165.00 Office 365 subscription is good value. That’s true even though I don’t use Word to write and I almost never open Excel. I go out of my way to not let PowerPoint into my life.

At this point you might think this is throwing money away. Open source fans reading this will be aghast.

But there is a method in my madness. Writing is my work. A typical year’s work is 250,000 words. My writing output was even higher for a few years. After more than 40 years in the business, I’ve written, and publishers have paid me for, at least 10 million words.

Most, not all, of the time, I’m paid by the word. Which means my ability to produce quality writing puts food on my table and a roof over my head.

Writing is talking to people, researching, checking then putting it all into words. Sometimes it is about reviving my own work or dealing with words others have sent to me.

Microsoft Word is not optional

Like it or not, Microsoft Word is the lingua franca of digital writing. Almost everyone in the business uses it. It’s been more than two decades since an editor expected copy in anything other than Word format.

At this point, people who dislike Word might be thinking: “Yes, but everything else can save in a Word format. So it isn’t necessary to buy a subscription”.

They have a point. Except that sooner or later, something doesn’t convert between Word and another format.

The most troublesome issue is with edits marked using Microsoft’s Track Changes feature.

Yes, many non-Word writing applications can understand and deal with Track Changes markups. But this is not always straightforward.

The cost to me of failing to deal fast with one edit incident can be greater than the subscription price. It’s rare, but over 250,000 words, it happens a few times every year.

It costs even more than the subscription when we take into account it includes licenses for other family members. In effect my personal subscription costs 25 percent of $165, that’s $40 plus change.

Don’t go there

Even a quick dive down the troubleshooting rabbit hole costs more.

Multiply this by the two or more incidents a year and you can see that paying the subscription leaves me ahead. It’s a solid investment.

Open source fans tell me this attitude is wrong and that I’m paying a tax or even a ransom to Microsoft to be able to work. You could see it this way.

Yet it isn’t Microsoft that is holding me to ransom, it is the editors and publishers who commit to Word. If everyone accepted plain text2 I wouldn’t need to pay the fee.

It might be better to frame the fee as paying for membership of the hireable professional writers club. Either way, it’s a bargain.


  1. In my world it ranks behind IA Writer, Byword and Pages. ↩︎
  2. Text was fine for a long time. That changed about 20 years ago. ↩︎