web analytics

Many US systems that process unemployment claims still run on COBOL, a 60-year-old programming language that relatively few coders understand.

IBM plans to offer free COBOL training to address overloaded US unemployment systems:

“IBM is releasing a free training course next week to teach the 60-year-old programming language COBOL to coders. It is also launching a forum where those with knowledge of the language can be matched with companies in need of help maintaining their critical systems.

“The moves come in response to desperate pleas by state governors for anyone with knowledge of COBOL to volunteer their time to help keep unemployment systems functioning, a critical need as the coronavirus has resulted in an unprecedented surge in people being laid off and having to claim unemployment benefits.”

Younger readers may be mystified by this. COBOL is a programming language from the early 1960s. It was mainly used to write business applications1 on mainframe computers.

It wouldn’t be accurate to say COBOL is like Basic, but there are some similarities. If you’ve used Basic, especially one of the earliest forms, you’d recognise some of the concepts.

Printing numbers, text

One of its big features back in the day was the ability to carefully place printed numbers and text in the right places on forms… so that computers could, among other things, write paper cheques.

This was a strength, but, in a sense it turned out to be a weakness, as we shall see later.

Another big feature of COBOL something newer programming languages might learn from, is that it is self-documenting. That is, it is easy to write code in a way that other programmers can follow. This was important as it meant others could pick up the pieces at any point.

In 1975 I studied Computer Science A Level at a technology college in the UK. We mainly worked in Fortran, another early language, and Assembler, which is a kind of half way house between machine code and a formal language. There was some Basic and some COBOL.

COBOL obsolete in the 1970s… sort of

At that time one of the Computer Science Lecturers introduced us to COBOL. I remember him saying that the language was still important, if we were going to end up as programmers it would probably be what we would use at first. However, he went on to say  COBOL probably wouldn’t be around much longer.

That was 45 years ago. It is still going strong.

The headline mentions that COBOL is back from the dead again. That’s because it had a come back in the late 1990s. Much of the problem that was known as the millennium bug or Y2K, was to do with COBOL code.

In the late 1990s, many important programmes, especially those used for billing systems and for government, used COBOL. The printing features had standard ways of handling numbers and, in particular dates.

From memory, correct me if this is wrong, COBOL could handle the year number in more than one format, but almost every programmer stuck to using just two digits to show the year. So they represented 1975 as 75. When the year 2000 rolled around, most COBOL programs interpreted this a 1900. The error would then cause problems elsewhere in a program.

COBOL’s second coming

So in the late 1990s, industry dragged COBOL programmers out of retirement to fix up the software some of them wrote in the first place. I fielded calls from recruiter offering me huge money to work on fixing COBOL code.

Keep in mind that I have never worked as a programmer. I did one year of university Computer Science before realising this was not the career for me. And my only brush with COBOL was a few lectures during the mid 1970s. This tells you just how desperate businesses were to get their code fixed in the late 1990s.

Now COBOL is back in the spotlight as many US unemployment systems still use the language. They are busy right now with US unemployment rates going through the roof. IBM is training people to fixed COBOL code.

Undead software

Here’s the weird thing about COBOL. It wasn’t even remotely cutting edge technology by 1980, yet a lot of mainframe programs depended on it. Companies rarely reworked programs with more modern programming languages. Instead they employed teams to keep maintaining and tweaking ancient code. Over time they dropped many old programs for newer technology, but not at the pace you might imagine.

Because everyone knew COBOL was past its sell-by date, people didn’t worry about the looming Y2K problem until the late 1990s. They assumed the old code would be replaced by then. But it wasn’t. Those programs kept churning on.

I found this reference at the Six Colors website:

Back in 2015 my pal Glenn Fleishman found the oldest computer program still in use for MIT Technology Review, and guess what:

In 1958, the United States Department of Defense launched a computerized contract-management system that it dubbed Mechanization of Contract Administration Services, or MOCAS (pronounced “MOH-cass”). The system was designed to use the latest in computation and output technology to track contracts in progress and payments to vendors.

Fifty-seven years later, it’s still going…. MOCAS is written in COBOL.

On and on

That same attitude applied in the late 1990s. Programmers patched code on the assumption that the clunky old COBOL programs that had made it to 2000 were not going to stay in use much longer. This lead to another problem.

It’s not only COBOL that carried on this way. Microsoft Windows retained many original lines of code from the 1980s well into this millennium. I’m not sure if it is still true today, but for decades Microsoft Word included segments of ancient code. In many cases the app didn’t actually use the code, but developers were often unwilling to remove it until the next big clean up.


  1. In those days people rarely talked of applications. If they did, it was more often about the application of the computer to a task than a synonym for program. Likewise, the people who wrote programs were always programmers, I don’t recall hearing the term developer until well into the PC era. Again, if you remember otherwise, feel free to comment. I’m not infallible. Nor is my memory. ↩︎

IBM Gini RomettyEarlier this year IBM told remote employees they must return to the office or leave the company.

It’s a turnaround. IBM pioneered allowing employees to work from home. At times as many as more than a third of the firm’s staff worked at places away from company offices.

The company often lectures others on the merits of remote work. Company marketing describes telework as the future. Moreover, IBM sells products enabling its customers to offer remote work to their employees.

IBM’s remote work policy was popular with staff. Many talented people either opted to join the company or decided to stay put because they could work from home. It’s powerful for working women with families and just as good for dads who like to see their children more often.

Productivity or IBM’s staff costs?

The official reason for the change is that working together in one place helps productivity, teamwork and morale.

There’s something in this. Collaboration is easier when co-workers sit across the aisle. Video conference calls are productive, but so are well organised face-to-face sessions. Chance meetings at the coffee station can spark fresh thinking.

Yet, you can’t help wonder if IBM’s move is about cutting staff numbers. Many remote workers may decide it is too hard to move home in order to keep their job. Some of the office demands mean people have to move long distances to keep their jobs.

There’s research, some sponsored by IBM, showing teleworkers are more productive than office-bound workers. Which argument are we supposed to believe? Can we trust anything the company says on the subject?

Ominous

Yahoo made a similar back-to-the-office move. It was unpopular. Many talented staff members quit. We all know how well that story ended.

There’s a practical problem for IBM workers in places like New Zealand. Some specialist roles are shared with Australia. There are ANZ managers are in New Zealand, others across the Tasman. They shuttle between locations and make a lot of conference calls. What happens to them under the new rules? The fear is they will be under pressure to move closer to the regional HQ in Sydney. That will not go down well with New Zealand customers.

Remote working became popular with large companies about a decade ago as suburban broadband improved. Video conferencing went from being difficult to practical.

Senior managers across the technology and other industries loved the idea of remote work as they thought it would save costs. In theory, offices needed less real estate and fewer support services when workers were elsewhere.

Things didn’t work out that way. Few savings materialised.The other part of this equation is that management went through a stage of being output-focused. That is, they were more concerned with what employees produced than in keeping close tabs on them all day long. If someone produced a report in their pyjamas or by sitting next to the pool that wasn’t a problem so long as the work was good. It seems the pendulum has swung back to command and control.

IBM says Macs cheaper than PCsThe jamf blog covers a presentation by Fletcher Previn, VP of Workplace as a Service at IBM:

In 2015, IBM let their employees decide – Windows or Mac. “The goal was to deliver a great employee choice program and strive to achieve the best Mac program,” Previn said. An emerging favorite meant the deployment of 30,000 Macs over the course of the year. But that number has grown. With more employees choosing Mac than ever before, the company now has 90,000 deployed (with only five admins supporting them), making it the largest Mac deployment on earth.

But isn’t it expensive, and doesn’t it overload IT? No. IBM found that not only do PCs drive twice the amount of support calls, they’re also three times more expensive. That’s right, depending on the model, IBM is saving anywhere from $273 – $543 per Mac compared to a PC, over a four-year lifespan.

IBM is now the biggest Mac user, so the business technology giant’s experience is important. By any standard 90,000 users is a significant sample size. The total cost of ownership matters when you measure users in tens of thousands.

And we’re talking here about the company that started the PC ball rolling 35 years ago. That must count for something too.

 

IBM cloudIBM’s reinvention as a software and services business still serves as an object lesson in turning troubled technology companies around. It switched from dependence on mainframe and servers to selling software, services and outsourcing.

At the time, the turnaround seemed miraculous. Now IBM needs to pull another rabbit from the hat.

The technology market is going through yet another transition. IBM has placed plenty of bets in the brave new world. It even has an acronym for these markets: CAMSS (cloud, analytics, mobile, security and social). They are all fast-growing areas, but IBM’s efforts are not growing fast enough to offset declines elsewhere in the company’s portfolio.

Worse, these new lines of business have lower margins than existing lines, which in turn have lower margins than IBM’s mainframe-era businesses.

This IBM reinvention is late to market

IBM was late to all these exciting new markets. Take cloud: In the latest financial report IBM says cloud revenues climbed 50 percent year on year. We need to be careful with these numbers as IBM has a habit of bundling hardware sales services into its cloud revenue numbers.

To put IBM’s figure in perspective, Amazon recently announced 80 percent year on year growth in cloud services.

Despite billion-dollar investments and Softlayer (a cloud business) acquisition, IBM trails a long way behind Amazon and Microsoft in market share[1]. Gartner also marks IBM behind Google and Rackspace on “ability to execute”.

To be fair, IBM isn’t a plain vanilla cloud company. It often wraps cloud with other sophisticated services. That sounds good, but it could be a problem. IBM doesn’t have the culture needed to run the commodity cloud infrastructure customers demand.

The cloud market looks set to shake down with a handful of global giants emerging along with specialist and geographic niche players. IBM doesn’t rate in the top five cloud companies. Its best hope is to be a specialist cloud niche player.

The other new markets are similar. IBM isn’t doing well in analytics and apart from a deal with Apple is nowhere in mobile.

IBM has yet to explain what the social part of CAMSS means, which leaves security. Now there’s no question security is an important and growing market. There will be fortunes made in this area, but security alone is not enough to sustain IBM. At least not the company we know.

Low margin business

One odd aspect of IBM’s strategy is the company is throwing money at risky, low-margin areas that don’t suit its culture. The cash might be better spent leveraging strengths.

Sure, the mainframe market is in long-term decline, but as ZDNet reports revenue from IBM’s latest mainframe was up 9 percent year-on-year.

IBM’s strategic problem is that it has no answers for the changes taking place in the industry. The people able to make the right decisions are the kind of people who don’t rise to the top of the company’s conservative culture.

The company’s stock response when a market turns into a low-margin commodity business is to sell it off and breath a sigh of relief that it doesn’t have to get dirty down there. It’s done that with printers, PCs and more recently, with servers. Then, every so often, it embarks on another session of masochistic cost reduction, which means sacking workers and making those who remain less motivated and even more risk averse.

This approach avoids difficult questions like “how can we adjust our business model to deal with the new realities?” It’s a question IBM can’t put off any longer.


  1. Although market share can be misleading in technology discussions, it is important in cloud computing because of economies of scale.  ↩