web analytics

Computerworld reports we added 281 exabytes of data to the global information total in 2007.

An exabyte is a billion gigabytes. So this adds up to 800MB of data for each of the world’s 6 billion people. As much data as a 30 metre high stack of books.

It’s a lot of information.

Or maybe not. Storage experts believe that anywhere from 80 to 90 percent of stored data is anything but valuable.

Worthless storage, junk information

In 2002 I spoke to Rob Nieboer, who at the time was StorageTek’s Australian and New Zealand storage strategist. He revealed the vast bulk of data stored on company systems is worthless.

He says, “I haven’t met one person in the last three years who routinely deletes data. However, as much of 90 percent of their stored data hasn’t been accessed in months or years. According to the findings of a company called Strategic Research, when data isn’t accessed in the 30 days after it is first stored there’s only a two percent chance it will get used later.”

At the same time companies often store many data files repeatedly in the same file system. Nieboer says it’s not unusual for a single system to hold as many as 15 separate copies of the same file.

Storage Parkinson’s Law

According to Rosemary Stark (also interviewed in 2002 when she was Dimension Data’s national business manager for data centre solutions), storage obeys a version of Parkinson’s Law.

She said, “It’s a case of if you build it, they will come. Put together a system with 2GB of storage and pretty quickly it will fill up with data. Buy a system with 200GB of storage and that will also fill up before too long.”

Like Nieboer, Stark said there’s a huge problem with multiple copies of the same information but she estimates the volume of unused archive material to be closer to 80 percent. But she said 80 percent isn’t all junk: “It’s like the paper you keep on your desk. You don’t want it all, there may be a lot you can safely throw away but sometimes there are things you need to keep just in case you need them again later.”

Needles and haystacks

Although many companies focus on the economic cost of storing vast amounts of junk information, there’s a tendency to overlook the performance overhead imposed by unnecessary data. In simple terms, computer systems burn resources ploughing through haystacks of trash to find valuable needles of real information.

There are other inefficiencies. Stark said she has seen applications, for example databases, that use, say, 300 Terabytes of storage  even though the actual data might only be 50 Terabytes. This happens when systems managers set aside capacity for anticipated needs. The situation is a little like a child’s mother buying outsize clothes on the grounds that the youngster will eventually grow into them.

Nieboer said there are inherent inefficiencies in various systems.

Mainframe disks are typically only 50 percent full. With Unix systems disks might only be 40 percent full, with Windows this falls to 30 percent.

Last night many of my neighbours switched off their electricity for Earth Hour. It wasn’t an unusual event.

That’s because there have been at least four major power cuts in my North Shore suburb since I moved there at the end of 2004.

In the last five years, the power would have been off for roughly one day in every year. I’ve also seen two disruptive power cuts while working in Auckland’s central business district. In at least one of those cases, the city, and my newspaper, lost an entire day’s production.

Earlier this year the upmarket suburbs of Newmarket and Parnell suffered two more power cuts. The New Zealand Herald covered the story on February 19 in Power restored to Auckland suburbs saying shopkeepers’ confidence in Vector (the local power company) was at an all-time low.

As we shall see, that is saying something.

Not good enough

To be frank, it just isn’t good enough. Admittedly things in Auckland aren’t as bad as, say, Manila or Jakarta, but for a first world country like New Zealand, frequent power cuts are not a good look.

Ten years ago when US President Bill Clinton visited Auckland for the APEC meeting he brought a portable electricity generator. It parked outside his chic city centre hotel on the back of a large, dark, unmarked truck that some locals dubbed the ‘stealth generator’.

In any other major western city, Clinton’s precaution might have seemed a little over the top. It would have insulted his hosts.

But Auckland is different. A year and a half before the leader of the free world flew into New Zealand’s largest city, the locals were checking their shopping change by candlelight when the Auckland’s lights went out and stayed out for almost six weeks.

Wikipedia entry on 1998 Auckland power crisis.

Even now, more than ten years after that blackout, many Auckland residents fear their city’s power supply is still not secure. With good reason because at 8:30 on 12th June 2006 the power went off again. Half of Auckland including the entire CBD was without power for over four hours. Some parts of the city suffered a longer outage.

Wikipedia entry on 2006 Auckland power cut

The problems are partly a result of overzealous free-market reforms. The greed and arrogance of power company managers are also a reason.

There’s an obvious parallel here with the global financial crisis.

And then there are New Zealand’s ridiculous planning laws. These mean generators have built no new power stations to meet booming demand. Thankfully this looks set to change with the new Kaipara gas-fired power station finally getting the green light. But that’s only the start. We need more.

Electricity hampered by bureaucracy

The robust networks needed to move power from where it is generated into the city are still often held up by endless bureaucracy and over the top planning processes.

At the time of earlier crises, Auckland’s power supply depended on four cables: two gas-filled cables and two oil-filled cables. On 22 January 1998, one of the gas-filled cables failed. Power wasn’t disrupted as the remaining three cables took up the load.

On 9 February, the second gas-filled cable was cut. The power went off and Mercury Energy Limited, which operated the cables, asked its customer to voluntarily cut power consumption by 10 percent.

On 19 February, the emergency started in earnest when a fault in one of the oil-filled cables caused both remaining cables to shut down, cutting off all power to the city.

Mercury repaired one of the cables quickly, but it could only supply a reduced load. Power rationing was introduced to the city centre, which saw rolling two-hour power cuts.

At 5.30pm on Friday 20 February, the last cable failed. Mercury issued a statement that it “could no longer supply the Central Business District with electricity”. A limited standby cable managed to provide power to the district’s two hospitals and other emergency services.

For more than five weeks the nation’s largest city was plunged into chaos. Many businesses had little choice but to close down temporarily. Others relocated staff to other New Zealand cities or even flew them to Australia.

50,000 workers affected

At least 50,000 workers and 8,500 businesses were affected. Estimate costs were around $NZ400 million, though that figure does not include tourism, international trade and financial services. In a small nation like New Zealand, the shut-down was serious enough to tip an already fragile economy into recession.

Who knows how much investment hasn’t happened because of the flaky infrastructure?

At the time of the blackout, Jim Macaulay, chairman of Mercury Energy, the company that supplied Auckland’s electricity, told the media it was “the most incredible, freakish bad luck you could ever imagine.” However, the government inquiry into the blackout found that there were many warnings that such an event could occur.

Well, it could happen again. Earth Hour should have acted as a reminder to people living in a city where electricity and light can’t entirely be taken for granted.

Open source software is free.

Anyone can download an open source program. You can run it, copy it and pass it on to friends and colleagues. You can look at the code and see how the developers made the program. None of this means paying a license fee. It doesn’t break any laws. You have permission to do all these things.

Money, or cost, is not the most important point. Advocates think of word free as in ‘free speech’, not ‘no payment’.

Freedom means that users can change the programs to suit their own needs. That would be illegal with most other forms of software.

Open Source freedom means responsibility

There is one restriction: you must, in turn, pass the same set of freedoms on to everyone else. Altered open source programs must be made available to everyone.

This approach decentralises control. In turn, that means developers continually improve the software.

At the same time, having large numbers of people looking at and improving on programs means that bugs are quickly eliminated. That improves quality control.

A lot of important programs and applications are based on free software. It runs the internet and underpins some of the most popular operating systems.

Technology companies talk up their products and technologies. Let’s not mince words: they are hype merchants.

They hire professional public relations consultants and advertising agencies to whip up excitement on their behalf.

Sometimes they convince people in the media to follow suit and enthuse about their new gizmos or ideas.

Occasionally the media’s constant search for hot news and interesting headlines leads to overenthusiastic praise or a journalist swallowing a trumped-up storyline.

Hype cycle

None of this will be news to anyone working in the business. What you may not know is that the IT industry’s shameless self-promotion has now been recognised and enshrined in Gartner’s Hype Cycle.

Gartner Hype Cycle

 

Gartner analysts noticed a pattern in the way the world (and the media) viewed new technologies. This is a huge initial burst of excitement rapidly followed by a sigh of disillusion and, eventually, a more balanced approach.

This observation evolved into the Hype Cycle represented graphically in the diagram. The horizontal axis shows time, while the vertical axis represents visibility.

Five phases:

In the first phase, Garter calls it the “technology trigger”, a product launch, engineering breakthrough or some other event generates enormous publicity.

At first only a narrow audience is in on the news. They may hear about it through the specialist press and  start thinking about its possibilities.

Things snowball. Before long the idea reaches a wider audience and the mainstream media pays attention.

This interest gets out of control until things reach the second phase, which Gartner calls “the peak of inflated expectations”. At this point the mainstream media becomes obsessed – you can expect to see muddle-headed but enthusiastic TV segments about the technology.

You know things have peaked for sure when current affairs TV shows and radio presenters pay attention.

At this point people typically start to have unrealistic expectations. While there may be successful applications of the technology, there are often many more failures behind the scenes.

Trough of disillusionment

Once these disappointments become public, the Hype Cycle shifts into what Gartner poetically calls the “trough of disillusionment”. The mainstream press will turn its back on the story, others will be critical. Sales may drop. The idea quickly falls out of favour and seems unfashionable.

Occasionally ideas and technologies sink beneath the waves at this point, but more often they re-emerge in the “slope of enlightenment”. This is where companies and users who persisted through the bad times come to a better understanding of the benefits on offer. As a rule of thumb, most of the media has lost interest and may even ignore things, the good stuff just happens quietly in the background.

Finally, the cycle reaches the “plateau of productivity”. This occurs when the benefits of the idea or technology are now widely understood and accepted.