web analytics

Mark Shead at Productivity 501 writes about the Hawthorne effect:

The Hawthorne effect refers to some studies that were done on how training impacts employees’ productivity at work. The studies found that sending someone to training produces employees that work harder. The funny part about it is that you still get the productivity increase even if the training doesn’t teach them how to be better at their jobs. Sending someone to training helps them feel like they are important, like the company is investing in them and they are valuable. Because of this, they work harder.

An explanatory note at the bottom of Shead’s post points out the original tests were to do with changing light levels. You can read Shead’s original story at Hawthorne Effect : Productivity501.

It’s also worth reading the Wikipedia entry on the Hawthorne effect. There’s also a good definition of the effect at Donald Clark’s site: The Hawthorne effect.  Clark writes:

The Hawthorne effect – an increase in worker productivity produced by the psychological stimulus of being singled out and made to feel important.

Clarke links the effect to work done by Frederick Taylor who gave birth to the idea of industrial psychology.

My own common sense experience as a manager says you should pay attention to workers as a matter of course. Sadly this isn’t obvious to everyone. It certainly wasn’t back in the 1920s and 1930s when these ideas were fresh and new. If the effect is clear among knowledge workers at your workplace, it’s a sign you aren’t managing people correctly.

See also: Taylor’s scientific management doesn’t apply to knowledge work

Computerworld reports we added 281 exabytes of data to the global information total in 2007.

An exabyte is a billion gigabytes. So this adds up to 800MB of data for each of the world’s 6 billion people. As much data as a 30 metre high stack of books.

It’s a lot of information.

Or maybe not. Storage experts believe that anywhere from 80 to 90 percent of stored data is anything but valuable.

Worthless storage, junk information

In 2002 I spoke to Rob Nieboer, who at the time was StorageTek’s Australian and New Zealand storage strategist. He revealed the vast bulk of data stored on company systems is worthless.

He says, “I haven’t met one person in the last three years who routinely deletes data. However, as much of 90 percent of their stored data hasn’t been accessed in months or years. According to the findings of a company called Strategic Research, when data isn’t accessed in the 30 days after it is first stored there’s only a two percent chance it will get used later.”

At the same time companies often store many data files repeatedly in the same file system. Nieboer says it’s not unusual for a single system to hold as many as 15 separate copies of the same file.

Storage Parkinson’s Law

According to Rosemary Stark (also interviewed in 2002 when she was Dimension Data’s national business manager for data centre solutions), storage obeys a version of Parkinson’s Law.

She said, “It’s a case of if you build it, they will come. Put together a system with 2GB of storage and pretty quickly it will fill up with data. Buy a system with 200GB of storage and that will also fill up before too long.”

Like Nieboer, Stark said there’s a huge problem with multiple copies of the same information but she estimates the volume of unused archive material to be closer to 80 percent. But she said 80 percent isn’t all junk: “It’s like the paper you keep on your desk. You don’t want it all, there may be a lot you can safely throw away but sometimes there are things you need to keep just in case you need them again later.”

Needles and haystacks

Although many companies focus on the economic cost of storing vast amounts of junk information, there’s a tendency to overlook the performance overhead imposed by unnecessary data. In simple terms, computer systems burn resources ploughing through haystacks of trash to find valuable needles of real information.

There are other inefficiencies. Stark said she has seen applications, for example databases, that use, say, 300 Terabytes of storage  even though the actual data might only be 50 Terabytes. This happens when systems managers set aside capacity for anticipated needs. The situation is a little like a child’s mother buying outsize clothes on the grounds that the youngster will eventually grow into them.

Nieboer said there are inherent inefficiencies in various systems.

Mainframe disks are typically only 50 percent full. With Unix systems disks might only be 40 percent full, with Windows this falls to 30 percent.

Drawing parallels between the music industry and the newspaper business is not new.

Both industries are in free fall. Both are leaving skilled career-committed professionals struggling to find ways to carry on doing what they are good at while putting food on the table.

Although the newspaper industry may now be collapsing faster than the music business, the record companies started their decline earlier. Which means musicians have had longer to work out ways of coping.

And some of them are coping quite nicely thank you.

David Byrne’s Survival Strategies for Emerging Artists suggests journalists can adapt some lessons already learnt by professional musicians.

Byrne’s article starts with a description of what happened to the business that is optimistic from a musician’s point of view:

What is called the music business today, however, is not the business of producing music. At some point it became the business of selling CDs in plastic cases, and that business will soon be over. But that’s not bad news for music, and it is certainly not bad news for musicians. Indeed, with all the ways to reach an audience, there have never been more opportunities for artists.

You could make an equally uplifting argument about the opportunities for journalists. Although there is one huge difference between the two industries: no-one is going to pay to see journalists perform live.

Later Byrne looks at possible distribution models – most of which have analogies in the newspaper world.

David Byrne’s Survival Strategies for Emerging Artists

Abraham Maslow’s hierarchy of needs first appeared in 1954. The world has changed over the past 55 years. We’re not the same and critics have challenged Maslow.

You can read more about Maslow and his hierarchy of needs in Motivation and the hierarchy of needs. Some criticism is covered in Challenging Maslow’s Hierarchy of Needs.

Maslow’s hierarchy is often shown as a pyramid. There’s an implication people move up the pyramid as their lives improve.

Take a knowledge worker. The person will gain skills, win responsibility and in turn earn extra income. This takes care of the lower levels of the hierarchy.

Self-actualisation

According to Maslow this makes it possible to move up to self-actualisation. Think of this as a kind of western nirvana.

Today’s global financial crisis means many workers are moving in the opposite direction.

Being laid off is traumatic. In some cases people can be at the pinnacle of the hierarchy one day and slide all the way to the bottom the moment the pink slip appears. This can also happen when disaster strikes. Finding food, shelter and warmth are once more the most important things on the agenda.

Many redundant workers pick themselves up and climb back up Maslow’s pyramid. The journey is easier the second time around. Knowing the route and recognising the landmarks along the way helps.

Maslow’s theory works well enough on the four bottom stages. You only have to look around and see people at each level. And occasionally you’ll notice people moving up or down.

You don’t see many self-actualised pyramid toppers.

Even in the good times before the economy nose-dived, Brahmins were thin on the ground. This would be especially so in the higher levels of the economy (which is where you might expect to find them given the pyramid).

What does this tell us?

Maslow’s hierarchy is a useful theory, but it’s not a pyramid. It is a four-step ladder. At each step up the ladder there’s a slide that could take you back down again. In other words, it’s a game of snakes and ladders.

Last night many of my neighbours switched off their electricity for Earth Hour. It wasn’t an unusual event.

That’s because there have been at least four major power cuts in my North Shore suburb since I moved there at the end of 2004.

In the last five years, the power would have been off for roughly one day in every year. I’ve also seen two disruptive power cuts while working in Auckland’s central business district. In at least one of those cases, the city, and my newspaper, lost an entire day’s production.

Earlier this year the upmarket suburbs of Newmarket and Parnell suffered two more power cuts. The New Zealand Herald covered the story on February 19 in Power restored to Auckland suburbs saying shopkeepers’ confidence in Vector (the local power company) was at an all-time low.

As we shall see, that is saying something.

Not good enough

To be frank, it just isn’t good enough. Admittedly things in Auckland aren’t as bad as, say, Manila or Jakarta, but for a first world country like New Zealand, frequent power cuts are not a good look.

Ten years ago when US President Bill Clinton visited Auckland for the APEC meeting he brought a portable electricity generator. It parked outside his chic city centre hotel on the back of a large, dark, unmarked truck that some locals dubbed the ‘stealth generator’.

In any other major western city, Clinton’s precaution might have seemed a little over the top. It would have insulted his hosts.

But Auckland is different. A year and a half before the leader of the free world flew into New Zealand’s largest city, the locals were checking their shopping change by candlelight when the Auckland’s lights went out and stayed out for almost six weeks.

Wikipedia entry on 1998 Auckland power crisis.

Even now, more than ten years after that blackout, many Auckland residents fear their city’s power supply is still not secure. With good reason because at 8:30 on 12th June 2006 the power went off again. Half of Auckland including the entire CBD was without power for over four hours. Some parts of the city suffered a longer outage.

Wikipedia entry on 2006 Auckland power cut

The problems are partly a result of overzealous free-market reforms. The greed and arrogance of power company managers are also a reason.

There’s an obvious parallel here with the global financial crisis.

And then there are New Zealand’s ridiculous planning laws. These mean generators have built no new power stations to meet booming demand. Thankfully this looks set to change with the new Kaipara gas-fired power station finally getting the green light. But that’s only the start. We need more.

Electricity hampered by bureaucracy

The robust networks needed to move power from where it is generated into the city are still often held up by endless bureaucracy and over the top planning processes.

At the time of earlier crises, Auckland’s power supply depended on four cables: two gas-filled cables and two oil-filled cables. On 22 January 1998, one of the gas-filled cables failed. Power wasn’t disrupted as the remaining three cables took up the load.

On 9 February, the second gas-filled cable was cut. The power went off and Mercury Energy Limited, which operated the cables, asked its customer to voluntarily cut power consumption by 10 percent.

On 19 February, the emergency started in earnest when a fault in one of the oil-filled cables caused both remaining cables to shut down, cutting off all power to the city.

Mercury repaired one of the cables quickly, but it could only supply a reduced load. Power rationing was introduced to the city centre, which saw rolling two-hour power cuts.

At 5.30pm on Friday 20 February, the last cable failed. Mercury issued a statement that it “could no longer supply the Central Business District with electricity”. A limited standby cable managed to provide power to the district’s two hospitals and other emergency services.

For more than five weeks the nation’s largest city was plunged into chaos. Many businesses had little choice but to close down temporarily. Others relocated staff to other New Zealand cities or even flew them to Australia.

50,000 workers affected

At least 50,000 workers and 8,500 businesses were affected. Estimate costs were around $NZ400 million, though that figure does not include tourism, international trade and financial services. In a small nation like New Zealand, the shut-down was serious enough to tip an already fragile economy into recession.

Who knows how much investment hasn’t happened because of the flaky infrastructure?

At the time of the blackout, Jim Macaulay, chairman of Mercury Energy, the company that supplied Auckland’s electricity, told the media it was “the most incredible, freakish bad luck you could ever imagine.” However, the government inquiry into the blackout found that there were many warnings that such an event could occur.

Well, it could happen again. Earth Hour should have acted as a reminder to people living in a city where electricity and light can’t entirely be taken for granted.