web analytics

When software giant Oracle set out to build its internal eCRM system, the implementation team quickly ran into problems. Like any multinational, Oracle comprises many smaller business units each with its own agenda — and not every division was keen to work with a centralised eCRM system.

James Finlay the CRM marketing manager for Oracle Australia says, “Our implementation team suffered a large amount of pain”. He says the pockets of resistance didn’t crumble until Chairman Larry Ellison took control of the project and cracked a few heads. “In order to get it implemented around the world, it had to be pushed personally by Larry. Along the way a couple of senior people in the organisation had to be dealt with”.

Oracle’s experience underlines the single most important challenge facing organisations wishing to implement Customer Relationship Management systems. Although CRM is often sold by technology companies and regarded as a technical solution, its scope goes way beyond the nuts and bolts of hardware, networks and software. All these components play a role, but at its core, CRM is a business strategy or even a philosophy.

Given the complex technology embedded in CRM systems, it’s easy to lose sight of why an organisation might want to install such a system in the first place.

This story first appeared in Fairfax’s Business Online magazine

Overcoming the main CRM challenges

  • Find a high profile, top-level sponsor – preferably your CEO or Managing Director.
  • Maintain a focus on the strategic goals of eCRM; don’t get bogged down with technical considerations.
  • Show users the practical benefits – address their fears and deal with any aspect that threatens them.
  • Develop the right management infrastructure – make sure it meshes with your strategic CRM goals rather than operational issues.
  • Create a customer-focused culture so that CRM technology doesn’t exist in a vacuum.
  • Build an integrated technical infrastructure so that your CRM components work seamlessly with each other.

Technology trap

Terry Gatward, e-business manager of Fuji Xerox Australia admits that in the earlier stages, his company fell into the trap of viewing CRM from a technology perspective. He says, “It became too much of an IT project and not enough of a business strategy. Things got back on track when we got the sales managers and our managing director behind it.”

This involved showing sales managers and sales staff the practical benefits of CRM. “Frankly, most of them were frightened that CRM was going to be used as a stick to beat them when they stepped out of line. We also found sales people were happy to use it to extract information, but they wouldn’t put any data in.”

Fuji-Xerox identified sales commission as the key issue. Employees were concerned that by sharing information they would be passing their commission over to other sales offices. Gatward says the company got around this problem by developing a set of rules to determine who gets commission for various sales. “More importantly, we got the sales managers to enforce the rules.”

Management structure

Putting the right management infrastructure in place is crucial. At Fuji-Xerox the CRM application support team is now part of the marketing department – which reflects its strategic role within the organisation. The infrastructure is still supported by the IS department.

London-based Michael Barnes, a senior program director with the Meta Group agrees that dealing with issues of corporate culture and internal politics is vital. “Mitigating the cultural and political challenges of implementing an enterprise CRM strategy is as important as dealing with technical issues. Our research reveals roughly half of the Global 2000 organizations have or plan to establish a CRM Program Management Office (PMO) this year.”

Barnes says that companies that already have a unified global view of CRM are twice as likely to establishing a PMO as companies who have a tactical approach to CRM. He regards this structure as an important indicator of eventual success. Establishing a PMO is “clearly a realization of the business complexity of getting CRM right.”

Experience gained from working in the banking industry gives Ralph Tomerlin a perspective on the relationship between company culture and CRM. Tomerlin, who is now managing director of eCRM vendor Delano Asia Pacific says that historically many organisations, including banks, offered customers a fixed set of products and had a take-it-or-leave attitude.

Knowing customers

He says, “In the past your customers would know far more about you than you knew about them. Nowadays you have to compete and your mentality has to change. You can put in a system which lets you know the value of your customers – but it’s what you do with this information once you have it that really matters.”

Tomerlin argues that to make CRM effective companies need to re-educate their staff to be more aware of customer needs and organisational goals. “You need to get into the mind of your employee how to sell, how to upsell and to constantly seek ways of increasing the value of each customer”.

The rational behind this is simple arithmetic. In a bank or a brokerage firm it typically costs $400 to acquire a new customer. This relationship can take years to reach a pay off. Because CRM generally allows companies to focus on selling more to existing customers by meeting their needs, rather than acquiring new ones, it can have a dramatic impact on the bottom line – but for that to happen employees have to be looking at shifting customers to more profitable lines of business.

However, Oracle’s Finlay says it is important not to confuse being customer focused with having a eCRM system. “The two are linked but they are not the same – you can’t have one without the other.” Finlay also believes that a company-wide customer-focus culture has to come from the top. “It’s determined by the CEO, no-one else.”


Gatward says Fuji-Xerox’s cultural experience is that some sales people take to CRM immediately, while others are far more cautious. He says, “we’ve got some sales people who love it, they exploit it, using it all the time and they like to talk to you about how it improves their productivity. This is good, we’ve found it really helps to have CRM champions.”

Putting the right technical infrastructure in place is important. Michael Barnes says most companies fail to create a CRM technology ecosystem that includes operational, collaborative and analytical CRM components. “Instead they are purchasing disparate eCommerce and CRM products, services and solutions that do not play well together. For example, e-mail response and Web click-through data collection systems are not integrating into analytical marketing applications”.

Barnes says eCRM remains essentially a misnomer. “Except for a minority of regular Web users and the more significant business-to-business segment, the e-customer is missing in action. To date, most companies have found that the Internet remains most useful for generating sales leads that can be pursued via more traditional channels. This is changing – but slowly.”

Computerworld reports we added 281 exabytes of data to the global information total in 2007.

An exabyte is a billion gigabytes. So this adds up to 800MB of data for each of the world’s 6 billion people. As much data as a 30 metre high stack of books.

It’s a lot of information.

Or maybe not. Storage experts believe that anywhere from 80 to 90 percent of stored data is anything but valuable.

Worthless storage, junk information

In 2002 I spoke to Rob Nieboer, who at the time was StorageTek’s Australian and New Zealand storage strategist. He revealed the vast bulk of data stored on company systems is worthless.

He says, “I haven’t met one person in the last three years who routinely deletes data. However, as much of 90 percent of their stored data hasn’t been accessed in months or years. According to the findings of a company called Strategic Research, when data isn’t accessed in the 30 days after it is first stored there’s only a two percent chance it will get used later.”

At the same time companies often store many data files repeatedly in the same file system. Nieboer says it’s not unusual for a single system to hold as many as 15 separate copies of the same file.

Storage Parkinson’s Law

According to Rosemary Stark (also interviewed in 2002 when she was Dimension Data’s national business manager for data centre solutions), storage obeys a version of Parkinson’s Law.

She said, “It’s a case of if you build it, they will come. Put together a system with 2GB of storage and pretty quickly it will fill up with data. Buy a system with 200GB of storage and that will also fill up before too long.”

Like Nieboer, Stark said there’s a huge problem with multiple copies of the same information but she estimates the volume of unused archive material to be closer to 80 percent. But she said 80 percent isn’t all junk: “It’s like the paper you keep on your desk. You don’t want it all, there may be a lot you can safely throw away but sometimes there are things you need to keep just in case you need them again later.”

Needles and haystacks

Although many companies focus on the economic cost of storing vast amounts of junk information, there’s a tendency to overlook the performance overhead imposed by unnecessary data. In simple terms, computer systems burn resources ploughing through haystacks of trash to find valuable needles of real information.

There are other inefficiencies. Stark said she has seen applications, for example databases, that use, say, 300 Terabytes of storage  even though the actual data might only be 50 Terabytes. This happens when systems managers set aside capacity for anticipated needs. The situation is a little like a child’s mother buying outsize clothes on the grounds that the youngster will eventually grow into them.

Nieboer said there are inherent inefficiencies in various systems.

Mainframe disks are typically only 50 percent full. With Unix systems disks might only be 40 percent full, with Windows this falls to 30 percent.

Last night many of my neighbours switched off their electricity for Earth Hour. It wasn’t an unusual event.

That’s because there have been at least four major power cuts in my North Shore suburb since I moved there at the end of 2004.

In the last five years, the power would have been off for roughly one day in every year. I’ve also seen two disruptive power cuts while working in Auckland’s central business district. In at least one of those cases, the city, and my newspaper, lost an entire day’s production.

Earlier this year the upmarket suburbs of Newmarket and Parnell suffered two more power cuts. The New Zealand Herald covered the story on February 19 in Power restored to Auckland suburbs saying shopkeepers’ confidence in Vector (the local power company) was at an all-time low.

As we shall see, that is saying something.

Not good enough

To be frank, it just isn’t good enough. Admittedly things in Auckland aren’t as bad as, say, Manila or Jakarta, but for a first world country like New Zealand, frequent power cuts are not a good look.

Ten years ago when US President Bill Clinton visited Auckland for the APEC meeting he brought a portable electricity generator. It parked outside his chic city centre hotel on the back of a large, dark, unmarked truck that some locals dubbed the ‘stealth generator’.

In any other major western city, Clinton’s precaution might have seemed a little over the top. It would have insulted his hosts.

But Auckland is different. A year and a half before the leader of the free world flew into New Zealand’s largest city, the locals were checking their shopping change by candlelight when the Auckland’s lights went out and stayed out for almost six weeks.

Wikipedia entry on 1998 Auckland power crisis.

Even now, more than ten years after that blackout, many Auckland residents fear their city’s power supply is still not secure. With good reason because at 8:30 on 12th June 2006 the power went off again. Half of Auckland including the entire CBD was without power for over four hours. Some parts of the city suffered a longer outage.

Wikipedia entry on 2006 Auckland power cut

The problems are partly a result of overzealous free-market reforms. The greed and arrogance of power company managers are also a reason.

There’s an obvious parallel here with the global financial crisis.

And then there are New Zealand’s ridiculous planning laws. These mean generators have built no new power stations to meet booming demand. Thankfully this looks set to change with the new Kaipara gas-fired power station finally getting the green light. But that’s only the start. We need more.

Electricity hampered by bureaucracy

The robust networks needed to move power from where it is generated into the city are still often held up by endless bureaucracy and over the top planning processes.

At the time of earlier crises, Auckland’s power supply depended on four cables: two gas-filled cables and two oil-filled cables. On 22 January 1998, one of the gas-filled cables failed. Power wasn’t disrupted as the remaining three cables took up the load.

On 9 February, the second gas-filled cable was cut. The power went off and Mercury Energy Limited, which operated the cables, asked its customer to voluntarily cut power consumption by 10 percent.

On 19 February, the emergency started in earnest when a fault in one of the oil-filled cables caused both remaining cables to shut down, cutting off all power to the city.

Mercury repaired one of the cables quickly, but it could only supply a reduced load. Power rationing was introduced to the city centre, which saw rolling two-hour power cuts.

At 5.30pm on Friday 20 February, the last cable failed. Mercury issued a statement that it “could no longer supply the Central Business District with electricity”. A limited standby cable managed to provide power to the district’s two hospitals and other emergency services.

For more than five weeks the nation’s largest city was plunged into chaos. Many businesses had little choice but to close down temporarily. Others relocated staff to other New Zealand cities or even flew them to Australia.

50,000 workers affected

At least 50,000 workers and 8,500 businesses were affected. Estimate costs were around $NZ400 million, though that figure does not include tourism, international trade and financial services. In a small nation like New Zealand, the shut-down was serious enough to tip an already fragile economy into recession.

Who knows how much investment hasn’t happened because of the flaky infrastructure?

At the time of the blackout, Jim Macaulay, chairman of Mercury Energy, the company that supplied Auckland’s electricity, told the media it was “the most incredible, freakish bad luck you could ever imagine.” However, the government inquiry into the blackout found that there were many warnings that such an event could occur.

Well, it could happen again. Earth Hour should have acted as a reminder to people living in a city where electricity and light can’t entirely be taken for granted.

Open source software is free.

Anyone can download an open source program. You can run it, copy it and pass it on to friends and colleagues. You can look at the code and see how the developers made the program. None of this means paying a license fee. It doesn’t break any laws. You have permission to do all these things.

Money, or cost, is not the most important point. Advocates think of word free as in ‘free speech’, not ‘no payment’.

Freedom means that users can change the programs to suit their own needs. That would be illegal with most other forms of software.

Open Source freedom means responsibility

There is one restriction: you must, in turn, pass the same set of freedoms on to everyone else. Altered open source programs must be made available to everyone.

This approach decentralises control. In turn, that means developers continually improve the software.

At the same time, having large numbers of people looking at and improving on programs means that bugs are quickly eliminated. That improves quality control.

A lot of important programs and applications are based on free software. It runs the internet and underpins some of the most popular operating systems.

Technology companies talk up their products and technologies. Let’s not mince words: they are hype merchants.

They hire professional public relations consultants and advertising agencies to whip up excitement on their behalf.

Sometimes they convince people in the media to follow suit and enthuse about their new gizmos or ideas.

Occasionally the media’s constant search for hot news and interesting headlines leads to overenthusiastic praise or a journalist swallowing a trumped-up storyline.

Hype cycle

None of this will be news to anyone working in the business. What you may not know is that the IT industry’s shameless self-promotion has now been recognised and enshrined in Gartner’s Hype Cycle.

Gartner Hype Cycle


Gartner analysts noticed a pattern in the way the world (and the media) viewed new technologies. This is a huge initial burst of excitement rapidly followed by a sigh of disillusion and, eventually, a more balanced approach.

This observation evolved into the Hype Cycle represented graphically in the diagram. The horizontal axis shows time, while the vertical axis represents visibility.

Five phases:

In the first phase, Garter calls it the “technology trigger”, a product launch, engineering breakthrough or some other event generates enormous publicity.

At first only a narrow audience is in on the news. They may hear about it through the specialist press and  start thinking about its possibilities.

Things snowball. Before long the idea reaches a wider audience and the mainstream media pays attention.

This interest gets out of control until things reach the second phase, which Gartner calls “the peak of inflated expectations”. At this point the mainstream media becomes obsessed – you can expect to see muddle-headed but enthusiastic TV segments about the technology.

You know things have peaked for sure when current affairs TV shows and radio presenters pay attention.

At this point people typically start to have unrealistic expectations. While there may be successful applications of the technology, there are often many more failures behind the scenes.

Trough of disillusionment

Once these disappointments become public, the Hype Cycle shifts into what Gartner poetically calls the “trough of disillusionment”. The mainstream press will turn its back on the story, others will be critical. Sales may drop. The idea quickly falls out of favour and seems unfashionable.

Occasionally ideas and technologies sink beneath the waves at this point, but more often they re-emerge in the “slope of enlightenment”. This is where companies and users who persisted through the bad times come to a better understanding of the benefits on offer. As a rule of thumb, most of the media has lost interest and may even ignore things, the good stuff just happens quietly in the background.

Finally, the cycle reaches the “plateau of productivity”. This occurs when the benefits of the idea or technology are now widely understood and accepted.