web analytics

There’s a widespread feeling that remote work is now mainstream. An assumption that we will never go back to the old ways of working.

If that’s true, it’s been a long time coming. For years experts and pundits have predicted many more people will work from home in the future.

There’s evidence that the number of people working from home, at least for some of the time, was already rising before the Covid–19 pandemic accelerated the trend.

Remote work’s slow, steady rise

The rise has been slow, but steady.

It exploded when New Zealand and most of the world went into lockdown. Anyone who’s work could be done remotely logged on from home. Data volumes on broadband networks soared and hitherto esoteric applications like Zoom became part of everyday life for white collar workers.

For some home will be cemented in as their main future workplace. It’s worth remembering this only applies to certain types of work, a surgeon can’t operate via Zoom, nor can a supermarket shelf stacker.

The National Business Review closed down its editorial office, apparently for ever. The paper gave staff an allowance to cover home working costs.

Lonely, alienating?

Remote work is not too lonely and alienating for journalists. The job often involves attending functions or meeting people for interviews or information gathering.1 It could be hard going for some.

Yet, hundreds of companies are working through similar plans.

One key thing that changed with the early 2020 lockdown was that managers and individuals alike realised that mass remote working is possible and practical. Until now there was scepticism, especially among more anally-retentive managers.

Productivity questions

There are still questions over its desirability in every case. Some voices say remote working is more productive. Other managers hate the idea of not being able to look out from the corner suite to see rows of heads down with people beavering away.

For what it’s worth, my experience over the years is that it can be more productive at times, but work quickly eats in to the rest of your life. I certainly do more than a forty hour week and can count the number of non-working weekends over the last 15 years on my fingertips.

Either way, my gut tells me that while we are going to see more home or remote working than before lockdown, there will also be a drift back to the office.

Mix and match remote work

Maybe people will work from home two or three days a week and commute on the other days. Or it could be people will work one way when they need to focus on their own, and another way when close collaboration is needed.

It’s not all about me, but let’s go back to my experience in this department. I find if I only work from home, my productivity is OK, but not great.

Likewise, if I only work from an office, it’s not great either. But if I mix things up, productivity shoots up. If I have the freedom to work from home as and when the mood or my energy levels dictate, I get the best result of all.

Your experience might be different. It certainly will if you have a young family or if there’s not a lot of space at home. In those cases getting out is wise. This goes some way to explain the popularity of co-working spaces.

Which brings me to the other key point. It’s likely that many future workplaces will look and feel a lot more like co-working spaces.


  1. Well, that’s my experience and I’ve been working this way for almost 15 years now and sporadically for periods before this one. ↩︎

At the Guardian, Douglas Rushkoff says our technology is now an entire environment. We live there. We’ve spent the decade letting our tech define us. It’s out of control

He says:

“We may come to remember this decade as the one when human beings finally realized we are up against something. We’re just not quite sure what it is.

“More of us have come to understand that our digital technologies are not always bringing out our best natures. People woke up to the fact that our digital platforms are being coded by people who don’t have our best interests at heart. This is the decade when, finally, the “tech backlash” began.

But it’s a little late.”

It is a long essay and not easy reading, especially at a time of year when most New Zealanders and Australians have switched off their work brains.

Yet, if you have the time, it is worth reading it all.

Rushkoff knows his stuff and offers some powerful insights. In the essay he runs through the key issues.

Issues are not new

To cut it short, he starts out by saying surveillance capitalism and manipulation are not new. They have long been part our online activity and in our apps for ages. It’s being going on for 20 years now.

He says while these ideas are getting all the attention today, things have moved on. Surveillance capitalism and manipulation may no longer be relevant concerns.

Rushkoff argues we now spend most of how waking hours bathing in the waters of Facebook, Twitter, Apple and Google. In other words: “We have been shaped into who the data says we are”.

Join the party

Until now, the common response has been about joining in. There is pressure for young people to learn to code. I’m all for motivated, interested youngsters learning to code, it remains a good career choice.

We don’t have enough people tackling these issues from a social science or art point of view. (Rushkoff talks about liberal arts).

Writers, journalists, movie makers, artists and others have an important role to play. We can communicate and understanding what is going on from a non-engineering or financial perspective.

It’s a complex, deep essay. You may find it too much to absorb in a single reading. I’ve come back to it a few times.

A disappointing omission is that Rushkoff fails to make a connection between this and evidence that our digital lives make us less happy.

Take back control

One thing we can do to mitigate the problems is to take back control of our online experience. If you like to spend less time bathing in what is, if not a toxic soup, certainly something less than ideal.

How to fight back? First, do all the obvious hygiene things. Quit Facebook, choose apps and operating systems where there is room for privacy. Use alternatives to Google.

Embrace openness in all its aspects, not only Open Source software. Be wary of products like Android which are surveillance tools with a little usefulness thrown in.

Be especially wary of ‘free’ services. The price you pay may be far higher than you think.

You don’t have to learn to code. Indeed, unless you have an aptitude or an urge to do so, I recommend you don’t. People like you can read more printed books instead. But when you do, write and talk about your experiences and ideas.

Declare independence

Try to develop an independent online presence. One that isn’t part of a commercial data collection operation.

Learn how to use WordPress. Write a blog instead of posting articles on Facebook or Linkedin. Share things. Investigated ideas like the IndieWeb or Microblogging, both are refreshing. Build links with humans, not corporations or bots.

Rushkoff’s optimistic finishing points echo those broad ideas, even he dresses them in different language. The key here is to seize back as much control as you can.

You’ll be happier.

As education minister, Hekia Parata upset New Zealand’s tech sector by not elevating digital technology teaching to the level they asked for.

The debate hasn’t stopped.

It may never stop. The technology industry is wealthy and powerful. It knows how to lobby. It is a master of using the media. Its voice will be heard.

It has a good point.

Digital curriculum

There’s a strong case for giving digital technology a greater share of the curriculum.

Digital technology doesn’t belong in a vocational ghetto alongside woodwork and other non-academic subjects.

While there is a case for non-academic digital education, technology also needs to be taught to a higher standard.

But let’s not carried away. Whether you call it digital literacy or technology it should not be on a par with language or maths teaching.

They are fundamental.

Although you might argue the same about technology’s role in the modern world, that’s not quite true.

Literacy and digital, different aspects of the same thing

You can’t do digital well without being able to read and communicate.

Without reading skills a young person’s digital experience can’t advance far beyond taking selfies, playing games and watching streamed video.

Most digital devices are, one way or another, communications tools.

Even if the tools evolve to the point where an ability to write or type is no longer essential, people still need basic communications skills.

If the goal is to encourage more young New Zealanders into technology careers, they need to be articulate and numerate to cope with the work.

At a pinch you can train a literate, articulate adult to work in almost every tech industry role. Although it may not be impossible to find meaningful work for those without those skills, it will be harder.

Education bigger picture

Education has be about more than preparing people for the workplace. Digital skills are important for every other aspect of life.

Which brings us to the most important aspect of technology education: the digital divide.

We often think the digital divide is only about access to devices, tools and networks. It is usually framed as something that affects poorer or more remote New Zealanders.

Yet there’s another divide that’s just as bad.

People who feel unable to perform even basic digital tasks because they lack the skills are as disadvantaged as those who can’t get online.

The same goes for people who can’t read, write or otherwise communicate. We don’t call that a digital divide, but it amounts to something similar. Let’s call it the literacy divide.

It’s great that we devote money, time and energy to helping people get across the digital divide. More power to those working in this area.

Yet we also need to use the same vigour to deal with the basic literacy divide because those people are in the same dark place.

So, by all means ramp up digital education, but not at the expense of something that’s more fundemental.

I’m not an early adopter.

Early adopters must own the latest devices. They run ahead of the pack. They upgrade devices and software before everyone else.

Early adopters use the latest phones. They buy cars with weird features. They queue up in the wee small hours for iPhones, iPads or games consoles. Back in the day they’d go to midnight store openings to get the newest version of Microsoft Windows a few hours earlier.

Their computers never work because they are awash in beta and alpha versions of software screwing things up.

And some of their kit is, well, unfinished.

Computer makers depend on early adopters. They use them as guinea pigs.

Early adopters first to benefit

Marketing types will tell you early adopters will buy a product first to steal a march over the rest of humanity. They claim they will be the first to reap the benefits of the new product. It will make them more productive or live more enjoyable lives.

This can be true. Yet early adopters often face the trauma of getting unfinished, unpolished products to work. Often before manufacturer support teams have learnt the wrinkles of their new products.

There’s another reason computer makers love early adopters — they pay more for.

New products usually hit the market with a premium price. Once a product matures, the bugs eliminated and competition appears, profit margins are slimmer.

Companies use high-paying early adopters to fund their product development.

Being an early adopter is fine if you enjoy playing with digital toys. If productivity isn’t as important to you as being cool. If you have the time and money to waste making them work.

I don’t. I prefer to let others try things first. Let computer makers and software developers iron out the wrinkles while the product proves its worth. Then I’ll turn up with my money.

In technology the early bird pays the bill.