The processor upgrade in the 2020 12.9-inch iPad Pro is incremental. Apple has done much more with the rear camera. Or, to be more accurate, cameras. The cluster includes two cameras and a lidar sensor.
For the last five years, phone upgrades have centred on beefed up cameras. This follows that path.
The iPad’s main wide angle camera is like the 28mm equivalent camera on the earlier 2018 12.9-inch iPad Pro. It has 12 megapixels. If there’s a performance difference, I can’t see it.
By tablet standards it’s still a great camera although it’s not as good as the camera on the iPhone 11. You wouldn’t expect that.
Second rear camera
There’s also a 10 megapixel ultra-wide angle camera. It’s the first time the iPad has had a second camera. For the most part it helps the iPad Pro take better pictures in poor light conditions.
In practice the two cameras work together much of the time.
Taking pictures with a 12.9-inch iPad Pro is unwieldy compared with the iPhone. Using on screen controls also gets in the way. And it feels a bit odd standing there with a magazine-size device shooting images.
The wide angle lens makes this even harder when focusing on near objects. That’s because a lever effect comes into play, so a small movement moves the camera target a fair distance.
No doubt there are enthusiasts who swear by the iPad Pro camera and do amazing things with it. For me, it is for opportunistic snapshots. I also use the iPad camera as a replacement for a scanner. It does a great job of capturing images sitting on my desk.
On the front of the 12.9-inch iPad Pro is a seven megapixel camera for selfies and video conferencing. It’s limited when compared to the rear cameras, but is great for FaceTime or Skype calls.
Video conference camera better than laptops
I’ve been using it while working from home. The picture quality is way better than you get on a MacBook or, for that matter, most Windows laptops.1
The only issue with the front camera is that sits at the top of the display when you hold the iPad in portrait mode. This is the same as the camera on an iPhone.
It makes sense when you are using the tablet as a tablet. Yet when it is sitting on your desk, perhaps with an attached keyboard, the camera is off the left hand side.
The software is clever enough to adjust the image so that when you look face on at the iPad in landscape mode, it centres your image. If you want to look people in the eye, you need to remember to stare at the lefthand edge of the display.
When Apple told me there was a lidar sensor, my first thought that it would gauge depth. This would help with photography. Although that’s possible and, in theory, could be a future software update, it’s not why Apple included the sensor.
For Apple, lidar is all about improving the augmented reality experience. You can use it to accurately measure the space around you.
The iPad has a new iPadOS measuring app that uses this. More often lidar is used in conjunction with augmented realty apps and games. You might, for example, have AR games characters running around your living room.
Lidar technology is used by autonomous and semi-autonomous cars to map the immediate world around them. The iPad version works up to 5 metres which is more than enough inside most homes, but is less useful out of doors.
There’s no question this technology is clever, but I consider it a nice-to-have feature. It is far from essential as things stand right now. That could all change with the arrival of new applications that make use of it. Nothing springs to mind, but if it did I’d be a wealthy software entrepreneur not a journalist.
- The iPad Pro video calling experience is vastly better than calling on any laptop. This alone could justify the expense of buying a 2020 iPad Pro. ↩︎