At its launch back in 2010, the iPad was heavily criticized for being a big iPhone. iOS 11 and the iPad Pro proved that wasn’t the case. Things further diverged with the introduction of the iPhone X, which has led to some confusion for anyone who regularly uses an iPad. I’ve been using an iPhone X and iPad Pro together for nearly six months now, and I often feel lost when moving back and forth between the devices – one with a physical home button, the other with webOS-like gestures. The result is a vastly different user experience, even though they run the same version of iOS on large rectangles of glass.
I also use both an iPhone X and an iPad Pro 12.9″, and I actually don’t see this as a problem at all. The two devices are vastly different, and I use them in completely different ways – one as a smartphone, the other as a laptop – so it only makes sense to use them differently. Forcing the iPad into the same gestures and UI as the iPhone only leaves it hamstrung; it restricts the iPad into being an oversized iPhone, while what I want is for the iPad to gain more and more features from classic operating systems like macOS and Windows.
I couldn’t agree more, Thom. I use both regularly as well, regularly being every single day, and I don’t get in the least confused between them. They have different use cases, and I wouldn’t expect every single thing about them to be identical. Of course, if Apple did bring the iPad into line with the iPhone X, we’d have another article just like this one claiming that the iPad and iPhone 8 are too different. I think people just want to complain sometimes.
Also have a iPhone X and an iPad Pro 12.9″ and couldn^aEURTMt agree more than the devices NEED different gestures from each other. One is a tiny screen and the other is the same size screen as a 13^aEUR Apple laptop. Who in their right mind (or left mind) would think they should have the same gestures?
For the iPad Pro, if the use cases are to basically replace a macbook / laptop. Wouldn’t it be better if they developed touch interfaces for macOS, rather than make iOS more like macOS?
Both Gnome-Shell and Windows 10 work on a touch display (to varying degrees of success for both), so I don’t know why Apple doesn’t do the same with macOS, unless they’re planning on killing it soon, which is a decent possibility….
The answer to that is obvious: applications. I haven’t tried GNOME on touch yet, but have you actually used Windows 10 and all your necessary applications via a touch screen? No trackpad, no mouse, just touch and keyboard? If you haven’t, try it for a few minutes and I think the answer will become evident. It’s not whether the operating system can be adapted (MacOS already supports touch pretty well with an external monitor), it’s whether the user-facing applications ever would be. It makes far more sense to capitalize on an already huge base of touch-enabled applications than try to put MacOS into that role and hope app developers will properly code their applications for touch. It hasn’t worked out that well for Microsoft, after all, who can’t even bother to do it for their own apps let alone anyone else putting in the effort.