Archives

mac os x

Apple and Google pressed in antitrust hearing on whether app stores share data with product development teams

In today’s antitrust hearing in the U.S. Senate, Apple and Google representatives were questioned on whether they have a “strict firewall” or other internal policies in place that prevent them from leveraging the data from third-party businesses operating on their app stores to inform the development of their own competitive products. Apple, in particular, was called out for the practice of copying other apps by Senator Richard Blumenthal (D-CT), who said the practice had become so common that it earned a nickname with Apple’s developer community: “sherlocking.”

Sherlock, which has its own Wikipedia entry under software, comes from Apple’s search tool in the early 2000’s called Sherlock. A third-party developer, Karelia Software, created an alternative tool called Watson. Following the success of Karelia’s product, Apple added Watson’s same functionality into its own search tool, and Watson was effectively put out of business. The nickname “Sherlock” later became shorthand for any time Apple copies an idea from a third-party developer that threatens to or even destroys their business.

Over the years, developers claimed Apple has “sherlocked” a number of apps including Konfabulator (desktop widgets), iPodderX (podcast manager), Sandvox (app for building websites), Growl (a notification system for Mac OS X), and in more recent years, F.lux (blue light reduction tool for screens) Duet and Luna (apps that makes iPad a secondary display), as well as various screen time management tools. Now Tile claims Apple has also unfairly entered its market with AirTag.

During his questioning, Blumenthal asked Apple and Google’s representatives at the hearing — Mr. Kyle Andeer, Apple’s
Chief Compliance Officer and Mr. Wilson White, Google’s Senior Director Public Policy & Government Relations, respectively — if they employed any sort of “firewall” in between their app stores and their business strategy.

Andeer somewhat dodged the question, saying, “Senator, if I understand the question correctly, we have separate teams that manage the App Store and that are engaged in product development strategy here at Apple.”

Blumenthal then clarified what he meant by “firewall.” He explained that it doesn’t mean whether or not there are separate teams in place, but whether there’s an internal prohibition on sharing data between the App Store and the people who run Apple’s other businesses.

Andeer then answered, “Senator, we have controls in place.”

He went on to note that over the past twelve years, Apple has only introduced “a handful of applications and services,” and in every instance, there are “dozens of alternatives” on the App Store. And, sometimes, the alternatives are more popular than Apple’s own product, he noted.

“We don’t copy. We don’t kill. What we do is offer up a new choice and a new innovation,” Andeer stated.

His argument may hold true when there are strong rivalries, like Spotify versus Apple Music, or Netflix versus Apple TV+, or Kindle versus Apple Books. But it’s harder to stretch it to areas where Apple makes smaller enhancements — like when Apple introduced Sidecar, a feature that allowed users to make their iPad a secondary display. Sidecar ended the need for a third-party app, after apps like Duet and Luna first proved the market.

Another example was when Apple built screen time controls into its iOS software, but didn’t provide the makers of third-party screen time apps with an API so consumers could use their preferred apps to configure Apple’s Screen Time settings via the third-party’s specialized interface or take advantage of other unique features.

Blumenthal said he interpreted Andeer’s response as to whether Apple has a “data firewall” as a “no.”

Posed the same question, Google’s representative, Mr. White said his understanding was that Google had “data access controls in place that govern how data from our third-party services are used.”

Blumenthal pressed him to clarify if this was a “firewall,” meaning, he clarified again, “do you have a prohibition against access?”

“We have a prohibition against using our third-party services to compete directly with our first-party services,” Mr. White said, adding that Google has “internal policies that govern that.”

The Senator said he would follow up on this matter with written questions, as his time expired.

Yeah, Apple’s M1 MacBook Pro is powerful, but it’s the battery life that will blow you away

Survival and strategy games are often played in stages. You have the early game where you’re learning the ropes, understanding systems. Then you have mid-game where you’re executing and gathering resources. The most fun part, for me, has always been the late mid-game where you’re in full control of your powers and skills and you’ve got resources to burn — where you execute on your master plan before the endgame gets hairy.

This is where Apple is in the game of power being played by the chip industry. And it’s about to be endgame for Intel. 

Apple has introduced three machines that use its new M1 system on a chip, based on over a decade’s worth of work designing its own processing units based on the ARM instructions set. These machines are capable, assured and powerful, but their greatest advancements come in the performance per watt category.

I personally tested the 13” M1 MacBook Pro and after extensive testing, it’s clear that this machine eclipses some of the most powerful Mac portables ever made in performance while simultaneously delivering 2x-3x the battery life at a minimum. 

These results are astounding, but they’re the product of that long early game that Apple has played with the A-series processors. Beginning in earnest in 2008 with the acquisition of PA Semiconductor, Apple has been working its way towards unraveling the features and capabilities of its devices from the product roadmaps of processor manufacturers.  

The M1 MacBook Pro runs smoothly, launching apps so quickly that they’re often open before your cursor leaves your dock. 

Video editing and rendering is super performant, only falling behind older machines when it leverages the GPU heavily. And even then only with powerful dedicated cards like the 5500M or VEGA II. 

Compiling projects like WebKit produce better build times than nearly any machine (hell the M1 Mac Mini beats the Mac Pro by a few seconds). And it does it while using a fraction of the power. 

This thing works like an iPad. That’s the best way I can describe it succinctly. One illustration I have been using to describe what this will feel like to a user of current MacBooks is that of chronic pain. If you’ve ever dealt with ongoing pain from a condition or injury, and then had it be alleviated by medication, therapy or surgery, you know how the sudden relief feels. You’ve been carrying the load so long you didn’t know how heavy it was. That’s what moving to this M1 MacBook feels like after using other Macs. 

Every click is more responsive. Every interaction is immediate. It feels like an iOS device in all the best ways. 

At the chip level, it also is an iOS device. Which brings us to…

iOS on M1

The iOS experience on the M1 machines is…present. That’s the kindest thing I can say about it. Apps install from the App Store and run smoothly, without incident. Benchmarks run on iOS apps show that they perform natively with no overhead. I even ran an iOS-based graphics benchmark which showed just fine. 

That, however, is where the compliments end. The current iOS app experience on an M1 machine running Big Sur is almost comical; it’s so silly. There is no default tool-tip that explains how to replicate common iOS interactions like swipe-from-edge — instead a badly formatted cheat sheet is buried in a menu. The apps launch and run in windows only. Yes, that’s right, no full-screen iOS apps at all. It’s super cool for a second to have instant native support for iOS on the Mac, but at the end of the day this is a marketing win, not a consumer experience win. 

Apple gets to say that the Mac now supports millions of iOS apps, but the fact is that the experience of using those apps on the M1 is sub-par. It will get better, I have no doubt. But the app experience on the M1 is pretty firmly in this order right now: Native M1 app>Rosetta 2 app>Catalyst app> iOS app. Provided that the Catalyst ports can be bothered to build in Mac-centric behaviors and interactions, of course. But it’s clear that iOS, though present, is clearly not where it needs to be on M1.

Rosetta 2

There is both a lot to say and not a lot to say about Rosetta 2. I’m sure we’ll get more detailed breakdowns of how Apple achieved what it has with this new emulation layer that makes x86 applications run fine on the M1 architecture. But the real nut of it is that it has managed to make a chip so powerful that it can take the approximate 26% hit (see the following charts) in raw power to translate apps and still make them run just as fast if not faster than MacBooks with Intel processors. 

It’s pretty astounding. Apple would like us to forget the original Rosetta from the PowerPC transition as much as we would all like to forget it. And I’m happy to say that this is pretty easy to do because I was unable to track any real performance hit when comparing it to older, even ‘more powerful on paper’ Macs like the 16” MacBook Pro. 

It’s just simply not a factor in most instances. And companies like Adobe and Microsoft are already hard at work bringing native M1 apps to the Mac, so the most needed productivity or creativity apps will essentially get a free performance bump of around 30% when they go native. But even now they’re just as fast. It’s a win-win situation. 

Methodology

My methodology  for my testing was pretty straightforward. I ran a battery of tests designed to push these laptops in ways that reflected both real world performance and tasks as well as synthetic benchmarks. I ran the benchmarks with the machines plugged in and then again on battery power to estimate constant performance as well as performance per watt. All tests were run multiple times with cooldown periods in between in order to try to achieve a solid baseline. 

Here are the machines I used for testing:

  • 2020 13” M1 MacBook Pro 8-core 16GB
  • 2019 16” Macbook Pro 8-core 2.4GHz 32GB w/5500M
  • 2019 13” MacBook Pro 4-core 2.8GHz 16GB
  • 2019 Mac Pro 12-Core 3.3GHz 48GB w/AMD Radeon Pro Vega II 32GB

Many of these benchmarks also include numbers from the M1 Mac mini review from Matt Burns and the M1 MacBook Air, tested by Brian Heater which you can check out here.

Compiling WebKit

Right up top I’m going to start off with the real ‘oh shit’ chart of this piece. I checked WebKit out from GitHub and ran a build on all of the machines with no parameters. This is the one deviation from the specs I mentioned above as my 13” had issues that I couldn’t figure out so I had some Internet friends help me. Also thanks to Paul Haddad of Tapbots for guidance here. 

As you can see, the M1 performs admirably well across all models, with the MacBook and Mac Mini edging out the MacBook Air. This is a pretty straightforward way to visualize the difference in performance that can result in heavy tasks that last over 20 minutes, where the MacBook Air’s lack of active fan cooling throttles back the M1 a bit. Even with that throttling, the MacBook Air still beats everything here except for the very beefy MacBook Pro. 

But, the big deal here is really this second chart. After a single build of WebKit, the M1 MacBook Pro had a massive 91% of its battery left. I tried multiple tests here and I could have easily run a full build of WebKit 8-9 times on one charge of the M1 MacBook’s battery. In comparison, I could have gotten through about 3 on the 16” and the 13” 2020 model only had one go in it. 

This insane performance per watt of power is the M1’s secret weapon. The battery performance is simply off the chart. Even with processor-bound tasks. To give you an idea, throughout this build of WebKit the P-cluster (the power cores) hit peak pretty much every cycle while the E-cluster (the efficiency cores) maintained a steady 2GHz. These things are going at it, but they’re super power efficient.

Battery Life

In addition to charting battery performance in some real world tests, I also ran a couple of dedicated battery tests. In some cases they ran so long I thought I had left it plugged in by mistake, it’s that good. 

I ran a mixed web browsing and web video playback script that hit a series of pages, waited for 30 seconds and then moved on to simulate browsing. The results return a pretty common sight in our tests, with the M1 outperforming the other MacBooks by just over 25%.

In fullscreen 4k/60 video playback, the M1 fares even better, clocking an easy 20 hours with fixed 50% brightness. On an earlier test, I left the auto-adjust on and it crossed the 24 hour mark easily. Yeah, a full day. That’s an iOS-like milestone.

The M1 MacBook Air does very well also, but its smaller battery means a less playback time at 16 hours. Both of them absolutely decimated the earlier models.

Xcode Unzip

This was another developer-centric test that was requested. Once again, CPU bound, and the M1’s blew away any other system in my test group. Faster than the 8-core 16” MacBook Pro, wildly faster than the 13” MacBook Pro and yes, 2x as fast as the 2019 Mac Pro with its 3.3GHz Xeons. 

Image Credits: TechCrunch

For a look at the power curve (and to show that there is no throttling of the MacBook Pro over this period (I never found any throttling over longer periods by the way) here’s the usage curve.

Unified Memory and Disk Speed

Much ado has been made of Apple including only 16GB of memory on these first M1 machines. The fact of it, however, is that I have been unable to push them hard enough yet to feel any effect of this due to Apple’s move to unified memory architecture. Moving RAM to the SoC means no upgradeability — you’re stuck on 16GB forever. But it also means massively faster access 

If I was a betting man I’d say that this was an intermediate step to eliminating RAM altogether. It’s possible that a future (far future, this is the play for now) version of Apple’s M-series chips could end up supplying memory to each of the various chips from a vast pool that also serves as permanent storage. For now, though, what you’ve got is a finite, but blazing fast, pool of memory shared between the CPU cores, GPU and other SoC denizens like the Secure Enclave and Neural Engine. 

While running many applications simultaneously, the M1 performed extremely well. Because this new architecture is so close, with memory being a short hop away next door rather than out over a PCIE bus, swapping between applications was zero issue. Even while tasks were run in the background — beefy, data heavy tasks — the rest of the system stayed flowing.

Even when the memory pressure tab of Activity Monitor showed that OS X was using swap space, as it did from time to time, I noticed no slowdown in performance. 

Though I wasn’t able to trip it up I would guess that you would have to throw a single, extremely large file at this thing to get it to show any amount of struggle. 

The SSD in the M1 MacBook Pro is running on a PCIE 3.0 bus, and its write and read speeds indicate that. 

 

Thunderbolt

The M1 MacBook Pro has two Thunderbolt controllers, one for each port. This means that you’re going to get full PCIE 4.0 speeds out of each and that it seems very likely that Apple could include up to 4 ports in the future without much change in architecture. 

This configuration also means that you can easily power an Apple Pro Display XDR and another monitor besides. I was unable to test two Apple Pro Display XDR monitors side-by-side.

Cooling and throttling

No matter how long the tests I ran were, I was never able to ascertain any throttling of the CPU on the M1 MacBook Pro. From our testing it was evident that in longer operations (20-40 minutes on up) it was possible to see the MacBook Air pulling back a bit over time. Not so with the Macbook Pro. 

Apple says that it has designed a new ‘cooling system’ in the M1 MacBook Pro, which holds up. There is a single fan but it is noticeably quieter than either of the other fans. In fact, I was never able to get the M1 much hotter than ‘warm’ and the fan ran at speeds that were much more similar to that of a water cooled rig than the turbo engine situation in the other MacBooks. 

Even running a long, intense Cinebench 23 session could not make the M1 MacBook get loud. Over the course of the mark running all high-performance cores regularly hit 3GHz and the efficiency cores hitting 2GHz. Despite that, it continued to run very cool and very quiet in comparison to other MacBooks. It’s the stealth bomber at the Harrier party.

In that Cinebench test you can see that it doubles the multi-core performance of last year’s 13” MacBook and even beats out the single-core performance of the 16” MacBook Pro. 

I ran a couple of Final Cut Pro tests with my test suite. First was a 5 minute 4k60 timeline shot with iPhone 12 Pro using audio, transitions, titles and color grading. The M1 Macbook performed fantastic, slightly beating out the 16” MacBook Pro. 

 

 

With an 8K timeline of the same duration, the 16” MacBook Pro with its Radeon 5500M was able to really shine with FCP’s GPU acceleration. The M1 held its own though, showing 3x faster speeds than the 13” MacBook Pro with its integrated graphics. 

 

And, most impressively, the M1 MacBook Pro used extremely little power to do so. Just 17% of the battery to output an 81GB 8k render. The 13” MacBook Pro could not even finish this render on one battery charge. 

As you can see in these GFXBench charts, while the M1 MacBook Pro isn’t a powerhouse gaming laptop we still got some very surprising and impressive results in tests of the GPU when a rack of Metal tests were run on it. The 16″ MBP still has more raw power, but rendering games at retina is still very possible here.

The M1 is the future of CPU design

All too often over the years we’ve seen Mac releases hamstrung by the capabilities of the chips and chipsets that were being offered by Intel. Even as recently as the 16” MacBook Pro, Apple was stuck a generation or more behind. The writing was basically on the wall once the iPhone became such a massive hit that Apple began producing more chips than the entire rest of the computing industry combined. 

Apple has now shipped over 2 billion chips, a scale that makes Intel’s desktop business look like a luxury manufacturer. I think it was politic of Apple to not mention them by name during last week’s announcement, but it’s also clear that Intel’s days are numbered on the Mac and that their only saving grace for the rest of the industry is that Apple is incredibly unlikely to make chips for anyone else.

Years ago I wrote an article about the iPhone’s biggest flaw being that its performance per watt limited the new experiences that it was capable of delivering. People hated that piece but I was right. Apple has spent the last decade “fixing” its battery problem by continuing to carve out massive performance gains via its A-series chips all while maintaining essentially the same (or slightly better) battery life across the iPhone lineup. No miracle battery technology has appeared, so Apple went in the opposite direction, grinding away at the chip end of the stick.

What we’re seeing today is the result of Apple flipping the switch to bring all of that power efficiency to the Mac, a device with 5x the raw battery to work with. And those results are spectacular.

How Apple reinvented the cursor for iPad

Even though Apple did not invent the mouse pointer, history has cemented its place in dragging it out of obscurity and into mainstream use. Its everyday utility, pioneered at Xerox Parc and later combined with a bit of iconic* work from Susan Kare at Apple, has made the pointer our avatar in digital space for nearly 40 years.

The arrow waits on the screen. Slightly angled, with a straight edge and a 45 degree slope leading to a sharp pixel-by-pixel point. It’s an instrument of precision, of tiny click targets on a screen feet away. The original cursor was a dot, then a line pointing straight upwards. It was demonstrated in the ‘Mother of all demos’ — a presentation roughly an hour and a half long that contained not only the world’s first look at the mouse but also hyper linking, document collaboration, video conferencing and more.

The star of the show, though, was the small line of pixels that made up the mouse cursor. It was hominem ex machina — humanity in the machine. Unlike the text entry models of before, which placed character after character in a facsimile of a typewriter, this was a tether that connected us, embryonic, to the aleph. For the first time we saw ourselves awkwardly in a screen.

We don’t know exactly why the original ‘straight up arrow’ envisioned by Doug Engelbart took on the precise angled stance we know today. There are many self-assured conjectures about the change, but few actual answers — all we know for sure is that, like a ready athlete, the arrow pointer has been there, waiting to leap towards our goal for decades. But for the past few years, thanks to touch devices, we’ve had a new, fleshier, sprinter: our finger.

The iPhone and later the iPad didn’t immediately re-invent the cursor. Instead, it removed it entirely. Replacing your digital ghost in the machine with your physical meatspace fingertip. Touch interactions brought with them “stickiness” — the 1:1 mating of intent and action. If you touched a thing, it did something. If you dragged your finger, the content came with it. This, finally, was human-centric computing.

Then, a few weeks ago, Apple dropped a new kind of pointer — a hybrid between these two worlds of pixels and pushes. The iPad’s cursor, I think, deserves closer examination. It’s a seminal bit of remixing from one of the most closely watched idea factories on the planet.

In order to dive a bit deeper on the brand new cursor and its interaction models, I spoke to Apple SVP Craig Federighi about its development and some of the choices by the teams at Apple that made it. First, let’s talk about some of the things that make the cursor so different from what came before…and yet strangely familiar.

—————————

The iPad cursor takes on the shape of a small circle, a normalized version of the way that the screen’s touch sensors read the tip of your finger. Already, this is different. It brings that idea of placing you inside the machine to the next level, blending the physical nature of touch with the one-step-removed trackpad experience.

Its size and shape is also a nod to the nature of iPad’s user interface. It was designed from the ground up as a touch-first experience. So much so that when an app is not properly optimized for that modality it feels awkward, clumsy. The cursor as your finger’s avatar has the same impact wherever it lands.

Honestly, the thinking could have stopped there and that would have been perfectly adequate. A rough finger facsimile as pointer. But the concept is pushed further. As you approach an interactive element, the circle reaches out, smoothly touching then embracing and encapsulating the button.

The idea of variable cursor velocity is pushed further here too. When you’re close to an object on the screen, it changes its rate of travel to get where you want to go quicker, but it does it contextually, rather than linearly, the way that OS X or Windows does.

Predictive math is applied to get you to where you’re going without you having to land precisely there, then a bit of inertia is applied to keep you where you need to be without over shooting it. Once you’re on the icon, small movements of your finger jiggle the icon so you know you’re still there.

The cursor even disappears when you stop moving it, much as the pressure of your finger disappears when you remove it from the screen. And in some cases the cursor possesses the element itself, becoming the button and casting a light ethereal glow around it.

This stir fry of path prediction, animation, physics and fun seasoning is all cooked into a dish that does its best to replicate the feel of something we do without thinking: reaching out and touching something directly.

These are, in design parlance, affordances. They take an operation that is at its base level much harder to do with a touchpad than it is your finger, and make it feel just as easy. All you have to do to render this point in crystal is watch a kid who uses an iPad all day try to use a mouse to accomplish the same task.

The idea that a cursor could change fluidly as needed in context isn’t exactly new. The I-Beam (the cursor type that appears when you hover over editable text) is a good example of this. There were also early experiments at Xerox Parc — the birthplace of the mouse — that also made use of a transforming cursor. They even tried color changes, but never quite got to the concept of on-screen elements as interactive objects — choosing to emulate functions of the keyboard.

But there has never been a cursor like this one. Designed to emulate your finger, but also to spread and squish and blob and rush and rest. It’s a unique addition to the landscape.

—————————

Given how highly scrutinized Apple’s every release is, the iPad cursor not being spoiled is a minor miracle. When it was released as a software update for existing iPads — and future ones — people began testing it immediately and discovering the dramatically different ways that it behaved from its pre-cursors.*

Inside Apple, the team enjoyed watching the speculation externally that Apple was going to pursue a relatively standard path — displaying a pointer on screen on the iPad — and used it as motivation to deliver something more rich, a solution to be paired with the Magic Keyboard. The scuttlebutt was that Apple was going to add cursor support to iPad OS, but even down to the last minute the assumption was that we would see a traditional pointer that brought the iPad as close as possible to ‘laptop’ behavior.

Since the 2018 iPad Pro debuted with the smart connector, those of us that use the iPad Pro daily have been waiting for Apple to ship a ‘real’ keyboard for the device. I went over my experiences with the Smart Keyboard Folio in my review of the new iPad Pro here, and the Magic Keyboard here, but suffice to say that the new design is incredible for heavy typists. And, of course, it brings along a world class trackpad for the ride.

When the team set out to develop the new cursor, the spec called for something that felt like a real pointer experience, but that melded philosophically with the nature of iPad.

A couple of truths to guide the process:

  • The iPad is touch first.
  • iPad is the most versatile computer that Apple makes.

In some ways, the work on the new iPad OS cursor began with the Apple TV’s refreshed interface back in 2015. If you’ve noticed some similarities between the way that the cursor behaves on iPad OS and the way it works on Apple TV, you’re not alone. There is the familiar ‘jumping’ from one point of interest to another, for instance, and the slight sheen of a button as you move your finger while ‘hovering’ on it.

“There was a process to figure out exactly how various elements would work together,” Federighi says. “We knew we wanted a very touch-centric cursor that was not conveying an unnecessary level of precision. We knew we had a focus experience similar to Apple TV that we could take advantage of in a delightful way. We knew that when dealing with text we wanted to provide a greater sense of feedback.”

“Part of what I love so much about what’s happened with iPadOS is the way that we’ve drawn from so many sources. The experience draws from our work on tvOS, from years of work on the Mac, and from the origins of iPhone X and early iPad, creating something new that feels really natural for iPad.”

And the Apple TV interface didn’t just ‘inspire’ the cursor — the core design team responsible works across groups, including the Apple TV, iPad OS and other products.

—————————

But to understand the process, you have to get a wider view of the options a user has when interacting with an Apple device.

Apple’s input modalities include:

  • Mouse (Mac)
  • Touchpad (Mac, MacBook, iPad)
  • Touch (iPhone, iPad)
  • AR (iPhone, iPad, still nascent)

Each of these modalities has situational advantages or disadvantages. The finger, of course, is an imprecise instrument. The team knew that they would have to telegraph the imprecise nature of a finger to the user, but also honor contexts in which precision was needed.

(Image:Jared Sinclair/Black Pixel)

Apple approached the experience going in clean. The team knew that they had the raw elements to make it happen. They had to have a touch sensitive cursor, they knew that the Apple TV cursor showed promise and they knew that more interactive feedback was important when it came to text.

Where and how to apply which element was the big hurdle.

When we were first thinking about the cursor, we needed it to reflect the natural and easy experience of using your finger when high precision isn’t necessary, like when accessing an icon on the home screen, but it also needed to scale very naturally into high precision tasks like editing text,” says Federighi.

“So we came up with a circle that elegantly transforms to accomplish the task at hand. For example, it morphs to become the focus around a button, or to hop over to another button, or it morphs into something more precise when that makes sense, like the I-beam for text selection.“

The predictive nature of the cursor is the answer that they came up with for “How do you scale a touch analogue into high precision?”

But the team needed to figure out the what situations demanded precision. Interacting with one element over another one close by, for example. That’s where the inertia and snapping came in. The iPad, specifically, is multipurpose computer so it’s way more complex than any single-input device. There are multiple modalities to service with any cursor implementation on the platform. And they have to be honored without tearing down all of the learning that you’ve put millions of users through with a primary touch interface.

We set out to design the cursor in a way that retains the touch-first experience without fundamentally changing the UI,” Federighi says. “So customers who may never use a trackpad with their iPad won’t have to learn something new, while making it great for those who may switch back and forth between touch and trackpad.”

The team knew that it needed to imbue the cursor with the same sense of fluidity that has become a pillar of the way that iOS works. So they animated it, from dot to I-beam to blob. If you slow down the animation you can see it sprout a bezier curve and flow into its new appearance. This serves the purpose of ‘delighting’ the user — it’s just fun — but it also tells a story about where the cursor is going. This keeps the user in sync with the actions of the blob, which is always a danger any time you introduce even a small amount of autonomy in a user avatar.

Once on the icon, the cursor moves the icon in a small parallax, but this icon shift is simulated — there are not layers here like on Apple TV, but they would be fun to have.

Text editing gets an upgrade as well, with the I-Beam conforming to the size of the text you’re editing, to make it abundantly clear where the cursor will insert and what size of text it will produce when you begin typing.

The web presented its own challenges. The open standard means that many sites have their own hover elements and behaviors. The question that the team had to come to grips with was how far to push conformity to the “rules” of iPad OS and the cursor. The answer was not a one-size application of the above elements. It had to honor the integral elements of the web.

Simply, they knew that people were not going to re-write the web for Apple.

Perfecting exactly where to apply these elements was an interesting journey. For instance, websites do all manner of things – sometimes they have their own hover experiences, sometimes the clickable area of an element does not match what the user would think of as a selectable area,” he says. “So we looked carefully at where to push what kind of feedback to achieve a really high level of compatibility out the gates with the web as well as with third party apps.”

Any third-party apps that have used the standard iPad OS elements get all of this work for free, of course. It just works. And the implementation for apps that use custom elements is pretty straightforward. Not flick-a-switch simple, but not a heavy lift either.

The response to the cursor support has been hugely positive so far, and that enthusiasm creates momentum. If there’s a major suite of productivity tools that has a solid user base on iPad Pro, you can bet it will get an update. Microsoft, for instance, is working on iPad cursor support that’s expected to ship in Office for iPad this fall.

—————————

System gestures also feel fresh and responsive even on the distanced touchpad. In some ways, the flicking and swiping actually feel more effective and useful on the horizontal than they do on the screen itself. I can tell you from personal experience that context switching back and forth from the screen to the keyboard to switch between workspaces introduces a lot of cognitive wear and tear. Even the act of continuously holding your arm up and out to swipe back and forth between workspaces a foot off the table introduces a longer term fatigue issue.

When the gestures are on the trackpad, they’re more immediate, smaller in overall physical space and less tiring to execute.

Many iPad gestures on the trackpad are analogous to those on the Mac, so you don’t have to think about them or relearn anything. However, they respond in a different, more immediate way on iPad, making everything feel connected and effortless,” says Federighi.

Remember that the first iPad multitasking gestures felt like a weird offshoot. An experiment that appeared useless at worst but an interesting curiosity at best. Now, on the home button free iPad Pro, the work done by the team that built the iPhone X shines brightly. It’s pretty remarkable that they built a system so usable that it even works on trackpad — one element removed from immediate touch.

Federighi says that they thought about rethinking 3 finger gestures altogether. But they discovered that they work just fine as is. In the case of anything that goes off of the edge you hit a limit and just push beyond it again to confirm and you get the same result.

There are still gaps in the iPad’s cursor paradigms. There is no support for cursor lock on iPad, making it a non starter for relative mouse movement in 3D apps like first person games. There’s more to come, no doubt, but Apple had no comment when I asked about it.

The new iPad cursor is a product of what came before, but it’s blending, rather than layering, that makes it successful in practice. The blending of the product team’s learnings across Apple TV, Mac and iPad. The blending of touch, mouse and touchpad modalities. And, of course, the blending of a desire to make something new and creative and the constraint that it also had to feel familiar and useful right out of the box. It’s a speciality that Apple, when it is at its best, continues to hold central to its development philosophy.