Fact Finder - Technology and Inventions
Steve Jobs and the Multi-Touch Interface
You might not realize that the multi-touch technology Steve Jobs called Apple's own was actually built on decades of work by researchers he never publicly credited. E.A. Johnson patented the first finger-driven touchscreen back in 1969. Apple also acquired FingerWorks, whose co-founders built the core gesture recognition technology powering the iPhone. Jobs famously rejected the stylus, calling it outdated — and that conviction permanently reshaped how humans interact with screens. There's much more to this story.
Key Takeaways
- Steve Jobs famously dismissed styluses, comparing them to failed technologies like floppy disks, believing finger-based interaction was more natural.
- Jobs' obsession with multi-touch details included the rubber band scrolling effect, discovered by UI designer Bas Ording.
- Apple's 949 multi-touch patent became a cornerstone of major legal battles, reflecting how transformative Jobs considered the technology.
- Jobs championed acquiring FingerWorks, whose gesture recognition technology proved instrumental in building the iPhone's multi-touch system.
- Jobs' conviction in multi-touch permanently reshaped interaction design, establishing pinch-to-zoom and physics-based scrolling as universal standards.
Who Actually Invented Multi-Touch Before Apple?
While Apple's iPhone made multi-touch famous, the technology's roots stretch back decades earlier. You might be surprised to learn that the multi touch pioneers before 1998 made groundbreaking contributions long before Steve Jobs entered the picture.
In 1982, Nimish Mehta developed the first human-controlled multi-touch device at the University of Toronto, using a video camera to detect multiple contact points simultaneously. Then in 1983, Myron Krueger built the Video Place optical system, tracking hand movements for gestural interaction. By 1984, Bob Boie invented the first transparent multi-touch screen overlay at Bell Labs, enabling fingertip manipulation of graphics.
These are the key innovators before Apple whose work laid the essential foundation that companies like FingerWorks and eventually Apple would later build upon. FingerWorks, co-founded by University of Delaware researchers John Elias and Wayne Westerman in 1998, produced influential gesture-based devices including the iGesture Pad and TouchStream Keyboard. Even earlier, E.A. Johnson was granted a patent in 1969 for the very first finger-driven touchscreen, marking the true origin point of an invention that would eventually change how the world interacts with technology.
The Real Pioneers: E.A. Johnson, Sam Hurst, and Jeff Han's TED Demo
Before Mehta, Krueger, and Boie were pushing multi-touch forward in the 1980s, two earlier inventors had already cracked open the touchscreen world itself.
E.A. Johnson built the first finger-driven capacitive touchscreen in 1965, while Sam Hurst developed resistive technology in 1970. Understanding capacitive vs resistive touchscreen differences helps you appreciate each contribution:
- Johnson's capacitive system detected touch through electrical capacitance changes near copper circuits
- Hurst's resistive approach required a conductive cover sheet to function
- CERN's early touchscreen deployments began in 1973, making it among the first institutions using touch operationally
- Elographics introduced its first transparent touch panel in 1974
Both inventors established foundational principles that made every multi-touch breakthrough possible, long before Steve Jobs ever stepped onto a stage. Hurst's work in particular extended well beyond his initial discovery, as his resistive touchscreen development spanned from 1971 to 1977 and laid critical groundwork for the commercial touchscreen industry. Today, researchers at the University of Cambridge are building on these foundational principles by developing predictive touch technology that uses AI and sensor data to anticipate a user's intended target without any physical contact.
How Did Steve Jobs First Encounter the Power of Multi-Touch?
How did a tablet prototype accidentally spark the smartphone revolution? In the early 2000s, Steve Jobs tasked a team with multi touch prototyping — building a glass screen display you could type on without a physical keyboard. Within about six months, they'd created something remarkable.
Early multi touch realizations hit hard when UI designer Bas Ording demonstrated inertial scrolling with a rubber band effect at list ends. Jobs reportedly found the experience transformative, describing it as a glimpse into the future of human-computer interaction. Apple had previously acquired Fingerworks, a multitouch startup, whose expertise proved instrumental in shaping the technology behind these early breakthroughs.
That single demo reportedly ignited full iPhone development momentum. Recognizing a ripe opportunity to disrupt the phone market, Jobs shelved the tablet entirely. He redirected all team energy toward building what would eventually become the iPhone, publicly presented at Macworld on January 9, 2007. After the iPhone's success, Apple revisited the original tablet concept, leveraging lessons learned to launch the iPad's groundbreaking debut that took the world by storm.
The Secret Team Behind Apple's Multi-Touch Breakthrough
Behind Apple's multi-touch revolution stood a small group of engineers whose work most people have never heard of. Understanding the importance of Fingerworks acquisition helps explain how Apple gained technology competitors couldn't replicate.
Wayne Westerman and John Elias, Fingerworks' co-founders, built the foundation through:
- Capacitive sensing and touch-tracking systems
- Gesture recognition technology
- Multi-touch patents spanning Apple's product lines
- The Touchstream keyboard that shaped iPhone's interface
The role of Project Purple leadership fell to Scott Forstall, who directed hardware, software, and UX teams toward a unified vision. He ran weekly keyboard review sessions and secured Jobs' backing to redirect all available resources.
Ken Kocienda and Bas Ording completed the inner circle, delivering the virtual keyboard and inertial scrolling that made everything work. Westerman's inspiration for multi-touch stemmed from his personal struggle with tendonitis, which led him to seek a more graceful and expressive alternative to traditional keyboard input.
Westerman and Elias originally developed their touch tracking and gesture recognition technology at FingerWorks, which was later chronicled in Brian Merchant's book The One Device, a detailed account of the secret history behind the iPhone's creation.
Why Jobs Rejected the Stylus and Bet Everything on Multi-Touch
When Steve Jobs took the stage at the 2007 iPhone keynote, he didn't just introduce a new product—he buried a technology. "Who wants a stylus?" he asked the crowd. "You have to get it and put it away...Yuck." He compared styluses to failed tech like floppy disks, dismissing them as unnatural and outdated.
Jobs bet everything on multi-touch instead, positioning fingers as the ideal input method. Understanding the multi-touch development timeline helps you appreciate how bold that wager was—decades of research existed, yet no one had translated it into a mainstream product. His conviction reshaped multi-touch's commercial impact permanently. Apple's multi-touch innovations were so significant that the 949 multi-touch patent became a cornerstone of its legal battles against numerous competitors. Notably, Jeff Han of Perceptive Pixel had publicly demonstrated multi-touch technology before Apple, underscoring just how competitive and historically rich the landscape was that Jobs was stepping into.
The Rubber Band Effect and Other Multi-Touch Details Jobs Obsessed Over
Few details escaped Jobs' obsession during iPhone development, and the rubber band effect stands as proof. Bas Ording discovered it while building rubber band effect prototypes from a 200-name scrolling list. A hard stop at the list's end looked like a crash — one of many touch device development obstacles he'd to solve.
Ording's fixes transformed the experience:
- Added space at the list's end moved slower than your finger, creating elasticity
- A snap-back animation bounced the list like a real rubber band
- The effect signaled list-end without suggesting a malfunction
- Jobs recognized it made multitouch feel responsive and fun
Jobs immediately pivoted from tablet to iPhone development after seeing it. Patent '381 protected inertial scrolling fiercely, and Jobs personally warned Samsung against copying it. Forstall later testified that inertial scrolling was one of the key things for the fluidity of the iPhone and all of iOS. Ording also contributed the Dock magnification effect and Exposé, cementing his role as one of Apple's most influential interface designers.
What Did Steve Jobs Get Right (and Wrong) at the 2007 Keynote?
Jobs' obsession with details like the rubber band effect shaped how he presented the iPhone to the world — and that 2007 Macworld keynote remains one of tech history's most studied performances. Jobs' focus on multi-touch drove him to correctly predict that physical buttons were obsolete and that fingers would replace the stylus.
Jobs' vision for iPhone's UI — positioning multi-touch as the third great interface revolution after the mouse and click wheel — proved remarkably accurate. He nailed the "one device" concept and the promise of desktop-class software. Where he stumbled, though, was overstating how far ahead the software truly was, since early iPhone limitations like no third-party apps and EDGE-only connectivity quickly exposed gaps between the keynote's bold claims and real-world delivery. At that same keynote, Jobs framed the iPhone as a combination of three devices — a widescreen iPod, a revolutionary phone, and a breakthrough Internet communicator — a framing that proved both memorable and strategically sound.
The Patents Apple Filed to Lock Down Its Multi-Touch Claims
Apple didn't just build the iPhone — it built a legal fortress around it. Its multi touch patent portfolio targeted specific implementations, not the broad concept itself. Here's what you should know about Apple's multi touch licensing strategy:
- Patent 7,479,949 covered multitouch interfaces responding to two or more simultaneous inputs
- A capacitive touchscreen patent filed three years before its 2011 grant targeted portable multifunction devices
- Apple asserted five patents against Samsung, covering slide-to-unlock, autocomplete, data tapping, and unified search
- Courts denied permanent injunctive relief against Samsung twice
Apple never claimed to own multitouch entirely — it owned the incremental improvements. Acquiring Fingerworks in 2005 gave Apple critical gesture expertise, turning those innovations into enforceable claims competitors couldn't easily design around. The patent's claims focus specifically on heuristics for scrolling and navigation rather than addressing multitouch techniques broadly. Capacitive touchscreen technology relies on human finger conductivity to register input, using a sheet of glass that responds to touch through the body's natural electrical properties.
How Apple's Multi-Touch Rollout Forced Every Competitor to Rethink the Screen
When the iPhone launched in 2007, it didn't just introduce a new device — it invalidated every assumption competitors had made about how a screen should work. The industry-wide impact of multi-touch forced manufacturers to abandon resistive technology entirely and adopt capacitive touchscreens as the new standard. You can trace that shift directly to Apple's rollout.
The counterintuitive decisions by Apple — like eliminating physical keyboards and betting everything on finger-based gestures — initially looked risky. Instead, they redefined what users expected from every screen.
Samsung, HTC, and Nokia scrambled to develop competing implementations, engineering workarounds to avoid patent claims while still delivering comparable functionality. Multi-touch didn't stay mobile, either. Laptops and desktops eventually adopted it too, making touch interaction table stakes across the entire PC industry. Apple's multi-touch patent specifically covers methods of translating web content using single and two-finger gestures, giving competitors a narrow but critical target to design around.
Apple's patent, number 7,966,578, was filed in 2007 and specifically governs the manipulation of content within a frame on a web page, making it a precise but consequential legal instrument in the broader smartphone patent wars.
From iPhone to Every Screen: How Multi-Touch Became the Default Human Interface
The iPhone's debut didn't just launch a product — it set off a chain reaction that rewired how every screen-based device would be built and used. Built on early engineering breakthroughs from institutions like Carnegie Mellon and Bell Labs, Apple's implementation proved that natural gestural interfaces could replace buttons entirely. Suddenly, every manufacturer had to follow.
What the iPhone normalized across all screens:
- Pinch-to-zoom as a standard navigation tool
- Physics-based scrolling with momentum and bounce
- Finger-first design replacing stylus dependency
- Direct content manipulation as the default interaction model
You now touch ATMs, kiosks, laptops, and tablets the same way you touch your phone. That consistency didn't happen by accident — it happened because the iPhone made anything less feel broken. CERN developed one of the earliest multi-touch screens in the 1970s, decades before the technology would reach the hands of everyday consumers.
FingerWorks pioneered gesture recognition technology in 1998, laying critical groundwork for the fluid, multi-finger interactions that would later define the iPhone's revolutionary interface.