Silky touch
The keyboard covering half of the device did not meet this need. The best option, logically, was touch control, which would allow the controls to be moved directly to the display, thus maximizing the available screen space. Help for the controls would be displayed on the screen and could be changed as needed, which was not possible with classic keys. It was a good idea that had been used before, but the problem was that no one had managed to see it through to completion. At that time, there were several technologies for reading the point of pressure from the display, but in practice, only one of them was used: resistive technology. Resistive displays consisted of several layers, and pressing the layers together indicated a keystroke. The displays were not expensive, but they were also not the most reliable; the layers tended to tear over time and did not reliably transmit the touch point. In addition, it was necessary to touch the display with a thin tip that clearly determined the touch point. The need to use a pen with a tip, known as a stylus, was very inconvenient. It was necessary to find a place to store the stylus, and it was also necessary to hold the device in one hand and the stylus in the other. It was not possible to operate it with two fingers, let alone mark multiple points on the screen simultaneously. Add to that the fact that the touch layers above the screen distorted the image and color quality, and it’s no surprise that resistive displays failed to convince users of their qualities. At the time, resistive displays were used in a number of devices from Palm, as well as Windows Mobile, and although these devices found fans among enthusiasts, they were not the best displays for everyday use. It was necessary to look elsewhere.
Apple engineers set out to find a solution and eventually came across FingerWorks, a company that developed devices for controlling computers and other devices. FingerWorks’ portfolio included iGesture touchpads, which recognized touches in multiple places and were able to interpret them as different commands. The best known was probably the TouchStream keyboard, which allowed users to type without pressing keys, simply by touching them. It was used as a keyboard for medical devices, but also for people who were particular about keyboard ergonomics or suffered from carpal tunnel syndrome.
FingerWorks not only had experience with touch-controlled designs, but had also developed a whole set of gesture-based controls, i.e., movements that triggered an action. And that was exactly what Apple needed. So, in early 2005, it quietly bought the entire FingerWorks company, including its founders John Elias and Wayne Westerman, as well as all its patents, without its competitors realizing why it was interested in a company operating in such a narrow and unconventional market segment. They would soon find out.
This allowed Apple to quickly complete the design of a different type of display capable of recognizing touch. The capacitive display recognized touch by changing the capacity of the conductive layer, so it was enough to place a finger on the display; it was not necessary to press down for the capacitive layer to register the touch. No stylus was needed, but on the other hand, it would be necessary to remove gloves in winter or use gloves with a conductive layer simulating finger touch. Capacitive displays had all the advantages that resistive displays lacked: they retained true colors, were easy to operate because they could be controlled with multiple fingers, were more reliable in design, and allowed for multiple touches at once. But no one was manufacturing them. Apple knew how to deal with this, however, once again using its free cash resources and paying a billion dollars to a factory to put the display it had developed into production exclusively for it. This decision had far-reaching consequences for the future, as none of its competitors could offer a competitive product for a long time. It took almost two years after the launch of the iPhone before competitors were able to secure sources of capacitive displays. And two years was an incredible lead and a gap that could not be easily closed or caught up with later.
Let’s return to the touchscreen controls for a moment. Typing on a virtual keyboard displayed on a touchscreen is not as easy as it seems at first glance. Until then, Windows Mobile used a stylus to select letters on the keyboard, or graffiti, a method of writing simplified characters with a stylus on the display. Apple had extensive experience with graffiti and stylus recognition of handwriting on the display thanks to Newton, but Jobs hated the stylus and quite rightly considered it a limitation that would alienate ordinary users.
However, selecting letters on a virtual keyboard with your finger was not as easy as with a stylus. A finger usually covers three letters on the keyboard because it is thicker than the tip of a stylus. However, the guys at FingerWorks solved the problem by counting not only the center of the touch area, but also where the touch began—that is, the phases in which the finger touched and moved away from the display. This made it possible to determine fairly accurately which key the user had pressed, but further tests showed that users became nervous when they couldn’t see the letter they had pressed. This problem did not occur when using a stylus, and testing showed that it made it difficult to correct the entered letter by moving the finger without lifting it from the display. And so the keyboard was filled with the display of the pressed letter above the finger. This increased the accuracy of typing again.
Further testing showed that individual users tend to place their finger slightly to the left or right of the actual key they want to press, depending on which hand they hold the iPhone with and which hand they use to type. The smart program learned to recognize this habit and later, with subsequent system updates, it also learned to classify users according to the way they type, so that it would not be confused by the fact that the iPhone owner’s children type differently than the owner.
And finally, brute force arrived: dictionary-based corrections, where the iPhone could automatically correct typos on its own. Automatic corrections later became the subject of much hatred (similar to those in Word), because at first it was not possible to turn them off (on Jobs’ direct orders), and only later revisions of the system made this possible, when it turned out that the ability to turn off automatic corrections was the most common reason for “jailbreaking,” breaking the iPhone’s factory security and allowing the installation of various programs and settings outside the Apple ecosystem.
Why am I mentioning all these features when they are so common today and we cannot imagine working with a mobile phone without them? Precisely because of that. Apple may not have invented the virtual keyboard, but it was Apple that made it usable for typing with your finger. All of its obvious features, such as displaying the letter (and its variants, for example with diacritics) of the key pressed above the finger, corrections by moving the finger, or even less obvious ones, such as counting the position of the finger, are inventions and significant contributions by Apple. other systems simply copied them and did not bring any improvements of their own, with the exception of Swype typing, where the finger slides across the keyboard without having to be lifted.
The result of this work was a reliable touchscreen that could be controlled with your fingers for the first time in the history of mobile phones and computers, and which displayed interface controls depending on what you wanted or could do. A numeric keypad appeared when a number was expected to be entered, an alphabetical keyboard was offered for writing messages, an at-sign was added to the alphabetical keyboard when writing an email was expected, and so on.
In connection with the display, however, Jobs came up with another change to the established paradigm. Until then, transparent plastic sheets had been used as cover elements for display screens. They were flexible and did not break in your pocket, but they were susceptible to abrasion. Cover glass would have made a better impression. But where to find one with the necessary properties? In this case, too, a suitable supplier was found. Corning had glass available that it called Gorilla Glass and had no use for. This was exactly the glass Jobs needed, so he again used Apple’s free resources and arranged with Corning to produce a sufficient quantity of Gorilla Glass. In 2007, the iPhone became the first phone to use glass instead of plastic as a display cover, which significantly improved its appearance. This happened at the last minute; at its initial launch, the iPhone still had a plastic display cover, and it was only later that Apple announced that it would use “sapphire glass.”
As work progressed on the prototype of what would soon be named the iPhone, more and more pitfalls arising from its unusual design gradually emerged. For example, touchscreen controls caused confusion when you held the phone to your ear during a call. The touch layer interpreted the touch of your face or ear as pressing a key, so you would often “hold” the call, and it was not immediately obvious what had happened. Engineers solved the problem by building a proximity sensor into the phone that recognized when you brought your face close to the display. In such cases, the display immediately turned off, did not show anything, and did not respond to touch. This allowed the iPhone to save energy, prevent your ear from getting burned, and avoid receiving false responses to what it was supposed to do.
Table of contents
- Silky touch Currently reading
- 2011:How will the tablet battle be fought?