The context of this story
First important component: USB (and also FireWire)
The design team believed that the average user would not want to disassemble the computer to perform any upgrades. Rather, they would appreciate that the computer would work right out of the box, without any major tinkering or connecting cables. This was the first assumption that led to the all-in-one model. Thanks to this assumption, a new interface was also introduced, one that was unheard of in computers for ordinary users at the time: USB. The USB standard was very new; it was only at the beginning of 1996 that its final version 1.0 appeared, which was finally usable and allowed transfer speeds of up to 12 Mb/s. It was soon followed by version 1.1, the first version that was to become widespread and which Apple was also to adopt. Apple decided to replace its existing mix of buses, including SCSI and the company’s own ADB, with the new interface. The iMac became the first computer to use USB, contributing significantly to its rise and also profiting from it.
Why was USB a fortunate choice? The interfaces used by Apple up to that point were generally proprietary (such as ADB, or Apple Desktop Bus) or professional and expensive, such as SCSI. Only a few companies manufactured peripherals for Apple computers. By choosing an interface that would also be used in the Windows and PC world in the future, the possibility of interoperability increased significantly; all that was needed was to write the drivers. In addition, users did not have to throw away their existing peripherals when switching from one computer to another. However, this was also one of the arguments against choosing USB, as many Apple executives feared that the company would significantly reduce sales of peripherals by opening up to the world with this step. Jobs had a simple answer to this: he drastically reduced the number of peripherals that Apple manufactured. This included the StyleWriter series of printers, which were actually just modified versions of HP’s DeskJet. This was also part of the company’s return to its roots. The decisive factor was that a number of manufacturers agreed to introduce a range of peripherals with USB interfaces at the same time as the new Apple computer was launched. There would therefore be plenty of peripherals available, which was important for the end user.
Along with cutting out excess interfaces, Jon Rubinstein, head of the hardware division, and Jobs came up with the idea of removing the floppy disk drive, at that time an integral part of every computer. However, the 3.5-inch floppy disk drives used at the time had a standard capacity of 1.44MB, which was already too little. The days of the first Macs, when it was possible to run the operating system and the entire computer from floppy disks, were long gone. To give you an idea, the Windows 95 operating system was delivered on thirteen 3.5-inch floppy disks, while its successor, Windows 98, was no longer commonly sold on floppy disks, but was delivered on CD, and only on special request was it possible to create a set of eighty installation disks. The distribution of software via floppy disks in the mid-1990s was coming to an end due to the growing space requirements of software, the falling price of CD drives, and, ultimately, the simpler method of producing installation CDs compared to floppy disks. Last but not least, installation from CD was significantly more convenient for users and eliminated the problems of losing a floppy disk from a large installation package.
Although it was already clear in the second half of the 1990s that floppy disk drives were no longer important in modern computers, PC suppliers continued to include them in their configurations for fear of user backlash. If they were used, it was for transferring user files and documents, but there was no affordable alternative. It was not until 1996 that the price of writable CD-R drives fell below the magic threshold of $1,000, but they were still expensive for ordinary users, and the now popular USB memory sticks were virtually non-existent at the time. However, a single floppy disk could hold three to five photos, even from the low-quality digital cameras of the time, and was not suitable for transferring larger amounts of data. The truth is that a number of alternative storage media, such as Zip from IoMega, were developed at that time, but the computer industry did not move towards their widespread use. It was clear that this was only a transitional technology until the price of CD writers dropped enough to become commonly used, and CD readers were already a common feature of computers at the time, so reading a burned CD was usually not a problem.
Rubinstein and Jobs decided to make a radical cut and permanently remove the floppy disk drive, which was symbolic. The Macintosh was actually the first computer to use a 3.5-inch floppy disk drive, and with it, the floppy disk was to disappear. Instead of a floppy disk drive, the new iMac came with a built-in 56 Kbps modem as standard, and it was expected that users would transfer smaller files over the Internet as they were accustomed to doing. If they really needed a floppy disk drive, they could purchase a USB version. This radical step was later criticized as too radical when the iMac was introduced, with many commentators and reviewers unable to accept that the new iMac would not allow them to write their files to anything that could be transferred to another computer as a physical medium, but practice showed that users did indeed use the internet to transfer common files. To accommodate this method of file transfer, Apple later introduced the iTools utility suite in 2000, which included, among other things, the iDisk network drive, allowing users to store and share files that could then be accessed from the internet from different computers. This set of services available via the Internet later evolved into .Mac (in response to Microsoft’s .Net), then MobileMe, and finally iCloud, introduced in 2011. iDisk, also available for Mac OS 9, the predecessor of today’s Mac OS X operating system, offered 20 MB of storage space, with the option to purchase up to 400 MB of space for an additional $1 per MB per year.
Steve Jobs unveils the new iMac on May 6, 1998, at Flint Auditorium at De Anza Community College in Cupertino, the same place where the original Macintosh was unveiled. Company legend has it that the iMac was met with a fabulous reception. This was largely true, as it received very positive reviews and fans were enthusiastic that Apple had finally found its footing and Jobs was back. In the first six weeks after its August launch, 278,000 units were sold, and by the end of the year, 800,000 units had been sold, making the iMac the fastest-selling computer in Apple’s history. Just for the record, the first iMac is usually referred to by its color, Bondi Blue, a name derived from the color of the water at Sydney’s most popular beach, Bondi.
Image: apple-imac-g3—bondi-blue
Description: The first iMac with a 233 MHz PowerPC 750 G3 processor, 32 MB RAM, 4 GB HDD, and a tray-load CD-ROM drive. The first Mac with USB ports. Introduced on May 6, 1998, discontinued on January 5, 1999. Price starting at $1,299.
A number of other decisions that would prove important for the future were made during the design of the first iMac. The first was the CD-ROM drive used. The first version of the iMac G3 had to use a classic tray-loading CD-ROM drive, which Jobs hated. He wanted a slot-loading model, where the CD is simply inserted into a slot. However, this model was not available in sufficient quantities and with sufficient reliability at the time, so the first iMac had to use a classic 24x tray-loading CD-ROM drive, which Jobs almost couldn’t stomach when he saw the first completed iMac prototype with a tray-loading drive, and he yelled at his designers. But there was no turning back, other than to delay the launch of the iMac by many months or use a slide-out drive. Jobs made an exception and agreed, but stipulated that the next iMac would come with a slot-loading drive as soon as possible.
As soon as it became possible to use a slot-loading CD-ROM drive, Jobs insisted on its use, and so in January 1999, when the iMac G3 upgrade, known as Blueberry due to its color, was introduced, it was equipped with a slot-loading 24x CD-ROM drive. In this case, Jobs clearly prioritized appearance over function, as his engineers had warned him about the consequences of this move. Slot-loading drives would develop more slowly, and in the future, burners would be slower to appear, and Apple would not yet be in a position to dictate terms. At the time, Jobs believed that this would not be a problem, but as the future would show, it would be. And in the end, it would be a challenge that would force Apple to take a completely different approach to music.
Another important decision will be the use of FireWire. Admittedly, the first models did not have FireWire; it first appeared in October 1999 in models labeled DV, or Digital Video, which were intended for use by users working with video. These iMacs had a built-in DVD-ROM drive and a pair of FireWire ports with a transfer speed of 400 Mb/s, which was impressive at the time, while the pair of USB 1.1 ports with a maximum speed of 12 Mb/s remained. This version also cost $1,299. Jobs imagined that people would want to use the new iMacs to edit their videos. At that time, camera prices were falling dramatically, and digital cameras were coming onto the market at prices and with capabilities that were accessible even to amateurs. In 1999, Sony launched Digital8, and a year later, a whole range of digital cameras using the DV standard came onto the market. And one thing applied to all digital cameras: to transfer data to a computer, it was necessary to use a FireWire port, sometimes also referred to as IEEE1394 or, in Sony’s case, iLink, although we will not go into the differences leading to the various designations. The important thing was that all new cameras were compatible with FireWire, and Apple was the first and, for a long time, the only company to install the FireWire interface as standard in its computers designed for everyday users. For PCs, expansion cards could be purchased at a price ranging from a third to half the cost of the computer. Especially in the wealthier American market, where the penetration of amateur family video was traditionally high, Jobs expected Apple to become the logical choice for households that wanted to process their video in some way. This was the purpose of the iMac DV series with better digital video support, which, in addition to DVD-ROM and FireWire, also included a better graphics card and even a VGA output.
Thirdly, Apple decided to develop a basic software package for Mac on its own. The reason for this decision was pragmatic. When Jobs decided to focus on amateur video processing with the iMac in 1999, he wanted Adobe to prepare a package of video editing tools for Mac OS, but was rejected, which was one of the components of a future rift with Adobe that led to the collapse of Adobe’s ambitions to penetrate mobile phones with Flash a decade later. Jobs felt justifiably offended; if it weren’t for Apple, Adobe wouldn’t be where it was with its software packages. But that didn’t count, so in 1999 Apple began developing its own software package to turn iMac computers into a home digital hub. First and foremost was iMovie, apparently the first standalone software in ten years that Apple wrote entirely on its own. iMovie was created as a secret project within Apple, which most employees had no idea about. Jobs managed to create a “startup environment” within the company, which ultimately led to the creation of iMovie as a simple tool for amateur video editing. iMovie was intended to accompany the more powerful Final Cut Pro editing software, which was created in collaboration with Apple by Macromedia, then a major competitor of Adobe, which later acquired it in 2005. iMovie was released in October 1999 alongside the iMac DV and was later offered to other users as a free download. The success Apple had with iMovie encouraged it to continue developing its own basic software for Mac. For now, however, Jobs still believes that users want to edit digital video and will buy an iMac for that reason.
But Jobs was wrong. Users didn’t want to process digital video.