TECHNOLOGY AT THE TIPS OF OUR FINGERS: Smartphone Touchscreen Innovation Abstract In today's world, touchscreen technology has embedded itself into our daily lives and changed the way we perceive and interact with technology. Nowadays, it's hard to imagine a world without having everything you need at the tip of your fingers (literally). Looking more specifically at the smartphones in our pockets, touchscreen electronics have revolutionized innovation by turning users into the true conductors of their experience and relationship with technology. Key Words Touchscreen, Technology, Smartphone, Mobile Phone, Phone, UI, Software, Electronics, Electrical Engineering, Computer Science Prepared by Kalyn S. Nakano Author Biography Kalyn is a junior studying Computer Science at USC’s Viterbi School of Engineering. She likes technology and design. Contact Information Email: knakano@usc.edu Phone: (650) 400-8240 Paper Submitted December 7, 2012 Prepared for Marc Aubertin, Writing 340 Professor USC Viterbi School of Engineering Introduction Touchscreen technology has permeated innovation to the point where it's no longer perceived as a phenomenon, but rather held as an expectation. With touchscreen devices becoming more and more commonplace in our society, the technology behind it is often overlooked. When the first touchscreen mobile phone was introduced by Apple Inc. in 2007 [1] and brought into the mainstream marketplace, technology was no longer just about what the human eye could see. It became more about what we as humans could physically touch, hold, and intuitively instruct. You no longer needed to punch keys to operate your phone or navigate by means of a single designated button. By way of touchscreen, we are now tapping, typing, swiping, scrolling and zooming on-screen out of natural intuition. Figure 1: Before the introduction of touchscreen technology to mobile devices, the only input methods available were dial keys and navigation buttons. (img via TechWench) We are beginning to connect to technology more than ever before because technology is becoming more responsive to us. Improvements and innovations in mobile software development alongside touchscreen electronics have made us one step closer to physically connecting people with technology around the clock. Most importantly, the touchscreen has transformed how we innovate and expanded the ways in which we can. To be considered “innovative” today, phones can no longer just be responsive to pressed analog buttons. More intimately, mobile devices must now be programmed to become responsive to the human touch. Removing the extra physical barrier between users and software, touch-screen innovation has seamlessly bridged together user-interface design, hardware, and software development to connect us to technology like never before. What's making smartphones “smart” With touchscreen technology so commonplace in the mobile phone market today, we often overlook what’s making our smartphones “smarter” than the other every day objects we interact with. When you touch a table, the table doesn't really "know" you touched it. Sure, it can break if you put enough weight on it, or flip on it’s side if enough force is applied; but your standard table only reacts according to the natural laws of physics. There is no software instructing it how to react or telling it what it is it should even react to. It cannot consistently or purposefully react to the human touch not only because it’s not connected to any data, but because it has no way of sensing you touched it in the first place. When you touch the screen of your smartphone however, your smartphone not only knows you touched it, but can pinpoint to the pixel exactly where you touched it, when you touched it, and how much pressure you touched it with. This is because most smartphones today use capacitive systems to become "smart" and sensitive to the human touch [2]. At just the tap of a finger, you can discover and access anything from the weather today to your favorite song simply because your smartphone knows you touched it, where you touched it, and most importantly can presume the intent you had while doing so. The electric bond between us Today’s capacitive touchscreens like the ones on our smartphones and tablets utilize the human body’s electrostatic field and more specifically, the conductive property of fingers, to sense a finger touching the screen [3]. This is made possible through the use of a capacitor. A capacitor is an electric circuit that is designed to hold an electric charge. Made up of electrodes conducting different currents of electricity, direct currents (DC) and alternating currents (AC), both electrodes form a grid at the surface of the touchscreen [4]. When your finger makes contact with the screen, it makes a capacitive contact. AC current is then generated from the phone, thereby inducing a corresponding current within the user’s body [5]. Since the human body has an electrostatic field, its presence changes the capacitance of the conductive material that coats the inside of the touchscreen. Your finger then becomes the conductor, and closes the electrical circuit, as the decrease in charge on the capacitive layer is registered as “touch” by the controller [6]. This decrease is measured in circuits located at each corner of the screen, while the controller is then able to measure with speed and precision the size, shape and location of the affected area that sensed your touch. The processor, when paired with gesture-interpretation software, is then finally able to take this data and interpret your physical contact and movements on-screen as commands to the software. Stepping back to rediscover this technology behind today’s smartphones, we can see how far we have come from pushing buttons that only have a single function. Now with an electric relationship between our fingers and our phones, with just the touch of a finger to the screen of a smartphone, one can access anything from their favorite song to their personal email. Through The Looking-Glass, and What Is Found There Understanding the electric sensitivity of smartphones to human touch, it becomes clear that there is much more behind the screen than just glass. Although most touchscreens today are designed to appear simple, the makeup of a touchscreen is far more complex than just the outer shell of glass we generally perceive screens to be characterized as. Most smartphones today have touch screens that are made up of five basic components: The bottom layer is your typical LCD display that provides the GUI for the interface. Right above the bottom layer, lies a glass substrate for physical protection. The most important component however is the layer of capacitive material right above it. A grid of nearlytransparent electrodes composed of capacitive material (most commonly, indium tin oxide, or ITO) is responsible for sensing human touch in capacitive touchscreens [7]. Separated by a required insulating gap, exists a layer for X sensor electrodes and a layer for Y sensor electrodes. Each time there is an overlap between the two, a capacitor is formed, and can be “sensed” and further interpreted as an (x,y) coordinate [8]. Finally, to complete the touchscreen, is another layer of glass or plastic to act as the “cover lens” for the capacitive system. Usually stronger material such as Gorilla glass [9] is used, to protect the touchscreen sensors from everyday wear and use. Figure 2: The layers that comprise today’s touchscreen smartphones. [10] How technology touched the hardware Before the touchscreen, mobile devices were severely limited in terms of their hardware design and layout. To be considered “mobile” in the 21st century, the design had to fit within the surface area confinements of your average jean pocket. A legible display and a clear input method to navigate and operate the phone were both essential. Optimizing one however, generally meant compromising the other. With input methods kept physically separate from the display, mobile devices needed to see new technological innovation in order to improve upon and outgrow simpler design strategies like flip-phones and slide-out keyboards [11]. When touchscreen technology was finally introduced to the mobile arena, industrial designers and product engineers were given more than twice the real estate to non-exclusively optimize both the display and input hardware simultaneously. As the door opened for more innovation and creativity to take Figure 3: Flip-phones and slide-out keyboards on mobile phones before touchscreen. (img via Samsung) place in the physical hardware design of mobile devices, we soon saw lighter, slimmer, and sleeker phones. Most importantly however, we saw increased efficiency—efficiency in surface area, material used, and function By transitioning user-input to take place on the screen itself, the display effectively doubled in size, as the number of physical buttons on some devices decreased from 40 to 1 [12]. Utilizing this extra space, screen displays, as a result, became more legible and could present more information on-screen at once. While greater screen size may have resulted in the opportunity to present more information in terms of quantity, it more importantly resulted in the opportunity to display information with more quality. Previously confined to screens no larger than two square inches, most mobile devices could not effectively display anything complex, interactive, or meaningful to the user, like websites or video games [13]. When touchscreen technology transformed what the potential information could mean to the user however, a whole new door opened for software innovation to take advantage of the opportunity. Apps, apps, and more apps Increasing the quantity and quality of presentable information effectively means more relevant data for users to access on their phones. More relevant data means more useful applications of the data, and thus more potential consumer-desirable features and functions for the mobile phone. Before touchscreens, most phones could only perform a limited range of operations, and could only effectively display a limited amount of information. Essentially, old mobile devices had a standard set of applications, regardless of who you were as a user: a dial pad for making phone calls, an inbox for SMS text messaging, a standard calculator for simple math, a notepad for simple text, and occasionally a game center for playing Tetris or solitaire. Accessing the web on small, hard-to-read screens was not only a long, frustrating and in-convenient process, but also required the user to possess specific qualities, like patience and 20/20 vision. Larger and clearer displays, in addition to the increased amount of user-control given by touch-screen instruction, has revolutionized mobile software, as well as what that can mean for different users. Since browsing the web on your phone no longer requires the precise control of up, down, and side-arrow buttons to navigate a single web page, mobile devices can now effectively present any information you can access via the Internet. With over 644 million active websites on the Internet, that can mean a lot of information [14]. When this new data became readily accessible to mobile device users, the one-size-fits-all standard for mobile applications soon disappeared. Today, there are over 650,000 active mobile apps in the Apple App Store alone and over 175,000 active publishers in the United States App Store [15]. Now, at the tap of your phone’s touchscreen, you can download almost any application you could imagine, anywhere, and anytime you want. With thousands of apps now readily available for download, new information and tools that are personally relevant to each user’s individual needs have become readily accessible. And by increasing the number and range of mobile applications available, user experiences can now be customized. What does this mean? A 15-year old teenager no longer has to have a phone that functions the same way as their 70-year old grandparent’s does. Evolving our relationship with technology Opening the door for hardware improvements, software development, and intuitive interface designs, touchscreen electronics have created an entirely new platform for innovation. Although overlooked, touchscreen technology has not only revolutionized the way we engineer and design hardware and software, but more importantly, the overall experience and relationship we are able to have with technology. By bringing touchscreen electronics to mobile devices more specifically, technology is now something we are finding ourselves interfacing with around the clock, everywhere we go. And as personally pertinent data and tools become more easily accessible to a wider range of users, there becomes an incredible amount of potential for innovating new user experiences that transform technology and mobile devices into more intimately becoming further extensions of their users. References: [1] L. Zhang, D. Liu, X. Qiao, Research into the Interactive Design of Touch-Screen Mobile Phones’ Interface Based on the Usability, Edition of book, Switzerland: Trans Tech Publications, 2011, p. 1, 56-60. [2] L. Zhang, D. Liu, X. Qiao, Research into the Interactive Design of Touch-Screen Mobile Phones’ Interface Based on the Usability, Edition of book, Switzerland: Trans Tech Publications, 2011, p. 1. [3] M. Lee, "The art of capacitive touch sensing," EE Times, March 1, 2006. :http://www.eetimes.com/design/analog-design/4009622/The-art-of-capacitive-touchsensing. [Accessed October 6, 2012] [4] S. Saini, "How do touch-sensitive screens work?," MIT Engineering Ask an Engineer, June 7, 2011. http://engineering.mit.edu/live/news/1439-how-do-touchsensitive-screenswork. [Accessed October 5, 2012] [5] D. Braue, "Inside touch: How touchscreen technologies work," APC Magazine, September 14, 2011. :http://apcmag.com/inside-touch-how-touchscreen-technologieswork.htm. [Accessed October 7, 2012] [6] D. Morley, C. Parker, 1st Initial. , Understanding Computers: Today and Tomorrow, Comprehensive, 5th ed., Boston, MA: Cengage Learning, 2009, p. 147-8. [7] S. Kolokowsky, T. Davis, "Touchscreens 101: Understanding Touchscreen Technology and Design," Cypress Semiconductor Corp., pp. 2, June, 2009. :http://www.cypress.com/?docID=17212. [Accessed October 6, 2012] [8] T. Vu, A. Baid, S. Gao, "Distinguishing Users with Capacitive Touch Communication,"Proceedings of the 18th annual international conference on Mobile computing and networking, Ed New York, NY: ACM, p. 197-208, 2012. [9] S. Hanna, "Demystifying Touchscreen Technology," The Verge, August 27, 2012. :http://www.theverge.com/2012/8/27/3271818/demystifying-touchscreen-technology. [Accessed October 7, 2012] [10] S. Kolokowsky, T. Davis, "Touchscreens 101: Understanding Touchscreen Technology and Design," Cypress Semiconductor Corp., pp. 3, June, 2009. :http://www.cypress.com/?docID=17212. [Accessed October 7, 2012] [11] M. Fidelman, "How Mobile is Rapidly Evolving the World," Forbes, pp. 1-3, May 16, 2012. :http://www.forbes.com/sites/markfidelman/2012/05/16/how-mobile-israpidly-evolving-the-world/. [Accessed October 8, 2012] [12] M. Sjölund, A. Larsson, E. Berglund, "Smartphone Views: Building Multi-device Distributed User Interfaces," Mobile Human-Computer Interaction – MobileHCI 2004, Ed Berlin, Germany: Springer, p. 127-40, 2004. [13] N. Henze, E. Rukzio, S. Boll, "100,000,000 taps: analysis and improvement of touch performance in the large," Proceedings of the 13th International Conference on Human Computer Interaction with Mobile Devices and Services, Vol. , no. , 133-142, 2011. [14] J. Bort, "Demystifying Touchscreen Technology," Business Insider, March 8, 2012. :http://articles.businessinsider.com/2012-0308/tech/31135231_1_websites-domain-internet. [Accessed October 8, 2012] [15] J. Wilcox, "Apple App Store reaches 650,000 available, 30 billion downloads," Beta News, Vol. , no. , pp. , June 11, 2012.[]. :http://betanews.com/2012/06/11/apple-app-store-reaches-650000/. [Accessed October 8, 2012]