Finding Genius
Page 20
The technologies being employed at the time were crude. Members of the “Quantified Self” would hack together readily available computing components as a makeshift solution to translate aspects of their lives into digital form. Writing for the Financial Times, April Dembosky observed an early practitioner of the movement, a “quiet middle-aged man,” who hacked together his own tracking solution, “a pulse monitor clipped to his earlobe, a blood pressure cuff on his arm and a heart rate monitor strapped around his chest, all feeding a stream of data to his walkie-talkie computer.”
The accuracy and utility of these projects are questionable, but what Wolf and Kelly foresaw was the inevitable ubiquity of wearable computers that would track our lives — long before Fitbit, Apple, and Samsung entered the market with the devices that most consumers know today.
At the time, the individuals who were part of this Quantified Self movement were a relatively small subset of the population who believed that new digital technologies could lead them to a deeper understanding of themselves. “A new culture of personal data was taking shape,” Kelly wrote. “We don’t have a slogan, but if we did it would probably be ‘Self-knowledge through numbers.’”
The movement has since propagated to nearly every corner of the world. San Francisco plays host to a collective of approximately 5,000 people who meet to explore “Consciousness Hacking.” The community describes itself as exploring technology to catalyze psychological, emotional, and spiritual flourishing. About 500 people in Stockholm meet to discuss the very same topic. Nearly 1,000 people meet in Brooklyn to discuss “biohacking,” which focuses on exploring applications of novel biological technologies such as affordable genetic sequencing. In fact, one search on “Meetup” for the term “Quantified Self” surfaces 90,900+ members and 228 Meetups in over 115 cities and 25 countries.
The concept of knowing oneself for the purpose of self-development is not new. An early known example of this type of behavior was a self-recorded account of the 17th century physician, Santorio Santorio. In one famous experiment Santorio used a self-weighing chair to meticulously weigh himself, including everything he ate and drank, and his waste for 30 years. The crux of these experiments was the idea that you cannot change what you don’t understand, and that through numbers one can build an understanding of the self.
The intellectual underpinnings for the Quantified Self stretch far back. Philosophical giants from Michel Foucault to Martin Heidegger have also given heavy consideration to the “care of the self,” emphasizing the importance of self-knowledge. While philosophers used discourse to examine their own thoughts and conduct, it could be viewed as an activity of gaining true knowledge that could drive meaningful personal growth. To this end, members of the Quantified Self movement, as Dembosky writes, “are fond of referencing Benjamin Franklin, who kept a list of 13 virtues and put a check mark next to each when he violated it. The accumulated data motivated him to refine his moral compass.”
Athletes have long measured time and resistance as a means to quantify fitness. Individuals with chronic conditions have long measured aspects of their behaviors and physiology (among other variables) to manage their body states. In many ways, the most recent boom in quantifying ourselves is an elaboration of an age-old concept.
So what was happening in the late 2000s that gave way to our current landscape?
Cheaper, Faster, Smaller
“Moore’s Law” is a commonly used term that describes the doubling of the number of transistors on integrated circuits every two years. The term “law” connotes a physical inevitability, but in reality the reference more accurately describes a common commercial framework that engineers have utilized to drive progression across a variety of technical fields. Moore’s Law is likely the most famous technological observation of our modern era. It is so famous that it often takes on a popular meaning to encompass exponential technological change in general.
Exponential change is a simple concept that is impossible to fully understand. A famous mathematical problem known as the “wheat and chessboard problem” emphasizes this reality well, as posed below:
“If a chessboard were to have wheat placed upon each square such that one grain were placed on the first square, two on the second, four on the third, and so on (doubling the number of gains on each subsequent square), how many grains of wheat would be on the chessboard at the finish?”
This problem has gained notoriety over the past hundreds of years, and often appears in stories about the inventor of chess. Sessa, an ancient Indian minister who is said to have invented chess, is said to have requested wheat — in accordance to the wheat and chessboard problem — as a prize for inventing the game. The king, not realizing the hidden implications of the request, willingly agrees, only to have his court tell him that the request has bankrupted the kingdom’s supply of grains. Opinions still vary as to whether the king rewards Sessa with a position in his court for his clever request or executes him for making him a fool.
The general lesson is that exponential progression accelerates so quickly that its velocity defies intuitive logic. This factor is one of the driving forces behind the advances in the Quantified Self movement.
Figure 1 - Exponentially increasing computational capacity over time (computations per second) — Koomey, Berard, Sanchez, and Wong (2011)
Examining the progression of personal computers in the 2000s provides a tangible, recent record of technological progression in action. At the beginning of the decade, the price of these PCs was extraordinarily high compared to today’s standards. A notable example is the Gateway Performance 1500, which carried the brand new Pentium 4 Processors, sold for a price tag of $4,272 (accounting for inflation.) A few years later in 2003, Apple released their Power Mac G5, which was the first personal computer to utilize new 64-bit processing architecture. The cost at the time for the G5 was $2,587 in inflation-adjusted terms. Near the end of the decade, in 2009, HP released their 2140 Mini-Note, which sold for $554 in inflation-adjusted terms. In the span of a decade, personal computers moved from a price point that only a few families could afford to a price that opened the market to a significant portion of the country. Indeed, in 2000, the consumer PC penetration per capita in North America was approximately 26%, while in 2009 the number reached 51%.
Computers were not just getting cheaper; they were also becoming more powerful. The benefit of being able to place more transistors on a chip is that the computer can conduct more mathematical operations in the same amount of time. Furthermore, as computing power increases, the more capacity it has to execute instructions in parallel, retrieve data from memory, and (more generally) have room for ingenious engineers to make the machine run faster. Every time you turn on a computer, decades of computer science innovation powers the task at hand.
The magnitude of this improvement cannot be understated. From approximately 1975 to 2009, the computational capacity for computers doubled every 1.5 years. Said another way, computers effectively doubled in power 23 times over the past 34 years. To put that in perspective, if you had $2 that went through approximately 23 cycles of doubling, you would have $8,388,608.
Size is another factor at play. An implication of Moore’s Law is that while the total potential power of a computing system increases dramatically, the form factor of moderately powerful devices decreases. Engineers have pushed the size of transistors down to 10nm, which is nearing the physical limits of what is possible. Practically, that means that smaller computers can now have the same power, if not more, than the much larger ones did even a decade ago.
The exponential change in computing price, power, and form factor is one of the most extraordinary, ongoing technological achievements in human history. It has fundamentally transformed all aspects of our society: how we socialize with one another, educate each other, and conduct business.
This ongoing innovation is also the underlying force that enabled useful and affordable mobile computers, which become the central player in the Quantified
Self story.
The Mobile Shift
In 2005, Steve Jobs recruited a group of Apple engineers to work on a project, code-named “Project Purple 2,” that would fundamentally transform the world’s relationship with computers. The project’s goal was to build a computing device, which employed a multi-touch screen obfuscating the need for a keyboard and mouse and would take the form factor of a mobile phone. It is reported that approximately $150 million was invested into this project over a 30 month period.
Steve Jobs officially announced the project in his keynote address at the January 9, 2007 Macworld Conference. This was more than a new product announcement — it marked a platform shift from a computing paradigm that was primarily based on PCs to one that was (and continues to be) based on mobile smartphones.
Technology investors often refer to “platform shifts” as moments when a new group of technologies (typically a combination of hardware and software) become a new paradigm for the development of computing applications. A variety of driving factors play into these shifts, which are hard to isolate, but the economic implications are significant. An explosion of new business opportunities often arrive alongside a new computing platform, opening the floodgates for early-stage startups and large companies to release innovative products and services. Frequently, early-stage startups are able to profit off of these platform shifts more quickly than large organizations.
Ownership of mobile smartphones has exploded since the iPhone’s first release, when virtually no one had a mobile phone. In the span of a decade, four out of every five mobile phone users in the US now own a smartphone. This number is even more striking when you look at people between the ages of 18-29 — approximately 94% own a smartphone.
Mobile phones have effectively become the center of the computing universe, displacing the desktop as the central hub. Put more simply, we use mobile phones for everything. Nearly every indicator of usage supports this claim. For instance, 2016 became the year where the combined traffic for mobile and tablet devices outpaced desktop worldwide.1 In 2015, the number of “mobile-only” adult Internet users exceeded the number of desktop-only Internet users.2 In that same year, more Google searches took place on mobile devices than on computers in over 10 countries including the US.3
However, it’s not only the centrality of mobile devices that is so transformative. As smartphones are always internet-enabled, we are now a society that is pervasively connected to each other. 26% of American adults now report that they go online “almost constantly,” and three out of four Americans go online on a daily basis.4 Pervasive broadband has enabled a plethora of innovative services — such as ride sharing, music streaming, and dating-by-swipes — that have transformed the way consumers interact with businesses and with each other.
The centrality of mobile smartphones and the persistent connection to the Internet laid the foundation for a new class of computers — wearable devices — that would create the conditions for the first truly mainstream Quantified Self use case.
The Emergence of Modern Wearable Computers
As consumer demand for smartphones continued to rise, the miniaturization and accessibility of computing components created a new opportunity for companies to commercialize a new class of computing devices. These devices were affordable, wearable computers that were beginning to show signs of a product-market fit. “Can a $99 rubber wristband inspire owners to move more, sleep better, and eat smarter?,” wrote Thomas Ricker in a review of the Jawbone Up Fitness Band published on November 6, 2011.5 Jawbone’s band was “very much post-PC,” according to Ricker, meaning it utilized the smartphone as a central hub rather than a desktop computer.
While a variety of companies in previous decades had tried to commercialize wearable computers, the early 2010s saw a tidal wave of new products specifically targeting the fitness market. The first two years of the decade saw the introduction of modestly successful devices that primed the market for the bracelet form factor. Notably, in September 2010, Apple released the 6th-generation iPod Nano, which came with a wristband attachment converting it into a wearable wristwatch computer. Jawbone was one of many companies that were part of the race to capitalize on the application of computing technology to fitness.
Fitbit is probably the most well-known company in this market today. Originally called “Healthy Metrics,” the company was founded in 2007 with a particular focus on building accessible activity trackers. In the beginning of the decade, the company generated approximately $5 million in revenue. Demonstrating the growth in popular demand for fitness-focused wearables, the company went on to generate $1.9 billion in revenue in 2015, which is the year that it filed for an IPO targeted to raise $358 million in new funding.
The popularity of these new devices translated to one clear message: fitness was the killer app that would drive mass adoption of wearable computers. To a certain extent, the Quantified Self had now become a popular phenomenon with a significant portion of the population interested in tracking at least part of their daily lives.
Yet fitness is not the only application of wearable computing that was being developed for the mass market in the early 2010s. Wearable computers were becoming a new platform for developers to build new applications.
Eric Migicovsky is a notable trailblazer in this early market. After having successfully completed Y Combinator, the most premier startup incubator in the world, Migovsky was unable to raise capital from traditional venture capital funds. On April 11, 2012 Migiovsky’s company, Pebble, launched a Kickstarter campaign with an initial target of $100,000. Early participants paid $99 for a Pebble and within two hours of going live the project had met its target. Moreover, the campaign went on to raise $10.26 million in funding from 68,929 people — at the time, the world record for the most money raised via a Kickstarter campaign.
What was so notable about the Pebble was not just that it offered fitness applications in a wearable form factor, but that it also offered a variety of functions that served as an early template for what was to become the modern smartwatch. The Pebble was originally slated to ship with pre-installed apps, including a cycling tracker and a golf rangefinder, in addition to displaying notifications and messages from the connected smartphone. However, the beauty of the device was that it enabled a plethora of applications to be built for it with future generations seeing applications including games (e.g. Pixel Miner) and transportation (e.g. Uber.)
Gadi Amit, the founder of New Deal Design and the person responsible for the design of the Fitbit Activity Tracker, made an apt comment on the future of wearable computers: “You could imagine five years from now each one of us will probably have about 10 of these. Two or three will be customized to some physical or medical needs we have. Two or three will be recreational. Two or three will be data oriented, identity, authentication, and so on.”6
Indeed, the emergence of wearable computers had just begun.
Moving Beyond Fitness
In 2017, worldwide shipments for wearable computing devices was approximately 121 million across five categories of devices: smartwatches, smart clothing, ear-worn devices, wristbands, and sports watches.7This number is set to grow dramatically over the coming years. In 2019, the number of total devices shipped is expected to increase approximately 27% over 2018 to approximately 190 million devices. By 2022, the total is expected to reach approximately 373 million devices. Accompanying this expected growth in device shipments is an expanding set of use cases.
Healthcare is one market that has industry commentators excited. Health providers are seeking technological solutions to help better manage their patient populations and subsequently the overall cost of delivering healthcare. There are more clinical trials incorporating digital health solutions than ever before. This will only increase as the FDA rolls out programs, such as the digital health software certification pilot program, that make it easier for digital health solutions to be approved. Insurance companies, recognizing the potential value of wearables in managing he
alth, are starting to offset or fully reimburse the cost of these devices. As of 2017, less than 5% of large health systems had deployed full-scale digital health pilot programs. Thanks to many positive tailwinds, this number is expected to increase to 40% by the end of 2020.8
These medical wearable devices will look significantly different than the bracelet form factor most fitness bands have adopted. Sensor-embedded textiles have become a market that many believe will be a significant part of the health management solution. For example, Owlet is a company that has created a smart sock for babies, which allows parents to have a constant and accurate read of a baby’s heart rate and oxygen level with the help of a pulse oximetry (pulsox) device positioned within the sock.
Likewise, Nanowear is a trailblazer in this market building medical-grade smart textiles focused on healthcare applications across a broad array of medical conditions and chronic disease states. The company’s technology provides healthcare providers with continuous diagnostic data through a wearable that is effectively worn by the patient as a piece of clothing. The brilliance of these new, emerging solutions is that they move away from simple, descriptive quantifications of our behavior to prescriptive information that can diagnose targeted issues.
Beyond just measuring data exhaust from the individual, companies are also finding ways to measure aspects of the immediately surrounding environment. Deloitte Wearables is a company in Canada building a sensor that can be attached to a miner’s helmet. The product is able to detect all types of environmental risk factors including hazardous gasses, radiation, temperature, and humidity. The company is using this data to provide alerts for emergency situations and allow for more effective communication. Rather than trying to surface descriptive statistics related to the individual, the company is focused on surfacing data related to an individual’s surroundings and placing that information in context.