If you aren’t paying for something, you’re the product. Or your data is.
On September 3rd Michael Jordan walked to the center of the field at the onset of the University of Michigan’s first football game of the 2016 season. The eye-pleasing, ceremonial activities of (arguably) the world’s best basketball player overshadowed any discussion around the details of the contract which led to his honorary captain’s position. The team donned the Jumpman logo on its uniform as part of the $170MM deal between University of Michigan and Nike.
Nike sponsors a lot of collegiate teams and sports, as well as national and international teams. The news of yet another college and Nike partnership hardly seems worth paying attention to. But in this case the 170MM dollar amount brought with it the opening of a Pandora’s Box of personal information (and I mean really personal, as in biometric) and performance data access.
The Nike/University of Michigan contract (penned in March of 2016, activated in August of the same year) gives Nike broad rights to utilize the information it collects through ‘wearable devices’ the athletes use. It does so through a clause defining ‘Activity Based Information’ as:
“performance and/or activity information/data digitally detected from the Teams or Team members during competition, training or other Covered Program Activities including, but not limited to, speed, distance, vertical leap height, maximum time aloft, shot attempts, ball possession, hear rate, running route, etc.”
To most people working in digital or data, the idea of collecting this sort of information is impressive, amazing even. Being able to analyze individual information is only second to the promise of correlating the information across team members, team performance, inter player dynamics, and even practice factors leading up to competition; performance metrics becomes a powerful tool for coaches and players alike.
But, the contract isn’t intended to deliver that information to the coaches and players. The data will reside with Nike – who in the contract isn’t listed as a manufacturer, but rather a ‘sports and fitness company’.
In fact, and I had to refer to Bill Wilson as a recognized authority on this subject, a standard contract between a player and the college doesn’t specify any level of privacy on the part of the player. The subject of the rights of college athletes is a vibrant one, and not the subject of this discussion; rather the idea of user generated data is.
User generated information isn’t new. As users of products and services of any kind (electricity, gas, cars, mobiles, laptops, websites, apps, credit cards) our behaviors and transactions have been recorded in various forms. We often don’t read the fine print, where the policies are written and the types of data collected are defined, though the use of this data rarely is.
I have a colleague who laughingly espouses he would give Jeff Bezos his social security number if the promise was that the Amazon experience would continue to surprise and delight him.
I wonder would he offer up his heart rate? His pulse? His mile run time? Perhaps. But my colleague is a fellow strategist, not an athlete.
Wearables are permeating not just athletics, but our everyday lives. Common knowledge at this point. Most early adopters are using wearables to showcase just that, early adoption. However, they are quickly becoming the norm. In 2014, PricewaterhouseCoopers reported that 7.6 million wearable devices shipped within the United States, a 200% increase over the previous year, and that one in five Americans owns a wearable device.
I sat in a session in SXSW two years ago where fabric manufacturers discussed how they were (literally) weaving haptic thread into garments to both read from and deliver data to the wearer.
In late 2015 Netflix published a DIY article on how to make socks that sensed when you fell asleep and paused the show you were watching, (ummmm, yes please). But what data do they need to collect in order to make that happen? Could the same sensory information they collect be used to correlate physiological responses to comedies and thrillers in order to better advise you on which other shows & movies will evoke similar responses? Is that terrible? What if that data was shared/sold to another brand? Your employer? Your internet service provider? What would you be comfortable with?
How about using wearable delivered information in the workplace? In many athletic programs athletes wear sensors to inform team doctors and physiologists of metrics which can be used to pre-empt injury or modify work out or therapy programs.
If we issue wearables in corporate environments who does that data belong to? Should we be able to look up our colleague’s whereabouts if they are late for a meeting?
“Oh Bob is in the elevator; he should be here in a minute.” Or what about more personal information like one’s heartrate or body position? We can think through all sorts of efficiency arguments for companies to know more about their workforce and ideally working to create a better environment, manage workload and personalize work experiences. But are our employers, our brands, our government prepared to handle this level of sensitive information? Do we trust them.
In a 2015 Georgetown Law Journal publication, Matthew Langley wrote extensively about this subject from the legal perspective. He noted that there is no Federal law which prohibits software companies who integrate with wearables from collecting health information. (In fact integrated software/hardware companies like Fitbit actively need it to operate) The most poignant question in my mind is the idea of intent. By simply wearing a device and using software, am I intending to share the data? Are we nearing the point in time where I won’t be creeped out by Amazon pushing water to me on my Apple watch when I am perspiring?
At the center of the wearables conversation is the use and security of the information I provide. In the case of Nike and the University of Michigan, health attorney Tatiana Melnik surfaces that the federal Health Insurance Portability and Accountability Act applies to medical information and records, but not biometric information in a NY Times article. She notes the slippery slope asking, “How does a player know you’re not going to turn around and share this information with the N.F.L.”
But this question is applicable to us all, not just highly motivated professionals who use their body as their brand and primary asset. In an article published on the Missouri Bar website we learn that:
“A Federal Trade Commission (FTC) study released in May 2014 revealed that 12 mobile health applications and devices transmitted information to 76 different third parties, and some of the data could be linked back to specific users. In addition, 18 third parties received device-specific identifiers, 14 received consumer-specific identifiers, and 22 received other key health information.”
We know that technology is outpacing societies ability to adopt, monitor or regulate it. With wearables we are entering the most dangerous and promising period of digital lifestyle in our generation. It is a time that as brands and stewards of digital, we must err on the conservative side of personal data. As consumers we must use the very technologies that bring us benefits to challenge and scrutinize their practices. Allowing pomp and circumstance, or even eloquent marketing stories, to overshadow the risk to our privacy and now our bodies isn’t a trend I hope continues. Our involvement is critical as the government organizations we rely on at these tricky intersections, are well behind the journey technology has us on.