There's a new report on the collaborative economy, saying "sharing is the new buying." The basic idea of this collaborative, or sharing, economy is that you have something of value, a resource such as a ride, a living space, an appointment, an apartment, a couch, an office space, or anything, really, then a cloud-based platform can coordinate the use, rental or purchase of the item.
Trust drives economics, and that statement is even more true in the sharing economy. Social networks are key components of this sharing economy, driving the reputation and trust between the participants of online networks. You can rate your driver on Uber, for example. Social networks become the key arbiters of quality, driving traffic and customers toward high quality items as people follow the advice of those they trust. As one commenter put it, "The whole world has become a network, and we are all network marketers unknowingly."
The author of the report, Jeremiah Owyang, said, "The big trend here is that the crowd is empowered to get the physical world from each other rather than buying it from brands." In other words, we used to trust brands, now we increasingly find trust in one another.
The strength of this new area of the economy is undeniable with already several billion-dollar companies, including Etsy -- which allows craft-makers to sell their wares -- and Uber -- a transportation service -- among those quickly becoming household names. According to the report, there are 80 million "sharers" in the U.S. in one form or another, and the size of the sharing economy doubled between 2013 and 2014.
The sharing economy is happening in health care in different flavors. ZocDoc is one early example of the sharing economy in action in health care. ZocDoc -- which allows physician practices to share appointment times that become available -- is now used by five million patients per month, according to the firm. PatientsLikeMe is another example. Patients share their health data and stories to the benefit of community and, ultimately, themselves.
These examples are part of a growth trend toward collecting, sharing and possibly selling health data that is quickly becoming exponential. Just in the last couple of weeks:
- NIH announced the largest and most diverse genetics database ever created;
- Merck announced the M2i2 collaboration group, with 14 collaborators, including PatientsLikeMe, Practice Fusion, Allscripts and Boston Children's Hospital;
- Craig Venter of human genome and artificial life fame has raised $70 million for a new longevity venture that "will use ... technology to map 40,000 human genomes in a push to build the world's largest database of human genetic variation"; and
- The United Kingdom's National Health Service has run into a string of bad press and problems around data security, selling information and indiscriminate sharing. The British science journal Nature called it a "slapdash" introduction of a national health database.
These stories juxtapose the issues pretty well in health care. People are saying this is bigger than big data, huge data, and while there's great possibility, there's also great risk. There's a land grab going on in relation to personal data, including personal health and genomics data. It's like a new frontier with few rules for how data are used, and the law is notoriously slow to adapt. A common theme among these stories is whether consent has been or will be granted by patients and whether there is an opportunity to opt out.
Yet huge data will keep on getting even more huge and more useful. Genetic sequencing costs are plummeting, while wearable sensors continue to evolve and sell. Apple is poised to enter into healthcare with the iWatch, having what looks to be a wide number of sensors and several key biometric execs now recruited to help build it. Some reports say the iWatch will be able to predict heart attacks.
Rather than third parties building predictive models to harness that data for a business model, Apple seems to understand that maybe we just need to enable that power to the people who need it most, and make the majority of health decisions, the patient. After all, since we have all of our skin in the game, and then some, it would be great if we understood our own risks.
Apple's entry into health care signals a couple of things:
- A company with a history of success building great devices for users first, not third parties.
- The mainstream entry of a true "Internet of Things" in health care. With our body being one of those things with the potential for ourselves and others to track massive amounts of data about health.
One of the things we will be making with all of these sensors on our watches, phones and game consoles is a pretty robust virtual representation. Call this a new kind of maker movement, making a representation of ourselves. Something new that can potentially be shared and is far more valuable when it's combined with other data. Who would you rather trust with this representation, Apple or Facebook? They both have parts of it and will soon have more.
The point is that there will soon be an economy of health care data (something I had hoped for with the passing of the Affordable Care Act) and of representations of different parts of our health information, some of it shared, but, like any economy, there must be safety precautions put into place so that people feel comfortable sharing and making transactions.
And herein lies the fundamental point: In all the examples cited above about the sharing economy, there are rights attached to what is being sold or shared -- property rights. Nobody else can sell some object that I've made without my permission, that's stealing. There are protections, and these protections are exactly what enables the sharing economy. If I buy or sell something on eBay, I can be reasonably assured it won't be tampered with in transit.
We need to review and get people behind enabling a sharing economy in health care while answering some fundamental questions for ourselves, our government and private enterprises that wish to leverage that data. I'll paraphrase some of these core questions from a leading consultant on the Internet of Things, Chris Rezendes of INEX Advisors:
- What do I want to measure?
- What sensors or tests can I use?
- What data do I want off of it?
- Who owns the data or parts of the data?
- Who can access the data?
- Who determines access rights?
- What are the minimum communication (and fundamental unit) and security measures for transmitting the data?
- And finally (and for me, most importantly), what happens when someone violates a rule?
My colleague, Nathan DiNiro, likes to compare effective data transmission to the U.S. postal system. It wouldn't be too hard for someone to tamper with the mail, but it's not a big problem because very strict rules are in place against misuse. As we move toward a new health data economy, we need to be clear about rules and rights around health data, and the consequences for, and enforcement of, misuse. My take is that will come from enforcement to enable, not inhibit, sharing and transmission for the benefit of all.
As John Wilbanks, of Sage Bionetworks, has correctly asserted (h/t David Maisenberg of @biologypartners), we could come up with all kinds of encryption methods to protect data from getting into the wrong hands yet still do research, but at the end of the day, as costs continue to plummet to collect data independently, it makes a lot more sense to increase the costs of misuse.
We don't think anything of dropping a letter in the mail. We should feel the same security when creating and sharing health data. If we don't answer these questions about access and ownership for ourselves and our representatives, others will continue to do so. We can't necessarily predict what misuse will happen, but we know that without clear rules, rights and enforcement, this data will be misused, and the sharing economy will suffer.