Government regulators are only beginning to draw lines of privacy around data accumulated by the Internet of Things–that emerging collection of installed and wearable networked devices that were featured so prominently at CES 2014. But could these same devices end up being part of the privacy solution?
In September 2013, the FTC took its first enforcement action related to IOT-collected information. TRENDnet, a company that markets video cameras designed to allow consumers to monitor their homes remotely settled FTC charges that its lax security practices exposed the private lives of hundreds of consumers to public viewing online. According to the FTC, TRENDnet marketed its numerous products as being “secure” when, in fact, the cameras had faulty software that left them open to online interception. The complaint further alleged that, in January 2012, a hacker exploited this flaw and made it public, and, eventually, hackers posted links to the live feeds of nearly 700 of the cameras. The feeds displayed babies asleep in their cribs, young children playing, and adults going about their daily lives. Once TRENDnet learned of this flaw, it uploaded a software patch to its website and sought to alert its customers of the need to visit the website to update their cameras.
“The Internet of Things holds great promise for innovative consumer products and services. But consumer privacy and security must remain a priority as companies develop more devices that connect to the Internet,” said FTC Chairwoman Edith Ramirez.
Geolocation data is a perennial issue with any wearable device. The IOT will also implicate subject-specific privacy laws. Without question, IOT advancements will allow a greater range of devices to do such things as storing personal health information or sending messages that are intended to be private. When they do, new questions will arise about applying existing, subject-specific privacy laws like HIPAA and the Stored Communications Act.
A new approach to protecting privacy will need to be found. Here, in addition to new questions, augmented reality (especially AR-capable digital eyewear) also offers potential solutions.
Because augmented display technologies will allow us to see large displays of virtual data floating in mid-air, rather than relying on size-constrained physical monitors, privacy warnings and dialogues can be made easier to notice. They will also be made easier to understand if they are displayed in physical proximity to the device being warned of, rather than on a remote, two-dimensional privacy document. So, for example, if the manufacturer of my refrigerator wishes to warn me that it will remember all of the food items I place inside the fridge, it can be programmed to display in my AR eyewear a large, red box containing this warning and floating in mid-air in front of the refrigerator door. By gesturing a hand (which, at that point, will likely also be equipped with location-aware transmitters for just such a purpose as this) through the dialogue box, I can indicate my assent to this data collection and go about my business.
Indeed, AR apps and eyewear could be programmed to recognize virtually any IOT-connected device and display the types of information that it collects. In that way, AR could democratize data collection by empowering everyone to peek behind the veil and “see” the data residing in and traveling through otherwise-invisible sensors.
Similarly, as I walk down the sidewalk, my AR eyewear could be programmed to display the geographic boundary lines around each store’s BLE sensor network. These could be highlighted in predetermined colors, or annotated with the appropriate warning language, to indicate that by stepping over the line, the store’s network will register my physical presence there and be permitted to digitally interact with me. In both examples, the consumer would be able to make a decision that is orders of magnitude more informed than anything allowed by present-day digital privacy practices.
As technology redefines our ideas of what is possible, the countermeasures we take to protect privacy should be equally outside the box.