top of page

Utopia: How Design Can Protect Privacy in the Connected World

18th of January, 2016

Everyday we move forward with new technology, new promises, and new sets of responsibilities. We are told and sold on modern technologies that keep us connected and bring forth a sense of ease in a highly connected world. We have Uber to get us to where we need to go, Siri to suggest to us a restaurant, and Fitbit to track our weight loss progress. Facebook connects us and keeps us in contact with our friends, and Rogers Smart Home Monitoring protects our homes, connecting us while we’re out of town.

 

Perversion lies in the connectedness that leaves us exposed as a consumer. With the gadgets we carry we leave behind a digital trail of our location, identity, habits, likes and dislikes, and more. We leave behind a trail to be harvested and sold without our knowledge and consent.

 

A 2015 Carnegie Mellon University study reveals alarming statistics: “Your location has been shared 5,398 times with Facebook, Groupon […] and seven other apps in the last 14 days.” MIT’s Technology Reviewreports that this “often takes place without the [gadget] owner being aware.” Everyday we increasingly accept our digital interactions that we would otherwise find intolerable in real life. For example, just clicking forward or scrolling to the bottom of the screen and accepting “Terms & Conditions” unread. Would you sign a printed legal document without reading it? Would you offer personal and financial information to strangers? We need more transparency into what’s going on behind the scenes in the connected world.

 

The Federal Trade Commission’s report on the “Internet of Things (IoT): Privacy and Security in a Connected World” goes into detail regarding how security should be approached, highlighting best practices. The first “best practice” suggests that companies should build security into their devices at the outset, rather than as an afterthought. As part of the security by design process, companies should consider:

  1. Conducting a privacy or security risk assessment;

  2. Minimizing the data they collect and retain; and

  3. Testing their security measures before launching their products.

 

With respect to personnel practices, another good practice argues that companies should train all employees about good security and ensure that security issues are addressed at the appropriate level of responsibility within the organization. Further, companies should retain service providers that are capable of maintaining reasonable security and provide reasonable oversight for these service providers. Next, when companies identify significant risks within their systems, they should implement a defense in depth approach, in which they consider implementing security measures at several levels. Companies should also consider implementing reasonable access control measures to limit the ability of an unauthorized person to access a consumer’s device, data, or even the consumer’s network. Finally, companies should continue to monitor products throughout the life cycle and, to the extent feasible, patch known vulnerabilities

4 Designs That Can Protect Privacy

1. Symmetry

Re-invent the asymmetry between consumers (who benefit a little) and providers of goods and services (who benefit enormously) from our information. I.e., Facebook should agree to your terms and conditions, reciprocating your agreement to theirs. We need a consumer-side list of “Terms & Conditions” that will benefit and protect the user.

2. Intelligence

Apply artificial, personal, and social intelligence to privacy. Siri should ask Super-Computer Watson to analyze both Terms & Conditions mentioned above—in microseconds—and counsel you, offering alternatives and crowd-sourced rankings regarding the reliability and safety of said applications and whether it will benefit you as a user to actually use the product or service.

 

3. Revocability

Provide options to examine, retrieve, and delete—with confirmation—any personal, financial, and usage information from any app or service that originally requested, used, or recorded such information. Europeans call it the “right to be forgotten”. There needs to be a way in which what is “forgotten” is actually permanently deleted and not stored in binary.

4. Containment

Develop data encapsulation and encryption protocols enforcing ownership, permission, and expiration. If you try an app or service for a few days and never use the product or service, or pay for any subscription fees, why should they keep your data forever? You used their products for 30 days, you should have an option to delete all records and user profiles.

 

Tim Kobe, founder and CEO of Eight Inc., is indisputable when he argues, “Our work must delight and reward the user first.” User-centric means we work on  behalf of the user–– the citizen-consumer. Users should never be placed in situations where desire, convenience, or necessity push them to forgo their best self-interest. Ultimately, the design decisions made should focus on the best interests of the end user. And in the long run, that’s also the best way for companies to serve their users. Designers must do their part to ensure that our connected world combines fairness, safety, and privacy. It not,  George Orwell’s 1984 dystopia may arrive more terrible, and much sooner, than we realize.

About Jitesh Chauhan

 

A student of life with a passion for people, communication, and privacy. 

bottom of page