Currently showing: Other


01 Mar 17 22:15

I had the pleasure of addressing a group today on the many dimensions involved in becoming a pro-privacy brand that consumers will really trust.  We followed the case study of wearables while discussing issues ranging from the legal context of privacy laws, the Privacy by Design approach, and identifiability of data.  I'd love to share a bit on the three key messages we covered:

1. Context is crucial -- When it comes to brand differentiation and consumer trust, it's helpful to understand the historical, cultural and legal context, but it's crucial to realize it really boils down to the individual level.  Privacy is a developing field of law that is not static; in fact, new laws such as those regulating the use of biometrics and genetic tests are often adopted by legislators in reaction to outcry from constituents.  So, let's think about how you would feel about your beloved fitenss tracker doing the following with your health data:
* sharing it with your fitness coach or trainer
* sharing it with the company's community of users
* analyzing it to develop products and services
* analyzing it for marketing purposes
* providing it to your healthcare providers
* providing it to other health-related companies (including big pharma)
* providing it to your employer
* providing it to the government
* combining it with data from people who live in the same region
* combining it with data from people of my ethnicity
* using it to make predictions about your family members
* saving it forever
* posting it on your social networks

In an informal poll that I conducted during the webinar, I learned that my audience thought it was just as unacceptable to share their personal wellness data with their social networks as with the government.  I also learned that only about 20% thought it was acceptable to combine their data with other people in the same cohort, be it geographic, socio-economic, ethnic, or familial.  So, these are the barriers that we face when we want to use big data sources from wearables for underwriting or other legitimate business purposes.

2. Privacy by Design -- A pro-active way to face, head-on, these issues of confidence and data sharing by customers is to evince a great, trustworthy brand.  This requires a multi-factor approach that should involve your business, legal/compliance, technology/analytics, and marketing teams.  Because wearables and big data are mostly unregulated as of date, it's not sufficient to just say that you are compliant with the laws.  That should only be a starting point.  The fair information practice principles exhort us to think about concepts such as accountability and notice and consent .. ideas that I like to sum up as "Say what you do and do what you say."  In saying what we do, it's important to be user-centric and use language that is straightforward, clear and user-friendly.  Do you ever read the privacy policies for websites that you visit?  More often than not, they are written in complicated and legalistic terms.  And, they are long!  A study has shown that if American Internet users were to read online privacy policies, word-for-word, each time they visited a new website, it would take a person 76 work days per year to do this.  Who would do this, especially when click-throughs are designed to make it easy for you to skip the actual reading of a privacy policy or terms of use?  This is where the IT designers and business can collaborate to provide genuine and practical notice to consumers.

3. Identifiability is a moving target -- It's not so simple to say that data has been "de-identified" and is thus safe to use.  There are many studies that show how it is possible to re-identify datasets that were supposedly de-identified, often by research techniques using auxiliary databases:
* Simple Demographics Often Identify People Uniquely - Check out this website (http://aboutmyinfo.org/) where you can plug in your DOB, gender and ZIP to see just how easy it is to identify you to the tee.
* Re-identification, Personal Genome Project
De-anonymization of Netflix Prize Dataset
De-anonymizing Web Browsing Data with Social Networks
On Taxis and Rainbows: Lessons from NYC's Improperly Anonymized Taxi Logs
So please remember that identifiability is incremental.  Steer clear of terms like "de-identified" and "anonymous" that imply that re-identification is categorically impossible.  In reality, the bigger and finer-grained our datasets get (especially with wearables and sensors), coupled with the fast-paced development of analytic capabilities, the easier it is to re-identify individuals.  Instead, we can look into statistical methods such as differential privacy, which is something that Apple is now touting, and a holistic privacy program like I describe above.  We can be forthright with consumers that we are using personal data, but in a trustworthy and above-board way that actually results in "return on information" (a personal spin on "ROI") for them. 

Finally, here are some useful articles for your perusing pleasure:
* Interactive data breach infographic: World's Biggest Data Breaches Infographic
* National Cybersecurity Institute: How Does a Data Breach Affect Your Business Reputation
* Privacy by Design: Why You Should Bet Big on Privacy
* Privacy for Your Insureds: How to Help Insureds Manage Customer Privacy Risk
* UK Information Commissioner: Summary of the Forum on the Use of Data in Retail General Insurance


Category: Other


1 Comment

Waldemar Razik - 6 Apr 2017, 10:29 a.m.

Thank You for promoting privacy. This is very important for all of us.

What's more, there can be a lucrative (re-)insurance business in providing comprehensive solutions to developers, providers and operators of e-services.

I can envison a situation where:

1. technical underwriters of a re-insurance company supervise the design and development process of new e-services from day one. As paid consultants (profit for an insurer even befor underwriting any contract!). In this way the best practices in Secure Software Development Lifecycle can be monitored and if required imposed on the service developer.

2. re-insurance company can provide commercial assistance services (SOC - security operations center) to a customer who utilizes the newly developed asset. As the asset is known to the underwriter, the risk can be priced and mitigated more effectively.

3. additional portfolio of insurance policies related to the asset can be sold to the end users (private persons and SME) and further cross-sell general privacy insurance coverage for other areas of digital world.

Really, privacy is important and can be a source of stable revenue for the insurance industry.


If you would like to leave a comment, please, log in.