By 2020, that’s in less than three years, the number of connected “things” will have risen to 25 billion, with over half of that used in the consumer sector, according to Gartner. These connected products such as home systems, health and fitness trackers will be able to gather and transmit detailed information.

With the European General Data Protection Regulation (the GDPR) coming into place by May 2018, companies not only need to establish practices on protecting customers’ data, but also start leading an open discussion on how we can be designing for transparency and control in the use of digital products and services.

This is not only relevant to the EU, as it also affects any Australian businesses that either offer goods and services in the EU or monitor the behaviour of individuals in the EU. Without taking a more stringent approach to personal data, companies could face regulatory penalties and even worse, risk losing customer trust.

So what has our society discovered about user trust with personal data? According to a study by product strategy and design firm Frog via Harvard Business Review, while only 25% of people knew that data such as their location was being captured, 97% of respondents were concerned about businesses and government misusing their information.

Top concerns ranked from identity theft, followed by privacy issues around personal information such as credit card information, digital communication and in certain countries, health history. Privacy concerns vary in weight according to cultural norms and whether a society is more collective or individualistic.

In early December, I was one of the designers who took part in a design jam session organised by Facebook. Together with academic participants from the University of Southampton, as well as designers from UK digital agency Normally, we collaborated in diverse teams to come up with ideas to improve how startups can design transparency and control into the personal data of their users.

We were given an insightful anecdote by Chris Downs, director of Normally Studio and one of the pioneers of Service Design, on his experiments with compiling and selling his own data. Thinking about the scale of big data can often make us lose sight of the individuals affected by those data-driven decisions. Chris’ story humanised the data collection process and made it concrete and material. He concluded that data transparency, ie. knowing what data was being collected and for what purpose, was imperative to maintaining a trusting relationship with customers. We can only design more positive interactions around data consent once we first understand the value of our own data.

Trust is key when asking for information

Numerous consumer studies suggest that we are more willing to share our data with businesses and companies whom we trust. Interestingly, the type of company can determine the level and likelihood of trust. For instance, primary care doctors, payment or credit card companies, followed by e-commerce firms rank the highest. After banks, tech firms and internet giants come government. Ranking in last are media and entertainment industries, and social media firms.

It also seems apparent that the higher the consumer value of the service, the greater our trust, expectation and willingness to share information with the company. For online businesses, it means that getting users to give you their name and email may be straightforward, but asking for their payment details requires more effort to show that their data won’t be misused.

In the case of the startups at Facebook’s design jam, we had several startups who were providing a valuable solution to their users and were also using data to enhance their experience. The following are some of my insights from the collaborative design session:

Data minimisation at sign-up

Where possible in the sign-up stages, minimise the number of data pieces required. For example, Netflix offers its users recommendations of shows, but during sign up only asks you to pick three preferred shows, while learning more later on as you use their service. The same can be applied to sign-up and contact forms. I always recommend clients to request as few form fields as possible, since asking for too much information upfront can deter users from wanting to connect with you.

Netflix minimises data collection at sign-up stage

Consenting to policy notices is our choice and it’s okay to disagree

We are so accustomed to clicking agree to policy statements that we don’t even think twice about what we’re consenting to. Digital businesses should still give users options to continue using the product or service even if they do not agree to the policy statement. For instance, allowing users the freedom to browse anonymously without requiring them to sign up or log in. Further, making anonymity the default option would be the ultimate respect of your user’s privacy and can even help establish trustworthiness if you are a new business.

We respond better to businesses that present privacy in a way that is friendly, in layman’s terms and even fun. 

Text like privacy statements and policy notices were originally established and written by lawyers, but now it has become increasingly the role of the designer and copywriter to translate and present those steps in a way that is understandable and easy to follow. Firms like Facebook and Pandora are using storytelling in their sign-up process to teach users about the rights and use of their data to give them a greater sense of control.

Facebook's user friendly privacy checkup

I found an excellent guide written by Rick Hennessy over at Frog’s blog on Nine Principles for Designing Great Data Products. In trying to keep this article on the positive, I left out the stories where privacy breaches turned grave and harmful. For instance, when the Facebook Beacon met with backlash and a lawsuit, but ultimately paved the way for the more acceptable Facebook Connect for logins. Or under more serious consequences, when Skype admitted and was forced to apologise for a Chinese privacy breach that involved the censoring and tapping of logged messages, exposing the personal information of users who mentioned topics that were politically sensitive to the communist rule. These instances show that whenever a company’s privacy breach comes into public light, it can take long and painstaking measures before customers regain trust in the business.

The near future will see a rise in data protection and security, which makes it an important time for designers to use our methods for positive impact. At the core of our data-driven world, it is the businesses who can openly address these issues, which gain the trust of users and a strong competitive advantage.