Day after day, phenomenal amounts of new data are generated. In 2017, IBM calculated that 90% of all data in the world was created in the previous two years. Data shapes people’s knowledge, decisions and daily lives: they are synonymous with power. And thanks to the relentless miniaturization of technological devices, data is often present every minute of every day.
At the same time, the processing power of technologies continues to grow and gain in intelligence to help everyone in their day-to-day life. Programs, platforms and other applications analyze each data provided by billions of users one by one. Data relating to comings and goings, photos downloaded, stores visited and what is purchased, keystrokes or user behavior.
For anyone who creates these technologies, there is no longer a limit to the level of detail that these computational processes can achieve, resulting in algorithms built into applications that attempt to predict consumer desires. The greater the amount of data provided, the more accurate the algorithmic predictionssometimes going so far as to anticipate wishes or needs even before individuals are aware of them. Often the technology is used without any control over the processing activities carried out or the data that users share or entrust to service providers.
Legislation struggling to keep pace
The last few years have been marked by changes in laws and regulations governing data protection around the world. Both authorities and users recognize the need to regulate data processing practices of the technology industry while ensuring greater transparency and allowing individuals to control the processing activities performed by organizations on their data.
These measures represent necessary progress, but are they sufficient? Do the data protection policies and other privacy statements presented on websites and service offerings provide the transparency and clarity that users demand to maintain control over their data?
Who reads these data protection policies?
Today, few people read the full terms and conditions or data protection policies when subscribing to new online services. They simply don’t have the time or inclination to force themselves to read legal jargon that often generates more doubts and questions than they initially had.
Although individuals have multiple rights under certain data protection regulations (such as the right to access data held by organizations about them, to restrict certain processing activities, or to delete personal information), more action needs to be taken in the absence of other prevailing laws or regulations.
Data protection by design
Data protection laws such as the General Data Protection Regulation (GDPR) provide for the obligation to incorporate data protection principles throughout the life cycle of a product or service. We speak of data protection by design when this protection is incorporated by default in technologies and systems. This concept was developed by Ann Cavoukian in the mid-1990s as a systems engineering approach. So it’s been around for a while, but it hasn’t received the attention it deserves.
Regulations such as the GDPR require organizations not only to react to data and privacy breaches, but also to act upstream, designing and incorporating protection requirements into product design in order to ” prevent possible incidents. Organizations should take steps to integrate data protection principles and requirements by design throughout the product and service development process. And they must question the models for collecting personal data presented.
Artificial intelligence and data protection
Until recently, organizations collected and processed all the data they could for the sake of convenience.
They were based on the principle that even if not all the data processed was usable at the present time, it could prove useful later. In machine learning, the argument for excessive data collection was that the more data one has, the more accurate the models will be. One of the principles of data protection by design requires product developers and service providers to avoid excessive collection of data and limit their collection. Organizations should collect and process only the data that is absolutely necessary for the purpose. Because the more data we have, the more risks we take.
Do we really have the means to act?
How many times has a user had difficulty finding privacy settings in an app? Or the unpleasant surprise to discover that some settings had been enabled by default and were already actively sharing its data? Did he ever have the impression that he was being pushed to approve a function without really understanding what he was consenting to? Using questionable methods (design elements that manipulate the user to deliberately deceive them or influence their behavior) to improve conversion is unacceptable. Just as important as the rest of its design is the ease of use of an app’s settings. The security and privacy features of a product should be easy to find and understand for users, so that they can exercise their data protection rights and controls. Developers, designers and product managers have a responsibility to educate and empower their users.
Users often express concerns about protecting their data and respecting their privacy, but their online behavior does not reflect these concerns. We speak of “privacy paradox” to designate this dilemma. So why are users’ actions not always in line with their demands? Perhaps because humans have a limited ability to collect and analyze information before making choices about how an application will share their personal information. Furthermore, it is impossible to predict with certainty how this information will be handled in the future and what impact this will have on users. Individuals are therefore at the mercy of unwanted effects, and while regulations attempt to protect users and empower them, there is still work to be done.
The need to anticipate and design for all possible uses
The Ethics Center in Australia has published a list of principles for ethical technology, and anyone involved in product development should read it. This list includes the following principles:
Merits . Just because you can do something doesn’t necessarily mean it’s good to do.
No to instrumentalisation . Never conceive of a technology in which people are just a cog in the machine.
Self-determination . Give as much freedom as possible to all those who will be affected by what is designed.
Responsibility . Anticipate and design for all possible uses.
Net profit . Maximize the benefits, minimize the harm.
Equity . Treat similar cases similarly, and different cases differently.
Accessibility . Design with the objective of including the most vulnerable users.
Purpose . Design with a goal that is honest, clear and adequate.
These principles emphasize the need to always anticipate and take into account all possible uses in its design upstream. This is of course very difficult advice to follow. The creators of social media platforms now popular with billions of people probably hadn’t imagined, projected or predicted the use cases and impacts their platforms would have a few years later. However, given the evolution of digital products over time, without forgetting the evolution of the purpose, it is essential to regularly review the possible uses to anticipate as well as the associated potential risks. While also keeping in mind the principles of data protection by design.
The industry must step up its data protection efforts
Technology and data can bring huge benefits, but they also come with risks. It is important to remember that the products and services created are shaped by the values of their creators. It is imperative that consumers understand their responsibility to create the world they want to live in. It is therefore up to them to invent this world. If they don’t, someone else with a different vision will do it for them and create a world in their image.