On 28 January 2021, the data privacy communities of the world celebrated the Data Protection Day which is an international effort to create awareness about the importance of respecting privacy, safeguarding data and enabling trust.
The world is witnessing the rise of a new economy known as the data economy, where data is the raw material. From driverless cars to humanoid robots and from intelligent personal assistants to smart home devices, the world is undergoing a fundamental change, transforming the way we live, work and play. During the coronavirus pandemic, new digital solutions are being used to help detect and tack the virus, such as tracing apps or apps collecting health data such as body temperature.
Data has become critical to all aspects of human life. It is forecasted that by 2025, the global datasphere (i.e. the sum of all data created, captured and replicated on our planet) will grow to 163 zettabytes (i.e. a trillion gigabytes). In just 10 years, society has witnessed the transition of analog to digital. What the next decade will bring using the power of data is virtually limitless.
Imagine being awaken by a virtual personal assistant, being served coffee and toast made by a robot and being delivered hot croissant in record time by a drone or being advised on what clothing to wear from your wardrobe to best suit the weather report and your schedule for the day. This scenario is not science fiction, but is reality in our rapidly growing digital existence. This digital existence is only possible when data is transformed to unleash its immense potentials. The question is however, who is the winner in all this? Is it the individual, who is supplying his personal data?
Not at all. The winner, and by far, are the tech giants which are earning untold riches out of data. The data economy is dominated by five big tech platforms which are collectively worth USD 5.6trn. This corporate inequality is largely the result of network effects. A firm that can collect a lot of data, for instance, can make better use of artificial intelligence and attract more users, which in turn supply more data. Such firms can also afford the best data scientists and have the means to buy the best AI startups. This is a perfect example where size begets size.
This situation however needs to change. The individual needs to feel protected and they need to be able to trust that their data will not be misused. Individuals need to have more control over their data. They need to be made fully aware of how their data is going to be used before they consent to the processing. For example, if an individual visits a website which contains the Facebook icon, the individual needs to be made aware in no uncertain terms that this means that Facebook can use information which has been collected by this website for the purpose of sending adverts of the website’s products to his Facebook newsfeed. In this particular scenario, Facebook gets paid for showing the adverts of the website to the user but the latter gets nothing out of it, apart from being shown adverts which he did not ask for in the first place. Whether proper consent which satisfies all the conditions (freewill, specific, informed and unequivocal) was given for this highly intrusive behavior; most probably not.
This type of behavior can be very far reaching and can represent high risks to the rights and freedoms of data subjects. The EU General Data Protection Regulation (the “GDPR”) from which our own Data Protection Act 2017 is inspired has set new international standards in matters of data privacy and is a leap towards giving individuals more control over their personal data and imposing more restrictions on controllers (i.e. those processing personal data). Controllers need to process data lawfully, fairly, and transparently and they have a duty of accountability to individuals whose data they are processing. Controllers also need to have in place policies and processes so that personal data of data subjects, whether be it employees, clients, service providers, directors or shareholders, are processed in accordance with data privacy laws. Privacy challenges cannot be underplayed as data breaches can put companies out of business, not only because of the sanctions involved (Rs 200,000 and five years imprisonment under the Data Protection Act 2017 and under the GDPR, 2% or 4% of the worldwide annual turnover, or €10 million or €20 million fine, whichever is the higher) but because of the reputational damage it may create and which may be irreversible.
We may have proper laws and proper enforcement mechanisms, but if consumers themselves act in a reckless manner by trading off their personal data and consequently their privacy against what is believed to be “free” online services, they will continue to be an easy prey for unscrupulous controllers. Individuals should be cautious at all times, especially when they are requested to give their consent or to agree to terms and conditions. Clicking on the “consent” or “agree” button, though tempting, may open the floodgate to an erosion of their privacy.
According to DLA Piper’s latest annual GDPR Data Breach Survey 2020, last year was notable for the significant increase in regulatory enforcement action. Data protection authorities not only in Europe, but increasingly around the world, have become much more active and are starting to use their powers much more assertively to issue material fines.
Even in Mauritius, as people are becoming more aware of their personal data, the number of breach notifications to the Data Protection Office is increasing. Last year, the Commissioner gave five decisions for alleged unlawful disclosure or use of personal data or unlawful use of CCTV cameras. We foresee that the Data Protection Office will, in the near future, follow the international trends and become more active in their enforcement role.
This article was published in the Business Magazine Mauritius on 03 February 2021.