In this post, Dr Phoebe Moore, Associate Professor of Political Economy and Technology in ULSB, discusses the implications of the recently introduced General Data Protection Regulation (GDPR) for work and workers.
By 25th May 2018, all companies with over 250 employees across Europe were required to meet the rules of the General Data Protection Regulation (GDPR), which is a redrafting of the 1995 Data Protection Directive (DPD), or EU 95/46/EC. Here, I look at what this means for workers’ rights around privacy, consent for data collected about ourselves and its uses, as algorithms, wearable technologies and precise people analytics set out to monitor, predict, prescribe and assist our performance in workplaces.
In the case of the DPD, rules pivoted around consumer rights and ‘information relating to an identified or identifiable natural person’. Your street address, phone number and name were considered personal data. Your eye colour, height and model of your car, within this remit, were not. EU terminology was designed to accommodate new technologies, however, and in 2012, email addresses, IP addresses and in some cases a photo image could be classified as data relating to identifiable people.
However, it soon became clear that personal data is not static and can be traced back to an individual, using a collection of data points from public records, which could be used to construct a picture of an individual. Indeed, previously anonymized medical records have been excavated to gain information about people; school transcripts and church congregation data resurrected. Furthermore, over time, new technologies have permitted a swathe of new types of data collection possibilities, to do with accumulation at increasingly granular and even intimate levels, as well as providing new possibilities for storage, access and sharing. Not to mention the raft of data being collected by such behemoths as Facebook, which introduces a new range of possibilities for the mix of personal and public identification, the use of this data and legalities therein.
The GDPR is mandated for all organisations offering goods and services to customers in the EU whether inside the EU or not, whether they own the personal data processed or not. The 261-page GDPR document contains significant technical detail, so there are many items for organizations to work through. Ian Kilpatrick, executive VP for cybersecurity at Nuvia points out, the document provides more information about ‘what is to be achieved’ than ‘what is to be done’. In this context, companies and organizations across Europe have been auditing and preparing for some potentially very disruptive requirements for their business practices and operations, hoping to meet compliance and avoid heavy fines that will be imposed.
Algorithms and people analytics for workplace decisionmaking eliminated?
Unprecedented concerns surrounding the accumulation and lifetime of data, and people’s rights to that data and privacy; questions inspired and forced by a range of new technologies and new methods of data accumulation; have led to the introduction of this new European Regulation. Indeed, in the first two pages of the GDPR text, where the foundations for the new Regulation are set out, it is made clear that technological development is a key reason for the repeal of Directive 95/46/EC and its reconsideration, indicating that:
(6) Rapid technological developments and globalisation have brought new challenges for the protection of personal data. The scale of the collection and sharing of personal data has increased significantly.
Technology has certainly impacted workplaces and work over time. In recent years, the rise in ‘gig work’, where work that is made available is determined by algorithm; threats of automation in the manufacturing industry and in routine as well as non-routine jobs elsewhere; big data based on new ‘people analytics’ techniques and corporate wellness initiatives in office environments involving biometric, health and sensory data acquisition; have created significant new avenues for data collection and now, significant regulation.
Data subjects will have the ‘right not to be subject to a decision based solely on automated processing, including profiling’
Key concerns for workers in these new technologically informed and driven worlds of work, as I told a Spanish journalist from Univision on 6th November 2017, are: what information is being collected by my manager/client/employer, how and why? How is this data being stored and for how long? Who has access to this data, and why? Workers and worker associations such as trade unions should be vociferously asking, can I get hold of my data? Will it be used in appraisals about my work, and how?
The last question could apply, for example, to judgements made about warehouse work. In fact, I communicated with one Warehouse Operative in 2016, who indicated to me that she and colleagues were told armband data had been used to make decisions about firing and retention. However, workers were not provided access to the data informing these decisions.
The practice of using purely automated data such as in this warehouse, if case should be eliminated with the new Regulation. Indeed, Section 4, the ‘Right to object and automated individual decision-making’ Article 22, called ‘Automated individual decision-making, including profiling’, indicates that:
22(1): The data subject shall have the right not to be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning him or her or similarly significantly affects him or her.
The foundations for the Regulation, listed in the first sections of the document, make it abundantly clear that:
(71): The data subject has the right not to be subject to a decision, which may include a measure, evaluating personal aspects relating to him or her which is based solely on automated processing and which produces legal effects concerning him or her or similarly significant affects him or her, such as… e-recruiting practices without any human intervention. Such processing includes profiling that consists of any form of automated processing of personal data evaluating the personal aspects of a natural person, in particular to analyse or predict aspects concerning the data subject’s performance at work… reliability or behaviour, location or movements, where it produces legal effects concerning him or her or similarly significantly affects him or her.
These restrictions will put significant pressure on any company making decisions solely based on algorithm, which potentially fully disrupts the Uber business model and operational practices. Uber taxi drivers gain work through the use of an app that directs customers purely based on algorithm; movements are entirely tracked and judgements about working practices made accordingly; and worst, can presently can be deactivated if their client ranking systems are not high enough, or they have not accepted enough rides. It is difficult to see how these practices will not be fully overhauled in the wake of GDPR coming in to force.
The new Regulations also call into question data-driven ‘people analytics’ under extensive experimentation by human resources professionals. This practice involves using hard data and using ‘digital tools and data to ‘measure, report and understand employee performance, aspects of workforce planning, talent management and operational management’. The tools allow ‘organizations to conduct real-time analytics at the point of need in the business process… allows for a deeper understanding of issues and actionable insights for the business’ (Collins et al 2017). Miriam A Cherry (2016) indicates that people analytics is a nascent field that helps human resources make decisions on ‘(1) the search for new pools of quantitative data that are correlated with business and employment success, and (2) the use of such data to make workplace decisions and to replace subjective decision-making by managers’ (Cherry 2016). In February 2017, Deloitte reported that 71 per cent of companies internationally see people analytics as a high priority for their organizations, but progress, this reported, has been slow.
However, the GDPR may slow people analytics even further, particularly where big data is being used to recruit workers, evaluate performance, provide better leadership, make hires and promote, to influence and improve job design, make decisions on compensation, and improve collaboration without ‘human intervention’ as the GDPR requires. Teach for America is applauded for the most extensive use of people’s data to make hiring decisions as well as predictions about performance before hires, where ‘human intervention’ would be difficult to achieve, given the lack of interaction with data subjects at the earliest stages. A little late in the game, Google advises that a data driven approach, in their People Analytics guidance, provides the best way to ‘inform your people practices, programs and processes… reporting and metrics to predictive analytics [to help you] uncover new insights, solve people problems [italics added] and direct your HR actions’. ‘People problems’ could, of course, mean ‘who to fire’.
‘People problems’ in people analytics could, of course, mean ‘who to fire’.
Further to these practices, which will be much challenged in the wake of the GDPR, wearable devices in factories and warehouses where industrie 4.0 is being introduced, track workers’ movements and store extensive data about their performance, toilet breaks and minutes spent on consoles (see example of Warehouse Operative above). In professional workplaces, devices are being used to store information about how long workers spend at desks using heat sensors, such as in the case of Occupeye as briefly used at the Telegraph, as well as record workers’ tones of voice and gestures as experimented with Humanyze. Under the GDPR, these activities will come under great scrutiny, as companies will need make it clear what data is being used and why, the authenticity and reality of ‘human intervention’ if decisions are made on the basis of data analytics, and, of course, to gain explicit consent from data subjects in the first instance.
In the British Academy/Leverhulme project I have just completed, the company I studied which carried out the Quantified Workplace experiment involving FitBits, RescueTime and daily lifelogs, was queried by the Dutch Personal Data Protection Agency. While employees had consented to participation in the project, the Agency asked the company: ‘can there ever be a consenting relationship between an employee and employer?’.
Indeed, one of the key areas the GDPR develops in data protection is consent. The definition of ‘consent’ in the DPD was ‘any freely given specific and informed indication of his wishes by which the data subject signifies his agreement to personal data relating to him being processed’. The GDPR definition adds detail regarding how consent is given and states in Art 4(11) that consent is: ‘any freely given, specific, informed and unambiguous indication of the data subject’s wishes by which he or she, by a statement or by a clear affirmative action, signifies agreement to the processing of personal data relating to him or her’.
At present, consumers are required to agree with often very detailed and rather opaque terms and conditions by ticking a box after entering significant amounts of personal data into a form online in order to register, before crossing into provision of most online services today. The ICO has published Draft Guidance documents, including guidance on consent, which indicates that the ‘GDPR sets a high standard for consent’. Most of the advice on consent compliance centres on consumers, indicating for example that ‘consent requires a positive opt-in. Don’t use pre-ticked boxes or any other method of consent by default’. Further, the Guidance recommends making it easy to withdraw consent and ‘tell them how’. Even more interestingly, companies should ‘avoid making consent a precondition of service’. This is quite a significant difference to the widespread use of the tick box required for online services and apps or even to buy theatre and circus tickets!
Explicit reference to the implications of the GDPR for workers is far less detailed than for consumers, but the ICO’s Draft Guidance states explicitly that ‘public authorities and employers will find using consent difficult’ and that ‘employers and other organisations in a position of power are likely to find it more difficult to get valid consent’. However, ‘consent’ is not the only way to ensure compliance of the GDPR in processing personal data, as pointed out by Information Commissioner Elizabeth Denham in a blog post August 2017. Local authorities process council tax information, banks share data for fraud protection and insurance companies process claims information, for example, and for each of these, a different lawful basis is used to process personal information, which is not ‘consent’.
Public authorities and employers will find using consent difficult. In any case, to process data, a company or organisation must ‘identify a lawful basis before you start’. The GDPR provides further ways to process data that could be ‘more appropriate than consent’, Ms Denham indicates. Looking through these other ways, there are a number of provisions that are relevant, for example, to public authorities and highly regulated sectors where the ‘public interest’ is upheld:
6(1)(a) – Consent of the data subject
6(1)(b) – Processing is necessary for the performance of a contract with the data subject or to take steps to enter into a contract
6(1)(c) – Processing is necessary for compliance with a legal obligation
6(1)(d) – Processing is necessary to protect the vital interests of a data subject or another person
6(1)(e) – Processing is necessary for the performance of a task carried out in the public interest or in the exercise of official authority vested in the controller
6(1)(f ) – Necessary for the purposes of legitimate interests pursued by the controller or a third party, except where such interests are overridden by the interests, rights or freedoms of the data subject
Article 9 of the GDPR stresses the following however:
Processing of personal data revealing racial or ethnic origin, political opinions, religious or philosophical beliefs, or trade union membership, and the processing of genetic data, biometric data for the purpose of uniquely identifying a natural person, data concerning health or data concerning a natural person’s sex life or sexual orientation shall be prohibited.
This clause of prohibited data collection is not unlike the previous DPD, but the GDPR document then a list of conditions then listed where paragraph 1 does not apply. The sections with relevance to workers are 9(2)(b) and 9(2)(h):
9(2)(b) – Processing is necessary for carrying out obligations under employment, social security or social protection law, or a collective agreement
9(2)(h) – Processing is necessary for the purposes of preventative or occupational medicine, for assessing the working capacity of the employee, medical diagnosis, the provision of health or social care or treatment or management of health or social care systems and services on the basis of Union or Member State law or a contract with a health professional
Decisions on this basis will be made without worker consultation in some or even most cases, and the Regulation puts a lot of responsibility for ethical practices on human resource departments and employers themselves. For example, ‘working capacity’ of the employee is not defined in any detail, potentially putting a lot of onus on companies, or even the government, to provide useful and good guidance on fitness to work, which has very negative implications in recent UK history.
Nonetheless, employees, as data subjects, should gain significant rights under the GDPR. In general, employees’ rights gained with the introduction of this Regulation should give people:
- The right to be informed, which encompasses the obligation on employers to provide transparency as to how personal data will be used;
- The right of access, similar to those rights under the DPA and encompassing the ever-popular subject access request;
- The right to rectification of data that is inaccurate or incomplete (again similar to the DPA);
- The right to be forgotten under certain circumstances;
- The right to block or suppress processing of personal data (similar to the DPA); and
- The new right to data portability which allows employees to obtain and reuse their personal data for their own purposes across different services under certain circumstances. (Creed 2017)
So, the rolling out of the GDPR under the banner of the UK’s Data Protection Bill has significant implications for employers and it will be interesting to see how these new requirements change the employment relationship, perhaps forever.
Unions and in conclusion
The Trades Union Congress (TUC) recently published Shaping our Digital Future (04/09/2017) which recommends the following to the UK government:
- Set a mission for the UK to be a top five digital economy by 2030
- Establish a commission on the future of work, engaging unions, business and civil society in how technology should be introduced
- Ensure that workers have a say in the introduction of technology at company and sector level, with new sectoral institutions to bring unions and business together.
- Diversify the tech workforce, with a target to double the proportion of female STEM graduates in ten years.
The United Kingdom, despite leaving the European Union, will still have to follow suit if it intends to continue to do business with European partners. Employment law will presumably also still be applicable, but there are still significant unknowns in that regard.
In conclusion, the regulation of the introduction of any technologies in workplaces and any digitalized management methods which involve new methods to gather data about people and possibilities for usage, which the GDPR promises, should involve consultation with unions and worker councils across Europe. These entities should be taking note of the new rights workers should have with the introduction of the GDPR, inform memberships, and press the European Parliament to ensure protections are explicit and enforceable.
A version of this blog was published on 7th November 2017 on Phoebe’s personal blog site: https://phoebevmoore.wordpress.com/2017/11/07/the-gdpr-algorithms-and-people-analytics/