Skip to Main Content
The Liability of Internet Intermediaries The Liability of Internet Intermediaries

Contents

The Liability of Internet Intermediaries The Liability of Internet Intermediaries
1

Overview of the data protection regime 10.04

1.1

Background to the Directive 10.06

1.2

Basic concepts 10.11

1.3

Territorial scope of the Directive 10.39

1.4

Material scope of the Directive 10.53

2

Duties of data controllers 10.88

2.1

The data protection principles 10.91

2.2

Notification 10.123

2.3

Electronic communications 10.141

3

Rights of data subjects 10.159

3.1

The right of access 10.160

3.2

The right of rectification, erasure, or blocking 10.164

3.3

The right to object 10.183

3.4

The Google Spain decision 10.199

3.5

Practical consequences 10.213

4

Remedies for breach 10.247

4.1

Compensation 10.247

4.2

Injunctive relief 10.252

4.3

Enforcement by the Commissioner 10.254

5

The Data Protection Regulation 10.258

10.01 This chapter examines the liability of internet intermediaries for contraventions of the data protection regime. Data protection duties, like those upholding rights of privacy and confidentiality, can impose significant burdens upon internet intermediaries. This is because much of the information in which these services deal will contain ‘personal data’, and in some cases sensitive personal data, while almost all of the activities undertaken by them will involve some form of ‘processing’ of those data.

10.02 Some data protection duties are fault-based while others are strict. This distinction is important for two reasons. First, liability for breach of these duties may fall outside the safe harbour regime established by the E-Commerce Directive. Second, these duties may require service providers to act in ways that are at odds with their normal business or which require alterations to their technology. The so-called ‘right to be forgotten’, considered in sections 3.2–3.5 of this chapter, is one such example.

10.03  Nature of liability. It should be appreciated that breach of these duties creates primary, not secondary, liability. In Google Spain SL v Agencia Española de Protección de Datos,1 the Advocate General considered that where a search engine unlawfully processes data in search results this involves ‘secondary liability’. However, this is not an accurate description, at least under English conceptions of secondary liability. In that case, it was common ground that the primary publication of data on the third party website was lawful. There was therefore no primary wrongdoing and, in its absence, there could be no secondary wrongdoing. Instead, data protection duties are better understood as examples of primary liability: duties which are imposed upon search engines and other service providers as data controllers. Nevertheless, certain analogies can be drawn with secondary liability cases, since the acts giving rise to unlawful processing are partly those of third parties, such as website operators and users who carry out searches.

10.04  Legislative framework. The Data Protection Act 1998 transposes Directive 95/46/EC.2 Together they regulate how data controllers, including many service providers, may store and process personal data in the United Kingdom and elsewhere in the EU. These instruments impose several important duties upon internet services in relation to data relating to specific individuals. Equivalent obligations apply in other member states and in many other jurisdictions, where national privacy statutes share many of the same themes and embody similar data protection principles.3

10.05  Interpretation of the 1998 Act. Because the 1998 Act essentially gives effect to the Directive, which is intended to harmonise national data protection law throughout the EU, the Directive is the proper starting point for questions of interpretation arising under the Act. As the Court of Justice has explained, approximation of national data protection laws under the Directive is ‘generally complete’.4 Consistent with the practice adopted in trade mark and copyright cases, it is therefore preferable to refer directly to provisions of the Directive and that is the approach taken in this chapter.5 Where the Directive permits member states to specify the conditions for exercising rights or to derogate from protection, relevant provisions of the 1998 Act will be considered.

10.06  Treaty basis. The Directive is founded upon article 16 of the Treaty on the Functioning of the European Union, which provides express recognition for the right to protection of personal data concerning individuals. Article 16(2) requires the European Parliament and Council to lay down rules concerning the free movement of personal data, as well as processing carried out by member states and Union institutions.6

10.07  Purposes. The Directive has three main purposes. Its first and principal aim is to ensure the free movement of personal data by establishing uniform rules for cross-border data flows within the EU and by harmonising national law, subject to a ‘margin for manoeuvre’ in certain areas.7 Second, it aims to safeguard the fundamental rights and freedoms of natural persons under the European Convention, in particular article 8, and to ensure a high level of protection for those rights throughout the EU, subject to certain justified interferences.8

Third, the Directive recognises and foresees rapid advances in information technology which increase the ease and frequency of processing and exchanging personal data, particularly by means of ‘new telecommunications networks’, and aims to accommodate future developments.9 These aims clearly overlap to some degree.

10.08  Interpretation of the Directive. The Directive must be interpreted to achieve these aims and, in particular, so as to be compatible with fundamental rights. This interpretative obligation follows from the Charter and Treaties, and has been accepted by the Court of Justice in relation to the Directive.10 Article 7 of the Charter is in nearly identical terms to article 8 of the Convention and has the same meaning;11 article 8 of the Charter goes further. Entitled ‘Protection of personal data’, it provides that:

1.

Everyone has the right to the protection of personal data concerning him or her.

2.

Such data must be processed fairly for specified purposes and on the basis of the consent of the person concerned or some other legitimate basis laid down by law. Everyone has the right of access to data which has been collected concerning him or her, and the right to have it rectified.

3.

Compliance with these rules shall be subject to control by an independent authority.

10.09 Article 8(2) gives express recognition to rights of access and rectification, which are discussed later in this chapter. This indicates that an expansive approach is likely to be taken to these rights, which are provided for in the Directive.

10.10  Extrinsic materials. When interpreting the Directive, reference is often made to the opinions and recommendations of the Article 29 Working Party. This is an advisory group established (as its name suggests) under article 29(1) of the Directive. Its role is, among other things, to give opinions, advice, and recommendations to the Commission concerning the Directive and the PEC Directive, particularly in relation to the protection of data protection rights and freedoms in electronic communications.12

10.11  Terminology. The vocabulary of the data protection regime can be somewhat alien to a common lawyer, so the basic concepts are introduced in the following paragraphs. Sections 1.3 and 1.4 then consider the territorial and material scope of the data protection regime, respectively.

10.12  Definition. The basic unit of protection established by the Directive and the 1998 Act is ‘personal data’, which is defined in article 2(a) of the Directive to mean:

any information relating to an identified or identifiable natural person (‘data subject’); an identifiable person is one who can be identified, directly or indirectly, in particular by reference to an identification number or to one or more factors specific to his physical, physiological, mental, economic, cultural or social identity...

10.13  The purposive approach. In Durant v Financial Services Authority, the Court of Appeal construed ‘personal data’ more narrowly in the context of a subject access request. The Court distinguished between ‘mere mention’ of a data subject, which is not necessarily personal data, and information which ‘affects his privacy, whether in his personal or family life, business or professional capacity’.13 Information might fall within the second category if it is biographical, in the sense of going beyond recording the individual’s involvement in some transaction or ‘life event’ that has no personal connotations. Additionally, the information should have the data subject as its ‘focus’ rather than as a peripheral feature. As the Court recognised in Durant, this narrow approach relies on a purposive construction of section 7 of the 1998 Act, and does not reflect the preferred approach of the Court of Justice in relation to ‘personal data’ elsewhere in the Directive.14

10.14  Identification of an individual. The broader approach is that ‘personal data’ is information that identifies, or can be used to identify, an individual human being. In determining whether a person is identifiable, it is relevant to consider all means which are reasonably likely to be used either by the data controller or any third party to identify the individual. This definition excludes data rendered anonymous in such a way that the data subject is no longer identifiable.15 In determining whether identification is possible, the likelihood of the data subject being re-identified in the future must also be considered.

10.15  Correlation with an individual. Internet services frequently deal with IP address data.16 Such data may (and, in practice, will often) be personal data because they can be used, in conjunction with subscriber or log information, to identify a living individual. For example, ISPs can correlate an IP address with historical subscriber records to determine the individual to whom an IP address has been allocated. This may be because IP addresses are allocated on a ‘static’ basis, in which case there is a one-to-one correlation between the address and a specific subscriber account, or because they are allocated on a ‘dynamic basis’ to different subscribers in succession, in which case there are likely to be records showing which IP address was allocated to a subscriber at specific points in time.17

10.16  Subscribers of ISPs. In Productores de Musica de España v Telefónica de España SAU, the Court of Justice concluded that information being sought by a collecting society relating to the names and addresses of users of the KaZaA file-sharing service involved the making available of personal data.18 It was common ground that the defendant ISP, Telefónica, stored and processed personal data. As the Advocate General also explained, at the point when the claimant had obtained the users’ IP addresses, that also involved the processing of personal data and was subject to the Directive.19

10.17 Similarly, in R (British Telecommunications plc) v Secretary of State, it was common ground that an ISP’s obligations under the Digital Economy Act 2010 require it to process personal data in the form of IP addresses.20 There was some dispute about the extent to which an IP address, when associated with a particular record of download activity, might reveal special categories of personal data within article 8 of the Directive, such as information relating to racial or ethnic origin, political opinions, religious or philosophical beliefs, trade union membership, or relating to the health or sex life of an individual.

10.18  Secondary characteristics. In BT, the applicant pointed out that an identifiable person may be linked through an IP address to the accessing of material which reveals these kinds of secondary characteristics of the person. The example was given of data which ‘tend to show unusual sexual proclivities’ (as in the case of downloaded pornographic images). It was not necessarily the case that a particular subscriber had actually accessed the material or possessed such proclivities but, as the Court dryly observed, the inference would not be that such a subscriber was downloading such material ‘for the purposes of scholarly research into the efficacy (or lack thereof) of the Obscene Publications Act 1959’.21 The Court accordingly held that IP address data would be personal data, and some of it would be sensitive personal data.

10.19  Video surveillance footage. Indirect support for this conclusion may be found in Ryneš, where the Court of Justice held that the meaning of personal data encompassed video surveillance footage which contained images of a person.22 Where processing of those data covered, in part, a public space (namely the street outside the defendant’s home), it was not activity of a purely ‘personal or household’ character and so was not exempt from the requirements of the Directive.

10.20  Ability to process stored data. The defendant in Ryneš had no direct means of accessing the recorded footage, though it was stored on a hard disk for a period of time and ultimately handed over to police in order to identify two suspects. However, although the defendant may not have been able to identify the data subjects from the surveillance data, it was possible for third parties, such as the police, to do so. This suggests that IP addresses may amount to personal data even where the immediate data controller does not itself hold the necessary means of associating the data with a specific data subject, provided that a third party may do so using ‘all the means likely reasonably to be used...by any other person to identify the said person’.23

10.21  IP addresses that are not personal data. Occasionally, IP addresses may not permit identification of an individual because there is no available or likely means of correlation with a specific account holder. For example, the operator of an internet cafe or wireless access point may not record or retain logs of particular individuals who used the service. Such IP addresses are accordingly unlikely to be personal data in those circumstances.

10.22  Combining access logs and IP addresses. A host, search engine, or website operator will often be in a position to correlate the IP address of a visitor to a website with the details of a registered account holder or member who has accessed the intermediary’s facilities at a particular time.24 Hosts commonly store logs recording the IP address of those who access a web server; although these logs tend to be recycled or overwritten regularly, they are likely to contain personal data. Whether a website operator is able to use an IP address to identify an otherwise anonymous visitor will depend on the availability of other data, such as ISP records, which could be used indirectly to identify an individual. As the Working Party concluded, ‘to be on the safe side’ these IP addresses will generally need to be treated as personal data.25

10.23  Query records. Search histories may also amount to personal data. In opinion number WP 148, the Article 29 Working Party concluded that the record of an individual’s search queries is personal data if the individual is identifiable from them (eg through ‘vanity searches’ for their own name, by inference from other search activity, or from their registered account details or IP address). The Working Party further recognised that a search history ‘contains a footprint of that person’s interest, relations and intentions’ and therefore falls within article 8 of the Convention.26 This material was referred to by Tugendhat J in Vidal-Hall v Google Inc to conclude that there was a good arguable case that search history data was personal data within the meaning of the Directive and the 1998 Act,27 a conclusion upheld on appeal.28

10.24  Tracking cookies. For similar reasons, tracking cookies that collate browsing activity and uniquely identify website visitors are likely to constitute personal data. Examples are the Google ‘PREF’ cookie, the Facebook ‘OpenGraph’ cookie, the DoubleClick ‘ID’ cookie, and other persistent third party cookies. These records are created during normal web browsing activity and are designed to recognise a particular web browser whenever the user visits a suitably configured website. By doing so, it is possible to obtain information about what websites a user has visited, the date and time of such access, the duration of a visit, and the IP address of the visitor. This information may also be correlated with other access logs or user accounts (such as a Google or Facebook account). Taken together, there is at least a reasonable likelihood of being able to identify the individual user and could, for similar reasons to BT, also involve sensitive personal data.

10.25 A ‘data subject’ is a natural person to whom personal data relate. Every other person (besides the data controller and processor) is referred to as a ‘third party’.

10.26  Definition of ‘processing’. The main regulated activity is ‘processing of personal data’, also referred to as ‘processing’. Article 2(b) of the Directive defines processing to mean:

any operation or set of operations which is performed upon personal data, whether or not by automatic means, such as collection, recording, organization, storage, adaptation or alteration, retrieval, consultation, use, disclosure by transmission, dissemination or otherwise making available, alignment or combination, blocking, erasure or destruction...

This is an extremely broad definition. It also includes ‘publication’ in both print and electronic form.29 All internet intermediaries will engage in at least some ‘processing’ of personal data.

10.27  Examples of processing. In the Google Spain case (discussed further in section 3.4) the Court described the following activities carried out by Google as examples:

in exploring the Internet automatically, constantly and systematically in search of the information which is published there, the operator of a search engine ‘collects’ such data which it subsequently ‘retrieves’, ‘records’ and ‘organises’ within the framework of its indexing programmes, ‘stores’ on its servers and, as the case may be, ‘discloses’ and ‘makes available’ to its users in the form of lists of search results. As those operations are referred to expressly and unconditionally in article 2(b)...they must be classified as ‘processing’ within the meaning of that provision.30

10.28  Undifferentiated processing. Processing can occur whether or not personal data are distinguished from other data. Thus, a host processes personal data when it (1) ‘collects’ personal data that is uploaded by users as part of a wider set of material and (2) ‘stores’ such data on its servers. A social network also ‘collects’ such data, which it then ‘organises’ in a database, ‘stores’ on its servers, and ‘retrieves’ and ‘makes available’ to users in feeds and on profile pages, and so on. Virtually any dealing with user-supplied data will involve processing personal data at some point.

10.29  Definition of ‘controller’. Article 2(d) defines a ‘controller’ to mean a body31 that determines32 how and for what purposes personal data may be processed.33 This includes any person who, under the direct authority of the controller, is authorised to process the data. The determination may be that of the body acting alone or jointly with others.

10.30  Elements of the definition. The concept of a ‘controller’ is autonomous and must be interpreted according to EU law. The definition contains three elements: first, personality—the data controller can be a natural or legal person, public authority, agency, or other body; second, contribution—the data controller need only contribute to a determination about personal data jointly with another controller; and third, functional control—the data controller must be able to exercise responsibility for how and why personal data are processed.34

10.31  Scope of the definition. The concept of a data controller is central to the data protection regime because it determines the scope of the data protection principles and the duties owed to data subjects, as well as the scope of liability to compensate data subjects for breaches of those duties. The definition in the Directive was intended to ensure effective protection of personal data in light of new and unforeseen technological developments. As such, it is a technology-neutral definition which applies regardless of the medium and manner of control. A teleological construction therefore suggests that the definition should be construed widely, notwithstanding that many modern internet services did not exist at the time when it was drafted and adopted. Nevertheless, it is clear that the essential element of the definition is that the data controller must, in fact, exercise control over the personal data in question.

10.32  The focus of the inquiry. In Google Spain, the CJEU held that the definition of ‘controller’ requires the relevant activity to be examined at a very high level of abstraction, focussing on the overall service that involves processing personal data. If a person determines the purposes of and means used in carrying out a business activity, and then proceeds to process some personal data within the ‘framework’ of that activity, then the person will probably be a ‘controller’ in respect of that processing.35 In other words, if a person is in a position to control how it deals with data generally during the course of its business activities, it will be a data controller with respect to any personal data that it happens to process while carrying out those activities.

10.33  Criticism of the Google Spain approach. Such a wide definition is vulnerable to serious criticism. First, it might be considered important, based on a natural reading of the words used in article 2(d), to require that a data controller be someone who determines how to process ‘personal data’ (emphasis added), as distinct from other kinds of data or data generally. In Google Spain, the Advocate General preferred a narrow construction of ‘data controller’ which excluded services that merely determine how to process data ‘in a haphazard, indiscriminate and random manner’ without singling out personal data for different treatment. In the Advocate General’s view, the essence of the concept of a data controller is someone who has taken responsibility for the processing of personal data, which entails an ‘awareness’ of the existence of such data as a defined category of information. The consequence of this approach would have been that a service provider could not be a data controller unless it was aware of personal data and had dealt with it ‘in some semantically relevant way and not [as] mere computer code’.36

10.34 In Google Spain, the Advocate General was of the opinion that general purpose search engines such as Google should not be considered ‘data controllers’ under the Directive but instead as ‘merely supplying an information location tool’. Such a service provider:

does not exercise control over personal data included on third-party web pages. The service provider is not ‘aware’ of the existence of personal data in any other sense than as a statistical fact web pages are likely to include personal data. In the course of processing of the source web pages for the purposes of crawling, analysing and indexing, personal data does not manifest itself as such in any particular way.37

This ‘entirely passive’ and ‘intermediary’ function of search engines must be distinguished from situations where they exercise ‘real control’ over personal data (eg in relation to personal data collected from their own users and advertisers).

10.35  Further difficulties. The Advocate General’s approach carries certain difficulties of its own. First, it imports a mental state into a definition which does not expressly refer to knowledge or intent. Second, it would seriously restrict the operation of the Directive, since algorithmic processing of data per se would not be covered unless the responsible party was specifically ‘aware’ of personal data. This appears to be contrary to the definition of ‘processing’ in article 2(b), which includes processing ‘whether or not by automatic means’ and as part of wider processing of undifferentiated data. Third, it could lead to arbitrary results; for example, a service provider might become a data controller once notified of personal data, while another who carried out identical processing of the same data would not. Like the anti-moderation paradox discussed in chapter 8, this would perversely discourage intermediaries from identifying and protecting personal data, leading to self-induced data agnosticism.

10.36  Control over personal data. A better distinction is that to be a data controller requires a reasonable degree of control over personal data—in particular, how such data are processed. Determining how personal data are processed entails an ability to identify and deal with those data as distinct data structures. If, for example, a service uses algorithmic or other automated means to separate personal data from other data (eg by using natural language processing to extract a proper noun or identification number and cross-referencing that against a database of living natural persons), then it could be said to exercise control over those data. If, however, this is not possible, or the data are not structured in this way, then the service cannot be said to determine how personal data are processed, except in the most general sense that it remains free not to process any data at all, or to process all data in a different and undifferentiated way. Whether it is possible to distinguish between data would, of course, be a matter for expert evidence, and may depend on future advances in probabilistic natural language processing technology.38

10.37 The reason why control matters is obvious: if a service provider or other entity cannot control personal data, it is in no position to guarantee compliance with its data protection duties by ensuring that such data are processed lawfully, are accurate, are stored securely, and so on. It would make no sense to subject an internet intermediary to data protection duties which it had no hope of being able to discharge, let alone at proportionate cost. Understood in this way, Google Spain rests on the fact that search engines control the structure and design of their indices, the methodology of search result rankings, and the design of their web crawling robots. However, it also rests on an unstated factual premise (control over personal data contained in cached and indexed websites) which is open to question.

10.38 A ‘processor’ is a body that processes personal data on behalf of a data controller. In many cases, these may be the same person. This includes any person who, under the direct authority of the processor, is authorised to process the data.39

10.39  Acts of processing within the EU. The Directive fully harmonises the territorial scope of national data protection law. The starting point is that it applies to ‘any processing of personal data in the Community’.40 Article 4 lays down a mandatory choice of law rule that applies national data protection law in any of three circumstances:

(a)

Where a data controller is established within the territory of one or more member states and processing is carried out ‘in the context of’ those activities, it is bound by national law in each relevant member state where it is established.41

(b)

Where a data controller is not established within the territory of a member state but national law is otherwise applicable by reason of a rule of public international law, then that law applies.42

(c)

Where a data controller is established outside the EU and uses equipment situated in the territory of one or more member states to process personal data there (and not merely for transit to another place), national law applies and the controller must designate a local representative who will also be liable in any proceedings.43

10.40  Definition of ‘establishment’. The concept of establishment is widely cast. Under the Directive, the test is whether the person exercises a real and effective activity using stable arrangements in a particular place.44 Whether that occurs through the person or through a related subsidiary with separate legal personality is not determinative. Thus, a single data controller may be established in several member states through different subsidiaries.

10.41  Transposition. In the United Kingdom, article 4 is transposed in section 5(3) of the 1998 Act, which defines establishment in terms of residence, incorporation, formation, or the maintenance of an offence, branch, or agency through which the person carries on any activity.

10.42  The facts in Google Spain. The first question is normally whether the defendant has an establishment within the EU. In Google Spain, the issue was whether the Directive applied to Google Inc as a search engine service provider. The facts are discussed more fully in paragraphs 10.200 and 10.201. For territoriality purposes, what mattered was that the acts of data processing were performed by Google Inc, a Californian firm, using servers located in various jurisdictions, including data centres in Belgium and Finland. It had numerous EU subsidiaries, including Google Spain SL, a company with its seat in Madrid that was responsible for sales of search keyword advertising to Spanish customers. Only Google Inc was responsible for operating the Google.com and Google.es search engines, and the associated data processing. Google Spain was designated as a data controller with the national regulator for advertising sales data (but not search activity).45

10.43  Reasoning of the CJEU. The Court held that Google’s Spanish subsidiary was a relevant ‘establishment’ within the meaning of article 4(1)(a). This was because the processing of personal data in search results by Google Inc was carried out ‘in the context of’ the activities of Google Spain, which sold advertising space on the same results pages. The Court’s reasoning can be broken down into four steps:

(a)

First, Google Spain was an effective and real exercise of activity through stable arrangements and was therefore an ‘establishment’ under article 4(1)(a).46

(b)

Second, it was common ground that Google Spain targeted and ‘oriented’ its activities towards the inhabitants of Spain. Its purpose was the promotion and sale of keyword advertising, which ‘constitutes the bulk of the Google group’s commercial activity’.47 Its activities therefore took place in Spain.

(c)

Third, article 4(1)(a) does not require data processing to be carried out by the EU establishment or even within the EU, but only that it be carried out ‘in the context of the activities’ of the establishment.

(d)

Fourth, even though the processing of personal data was carried out exclusively by Google Inc, Google Spain’s activities were what made the search service profitable and those activities were, in turn, enabled by the search service. The advertising was therefore ‘inextricably linked’ to the processing in that they occurred together.

10.44  Underlying policy. Such a broad interpretation was said to be justified by the need to ensure effective and complete protection of the right to privacy. One important limitation on this approach is that the EU subsidiary must still ‘target’ its activities towards a member state; for example, by selling advertisements to consumers there.48

10.45  Subsequent treatment of Google Spain. In Richardson v Facebook UK Ltd, Warby J rejected the misconceived submission that Google Spain decided that any European subsidiary may be liable for data processing activities undertaken in Europe by its US parent company.49 What the CJEU actually held was that a US data controller, Google Inc, may be subject to the Directive (and specifically the transposing provisions of Spanish data protection laws) when it had an establishment in Spain that carried out local activities in the context of the same data processing. However, the subsidiary was not itself held to be a data controller. Accordingly, the claimant in Richardson could derive no assistance from ‘an analogy between the role of Google Spain within the Google corporate structure and that of [Facebook UK] within Facebook’.50 At best, the argument would be that Facebook could be subject to English data protection law for activities taking place in the same context as its UK establishment (and not the converse).

10.46  Location of the ‘means used’. Second, the Directive and 1998 Act may apply to a data controller even if it is only established outside the EU: it is, according to recital (20) of the Directive, sufficient that the ‘means used’ for data processing are located in a member state and not used solely for the purposes of transit. Section 5(1)(b) of the 1998 Act, like the English version of article 4, uses the expression ‘equipment’; however, this is properly construed as the wider concept ‘means’.51 What means are used, and their actual location, are questions of fact. By analogy with the approach taken to article 4(1)(a), it seems likely that the means must either be used to carry out the relevant processing or be inextricably linked to it.

10.47  Absence of local establishment. Article 4(1)(c) appears to require as a pre-condition that the service provider is not established in any member state before equipment situated in the EU can be taken into account. As the Advocate General noted in Google Spain,52 where a foreign service provider has a subsidiary established in one or more member states, this prevents article 4(1)(c) from applying on a literal interpretation. Section 5(1)(b) of the 1998 Act contains a similar pre-condition.

10.48  Foreign services with local activities. What this means for other internet intermediaries headquartered outside the EU is that if an EU subsidiary makes sales which contribute to the profitability of the service and are delivered using the same platform as the one in which relevant data processing is carried out, then the processing is very likely to be carried out ‘in the context of the activities’ of that EU subsidiary, and will arguably be ‘inextricably linked’ to services of the parent company that are made profitable by the subsidiary’s activities.

10.49  Consequences of an inextricable link. Where a local EU establishment is ‘inextricably linked’ in some way to the data processing service of the parent company, that will normally be enough to extend the application of the Directive to the processing of the business as a whole. This is so regardless of who actually operates the service or carries out the processing. Similarly, the precise location of the technical equipment carrying out processing does not matter. It also appears from the decision in Google Spain that the focus should be on the service overall, rather than specific acts of ‘processing’, but this point remains unresolved.

10.50  Threshold needed for an ‘inextricable link’. The vital question is therefore what constitutes an ‘inextricable’ link. In Google Spain, the advertising was displayed on the same webpages as the data processing. This may therefore be regarded as a sufficient link. It is unclear whether it is also necessary. For example, an EU consulting or analytics service which relies upon data gathered and processed by a US parent company may also be inextricably linked, even if the results of the consulting service are not ‘displayed’ per se to users. What matters, it is suggested, is whether the activity of the EU establishment (1) occurs in the same commercial context as the parent company’s data processing, and (2) depends upon or supports the parent company’s data processing, for example by using the same technical means.

10.51  Jurisdiction under the Brussels I Regulation. Conversely, in eDate Advertising GmbH v X, Advocate General Cruz Villalón put forward a ‘centre of gravity of the dispute’ connecting factor to determine the scope of jurisdiction in disputes involving internet publication under article 5(3) of the Brussels I Regulation.53 The Court accepted that a claimant who is the victim of internet wrongdoing may bring an action in the courts of the member state in which ‘the centre of his interests’ is based. This will normally be his habitual residence, though it may be another state where other factors, such as pursuit of a professional activity, establish a ‘particularly close link’.54

10.52  Consideration in Google Spain. This approach was rejected by the Advocate General in Google Spain.55 The Court did not consider the issue in light of its conclusion that the processing fell within article 4(1)(a) of the Directive. It is far from clear that such a connecting factor would find any support in the text of the Directive or offer any useful criterion for delimiting its scope in circumstances where an internet intermediary processes data using means located in numerous (and in many cases arbitrary) territories.

10.53  Breadth of application. Data are now created and processed in almost every aspect of modern human life. Much of this information will include personal data. Plainly, however, the Directive was not intended to apply to every possible dealing with electronic or structured material, since this would impose a serious burden on individuals and businesses. Nevertheless, the Directive is drafted in broad terms and subject only to limited exclusions from its ratione materiae, which are considered in the following paragraphs.

10.54  Everyday dealings with personal data. As the Advocate General observed in Google Spain, the Directive was proposed and adopted when the internet was at a very early stage of development and modern web applications did not exist. The consequence of this breadth is strikingly illustrated by the following example:

Let us think of a European law professor who has downloaded, from the court’s website, the essential case law of the court to his laptop computer. In terms of the Directive, the professor could be considered to be a ‘controller’ of personal data originating from a third party. The professor has files containing personal data that are processed automatically for search and consultation within the context of activities that are not purely personal or household related. In fact, anyone today reading a newspaper on a tablet computer or following social media on a smartphone appears to be engaged in processing of personal data with automatic means, and could potentially fall within the scope of application of the Directive to the extent this takes place outside his purely private capacity.56

10.55  Construing the purpose of the Directive. On the other hand, as is noted in paragraph 10.07, the Directive was intended to apply to all forms of technology and to regulate processing on new and unforeseen telecommunications services. With those services, and the ease of distribution and ubiquity of access that they enable, comes new potential for privacy harms. As the Court of Justice remarked in eDate, internet publications ‘may be consulted instantly by an unlimited number of Internet users throughout the world, irrespective of any intention on the part of the person who placed it...and outside of that person’s control’.57 There is obvious tension between the difficulty of controlling such dissemination and the tendency for ever-wider classes of persons to be considered data ‘controllers’. Ultimately, as Google Spain demonstrates, this latter tendency has prevailed in the EU authorities.

10.56  Advances in information technology. Electronic data processing has rapidly grown in significance and now probably far exceeds anything the drafters of the Directive could have envisaged. In L v L, Tugendhat J described the development of information privacy laws partly as a by-product of advances in information technology:

In the last 20 years or so the legal protection of information has been greatly increased. This has in large measure been in response to the development of computers and their use for word processing and sending of electronic messages. The amount of information that can be stored on a laptop is vast, and techniques for copying are quick and simple for experts.58

10.57  Medium of processing. It is clear that the Directive applies to personal data regardless of the medium in which it is obtained and processed. This is confirmed by recital (14), which provides that the Directive shall extend to sound and image data processed electronically:

given the importance of the developments under way, in the framework of the information society, of the techniques used to capture, transmit, manipulate, record, store or communicate sound and image data relating to natural persons, this Directive should be applicable to processing involving such data . . .

10.58  Technological neutrality. The Directive is drafted in a technology-neutral way. This is consistent with its aim to adapt to changes in methods of communicating and disseminating information. The approach of the CJEU has been to interpret the provisions of the Directive to apply irrespective of the medium which is used to transmit and store data. For example, in Satakunnan, the Court held that article 9 could apply regardless of whether transmission is ‘classic in nature, such as paper or radio waves, or electronic, such as the internet’.59

10.59  Method of processing. Similarly, the Directive apples to processing regardless of the specific ‘techniques’ or technologies being used by the data controller. However, by article 3(1) protection extends only to data processing systems which are wholly or partly ‘automated’, or which are ‘structured’ manual systems. This requires some form of filing system that is structured according to specific criteria relating to individuals in a manner that allows ‘easy access to the personal data’. They may be centralised or decentralised.60 This is reflected in the definition of ‘data’ contained in the 1998 Act.61

10.60  Specific exclusions from harmonisation. Data processing for the purposes of public security, defence, national security, criminal law, and other non-harmonised fields are excluded from the operation of the Directive.62 Other fields may be excluded by construing these examples ejusdem generis, so that further fields may be excluded from the data protection regime only if they are of the same kind as the examples given.63 The processing of data relating to legal persons is likewise excluded,64 as is processing carried out by a natural person in the course of a purely personal or household activity.65

10.61  Personal or household activities. The exclusion for domestic activities is narrow. It does not, for example, exempt publications made by a blogger or other voluntary website operator which are publicly accessible, even if the subject matter of those publications is of a personal or charitable nature. In Lindqvist, the publication was a personal homepage operated by a volunteer catechist, which contained personal data about her colleagues and family that were published without their consent. She was initially convicted of offences under Swedish data protection legislation. The Court of Justice held that article 3(2) did not apply to her publication. Article 3(2) (and with it, section 36 of the 1998 Act) is limited to processing carried out ‘in the exercise of activities which are exclusively personal or domestic, correspondence and the holding of records of addresses’.66

10.62 Section 36 of the 1998 Act is in slightly broader terms: it exempts processing by an individual which is only for that individual’s ‘personal, family or household affairs (including recreational purposes)’. In light of Lindqvist, it seems unlikely that section 36 would exempt personal blogging activity or publications on social media, since processing for that purpose would not be purely personal: it would be a publication to an unspecified audience. The situation may be different if the audience is restricted in some way; for example, to friends or followers on social media. It is suggested that such a publication would be essentially for the purpose of that individual’s personal or recreational affairs and would, in today’s society, be properly considered to have an exclusively personal or domestic character.

10.63  Scope of the exemption. The Directive has a limited application to data processing for journalistic, artistic, or literary purposes.67 Under section 32 of the 1998 Act, three conditions must be met before processing will be partially exempt.

10.64Sole purpose of processing. First, the processing must be undertaken solely for the relevant purpose, and with a view to publishing such material. Both the Directive and the 1998 Act make clear that processing for a dominant or partial purpose will not be sufficient.

10.65Reasonable belief as to public interest. Second, the data controller must reasonably believe that publication of such material would be in the public interest.

10.66Incompatibility with purpose. Third, the data controller must reasonably believe that an applicable data protection obligation is incompatible with the relevant journalistic, artistic, or literary purpose. The second and third conditions are hybrid tests that compare the controller’s actual subjective belief to an objective standard of reasonableness, taking account of any relevant codes of practice.68 This prevents obtuse or foolish data controllers from relying upon opportunistic or dishonest beliefs.

10.67  Effect of exemption. If section 32 applies, then the rights of data subjects do not apply as against the data controller to the extent that processing satisfies the three conditions. The data controller is also exempt from all data protection duties, other than the duty to take appropriate measures to store data securely so as to prevent loss or destruction of the data.69 However, the exemption is not total: notification and other obligations still apply.

10.68  Interpretation of the exemption. Section 32 must be construed compatibly with article 9, which is the proviso that allows member states to lay down exemptions and derogations from protection for the purpose of upholding the freedom of information and the right to receive and impart information:

Member States shall provide for exemptions or derogations . . . for the processing of personal data carried out solely for journalistic purposes or the purpose of artistic or literary expression only if they are necessary to reconcile the right to privacy with the rules governing freedom of expression.

10.69  Secondary acts of processing. Exemptions and derogations permitted under article 9 apply only to primary publishers and not service providers who deal in data that was originally processed pursuant to article 9.70 This, presumably, is because processing by a service provider is not ‘solely’ for one of the relevant purposes, but rather for at least the purpose of operating the internet service. This may lead to the surprising result that a primary publication of data is compatible with the Directive and exempted by section 32, but subsequent automated dealings with the data by internet services are not. This raises particular concerns in relation to automated news aggregators such as Google News, or publishing platforms for artistic works such as Flickr and 500px, which may process personal data in ways which would be exempt if undertaken by the primary publisher.71

10.70  Authors of websites. Where material is published by a person to a webpage or platform, that person will normally be a data controller in relation to any personal data contained in the uploaded material. In Lindqvist, the Court of Justice held that the Directive likewise applies to website operators where they personally upload personal data. In that case, Ms Lindqvist’s act of uploading the HTML documents containing personal data onto her webpage was an act of processing:

[Article 2(b)] gives several examples of [processing] operations, including disclosure by transmission, dissemination or otherwise making data available. It follows that the operation of loading personal data on an internet page must be considered to be such processing.... [P]lacing information on an Internet page entails, under current technical and computer procedures, the operation of loading [ie uploading] that page onto a server and the operations necessary to make that page accessible to people who are connected to the Internet. Such operations are performed, at least in part, automatically.72

10.71  Other uploaders of material. It follows from Lindqvist that any person who uploads personal data to the internet will engage in processing of personal data and is likely to be a data controller for the purposes of the Directive and the 1998 Act.

10.72  Aggregators. Personal data do not cease being personal merely because they have become known to the general public73 or disclosed lawfully in another medium.74 It follows that an internet intermediary will still process personal data even if the data are obtained from public third party sources. This poses a particular risk for aggregators of data, such as search engines, portals, and data feeds. For example, in Volker und Markus Schecke GbR v Land Hessen an EU internet platform which published personal data relating to the beneficiaries of agricultural subsidies had interfered with the privacy rights of those individuals, despite the underlying data being aggregated from public registers.75 Conversely, if data are created and supplied directly by data subjects, it is much more likely that any personal data are processed with consent or otherwise published lawfully.

10.73  Applicability of the Directive. On their face, the Directive and 1998 Act apply to web hosts of internet materials just as they apply to any other intermediary who deals in personal data. Hosts receive, store, and transmit data (which may include personal data) in the course of operating a web server. Accordingly, it might be said, any hosts with a relevant territorial nexus (such as EU establishments or equipment) should be considered data controllers who process any personal data that are uploaded.

10.74  Identifying the ‘controller’. It is submitted that, where personal data are supplied and administered exclusively by a third party (such that the host has neither knowledge of nor control over those data), the host should not be treated as a ‘controller’ of the data except insofar as it processes additional data when operating the service (eg billing or account data for the subscriber). This approach is supported by recital (47) of the Directive, which provides that electronic mail services are only controllers with respect to personal data they process which is additional to the content of messages sent by users.76

10.75  Control over data processing. The need for a circumscribed approach arises because such a host would not be able to exercise practical control over the data that are uploaded by the client. While many hosts may have the contractual power to suspend or terminate an account for breach of the applicable terms of use, this theoretical power is unlikely to confer the functional control that is the essence of a data processor under the Directive. In this respect, a host is unlike a search engine or website operator, which may carry out further processing on data which they obtain from third parties: the host, in the ordinary course of its activities, carries out no further processing on stored data beyond transmitting exactly what has been uploaded by the subscriber at their instruction.

10.76  Applicability of the Directive. Search engines index, store, and make available in order of relevance information published to the internet by third parties. Insofar as that information contains personal data, it is clear from the decision in Google Spain that the Directive applies to these activities. The facts and reasoning of that case are discussed in greater detail in section 3.4. In summary, one of the questions referred to the CJEU asked, in essence, whether these activities of search engines amount to ‘processing of personal data’ within the meaning of article 2(b) of the Directive and, if so, whether search engines are ‘controllers’ in respect of that processing within the meaning of article 2(d).

10.77  The approach in Google Spain. The CJEU answered both questions in the affirmative, as noted at paragraph 10.32. Several aspects of search engines’ activities were important features in the decision. Guidance can be taken from these features as to the probable approach to other internet intermediaries.

10.78Ranking as an act of ‘processing’. First, the Court emphasised that when search engines index and rank personal data contained in third parties’ webpages, they are engaging in additional processing that goes beyond that carried out by the original publisher. Search engines inherently operate by processing data. Their index is created by recursively visiting, analysing, processing, and visiting all links on all webpages within the indexed data set; this involves copying and storing the source data, transforming the source data so as to render it searchable (eg by performing ‘stemming’ and keyword expansion on page text, by excluding or down-weighting duplicate page elements, such as navigation bars and footers, and so on),77 and then transmitting excerpts from the source data in ranked order in response to queries. Other than the design of the underlying indexing and ranking algorithms, all this occurs automatically.

10.79Causing greater dissemination. Second, the Court emphasised the ‘decisive role’ of search engines in the overall dissemination of personal data. Search engines make such data available to users who would not otherwise have found them. This consideration superficially resembles the ‘new public’ test that was proposed (and later partly abandoned) in the context of the communication to the public of copyright works.78 It amounts to saying that, although the purpose of a search engine is not to create new content (other than intermediate data structures for the purpose of storing and ranking content), in rendering accessible content which may otherwise be consigned to the dusty microfiche archives of a library or municipal authority, they are intervening so as to open up that content to a new audience. The same is presumably true of many other services.

10.80Joint determinations to process data. Third, although search engines follow the robots exclusion protocol by convention,79 the Court held that this does not affect their responsibility for processing personal data where no choice has been made by the website operator. In any case, for a search engine to determine the processing of personal data jointly with a website operator based on its robots.txt file would still fall within article 2(d) (which encompasses joint determinations).80

10.81  Controllers of indexed data. The effect of the Google Spain decision appears to be that search engines are controllers with respect to all personal data contained on indexed webpages. The Advocate General notably regarded such a conclusion as ‘absurd’.81 It is also difficult to reconcile with the approach taken by the Court in Lindqvist, which recognised (in the context of data transfers) that:

Given, first, the state of development of the internet at the time [the Directive] was drawn up and, secondly, the absence...of criteria applicable to use of the internet, one cannot presume that the Community legislature intended the expression ‘transfer [of data] to a third country’ to cover the loading...of data onto an internet page, even if those data are thereby made accessible...

Conversely, the decision in Google Spain assumes that the Directive was intended to cover the loading of data from an internet page into a search engine, where those data are thereby made accessible. This approach means that if a webpage includes sensitive personal data that is, in turn, included in search results, the search engine would be required to determine whether processing is lawful and cease any unlawful processing. As is discussed further in chapter 12, EU safe harbours may not apply to questions of liability arising from contraventions of the Directive.82

10.82  Other processing. In addition to the processing carried out by search engines when indexing third party content, search engines control the processing of personal data in a number of other ways when supplying search and advertising services to their users. First, the details of search engine users will often involve personal data, such as the contact details of the user (if registered), their search history, IP address, and tracking cookies.83 Second, advertisers will supply personal data in the course of reserving and paying for keyword advertisements.84 It is clear that the Directive also applies to these activities.

10.83  Data supplied by users to services. Where a ‘telecommunications or electronic mail service’ receives material from a third party such as a user, that material may include personal data. In such a case, the approach taken in recital (47) of the Directive is to treat the person from whom the message originates as the controller of personal data contained in the message, rather than the service provider. The service provider is treated as a data controller only ‘in respect of the processing of the additional personal data necessary for the operation of the service’.85 However, transmission of such messages must be the sole purpose of the service.

10.84  Scope of recital (47). The meaning of this proviso is unclear. Normally, a service provider will process the entirety of a message in order to transmit it, so that operation of the service will necessarily also involve processing all of the personal data it contains. However, the better interpretation is that the service is only a data controller with respect to ‘additional’ data—that is, data not already contained within the message—such as the details of the sender and recipient, user account information, and any identifying metadata, which are stored and processed in the course of transmission, or for the purpose of providing the service. Otherwise, this proviso would be deprived of any meaningful operation.

10.85  Relevant communications services. The Directive refers to messages transmitted ‘by means of a telecommunications or electronic mail service’ whose sole purpose is message transmission. It appears arguable that a similar approach should apply to non-email services such as Facebook Messenger, WhatsApp, and Snapchat, on the basis that such services are either forms of ‘electronic mail’ or are otherwise telecommunications services. Such an argument would face two difficulties. First, those applications sit uncomfortably with the definition of a classical ‘electronic mail service’. It is suggested that, consistent with the aim of the Directive to adapt to evolutions in technology, the concept of an electronic mail service should include any electronic messaging service, whether using the Simple Mail Transport Protocol or some other IP-based protocol.86

10.86 Second, messaging applications often do more than offer simple text-based messaging; for example, they tend to support voice telephony and sending photographs, videos, and other files, or even form part of a larger platform or social network (eg Facebook or Instagram). One answer is that all these activities should now be regarded as falling within the concept of a mail service (which, even at the time the Directive was adopted, would have permitted attachments).87 If they do not, the proviso should apply to the extent the service does consist of a mail service. Insofar as the application provides other services, it may be considered a data controller in respect of further processing relating to those other services.

10.87  Extension to other application-layer intermediaries. Arguably, the same approach should apply to personal data published by users to other application-layer content services, such as social networks, media platforms, and comments to blogs. In all cases, the operator of the service is carrying out activities which include, in part, the provision of a service which consists solely of functionality to receive and transmit messages from users. However, in light of the Google Spain decision it appears that many application-layer platforms will be considered data controllers in respect of any additional processing that they carry out.

10.88  Nature of the obligations. The duties of data controllers are specified at a high level of abstraction in article 6 of the Directive and given effect in section 4(4) of the 1998 Act. That provision creates a statutory duty to comply with the data protection principles set out in schedules 1–4 of the Act:

it shall be the duty of a data controller to comply with the data protection principles in relation to all personal data with respect to which he is the data controller.

10.89  Scope of data protection duties. Data protection duties attach to data controllers.88 In general, a data controller will only be liable for its own processing. However, if a service provider receives and stores data from third parties which include personal data, it will need to ensure that its additional processing satisfies the requirements of the Directive. As the Court of Justice explained in Google Spain, where a service provider’s activities involve processing which can be ‘distinguished from and is additional to’ processing carried out by third parties, the service provider:

must ensure, within the framework of its responsibilities, powers and capabilities, that that [ie the additional] processing meets the requirements of [the Directive], in order that the guarantees laid down by the Directive may have full effect.89

10.90 Although formally this obligation applies only to any additional processing carried out by the data controller, in practice this will amount to a duty to ensure compliance with respect to all personal data stored or utilised by the service. The following sections outline the main duties of data controllers. They do not aim to provide a comprehensive treatment but instead consider a selection of issues likely to arise in relation to internet intermediaries.

10.91  Lawful processing. First, personal data must be processed fairly and lawfully.90 The circumstances in which processing is ‘lawful’ are specified in article 7. In the case of ordinary personal data, data processing is lawful if at least one of the legitimacy conditions is satisfied, namely, where the processing:

(a)

has been unambiguously consented to by the data subject;

(b)

is necessary to perform or conclude a contract to which the data subject is or will be a party;

(c)

is necessary for compliance with a legal obligation of the controller (other than an obligation ‘imposed by contract’);91

(d)

is necessary to protect the vital interests of the data subject;

(e)

is necessary for the performance of certain public interest or official activities;92 or

(f)

is necessary for legitimate interests pursued by the controller or third party recipients of the data, unless those interests are overridden in any particular case by the data subject’s rights and freedoms.93

10.92  Transposition. By article 5, the exact circumstances in which data processing is lawful are left to member states. In the 1998 Act, these circumstances are specified in schedule 2. In general, data processing will not be fair and lawful unless at least one of the conditions mentioned in paragraph 10.91 is satisfied.

10.93  Meaning of consent. Consent is given a specific meaning in the Directive: it requires a ‘freely given specific and informed indication’ of the data subject’s wishes and which signifies an agreement to the relevant processing.94 Consent may be implicit or explicit, but must be explicit in the case of sensitive personal data.95

10.94  Processing in aid of legitimate interests. Even in the absence of consent, the controller is free to process personal data in certain circumstances. The broadest of these is to pursue the ‘legitimate interests’ of the controller or third parties, subject to a balancing exercise. In Google Spain, the Advocate General considered that this would justify activities by search engines in making information more easily accessible to internet users, effectively disseminating internet content, and enabling ancillary businesses, each of which corresponded to a protected Charter right. However, the Court rejected this view and ultimately sided with the data subject in its balancing exercise.96

10.95  Unauthorised access to IT equipment. In Imerman v Tchenguiz, the Court of Appeal considered that there was a realistic prospect of establishing that processing was not lawful or fair where the defendants had surreptitiously accessed personal data in computer files stored on a server and IT systems.97 The defendants were owners of the equipment and had the ability to access and control those systems; it was therefore ‘strongly arguable’ they were data controllers. There was said to be force in the argument that their processing of the material was unlikely to be lawful, fair, and in accordance with schedule 2 of the 1998 Act. This was because the data were arguably obtained in breach of the Computer Misuse Act 1990 (meaning processing was not lawful), and because the processing was not ‘fair’ since it was done in secret and indiscriminately. Although these comments were obiter (the appeal was decided on the basis of breach of confidence), they indicate the considerations relevant to determining whether processing is lawful and fair.98

10.96  Exhaustive nature of grounds. Article 7 sets out an ‘exhaustive and restrictive’ list of the grounds on which processing may be lawful. As such, member states are not free to add new grounds or alter the scope of the six principles set out in article 7. This reflects the limited margin of discretion given to member states to derogate from the uniform level of protection established under the Directive. This is to be contrasted with national provisions that are a ‘mere clarification’ of article 7, or ‘guidelines’ concerning how to conduct the balancing exercise required to strike a ‘fair balance’ between the rights of data subjects, data controllers, and third parties.99

10.97  Purposes of processing. Second, personal data must be collected for specified, explicit, and legitimate purposes and not further processed incompatibley with those purposes.100 The usual way in which purposes are specified by internet services is in a privacy notice (commonly referred to as a privacy ‘policy’) published to users of the service.

10.98  Importance of privacy notices. Privacy notices are important documents because they are afforded a special status under paragraph 5(a) of schedule 1 to the 1998 Act: namely, they specify the purposes for which data are collected and, in turn, determine the purposes for which processing may occur and for which data will be adequate, relevant, and necessary. Care must be taken when drafting a privacy notice to ensure that it:

(a)

correctly describes the activities of the service;

(b)

comprehensively deals with all personal data collected and processed by the service;

(c)

accurately describes how and why the service collects data, and where that occurs;

(d)

explains what data are stored, why, and for how long;

(e)

explains any situations in which the data may be disclosed to others; and

(f)

addresses the future possibility that the data subject will want data removed, rectified, or blocked.

10.99 Unfortunately, very few privacy notices are written in a clear and concise manner. This is regrettable, because it means that few users will bother to read them, and even fewer will comprehend them. Empirical research suggests that privacy notices are rarely understood, even by experienced readers.101

10.100  Legitimacy of purposes. The reference to ‘legitimate’ purposes suggests that privacy notices will be subject to some degree of external supervisory control. Although no authority exists on the point, it is suggested that a privacy notice which permitted collection and processing for ‘any purpose’ would not be legitimate (additionally, it would arguably not be explicit and specified with sufficient clarity and precision to allow data subjects to give their consent). Similarly, a privacy notice which permitted processing for an obviously illegitimate purpose (such as sale of personal data to criminals) would not be legitimate even if clearly specified. If incorporated into a contract or notice communicated by a trader to a consumer, such a notice may also fall afoul of unfair terms legislation insofar as it affects users who are dealing as consumers.102

10.101  Quality of data processing. Third, data must be ‘adequate’, ‘relevant’, and ‘not excessive’ in relation to the purposes for which they are collected or may be further processed.103 Adequacy, relevance, and necessity must ordinarily be assessed based on the purposes of the data controller and not those of any third party.104 What must be assessed is whether the personal data collected, stored, and processed by the controller are adequate for the purposes of the controller (as specified in the controller’s privacy notice), relevant to the controller’s activities, and whether they go beyond what is necessary for those activities.105

10.102  Content of the duty. The Court explained this requirement in Google Spain as a duty upon the controller to ‘take every reasonable step to ensure that data which do not meet the requirements of [article 6] are erased or rectified.’ In interpreting this as a negligence-based standard rather than an absolute duty, a reasonable step will be one that a reasonable economic operator in the position of the controller would take with respect to the data concerned. Other relevant factors will include the nature of the non-compliance with article 6, the cost of taking remedial steps, and the consequences to the data subject of not taking action.

10.103  Examples. In the context of internet intermediaries, this requires consideration of what data are collected by the service, the reasons why they are stored, and what processing is or may be carried out by the service. Several examples follow:

(a)

A social network stores data supplied by its users about themselves and other users in the form of image, text, and video postings. The storage enables users to view the postings, which may be displayed or searched for at any time by users who have been configured to see the information. The storage is likely to be ‘adequate’ and ‘relevant’, provided the data are only displayed in accordance with users’ instructions, and deleted when no longer needed.

(b)

A search engine stores lists of URLs for the purpose of enabling access to websites relevant to specified keywords. Such data may be considered ‘adequate’ and ‘relevant’ where the keywords appear on the page, provided they are removed once the source page is no longer available or has been modified to remove the keywords. The Court in Google Spain held that the age of the source material is also a relevant consideration.

(c)

A website displays comments from users which include data identifying individuals. The website processes and stores all comment data for the legitimate purpose of enabling comments to be viewed and disseminated. Provided the website stores comment data in the form provided, and removes them if the original comment is deleted by the user, such data seem likely be ‘adequate’ and ‘relevant’.

10.104  Conducting the assessment. Difficult questions arise when considering the benchmark against which to assess adequacy, relevancy, and excessiveness. On a literal reading of the Directive and 1998 Act, the purpose against which data should be assessed is the purpose for which the data are processed by the data controller. Accordingly, adequacy must be assessed against that purpose, and not the purpose for which the data were originally created (which may be different or unknown). Similarly, relevance must be considered in light of the ongoing purpose of the data controller, and not only the original purpose of the author. It might have been argued that if the purpose of processing is to make available a repository of third parties’ publications, then personal data contained in the archive remain relevant to that purpose notwithstanding that the original publications were authored decades ago. However, Google Spain appears to have rejected such an approach by concluding that processing for the purpose of making the original publication re-accessible may be unlawful.

10.105  Accuracy of data. Fourth, data must be accurate and updated as necessary. All reasonable steps must be taken to erase or rectify data that are inaccurate or incomplete, having regard to the purposes for which they were collected or for which they may be further processed.106

10.106  Standard of care. This is not an absolute duty but a duty to take reasonable care. As schedule 1 of the 1998 Act makes clear, data controllers are not required to avoid any inaccuracy at all in personal data—only to take reasonable steps aimed at ensuring that the data being processed accurately record any information obtained by the data controller from the data subject or from a third party. If the data subject notifies the controller of his or her view that the data are inaccurate, the controller must update the data to reflect that (or rectify the data).107

10.107  Corrupted data. Understood in this way, the duty of accuracy is relatively limited in the context of internet intermediaries that receive and process data automatically from data subjects. One possible breach could be an error or inaccuracy introduced during transmission or storage (eg by data corruption or a software bug) and which results from a failure to take reasonable care.

10.108  Notification of inaccuracies. Another possibility is a failure to act on a notification that, in the data subject’s view, the data are inaccurate. Sub-paragraph 7(b) of schedule 1 to the 1998 Act appears to impose an absolute duty to update personal data so as to ‘indicate that fact’. In light of the fault-based standard of liability established by article 6(1)(d), the better view is likely to be that the data controller must only take such steps to update the data as are reasonable in the circumstances.

10.109  Duration of processing. Fifth, personal data must be kept in a form which permits identification of data subjects for no longer than is necessary for the purposes for which the data were collected or for which they may be further processed.108 Again, this is assessed by reference to the purposes of the data controller, as assessed by reference to any applicable privacy notice.

10.110  The relevant rights. Sixth, data controllers are obliged to process personal data in accordance with the rights of data subjects under the Directive and 1998 Act. These rights include rights of access, objection to further processing, and erasure, rectification, or blocking. They are examined in section 3.

10.111  Breach of the duty. A data controller contravenes this duty only if it fails to process data in accordance with these rights of access, objection, and removal. For example, a website or search engine that continues to process personal data notwithstanding receipt of a well-founded objection to further processing will not have processed those data in accordance with the rights of the relevant data subject, to the extent that the notice is justified.109 So too for a service that fails to accommodate any other relevant request from a data subject.110 However, until all applicable pre-conditions to the exercise of a data subject’s rights (such as payment of a fee or a sufficiently clear description of the relevant data) have been fulfilled by the data subject, the controller is under no duty to act.

10.112  Technical and organisational measures. Seventh, data controllers must adopt appropriate measures (both technical and organisational) in order to maintain an appropriate level of security for personal data.111 These measures must be designed to prevent unlawful processing and guard against accidental loss or destruction of, or damage to, personal data. These measures must relate both to the design of a processing system and the processing of data using that system.112 Article 17 of the Directive refers in particular to processing that involves ‘the transmission of data over a network’ as requiring data security.

10.113  Selection of data processor. Where processing is carried out on behalf of a data controller (eg, outsourced cloud services), the data controller owes a duty to select a data processor with sufficient guarantees of technical and organisational security measures and to ensure ongoing compliance with those measures. Additionally, reasonable steps must be taken to ensure the reliability of employees who have access to personal data.113 The relationship between controller and processor must be spelled out in a written (or equivalent) contract which includes certain required terms, in particular that:

(a)

the processor shall act only on instructions from the controller; and

(b)

the processor shall be bound by data security obligations equivalent to those binding the controller.114

10.114  Appropriate level of data security. What constitutes an ‘appropriate’ level of data security depends on several factors, including: the state of the art; the cost involved; risks inherent in the processing; the harm that might result from unlawful processing, loss or damage; and the nature of the data to be protected.115 It is submitted that an ‘appropriate’ level of security is a duty to take reasonable care, and creates duties that are essentially coterminous with liability in negligence. The corollary of this is that many data controllers are likely to owe statutory duties of care with respect to data integrity and security.

10.115  Transfers of personal data. Eighth, service providers who store or process users’ data outside the EU must comply with the provisions related to cross-border data flow. In general, such transfers are permissible where any of three conditions have been satisfied:116 first, the destination country must either ensure an ‘adequate’ level of protection for personal data in all the circumstances of the case; second, a relevant exception must apply (eg the data subject must either have given his consent, or the transfer must be ‘necessary’ for performance or conclusion of a contract, for legal proceedings or for a substantial public interest); or third, the transfer must be relevantly authorised and effected with ‘appropriate safeguards’.

10.116  Accessibility of websites abroad. Merely uploading personal data to a website which is accessible to persons outside the EU will not create liability, assuming that the processing is otherwise lawful. In Lindqvist, it appears that the personal data were uploaded to a web server located within the EU which was operated by a host established either in Sweden or another member state. The Court properly concluded that merely uploading data which becomes accessible to users in third countries with the technical means to do so would not be a sensible result:

If article 25 were interpreted to mean that there is ‘transfer [of data] to a third country’ every time that personal data are loaded onto an internet page, that transfer would necessarily be a transfer to all the third countries where there are the technical means needed to access the internet. The special regime [for data transfers] would thus necessarily become a regime of general application, as regards operations on the internet. Thus, if the Commission found, pursuant to article 25(4) of Directive 95/46, that even one third country did not ensure adequate protection, the member states would be obliged to prevent any personal data being placed on the internet.117

10.117  Servers located in third states. If personal data are transmitted to a server physically located outside the EU, this is likely to be a transfer. If the data belong to a user of the service, then that is unlikely to pose a problem, since the user may be required to agree to the transfer as part of the terms of use. Alternatively, the transfer may be necessary to perform the contract under which the service is provided, either directly to that user or to a third party and in the interests of that user.118 Nevertheless, care should be taken to identify any relevant transfers to hosting infrastructure located outside the EU, particularly where decentralised cloud architecture is used.

10.118  Annulment of the data transfer safe harbours. As a result of the decision in Schrems v Data Protection Commissioner, the so-called ‘safe harbour’ arrangement by which transfers of personal data were permitted from the EU to the United States has been annulled.119 That arrangement was formerly based on a decision of the European Commission which deemed that the United States provided an ‘adequate level of protection’ for the purposes of article 25(2) of the Directive if certain ‘Safe Harbour Privacy Principles’ were complied with.120 The principles required United States organisations receiving personal data from the EU to self-certify to an American government department and provide various information about their privacy policies and programmes.

10.119 In Schrems, the claimant was a user of Facebook who objected to the transfer of his personal data to the United States on account of the surveillance activities of the National Security Agency, which had been disclosed by Edward Snowden. The Irish Data Protection Commissioner rejected his complaint, and he sought review of the decision in the Irish High Court. Upon being referred to the CJEU, the Court held that it would be contrary to the Directive for a decision of the Commission to prevent the national supervisory authorities from evaluating whether an adequate level of protection is provided by a third state: the authority ‘must be able to examine, with complete independence, whether the transfer of that data complies with the requirements laid down by the directive.’121

10.120  The meaning of ‘adequate level of protection’. The term ‘adequate’ means that the overall standard of protection provided by the third state must be sufficient ‘to ensure, by reason of its domestic law or its international commitments, a level of protection of fundamental rights and freedoms that is essentially equivalent to that guaranteed within the European Union by virtue of Directive 95/46 read in the light of the Charter.’122 However, the level of protection need not be identical to that guaranteed by the EU legal order.

10.121  Assessing the adequacy of protection. According to the CJEU in Schrems, to assess whether a country provides adequate protection requires the decision-maker to ask ‘whether that decision is compatible with the protection of the privacy and of the fundamental rights and freedoms of individuals’.123 This assessment must be conducted ‘with all due diligence’.124

10.122  Assessment of United States data protection. In light of the Snowden disclosures, the Court’s decision in Digital Rights Ireland,125 and the Commission’s own assessment of United States surveillance practices,126 the Court concluded that the United States failed to provide an adequate level of protection for personal data. In particular, personal data transferred from the EU could be accessed and stored for purposes ‘beyond what was strictly necessary and proportionate to the protection of national security’, and there were inadequate means of redress.127 Relying upon Digital Rights Ireland, the Court further observed that legislation which permitted the American authorities to store and have general access to the content of electronic communications ‘must be regarded as compromising the essence of the fundamental right to respect for private life’.128 Because the Commission did not consider whether United States law actually ensured an adequate level of protection when it adopted the safe harbours, that decision was invalid in its entirety.

10.123  The notification requirement. Data controllers established in a member state are obliged to notify the national authority that they intend to process personal data. Notification is a mandatory requirement and must normally be completed before any processing of data takes place. In the United Kingdom, it is a criminal offence not to notify or to notify changes to the current practice or intentions of the data controller, where that failure is negligent.129

10.124  Notification by intermediaries. Almost all, if not all, service providers with an EU establishment will need to notify the relevant supervisory authority before processing personal data. Fortunately, in the United Kingdom the procedure is cheap and relatively simple: notifications can be sent to the Information Commissioner’s Office (‘ICO’) using an online form.130

10.125  Nature of the problem. It is an unfortunate reality of modern digital storage that attempts to compromise the security of internet-connected systems are both inevitable and commonplace. With regrettable frequency these attempts succeed, resulting in unauthorised disclosure of data (much of them personal data) to malicious attackers. The consequences of data intrusions range from the mundane to the very serious: affected individuals may have to reset their passwords, receive unwanted communications, lose important data, have to deal with unauthorised transactions, or face a heightened risk of identity theft. Where the compromised data include passwords, the consequences may extend far beyond the data controller’s own platform because many individuals reuse login credentials on multiple platforms for convenience.

10.126  Causes of data breaches. Data breaches can occur in many ways. For analogue systems, the most common breach is simply sending data to the incorrect recipient. Sometimes physical equipment is lost or stolen. For internet data, the most common breaches are analogous: incorrectly addressed emails, and loss or theft of an unencrypted device. The type of breach often regarded as archetypal—malicious intrusion or hacking—is in fact relatively infrequent. Less commonly, information is accidentally uploaded to a public-facing webpage by the data controller.131

10.127  Rationale for notifying affected users. Some of the harms associated with data breaches can be averted if affected individuals receive timely notification so that they may reset their passwords or adopt other security countermeasures, such as two-factor authentication.132 For this reason, national data protection rules increasingly provide for mandatory data breach notifications. In the United States, almost all states provide for some form of notification procedure.133 In Australia, a proposal has been made for mandatory notification to the data protection authority and affected individuals of any data breach that poses a ‘real risk of serious harm’ to the individuals.134

10.128  EU service providers’ duty to notify. In the European Union, providers of publicly available electronic communications services must inform their subscribers if there is a ‘particular risk of a breach of security of the network’.135 This appears to include a security breach that is in progress or has already occurred, since that may create a particular risk of further breaches of security. In cases where the risk lies outside the scope of measures able to be taken by the service provider, subscribers must also be informed of any possible steps they can take and the costs of taking them. This information must be provided to the subscriber without separate charge.

10.129  Transposition. Data breach notification is implemented in the United Kingdom by regulation 5 of the Privacy and Electronic Communications (EC Directive) Regulations 2003, which requires a provider of a ‘public electronic communications service’ to take appropriate technical and organisational measures to safeguard the security of that service. Regulation 5(3) provides:

Where, notwithstanding the taking of [appropriate technical and organisational] measures as required by paragraph (1), there remains a significant risk to the security of the public electronic communications service, the service provider shall inform the subscribers concerned of—

(a)

the nature of that risk;

(b)

any appropriate measures that the subscriber may take to safeguard against that risk; and

(c)

the likely costs to the subscriber involved in the taking of such measures.

A measure is ‘appropriate’ if it is proportionate to the risk, having regard to the state of technological development and the cost of implementing the measure.136

10.130  Who must notify. The obligation to notify is limited to operators of a ‘public electronic communications service’. This concept is governed by a number of cascading definitions in the Communications Act 2003 (‘2003 Act’). In summary, there are four elements:

(a)

Public access. The service allows members of the public to send messages over an electronic communications network.

(b)

Electronic transmission. The service principally transmits signals electronically.

(c)

Not content. The service is not purely a ‘content service’ involving the exercise of editorial control or the provision of material to subscribers.

(d)

Control. The service is under the direction and control of the service provider.

10.131  Public service providers. The starting point is section 151 of the 2003 Act, which requires there to be an ‘electronic communications service that is provided so as to be available for use by members of the public’, in the sense that the public may be customers of the service provider.137 Thus, a service that was restricted to employees of a company would not fall within the definition, unless the service was also available to its customers.

10.132  Meaning of ‘electronic communications service’. An ‘electronic communications service’ is, in turn, defined to mean a service whose principal or only feature is the conveyance of ‘signals’ by means of an electronic communications network, but does not include a ‘content service’.138 Essentially, a service provider who exercises editorial control over material or provides material for transmission will supply a content service to that extent.139 To the extent a platform allows third parties to provide material for transmission, it will not be a content service and so may be an electronic communications service. This means that service providers who are subject to regulation 5(3) are not limited to operators of network-layer infrastructure: they may include internet services who allow electronic messages to be sent. Most commonly, data breach notifications are given by ISPs and application-layer platforms.

10.133  The triggering event. A notification obligation is triggered when there is a ‘personal data breach’. This is defined in Directive 2009/136/EC to mean:

a breach of security leading to the accidental or unlawful destruction, loss, alteration, unauthorised disclosure of, or access to, personal data transmitted, stored or otherwise protected in connection with the provision of a public electronic communications service in the Community.140

It follows that notification is limited to breaches involving a service established within a member state of the European Union. It stands to reason that the territorial scope of this provision is equivalent to the Data Protection Directive, and so would extend to a service supplied to persons in a member state in the context of activities involving data storage or transmission infrastructure in third states.

10.134  Who must be notified. Article 4(3) of the PEC Directive (as amended) defines the circumstances in which a breach notification will be required. The starting point is that the service provider is always required to notify the competent national authority (in the United Kingdom, the ICO). Additionally, where the breach ‘is likely to adversely affect the personal data or provider of a subscriber or individual’, the service provider is required to notify that subscriber or individual.141 Subscriber notifications must be sent in parallel with notification to the national authority. A failure to notify may lead to a £1,000 penalty.

10.135  Likelihood of adverse effect. Most but not all data breaches will be likely to affect subscribers or individuals adversely. In large part this will depend on the nature of the data that have been accessed, and the person or persons who have had access to the information. A risk of harm is particularly acute where the data include sensitive personal data, or data relating to financial information, browsing histories, email content, call histories, search histories, and location data. Relevant types of harm are not purely financial; they may also be emotional, psychological, or reputational.142 Even if the service provider considers that there is no likely harm, the national authority may require notification.

10.136  Notifications to affected users. The need for subscriber notifications may be displaced or deferred in three circumstances. First, by agreement with the national authority, the service provider may delay the notification until the breach has been investigated fully, if early notification may prejudice the investigation.143 Second, no notification is required if the service provider can demonstrate to the satisfaction of the national authority that it has implemented ‘appropriate technological protection measures’ with respect to the data involved in the security breach, sufficient to render the data unintelligible to the intruder.144 This includes both standardised encryption and hashing techniques. Third, if the service provider cannot identify all affected end users within the relevant time period, it may be required instead to notify individuals by means of advertisements placed in national or regional media.145

10.137 In all cases, notification must occur ‘without undue delay’. However, as a result of article 2(2) of the Notification Regulation, service providers are now required to notify the national authority within 24 hours of the detection of the breach ‘where feasible’.146 Detection requires actual knowledge that a security incident has occurred that led to personal data being compromised.

10.138  Notifications to a national authority. To be a valid notification, the provider must include certain information to the national authority, including:

(a)

details of the service provider;

(b)

details of the incident (when it occurred, how it was detected, the general circumstances);

(c)

details of the personal data affected by the incident; and

(d)

technical and organisational measures applied (or to be applied) by the service provider to the affected personal data.147

Because full information about a security incident may not be available until more than 24 hours after its detection, provision is made for the supply of further details within three days of an initial data breach notification if they are not initially available to the service provider.148 The ICO has published an online form which allows both initial and follow-up notifications to be communicated securely and immediately.149

10.139  Notifications to users. The information which must be supplied to a subscriber or individual is different and must be expressed in clear and intelligible language. In addition to the information required to be given to the national authority, the service provider must explain:

(a)

the likely adverse consequences of the data breach for the individual (eg identity theft, fraud, harm, distress);

(b)

what measures are being taken by the service provider to address the breach; and

(c)

what measures the individual could take to mitigate the adverse consequences.

10.140  Incident record-keeping. Service providers must also maintain an inventory of personal data breaches which sets out the facts surrounding each breach, their effects, and what remedial action was taken.150

10.141 Internet intermediaries, whether or not data controllers, must comply with the requirements of Directive 2002/58/EC (‘PEC Directive’) in relation to cookies. Following its amendment in 2009, article 5(3) provides:

Member States shall ensure that the storing of information, or the gaining of access to information already stored, in the terminal equipment of a subscriber or user is only allowed on condition that the subscriber or user concerned has given his or her consent, having been provided with clear and comprehensive information, in accordance with Directive 95/46/EC, inter alia about the purposes of the processing.

10.142  Scope of stored information. Article 5(3) applies to two classes of activity: (1) the storing of information; and (2) the accessing of information already stored. In simple terms, article 5(3) requires users to be told about activities involving data stored on their equipment and given the opportunity to consent to or refuse those activities. The most common case in which article 5(3) will apply is persistent session cookies stored on a user’s computer, but it clearly has wider application. For example, it may also apply to other means of storage and monitoring, including spyware, malware, viruses, keyloggers, file synchronisation, remote access tools, operating systems, and perhaps virtualised software instances.

10.143  Transposition. The PEC Directive is implemented in the United Kingdom in the Privacy and Electronic Communications (EC Directive) Regulations 2003. Regulation 6 lays down two requirements and prohibits the use of an electronic communications network to store or gain access to information stored in the terminal equipment of a subscriber or user unless those requirements are met. Before information can be stored or accessed on a user’s ‘terminal equipment’ (such as a computer or phone), users must: (1) be provided with ‘clear and comprehensive information’ about the purpose of that activity; and (2) be given the opportunity to refuse consent. Users may express consent using the appropriate settings of a web browser or other application, and only need to consent once.151

10.144  Exceptions. Two exceptions are recognised where the storage or access to information is ‘technical’. First, the requirements do not apply if it is for the sole purpose of transmitting a communication electronically. Second, they do not apply if the relevant activity is strictly necessary to provide the internet service requested by the subscriber or user.152 Thus, although consent must normally be given before storing or accessing information, in a typical case (where a cookie needs to be set automatically upon visiting a website) that initial storage may be protected by the second exemption if the cookie is necessary to access and display the webpage correctly.

10.145  Right to opt out. Article 14(1)(b) of the Directive requires the conferral of an absolute right to opt out of processing for the purposes of direct marketing.

10.146  Prohibition of direct marketing. Additionally, article 13 of the PEC Directive prohibits the use of, among other things, electronic mail for the purposes of direct marketing, subject to two exceptions. First, direct marketing is allowed if the relevant subscribers or users have given their prior consent. Second, it is allowed where a customer’s email address is obtained ‘in the context of the sale of a product or a service’ and the direct marketing concerns ‘its own similar products or services’. In this case, customers must be given the opportunity to opt out of such marketing freely and ‘in an easy manner’.153 However, as Lewison J noted in Microsoft Corp v McDonald (t/a Bizads), the policy underlying the PEC Directive was to protect both subscribers and intermediaries whose networks would otherwise need to deal with large volumes of spam email.154 As such, the exceptions should be construed narrowly.

10.147  Absolute prohibitions. Additionally, and notwithstanding these exceptions, member states are obliged to prohibit three kinds of direct marketing by email:

(a)

first, emails that disguise or conceal the identity of the person on whose behalf it has been sent;

(b)

second, emails that contravene article 6 of the E-Commerce Directive, or which encourage recipients to visit a website that contravenes that article; or

(c)

third, emails without a valid address to which the user may send an opt-out request.155

10.148  Mandatory statements in communications. It will be recalled that, in order to comply with article 6 of the E-Commerce Directive (or regulation 7 of the 2002 Regulations in the United Kingdom), a commercial communication must clearly identify: (1) its commercial nature; (2) its sender; (3) any offers and their conditions; and (4) any competitions and their conditions.

10.149  Definition of ‘commercial’ communications. A communication is ‘commercial’ if it is:

designed to promote, directly or indirectly, the goods, services or image of any person pursuing a commercial, industrial or craft activity or exercising a regulated profession, other than a communication...consisting only of information allowing direct access to the activity of that person...or...[where] the communication has been prepared independently of the person making it...

10.150 The key question is therefore the purpose of the person making the communication. If that purpose is to promote goods, services, or ‘image’, it will be commercial. The exceptions are narrow: information allowing direct access to the activity is generally limited to contact information, such as a link to a website, and must consist only of that information. Additionally, the service provider responsible for providing the information society service which makes the communication will need to comply with regulation 8 of the 2002 Regulations, which require the clear and unambiguous identification of unsolicited communications as such.

10.151  Contractual exclusions. An enterprising service provider may seek to displace or restrict the operation of these provisions by providing otherwise in an agreement with the subscriber or user. This is not possible for two reasons. First, regulation 27 of the PEC Regulations provides that such a term in a contract with a subscriber shall be void insofar as it is inconsistent with a requirement of those Regulations. Second, there are obvious difficulties standing in the way of concluding a contract between a service provider and a user who is not a subscriber or member of a service (eg by means of terms of use in a website the subscriber has never visited or in an email not yet received).

10.152  National security. The PEC Regulations contain several general exceptions in addition to the specific defences already discussed. First, there is a wide exception for national security in regulation 28. This appears to be broad enough to accommodate storage of the kind that occurs pursuant to the data retention regimes discussed in chapter 17, as well as access to stored information for that purpose.

10.153  Compliance with law. Second, regulation 29 creates exceptions to compliance where that would be inconsistent with duties under legislation or court orders, or would be likely to prejudice the prevention, detection, and prosecution of crime.

10.154  Enforcement of legal rights. Third, regulation 29 creates three important exceptions in relation to legal rights. These permit a service provider to do or refrain from doing anything that would otherwise be required by the PEC Regulations, including the processing of data, if that is necessary for the purposes of, or in connection with, any legal proceedings (including prospective proceedings), for obtaining legal advice, or for establishing, exercising, or defending legal rights. This is cast in identical terms to section 35 of the 1998 Act and should be construed identically.

10.155  Private proceedings. Enforcement is by two routes: private rights of action and administrative decision. First, article 13(6) of the PEC Directive directly confers a horizontal right upon any natural or legal person that is harmed by an infringement of the national rules on unsolicited mail and has a legitimate interest in ceasing future infringements. That right is to bring proceedings.156 The standing requirement under the PEC Regulations is simpler: the claimant need only have suffered damage. It is a defence for the defendant to show that he had taken ‘such care as in all the circumstances was reasonably required to comply with the relevant requirement’.157

10.156  Standing to sue. Both individual users and internet intermediaries who are harmed by a breach of the PEC Regulations possess standing to bring an action under regulation 30. This reflects the policy of the PEC Directive of protecting both consumers and networks, and the fact that many services suffer loss from spam. For example, in Microsoft Corp v McDonald (t/a Bizads), the claimant webmail provider obtained summary judgment against an individual who operated a website which sold lists of email addresses to ‘spammers’.158 The lists included email accounts operated by Microsoft’s Hotmail webmail service. The defendant claimed that all users had opted in to marketing emails, but decoy accounts set up by Microsoft also received these emails and had not consented.

10.157  Remedies. In McDonald, the Court granted an injunction restraining further breaches of the PEC Regulations and prohibiting the defendant from transmitting or instigating the transmission of spam emails to Hotmail accounts. The defendant had contravened the Regulations by ‘instigating’ the unsolicited communications, in the sense of urging or inciting purchasers of the lists to send the emails. Microsoft was said to suffer two kinds of loss: first, the cost of investing in additional servers to cope with the volume of spam email; and second, the damage to its goodwill that occurs when subscribers receive unwanted mail.159

10.158  Administrative enforcement. Additionally, the Commissioner may apply its usual enforcement powers under the 1998 Act to contraventions of the PEC Regulations, potentially in parallel with a private action.160 The applicable principles are considered in section 4.3.

10.159  Overview. The Directive requires all individuals to be guaranteed a number of rights with respect to personal data concerning them. These are, in summary, a right of access, a right of objection, and a right of rectification, erasure, or blocking. These rights are separate but overlapping, and there are distinct preconditions for exercising them. They are not absolute rights but instead require a balancing exercise to be undertaken so as to strike a fair balance between the fundamental rights of the data subject and relevant interests of the data controller and third parties. The rights are considered in turn in the following sections.

10.160  What may be obtained. Data subjects have the right to access data relating to them which are being processed by a data controller. This includes a ‘right to know the logic involved in the automatic processing of data’, subject to exceptions for the protection of trade secrets and intellectual property such as copyright in software.161 Article 12(a) provides:

Member States shall guarantee every data subject the right to obtain from the controller: (a) without constraint at reasonable intervals and without excessive delay or expense:

confirmation as to whether or not data relating to him are being processed and information at least as to the purposes of the processing, the categories of data concerned, and the recipients or categories of recipients to whom the data are disclosed,

communication to him in an intelligible form of the data undergoing processing and of any available information as to their source,

knowledge of the logic involved in any automatic processing of data concerning him at least in the case of the automated decisions referred to in Article 15 (1)...

10.161  Conditions for exercise of the right. The right of access is subject to several conditions: first, it is limited to personal data relating to the person making the request; second, requests are restricted to ‘reasonable intervals’; third, there may be a delay or charge involved for such access, provided it is not ‘excessive’ and does not otherwise amount to a ‘constraint’ on access; and fourth, the request is confined to specific information, which relates broadly to: the data being processed (if any), the source of those data (if available), the purposes of processing, and so on.

10.162  Protections for the rights of others. The right of access may also be restricted so as to protect the data subject or the rights and freedoms of others.162 For example, where the data undergoing processing include information identifying other data subjects, access could be limited or, if that information is inseparable from the whole, refused. This is consistent with the duty to protect data against unauthorised access by third parties.163 Similarly, under the 1998 Act the right does not afford access to privileged materials.164 It is suggested that, in any given case, a balancing exercise must be undertaken to arrive at a ‘fair balance’ between the right of access enjoyed by the data subject, and the relevant rights and freedoms of the service provider and other users. This exercise is essentially the same as the one applicable to articles 12(b) and 14 of the Directive.

10.163  Specific exemptions. In the United Kingdom, the 1998 Act also creates specific exemptions for professional references, the armed forces, judicial and counsel appointments, Crown employment, corporate planning and financial documents, academic results, and the privilege against self-incrimination.165 Those are not considered in detail by this work.

10.164  Terminology. European privacy law confers upon data subjects a limited right to removal or alteration of their personal data. This protection is sometimes erroneously referred to as a ‘right to be forgotten’, but there can of course be no right to compel others to ‘forget’. The reality is that data subjects have a more limited right to request the alteration, removal, or restriction of access (as appropriate) to specific data concerning them which is processed in contravention of data protection law. That right is exercisable subject to various limitations and conditions which are discussed in the following paragraphs by reference to the more neutral heading ‘rights of removal’.

10.165  Underlying policy. Rights of removal are alien to English law, which generally permits the publication of truthful material that is lawful, public, and does not infringe the property rights of another. The concept embodied in the Directive derives from the French criminal doctrine of le droit à l’oubli (the ‘right of oblivion’),166 which among other things permits rehabilitated offenders to object to continued publication of their spent convictions.167

10.166  Legislative basis. The right is set out in article 12(b) of the Directive, which provides:

Member states shall guarantee every data subject the right to obtain from the controller...(b) as appropriate the rectification, erasure or blocking of data the processing of which does not comply with the provisions of this Directive, in particular because of the incomplete or inaccurate nature of the data;...

10.167  Relationship to other rights. This right of removal is separate to the rights of access (see paragraph 10.160) and objection (see paragraph 10.183). In practice, an objection to processing is also likely to be accompanied by a request for erasure, since both are directed towards essentially the same objective.

10.168  Conditions for exercise of the right. Before a data controller will be obligated to give effect to a removal request, several requirements must be satisfied by the data subject:

(a)

Personal data. The request must relate to data which identify, or are reasonably likely to lead to the identification of, the individual making the request. Logically, the scope of the right is limited to such data and the data subject must specify those data with sufficient clarity and precision for the service provider to act.

(b)

Data subject. The request must be made by the data subject to whom the personal data relate.

(c)

Data controller. The recipient of the request must be a data controller with respect to the personal data that is the subject of the request.

(d)

Unlawful processing. At the time of the request, processing of the data must not be permitted by the Directive. This may be because:

(i)

the data do not comply with article 6 (eg because data are not adequate, relevant, necessary, accurate, up-to-date, or are excessive);

(ii)

the processing is not legitimate because it does not satisfy the article 7 conditions (eg because the legitimate interests pursued by the controller do not override the rights of the data subject); or

(iii)

the data fall into one or more of the special categories of data in article 8 (such as data revealing racial or ethnic origin, political opinions, philosophical beliefs, health, and so on) and none of the exemptions apply.

(e)

Appropriate response. As between rectification, erasure, and blocking, only the appropriate action need be taken. If erasure is sought, that must be appropriate in the circumstances.

10.169  The requirement of unlawful processing. Before a right arises under article 12(b), continued processing of the personal data must be such as not to comply with the Directive. If processing is compatible with the provisions of articles 6, 7, and 8, then there is no right to removal. The Directive gives two examples of why processing may be unlawful: incompleteness and inaccuracy of the data. Google Spain confirms that these examples are non-exhaustive.168 However, the processing must be ‘non-compliant’ in some way, by ‘non-observance of the other conditions of lawfulness’.169 As a result, there are potentially numerous ways in which processing might be unlawful, but exhaustively specified ways in which it might be demonstrated to be lawful.

10.170  Transposition. In the United Kingdom, rights of removal are dealt with in section 14 of the 1998 Act, which creates two forms of mandatory injunction.

10.171  Inaccurate data. The first, under section 14(1), arises where a court is satisfied that personal data are (1) ‘inaccurate’; and (2) ‘contain an expression of opinion which appears to the court to be based on the inaccurate data’. The Court has discretion to order the data controller to rectify, block, erase, or destroy the inaccurate data. Accuracy is judged objectively by reference to the true state of affairs, rather than against the data supplied by the data subject or a third party. However, if the data controller has nevertheless complied with its duty of accuracy (eg because it accurately recorded what it was given), then the Court may order a corrective statement to be published instead of removal.

10.172  Contraventions of the 1998 Act. The second right of removal, under section 14(4), arises where a court is satisfied that (1) there has been a contravention of the 1998 Act by a data controller; (2) the applicant data subject has suffered damage as a result; and (3) there is a substantial risk of further contraventions in respect of the same personal data. Where those requirements are satisfied, the Court has discretion to order rectification, blocking, erasure, or destruction of the data.170 Importantly, it appears that this remedy is not restricted to the situation where the contravention of the Act is committed by the respondent. This may be more than what article 12(b) strictly requires.

10.173  Procedure. In both cases, the right arises only upon application by the data subject to a court. Both the High Court and the County Court have jurisdiction.171

10.174  The need for a judicial procedure. Section 14 does not appear to require a data controller to act unless and until the data subject’s rights have been adjudicated by the relevant judicial authority. This is arguably inconsistent with article 12, which confers a right to obtain ‘from the controller’ the removal sought without application to a court or administrative authority. Additionally, the remedies conferred by section 14 are narrower: section 14(1) only applies to personal data contained in ‘expressions of opinion’, while section 14(4) requires both harm (which the Court of Justice has explained is not required for the right to arise) and a substantial risk of future contraventions.

10.175  Notification of third party recipients. Similarly, both rights of removal can be accompanied by an order to notify third parties to whom the data have previously been disclosed that the data have been rectified, erased, blocked, or destroyed, as the case may be.172 However, notification must be reasonably practicable, bearing in mind the number of persons who would have to be notified. Thus, in the case of most internet intermediaries, notification is unlikely to be practicable where more than a trivial number of identifiable individuals have accessed the material.

10.176  Form of order. The appropriate order will depend on the nature of the contravention. In cases where the right arises because data are not in fact accurate but have been correctly recorded by the controller, the appropriate course is likely to be to rectify the data or publish an appropriate notice detailing the data subject’s opinion, if this is possible, rather than to remove the material. Similarly, where the right arises because data have not been kept up to date, article 6(d) of the Directive specifies that the response shall be to take every reasonable step to ensure that the data ‘are erased or rectified’. This suggests that the data controller (or court) has discretion as between those two alternatives.

10.177  Claims for de-indexing of personal data. In Mosley v Google Inc, the High Court refused Google’s application for summary dismissal of claims for (inter alia) an order against Google under section 14.173 The claimant had previously had his privacy infringed by the publication of images and footage of private sexual activity,174 as a result of which certain of those materials had become indexed by Google’s search engine. The claimant’s solicitors had repeatedly requested the de-indexing of specific images and URLs, but this became ‘a Sisyphean task’ as new images were published and numerous others remained.175 In 2011 and 2014, the claimant accordingly gave written notices to Google that it was required to cease processing the images under section 10 of the 1998 Act. At the time, Google refused on the basis that it was not a data controller and the notices did not identify the personal data concerned.

10.178 In light of the intervening decision in Google Spain,176 it was common ground that Google was a data controller with respect to those materials, which were displayed in response to queries for the claimant’s name. Google argued that it was exempted by the safe harbours from compliance with an order under the 1998 Act. The Court rejected this defence, on the basis that either the data protection regime was exempt from safe harbours or, in any case, because injunctions were still available consistently with the safe harbours.177 Accordingly, Mitting J concluded that the claimant had raised a viable claim which should proceed to trial. It appears that the claim subsequently was compromised out of court.

10.179  Blocking access to unlawfully processed data. Both section 14 and article 12(b) refer to the ‘blocking’ of unlawfully processed personal data as an available remedy. The question arises whether a court could make an order against an internet intermediary, such as an ISP or social network, to require it to block all access to a specified internet location operated by a third party on which personal data were being unlawfully processed. For example, a data subject in the United Kingdom might well wish to be able to compel major ISPs (or other network-layer intermediaries) to block access to a website on which sensitive information about her is being published.

10.180  Potential statutory basis. Section 14 appears to permit a blocking remedy in appropriate cases, though to date such a remedy has never been sought in the United Kingdom. The claimant will, of course, need to establish that the relevant service provider is a data controller with respect to the personal data that is sought to be blocked. This may be difficult in the case of an ISP, which—although processing data in the sense of transmitting it—can hardly be said to determine the purposes for which it is processed, other than in the very general sense of being at liberty to block the request as a whole. On the other hand, the breadth of the approach taken in Google Spain suggests that this is at least arguable. More promisingly, a social network that processes and displays links to third party material may be said to carry out additional processing on that material in a way that makes it a data controller with respect to the links.

10.181  Proportionality. Data protection blocking orders are statutory creations and do not rest on any inherent power of the Court. However, both forms of injunction under section 14 are discretionary and do not follow as of right from a contravention of the 1998 Act. Like any discretionary injunctive remedy, it is suggested that proportionality is a highly material consideration. In this regard, similar considerations are likely to apply as to website blocking orders made in the context of copyright and trade marks.178 The most important factors in this context are likely to be the seriousness of the contravention, its consequences for the data subject, the likely effectiveness of the order, and the cost and disruption associated with implementing it. If, for example, the personal data are already widely disseminated via other sources, blocking may be futile and it would be inappropriate to grant relief. So too if the service provider did not have any technically suitable means at its disposal with which to block access at proportionate cost, if a more effective and cheaper alternative has not been explored, or if blocking would carry a risk of unintended harm to the intermediary’s network, systems, or other lawful material, then such an order would not be appropriate.

10.182  Appropriate cases. Blocking is most likely to be appropriate where the data controller does not have the ability to alter the data, but is a necessary link in the chain of dissemination. There is every reason to suppose that it may be a useful tool for halting the local dissemination of extremely harmful but otherwise irremovable material, such as cyber-harassment, hate speech, or ‘revenge porn’, all of which are likely to involve sensitive personal data. Claimants should, however, be alive to the risk of the ‘Streisand effect’ in the event that a blocking order is made.179

10.183 Additionally, article 14(1)(a) of the Directive requires member states to confer upon data subjects a right to object at any time to the processing of personal data about them. There must be ‘compelling legitimate grounds’ for such an objection. If upheld, the controller must cease processing the data.

10.184  Legislative basis. The right to object is framed in article 14 as follows:

Member states shall grant the data subject the right: (a) at least in the cases referred to in article 7(e)–(f), to object at any time on compelling legitimate grounds relating to his particular situation to the processing of data relating to him, save where otherwise provided by national legislation. Where there is a justified objection, the processing instigated by the controller may no longer involve those data...

10.185  Relationship to other rights. The relationship between article 12(b) and article 14 is additive:180 the data subject may rely upon either or both to found a request. They specify different conditions and have different limitations. However, the balancing exercise which must be undertaken to determine whether an objection under article 14 is justified is the same as the balancing exercise that must be undertaken under articles 7(e) and 7(f) to determine whether processing is legitimate under article 12(b). All the circumstances of the data subject’s request and his or her own situation must be taken into account in both cases.

10.186  Conditions for exercise of the right. Under the Directive, before a data controller will be obligated to give effect to an objection to processing, several requirements must be satisfied by the data subject:

(a)

Personal data. The request must relate to data which identify, or are reasonably likely to lead to the identification of, the individual making the request. Logically, the scope of the right is limited to such data. The data subject must specify those data with sufficient clarity and precision for the controller to act.

(b)

Data subject. The request must be made by the data subject to whom the personal data relate.

(c)

Justification. The data subject must at the time of his request show compelling legitimate grounds, as permitted by national law (the four lawfulness factors), based on his fundamental rights and freedoms under articles 7 and 8 of the Charter.

(d)

Balancing exercise. The controller (or tribunal) must carry out a balancing exercise to determine whether the fundamental rights and freedoms of the data subject should override the legitimate interests of the controller and third parties: see article 7(e) and 7(f) of the Directive.

(e)

Public interest. There must not be any other reason supported by the preponderant interest of the public for processing to continue.

10.187  Instigation of further processing. If the objection is justified, then future processing ‘instigated by the controller’ may no longer involve the data forming part of the request. The concept of instigation is not defined, but it appears that this would extend to any processing caused by the controller, including by giving a direction to a third party data processor.

10.188  Justification. In Google Spain, the Court held that processing of data by a search engine ‘is capable of being covered by the ground in article 7(f)’.181 In light of this decision, there is no reason in principle why any internet intermediary could not seek to justify processing in a given case—for example, where it is necessary for a public interest or for the legitimate interests of the intermediary or third parties. However, this is a fact-specific inquiry based on characteristics of the data subject and the specific data being processed. This will offer little solace to services who deal with millions or billions of users’ data each day.

10.189  The balancing exercise. Determining whether an objection is ‘compelling’ and based on legitimate grounds requires the tribunal to undertake a balancing exercise to weigh up the competing rights of the data subject, the data controller, and third parties. It is suggested that the familiar Campbell approach should apply, as this is the approach developed by the courts under national law to balance the competing rights of private citizens under articles 8 and 10 of the Convention and, by analogy, under articles 7, 8, and 11 of the Charter.

10.190 The nature of the article 7(f) balancing exercise was explained by the Court in Google Spain as ‘a balancing of the opposing rights and interests concerned’ having regard to the ‘significance’ of the data subject’s interests under articles 7 and 8 of the Charter.182 This exercise is intrinsically similar to the inquiry undertaken by courts in the context of claims for misuse of private information. In conducting the balancing exercise, one relevant factor is the extent to which the personal data in question are already available to the public.183

10.191  The need for substantial damage or distress. Article 14(1)(a) preserves the freedom of member states to restrict the circumstances that will be regarded as ‘compelling legitimate grounds’ for objection. In the United Kingdom, a much more limited right exists in the form of section 10 of the 1998 Act: an individual is only entitled to object to processing if its purpose or manner ‘is causing or is likely to cause substantial damage or substantial distress’, and such damage or distress is ‘unwarranted’.184 Additionally, the right to object does not apply where any of four lawfulness conditions is met: consent; performance or formation of a contract with the data subject; compliance with a legal obligation; or protection of the vital interests of the data subject.185

10.192  Determining whether damage or distress is ‘unwarranted’. For the purposes of section 10, it is submitted that the same Campbell balancing exercise should be used to determine whether the damage or distress would be ‘unwarranted’. This is consistent with the approach in Google Spain and reflects the requirement of article 12(b) to determine whether the objection is ‘compelling’.

10.193  Request procedure. In order to make a valid request within section 10, the data subject must do several things:

(a)

First, the notice must be in writing and sent to the data controller. This could include an email message or online form.

(b)

Second, the notice must contain the required particulars; namely, it must specify the personal data, specify the purpose or manner of processing to which objection is made, and specify valid reasons. For a host or search engine, it is probably a sufficient specification of the data to give a URL. The purpose or manner of processing need not be detailed, and could simply summarise the internet intermediary’s activity. Reference may usefully be made to its privacy notice, if any.

(c)

Third, the data subject will need to explain why the processing is causing damage or distress and give some indication of why that is unwarranted.

10.194  Sources of guidance. According to ICO guidance, substantial distress will only be present where the data subject can demonstrate ‘a level of upset or emotional or mental pain that goes beyond annoyance and irritation, strong dislike or a feeling that the processing is morally abhorrent’.186 It is submitted that this is a hybrid mental state, which requires two components: the data subject must actually be distressed and so must hold that subjective state of mind; additionally, the level of perturbation must be ‘substantial’ when assessed against the standard of mental and emotional fortitude expected of a reasonable member of the public. Otherwise, a particularly frail data subject, or one suffering from serious mental illness, might legitimately be able to make requests which, to all other eyes, are spurious.

10.195  Response procedure. Upon receipt of a valid notification, a data controller has 21 days to respond. The response must state whether the controller accepts the request or not, and indicate to what extent it has complied or intends to comply with the notice. If the notice is materially unclear, a prudent course is to write for clarification. This may provide some measure of costs protection in the event that the data subject proceeds to make an application to court prematurely. Where the recipient of a notice does not consider itself to be a data controller with respect to the data, it should refuse the request and explain that fact. However, where compliance would only require proportionate requests to be made of a third party data processor those requests should be made, since the controller bears ultimate responsibility for ensuring compliance with the data protection principles (including respect for data subject rights).

10.196  Refusal or inability to comply. Section 10 does not explain what a data controller should do if it considers that the request is justified, but for whatever reason it cannot or will not comply. For example, it may require the internet intermediary to make major changes to its network infrastructure, may involve identifying and isolating the personal data relating to the data subject in a way that simply is not possible, or may require disproportionate cost or time to carry out. It is clear that the right to object is not an absolute right, but must be balanced against both the interests of the data controller and those of third parties. The cost and consequences of compliance are clearly relevant factors. As a result, a data controller in this position will need to consider carefully whether its interests would prevail in the balancing exercise and respond accordingly.

10.197  Judicial remedies. If a data controller refuses to comply with a notice to any extent, the data subject may apply to a court for an order that the data controller take such steps to comply with the notice (or any relevant part) as the court thinks fit.187 Before making such an order, the court must be satisfied of three things: first, that the notice was valid; second, that the controller failed to comply; and third, that it is appropriate to order the controller to comply.

10.198  Costs. Section 10(4) does not deal with the question of costs arising from a claim for an injunction against a data controller. It is suggested that the costs should follow the event, since if a data controller wrongly fails to comply with a notice, it has breached its data protection duties and is properly considered a wrongdoer.188 However, the costs of making and considering the notice of objection would appear to lie where they fall.

10.199  Overview. The rights of objection and rectification, erasure, or blocking were first considered by the Court of Justice in the context of an internet service in Google Spain SL v Agencia Española de Protección de Datos.189 That decision has already been discussed in paragraphs 10.42 and 10.76 in relation to the territorial and material scope of the Directive. However, in view of the importance of the decision for internet intermediaries, it may be useful to analyse the findings and reasoning of the Court more fully. In short, the Court held that search engines may be obliged to remove links to web pages containing personal data about a person from search results based on that person’s name, where the conditions of article 12(b) or article 14 are satisfied.190

10.200  National proceedings. The facts of the Google Spain case can be stated shortly. The reference was made by the Spanish Audiencia Nacional during proceedings between Google Inc (‘Google’) and its local Spanish subsidiary, Google Spain SL, and the Spanish data protection authority (‘AEPD’) and a complainant, C. C complained about newspaper articles published in the La Vanguardia newspaper in January and March 1998. Those articles concerned a real estate auction of C’s property in attachment proceedings arising from his unpaid social security debts. One of the articles may still be accessed online.191

10.201 The articles were initially available in printed form then republished online in the newspaper’s archives, which were indexed by Google and thereby made accessible to users of the search engine who searched for C’s full name. C later requested removal of the material from both the archive and Google. Both refused. The AEPD rejected C’s request against the newspaper, holding such publication to be lawful, but upheld his complaint against Google and Google Spain. They appealed to the national court, which referred several questions to the Court of Justice.

10.202  Questions referred. The Court interpreted the relevant question as being whether the rights under articles 12(b) and 14 of the Directive require a search engine to remove from the list of search results for a person’s name links to websites lawfully published by third parties containing information relating to that person. Google argued that the principle of proportionality required any request for removal to be addressed to the website operator in question, since only they can assess the lawfulness of the publication and have the most effective and least intrusive means of making the information inaccessible. C, supported by several governments, argued that a request should be addressable directly to a search engine operator, irrespective of whether the material is lawful, on the basis of his subjective preference for removal.

10.203 The Advocate General considered the ‘particularly complex and difficult constellation of fundamental rights’ at stake,192 and concluded that no general ‘right to be forgotten’ was to be found in the Directive or required by the Charter or Convention. The rights of users to receive information, and the freedom of search engines to carry on business and express information, should prevail. Moreover, in the Advocate General’s view, it would not be feasible to require search engines to undertake a balancing exercise in individual cases. Inevitably, this would result in ‘automatic withdrawal’ of material upon receipt of a request, or create an impractically large number of requests,193 many of which would relate to lawful material.

10.204  The role of search engines. Unusually, the Court rejected the Advocate General’s opinion and found that rights under articles 12(b) and 14 of the Directive could have the effect of requiring a search engine to remove hyperlinks to personal data. In the Court’s view, search engines enable internet users to develop a ‘detailed profile’ of a person which includes ‘a vast number of aspects of his private life’. Without a search engine, such information ‘could not have been interconnected or could have been only with great difficulty’. Search results were ‘ubiquitous’ and, as a result of this serious interference with privacy, cannot be justified by search engines’ economic interest in data processing.194 Although the ratio of the decision does not appear to be limited to search engines, this may afford a basis for distinguishing other internet intermediaries which do not make personal data available to a new public, or which do so only to a much more limited extent than the Google search engine.

10.205  The requirement of ‘fair balance’. The Court accepted that internet users have a legitimate interest in having access to information about a person. Accordingly, a ‘fair balance’ is required between the interests of the public and the fundamental rights of the data subject under articles 7 and 8 of the Charter. The Court described the required balance as follows:

Whilst it is true that the data subjects rights protected by those articles also override, as a general rule, that interest of Internet users, that balance may however depend, in specific cases, on the nature of the information in question and its sensitivity for the data subject’s private life and on the interest of the public in having that information, an interest which may vary, in particular, according to the role played by the data subject in public life.195

10.206 On its face, this statement comes close to departing from the traditional view that neither interest automatically prevails over the other.196 What the Court appears to be saying is that the starting point is that the rights of a data subject mentioned in search results prevail over those of other internet users, subject to exceptions in specific cases. The better view, it is respectfully suggested, is that the inquiry should be concerned with ‘specific cases’ and must therefore depend on all the circumstances, including the nature of the information, the identity and conduct of the data subject, any legitimate interests in accessing and disseminating that information, and all the other highly nuanced factors considered in relation to privacy claims.197

10.207  Interests of the service provider. The balancing exercise also takes into account interests specific to the service provider, such as the purposes of processing and the consequences of that specific processing for the data subject. As the Court pointed out, the purposes and consequences are not necessarily the same, as between a search engine and a website operator. Accordingly, different intermediaries may stand in different positions in any balancing of rights.198 Extrapolating from Google Spain, relevant factors are likely to include: how widely the intermediary causes dissemination of the data; whether this results in the data being available to a new audience; and the nature of the service.

10.208  Evidence of loss or damage. The Directive does not expressly require a data subject to suffer any loss or damage from processing to which they object. As such, the objecting data subject does not need to show that the unlawful processing complained of would necessarily cause her any harm. On the facts of Google Spain, there was certainly no evidence that C was suffering any harm by reason of the processing.199 This being so, it is difficult to understand the relevance of a new audience without the existence of damage which is liable to be amplified by wider dissemination. Further, on the facts, the original article was published in order to attract bidders to the auction of C’s assets and intended to be disseminated as widely as possible—indeed, to do so was in C’s interests since it could have increased the sum recovered at auction.

10.209  Data processing over time. The Court appears to have accepted the submission that rights of objection and removal arise only where data processing is incompatible with the Directive. However, the Court noted that initially lawful processing may become unlawful over time. This may be because the data have become ‘inadequate, irrelevant or excessive in relation to the purposes of the processing’, have not been kept up to date, or have been kept ‘for longer than is necessary’ for their purpose of collection (unless required for historical, statistical, or scientific purposes). Processing may also be unlawful if data are or have become inaccurate, or if an earlier consent under article 7 is later withdrawn or does not cover the entire period of processing.

10.210  Assessing the lawfulness of processing. In the context of an intermediary which provides access to personal data, relevant factors are likely to include: the amount of time that has elapsed between collection of the personal data and the processing which results in access to it; the original purpose of collection; and the current status and attributes of the data. Ultimately, what must be shown is that processing is ‘incompatible with article 6(1)(c) to (e) of the Directive because that information appears, having regard to all the circumstances of the case, to be inadequate, irrelevant or no longer relevant, or excessive in relation to the purposes of the processing at issue carried out by the operator of the search engine’.200 If that is the case, then the right to removal will arise.

10.211  Assessment of countervailing interests. Once a prima facie right to removal has been established, the balancing exercise described above must be carried out to determine whether the rights of the data subject should be preferred to those of the data controller and third parties. In Google Spain, the Court indicated that, in general, a person whose personal data was included in search results for his name would be entitled to removal, subject to ‘particular reasons’ why the interference with his rights may be justified by the ‘preponderant interest of the general public’ in having access to the information. An example given is ‘the role played by the data subject in public life’.201 To this one may add: prior conduct of the data subject; the nature of the information; the potential harm to the public if the information is not processed; and the potential benefits if it is.

10.212 On the facts of Google Spain, the Court indicated its view that C was likely to be entitled to removal of hyperlinks to the articles from search results for his name. The announcements contained information of ‘sensitivity’ to his private life, and were initially published over 16 years beforehand. Although strictly a matter for the national court to determine, there appeared to be no countervailing public interest arising from C’s position in public life. In those circumstances, C had a right to removal of the links to the articles under article 12(b) and a right to object to further processing. Assuming that the Audencia Nacional reaches this conclusion on the facts, it will overturn the AEPD decision and order Google to take all necessary steps to remove the two articles concerning C from its index.

10.213  Volume of requests: Google. The Google Spain decision has led to a substantial number of removal requests being made to Google,202 with 90,000 being received within 3 months of the decision and around 1,000 requests per day thereafter.203 According to material published by Google, many of the requests originating from the United Kingdom relate to fraudulent websites, convictions and arrests, and government and police data.

10.214  Removal statistics: Google. As at 22 November 2015, Google reported that it had received 346,403 requests relating to 1,226,971 URLs. Of these requests, Google had removed 42 per cent of the notified URLs.204 The proportion of requests acceded to in the United Kingdom was slightly lower, at 38 per cent. The most common source of de-indexed URLs was facebook.com, closely followed by groups.google.com (a newsgroup index), youtube.com, and twitter.com. The top 10 domain names accounted for 9 per cent of total de-indexing.

10.215  Volume of requests: other intermediaries. It appears that fewer requests are being made to other internet intermediaries. According to a report published by Wikipedia, it received 90 requests for the alteration or removal of data between January and June 2014, most of which were from Germany, the United Kingdom, and the United States.205 Wikipedia boasts that it has complied with none of these requests. There may be a number of reasons for the lower rate of requests to other intermediaries: first, data subjects may be un aware of the applicability of Google Spain to other forms of processing by non-search engine intermediaries; second, those intermediaries may not have developed or promoted their systems for delisting; and third, requests can be made directly to Google.

10.216  Volume of complaints. According to a publication by the ICO,206 472 complaints relating to search results had been received from data subjects in the United Kingdom as at 13 August 2015. The ICO required removal of the search results in 20 per cent of cases, and decided that the complaint was either ineligible or did not require de-indexing in the remaining cases. This illustrates that many complainants will not disclose valid requests for removal. However, in approximately one third of cases, the ICO has disagreed with Google’s first instance decision, usually on the basis of relevance.

10.217  Enforcement proceedings. In Hegglin v Persons Unknown & Google Inc, the claimant sought injunctive relief under sections 10 and 14 of the 1998 Act. The claimant was a businessman and investor who complained of abuse and defamatory allegations on various websites and discussion fora. The material appeared in Google search results for his name. Google had removed certain of the content from Google-hosted websites and blocked other URLs from search results. Bean J granted permission to serve outside the jurisdiction, concluding that there was an arguable case that Google was under an enforceable obligation to comply with the 1998 Act when processing the claimant’s personal data. The Court had been referred to Google Spain and cited it as authority for the proposition that Google is a data controller and subject to the Directive.

10.218  ICO enforcement notices. In August 2015, the ICO issued an enforcement notice to Google requiring it to de-index 9 search results from searches for another complainant’s name.207 The results contained information relating to a spent conviction for a minor criminal offence committed by the complainant 10 years earlier. Google had previously removed other links to the conviction. However, after that removal became a news story in its own right, Google refused to de-index further links to news articles which repeated the conviction in the context of reporting on the removal.