Skip to Main Content
The Liability of Internet Intermediaries The Liability of Internet Intermediaries

Contents

The Liability of Internet Intermediaries The Liability of Internet Intermediaries
1

Self-regulation of internet content 11.03

1.1

Platform acceptable usage policies 11.05

1.2

Search engine policies 11.09

1.3

Carrier terms of service 11.10

2

Statutory regulation of internet content 11.11

2.1

Malicious communications 11.11

2.2

Obscenity 11.30

2.3

Unclassified video recordings 11.35

2.4

Terrorist publications 11.46

2.5

Classified official secrets 11.56

2.6

Child sexual abuse material 11.62

2.7

Equality 11.67

2.8

Gambling services 11.68

2.9

Ticket resale 11.74

2.10

Tobacco products 11.78

2.11

Private vehicle hire services 11.88

2.12

Self-harm 11.96

3

Contempt of court 11.97

3.1

Primary liability for contempt 11.97

3.2

Accessory liability of secondary publishers 11.108

3.3

Assisting a contempt of court 11.110

11.01 This chapter addresses the liability of internet intermediaries for publishing and transmitting ‘unlawful material’. That phrase is used as a broad catch-all to describe information which is considered illegal, whose dissemination is normally prohibited under the criminal law of the United Kingdom, or in which dealings may be unlawful if the prescribed regulatory requirements are not satisfied.

11.02 The reasons why material can be unlawful are as varied as the topics that such information can concern. Some material is prohibited because of attributes intrinsic to its nature or purpose, so that any dealings with such material are inherently odious. Other provisions reflect concern to regulate communications about certain goods or services, or to prevent harm to national security. Although this chapter cannot possibly address every example of regulated content, this chapter deals with twelve categories of material by way of example. Most of these restrictions apply regardless of medium, but some apply specifically to internet publications such as websites.

11.03  Importance of self-regulation. Before turning to the myriad ways in which internet content is criminalised by statute, attention should be paid to the self-regulation of content by internet intermediaries and their users. Self-regulatory schemes are both more widespread and far-reaching than their statutory counterparts, and it is normally quicker and more cost-effective to report problematic content by means of such a scheme than by invoking a judicial dispute-resolution process or launching a criminal investigation. This is especially true where the content complained of has been published to a large application-layer intermediary, such as Facebook, Google, or Twitter, who tend to enforce their stated content and usage policies promptly and effectively, and to offer simple tools for reporting content that violates those policies. For this reason, such services offer complainants a useful starting point for securing the removal of harmful or offensive content at its source.

11.04  Intermediaries as de facto regulators. Conversely, the content policies of internet intermediaries are themselves largely unregulated. They often go beyond the boundaries of what would be considered unlawful under English law, to cover ‘offensive’, ‘inappropriate’, ‘objectionable’, and other vaguely defined categories of material. These policies give ultimate discretion to internet intermediaries both to define what is and is not permissible on their services, and to determine when to enforce the policy against users. This gives rise to a degree of variance between platforms, often between platforms and national law, and sometimes between complainants. The regulatory power of service provider policies reflects their emergence as de facto transnational standards for internet content and conduct.

11.05  Prohibition of unlawful conduct. All of the major application-layer platforms require their users to agree to policies that set out what may be published and how their services may be used. Typically, these policies contain broad prohibitions on content which is illegal, tortious, or otherwise contrary to the laws of the relevant country (usually that specified in the policy’s choice of law clause). For example, the Facebook Statement of Rights and Responsibilities prevents users from using Facebook ‘to do anything unlawful, misleading, malicious, or discriminatory’,1 while the Twitter Terms of Service require users to use Twitter ‘only in compliance with these Terms and all applicable local, state, national, and international laws, rules and regulations’.2

11.06  Prohibited content. Additionally, platform policies define specific categories of material whose publication is prohibited. These tend to include unsolicited commercial messages (spam), hate speech, harassment, threats of violence, content which infringes intellectual property rights, malicious code (such as viruses), pornography or (in the case of Facebook and Google Plus) nudity, content depicting graphic or gratuitous violence, content which interferes with the operation of the platform, and content ‘that infringes or violates someone else’s rights or otherwise violates the law’.3

11.07  Regulated trades. Businesses wishing to promote goods and services that are regulated under national law will need to take special care to comply with platform policies. Examples include pharmaceuticals, alcohol, gambling, tobacco, fireworks, weapons, and medical devices. Some platforms prohibit these promotions outright; others, such as Google Plus, simply clarify that the promoter bears responsibility for placing any relevant age and geographical restrictions upon access, and preserves the platform operator’s right to remove non-compliant promotions.4

11.08  Examples of other policies. Examples of application-layer platforms that operate acceptable usage policies include YouTube,5 Google Plus,6 Blogger,7 LinkedIn,8 and Tumblr.9 The Google Plus policy additionally prohibits materials which ‘manipulate ranking or relevancy using techniques like repetitive or misleading keywords or metadata’.10 Summaries of the most common policies are available here.11

11.09  Prohibition of unlawful conduct. Search engines such as Google and Bing also require their users to agree to terms of use which set out how those indices may be used. The Google Terms of Service provide that its services may be used ‘only as permitted by law’.12 The Microsoft Services Agreement (which applies to the Bing search engine) sets out content and actions which are not permitted, including use of its services ‘to do anything illegal’, to infringe the rights of others, and to engage in activity which is false or misleading or which harms the services or harms others.13 Those policies also deal with, among other things, the circumstances in which the search engine may at its discretion remove unlawful content; these are discussed elsewhere in relation to de-indexing.14

11.10 Similar restrictions may be found in the terms of network-layer services, such as ISPs, mobile carriers, and network operators. For example, all major United Kingdom mobile and fixed-line carriers prohibit accessing and transmitting unlawful material. Their terms of service are too numerous to address comprehensively in this work, and the reader is directed to the carriers’ websites for further information.

11.11 Malicious statements conveyed with the prescribed intent are rendered unlawful by a number of statutory provisions.

11.12  Offensive and obscene messages. First, section 127(1) of the Communications Act 2003 makes it an offence to send by means of a public electronic communications network a message or other matter that is ‘grossly offensive’ or ‘of an indecent, obscene or menacing character’, or to cause such a message or matter to be so sent.

11.13  Harassing messages. Second, section 127(2) of the 2003 Act makes it an offence to send by the same means a message that is known to be false and is ‘for the purpose of causing annoyance, inconvenience or needless anxiety to another’, or to cause any such message or matter to be so sent, or persistently to make use of a relevant network for that purpose.

11.14  Definition of ‘public electronic communications network’. The phrase ‘public electronic communications network’ is defined in sections 32 and 151(1) of the 2003 Act and plainly includes the internet. In summary, it refers to a network provided wholly or mainly for the purpose of making electronic communications services available to members of the public. Such services, in turn, refer to services whose principal feature is the conveyance by electrical means of ‘signals of any description’.

11.15  Application to social networks. In Chambers v Director of Public Prosecutions, Twitter was held to fall within the ambit of this definition.15 By analogy, virtually any other internet service is likely to constitute an electronic communications service. Similarly, in Rhodes v OPO (by his litigation friend BHM), Lord Neuberger explained that the ambit of section 127 is ‘limited to electronic communications and appears to give rise to no civil liability.’16

11.16  Hoaxes and abuse. Section 127 has been primarily invoked in prosecutions against individual members of the public who have sent threats, hoaxes, and abusive messages by means of internet services such as Twitter and Facebook. To date more than 1,200 prosecutions have been brought.17 Of these, Chambers (popularly referred to as the ‘Twitter joke case’) is the most well known. In response to criticism, the Crown Prosecution Service issued guidance restricting the circumstances in which prosecutions would be brought against social media users.18 Readers are directed to a specialist criminal law text for further discussion of the ambit of these provisions against individuals.

11.17  Availability of injunctions against intermediaries. Two examples from Northern Ireland demonstrate how social networks can face non-monetary liability to remove materials which constitute unlawful harassment or otherwise interfere in an individual’s article 3 and article 8 rights. In XY v Facebook Ireland Ltd, the High Court held that a Facebook page entitled ‘Keeping Our Kids Safe from Predators’ created a real risk of infringing the claimant sex offender’s rights to freedom from inhuman and degrading treatment and to respect for private and family life. The Court ordered Facebook to remove the page in question (designated by its URL), but refused to order it to monitor the site to prevent similar material from being uploaded in the future. Such an injunction would have imposed a disproportionate burden and require excessive judicial supervision.19

11.18 In another Northern Irish case, AB Ltd v Facebook Ireland Ltd, the same judge commented that the criminal law was proving inadequate to address the problem of malicious use of social networking platforms:

The misuse of social networking sites and the abuse of the right to freedom of expression march together. Recent and pending litigation in Northern Ireland confirms that, in this sphere, an increasingly grave mischief confronts society....The solution to this mischief is far from clear and lies well beyond the powers of this Court. Self-regulation and/or statutory regulation may well be necessary. In the meantime, this unmistakably pernicious evil is repeatedly manifest. Recourse to the courts for appropriate protection and remedies is an ever expanding phenomenon.20

11.19  Threshold of seriousness. It should not be forgotten that, before any message could fall within a section 127(1) offence, it must involve a credible or serious statement: as Lord Judge explained in Chambers, a message which does not create fear or apprehension in readers would lack menace and fail to satisfy the actus reus of the offence.21 In this regard, context is important. The comments of Eady J in Smith v ADVFN plc are apt, where (in the context of defamation) he described internet bulletin board postings as:

like contributions to a casual conversation (the analogy sometimes being drawn with people chatting in a bar) which people simply note before moving on; they are often uninhibited, casual and ill thought out; those who participate know this and expect a certain amount of repartee or ‘give and take’.22

To similar effect, Baroness Hale commented in Majrowski v Guy’s and St Thomas’ NHS Trust that ‘[a] great deal is left to the wisdom of the courts to draw sensible lines between the ordinary banter and badinage of life and genuinely offensive and unacceptable behaviour’.23 Similarly, Lord Nicholls described ‘the boundary between conduct which is unattractive, even unreasonable, and conduct which is oppressive and unacceptable’.24 Although these comments were made in the context of a civil claim for harassment, they apply with equal force to section 127.

11.20  Offences by intermediaries. It will be rare indeed that an internet intermediary falls within the first, ‘sending’ limb of the section 127 offences. Of potentially greater interest are the ‘causing’ offences. However, section 127(1)(b) and (2) require the defendant to act with a specified purpose, which entails a specific intent (to send a message of a menacing character, and to cause annoyance, inconvenience, or anxiety, respectively).25 This contrasts with section 127(1)(a), which is an offence of basic intent. Accordingly, what must be shown is that the defendant had, at the time the message was sent, actual knowledge of its contents and either intended to cause the proscribed result, or was reckless as to whether the message was likely to produce that result in a reasonable member of the public who read or saw it.

11.21 As a result, it seems doubtful, certainly in the vast majority of cases, whether an internet intermediary could be said to know the contents of a message or other matter, let alone to harbour any intent one way or the other about the purpose of sending it. This will not be enough for the reasons Lord Bingham explained in Director of Public Prosecutions v Collins:

Parliament cannot have intended to criminalise the conduct of a person using language which is, for reasons unknown to him, grossly offensive to those to whom it relates...On the other hand, a culpable state of mind will ordinarily be found where...facts known to the sender of the message about an intended recipient render the message peculiarly offensive to that recipient, or likely to be so, whether or not the message in fact reaches the recipient.26

Consequently, it remains an open question whether an internet intermediary that is notified of the facts which render a message offensive or menacing will be reckless as to the future effects of that message when it is retransmitted after notification. Another possibility is that injunctive relief may be sought pursuant to an equitable duty not to facilitate criminal wrongdoing by third parties.27

11.22  Hate speech. Third, section 1 of the Malicious Communications Act 1988 makes it an offence to send with the prescribed intent an electronic communication which is, in whole or part, ‘of an indecent or grossly offensive nature’, or which conveys such a message, a threat, or false information. Originally designed to address paper-based hate mail, this provision has since been amended to apply to internet communications.

11.23  Definition of ‘electronic communication’. Section 1(2A) defines ‘electronic communication’ to include any communication, however sent, that is in electronic form. This is a broad definition that clearly encompasses TCP/IP transmissions sent by means of the internet. To ‘send’ is similarly broadly defined: by section 1(3), it includes transmitting, and causing to be sent or transmitted. Accordingly, the ‘sender’ of a message could be argued to include an internet intermediary which transmits messages provided by its users. However, like section 127 of the 2003 Act, section 1 has mostly been invoked against individuals who have made hateful or abusive postings to Twitter and other social networks. Invariably, these cases have involved charges against those who authored and uploaded the relevant communications, rather than the services that hosted or transmitted them.

11.24  Required mental state. Although the drafting of section 1 is unhelpfully vague, the required mens rea is narrow. What must be shown is that at least one of the defendant’s purposes in sending the message was to cause distress or anxiety to its intended recipient. Again, this requires specific intent. The statutory language seems to preclude liability where the defendant is reckless as to whether or not the message would have that effect. It is inherently unlikely that an internet intermediary which transmits a message on behalf of a third party sender could be said to intend a particular effect upon its recipient. As such, section 1 would rarely apply to network- or application-layer services that do not themselves author messages that are transmitted.

11.25  Calculated or deliberate facilitation. A service which was designed expressly for the purpose of enabling offensive or anonymous communications to be sent, but which did not itself author or pre-moderate any of those communications, may fall closer to the line. It is suggested that such a service would still lack the required intent and purpose in relation to the actual message that was sent by the primary wrongdoer to the recipient.

11.26  Identifying a course of conduct. Fourth, communications which target a specific individual or group can amount to harassment or stalking by the sender of the communication under section 2 of the Protection from Harassment Act 1997. To do so, the communications would, taken together, need to form a relevant course of conduct falling within that provision, which normally requires something ‘oppressive and unacceptable’.28 Harassment also creates a civil cause of action against the harasser: section 3(1). Section 3(3)(a) permits the Court to grant an injunction ‘for the purpose of restraining the defendant’ from pursuing harassing conduct, but does not appear to contemplate injunctive relief against third parties.

11.27  Harassment on social media. In Brand v Berki,29 a well-known comedian and actor obtained an anti-harassment order against a masseuse under the 1997 Act. The defendant had sent ‘a concerted campaign of emails’ to the claimants and journalists, and had posted various allegations to Twitter and other websites. The order prevented her from communicating with the claimants or from publishing allegations about the claimants to third parties. Although that order only binds the defendant in personam, it could, if served on Twitter or another service provider, be used to compel the removal of material published by the defendant in breach of the order.30 Such intermediaries may well choose to comply with a valid court order relating to content stored or transmitted using their services, even if it is not directly addressed to them.

11.28  Threats to kill. If a message contains a threat to kill a person it could fall within section 16 of the Offences against the Person Act 1861.31 Again, this provision is unlikely to impose primary liability upon any internet intermediary absent exceptional circumstances.

11.29 Because many malicious communications are published anonymously, it will frequently be necessary to identify the user of an internet service for the purpose either of prosecuting that individual or bringing a civil cause of action (such as defamation or malicious falsehood). Chapter 17 discuesses the powers available to law enforcement and investigatory agencies; Chapter 4 discusses the private processes which may be used to unmask such a wrongdoer.

11.30  Definition of ‘obscene’. Section 2 of the Obscene Publications Act 1959 makes it an offence to publish an ‘obscene article’. Obscenity is defined in section 1(1) to mean an article whose effect, taken as a whole, ‘is such as to tend to deprave and corrupt persons who are likely, having regard to all relevant circumstances, to read, see or hear the matter contained or embodied in it.’ By way of indicative guidance, the Crown Prosecution Service cites as examples realistic depictions of torture, rape, or dismemberment.32

Readers are referred to a specialist work for further discussion of the scope of primary liability.33

11.31  Scope of publication. A person ‘publishes’ an article in the circumstances specified in section 1(3), which include distributing, circulating, and transmitting data stored electronically. This broad definition goes beyond the scope of common law publication, since it includes electronic transmissions which, at least since Bunt v Tilley, would not amount to publishing material for the purposes of defamation. It is clear, for example, that a mere transmission of data constitutes publication, whether or not it is seen by any person. Thus, there is publication both when obscene material is uploaded and when it is downloaded,34 copied, indexed, backed up, or otherwise dealt with electronically.

11.32  Website access restrictions. Because the meaning of obscenity depends in part upon the identity of persons by whom material is likely to be encountered, this makes it important to consider any technical or other restrictions placed on access to material. For example, many social networks require visitors to confirm their age upon registration, which may mean that material hosted by them is less likely to be seen by children below the relevant age. Purveyors of adult material such as pornography may implement more sophisticated age verification mechanisms involving credit card checks. A failure to restrict access to adult material, or to do so effectively, could result in a finding of obscenity if that failure results in children being likely to access the material, even if the material would not otherwise tend to deprave or corrupt the public at large.35

11.33  Defences. Of greatest interest to internet intermediaries is section 2(5), which creates a defence where a person proves that he or she had not examined the article in question and had no reasonable cause to suspect that it was obscene. This seems likely to exonerate many intermediaries, including ISPs and hosts, at least until the point of being notified of obscene material that is being stored or transmitted by means of their service.

11.34  Content hosted abroad. Difficult jurisdictional questions arise where obscene material is hosted on a server located outside the United Kingdom. Applying the reasoning in R v Perrin,36 jurisdiction seems likely to exist if material is either uploaded or downloaded within the United Kingdom regardless of where it is stored. In that case, although not entirely clear, the material appears to have been hosted on an American server. If this reasoning continues to be followed, then foreign internet material could be the subject of an offence under the 1959 Act if it has been accessed in or posted from the United Kingdom.

11.35  Overview. Section 9(1) of the Video Recordings Act 1984 makes it an offence to supply or offer to supply a video recording containing a video work that has not been issued a classification certificate by the British Board of Film Classification, unless the supply or the work is specifically exempted. This offence is not restricted by medium and therefore applies equally to internet intermediaries. Section 9 is an indictable offence.37

11.36  Supply via marketplaces and platforms. Many internet intermediaries cause video works to be supplied or permit third parties to supply them (eg in a marketplace or on a social video-sharing platform). Although no prosecutions have been brought against such services for contraventions of section 9, there remains the (largely theoretical) possibility that such marketplaces and platforms could be said to engage in a ‘supply’ of any unclassified work that is sold by them. ‘Supply’ is defined in section 1(4) to mean ‘supply in any manner, whether or not for reward, and, therefore, includes supply by way of sale, letting on hire, exchange or loan’. This was intended to be a ‘wide-ranging’ definition, designed to apply to both retailers and those involved at all levels in the chain of production and distribution of recordings.38

11.37  Scope of ‘supply’. The concept of a ‘supply’ is a fundamental feature of the regulatory scheme created by the 1984 Act. It is referred to in many places throughout the Act and is the basis for its six main offences. There is a substantial body of case law interpreting the word ‘supply’ in other legislation. These authorities tend towards the view that supply requires the supplier to have had, and to part with, physical possession or control of the goods to the transferee for his or her own purposes. For example, in R v Delgado the accused had possession of prohibited drugs which he intended to give back to two third parties. To do so was held to be a ‘supply’ under section 5(3) of the Misuse of Drugs Act 1971 because it would involve ‘a transfer of physical control of [goods] from one person to another’ for the benefit of the transferee.39 While this did not depend on any analysis of ownership or legal possession, it did require possession of the drugs.

11.38  The need for physical possession. In R v Maginnis, a majority of the House of Lords endorsed the general approach of Delgado that possession of goods is a necessary but insufficient condition of their supply. As Lord Keith explained:

The word ‘supply’, in its ordinary natural meaning, conveys the idea of furnishing or providing to another something which is wanted or required in order to meet the wants or requirements of that other. It connotes more than the mere transfer of physical control of some chattel or object from one person to another.40

11.39  Offers to supply by means of a third party. The Court of Appeal reviewed these authorities in Interfact Ltd v Liverpool City Council in the context of joined appeals against convictions under section 12 of the 1984 Act. It concluded that ‘“supply” has a wide meaning’, not being limited to contractual principles of offer and acceptance. On this basis, distribution of a printed catalogue amounted to an offer to supply by the person who would deliver the video recording, regardless of the ‘underlying transaction’.41 The Court ultimately adopted a purposive interpretation of the Act, noting that its purpose was ‘to prevent children viewing unsuitable material’ and so the scope of the Act’s restrictions ‘should be interpreted with a view to that purpose being achieved’.42 The purpose of section 9 is analogous and arguably even stricter than section 12.

11.40  Application to internet intermediaries. In light of this broad approach, there is a considerable prospect that it could be applied to application-layer intermediaries who operate services for supplying video works. Chief among the factors to be considered would be whether the service itself supplies the goods, or merely facilitates their supply by a third party (eg by listing them and acting as a payment intermediary). If the marketplace never takes possession of the chattels embodying the video works, then it might be said that they cannot supply them in the sense explained by Lord Keith in Maginnis.

11.41 However, it is conceivable that a marketplace may involve itself in the terms of a transaction sufficiently closely that it can be considered party to an ‘offer to supply’ the goods. For example, it might offer to accept payment from the buyer on behalf of the seller, and may offer to guarantee the quality or delivery of the goods in some way. In this regard, it remains an open question whether platform-supplied insurance or money-back guarantees would be sufficient to involve those services in any transactions undertaken by users with marketplace sellers.43

11.42  Defences. There is a defence to the section 9 ‘supply’ offence in the form of section 14A, which creates a defence of reasonable care. This recognises the impracticability of secondary suppliers verifying the contents of all video recordings that they supply. Section 14A exonerates non-primary parties who can prove:

(a)

that the commission of the offence was due to the act or default of a person other than the accused, and

(b)

that the accused took all reasonable precautions and exercised all due diligence to avoid the commission of the offence by any person under his control.

11.43 In creating a negligence-based standard, section 14A requires internet intermediaries to act reasonably but no more. Hansard records the purpose of section 14A:

It is unreasonable to expect retailers to check the contents of every cassette. That would be especially onerous for proprietors of video libraries who would not have the resources to check every cassette on its return to the library to ensure that the borrower had not altered the programme . . .44

11.44  No requirement of monitoring. In Bilon v WH Smith Trading Ltd, Latham LJ rejected a suggestion that a retailer was required to make inquiries of third party publishers or make random checks of publications they carried for retail sale in order to rely on section 14A. In that case, the relevant publisher was ‘a reputable publishing company’ with whom the retailer had traded for 20 years. Random checks would not ‘provide any significant protection’ and checks of every item were ‘unrealistic’. Accordingly, the retailer had acted reasonably and could rely upon the defence.45

11.45  Reasonable care by intermediaries. Similar considerations are likely to apply to internet intermediaries, at least up until the point at which they are notified of unclassified material. The situation is possibly analogous to innocent dissemination of defamatory material: once a secondary publisher is put on notice that they are publishing defamatory statements, it would not be reasonable to continue publishing them—so too for supplying unclassified video recordings.46 The same result is likely to be achieved by the storage safe harbour.47 Any more stringent duty may well fall afoul of the prohibition on general monitoring set out in article 15(1) of the E-Commerce Directive.48

11.46  Actus reus for publication. Section 1 of the Terrorism Act 2006 makes it an offence to publish or cause another to publish a statement that is likely to be understood by some or all of the relevant public as an encouragement or other inducement of certain acts of terrorism. Publication of a statement is defined in section 20(4) to mean publishing it in any manner to the public, including by providing electronically any service by means of which the public have access to the statement, or using such a service to enable or facilitate access to the statement. Accordingly, section 1 applies to internet intermediaries.

11.47  Mens rea for publication. The required mens rea is intention or recklessness as to encouraging or inducing members of the public to commit, prepare, or instigate the relevant acts: section 1(2)(b). This is narrow and appears to limit the scope of the offence to those with some degree of editorial responsibility for the statement. However, as for obscene materials, it is an open question whether an internet service on notice of a terrorist publication can eventually be said to become reckless about the relevant effect on the public.

11.48  Dissemination offence. Section 2 of the 2006 Act creates a separate dissemination offence. Like section 1, it requires the defendant to intend or be reckless as to the prescribed consequences and is therefore of limited application to neutral internet services. However, it is broader than section 1 in that the mens rea includes the intentional or reckless provision of assistance in relation to acts of terrorism: section 2(1)(b)–(c). It is also broader in that the statutory definition of publication does not apply. Instead, dissemination is stated to include (among other things) distributing or circulating a publication (namely an article or record of any description containing relevant matter), and providing a service enabling others to obtain or consume it, and transmitting its contents electronically. Again, this appears to encompass many acts by internet intermediaries at the network and application layer.

11.49  Enforcement notices. Sections 3 and 4 establish a notification framework for internet intermediaries involved in publishing or disseminating terrorism-related material. In broad terms, these provisions allow a member of the police to put an internet (or other electronic) service on notice of unlawful material, and to require its removal or alteration within 2 working days, failing which the service provider will be deemed to give future publications of the material his endorsement.

11.50  Comparison to notice-and-takedown. This notification framework establishes a regime of strict liability and goes beyond voluntary notice-and-takedown in civil matters in several ways. First, the framework applies to any ‘service provided electronically’. This would include all internet intermediaries (and information society services). Relevant publications under sections 1 and 2 of the 2006 Act are brought within the ambit of section 3 when they occur ‘in the course of, or in connection with, the provision or use of’ such a service.49

11.51  What must be notified. Second, a notice must satisfy certain mandatory requirements, including a warning of the consequences of not complying with the notice. Under the framework notices may be sent only by a constable and by a method prescribed by section 4: hand delivery to the person or the secretary or equivalent officer of a body corporate, or by recorded post to an individual’s last known address or a corporation’s registered office.

11.52  Consequence of failure to act. Third, the effect of non-compliance is automatic. Where a notice has been validly given to a service provider who has failed, without reasonable excuse, to comply with the notice within 2 working days, the relevant statement, article, or record to which the notified conduct relates ‘is to be regarded as having the endorsement of’ the relevant person.50 The consequence is that the service provider will, as a matter of law, be prevented from relying upon the non-endorsement defence in sections 1 and 2.

11.53  Repeat publications. Fourth, notification activates an obligation which approaches ‘notice-and-stay-down’. Where a valid notice has been received and complied with, but the service later publishes a repeat statement (one ‘which is, or is for all practical purposes, the same or to the same effect as’ the notified statement), then the service provider will equally be deemed to give its endorsement to the repeat statement. This is subject to a defence where the intermediary can show that it had, before the time of the repeat statement, ‘taken every step [it] reasonably could to prevent a repeat statement from becoming available to the public and to ascertain whether it does’, was not aware of the statement at the time of publication, and took all reasonable steps to remove or alter the statement once it became aware of it.

11.54  Potential criminal liability. Non-compliance with a notice under section 3 does not of itself create criminal liability. It would still need to be shown that the addressee of the notice satisfied the required mens rea for an offence under sections 1 or 2, and that all the other elements of the relevant offence were satisfied. However, based on knowledge of the material and failure to remove it, a charge of recklessness may become more likely, and (apart from the mere conduit safe harbour and prohibition on general monitoring) other limitations upon liability would be removed. The risk of recklessness would depend on all the circumstances and, in particular, upon the nature of the published material.

11.55 Schedule 8A of the Terrorism Act 2000 restricts the publication of information about members of the military and intelligence services. It contains provisions which extend primary liability to information society service providers, subject to safe harbours in similar terms to the 2002 Regulations. It is unclear why this duplication is necessary, given the otherwise horizontal nature of the safe harbours.51 This same criticism applies to the other subject-specific extensions and limitations of liability discussed elsewhere in this chapter.

11.56  Publications threatening national security. A court may grant an injunction to prevent a threatened or continuing breach of the Official Secrets Act 1989. This type of injunctive relief is sometimes described as the exercise of a specific ‘power to deal with publication which threatens national security’,52 though no such power seems to be expressly conferred by the Act. The better view is that these injunctions are an incident of the Court’s inherent jurisdiction to restrain illegal activity.

11.57  Contra mundum injunctions. In Attorney–General v Punch Ltd, Lord Phillips MR commented in obiter that ‘where national security is at risk it should be open to the Crown to obtain an injunction binding on all the world’.53 However, referring to the ‘well established liability’ of third parties who assist in the breach of an injunction so as to interfere with the course of justice, his Lordship considered that more targeted injunctive relief was appropriate in that case:

If a third party assists the confidant to breach the injunction by publishing information supplied by the confidant, whether directly or indirectly, the third party will be in contempt of court for aiding and abetting the breach of the court order by the confidant. It will not be often that a third party comes into possession of information that has emanated from the confidant and has not yet entered the public domain, but where publication is not one to which the confidant is party.54

11.58  Telegraphy. The imposition of confidentiality duties on intermediaries in control of new technologies is a common feature of measures designed to ensure national security. For example, under section 4(1) the Official Secrets Act 1920 (UK) (now repealed), the Secretary of State had the power to compel any operator of a telegraphic apparatus to produce any message sent to or from a foreign source. An early example of a telecommunications interception power, the provision reflected a centralised approach to monitoring the dissemination of materials at a time when encryption standards were relatively undeveloped.

11.59  Telecommunications licensing. A second example is section 5 of the Telecommunications Act 1984 (UK) (also repealed), which compelled any operator of a ‘telecommunication system’ to apply for a licence, and made it an offence to run an unauthorised system. Today, the enforcement of such a provision would be impossible (and indeed was only repealed as late as 2003), but it illustrates the tendency to use intermediaries as proxies for control over governmental information.

11.60  Solicitation of national security documents. Prior to examining a document, there is unlikely to be basis for believing that the information it contained was protected, wrongfully obtained, or likely to be damaging. Conversely, ‘non-neutral’ platforms which actively solicit classified material, such as WikiLeaks, might be fixed with knowledge earlier. In Lord Advocate v Scotsman Publications Ltd, Lord Templeman commented that third parties who ‘instigate or encourage or facilitate’ breach of a primary duty might face liability, as opposed to innocent recipients.55

11.61  The example of WikiLeaks. WikiLeaks-style platforms actively encourage people to leak protected documents in circumstances that would undoubtedly breach the Act, while facilitating the anonymous storage and transmission of those documents. In these circumstances, it is probably reasonable to conclude that their operators would have at least ‘reasonable cause to believe’ that some documents uploaded by members would fall afoul of the Act. This is especially so if the documents are marked as ‘secret’, ‘top secret’, or ‘confidential’,56 which was the case in relation to many of the diplomatic cables disclosed by WikiLeaks.57 Moreover, it would be unlikely that many cables are ‘inaccurate or unenlightening or insignificant’ such as to fall outside section 3(2).58 To the contrary, the controversy and diplomatic fallout surrounding their disclosure amply illustrates the obvious potential of such documents to cause damage.59

11.62  Primary liability. The distribution and possession of indecent and abusive images of children is unlawful. In some circumstances, such primary liability can extend to the operators of internet intermediaries which are implicated in that activity. These provisions are discussed below. Additionally, the voluntary blocking of child abuse images by ISPs is discussed in chapter 14.

11.63  Distribution offences. Section 1 of the Protection of Children Act 1978 prohibits dealings with ‘indecent’ images (which can include electronic data). What must be shown is that the defendant deliberately or knowingly distributed or showed such images or advertisements for them, or possessed such images with a view to distributing them. The offence also applies to acts by companies.60

11.64  Incitement offences. It seems unlikely that an internet intermediary (such as an ISP or web host) whose service is used to disseminate or store prohibited images would be at risk of an incitement charge under section 1(1)(b) of the 1978 Act, since such processes are ordinarily automated functions of hardware and software that are not intended to result in recipients of the service being persuaded or encouraged to engage in conduct falling within section 1.

11.65  Possession offences. Similarly, section 160 of the Criminal Justice Act 1988 criminalises possession of indecent images of children per se. However, the authorities suggest that in order to ‘possess’ an image it must be within the custody or control of the defendant.61 For this reason, it is arguable that a neutral service provider would not possess an image merely by reason of it being inadvertently stored on equipment that is made available to its customers. In any case, a defence applies if the image was sent without any prior request and was not kept for an unreasonable time.62 What constitutes an ‘unreasonable’ time is unclear, but the statutory language suggests an objective test. What is reasonable in the circumstances will naturally depend upon what was known to the service provider; it seems unlikely that a reasonable time could have elapsed before it became aware of the image in question.

11.66  Extension of primary liability. Schedule 13 of the Coroners and Justice Act 2009 extends primary criminal liability to English service providers for possession of ‘prohibited images of children’, including in the course of providing information society services. It is unclear whether these provisions are intended to enlarge the concept of ‘possession’, or merely confirm the existing authorities. In any case, paragraphs 3–5 set out equivalent safe harbours to those found in the 2002 Regulations so intermediaries’ criminal liability is unlikely to have greatly increased in scope. Similar provisions may be found in schedule 14 of the Criminal Justice and Immigration Act 2008, which relates to ‘extreme pornographic images’.

11.67  Availability of injunctions. Schedule 25 of the Equality Act 2010 makes provision for orders against information society service providers who transmit and store material which contravenes the provisions of that Act. Paragraph 6 of schedule 25 adds the further limitation that an injunction under the Equality Act may not impose on such a service provider any liability which would ‘contravene’ articles 12–14 of the E-Commerce Directive, or amount to a general obligation within the meaning of article 15.63 This goes further than the safe harbours, since it suggests that no injunctive liability can attach in respect of activities protected by a safe harbour (even though the safe harbours do not prohibit injunctions as such).

11.68 The provision and advertising of gambling is regulated by the Gambling Act 2005. Several provisions are relevant to internet intermediaries.

11.69  Gambling software. Section 41 makes it an offence to manufacture, supply, install, or adapt gambling software (ie computer software for use with remote gambling) in the course of a business without a licence. Although on its face this might encompass certain acts by service providers, section 41(3) clarifies that a person does not ‘supply’ or ‘install’ software merely because he or she makes available communication facilities to another person which are used to do so.

11.70  Gambling advertising. Section 330 makes it an offence to advertise unlawful (ie unlicensed) gambling by means of remote or non-remote communication. Advertising can be by way of ‘remote communication’, which includes the internet: section 4(2)(a). For the necessary territorial nexus to be established for internet advertising, two requirements must be met. First, the advertising must relate to gambling which will either take place in Great Britain or have at least one piece of remote gambling equipment located there. Second, the advertising must satisfy a targeting test, whereby the data, communication, or information constituting the advertisement is either intended to come to the attention of people there or is likely to be accessed by people there.

11.71  Definition of ‘advertising’. By section 327, advertising is defined to mean: (1) doing anything to encourage people to take advantage of facilities for gambling; (2) bringing those facilities or information about them to people’s attention; or (3) participating in or facilitating an activity knowing or believing it is designed for either of those purposes. This definition conflates subjective and objective elements: the status of an act as ‘advertising’ depends not just upon its content, but upon the subjective purpose of the person engaging in the relevant activity. Facilitation is not defined but an analogy may be drawn with a service that is ‘mixed up’ in wrongdoing within the Norwich Pharmacal jurisdiction.64

11.72  Application to internet intermediaries. The second and third limbs are most relevant. For example, many forms of search keyword advertising involve manual review by a search engine for the purpose of bringing advertised material to the attention of the public. However, the definition would be unlikely to include organic search results that are the result of user-generated queries, which would only be accompanied by the more general belief that information is being made available that is related to the user’s query.

11.73  Mens rea for advertising. Where a person advertises unlawful gambling by conduct falling within limb (3), an offence under section 330 will only be committed if the person knows or should know that the advertised gambling is unlawful. In practice, this means that most internet intermediaries will not fall within the offence, since they are unlikely to have any particular reason to know that services they facilitate are unlawful. However, where a professional intermediary has assisted a gambling operator to advertise its services (eg by creating or optimising the text of an advertisement) while knowing that the relevant equipment or service has a link to the United Kingdom, there would be a reasonable agreement that such a person ought to know that the advertisement is unlawful.

11.74  Regulation of football ticket resale. Marketplaces such as Viagogo, eBay and Gumtree permit the resale, in certain circumstances, of tickets previously purchased by members of the public. Transactions involving resale are subject to section 166 of the Criminal Justice and Public Order Act 1994, which makes it an offence for an unauthorised person to sell or otherwise dispose of a ticket for a designated football match.65 The provision only applies if a sale is ‘unauthorised’ by the organisers of the match, which calls for close examination of the terms under which the ticket was originally sold. The provision is limited to football matches designated by the Secretary of State, and any other designated sporting event or category of events (which may only be designated if more than 6000 tickets are issued for sale): section 166(6).66

11.75  Definition of ‘selling a ticket’. The phrase ‘selling a ticket’ is defined in section 166(2)(aa) in very wide terms. It includes offering or exposing a ticket for sale, making a ticket available for sale by another, advertising that a ticket is available for purchase, and offering to sell a ticket. On its face, this appears to include advertisements published by internet marketplaces for the purchase of a relevant ticket. For example, the service may accept an advertisement authored by one of its users and, at the point of publishing the advertisement, has contravened section 166.

11.76  Territorial scope of the offence. The offence in section 166 is not territorially circumscribed, but it is suggested that publication would need to occur in the United Kingdom, in the sense of being targeted or directed there, or in an EEA state by a local service provider. If the ticket relates to an event taking place in another territory and is advertised using a service provider established outside the United Kingdom, such conduct seems unlikely to fall within section 166.67

11.77  Application to internet intermediaries. Section 166A qualifies the operation of section 166 in relation to service providers in two ways. First, it exempts service providers established outside of the United Kingdom entirely: section 166A(1). Second, it provides a limited defence for acts done in the course of transmitting or storing user-created content: section 166A(3)–(4).68 However, this defence is more limited than the safe harbours, because it will not apply (even to acts of transmission) if the service provider had actual knowledge that the relevant content contravened section 166 when it was provided, or failed expeditiously to remove or disable access to the content upon obtaining such knowledge. This appears to contemplate notice-and-takedown obligations even in respect of mere conduits.

11.78  Restrictions on advertising. The advertising of tobacco products is now heavily regulated. In the United Kingdom, the Tobacco Advertising and Promotion Act 2002 imposes various restrictions on their marketing and display. Section 2 makes it an offence to publish or cause to be published in the United Kingdom a tobacco advertisement in the course of a business, or to distribute such an advertisement.69 Distribution includes ‘transmitting it in electronic form, participating in doing so, and providing the means of transmission’: section 2(3). As section 21 makes clear, section 2 encompasses publishing by any means, including ‘any electronic means, for example by means of the internet’.

11.79  Scope of the ‘means of transmission’. Section 2 therefore applies to internet intermediaries who provide means of transmitting tobacco advertisements or who in fact transmit them. The act of transmission is not specifically demarcated. On a broad construction, it could encompass all acts from the accessing of stored data by a host to the reception of the data by its ultimate recipient. The narrower and, it is submitted, better view is that transmission should be construed co-terminously with the scope of regulation 17 of the 2002 Regulations (the mere conduit safe harbour). This avoids the absurdity that may otherwise result if, for example, a disjoint between the two definitions resulted in liability being imposed upon the operator of an intermediate server or wireless access point, who might otherwise be said to provide the means of transmission for advertisements sent to users accessing downstream networks.

11.80  Safe harbours. Schedule 1 of the 2002 Act articulates safe harbours which mirror those found in the 2002 Regulations and are subject to essentially the same limitations and conditions. These provisions apply to all offences created by the Act.

11.81  Application to intermediaries in the EEA. By regulation 2(2) of the Tobacco Advertising and Promotion 2002 etc (Amendment) Regulations 2006, section 2(4) of the 2002 Act was amended to provide that it is an offence for a service provider established in the United Kingdom to do anything in the course of providing information society services in another EEA state (a member state, Norway, Iceland, or Liechtenstein) which would constitute an offence under section 2. Previously, section 2(4) had contained a broad exclusion of liability for extraterritorial service providers who caused relevant advertisements to be published by means of a website accessed in the United Kingdom.

11.82 As a result of this amendment, section 2 now applies with equal force to advertisements transmitted from another EEA state, provided that two nexus requirements are met: first, the transmission must still cause publication or distribution in the United Kingdom; and second, the service provider must be established there.

11.83  Locating the intermediary. The concept of establishment is defined in section 21 of the 2002 Act. The test is whether the service provider effectively pursues an economic activity from a place for an indefinite period. However, the mere presence or use of technical equipment in that place is not sufficient without more. Accordingly, evidence of sales, advertising, or other commercial activity in the United Kingdom will be required. Further, where a service provider has multiple establishments, its establishment for the purposes of the Act will be the place which is the centre of its activities relating to the service. This may have the effect of removing many well-known application-layer intermediaries headquartered in the United States from the operation of the 2002 Act.

11.84  Publication offences. Section 3A makes it an offence to publish a tobacco advertisement in the United Kingdom or another EEA state by means of an information society service established there. Such publication entails an offence by both the proprietor of the service and any editor of the information contained in the service, as well as by any person who directly or indirectly procured the inclusion of the tobacco advertisement in the information. This is a wide-ranging provision that applies, on its face, to all internet intermediaries that supply means for publishing information, whether or not the service provider is responsible for the content of the information.

11.85  Application to internet intermediaries. Although prima facie liability under section 3A is subject to the same territorial restrictions and safe harbours as other provisions, it is conceivable that a network-layer intermediary could be put on notice of the transmission of advertisements and be unable, in practical terms, to do anything to prevent their ongoing transmission. Similarly, a search engine or marketplace established in the United Kingdom would be required to disable access to notified material which contravened the 2002 Act or risk liability under section 3A.

11.86  Display offences. Section 7D of the 2002 Act permits the Secretary of State to regulate the display of tobacco products and their prices on a website in England, Wales or Northern Ireland where tobacco products are offered for sale. Displaying or causing such products or prices to be displayed on a relevant website in breach of the regulations is an offence, except where this occurs in the course of providing information society services by a person established outside the United Kingdom: section 7D(4). The meaning of ‘display’ is not defined and, unlike the concept of distribution in section 2, providing the means of display is not expressly included. Nevertheless, section 7D appears to extend prima facie liability at least to the local operators of websites or marketplaces on which tobacco products or their prices appear.

11.87  Overall risk of liability. Accordingly, subject to the operation of safe harbours, it appears that application- and network-layer intermediaries such as search engines, marketplaces, social networks, ISPs, and hosts who are established in the United Kingdom will face prima facie liability under the 2002 Act. They will need to consider whether to develop appropriate procedures to ensure that they do not engage in conduct that contravenes the relevant prohibitions and act expeditiously to remove prohibited advertisements.

11.88  Overview. Internet services that supply or advertise vehicular transportation services (eg by means of mobile apps and websites) will need to comply with applicable statutory requirements governing the licensing and promotion of such services. Advertisements for private vehicle hire are regulated, particularly in London and other major metropolitan areas. In London, private hire vehicles are regulated by Transport for London. Elsewhere in the United Kingdom, regulation is by local governments.70

11.89  Regulation of advertising. Section 31 of the Private Hire Vehicles (London) Act 1998 makes it an offence for any person to issue or cause to be issued an advertisement that vehicles described as ‘taxis’, ‘cabs’, or similar can be hired on application to a specified address in London, if the vehicles are not in fact London cabs. An advertisement may be issued in any form and medium, including via the internet.

11.90  Regulation of electronic hire. In the United Kingdom, only licensed hackney carriages may ‘ply for hire’, and an offence is committed by both the proprietor and the driver of an unlicensed vehicle that does so.71 Traditionally, plying was restricted to offline activity within the view of the vehicle in question, which expressly or impliedly invited the public to use the vehicle and indicated that members of the public would be able to do so if they wanted to.72 Although a question of fact and degree, the key criterion was whether a reasonable member of the public would regard the vehicle as immediately available for hire.

11.91  Regulation of taximeters. Section 11 of the 1998 Act prohibits a licensed private hire vehicle from being equipped with a ‘taximeter’, which is relevantly defined as a device for calculating and displaying the fare to be paid for a trip based on its distance, duration, or both.73 (This is unlike a hackney carriage or taxi, which must be fitted with such a meter.)

11.92  Application to online transport apps. With the advent of internet booking and ride-sharing services such as Uber, Lyft, GetTaxi, and Kabbee which provide real-time or near-immediate hailing of cabs and private vehicles, significant controversy surrounds the question of whether such services are required to be licensed. In Transport for London v Uber London Ltd,74 the High Court held that neither Uber nor its drivers contravened the 1998 Act because private hire vehicles in the Uber driver network were not equipped with ‘taximeters’. Ouseley J reasoned that, although fares were calculated on the basis of (inter alia) distance and time, the calculation did not take place in vehicles themselves. Instead, the fare was calculated by servers located in the United States on the basis of a fare structure algorithm to which the trip GPS signals and timing data were input.75

11.93 As a result, the smartphone in Uber vehicles was not a device for calculating fares; its purpose was merely to record and transmit ‘some or all of the inputs to a calculation made elsewhere’ and then to receive the output of the calculation.76 In any event, it was the driver who was equipped with the smartphone, rather than his vehicle.77 Consequently, vehicles were not ‘equipped’ with a taximeter and fell outside the scope of section 11.

11.94  Approach to statutory interpretation. In TfL v Uber, the Court rejected attempts by the claimant to expand the definition of ‘taximeter’ to meet changes in technology. The claimant relied upon the reasoning of Lord Bingham in R (Quintavalle) v Secretary of State for Health, where the central issue was whether a 1990 Act referring to a ‘live human embryo’ conferred authority to regulate research concerning cloning of unfertilised embryos in a laboratory. The Court held that Parliament could not have intended to distinguish between cloned and fertilised embryos, since it was unaware of the scientific possibility at the time the legislation was enacted.78 Lord Steyn similarly commented that Acts should be construed as ‘always speaking’ unless they were concerned only with a particular problem.79

11.95 However, although Ouseley J endorsed this approach in principle, it did not assist the claimant where the statutory provision in the 1998 Act was not specific to any particular form of technology. The interpretation sought by the claimant was ‘not one to deal with unanticipated new technology but with an unanticipated new way of operating licensed mini-cabs, using new technology at booking, for the journey and in fare calculation and payment’. This new technology would fall outside the provision unless it was construed so that the concepts of equipping and calculating were fundamentally altered. As such, whether and how to regulate this new technology was not capable of being resolved by an ‘updating interpretation’ of section 11, and should properly be debated by Parliament.80

11.96  Encouragement offences. Schedule 12 of the Coroners and Justice Act 2009 creates offences related to acts by information society service providers who intentionally encourage or assist the suicide or attempted suicide of another person.81 These offences are expressly extraterritorial in that they apply to acts done elsewhere in the EEA, provided that the service provider is established domestically. Paragraphs 4–6 provide that a service provider is not capable of being guilty of an offence in respect of anything done in the course of providing mere conduit, caching, or storage services (which have definitions duplicating but corresponding to the 2002 Regulations).82

11.97  Overview. Liability for contempt of court can attach in various ways and a full treatment of primary liability is beyond the scope of this work. However, several heads of liability are summarised below. In each case, it appears that an internet intermediary that publishes proscribed material may face criminal liability as a contemnor, but it will be necessary to show both that the material falls within the relevant offence and that the service intended to publish the material.

11.98  Restrictions on publication. First, section 12 of the Administration of Justice Act 1960 makes it a contempt of court to publish information relating to certain specified proceedings before courts sitting in private. Publication is not defined and the actus reus is not specific to any medium.

11.99  Reports of proceedings. Second, section 8 of the Magistrates’ Courts Act 1980 makes it unlawful to publish in Great Britain a written report about committal proceedings other than certain specified information. Liability attaches both to the person who publishes the report and to the proprietor, editor, or publisher of any newspaper or periodical in which it appears. Again, this appears to encompass any medium, though the publication would need to be targeted or directed to the relevant territory, or otherwise have a link to Great Britain.

11.100  Identification of protected parties. Third, section 1 of the Sexual Offences (Amendment) Act 1992 prohibits the publication in England and Wales of any matter likely to reveal the identity of a complainant of rape or various other serious sexual offences, while section 5 imposes criminal liability upon ‘the person publishing the matter’. Again, publication would need to be relevantly targeted. Section 5(5) creates a defence where, at the time of the publication, the defendant was not aware and neither suspected nor had reason to suspect that the publication included the proscribed matter.

11.101  Publication of certain offences. Fourth, schedule 11B of the Education Act 2002 and schedule 4 of the Education Act 2011 contain reporting restrictions restricting the publication of material relating to certain offences committed by teachers. Those provisions extend primary liability to information society service providers, subject to safe harbours in similar terms to the 2002 Regulations.83

11.102  Strict liability for contempt. Fifth, the Contempt of Court Act 1981 recognises various forms of statutory contempt. By sections 1 and 2, strict liability applies to conduct (comprising publications and other communications ‘in whatever form’, which are addressed to the public or any section of it: section 2(1)) which creates a substantial risk that the course of justice in active proceedings will be seriously impeded or prejudiced. Whether such a risk exists is a question to be determined in all the circumstances, and will depend on the nature of the publication and the period between publication and the time for trial.

11.103  Defence of reasonable care. Section 3(2) of the 1981 Act creates a defence of reasonable care which could be invoked by a secondary distributor of a publication containing matter which contravenes the strict liability rule. To avail itself of the defence, a distributor must show that at the time of distribution it had taken all reasonable care, and did not know or have reason to suspect that the publication contained the proscribed matter.

11.104  Jury deliberations. Section 8 of the 1981 Act makes it a contempt of court to obtain, disclose, or solicit any particulars of statements made, opinions expressed, arguments advanced, or votes cast by members of a jury in the course of their deliberations in any legal proceedings. The provision is widely cast and an internet intermediary that knowingly or deliberately publishes such material could contravene it.

11.105  Recordings of proceedings. Finally, section 9 of the 1981 Act makes it a contempt of court to publish a recording of legal proceedings made by or from an unauthorised sound recording device, by playing it in public or disposing of the recording with that intent.

11.106  Application to internet intermediaries. Courts approach the question of publication by analogy with defamatory material. This suggests that intermediaries will not normally face substantive liability for publications authored by their users, at least until a reasonable period after notification, though they might be the subject of an injunction. In R v Harwood, for example, the Court applied Tamiz to enjoin the Mail Online from publishing an article about a criminal trial.84 The Law Commission has recommended against altering the definition of publication.85

11.107  Facilitation of juror misconduct. Service provider liability continues to be discussed as a possible approach to the problem of juror contempt. The United Kingdom government has frequently expressed concerns about jurors who access search engines and social media to conduct research while empanelled in a criminal trial.86 A number of solutions have been debated, including service provider liability and website blocking, but no firm legislative proposals have been put forward.

11.108  Injunctions to restrain contempt. In most cases, internet intermediaries will not themselves engage in a relevant publication but will instead facilitate or enable the relevant primary conduct by the contemnor. Once put on notice of such conduct, the question arises as to what, if anything, is their responsibility to remove the material and to prevent further publication. One possibility is that an application could be made to the Court for an order that the service, a non-party, take steps to remove or disable access to the material. Such an order could be made either as an incident of the Court’s inherent jurisdiction to regulate its procedures or brought within the equitable protective jurisdiction to compel the service provider to cease facilitating wrongdoing.87

11.109  Criminal liability as an accessory. In an extreme case, where an internet intermediary has intentionally conspired with a contemnor to solicit or disseminate materials whose publication amounts to a contempt of court, it may be possible to argue that the service is criminally liable as an accessory. An overview of the relevant principles is given in chapter 5,88 but for more detailed treatment readers are referred to a specialist work.

11.110  Service of injunctions. Internet intermediaries may be liable for contempt of court where they host or transmit material that is the subject of a court order with actual knowledge. In Spycatcher, the Court of Appeal held that where a third party is on notice of an injunction protecting confidential material (normally by being served with it), it would be a contempt of court to publish the material even if he is not directly subject to the injunction.89 To do so would undermine the purpose for the injunction and render it ineffective, since, the material’s secrecy being thus destroyed, any further relief would become futile. However, different considerations may apply to other categories of wrongful information, such as material infringing copyright.

11.111  Liability for knowing assistance. Bloomsbury Publishing plc v News Group Newspapers Ltd provides further explanation of the basis for this form of accessory liability:

an injunction would be effective not just against identified defendants, but also against anybody else who is informed of the terms of the injunction and who assists or tries to assist (directly or indirectly) the addressee to breach its terms.90

11.112 In that case, the defendant had obtained secret information relating to the contents of a forthcoming Harry Potter novel, publication of which the claimants sought to restrain. The person who supplied the leaked copy of the book remained anonymous, but the Court considered that it had power to grant a ‘John Doe’ order against the unknown third party that would bind him in personam.91

11.113  Liability of non-parties. The relief in Bloomsbury was also expressed to be effective against those who ‘assist’ in attempts to sell the stolen book. By analogy, an intermediary who wrongfully disseminates confidential or unlawful material could be bound by an injunction restraining disclosure even if not directly named in the proceedings, provided they were put on notice of the injunction, on the basis that they assist the wrong. Given the compulsive power of an injunction requiring the blocking or removal of information, this comes close to recognising a more limited form of liability for assistance—especially considering that safe harbours do not apply to injunctive relief.92

11.114 Similarly, in Acrow (Automation) Ltd v Rex Chainbelt Inc,93 the Court held that an injunction equally binds third parties who assist a defendant to commit contempt. For this to occur, actual knowledge is required, which means that most internet intermediaries are unlikely to be affected unless and until notified. However, it is conceivable that an application-layer intermediary might cause material to be published which breaches the terms of an injunction of which the operator is aware. Consider, for example, an international social network whose members include jurors who publish details of their deliberations. In such a case, the intermediary might be restrained (or liable to contempt) if it fails to remove the material within a reasonable period of receiving notice of the terms of the injunction and the offending material.

Notes
1

Facebook Inc, Statement of Rights and Responsibilities (15 November 2013) cl 3.10 <https://facebook.com/terms>.

2

Twitter Inc, Terms of Service (25 June 2012) cl 1 <https://twitter.com/tos>.

3

Facebook Inc, n 1, cl 5.1.

4

Google Inc, User Content and Conduct Policy, cl 13 <https://google.com/intl/en/+/policy/content.html>.

5

YouTube LLC, Terms of Service (9 June 2010) cl 7.5 <https://youtube.com/static?template=terms>; YouTube Community Guidelines<https://youtube.com/t/community_guidelines>.

6

Google Inc, n 4.

7

Google Inc, Blogger Content Policy (2014) <http://blogger.com/content.g>.

8

LinkedIn Corp, User Agreement (26 March 2014) <http://linkedin.com/legal/user-agreement>.

9

Tumblr Inc, Terms of Service (27 January 2014) cl 6 <http://tumblr.com/policy/en/terms-of-service>; Community Guidelines (27 January 2014) <http://tumblr.com/policy/en/community>.

10

Google Inc, n 4, cl 8.

11

Various contributors, Terms of Service; Didn’t Read (7 December 2012) <https://tosdr.org/>.

12

Google Inc, Google Terms of Service (14 April 2014) <http://google.com/intl/en/policies/terms/>.

13

Microsoft Corp, Microsoft Services Agreement (11 June 2014) cl 3.5 <http://windows.microsoft.com/en-gb/windows/microsoft-services-agreement>.

14

See chapter 16, section 1.1.

15

Chambers v Director of Public Prosecutions [2013] 1 WLR 1833, [25] (Lord Judge LCJ) (‘Chambers’).

16

[2015] UKSC 32, [108] (Lord Neuberger PSC).

17

See Owen Bowcott and Katy Roberts, ‘Twitter Racism: How the Law is Taking on the “Twacists”’ (The Guardian, 27 March 2012) <http://theguardian.com/technology/2012/mar/27/twitter-racism-taking-on-twacists>.

18

Crown Prosecution Service, Guidelines on Prosecuting Cases involving Communications Sent via Social Media (2013) <http://www.cps.gov.uk/legal/a_to_c/communications_sent_via_social_media/>.

19

[2012] NIQB 96, [16]–[20] (McCloskey J).

20

[2013] NIQB 14, [13]–[14] (McCloskey J).

21

Chambers, [30] (Lord Judge LCJ).

22

[2008] EWHC 1797 (QB), [14] (Eady J).

23

[2007] 1 AC 224, [66] (Baroness Hale) (‘Majrowski’).

24

Majrowski, [30] (Lord Nicholls).

25

Chambers, [36] (Lord Judge LCJ).

26

[2006] 1 WLR 2223, [11] (Lord Bingham).

27

See chapter 16, section 1.2.

28

Majrowski v Guy’s and St Thomas’ NHS Trust [2007] 1 AC 224, [30]; Ferguson v British Gas Trading Ltd [2010] 1 WLR 785, [18]; Veakins v Kier [2009] EWCA Civ 1288, [11].

29

[2014] EWHC 2979 (QB), [41]–[43], [50] (Carr J).

30

See the principles set out in section 3.3.

31

See also Offensive Behaviour at Football and Threatening Communications (Scotland) Act 2012 (Scot) s 6.

32

Crown Prosecution Service, Obscene Publications (2011) <http://www.cps.gov.uk/legal/l_to_o/obscene_publications/>.

33

See, eg, Archbold, [31-63]–[31-72].

34

R v Waddon (Unreported, Court of Appeal (Criminal Division), 6 April 2000).

35

R v Perrin [2002] EWCA Crim 747. See also Children and Young Persons (Harmful Publications) Act 1955.

36

[2002] EWCA Crim 747, [18] (Kennedy LJ).

37

Video Recordings Act 1984 s 9(3)(a).

38

Hansard, Video Recordings Bill, House of Commons (11 November 1983, Mr Graham Bright) vol 48, col 525.

39

[1984] 1 WLR 89, 92 (Skinner J).

40

[1987] 1 AC 303 (‘Maginnis’).

41

[2005] 1 WLR 3118, 3131 (Kay LJ) (‘Interfact’).

42

Interfact, 3127 (Kay LJ).

43

See, eg, Amazon Services Europe SarL, ‘A-to-z Claim Conditions’ (2014) <http://amazon.co.uk/gp/help/customer/display.html?nodeId=201460300>.

44

Hansard, Video Recordings Bill, House of Commons (11 November 1983, Mr Graham Bright) vol 48, col 529.

45

[2001] EWHC Admin 469, [11]–[14] (Latham LJ) (Forbes J agreeing).

46

See chapter 8, section 3.1.

47

See chapter 12, section 5.

48

See chapter 13, section 1.7.

49

Terrorism Act 2006 s 3(1).

50

Terrorism Act 2006 s 3(2).

51

See chapter 12, section 1.2.

52

Attorney–General v Jonathan Cape Ltd [1976] QB 752, 769 (Lord Widgery CJ); Attorney–General v Punch Ltd [2001] QB 1028, 1062 (Lord Phillips MR) (‘Punch’).

53

Punch, 1062 (Lord Phillips MR).

54

Punch, 1062–3.

55

Lord Advocate v Scotsman Publications Ltd [1990] 1 AC 812, 826 (Lord Templeman).

56

Secretary of State for Defence v Guardian Newspapers Ltd [1985] AC 339, 354–5 (Lord Diplock).

57

See, eg, US Embassy (London), ‘UK Open to Financial Measures against Iran [Secret]’ (WikiLeaks, 16 June 2006) <http://www.wikileaks.ch/cable/2006/06/06LONDON4338.html>.

58

Lord Advocate v Scotsman Publications Ltd [1990] 1 AC 812, 825 (Lord Templeman).

59

See, eg, Bernard Gwertzman, ‘The Legal Case against WikiLeaks’ (Council on Foreign Relations, 13 December 2010) <http://www.cfr.org/media-and-foreign-policy/legal-case-against-wikileaks/p23618>; Harold Hongju Koh, Letter to Julian Assange and Jennifer Robinson (27 November 2010, Washington DC) <http://www.reuters.com/article/2010/11/28/us-wikileaks-usa-letter-idUSTRE6AR1E420101128>; Daniel Dombey and George Parker, ‘US Tries to Limit WikiLeaks Damage’ (Financial Times, 29 November 2010) <http://www.ft.com.libproxy.ucl.ac.uk/cms/s/0/d2bc69f8-fb2c-11df-b576-00144feab49a.html>.

60

Protection of Children Act 1978 s 3.

61

R v Porter [2006] EWCA Crim 560.

62

Criminal Justice Act 1988 s 160(2)(c).

63

It is suggested that ‘contravene’ should be understood in the sense of ‘fall within the scope of’, since injunctive relief would not normally be contrary to these provisions: see chapter 12, sections 3–5; chapter 13, section 1.7.

64

See chapter 4, section 1.

65

The Glasgow Commonwealth Games Act 2008 (Ticket Touting Offence) (England and Wales and Northern Ireland) Order 2012/1852 made similar provision for touting of 2014 Commonwealth Games tickets, but its operation is now essentially spent.

66

The Ticket Touting (Designation of Football Matches) Order 2007/790 designated various matches inside and outside England and Wales involving national, UEFA, and FIFA member clubs: reg 2.

67

Criminal Justice and Public Order Act 1994 s 166A(1).

68

Although s 166A(4) is drafted so as to require the transmitted or stored information both to be provided by a recipient of the service and to be stored ‘solely for the purpose’ of more efficient onward transmission (ie caching), the conjunctive condition appears to be an error. To avoid absurdity, information should fall within sub-s (4) if it satisfies either condition.

69

Certain trade communications are excluded: see s 4 of the 2002 Act.

70

See Local Government (Miscellaneous Provisions) Act 1976 s 71.

71

Town Police Clauses Act 1847 ss 38, 45; Metropolitan Public Carriage Act 1869 ss 4, 7.

72

Cogley v Sherwood [1959] 2 QB 311, 325 (Parker LCJ).

73

See Measuring Instruments (Taximeters) Regulations 2006 (SI 2006/2304).

74

[2015] EWHC 2918 (Admin) (‘TfL v Uber’).

75

TfL v Uber, [15] (Ouseley J).

76

TfL v Uber, [20] (Ouseley J).

77

TfL v Uber, [45]–[46] (Ouseley J).

78

[2003] 2 AC 687, [14] (Lord Bingham) (‘Quintavalle’).

79

Quintavalle [23] (Lord Steyn). See also Royal College of Nursing v Department for Health and Social Security [1981] AC 800, 822 (Lord Wilberforce).

80

TfL v Uber, [39] (Ouseley J).

81

See also Suicide Act 1961 s 2.

82

See chapter 12.

83

See chapter 12.

84

[2012] EW Misc 27 (CC), [26].

85

Law Commission, Contempt of Court (1): Juror Misconduct and Internet Publications (Law Com No 340, 9 December 2013) [2.29].

86

See, eg, BBC, ‘Jurors Jailed for Contempt of Court over Internet Use’ (29 July 2013) <http://www.bbc.co.uk/news/uk-23495785>.

87

See chapter 16, section 1.1.

88

See chapter 5, section 4.

89

[1988] Ch 333, 375.

90

[2003] EWHC 1087, [9] (Laddie J) (‘Bloomsbury’).

91

Bloomsbury, [10], [21], [25] (Laddie J).

92

See chapter 12, section 1.2.

93

[1971] 3 All ER 1175.

Close
This Feature Is Available To Subscribers Only

Sign In or Create an Account

Close

This PDF is available to Subscribers Only

View Article Abstract & Purchase Options

For full access to this pdf, sign in to an existing account, or purchase an annual subscription.

Close