Search Results
You are looking at 1 - 10 of 14 items for
- Author or Editor: Tsachi Keren-Paz x
Should digital platforms be responsible for intimate images posted without the subject’s consent? Could the viewers of such images be liable simply by viewing them?
This book answers these questions in the affirmative, while considering the social, legal and technological features of unauthorized dissemination of intimate images, or ‘revenge porn’. In doing so, it asks fundamental socio-legal questions about responsibility, causation and apportionment, as well as conceptualizing private information as property.
With a focus on private law theory, the book defines the appropriate scope of liability of platforms and viewers, while critiquing both the EU’s and US’ solutions to the problem. Through its analysis, the book develops a new theory of egalitarian digital privacy.
In this chapter I debunk claims that filtering NCII involves too high costs in terms of either freedom of expression or financial costs, as well as the related claim that such filtering in not technologically feasible. I first focus on Facebook’s filtering practice as reflected in its (untransparent) transparency report. I then evaluate this practice to highlight its shortcoming and delineate the contours of an acceptable and practicable NCII filtering backed by (a more controversial) strict liability for harm from remaining NCII. I discuss penumbra definitional issues of intimacy beyond nudity and cultural differences and scope of liability for harms from these images. My approach diverges from the recent Law Commission’s proposals in their 2002 final report, affording better protection to cultural minorities and taking lessons from medical ethics and law. I also discuss an economy of scales and its potential relevance to smaller intermediaries with a critique of the weight given in recent policy discussions to a means-based test as limiting intermediaries’ potential duties to filter content.
This chapter proposes to understand child pornography as an extreme and complicated instance of NCII. It (1) establishes viewers’ liability for viewing child pornography under privacy law (as distinct from bespoke statutory provisions), (2) examines whether viewing child pornography could be considered as ‘acting in concert’ and hence lead to full liability of each viewer to the victim’s entire damage from the viewing of their abuse (and possibly also from their initial abuse); (3) argues that the holding in the US Supreme Court in Paroline v United States is compatible with demand-based liability: A viewer could and should be liable to the victim’s injury from the initial abuse, as long as the production of the child pornography was (also) motivated by the prospect of distribution. However, the viewer should not be liable to victims whose images he did not view for harm from either the initial abuse or the circulation of images.
Intentional viewing of non-consensual intimate images should lead to liability for breach of privacy, whether the victim is a child or an adult. The chapter defends this view with reference to case studies in which viewers look for NCII, while also commenting on the budding policy discussion of whether to criminalize the intentional possession of NCII. Viewers who took a risk that they might come across NCII should be liable, at least where that risk is significant, as is the case when one visits commercial porn websites.
The analysis will draw on anecdotal data about the ways NCIIs are shared and viewed. While comprehensive data on this issue is missing, civil litigation, in the main, does not require this data; rather, the individual behaviour of the defendant-viewer will be sufficient to decide whether liability could be imposed and based on what theory of liability or cause of action; such individual behaviour will often be verifiable to courts.
This chapter explains why viewers’ liability could and should be strict, rather than merely fault-based. Building on the discussion in Chapters 4 and 5 it further explains how a property-based understanding of privacy helps justifying both strict liability and its proper limits; thus avoiding excessive liability. Hence, this chapter makes three contributions: doctrinal, normative and conceptual. Doctrinally, it explains that (1) the misuse of private information tort (Privacy) is already understood as a stricter form of liability; (2) that liability under Privacy might be stricter than under breach of confidentiality (from which Privacy sprung) and the justifications for this difference; and (3) Privacy can therefore accommodate viewers’ strict liability for viewing NCII (Part 2). Normatively, it then explains how the concepts of possession, passive behaviour and reliance make viewers’ strict liability to be justified and not excessive (Part 3). Theoretically, it inquires how conceptualizing information as property justifies strict, rather than fault-based, liability for viewing, despite the fact that the act of viewing both misappropriates and destroys value (Part 4); it thus complements the analysis offered in Chapters 4 and 5.
This chapter addresses three foundational questions about the scope of liability for breach of privacy, relevant to determining whether the typical harms suffered by NCII are capable of being compensated by a misuse of private information (Privacy) claim: (1) Privacy ought to compensate for both the mere diminishing of control and for distress (and other consequential losses). This has bearing on debates in tort law and the law of remedies on the extent to which injury to autonomy ought to be an actionable in its own right. (2) Privacy (and not only defamation), ought to compensate for reputation loss. This relates to broader debates about the division of labour between different areas of law and claimant’s election; coherence, the relevance of dated authorities and law’s expressive potential. (3) The test for remoteness in Privacy is clarified with reference to some types of consequential losses such as loss of employment, loss of dependency due to a break of marriage, and losses from follow-up physical attacks or from suicide. (4) A fourth question is unique to viewers’ liability, and concerns apportionment of liability. Each viewer should be liable to a significant portion of the loss, but not to the entire loss.
This chapter addresses two hurdles claimants need to cross if civil claims against viewers are to become viable: evidence and costs. I first discuss the evidentiary hurdle faced by claimants of identifying viewers, which is unique to claims against viewers. I refer to two types of difficulties: identifying those who viewed the claimant’s images; and identifying whether a viewer of a porn website hosting many intimate images, viewed the claimant’s images. The contours of a disclosure order against internet service providers (in the UK, a Norwich Pharmacal Order) for the identity of users breaching the claimant’s privacy will be examined. I will defend the claim that as a matter of ‘poetic justice’, where the alleged wrongdoing is manifested in a serious undermining of the claimant’s privacy, relative little weight should be given to the user’s privacy interest militating against disclosing his identity. Then, I deal with the issue of financial costs, in which I summarize the findings relevant to claims against uploaders and explain the potential differences from claims against viewers.
The chapter summarizes the main policy recommendations of the book: The harm from non-consensual intimate images (NCII) is caused also by platforms and viewers who should both be liable for breach of the claimant’s privacy. The harm from NCII is exceptional so liability should not necessarily be imposed for harm from other user generated content. In reality, hosts and viewers sell and buy the claimant’s images without the claimant’s consent, so they should be strictly liable for the ensuing harm, as those who sell and buy stolen chattel are liable to the original owner. Platforms should actively filter out non-consensual intimate images, regardless of their size. Similar to defences to criminal responsibility for child pornography, promptly deleting unsolicited NCII should prevent civil liability. Liability of each viewer should be to a significant amount, but should not extend to the entire loss.
The chapter defines ‘non-consensual intimate images’ and highlights the potential contribution of platforms and viewers to the breach of privacy of those depicted in non-consensual intimate images. It explains the theoretical framework informing the project; the project’s genealogy; and the book’s contributions in terms of policy, private law theory and tort doctrine.
The chapter overviews the main two regulatory regimes to platforms’ civil liability to user generated content, of which non-consensual intimate images is just one example: complete immunity and post notice liability for failure to remove the content expeditiously. It also examines some recent trajectory of increased platforms’ responsibility.
It then examines how the principles of control and fairness feature in courts’ decisions about platform liability for UGC, by examining US S230 decisions and European decisions of both the CJEU, ECtHR and some other decisions.
Finally, it sets the Chapter 3 argument – that principles of control, fairness and right to effective remedy should lead to rejecting post notice immunity – within the existing literature, explaining the support for such immunity as an issue of (mistaken) framing of internet intermediaries as passive hosts, rather than active participants – and for profit – in breaching the claimant’s privacy.