Research

 

You will find a complete range of our monographs, muti-authored and edited works including peer-reviewed, original scholarly research across the social sciences and aligned disciplines. We publish long and short form research and you can browse the complete Bristol University Press and Policy Press archive.

Policy Press also publishes policy reviews and polemic work which aim to challenge policy and practice in certain fields. These books have a practitioner in mind and are practical, accessible in style, as well as being academically sound and referenced.
 

Books: Research

You are looking at 11 - 20 of 20 items for :

  • Technology Law x
Clear All

The chapter adds a non-obvious private law perspective on the proper scope of liability of internet intermediaries to victims of non-consensual intimate images. It conceptualizes the right to privacy in intimate images as property; defends the applicability of the framework of conflicts over title to chattels to unauthorized dissemination of intimate images; and draws the relevant conclusions: Whether or not an innocent buyer has better title to the chattel than the original owner, the merchant is always strictly liable to the original owner. Internet intermediaries, as any other merchant selling ‘goods’ with defective title, ought to be strictly liable to the claimant. This conclusion is bolstered by the priority of the image subject’s interest vis-à-vis viewers’ under both a traditional Articles 8/10 ECHR balancing exercise (namely, privacy is not property) and an examination of the justifications for a market overt rule (namely, privacy is property). Alternatively, it could be based on policy considerations explaining why the merchant is always liable in conversion to the original owner, even under legal systems recognizing market overt and even if the thief is identifiable and solvent.

Restricted access

This chapter addresses three foundational questions about the scope of liability for breach of privacy, relevant to determining whether the typical harms suffered by NCII are capable of being compensated by a misuse of private information (Privacy) claim: (1) Privacy ought to compensate for both the mere diminishing of control and for distress (and other consequential losses). This has bearing on debates in tort law and the law of remedies on the extent to which injury to autonomy ought to be an actionable in its own right. (2) Privacy (and not only defamation), ought to compensate for reputation loss. This relates to broader debates about the division of labour between different areas of law and claimant’s election; coherence, the relevance of dated authorities and law’s expressive potential. (3) The test for remoteness in Privacy is clarified with reference to some types of consequential losses such as loss of employment, loss of dependency due to a break of marriage, and losses from follow-up physical attacks or from suicide. (4) A fourth question is unique to viewers’ liability, and concerns apportionment of liability. Each viewer should be liable to a significant portion of the loss, but not to the entire loss.

Restricted access

The chapter overviews the main two regulatory regimes to platforms’ civil liability to user generated content, of which non-consensual intimate images is just one example: complete immunity and post notice liability for failure to remove the content expeditiously. It also examines some recent trajectory of increased platforms’ responsibility.

It then examines how the principles of control and fairness feature in courts’ decisions about platform liability for UGC, by examining US S230 decisions and European decisions of both the CJEU, ECtHR and some other decisions.

Finally, it sets the Chapter 3 argument – that principles of control, fairness and right to effective remedy should lead to rejecting post notice immunity – within the existing literature, explaining the support for such immunity as an issue of (mistaken) framing of internet intermediaries as passive hosts, rather than active participants – and for profit – in breaching the claimant’s privacy.

Restricted access

Intentional viewing of non-consensual intimate images should lead to liability for breach of privacy, whether the victim is a child or an adult. The chapter defends this view with reference to case studies in which viewers look for NCII, while also commenting on the budding policy discussion of whether to criminalize the intentional possession of NCII. Viewers who took a risk that they might come across NCII should be liable, at least where that risk is significant, as is the case when one visits commercial porn websites.

The analysis will draw on anecdotal data about the ways NCIIs are shared and viewed. While comprehensive data on this issue is missing, civil litigation, in the main, does not require this data; rather, the individual behaviour of the defendant-viewer will be sufficient to decide whether liability could be imposed and based on what theory of liability or cause of action; such individual behaviour will often be verifiable to courts.

Restricted access

Richard Bertinet is a chef who has lived in the UK since 1988.1 He runs a well-known and popular cookery school in Bath and has penned several award-winning recipe books. A significant portion of the UK’s population is made up of people like Richard – people who migrated from EU Member States and made the UK their home. There is still no exact, official count of how many EU citizens are resident in the UK by virtue of free movement rights, but we now know it to be more than four million.2 That group is embedded within communities across all walks of life. Some have been in the UK for decades, while others arrived more recently. Following the leave vote at the June 2016 Brexit referendum, the status of this group quickly became uncertain. Quite apart from negotiating the rules that would apply, there was the immense challenge of how the new rules would be administered fairly and effectively at the speed required by the Brexit process. In response to this challenge, the Home Office adopted a novel process, known as the EU Settlement Scheme, which included a combination of online applications, partially automated decision making, and cross-departmental data-sharing arrangements. For people like Richard, it was, in the words of then Home Secretary Amber Rudd MP, meant to be ‘as easy as setting up an online account at LK Bennett’.3 Many applications were processed quickly and successfully.

Restricted access

Robtel Neajai Pailey is a Liberian academic, activist, and author, currently based at the London School of Economics and Political Science.1 Since 2006, she has applied for and obtained a range of visas for the UK, including as a tourist, a student, and a skilled worker. Pailey made several of her applications from the US, where she is a permanent resident. The application process was costly and a bit intrusive, but on the whole she felt the experience was ‘relatively smooth’. When Pailey applied for a visa from Ghana in 2018, however, she bore significant additional costs and delay. Between the Home Office, the British High Commission in Ghana, and the local visa application centre, no one seemed to know the status of her application or the location of her passport. The delay forced her to cancel a different trip at substantial personal cost, and her request for a refund of the application fees was refused. She described the experience as, simply, ‘the absolute worst’.

Restricted access

In recent years, the United Kingdom's Home Office has started using automated systems to make immigration decisions. These systems promise faster, more accurate, and cheaper decision-making, but in practice they have exposed people to distress, disruption, and even deportation.

This book identifies a pattern of risky experimentation with automated systems in the Home Office. It analyses three recent case studies including: a voice recognition system used to detect fraud in English-language testing; an algorithm for identifying ‘risky’ visa applications; and automated decision-making in the EU Settlement Scheme.

The book argues that a precautionary approach is essential to ensure that society benefits from government automation without exposing individuals to unacceptable risks.

Restricted access

The Home Office – the main UK public authority responsible for immigration – is keenly interested in identifying ‘sham’ marriages which are designed to game the immigration system.1 Since at least 2015, the department has used an automated system to determine whether to investigate a proposed marriage.2 Marriage registrars across the country transmit details of proposed marriages to the system via ‘data feeds’. The system applies eight ‘risk factors’ to assess the risk that a couple’s marriage is a sham. These risk factors include the couple’s interactions before the registrar, ‘shared travel events’, and age difference. The system allocates couples either a ‘green’ rating, indicating that no investigation is warranted, or a ‘red’ rating, indicating that an investigation is warranted to identify possible ‘sham activity’. This algorithm processes a large number of marriages each year. In a 12-month period across 2019 and 2020, the Home Office received 16,600 notifications of marriages involving a non-European national, of which 1,299 were subsequently investigated.

Restricted access

The three systems we have explored in this book barely scratch the surface of automation in government immigration systems. They are systems which have, for various reasons and through various means, come into public view. But automated systems are being developed and deployed in many more corners of the immigration bureaucracy. The current trajectory, both in the UK and around the world, is toward increasingly automated immigration systems.

From the transitional and experimental phase that we are currently in, it is clear that automated immigration systems can bring benefits. For example, automation has allowed millions of people to get their status under the EU Settlement Scheme quicker than would have otherwise been possible, reducing delay and associated anxiety. These systems also seem to have some success in reducing decision-making costs. However, automated systems also pose clear and real risks of failure. These failures can occur, and have already occurred, at both individual and systemic levels, with disastrous effects for individuals and their families, as well as wider society and the economy. The resultant harms must be taken seriously, and certainly more seriously than the Home Office appears to have taken them to this point.

Restricted access

At dawn on 30 June 2014, Raja Noman Hussain awoke to find about 15 immigration and police officers raiding his house.1 Raja, a 22-year-old Pakistani man, had arrived in the UK several years earlier to study. Now he was being accused of cheating in an English language proficiency test approved by the Home Office, which he had sat in 2012 to meet a condition of his visa. After confirming his ID, the officers told him to grab some clothes, handcuffed him, and took him into immigration detention. Raja spent the next four months in detention, during which time he estimates he met over 100 other international students who had also been detained on the same basis. What followed was six years of legal battles over the cheating allegation, which disrupted his studies, estranged him from his family, and cost him around £30,000. Finally, in early 2021, Raja succeeded in clearing his name and confirming his right to be in the UK.

Raja was one of the tens of thousands of students whose visas were revoked or curtailed – and studies disrupted or ended – after the Home Office accused them of cheating in a government-approved English language test. This scandal eventually hit the headlines. The ensuing appeals and judicial reviews – which became known as the ‘ETS cases’ – have cost the government millions of pounds.2 What is less appreciated about this debacle is that much of it centred on a failed automated system: a voice recognition algorithm which the government used to identify suspected cheats. This chapter explores that side of the story.

Restricted access