The UK Cyber Security and Resilience Bill

Share

Background

The UK government has recently announced that it plans to introduce a Cyber Security and Resilience Bill (Bill). The Bill seeks to update the 2018 Network and Information Security Regulations, which implemented the European Union (EU) NIS 1 Directive when the UK was a member of the EU.

A key driver behind the UK government’s plans is a desire to stay broadly aligned with evolving EU legislation, particularly with the significant expansion in scope of the new EU NIS 2 Directive. Once presented to Parliament, the Bill could become law by early 2026.

Continue reading “The UK Cyber Security and Resilience Bill”

Countries Poised to Adopt New Cybersecurity Measures After UN Adopts Major Cybercrime Convention

Share

On August 7, 2024, after three years of negotiation, the United Nation’s Ad Hoc Committee to Elaborate a Comprehensive International Convention on Countering the Use of Information and Communications Technologies for Criminal Purposes unanimously adopted the Convention Against Cybercrime. The Convention now goes to the General Assembly, where it is expected to be adopted. If ratified by 40 member states, the Convention will enter into force.

Continue reading “Countries Poised to Adopt New Cybersecurity Measures After UN Adopts Major Cybercrime Convention”

UK Supreme Court Rules that AI cannot be an ‘Inventor’ Under UK Patent Law

Share

In Thaler v Comptroller-General of Patents, Designs and Trade Marks [2023] UKSC 49, the UK Supreme Court ruled that AI cannot be an ‘inventor’ for the purposes of UK patent law. The ruling concludes a series of appeals from Dr Stephen Thaler and his collaborators, who argued that an AI system called ‘DABUS’ should be named as the inventor of two new inventions generated autonomously by it relating to food and beverage packaging and light beacons. This was part of a series of test cases, which have had limited success globally, seeking to establish that AI systems can make inventions and that the owners of such systems can apply for and secure the grant of patents for those inventions. The judgment noted that the broader questions of whether an invention generated autonomously by AI ought to be patentable, or whether the meaning of the term ‘inventor’ should be expanded to include machines powered by AI, were matters of policy that would need to be addressed by legislation.

The UK Supreme Court made three main findings.

  1. DABUS is not an ‘inventor’ under the Patents Act 1977 (“Patents Act”)
  2. An ‘inventor’ within the meaning of the Patents Act must be a natural person (a human being). Since DABUS is a machine, not a natural person, it cannot be an ‘inventor.’
  3. It was not Dr Thaler’s case that he was the inventor and had simply used DABUS as a highly sophisticated tool. Had Dr Thaler made that case and named himself as the inventor, the Court noted that its decision might have been different, but it was not the Court’s place to determine that question.
  1. Dr Thaler was not entitled to apply for and obtain a patent simply by virtue of his ownership of DABUS
  2. Dr Thaler sought to rely on the doctrine of accession whereby the owner of existing property would own new property generated by that existing property (in the same way that a farmer owns the cow and also the calf). The Court held that this only applies to tangible property and not to intangible inventions. For this reason, title to the invention cannot pass as a matter of law from the machine that generated it to the owner of that machine. This argument also assumes that DABUS itself can be an inventor within the meaning of the Patents Act, which, as the court had already established, it cannot.
  1. By failing to satisfy the requirements of the Patents Act, the two patent applications must be taken to have been withdrawn
  2. Because Dr Thaler had failed to name an inventor and had failed to state a valid right to apply for and obtain the patents, the UK Intellectual Property Office had been correct to find that Dr Thaler’s two patent applications would be taken to be withdrawn at the expiry of the 16-month period prescribed by UK patent law for this purpose.

Commentary

Dr Thaler’s UK patent applications were part of a project involving parallel applications to patent offices around the world. The UK Supreme Court’s ruling is unsurprising and follows similar decisions in the United States and Europe.

The ruling raises significant issues for the AI industry, but it is important to focus on what it confirms: that inventors must be natural persons for the purposes of UK patent law. The judgment does not impact the patentability of AI-generated inventions as it does not necessarily preclude a person from securing a patent, provided that a human being is named the inventor.

Bletchley Park AI Safety Summit 2023

Share

On 1 and 2 November 2023, world leaders, politicians, computer scientists and tech executives attended the global AI Safety Summit at Bletchley Park in the UK. Key political attendees included US Vice President Kamala Harris, European Commission President Ursula von der Leyen, UN Secretary-General António Guterres, and UK Prime Minister Rishi Sunak. Industry leaders also attended, including Elon Musk, Google DeepMind CEO Demis Hassabis, OpenAI CEO Sam Altman, Amazon Web Services CEO Adam Selipsky, and Microsoft president Brad Smith.

Day 1: The Bletchley Declaration

On the first day of the summit, 28 countries and the EU signed the Bletchley Declaration (“Declaration”). The Declaration establishes an internationally shared understanding of the risks and opportunities of AI and the need for sustainable technological development to protect human rights and to foster public trust and confidence in AI systems. In addition to the EU, signatories include the UK, the US and, significantly, China. Nevertheless, there are notable absences, most obviously, Russia.

Continue reading “Bletchley Park AI Safety Summit 2023”

The UK’s Online Safety Bill – Implications for US and International Businesses

Share

On 19 September 2023, the UK Parliament passed the Online Safety Bill (“OSB”). The OSB aims to protect individuals from illegal online content and focuses on the protection of children by requiring the removal of content that is legal but harmful to children. For example, social media platforms will be required to act rapidly to prevent children from viewing illegal material, or content that is harmful to them, such as pornography, online bullying, and the promotion of suicide, self-harm or eating disorders. The definition of illegal content covers content that is already unlawful under existing legislation, such as terrorism, hate speech and child sexual exploitation, and introduces new offences relating to more recent online phenomena such as revenge pornography, and ‘upskirting’ and ‘downblousing’ images. This is one of the most significant pieces of UK legislation post-Brexit and shows a distinctly UK approach to online harms, which businesses operating globally will need to comply with. This will need to be reviewed in parallel with the EU Digital Services Act, which has similar goals in making Europe a safe online environment.

A date for Royal Assent (when the OSB will become law) is expected shortly. The OSB’s wide scope makes it likely to result in implementation problems and potential challenges resulting from the impact the OSB is likely to have on freedom of expression and personal privacy. The underlying principles of the OSB are very different to those familiar with US laws and the constitutional protections for free speech. The risks of non-compliance will be significant, with extremely high potential fines of up to 10% of a company’s global revenue.

Continue reading “The UK’s Online Safety Bill – Implications for US and International Businesses”

Court of Justice of the European Union Recognizes Inferred Special Categories of Personal Data

Share

On August 1, 2022, the Court of Justice of the European Union (CJEU) issued an opinion regarding a Lithuanian data protection case that may signal an expansion of interpretation of the definition of sensitive personal data under the EU’s General Data Protection Regulation (GDPR). Specifically, the CJEU found that data indirectly disclosing sexual orientation constitutes sensitive personal data.

At issue was a Lithuanian law that requires the Chief Official Ethics Commission of Lithuania to publish information about the private interests of public officials in an effort to combat corruption. In the facts underlying the case, a Lithuanian official objected to the Chief Official Ethics Commission’s online publication of his private interest information, which included his spouse’s name. The CJEU concluded that the publication of such information was prohibited by the GDPR because it was “liable to disclose indirectly the sexual orientation of a natural person,” a type of special category of personal data generally prohibited from processing under GDPR Article 9 (processing of special categories of personal data) unless certain additional conditions are satisfied such as the data subject’s explicit consent, or that processing is necessary for reasons of substantial public interest.

Continue reading “Court of Justice of the European Union Recognizes Inferred Special Categories of Personal Data”

©2024 Faegre Drinker Biddle & Reath LLP. All Rights Reserved. Attorney Advertising.
Privacy Policy