In this update, we detail the key legislative updates in the second quarter of 2020 related to artificial intelligence (“AI”), the Internet of Things (“IoT”), cybersecurity as it relates to AI and IoT, and connected and automated vehicles (“CAVs”). The volume of legislation on these topics has slowed but not ceased, as lawmakers increasingly focus on the pandemic and the upcoming national election. As Congress processes Appropriations bills, it continues to look to support and fund these technologies. We will continue to update you on meaningful developments between these quarterly updates across our blogs. Continue Reading
On 16 July 2020, the European Commission (“Commission”) announced that it has launched an antitrust sector inquiry into “consumer-related products and services that are connected to a network and can be controlled at a distance, for example via a voice assistant or mobile device.”
Commission Executive Vice President and Competition Commissioner Vestager said that “[t]he sector inquiry will cover products such as wearable devices (e.g. smart watches or fitness trackers) and connected consumer devices used in the smart home context, such as fridges, washing machines, smart TVs, smart speakers and lighting systems. The sector inquiry will also collect information about the services available via smart devices, such as music and video streaming services and about the voice assistants used to access them.” Connected cars are outside of the scope of the inquiry. Continue Reading
Senators Lindsey Graham (R-S.C.), Tom Cotton (R-Ark.) and Marsha Blackburn (R-Tenn.) have introduced the Lawful Access to Encrypted Data Act, a bill that would require tech companies to assist law enforcement in executing search warrants that seek encrypted data. The bill would apply to law enforcement efforts to obtain data at rest as well as data in motion. It would also apply to both criminal and national security legal process. This proposal comes in the wake of the Senate Judiciary Committee’s December 2019 hearing on encryption and lawful access to data. According to its sponsors, the purpose of the bill is to “end the use of ‘warrant-proof’ encrypted technology . . . to conceal illicit behavior.” Continue Reading
Today, the Supreme Court issued its decision in Barr v. American Association of Political Consultants, which addressed the constitutionality of the Telephone Consumer Protection Act (TCPA). Although the Court splintered in its reasoning—producing four separate opinions—the justices nevertheless coalesced around two core conclusions: (1) the TCPA’s exception for government debt collection calls is unconstitutional, and (2) the exception can be severed from the rest of the TCPA. Six justices determined that the TCPA’s government-debt exception violates the First Amendment, and seven justices concluded that the exception is severable from the rest of the statute. The end result is that the government-debt exception is invalid but the rest of the TCPA—including its general prohibition on automated calls and text messages to mobile numbers—remains intact. The narrow scope of this ruling suggests that it may have limited practical effect for most parties.
On June 4, 2020, Representatives Anna Eshoo (D-CA-18), Anthony Gonzalez (R-OH-16), and Mikie Sherrill (D-NJ-11) introduced the National AI Research Resource Task Force Act. This bipartisan bill would create a task force to propose a roadmap for developing and sustaining a national research cloud for AI. The cloud would help provide researchers with access to computational resources and large-scale datasets to foster the growth of AI.
“AI is shaping our lives in so many ways, but the true potential of it to improve society is still being discovered by researchers,” explained Rep. Eshoo. “I’m proud to introduce legislation that reimagines how AI research will be conducted by pooling data, compute power, and educational resources for researchers around our country. This legislation ensures that our country will retain our global lead in AI.”
Earlier this week, the Federal Communications Commission’s (FCC’s) Consumer and Government Affairs Bureau released a Declaratory Ruling clarifying the agency’s interpretation of the “Automatic Telephone Dialing System” (an “autodialer” or “ATDS”) definition in the Telephone Consumer Protection (TCPA). The Ruling clarified that, in the context of a call or text message platform, the definition does not turn on whether the platform is used by others to transmit a large volume of calls or text messages; instead, the relevant inquiry is whether, in this context, the platform is capable of transmitting calls or text messages without a user manually dialing each such call or text message.
Earlier this month, the Federal Communications Commission (“FCC”) asked for comment on a Petition for Rulemaking filed by the Consumer Technology Association (“CTA”) that proposes to modify the FCC’s device authorization rules to allow the importation and conditional, preauthorization marketing and sales of radiofrequency (“RF”) devices that have not yet been approved under the FCC’s rules. The deadline for filing comments supporting or opposing the petition is July 9, 2020. Continue Reading
The COVID-19 pandemic is accelerating the digital transition and the adoption of artificial intelligence (“AI”) tools and Internet of Things (“IoT”) devices in many areas of society. While there has been significant focus on leveraging this technology to fight the pandemic, the technology also will have broader and longer-term benefits. As the New York Times has explained, “social-distancing directives, which are likely to continue in some form after the crisis subsides, could prompt more industries to accelerate their use of automation.”
For businesses proceeding with reopenings over the coming weeks and months, and for sectors that have continued to operate, AI and IoT technologies can greatly improve the way they manage their operations, safely engage with customers, and protect employees during the COVID-19 crisis and beyond. But businesses also should take steps to ensure that their use of AI and IoT technologies complies with the evolving legal requirements that can vary based on several factors, including the industry sector where the technology is deployed and the jurisdiction where it is used. Businesses also will want to have mechanisms in place to help ensure that the technology is used appropriately, including appropriate oversight and workforce training and other measures.
On June 2, 2020, the French Supervisory Authority (“CNIL”) published a paper on algorithmic discrimination prepared by the French independent administrative authority known as “Défenseur des droits”. The paper is divided into two parts: the first part discusses how algorithms can lead to discriminatory outcomes, and the second part includes recommendations on how to identify and minimize algorithmic biases. This paper follows from a 2017 paper published by the CNIL on “Ethical Issues of Algorithms and Artificial Intelligence”. Continue Reading
Senators Maria Cantwell (D-WA) and Bill Cassidy (R-LA) introduced bipartisan legislation this week to address privacy issues in the COVID-19 era. The proposal, entitled the “Exposure Notification Privacy Act,” would regulate “automated exposure notification services” developed to respond to COVID-19. This bipartisan legislation comes on the heels of dueling privacy proposals from both political parties. We previously analyzed the Republican “COVID-19 Consumer Data Protection Act” proposal introduced by Senate Commerce Chairman Roger Wicker (R-MS) on this blog and the Democratic “Public Health Emergency Privacy Act” proposal on this blog.Below are descriptions of the notable provisions in the Exposure Notification Privacy Act:
- In contrast to the Wicker proposal and the proposal introduced by House and Senate Democrats, both of which would cover symptom tracking and other apps, this new bipartisan proposal would be narrower by only regulating operators of so-called “automated exposure notification services.” This is defined as any website or mobile application designed for use or marketing to digitally notify “an individual who may have become exposed to an infectious disease.” Operators can be both for-profit and non-profit entities.
- However, the definition of covered personal data is broader than some earlier proposals that only covered certain categories of health and location data. The new proposal covers all data linked or reasonably linkable to any individual or device that is “collected, processed, or transferred in connection with an automated exposure notification service.” This definition is broader than the Republican proposal, which defined covered data to include health information, geolocation data, and proximity data. It is also broader than the Democratic proposal, which included the same data elements as the Republican proposal while also covering certain medical testing data and contact information.
- Under the bipartisan bill, operators may not enroll individuals in automated exposure notification services without their affirmative express consent, which is the same as both the Democratic and Republican proposals.
- However, the new proposal could curtail the ability of technologies to collect, process, or share an actual, potential or presumptive positive diagnosis of an infectious disease except when such diagnosis is confirmed by a public health authority or a licensed health provider.
- The proposal requires operators to “collaborate with a public health authority in the operation” of their notification service.
- The bill includes certain transfer restrictions. Covered data may only be transferred for certain enumerated purposes, such as to notify enrolled individuals of potential exposure to an infectious disease, or to public health authorities or contracted service providers.
- The bill obligates operators to delete all covered data upon request of the individual, as well as within 30 days of the receipt of such data, on either a rolling basis or “at such times as is consistent with a standard published by a public health authority within an application jurisdiction.” Such deletion requirements do not apply to data retention for public health research purposes.
- The bill distinguishes between operators and service providers, and only a subset of obligations—such as data deletion requirements—apply to service providers. Service providers with “actual knowledge” that an operator has failed to adhere to certain standards required under the proposal would be obligated to notify the operator of the potential violation.
- Similar to the Democratic proposal, this bill makes it unlawful for “any person or entity” to discriminate on the basis of “covered data collected or processed through an automated exposure notification service” or their choice “to use or not use” such a service.
- While the Democratic and Republican proposals imposed public reporting obligations on covered entities, this bipartisan proposal would require such an obligation on the federal Privacy and Civil Liberties Oversight Board. Under the proposal, the Board would be required to issue a report within one year after enactment that assesses “the impact on privacy and civil liberties of Government activities in response to the public health emergency related to” COVID-19 and makes recommendations for the future.
As with both the Republic and Democratic proposals, the Exposure Notification Privacy Act enforcement provisions name both the Federal Trade Commission and state Attorneys General. Notably, the Act preserves the right for individuals to bring claims arising under various state laws, including consumer protection laws, health privacy or infectious diseases laws, civil rights laws, state privacy and data breach notification laws, and under contract or tort law.