Last week, Federal Communications Commission (“FCC”) Chairwoman Jessica Rosenworcel announced plans to reorganize the agency’s International Bureau by creating a new Space Bureau and a standalone Office of International Affairs. The announcement, which marks the latest in a string of space-focused actions over the last several months, is a further indication of the FCC’s commitment to leadership in the growing space economy.
Continue Reading FCC Positions Itself for Expanding Space IndustryU.S. AI, IoT, CAV, and Privacy Legislative Update – Third Quarter 2022
This quarterly update summarizes key legislative and regulatory developments in the third quarter of 2022 related to Artificial Intelligence (“AI”), the Internet of Things (“IoT”), connected and autonomous vehicles (“CAVs”), and data privacy and cybersecurity.
This quarter, Congress has continued to focus on the American Data Privacy Protection Act (“ADPPA”) (H.R. 8152), which would regulate the collection and use of personal information and includes specific requirements for AI systems. Disagreements over the legislation’s preemption of state laws and creation of a private right of action continue to stall the its progress. Separately, the Federal Trade Commission (“FTC”) announced an Advanced Notice of Proposed Rulemaking to solicit input on questions related to privacy and automated decision-making systems. The notice cites to the FTC’s prior guidance related to IoT devices.
Artificial Intelligence
Regulators and the White House have expressed increased interest in setting forth requirements and best practice expectations around the operation of AI systems. For example, the FTC announced an Advanced Notice of Proposed Rulemaking in August that asks for comments on a number of topics related to automated decision-making systems. In particular, the FTC is requesting comments on the prevalence of error in automated decision-making systems, discrimination based on protected categories facilitated by algorithmic decision-making systems (and whether the FTC should consider recognizing additional categories of protected classes), and how the FTC should address algorithmic discrimination that occurs through the use of proxies.
In early October, the White House also released its Blueprint for an AI Bill of Rights. Discussed in further detail here, the Blueprint outlines recommended best practices for entities using AI, which include measures to provide a safe and effective system, protections against algorithmic discrimination, attention to data privacy, notice and explanation, and the provision of human alternatives and consideration.
Congress continues to weigh into the discussion about regulation of AI systems. The latest version of the ADPPA would require a covered entity or service provider who “knowingly develops” a covered algorithm that processes covered data “in furtherance of a consequential decision” must evaluate the design, structure, and inputs of the covered algorithm. In addition, entities of a certain size, which the bill calls “large data holders,” must conduct an impact assessment that describes the design process and methodologies of the covered algorithm, an assessment of the necessity and proportionality of the algorithm in relation to its stated purpose, and the steps the entity will take to mitigate the risk of harm.
Internet of Things
This quarter, federal lawmakers introduced and advanced several bills related to the Internet of Things (“IoT”), including two bills imposing requirements on manufacturers of devices with cameras or microphones. One of these bills is the Earning Approval of Voice External Sound Databasing Retained on People (“EAVESDROP”) Act (H.R. 8543), introduced by Representative Steve Scalise (R-LA) in July. The bill would require manufacturers of connected devices with microphones to provide notices to consumers regarding the devices’ collection of certain consumer information. Manufacturers would also have to provide an easy way for consumers to deactivate the ability of the device to collect information. The EAVESDROP Act exempts devices solely marketed as microphones and provides a safe-harbor for manufacturers that comply with a set of self-regulatory guidelines to be developed by the FTC. In contrast, the Informing Consumers about Smart Devices Act (H.R. 4081) would require manufacturers of connected devices equipped with a camera or microphone to disclose to consumers that a camera or microphone is part of the device, and would not apply to mobile phones, laptops, or other devices that consumers would reasonably expect to include a camera or microphone. The Informing Consumers about Smart Devices Act is sponsored by Reps. John R. Curtis (R-UT) and Seth Moulton (D-MA) and was approved by the House of Representatives on September 29, 2022.
Additionally, on September 28, 2022, the Senate approved the Small Business Broadband and Emerging Information Technology Enhancement Act of 2022 (S. 3906). As we noted in our Second Quarterly Legislative and Regulatory Update, this bipartisan bill, sponsored by Senators Jeanne Shaheen (D-NH) and John Kennedy (R-LA), aims to bolster IoT competencies at the Small Business Administration (“SBA”), including through the designation of a coordinator for emerging information technology (which includes IoT technology).
Federal regulatory efforts related to IoT this quarter largely centered on cybersecurity and consumer protections. For instance, the National Institute of Standards and Technology (“NIST”) published the final version of its Profile of the IoT Core Baseline for Consumer IoT Products (NIST IR 8425), building on work undertaken pursuant to E.O. 14028. The publication, which follows a public draft released in June 2022, describes NIST’s cybersecurity expectations for IoT products for home and personal use. As we noted in our previous quarterly update, the NIST guidance is not legally binding, but it signals a best practice that may later be incorporated by lawmakers in legislation.
NIST also published a report summarizing key takeaways from of its June 2022 IoT Cybersecurity workshop (NIST IR 8431), and a report with guidance for first responders on minimizing security vulnerabilities when using mobile and wearable devices (NIST IR 8235). Other agency activities impacting IoT technology include the FTC’s publication of a business guidance blog post focused on the marketplace for sensitive consumer location and health information collected by connected devices, and highlighting FTC enforcement against misuse of consumer data and deceptive claims about data anonymization. These developments signal a continued focus by federal regulators on IoT cybersecurity and the protection of consumer data collected by connected devices.
Connected and Autonomous Vehicles
On August 8, 2022, Reps. Debbie Dingell (D-MI) and Bob Latta (R-OH) launched the bipartisan Congressional Autonomous Vehicle Caucus. The first of its kind, the purpose of this caucus is to educate Congressional Members and staff on autonomous vehicle technology that can improve the safety and accessibility of roadways. Rep. Dingell stated that the caucus will help the United States stay at the “forefront of innovation, manufacturing, and safety” while “engaging all stakeholders, making bold investments, and working across the aisle to get the necessary policies right to support the safe deployment of autonomous vehicles.” Industry should watch for developments here, as policy proposals and opportunities for engagement could be on the horizon.
Federal regulators remain active in this space, signaling an interest in funding and advancing the deployment of CAV technologies. A recent stated priority for the Strengthening Mobility and Revolutionizing Transportation (“SMART”) Grants Program is to improve the integration of systems and promote connectivity of infrastructure, connected vehicles, pedestrians, and bicyclists, and the Department of Transportation (“DOT”) authorized and appropriated $100M for projects in this space for FY2022. Additionally, the Federal Transit Administration (“FTA”) and DOT issued a Notice of Funding Opportunity to apply for funding for projects exploring the use of Advanced Driver Assistance Systems (“ADAS”) for transit buses to demonstrate transit bus automation technologies in real-world settings. Finally, DOT issued a Request for Information seeking comments on the possibility of adapting existing and emerging automation technologies to accelerate the development of real-time roadway intersection safety and warning systems for drivers and vulnerable road users.
This quarter, the National Highway Traffic and Safety Administration (“NHTSA”) also released a final version of the Cybersecurity Best Practices for the Safety of Modern Vehicles guidance, an update to its 2016 edition. While the edits were largely cosmetic, a few key changes potentially relevant to CAVs and in-vehicle software are below:
- The final version clarifies that both suppliers and manufacturers should maintain a database of software components so that when vulnerabilities are identified in software, affected systems can be easily identified.
- The final version adds a new best practice stating that manufacturers should employ measures to limit firmware version rollback attacks (i.e., when an attacker uses the software update mechanisms to place older, more vulnerable software on a targeted device).
- The final version adds a new best practice stating that industry should collaborate to address “future risks” as they emerge.
Privacy and Cybersecurity
As described in further detail in our second quarterly update for 2022 and here, the ADPPA continues to be the prevailing data privacy framework in Congress. The bill sets forth broad requirements around data collection and disclosures, though the likelihood of passage this Congress continues to decrease as lawmakers remain stalled over issues around preemption and a private right of action. California’s principal privacy regulator – the California Privacy Protection Agency – convened a special meeting on July 28, 2022 to discuss the ADPPA and to express the Agency’s strong disagreement with the ADPPA’s preemption provision.
The FTC is also exploring privacy regulation, including through its Advanced Notice of Proposed Rulemaking, released in August. Specifically, the notice broadly asks whether the agency “should implement new trade regulation rules or other regulatory alternatives concerning the ways in which companies (1) collect, aggregate, protect, use, analyze, and retain consumer data, as well as (2) transfer, share, sell, or otherwise monetize that data in ways that are unfair or deceptive.” Notably, the FTC recently extended the deadline to receive comments on the notice to November 21, 2022. Additionally, the FTC released its agenda for a workshop on children’s advertising that will be held on October 19, 2022, which will focus on whether children can distinguish ads from entertainment in digital media.
We will continue to update you on meaningful developments in these quarterly updates and across our blogs.
Artificial Intelligence & NYC Employers: New York City Seeks Publication of Proposed Rules That Would Regulate the Use of AI Tools in the Employment Context
Many employers and employment agencies have turned to artificial intelligence (“AI”) tools to assist them in making better and faster employment decisions, including in the hiring and promotion processes. The use of AI for these purposes has been scrutinized and will now be regulated in New York City. The New York City Department of Consumer and Worker Protection (“DCWP”) recently issued a Notice of Public Hearing and Opportunity to Comment on Proposed Rules relating to the implementation of New York City’s law regulating the use of automated employment decision tools (“AEDT”) by NYC employers and employment agencies. As detailed further below, the comment period is open until October 24, 2022.
NYC’s Local Law 144, which takes effect on January 1, 2023, prohibits employers and employment agencies from using certain AI tools in the hiring or promotion process unless the tool has been subject to a bias audit within one year prior to its use, the results of the audit are publicly available, and notice requirements to employees or job candidates are satisfied. The DCWP, the New York City agency responsible for administering this law, proposed the new rules to clarify the responsibilities of employers and employment agencies once the statute goes into effect.
What tools are impacted? What are “Automated Employment Decision Tools”?
The law governs “AEDTs,” which are defined as “any computational process, derived from machine learning, statistical modeling, data analytics, or artificial intelligence, that issues simplified output, including a score, classification, or recommendation, that is used to substantially assist or replace discretionary decision making for making employment decisions that impact natural persons.” The proposed rules outline which tools fall within the scope of the law by defining “to substantially assist or replace discretionary decision making” as:
- relying solely on the tool’s output (score, tag, classification, ranking, etc.) without considering other factors;
- using the tool’s output as one of a set of criteria where the output is weighted more than any other criterion in the set; or
- using the tool’s output to overrule or modify conclusions derived from other factors.
When does the statute apply? Who are “candidates for employment”?
The new law applies when employers and employment agencies use an AEDT to screen either “candidates for employment” or “employees for promotion” within New York City. The proposed rules define “candidates for employment” to mean persons who have applied for a specific employment position by submitting the necessary information and/or items in the format required by the employer or employment agency.
What is the “bias audit”?
Local Law 144 prohibits the use of an AEDT unless it has been the subject of a bias audit within one year prior to its use. Under the proposed rules, the structure and requirements for the bias audit change based on how the AEDT is used.
- Where an AEDT selects individuals to move forward in the hiring process or classifies individuals into groups, the bias audit must: (i) calculate the selection rate for each category/classification and (ii) calculate the impact ratio for each category/classification. Categories are the component 1 categories (race, ethnicity and gender) as designated on the federal EEO-1 report.
- Where the AEDT only scores individuals rather than selecting them, the proposed rules require the bias audit to: (i) calculate the average score for individuals in each category and (ii) calculate the impact ratio for each category.
An “independent auditor” must perform bias audits. The proposed rules define “independent auditor” as “a person or group that is not involved in using or developing an AEDT that is responsible for conducting a bias audit of such AEDT.”
What happens with the audit results?
The proposed rules clarify Local Law 144’s requirement that the results of a bias audit must be “made publicly available on the website of the employer or employment agency” prior to use by stating that the information must be posted “on the careers or jobs section of their website in a clear and conspicuous manner.” Additionally, the proposed rule would require the information to remain posted for at least six months after the AEDT was last used to make an employment decision.
What about the notice requirements?
Local Law 144 requires that any employer or employment agency that uses an AEDT to screen an employee or a candidate who has applied for a position for an employment decision must notify individuals who reside in New York City that the AEDT will be used in connection with their assessment or evaluation, as well as the job qualifications and characteristics that the AEDT will consider. Notice must be provided at least 10 business days before use of an AEDT and must include instructions for how to request an alternative selection process or accommodation.
The proposed rules provide guidance to employers and employment agencies on how to satisfy the law’s notice requirements.
- For candidates for employment, the rules allow notification to impacted individuals through the following means:
(i) on the careers or jobs section of its website in a clear and conspicuous manner,
(ii) in the job posting, or
(iii) via U.S. mail or e-mail.
- For existing employees, the law’s notice requirements may be satisfied through:
(i) written policies or procedures,
(ii) in the job posting, or
(iii) written notice in person via U.S. mail or e-mail.
How do I comment on the proposed rules?
Anyone can comment on the proposed rules by:
- Submitting Written Comments: Written comments on the proposed rules must be submitted on or before Monday, October 24, 2022 and may be submitted via email at Rulecomments@dcwp.nyc.gov or through the city’s rules website at http://rules.cityofnewyork.us.
- Attending the Public Hearing: Interested parties may attend the public hearing on the proposed rules, which is scheduled to take place on Monday, October 24, 2022 at 11:00 AM. Additional details on how to access the public hearing via phone or videoconference are available here.
Supreme Court Grants Certiorari in Gonzalez v. Google, Marking First Time Court Will Review Section 230
This morning, the Supreme Court granted certiorari in Gonzalez v. Google LLC, 2 F.4th 871 (9th Cir. 2021) on the following question presented: “Does section 230(c)(1) immunize interactive computer services when they make targeted recommendations of information provided by another information content provider, or only limit the liability of interactive computer services when they engage in traditional editorial functions (such as deciding whether to display or withdraw) with regard to such information?” This is the first opportunity the Court has taken to interpret 47 U.S.C. § 230 (“Section 230”) since the law was enacted in 1996.
Continue Reading Supreme Court Grants Certiorari in Gonzalez v. Google, Marking First Time Court Will Review Section 230Fifth Circuit Upholds Texas Law Restricting Online “Censorship”
On September 16, the Fifth Circuit issued its decision in NetChoice L.L.C. v. Paxton, upholding Texas HB 20, a law that limits the ability of large social media platforms to moderate content and imposes various disclosure and appeal requirements on them. The Fifth Circuit vacated the district court’s preliminary injunction, which previously blocked the Texas Attorney General from enforcing the law. NetChoice is likely to ask the U.S. Supreme Court to review the Fifth Circuit’s decision.
HB 20 prohibits “social media platforms” with “more than 50 million active users” from “censor[ing] a user, a user’s expression, or a user’s ability to receive the expression of another person” based on the “viewpoint” of the user or another person, or the user’s location. HB 20 also includes various transparency requirements for covered entities, for example, requiring them to publish information about their algorithms for displaying content, to publish an “acceptable use policy” with information about their content restrictions, and to provide users an explanation for each decision to remove their content, as well as a right to appeal the decision.
Continue Reading Fifth Circuit Upholds Texas Law Restricting Online “Censorship”CISA Requests Public Comment on Implementing Regulations for the Cyber Incident Reporting for Critical Infrastructure Act
On September 12, 2022, the U.S. Cybersecurity and Infrastructure Security Agency (“CISA”) published a Request for Information, seeking public comment on how to structure implementing regulations for reporting requirements under the Cyber Incident Reporting for Critical Infrastructure Act of 2022 (“CIRCIA”). Written comments are requested on or before November 14, 2022 and may be submitted through the Federal eRulemaking Portal: http://www.regulations.gov.
Continue Reading CISA Requests Public Comment on Implementing Regulations for the Cyber Incident Reporting for Critical Infrastructure ActFederal Circuit Rules That Under The Patent Act An Inventor Must Be Human: So What Can Be Done To Patent AI Inventions?
In its August 5, 2022 affirmance of the district court’s grant of summary judgment, the Federal Circuit in Thaler v. Vidal ruled that the Patent Act unambiguously and directly answers the question of whether an AI software system can be listed as the inventor on a patent application. Since an inventor must be a human being, AI cannot be.
Judge Stark’s first authored precedential opinion since confirmation to the Federal Circuit aligns the U.S. position on whether AI can be listed as an inventor on a patent application with that of other major jurisdictions. Left for another day are questions such as the rights, if any, of AI systems, and whether AI systems can contribute to the conception of an invention.
PTO and Litigation Background of the DABUS Patent Applications
In July 2019, two patent applications were filed in the United States Patent and Trademark Office (PTO) that identified an AI system called DABUS (Device for the Autonomous Bootstrapping of Unified Sentience) as the sole inventor and Stephen L. Thaler as the Applicant and Assignee. DABUS, which was characterized as “a particular type of connectionist artificial intelligence” known as a “Creativity Machine” during prosecution and as “a collection of source code or programming and a software program” before the U.S. District Court for the Eastern District of Virginia, allegedly generated the subject matter of the two patent applications.
The filed patent applications specifically stated that the inventions were conceived by DABUS, and that DABUS should accordingly be named as the inventor. The PTO subsequently issued Notices stating that the applications did not identify each inventor by his or her legal name. In response to filed Petitions requesting that the PTO vacate the issued Notices, the PTO issued Petition Decisions refusing to vacate, explaining that a machine does not qualify as an inventor under the patent laws, and providing additional time to identify inventors by their legal name to avoid abandonment of the applications.
Thaler then sought judicial review under the Administrative Procedure Act in the Eastern District of Virginia, requesting an order compelling the PTO to reinstate the DABUS patent applications, and a declaration that a patent application for an AI-generated invention should not be rejected on the basis that no natural person is identified as an inventor. After briefing and oral argument, the district court issued an order denying Thaler’s requested relief and granting the PTO’s motion for summary judgment, recognizing the Federal Circuit’s consistent holdings under current patent law requiring inventors to be natural persons.
Continue Reading Federal Circuit Rules That Under The Patent Act An Inventor Must Be Human: So What Can Be Done To Patent AI Inventions?Biden Administration Announces Priorities for the Implementation of the CHIPS Act of 2022
On August 25, 2022, President Biden announced a new Executive Order (“EO”) addressing the Implementation of the CHIPS Act of 2022 (“CHIPS Act”). The CHIPS Act was signed by President Biden on August 9, 2022, and, among other things, authorizes $39 billion in funding for new projects to establish semiconductor production facilities within the United States. The new EO identifies the Administration’s implementation priorities for this CHIPS Act funding and creates the CHIPS Implementation Steering Council to aid with the rollout of administrative guidance. In connection with the EO, the Department of Commerce launched CHIPS.gov, which is intended to be a centralized resource for potential applicants of CHIPS funding. The EO and new website reflect the Administration’s intent to swiftly implement the CHIPS Act and increase the domestic production of semiconductors.
Continue Reading Biden Administration Announces Priorities for the Implementation of the CHIPS Act of 2022Artificial Intelligence and Algorithms in the Next Congress
Policymakers and candidates of both parties have increased their focus on how technology is changing society, including by blaming platforms and other participants in the tech ecosystem for a range of social ills even while recognizing them as significant contributors to U.S. economic success globally. Republicans and Democrats have significant interparty—and intraparty—differences in the form of their grievances and on many of the remedial measures to combat the purported harms. Nonetheless, the growing inclination to do more on tech has apparently driven one key congressional committee to have compromised on previously intractable issues involving data privacy. Rules around the use of algorithms and artificial intelligence, which have attracted numerous legislative proposals in recent years, may be the next area of convergence.
Continue Reading Artificial Intelligence and Algorithms in the Next CongressFTC Proposes Motor Vehicle Dealers Trade Regulation Rule

On July 13, the Federal Trade Commission published a notice of proposed rulemaking regarding the Motor Vehicle Dealers Trade Regulation Rule. The Motor Vehicle Dealers Trade Regulation Rule is aimed at combating certain unfair and deceptive trade practices by dealers and promoting pricing transparency. Comments to the proposed rule are due on or before September 12, 2022.
The proposed rule:
- Prohibits dealers from making certain misrepresentations in the sales process, enumerated in proposed § 463.3. The list of prohibited misrepresentations includes misrepresentations regarding the “costs or terms of purchasing, financing, or leasing a vehicle” or “any costs, limitation, benefit, or any other Material aspect of an Add-on Product or Service.”
- Includes new disclosure requirements regarding pricing, financing and add-on products and services. Notably, the proposed rule would obligate dealers to disclose the offering price in many advertisements and communications with consumers.
- Prohibits charges for add-on products and services that confer no benefit to the consumer and prohibits charges for items without “Express, Informed Consent” from the consumer (which, notably, as defined, excludes any “signed or initialed document, by itself”). The proposed rule outlines a specific process for presenting charges for add-on products and services to the consumer, which obligates the dealer to disclose and offer to close the transaction for the “Cash Price without Optional Add-Ons” and obtain confirmation in writing that the consumer has rejected that price.
- Imposes additional record-keeping requirements on the dealer, in order to demonstrate compliance with the rule. The record-keeping requirements apply for a period of 24 months from the date the applicable record is created.
The proposed rulemaking focuses only upon “Dealers”, at a time when Tesla is now selling direct-to-consumer, Ford has announced its own designs to launch an e-commerce platform, and companies such as BMW have begun to unbundle services from vehicle sales and create new standalone offerings (see this recent article on subscription seat warmers). Under the proposed rule, to meet the definition of a “Dealer”, a person/entity must be “predominantly engaged in the sale and servicing of motor vehicles, the leasing and servicing of motor vehicles, or both” (emphasis added).
Gesturing at some of the developments in automotive sales models, Commissioner Christine Wilson dissented, expressing her concern that despite the “best of intentions”, a complex regulatory scheme could “stifle innovation”. She requested comment on (among other items) “Anticipated changes in the automobile marketplace with respect to technology, marketing, and sales, and whether it is possible to future-proof the proposed Rule so that it avoids inhibiting beneficial changes in these areas.”