General

ZwillGen’s Recommended New Year’s Resolutions for 2025

Published: Jan. 15, 2025

As we usher in the new year, we know people aren’t the only ones who are setting goals for improvement.  Businesses, which continue to face a rapidly evolving landscape of legal and regulatory changes, also want to know what their legal compliance focus should be in weeks and months ahead. ZwillGen is pleased to provide our recommended New Year’s resolutions for businesses to consider in 2025, from adapting to new AI regulations to strengthening data security and privacy practices.

Our Resolutions:

Artificial Intelligence: If your 2025 roadmap includes AI, adopt a strategy to adapt to evolving legal obligations and regulatory scrutiny, as well as customer requirements.

Auto-Renewal: Evaluate your consent and cancellation practices for auto-renewing subscriptions.

Children’s Privacy: Assess whether your current practices implicate new, broader requirements for users under 18.

Data Security: Minimize the risk of credential theft by enabling Multi-Factor Authentication.

Gaming: If you will be providing goods or services of any kind to a company operating in the gaming industry, seek counsel on potential licensing requirements before beginning to operate.

Privacy Law Compliance & Enforcement: Ensure your privacy compliance program is tailored to meet the requirements of the 5 state laws taking effect in January 2025, and the 3 new state laws coming later in the year.

Sensitive Data: Evaluate your processing of sensitive data, which requires consent under many state laws, and is the focus of state AG and FTC enforcement.

Vendor Contracting: Ensure appropriate vendor diligence and contracting processes are in place and that such processes are tailored to the types of data disclosed as well as the types of technologies vendors are using to process such data.


Artificial Intelligence

By Alexei Klestoff

As AI continues to evolve, 2025 promises significant legal developments that will reshape how organizations develop, deploy, and manage AI systems.  If your 2025 roadmap includes AI, you need a legal strategy to adapt to these evolving obligations and respond to customer requirements.

In the U.S., state-level regulation will intensify, with at least one more state expected to pass an AI safety law modeled after the EU AI Act and Colorado AI Act.  These safety laws focus on high-risk AI, which is a small subset of AI use cases.  This will leave a large regulatory hole that will likely be filled by the California Privacy Protection Agency’s automated decision-making technology (ADMT) regulations, which will be much broader and will likely become the standard due to California’s market influence. 

Risk assessments, bias testing, and red teaming to address these and other states’ ADMT requirements, as well as requirements specific to employment, insurance, and financial services, will become more common.

Enforcement patterns will shift notably. The new FTC will back off its aggressive AI push and state AGs will take the lead, leveraging existing privacy and consumer protection frameworks to require greater transparency and user control. In the EU, data protection authorities will continue expanding their interpretation of GDPR to address AI issues, particularly controllers’ lawful bases for model development and deployment and how to handle data deletion.

Contractual practices will also evolve. We started to see more AI-specific addenda in commercial contracts last year and we expect this trend to continue.

Due to increased enforcement and litigation risks, more companies will offer an opt out of model training and, in response to the EDPB’s AI Opinion, at least a few EU model developers will provide an opt out of scraping as well.  Some companies will pledge to never use user data for model training as a market differentiator. 

Data pipelines will continue to expand.  The trend of sites licensing corpora of their existing data will continue, despite the CPPA’s expanded definition of “data broker” in data broker registration regulations.  More business models where people voluntarily provide their data will also emerge.

Child safety laws will come for AI in 2025, either through dedicated AI legislation or regulatory investigations. Additionally, the ongoing erosion of CDA 230 protections may lead to court decisions excluding generative AI from its scope, requiring more robust content moderation systems and guardrails.

The dozens of copyright cases will continue to wend their way through the courts but will provide little clarity until trial or a settlement that will set the market for other pending cases.

To remain on top of the ever evolving AI landscape, ZwillGen has recently announced the acquisition of Luminos.Law, an innovative multidisciplinary firm that blends legal advisory services with the skills of senior data scientists, to help clients better understand and effectively deploy transformative artificial intelligence technology. This acquisition launches ZwillGen’s Artificial Intelligence (AI) Division focusing on red team testing and audits for AI models and systems, alongside the firm’s long-standing legal and policy guidance around AI technology.


Auto-Renewal Laws

By Daniel Levin and Zach Lerner

In 2025, a significant number of auto-renewal laws will take effect, many of which prescribe businesses’ practices related to obtaining consent and allowing consumers to cancel their auto-renewing subscriptions. In terms of consent, the FTC’s Negative Option Rule (which as discussed in our previous blog, will explicitly cover both B2C and B2B offerings) will require businesses to obtain the customer’s separate, affirmative consent to the Negative Option Feature (e.g., through an unchecked checkbox distinct from the call to action) and include certain novel disclosures at the point of sale. The Rule will also require businesses to provide an online cancellation mechanism that is easy to find and use, and imposes additional obligations if subscribers enrolled through telephone or mail.

State laws taking effect in 2025 also include noteworthy cancellation provisions—e.g., Minnesota’s auto-renewal law (effective Jan. 1, 2025) will prohibit sellers from making save offers within the cancellation flow without the consumer’s permission; and an amendment to California’s auto-renewal law (effective July 1, 2025) clarifies the existing enhanced cancellation restriction in a business-friendly way. 

These laws and other forthcoming changes to the auto-renewal landscape include other notable provisions that will impact various subscription touchpoints, including marketing collateral, landing pages, and several subscriber communications (including end of promotional period renewal notices, anniversary renewal notices, and material change notices).


Children’s Privacy

By Ben MacLean and Kandi Parsons

In 2024, the children’s privacy protection landscape evolved significantly as lawmakers prioritized strengthening online safeguards for minors under 18 (not just children under 13). This year saw an introduction of new state and global regulations aimed at age-appropriate design and establishing parameters on how children’s personal data is handled, as well as regulations seeking to cover social media-like services and addictive features. 

While federal protections for children’s privacy via the Kids Online Safety and Privacy Act (which incorporated KOSA and COPPA 2.0) and the FTC’s proposed changes to the COPPA Rule were contemplated, nothing was finalized. In the meantime, several states proposed, enacted, or amended laws that emphasize greater parental control, stricter data use restrictions, and stronger enforcement mechanisms. Many of these laws contain an expanded definition of “child,” require teen consent to targeted advertising (in some cases prohibiting targeted ads to minors completely), mandate affirmative “verifiable” parental authorization for various activities, and set forth  age verification requirements. State attorneys general, such as California and Texas, have sought to enforce children’s privacy rights through both new and existing legislation. Many bills have already been introduced in 2025 that would create further compliance requirements for organizations handling children’s data. 

As regulators and the public continue to scrutinize the role and obligations of technology with respect to children, organizations should account for developing state regulations and partner with their legal teams actively monitoring this area. In particular, organizations that have not previously been considering children’s privacy requirements may now fall within the scope of new laws.


Data Security

By Paul Rice

Credential theft and reuse are some of the most common online attacks, often from phishing, brute force, credential reuse, credential-stealing malware, or database breaches.  Single-factor authentication based on credentials, such as username and password, remains particularly vulnerable, highlighting the urgent need for a more secure solution.  MFA adds an extra layer of security by requiring multiple forms of verification (i.e., username and password plus an additional form of authentication) so that even if credentials are compromised, attackers still cannot access protected systems.  Sophisticated attackers may still compromise MFA, but effective use decreases the likelihood. 

Avoid MFA systems that rely on older technologies, such as SMS-based authentication, whenever possible due to SIM-swapping risks and move to passcodes or authenticator applications. We also recommend training personnel on MFA and password security to help prevent them from falling victim to attacks that seek to intercept second factors, such as MFA push notification fatigue attacks and social engineering.


Gaming

By Whitney Fore

The rapid expansion of U.S. mobile Sports wagering and, to a lesser (but increasing) extent, internet gaming (iGaming) has been a staple in news coverage since the 2018 Murphy decision allowing states to legalize sports wagering. Generally, reporting and discussions surrounding expanded gaming are focused on the public-facing operators of these gaming platforms – companies like FanDuel, DraftKings, and BetMGM. However, these household names in online gaming do not operate in a vacuum. Gaming operators rely on a lineup of partners with whom they contract to supply the technology necessary for offering their product within a highly regulated industry. Some examples of these partnerships include vendors who provide the essential components of patron geolocation and I.D. verification, payment processing, odds, and risk management. 

While the regulatory barriers to entry as a vendor into the gaming industry are not prohibitive, they are a significant factor to be considered. It is essential to balance the possibility of operating within a potentially lucrative, expanding industry with the costs of doing so within the extensive regulatory bounds. To streamline market entry and operations in this heavily regulated space, partnering with legal experts familiar with state-specific gaming regulations is advisable. Look for legal teams that not only understand the regulatory framework but also have established relationships with state regulators. Their expertise can help frame your company’s products and services in the best light, ensuring smoother approval and compliance processes.


Privacy Law Compliance & Enforcement

By Ahmed Eissa

2025 kicks off with several new state comprehensive privacy laws taking effect. On January 1, 2025, privacy laws in Delaware, Iowa, New Hampshire, and Nebraska came into force, and New Jersey will follow two weeks later on January 15, 2025. Later in the year, three more state comprehensive privacy laws will come online (Tennessee – July 1, 2025; Minnesota – July 31, 2025; and Maryland – October 1, 2025).

As the roster of states with comprehensive privacy laws grows, companies should carefully evaluate the applicability and threshold of each law to determine whether they need to comply with a given state’s privacy regime. And while it may be tempting to sort the states into certain compliance groups because they share many commonalities, note that the laws are not one-size-fits-all; some states break the mold and impose novel requirements not found in other laws. So, privacy law compliance in 2025 will include evaluation of whether and when to apply a tailored approach to responding to data subject rights requests, publishing privacy policies and notices, handling sensitive data, and satisfying other major substantive requirements of applicable laws.

Don’t forget that obligations don’t always end with the text of the statute. California, Colorado, and New Jersey all provide for subsequent rulemaking. Colorado’s final rules were filed in 2023, California has adopted a number of regulations but is still weighing additional draft regulations (e.g., relating to insurance, cybersecurity audits, risk assessments, and automated decision making technology (ADMT)). New Jersey’s rulemaking has not yet begun.

Finally, companies should keep tabs on the 2025 state legislative season. Fourteen states passed comprehensive legislation in 2023 and 2024. As of now, 2025 has no indication of slowing, and some states have already advanced privacy bills in lame duck sessions before 2024 ended. Knowing which state laws are coming down the pike—and what they require—will go a long way in developing an efficient compliance program. 


Sensitive Data

By Kandi Parsons

Nearly all the states with comprehensive privacy legislation require consent to the processing of sensitive personal data. Sensitive data includes, among other things, race, ethnicity, sexual orientation, religion, precise location, biometric/genetic information, children’s data, and health conditions. See e.g., New Jersey, and Delaware. Certain states have imposed even more onerous requirements for such data. Maryland restricts processing sensitive data unless it is “strictly necessary to provide or maintain a specific product or service requested by the consumer to whom the personal data pertains” and prohibits the sale of sensitive data (without an apparent option to obtain the consumer’s consent). States like Washington, Nevada, Colorado, and Texas have adopted specific notice and consent requirements and processing limitations for  specific types of data such as biometric data or health data. For example, the Washington state My Health, My Data Act contains many obligations protecting “consumer health data”, such as heightened consent for collection and sharing, data subject rights, additional privacy disclosures, and more. Colorado’s amendments to the state Privacy Act also broaden the scope of protected data, imposing new requirements on entities processing biometric data. Sensitive data requirements have been a key area of focus in enforcement efforts by state AGs, like Colorado, Connecticut, and Texas.   

At the federal level, the FTC has settled several cases alleging entities’ failures to provide sufficient notice and/or obtain adequate consent to the processing of sensitive data. Specifically, those associated with advertising and browsing data have been alleged to be unfair or deceptive practices under the FTC Act. In June, the FTC finalized its Order with Avast after alleging that the company unfairly collected consumer browsing data and sold it without proper consent or notice. The FTC also previously banned BetterHelp from sharing health data for advertising purposes, and required the company to obtain affirmative express consent before disclosing personal data to third parties for any purpose. In 2025, companies should assess whether they have proper compliance mechanisms in place and determine the appropriate approach for different pieces of sensitive data.


Vendor Contracting

By Melissa Maalouf

In 2024 we continued to see new privacy laws with specific vendor contracting requirements become enforceable in the United States, as well as increased regulatory scrutiny regarding the disclosure of personal data to vendors for processing.

Of note, in October, the European Data Protection Board (EDPB) adopted Opinion 22/2024 regarding the reliance on processors and subprocessors.  Under this guidance, the EDPB concludes that processors must provide controllers with the details of every subprocessor down the chain, along with information about the processing purposes and scope.  Further, the EDPB emphasized that the controller is responsible for confirming that all subprocessors can comply with the GDPR, including with respect to cross-border transfer rules.  While responsibility for data processing down the chain is ultimately on the controller, a processor’s inability to provide such information to controllers upon request could cause the processor to lose potential customers.  Collecting downstream processing information can be time consuming for processors, but we are already seeing EU- and UK-based data controllers requesting this detailed information.

While we have not seen any enforcement actions or guidance in the U.S. yet regarding diligence and contracting requirements for downstream vendors, given that a number of the U.S. privacy laws require companies to perform such diligence, it would not be surprising if we see regulatory activity in this context over the next year.  Such enforcement activity or guidance is particularly likely in the AI context, as companies begin relying more on AI vendors to assist with processing activities and regulators continue to scrutinize AI technologies.  To get ahead of these growing risks, companies should ensure that they have appropriate vendor diligence and contracting processes in place to vet vendors before providing them with personal data, and that such processes are tailored to the types of data disclosed as well as the types of technologies vendors are using to process such data.