Sign Up for Alerts
Sign up to receive receive industry-specific emails from our legal team.
Sign Up for Alerts
We provide tailored, industry-specific legal updates to our clients and other friends of the firm.
Areas of Interest
April 1st, 2022
Are “Privacy-First” Clean Rooms Safe From Regulators?
Privacy & Data Security Group Chair Daniel M. Goldberg was quoted in the article, “Are ‘Privacy-First’ Clean Rooms Safe From Regulators?” published by Cybersecurity Law Report. The article discusses “data clean rooms” which enable companies to share consumer data with each other for targeted marketing while supposedly protecting privacy. Daniel is quoted saying, “Does this really move the industry forward? Or are we solving the issue of the disappearing third-party cookie, but bringing companies another bucket of privacy concerns?” He adds, “Not all data clean rooms rely on the same technology or security measures. They definitely could be subject to scrutiny, especially by regulators who don’t understand the technical nuances in the technology.”
Daniel comments on the claimed protections of a clean room, “you can use your first-party data to facilitate the types of advertising that you wanted to do previously. Don’t worry about the fact that third-party cookies are going away. We will still allow you to achieve what you need to be achieving.” Companies having an interest in clean rooms is positive, Daniel noted. This shows a company is “already thinking about the sophistication and importance of deidentifying data and the security around it,” he said.
Daniel says, “From a regulatory perspective, using a clean room sounds great in theory” because clean rooms have promoted themselves as privacy secure. However, Daniel notes, “clean rooms each have their secret sauce, with proprietors citing ‘fragmented architectures’ and other obscure approaches to protect PII.” One recently acquired start-up, DataFleets, noted that its clean-room technology “is very cutting edge and uses privacy-preserving algorithms coming out of academia and other corporate research programs.”
Regulators have already batted at adtech magic, Daniel said. They may look past the de-identified means of data sharing to focus on its ends: companies are using clean rooms to effectively target consumers on a one-to-one basis. He says, “A regulator's interpretation may be that there should not be any individually targeted ads in the ecosystem at all, period."
Given the potential regulations with clean rooms Daniel says, “Companies should investigate how the provider strips out identifiers and stops reversals of the process. Companies must confirm the clean room’s deidentification approach sufficiently ensures that one is not able to re-identify people in any context.” Daniel advises, “Companies should check the security of the systems to ensure that access restrictions are operating and rely on two-factor authentication or stronger measures.”
Daniel suggests a marketing or sales team to learn about the contract language review, “A marketing team or sales team is talking to the sales representative of the clean room, who will make all these promises around how it works. When you look at the contractual language, it doesn’t necessarily impose all those obligations or expressly state or incorporate those,” he said.
“Get on the line with the technical team, those who actually are responsible for the product, so they can explain to you how the product works,” Daniel says. Ask questions around what data goes in and how the data is joined with other data. What type of analytics can each party perform on the data? Companies considering using a clean room should confirm that the contract specifies the implementation measures, Daniel adds. The contract also should show what type of data can leave the room. “One reason that these clean rooms exist and are effective is because they compile the data of many, many different customers,” Daniel adds in conclusion.
Read the full article here. (Behind paywall)
Other Quoted
Challenges in Opt-Out Design and Children’s Privacy Highlighted by Sling TV’s Settlement With California AG
Cybersecurity Law Report quoted Daniel Goldberg regarding California AG Rob Bonta's $530,000 settlement with Sling TV for CCPA violations related to opt-out processes and children's privacy protections. Goldberg noted this is "the first CCPA settlement involving a connected TV" and indicates that connected TV is now a priority for privacy enforcement. Goldberg predicted future settlements are more likely to reach seven or eight figures as investigations progress. He explained that companies struggle to operationalize opt-outs across different platforms because many rely heavily on consent management platform vendors that often cover only cookie-based activity and don't connect to the backend systems that actually drive targeted advertising.
On children's privacy, Goldberg highlighted the CA AG's aggressive stance, noting that although Sling TV didn't collect age data, the AG concluded they still had knowledge of minors through child-directed channels, notifications from programmers, demographic inferences purchased from data brokers, and ad-targeting segments that included children. He observed this "arguably expands the notion of 'actual knowledge' under the CCPA." Goldberg advises companies to map out data flows and opt-out signals across every environment and audit vendor configurations to ensure the tools actually work. Read the full article on children’s privacy protections here. (Behind a paywall)
December 1 2025
Game companies must be flexible to comply with changing laws
Emma Smizer was recently featured as a panelist at GamesBeat Next 2025 and quoted in a GamesBeat article discussing global regulatory compliance and its impact on the gaming industry. The panel examined how evolving policy frameworks create new opportunities for developers and platforms navigating global markets.
Smizer addressed compliance challenges under emerging laws, specifically citing the Texas App Store Accountability Act. She noted that this kind of legislation changes how developers and platforms interact with users: “App stores have to do this age verification, but so do software and hardware developers. Global compliance is complicated, even just across the states… We’re moving toward a world where you can’t just be willfully ignorant about the age of your users.”
Her analysis emphasizes a growing trend that age verification and child safety requirements are not only regulatory hurdles but also can create opportunities and growth for businesses and sectors. Read the full summary of the panel here.
November 25 2025
Copyright Guide or Policy Change? Project Divides IP Attys
Law360 quoted Jacqueline Charlesworth on the controversy surrounding the American Law Institute’s copyright restatement project. Ms. Charlesworth criticized the initiative as advancing a “revisionist theory” that could weaken copyright protections. She was among nearly two dozen advisers who resigned from the project, signaling deep concerns about its direction.
The article highlights a broader debate within the IP community: whether the restatement simply clarifies existing law or attempts to reshape policy in favor of users. Ms. Charlesworth’s perspective emphasizes the stakes for rights holders as courts and practitioners consider how much influence the restatement may carry. Read the Law360 article about the copyright restatement project here.
November 19 2025
