Legal Archive
Four nonprofit groups seeking to protect kids’ privacy online asked the Federal Trade Commission (FTC) to investigate YouTube today, after back-to-back reports allegedly showed that YouTube is still targeting personalized ads on videos “made for kids”. Now it has become urgent that the FTC probe YouTube’s data and advertising practices, the groups’ letter said, and potentially intervene. Otherwise, it’s possible that YouTube could continue to allegedly harvest data on millions of kids, seemingly in violation of the Children’s Online Privacy Protection Act (COPPA) and the FTC Act. Targeted online advertising already oozes sleaziness, but targeting children is on a whole different level. There’s a reason you should keep a close eye on what your kids are watching on YouTube, and the various content rabbit holes YouTube’s algorithm can trap people in aren’t the only reason to do so. I’m not one of those extremists that believes YouTube is universally bad for kids – it all depends on what you watch, not that you watch – but that doesn’t mean I’m about to hand the remote control to my kids and leave the room.
Debate continues to rage over the federal Kids Online Safety Act (KOSA), which seeks to hold platforms liable for feeding harmful content to minors. KOSA is lawmakers’ answer to whistleblower Frances Haugen’s shocking revelations to Congress. In 2021, Haugen leaked documents and provided testimony alleging that Facebook knew that its platform was addictive and was harming teens—but blinded by its pursuit of profits, it chose to ignore the harms. But when Blumenthal introduced KOSA last year, the bill faced immediate and massive blowback from more than 90 organizations—including tech groups, digital rights advocates, legal experts, child safety organizations, and civil rights groups. These critics warned lawmakers of KOSA’s many flaws, but they were most concerned that the bill imposed a vague “duty of care” on platforms that was “effectively an instruction to employ broad content filtering to limit minors’ access to certain online content.” The fear was that the duty of care provision would likely lead platforms to over-moderate and imprecisely filter content deemed controversial—things like information on LGBTQ+ issues, drug addiction, eating disorders, mental health issues, or escape from abusive situations. Since then, Ars Technica reports in this detailed article, the law does seem to have been amended in positive, constructive ways – but not nearly far enough to make it workable and not prone to massive abuse. Sadly, it seems the bill is poised to pass, so we’ll have to see what the eventual, final version will look like.
In a well-intentioned yet dangerous move to fight online fraud, France is on the verge of forcing browsers to create a dystopian technical capability. Article 6 (para II and III) of the SREN Bill would force browser providers to create the means to mandatorily block websites present on a government provided list. Such a move will overturn decades of established content moderation norms and provide a playbook for authoritarian governments that will easily negate the existence of censorship circumvention tools. France wants to outdo everyone else for the worst tech policy ideas in history.
The Dolphin project has broken the silence regarding their legal tussle with Nintendo and Valve, giving a far more detailed elaboration of what, exactly happened. First things first – Nintendo did not send Valve or Dolphin a Digital Millenium Copyright Act (DMCA) section 512(c) notice (commonly known as a DMCA Takedown Notice) against our Steam page. Nintendo has not taken any legal action against Dolphin Emulator or Valve. What actually happened was that Valve’s legal department contacted Nintendo to inquire about the announced release of Dolphin Emulator on Steam. In reply to this, a lawyer representing Nintendo of America requested Valve prevent Dolphin from releasing on the Steam store, citing the DMCA as justification. Valve then forwarded us the statement from Nintendo’s lawyers, and told us that we had to come to an agreement with Nintendo in order to release on Steam. Considering the strong legal wording at the start of the document and the citation of DMCA law, we took the letter very seriously. We wanted to take some time and formulate a response, however after being flooded with questions, we wrote a fairly frantic statement on the situation as we understood it at the time, which turned out to only fuel the fires of speculation. So, after a long stay of silence, we have a difficult announcement to make. We are abandoning our efforts to release Dolphin on Steam. Valve ultimately runs the store and can set any condition they wish for software to appear on it. But given Nintendo’s long-held stance on emulation, we find Valve’s requirement for us to get approval from Nintendo for a Steam release to be impossible. Unfortunately, that’s that. The post also goes into greater detail about the Wii Common Key that’s been part of Dolphin’s codebase for 15 years. This key was originally extracted from the GameCube hardware itself, and a lot of people online claimed that Dolphin should just remove this key and all would be well. After consulting with their lawyers, Dolphin has come to the conclusion that including the key poses no legal risk for the project, and even if it somehow did, the various other parts of the Dolphin codebase that make emulation of original games possible would pose a much bigger legal threat anyway. So, the team will keep on including the key, and the only outcome here is that Dolphin will not be available on Steam.
Ars Technica: Antitrust enforcers released a draft update outlining new rules today that officials say will make it easier to crack down on mergers and acquisitions that could substantially lessen competition in the US. Now the public has 60 days to review the draft guidelines and submit comments to the Federal Trade Commission (FTC) and the Department of Justice (DOJ) before the agencies’ September 18 deadline. A fierce debate has already started between those in support and those who oppose the draft guidelines. Any corporation should be serving the democratically elected government of a country – not the other way around. If a merger or acquisition is deemed harmful to the competitive landscape, and thus to consumers, a government should be able to just stop it. The same applies to corporations who grow too large, too rich, too powerful – if a company’s actions start to dictate significant parts of the market or even economy, they are a threat to the stability and functioning of the society it’s claiming to be a part of, and as such, they should be able to be split up or their actions otherwise remedied to protect society. In other words, any steps the Us FTC and DOJ take to take control over runaway corporations are positive.
Together with the open source software community, GitHub has been working to support EU policymakers to craft the Cyber Resilience Act (CRA). The CRA seeks to improve the cybersecurity of digital products (including the 96 percent that contain open source) in the EU by imposing strict requirements for vendors supplying products in the single market, backed by fines of up to EUR15 million or 2.5% of global revenue. This goal is welcome: security is too often an afterthought when shipping a product. But as written it threatens open source without bolstering resilience. Even though the CRA, as part of a long-standing line of EU ‘open’ strategy, has an exemption for open source software developed or supplied outside the course of a commercial activity, challenges in defining the scope have been the focus of considerable community activity. Three serious problems remain with the Parliament text set for the industry (‘ITRE’) committee vote on July 19. These three problems are set out below. Absent dissent, this may become the final position without further deliberation or a full Parliament plenary vote. We encourage you to share your thoughts with your elected officials today. The three problems are substantial for open source projects. First, if an open source project receives donations and/or has corporate developers working on it, it would be regulated by the CRA and thus face a huge amount of new administrative rules and regulations to follow that would no doubt be far too big a burden for especially smaller projects or individual developers. On top of that, the CRA, as it currently stands, also intends to mess with the disclosure process for vulnerabilities in a way that doesn’t seem to actually help. These three problems are big, and could have far-reaching consequences for open source.
The Interactive Advertising Bureau, one of the biggest names in online advertising, held some sort of corporate event or whatever in January of this year, and the IAB CEO, David Cohen, held a speech there to rally the troops. Apparently, those of us who are fighting back against the online advertising industry? We’re “extremists”. Extremists are winning the battle for hearts and minds in Washington D.C. and beyond. We cannot let that happen. These extremists are political opportunists who’ve made it their mission to cripple the advertising industry and eliminate it from the American economy and culture. This guy, who uses double spaces after a period and hence is already on my shitlist, just gave us an amazing creed.
As you may have noticed, I used the word copyrighted for the title of this story. And it’s not without reason. I think this story could have been fairly decent even without the copyright part, so before we get to the nitty gritty stuff – I can 100% confirm that Brave lets you ingest copyrighted material through their Brave Search API, to which they also assign you “rights”. Time and time again, Brave gets caught doing slimy things. Just don’t use Brave. There are far, far better and more ethical alternatives.
Today, the European Commission adopted its adequacy decision for the EU-U.S. Data Privacy Framework. The decision concludes that the United States ensures an adequate level of protection – comparable to that of the European Union – for personal data transferred from the EU to US companies under the new framework. On the basis of the new adequacy decision, personal data can flow safely from the EU to US companies participating in the Framework, without having to put in place additional data protection safeguards. In 2020, European Union courts struck down the previous agreement between the EU and the US, the Privacy Shield, as the court stated it did not sufficiently protect EU user data from US government surveillance. This was obviously a big problem for companies like Facebook and Google, and ever since, the two blocks have been trying to come up with a replacement that would allow these companies to continue to operate relatively unscathed. In the meantime, though, several European countries handed out large fines to Amazon and Facebook for not taking proper care of EU user data. So, what makes this new agreement stricter than the previous one? The EU-U.S. Data Privacy Framework introduces new binding safeguards to address all the concerns raised by the European Court of Justice, including limiting access to EU data by US intelligence services to what is necessary and proportionate, and establishing a Data Protection Review Court (DPRC), to which EU individuals will have access. The new framework introduces significant improvements compared to the mechanism that existed under the Privacy Shield. For example, if the DPRC finds that data was collected in violation of the new safeguards, it will be able to order the deletion of the data. The new safeguards in the area of government access to data will complement the obligations that US companies importing data from EU will have to subscribe to. I’m obviously no legal expert so take this with a grain of salt, but this kind of feels like yes, there are additional protections and safeguards, but if (let’s be real here: when) companies like Facebook violate these, don’t worry, EU citizen! You can undertake costly, complex, and long legal proceedings in misty business courts so Facebook or whatever can get fined for an amount that Zuckerberg spends on his interior decorator every week. The courts struck down the Safe Harbor agreement in 2015, and the aforementioned Privacy Shield in 2020, so we’ll see if this new agreement stands the test of the courts.
Ars Technica: This weekend saw an exception to that rule, though, as Nintendo’s lawyers formally asked Valve to cut off the planned Steam release of Wii and Gamecube emulator Dolphin. In a letter addressed to the Valve Legal Department (a copy of which was provided to Ars by the Dolphin Team), an attorney representing Nintendo of America requests that Valve take down Dolphin’s “coming soon” Steam store page (which originally went up in March) and “ensure the emulator does not release on the Steam store moving forward.” The letter exerts the company’s “rights under the Digital Millennium Copyright Act (DMCA)’s Anti-Circumvention and Anti-Trafficking provisions,” even though it doesn’t take the form of a formal DMCA takedown request. In fighting a decision like this, an emulator maker would usually be able to point to some robust legal precedents that protect emulation software as a general concept. But legal experts that spoke to Ars said that Nintendo’s argument here might actually get around those precedents and present some legitimate legal problems for the Dolphin Team. This silly cat and mouse game between Nintendo and emulators is childish. The only people getting rich off this are lawyers.
With United States v. Smith (S.D.N.Y. May 11, 2023), a district court judge in New York made history by being the first court to rule that a warrant is required for a cell phone search at the border, “absent exigent circumstances” (although other district courts have wanted to do so). EFF is thrilled about this decision, given that we have been advocating for a warrant for border searches of electronic devices in the courts and Congress for nearly a decade. If the case is appealed to the Second Circuit, we urge the appellate court to affirm this landmark decision. Of course, a decision like this can go through quite a few more courts, but it’s a good precedent.
Apple Inc. failed to fully revive a long-running copyright lawsuit against cybersecurity firm Corellium Inc. over its software that simulates the iPhone’s iOS operating systems, letting security researchers identify flaws in the software. The US Court of Appeals for the Eleventh Circuit on Monday ruled that Corellium’s CORSEC simulator is protected by copyright law’s fair use doctrine, which allows the duplication of copyrighted work under certain circumstances. CORSEC “furthers scientific progress by allowing security research into important operating systems,” a three-judge panel for the appeals court said, adding that iOS “is functional operating software that falls outside copyright’s core.” Good.
Before you read this article – note that Codeium offers a competitor to GitHub Copilot. This means they have something to sell, and something to gain by making Copilot look bad. That being said – their findings are things we already kind of knew, and further illustrate that Copilot is quite possibly one of the largest, if not the largest, GPL violations in history. To prove that GitHub Copilot trains on non permissive licenses, we just disable any post-generation filters and see what GPL code we can generate with minimal context. We can very quickly generate the GPL license for a popular GPL-protected repo, such as ffmpeg, from a couple lines of a header comment. Codeium claims it does not use GPL code for its training data, but the fact it uses code licensed more permissively still raises questions. While the BSD and MIT-like licenses are more permissive and lack copyleft, they still require the inclusion of the terms of the license and a copyright notice to be included whenever the covered code is used. I’m not entirely sure if using just permissively licensed code as training data is any better, since unless you’re adding the licensing terms and copyright notice with every autocompleted piece of code, you’re still violating the license. If Microsoft or whoever else wants to train a coding “AI” or whatever, they should either be using code they own the copyright to, get explicit permission from the rightsholders for “AI” training use (difficult for code from larger projects), or properly comply with the terms of the licenses and automatically add the terms and copyright notices during autocomplete and/or properly apply copyleft to the newly generated code. Anything else is a massive copyright violation and a direct assault on open source. Let me put it this way – the code to various versions of Windows has leaked numerous times. What if we train an “AI” on that leaked code and let everyone use it? Do you honestly think Microsoft would not sue you into the stone age?
While ChatGPT has become what seems like a household name, the AI model’s method of data collection is somewhat concerning and has some clear negative connotations. With that being the case, Italy is moving forward with legal action to stop ChatGPT from operating for the time being. Good. These corporate, for-pay tools are built upon the backs of untold numbers of writers and other artists who have not been asked if they want their works to be used. For instance Microsoft will stomp any misuse of its codes or trademarks into the ground, but at the same time, it’s building entire profit streams on the backs of others. This is wrong.
A federal judge has ruled against the Internet Archive in Hachette v. Internet Archive, a lawsuit brought against it by four book publishers, deciding that the website does not have the right to scan books and lend them out like a library. Judge John G. Koeltl decided that the Internet Archive had done nothing more than create “derivative works,” and so would have needed authorization from the books’ copyright holders — the publishers — before lending them out through its National Emergency Library program. As much as we all want the Internet Archive to be right – and morally, they are – copyright law, as outdated, dumb, and counterproductive as it is, was pretty clear in this case. Sadly.
Google has lost its latest battle with European Union regulators. This morning, the EU General Court upheld Google’s record fine for bundling Google Search and Chrome with Android. The initial ruling was reached in July 2018 with a 4.34 billion euro fine attached, and while that number has been knocked down to 4.125 billion euro ($4.13 billion), it’s still the EU’s biggest fine ever. The EU takes issue with the way Google licenses Android and associated Google apps like the Play Store to manufacturers. The Play Store and Google Play Services are needed to build a competitive smartphone, but getting them from Google requires signing a number of contracts that the EU says stifles competition. Google breakin’ rocks in the hot sun.
The most notable proposed fix (listed in Annex II) is for phone makers and sellers to make “professional repairers” available for five years after the date a phone is removed from the market. Those repairers would have access to parts including the battery, display, cameras, charging ports, mechanical buttons, microphones, speakers, and hinge assemblies (including for folding phones and tablets). Phone companies also get a choice: either make replacement batteries and back-covers available to phone owners or design batteries that meet minimum standards. Those include still having 83 percent of its rated capacity after 500 full charging cycles, then 80 percent after 1,000 full charging cycles. Apple, for example, currently claims that its iPhones are designed to retain 80 percent capacity after 500 charge cycles. Good. I’ve been saying it for years: if the automotive industry can be legally obligated to provide spare parts, repair information, and more to third parties, so can the technology industry.
CNet decided to ask makers of home security cameras about their policies when it comes to dealing with requests from United States law enforcement: Ring, the Amazon-owned video doorbell and home security company, came under renewed criticism from privacy activists this month after disclosing it gave video footage to police in more than 10 cases without users’ consent thus far in 2022 in what it described as “emergency situations.” That includes instances where the police didn’t have a warrant. While Ring stands alone for its extensive history of police partnerships, it isn’t the only name I found with a carve-out clause for sharing user footage with police during emergencies. Google, which makes and sells smart home cameras and video doorbells under the Nest brand, makes as much clear in its terms of service. Other manufacturers of home security cameras, such as Wyze and Arlo, only provide footage after a valid warrant, while devices that use Apple’s HomeKit Secure Video are end-to-end encrypted, so footage cannot be shared at all. In other words, if you live in the United States, it’s best to avoid Amazon’s and Google’s offerings – especially if you’re a member of a minority or are a woman seeking essential healthcare – and stick to Apple’s offerings instead.
The Dutch Ministry of Education has decided to impose some restrictions on the use of the Chrome OS and Chrome web browser until August 2023 over concerns about data privacy. The officials worry that Google services collect student data and make it available to large advertising networks, who use it for purposes beyond helping education. Since the national watchdog doesn’t know where or how the students’ personal data is stored and processed, there are concerns about violating European Union’s GDPR (General Data Protection Regulation). It always irritates me to no end when people claim all the GDPR ever did was create cookie prompts (it didn’t – those prompts aren’t even GDPR compliant), when in fact, it’s been leading to things like this, where governments and advocacy groups now have the legal means to fight companies that violate the privacy rights of those of us in the EU. In this particular case, Google is being forced to change its privacy systems for the better. It’s a sign of things to come now that the DMA has been fully passed.
The Council today gave its final approval on new rules for a fair and competitive digital sector through the Digital Markets Act (DMA). The DMA ensures a digital level playing field that establishes clear rights and rules for large online platforms (‘gatekeepers’) and makes sure that none of them abuses their position. Regulating the digital market at EU level will create a fair and competitive digital environment, allowing companies and consumers to benefit from digital opportunities. This final approval was a formality, but you never know with corporations.