AI news
June 18, 2024

Stability AI's SD3 Gets Banned On CivitAI

Stability AI's Stable Diffusion 3 has absurd licensing.

Jim Clyde Monge
Jim Clyde Monge

Stability AI’s highly anticipated Stable Diffusion 3 (SD3) model has been temporarily banned on the popular AI art platform Civitai due to concerns over the restrictive nature of its license agreement.

According to Civitai, the SD3 license grants Stability AI excessive control over the use of not only models fine-tuned on SD3, but also any models that include SD3-generated images in their training data.

Here’s what CivitAI team has to say:

Unfortunately, due to a lack of clarity in the license associated with Stable Diffusion 3, we are temporarily banning:

  • All SD3-based models
  • All models or LoRAs trained on content created with outputs from SD3-based models. This includes utilities such as control.

Currently, existing SD3 models will be archived.

Overview of Stable Diffusion 3 License

The Creator’s License

Stability AI has introduced a “Creator’s License” for the Stable Diffusion 3 model. This license applies to creators and developers who have less than $1 million in annual revenue, less than $1 million in institutional funding, and fewer than 1 million monthly active users.

License Fees and Image Generation Limits

The Creator’s License comes with a $20 monthly fee, even for those running the models locally on their computers. Additionally, the license reportedly limits image generation to just 6,000 images per month, which many argue is an unreasonably low limit.

User Experience

Having tried the SD3 medium myself, I can attest that it’s challenging to get a usable result on the first try. If a company wants to provide SD3 for its users and has, for example, 500,000 active users, it would have to limit them to a total of 6,000 images per month. This essentially makes it impossible to offer the service without frustrating everyone involved.

Derivative Works and Copyright Issues

The license defines “derivative works” as any modifications to the core model, including models created based on or derived from the core model or its output. This raises concerns about copyright implications and the potential impact on the broader AI community.

Legal Ambiguities and Their Impact

Legal ambiguities can have significant impacts on compliance and interpretation. Ambiguity in legal language is common due to the complex nature of legal texts. In criminal law, the rule of lenity holds that ambiguous statutes should be interpreted in the most favorable way for the defendant.

Ambiguity aversion can influence decision-making, with people opting to reduce non-compliant behavior as ambiguity about the perceived certainty of apprehension increases.

However, the effects of ambiguity interact with the perceived certainty of consequences in complex ways. The optimal level of specificity in laws depends on the legal and organizational context, as overly detailed laws can have inadvertent effects on behavior.

Psycholinguistic data can help resolve syntactic ambiguities in legal language, such as the scope of prepositional phrase attachment to coordinated noun phrases (e.g., “cars and trucks [with permits]”). Linguistic analysis of ambiguities in laws, such as the HITECH Act, can provide insights for domain experts like software engineers.

Community Reactions and Concerns

The AI art community has expressed concerns about the implications of SD3’s censored nature and restrictive license. Many fear that pre-programmed limitations in the model could stifle artistic expression and lead to homogenized, uninspired outputs.

The CivitAI community, which values unrestricted exploration and experimentation, advocates for user control through prompts and filters rather than top-down censorship. There are also worries that the SD3 license grants Stability AI excessive power over derivative works, potentially allowing them or future owners of the rights to demand takedowns or hefty fees from creators.

Some have called for continued experimentation with SD3 while being mindful of the license terms, and for exploring alternative models without such restrictions. The debate highlights the challenges of balancing safety and creative freedom in open-source AI art.

Here are some of the most interesting comments from the discussion on the temporary ban on Stable Diffusion 3:

“Civitai falls under a commercial license because we can buy buzz here. And it turns out that SAI is prohibiting from selling images generated in SD3 unless you buy a lifetime subscription from them. It is important to keep Civitai and all authors safe.”
“Civitai are a business, so that’s the issue, while those training don’t make money, Civitai do, so they can’t host SD3 to my understanding unless they disable the use of buzz for using it or simply host it as a free download but not have it as part of their image generator.”
“The license that SD3 has on it is a death sentence to that project in my view. Frankly, it was foolish that they would publish that license, and I think it’ll earn exactly what it deserves — public abandonment.”
“Stability AI betrayed the community by releasing a poisoned model with a toxic license. It’s better if the community moves on to alternative models.”
“That’s the non-commercial license, and the commercial license is very vague and broad in its statements on what counts as commercial use. It certainly needs clarification.”

The Absurd Terms of the Stable Diffusion 3 License

The license for Stable Diffusion 3 introduces several highly contentious terms that have sparked outrage within the AI community. One particularly absurd clause holds users liable for any actions or omissions by their customers or users in connection with the software product. This implies that users are responsible not only for their actions but also for those of their customers, creating an unreasonable and overly burdensome level of liability.

Another major concern is the potential for future fee increases. There is widespread anxiety that Stability AI might hike the fees, effectively holding users and their businesses hostage, especially those that have built their operations around Stable Diffusion 3 and its derivative works. This fear is compounded by the current monthly fee of $20 for the Creator’s License, which already imposes a severe limitation of 6,000 images per month, a threshold many deem unreasonably low.

In light of these issues, many in the community argue that Stability AI should honor the contributions of small artists and designers who have significantly improved the models. Proposals include exempting these individuals from some of the more restrictive terms or offering them more favorable conditions. Additionally, there is a call to set higher revenue thresholds for licensing fees, such as $500,000 or more, to ensure that smaller entities can continue contributing without facing prohibitive financial barriers. Overall, the community urges Stability AI to rethink and revise its license agreement to align with the open-source principles that have been fundamental to the development of these AI models.

Get your brand or product featured on Jim Monge's audience