December 9, 2023

– Advertisement –

Generative AI, especially text-to-image AI, is attracting as many lawsuits as it does venture dollars.

The two companies behind popular AI art tools, Midjourney and Stability AI, are embroiled in a legal case alleging they violated the rights of millions of artists by training their tools on web-scraped images. Separately, stock image supplier Getty Images took Stability AI to court for allegedly using images from its site without permission to train Stable Diffusion, an art-producing AI.

– Advertisement –

The flaws of generative AI – the data it is trained on, and relatedly, the makeup of its training data – is its tendency to put it in the legal crosshairs. But a new startup, Bria, claims to reduce the risk by training image-making — and soon video-making — AI in an “ethical” way.

“Our goal is to empower both developers and creators, while ensuring that our platform is legally and ethically sound,” Bria co-founder Yair Adato told TechCrunch in an email interview. “We combined the best of visual generative AI technology and responsible AI practices to create a sustainable model that prioritizes these ideas.”

– Advertisement –

Image credit: Bria

– Advertisement –

Adato co-founded Bria when the pandemic struck in 2020, and the company’s other co-founder, Asa Elder, joined in 2022. Adato’s Ph.D. Studying in computer science at Ben-Gurion University of the Negev, he says he developed a passion for computer vision and its potential to “improve” communication through generative AI.

“I realized there is a real commercial use case for this,” Adato said. “The process of creating visuals is complex, manual and often requires specialized skills. Bria was created to address this challenge – providing a visual generative AI platform tailored to enterprises that digitizes and automates this entire process Is.

Thanks to recent advances in the field of AI, both on the commercial and research side (open source models, decreasing cost of computation, and so on), there is no shortage of platforms that provide text-to-image AI art tools (midjourney ) offer. , DeviantArt, etc.). But Adato claims that Bria is different in that it (1) focuses exclusively on enterprise and (2) was built with ethical considerations in mind from the start.

See also  AlleyWatch Startup Daily Funding Report: 10/13/2022 – AlleyWatch

Bria’s platform enables businesses to create visuals for social media posts, ads and e-commerce listings using its image-generating AI. Through a web app (an API is on the way) and Nvidia’s Picasso Cloud AI service, customers can generate, modify or upload visuals and optionally switch on a “brand guardian” feature, which ensures strives to adhere to their visual brand guidelines.

The AI ​​in question is trained on “authorized” datasets that contain content that Bria licenses from partners, including individual photographers and artists, as well as media companies and stock image repositories, which receive a share of the startup’s revenue. We do.

Bria isn’t the only enterprise exploring a revenue-sharing business model for generative AI. Shutterstock’s recently launched Contributors Fund reimburses creators whose work is used to train AI art models, while OpenAI has donated a portion of Shutterstock’s library to its image creation tool DAL-E2. Licensed to train. Meanwhile, Adobe says it’s developing a compensation model for contributors to Adobe Stock, its Stock content library, that will allow them to “monetize their talent” and benefit from any revenue that its content generates. Firefly, the generative AI technology, brings.

But Bria’s approach is more comprehensive, Adato told me. The company’s revenue sharing model rewards data owners based on the impact of their contributions, allowing artists to set prices on a per-AI-training-run basis.

Adato explains: “Every time an image is generated using Bria’s generative platform, we find the scenes in the training set that contributed the most [generated art], and we use our technology to allocate revenue among creators. This approach allows us to have many licensed sources in our training set, including artists, and avoid any issues related to copyright infringement.

Image credit: Bria

Bria clearly marks all generated images on its platform with watermarks and offers free access — or so it claims, at least — to nonprofits and academics who want to “democratize creativity, create deepfakes.” work to prevent or promote diversity.”

See also  ‘Affordability is hindered’: Mortgage rates rise for fourth week in a row, surpass 7%

In the coming months, Bria plans to go a step further, offering an open source generative AI art model with a built-in attribution mechanism. Attempts have been made at this, such as Have I been coached? and static attribution, sites that make a best effort to identify which artworks contributed to a particular AI-generated visual. But Bria’s model will allow other generative platforms to set up similar revenue sharing arrangements with creators, Addato says.

It’s hard to put too much stock in Bria’s technology given the origins of the generative AI industry. It’s unclear how, for example, Bria is “tracing back” to the training sets and using this data to split revenue. How will Bria handle complaints from creators who allege they are being unfairly underpaid? Will some creators get paid more because of a glitch in the system? Only time will tell.

Adato exudes the kind of trust you’d expect from a founder despite the unknowns, arguing that each contributor to Bria’s platform AI training dataset gets their fair share based on usage and “real impact.”

“We believe that the most effective way to solve [the challenges around generative AI] The training set is level, using a high-quality, enterprise-grade, balanced and secure training set,” said Adato. “When it comes to the adoption of generative AI, there are ethical and legal implications to ensure companies What needs to be considered is that technology is used in a responsible and safe way. However, by working with Bria, companies can rest assured that these concerns will be taken care of.”

This is an open question. And it’s not the only one.

What if a creator wants to opt out of Bria’s platform? can they? Adato assured me that they would be able to do so. But Bria uses its own opt-out mechanism, as opposed to a more general standard like DeviantArt’s or artist advocacy group Spawning’s, which offer a website where artists can submit their art to one of the more popular generative art training data sets. Can be removed from one.

See also  Singapore’s Temasek leads a $40 million funding round in a Chinese startup

This increases the burden for content creators, who now have to worry about potentially taking steps to remove their art from another generative AI platform (unless they use a “cloaking” tool like Glaze). , untrained their art). Adato doesn’t see it that way.

“We have made it a priority to focus on secure and quality enterprise data collection in building our training sets to avoid biased or toxic data and copyright infringement,” he added. “Overall, our commitment to ethical and responsible training of AI models sets us apart from our competitors.”

Those rivals include OpenAI, Midjourney and Stability AI, as well as Jasper, whose generative art tool, Jasper Art, also targets enterprise customers. The formidable competition – and open ethical questions – don’t seem to be deterring investors, however – Bria has raised $10 million in venture capital from Entri Capital, IN Venture, Getty Images and a group of Israeli angel investors.

Bria

Image credit: Bria

Adato said Bria is currently serving a “range” of customers, including marketing agencies, visual stock repositories, and tech and marketing firms. “We are committed to continuously growing our customer base and providing them with innovative solutions for their visual communication needs,” he added.

Should Bria be successful, part of me wonders whether it will spawn a new crop of generative AI companies that are more limited than the big players today — and thus less vulnerable to legal challenges. Together Grant As generative AI begins to cool, partly due to higher levels of competition and questions of liability, more “narrow” generative AI startups may offer a chance to cut through the noise and avoid lawsuits in the process.

We’ll have to wait and see.

Source link

– Advertisement –