There have been multiple accounts created with the sole purpose of posting advertisement posts or replies containing unsolicited advertising.

Accounts which solely post advertisements, or persistently post them may be terminated.

What is Cara, the Instagram alternative that gained 600k users in a week?

Key points:

  • Cara’s Rapid Growth: The app gained 600,000 users in a week
  • Artists Leaving Instagram: The controversy around Instagram using images to train AI led many artists to seek an alternative
  • Cara’s Features: The app is designed specifically for artists and offers a ‘Portfolio’ feature. Users can tag fields, mediums, project types, categories, and software used to create their work
  • While Cara has grown quickly, it is still tiny compared to Instagram’s massive user base of two billion.
  • Glaze Integration: Cara is working on integrating Glaze directly in the app to provide users with an easy way to protect their work from be used by any AI

more about: blog.cara.app/blog/cara-glaze-about

thorbot ,

Who. The fuck. Cares

jadelord ,

This will be the headline a month later:

Cara’s monthly active users down to a few thousands. Here’s why.

yamanii ,
@yamanii@lemmy.world avatar
rolling_resistance ,

Cara has no passwords: you log in via Google or Apple

uhuh, no thanks

KuroiKaze ,

You have a problem with oauth?

WanderingVentra ,

A lot of people are trying to de-google.

Rudgrcom ,

you can use your email

yamanii ,
@yamanii@lemmy.world avatar

So much bad faith, I logged in just fine with a regular e-mail.

rolling_resistance ,

It’s just a quote from the article, but good to know.

vodkasolution ,

I’m no federated-nazi and I welcome projects like Cara, but at the beginning there are always lots of subscriptions

doodledup ,

I don’t understand how this Glaze thing is supposed to stop AI being trained on the art.

Etterra ,

It pollutes the data pool. The rule of gigo (garbage in garbage out) is used to garbage the AI results.

Basically, it puts some imperceptible stuff in the image file’s data (somebody else should explain how because I don’t know) so that what the AI sees and the human looking at the picture sees are rather different. So you try and train it to draw a photorealistic car and instead it creates a lumpy weird face or something. Then the AI uses that defective nonsense to learn what “photorealistic car” means and reproduce it - badly.

If you feed a bunch of this trash into an AI and tell it that this is how to paint like, say, Rembrandt, and then somebody uses it to try to paint a picture like Rembrandt, they’ll end up getting something that looks like it was scrawled by a 10-year-old, or the dogs playing poker went through a teleporter malfunction, or whatever nonsense data was fed into the AI instead.

If you tell an AI that 2+2=🥔, that pi=9, or that the speed of light is Kevin, then nobody can use that AI to do math.

If you trained Chat GPT to explain history by feeding it descriptions of games of Civ6 them nobody could use it to cheat on their history term paper. The AI would go on about how Gandhi attacked Mansa Musa in 1686 with all out nuclear war. It’s the same thing here, but with pictures.

egeres ,
@egeres@lemmy.world avatar

Right but, AFAIK glaze is targeting the CLIP model inside diffusion models, which means any new versions of CLIP would remove the effect of the protection

General_Effort ,

It’s not. It’s supposed to target certain open source AIs (Stable Diffusion specifically).

Latent diffusion models work on compressed images. That takes less resources. The compression is handled by a type of AI called VAE. For this attack to work, you must have access to the specific VAE that you are targeting.

The image is subtly altered so that the compressed image looks completely different from the original. You can only do that if you know what the compression AI does. Stable Diffusion is a necessary part of the Glaze software. It is ineffective against any closed source image generators that have trained their own VAE (or equivalent).

This kind of attack is notoriously fickle and thwarted by even small changes. It’s probably not even very effective against the intended target.

If you’re all about intellectual property, it kinda makes sense that freely shared AI is your main enemy.

Flaky ,

Yeah, this is why I get very skeptical or even dismissive of Glaze/Nightshade working. It’s an interesting concept, using machine learning/generative AI to “poison” the models, but I’ll have to believe it when I see it. I see people defending it by saying it has an academic paper - that doesn’t mean anything on its own. It still needs to be independently evaulated.

Saw a post on Bluesky from someone in tech saying that eventually, if it’s human-viewable it’ll also be computer-viewable, and there’s simply no working around that, wonder if you agree on that or not.

go_go_gadget ,

if it’s human-viewable it’ll also be computer-viewable

Sort of. If you raise a person to look at thousands pictures of random pixels and say “that’s a fox” or “that’s not a fox” eventually they’ll make up a pattern to say if the random pixels are a fox or not. Meanwhile someone raised normally will take one look and go “that’s just random pixels it’s not a picture of anything”. AI is still in that impressionable stage. So you feed it garbage and it doesn’t know it’s garbage.

General_Effort ,

I’m sure it works fine in the lab. But it really only targets one specific AI model; that one specific Stable Diffusion VAE. I know that there are variants of that VAE around, which may or may not be enough to make it moot. The “Glaze” on an image may not survive common transformations, such as rescaling the image. It certainly will not survive intentional efforts to remove it, such as appropriate smoothing.

In my opinion, there is no point in bothering in the first place. There are literally billions of images on the net. One locks up gems because they are rare. This is like locking up pebbles on the beach. It doesn’t matter if the lock is bad.

Saw a post on Bluesky from someone in tech saying that eventually, if it’s human-viewable it’ll also be computer-viewable, and there’s simply no working around that, wonder if you agree on that or not.

Sort of. The VAE, the compression, means that the image generation takes less compute; ie cheaper hardware and less energy. You can have an image generator that works on the same pixels, visible to humans. Actually, that’s simpler and existed earlier.

By Moore’s law, it would be many years, even decades, before that efficiency gain is something we can do without. But I think, maybe, this becomes moot once special accelerator chips for neural nets are designed.

What makes it obsolete is the proliferation of open models. EG Today Stable Diffusion 3 becomes available for download. This attack targets 1 specific model and may work on variants of it. But as more and more rather different models become available, the whole thing becomes increasingly pointless. Maybe you could target more than one, but it would be more and more effort for less and less effect.

themoonisacheese ,
@themoonisacheese@sh.itjust.works avatar

Not only is this kind of attack notoriously unstable, finding out what images have been glazed is a fantastic indicator for finding high-quality art that is the stuff you want to train on.

General_Effort ,

I doubt that. Having a very proprietary attitude towards one’s images and making good images are not related at all.

Besides, good training data is to a large extent about the labels.

boatsnhos931 ,

Nice try feds

stufkes ,

Isn’t there already artstation.com? Just had a look at Cara and it looks very similar

johannesvanderwhales ,

Its Instagram mashed up with artstation

nasi_goreng ,
@nasi_goreng@lemmy.zip avatar

There was anti-AI art campaign a few months back by Artstation user due to Artstation allowing them.

samus12345 ,
@samus12345@lemmy.world avatar
SomethingBurger ,
johannesvanderwhales ,

They actually seem quite a bit different. The one for Cara isn’t perfectly round and seems to suggest a person in the middle.

samus12345 ,
@samus12345@lemmy.world avatar

Yeah, they’re different, but “white circular C on a black background” just made me think of the CN one.

Anafabula ,
@Anafabula@discuss.tchncs.de avatar
Hadriscus ,

Thanks for the link. This is pretty much what I expected.

gedaliyah ,
@gedaliyah@lemmy.world avatar

the crowdfunding/patronage of this platform only helps them build their proprietary empire. It’s like giving money to your neighbor who wants to build a swimming pool on their property because they promise you’ll be able to swim in it.

PopOfAfrica ,

So what happend when this app needs to pay server costs for 600,000 people?

MostlyGibberish ,
Hadriscus ,

Do you mind telling what this says ? it seems Firefox doesn’t load twitter anymore. Or maybe you need an account ? I’m not sure, but it says “error”

wagoner ,
Dicska ,

“Jingna Zhang @ cara.app/zemotion @zemotion So freaking speechless right now. Seen many @vercel functions stories but first time experiencing such discrepancy vs request logs like, this is cannot be real??”

https://lemmy.world/pictrs/image/a0e22148-c796-4b8c-950c-dc86861c5135.jpeg

Omniraptor ,

I’ve heard from many ppl that vercel is pretty nasty that way, and to only use them for learning and toy projects.

dan ,
@dan@upvote.au avatar

Twitter no longer loads newer tweets if you’re logged out. Instead of showing a proper message, it either fails to load or redirects to the login page. They did that to prevent scraping.

jol ,

So what happens now? I doubt they have figured monetization right?

Evotech ,

They basically have to just move off vercel. There’s a lot of other much cheaper alternatives though

jol ,

OK, but they have a 90k bill now?

Evotech ,

Can probably be massaged a bit

jol ,

Massaging costs extra though?

UnderpantsWeevil ,
@UnderpantsWeevil@lemmy.world avatar

Re: the hosting company

Your account does not appear to have spend management enabled, which would allow you to pause your project entirely if you hit a certain level of spend.

So, this is something of a devil’s bargain. Either shut down your website just as it’s catching fire and gaining traction. Or get billed a year’s server budget in a matter of days because of exploding costs.

In a saner world, this might be used as an argument for treating the Internet as a public utility and not a for-profit rent. Perhaps more companies could grow and sustain large pools of customers if they weren’t kneecapped by their own momentum.

Instead, I’m sure we’re going to see more exotic insurance and finance services designed to siphon money out of websites as a hedge against unexpected growth.

TheFeatureCreature ,
@TheFeatureCreature@lemmy.world avatar

I’ll be watching this curiously from a safe distance for now. I am interested in a new platform without AI, but this stinks of early-stage enshitification.

pentagrammar ,

They have huge bills to pay already. It can totally lead to enshittification.

TheImpressiveX ,
@TheImpressiveX@lemmy.ml avatar

Out of the frying pan, into the fire.

grrgyle ,

Sounds like another pan

afraid_of_zombies ,

🤷 all we have to do is keep moving faster than our waste stream.

Platform has cool ideas, gets users, gets greedy, gets infected with bots and scammers, users leave for new platform with cool ideas…

Accept the idea that you are not going to have a thirty year old Yahoo Answers account and even if you did you won’t be using it, and make peace with it.

grrgyle , (edited )

This exactly. And also the more splintered similar user bases are, the better

More competition, less easy to enshittify a “captured” user base

Prandom_returns ,

Yet another centralised social network. That pinky-promises they’ll never go bad.

Join now! Bring your friends! No ads! Everything’s free! We’re indie!..

Moments later… enshitification ensues.

Sabata11792 ,

Solves the problem for a few years until Meta buys their users and data back.

Murdoc ,

Assuming they don’t own them already as a sort of pressure valve. Yeah I’m getting that cynical.

General_Effort ,

Does it seem odd… This is a crowd that is all about “hands off muh property”. And yet they see nothing suspicious about someone giving them a free service.

interdimensionalmeme ,

Yep, this is just instagram again with a little anti ai image filter on top. And a portfolio, not a photo album !

If it’s not as interoperable as email, it belongs in the trash

gedaliyah ,
@gedaliyah@lemmy.world avatar

Pixelfed looks like they are doing a huge push to get up to speed. It has been an immature app/platform for a long time and slow to get the features that people need from a photo sharing social media.

According to their mastodon, they are working for better AI management features, and launching an app that will make it a genuinely positive experience.

HootinNHollerin , (edited )

The official app is available in beta. I’m very impressed w it

ams ,

I really want Pixelfed to take off and this really could have been a moment, but after using it for more than a year now, I just can’t see it. Development is very slow - it feels like a one-man show (it might not be). We do need an alternative to Instagram, but yeah…

  • All
  • Subscribed
  • Moderated
  • Favorites
  • [email protected]
  • random
  • lifeLocal
  • goranko
  • All magazines