Not if isn’t connected to a network. 😈
Not if isn’t connected to a network. 😈
Yeah but not all consumption is equal. And when musk is an accelerations… that’s probably worse
This is why I just set up a media server at home.
It’s mine, you can’t pump it full of ads. All the media is mine and those companies can go fuck themselves.
Sail those seas folks
If what he says is true men have no purpose
The brain-dead take is that companies should explore new technology. Without any qualifiers on it (i assume there aren’t because you didn’t add any and you applied it to ai). That’s how we’ve ended up with such a huge amount of waste, pollution and theft from small independents.
Even if we just narrow it to the field of AI, the waste and environmental damage from just this kind of tech is just absurd.
Let’s add to the downsizing ai causes, the pathetic service disruptions and inevitable decline of a company’s reputation from using such a thing and its nothing but a waste.
Wild for a company that’s never made a profit
Nor should what they produce be copyrightable in any form. Even if it’s the base upon which an artist builds.
Also, it should all be free.
Yeah. The demo needed to take him to task years ago. The fact they waited until the election is a political move and decision. And a poorly thought out one
The fundamental problem is all those results are on people with abnormal brain function. Because of the corpus calusotomy.
It can’t be assumed things work that way in a normal brain.
People do make up things in regards to themselves often. Especially in the case of dissonance. But that’s in relation to themselves, not the things they know. Most people, if you asked what op did will either admit they don’t know or that you should look it up. The more specific the question the less likely to make something up.
Funny thing is, that the part of the brain used for talking makes things up on the fly as well 😁 there is great video from Joe about this topic, where he shows experiments done to people where the two brain sides were split.
Having watched the video. I can confidently say you’re wrong about this and so is Joe. If you want an explanation though let me know.
Or, the words “i don’t know” would work
You’re right, it’s not. It needs to know what things look like. Which. Once again, it’s not going to without knowing what those things look like. Sorry dude either csam is in the training data and can do this. Or it’s not. But I’m pretty tired of this. Later fool
Once again you’re showing the limits of AI. A dragon exists in fiction. It exists in the mind of someone drawing it. While in ai, there is no mind, the concept cannot independently exist.
Generative AI, just like a human, doesn’t rely on having seen an exact example of every possible image or concept
If a human has never seen a dog before, they don’t know what it is or what it looks like.
If it’s the same as a human, it won’t be able to draw one.
I wasn’t the one attempting to prove that. Though I think it’s definitive.
You were attempting to prove it could generate things not in its data set and i have disproved your theory.
To me, the takeaway here is that you can take a shitty 2 minute photoshop doodle and by feeding it thru AI it’ll improve the quality of it by orders of magnitude.
To me, the takeaway is that you know less about ai than you claim. Much less. Cause we have actual instances and many where csam is in the training data. Don’t believe me?
But you do know because corn dogs as depicted in the picture do not exists so there couldn’t have been photos of them in the training data, yet it was still able to create one when asked.
Yeah, except photoshop and artists exist. And a quick google image search will find them. 🙄
Then if your question is “how many Photograph of a hybrid creature that is a cross between corn and a dog were in the training data?”
I’d honestly say, i don’t know.
And if you’re honest, you’ll say the same.
Also if you’d like to see how the corn dog comment is absurd and wrong. Go look up my comment.
It didn’t generate what we expect and know a corn dog is.
Hence it missed because it doesn’t know what a “corn dog” is
You have proven the point that it couldn’t generate csam without some being present in the training data
More like sophisticated terrorist attack tools used