That was their point lol
This is arguably the tamest thing Nintendo has done recently, so it’s a weird final straw
That was their point lol
This is arguably the tamest thing Nintendo has done recently, so it’s a weird final straw
“pirate community” lmfao
Lol the first link mostly squabbles over definitions, and argues for natural gas.
The second link relies on simulations and required specific blends of several renewable sources to get rid of the need for a baseload source, which is not a broadly applicable solution. Not every location can have 50% wind supplying power (fucking lol) and they STILL required natural gas to ramp up supply.
What the fuck do you mean by “base load myth”? Lol
“Fellas, does mention common refrains of someone you’re talking with make you MAD??”
to be fair, i didn’t have to respond, i did to bust your cocky egos
Lmfao do you actually think you sound like the cool vanguard of the people here?
Lol of course not, so I’ll repeat myself and say it’s funny how this never comes up in the “death to America” and “such and such is the West’s fault” of the other hexbear posts you comment in. I know you’re being a contrarian teenager right, but that’s the kind of stuff that makes hexbear posters look dumb.
That’s hilarious, do you also say that with your “death to America” hexbear buddies?
Does even Hamas state they want a two-state solution in their charter?
Steam stans refuse to look outside the platform and aren’t always the brightest bunch
Yeah, Marcov Chains are truly the worst thing our species has produced!
Glad you have your priorities straight. It’s been fun talking to a chatbot instead of having a discussion like normal people. You can respond to this comment with whatever responses you want, just know that according to the things I’ve pretended you said, I’ve won the argument.
Are you in high school? You’re making up things I never said and putting a sexual element on your responses for no reason.
Stay in school and learn how to have discussions before arguing about language and technology lol
Goodnight.
What on Earth is this in response to?? Did I say it was a hard riddle?
I concede. AI has a superintelligient brain and I’m just so jealous.
Point to any part of my comment that implied any of this.
I only gave more info on how LLMs work since what you were describing were Marcov chains. I wasn’t saying you were wrong with the thrust of your comment, just the details on how they work. If they were exactly as effective as Marcov chains we wouldn’t be having these discussions, that’s why they can be misused.
Feel free to discuss the actual words I’m using instead of this LLM word salad.
There’s already an explanation for the Mandela effect, it’s that our memories are extremely fallible and more affected by our view/environment as opposed to facts than most people believe.
I think you’re seeing coherence where there is none.
Ask it to solve the riddle about the fox the chicken and the grains.
I think it getting tripped up on riddles that people often fail or it not getting factual things correct isn’t as important for “believability”, which is probably a word closer to what I meant than “coherence.”
No one was worried about misinformation coming from r/SubredditSimulator, for example, because Marcov chains have much much less believability. “Just guessing words” is a bit of a over-simplification for neural nets, which are a powerful technology even if the utility of turning it towards language is debatable.
And if LLM’s weren’t so believable we wouldn’t be having so many discussions about the misinformation or misuse they could cause. I don’t think we’re disagreeing I’m just trying to add more detail to your “each word is generated independently” quote, which is patently wrong and detracts from your overall point.
I don’t disagree, I was just pointing out that “each word is generated independently of each other” isn’t strictly accurate for LLM’s.
It’s part of the reason they are so convincing to some people, they are able to hold threads semi-coherently throughout entire essay length paragraphs without obvious internal lapses of logic.
Generative AI and LLMs start by predicting the next word in a sequence. The words are generated independently of each other
Is this true? I know that’s how Marcov chains work, but I thought neural nets worked differently with larger tokens.
I feel like I clearly stated my issue with them, but throw out some more trite insults since you don’t have a response to it
Source?