President Trump strongly criticized Ukrainian President Volodymyr Zelensky after a tense argument in the Oval Office on Friday. Trump accused Zelensky of showing disrespect toward the United States…
I find it really interesting that almost all of the recent comments on the YouTube video are 95% the same and praising “how great all this transparency” is, completely drowning out all other comments. They’re also worded very very similarly.
There is a similar phenomenon on Tiktok. The accounts often have similar name structures, have no picture or some USA flag type pic, but what gets me is that they all have around the same amount of followers or people following them.
Then people will make entire videos to engage with the bots… Drives my blood pressure up when I see it.
reddit has those types of bots too, especially on the conservative sub, half of it is by russia. and reddit touting how its getting rid of so many spammers,yet they did very little to stop the political ones, instead went after all the OF ones, and the general users too.
I read a while ago about someone doing forensic analysis using the heuristic of analyzing follower count numbers. I forget the exact mechanism, but certain patterns indicate that the follower count was statistically anomalous and therefore likely couldn’t be trusted.
Actually regardless of any type of video subject, really.
Any new video that I’ve seen, across many different subjects, has robotic comments right away, when the videos are first uploaded, the first minute or three.
I guess the YouTube folks are thinking that if it looks like there’s a bunch of people “in the room” other people will want to jump in and join the conversation.
Then after that, you’ll get robotic responses to your comments, that are usually antagonistic in nature, to try to bring you back to the YouTube app to reply to and continue interacting with the app.
AI has been rapidly employed to create social media profiles. And before AI, there are entire teams dedicated to managing their own “teams” of social media profiles.
You can rest assured, like 70% of “users” on social media sites are just puppet accounts. You, as a private citizen, can rent some of these pools for things like PR campaigns.
theres still teams doing that, in addition to the AI. Reddit is pretty wierd around ai, they use ai for thier moderation but will ban most AI used by users.(they wont do any thing around russia-troll AI accounts) but will happily ban others.
Everyone is national-identity number verified through a third party overseen by PricewaterhouseCoopers, and we do regular video calls to vibe check you and try to sus out whether you’re posting your own thoughts or just renting out your account
Naturally only 40% of our “users” are bots
[ 👆 I think about this all time ] also for reviews etc.
I checked the super generic comments that seemed AI generated. Some of them had content like videos or decent descriptions. Others had profiles that were as old as 10 years. If they were bots I would expect them to be created in the last few years. There’s a few that are suspicious, but it was maybe 2 in 15 accounts. They could also be hacked accounts.
I only saw one that had videos and one that had a playlist of some videos of a rightwing influencer. I wouldn’t take account age to be a sign of not-bot, both because of compromised accounts as you mentioned and because YouTube botting has been going on for a long time now. It’s 20 years old at this point and video monetization was introduced pretty soon after creation.
I find it really interesting that almost all of the recent comments on the YouTube video are 95% the same and praising “how great all this transparency” is, completely drowning out all other comments. They’re also worded very very similarly.
There is a similar phenomenon on Tiktok. The accounts often have similar name structures, have no picture or some USA flag type pic, but what gets me is that they all have around the same amount of followers or people following them.
Then people will make entire videos to engage with the bots… Drives my blood pressure up when I see it.
reddit has those types of bots too, especially on the conservative sub, half of it is by russia. and reddit touting how its getting rid of so many spammers,yet they did very little to stop the political ones, instead went after all the OF ones, and the general users too.
I read a while ago about someone doing forensic analysis using the heuristic of analyzing follower count numbers. I forget the exact mechanism, but certain patterns indicate that the follower count was statistically anomalous and therefore likely couldn’t be trusted.
Looking at their profiles, they don’t look like bots to me. Maybe it’s just that MAGAs sound as stupid as a bot swarm.
There’s like a stupidly large amount of bots in YouTube comments.
Apparently they encourage engagement. 🤷
This comment is licensed under CC BY-NC-SA 4.0
True, the likes might be manufactured. Don’t count on Google to moderate them
yea i noticed that too on alot of scifi channels, especially star trek, doctor who videos. almost always around wokeness.
Actually regardless of any type of video subject, really.
Any new video that I’ve seen, across many different subjects, has robotic comments right away, when the videos are first uploaded, the first minute or three.
I guess the YouTube folks are thinking that if it looks like there’s a bunch of people “in the room” other people will want to jump in and join the conversation.
Then after that, you’ll get robotic responses to your comments, that are usually antagonistic in nature, to try to bring you back to the YouTube app to reply to and continue interacting with the app.
This comment is licensed under CC BY-NC-SA 4.0
AI has been rapidly employed to create social media profiles. And before AI, there are entire teams dedicated to managing their own “teams” of social media profiles.
You can rest assured, like 70% of “users” on social media sites are just puppet accounts. You, as a private citizen, can rent some of these pools for things like PR campaigns.
theres still teams doing that, in addition to the AI. Reddit is pretty wierd around ai, they use ai for thier moderation but will ban most AI used by users.(they wont do any thing around russia-troll AI accounts) but will happily ban others.
Join my new social network, SSNBook
Everyone is national-identity number verified through a third party overseen by PricewaterhouseCoopers, and we do regular video calls to vibe check you and try to sus out whether you’re posting your own thoughts or just renting out your account
Naturally only 40% of our “users” are bots
[ 👆 I think about this all time ] also for reviews etc.
How could you tell anything by their profiles? Most had no content or other information. It’s not like Reddit where you can view their comments.
I checked the super generic comments that seemed AI generated. Some of them had content like videos or decent descriptions. Others had profiles that were as old as 10 years. If they were bots I would expect them to be created in the last few years. There’s a few that are suspicious, but it was maybe 2 in 15 accounts. They could also be hacked accounts.
I only saw one that had videos and one that had a playlist of some videos of a rightwing influencer. I wouldn’t take account age to be a sign of not-bot, both because of compromised accounts as you mentioned and because YouTube botting has been going on for a long time now. It’s 20 years old at this point and video monetization was introduced pretty soon after creation.
Here’s their anti-bot policy page from 2014: https://web.archive.org/web/20140209083324/https://support.google.com/youtube/answer/3399767?hl=en
Plus 5% of how this would be worse under Harris or Biden.