Fake4000@lemmy.world to Technology@lemmy.worldEnglish · 10 months agoReddit started doing what they always wanted to do, sell user content to AI.www.reuters.comexternal-linkmessage-square206fedilinkarrow-up11.11Karrow-down115cross-posted to: technology@lemmy.worldtechnology@beehaw.org
arrow-up11.09Karrow-down1external-linkReddit started doing what they always wanted to do, sell user content to AI.www.reuters.comFake4000@lemmy.world to Technology@lemmy.worldEnglish · 10 months agomessage-square206fedilinkcross-posted to: technology@lemmy.worldtechnology@beehaw.org
minus-squareAppoxo@lemmy.dbzer0.comlinkfedilinkEnglisharrow-up6·10 months agoAfaik the OpenAI bot may choose to ignore it? At least that’s what another user claimed it does.
minus-squareJohnEdwa@sopuli.xyzlinkfedilinkEnglisharrow-up12·10 months agoRobots.txt has been always ignored by some bots, it’s just a guideline originally meant to prevent excessive bandwidth usage by search indexing bots and is entirely voluntary. Archive.org bot for example has completely ignored it since 2017.
Afaik the OpenAI bot may choose to ignore it? At least that’s what another user claimed it does.
Robots.txt has been always ignored by some bots, it’s just a guideline originally meant to prevent excessive bandwidth usage by search indexing bots and is entirely voluntary.
Archive.org bot for example has completely ignored it since 2017.