• 1 Post
  • 444 Comments
Joined 1 year ago
cake
Cake day: July 5th, 2023

help-circle
  • frog 🐸@beehaw.orgtoChat@beehaw.orghow's your week going, Beehaw
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    4 months ago

    I’m feeling the need to do a social media detox, including Beehaw. Pro-AI techbros are getting me down.

    Shockingly, keeping Instagram active. My feed there is nothing but frogs, greyhounds, and art from local artists, and detoxing from stuff that is improving my mood rather than making it worse seems unnecessary.


  • But this is the point: the AIs will always need input from some source or another. Consider using AI to generate search results. Those will need to be updated with new information and knowledge, because an AI that can only answer questions related to things known before 2023 will very quickly become obsolete. So it must be updated. But AIs do not know what is going on in the world. They have no sensory capacity of their own, and so their inputs require data that is ultimately, at some point in the process, created by a human who does have the sensory capacity to observe what is happening in the world and write it down. And if the AI simply takes that writing without compensating the human, then the human will stop writing, because they will have had to get a different job to buy food, rent, etc.

    No amount of “we can train AIs on AI-generated content” is going to fix the fundamental problem that the world is not static and AI’s don’t have the capacity to observe what is changing. They will always be reliant on humans. Taking human input without paying for it disincentivises humans from producing content, and this will eventually create problems for the AI.






  • I honestly don’t get why so many people are so reckless and impatient on the roads. I’ve seen some people being really fucking stupid around cyclists and motorcyclists. One incident haunts me, because I know someone would have been severely injured, maybe killed, if I hadn’t been quick enough to get out of the way of an impatient person overtaking in a stupid place.

    And it’s just like… why? Just leave home a few minutes earlier!


  • frog 🐸@beehaw.orgtoChat@beehaw.orgHow to Deal With Cyclists
    link
    fedilink
    English
    arrow-up
    17
    ·
    4 months ago

    I did not know the exact wording of this guidance, but this is basically the strategy I use. I’ve always figured that because I prepare for my journeys, I am never in such a rush that I need to put someone else’s life at risk in order to pass them quicker - it’s not like it’s going to make a difference to my day if I arrive at my destination 2 minutes later, but it’ll make a huge difference to someone else’s day if I rush past a cyclist when it’s not safe.


  • Not even then. I think the thing that’s easy to forget about shareholders is they’re not doing this because they’re evil and get off on watching people suffer. They’re doing it because their own personal inadequacies are so vast that the only way they can cope with life is by trying to fill that enormous emotional hole with money. As a result, even when every other person on the planet has been crushed and ground into paste, and just one person with this mindset finally owns everything… it still won’t be enough for them. They will still be left with that unfillable emotional hole. They will still be empty inside.



  • I’ve definitely thought about modding Freelancer, but haven’t quite found the right ones yet. I tried Discovery (I think it was), and felt that the changes to the enemy AI and equipment (such as constantly using shield batteries and nanobots) just made gameplay more frustrating than enjoyable, because it made every single battle challenging - no more just chilling out while hauling random stuff through trade lanes. I’d really love a mod that adds new systems, planets, locations, ships, etc without dramatically changing the gameplay to be exclusively about the combat.



  • I still have a soft spot for Freelancer, despite all the years that have gone by (and aside from some minor UI issues, plays perfectly on a modern PC), and it still looks remarkably nice for its age, too. The story is pretty linear, and the characters not hugely memorable (despite some voice acting from George Takei, John Rhys-Davies, and Jennifer Hale), but it’s just fun to play. It can be challenging if you want to venture into areas less travelled, but because progress through the game is largely dependent on the money you earn (in-game), if you just want a chill evening, you can just trade goods.

    And like… this is a game I’ve been playing on and off for 20 years, and occasionally I still find something new. I played it a couple of months ago, committing to docking with every planet and station… and discovered a new trade route that was both shorter and more profitable than the one I had been using. It probably only cut 10 minutes off my three stage trade run around the entire map, but it was still kind of exciting to go “oooh, I never realised this was an option!” All because I visited a station I don’t usually visit.


  • Yeah, I think you could be right there, actually. My instinct on this from the start is that it would prevent the grieving process from completing properly. There’s a thing called the gestalt cycle of experience where there’s a normal, natural mechanism for a person going through a new experience, whether it’s good and bad, and a lot of unhealthy behaviour patterns stem from a part of that cycle being interrupted - you need to go through the cycle for everything that happens in your life, reaching closure so that you’re ready for the next experience to begin (most basic explanation), and when that doesn’t happen properly, it creates unhealthy patterns that influence everything that happens after that.

    Now I suppose, theoretically, there’s a possibility that being able to talk to an AI replication of a loved one might give someone a chance to say things they couldn’t say before the person died, which could aid in gaining closure… but we already have methods for doing that, like talking to a photo of them or to their grave, or writing them a letter, etc. Because the AI still creates the sense of the person still being “there”, it seems more likely to prevent closure - because that concrete ending is blurred.

    Also, your username seems really fitting for this conversation. :)




  • Given the husband is likely going to die in a few weeks, and the wife is likely already grieving for the man she is shortly going to lose, I think that still places both of them into the “vulnerable” category, and the owner of this technology approached them while they were in this vulnerable state. So yes, I have concerns, and the fact that the owner is allegedly a friend of the family (which just means they were the first vulnerable couple he had easy access to, in order to experiment on) doesn’t change the fact that there are valid concerns about the exploitation of grief.

    With the way AI techbros have been behaving so far, I’m not willing to give any of them the benefit of the doubt about claims of wanting to help rather than make money - such as using a vulnerable couple to experiment on while making a “proof of concept” that can be used to sell this to other vulnerable people.



  • Think of how many family recipes could be preserved. Think of the stories that you can be retold in 10 years. Think of the little things that you’d easily forget as time passes.

    An AI isn’t going to magically know these things, because these aren’t AIs based on brain scans preserving the person’s entire mind and memories. They can learn only the data they’re told. And fortunately, there’s a much cheaper way for someone to preserve family recipies and other memories that their loved ones would like to hold onto: they could write it down, or record a video. No AI needed.


  • Sure, you should be free to make one. But when you die and an AI company contacts all your grieving friends and family to offer them access to an AI based on you (for a low, low fee!), there are valid questions about whether that will cause them harm rather than help - and grieving people do not always make the most rational decisions. They can very easily be convinced that interacting with AI-you would be good for them, but it actually prolongs their grief and makes them feel worse. Grieving people are vulnerable, and I don’t think AI companies should be free to prey on the vulnerable, which is a very, very realistic outcome of this technology. Because that is what companies do.

    So I think you need to ask yourself not whether you should have the right to make an AI version of yourself for those who survive your death… but whether you’re comfortable with the very likely outcome that an abusive company will use their memories of you to exploit their grief and prolong their suffering. Do you want to do that to people you care about?