New ask Hacker News story: Ask HN: How do you cope with the existential dread of AGI?
Ask HN: How do you cope with the existential dread of AGI?
6 by NumberWangMan | 5 comments on Hacker News.
I've opened my eyes recently to the catastrophe that we're headed toward with unaligned AI. I know plenty of people here aren't worried, and are actually excited about it. I was too, but I just hadn't really thought very deeply about it before. I was imagining the most rose-tinted sci-fi. Now I'm trying to figure out how this doesn't end extremely poorly for us all, and can't. I am trying not to panic, not to sink into a deep despair, but it seems like even if AI doesn't actually kill everyone (which apparently over half the people working on AI think has a good chance of happening!) it's going to screw up things much worse than just the weak AIs that we've used to create the never-ending attention economy. It seems like AI is a fire that has already started to burn and consume us, and we just keep feeding it instead of fighting it. Maybe we create AIs to cure our diseases, then someone uses one to take out the power grid and modern society collapses and hundreds of millions starve. Maybe even if a super-intelligent AI doesn't take over, we rely on them more and more until every important decision is made by AI instead of people, and when bad things start happening, it's too late to undo it. We're going so, so fast, and there's nobody at the helm. I have a little boy, and I don't this world for him. He's so naive, so excited about technology, and I cry for him when he's not looking. I wasn't doing too bad up until today, tried to have a little hope and maybe start the grieving process ahead of time, but it's just so terrifying. I'm trying so hard to shout "stop! stop!" but the world is so noisy. And to the people working on making better, smarter AIs -- do you think that this is going to be a good thing for humanity, to develop and release these things to everyone, so quickly? Are you not worried that you're bringing about the end of the world, or at least going to cause massive amounts of suffering? I have heard that a lot of people working in AI are privately very concerned, but don't feel like it'll make much of a difference if they quit or not.
6 by NumberWangMan | 5 comments on Hacker News.
I've opened my eyes recently to the catastrophe that we're headed toward with unaligned AI. I know plenty of people here aren't worried, and are actually excited about it. I was too, but I just hadn't really thought very deeply about it before. I was imagining the most rose-tinted sci-fi. Now I'm trying to figure out how this doesn't end extremely poorly for us all, and can't. I am trying not to panic, not to sink into a deep despair, but it seems like even if AI doesn't actually kill everyone (which apparently over half the people working on AI think has a good chance of happening!) it's going to screw up things much worse than just the weak AIs that we've used to create the never-ending attention economy. It seems like AI is a fire that has already started to burn and consume us, and we just keep feeding it instead of fighting it. Maybe we create AIs to cure our diseases, then someone uses one to take out the power grid and modern society collapses and hundreds of millions starve. Maybe even if a super-intelligent AI doesn't take over, we rely on them more and more until every important decision is made by AI instead of people, and when bad things start happening, it's too late to undo it. We're going so, so fast, and there's nobody at the helm. I have a little boy, and I don't this world for him. He's so naive, so excited about technology, and I cry for him when he's not looking. I wasn't doing too bad up until today, tried to have a little hope and maybe start the grieving process ahead of time, but it's just so terrifying. I'm trying so hard to shout "stop! stop!" but the world is so noisy. And to the people working on making better, smarter AIs -- do you think that this is going to be a good thing for humanity, to develop and release these things to everyone, so quickly? Are you not worried that you're bringing about the end of the world, or at least going to cause massive amounts of suffering? I have heard that a lot of people working in AI are privately very concerned, but don't feel like it'll make much of a difference if they quit or not.
Comments
Post a Comment