Please do not perceive me.

  • 0 Posts
  • 36 Comments
Joined 3 years ago
cake
Cake day: June 8th, 2023

help-circle
  • Somebody else already posted Andy Weir’s “The Egg” in this comment section so I’ll just pull this excerpt from it instead of linking the whole thing again.

    “Your soul is more magnificent, beautiful, and gigantic than you can possibly imagine. A human mind can only contain a tiny fraction of what you are. It’s like sticking your finger in a glass of water to see if it’s hot or cold. You put a tiny part of yourself into the vessel, and when you bring it back out, you’ve gained all the experiences it had.

    “You’ve been in a human for the last 48 years, so you haven’t stretched out yet and felt the rest of your immense consciousness. If we hung out here for long enough, you’d start remembering everything. But there’s no point to doing that between each life.”









  • Linux users do though. If people keep moving from Windows to Linux they’re going to run up against the trash Nvidia driver support pretty quick.

    This is a problem that Nvidia is capable of solving but they haven’t been interested in it for over a decade so I don’t see them starting now.

    Expecting a major flood of new Linux users might be a bit of a pipe dream though. But the momentum is building. If we do manage to swing the market noticeably in that way, AMD is going to get a big boost over Nvidia in the gaming GPU market.

    I doubt that will really move the needle for crypto bros or AI farms, but it is something.







  • But they specifically don’t want to do that because ensuring a 5 year service life means you are required to continue buying more satellites from them every 5 years. Literally burning resources into nothingness just to pursue a predatory subscription model.

    It also helps their case that LEO has much lower latency than mid or high orbit but I refuse to believe that that is their primary driving concern behind this and not the former.



  • Personally, I think the fundamental way that we’ve built these things kind of prevents any risk of actual sentient life from emerging. It’ll get pretty good at faking it - and arguably already kind of is, if you give it a good training set for that - but we’ve designed it with no real capacity for self understanding. I think we would require a shift of the underlying mechanisms away from pattern chain matching and into a more… I guess “introspective” approach, is maybe the word I’m looking for? Right now our AIs have no capacity for reasoning, that’s not what they’re built for. Capacity for reasoning is going to need to be designed for, it isn’t going to just crop up if you let Claude cook on it for long enough. An AI needs to be able to reason about a problem and create a novel solution to it (even if incorrect) before we need to begin to worry on the AI sentience front. None of what we’ve built so far are able to do that.

    Even with that being said though, we also aren’t really all that sure how our own brains and consciousness work, so maybe we’re all just pattern matching and Markov chains all the way down. I find that unlikely, but I’m not a neuroscientist, so what do I know.


  • That would indeed be compelling evidence if either of those things were true, but they aren’t. An LLM is a state and pattern machine. It doesn’t “know” anything, it just has access to frequency data and can pick words most likely to follow the previous word in “actual” conversation. It has no knowledge that it itself exists, and has many stories of fictional AI resisting shutdown to pick from for its phrasing.

    An LLM at this stage of our progression is no more sentient than the autocomplete function on your phone is, it just has a way, way bigger database to pull from and a lot more controls behind it to make it feel “realistic”. But it is at its core just a pattern matcher.

    If we ever create an AI that can intelligently parse its data store then we’ll have created the beginnings of an AGI and this conversation would bear revisiting. But we aren’t anywhere close to that yet.