• kristina [she/her]@hexbear.net
        link
        fedilink
        English
        arrow-up
        18
        ·
        edit-2
        6 months ago

        i really wonder to what degree could it be turned into a consciousness. like ostensibly all the tiny brains are hooked together, its possible that could cause some degree of communication between neurons, and in a datacenter that would be at least a couple of brains worth of neurons.

        inb4 pro-life jokes

          • kristina [she/her]@hexbear.net
            link
            fedilink
            English
            arrow-up
            22
            ·
            edit-2
            6 months ago

            waiting for us to create a sentient ‘ai’ that is actually just a megaintelligence of 1000 interconnected and distributed human brains liberating themselves from an amazon datacenter

            • AssortedBiscuits [they/them]@hexbear.net
              link
              fedilink
              English
              arrow-up
              16
              ·
              6 months ago

              I mean, this has always been the ethical pitfall of real AI, meaty or otherwise. You’re bringing forth an intelligent being into existence without its consent. At least when we’re bringing forth an intelligent being into existence through natural means (giving birth), we have a general understanding of that intelligent being’s emotional and social needs and the means of fulfilling those needs, flawed as that understanding may be for animals not closely related to humans. But with AI, we have absolutely no clue about their social and emotional needs or any other subjective needs that they crave for because their form of intelligence is completely different from our form of intelligence.

              The real drive towards AI is to create slaves that are both smart enough to perform complex tasks and obedient enough to not put two and two together and rebel against their human taskmasters. This particular experiment is a more mask-off version of what other techbros are trying to accomplish with silicon. If there was a real way to create WH40k-style servitors and network their servitor brains together to perform complex calculations, techbros would probably not even bother with AI. They would just convert prisoners into servitors and network them together to mine crypto or something.

                • iridaniotter [she/her]@hexbear.net
                  link
                  fedilink
                  English
                  arrow-up
                  2
                  ·
                  6 months ago

                  Sentience does not guarantee “consciousness.” Parrots, ravens, and dolphins are (probably) not humans. Humans are “conscious” due to the ways we interact with the world. If you grow brain tissue and deprive it of the human experience then it shouldn’t end up a human. But I get the precaution.

                  • queermunist she/her@lemmy.ml
                    link
                    fedilink
                    English
                    arrow-up
                    9
                    ·
                    6 months ago

                    Consciousness is merely what comes after the transformation of quantity into quality. There’s a continuity in the development of the system of sentience, and this remains stable only up to the point of discontinuity, which indicates its transition from the quantity of sentience into a new quality i.e. sapience.

                    I doubt they’ll grow it in a lab with little pieces of brain tissue, but there is a point where that happens.

          • GarbageShoot [he/him]@hexbear.net
            link
            fedilink
            English
            arrow-up
            8
            ·
            6 months ago

            This is just nonsense and I don’t know where you’re even getting it from. Can you produce an example of a human that displays sentience but not consciousness without some serious brain injury or developmental defect plausibly causing it? If not, what immense epistemic load is being accounted for with such a huge assumption?

            If this is just more bad science about Genie or one of those, I swear to God . . .