There have been multiple accounts created with the sole purpose of posting advertisement posts or replies containing unsolicited advertising.

Accounts which solely post advertisements, or persistently post them may be terminated.

SmoothIsFast ,

We have literally created small organism brains with neural networks that behaved nearly identical, using ML and neural nets. Idk but i would call that pretty damn good emphical evidence. We do not know the specific mechanism on how a brain generates its weight so to speak chemically and computes but we understand that at its simplest form it is a neuron, with a weight, and depending on that weight/sensitivity what ever you want to call it produces a an output pretty damn consistently. The brain is multiple networks working simultaneously with the ability to self learn, this architecture is what is missing in our ML models now if you wanted general artificial intelligence, but we are missing foundational algoriths for chossing wieghts instead of randomly assigning them and hoping for the best to facilitate memroy and cleaner network integrations. You need specialized networks for each critical function, motor control, emotional regulation, etc then you need a system that can interpret weights or create weights in a way that you can imprint an “image”, for lack of a better term, to create memories. Consciousness would then just be the network that facilitates interpretation from each networks output and decides which systems need to be engaged next or if an end state was reached. Which imo can be clearly demonstrated by split brain individuals.

If we had a theory of mind that was complete, it would simply be a matter of counting up the number of transistors required to approximate varying degrees of intelligence.

I think this would be our fundamental lacking to interpret how our brain calculates and uses chemical weights so to speak to vary output. If we can’t judge that efficiency, we can’t just count all the transitors and say it’s this smart because the model could literally be trained to just output the letter s for everything even if its the size of chat gpt. I think we very well could state the capacity and limits of our brains by counting the number of neurons but whether it reaches its potential is dependent on how efficiently it was trained and that is where approximating intelligence becomes insanely difficult.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • [email protected]
  • random
  • lifeLocal
  • goranko
  • All magazines