rw-book-cover

Connor Leahy - E/Acc, AGI and the Future. By Machine Learning Street Talk

The video is about Connor Leahy discussing e/acc, AGI, and the future. He shares insights on these topics in the context of machine learning. You can watch the full discussion on YouTube.

Highlights

the phrase vox populi, vox dei, because it’s kind of true, because it’s sensible to think of the population as expressing certain wills or certain gods or certain concepts and so on, at different times.

(View Highlight)


Democracy was one of the craziest inventions of the, you know, 19th, 20th century because it allowed for new forms of social organization and new new gods, so to speak, that could, um, both do more complicated things and involve more people, often for the better.

(View Highlight)


So Dawkins, I think was talking about, well, isn’t it interesting that we have this, um, DNA, but the DNA almost has agency, so the genes have a life of their own, which supervene on the DNA. And our culture is very much like that. So even though our DNA hasn’t really evolved very much over the last 100,000 years, our culture has exploded because it has this kind of symbiosis. It’s an organism, it has its own agency and so on. And inside that you have this profusion of memetic activity, and they don’t necessarily get selected for what’s moral, as you say. They get selected for what helps them survive, which is quite an amoral process.

(View Highlight)


this externalized view of cognition. So we think of cognition as being a kind of profusion of activity, which is spread out in a very complex way.

(View Highlight)


a lot of my ideas, most of my ideas don’t come from myself. They come from, you know, reading, say, Andy Clark Robotics and Cognition

(View Highlight)


structure my external environment to be easier for me to extend my cognition into

(View Highlight)


So I’m like simultaneously constructing my environment to allow me to extend my cognition, while my extended cognition gives me the capabilities of doing that sort of external environment construction

(View Highlight)


there’s this kind of cyclical information exchange between us as agents and our environment

(View Highlight)


So we use the notepad to extend our mind. But I think where he wants you to go, though, is in more broadly than that, which is to think of absolutely everything which is physically embedded in our world extends our mind.

(View Highlight)


So to a certain extent, we’re a bit like ChatGPT we we’d be really rubbish humans on our own, because most of our cognition is embedded in this sort of society.

(View Highlight)


I fully expect early Agis to be quite embedded.

(View Highlight)


But I argue that AI systems today have a creativity gap and an autonomy gap. And again, trying not to get too philosophical. What I mean by that is they’re parasitic on the actual humans that use the system, and also the kind of the culture and the society we have. So when you use ChatGPT, your entropy smuggling.

(View Highlight)


And I expect that these. Things that we are building are going to be very distributed intelligences in a very similar way to humans.

(View Highlight)


Our culture is where most of our cognition happens from a collective intelligence point of view. And I believe that the the richness of that emerging thing is because of the sophistication of our interface, our physical interface with the world. So, you know, our hands and arms and like the way that we use tools. And, um, so it kind of feels to me that us as physical agents is the key that opens the door to the complexity.

(View Highlight)


But it’s not inconceivable, though, that machines, even though it’s all digital, they could similarly get to know each other in quite a complex way, which could give rise to another form of memetic society which is unrecognizable, maybe, but still one nonetheless.

(View Highlight)


As we were saying before, the reason why AI has no autonomy because it smuggles it from us. We’re the autonomy producing thing in the system.

(View Highlight)


the first AGI that will truly like, emerge will be messy. It’s not going to be one beautiful, crisp piece of code that encompasses everything. Neither will it be a massive neural network. It will be a weird cyborg admixture of large neural networks, code, formal logic, people, institutions, laws. All of this together will form the entity

(View Highlight)