Discussion about this post

User's avatar
Josh Holder's avatar

Great post, and I have similar intuitions to the ones you lay out in the final section about the nature of intelligence. Some problems are indeed just *really hard*, no matter what level of "intelligence" you have. I don't expect a fast takeoff with killer nanobots in our near future.

Gwern has a great piece arguing against this intuition (https://gwern.net/complexity) which is an interesting read to challenge your assumptions. But I remain unconvinced.

One example he gives is that "a logistics/shipping company which could shave the remaining 1-2% of inefficiency off its planning algorithms would have a major advantage over it's rivals." Fundamentally, that's still describing an advantage in the context of our institutions, markets, and manufacturing process. The situation being described is less "AI reaches the singularity and immediately ascends, discarding our institutions and releasing paperclip-manufacturing nanobots," and more, "AI slowly outcompetes us in more and more domains until there is nothing left for humans to do."

Expand full comment