James Fodor is a science podcaster, an essayist, and currently, a PhD candidate in computational neuroscience and computational linguistics at the University of Melbourne (Australia). His intellectual and research interests cover such diverse areas as cognitive science, computer science, philosophy, theology, and economics.
In this conversation, KMO and James discuss:
02:07 – A brief history of "The Science of Everything"
10:08 – How neural networks learn vs how humans learn
12:52 – The uncomfortable question of back-propagation
16:00 – Acquiring language and concepts
19:52 – Why and when machines' way of learning is important
23:22 – Illogical language models and the Internet lurking behind
25:52 – Conversation starters and GPT's talking to one another
27:45 – Modeling the mind vs the "just a neural network" cop-out
33:08 – A space of possible minds and complementing human intelligence