Lex Fridman Podcast - #76 – John Hopfield: Physics View of the Mind and Neurobiology

John Hopfield is professor at Princeton, whose life’s work weaved beautifully through biology, chemistry, neuroscience, and physics. Most crucially, he saw the messy world of biology through the piercing eyes of a physicist. He is perhaps best known for his work on associate neural networks, now known as Hopfield networks that were one of the early ideas that catalyzed the development of the modern field of deep learning.

EPISODE LINKS:
Now What? article: http://bit.ly/3843LeU
John wikipedia: https://en.wikipedia.org/wiki/John_Hopfield
Books mentioned:
– Einstein’s Dreams: https://amzn.to/2PBa96X
– Mind is Flat: https://amzn.to/2I3YB84

This conversation is part of the Artificial Intelligence podcast. If you would like to get more information about this podcast go to https://lexfridman.com/ai or connect with @lexfridman on Twitter, LinkedIn, Facebook, Medium, or YouTube where you can watch the video versions of these conversations. If you enjoy the podcast, please rate it 5 stars on Apple Podcasts, follow on Spotify, or support it on Patreon.

This episode is presented by Cash App. Download it (App Store, Google Play), use code “LexPodcast”. 

Here’s the outline of the episode. On some podcast players you should be able to click the timestamp to jump to that time.

OUTLINE:
00:00 – Introduction
02:35 – Difference between biological and artificial neural networks
08:49 – Adaptation
13:45 – Physics view of the mind
23:03 – Hopfield networks and associative memory
35:22 – Boltzmann machines
37:29 – Learning
39:53 – Consciousness
48:45 – Attractor networks and dynamical systems
53:14 – How do we build intelligent systems?
57:11 – Deep thinking as the way to arrive at breakthroughs
59:12 – Brain-computer interfaces
1:06:10 – Mortality
1:08:12 – Meaning of life

PHPUgly - 180: Hardware BUGS 🐛

This week Eric, John, and Thomas talk Laracon Online, BUGS, Pis and more Laravel.

PHPUgly - 179: Just Don’t Close The Browser

Lex Fridman Podcast - #75 – Marcus Hutter: Universal Artificial Intelligence, AIXI, and AGI

Marcus Hutter is a senior research scientist at DeepMind and professor at Australian National University. Throughout his career of research, including with Jürgen Schmidhuber and Shane Legg, he has proposed a lot of interesting ideas in and around the field of artificial general intelligence, including the development of the AIXI model which is a mathematical approach to AGI that incorporates ideas of Kolmogorov complexity, Solomonoff induction, and reinforcement learning.

EPISODE LINKS:
Hutter Prize: http://prize.hutter1.net
Marcus web: http://www.hutter1.net
Books mentioned:
– Universal AI: https://amzn.to/2waIAuw
– AI: A Modern Approach: https://amzn.to/3camxnY
– Reinforcement Learning: https://amzn.to/2PoANj9
– Theory of Knowledge: https://amzn.to/3a6Vp7x

This conversation is part of the Artificial Intelligence podcast. If you would like to get more information about this podcast go to https://lexfridman.com/ai or connect with @lexfridman on Twitter, LinkedIn, Facebook, Medium, or YouTube where you can watch the video versions of these conversations. If you enjoy the podcast, please rate it 5 stars on Apple Podcasts, follow on Spotify, or support it on Patreon.

This episode is presented by Cash App. Download it (App Store, Google Play), use code “LexPodcast”. 

Here’s the outline of the episode. On some podcast players you should be able to click the timestamp to jump to that time.

OUTLINE:
00:00 – Introduction
03:32 – Universe as a computer
05:48 – Occam’s razor
09:26 – Solomonoff induction
15:05 – Kolmogorov complexity
20:06 – Cellular automata
26:03 – What is intelligence?
35:26 – AIXI – Universal Artificial Intelligence
1:05:24 – Where do rewards come from?
1:12:14 – Reward function for human existence
1:13:32 – Bounded rationality
1:16:07 – Approximation in AIXI
1:18:01 – Godel machines
1:21:51 – Consciousness
1:27:15 – AGI community
1:32:36 – Book recommendations
1:36:07 – Two moments to relive (past and future)

African Tech Roundup - Vietnamese-style Africa-focused Fintech Innovation With Quan Le of Binkabi

Andile Masuku catches up with Binkabi CEO, Quan Le to learn how his company is working to lower the world's reliance on the US dollar for imports and exports. Listen in to hear how Quan and his team are cutting out middlemen by turning agricultural commodities into tradable assets and automatically matching inbound and outbound trades which enable farmers to directly participate in global trade networks and retain more profits from their harvest. Binkabi is a London-headquartered cross-border physical commodity trading platform which primarily operates in developing countries. They leverage blockchain technology to solve for the complex frictions that characterise international agriculture supply chains. Quan is a Vietnamese finance professional who previously worked at PwC London as an auditor and, later, in merger acquisition advisory. During his 16-year-plus tenure at the firm, he worked with leading financial institutions in emerging markets in both Asia and Africa. Image Credit: no_one_cares (unsplash.com)

Code Story: Insights from Startup Tech Leaders - S2 E3: Wil Schroter, Startups.com

Startup veteran Wil Schroter is a family man, and now amateur carpenter. He spends a lot of his spare time, covered in sawdust and enjoying a balance of analog activities, away from digital life. He has spend 25 years as a startup CEO, and during that team, he learned that what he was best at was teaching people how to go through the startup process. For his 9th startup, he built Startups.com – a place to provide education and tools to help founders through the entire startup process – and this solution was catalyzed with the creation of a funding platform.


Today’s sponsors:

Podcorn (https://podcorn.com/podcasters)


Links


Leave us a review on Apple Podcasts!


Amazing tools we use:

  • If you want the best publishing platform for your podcast, with amazing support & people – use Transistor.fm.
  • Want to record your remote interviews with class? Then, you need to use Squadcast.
  • Code Story uses the 1-click product ClipGain, sign up now to get 3hrs of podcast processing time FREE!


Credits: Code Story is hosted and produced by Noah Labhart, Co-produced and edited by Bradley Denham. Be sure to subscribe on Apple PodcastsSpotifyPocket CastsGoogle PlayBreakerYouTube, or the podcasting app of your choice.



Our Sponsors:
* Check out Vanta: https://vanta.com/CODESTORY


Support this podcast at — https://redcircle.com/code-story/donations

Advertising Inquiries: https://redcircle.com/brands

Privacy & Opt-Out: https://redcircle.com/privacy

The Stack Overflow Podcast - A Dash of Anil, a Pinch of Glimmer, a splash of Glitch

Glitch, a platform that makes it easy for anyone to create or remix a web app, has seen over five million apps created by users. You can read more about how it works here. If you want to learn a little about how it works with Docker, check out this piece here.

If you want to know more about the shared history of Stack and Glitch, you can read up on it here. TLDR; Glitch was born out of Fog Creek software and counts Joel Spolsky and Michael Pryor as founders. 

Glimmer is a new web magazine from the folks at Glitch. It focuses on creators and makers, with a special emphasis on unearthing the human stories of people building today's software.

While you're here, don't forget to take 15-20 minutes and share your opinions in our 2020 Developer Survey. Whether Stack Overflow helped you during your journey as a programmer or not, we want to hear from everyone who codes. 

Some fun background for younger listeners: 

Geocities - a popular platform for building and hosting a personal website and linking it with others that share similar themes. 

BetaBeat - a website launched by The NY Observer that covered the SIlicon Alley tech scene. It was how Ben first met Anil, Joel, and many others. 

Heroku

Docker

If you have comments, questions, or suggestions, please send us an email at podcast@stackoverflow.com

Today’s episode is brought to you by Refinitiv. Unlock new possibilities with consistent, high-value market data from Refinitiv. Try the Refinitiv Eikon Data API for the largest breadth and depth of data and community tools with native Python support. Check out refinitiv.com/stackpodcast to try the Eikon Data API today. Refinitiv. Data is just the beginning.

Lex Fridman Podcast - #74 – Michael I. Jordan: Machine Learning, Recommender Systems, and the Future of AI

Michael I. Jordan is a professor at Berkeley, and one of the most influential people in the history of machine learning, statistics, and artificial intelligence. He has been cited over 170,000 times and has mentored many of the world-class researchers defining the field of AI today, including Andrew Ng, Zoubin Ghahramani, Ben Taskar, and Yoshua Bengio.

EPISODE LINKS:
(Blog post) Artificial Intelligence—The Revolution Hasn’t Happened Yet

This conversation is part of the Artificial Intelligence podcast. If you would like to get more information about this podcast go to https://lexfridman.com/ai or connect with @lexfridman on Twitter, LinkedIn, Facebook, Medium, or YouTube where you can watch the video versions of these conversations. If you enjoy the podcast, please rate it 5 stars on Apple Podcasts, follow on Spotify, or support it on Patreon.

This episode is presented by Cash App. Download it (App Store, Google Play), use code “LexPodcast”. 

Here’s the outline of the episode. On some podcast players you should be able to click the timestamp to jump to that time.

OUTLINE:
00:00 – Introduction
03:02 – How far are we in development of AI?
08:25 – Neuralink and brain-computer interfaces
14:49 – The term “artificial intelligence”
19:00 – Does science progress by ideas or personalities?
19:55 – Disagreement with Yann LeCun
23:53 – Recommender systems and distributed decision-making at scale
43:34 – Facebook, privacy, and trust
1:01:11 – Are human beings fundamentally good?
1:02:32 – Can a human life and society be modeled as an optimization problem?
1:04:27 – Is the world deterministic?
1:04:59 – Role of optimization in multi-agent systems
1:09:52 – Optimization of neural networks
1:16:08 – Beautiful idea in optimization: Nesterov acceleration
1:19:02 – What is statistics?
1:29:21 – What is intelligence?
1:37:01 – Advice for students
1:39:57 – Which language is more beautiful: English or French?