The Stack Overflow Podcast - Craft and quality beat speed and scale, with or without agents

Ryan welcomes Tom Moor, head of engineering at Linear, to discuss AI agents’ mixed results for productivity in the development lifecycle, the importance of context for maximizing agents’ effectiveness, and the role that junior developers need to take in a world increasingly driven by AI.

Episode notes:

Linear is a tool for planning and building products that streamline issues, projects, and product roadmaps.

Connect with Tom on Twitter

This episode’s shoutout goes to user ozz, who won a Populist badge for their answer to Column width not working in DataTables bootstrap.

TRANSCRIPT

See Privacy Policy at https://art19.com/privacy and California Privacy Notice at https://art19.com/privacy#do-not-sell-my-info.

Big Technology Podcast - Google Research Head Yossi Mathias: AI For Cancer Research, Quantum’s Progress, Researchers’ Future

Yossi Matias is the head of Google Research. He joins Big Technology Podcast to discuss the company's research efforts in areas like cancer treatment and Quantum and to discuss the relationship between research and product. Tune in to hear how Google used LLMs to generate a cancer hypothesis validated in living cells, what a “13,000×” quantum result really means, and how the research product loop turns papers into products. We also cover whether AI can automate a researcher's job. This conversation was recorded in front of a live audience at Google's Mountain View headquarters.

---

Enjoying Big Technology Podcast? Please rate us five stars ⭐⭐⭐⭐⭐ in your podcast app of choice.

Want a discount for Big Technology on Substack + Discord? Here’s 25% off for the first year: https://www.bigtechnology.com/subscribe?coupon=0843016b

Questions? Feedback? Write to: bigtechnologypodcast@gmail.com

Talk Python To Me - #525: NiceGUI Goes 3.0

Building a UI in Python usually means choosing between "quick and limited" or "powerful and painful." What if you could write modern, component-based web apps in pure Python and still keep full control? NiceGUI, pronounced "Nice Guy" sits on FastAPI with a Vue/Quasar front end, gives you real components, live updates over websockets, and it’s running in production at Zauberzeug, a German robotic company. On this episode, I’m talking with NiceGUI’s creators, Rodja Trappe and Falko Schindler, about how it works, where it shines, and what’s coming next. With version 3.0 releasing around the same time this episode comes out, we spend the end of the episode celebrating the 3.0 release.

Episode sponsors

Posit
Agntcy
Talk Python Courses

Rodja Trappe: github.com
Falko Schindler: github.com

NiceGUI 3.0.0 release: github.com
Full LLM/Agentic AI docs instructions for NiceGUI: github.com

Zauberzeug: zauberzeug.com
NiceGUI: nicegui.io
NiceGUI GitHub Repository: github.com
NiceGUI Authentication Examples: github.com
NiceGUI v3.0.0rc1 Release: github.com
Valkey: valkey.io
Caddy Web Server: caddyserver.com
JustPy: justpy.io
Tailwind CSS: tailwindcss.com
Quasar ECharts v5 Demo: quasar-echarts-v5.netlify.app
AG Grid: ag-grid.com
Quasar Framework: quasar.dev
NiceGUI Interactive Image Documentation: nicegui.io
NiceGUI 3D Scene Documentation: nicegui.io

Watch this episode on YouTube: youtube.com
Episode #525 deep-dive: talkpython.fm/525
Episode transcripts: talkpython.fm

Theme Song: Developer Rap
🥁 Served in a Flask 🎸: talkpython.fm/flasksong

---== Don't be a stranger ==---
YouTube: youtube.com/@talkpython

Bluesky: @talkpython.fm
Mastodon: @talkpython@fosstodon.org
X.com: @talkpython

Michael on Bluesky: @mkennedy.codes
Michael on Mastodon: @mkennedy@fosstodon.org
Michael on X.com: @mkennedy

Python Bytes - #455 Gilded Python and Beyond

Topics covered in this episode:
Watch on YouTube

About the show

Sponsored by us! Support our work through:

Connect with the hosts

Join us on YouTube at pythonbytes.fm/live to be part of the audience. Usually Monday at 10am PT. Older video versions available there too.

Finally, if you want an artisanal, hand-crafted digest of every week of the show notes in email form? Add your name and email to our friends of the show list, we'll never share it.

Michael #1: Cyclopts: A CLI library

Brian #2: The future of Python web services looks GIL-free

  • Giovanni Barillari
  • “Python 3.14 was released at the beginning of the month. This release was particularly interesting to me because of the improvements on the "free-threaded" variant of the interpreter.

    Specifically, the two major changes when compared to the free-threaded variant of Python 3.13 are:

    • Free-threaded support now reached phase II, meaning it's no longer considered experimental
    • The implementation is now completed, meaning that the workarounds introduced in Python 3.13 to make code sound without the GIL are now gone, and the free-threaded implementation now uses the adaptive interpreter as the GIL enabled variant. These facts, plus additional optimizations make the performance penalty now way better, moving from a 35% penalty to a 5-10% difference.”
  • Lots of benchmark data, both ASGI and WSGI
  • Lots of great thoughts in the “Final Thoughts” section, including
    • “On asynchronous protocols like ASGI, despite the fact the concurrency model doesn't change that much – we shift from one event loop per process, to one event loop per thread – just the fact we no longer need to scale memory allocations just to use more CPU is a massive improvement. ”
    • “… for everybody out there coding a web application in Python: simplifying the concurrency paradigms and the deployment process of such applications is a good thing.”
    • “… to me the future of Python web services looks GIL-free.”

Michael #3: Free-threaded GC

  • The free-threaded build of Python uses a different garbage collector implementation than the default GIL-enabled build.
  • The Default GC: In the standard CPython build, every object that supports garbage collection (like lists or dictionaries) is part of a per-interpreter, doubly-linked list. The list pointers are contained in a PyGC_Head structure.
  • The Free-Threaded GC: Takes a different approach. It scraps the PyGC_Head structure and the linked list entirely. Instead, it allocates these objects from a special memory heap managed by the "mimalloc" library. This allows the GC to find and iterate over all collectible objects using mimalloc's data structures, without needing to link them together manually.
  • The free-threaded GC does NOT support "generations”
  • By marking all objects reachable from these known roots, we can identify a large set of objects that are definitely alive and exclude them from the more expensive cycle-finding part of the GC process.
  • Overall speedup of the free-threaded GC collection is between 2 and 12 times faster than the 3.13 version.

Brian #4: Polite lazy imports for Python package maintainers

  • Will McGugan commented on a LI post by Bob Belderbos regarding lazy importing
  • “I'm excited about this PEP.

    I wrote a lazy loading mechanism for Textual's widgets. Without it, the entire widget library would be imported even if you needed just one widget. Having this as a core language feature would make me very happy.”

    https://github.com/Textualize/textual/blob/main/src/textual/widgets/__init__.py

  • Well, I was excited about Will’s example for how to, essentially, allow users of your package to import only the part they need, when they need it.

  • So I wrote up my thoughts and an explainer for how this works.
  • Special thanks to Trey Hunner’s Every dunder method in Python, which I referenced to understand the difference between __getattr__() and __getattribute__().

Extras

Brian:

  • Started writing a book on Test Driven Development.
    • Should have an announcement in a week or so.
    • I want to give folks access while I’m writing it, so I’ll be opening it up for early access as soon as I have 2-3 chapters ready to review. Sign up for the pythontest newsletter if you’d like to be informed right away when it’s ready. Or stay tuned here.

Michael:

Joke: You're absolutely right

Big Technology Podcast - OpenAI’s Risky Browser Bet, Amazon’s Mass Automation Plan, Clippy’s Back

Ranjan Roy from Margins is back for our weekly discussion of the latest tech news. We cover: 1) OpenAI's Atlas browser is here 2) Atlas plays 2048 3) The danger of AI browser prompt injection 4) Will Atlas be around in five years? 5) Why Dave's Hot Chicken is the world's top app 6) Amazon has plans to automate hundreds of thousands of jobs 7) OpenAI is paying investment bankers to train its models 8) If we automate all the work, who will be left to buy stuff? 9) Meta cuts 100 AI jobs 10) Reddit fools AI crawlers and shows theft 11) Clippy returns!

---

Enjoying Big Technology Podcast? Please rate us five stars ⭐⭐⭐⭐⭐ in your podcast app of choice.

Want a discount for Big Technology on Substack + Discord? Here’s 25% off for the first year: https://www.bigtechnology.com/subscribe?coupon=0843016b

Questions? Feedback? Write to: bigtechnologypodcast@gmail.com

The Stack Overflow Podcast - Your runbooks are obsolete in the age of agents

Ryan is joined by Spiros Xanthos, CEO and founder of Resolve AI, to talk about the future of AI agents in incident management and troubleshooting, the challenges of maintaining complex software systems with traditional runbooks, and the changing role of developers in an AI-driven world.

Episode notes:

Resolve AI is building agents to help you troubleshoot alerts, manage incidents, and run your production systems. 

Connect with Spiros on Linkedin or email him at spiros@resolve.ai. 

Congrats to user larsks for winning a Stellar Answer badge for their answer to How do I get into a Docker container's shell?.

TRANSCRIPT

See Privacy Policy at https://art19.com/privacy and California Privacy Notice at https://art19.com/privacy#do-not-sell-my-info.

Code Story: Insights from Startup Tech Leaders - S11 Bonus: Tanmai Gopal, PromptQL

Tanmai Gopal is a repeat guest on the podcast. Back in Season 7, he came on to tell the creation story of Hasura, which is a universal data access layer for next generations apps. He talked through he and his colleagues frustration with building API after API, and taking steps to ensure people wanted to not do that work anymore.

As Hasura started to take off, Tanmai started to ask the question around what was the right method for developers, in particular their applications, to access data. With the advent of AI, he and his team dug into what the right problems were to solve - and they identified the main problem with this type of tech was accuracy and trust.

This is the creation story of PromptQL.

Sponsors

Links




Support this podcast at — https://redcircle.com/code-story-insights-from-startup-tech-leaders/donations

Advertising Inquiries: https://redcircle.com/brands

Privacy & Opt-Out: https://redcircle.com/privacy

The Stack Overflow Podcast - What leaders need to know from the 2025 Stack Overflow Developer Survey

In this episode of Leaders of Code, Eira May, B2B Editor at Stack Overflow, and Natalie Rotnov, Senior Product Marketing Manager for the Enterprise Product Suite at Stack Overflow, unpack the key takeaways from the 2025 Developer Survey for tech and business leaders. The discussion focuses on the evolving developer relationship with AI, the continued struggle with tool sprawl, and actionable recommendations for leaders looking to deliver value and improve developer experience.

The discussion covers critical findings for tech leaders:

  • The decline in developer trust in AI is linked to two main frustrations: solutions that are "almost right, but not quite" and the time wasted debugging AI-generated code.
  • Human connection and community validation remain vital: 80% of developers still visit Stack Overflow regularly, and the number of "advanced questions" on the public platform has doubled since 2023, underscoring AI’s limitations when it comes to complex, context-dependent questions.
  • Tool sprawl continues, as most developers use 6–10 tools, suggesting that AI tends to complicate rather than simplify workflows.

Notes:

  • Explore key insights from the 2025 Stack Overflow Developer Survey, consolidated into an executive-ready summary. 
  • Connect with Natalie Rotnov on LinkedIn.


See Privacy Policy at https://art19.com/privacy and California Privacy Notice at https://art19.com/privacy#do-not-sell-my-info.

Big Technology Podcast - Amazon’s Panos Panay: The Reality of Building Alexa Plus and AI Assistants

Panos Panay is Amazon’s head of Devices & Services. Panay returns to Big Technology Podcast to discuss Alexa Plus's delayed rollout, when the assistant is releasing to everyone, and the challenge of building these products. Tune in for specifics on compatibility, usage spikes, and what “day one” means when you have hundreds of millions of customers. We also cover the future of computing, from phones to wearables and home devices. Hit play for a grounded look at what’s real now—and what’s coming next.

---

Enjoying Big Technology Podcast? Please rate us five stars ⭐⭐⭐⭐⭐ in your podcast app of choice.

Want a discount for Big Technology on Substack + Discord? Here’s 25% off for the first year: https://www.bigtechnology.com/subscribe?coupon=0843016b

Questions? Feedback? Write to: bigtechnologypodcast@gmail.com

Code Story: Insights from Startup Tech Leaders - S11 E22: Ryan Wang, Assembled

Ryan Wang has had a winding set of paths to get to where he is today. He studied economics and statistics, with the intent of going to grad school and becoming a professor. After talking with his boss at the time, Steven Levitt (also one of the authors of Freakonomics), he was convinced that was not the best path. Eventually, he joined stripe via nepotism, and became a software developer via data science. Outside of tech, he loves to read about different topics. Right now, he is reading about owls, and also loves to read fiction and poetry. In fact, he drops poetry occasionally at his current venture.

While at Stripe, back when it was an 80 person company, Ryan noticed people doing support tickets on their own. After he spent some time there, he and his now co-founder started to tinker in machine learning for support. As he made progress, a leader pointed out that the real problem was around workforce management.

This is the creation story of Assembled.

Sponsors

Links




Support this podcast at — https://redcircle.com/code-story-insights-from-startup-tech-leaders/donations

Advertising Inquiries: https://redcircle.com/brands

Privacy & Opt-Out: https://redcircle.com/privacy