Eye On A.I.

By: Craig S. Smith
  • Summary

  • Eye on A.I. is a biweekly podcast, hosted by longtime New York Times correspondent Craig S. Smith. In each episode, Craig will talk to people making a difference in artificial intelligence. The podcast aims to put incremental advances into a broader context and consider the global implications of the developing technology. AI is about to change your world, so pay attention.
    Eye On A.I.
    Show more Show less
activate_Holiday_promo_in_buybox_DT_T2
Episodes
  • #229 Mitesh Agrawal: Why Lambda Labs’ AI Cloud Is a Game-Changer for Developers
    Jan 8 2025

    This episode is sponsored by Netsuite by Oracle, the number one cloud financial system, streamlining accounting, financial management, inventory, HR, and more.

    NetSuite is offering a one-of-a-kind flexible financing program. Head to https://netsuite.com/EYEONAI to know more.



    In this episode of the Eye on AI podcast, we dive into the transformative world of AI compute infrastructure with Mitesh Agrawal, Head of Cloud/COO at Lambda

    Mitesh takes us on a journey from Lambda Labs' early days as a style transfer app to its rise as a leader in providing scalable, deep learning infrastructure. Learn how Lambda Labs is reshaping AI compute by delivering cutting-edge GPU solutions and accessible cloud platforms tailored for developers, researchers, and enterprises alike.

    Throughout the episode, Mitesh unpacks Lambda Labs’ unique approach to optimizing AI infrastructure—from reducing costs with transparent pricing to tackling the global GPU shortage through innovative supply chain strategies. He explains how the company supports deep learning workloads, including training and inference, and why their AI cloud is a game-changer for scaling next-gen applications.

    We also explore the broader landscape of AI, touching on the future of AI compute, the role of reasoning and video models, and the potential for localized data centers to meet the growing demand for low-latency solutions. Mitesh shares his vision for a world where AI applications, powered by Lambda Labs, drive innovation across industries.

    Tune in to discover how Lambda Labs is democratizing access to deep learning compute and paving the way for the future of AI infrastructure.

    Don’t forget to like, subscribe, and hit the notification bell to stay updated on the latest in AI, deep learning, and transformative tech!



    Stay Updated:

    Craig Smith Twitter: https://twitter.com/craigss

    Eye on A.I. Twitter: https://twitter.com/EyeOn_AI



    (00:00) Introduction and Lambda Labs' Mission

    (01:37) Origins: From DreamScope to AI Compute Infrastructure

    (04:10) Pivoting to Deep Learning Infrastructure

    (06:23) Building Lambda Cloud: An AI-Focused Cloud Platform

    (09:16) Transparent Pricing vs. Hyperscalers

    (12:52) Managing GPU Supply and Demand

    (16:34) Evolution of AI Workloads: Training vs. Inference

    (20:02) Why Lambda Labs Sticks with NVIDIA GPUs

    (24:21) The Future of AI Compute: Localized Data Centers

    (28:30) Global Accessibility and Regulatory Challenges

    (32:13) China’s AI Development and GPU Restrictions

    (39:50) Scaling Lambda Labs: Data Centers and Growth

    (45:22) Advancing AI Models and Video Generation

    (50:24) Optimism for AI's Future

    (53:48) How to Access Lambda Cloud

    Show more Show less
    56 mins
  • #228 Rodrigo Liang: How SambaNova Systems Is Disrupting AI Inference
    Jan 1 2025

    This episode is sponsored by RapidSOS. Close the safety gap and transform your emergency response with RapidSOS.

    Visit https://rapidsos.com/eyeonai/ today to learn how AI-powered safety can protect your people and boost your bottom line.



    In this episode of the Eye on AI podcast, we explore the world of AI inference technology with Rodrigo Liang, co-founder and CEO of SambaNova Systems.

    Rodrigo shares his journey from high-performance chip design to building SambaNova, a company revolutionizing how enterprises leverage AI through scalable, power-efficient solutions. We dive into SambaNova’s groundbreaking achievements, including their record-breaking inference models, the Lama 405B and 70B, which deliver unparalleled speed and accuracy—all on a single rack consuming less than 10 kilowatts of power.

    Throughout the conversation, Rodrigo highlights the seismic shift from AI training to inference, explaining why production AI is now about speed, efficiency, and real-time applications. He details SambaNova’s approach to open-source models, modular deployment, and multi-tenancy, enabling enterprises to scale AI without costly infrastructure overhauls.

    We also discuss the competitive landscape of AI hardware, the challenges of NVIDIA’s dominance, and how SambaNova is paving the way for a new era of AI innovation. Rodrigo explains the critical importance of power efficiency and how SambaNova’s technology is unlocking opportunities for enterprises to deploy private, secure AI systems on-premises and in the cloud.

    Discover how SambaNova is redefining AI for enterprise adoption, enabling real-time AI, and setting new standards in efficiency and scalability.

    Don’t forget to like, subscribe, and hit the notification bell to stay updated on the latest breakthroughs in AI, technology, and enterprise innovation!



    Stay Updated:

    Craig Smith Twitter: https://twitter.com/craigss

    Eye on A.I. Twitter: https://twitter.com/EyeOn_AI

    Show more Show less
    24 mins
  • #227 Sedarius Tekara Perrotta: The Importance of Data Quality in AI Systems
    Dec 29 2024

    In this episode of the Eye on AI podcast, we dive into the critical issue of data quality for AI systems with Sedarius Perrotta, co-founder of Shelf.

    Sedarius takes us on a journey through his experience in knowledge management and how Shelf was built to solve one of AI’s most pressing challenges—unstructured data chaos. He shares how Shelf’s innovative solutions enhance retrieval-augmented generation (RAG) and ensure tools like Microsoft Copilot can perform at their best by tackling inaccuracies, duplications, and outdated information in real-time.

    Throughout the episode, we explore how unstructured data acts as the "fuel" for AI systems and why its quality determines success. Sedarius explains Shelf's approach to data observability, transparency, and proactive monitoring to help organizations fix "garbage in, garbage out" issues, ensuring scalable and trusted AI initiatives.

    We also discuss the accelerating adoption of generative AI, the future of data management, and why building a strategy for clean and trusted data is vital for 2025 and beyond. Learn how Shelf enables businesses to unlock the full potential of their unstructured data for AI-driven productivity and innovation.

    Don’t forget to like, subscribe, and hit the notification bell to stay updated on the latest advancements in AI, data management, and next-gen automation!


    Stay Updated:
    Craig Smith Twitter: https://twitter.com/craigss
    Eye on A.I. Twitter: https://twitter.com/EyeOn_AI


    (00:00) Introduction and Shelf's Mission
    (03:01) Understanding SharePoint and Data Challenges
    (05:29) Tackling Data Entropy in AI Systems
    (08:13) Using AI to Solve Data Quality Issues
    (12:30) Fixing AI Hallucinations with Trusted Data
    (21:01) Gen AI Adoption Insights and Trends
    (28:44) Benefits of Curated Data for AI Training
    (37:38) Future of Unstructured Data Management

    Show more Show less
    45 mins

What listeners say about Eye On A.I.

Average customer ratings

Reviews - Please select the tabs below to change the source of reviews.