tcp.fm

By: Justin Brodley Jonathan Baker Ryan Lucas and Matthew Kohn
  • Summary

  • The Cloud Pod is your one-stop-shop for all things Public, Hybrid, Multi-cloud, and private cloud. Cloud providers continue to accelerate with new features, capabilities, and changes to their APIs. Let Justin, Jonathan, Ryan and Peter help navigate you through this changing cloud landscape via our weekly podcast.
    © 2020 The Cloud Pod
    Show more Show less
activate_Holiday_promo_in_buybox_DT_T2
Episodes
  • 278: Azure is on a Bender: Bite my Shiny Metal FXv2-series VMs
    Oct 16 2024

    Welcome to episode 278 of The Cloud Pod, where the forecast is always cloudy! When Justin’s away, the guys will… maybe get a show recorded? This week, we’re talking OpenAI, another service scheduled for the grave over at AWS, saying goodbye to pesky IPv4 fees, Azure FXv2 VMs, Valkey 8.0 and so much more! Thanks for joining us, here in the cloud!

    Titles we almost went with this week:
    • Another One Bites the Dust
    • Peak AI reached: OpenAI Now Puts Print Statements in Code to Help You Debug
    A big thanks to this week’s sponsor: Archera

    There are a lot of cloud cost management tools out there. But only Archera provides cloud commitment insurance. It sounds fancy but it’s really simple. Archera gives you the cost savings of a 1 or 3 year AWS Savings Plan with a commitment as short as 30 days. If you don’t use all the cloud resources you’ve committed to, they will literally put money back in your bank account to cover the difference. Other cost management tools may say they offer “commitment insurance”, but remember to ask: will you actually give me my money back? Archera will. Click this link to check them out

    AI Is Going Great – Or How ML Makes All It’s Money

    00:59 Introducing vision to the fine-tuning API.

    • OpenAI has announced the integration of vision capabilities into its fine-tuning API, allowing developers to enhance the GPT-4o model to analyze and interpret images alongside text and audio inputs.
    • This update broadens the scope of applications for AI, enabling more multimodal interactions.
    • The fine-tuning API now supports image inputs, which means developers can train models to understand and generate content based on visual data in conjunction with text and audio.
    • After October 31, 2024, training for fine-tuning will cost $25 per 1 million tokens, with inference priced at $3.75 per 1 million input tokens and $15 per 1 million output tokens.
    • Images are tokenized based on size before pricing. The introduction of prompt caching and other efficiency measures could lower the operational costs for businesses deploying AI solutions.
    • The API is also being enhanced to include features like epoch-based checkpoint creation, a comparative playground for model evaluation, and integration with third-party platforms like Weights and Biases for detailed fine-tuning data management.
    • What does it mean? Admit it – you’re dying to know.
    • Developers can now create applications that not only process text or voice but also interpret and generate responses based on visual cues, and importantly fine tuned for domain specific applications, and this update could lead to more intuitive user interfaces in applications, where users can interact with services using images as naturally as they do with text or speech, potentially expanding the user base to those less tech-savvy or in fields where visual data is crucial.

    03:53 Jonathan – “I mean, I think it’s useful for things like quality assurance in manufacturing, for example. You know, could, you could tune it on what your nuts and bolts are supposed to look like and what a good bolt looks like and what a bad bolt looks like coming out of the factory. You just stream the video directly to, to an AI, AI like this and have it kick out all the bad ones. It’s kind of, kind of neat.”

    04:41 Introducing the Realtime API

    • OpenAI has launched its Realtime API in public beta, d...
    Show more Show less
    47 mins
  • 277: Class E IPs, so now you can procrastinate IPv6 even longer
    Oct 10 2024

    ​Welcome to episode 277 of The Cloud Pod, where the forecast is always cloudy! Justin, Ryan, and Matthew are your hosts this week for a news packed show. This week we dive into the latest in cloud computing with announcements from Google’s new AI search tools, Meta’s open-sourced AI models, and Microsoft Copilot’s expanded capabilities. We’ve also got Oracle releases, and some non-liquid Java on the agenda (but also the liquid kind, too) and Class E IP addresses. Plus, be sure to stay tuned for the aftershow!

    Titles we almost went with this week:

    Which cloud provider does not have llama 3.2

    Vmware says we will happily help you support your old Microsoft OS’s for $$$$

    Class E is the best kind of IP Space

    Microsoft says trust AI, and so does Skynet

    3.2 Llama’s walked into an AI bar…

    Google gets cranky about MS Licensing, join the club

    Write Your Prompts, Optimize them with Vertex Prompts Analyzer, rinse repeat into a

    vortex of optimization

    Oracle releases Java 23, Cloud Pod Uses Amazon Corretto 23 instead

    Oracle releases Java 23, Cloud Pod still says run! MK

    A big thanks to this week’s sponsor: Archera There are a lot of cloud cost management tools out there. But only Archera provides cloud commitment insurance. It sounds fancy but it’s really simple. Archera gives you the cost savings of a 1 or 3 year AWS Savings Plan with a commitment as short as 30 days. If you don’t use all the cloud resources you’ve committed to, they will literally put money back in your bank account to cover the difference. Other cost management tools may say they offer “commitment insurance”, but remember to ask: will you actually give me my money back? Archera will. Click this link to check them out

    AI Is Going Great – Or How ML Makes All It’s Money

    01:06 OpenAI CTO Mira Murati, 2 other execs announce they’re leaving

    • Listener Note: paywall article
    • OpenAI Chief Technology Officer Mira Murati is leaving, and within hours, two more OpenAI executives joined the list of high-profile departures.
    • Mira Murati spent 6.5 years at the company, and was named CEO temporarily when the board ousted co-founder Sam Altman.
    • “It’s hard to overstate how much Mira has meant to OpenAI, our mission, and to us all personally,” Altman wrote. “I feel tremendous gratitude towards her for what she has helped us build and accomplish, but most of all, I feel personal gratitude towards her for her support and love during all the hard times. I am excited for what she’ll do next.”
    • Mira oversaw the development of ChatGPT and image generator Dall-E. She was also a pretty public face for the company, appearing in its videos and interviewing journalists.
    • The other two departures were Barret Zoph, who was the company’s Vice President of Research and Chief Research officer Bob McGrew.

    02:26 Ryan – “Her reason for leaving is, you know, to take some time and space to explore and, you know, be more creative. I’m like, yeah, okay. they’re starting copy. Yeah. Yeah. Leaving for health reasons. You got fired.”

    -Copywriter Note: this is 100% copywriter speak for you either got fired – or will be soon and decide to step down.

    03:38

    Show more Show less
    1 hr and 9 mins
  • 276: New from AWS - Elastic Commute - Flex Your Way to an Empty Office
    Oct 1 2024

    Welcome to episode 276 of The Cloud Pod, where the forecast is always cloudy! This week, our hosts Justin, Matthew, and Jonathan do a speedrun of OpenWorld news, talk about energy needs and the totally not controversial decision to reopen 3 Mile Island, a “managed” exodus from cloud, and Kubernetes news. As well as Amazon’s RTO we are calling “Elastic Commute”. All this and more, right now on The Cloud Pod.

    Titles we almost went with this week:
    • The Cloud Pod Hosts don’t own enough pants for five days a week
    • IBM thinks it can contain the cost of K8s
    • Microsoft loves nuclear energy
    • The Cloudpod tries to give Oracle some love and still does not care
    • The cloud pod goes nuclear on k8s costs
    • Can IBM contain the costs of Kubernetes and Nuclear Power?
    • Google takes on take over while microsoft takes on nuclear
    • AWS Launches ‘Managed Exodus’: Streamline Your Talent Drain
    • Introducing Amazon WorkForce Alienation: Scale Your Employee Discontent to the Cloud
    • Amazon SageMaker Studio Lab: Now with Real-Time Resignation Prediction
    A big thanks to this week’s sponsor: We’re sponsorless! Want to get your brand, company, or service in front of a very enthusiastic group of cloud news seekers? You’ve come to the right place! Send us an email or hit us up on our slack channel for more info. General News

    01:08 IBM acquires Kubernetes cost optimization startup Kubecost

    • IBM is quickly becoming the place where cloud cost companies go to assimilate? Or Die? Rebirthed mabe? Either way, it’s not a great place to end up.
    • On Tuesday they announced the acquisition of Kubecost, a FinOps startup that helps teams monitor and optimize their K8 clusters, with a focus on efficiency – and ultimately cost.
    • This acquisition follows the acquisitions of Apptio, Turbonomic, and Instana over the years.
    • Kubecost is the company behind OpenCost; a vendor-neutral open source project that forms part of the core Kubecost commercial offering.
      • OpenCost is part of the Cloud Native Computing Foundations cohort of sandbox projects.
    • Kubecost is expected to be integrated into IBM’s FinOps Suite, which combines Cloudability and Turbonomic.
      • There is also speculation that it might make its way to OpenShift, too.

    02:26 Jsutin- “…so KubeCost lives inside of Kubernetes, and basically has the ability to see how much CPU, how much memory they’re using, then calculate basically the price of the EC2 broken down into the different pods and services.”

    AI Is Going Great – Or How ML Makes All It’s Money

    05:03 Introducing OpenAI o1-preview

    • Reasoning LLM’s have arrived this week. Dun Dun Dun…
    • The idea behind reasoning models is to take more time to “think” before they respond to you.
    • This allows them to reason through complex tasks. and solve harder problems than previous mod...
    Show more Show less
    1 hr and 10 mins

What listeners say about tcp.fm

Average customer ratings

Reviews - Please select the tabs below to change the source of reviews.