Big News from OpenAI: Two New Models!
OpenAI just announced two new models with better performance at a much lower cost. This video breaks down everything you need to know: What the new models do, how they […]
Jeffrey Spyropoulos: Making Analytics Count at JCP Jim Griffin
Tapan Khopkar: A ‘MasterClass’ in Marketing Mix Jim Griffin
Aida Farahani: From 2D to 3D in Seconds Jim Griffin
Nikhil Patel: Inside Sally Beauty’s Data Strategy Jim Griffin
Victor Perrine: From Bananas to $Billions Jim Griffin
Ray Pettit: New Models for AI Literacy? Jim Griffin
Ivan Pinto: A Year of AI Testing in Software Dev Jim Griffin
Sam Marks: Big Data, Big Bad Bruins Jim Griffin
Stable Code 3B is the latest entrant in the field of text-to-code and has been declared in the technology press as a new leader.
This video compares the mostly positive feedback from bloggers and journalists to the mostly skeptical feedback from the programmer community on forums like Y-Combinator.
In fact, it’s not clear that any 3B model can really be classified as a leader today in text-to-code, and I explain why.
Along the way, we uncovered a thread where Emad Mostaque – the founder and CEO of Stability AI – was personally defending his company against a small group of nay-sayers.
As you’ll see, the conversation took an unexpected turn when Emad justified the leadership position of his company, in part, based on a claim he makes that “stable diffusion turbo does like 100 cats with hats per second.”
After that, the back-and-forth on Y-Combinator was quite amusing, as you will see, and maybe I added a bit of fuel to the fire as well, with my closing sequence to this video.
OpenAI just announced two new models with better performance at a much lower cost. This video breaks down everything you need to know: What the new models do, how they […]
Copyright AI Master Group 2023-24