Links mentioned: Details automatically generated.
summary

Speaker is signing up for the limited public beta of VS Code. They want to go to production deployment via Azure. Speaker wants to sign up. They explain how the development of AI applications is changing the way people create software. Github is launching AI models that will allow developers to build with industry-leading AI models. Speaker is interested in how GitHub is incorporating Lama 3.1, GPT4, OpenAI.

Speaker has signed up for Azure AI. It's free. It allows Speaker to bring the models to the developer environment in code spaces and upload them to the cloud. Speaker likes the limited public beta of the GitHub models. They want to check them out. Speaker is going to check out the code that is being sent to him.

Speaker is developing an artificial intelligence tool for the client and the chat component of it.

They are back at the same place they used to hang out. Jack has changed their dreams for their ticket out. Speaker is starting to do laundry. They bought a better version of a product that had been recalled. Speaker is trying to get some cool shots when they travel. They watched some videos on YouTube to get inspiration. Speaker shares a link to a blog post about AI engineers building on GitHub. They have been messing around with some models for a side project. They want to be able to do things in the browser.

Speaker is excited about GitHub models. They are enabling the rise of the AI engineer with GitHub models, bringing the power of industry leading large and small language models, LLMs. Speaker is introducing the models on GitHub to the more than 100 million users directly on GitHub. They are waiting to hear what they say. Speaker has had this idea for a while.

Speaker explains to Speaker what is the capital of France and then the role of a user. They are making a fetch call behind the scenes. Speaker would like to see some sort of documentation to show how the responses come back.

Speaker wants to know if all the models are the same across the board. They have signed up for the model playground on GitHub. Speaker: They're going through the Azure Open AI service, which allows Microsoft and others with just a few clicks and keystrokes.

Github allows people to experiment, compare, test, and deploy AI applications right from their source code. There are no prompts or outputs in GitHub models and they are not used to train or improve the models.

Speaker is wondering how the GPT-40 model can be used as a proxy for Open AI.

Speaker helps Sarah plan the lesson around a word-based question. Sarah has 15 apples. They give seven apples to their friend.

Speaker is looking for a lesson plan for teaching addition and subtraction using the word-based question. They J. Malon will be putting GitHub models to the test in Harvard CS50 this fall. Speaker is talking about the first wave of GitHub models. They will continue to add more language, vision and other models to the platform. Speaker has launched GitHub models on the GitHub marketplace. They want to try out one of the newest models, GPT-40. Speaker will switch to Phi3 Mini Instruct to test GPT4O model. They want to start using the models. By clicking the code button, we get access to some getting started instructions.

Speaker wants to start experimenting in code. They need to update the streaming sample to use Phi3 Mini 4 K and update the prompt. Speaker wants to make sure that the GPT4-0 and GP240 messages are different when they switch to the Phi-3 Mini 4K. They also want to update the streaming sample and continue with the experiments. Speaker is teaching AI models to create questions for first-year computer science students. They explain how Gitub Models helps minimize the friction as they explore and experiment with AI models.

Speaker asks for the last 10 commits of this repository using Phi3 Mini Instruct. They also want the PR message for this PR. Speaker explains how to run prompt evals in GitHub actions with a series of JSON files that they just pipe in the GitHub models. Go to production with Azure AI by replacing their GitHub personal access token with an Azure subscription and credential. Speaker wants to see more examples of it. The next break session is the final break session. Speaker sends everyone off with some traveling music called Thank You. They will be on stage in 15 minutes at the hour. Speaker thanks everyone for their support. They are going to show how music is more meaning to life. Speaker thanks their fans for their support.

topics
  • blog post
  • github models
  • language models
  • instruct
  • code spaces
  • full stack
  • linear path
  • interactive model playground
  • github marketplace
  • system prompt
  • code button
transcript

Welcome back.

Your dreams for your ticket out?

Welcome back to that same old place that you laughed about?

Well, the names have all changed since you hung around.

But those dreams have remained and they have turned around.

You know who'd thought they need you.

To do the thought they need you.

Yeah, where we need you.

Jack, here, where we need you.

Yeah, we tease him a lot because we got him on the spot.

Welcome back.

Welcome back.

Welcome back.

All right.

Get stretch.

Okay.

That's good.

Oh, that's awesome.

All right, welcome back.

Hopefully that last break session, last work session was a good one for you.

I am starting some laundry.

So I have just bought something.

It's kind of to replace a thing that I purchased before that I had a lot of fun with, but then it got like recalled, I guess.

So I am like, I am not going to use this anymore.

And so there's a newer version, better version by a different manufacturer.

And I think it like will pack up easier.

So when I go out.

To do like events or whatever when I travel, I will be able to get some cool shots.

And yeah, so I was kind of like watching videos on it.

And, so I kind of went down like a rabbit hole.

Not too productive, but maybe in the future it will help me get some cool shots or whatever.

So yeah, neither here nor there.

So I wanted to talk about something that I have had in like the in the list of things to talk about for a minute.

But I just had other things I wanted to talk about before that.

So I figured now is it as good at any time to check it out.

And I put a link to it.

And so you'd be able to check it out for yourself next to the recording of the video on the website.

But what I am talking about, and it has to do with like AI models and things.

So, Let us check this out.

So GitHub.

So now this blog post is introducing GitHub models, a new generation of AI engineers building on GitHub.

Cool.

And, so I guess they're really getting into the AI engineer stuff.

So kind of interesting to see what they have planned here.

I kind of looked at it before, but I didn't really dive too deep into it.

I have been messing around with some models.

Previous, some AI, like LLMs and all these things.

But a lot of it is for a side project, but I want to be able to do things in the browser.

So I spoke before about Transformers J.S.

And it's kind of limited on the different models that it can do in the browser.

So I have been kind of playing around with that mostly.

I don't really done too much other AI stuff because the stuff I build don't really need AI.

Like I showed the video to GIF converter.

That didn't need any AI in it.

You know what I am saying?

So, so yeah, so we will see what they have here.

You know what I am saying?

It's kind of interesting to see.

So, Here we go.

All right, so we are enabling the rise of the AI engineer, all right, with GitHub models, bringing the power of industry leading large and small language models, LLMs like I said before.

I never heard about small language models, SLMs, hmm.

To our more than 100 million users directly on GitHub.

Wow.

Okay.

So introducing this here.

And so they have some of the models that I guess they have that they, that you be able to kind of work with.

So there's the meta-Lamda 3.1, Open AI, GPT4O.

This one, yeah, from Microsoft, yeah, the Microsoft logo, yep.

The Phi-3, Phi, I guess I would say it.

Medium Instruct, okay.

All right.

Some of these I have never seen before is kind of cool.

Mistral I have seen before.

That's cool.

All right.

So, let us see what they say here.

Shout to Thomas.

And so, yeah, so again, this is, I have had this for a while.

It's like September 1st now.

So, yeah.

All right.

Cool.

So here we go.

So we believe every developer can be an AI engineer with the right tools and training from playground to coding with the model in code spaces.

That's the online in browser, like, basically VS code that they have where I think you can just press like if you're in a GitHub repository and I don't know if you have to hook anything up I don't think I did anything to hook it up but I can press like the period the and it will open up the code in like an in browser thing I can make the changes and all it's pretty dope so yeah to production deployment via Azure all right GitHub models show you how simple it can be All right.

Sign up for the limited public beta here.

So if you want to sign up, you can.

So from the early days of the home computer, they're taking it back.

The dominant mode of creation for developers has long been building, customizing, deploying software with code.

True.

Today, in the age of AI, A secondary and equally important mode of creation is rapidly emerging, the ability to leverage machine learning models.

Increasingly, developers are building generative AI applications where the full stack contains backend and front-end code plus one or more models.

But a vast segment of developers still lack easy access to open and close models.

This changes today.

All right, dope, dope, dope.

So now we are launching GitHub models, enabling our more than 100 million developers to become AI engineers and build with industry leading AI models.

All right, so I guess they're going to be kind of like the in-between where you be able to mess around with the models.

Kind of like how I guess I showed a few weeks ago.

Whenever that was, that Cloudflare, they had like the little playground where you could like literally just click and drag like little boxes and then connect them together and kind of play around with a different model.

So I am interested in to see how they are incorporating it.

I will get up.

I want to say they, GitHub is incorporating it.

All right, so from Lama 3.1, that we said, the meta one, to GPT4, OpenAI.

Oh, there's a mini, poor old mini, to the Microsoft one, to the mistral, all right.

You can access.

Each model via a built-in playground.

Okay, that lets you test different prompts and model parameters for free, right in GitHub.

So I guess you had to have to sign up first, and then I guess it's free.

Okay.

And if you like what you're seeing on the playground, we have created a glide path?

Glide path.

Okay.

I have .

Normally it's like a happy path.

Like, glad.

I have never heard of that path.

Okay, cool.

To bring the models to your developer environment in code spaces and Vs code.

All right.

So like online or like on your computer.

That's dope.

And once you're ready to go to production, okay, Azure AI offers built-in responsible AI, enterprise-grade security and data privacy and global availability with provision throughput and availability in over 25 Azure regions for some models.

It's never been easier to develop and run your AI application.

Yeah, I guess it's kind of important that You need to have the models in the region to make it quicker, I guess.

And then I guess some places you can't have like data, I guess, going out of the region.

So it's probably a good idea that they have it within there.

So that's cool.

And here's a video.

So let us check this out.

So limited public beta of the GitHub models.

Let me see all this stuff here again.

Let us check it out Anytime there's some code, I got to go check this out, hold on.

Anytime there's some code, I got to go check this out.

Hold on.

What is it saying?

What are you saying here?

What does that pop up?

All right, so we got a response.

We're going to wait.

Okay, so you're like an SDK.

So the client, the chat component of it, the completions create.

So you're going to create something, all right?

And you're using the model GPT for O, the content.

So you're kind of like talking to the AI and setting it up.

So what is the capital of France and then the role of a user And so that you're waiting that?

So behind the scenes, I am guessing is making like a fetch call, like an API call.

So this way you don't have to do that and all the, I guess, the authentication and things.

So you don't have to worry about all that.

You just, I am assuming you probably have to initialize the client and then allow you with like some sort of credentials and I allow you to do all this.

And then what comes back?

So there's a few, I guess, a few choices that could come back.

So you can get the first one, the message and the content of that message.

Okay.

I hope they have like some sort of like documentation to show like the breakdown of how the responses come back in like what form.

I am assuming so because trying to guess that would be terrible.

All right, so yeah, so that seems pretty straightforward.

And I guess you would just change this model here.

All right.

I wonder if all the models have the same, as you pass in a role and you pass in a content?

I wonder if they behind the scenes, if they aren't, if they're kind of like making that the same across the board maybe.

All right.

So, okay, cool.

So let us check out the rest of the video.

All right, that looks pretty cool.

So yeah, so it looks like you still have, you do have the sign up.

All right, cool.

All right, so the joy begins in the model playground on GitHub.

For most of us, learning to be a developer didn't happen on a linear path in the classroom.

It took practicing, playing around, and learning through experimentation.

Okay, that's period.

The same is true today for AI models in the new interactive model playground, students, hobbyists, startups, and even more and more can explore the most popular private and open models from meta, mistral, Azure Open AI service.

Okay, so yes, because they're going through, I guess, Azure Open AI service.

Okay.

Microsoft and others with just a few clicks and keystrokes.

Experiment, compare, test, and deploy AI applications right where you manage your source code?

That's cool.

So then I noticed Google's not here.

Interesting.

Well, I mean, maybe they're under and others.

I don't think, I don't remember seeing Google in here.

Okay, interesting.

All right.

Cool.

In alignment with GitHub and Microsoft's continued commitment to privacy and security, no prompts or outputs in GitHub models would be shared with model providers.

Interesting.

Nor used to train or improve the models.

Okay, so meta is not going to get your prompts.

Mistral is not.

Interesting.

So then how does it?

So can I know people pay for?

Open AI, right?

The, what's the name of the model again?

The GPT-40.

I know people pay for that.

So, hmm, because, so I guess you're not putting your API keys for Open AI.

You just be using, so maybe as a proxy through GitHub, and then maybe they will charge you on the back end for making those calls.

I wonder how that works.

Hmm.

Interesting.

Because like the prompts or outputs in GitHub models will not be shared with model providers.

So they're, I guess they're hosting those things, but what if they're closed source?

I don't know how that works.

Interesting.

All right, then we have this here, which is kind of small.

Okay, so it's a thing that says, please help me plan the lesson around this word-based question.

All right, word, yeah, word.

Sarah has 15 apples.

She gives seven apples to her friend.

How many apples does Sarah have?

So we're looking for a lesson plan around that.

Cool.

All right, GPT-O-4-O says, sure, here's a simple lesson plan for teaching addition and subtraction using the word-based question.

Okay.

And lesson plan addition and subtraction, objective students will be able to solve word problems involving addition and subtraction.

Okay, the things you will need, whiteboarded markers, pencil and paper, paper and pencils.

Counters or physical objects, for example, like the apples or blocks, okay, worksheet with similar word problems.

Okay, and they have like a warm activity.

Okay, and they kind of break down in time.

That's interesting.

So, Professor David J.

Malon.

We will be putting GitHub models to the test in Harvard CS50 this fall to enable students to experiment with AI all the more easily.

Cool.

So now test and compare different models.

Okay, so every piece of software is unique.

Yeah, yeah, yeah, yeah.

Okay, so Mistral offers low latency while GPTO 4 is excellent at building multimodal applications and might demand audio, vision, and text real time.

Okay.

That's good to know.

So the suite of developers, with the suite of models, developers will have all the options they need to stay in the flow, experiment more, and learn faster than ever before.

All right, bars.

Let us see.

And this is just the first wave.

In the months ahead, as we approach the general availability of GitHub models, we will continue to add more language, vision, and other models to our platform.

Okay, so spin up codebases.

So let us see if this will allow you to, check this video.

Maybe this is them actually using it.

Here we go.

We believe that every developer can be an AI developer with the right tools and training.

That's why we have launched GitHub models on the GitHub marketplace.

A place for you to.

Okay, so they have a marketplace.

Interesting.

Interesting.

Okay, so it's kind of like hugging face, I guess, a little bit.

Okay.

All right, cool, cool, cool, okay.

Explore and experiments with a handpick collection of top models with entit So let us try out one of the newest models, GPT-40.

We can pass in an initial user prompt to interact with the model, and we can use the playground to adjust the parameters.

For example, we can change the system prompt and experiment with the temperature for more or less randomness.

Now that we have adjusted the parameters, we can send user prompts to interact with the model simulating its use for our application.

Now, GPT4O looks great, but I want to test how a different and smaller model handles this scenario.

So let us switch to Phi3 Mini Instruct and evaluate its response.

Looks good! Now, we can also navigate to the details page to find out more information about the model through the ReadMe, Evaluation and Transparency tabs.

That's cool.

Now, I have made my decision, and I want to start using the models By clicking the code button, we get access to some getting started instructions.

We can even use a GitHub code space to create a pre-configured development environment, ready, with SDKs.

I don't know those that fast.

Hold on now.

Come on now.

You just click that.

Hold on.

Is this in the browser of code spaces?

Hold on.

Hold on here.

Configured to.

Wait, wait, what?

Here you go.

Here you go.

Okay, let us see.

We can even use a GitHub code space to create a pre-configured development environment, ready, with SDKs and some samples.

Now the model.

I wonder if that's online, because code spaces, I thought, was like an online in-browser, like, VS code?

Hmm.

The API calls use the entitlements And because the CodeSpace knows who I am, I don't have to add an API key or sign up to any other services to start experimenting in code.

That's cool.

So yeah, so since you're already signed in with the GitHub, then it already knows everything.

So that's cool.

That's good.

I was kind of wondering about that.

Okay.

Talking about experimenting, let us update the streaming sample to use Phi3 Mini 4K Instruct We will also update the prompt and continue with our.

See, okay, so that was different from before because the GPT4-0, let me see here.

Yeah, GP240.

So you have these messages here, a role system and role user.

But then we switched over to the Phi-3, the messages like changed.

So I guess it is on you.

To make sure that it's a different, interesting.

Okay.

Talking about experimenting, let us update the streaming sample to use Phi3 Mini 4K Instruct.

We will also update the prompt and continue with our experiments.

Now, those entitlements we mentioned earlier follow you wherever you're logged in, including the GitHub CLI.

Here, we're calling AI models to create questions for first-year computer science students.

Okay, that's pretty dope, so you can do this.

Okay, sure.

So, okay, that's pretty dope, so you can do it from the CLI.

All right, all right, that's pretty cool.

Okay.

It's probably good if you had like a quick question or something.

Or maybe you want to do something if you already need the CLI.

That's pretty dope.

Can check the knowledge of Git.

But we can even combine other CLI commands with the GitHub CLI.

For example, let us summarize the last 10 commits of this repository using Phi-3 Mini Instruct.

Gitub Models helps minimize the friction as you explore and experiment with AI models.

Wait, what I do?

So take your thing got in the way For example, let us summarize the last 10 commits of this repository using Phi3 Mini Instruct.

Well, that's pretty cool, so you're able to, okay, and then I am wondering if you can have like, hey, write me the PR message for this.

That's cool.

Like the commit message for this PR.

All right, that's pretty dope.

All right, so then once you're ready, it's a breeze to get things up and running, use the knowledge you have gained from the playground and code spaces and set up a prototype or proof of concept within your own applications.

Run prompt evals in GitHub actions with a series of JSON files that you just pipe in the GitHub models.

Okay.

So there's a co-pilot extension.

Interesting.

And finally, go to production with Azure AI by replacing your GitHub personal access token with an Azure subscription and credential.

Okay.

So he's not going to do that for you.

Okay.

Cool.

Hmm.

Keep reading.

Cool.

All right.

Dope.

Okay.

Very nice.

So I may, uh, hmm.

Interesting.

May have to check that out.

I will probably look around for some more, like, some other videos on it and see what's up.

Come on.

I want to see more examples of it.

All right.

Cool.

So let me get y'all back to work.

Stop screen share, yes.

So the next break session is the final break session.

So we will take a look at an artist's website, listen to some videos, listen some tracks, maybe watch some videos, and then maybe you will become a fan.

So until then I am going to send you all off with some traveling music.

This is Nujibez, featuring a Pony Bee.

The track is called Thank You.

Thanks for hanging out.

See you 15 minutes at the hour.

Till then, I wish you much productivity.

All right.

Y'all be easy.

And peace.

Thanks.

Let me start from the heart.

I respond to state.

I thank you all the show will luck to me.

I appreciate this unique opportunity.

I am going to rock so hard you never stop supporting me.

I am going to show you have music is more meaning to life.

I thank you all this show will love to me.

I appreciate this unique opportunity.

I am going to write so hard you never stop supporting me.

I am gonna show you how music is more meaning to life.

But let me start from the heart, I respond to stay.

I thank you all the show of luck to me.

I appreciate this unique opportunity.

I am gonna rock so hard you never stop supporting me.

I am gonna show you how music is more.