Get started with š AI and Grok
An introduction to š AI and the Grok foundational models, with sample code.
Grok is šās new foundational AI model. AndĀ as of today, you can get $25 in credits every month to build withĀ it.
In this post, Iāll show you how to build a simple app with Grok andĀ Vercel AI SDK. Weāll build a full-stack application usingĀ Astro, and deploy it toĀ Cloudflare Pages. If youāre interested in seeing the full source code for what we build,Ā check it out onĀ GitHub.
Create a project
First, weāll create a new project using theĀ npm create cloudflare
CLI - selecting the āAstroāĀ framework:
This will create a new project in theĀ grok-starter
directory.
Add Vercel AIĀ SDK
Next, weāll add the Vercel AI SDK and theĀ š AI Provider to ourĀ project.
Letās define an API endpoint that we can use to call the AIĀ model:
Note: this function is typed asĀ any
to simplify the tutorial. In theĀ full code, the function isĀ properly typed.
Get an APIĀ key
To use the Grok model, youāll need an API key. You can get one by signing up for an account atĀ console.x.ai. After youāve confirmed your email address, you can generate an APIĀ key:
Add the API key toĀ .dev.vars
in the root of yourĀ project:
Run locally
Now we can run the appĀ locally:
To generate our first text from Grok, we can send a POST request to theĀ /api/generate
endpoint1.
Create a frontendĀ interface
Now we can create a simple frontend interface to interact with our API. InĀ src/pages/index.astro
, weāll replace the content generated by Astro with basic styling, and add a form with an input and aĀ button:
This form will render, but submitting it wonāt do anything. Letās write a basic function that will take the text from the input and send it to our APIĀ endpoint:
The complete implementation looks likeĀ this:
The final UI can accept prompts and generateĀ responses:
Deployment
Now that the UI and endpoint work correctly, we can deploy the application. To do this, weāll useĀ Cloudflare Pages. This is a great way to deploy a static site, and itās free for smallĀ projects.
If you havenāt created an account already, you can do soĀ here. Once youāve created your account, youāll be able to create a newĀ project.
Once youāve created an account, you can authenticate with the CLI byĀ running:
Finally, we can deploy our application byĀ running:
This will build the site, and deploy it to CloudflareĀ Pages.
Before we can use it in production, we need to set up the š API key that we generated earlier. To do this, run the followingĀ command:
After setting the secret, you should deploy one more time, to ensure that the secret is available to yourĀ application:
After doing this, you can visit the site and test itĀ out!
Securing the AI endpoint with AIĀ Gateway
We have one more trick up our sleeve. We can useĀ Cloudflare AI Gateway to help protect our application from abuse. Since weāre essentially deploying an unprotected API endpoint that would allow anyone to use š AI/Grok via our application, it would be easy for a malicious actor to use up all ourĀ credits!
AI Gateway allows you to protect your API endpoints from abuse. It works by rate-limiting requests, caching responses, and adding useful logging/analytics to your AIĀ endpoints.
Hereās how to integrate it into yourĀ application:
- Enable AI Gateway by visiting theĀ Dashboard and enablingĀ it.
- Create a new AI Gateway by clicking the āCreate GatewayāĀ button.
- Select the āAPIā button, and find the āGrokā endpointĀ option
- Copy this value.
This AI Gateway endpoint is what we will proxy our AI requestsĀ through.
In the src/pages/api/openai.ts
file, weāll update the URL to use thisĀ endpoint:
Redeploy your application, and try issuing a request from the UI to your AI endpoint. You should begin seeing logs generated in the AI GatewayĀ console.
Finally, we can turn on a few settings to make AI Gateway securely protect our APIĀ endpoint:
- āCache Responsesā. This will cache any response from Grok that matches a previousĀ prompt.
- āRate Limit Requestsā. This will limit the number of requests that can be made to the API endpoint from any given IP address. You can tweak this to be, for instance, 10 requests perĀ minute.
Conclusion
Iām impressed with š AI and the Grok model! Itās pretty smart, and itās easy to integrate into applications. The $25 free monthly credit theyāve announced is awesome, and Iām excited to keep building with it. In the blog post, they mention support for tool calling and system prompts. This will probably get integrated into Vercel SDK AI soon, so it will be another great model to have in the toolbelt while building AIĀ apps.
If youāre interested in seeing the full source code for this project,Ā check it out onĀ GitHub!