Overview
Transcript
Hello there. I’m Mike Amundsen, and this talk is "ALPS + AI = API." I know that’s a terrible title. We’ll talk about that in just a second. But first of all, this is me. This is how you can find me in LinkedIn, in GitHub, in X, in Mastodon, in Bluesky, and any place where there’s lots of social activity. I’d love to connect with you and learn from you. On all my talks, I always use the same entry, which is that most of the material that I’m showing you today, I learned from someone else. I would love to now learn from you and be able to share that as well, so please connect with me and tell me what’s interesting to you, what’s challenging, and what you’ve learned, and how maybe I can learn from you. So that’s how you can find me.
So now let’s get to the point. This was my real title, but it’s too long, "Let’s create a functional API prototype using only an API story document and some GenAI prompts." So this is going to be lots and lots about GenAI and a handful of prompting advice. But that was kind of long, so I shortened it. I appreciate you watching the video anyway. Hopefully, this will be interesting.
So this whole idea of API story, what is an API story and why are we going to use it? I’m going to use the API story to generate a description document called ALPS, Application-Level Profile Semantics. It’s similar to OpenAPI, but it’s a bit more generic, and we’ll talk about why that is in just a bit. And then we’ll use that description output to create a working NodeJS API, a minimalist proof of concept, somewhat naive kind of API. Stories, description, and API. Here we go.
Now, before we get into all of the prompts and all the other fun stuff, we’ve got one more thing to deal with, and that is this, Application-Level Profile Semantics or the ALPS specification. The ALPS specification was created more than 10 years ago as an experiment in how to create a fully descriptive single document that could work in any RESTful as well as other formats, such as AsyncAPI, RPC, and so forth. So it’s literally a format for clarifying the application-level meaning and structure of interfaces. It’s a very generic or general interface description language.
As I said, it was first used as an experiment in 2011 at an event called RESTFest in the U.S., and it’s actually based on an XML Meta Data Profile from 2003. So the whole idea of this and where it comes from is basically about 20 years old, but it really applies well, especially in the age of bots. Application description documents are very handy for chatbots.
Now, it turns out, there’s lots of tooling around this ALPS space, but not a lot of people know about it. There’s an online editor that looks a little bit like SwaggerHub. There’s a command line utility for authoring and validating them. There’s actually a very extensive large language model prompt library that you can access. And OpenAI even has an ALPS assistant. Now, that’s a lot of stuff. That’s the diagramming, the actual editor system, the command line app, this prompt library, which we’ll meet up with in just a second, and the ALPS assistant, all very handy in working with this description language.
Now, it turns out, all of these great tools come from the mind of the same person, Akihito Koriyama in Japan. He and his organization use the ALPS format and these ALPS tools extensively in designing and building their applications. So, who is Akihito Koriyama? He’s actually the creator of a PHP framework called Beard.Sunday, and he does lots of API architecture and REST work, specifically through this notion of using ALPS. Now, I interviewed him and I asked him what his goal is and what he’s working on and why it’s interesting to him, and he had a really interesting reply. He basically said that business requirements have been interpreted and formalized by engineers, and he wants to close the gap. And he likes to use these tools to close that gap. And I think that’s really fascinating.
All right. We’ve got sort of the basics. We’ll see more of ALPS as we go along. We’re not going to get too deep into it. You can find some information about ALPS online if you like.
Let’s get right to this notion of writing API stories. So, what do I mean by API stories? Well, every API starts with a story. We need a certain feature. Our customers would request a functionality. You’ve got an idea. These are all stories, and stories are shared understanding. Our brains are wired for stories, not data. They tell stories to each other. Stories are accessible and repeatable. You get on an elevator, somebody asks you what you’re working on, you can tell a single short story online. "We’re building a person API so that it’s easier to keep track of people who are really important to us." "We’re updating the customer service." And so on, and so forth. API stories are the bread and butter of every single design.
Now, this is what an API story looks like when I build one. It’s a little bit like an extended user story. That first line, "We need to track Task records in order to improve the timeliness and accuracy of our customer follow-up," reads kind of like a user story, but it’s a little bit more terse. And then there’s more information, all of the data elements we’re going to need for this task management service, all the possible actions, List and Create and Update and MarkComplete, and so on and so forth, all captured on this single-page document. And notice there’s nothing here about methods or HTTP or URLs or anything like that. It’s simply plain text. And that means anybody can contribute. You can share this with domain experts, engineers, the CEO, the person who just arrived brand new to the company today, and they can interact with you on this document.
Now, here’s a more distilled version of that. This looks a lot more like a user story. As a product manager, I need a task tracking system. These tasks have the following data elements. Users should be able to do the following things. And there’s a little bit extra about filtering services. So that’s a pretty good API story right there.
Stories are shared understanding. So now let’s take a look at creating one of these stories, and I’m going to move over here to another application. Give me just a second here. Here we go.
So I talked to you earlier about this idea of a library and how this prompt library works. Let’s get started over here, Prompts. Okay. So we want to create an ALPS document from a user story. I’m actually going to use that task management app that we were just looking at earlier, that story, and generate an ALPS document from it. Now, this actually sets up all the prompts I need to create that document. Now, what I’m going to do is I’m actually going to paste this in and get this started while we talk a little bit about what’s going on. And I’ll show you what’s in this prompt. So I’m literally pasting this prompt in this ALPS profile creation prompt and giving it all sorts of instructions to work with. I’m going to set this off and running. And while I do that, I’m going to go over to another window and actually show you what that prompt looks like.
So this is the story that we did earlier. And this is the prompt that we just supplied to the chatbot. So these are basically step-by-step instructions on creating an ALPS document. If you were going to teach somebody how to create an ALPS document, this is what you would do. You would hand them this kind of information. And now, we can see how we’re doing over here. Okay. So it is actually now generating a complete ALPS document for us right now. So this is the first step in the process. This can also be converted, by the way, into an OpenAPI document. If you want it to go directly to a RESTful format rather than a more generic format, you can do that as well. Now, this takes a little bit of time. I’m going to go ahead and move on. We’ll use this in just a minute when we continue on with our project. All right. Let’s keep going.
So we’re building that ALPS description document now. We’re converting that user story into a prompt. We pass that prompt to generate that document. And that’s halfway through the story. We’re also going to apply a review to make sure that the process is going correctly. And then we’ll even load that document into our editor to make sure that it’s working properly. So let me go ahead and show you what that looks like here, too, in just a second. Let’s see if we’ve got… Is it done yet? Yes, it’s done. So let’s go ahead and move on to the next step.
This is our editor. This is how we’re actually going to validate our output here. I’m going to copy that output from the LLM into the visual editor, and it’s going to show us that it is a valid document. So it knows how to go from a list to tasks, all these particular actions, and how to manage users. And it’s generating all the basic information about what a task object looks like and what the task list can do. It’s got objects and actions. All of this material here is set and ready to go. So we have validated that we’ve got the proper user story done.
So we went through the process of creating this user story. We then created a prompt to get us to start creating our ALPS document. Then we went to the ALPS assistant. We pasted it in. And we actually can also do a validation. Because of time, I’m going to skip this validation step, but it’s included in the repo that I’m going to share with you at the end. It’s always a good idea to validate with the assistant. And then we actually got the OpenAPI document we’re looking for. And this looks a lot like not OpenAPI, but the ALPS document, which looks a lot like OpenAPI. And that makes sure that everything is up and running.
Now, let’s move on to the next step. We’ve described the design, but we have yet to actually create the implementation. And that’s the next step in our process here. So let’s create that Node API. So what we want to do is we want to get the GenAI to read the ALPS document and then create a simple NodeJS application. So the API design description, that ALPS document, becomes our map, our little navigational process for creating the working API. And now, we’re going to focus on a proof-of-concept API. We’re not working on production code. This is just for sharing and testing. Remember, what stories let us do is say we want to describe something, and we can quickly turn that into some working code just to see if it works, if it’s a good idea, if we like the way it behaves. This is not going to be production-level at all, but it’s going to be a great way to get a prototype up and running.
Now, to do that, I’m going to need to give it a couple of things. Remember, there were some notes that we gave the LLM about how to describe an ALPS document. I need to give that same information, along with our output, in order to describe how to do an API. So I’m going to do the same thing here and then move to our chat window. And here’s what I’m going to do. I’m actually going to give it a couple of things. I’m going to give it two documents, a prompts document and a template document. And then I’m going to give it just a little bit of advice. This is always a good way to interact with your generator and say, "Hey, this is what we’re going to do. It’s time to generate the document." So let’s give this a kick, and let’s make sure it starts running. While this is working, I’ll spend a little bit of time talking about what I just fed into the system. So let’s go back over here.
This first element is a step-by-step for parsing the profile and turning it into an ExpressJS service. And again, this is the same set of instructions you might give a new worker that just came into the office today. You can think of these chatbots as somebody who’s well-meaning, eager, talented, but not really fully informed. So they get informed by using tools like this. Now, I add some additional considerations to…like, this is sort of an advice document to make sure that things work okay. Let me see. It looks like it’s building here. So, yeah, here we go, it’s building code now. So I think, actually, this is one of the smartest things they did in the chatbot series, make you watch the code, right? They don’t just produce the document for you. They make you watch it character by character, which is kind of entertaining. So it’s generating a working application right now based on the prompts I gave it.
Now, the reason I have those prompts and the reason I have them written down is that the next time I go back to this GenAI bot, it’s not going to remember anything about what we did before. It may say it does or it may recognize a few words or phrases, but it doesn’t understand the rules deeply from one minute to the next. There’s just not enough room to remember all that stuff. So I always, when I start a prompt, I always bring over those prompt materials, that guidance document, as well as whatever the ALPS document is going to be. And that’s really important. That’s turning that well-meaning, eager individual into somebody that’s also well-informed.
So this actually gives us a working API. And how do I know that’s working? Because I can then actually go to the URL and run it, and it actually is functional. So if I go 101 here, it’s going to give me that record, all the links, all the actions, such as update and complete, and go back. And I can also do…I think I can do users. Can I do users? Yes, I can do users as well. So we’ve got an API that’s up and running that somebody can look at and interact with. We have basically built our own live mock server.
So let’s go back to the slides and review quickly. So we created this API, we checked it out, and we ran it, and we made sure that it actually works. We’ve got working code and a rough consensus about what that API could look like. Now, I could go back and make some changes to that original story, generate the ALPS, and then generate the API again in a matter of minutes. So now, I can do lots and lots of prototyping and tests. So that’s really pretty handy.
So let’s summarize here. We talked about this notion of writing stories. Stories are for sharing and understanding between individuals, and everybody can participate in that story. It’s not a technical document. It’s really sort of a domain topical document. We also talked about a process of using GenAI to build this design document to describe the design. It describes all of the actions, all of the states or resources, all of the data points, and all the connections between them. And we saw that rendered visually in the ALPS editor, and we saw that in the actual ALPS document that was output. Finally, we took that description document along with some instructions about how to convert that document into NodeJS, and we created a rough working code API.
Now, again, it wasn’t production-ready, but we can at least see how it works. It’s a different way to think about how we might mock something, and we can go back and do that over and over again. So, in just a matter of 10 minutes, while we were here, we went from this description to this actual working code. That’s pretty amazing, and that’s a pretty good use, in my opinion, of the GenAI products.
Now, there’s something important to keep in mind here, and you’ve probably heard this phrase, "vibe coding." What we just did is not really vibe coding. It’s close, but vibe coding was really described by Andrej Karpathy as simply accepting whatever the machine gives you and just keep moving on, keep going forward, sort of like having a good old time. It isn’t really creating anything that’s production-ready. And how do we know this? Because Andrej, in February, after inventing this vibe coding phrase, actually worked on a project to use GenAI to create a production application. It was called menu generator. And this was his quote after his experience in trying to get GenAI to create production-ready code. Basically, "it’s messy and not a good idea for anything of actual importance." And that’s a good thing to keep in mind. What we’re doing here are creating prototypes. We’re not trying to create production applications.
So all the material that I showed you is at this repo, and you can see walkthrough instructions on how to use this. And this can give you some ideas about how you can create your own customized generator process to turn stories or text documents into descriptions like ALPS or OpenAPI or AsyncAPI and then eventually into working code. So I hope this has been interesting. I thank you very much for your time. And I hope to see you again sometime soon.