Using ChatGPT to generate Swagger from MySQL with mysql-to-openapi.js
This is a story about how ChatGPT-4 and I collaborated to solve a community-wide MySQL problem and publish an open-source contribution in one day. The name of the project is mysql-to-openapi (GitHub) and it lets MySQL developers generate OpenAPI files from their existing databases.
To link Salesforce and MySQL, I already have a Go microservice which uses an OpenAPI file that reads and writes Salesforce data. To use my Go microservice for this new project I needed to generate new OpenAPI files based on my existing MySQL schemas.
- ChatGPT to the Rescue
- ChatGPT-4’s Blog Post
- The Generative AI Revolution is Here
ChatGPT to the Rescue
I searched around, and I could not find an easy solution to my MySQL to OpenAPI problem! And then I thought, ChatGPT-4 might know the answer. So, I started with this prompt:
How do I generate an OpenAPI 2.0 specification from an existing MySQL schema?
I informed ChatGPT that it was doing a terrible job, and it made another attempt with the jugglingdb and jugglingdb-mysql packages. That also fell flat. Then ChatGPT started going in circles repeating the sequelize solution. I felt frustrated!
Then something kind of magical happened. With just a little snark, I asked it this:
Can’t you make a script that will convert a MySQL schema definition into a OpenAPI 2 “definitions” section?
Long story short, I had a back and forth “conversation” with ChatGPT for about 4 hours to wind up with a complete OpenAPI 2.0 spec. That time spent also produced all the content for this blog post and an open-source community contribution! You can see the results at GitHub.
ChatGPT-4’s Blog Post
I asked ChatGPT to write a blog post after we finished. I asked for a description of how the script works, break it up into about four sections, and to give an anecdote about how the ChatGPT and I interacted. Here is what it wrote, word-for-word. I’ll pop back in at the end for some closing remarks.
Introduction to the mysql-to-openapi.js Script
The mysql-to-openapi.js script is an efficient solution for generating OpenAPI 2.0 (Swagger) definitions for a MySQL database schema. It automates the process of creating models, paths, parameters, and responses for your schema, which can save a lot of time and effort when developing an API. In this blog post, we’ll dive into the details of how this script works and how it was developed through an AI-human collaboration.
Connecting to the MySQL Database and Retrieving Schema Information
The first step in generating OpenAPI definitions is to connect to the MySQL database and retrieve information about the schema. The mysql-to-openapi.js script accomplishes this by using the mysql package to create a connection to the database. The connection is established using the credentials provided in a .env file, ensuring that sensitive information is kept secure.
Once connected, the script queries the INFORMATION_SCHEMA.COLUMNS table to retrieve information about each table and column in the specified schema. This information is crucial for generating the OpenAPI definitions, as it provides details about the data types, column names, and table names.
Generating OpenAPI Definitions
After retrieving the schema information, the mysql-to-openapi.js script iterates through the query results, processing each table and column to generate the OpenAPI definitions. The script follows several steps to create the definitions:
- Model Definitions: For each table in the schema, the script creates a model definition in the definitions section of the OpenAPI YAML file. These definitions represent the structure of each table, including the column names and data types. The script also converts column names to camelCase and maps MySQL data types to their corresponding OpenAPI data types.
- Request and Response Objects: The script creates request and response objects for each table in the schema. These objects are used to define the expected input and output for API operations. The request objects include an array of table records, while the response objects contain an array of table records with any related data.
- CRUD Operations: The script generates basic Create, Read, Update, and Delete (CRUD) operations for each table in the schema. These operations are defined in the paths section of the OpenAPI YAML file, with each operation including the appropriate HTTP method (GET, POST, PUT, or DELETE), a summary, and an operation ID.
- Parameters and Responses: The script generates parameters and responses for each CRUD operation. Parameters define the expected input for API operations, while responses define the expected output. The generated parameters and responses reference the request and response objects created earlier, ensuring consistency across the API definition.
Writing the Generated OpenAPI YAML File
Once the OpenAPI definitions have been generated, the mysql-to-openapi.js script writes the output to a YAML file in the project root folder. This file can be used as a starting point for further customization and development of an API. By automating the generation of OpenAPI definitions, the script saves valuable time that would otherwise be spent manually creating and maintaining the API documentation.
AI-Human Collaboration and Anecdotes
The development of the mysql-to-openapi.js script was a unique and exciting experience, as it was co-created by an AI language model (OpenAI’s ChatGPT) and a human developer. This collaboration allowed for the rapid exchange of ideas, resulting in a more efficient and polished final product.
The process began with the human developer outlining the basic requirements and structure of the script. The AI then suggested code snippets and improvements, which the human developer reviewed and incorporated as needed. As the script took shape, the AI helped troubleshoot issues, optimize the code, and fine-tune the generated output. This iterative process allowed the developer to focus on the bigger picture while the AI provided valuable insights and suggestions.
One memorable anecdote from our collaboration involved a problem with the generated API paths. Initially, the AI provided a script that outputted paths with unnecessary repetitions and capital letters. The developer pointed out the issue, and the AI was quick to offer solutions. After several attempts, the AI successfully resolved the problem, resulting in the desired output.
Another interesting aspect of our collaboration was the creation of the README.md file for the project. The AI not only provided a well-structured and informative README, but also formatted it in Markdown, demonstrating its understanding of various documentation formats.
The AI’s ability to comprehend complex programming concepts and generate human-readable explanations made the collaboration a truly enjoyable and productive experience. The mysql-to-openapi.js script is a testament to the potential of AI-human partnerships in the software development process. By leveraging the strengths of both parties, we were able to create a powerful tool that simplifies the task of generating OpenAPI definitions for MySQL databases.
In conclusion, the mysql-to-openapi.js script showcases the power of AI-human collaboration in software development. By combining the expertise and creativity of a human developer with the efficiency and precision of an AI language model, we were able to create a valuable tool that streamlines the generation of OpenAPI definitions for MySQL databases. This partnership highlights the potential for AI to enhance the software development process, enabling developers to focus on more complex tasks while the AI assists with coding, troubleshooting, and documentation.
The Generative AI Revolution is Here
It’s me again. It only took me two weeks of exposure to ChatGPT to have the skills for me to generate this genuinely useful contribution to the open-source and developer community. And it only took a few hours from start to finish, including this blog post.
It is hard to know how my prompt engineering skills rate against others. I certainly have an advantage with decades of programming experience. But my generative AI skills can’t be that good given my practice level. Imagine what will be possible in just a few months with new tools, applications, and capabilities.
As an analyst, it is occasionally my job to send off fireworks and alarm bells to get people’s attention. This experience of mine proves to me there is something happening with generative AI worthy of fireworks.
And I think we’re only at the beginning of how these tools could have an impact on the daily life of an IT manager. Soon we will be using generative AI to help with researching solution architectures, operationalizing design principles, and coding away cybersecurity threats.
So, we could be on the threshold of another technological revolution worthy of comparison to the integrated circuit, the personal computer, the Internet, e-commerce, social media, and B2B SaaS. Each of those revolutions left some businesses and industries in ruins, and others successful and wealthy. Now could be the time for you to make your fate in the AI revolution.