Systemprompt.io
With Systemprompt.io modules in Make, you can embed prompts with validated structured data as the output. This enables the use of chained prompts in scenarios, unlocking agentic flows and automations.
There is an easy to use graphical interface to create and manage your prompts. Prompts can be defined in detail using State of the Art (SOTA) prompting techniques. Static content can be saved and referenced. The platform enables control over the output data and visibility into usage.
Systemprompt.io provides a comprehensive and growing library of both prompts and scenarios to turbo boost your scenarios with reliable AI automation.
You can find Systemprompt.io prompt library here.
Detailed setup and how-to-guides are listed here.
Systemprompt.io is committed to Make integration to enable the easy use of no-code AI powered agentic workflows.
The use of this module and provided prompts and scenarios require a paid subscription (with 14 days free trial available) on the Execution plan.
Connect Systemprompt.io to Make
To establish the connection, you must:
Obtain your API key from Systemprompt.io
To obtain your API key from your Systemprompt.io account:
Log in to your Systemprompt.io account.
If you are not a paid user, you must select the plan that most suits your needs.
To use Systemprompt.io in Make, you need to use the Execution plan.
There is a 14 day free trial that you can terminate at any time.
Navigate to the dashboard.
Click on API SETTINGS and copy the API key value shown.
You will use this value in the API Key field in Make.
Establish the connection in Make
Log in to your Make account, add a Systemprompt.io module to your scenario, and click Create a connection.
Optional: In the Connection name field, enter a name for the connection.
In the API Key field, enter the API key copied above.
Click Save.
You have successfully established the connection. You can now edit your scenario and add Systemprompt.io modules. If your connection requires reauthorization at any point, follow the connection renewal steps here.
Build Systemprompt.io Scenarios
After connecting the app, you can perform the following actions:
Core
Execute Prompt
Executes a call to an LLM with structured request and response data.
Create Prompt
Creates a Systemprompt.io prompt for embedding via API.
Convert URL to a blog
Reads a URL and converts the content into a Systemprompt.io blog.
Universal
Make an API Call
Examples of usage
In this section, you will learn how to use the Create Prompt and Execute Prompt modules.
Create Prompt
Create prompt allows you to provide a title and description of what you want your prompt to do, we complete and autogenerate the prompt in Systemprompt.io system.
This will return a prompt ID that you can then use to execute.
[ { "name": "title", "label": "Title", "type": "text", "help": "The title of your prompt, Systemprompt.io AI generation platform will do the rest", "required": true }, { "name": "description", "label": "description", "type": "text", "help": "The description of your prompt, Systemprompt.io AI generation plafform will do the rest", "required": true } ]
Your prompt is now available in your Systemprompt.io dashboard for you to edit and test for more complex use cases.
For new users, we recommend you start by importing a prompt from Systemprompt.io Prompt library.
If you have a common use case that you can’t see in Systemprompt.io library, reach out on Systemprompt.io Discord channel and we will may set up the prompt on your behalf.
Execute Prompt (recommended for most Make use cases)
After you have created or imported a prompt, you can now execute it in your scenarios.
Executing prompts enables you to directly work with the result of the prompt after it has been processed by an LLM. In the Make scenario, you will receive the output of the prompt after it has been processed, with the structured data guaranteed to match the definition in your prompt entity.
For example, you might want to convert a list of products into a curated list of birthday presents for a target profile. You would define the output of the schema as follows:
```json{ "type": "object", "$schema": "http://json-schema.org/draft-07/schema#", "required": [ "products" ], "properties": { "products": { "type": "array", "items": { "type": "object", "required": [ "productLink", "title", "description", "price", "choiceMessage" ], "properties": { "price": { "type": "number", "description": "The price of the product." }, "title": { "type": "string", "description": "The title of the product." }, "description": { "type": "string", "description": "A brief description of the product." }, "productLink": { "type": "string", "format": "uri", "description": "A URL link to the product page." }, "choiceMessage": { "type": "string", "description": "A message explaining why this product was chosen." } } }, "maxItems": 5, "minItems": 5 } }```
This means that you are guaranteed to receive a response that conforms to the defined schema and can be used in your scenarios.
Don’t worry if you aren’t comfortable creating JSON Schemas, as with everything in Systemprompt.io, “there is a prompt for that” and Systemprompt.io has automations and flows that will do the technical work on your behalf.
This can be done in Systemprompt.io dashboard, or you can get help on Systemprompt.io Discord channel.
Universal
For advanced users, Systemprompt.io has an OpenAPI compliant swagger schema for its API, opening up a world of use cases with the Systemprompt.io platform.