GraphQL Service API
The Service API is a middleware service allows connecting your frontend to your backend stack. It includes examples to handle Crystallize events via webhooks, integration with payment providers, user authentication services and more.
What is GraphQL?
GraphQL is a query language for APIs and a server side runtime for fulfilling those queries with your existing data. It is designed with the idea of giving developers the ability to construct requests that pull data from multiple data sources in a single API call.
To get started with your own Service API run the following command.
npx @crystallize/cli <your-project-name>
Follow the steps to select the Service API boilerplate, and your preferred platform to be deployed. Currently the following are supported:
- Vercel (Serverless)
- AWS lamdas (Serverless)
You will be prompted to input your own tenant’s identifier or select the default furniture tenant. You would typically use the same tenant identifier you use in your frontend boilerplates (e.g. the React + Next.js boilerplate). The value for the selected tenant can be changed at any time, by changing the CRYSTALLIZE_TENANT_IDENTIFIER in the .env file in the source code.
Running the project
Running the project in development is very straightforward. Running the following command will start up the development server:
The server is running at port 3001 by default to allow you to run your boilerplates, which by default run at port 3000, at the same time.
Accessing the GraphQL playground
After running the project you are able to access the service running in your local machine at:
Linking a boilerplate to the Service API
The Service API can be linked to multiple different frontends, whether they are one of the open source boilerplates, or any other service. Depending on whether you are running your instance of the Service API to development or production, you can might want to add a different access URL in your applications.
To link a boilerplate to the Service API, you need to provide this environment variable to your boilerplate:
Its value should be targeting:
- http://localhost:3001/api/graphql (for development)
- your_preview_or_production_url (for preview or production)
Environment variables & tokens
The Service API serves as the part of the architecture responsible for connecting and syncing different services together. From validating your basket to creating an order, handling user authentication, email services, order management systems and more. That results in the Service API being responsible from holding the API keys and tokens for these services, instead of them being exposed in your frontend(s).
A pair of access tokens is required to be included as environment variables in order to allow the Service API to connect with Crystallize’s Order API. Make sure you have access to the tenant you want to connect to and have gone through the process of creating a pair of access tokens.
Including the different services (e.g. stripe checkout, sendgrid, etc.), means you would need to provide the set of tokens for that service. You can follow the .env.local.example file included in the source code and pick your preferred list of services. Fill out this information in the .env file.
Depending on your selected deployment method (Vercel, AWS , etc.) there are different ways of providing tokens. You can follow these guides on how to add environment variables in Vercel or how to add environment variables on AWS Lambda
Editing the logic
The Service API is completely open source, and you are in full control of the logic you would like to implement. The examples we cover in the boilerplate are included under /src/services and /src/webhooks and include examples on handling the different Crystallize webhooks, as well as different service integrations (e.g. voucher codes)
Deploying your project
Deploying your current Service API project depends again on your selected hosting option.
For AWS you may want to follow this guide on deploying an AWS service using the serverless framework.
while for Vercel, you can follow this guide on Vercel deployments.