AWS Public Sector Blog

Implement a secure, serverless GraphQL architecture in AWS GovCloud (US) to optimize API flexibility and efficiency

Implement a secure, serverless GraphQL architecture in AWS GovCloud (US) to optimize API flexibility and efficiency

Many Amazon Web Services (AWS) customers want to leverage the flexibility and efficiency of GraphQL in their workloads. GraphQL is a query language and server-side runtime system for application programming interfaces (APIs) that prioritizes giving clients exactly the information they request and no more. Building architectures that incorporate GraphQL can help developers evolve their APIs over time and solve shortcomings and inefficiencies associated with REST APIs. GraphQL can help public sector customers focus on their data and provide ways to explore the data in their APIs.

In this blog post, learn an AWS reference architecture using serverless technologies that you can use as a basis for building GraphQL-enabled solutions in the AWS GovCloud (US) Regions to unify data access in real-time and simplify operations.

GraphQL implementation options on AWS

There are two ways to run GraphQL implementations on AWS. The first way is using AWS AppSync, a fully-managed serverless GraphQL API service. AWS AppSync is available in most AWS Regions. The second way is with an open-source, GraphQL spec-compliant server such as Apollo on AWS. For the purpose of this blog post, we focus on the second approach.

Solution overview: Create a secure, serverless GraphQL architecture in AWS GovCloud (US)

Most real-world workloads in the public sector involve managing data, typically done by various personas, including end users and administrators. For such workloads, we use the term “missions” in this walkthrough.

The sample solution in this blog post uses AWS Lambda to run a fully-functional GraphQL API, allowing interactions with Amazon DynamoDB, a fully managed serverless, key-value NoSQL database designed to run high-performance applications at any scale. This enables features like creating and reading “missions items” by different personas. These mission items can represent various projects, goals, programs, or tasks that contribute to the overall mission or objectives of the organization.

You can create a GraphQL API that interface with DynamoDB, allowing clients to query, mutate, and interact with your DynamoDB data using GraphQL queries and mutations. Additionally, you can use foundational models deployed in Amazon SageMaker to build a generative artificial intelligence (AI) application to generate text content that can be incorporated into mission items. This content can include mission details and is generated by analyzing user-prompts and patterns learned from existing data.

The solution builds upon workflows for two different personas. The first is that of users who have mission administrator-level capabilities to support mission needs. They can perform privileged actions like generating mission details and creating missions with those details. The second persona consists of mission personnel who support by accessing those missions and actioning on them.

The solution uses AWS Lambda to host a fully operational GraphQL layer and uses Apollo as the state management tool. This offers flexibility in fine-tuning the operational characteristics of the APIs for both administrators and mission personnel.

Lambda is used to execute a model that performs text generation hosted on Amazon SageMaker. Amazon API Gateway acts as the gateway to these APIs as it has native integration with Lambda. AWS WAF, a web application firewall, protects the API Gateway from web exploits such as SQL injection and cross-site scripting attacks. Amazon Cognito user pools act as a user directory for the users of the two personas. Further, the solution uses dedicated Amazon Cognito app clients for each persona to authorize access to the APIs as necessary. AWS Lambda hosts the Apollo server. Amazon DynamoDB is the back-end data store. The AWS Lambda functions hosting the GraphQL server interact with the DynamoDB table (in Figure 1, it is named ‘Missions’) to support the APIs which involve create and/or read operations.

Figure 1. Architecture diagram of the solution, explained in more detail in the following section.

Figure 1. Architecture diagram of the solution, explained in more detail in the following section.

Figure 1 illustrates the solution workflow:

1. This flow represents a user who is a mission administrator, authenticating to the Amazon Cognito App Client dedicated to administration functions. The user, once successfully authenticated, is authorized to invoke the APIs applicable to them.

1.1. The solution provides an API to invoke an Amazon SageMaker-hosted model which can generate text. The user can invoke this API if authorized.

1.2. Once the API is invoked, the integration with AWS Lambda enables the execution of the Amazon SageMaker endpoint.

1.3. AWS Lambda invokes the Amazon SageMaker endpoint, which generates text that can be used for mission details, via a trained generative AI model.

1.4. The user can create missions by executing the API in this flow.

1.5. Once the API is invoked, the AWS Lambda hosting the GraphQL server is executed.

1.6. This creates an item in the Missions table representing a mission.

2. This flow represents a user who is a mission personnel persona, authenticating to the Amazon Cognito app client dedicated to personnel functions. The user, once successfully authenticated, is authorized to invoke the APIs applicable to them. In this case, mission personnel are not allowed to perform any create operations in the Amazon Dynamo DB table and the separation is done via the app clients.

2.1. The solution provides an Amazon API gateway endpoint (named ‘Personnel Endpoint’ in this solution) to create missions by executing the API in this flow. The user can only invoke this API if authorized to do so.

2.2. Once the API is invoked, the AWS Lambda hosting the GraphQL server is executed.

2.3. This retrieves an item in the Missions table in DynamoDB which represents a mission.

Prerequisites

To create the sample solution described in this blog post, you need:

1. An AWS GovCloud (US) Create a GovCloud account if you do not already have one and log in. Note: An AWS GovCloud (US) account is recommended to follow along with the sample solution in this blog post.

2. The AWS Command Line Interface (AWS CLI) configured.You need AWS CLI configured with appropriate permissions to build and deploy AWS Cloud Development Kit (AWS CDK).

3. NodeJS 14.x installed.

4. AWS CDK v2 installed with minimum version 2.84.

Deploying the solution

1. Clone the aws-govcloud-graphql GitHub repository in your terminal.

git clone https://github.com/aws-samples/aws-govcloud-graphql

2. Follow the instructions in the README provided on GitHub to build and deploy the solution.

Testing the deployment

Now that you’ve deployed the solution, you can create users of both personas in the Amazon Cognito user pool provisioned as part of the deployment.

Figure 2 and Figure 3 feature examples of users created in the Amazon Cognito user pool, representing the mission administrator and mission personnel personas respectively.

Figure 2. The Amazon Cognito user attributes screen. The 'sub value' is the username for the Admin user.

Figure 2. The Amazon Cognito user attributes screen. The ‘sub value’ is the username for the Admin user.

Figure 3. Amazon Cognito user attributes screen. The 'sub' value is the username for the Personnel user.

Figure 3. Amazon Cognito user attributes screen. The ‘sub’ value is the username for the Personnel user.

You can use these users to test the Administrator and Personnel flows. Note the two Amazon Cognito app clients that are specific to each persona in the App clients and analytics section within the Amazon Cognito dashboard.

Figure 4. In the Amazon Cognito app clients and analytics screen, find the Amazon Cognito app client names for each persona.

Figure 4. In the Amazon Cognito app clients and analytics screen, find the Amazon Cognito app client names for each persona.

Testing the Admin flow

Utilize a tool like Postman to get the OAuth 2.0 Access Token for the Administrator user. Run the OAuth 2.0 authorization flow in Amazon Cognito and use the app client intended for mission administrator users. Figure 5 features a sample custom OAuth 2.0 scope attached to admin users:

Figure 5. A sample OAth 2.0 scope attached to an admin user, which features 1) the app client id, 2) the custom scope, and 3) the usernames for the Admin user.

Figure 5. A sample OAth 2.0 scope attached to an admin user, which features 1) the app client id, 2) the custom scope, and 3) the usernames for the Admin user.

You can now use the token to invoke the APIs for these users. In this case, the generative AI stack is set up following the steps in the GitHub repository. Then, these users can generate a detailed description of the mission by invoking the provisioned API as follows:

Set the AUTH_HEADER value to export AUTH_HEADER=”Authorization: Bearer <Access Token> in the tool that you are using to invoke the API. The specific format is commonly used for including an access token in an authorization header, often for authentication purposes in API requests.

The <Access Token> part should be replaced with the actual access token, such as in Figure 5.

To make requests to the Amazon API Gateway endpoint, we use “curl” for demonstration purposes in this blog post. However, in a real-world scenario, customers would build the API execution login within their workloads.

Figure 5. A sample OAth 2.0 scope attached to an admin user, which features 1) the app client id, 2) the custom scope, and 3) the usernames for the Admin user.

Figure 6. Sample output from the generative AI model invocation via endpoint described in the architecture.

Admin users can create missions or items in the Amazon Dynamo DB table by invoking the API (which is GraphQL supported) using an HTTP “POST” request method.

When invoked, this API uses GraphQL mutation to create a mission item in the DynamoDB table. The payload should be included in the request body when making the POST API call and the request body should include the mission’s name and description when making the HTTP POST request. If the operation is successful, it adds a mission item to the DynamoDB table and generates a response with the “mission id.”

Figure 7. The 'createMission' operation for Admin users.

Figure 7. The ‘createMission’ operation for Admin users.

As described in Figure 1, the API creates a “mission” as an item in the corresponding DynamoDB table. Figure 8 features a sample name and description for the generated mission item.

Figure 8. The resulting item in the DynamoDB table once the createMission operation is completed.

Figure 8. The resulting item in the DynamoDB table once the createMission operation is completed.

Testing the Personnel flow

Testing the Personnel flow is similar to the testing procedure in the previous Admin flow section. Figure 9 illustrates a sample custom OAuth 2.0 scope attached to personnel users.

Figure 9. A sample Oath 2.0 scope attached to a personnel user, which features 1) the app client id, 2) the custom scope, and 3) the usernames for the personnel user.

Figure 9. A sample Oath 2.0 scope attached to a personnel user, which features 1) the app client id, 2) the custom scope, and 3) the usernames for the personnel user.

You can now use the token to invoke the APIs for these users. Personnel users can retrieve missions or an item from the DynamoDB table by invoking the API (which is GraphQL supported) using a HTTP “POST” request method.

When invoked, this API uses GraphQL query to read the requested data from the DynamoDB table. Include the payload comprising of “mission id” in the request body when making the POST API call. If the operation is successful, it reads a mission item from the DynamoDB table and returns the whole record as a response.

Figure 10. An example of the GetMission operation for a personnel user.

Figure 10. An example of the GetMission operation for a personnel user.

Clean up

 To clean up the solution, follow the cleanup steps in the README.

Conclusion

AWS federal customers and those who support them can run GraphQL implementations on AWS to interact with DynamoDB in the AWS GovCloud (US) Regions using an open-source, GraphQL spec-compliant server, with serverless services from AWS. This can help organizations achieve unified data access in real-time and simplify operations. Further, this solution also includes an approach for leveraging generative AI on AWS, which can be integrated with GraphQL implementations to generate recommendations based on user prompts. The streamlined data-fetching process, along with the ability to request precisely the information needed via GraphQL, can optimize network communication and enhance the overall user experience.

Read related stories on AWS Blogs:

Subscribe to the AWS Public Sector Blog newsletter to get the latest in AWS tools, solutions, and innovations from the public sector delivered to your inbox, or contact us.

Please take a few minutes to share insights regarding your experience with the AWS Public Sector Blog in this survey, and we’ll use feedback from the survey to create more content aligned with the preferences of our readers.

Saptarshi Banerjee

Saptarshi Banerjee

Saptarshi Banerjee serves as a partner solutions architect at Amazon Web Services (AWS), collaborating closely with AWS Partners to design and architect mission-critical solutions. With a specialization in serverless architecture and cloud-native solutions, Saptarshi is dedicated to enhancing performance, scalability, and cost-efficiency for AWS Partners within the cloud ecosystem.

Rajarshi Das

Rajarshi Das

Rajarshi Das is a senior solutions architect at Amazon Web Services (AWS). He focuses on helping public sector customers accelerate their security and compliance certifications and authorizations by architecting secure and scalable solutions. Rajarshi holds four AWS certifications including AWS Certified Solutions Architect – Professional and AWS Certified Security – Specialist.