Stack Name - AWS Lambda


The is the base reusable implementation for AWS Lambda type tech stacks while indirectly using Hibernate for relation database integration The stack includes:
  • Spark Micro Framework:
    • A lightweight alternative to heavier Java frameworks like Spring or Strut2, and is great for microservices implementations
    • Provide a remote replica Restful API with constant relational database connection to ensure fast execution time with minimal latency for Lambda functions executing in scenarios where the data must be stored in a relational database
    • Hibernate:
      • An object–relational mapping tool for the Java programming language. It provides a framework for mapping an object-oriented domain model to a relational database

Design Consideration

Problem To date, it is frowned upon using Lambda functions in scenarios where direct connectivity to a relational database is required.  This is due to the unacceptable latency caused by making a database connection per call.  Since a server-less function is stateless by definition, a database connection cannot be cached. Also, a relational database has concurrent connection limitations. Solution Harbormaster overcomes this well known Lambda to Relational DB problem by generating a replica Restful API.  Each Lambda function delegates to an associate Restful API to perform the read/write to the database.  This design keeps in place the value of a server-less API while providing the integrity and low-latency demanded of consumers.  Importantly, other applications can take advantage of the Restful API as well.


Access Public
Derived From
Long Name AWS Lambda
Short Name Lambda
Language(s) Java 8+, NodeJS, YAML, XML, Velocity Macros
Git Url
Example Project YAML


Along with what is supported by the parent tech stacks the following are overridden and additional capabilities:
  • Build:
    • Overloaded CI vendor specific macros to apply specifics for this tech stack, such as AWS and Hibernate settings, etc..
  • Spark Micro Web Framework:
    • All templates and Velocity macros used to create a complete Restful API for all CRUD capabilities along with persistence via Hibernate

Important AWS Considerations

Using Terraform

If Harbormaster is enabled to generate Terraform file(s) for AWS, it will create a secure EC2 instance with a running MongoDB database instance.  Each lambda function will have an environment variable assigned called mongoDbServerAddresses.  Its value is the URL(s) to the MongoDB instance(s) to be used by each lambda function for reading and writing.


For any CI platform, be sure to assign environment variables USERAWSACCESSKEY and USERAWSSECRETKEY to their respective values as assigned by AWS.  Refer to the CI platform on how to assign environment variables.


See options and


Option Name description Type values
service input
buildAndCopyResultLayer build and push DAO layer as an executable jar file boolean default: true
securityRemoteUrl security account with URL to copy DAO Restful API to input
fileLocation file location on remote service to copy DAO restful API select
  • react
  • angular
privateKeyFileLocation private SSH key file location
passphrase passphrase for key file, leave empty if none input


See usage

Deploy the Generated Restful API Layer

In the root directory of the generated project is a generated Maven pom-restful-api.xml file that contains the declarations and dependencies to run the Restful API Layer within the Spark container.
mvn clean install -f pom-restful-api.xml
By default, Spark listens on port 4567, accessible through the browser via: