Serverless Architecture Implementation – Top Use Cases
1. Building Web Applications
You can build serverless web applications and back end using AWS Lambda, Amazon API Gateway, Amazon S3, and Amazon DynamoDB to handle web, mobile, Internet of Things (IoT) and chatbot requests. The call flow for a serverless application starts with the application hosted on S3, unlike a EC2 instance which needs to be managed. On an event trigger, the app calls the REST API to end-point and the Lambda function is triggered. Lambda executes the function to fetch details from the non-relational DB based DynamoDB to return data back to the user. DynamoDB, being non-relational, helps in faster fetching of data when the data size runs into Terabytes.
2. Real-time Stream Processing
You can track application activities using Lambda and Kinesis to process real-time streaming data. As depicted in the figure below, Kinesis is really useful in analyzing real-time data. For example, if certain hashtags are being analyzed from a social network feed, Kinsesis has the capacity to handle millions of raw data flowing in, trigger Lambda and generate trending information which is stored on DynamoDB and can be used for reporting using Amazon’s own BI tool QuickSight.
3. Extract, Transform, Load
You can use AWS Lambda to perform data validation, filtering, sorting, or other transformations for every data change in a DynamoDB table and load the transformed data to another data store.
AWS Lambda can be used for ETL as well. Incoming triggers from the web or mobile app can trigger Lambda which is capable of refining data for moving it to an in-house warehouse like Redshift from which Quicksight or a Tableau based reporting tool can show the trends. The data transformation from the operational DynamoDB to a data warehouse based Redshift is done by Lambda seamlessly.
4. Building Mobile Backends
Developers can build mobile backends using Lambda and API Gateway to authenticate and process API requests. Lambda eases the process of creating rich, personalized app experiences. Other use cases are powering chatbot logic and voice enabled apps using Amazon Alexa.
An API Call Flow with Serverless Architecture
Now that we have seen the top uses cases of a serverless architecture, let’s take a look at how does the API management work in a serverless application. The Amazon API Gateway enables developers to build scalable APIs that run on the AWS serverless Lambda platform (which executes code directly), on its Elastic Compute Cloud, or on services hosted outside the cloud provider. The API gateway routes all the incoming requests from the web or mobile app to EC2 or any other publicly available endpoints which can be in the form of multiple microservices and aggregates results back to the user. API gateways not only add an additional layer of security to the microservices, but also prevent exposing the service discovery or versions from the client.
Amazon’s Content Delivery Network CloudFront can store objects being fetched to an end-point closer to the user’s location making the application faster. Any downtime or API failures can be monitored using the Amazon CloudWatch and respective authorities notified via e-mail. In a serverless architecture, the event trigger is initiated by an application residing on S3 from which the API gateway redirects it to respective microservice and returns a response to the user.
Publishing Applications to the Repository
One can publish applications in the AWS Serverless Application Repository to share solutions with developers or to help your customers quickly understand the value of products and services you sell and support. Anyone with an AWS account can publish a serverless application or application component to the AWS Serverless Application Repository. You can share your published applications within your team, across your organization, or with the community at large. Publicly shared applications must include a link to the application’s source code so others can view what the application does and how it works. All you need to do is package, publish and share.
Overcoming Security Concerns in Serverless Architecture
One of the major concerns of going serverless is sacrificing security for convenience and the vendor lock-in. Some of the security concerns can be addressed as below:
- Event injection – This can be solved with input validation and pre-defined database layer logic, such as an ORM or stored procedures.
- Broken authentication – This can be solved with built-in authentication/authorization solutions and avoiding vulnerable deployment settings.
- Insecure deployment settings – This can be avoided by never using publicly readable access control lists and keeping files encrypted.
- Misuse of permissions and roles – This can be evaded by using the age old least privilege principle.
- Insufficient logging – Third-party tools such as Dashbird or CloudWatch can help resolve this concern.
- Insecure storing of app secrets – AWS KMS can be used to encrypt your application secrets.
- DoS attacks – Attacks can be avoided by writing efficient code, using timeouts and throttling.
- Improper exception handling – Console-based logging stack traces or log files help to address this concern. You can also, hide stack traces from the end user.
While there are concerns in terms of vendor lock-in going serverless, the ease of use and the issue of managing servers tilt the scale in favor of AWS. Developers perceive security concerns in using AWS serverless architecture. However, most of these concerns can be addressed by following the best practices in coding. It is highly recommended to try AWS Lambda which is a new way to run event-driven applications as a service.