How to Use AWS Auto Scaling With Logi Analytics Applications

Managing web traffic is a critical part of any web application, and load balancing is a common and efficient solution. Load balancing distributes workloads, aiming to maximize throughput, minimize response time and avoid overloading a single resource. Using auto-scaling in combination with load balancing allows for your system to grow and shrink its distributed resources as necessary, providing a seamless experience to the end user. The scalability and elasticity features of AWS can be easily utilized by any web application built on AWS.

Considerations on setting up a Logi App on AWS

Applications built using Logi have shared components such as the Cached Data Files, Bookmark Files. To provide the seamless experience to the end user, Logi needs to be configured correctly to share these common files across multiple instances as the system scales the resources.

 

To support the shared resources, a shared file system is needed which needs to be shared across the multiple instances as resources get allocated and deallocated. Amazon Elastic File System or EFS is the shared file system on AWS. Currently EFS is only supported on Linux based instances and is not supported on Windows.

To support auto scaling, shared file locations need to be defined in the application settings so that new resources are pre-configured when auto scaling allocates new resources. Whenever you add a new server to the auto-scaling group, it has to be pre-configured with both your Logi application and the correct connections to the distributed resources.

Recommendations with Logi Apps on AWS

The solution to the EFS challenge on Windows Logi Apps involves adding a middle layer of Linux based EC2 instances to the architecture. Mount the EFS volumes on the Linux Instances, and these can be accessed on the Windows servers via the SMB protocol. By adding these Linux instances – and an associated load balancer – it becomes possible to use an EFS volume despite the Windows operating system being unable to mount it directly.

Use of Amazon Machine Images (AMIs) allows the developer to create a single instance with the Logi web app, the server and user settings for accessing the EFS, and their specific Logi application. This AMI can then be used by the auto scaling group to allocate and deallocate new instances. With the setting and report definitions saved in the AMI, and the shared elements such as Bookmarks and data cache saved on the EFS, it becomes possible to use AWS load balancers to implement a distributed and scalable Logi web application.

A detailed, step-by-step guide outlining all the technical details on how a Logi application can be configured to harness the scalability & elasticity features of AWS can be found here.

Stay in Touch

Get the latest news, posts, and in-depth articles from dbSeer in your inbox.

"*" indicates required fields

This field is for validation purposes and should be left unchanged.