![]() ![]() ![]() With Zappa, deploying your services as event-driven & horizontally scalable Lambda service is dumb-easy. We deployed this micro-service as AWS Lambda with Zappa. We implemented source data to target data translation by modelling target table structures through SQLAlchemy. Next we wrote a minimal micro-service in Python to listen to the message events on SQS, pickup the data payload & mirror the DB changes on to the target Data warehouse. In the Node.js function, we wrote minimal functionality to communicate the database changes (insert / update / delete / replace) to Amazon SQS. Interestingly enough, MongoDB stitch offers integration with AWS services. We chose Amazon SQS as the pipe / message backbone for communicating the changes from MongoDB to our own replication service. When there are a lot of database changes, Stitch automatically "feeds forward" these changes through an asynchronous queue. Using stitch triggers, you can execute a serverless function (in Node.js) in real time in response to changes in the database. One of the services offered by MongoDB Stitch is Stitch Triggers. It is the serverless platform from MongoDB. We chose MongoDB Stitch for picking up the changes in the source database. The data replication must be horizontally scalable (based on the load), asynchronous & crash-resilientīased on the above criteria, we selected the following tools to perform the end to end data replication: The data replication must be near real-time, yet it should NOT impact the production database We set ourselves the following criteria for the optimal tool that would do this job: ![]() Recently we were looking at a few robust and cost-effective ways of replicating the data that resides in our production MongoDB to a PostgreSQL database for data warehousing and business intelligence. Ultimately you want the whole chain of services involved in a call to be serverless, and that's when we've started leveraging Amazon DynamoDB on these projects so they'd be fully scalable. One of these innovations was to get ourselves into Serverless : Adopting AWS Lambda was a big step forward.Īt the time, only available for Node.js (Not Ruby ) but a great way to handle cost efficiency, unpredictable traffic, sudden bursts of traffic. Search: Elasticsearch / Amazon Elasticsearch Service / AlgoliaĪs our usage grows, patterns changed, and/or our business needs evolved, my role as Engineering Manager then Director of Engineering was also to ensure my team kept on learning and innovating, while delivering on business value. Once again, here you need a managed service your cloud provider handles for you.įuture improvements / technology decisions included: Ultimately migrated to Amazon RDS for Aurora / MySQL when it got released. On the database side: Amazon RDS / MySQL initially. Storage-wise, we went with Amazon S3 and ditched any pre-existing local or network storage people used to deal with in our legacy systems. Our various applications could now be deployed using AWS Elastic Beanstalk so we wouldn't waste any more efforts writing time-consuming Capistrano deployment scripts for instance.Ĭombined with Docker so our application would run within its own container, independently from the underlying host configuration. We've however broken up the monolith and decoupled the front-end application from the backend thanks to the use of Rails API so we'd get independently scalable micro-services from now on. The first enabler to this was to make use of the cloud and go with AWS, so we would stop re-inventing the wheel, and build around managed/scalable services.įor the SaaS product, we kept on working with Rails as this was what my team had the most knowledge in. I had inherited years and years of technical debt and I knew things had to change radically. The company also does provide Data APIs to Enterprise customers. This is a SaaS software helping real estate professionals keeping up with their prospects and leads in a given neighborhood/territory, finding out (thanks to predictive analytics) who's the most likely to list/sell their home, and running cross-channel marketing automation against them: direct mail, online ads, email. Back in 2014, I was given an opportunity to re-architect SmartZip Analytics platform, and flagship product: SmartTargeting.
0 Comments
Leave a Reply. |