Who is the Company

A multinational corporation.

The Challenge

Managing inventory at the warehouse requires constant communication with many internal and external systems, including but not limited to transportation and fulfillment systems. This communication may often involve the exchange of numerous messages. The company needed a platform to integrate the warehouse management system with any other system to accurately exchange data in real-time, regardless of message format or protocol differences.

In brief, the company was looking for a way to:

  • Interconnect diverse systems: They needed a uniform and scalable way to get their many systems to talk together.
  • Produce accurate and verifiable results: Just getting different systems talking together wasn’t enough. The company also needed to ensure accurate and verified data was presented to the destination in the correct format.
  • Improve communication: The company found that accurate intercommunication between systems was taking too long in its current system.
  • Future-proof the intercommunication system: They wanted to ensure the new systems to be quickly and easily plugged into the mix, with new systems being added down the road.

    The Solution

    To meet the company’s needs, our Application Development and Quality Engineering teams worked together to develop a comprehensive node integration framework. This innovative solution facilitates seamless connectivity to various sources such as Kafka, Google Cloud Platform (GCP) topics, and S3 buckets while enabling HTTP protocol exposure. The solution offers the capability to consume messages across different protocols and subsequently process them.

    We also implemented an automation testing framework that allows consuming a single message and publishing it to separate destinations. It also enables parallel transformations, which means applying different types of changes to the same message, as well as data filtering. Upon startup, all services connect to Dynamo DB, retrieve the configurations, establish connectivity, and begin consuming messages.

    Here are the key components of our solution:

    Processing encompasses transforming the received data into a diverse payload, enriching it with supplementary information, and publishing it to a specified destination. The destination can range from a Kafka topic, an alternative HTTP endpoint, or a GCP topic, leveraging Amazon DynamoDB for secure cloud-based data storage.

    • Source: The new system handles four different protocols, including, at present, Kafka, GCP, HTTP, and the S3 protocol.
    • Destination: We utilize four protocols here - NSP, GCP, HTTP, and Amazon Dynamo database (DB).
    • Receiver Service: This component ingests messages from the Amazon S3 bucket and forwards them to Simple Queue Service (SQS) queues.
    • Processor Service: This service extracts data from SQS queues. Depending on the message type, it transforms the message, appends additional information, and then publishes it to the outbound queue.
    • Delivery Service: This service retrieves from the outbound queue and publishes to multiple destinations.
    • Dynamo DB: Dynamo DB is a storage component that contains all the configurations.
    • Status: This feature is useful for retrying failed payloads and tracking the state of messages.

    Business Impact

    • Uniform system interconnectivity improves inventory handling: In the new system, the various company systems can now exchange information quickly and accurately. This solution has significantly enhanced company profitability and reduced waste as they can now react promptly to changing market conditions in their different regions.
    • 500% performance improvement: The old system required around five minutes to test a single event functionally, whereas in the new system, with automation, it only takes one minute. A 500% performance increase allows the company to operate much more efficiently.
    • Increased transparency allows for a proactive response: The solution provides a robust error-handling mechanism with visibility to raw and transformed messages through a user interface. The high transparency enables the company to deal with error conditions promptly and proactively.

    Technologies Used

    Development tools: Apache Camel, Spring Boot, Amazon Web Services (AWS), Kafka
    Testing tools: Cucumber, Spring Framework, REST Assured, AWS Java SDK, Atlassian Jira

    Related Capabilities

    Utilize Actionable Insights from Multiple Data Hubs to Gain More Customers and Boost Sales

    Unlock the power of the data insights buried deep within your diverse systems across the organization. We empower businesses to effectively collect, beautifully visualize, critically analyze, and intelligently interpret data to support organizational goals. Our team ensures good returns on the big data technology investments with the effective use of the latest data and analytics tools.

    Do you have a similar project in mind?

    Enter your email address to start the conversation