Connecting AI microservices with a back-end API running on Azure
For one of our enterprise clients, we recently built a system that integrates an AI microservice with their backend API running on Azure, ensuring seamless connectivity between new AI capabilities and the existing software stack.
At the core of the architecture, Azure Service Bus acts as a task manager, enabling the AI engine to pick up jobs from the backend API. It processes them in an iterative loop, sending follow-up messages back to the bus until completion. Document intelligence is used for document preprocessing, streamlining inputs before they reach the AI engine. To provide full visibility, we monitor LLM calls with Langfuse, use Azure Blob Storage for data handling, and expose a custom API for debugging and health checks.
#softwarestudio #azure #langfuse #AI
See also
Let's build. Together!
Are you looking for an entrepreneurial product development partner? Never hesitate to schedule a virtual coffee.

