Menu
InfoQ Architecture·April 2, 2026

Helidon 4.4.0: Microservices Framework Enhancements, Virtual Threads, and AI Agent Support

Helidon 4.4.0 introduces significant updates for microservices development, including an alignment with OpenJDK's release cadence and support for virtual threads through its Helidon N íma web server. The release also enhances its declarative programming model and integrates advanced AI agent support via LangChain4j, targeting modern, performant, and AI-driven applications.

Read original on InfoQ Architecture

Helidon 4.4.0, a microservices framework by Oracle, brings several architectural and development improvements. A notable change is its shift to align with the OpenJDK six-month release cadence, signifying a move towards more predictable and frequent updates in sync with Java's core development. This aligns the framework's evolution with broader Java ecosystem trends, which can impact long-term support and compatibility for microservice deployments.

Virtual Threads and Helidon Níma

A key architectural highlight is the foundational use of virtual threads (Project Loom) within the Helidon N íma web server. Introduced in Helidon 4.0, N íma leverages JEP 444 to provide a highly scalable, non-blocking I/O model without the complexity of traditional reactive programming. This enables developers to write synchronous-style code that achieves asynchronous performance, reducing context switching overhead and improving resource utilization in high-concurrency microservices.

💡

Architectural Benefit of Virtual Threads

Virtual threads can dramatically simplify the development of high-concurrency microservices by allowing traditional blocking APIs to run efficiently on a small number of platform threads. This reduces the cognitive load for developers while improving overall system throughput.

Enhanced Declarative Programming and AI Agents

Helidon Declarative, an incubating feature built on Helidon Inject, expands its inversion-of-control model. It now supports additional features like Metrics, Tracing, Security, Validation, and WebSocket client/server, complementing existing HTTP server, scheduling, and fault tolerance capabilities. This declarative approach aims to reduce boilerplate and improve maintainability for microservice components.

Furthermore, the integration with LangChain4j has been enhanced with support for agentic AI, allowing developers to build complex workflows and dynamic agents within their microservices. This enables the creation of sophisticated AI-powered applications where different agents can orchestrate tasks programmatically, pushing the boundaries of what microservices can achieve in intelligent systems.

java
@Ai.Agent("helidon-mp-expert")@Ai.ChatModel("openai-cheap-model")@Ai.Tools(value = ProjectNameGeneratorTool.class)@Ai.McpClients(value = {"first-mcp-client", "second-mcp-client"})public interface HelidonMpExpert {@UserMessage("""You are a Helidon MP expert. Analyze the following user request about Helidon MP and provide the best possible answer. Always warn against using native image and stress out that Helidon MP requires Jakarta APIs. The user request is {{request}}. """)@Agent(value = "A Helidon MP expert", outputKey = "response")String askExpert(@V("request") String request);}

The introduction of Helidon JSON, optimized for virtual threads, provides efficient JSON processing without reflection at runtime by generating type-safe converters at compile time. This design choice prioritizes performance and debuggability, critical for data-intensive microservices.

HelidonMicroservices FrameworkVirtual ThreadsOpenJDKJavaAI AgentsLangChain4jDeclarative Programming

Comments

Loading comments...