Genesis Keynote by Stephan Janssen

Learn how Java is evolving to embrace AI and LLMs through new tools, frameworks, and local inference capabilities while maintaining enterprise-grade stability and security.

Key takeaways
  • AI and large language models are becoming increasingly prominent in Java development, with many new tools and frameworks emerging

  • Java developers can now run LLM inference locally using projects like JLama, without relying on cloud services or Python

  • The Java ecosystem is evolving to support AI workloads while maintaining compatibility and stability through projects like GraalVM and native compilation

  • Developer productivity has increased significantly with AI assistance, though human intelligence and oversight remain essential

  • New tools like DevOps Genie and LangChain4J enable Java developers to integrate AI capabilities directly into their workflows

  • Privacy and data security concerns are driving development of local/private LLM deployment options for enterprise use

  • The Java community is actively working on bridging gaps between Java and AI technologies through various open source projects

  • Modern Java development involves using AI as a “thought partner” while still maintaining control over code quality and architecture

  • The six-month Java release cadence has helped the platform evolve and adapt to support new AI/ML workloads

  • There’s a growing need for Java libraries and tools that make AI capabilities more accessible to developers while preserving Java’s strengths