We can't find the internet
Attempting to reconnect
Something went wrong!
Hang in there while we get back on track
The Era of AAP: Ai Augmented Programming using only Java by Stephan Janssen
Discover how Java is becoming AI-ready with local LLM inference, integrated development tools, and frameworks that enable private, efficient AI-augmented programming.
-
Java is becoming a first-class citizen in AI programming with new frameworks like JLama and Llama3.java enabling local LLM inference
-
AAP (AI Augmented Programming) tools can now run large language models locally with up to 131,000 token context windows, eliminating the need for cloud services
-
DevOps Genie plugin for IntelliJ provides open-source AI coding assistance while keeping code private and running models locally
-
GPU support through projects like Babylon and TornadoVM will enable faster matrix multiplications and LLM inference directly in Java
-
Model sharding allows distributing large models across multiple machines, making it feasible to run billion-parameter models on consumer hardware
-
Token caching and streaming responses (20 tokens/second) improve the user experience when working with LLMs
-
RAG (Retrieval Augmented Generation) capabilities are evolving from basic to advanced implementations using semantic search and graph databases
-
Panama Vector API improvements enable faster matrix multiplications natively in Java, especially on ARM processors
-
Local LLM deployment eliminates privacy concerns around sending proprietary code to cloud services
-
The combination of local models, Java frameworks, and IDE integration creates a complete AI-assisted development environment that can increase developer productivity