During the day two DockerCon keynote, Docker – in collaboration with partners Neo4j, LangChain, and Ollama – introduced the GenAI Stack.
This innovative platform is meticulously designed to empower developers to kickstart their generative AI applications within minutes, eliminating the complexities associated with integrating diverse technologies.
The GenAI Stack offers a seamless solution by providing pre-configured, ready-to-code, and secure components. These include large language models (LLMs) from Ollama, vector and graph databases from Neo4j, and the LangChain framework.
Docker, a leading name in containerisation technology, also revealed its maiden AI-powered product – Docker AI – promising a new era in app development.
Scott Johnston, CEO of Docker, said: “Developers are excited by the possibilities of GenAI, but the rate of change, number of vendors, and wide variation in technology stacks makes it challenging to know where and how to start.
Today’s announcement eliminates this dilemma by enabling developers to get started quickly and safely using the Docker tools, content, and services they already know and love together with partner technologies on the cutting edge of GenAI app development.”
The GenAI Stack, available today on Docker Desktop’s Learning Center and GitHub repository, utilises trusted open-source content on Docker Hub to address popular GenAI use cases.
Components of the GenAI Stack include pre-configured open-source LLMs such as Llama 2, Code Llama, Mistral, and private models like OpenAI’s GPT-3.5 and GPT-4.
Neo4j serves as the default database, enhancing AI/ML models’ speed and accuracy by uncovering explicit and implicit patterns in data.
Emil Eifrem, Co-Founder and CEO of Neo4j, commented: “The driving force uniting our collective efforts was the shared mission to empower developers, making it very easy for them to build GenAI-backed applications and add GenAI features to existing applications.”
LangChain orchestrates the integration between LLMs, applications, and databases—offering a framework for developing context-aware reasoning applications powered by LLMs.
Harrison Chase, Co-Founder and CEO of LangChain, said: “We’re all here to help teams close the gap between the magical user experience GenAI enables and the work it requires to actually get there. This is a fantastic step in that direction.”
This initiative has received praise from industry experts.
James Governor, Principal Analyst and Co-Founder of RedMonk, commented: “Everything changed this year, as AI went from being a field for specialists to something that many of us use every day. The tooling landscape is, however, really fragmented, and great packaging is going to be needed before general broad-based adoption by developers for building AI-driven apps really takes off.
“The GenAI Stack that Docker, Neo4j, LangChain, and Ollama are collaborating to offer provides the kind of consistent unified experience that makes developers productive with new tools and methods so that we’ll see mainstream developers not just using AI, but also building new apps with it.”
The GenAI Stack announced this week promises to make generative AI more accessible and user-friendly for developers worldwide.
(Photo by Andy Hermawan on Unsplash)
See also: Python Developers Survey: Python 2 clings on for certain use cases
Want to learn more about AI and big data from industry leaders? Check out AI & Big Data Expo taking place in Amsterdam, California, and London. The comprehensive event is co-located with Digital Transformation Week.
Explore other upcoming enterprise technology events and webinars powered by TechForge here.