Hello everyone! After sharing my first forays into the fascinating world of Docker Model Runner and how it allowed me to tinker with AI models on my MacBook Pro M1, I bring you some exciting news: I've created a new Drupal module, ai_provider_docker!
As you know, my curiosity about Artificial Intelligence keeps growing. The ability to experiment with models locally, without worrying about cloud costs or latencies, thanks to Docker Model Runner, seemed like a golden opportunity. So, after some time exploring and "playing" around, I realized there wasn't a direct way to connect this capability of running local models with Drupal. And, as a keen developer, I couldn't resist the urge to try and create that connection!
An Early Step on a Long Road
This module, ai_provider_docker, represents my initial venture into a deeper understanding of the AI ecosystem as applied to websites. It might become the "definitive module" for Docker Model Runner in Drupal, or it might just be an initial experiment – only time will tell! What I do know is that it's already helping me continue exploring and understanding how AI integrates with our beloved web platforms.
For now, in this early version (you can find it on https://www.drupal.org/project/ai_provider_docker), the main functionality it covers is the chat interface. This means we can use the chat interface of the core Drupal AI module to interact with the models we have available locally via Docker Model Runner, leveraging the llama.cpp engine it supports. That's already a good starting point!
For it to work, of course, you need to have Docker Desktop 4.20 or higher (though in the previous article I mentioned 4.40+, it's always good to have the latest compatible version) with the Model Runner functionality enabled. And, naturally, the main dependency is the official Drupal AI module.
My Next Challenges: Embeddings and RAG
My next big goal is to integrate the concept of Embeddings into this new provider. It's something I need to understand very well and apply practically. Embeddings are key for more advanced functionalities like semantic search or RAG (Retrieval-Augmented Generation) architecture, which are areas I'm passionate about and want to explore thoroughly.
Creating this module also led me to dive into the code of other AI providers for Drupal, such as the modules for Gemini, OpenAI, Grok, and many others. It was an incredible learning process, seeing how others have approached these integrations and how the provider architecture works within Drupal AI.
I hope to continue making ai_provider_docker more robust with the necessary features as I keep learning. And, of course, my idea is to promote and socialize this new contribution with my colleagues in the Drupal community. It will be great to see if others find it useful and if we can build upon this together!
For me, every line of code in this module is another step in my own understanding of AI and its relationship with web development. I'll keep you updated on the progress!