We value your privacy
We use cookies to enhance your experience, serve personalized ads or content, and analyze our traffic. By clicking "Accept", you consent to our use of cookies.
For more information, please see our privacy policy.

Local AI (NVidia CUDA 12)
Local AI is an open-source, self-hostable online OpenAI API compatible service with WebUI to chat, generate images and more. Works best with DataCrunch provider and GPU instances.
Requirements
4096MB2 vCPU60GB
Source code
Deploy time (approx)
~20 minutes
About Local AI (NVidia CUDA 12)
Local AI is an open-source, self-hostable online OpenAI API compatible service with chatbot UI. This template is compatible with server which have NVidia GPU with CUDA 12 configured. We tested it with DataCrunch provider in DollarDeploy, and it works well with GPU instances of type A6000 and better (1€/hr).
Introduction to Local AI (NVidia CUDA 12)
About DollarDeploy
DollarDeploy allows you to easily deploy and manage apps on your own VPS without the need for SSH access. Deploy Local AI (NVidia CUDA 12) with just a few clicks and start building your solution today!