
Introduction
Contact us to get the latest docker image.
All data is local and accessible only to the user. LitenAI does not collect any data from user.
Installation Steps
Prepare Your Environment
- Ensure Docker is installed and running.
- Confirm your user is part of the Docker group. For assistance, follow the Docker post-install guide.
Load the Docker Image
- Download the Docker image from the LitenAI Store.
- Load the image using the following command. Version is the release version number like 0.0.89
docker load < litenai.v0.0.<version>.tar.gz
- Verify the image load with the below command. Note down the tag number, it will be needed to start the container.
docker image ls
Set Environment Variables
- Set your API key to a valid Azure, OpenAI or local API key
export LITENAI_API_KEY="api_key"
Bring up the docker container
- Replace <tag> with the number from ‘docker image ls’ command above
docker run -d --name litenai_container -p 8210:8210 -p 8221:8221 -e LITENAI_API_KEY=${LITENAI_API_KEY} litenai/litenai:<tag>
Access the Chat Interface in your local browser, by visiting the URL below
http://localhost:8210
Stop the container
docker stop litenai_container
Sample Chat Sessions
Access the Chat Interface in your local browser, by visiting the URL below
http://localhost:8210
Click on the Lake tab. The Docker environment comes pre-loaded with two lakes: logreason and techassist.
To explore the logreason lake, select it from the Lake tab. You can then follow the sample chat session described in the blog to understand how to analyze customer sessions and identify the root causes of performance issues.
To explore the techassist lake, select it from the Lake tab. After the lake loads, you can follow the sample chat session described in the blog to understand how technicians maintain and repair medical devices.
Configuration for Self-hosted LLM
LitenAI supports the use of locally hosted LLMs. To configure this, you need to provide the URL for the LLM call and specify the LLM model being used. An example configuration is shown below.
Set Environment Variables
export LITENAI_SERVE_URL="http://localhost:8000/v1"
export LITENAI_LLM_MODEL="meta-llama/Llama-3.2-1B-Instruct"
Run the Docker command below
docker run -d --name liten_container -p 8210:8210 -p 8221:8221 -e LITENAI_API_KEY=${LITENAI_API_KEY} -e LITENAI_SERVE_URL=${LITENAI_SERVE_URL} -e LITENAI_LLM_MODEL=${LITENAI_LLM_MODEL} -e LITENAI_AZURE_API_VERSION="" litenai/litenai:<tag>
Access the Chat Interface in your local browser, by visiting the URL below
http://localhost:8210
Configuration for OpenAI Service
LitenAI is configured to use the Azure OpenAI service by default. However, it can also integrate with the OpenAI service.
Set Environment Variables
You will need to obtain an OpenAI API key from your OpenAI account.
export LITENAI_API_KEY=
<OpenAI-API-Key>
Run the Docker command below
Set the serve URL and Azure API versions to empty values.
docker run -d --name liten_container -p 8210:8210 -p 8221:8221 -e LITENAI_API_KEY=${LITENAI_API_KEY} -e LITENAI_SERVE_URL="" -e LITENAI_AZURE_API_VERSION="" litenai/litenai:<tag>
Access the Chat Interface in your local browser, by visiting the URL below
http://localhost:8210