A server to run and interact with LLM models optimized for Rockchip RK3588(S) and RK3576 platforms. The difference from other software of this type like Ollama or Llama.cpp is that RKLLama allows ...
A transparent proxy service that allows applications to use both Ollama and OpenAI API formats seamlessly with OpenAI-compatible LLM servers like OpenAI, vLLM, LiteLLM, OpenRouter, Ollama, and any ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results