LlamaEdge
LlamaEdge allows you to chat with LLMs of GGUF format both locally and via chat service.
-
LlamaEdgeChatServiceprovides developers an OpenAI API compatible service to chat with LLMs via HTTP requests. -
LlamaEdgeChatLocalenables developers to chat with LLMs locally (coming soon).
Both LlamaEdgeChatService and LlamaEdgeChatLocal run on the infrastructure driven by WasmEdge Runtime, which provides a lightweight and portable WebAssembly container environment for LLM inference tasks.