UNPKG

llama-cpp-capacitor

Version:

A native Capacitor plugin that embeds llama.cpp directly into mobile apps, enabling offline AI inference with chat-first API design. Supports both simple text generation and advanced chat conversations with system prompts, multimodal processing, TTS, LoRA

5 folders, 5 files
148 kB
13.6 MB
425 kB
49.9 kB
11.8 kB
1.09 kB
555 B
863 B
21.6 kB
3.85 kB