0
I Use This!
Low Activity

Commits : Listings

Analyzed about 15 hours ago. based on code collected about 15 hours ago.
Aug 28, 2024 — Aug 28, 2025
Commit Message Contributor Files Modified Lines Added Lines Removed Code Location Date
feat(server): emit socket message when a new update is available More... over 1 year ago
feat(server): chit-chat duty and skill and more More... over 1 year ago
feat(server): make Leon's personality more lively More... over 1 year ago
feat(server): switch LLM duties back to using ChatWrapper instead of completions More... over 1 year ago
feat(server): Leon's personality done More... over 1 year ago
feat(server): personality support kick off; answer queue More... over 1 year ago
feat(server): add paraphrase LLM duty to kick off personality attribution More... over 1 year ago
feat(server): add confidence value in logs More... over 1 year ago
feat(server): use `Lexi-Llama-3-8B-Uncensored-Q5_K_S` as default LLM More... over 1 year ago
chore: upgrade `socket.io-client` to latest More... over 1 year ago
chore: upgrade `socket.io` to latest More... over 1 year ago
chore: upgrade `fastify` to latest More... over 1 year ago
chore: upgrade `dotenv` to latest More... over 1 year ago
chore: uninstall `async` npm package More... over 1 year ago
chore: upgrade `tsc-watch` to latest More... over 1 year ago
refactor(server): warning message on explicit deactivation of LLM More... over 1 year ago
feat(server): allow explicit deactivation of LLM More... over 1 year ago
fix(server): specify correct minimem total/free RAM for LLM More... over 1 year ago
feat(server): map resolved slots to dialog answers More... over 1 year ago
feat(server): support dialog type action after slots filled More... over 1 year ago
feat(server): action loop support More... over 1 year ago
fix(scripts): always update manifest on LLM setup More... over 1 year ago
feat(server): set Phi-3 as default LLM More... over 1 year ago
fix(server): `utterance` as `expected_item` loop indefinitely More... over 1 year ago
feat(scripts): fallback to mirror in case of error to download LLM More... over 1 year ago
feat: use `mistral-7b-instruct-v0.2.Q4_K_S` as final choice More... over 1 year ago
feat(server): add `utterance` as `expected_item` type More... over 1 year ago
feat(scripts): llama.cpp compatible build More... over 1 year ago
feat(server): final LLM setup More... over 1 year ago
feat(server): Gemma support (prototype) More... over 1 year ago