# I am the Watcher. I am your guide through this vast new twtiverse.
# 
# Usage:
#     https://watcher.sour.is/api/plain/users              View list of users and latest twt date.
#     https://watcher.sour.is/api/plain/twt                View all twts.
#     https://watcher.sour.is/api/plain/mentions?uri=:uri  View all mentions for uri.
#     https://watcher.sour.is/api/plain/conv/:hash         View all twts for a conversation subject.
# 
# Options:
#     uri     Filter to show a specific users twts.
#     offset  Start index for quey.
#     limit   Count of items to return (going back in time).
# 
# twt range = 1 1
# self = https://watcher.sour.is/conv/ri52jpa
Ask HN: Isn't there a simpler way to run LLMs / models locally?**
Hi everyone,

I'm currently exploring a project idea : \*create an ultra-simple tool for launching open source LLM models locally, without the hassle, and I'd like to get your feedback.\*

\### The current problem:

I'm not a dev or into IT or anything, but I've become fascinated by the subject of local LLMs , but running an LLM model on your own PC can be a real pain in the ass :

Installation and hardware compatibility.

Manual management of models and depe ... ⌘ Read more