SillyTavern is an open-source LLM front-end for power-users with focus on AI characters & role-play. In order to actually use it, you need to connect it to either a locally running model, or to one of the supported services.
Great way to do chat with characters or role-play without locking to a particular service.
Offers a lot of features
The most powerful AI site right now
KoboldAi Horde my beloved
If not counting the installation process, which is not quite user friendly. This is the best way to use ai
Mucho control, es la opción difícil para el usuario medio, pero la mejor al largo plazo, no echeis cuenta a los votos negativos (Seguramente sean empresarios que quieren vuestros datos, viendo peligrar su modelo de negocio)
Nothing to say. It's the best
It's good for RP but not for story writing.
Gold standard, the "PC Master Race" of AI waifus.
Sooo many features! You can tailor your AI experience to your liking and they update often with even more features. SillyTavern is just the UI though so you will need a backend to talk to AI otherwise use Kobold Horde.
SillyTavern is the gold standard for a reason. I do wish it had an easier to work with 'novel mode', though. It is pretty focused on chat-style RP, rather than collaborative fiction.
Offers the best options and versatility. It might be overwhelming at first. But once learned, you can never go back to other web services.
I think it’s fair to say Silly Tavern is the standard for text GUIs for RP. It’s rather fantastic, has a ton of plugins, and automated a lot of things to make prompting, usage, character cards, etc., about as easy as you can get considering how complex these models can be. If I had to complain, it would be that it sometimes feels a bit bloated with how it soaks up tokens, but that’s a small niggle.
To use its full power and potential, it requires going up a steep learning curve and the UI might be a bit overwhelming for a new user who is used to stuff like CAI. That's the only "complaints" I have. Luckily, their documentation is very good, so if you have patience, you can learn.
And if you're lucky, the horde might have 10+ workers on a 30 to 70b model with 4k context and if you're really lucky, 8k context. Albeit, response times will be abysmal, but it's a small price to pay for 70b outputs. A lot of the time they'll be 10+/- workers hosting 13b models or 8x7b, all with 4k and sometimes 8k. I've had higher quality AI experiences on ST+Horde then anything.