Used by llama 1/2 models family:
Silly tavern responses cut off. I want a good response for the ai! Relevant newest # murder # murderer # jonathan frakes. Pick if you use a llama 1/2 model.
Web i'm using pygmalion 13 via kobold ai and silly tavern locally. Added new setting allow { {user}}: Context size how many tokens of the chat are kept in.
Web try this if your prompts get cut off on high context lengths. Ai has issued completing its dialogue with. Pygmalion 7b have you searched for similar bugs?
The silly tavern ai not working or not responding issue can be caused by any of the following reasons: Web as a last resort, you can try turning on multigen (in the user settings panel), but will make responses come out slower because it's making the ai produce small replies back to. Web response length how much text you want to generate per message.
Natural order tries to simulate the flow of a real human conversation. Web make your bot responses better with roleplay preset. This approach is used by koboldai lite.
Web if you guys have the best settings for sillytavernai, please tell me! Web phrase thesaurus through replacing words with similar meaning of silly and question. Web describe the bug dialogue with character has broken grammar, mostly missing words such as the, a, etc.