Some fiction has it. In vampire the requiem, low level vampires can survive on animal blood. But more powerful ones need human or even vampire blood.
Some fiction has it. In vampire the requiem, low level vampires can survive on animal blood. But more powerful ones need human or even vampire blood.
Where can I get a sub 400 AMD card with 26 GB of VRAM?
Have you tried Matrix?
LLMs are statistical word association machines. Or tokens more accurately. So if you tell it to not make mistakes, it’ll likely weight the output towards having validation, checks, etc. It might still produce silly output saying no mistakes were made despite having bugs or logic errors. But LLMs are just a tool! So use them for what they’re good at and can actually do, not what they themselves claim they can do lol.
OpenWebUI connected tabbyUI’s OpenAI endpoint. I will try reducing temperature and seeing if that makes it more accurate.
Context was set to anywhere between 8k and 16k. It was responding in English properly, and then about halfway to 3/4s of the way through a response, it would start outputting tokens in either a foreign language (Russian/Chinese in the case of Qwen 2.5) or things that don’t make sense (random code snippets, improperly formatted text). Sometimes the text was repeating as well. But I thought that might have been a template problem, because it seemed to be answering the question twice.
Otherwise, all settings are the defaults.
I tried it with both Qwen 14b and Llama 3.1. Both were exl2 quants produced by bartowski.
Perplexica works. It can understand ollama and custom OpenAI providers.
Super useful guide. However after playing around with TabbyAPI, the responses from models quickly become jibberish, usually halfway through or towards the end. I’m using exl2 models off of HuggingFace, with Q4, Q6, and FP16 cache. Any tips? Also, how do I control context length on a per-model basis? max_seq_len in config.json?
Seems to be the only necessary thing in my case! Thanks.
Yeah I definitely have the default GTK chooser. Guess I have some config playing to do later.
Can you explain a bit more about this and how to configure it? When I use FF on gnome, the save dialogue just looks like other dialogues?
Doesn’t gnome already have this?
I use a Misskey fork for micro blogging and I can’t even get Lemmy posts to load. The profiles of communities do, but that’s it.
Ah right. What I really meant to ask was if it can do protocols other than http.
Which I don’t think it can…
Are you able to tunnel ports other than 80 and 443 through Cloudflare?
Word can in fact open odt files. It was added quite a long time ago. Don’t know how good the compatibility is, though
Well VTR is a roleplaying game. It’s similar to Vampire the Masquerade, but different setting and somewhat different mechanics. I guess it’s best explained as “nutrients.” Animal blood and blood from e.g. blood bags gives less Vitae (magic blood points resource) than blood harvested from living humans. And as the character becomes more powerful, eventually that “lesser” blood can’t actually give them Vitae.
The vampiric curse in VTR is explicitly stated to be supernatural, though, so there’s not a necessary scientific explanation for it. The curse imparts the Beast, which is the predator in all vampires.