Together, with your help, we can keep the Brighteon family of sites operating, innovating and fighting against online censorship. Any amount you can spare makes a difference.

About
Member since May 2022
#ai #skynet #mixtral #opensource
https://huggingface.co/TheBloke/Mixtral-8x7B-v0.1-GGUF/tree/main
Mixtral is a Mixture of Expert (MOE) model with 8 experts per MLP, with a total of 85B parameters but the compute required is the same as a 14B model.
It is at time of writing the best open source model when it comes to reasoning:
It is in ways better than ChatGPT 3.5