Running in the 90s again
Running in the 90s again
Could even run the instance from your phone or whatever device used to look at Lemmy
That’s evil
For browsing I have converted to Lemmy. For getting answers from a Google search I still click on the Reddit option. Lemmy doesn’t show on a Google search and other forums are useless for information. Minus stack overflow.
You can’t pirate their models, and even if they leaked, running them would need an expensive machine.
There are lots of open source models. They can get close but are limited by your hardware.
If you want close to GPT, there is the falcon 40b model. You’ll need something with more than 24 GB VRAM or deep down cpu offload with 128 GB RAM, I think, maybe 64.
With 24 GB VRAM you can do a 30B and so on…
For reference, the GPT models are like 135B. So a100 nvlink territory.