

It’s awesome but that 128k context window is a throwback to Llama3 days
I bet the closed $ource model has like 2MB context
It’s awesome but that 128k context window is a throwback to Llama3 days
I bet the closed $ource model has like 2MB context
Ditto. But I am taking my time. I am on the free tier; I cost him money. I’ll move off eventually probably to Tutanota unless something better shows up
bash is also well supported in Windows via WSL
We have already done that. It’s called Dreddit and not even major governments can stop it.
Agreed.
Really, if someone wants to use an LLM, the right place to run it is in a sandbox locally on your own computer
Anything else is just a stupid architecture. You don’t run your Second Brain on Someone Else’s Computer
It’s like the Internet in 1998
Pets.com hasn’t gone but yet, but it will.
The bubble will burst. BUT … the entire world will run on this new technology, nobody will imagine living without it, and multibillion dollar companies will profit and be created from it
Yes, several are fully open source. I like Mistral
what do you think an LLM is? once you’ve opened the weights, IMO it’s pretty open. Once they open the training data, that’s pretty damn open. What do you want a gitian reproducible build?
Do you use IDCS? If not, why not? Have you taken care of automating encryption and backup to cloud? There’s a new open source shared media server, are you interested in configuring, securing, and testing it?
It’s mostly set and forget, Earth is mostly harmless, etc
Have you checked Mistral? Open weights and training set. What more do you want?
I haven’t tested this but TBH as someone who has run Linux at home for 25 years I love the idea of an always alert sysadmin keeping my machine maintained and configured to my specs. Keep my IDS up to date. And so on.
Two requirements:
1 Be an open source local model with no telemetry
2 Let me review proposed changes to my system and explain why they should be made
I am not a linguist but the deafening silence from Chomsky and his defenders really does demand being called out.
Syntactical models of language have been completely crushed by statistics-at-scale via neural nets. But linguists have not rejected the broken model.
The same thing happened with protein folding – researchers who spent the last 25 years building complex quantum mechanical/electrostatic models of protein structure suddenly saw AlphaFold completely crush prior methods. The difference is, bioinformatics researchers have already done a complete about-face and are taking the new AI tools and running with them.
believing that a large language model has any kind of awareness or actual intelligence is absurd
I (as a person who works professionally in the area and tries to keep up with the current academic publications) happen to agree with you. But my credences are somewhat reduced after considering the points Hinton raises.
I think it is worth considering that there are a handful of academically active models of consciousness; some well-respected ones like the CTM are not at all inconsistent with Hinton’s statements
IMO PeerTube would be much larger but grifter sites like Rumble and Odyssee/LBRY are sucking a lot of the wind from the YT-alternative ecosystem
Long term IMO PeerTube is the only sensible architecture. federated.
This Nobel Prize winner and subject matter expert takes the opposite view
finally I can play Solitaire on my HPUX
Some people like being unpaid OnlyFans models whose intimate details go to corporations instead of pervy guys