I think if there was something happening, people would be protesting. Probably happening all over the world if it was bad enough. I think a terrorist leader died recently so surely if they were involved, then they were the good guys.
Right?
I think if there was something happening, people would be protesting. Probably happening all over the world if it was bad enough. I think a terrorist leader died recently so surely if they were involved, then they were the good guys.
Right?
deleted by creator
They didn’t spend a fortune on licensing for nothing.
Think of the lawsuits.
-Sony probably
How did we get here
Money!
The fact that it’s an option that even remotely works is my point. They sell hardware. They don’t support software. The community does that. There is something to be gained from having a uniform platform for learning self hosting responsibly.
A Raspberry pi isn’t particularly great at any one thing. It’s greatest strength comes in bundling everything you need in a box at an affordable price. Once you know where your pain points are then you can build/design a system that overcomes those shortcomings.
Having a starter kit would be an easy way to get more people in the space. Would it cost $35 of course not. Level1Techs made their KVM to meet their own requirements and then the community benefits. To me, this project has that kind of energy. Or at least the potential for it.
Raspberry pi was able to do it with $35.
That’s what I said. The person I replied to said that all messages are encrypted* with the asterisk being only if you specifically enable it. I clarified that it doesn’t apply to group chats though. I don’t use Telegram so the loss of functionality is actually a bigger deal to me than the argument around E2EE. Can you explain what features are lost when you enable it? It’s a messaging app so I’m curious what you sacrifice for E2EE.
Recent events have taught me that only individual chats are encrypted*. Group chats don’t have that feature.
Are you suggesting a Bitcoin exchange also dabbles in selling Magic cards?
Creating material that is copyright infridgement is not a desired output
Agreed.
the purpose of guns is to kill (when used).
Guns is a term with varied definitions of which not all are intended to kill. There are rubber bullets, air soft, small caliber, and even paint ball guns. These MAY be lethal but were made with other goals in mind.
Nvidia on the other hand made GPUs for applications that revolve around video, the G literally stands for graphics. Some people found out that they are also efficient at other tasks so Nvidia made a new line of products for that workload because it was more lucrative. Gamers usually only buy 1 graphics card per machine, a few years ago some would even buy up to 3. In contrast, AI researchers/architects/programmers buy as many as they can afford and constantly buy more. This has made Nvidia change their product stack to cater to the more lucrative customer.
AI manufacturers depend on copyrighted material to “train” the AI
With everything I said, these AI creators CHOOSE what to feed into these new tools. They can choose to input things in the public domain or even paid-licensed-content but instead using copyrighted and pirated content is the norm. That is because this is a new field and we are collectively learning where the boundaries are and what is considered acceptable and legal.
Reddit recently signed a deal to license it’s data (user generated content like posts and comments) for use with AI generation. Other companies are using internal data to tailor their AIs to solve field-specific problems. The problem is that AI, just like guns, is a broad term.
the method of creation makes it more likely to infridge.
Nvidia has given us the tools but until we define what is considered acceptable, these kinds of things will be inevitable. I do believe that the authors had their copyrights infringed but they are also going after the wrong people. There have been reports of AI spitting out full books on command, clearly proving that those works were used to train. The authors should be going after the creators of those specific AIs, not Nvidia.
There is a long and bumpy road ahead.
This feels like suing gun manufacturers over murder. They made the tool but they’re not the ones responsible for the crime.
The future of net neutrality is already decided, has been for a while. The only question is when. Does NN end now or later. We already let the wolves in.