If you can still use it after you stole it, as opposed to not being able to use it at all… Then it does give you an incentive
If you can still use it after you stole it, as opposed to not being able to use it at all… Then it does give you an incentive
It wouldn’t be. It would still work. It just wouldn’t be exclusively available to the group that created it-any competitive advantage is lost.
But all of this ignores the real issue - you’re not really punishing the use of unauthorized data. Those who owned that data are still harmed by this.
Making it open source doesn’t change how it works. It doesn’t need the data after it’s been trained. Most of these AIs are just figuring out patterns to look for in the new data it comes across.
Mastodon is confusing as shit though. They could have made is not as confusing, but this is what happens when you get backend only developers designing the front end of a product.
Ugh, Google+ was so much better than Facebook. The whole circles concept was a game changer for social media that no one else has really adopted in a meaningful way. Half the reason millennials began to leave Facebook was not wanting their parents seeing what they’re posting, so being able to decide which group can see a particular post was an awesome idea.
Sadly it just never got the adoption
This was about bluesky/Twitter type social media. Things with reshare and follows to specific users, where someone you follow arguing with someone you don’t will expose you to the person you don’t follow.
Those still aren’t bots. Bot farms are literally a bunch of servers running computer programs. That’s not the same thing as some online sweatshop pushing disinformation manually.
It decreases the spread. Cutting form the engagement means free people who aren’t already subscribed to that content will see it, since there’s fewer people arguing with it. Which means those who are susceptible to falling for it have less chance to even encounter it, meaning fewer fall into it.
Even if the incentive to create the trolls has changed, the counter to letting it spread hasn’t.
Most AI are not built to answer questions. They’re designed to act as some kind of detection/filter heuristic to identify specific things about an input that leads to a desired output.