I would be so happy if AI burst like the dotcom boom on steroids.
Mastodon: me@drdiddlybadger.com Avatar drawn by kittydee.art
I would be so happy if AI burst like the dotcom boom on steroids.
Not only is AI a new market but it’s the only market where someone with way less investment can leapfrog the shit out of you and that group can just release shit for free just to stab you in the dick after billions of investment. Open AI only has a hope if it can get massive state level contracts to fund it, likely by offering some type of surveillance service which they aren’t even specced for but models like these are great at. Just all round bad investment decision to buy into them unless you know what they’re planning.
The luddites were right tho
If you’re gonna use a new technology to churn out cheaper goods. Great. If you’re going to charge me the same for these goods and keep all the profits while still mis treating labor, fuck that.
This is pretty much Glaze 2. It just intentionally poisons the data set with specific targets so model is more fucked. Originally it was just noise being put in and ultimately a image that had been glazed would just get tossed. With this, the image will actually fuck up the resulting model of there is enough poisoned data included.
Probably, I’m not an expert obviously.
Isn’t this bound to happen without built in automated tools for flagging and moderation. Not quite sure how the federation handles this sort of thing besides community modding, saying something if you see something.
I don’t think LLMs and other models will ever go away I just want them to not be shoe horned into absolutely everything. They’re actively useful for certain analyses and tasks, they just aren’t a panacea.