Close as “won’t fix”. Easy. That’s what their customer service does to your ticket, too, if it’s too much to handle, so…
Close as “won’t fix”. Easy. That’s what their customer service does to your ticket, too, if it’s too much to handle, so…
You can make your speakers go BRRRRRRRRR via Home Assistant with it
Yeah, no. Just no. Absolutely no. Faster than everything else.and higher than everything else and all of that by burning less energy? Come on people, stop falling for this kind of crap.
Hey, now that’s funny. My cat is named
sudo ulimit - u 31677
I have a hunch the two wouldn’t get along.
My cousin wasn’t using any ML model. Their software probably did a geometric projection and that’s it. Then they’d search for the proposed owner of the fingerprint and get the real ones to compare against. That’s something that ML models cannot take from police as long as long as hallucinating is possible.
The research is bogus then as well. Projections like those have been in wide use at police stations around the globe at least since my cousin bragged about having this when he started his police training. That was.2008.
//////?-.,", duh,?!
Wellhehasabetterpaymentplanthan**methen
I’m at about 27ct.
That power consumption though… The price difference against the 3050 will be eaten up by the electricity bill really fast
Nothing really. The way add-ons interact with web pages is very similar.
Ppl misjudge the implications of those priorities.
Low priority: your shit is broken, well deal with it during office hours
Medium priority: a whole department’s shit is broken. We’ll stay longer to fix it if needed.
High priority: EVERYONE’S SHIT IS BROKEN HOLY FUCK, SEVENTEEN TECHS ARE IGNORING THE SPEED LIMIT ON THEIR WAY TO FIX THIS RIGHT NOW
And here I was thinking my 15 year old self was a dork for doing HTML-websites with color=“black” as background and color=“red” as font color…
This color scheme is jarring! Reading mode to the rescue!
“op” you are referring to is… well… myself, Since you didn’t comprehend that from the posts above, my reading comprehension might not be the issue here. \
But in all seriousness: I think this is an issue with concepts. No one is saying that LLMs can’t “learn” that would be stupid. But the discussion is not “is everything programmed into the LLM or does it recombine stuff”. You seem to reason that when someone says the LLM can’t “understand”, that person means “the LLM can’t learn”, but “learning” and “understanding” are not the same at all. The question is not if LLMs can learn, It’s wether it can grasp concepts from the content of the words it absorbs as it it’s learning data. If it would grasp concepts (like rules in algebra), it could reproduce them everytime it gets confronted with a similar problem. The fact that it can’t do that shows that the only thing it does is chain words together by stochastic calculation. Really sophisticated stachastic calculation with lots of possible outcomes, but still.
A carpet is bloat! I use a tiny spec of aluminium foil and rub it onto my cpu to flip it’s switches.
How does behaviour that is present in LLMs but not in SLMs show that an LLM can “think”?`It only shows that the amount of stuff an LLM can guess increases when you feed it more data. That’s not the hot take you think it is.
No one said anyhting about “learned” vs “programmed”. Literally no one.
Wait… Is this… Like that kindle thong mayhaps? Me am the smartyballs thinkerpersen!
They have me by the wallpapers!