Your data is worth about $5-$10 a month, at least for Facebook. A month.
Your data is worth about $5-$10 a month, at least for Facebook. A month.
Discord isn’t social media. What is with everyone just referring to every tech company product as “social media”!?
It’s not enshittification because it literally doesn’t follow the second part of your own definition. Needing to change your offerings because your internal prices increase is normal business. Enshittification literally is from companies offering stuff to entice users and then they realize they have nothing else to offer to businesses, so they remove features in order to sell them to businesses or to increase ads.
No, it really is unique to python. Most other languages have one or two package managers, not 15 (15 is not an exaggeration). Ruby has one. Rust has one. Java has two (maven and gradle). Elixir has one. Swift has one.
Python programmers think it’s normal when it most definitely is not. Even your IntelliJ example isn’t correct because IntelliJ will literally install and set up the jdk for you, but pycharm is completely unable to do that and it’s not because JetBrains hasn’t tried. Python tooling is just really really really bad.
Try finding a bug related to indentation in a 15 year old python codebase by the worst programmers on the planet. You won’t think that there’s no issues with it after that point. In any other language you literally just reformat and you’ll see the bug. That’s not the case in Python.
What I mean by that is that Python tooling is terrible. There’s five different ways to do everything, which you have to decide between, and in the end, they all have weird limitations (which is probably why four others exist).
There’s actually at least 15 different ways (the fifteenth one is called rye and it’s where I got that article from). And yes your entire post is super accurate. The pycharm thing is ridiculous too because RubyMine is excellent in comparison. You just pull in a library with Ruby’s excellent (singular) package manager, and then RubyMine is able to autocomplete it pretty much perfectly. PyCharm can’t even manage to figure out that you added a new dependency to whatever flavor of the week package manager you’re using this time.
Yep and if OpenAI goes under the whole market will likely crash, people will dump their GPUs they’ve been using to create models and then boom, you’ve got a bunch of GPUs available.
You might be too young. They’re referring to a service called StumbleUpon that did almost exactly what you posted here.
Everyone that doesn’t have access to those is using gpus though.
Really weird that the article doesn’t mention that the campus is 4 times the size of both apple and Microsoft’s campuses combined. It just mentioned it being larger. But 4 times is crazy.
What in the world are you talking about. Using contractors is not “paying people less than they’re worth”.
I’m still looking for the glasses to show op is a professional.
Also the difficulty is in the production line and custom swappable components, not the case design.
Nohello.net or whatever the URL is
Just use kagi. Statistically better.
I informed my SecOps team and they reached out to Slack. Slack posted an update:
We’ve released the following response on X/Twitter/LinkedIn:
To clarify, Slack has platform-level machine-learning models for things like channel and emoji recommendations and search results. And yes, customers can exclude their data from helping train those (non-generative) ML models. Customer data belongs to the customer. We do not build or train these models in such a way that they could learn, memorize, or be able to reproduce some part of customer data. Our privacy principles applicable to search, learning, and AI are available here: https://slack.com/trust/data-management/privacy-principles
Slack AI – which is our generative AI experience natively built in Slack – is a separately purchased add-on that uses Large Language Models (LLMs) but does not train those LLMs on customer data. Because Slack AI hosts the models on its own infrastructure, your data remains in your control and exclusively for your organization’s use. It never leaves Slack’s trust boundary and no third parties, including the model vendor, will have access to it. You can read more about how we’ve built Slack AI to be secure and private here: https://slack.engineering/how-we-built-slack-ai-to-be-secure-and-private/
None of that applies if you’re a paying customer like me, and I see all the same bs. So no, it’s really just bad design, it’s not trying to do any of the stuff you mentioned.
That’s what I was saying in the second part of my comment.
When he got kicked out by the board I was quite happy, literally “omg they’re actually going to follow their principles” but then nope. Apparently nobody in the company could see it for what it was and people outside didn’t want their “chatbot to go away!”