• 0 Posts
  • 29 Comments
Joined 1 year ago
cake
Cake day: June 23rd, 2023

help-circle

  • 520@kbin.socialtoProgrammer Humor@lemmy.mlcodeStyle
    link
    fedilink
    arrow-up
    1
    arrow-down
    2
    ·
    7 months ago

    So this looks like it’s based in Java code.

    A public class means that any bit of Java code, including that injected by an attacker, can see and mess with the contents of that class.

    A private class, in contrast, means that other bits of Java code are restricted to running the class’s predefined functions.

    In theory it is supposed to help with the security of the data. In practice if an attacker gets to this point, you’ve got much bigger issues.



  • You may want to read through what I actually wrote again.

    Someone takes an image, runs it through image-to-image AI with 90% strength (meaning ‘make minor changes at most’), and then claims it as their own.

    That last part there is what makes it stealing. It’s not the theft of the picture. It’s the theft of credit and of social media impressions. The latter sounds stupid at first, until you realise that it is an important, even essential part of marketing for many businesses, including small ones.

    Now sure, technically someone who likes and comments under the fake can also do the same underneath the real one but in reality, first exposure benefits are important here: the content will have its most impact, and therefore push viewers to engage in some way, including at a business level, the first time they see it. When something incredibly similar pops up, they’re far more likely to go “I’ve already seen this. Next.”.

    Human attention is very much a finite thing. If someone is using your content to divert people away from your brand, be it personal or professional, it has a very real cost associated with it. It is theft of opportunity, pure and simple.




  • 520@kbin.socialtoProgrammer Humor@lemmy.mlSPAs were a mistake
    link
    fedilink
    arrow-up
    6
    arrow-down
    1
    ·
    edit-2
    10 months ago

    Basically it means that the API calls won’t work in a browser and would only realistically work in things like Python scripts.

    If API calls are being handled by JavaScript in the browser, they’re going to run into issues, because the HttpOnly flag means the JavaScript code can’t read the auth token.

    Things like Python scripts have no such limitations though, so this can be used in cases where you aren’t expecting an actual browser.



  • 520@kbin.socialtoTechnology@lemmy.mlethinically ambigaus
    link
    fedilink
    arrow-up
    9
    arrow-down
    31
    ·
    edit-2
    10 months ago

    They are not talking about the training process

    They literally say they do this “to combat the racial bias in its training data”

    to combat racial bias on the training process, they insert words on the prompt, like for example “racially ambiguous”.

    And like I said, this makes no fucking sense.

    If your training processes, specifically your training data, has biases, inserting key words does not fix that issue. It literally does nothing to actually combat it. It might hide issues if the data model has sufficient training to do the job with the inserted key words, but that is not a fix, nor combating the issue. It is a cheap hack that does not address the underlying training issues.


  • 520@kbin.socialtoTechnology@lemmy.mlethinically ambigaus
    link
    fedilink
    arrow-up
    9
    arrow-down
    47
    ·
    edit-2
    10 months ago

    That explanation makes no fucking sense and makes them look like they know fuck all about AI training.

    The output keywords have nothing to do with the training data. If the model in use has fuck all BME training data, it will struggle to draw a BME regardless of what key words are used.

    And any AI person training their algorithms on AI generated data is liable to get fired. That is a big no-no. Not only does it not provide any new information from the data, it also amplifies the mistakes made by the AI.





  • It’s not that he sided against Israel. It’s that he sided against Jews. The Israeli government and its supporters like to conflate the two a lot, but they are very different things.

    So when someone writes “Jewish communities have been pushing the exact kind of dialectical hatred against whites that they claim to want people to stop using against them.” and Musk responds with “You have spoken the actual truth.”, that is not an anti-Israel position, that is an antisemite position, one that couldn’t be made more clearer even if he posted it with a picture of Auschwitz


  • I disagree - i think only relying on green electricity is good, but it won’t solve the devastating status we are currently that is the climate catastrophe.

    True but it would mean talking about AI in this context is completely irrelevant.

    Consuming less power makes it much easier to use green electricity for this power.

    AI isn’t even close to being the biggest offender when it comes to industrial power consumption. To talk about AI in this context is practically a distraction, with how insignificant it is. You might as well bring gaming into the conversation while you’re at it. It’s even less relevant to the conversation than the likes of bitcoin mining.

    It’s not like latter where a) you need insane levels of power for all cases in order to get anywhere and b) that’s the only barrier. You can train an AI on a gaming laptop. Bigger data sets might require, I dunno, a server, but that’s never not been true with any kinds of heavier loads. And once you have your data model, you can just keep using that like you would in any other program, without a power draw any more notable than any other program.

    Also, you cannot just go ‘I want an AI’ and have that magically make you money and shit. You need it to have as much direction, programming expertise and planning as any other computer program just to even get started. That will already remove the many people who have no idea what they’re doing and just want to make free money.

    No, the real conversation needs to be had about industrial machinery, especially older iterations that treat power as something in infinite supply.



  • There is one thing you have left out though; machine learning lowers barriers. It takes years to become a good statistical analyst, data engineer, or what have you. Now, I can train an AI to analyse the data and sure, when done right the results will be 90% as good as if I studied statistics for 10 years and did it myself, and yes there are pitfalls to be wary of, but now there are a lot of people with access to these tools that were previously out of reach.