• North Carolina: Where ‘No’ Doesn’t Mean ‘No’ When It Comes to Rape and Sexual Assault, by Dara Sharif: “Yes, in the year 2019, in North Carolina, once a person consents to sex, there are no backsies. […]

    • The article about misleading (not actually malicious, thank you headline writer) AI is a perfect example of “human” dysfunctional thinking. The AI systems have discovered p-hacking!
      The author says that training AI systems (actually, AI vision systems, though I think the same effect could happen with any sort of pattern-recognition system) by exposing them only to “real” correlations, they are much less vulnerable to generating false conclusions. But how do you recognize “real” correlations and how do you ensure that the AI is picking out real versus misleading or imaginary correlations? The answer, which took humans (and baby dinosaurs, thank you Mary!) millions of years to develop and is still far from perfect, is “science”. Don’t just find statistical correlations, use them to develop hypotheses and then test the hypotheses. Repeat.
      This reminds me of the distinction the SBM people draw between science-based medicine and evidence-based medicine. You need a comprehensive theoretical basis to ensure your statistical models aren’t just based on chance. The self-driving cars need to talk to each other so if three of them think it’s a stop sign and four of them think its a 45MPH sign, they at least know the matter is in dispute and to proceed with caution. And hopefully not with scattering experiments.

      • Thank you for clarifying that article! I was confused by the headline, but I thought the article was interesting to ponder. Eliminating human bias in creating models (or, more realistically, minimizing it) also seems like one of the biggest problems with deep-learning models (as with surveys, study design, etc.).

      • Hi again Buzz, long time no see! Your points are well made.
        I reckon the biggest piece of human bias that needs eliminating is the blind faith of developers in their new gee whiz software that turns out to be poorly designed and implemented, rushed into production well before it is ready and before seeking input from actual local experts in the field to which it is to be applied.
        Example: practically every piece of government software ever. Always they go for the cheapest quote and fail to buy the modules that actually, you know, make the system work. Systems are then retro fitted on the fly with the result that the system always remains a badly cobbled together and buggy nightmare. Yes I am a cynic on this.