This is the first part of a 2-part series on the growing importance of teaching Data and AI literacy to our students.  This will be included in a module I am teaching at Menlo College but wanted to share the blog to help validate the content before presenting to my students.

Wow, what an interesting dilemma. Apple plans to introduce new iPhone software that uses artificial intelligence (AI) to churn through the vast collection of photos that people have taken with their iPhones to detect and report child sexual abuse.  See the Wall Street article “Apple Plans to Have iPhones Detect Child Pornography, Fueling Priva…” for more details on Apple’s plan.

Apple has a strong history of working to protect its customers’ privacy.  It’s iPhone is basically uncrackable which has put it at odds with the US government. For example, the US Attorney General asked Apple to crack their encrypted phones after a December 2019 attack by a Saudi aviation student that killed three people at a Florida Navy base. The Justice Department in 2016 pushed Apple to create a software update that would break the privacy protections of the iPhone to gain access to a phone linked to a dead gunman responsible for a 2015 terrorist attack in San Bernardino, Calif. Time and again, Apple has refused to build tools that break its iPhone’s encryption, saying such software would undermine user privacy.

In fact, Apple has a new commercial where they promote their focus on consumer privacy (Figure 1).

Figure 1:  Apple iPhone Privacy Commercial

Now, stopping child pornography is certainly be a top society priority, but at what cost to privacy. This is one of those topics where the answer is not black or white. A number of questions arise including:

  • How much personal privacy is one willing to give up trying to halt this abhorrent behavior?
  • How much do we trust the organization (Apple in this case) in their use of the data to stop child pornography?
  • How much do we trust that the results of the analysis won’t get into unethical players’ hands and used for nefarious purposes?

And let’s be sure that we have thoroughly vetted the costs associated with the AI model’s False Positives (accusing an innocent person of child pornography) and False Negatives (missing people who are guilty of child pornography), a topic that I’ll cover in more detail in Part 2.

Data literacy starts by understanding what data is.  Data is defined as the facts and statistics collected to describe an…

Continue reading: http://www.datasciencecentral.com/xn/detail/6448529:BlogPost:1063830

Source: www.datasciencecentral.com