FEAST-L Archives

April 2020

FEAST-L@LISTSERV.JMU.EDU

Options: Use Monospaced Font
Show HTML Part by Default
Show All Mail Headers

Message: [<< First] [< Prev] [Next >] [Last >>]
Topic: [<< First] [< Prev] [Next >] [Last >>]
Author: [<< First] [< Prev] [Next >] [Last >>]

Print Reply
Subject:
From:
Youjin Kong <[log in to unmask]>
Reply To:
Youjin Kong <[log in to unmask]>
Date:
Wed, 29 Apr 2020 16:37:01 -0700
Content-Type:
multipart/alternative
Parts/Attachments:
text/plain (14 kB) , text/html (29 kB)
I also teach courses in ethical and philosophical issues in AI/computer
science/technology. One of my key course topics is Bias in Data/Algorithms
and Machine Learning Fairness: how AI could reproduce and reinforce gender
and racial oppression. There's a lot of discussion going on about this
topic, and I've found many good resources online! Here are some of the
learning materials I use this semester:

   - Julia Angwin, Jeff Larson, Surya Mattu, and Lauren Kirchner
(2016), Machine
   Bias
   <https://urldefense.proofpoint.com/v2/url?u=https-3A__www.propublica.org_article_machine-2Dbias-2Drisk-2Dassessments-2Din-2Dcriminal-2Dsentencing&d=DwIFaQ&c=eLbWYnpnzycBCgmb7vCI4uqNEB9RSjOdn_5nBEmmeq0&r=HUp8-bkYMlNgd3ZJBxWBKsBsFAFGHrEZg21p9gxugJA&m=ycge_FqadkSNnQB-2uVP2YQ9nr1b4CUyrb5JT8rhXIc&s=l3gWzoiCzeLWRS1P49P5Oa_A2N6FBmGNPi-SR-tcLRk&e=>
   , *ProPublica*
      - “There’s software used across the country to predict future
      criminals. And it’s biased against blacks.”
      - When An Algorithm Helps Send You to Prison
      <https://urldefense.proofpoint.com/v2/url?u=https-3A__www.nytimes.com_2017_10_26_opinion_algorithm-2Dcompas-2Dsentencing-2Dbias.html-3Fmtrref-3Dwww.google.com-26gwh-3D647B25F30BE1F038621B9556991C3CB2-26gwt-3Dpay-26assetType-3Dopinion&d=DwIFaQ&c=eLbWYnpnzycBCgmb7vCI4uqNEB9RSjOdn_5nBEmmeq0&r=HUp8-bkYMlNgd3ZJBxWBKsBsFAFGHrEZg21p9gxugJA&m=ycge_FqadkSNnQB-2uVP2YQ9nr1b4CUyrb5JT8rhXIc&s=LZ2iTNWbLLBPk6c_NNh7oZV219pZJ2vdZ-_v0-v5LmQ&e=>,
      The New York Times, 2017
   - Joy Buolamwini (2016), How I’m Fighting Bias in Algorithms
   <https://urldefense.proofpoint.com/v2/url?u=https-3A__www.ted.com_talks_joy-5Fbuolamwini-5Fhow-5Fi-5Fm-5Ffighting-5Fbias-5Fin-5Falgorithms-3Futm-5Fcampaign-3Dtedspread-26utm-5Fmedium-3Dreferral-26utm-5Fsource-3Dtedcomshare&d=DwIFaQ&c=eLbWYnpnzycBCgmb7vCI4uqNEB9RSjOdn_5nBEmmeq0&r=HUp8-bkYMlNgd3ZJBxWBKsBsFAFGHrEZg21p9gxugJA&m=ycge_FqadkSNnQB-2uVP2YQ9nr1b4CUyrb5JT8rhXIc&s=MiJnQgoKWMBwWvK_5U-btlAMC6FA7Wc4fvZwss40LLg&e=>,
   TED talk
      - "MIT grad student Joy Buolamwini was working with facial analysis
      software when she noticed a problem: the software didn't detect
her face --
      because the people who coded the algorithm hadn't taught it to identify a
      broad range of skin tones and facial structures."
      - Gender Shades <https://urldefense.proofpoint.com/v2/url?u=http-3A__gendershades.org_overview.html&d=DwIFaQ&c=eLbWYnpnzycBCgmb7vCI4uqNEB9RSjOdn_5nBEmmeq0&r=HUp8-bkYMlNgd3ZJBxWBKsBsFAFGHrEZg21p9gxugJA&m=ycge_FqadkSNnQB-2uVP2YQ9nr1b4CUyrb5JT8rhXIc&s=UY5RwXeNdBz-eIYXY4vyXAz510UFN7pBOmmDyAp5r-M&e=>
   - Google Has a History of Bias Against Black Girls
   <https://urldefense.proofpoint.com/v2/url?u=http-3A__time.com_5209144_google-2Dsearch-2Dengine-2Dalgorithm-2Dbias-2Dracism_&d=DwIFaQ&c=eLbWYnpnzycBCgmb7vCI4uqNEB9RSjOdn_5nBEmmeq0&r=HUp8-bkYMlNgd3ZJBxWBKsBsFAFGHrEZg21p9gxugJA&m=ycge_FqadkSNnQB-2uVP2YQ9nr1b4CUyrb5JT8rhXIc&s=ahpiHWbCp_dCbPKx9raLHB_kMjnDO-aOAhW9pbJHnEc&e=>,
   Time, 2018
      - This is an excerpt from Safiya Umoja Noble's 2018 book *Algorithms
      of Oppression: How Search Engines Reinforce Racism
      <https://urldefense.proofpoint.com/v2/url?u=https-3A__books.google.com_books-3Fid-3D-2DThDDwAAQBAJ-26printsec-3Dfrontcover-23v-3Donepage-26q-26f-3Dfalse&d=DwIFaQ&c=eLbWYnpnzycBCgmb7vCI4uqNEB9RSjOdn_5nBEmmeq0&r=HUp8-bkYMlNgd3ZJBxWBKsBsFAFGHrEZg21p9gxugJA&m=ycge_FqadkSNnQB-2uVP2YQ9nr1b4CUyrb5JT8rhXIc&s=Tduo1M24ZdtEnfKWqTzZ768X_Ry-aFp_VCqUHIUQEf4&e=>.*
   - Cathy O'Neill (2017), The Era of Blind Faith in Big Data Must End
   <https://urldefense.proofpoint.com/v2/url?u=https-3A__www.ted.com_talks_cathy-5Fo-5Fneil-5Fthe-5Fera-5Fof-5Fblind-5Ffaith-5Fin-5Fbig-5Fdata-5Fmust-5Fend-3Futm-5Fcampaign-3Dtedspread-26utm-5Fmedium-3Dreferral-26utm-5Fsource-3Dtedcomshare&d=DwIFaQ&c=eLbWYnpnzycBCgmb7vCI4uqNEB9RSjOdn_5nBEmmeq0&r=HUp8-bkYMlNgd3ZJBxWBKsBsFAFGHrEZg21p9gxugJA&m=ycge_FqadkSNnQB-2uVP2YQ9nr1b4CUyrb5JT8rhXIc&s=lfgDGSts1gkV3HkglgJBZuWUuxybrAr4hEwxjmkHRoQ&e=>,
   TED talk
      - “Algorithms decide who gets a loan, who gets a job interview, who
      gets insurance and much more -- but they don't automatically make things
      fair.”


...And other case discussions:

   - Machines Taught By Photos Learn a Sexist View of Women
   <https://urldefense.proofpoint.com/v2/url?u=https-3A__www.wired.com_story_machines-2Dtaught-2Dby-2Dphotos-2Dlearn-2Da-2Dsexist-2Dview-2Dof-2Dwomen_&d=DwIFaQ&c=eLbWYnpnzycBCgmb7vCI4uqNEB9RSjOdn_5nBEmmeq0&r=HUp8-bkYMlNgd3ZJBxWBKsBsFAFGHrEZg21p9gxugJA&m=ycge_FqadkSNnQB-2uVP2YQ9nr1b4CUyrb5JT8rhXIc&s=O6fNyEzBk2iA9ZzMLesqNmS3ok0h4pl-tplKAi761L8&e=>,
   Wired, 2017
   - “Algorithms showed a tendency to associate women with shopping and men
      with shooting.”
   - Supposedly ‘Fair’ Algorithms Can Perpetuate Bias
   <https://urldefense.proofpoint.com/v2/url?u=https-3A__www.wired.com_story_ideas-2Djoi-2Dito-2Dinsurance-2Dalgorithms_&d=DwIFaQ&c=eLbWYnpnzycBCgmb7vCI4uqNEB9RSjOdn_5nBEmmeq0&r=HUp8-bkYMlNgd3ZJBxWBKsBsFAFGHrEZg21p9gxugJA&m=ycge_FqadkSNnQB-2uVP2YQ9nr1b4CUyrb5JT8rhXIc&s=lronm9qWvj4kr_b8UUqij3ux1b9i3S5br5iY53glplw&e=>, Wired,
   2019
      - “How the use of AI runs the risk of re-creating the insurance
      industry's inequities of the previous century.”
   - Rise of the Racist Robots: How AI Is Learning All Our Worst Impulses
   <https://urldefense.proofpoint.com/v2/url?u=https-3A__www.theguardian.com_inequality_2017_aug_08_rise-2Dof-2Dthe-2Dracist-2Drobots-2Dhow-2Dai-2Dis-2Dlearning-2Dall-2Dour-2Dworst-2Dimpulses&d=DwIFaQ&c=eLbWYnpnzycBCgmb7vCI4uqNEB9RSjOdn_5nBEmmeq0&r=HUp8-bkYMlNgd3ZJBxWBKsBsFAFGHrEZg21p9gxugJA&m=ycge_FqadkSNnQB-2uVP2YQ9nr1b4CUyrb5JT8rhXIc&s=yvZo6KNLF2pkllzHu_rnWtlzjCo-OLDQQkzG_vkSpgc&e=>,
   The Guardian, 2017
      - “When we feed machines data that reflects our prejudices, they
      mimic them.”
   - Amazon Scraps Secret AI Recruiting Tool that Shows Bias Against Women
   <https://urldefense.proofpoint.com/v2/url?u=https-3A__www.reuters.com_article_us-2Damazon-2Dcom-2Djobs-2Dautomation-2Dinsight_amazon-2Dscraps-2Dsecret-2Dai-2Drecruiting-2Dtool-2Dthat-2Dshowed-2Dbias-2Dagainst-2Dwomen-2DidUSKCN1MK08G&d=DwIFaQ&c=eLbWYnpnzycBCgmb7vCI4uqNEB9RSjOdn_5nBEmmeq0&r=HUp8-bkYMlNgd3ZJBxWBKsBsFAFGHrEZg21p9gxugJA&m=ycge_FqadkSNnQB-2uVP2YQ9nr1b4CUyrb5JT8rhXIc&s=e_Z0V3vrJwPyz7Qd6ZzfkpXAksCCYvt35g6vyeMTGa4&e=>,
   Reuters, 2018
      - “Amazon.com Inc’s machine-learning specialists uncovered a big
      problem: their new recruiting engine did not like women.”


Best,
Youjin

On Wed, Apr 29, 2020 at 2:49 PM Eric Palmer <[log in to unmask]> wrote:

> Hi Shay and all,
> For political economy (big picture, applied philosophy):
> Shoshana Zuboff book, In the age of the smart machine.  (Or the swift
> read: https://urldefense.proofpoint.com/v2/url?u=https-3A__journals.sagepub.com_doi_10.1057_jit.2015.5&d=DwIFaQ&c=eLbWYnpnzycBCgmb7vCI4uqNEB9RSjOdn_5nBEmmeq0&r=HUp8-bkYMlNgd3ZJBxWBKsBsFAFGHrEZg21p9gxugJA&m=ycge_FqadkSNnQB-2uVP2YQ9nr1b4CUyrb5JT8rhXIc&s=wwvnDWqlkVrEp7zGOKvqYwAtQvA4O04gVdhrTqiklx4&e=
> <https://urldefense.proofpoint.com/v2/url?u=https-3A__journals.sagepub.com_doi_10.1057_jit.2015.5&d=DwMFaQ&c=eLbWYnpnzycBCgmb7vCI4uqNEB9RSjOdn_5nBEmmeq0&r=HUp8-bkYMlNgd3ZJBxWBKsBsFAFGHrEZg21p9gxugJA&m=Q7_PUPV6UMw9vq9z6ndO8rnOdeyecbe3TLyPpsCWHEw&s=ewbti1iahciKdx1LafDGpciDo0g01RK9ErW_KYBZhC4&e=> ,
> or a good pop digest version:
> https://urldefense.proofpoint.com/v2/url?u=https-3A__www.faz.net_aktuell_feuilleton_debatten_the-2Ddigital-2Ddebate_shoshana-2Dzuboff-2Dsecrets-2Dof-2Dsurveillance-2Dcapitalism-2D14103616.html&d=DwIFaQ&c=eLbWYnpnzycBCgmb7vCI4uqNEB9RSjOdn_5nBEmmeq0&r=HUp8-bkYMlNgd3ZJBxWBKsBsFAFGHrEZg21p9gxugJA&m=ycge_FqadkSNnQB-2uVP2YQ9nr1b4CUyrb5JT8rhXIc&s=9zP73z5JIKIDrH2vLN83je77E_m-RCm5k_kQgAU90i8&e=
> <https://urldefense.proofpoint.com/v2/url?u=https-3A__www.faz.net_aktuell_feuilleton_debatten_the-2Ddigital-2Ddebate_shoshana-2Dzuboff-2Dsecrets-2Dof-2Dsurveillance-2Dcapitalism-2D14103616.html&d=DwMFaQ&c=eLbWYnpnzycBCgmb7vCI4uqNEB9RSjOdn_5nBEmmeq0&r=HUp8-bkYMlNgd3ZJBxWBKsBsFAFGHrEZg21p9gxugJA&m=Q7_PUPV6UMw9vq9z6ndO8rnOdeyecbe3TLyPpsCWHEw&s=c0GQxX9wGxeEqst6218ql7ofoyBkaxj8uSND-dA1erg&e=>
>   )
>
> And, this is a political economy adjunct that is not philosophical, but it
> ties in out-of-control AI. It’s written in a shock journalism style, but it
> is really very different and as disturbing as the style suggests (to me, at
> least):
>
> https://urldefense.proofpoint.com/v2/url?u=https-3A__medium.com_-40jamesbridle_something-2Dis-2Dwrong-2Don-2Dthe-2Dinternet-2Dc39c471271d2&d=DwIFaQ&c=eLbWYnpnzycBCgmb7vCI4uqNEB9RSjOdn_5nBEmmeq0&r=HUp8-bkYMlNgd3ZJBxWBKsBsFAFGHrEZg21p9gxugJA&m=ycge_FqadkSNnQB-2uVP2YQ9nr1b4CUyrb5JT8rhXIc&s=ZEpL3tXC_UzKdy0fJ4teN6Pc5TNgDgmy6ZA2ymFk_dc&e=
> <https://urldefense.proofpoint.com/v2/url?u=https-3A__medium.com_-40jamesbridle_something-2Dis-2Dwrong-2Don-2Dthe-2Dinternet-2Dc39c471271d2&d=DwMFaQ&c=eLbWYnpnzycBCgmb7vCI4uqNEB9RSjOdn_5nBEmmeq0&r=HUp8-bkYMlNgd3ZJBxWBKsBsFAFGHrEZg21p9gxugJA&m=Q7_PUPV6UMw9vq9z6ndO8rnOdeyecbe3TLyPpsCWHEw&s=IalDHz9ce_C47EpxH5pB-muHG5RNzzt3Aml98AzH5_g&e=>
> James Bridle repackaged this material into a book, The New Dark
> Enlightenment, which I haven’t seen.
> Best
> Eric
>
>
> On Apr 29, 2020, at 4:45 PM, Amandine Catala <[log in to unmask]>
> wrote:
>
> Hi Shay,
>
> In case you haven’t come across this reference yet, there’s also the book
> Data Feminism, by Catherine D’Ignazio and Lauren Klein (MIT Press, 2020).
>
> There’s currently an online reading group on the book with the two authors
> on Fridays at 12 pm EST, and all the recordings are available here:
> https://urldefense.proofpoint.com/v2/url?u=http-3A__datafeminism.io_blog_book_data-2Dfeminism-2Dreading-2Dgroup_&d=DwIFaQ&c=eLbWYnpnzycBCgmb7vCI4uqNEB9RSjOdn_5nBEmmeq0&r=HUp8-bkYMlNgd3ZJBxWBKsBsFAFGHrEZg21p9gxugJA&m=ycge_FqadkSNnQB-2uVP2YQ9nr1b4CUyrb5JT8rhXIc&s=Yv2HKExalII23QAsQPSNuTpmdsBbZadw-xiphbBQqp0&e=
> <https://urldefense.proofpoint.com/v2/url?u=http-3A__datafeminism.io_blog_book_data-2Dfeminism-2Dreading-2Dgroup_&d=DwMFaQ&c=eLbWYnpnzycBCgmb7vCI4uqNEB9RSjOdn_5nBEmmeq0&r=HUp8-bkYMlNgd3ZJBxWBKsBsFAFGHrEZg21p9gxugJA&m=5KNAlRavGDx7eddky0niP2TQXTYjl1J_BGC1lmRpbmA&s=KQWK4sQaBH1ZBVgTAYG-LkZN7uCbhJRivM9q4HD5EwM&e=>
>
> Best wishes,
> Amandine
>
> Amandine Catala
>
> Professeure agrégée
> Chaire de recherche du Canada sur l’injustice et l’agentivité épistémiques
> Département de philosophie
> Université du Québec à Montréal
> CP 8888, Succ. Centre-Ville
> Montréal H3C 3P8, Québec, Canada
> FR: https://urldefense.proofpoint.com/v2/url?u=https-3A__crc-2Diae.com&d=DwIFaQ&c=eLbWYnpnzycBCgmb7vCI4uqNEB9RSjOdn_5nBEmmeq0&r=HUp8-bkYMlNgd3ZJBxWBKsBsFAFGHrEZg21p9gxugJA&m=ycge_FqadkSNnQB-2uVP2YQ9nr1b4CUyrb5JT8rhXIc&s=xrhp9e8vNnrbaDMvT4OQ2G1HGfblqNjbv64YB-FnBQM&e=
> <https://urldefense.proofpoint.com/v2/url?u=https-3A__crc-2Diae.com&d=DwMFaQ&c=eLbWYnpnzycBCgmb7vCI4uqNEB9RSjOdn_5nBEmmeq0&r=HUp8-bkYMlNgd3ZJBxWBKsBsFAFGHrEZg21p9gxugJA&m=5KNAlRavGDx7eddky0niP2TQXTYjl1J_BGC1lmRpbmA&s=cs51H-qEtKuJqLbvTavXrQNC8oQAygFW7_FxJLZoQpI&e=>
>
> Associate Professor
> Canada Research Chair on Epistemic Injustice and Agency
> Department of Philosophy
> University of Quebec at Montreal
> PO Box 8888, Downtown
> Montreal H3C 3P8, Quebec, Canada
> EN: https://urldefense.proofpoint.com/v2/url?u=https-3A__crc-2Diae.com_-3Flang-3Den&d=DwIFaQ&c=eLbWYnpnzycBCgmb7vCI4uqNEB9RSjOdn_5nBEmmeq0&r=HUp8-bkYMlNgd3ZJBxWBKsBsFAFGHrEZg21p9gxugJA&m=ycge_FqadkSNnQB-2uVP2YQ9nr1b4CUyrb5JT8rhXIc&s=UQuNdx_jS8yPToHzK_mWHBO6kbof8f9L2haF9SBGgMM&e=
> <https://urldefense.proofpoint.com/v2/url?u=https-3A__crc-2Diae.com_-3Flang-3Den&d=DwMFaQ&c=eLbWYnpnzycBCgmb7vCI4uqNEB9RSjOdn_5nBEmmeq0&r=HUp8-bkYMlNgd3ZJBxWBKsBsFAFGHrEZg21p9gxugJA&m=5KNAlRavGDx7eddky0niP2TQXTYjl1J_BGC1lmRpbmA&s=qHo8-jtvhlQV4hJ9QskZKaxo5pwkZ5dKUN34rpwV9Tw&e=>
>
> On Wed, Apr 29, 2020 at 1:07 PM Shay Welch <[log in to unmask]> wrote:
>
>> Hi folks,
>> Like many of you, our school is putting together summer internships for
>> the students since they don't have other options.  They have tagged me to
>> do one/I volunteered.  I need to do it on Philosophy and Data Science.  I
>> was wondering if you had any reading recommendations.  I definitely would
>> be interested in the epistemology and ethical aspects.
>>
>> --
>> All my best,
>> Shay Welch
>> Associate Professor of Philosophy
>> Spelman College
>>
>>
>> ############################
>>
>> To unsubscribe from the FEAST-L list: write to:
>> mailto:[log in to unmask] or click the following
>> link: http://listserv.jmu.edu/cgi-bin/wa?SUBED1=FEAST-L&A=1
>>
> ############################
>
> To unsubscribe from the FEAST-L list: write to:
> mailto:[log in to unmask] or click the following
> link: http://listserv.jmu.edu/cgi-bin/wa?SUBED1=FEAST-L&A=1
>
>
> ############################
>
> To unsubscribe from the FEAST-L list: write to:
> mailto:[log in to unmask] or click the following
> link: http://listserv.jmu.edu/cgi-bin/wa?SUBED1=FEAST-L&A=1
>


-- 
Youjin Kong, Ph.D.
Visiting Assistant Professor, Philosophy
Oregon State University
[log in to unmask]

CV: https://urldefense.proofpoint.com/v2/url?u=http-3A__www.youjinkong.com&d=DwIFaQ&c=eLbWYnpnzycBCgmb7vCI4uqNEB9RSjOdn_5nBEmmeq0&r=HUp8-bkYMlNgd3ZJBxWBKsBsFAFGHrEZg21p9gxugJA&m=ycge_FqadkSNnQB-2uVP2YQ9nr1b4CUyrb5JT8rhXIc&s=kgZjKjAjZ9WnoqGRUhcliK5m7SBlnNnCRz9kZ73TLQk&e=
Pronouns: she/her/hers

############################

To unsubscribe from the FEAST-L list:
write to: mailto:[log in to unmask]
or click the following link:
http://listserv.jmu.edu/cgi-bin/wa?SUBED1=FEAST-L&A=1


ATOM RSS1 RSS2