Tour Dates – “We Like to Move It, Move It (Just Another Immigration Variety Show)”

Created by playwright Amy Ng and Olivier-award winning Director Donnacadh O’Briain in collaboration with our very own Dr Sarah Fine, and supported by Arts Council England and King’s College London. We are very excited to announce tour dates for “We Like to Move It, Move It: Just Another Immigration Variety Show”! For more info and to book tickets, please visit Ice&Fire’s website: https://iceandfire.co.uk/project/wltmimi/

More about the show!

How do you solve a problem like immigration? Karaoke, moral philosophy and immigration controls come together in an all-immigrant variety show, serving up jokes, songs and plenty to chew over. There’s something for everyone! (But should there be?)

What is behind our collective acceptance of immigration control? What does it say about us and what do those who have come to the UK from somewhere else want to say about it? ice&fire theatre and Matthew Schmolle productions in collaboration with the Philosophy Department at King’s College London invite you to join the (very jolly) conversation. 

Co-produced with Matthew Schmolle Productions

Written by Amy Ng and Donnacadh O’Briain

Produced by ice&fire theatre and Matthew Schmolle Productions

The Company – Jahmila Heath, Tomoko Komura, Gaël Le Cornec and Sergio Maggiolo

Director – Donnacadh O’Briain

Set and Costume Designer – Elizabeth Rose

Sound Designer – Tingying Dong

Stage Manager – Kayleigh Atkinson

A special event this Thursday…


The Yeoh Tiong Lay Centre for Politics, Philosophy & Law
The London Medical Imaging & AI Centre for Value Based Healthcare and the Sowerby Philosophy & Medicine Project are very pleased to announce our jointly organized Special Legal-Themed Panel Discussion on Stereotyping & Medical AI, which will form the 5th instalment of the Philosophy & Medicine Project’s Stereotyping & Medical AI online summer colloquium series!

UPDATE: We are delighted to announce that chairing our talk will be Robin Carpenter, who is the Senior Research Data Governance Manager at the London Medical Imaging & AI Centre for Value Based Healthcare!

Special Legal-Themed Panel Discussion on Stereotyping and Medical AI

Jointly Organized by the Yeoh Tiong Lay Centre for Politics, Philosophy & Law, The London Medical Imaging & AI Centre for Value Based Healthcare & the Philosophy & Medicine Project

Panellists:

Dr. Jonathan Gingerich (KCL)

Lecturer in the Philosophy of Law at theYeoh Tiong Lay Centre for Politics, Philosophy & Law

Dr. Reuben Binns (Oxford)

Associate Professor of Human Centred Computing

Prof. Georgi Gardiner (Tennessee)

Associate Professor of Philosophy

Prof. David Papineau (KCL)

Professor of Philosophy of Science

Chair:

Robin Carpenter (The London Medical Imaging & AI Centre for Value Based Healthcare)

Senior Research Data Governance Manager

When: Thursday 29th of July, 5pm BST

REGISTER and find out more about the event here

Understanding “statistical stereotyping” as forming beliefs about individuals on the basis of statistical generalizations about the groups to which the individuals belong, can it be legally problematic to statistically stereotype patients in medicine, either when these beliefs are formed by medical AI/artificial agents or by medical professionals? In this Special Legal-Themed Panel Discussion, we’ll hear from relevant experts in law, computer science, and philosophy on this and related questions around the legal aspects of stereotyping in medicine, by both human and artificial agents. 

* For those unable to attend these colloquia, please feel free to register for our events in order to be notified once recordings of previous colloquia become available! You can also subscribe to the Philosophy & Medicine Project’s newsletter here, or follow us on Twitter or Facebook. Follow the YTL Centre at King’s on Twitter here and the London Medical Imaging & AI Centre for Value Based Healthcare here. Previous colloquia will also be posted to the Philosophy & Medicine Project’s YouTube channel.

About the Stereotyping and Medical AI Summer Colloquium Series

The aim of this fortnightly colloquium series on Stereotyping and Medical AI is to explore philosophical and in particular ethical and epistemological issues around stereotyping in medicine, with a specific focus on the use of artificial intelligence in health contexts. We are particularly interested in whether medical AI that uses statistical data to generate predictions about individual patients can be said to “stereotype” patients, and whether we should draw the same ethical and epistemic conclusions about stereotyping by artificial agents as we do about stereotyping by human agents, i.e., medical professionals.  

Other questions we are interested in exploring as part of this series include but are not limited to the following: 

  • How should we understand “stereotyping” in medical contexts? 
  • What is the relationship between stereotyping and bias, including algorithmic bias (and how should we understand “bias” in different contexts?)? 
  • Why does stereotyping in medicine often seem less morally or epistemically problematic than stereotyping in other domains, such as in legal, criminal, financial, educational, etc., domains? Might beliefs about biological racial realism in the medical context explain this asymmetry? 
  • When and why might it be wrong for medical professionals to stereotype their patients? And when and why might it be wrong for medical AI, i.e. artificial agents, to stereotype patients? 
  • How do (medical) AI beliefs relate to the beliefs of human agents, particularly with respect to agents’ moral responsibility for their beliefs? 
  • Can non-evidential or non-truth-related considerations be relevant with respect to what beliefs medical professionals or medical AI ought to hold? Is there moral or pragmatic encroachment on AI beliefs or on the beliefs of medical professionals? 
  • What are potential consequences of either patients or doctors being stereotyped by doctors or by medical AI in medicine? Can, for example, patients be doxastically wronged by doctors or AI in virtue of being stereotyped by them? 

We will be tackling these topics through a series of online colloquia hosted by the Sowerby Philosophy and Medicine Project at King’s College London. The colloquium series will feature a variety contributors from across the disciplinary spectrum. We hope to ensure a discursive format with time set aside for discussion and Q&A by the audience. This event is open to the public and all are very welcome.

Our working line-up for the remainder of this summer series is as follows, with a few additional speakers and details to be confirmed:

June 17            Professor Erin Beeghly (Utah), “Stereotyping and Prejudice: The Problem of Statistical Stereotyping” 

July 1               Dr. Kathleen Creel, (HAI, EIS, Stanford) “Let’s Ask the Patient: Stereotypes, Personalization, and Risk in Medical AI” (recording linked)

July 15             Dr. Annette Zimmermann (York, Harvard), “ “Structural Injustice, Doxastic Negligence, and Medical AI” 

July 22             Dr. William McNeill (Southampton), “Neural Networks and Explanatory Opacity” (recording linked)

July 29             Special Legal-Themed Panel Discussion: Dr. Jonathan Gingerich (KCL), Dr. Reuben Binns (Oxford), Prof. Georgi Gardiner (Tennessee), Prof. David Papineau (KCL), Chair: Robin Carpenter (The London Medical Imaging & AI Centre for Value Based Healthcare) (link to register)

August 12        Professor Zoë Johnson King (USC) & Professor Boris Babic (Toronto), “Algorithmic Fairness and Resentment”

August 26        Speakers TBC

September 2    Dr. Geoff Keeling (HAI, LCFI, Google)

September 9    Professor Rima Basu (Claremont McKenna)   

All best wishes, and we very much hope you can join us! 

The Organizers (Dr. Jonathan Gingerich, Robin Carpenter, Professor Elselijn Kingma, Dr. Winnie Ma, and Eveliina Ilola)

Not your stereotypical summer? Try this…


Stereotyping and Medical AI
 
Online Summer Colloquium Series

by the Sowerby Philosophy & Medicine Project

The aim of this fortnightly colloquium series on Stereotyping and Medical AI is to explore philosophical and in particular ethical and epistemological issues around stereotyping in medicine, with a specific focus on the use of artificial intelligence in health contexts. We are particularly interested in whether medical AI that uses statistical data to generate predictions about individual patients can be said to “stereotype” patients, and whether we should draw the same ethical and epistemic conclusions about stereotyping by artificial agents as we do about stereotyping by human agents, i.e., medical professionals.

Other questions we are interested in exploring as part of this series include but are not limited to the following:

  • How should we understand “stereotyping” in medical contexts?
  • What is the relationship between stereotyping and bias, including algorithmic bias (and how should we understand “bias” in different contexts?)?
  • Why does stereotyping in medicine often seem less morally or epistemically problematic than stereotyping in other domains, such as in legal, criminal, financial, educational, etc., domains? Might beliefs about biological racial realism in the medical context explain this asymmetry?
  • When and why might it be wrong for medical professionals to stereotype their patients? And when and why might it be wrong for medical AI, i.e. artificial agents, to stereotype patients?
  • How do (medical) AI beliefs relate to the beliefs of human agents, particularly with respect to agents’ moral responsibility for their beliefs?
  • Can non-evidential or non-truth-related considerations be relevant with respect to what beliefs medical professionals or medical AI ought to hold? Is there moral or pragmatic encroachment on AI beliefs or on the beliefs of medical professionals?
  • What are potential consequences of either patients or doctors being stereotyped by doctors or by medical AI in medicine? Can, for example, patients be doxastically wronged by doctors or AI in virtue of being stereotyped by them?

We will be tackling these topics through a series of online colloquia hosted by the Sowerby Philosophy and Medicine Project at King’s College London. The colloquium series will feature a variety contributors from across the disciplinary spectrum. We hope to ensure a discursive format with time set aside for discussion and Q&A by the audience. This event is open to the public and all are welcome. 

To find out more about this series, please visit the Philosophy & Medicine Project’s website: https://www.philosophyandmedicine.org/summer-series. Our next colloquium in the series will be a Special Legal-Themed Panel Discussion chaired by a member of the London Medical Imaging & AI Centre for Value Based Healthcare, and featuring our very own Professor David Papineau and Dr. Jonathan Gingerich (which you can register for here)!

Our working line-up for the summer series is as follows, with a few additional speakers and details to be confirmed:

June 17            Professor Erin Beeghly (Utah), “Stereotyping and Prejudice: The Problem of Statistical Stereotyping” 

July 1               Dr. Kathleen Creel, (HAI, EIS, Stanford) “Let’s Ask the Patient: Stereotypes, Personalization, and Risk in Medical AI” (recording linked)

July 15             Dr. Annette Zimmermann (York, Harvard), “ “Structural Injustice, Doxastic Negligence, and Medical AI” 

July 22             Dr. William McNeill (Southampton), “Neural Networks and Explanatory Opacity” (recording linked)

July 29             Special Legal-Themed Panel Discussion: Dr. Jonathan Gingerich (KCL), Dr. Reuben Binns (Oxford), Prof. Georgi Gardiner (Tennessee), Prof. David Papineau (KCL), Chair: Robin Carpenter (The London Medical Imaging & AI Centre for Value Based Healthcare) (link to register)

August 12        Professor Zoë Johnson King (USC) & Professor Boris Babic (Toronto), “Algorithmic Fairness and Resentment”

August 26        Speakers TBC

September 2    Dr. Geoff Keeling (HAI, LCFI, Google)

September 9    Professor Rima Basu (Claremont McKenna)  

To be notified about upcoming colloquia in the series and other Project events, you can subscribe to the Philosophy & Medicine Project’s newsletter here, or follow us on Twitter or Facebook. Previous colloquia will also be posted to the Philosophy & Medicine Project’s website and YouTube channel. (And for those unable to attend these colloquia, please feel free to register for our events in order to be notified once recordings of previous colloquia become available!)

Final Call! Lecturer in Philosophy

The Philosophy Department at King’s College London is seeking an excellent philosopher with outstanding research expertise and teaching experience in one or more of the areas where it currently has teaching needs: Political Philosophy, Epistemology and Logic. This a fixed term one year contract.

Closing date for applications: 3rd Aug 2021

Further details here

Start date: 1st September 2021

Fake News? So what?

Dr Eliot Michaelson has recently been published in Pubic Ethics with a piece titled ‘What Fake News Is and Why that Matters’. As Michaelson puts it, there is a difference between a false story and fake news. The former can arise from accidents or sloppy preparation. The latter however has a pernicious or moral tang that we would do well to articulate and be wary of. What do you think?

Read the full and very engaging article here. We think it’s so clear and bright that you can even read it in the sunshine.

Listen to Dr Eleanor Knox on the BBC’s Inside Science

Epidemiologist Julian Peto is advocating mass testing as the key part of a plan to stop the virus spreading. Studies where everyone has been tested have picked up asymptomatic cases. With the addition of isolation and contact tracing this method of testing has been able to massively reduce the spread of the virus. The hope is such a coordinated scheme implemented nationally could help bring the numbers down. There’s a question over which type of test is best to use for mass testing. At the moment many of us do lateral flow tests at home. Although they give instant results their accuracy has been shown to be strongly linked to how well the tests are conducted – hence the need to back up any positive findings with the more accurate PCR test.

PCR takes longer and needs sophisticated lab equipment. However a compromise could be to use RT Lamp tests, they are accurate, give results in around 20 minutes, do require a very basic lab, but without the expensive equipment of PCR. A number of RT lamp tests have now been developed for SARS-Cov2. Kevin Fong has been to see the developers of one of them, the OxLAMP test.

And with the lifting of restrictions how are you going to judge your own personal risk from Covid?

It’s a question that interests philosopher of science Eleanor Knox. She says government mandates on mask wearing and social distancing have allowed us to avoid tricky questions around our own potential risk from the virus and risks our own behaviour might pose to loved ones. Now there’s a lot more to think about in terms of balancing our desires to return to some semblance of normality while levels of Covid infection continue to rise.

Listen to this broadcast from BBC Radio 4’s Inside Science, here: https://www.bbc.co.uk/sounds/play/m000xtb6

Tonight: The YTL Centre Annual Lecture in Politics, Philosophy and Law, “The Dignity of Old Age” by Jeremy Waldron (NYU).

Join us tonight for the Annual Lecture of the YTL Centre.

Tickets are here: https://www.eventbrite.co.uk/e/the-ytl-centre-annual-lecture-the-dignity-of-old-age-tickets-156900279961

This year the lecture will be given by Jeremy Waldron (NYU), with replies from Stephen Darwall(Yale), Frances Kamm (Rutgers), Rae Langton and Richard Holton (Cambridge).

The lecture will take place on Teams on 8 July 2021, 16:00 – 18:00 BST.

Please join us by registering on Eventbrite.

Hiring: Two Lectureships – Political Philosophy, and Ethics or Epistemology (both indefinite contracts)

https://jobs.kcl.ac.uk/gb/en/job/025965/Lectureship-in-Political-Philosophy

https://jobs.kcl.ac.uk/gb/en/job/025928/Lectureship-in-Ethics-or-Epistemology

Political Philosophy:

The Philosophy Department at King’s College London is seeking an outstanding philosopher with research expertise and teaching experience in political philosophy. Research specialization, competence and ability to teach and supervise students at all levels in political philosophy are required.

Research or teaching expertise or competence in areas that will help widen or consolidate our curriculum are desirable. These areas include, but are not limited to, non-Western philosophy, logic, and philosophical issues concerning race and gender.

This post will be offered on an indefinite contract. This is a full-time post – 100% full time equivalent. Closing date: 3rd August.

Ethics or Epistemology:

The Philosophy Department at King’s College London is seeking an outstanding philosopher with research expertise and teaching experience in ethics or epistemology, broadly construed. Research specialization, competence and ability to teach at all levels and supervise postgraduate students in one of those areas are required. 

Research, teaching expertise or competence in areas that will help widen or consolidate our curriculum are desirable. These areas include, but are not limited to, non-Western philosophy, logic, and philosophical issues concerning race and gender. 

This post will be offered on an indefinite contract. This is a full-time post – 100% full time equivalent. Closing date: 3rd August.

King’s Philosophy Department is one of the largest and most distinguished departments in the UK. We have particular research strengths in the history of philosophy, philosophy of mind and psychology, philosophy of language and logic, metaphysics, epistemology and philosophy of science, and moral and political philosophy. 

Further information 

Applicants should include the following with their application:  

(1)    CV, with a list of publications  

(2)    a personal statement (around 500-1,000 words) 

(3)    the names and contact details of two referees 

(4)    two recent pieces of research on a topic relevant to the post of no more than 8,000 words each (these may be indicated portions of a larger piece of work).  

The Department will request references for longlisted candidates.  Presentations and interviews of shortlisted candidates will take place online. Start date: as early as possible during the academic year 2021-22. 

We welcome applications from all and encourage applications especially from members of groups underrepresented in UK academic Philosophy and from people marginalised on any of the grounds enumerated under the UK Equality Act 2010. 

Do you want to write a review?

Would you like to do a short (200 word+/-) review for our upcoming ‘Sound Pictures’ conference (pre-watch available now, live keynote and Q&A on 10th July)? Choose from a selection of ‘watch’ ahead talks. For example Professor Derek Matravers’s video on mixed perceptual modalities, or a novel philosophical argument about songwriting (complete with musical performances) from NYU’s Jenny Judge, or a fresh and critical podcast from our very own Colette Olive (KCL), as well as several other academic contributors. Plus there are recorded msucial performances and interviews with Bafta-nominee Film Composer Anne Chmelewsky and never before seen performances from Multi-Award winning violinist and composer Anna Phoebe and Tate Artist Nicola Durvasula. It’s a philosophy conference – just done a little bit differently – and open to anyone who has ever wondered about the nature of the connection between sound and image.

Interested to find out more? Here’s the topic overview film. If it intrigues and inspires you register for all the pre-watch here, and get in touch with us at philosophyandvisualarts@gmail.com about writing a review.

The conference is aimed at a broad audience so we hope there is something here to engage with philosophically for artists, musicians, undergraduate students from a broad variety of disciplines, and of course, for researchers working on the topic. The introduction film and interviews are aimed primarily at those less familiar with what is distinctive about this question philosophically, or with a particular speakers’ work, or who are newly interested in the kind of questions we have posed.

This conference is generously sponsored by a small grant from the British Society of Aesthetics.

CFA: Sound Pictures - Music & Philosophy