www.oii.ox.ac.uk
Open in
urlscan Pro
163.1.201.43
Public Scan
Submitted URL: http://xlpkz.mjt.lu/lnk/AUsAAAm3XB8AAcub50wAAR6kFhMAAYAyHtEAnNDsAA6KKgBjySB_sNUWwu91QWuBB8z6ZFYoVwAOYZc/6/mNzAOlZur5...
Effective URL: https://www.oii.ox.ac.uk/news-events/news/ai-creates-unintuitive-and-unconventional-groups-to-make-life-changing-decision...
Submission: On January 30 via manual from GB — Scanned from GB
Effective URL: https://www.oii.ox.ac.uk/news-events/news/ai-creates-unintuitive-and-unconventional-groups-to-make-life-changing-decision...
Submission: On January 30 via manual from GB — Scanned from GB
Form analysis
3 forms found in the DOMGET /
<form action="/" method="get">
<label for="search-button-206">Search for : </label>
<input id="search-button-206" type="text" name="s" class="search-field" value="">
<input type="submit" id="search-submit" alt="Search" value="Submit">
</form>
GET /
<form action="/" method="get">
<label for="search-button-273">Search for : </label>
<input id="search-button-273" type="text" name="s" class="search-field" value="">
<input type="submit" id="search-submit" alt="Search" value="Submit">
</form>
Name: mc-embedded-subscribe-form — POST https://oii.us5.list-manage.com/subscribe/post?u=9125dddb3054a8ddcb4f56f23&id=aa8d7ff5b8
<form action="https://oii.us5.list-manage.com/subscribe/post?u=9125dddb3054a8ddcb4f56f23&id=aa8d7ff5b8" method="post" id="mc-embedded-subscribe-form" name="mc-embedded-subscribe-form" class="validate" target="_blank" novalidate="">
<div id="mc_embed_signup_scroll">
<div class="mc-field-group">
<label for="mce-EMAIL">Sign up for the OII newsletter</label><br>
<input type="email" value="" name="EMAIL" class="required email" id="mce-EMAIL">
</div>
<div id="mce-responses" class="clear">
<div class="response" id="mce-error-response" style="display:none"></div>
<div class="response" id="mce-success-response" style="display:none"></div>
</div> <!-- real people should not fill this in and expect good things - do not remove this or risk form bot signups-->
<div style="position: absolute; left: -5000px;" aria-hidden="true"><label class="mc_embed_signup_label" for="mce-email-field">Ignore this hidden field - prevents robots signing up.</label><input type="text"
name="b_9125dddb3054a8ddcb4f56f23_aa8d7ff5b8" id="mce-email-field" tabindex="-1" value=""></div>
<div class="clear"><label class="mc_embed_signup_label" for="mc-embedded-subscribe">Subscribe</label><input type="submit" value="Subscribe" name="subscribe" id="mc-embedded-subscribe" class="button"></div>
</div>
</form>
Text Content
Skip down to main content Mobile menu button Mobile search button Search for : * Research * Research Section Home * Research Areas * Digital Economies * Digital Knowledge and Culture * Digital Politics and Government * Education, Digital Life and Wellbeing * Ethics and Philosophy of Information * Information Geography and Inequality * Information Governance and Security * Social Data Science * Publications * Projects * Research Microsites * Visitor Programme * Study * Study Section Home * Our Programmes * MSc in Social Data Science * MSc in Social Science of the Internet * DPhil in Social Data Science * DPhil in Information, Communication and the Social Sciences * Recognised Student Programme * Study FAQs * Open Days * Summer Doctoral Programme * SDP Tutors * SDP Alumni Gallery * People * People Section Home * OII People * Faculty & Research Fellows * Research Staff * Senior Fellows * DPhil Students * MSc Students * Administration * Research Associates * Advisory Board * Visitors * Faculty Associates * Former Members of the OII * Vacancies * Equality, Diversity & Inclusion * News & Events * News & Events Section Home * News * Latest Reports * Press Coverage * Upcoming Events * Videos & Event Recordings * Follow Us * About * About Section Home * Our History * Our Founding Donor * Giving to the Oxford Internet Institute * The Shirley Scholars Fund * Executive Education * Information for Alumni * Find Us * Library Homepage icon Main menu button * Research Research * Research Areas * Digital Economies * Digital Knowledge and Culture * Digital Politics and Government * Education, Digital Life and Wellbeing * Ethics and Philosophy of Information * Information Geography and Inequality * Information Governance and Security * Social Data Science * Publications * Projects * Research Microsites * Visitor Programme * Study Study * Our Programmes * MSc in Social Data Science * MSc in Social Science of the Internet * DPhil in Social Data Science * DPhil in Information, Communication and the Social Sciences * Recognised Student Programme * Study FAQs * Open Days * Summer Doctoral Programme * SDP Tutors * SDP Alumni Gallery * People People * OII People * Faculty & Research Fellows * Research Staff * Senior Fellows * DPhil Students * MSc Students * Administration * Research Associates * Advisory Board * Visitors * Faculty Associates * Former Members of the OII * Vacancies * Equality, Diversity & Inclusion * News & Events News & Events * News * Latest Reports * Press Coverage * Upcoming Events * Videos & Event Recordings * Follow Us * About About * Our History * Our Founding Donor * Giving to the Oxford Internet Institute * The Shirley Scholars Fund * Executive Education * Information for Alumni * Find Us * Library Search for : OII > News & Events > News > PRESS RELEASE - CURRENT DISCRIMINATION LAWS FAILING TO PROTECT PEOPLE FROM AI-GENERATED UNFAIR OUTCOMES OII > News & Events > News > PRESS RELEASE - CURRENT DISCRIMINATION LAWS FAILING TO PROTECT PEOPLE FROM AI-GENERATED UNFAIR OUTCOMES Published on 26 May 2022 Written by Sandra Wachter New paper from Oxford academic calls for changes in current laws to protect the public from AI-generated unfair outcomes. AI creates unintuitive and unconventional groups to make life-changing decisions, yet current laws do not protect members of online algorithmic groups from AI-generated unfair outcomes, says a new paper from a leading Oxford academic. A paper from Professor Sandra Wachter at the Oxford Internet Institute published today, reveals that the public is increasingly the unwitting subject of new, worrying forms of discrimination, due to the growing use of Artificial Intelligence (AI). For example, using a certain type of web browser such as Internet Explorer or Safari can result in a job applicant being less successful when applying online. Candidates in online interviews may be assessed by facial recognition software that tracks facial expressions, eye movement, respiration or sweat. The paper argues there is an urgent need to amend current laws to protect the public from this emergent discrimination through the increased use of Artificial Intelligence. In ‘The Theory of Artificial Immutability: Protecting Algorithmic Groups under Anti-Discrimination Law’ author, Professor Sandra Wachter, highlights that AI is creating new digital groups in society – algorithmic groups – whose members are at risk of being discriminated. These individuals should be protected by reinterpreting existing non-discrimination law, she argues, and outlines how this could be achieved. AI-related discrimination can occur in very ordinary, everyday activities with individuals having little awareness. In addition to job applications, other scenarios include applying for a financial loan where an applicant is more likely to be rejected if they use only lower-case letters when completing their digital application – or if they scroll too quickly through the application pages. The paper highlights that these new forms of discrimination often do not fit into the traditional norms of what is currently considered discrimination and prejudice. AI challenges our assumptions about legal discrimination. AI identifies and categorises individuals based on criteria that are not currently protected under the law. Familiar categories such as race, gender, sexual orientation and ability are replaced by groups like dog owners, video gamers, Safari users or “fast scrollers” when AI makes hiring, loan, or insurance decisions. Professor Sandra Wachter explains why this is important, “Increasingly decisions being made by AI programmes can prevent equal and fair access to basic goods and services such as education, healthcare, housing, or employment. AI systems are now widely used to profile people and make key decisions that impact their lives. Traditional norms and ideas of defining discrimination in law are no longer fit for purpose in the case of AI and I am calling for changes to bring AI within the scope of the law.” Professor Wachter’s new theory is based on the concept of ‘artificial immutability’. She has identified five features of ‘artificial immutability’ – opacity, vagueness, instability, involuntariness and invisibility – that contribute towards discrimination. Reconceptualising the law’s envisioned harms is required to assess whether new algorithmic groups offer a normatively and ethically acceptable basis for important decisions. To do so, greater emphasis needs to be placed on whether people have control over decision criteria and whether they are able to achieve important goals and steer their path in life. Read the full paper, ‘The Theory of Artificial Immutability: Protecting Algorithmic Groups under Anti-Discrimination Law’ by Professor Sandra Wachter. For more information call +44 (0)1865 287 210 or contact press@oii.ox.ac.uk. ENDS Notes for Editors About the OII The Oxford Internet Institute (OII) is a multidisciplinary research and teaching department of the University of Oxford, dedicated to the social science of the Internet. Drawing from many different disciplines, the OII works to understand how individual and collective behaviour online shapes our social, economic and political world. Since its founding in 2001, research from the OII has had a significant impact on policy debate, formulation and implementation around the globe, as well as a secondary impact on people’s wellbeing, safety and understanding. Drawing on many different disciplines, the OII takes a combined approach to tackling society’s big questions, with the aim of positively shaping the development of the digital world. AUTHOR PROFESSOR SANDRA WACHTER Professor of Technology and Regulation Professor Sandra Wachter is Professor of Technology and Regulation focusing on law and ethics of AI, Big Data, and robotics as well as Internet regulation at the Oxford Internet Institute at the University of Oxford View profile RELATED PROJECT PROGRAMME ON THE GOVERNANCE OF EMERGING TECHNOLOGIES This OII research programme investigates legal, ethical, and social aspects of AI, machine learning, and other emerging information technologies. Read more Active RELATED TOPICS Artificial Intelligence Governance of Technology 1 St Giles, Oxford, OX1 3JS, UK +44 (0)1865 287210 General: enquiries@oii.ox.ac.uk Press: press@oii.ox.ac.uk Staff Intranet FOLLOW US: Sign up for the OII newsletter Ignore this hidden field - prevents robots signing up. Subscribe Facebook link Twitter link YouTube link LinkedIn link Instagram link INFORMATION FOR: * Prospective students * Alumni * Job seekers * Media * Policy makers © Oxford Internet Institute 2023 | Terms of Use | Privacy Policy | Cookie Settings | Copyright Policy | Accessibility | Email Webmaster We are using cookies to give you the best experience on our website. You can find out more about which cookies we are using or switch them off in settings. Accept Reject Settings * Privacy Overview * Strictly Necessary Cookies * YouTube, Vimeo and Sparkle * Google Analytics Privacy Overview This website uses cookies so that we can provide you with the best user experience possible. Cookie information is stored in your browser and performs functions such as recognising you when you return to our website and helping our team to understand which sections of the website you find most interesting and useful. Strictly Necessary Cookies * moove_gdrp_popup - a cookie that saves your preferences for cookie settings. Without this cookie, the screen offering you cookie options will appear on every page you visit. This cookie remains on your computer for 365 days, but you can adjust your preferences at any time by clicking on the "Cookie settings" link in the website footer. Please note that if you visit the Oxford University website, any cookies you accept there will appear on our site here too, this being a subdomain. To control them, you must change your cookie preferences on the main University website. Enable or Disable Cookies YouTube, Vimeo and Sparkle This website uses the following additional cookies from third party websites: * YouTube tracks the YouTube videos you watch that are embedded on our webpages. * Doubleclick monitors the adverts you see on YouTube. This cookie is automatically added by YouTube, but the OII does not display any adverts. * Vimeo tracks the Vimeo videos you watch that are embedded on our webpages * 5p4rk.3l displays the OII's Twitter feed on the website homepage. These cookies will remain on your computer for 365 days, but you can edit your preferences at any time through the "Cookie Settings" in the website footer. Enable or Disable Cookies Please enable Strictly Necessary Cookies first so that we can save your preferences! Google Analytics This website uses Google Tags and Google Analytics to collect anonymised information such as the number of visitors to the site, and the most popular pages. Keeping these cookies enabled helps the OII improve our website. Enabling this option will allow cookies from: * Google Analytics - tracking visits to the ox.ac.uk and oii.ox.ac.uk domains * YouTube - owned by Google. The cookie will track the OII videos that you watch on our site. This option will not allow cookies from doubleclick.net, however. These cookies will remain on your website for 365 days, but you can edit your cookie preferences at any time via the "Cookie Settings" button in the website footer. Enable or Disable Cookies Please enable Strictly Necessary Cookies first so that we can save your preferences! Enable All Reject All Save Changes Powered by GDPR Cookie Compliance