The National Institute of Standards and Technology (NIST) has created guidance to identify and manage bias in artificial intelligence. The National Fair Housing Alliance wants to make sure communities of color and other underserved communities get fair access to mortgages and other services.
- Michael Akinwumi, chief tech equity officer at the National Fair Housing Alliance, said artificial intelligence can exacerbate discrimination and bias when the data used to develop the algorithms limits representation of certain groups.
- Akinwumi said people can also use artificial intelligence to find and remove bias.
- Akinwumi said NIST did a good job of identifying three stages in the development of an artificial intelligence solution where bias can exist – pre-design, design and deployment – and that there should be a fourth stage of monitoring.