Shaun Moore | How Bias Impacts AI | Rise of AI Conference 2023

Understand how bias impacts AI, particularly in facial recognition, and learn strategies for designing, training, and deploying fair and accurate technology.

Key takeaways
  • Bias is inherent in facial recognition algorithms, impacting accuracy, fairness, and trust.
  • The algorithms prioritize features associated with gender and race, not just uniqueness of the face.
  • Companies can insert bias when designing, training, and deploying facial recognition technology.
  • Labeling bias occurs when images for facial recognition are labeled by humans, potentially influenced by personal biases.
  • Unrepresentative data, measurement bias, and omitted variable bias can also introduce biases.
  • Understanding and mitigating bias is crucial for ethical AI development.
  • Explainability is critical for building trust in AI systems.
  • Data protection, regulation, and transparency are essential for ensuring accountability and fairness.
  • Companies must prioritize humanity and ethics when developing facial recognition technology.
  • The industry lacks a universally accepted definition of bias in AI.
  • Companies must develop internal definitions and guidelines for bias identification and mitigation.
  • Aggregating data from multiple sources without normalization can introduce biases.
  • Hyperlocal and hyper-specific data can be biased towards a particular group or population.
  • Synthetic data can be used to train facial recognition algorithms, but it may not accurately reflect real-world variability.
  • Camera placement, lighting, and environmental conditions can affect facial recognition accuracy.
  • Companies must work directly with customers to understand their use cases and ensure fairness and transparency.
  • The public must be educated about the limitations and potential biases of facial recognition technology.
  • Debate and discussion are necessary to move towards a clearer understanding of bias in AI and to develop effective solutions.