Bias (Machine Learning)
Systematic errors or unfairness in AI models that lead to skewed results. This can happen when training data isn't representative (like training a hiring AI only on male resumes) or when algorithms make incorrect assumptions. Bias can perpetuate real-world inequalities, so identifying and reducing it is crucial for building fair AI systems that work well for everyone.