Designing human experiences in the age of AI.
Links curated by Peter Polgar.
For a weekly newsletter of hand selected links, signup here.
Posted on 2017 November 6 by polgarp · Tagged in Data bias
Original link: http://www.slate.com/articles/technology/technology/2017/10/what_happens_when_the_data_used_to_train_a_i_is_biased_and_old.html
It’s important to realize that most publicly available, and also a huge portion of non-public non-trivial data sets will contain significant bias that distorts the results the algorithms. The simple fact is, earlier datasets were not collected according to today’s standards on bias, or not created with avoiding bias in mind. Since data is needed to design (maybe even just to prototype) and release any type of AI tech based product this should be the concern for most product teams. I’m not sure if this type of bias can be uncovered by quantitative analysis. User research should help by understanding the human context around the data.