WARNING! Big data could be biased and may also remove humanity from the equation
Eugenia Anastassiou
11/8/20211 min read
We all like to think of technology as a neutrally logical version of Star Trek’s Mr Spock who finds humans illogic and their “foolish emotions a constant irritant." However that thinking is “highly illogical” because machine-learning programs are still very much open to human prejudices, emotions, thoughts and behaviours.
However we take it, bias is a natural component of human thought processes and this ‘prejudice’ can be exacerbated in two ways - the way in which computers build simulated logical patterns from an input of data in order to form conclusions and secondly that the actual data mined from people could also be flawed, reflecting their bias.
This ‘Catch-Tech 22’ scenario makes it incredibly difficult to retain accuracy in tandem with gathering and adjusting data in order to omit bias; but more fundamentally determining what bias can be is highly subjective.
By that logic, who is neutral enough to judge bias: People or machines…..??
We have been made painfully aware of the serious repercussions biased data can have on our society – its effects on information, media, politics, education, healthcare, our organisations etc which govern every aspect of our lives. If our technology is feeding ‘prejudices’ back into our consciousness however subtly, that ‘biased’ message could be subconsciously reinforced to all sorts of ends.
Even in our messy illogic humanity, people still have the capacity of judgement and some of us will call out what we perceive as ethically and morally ‘wrong’ or prejudiced.
As the ever neutrally logical Mr Spock said "Logic is the beginning of wisdom ... not the end."


Connect
Get in touch for collaborations and inquiries.
eugenia@anastassiou.org
© 2025. All rights reserved.