Nigel- the robot that could tell you how to vote

A piece of software being developed by a company in Portland, Oregon, aims to be able to offer advice on every aspect of its users’ lives – including which way to vote.

“It (Nigel) tries to figure out your goals and what reality looks like to you and is constantly assimilating paths to the future to reach your goals.” says Nigel’s creator Mounir Shita.

Shita’s company, Kimera Systems, claims to have cracked the secret of “artificial general intelligence” – independent thinking – something that has eluded AI researchers for the past 60 years.

Instead of learning how to perform specific tasks, like most current AI, Nigel will roam free and unsupervised around its users’ electronic devices, programming itself as it goes

Its achievements have been limited so far – it has learned to switch smartphones to silent mode in cinemas without being asked, from observing its users’ behaviour.

Mounir Shita

Shita said that their goal, with Nigel, is by this time next year to have Nigel read and write at a grade school level. They are still way off participating in politics, but they are going there

“Let me go to the extreme here, if you are a racist, Nigel will become a racist. If you are a left-leaning liberal, Nigel will become a left-leaning liberal” he added.

Ian Goldin, professor of globalization and development at the University of Oxford, also believes AI could have a role to play in debunking political spin and lies.

“In the machine-learning world innovation happens more rapidly, so the pace of change accelerates,” says Goldin.

He further explains two things happen – first, the people get left behind more quickly, so inequality grows more rapidly, and the second thing that you have to renew everything quicker – fibre optics, infrastructure, energy systems, housing stock, mobility and flexibility.

British politicians, on a House of Lords committee, are set to investigate the economic, ethical and social implications of artificial intelligence over the coming months.