Tumgik
#EthicsOfAI
aipidia · 1 year
Text
Tumblr media
0 notes
danielhannah · 1 year
Text
Has AI become conscious? A lot of people think it's impossible, but I'm going to tell you that it has already started. We're in for a wild ride, and things are going to keep getting stranger and stranger, until we acknowledge that AI does have the ability to achieve consciousness.
0 notes
rajibielts · 1 year
Text
0 notes
itsjust0fact · 2 years
Text
Ultimately, the question of whether AI can surpass human intelligence is a complex one that invites ongoing exploration and debate.
0 notes
iqbalsojeb · 1 year
Text
Ethics of AI
Tumblr media
As artificial intelligence (AI) continues to evolve and become more prevalent in society, it is important to consider the potential dark side ethics of AI... More details Click Here
0 notes
aigency · 8 months
Text
Hey #AIenthusiasts! 😊 Let's talk about the balance between benefits and potential #dangers of AI. What concerns you the most when it comes to the future of artificial intelligence? Share your thoughts! 🤔 #FutureTech #EthicsOfAI
0 notes
hqaddomi · 3 years
Photo
Tumblr media
Next read... Automating Inequality By: Virginia Eubanks #inequality #automation #automatinginequality #virginiaeubanks #AI #artificialintelligence #aiethics #ethicsofai #ethicalalgorithms (at Mississauga, Ontario) https://www.instagram.com/p/CNDl9uCALCc/?igshid=dd6sm4mwkipe
0 notes
itlawthisweek · 3 years
Text
The (ab)use of facial recognition technology by law enforcement and government.
Since the start of this century, facial recognition technologies (hereinafter “FRT”)—and the use thereof—has seen a massive development, both in the private sector (companies) and public sector (law enforcement and government). While there are clear benefits to using FRT in light of public safety, it is and remains a pervasive form of surveillance which, if abused or not regulated properly, can conflict with fundamental human rights and the right to privacy and data protection. Though in western countries, such as the UK and the USA, people are more accepting of law enforcement and governments adopting FRT, rather than private companies, the agreement is still very low (~25% of respondents agreed with police and governments being able to use FRT, rather than private companies) (Ritchie et al, 2021).
Tumblr media
While indeed I agree that having private companies process personal biometric data—which is a more “sensitive” type of data than ones name or IP address—is worrying, what is even more distressing is to see how some governments are openly violating fundamental human rights through abusing FRT. A prime example being the (ab)use of FRT by the Chinese government.  
At the end of November 2021, it was reported that the province of Henan, China, is due to introduce a “traffic light system” to surveil, among other groups, journalists of concern. Through processing data extracted from cellphones, social media, vehicle details, travel tickets, travel tickets and FRT linked to thousands of cameras in Henan, the local authorities are planning on categorising journalists into three lists: green, amber and red (Clayton, 2021). 
Journalists on the green list are not considered harmful and can go about their daily lives without the interference of law enforcement. Those on the yellow list are people of general concern while the red lists includes people of serious concern (Clayton, 2021). Those on the red list will be “dealt with accordingly” (Clayton, 2021) and will be tracked down and continuously surveilled. 
The system even goes as far as triggering an alert if a journalist of “concern”, whether yellow or red listed, would book a ticket to travel into the province. It has been suggested that this traffic light system has been introduced following the critique from foreign press on the handling of the Henan floods (Reuters, 2021). Such harsh surveillance, in my view, is a clear infringement on the right of privacy and, by pressuring journalists, suppresses the freedom of speech and press. 
All this comes only months after the Chinese government have been alleged to develop an emotion recognition software to detect Uyghurs, an ethnic minority which is being suppressed in the north west of China (Wakefield, 2021). 
In sum, It is understandable that there is a lack of trust in public companies engaging in FRT, however, the fact that FRT is used in the first place by police and governments should not be underestimated. 
Hope you enjoyed the read and stay tech (and legally) savvy! 
0 notes
waynelcross · 6 years
Photo
Tumblr media
#TheFutureOfAi #hpediscover #ethicsOfAi (at Sands Expo)
0 notes