Tumgik
#AI_powered_voice_chat_moderation
phonemantra-blog · 1 year
Link
Call of Duty: Modern Warfare 3 to Combat Toxicity with AI Activision Introduces 'ToxMod' to Filter Toxic Voice Chats in Upcoming Release In a significant move towards fostering a healthier gaming environment, Activision, the creator of Call of Duty, has unveiled plans to utilize artificial intelligence for voice chat moderation in its highly anticipated title – Call of Duty: Modern Warfare III, scheduled for release on November 10 this year. [caption id="attachment_51295" align="aligncenter" width="1024"] AI to filter toxic voice chats[/caption] 'ToxMod': The AI-Powered Solution Detecting Toxicity in Voice Chats Activision has partnered with Modulate to craft a powerful voice chat moderation tool known as 'ToxMod.' This innovative tool harnesses the capabilities of machine learning to identify and combat various forms of toxicity prevalent in online gaming, including: Hate Speech: Swift detection of offensive language and hate-driven conversations. Harassment: Effective measures against in-game harassment. Bullying: Ensuring a safe gaming space by countering bullying. Sexism: Recognizing and addressing sexist remarks. Discriminatory Language: Identifying and addressing discriminatory language. An Arsenal Against Toxicity Reinforcing Anti-Toxicity Measures 'ToxMod' will complement Call of Duty's existing anti-toxicity arsenal, which already incorporates text-based filtering in 14 languages for both in-game chat and an efficient reporting system. Stay tuned for a more inclusive and respectful gaming experience, as Activision takes a bold step towards curbing toxicity in Call of Duty: Modern Warfare III. FAQs About AI to Filter Toxic Voice Chat How does 'ToxMod' work in identifying toxic voice chats? 'ToxMod' utilizes machine learning to recognize and flag toxic elements, including hate speech, harassment, bullying, sexism, and discriminatory language. What are Call of Duty's existing anti-toxicity measures? Apart from 'ToxMod,' Call of Duty employs text-based filtering in 14 languages for in-game chat and a robust reporting system.
0 notes
phonemantra-blog · 1 year
Link
Activision's ToxMod: AI-Powered Voice Chat Moderation in Call of Duty: Modern Warfare III In a significant move to combat toxicity in gaming, Activision, the creator of the Call of Duty franchise, is introducing an AI-powered voice chat moderation tool called 'ToxMod.' This innovative solution aims to identify and address toxic speech within the highly anticipated Call of Duty: Modern Warfare III, set to launch on November 10.  ToxMod: Tackling Toxicity with AI Revolutionizing Gaming Activision has partnered with Modulate to develop 'ToxMod,' a global voice chat moderation tool that leverages machine learning to detect various forms of in-game toxicity. This includes hate speech, harassment, bullying, sexism, and discriminatory language. Activision's ToxMod  Comprehensive Anti-Toxicity Measures Combining Forces ToxMod complements Call of Duty's existing anti-toxicity arsenal, which already includes text-based filtering in 14 languages for both in-game chat and a reporting system. The Beta Test and Human Oversight Testing the Waters Activision has initiated a beta test of the voice chat technology in North America, incorporating it into existing titles like Call of Duty: Modern Warfare II and Call of Duty: Warzone. Recognizing the potential for false positives, especially in languages other than English, the AI-based moderation system will submit reports of toxic behavior for human review. The Fight Against Toxicity Not Exclusive to Call of Duty Toxicity is not unique to the Call of Duty franchise, but due to its massive user base, Activision is turning to machine learning to automate and enhance its solutions. Measuring Impact Effective Measures Activision reports that its previous anti-toxicity efforts have flagged text and voice chats for over 1 million accounts, with a noteworthy 20 percent of those receiving warnings refraining from engaging in toxic behavior again. FAQs for Activision's ToxMod Q1: How does ToxMod detect toxic speech? A1: ToxMod utilizes machine learning to identify various forms of in-game toxicity, including hate speech, harassment, and discriminatory language. Q2: Is ToxMod only available for Call of Duty: Modern Warfare III? A2: Initially, ToxMod is being introduced in Call of Duty titles, starting with Modern Warfare III, but its application may expand in the future. Q3: How effective have Activision's anti-toxicity measures been so far? A3: Activision's previous efforts have successfully flagged text and voice chats for over 1 million accounts, resulting in 20 percent of warned users refraining from toxic behavior.
0 notes