Tumgik
#the experimental class is taking a diverse group of kids and getting them adjusted to what their lives would look like in space
Text
I love those scenes where the character has clearly beat the crap out of someone but they’re framing it in a different way
Current thoughts are a substitute Human teacher, doing their best to stay calm and dignified around the parents but getting sick of the xenophobic ones who keep encouraging their kids to be dicks
A, with some suspicious spots of green on them and a bloody nose: Me and your father simply had a constructive conversation on how hate is very much an emotion, one that can lead to other destructive things like anger and violence, I gave him a firsthand look at some evidence supporting my stance and he had to agree with my point, so of course, as your father, he wouldn’t want to encourage hatred against Humans in you
A: So I really think it best if you apologize to the other child, run along now, your father will be with you momentarily after he concludes his…meeting with our school nurse
#ya’ll aren’t getting the full picture so this character probably sounds more like mirror verse Amanda#but they’re v different#substitute teacher is inaccurate but I didn’t wanna get into full details in the body of the post#they’re an academy student helping out an experimental class as coursework#the experimental class is taking a diverse group of kids and getting them adjusted to what their lives would look like in space#so that it’ll be an easier adjustment when their families move to space bases#they’re specifically diverse to help eliminate any xenophobia before they’re surrounded by an even more diverse population#anyways they’re only supposed to be doing a little bit of work for this class#teaching for like an hour or so a couple days a week#because they’re still a student this is just a coursework thing#but they have some really annoying coworkers who keep trying to dump their own coursework onto them#(not all their coworkers just some)#so currently they’re really tired really overworked really annoyed#and every time they make headway with the kids their parents undo all their hard work#which lead to the above situation#annoying xenophobic Vulcan parent#because I’m thinking about Stonn rn#made up dialogue by yours truly#usually it’s fun thoughts with this character#like wrangling the kids to have a field trip at an aquarium#or thinking of situations where one of the kids sees them outside of the classroom and now they gotta switch to ‘professional mode’#so they still look dignified to the kid#while their classmates are like ‘who tf are you and where did you hide our friend’#because they’re not used to their friend trying to act almost emotionless#(they don’t fully try to act like a Vulcan but it helps the kids not be overstimulated if they’re much calmer and stuff than they normally#are)#tumblr is being weird I hope all my tags show#not sure how to tag this since I’m talking about a trope?#I guess no fandom for now
16 notes · View notes
ramialkarmi · 7 years
Text
A startup claims to have finally figured out how to get rid of bias in hiring with brain games and artificial intelligence
An AI startup called Pymetrics creates neuroscience-based games that eliminate the first step of the hiring process for corporations.
The games surface each applicant's inherent traits, like having a good memory or aversion to risk.
Though Pymetrics says its software can help rid bias in hiring, the technology is still new and experimental.
Think about your place of employment right now. Your family's background and your identity likely helped you get there.
You might have been lucky enough to have grown up in a good school district and gone to a university with a robust alumni network that led to job connections. You might've also had parents who could pay for a semester abroad or housing during an unpaid internship — things that look great on a resumé.
These advantages give people a leg-up in their careers, regardless of individual work ethic or talent. That may be why a large body of research shows the hiring process is biased. 
A tech startup called Pymetrics uses brain games and artificial intelligence in an attempt to rid the hiring process of unconscious biases, including classism, racism, sexism, and ageism. CEO Frida Polli told Business Insider that Pymetrics' algorithms do not account for the name of a candidate's school, employee referrals, gender, or ethnicity. Instead, they measure 70 inherent cognitive and emotional traits, including attention to detail, ability to focus, risk-taking, and memory.
In 2013, Pymetrics launched software that automates the first step of the recruiting process: scanning resumes. On September 20, the company announced it had raised $8 million, bringing its total funding to $17 million. 
In the fall, with a grant from The Rockefeller Foundation, Pymetrics will launch a program to match disadvantaged young adults, ages 18 to 24, with companies nationwide.
How corporations use Pymetrics
Right now, Pymetrics works with 40 to 50 companies, including big names Unilever and Accenture. Most of the companies are large, because the software needs a lot of employee data to generate an accurate algorithm.
To create an algorithm, between 100 and 150 of a company's top performers play a series of neuroscience-based games. The game that measures risk aversion, for instance, gives users three minutes to collect as much "money" as possible using this system: clicking "pump" inflates a balloon by 5 cents; at any point, the user can click "collect money." If the balloon pops, the user receives no money. The user is presented with balloons until the timer runs out.
Here's a screenshot:
A cautious user who takes a small amount of money from each balloon is neither better nor worse than an adventurous user who takes each balloon to its limit. They just receive different types of scores.
After top performers finish all 12 games, the company then creates a custom algorithm that reveals a trait profile for the ideal candidate.
When a candidate applies for a job, they are asked to play the same series of games. Recruiters can then see a candidate's results compared with benchmarks from the company's top-performing employees.
Those who receive scores closest to the ideal trait profile move on to the next round, which is usually an interview.
"What does the resume tell a company that's really that relevant?"
Polli said the goal for Pymetrics is to replace the act of looking at resumes, not human recruiters. 
"In an entry level role, as a freshly graduated college kid, what does the resume tell a company that's really that relevant? I was an English major, and I became a neuroscientist. There's no direct line there," she said. 
She added that the software reduces the chances of ethnic and gender discrimination, at least in the first round. Research has shown that white men have an advantage in the hiring process, especially for jobs in male-dominated fields.
These kinds of industries, including tech, law, and finance, also have a diversity problem. A 2014 analysis from USA Today, for example, found that black and Hispanic college students are graduating with computer engineering and science degrees at twice the rate they're getting hired.
Polli admits that computers are just as likely to have gender and ethnic biases as humans, since the latter programs the former.
"Let's take Fortune 500 CEOs. Less than 5% are women, and it's the same for ethnic representation. There are more guys named John than female [names] in this group. If you were to use that sample to predict who makes a good CEO, the name John would be really predictive," she said. "That's how bias gets introduced. Variables associated with a particular demographic group get picked up by the algorithms. And if you're not actively checking for that, you're going to perpetuate it."
To limit that kind of bias, Pymetrics adjusts its algorithm for each company. The startup creates a reference group of 10,000 people that have used Pymetrics. Unlike the new applicants, the company knows the genders and ethnicities of the reference group. If the team notices, for example, that men are receiving higher scores than women on a given trait, it will de-weight that trait in the software's model.
When Unilever began a hiring overhaul last year, it used Pymetrics and HireVue (which uses facial recognition to analyze interview questions) for 250,000 applicants. Unilever told BI it hired its "most diverse class to date" in North America from July 2016 to June 2017. The company said there was a "significant" increase in non-white hires, though it wouldn't disclose specific statistics. Unilever hired a nearly equal number of men and women as well.
Does it work?
As others have noted, there are dangers in relying too much on data analytics in hiring. Cathy O'Neill, a mathematician, wrote an entire book on the subject, called  "Weapons of Math Destruction." If a company's top-performing employees are mostly white, male, and young, basing an algorithm on their profile will likely make that algorithm biased toward candidates who look like the top employees, she wrote in the book.
Polli said that's why it's important to continually correct algorithms — which are designed by humans with biases — to limit that from happening.
As Mic notes, this kind of technology is still new and experimental. It's also only used at the first stage of the recruiting process. Even if a candidate makes it to an interview, a recruiter's unconscious bias still could affect their chances of getting the job.
Several other startups in the HR space, like HireVue, Mya Systems, and Talent Sonar, have similar objectives as Pymetrics, relying on everything from games to chatbots to facial recognition.
Polli is optimistic that this technology could give more less-privileged job candidates more of an equal shot.
"Economics [are] a huge barrier to getting a good job, because you don't have the right school or the right internship. That shouldn't get in the way," she said. "We're trying to bring back the American Dream, in that everyone should have the opportunity to good jobs. It doesn't matter what your race or gender or socioeconomic background. We think that all those factors should become irrelevant."
SEE ALSO: I tried the software that uses AI to scan job applicants for companies like Goldman Sachs and Unilever before meeting them — and it's not as creepy as it sounds
Join the conversation about this story »
NOW WATCH: Apple unveiled the 'iPhone X' — here are the best features of the $999 phone
0 notes