4 min read

AI is everywhere, and now it’s helping parents determine whether a potential babysitter for their toddler is a right fit for hire or not. Predictim is an online service that uses advanced AI to analyze the risk levels attached to a babysitter. It gives you an overall risk score for the babysitter along with complete details on the babysitter by scanning their social media profiles using language processing algorithms.

Predictim’s algorithms analyze “billions” of data points dating back to years in a person’s online profile. It then delivers an evaluation within minutes of a babysitter’s predicted traits, behaviors, and areas of compatibility based on her digital history. It uses language-processing algorithms and computer vision to assess babysitters’ Facebook, Twitter and Instagram posts for clues about their offline life. 

Predictim assesses the babysitters based on four different personality features like bullying/harassment, bad attitude, explicit content, and drug abuse. This is what’s making this service appealing for parents as determining all these details about a potential babysitter is not possible with just a standard background check.

“The current background checks parents generally use don’t uncover everything that is available about a person. Interviews can’t give a complete picture. A seemingly competent and loving caregiver with a ‘clean’ background could still be abusive, aggressive, a bully, or worse. That’s where Predictim’s solution comes in”, said Sal Parsa, co-founder, Predictim.  

Criticism towards Predictim  

Now, although the services are radically transforming how companies approach hiring and reviewing workers, it also poses significant risks. In a post by Drew Harwell, reporter, Washington Post, Predictim depends on black-box algorithms and is not only prone to biases over how an ideal babysitter should behave, look or share (online) but its personality scan results are also not always accurate. The software might misunderstand a person’s personality based on her/his social media use.

An example presented by Harwell is that of a babysitter who was flagged for possible bullying behavior. The mother who had hired the babysitter said that she couldn’t figure out if the software was making that analysis based on an old movie quote, song lyric or if it actually found occurrences of bullying language. Moreover, there are no phrases, links or details provided to the parents that indicate the non-credibility of a babysitter. Harwell also points out that hiring and recruiting algorithms have been “shown to hide the kinds of subtle biases that could derail a person’s career”.  An example given by Harwell is that of Amazon who scrapped its sexist AI algorithm last month, as it unfairly penalized the female candidates. 

Kate Crawford, co-founder, AI Now institute tweeted out against Predictim, calling it “bollocks AI system”: 

But, the Predictim team is set on expanding its capabilities. They’re preparing for nationwide expansion as  Sittercity, a popular online babysitter marketplace, is planning to launch a pilot program next year with Predictim’s automated ratings on the site’s sitter screenings and background checks. They’re also currently looking into gaining the psychometric data via the babysitter’s social media profiles to dig even deeper into the details about a babysitter’s private life. This has raised many privacy-related questions in support of the babysitters it could indirectly force a babysitter to provide the parent with all the personal details of her life to get a job, that she might not be comfortable sharing otherwise.   

However, some people think differently and are more than okay asking babysitters for their personal data. An example given by Harwell is of a mother of two, who believes that “babysitters should be willing to share their personal information to help with parents’ peace of mind. A background check is nice, but Predictim goes into depth, really dissecting a person — their social and mental status. 100 percent of the parents are going to want to use this. We all want the perfect babysitter.”

Now, despite parents wanting the “perfect babysitter”, the truth of the matter is that Predictim’s AI algorithms are not “perfect” and need to be more efficient so that they don’t project their unfair biases on the babysitters.  Predictim needs to make sure that it caters its services not just for the benefit of the parents but also takes into consideration the needs of babysitters.  

Read Next  

Google’s Pixel camera app introduces Night Sight to help click clear pictures with HDR+

Blackberry is acquiring AI & cybersecurity startup, Cylance, to expand its next-gen endpoint solutions like its autonomous cars’ software

Facebook AI researchers investigate how AI agents can develop their own conceptual shared language


Subscribe to the weekly Packt Hub newsletter. We'll send you the results of our AI Now Survey, featuring data and insights from across the tech landscape.