Prison is the New Siri and Alexa

CBx Vibe:Watching Me” Mike WiLL Made-It feat. Rae Sremmurd & Kodak Black

By Justin Moore

  • In 2016, it’s estimated that privately-held prison technology firms made $1.2B off phone systems alone
  • Artificial Intelligence used to store and analyze voices is subject to racial disparities

Prisons across the United States, in an attempt to bolster security and crack down on fraud, are quietly compiling the “voice prints” of prisoners. Voice print technology identifies a person based on the user’s unique vocal pattern. Once the unique voice print is collected, it can be used to identify the user in future and previously-recorded calls. Prisoner rights and privacy advocates have criticized many such programs for enrolling prisoners without their consent or with the threat of severely reduced phone privileges if consent is not provided.

Prison Phone.jpg

Why This Matters: Prison in the U.S. is big business and in 2016, it’s estimated that privately-held prison technology firms made $1.2 billion from phone systems alone. In efforts to expand the bottom line, some prison tech firms have advocated to replace in-prison family visitation rooms with “video-visitation” terminals at the expense of the visitors. In addition to concerns about involuntarily enrolling inmates, facilities in New York and Arizona have confirmed their voice recognition systems can identify and catalogue the voices of outside callers.

Even if a person has never committed a crime, this means they could have their voice print stored without their consent.

Even if a person has never committed a crime, this means they could have their voice print stored without their consent. Due to a lack of regulation in this field, a common theme when the law lags behind technology, this data could be shared with other governmental agencies such as law enforcement and even third party companies.

Situational Awareness: It has been well documented that the type of artificial intelligence used to record voices, create voice prints and subsequently recognize them are subject to racial bias. Studies have shown that facial recognition software can be less reliable when recognizing non-white faces because they were trained using images of white people.

CBx Vibe:Watching Me” Mike WiLL Made-It feat. Rae Sremmurd & Kodak Black

Welcome to CultureBanx, where we bring you fresh business news curated for hip hop culture!