Automated background checks are deciding who’s fit for a home

0
28
- Advertisement -

But advocates say algorithms can’t capture the complexity of criminal records

Mikhail Arroyo had made it out of the coma, but he was still frail when his mother, Carmen, tried to move him in with her. The months had been taxing: Mikhail was severely injured in a devastating fall in 2015. He had spent time in the hospital, and by 2016 was in a nursing home where his mother visited him daily, waiting until they could live together again. Carmen planned to move him to a new apartment with her in the Connecticut residential complex where she was staying. The only task left was the paperwork.

At a meeting, she says, the management company broke the news. Mikhail wouldn’t be allowed in to the apartment. Carmen was shocked.

“On what basis?” she asked.

The woman she spoke to couldn’t tell her. Although Carmen couldn’t have known all of the details, Mikhail had been flagged by a criminal screening tool run by a company called CoreLogic. She says she was only shown a single sheet of paper — one she couldn’t take with her, and that scarcely helped — and given a CoreLogic phone number to call.

She got nowhere. She waited on hold, was transferred back and forth. Carmen was Mikhail’s designated conservator, giving her legal decision-making power, but the company needed documentation for her and Mikhail — a passport, a driver’s license — and she wasn’t getting through. Carmen was “livid.” At one point, she says, she was told to get in touch with the legal department, but they weren’t available. Was this over a credit check? And didn’t the circumstances count for anything? She started to worry, wondering whether the number was somehow fraudulent and she’d handed over all of her personal information.

The process dragged on for months. “They just kept giving me the runaround,” Carmen says. Meanwhile, she looked for another place to live, and found one spot, even if it was in a worse neighborhood. But it had stairs, and Mikhail couldn’t walk at the time — so he wouldn’t be able to live there. He wasn’t able to fully speak yet, either, but when it was time to bring him back to the nursing home, he’d point to tell her he wanted to go back to the apartment. Some days, he’d cry. “It was heartbreaking for him,” she says, “because he wanted to go home.”

As landlords decide who to rent to, CoreLogic offers an array of screening tools. They might use a product called ScorePLUS, which the company describes as a “statistical lease screening model” that calculates “a single score” to determine the potential risk of someone signing a lease. A landlord might also turn to a product called CrimCHECK, which conducts a database search for criminal records. The breadth of records the company advertises is impressive. The company says it uses an arrest records database of more than 80 million booking and incarceration records from approximately 2,000 facilities, updated every 15 minutes. CrimSAFE, the system that flagged Mikhail, is described by the company as an “automated tool [that] processes and interprets criminal records and notifies leasing staff when criminal records are found that do not meet the criteria you establish for your community.”

The ability to outsource decisions is a key pitch to prospective screeners. “Whatever decision or information service you use, you’ll find the same simple data entry process, rapid turnaround and clear concise results that eliminate the need for judgment calls by your leasing professionals,” the company tells visitors to its website. Should there be a problem, and a landlord must send an “adverse action” letter, the company advertises an automated system for that as well.

But for some housing advocates, the rise of automation and elimination of human “judgment calls” is increasingly the problem, not the solution. When screening tools bypass more human forms of decision-making, they say, those decisions are more likely to collapse complex matters into simple, algorithmically generated pass-fail mechanisms — leaving behind people looking for a home.

Eric Dunn, the director of litigation at the National Housing Law Project, has seen how the tools used by landlords have evolved in recent years. Over time, he’s watched more people move away from what he calls “old guard screening,” where research was done by humans and landlords were provided extensive documentation. Instead, systems are more likely now to map “identifiers,” selecting options like certain types of crime, against massive databases of obscure origin.

Fair housing proponents like Dunn say nuance is lost when landlords rely on automated screening tools that turn a personal history — one person’s story, which a landlord might have to grapple with, weighing actual risk — into a list of variables to be machine-verified. If there are extenuating circumstances around a record, they can’t always be captured in the rigid framework of the system, advocates argue. No matter how expedient it might be for landlords and background check companies to use those tools, they say, it will lead to blocking people who otherwise would have been accepted by a more personalized form of screening.

The companies that offer these tools frame them as recommendations for landlords, which they can override. But Dunn questions that line of thinking. If the machine calculates a failing decision, he argues, there’s little other basis for a landlord to come to a different conclusion, especially if the landlord isn’t provided the complete history.

Federal law prohibits landlords from selecting their tenants based on protected characteristics — an applicant’s race, sex, or religion, can’t be used to determine whether they’re offered a place to live. But criminal records are more complex. If a record involves property damage, for example, a landlord might be within their legal right to decline an application, on the basis that it indicates a potential problem in the future.

In the past few years, some of the boundaries of those protections have been extended. The Supreme Court, in a major 5–4 decision in 2015, ruled that the law extended to decisions that disproportionately affected certain groups, even if it was indirectly. If a policy affects a black neighborhood more than a nearby white one, for example, the policy could be unlawful. The legal theory is known as the “disparate impact” standard.

Noting the disproportionate effect of criminal records on minority groups, the Department of Housing and Urban Development issued new guidance for real estate transactions in 2016. Under the guidelines, a landlord, or someone else determining whether to offer someone a home, might be able to use a criminal record to make a decision about a tenant. But that decision, according to the guidance, requires a close look at individual cases. If the screening policy fails to consider the severity or relevance of the record, or how long ago the incident happened, it likely wouldn’t pass HUD’s test. The Arroyos are currently involved in a suit with CoreLogic over these protections.

Monica Welby, deputy director of litigation at the Legal Action Center, says commercial checks are also “notoriously” inaccurate. Dunn sees similar issues. “I’ve looked at more criminal records reports than I could count, and I would say that well over half the ones I’ve looked at had some kind of inaccuracy,” he says.

A record might, for example, include information from someone with a similar name, leading to a denial. “This happens all the time,” Dunn says. A similarly named relative — or maybe someone completely unrelated, who happens to share a name with a rental applicant — can derail a tenant’s application. The problems, Dunn says, often might have been caught by a more individualized screening system.

CoreLogic has faced lawsuits over such errors. Under the Fair Credit Reporting Act, companies are required to make an effort to ensure accuracy as well as they reasonably can, but some have questioned the depth of that commitment. In 2015, a South Carolina man said in a lawsuit that he and his wife were seeking a new place to stay after flooding damaged their home. When he applied for a new apartment, though, he was flagged by a CoreLogic tool as a registered sex offender — apparently due to someone with a similar name. In court documents, the man said he was eventually able to reach someone to correct the discrepancy, but the process for removing the information would take two weeks, as the apartment slipped away.

A dispute can be hard to fight, as Carmen discovered. She eventually found legal representation from a local nonprofit, the Connecticut Fair Housing Center. The group filed an administrative complaint with the management company.

The CoreLogic system that flagged Mikhail, according to court documents, allows landlords to select certain options about criminal history to screen against. This means the decision largely remains in the landlord’s hands, the company argues, since the landlord chooses the parameters. CoreLogic has said its system only makes a check based on what it’s told to do, and is compliant with housing law. (CoreLogic declined to answer questions about its screening process, citing pending legal disputes, several of which it has faced in federal court. The company would not provide more information on precisely what landlords can screen for, or how it ensures accuracy in its results.)

If the company flags your application, and you believe it’s relying on inaccurate information, CoreLogic offers a helpline to call. The company says it will conduct a reinvestigation that will be completed within 30 days, and if any errors are found, will fix the issues. Still, some argue that even if the errors are corrected, whatever home an applicant has applied for will likely be gone after a month.

The Connecticut Fair Housing Center tried to get the management company to overlook the background check. Whatever caused Mikhail to be flagged, they argued, it was clearly moot. If a criminal screening is predicated on the theory that it could predict future behavior, Mikhail was hardly likely to commit a crime in the future — he was disabled now, reliant on others for help. There was no basis to think he was somehow a danger to people or property.

The argument, according to Salmun Kazerounian, a staff attorney at the center, didn’t sway the management. “They responded, essentially, ‘how can we agree to overlook a criminal record if we don’t know what it is?’” he says.

CoreLogic’s documentation was a sparse source of clues. The company provided a “result” that said there was a “disqualifying record,” but not enough to deduce what the problem was. The report generated a “jurisdiction” entry that was seemingly nonsensical: “000000033501.PA.”

At first, they had no idea how to find out what the record could mean. It took more digging to determine the circumstances, but eventually the story could be pieced together. Before his accident, Mikhail faced a retail theft charge in Pennsylvania. The charge, according to the center, was for a “summary offense” — a charge below misdemeanor that’s also called a “non-traffic citation.” The level of the charge suggested the incident involved less than $150 and was Mikhail’s first offense. He was 20 years old at the time. “It was as minor as they come,” Kazerounian says. Last year, the charge was withdrawn. (Mikhail was also arrested following a burglary in 2013, according to statements from local authorities; Kazerounian stressed that, regardless, the Pennsylvania charge was the only item on his record.)

While it’s hard to determine the exact rate of disputes like the Arroyos’, experts say there are broader issues around accuracy in background screening. “Disputing information with consumer reporting agencies can be extremely challenging for individuals,” Welby says.

In one CoreLogic case recently settled, a man named Abdullah James George Wilson was sent to prison after a 1992 robbery, but years later, after his counsel was found to be ineffective, Wilson was granted a favorable ruling on appeal. His record was sealed.

But in 2014, Wilson, looking for a place to live, found that his application was rejected anyway. The problem: a CoreLogic system flagged the record from the New York correctional record. Wilson was barred from the apartment.

“In this age of technology and widespread use of criminal background checks, it is more important than ever that background check companies get it right,” Wilson, who reached a settlement in a lawsuit, said in a statement to The Verge provided by the Legal Action Center. “They must take the proper steps to ensure that the criminal record information they report is accurate. The stakes are high for people — it can be the difference between having a place to call home or not.”

Mikhail was finally allowed to move in with Carmen in June 2017, after the charge was withdrawn. Had the screening been done effectively, the Connecticut Fair Housing Center alleges in a lawsuit against CoreLogic, they could have been reunited a year earlier, saving the Arroyos time, money, and emotional energy. “I’m bringing it because I think it was wrong, what they’ve done,” Carmen says.

The suit is ongoing. It argues that the screening tool disproportionately affects black and Latinx tenants, and fails to properly take into account mitigating circumstances of a criminal record, allegedly a violation of the Fair Housing Act as it was outlined in HUD’s 2016 guidance. The complaint seeks damages for the Arroyos, to be determined later, and asks for a judgment that would require CoreLogic to take steps that would prevent situations like the Arroyos’ in the future.

The company has responded that, as a background check business, they are not subject to the Act. Only the people using their tools are, they argue. The HUD rules, they say, support that claim.

When Mikhail got home, about two years after the accident, Carmen says she was “a bag of great emotions.” He remains in a wheelchair, and has physical therapy twice a week, but can say phrases like, “Hi, mom.” He cried when he got home.

When he made it back, he had moments of concern about whether he’d have to leave again. “I would always reassure him,” Carmen says, “no, you’re home, this is it.”

Written by Colin Lecher
This news first appeared on https://www.theverge.com/2019/2/1/18205174/automation-background-check-criminal-records-corelogic under the title “Automated background checks are deciding who’s fit for a home”. Bolchha Nepal is not responsible or affiliated towards the opinion expressed in this news article.