The People vs Artificial Intelligence

In the fertile Arkansas Delta lies a small town called Cash. Its name is an Anglicized version of a nearby river, Cache, which in French means “hidden.” A cache is also a computer term.

It’s therefore ironic yet appropriate that one of Cash’s aldermen, Bradley Ledgerwood, helped discover a hidden artificial intelligence (AI) program that has devastated people living with physical disabilities in the state of Arkansas.

You might be asking, why does this matter to me?

I’ll let Dorothy, a surveillance technology test subject, answer this: “You should pay attention to what happens to us. You’re next.”

“The Computer Made Me Do It”

The afternoon sun peeks over the roof of a seed company warehouse in Cash, Arkansas, where the Ledgerwood family lives.

Brad Ledgerwood, who has cerebral palsy, encountered a drastic drop in his hours of home health care in early 2016. He went from 56 hours of care per week to 32. Brad’s mother, Ann, had resigned from a well-paying job to take care of him and relied on the eight hours a day of relief.

I met Brad and Ann one cloudy afternoon at their modest brick home in Cash, where Brad serves on the city council. David, Brad’s father, was at work, supporting the family. When you’re doing this job, you never know what you’re going to walk into, but the Ledgerwoods are delightful.

Ann has dedicated her life to Brad. Every day she helps him perform the tasks he can’t, such as cooking, bathing, going to the restroom, pulling up his covers when he gets cold, and so many things you wouldn’t think of unless you lived her life. With the help of Ann, Brad is able to be a force of nature in Arkansas politics. As well as serving as an alderman, he participates in several online and local political groups. On Tuesday, he and Ann are going to be poll workers.

I asked Brad what he did when he found out his hours had been reduced by Arkansas’ Department of Human Services (DHS).

He told me he had asked the nurses why his hours were cut, and “they said the computer was tabulating how many hours you’re going to get.”

Ann didn’t suspect there was anything amiss going on at first. “I thought, surely this is a mistake,” she said.

Brad is in a Medicaid program overseen by DHS called ARChoices in Homecare. This taxpayer-funded Medicaid program provides home and community-based services for adults with disabilities and people over 65. Home health care helps people like Brad stay out of nursing facilities, which are much more expensive for the state and would not provide him the quality of life that staying at home has. For years, healthcare decisions had been made by nurses, not by computers.

Brad decided to call Kevin De Liban, an attorney with Legal Aid of Arkansas. Brad was not the only one to call Legal Aid about the cut in hours. In fact, of the 11,000 people enrolled in ARChoices, about 4,000 had their hours cut.

DeLiban filed a federal lawsuit against DHS on behalf of Brad and another plaintiff because the reduction of services violated two acts and the Constitution. De Liban told me that they had discovered during the legal process of the lawsuit that the computer issue was in fact an algorithm, and Legal Aid obtained a copy of it.

The problem all along had been that DHS secretly adopted the AI without being able to explain how it worked. Basically, nurses would come out to clients’ homes and ask them about 280 questions. This data would be entered into a computer, and the algorithm would put their allotted care level into tiers, called resource utilization groups (RUGs). The code, written by Brant Fries, PhD, didn’t take into account the diagnosis of cerebral palsy for nearly two years.

This tangle of healthcare technocracy has been in litigation for over two years. DHS loses, DHS appeals, and on and on.

In the meantime, Brad has been busy advocating for himself. “I have every senator’s phone number,” he told me. He also has the number of the governor of Arkansas, Asa Hutchinson. Brad has called the governor, but the governor has not spoken to him about his cut in home healthcare hours.

Asa Hutchinson

Hutchinson just happens to be one of the characters in my ongoing research into the Mena Airport scandal. That’s the one where CIA & company were trading arms for drugs, and they invaded a small town in Arkansas to use as a clandestine base. Asa was a young US attorney at the time, and I’m not sure how involved he was with the money laundering part of it, but that is another story.

Asa’s running for governor again, so it wasn’t very hard to find him. I met him at a political club luncheon. After his stump speech, during which he said he told Vice President Pence to end the trade war tariffs with China (a lot of talk about China amongst the Political Animals), I was the first one to throw up my hand for a question.

“From what I understand,” I said, “state-funded care for the elderly and people with disabilities has been cut drastically for some due to the use of artificial intelligence. What is your vision for AI in the state of Arkansas in the future?”

The reaction from the room was remarkable. Asa froze. The room went dead quiet. Somebody behind me told me to turn my camera off. “Okay,” I said.

Only then would the governor speak. “What do you mean by AI?”

And that, dear readers, is the moment I realized I had spoken the phrase which should not be spoken. We like to think of artificial intelligence as being sci-fi, but the truth is we live with it every day. I searched my mind for a word he would accept.


“Oh, yes.” He then regained his politician flair and told the room about how there was a need for independent assessment, and the government has too many service departments. He also said that his administration had helped shorten the waiting list for ARChoices. In an odd way, this is true.

Because although Kevin De Liban & Brad and the rest of the plaintiffs won their lawsuit fully on merit, helping possibly thousands within the program, DHS got itself a backlog of applicants for the ARChoices program. From the outside, it appears to me that DHS figuratively held some needy people in Arkansas hostage to keep their AI.

The judge in the case, Wendell Griffen, has been ruling favorably toward the plaintiffs, at one point even holding DHS in contempt of court for not addressing the AI problem, but the process was completed October 1st, DHS was allowed to continue to use the RUGs, and people were allowed to enroll in the program again. So yes, the ARChoices waiting list was shortened.

The next thing Asa Hutchinson said probably made my jaw drop, but I was too busy taking notes to notice.

“We need to keep people from taking advantage of the system.”

I immediately thought of the Ledgerwoods, fighting to survive financially, and all the other people who have suffered because of this AI.

“The human impact is devastating,” De Liban told me. “Some of them have had to lie in their own waste, to go without food, to endure pressure sores. People have suffered as a result of the state’s use of the RUGs algorithm.”

Hutchinson didn’t answer my question about AI.

Artificial Intelligence

Don’t get me wrong — AI can be helpful in many ways. Technology aids people like Brad and Ann. When Brad was in school, Ann had to read everything out loud to him. Imagine the number of hours. Now Siri does it.

However, the RUGs algorithm is a problem, and it is artificial intelligence, no matter what anybody says. According to Oxford Reference, artificial intelligence is “the theory and development of computer systems able to perform tasks normally requiring human intelligence, such as …. decision-making ….” When a computer program makes decisions about your level of health care, that is artificial intelligence by definition.

This is an entirely different picture than you get when you see videos like Pepper the robot addressing the UK Parliament in a childlike voice, saying its design is “aimed at assisting and caring for the elderly!”

Imagine Pepper babytalking to you: “Septicemia is not part of my programming!”

And then rolling out the door of your nursing home room, never to be seen again.

A computer does not have empathy. It does what it’s programmed to do. It’s like a high-functioning sociopath.

Michael Morton

Remember when I said that home health care saves the state a whole lot of money? DHS claims it pays on average $18,000 for people with home health care compared to $50,000 for those in nursing homes. Why would they cut back on a program that saves them money when the consequences could be so severe, not only to patients, but to taxpayers?

Well, at the same time the algorithm scandal is going on in Arkansas, so is a nursing home scandal.

Nursing home magnate Michael Morton bribed a circuit judge, Michael Maggio, to reduce the judgment in a negligence suit against one of Morton’s facilities from $5.2 million to $1 million. Maggio is now in prison, but Morton is still in business.

For this 2018 election cycle, Morton donated to Asa Hutchinson’s campaign, but Hutchinson turned him down. However, I can’t find anything where Morton’s business partner, David Norsworthy, is getting turned down for his donation to Asa’s campaign. Norsworthy bribed Senator Jake Files, who’s in prison now, too, but Norsworthy isn’t.

Seems like there’s some funny business going on in the healthcare system in Arkansas. Surprise, surprise. And I hate to say it, but I’m not shocked they are picking on some of the most vulnerable members of society first.

I asked Kevin De Liban why he thought they were using the AI. He said, “The algorithms seem to offer so-called objective cover for what are really budget cuts.”

That seems pretty obvious to me. But how deep does this scandal go? I’ll be investigating, so stay tuned.

Weaponized AI

In my research, I’ve come across many instances of AI being used against the vulnerable. It’s almost like government bureaucrats look at these people as easy targets. It’s hard to fight back when you’re dealing with a serious health issue and financially strapped.

Are they figuring out the legal and social obstacles they face as they deny help to the needy?

And it’s not just healthcare or food stamps. AI is everywhere now. From your social media accounts to your plane flights, artificial intelligence is there, looming secretly in the shadows.

Google, that corporation we all know and trust, has pinky promised that it will not turn into Skynet from the Terminator movies. Let’s hope they’re telling the truth now or soon, but I’m not holding my breath because AI might not recognize hypoxia.

Let’s face it: most people just let this happen without even speaking out about it. They’re too busy watching the Puppy Bowl.

But there is a family in Cash who’s speaking up for us. One of them is in a wheelchair and can barely move, but he is doing more than anybody else I know. Good on him.

I asked Brad how he felt about the court case.

He said, “Overall, I feel like we’ve won battles, but we haven’t won the war.”

Fight on, Ledgerwoods and Legal Aid, fight on.