The People vs Artificial Intelligence

In the fertile Arkansas Delta lies a small town called Cash. Its name is an Anglicized version of a nearby river, Cache, which in French means “hidden.” A cache is also a computer term.

It’s therefore ironic yet appropriate that one of Cash’s aldermen, Bradley Ledgerwood, helped discover a hidden artificial intelligence (AI) program that has devastated people living with physical disabilities in the state of Arkansas.

You might be asking, why does this matter to me?

I’ll let Dorothy, a surveillance technology test subject, answer this: “You should pay attention to what happens to us. You’re next.”

“The Computer Made Me Do It”

The afternoon sun peeks over the roof of a seed company warehouse in Cash, Arkansas, where the Ledgerwood family lives.

Brad Ledgerwood, who has cerebral palsy, encountered a drastic drop in his hours of home health care in early 2016. He went from 56 hours of care per week to 32. Brad’s mother, Ann, had resigned from a well-paying job to take care of him and relied on the eight hours a day of relief.

I met Brad and Ann one cloudy afternoon at their modest brick home in Cash, where Brad serves on the city council. David, Brad’s father, was at work, supporting the family. When you’re doing this job, you never know what you’re going to walk into, but the Ledgerwoods are delightful.

Ann has dedicated her life to Brad. Every day she helps him perform the tasks he can’t, such as cooking, bathing, going to the restroom, pulling up his covers when he gets cold, and so many things you wouldn’t think of unless you lived her life. With the help of Ann, Brad is able to be a force of nature in Arkansas politics. As well as serving as an alderman, he participates in several online and local political groups. On Tuesday, he and Ann are going to be poll workers.

I asked Brad what he did when he found out his hours had been reduced by Arkansas’ Department of Human Services (DHS).

He told me he had asked the nurses why his hours were cut, and “they said the computer was tabulating how many hours you’re going to get.”

Ann didn’t suspect there was anything amiss going on at first. “I thought, surely this is a mistake,” she said.

Brad is in a Medicaid program overseen by DHS called ARChoices in Homecare. This taxpayer-funded Medicaid program provides home and community-based services for adults with disabilities and people over 65. Home health care helps people like Brad stay out of nursing facilities, which are much more expensive for the state and would not provide him the quality of life that staying at home has. For years, healthcare decisions had been made by nurses, not by computers.

Brad decided to call Kevin De Liban, an attorney with Legal Aid of Arkansas. Brad was not the only one to call Legal Aid about the cut in hours. In fact, of the 11,000 people enrolled in ARChoices, about 4,000 had their hours cut.

DeLiban filed a federal lawsuit against DHS on behalf of Brad and another plaintiff because the reduction of services violated two acts and the Constitution. De Liban told me that they had discovered during the legal process of the lawsuit that the computer issue was in fact an algorithm, and Legal Aid obtained a copy of it.

The problem all along had been that DHS secretly adopted the AI without being able to explain how it worked. Basically, nurses would come out to clients’ homes and ask them about 280 questions. This data would be entered into a computer, and the algorithm would put their allotted care level into tiers, called resource utilization groups (RUGs). The code, written by Brant Fries, PhD, didn’t take into account the diagnosis of cerebral palsy for nearly two years.

This tangle of healthcare technocracy has been in litigation for over two years. DHS loses, DHS appeals, and on and on.

In the meantime, Brad has been busy advocating for himself. “I have every senator’s phone number,” he told me. He also has the number of the governor of Arkansas, Asa Hutchinson. Brad has called the governor, but the governor has not spoken to him about his cut in home healthcare hours.

Asa Hutchinson

Hutchinson just happens to be one of the characters in my ongoing research into the Mena Airport scandal. That’s the one where CIA & company were trading arms for drugs, and they invaded a small town in Arkansas to use as a clandestine base. Asa was a young US attorney at the time, and I’m not sure how involved he was with the money laundering part of it, but that is another story.

Asa’s running for governor again, so it wasn’t very hard to find him. I met him at a political club luncheon. After his stump speech, during which he said he told Vice President Pence to end the trade war tariffs with China (a lot of talk about China amongst the Political Animals), I was the first one to throw up my hand for a question.

“From what I understand,” I said, “state-funded care for the elderly and people with disabilities has been cut drastically for some due to the use of artificial intelligence. What is your vision for AI in the state of Arkansas in the future?”

The reaction from the room was remarkable. Asa froze. The room went dead quiet. Somebody behind me told me to turn my camera off. “Okay,” I said.

Only then would the governor speak. “What do you mean by AI?”

And that, dear readers, is the moment I realized I had spoken the phrase which should not be spoken. We like to think of artificial intelligence as being sci-fi, but the truth is we live with it every day. I searched my mind for a word he would accept.

“Algorithms?”

“Oh, yes.” He then regained his politician flair and told the room about how there was a need for independent assessment, and the government has too many service departments. He also said that his administration had helped shorten the waiting list for ARChoices. In an odd way, this is true.

Because although Kevin De Liban & Brad and the rest of the plaintiffs won their lawsuit fully on merit, helping possibly thousands within the program, DHS got itself a backlog of applicants for the ARChoices program. From the outside, it appears to me that DHS figuratively held some needy people in Arkansas hostage to keep their AI.

The judge in the case, Wendell Griffen, has been ruling favorably toward the plaintiffs, at one point even holding DHS in contempt of court for not addressing the AI problem, but the process was completed October 1st, DHS was allowed to continue to use the RUGs, and people were allowed to enroll in the program again. So yes, the ARChoices waiting list was shortened.

The next thing Asa Hutchinson said probably made my jaw drop, but I was too busy taking notes to notice.

“We need to keep people from taking advantage of the system.”

I immediately thought of the Ledgerwoods, fighting to survive financially, and all the other people who have suffered because of this AI.

“The human impact is devastating,” De Liban told me. “Some of them have had to lie in their own waste, to go without food, to endure pressure sores. People have suffered as a result of the state’s use of the RUGs algorithm.”

Hutchinson didn’t answer my question about AI.

Artificial Intelligence

Don’t get me wrong — AI can be helpful in many ways. Technology aids people like Brad and Ann. When Brad was in school, Ann had to read everything out loud to him. Imagine the number of hours. Now Siri does it.

However, the RUGs algorithm is a problem, and it is artificial intelligence, no matter what anybody says. According to Oxford Reference, artificial intelligence is “the theory and development of computer systems able to perform tasks normally requiring human intelligence, such as …. decision-making ….” When a computer program makes decisions about your level of health care, that is artificial intelligence by definition.

This is an entirely different picture than you get when you see videos like Pepper the robot addressing the UK Parliament in a childlike voice, saying its design is “aimed at assisting and caring for the elderly!”

Imagine Pepper babytalking to you: “Septicemia is not part of my programming!”

And then rolling out the door of your nursing home room, never to be seen again.

A computer does not have empathy. It does what it’s programmed to do. It’s like a high-functioning sociopath.

Michael Morton

Remember when I said that home health care saves the state a whole lot of money? DHS claims it pays on average $18,000 for people with home health care compared to $50,000 for those in nursing homes. Why would they cut back on a program that saves them money when the consequences could be so severe, not only to patients, but to taxpayers?

Well, at the same time the algorithm scandal is going on in Arkansas, so is a nursing home scandal.

Nursing home magnate Michael Morton bribed a circuit judge, Michael Maggio, to reduce the judgment in a negligence suit against one of Morton’s facilities from $5.2 million to $1 million. Maggio is now in prison, but Morton is still in business.

For this 2018 election cycle, Morton donated to Asa Hutchinson’s campaign, but Hutchinson turned him down. However, I can’t find anything where Morton’s business partner, David Norsworthy, is getting turned down for his donation to Asa’s campaign. Norsworthy bribed Senator Jake Files, who’s in prison now, too, but Norsworthy isn’t.

Seems like there’s some funny business going on in the healthcare system in Arkansas. Surprise, surprise. And I hate to say it, but I’m not shocked they are picking on some of the most vulnerable members of society first.

I asked Kevin De Liban why he thought they were using the AI. He said, “The algorithms seem to offer so-called objective cover for what are really budget cuts.”

That seems pretty obvious to me. But how deep does this scandal go? I’ll be investigating, so stay tuned.

Weaponized AI

In my research, I’ve come across many instances of AI being used against the vulnerable. It’s almost like government bureaucrats look at these people as easy targets. It’s hard to fight back when you’re dealing with a serious health issue and financially strapped.

Are they figuring out the legal and social obstacles they face as they deny help to the needy?

And it’s not just healthcare or food stamps. AI is everywhere now. From your social media accounts to your plane flights, artificial intelligence is there, looming secretly in the shadows.

Google, that corporation we all know and trust, has pinky promised that it will not turn into Skynet from the Terminator movies. Let’s hope they’re telling the truth now or soon, but I’m not holding my breath because AI might not recognize hypoxia.

Let’s face it: most people just let this happen without even speaking out about it. They’re too busy watching the Puppy Bowl.

But there is a family in Cash who’s speaking up for us. One of them is in a wheelchair and can barely move, but he is doing more than anybody else I know. Good on him.

I asked Brad how he felt about the court case.

He said, “Overall, I feel like we’ve won battles, but we haven’t won the war.”

Fight on, Ledgerwoods and Legal Aid, fight on.

Winner of my high school's citizenship award.

Filed under: empire watch, featured, latest, United States

by

Winner of my high school's citizenship award.

newest oldest most voted
Notify of
angryashell
Reader
angryashell

You have encountered the privately owned corporate world Federal reserve and wall street all packed up into one nice “what I call go to hell package”. The private corporations own America all of it and everybody in it. The louses that run governments have made government operations impossible to access, unless you reveal to them your all, then they don’t have any answers and are not responsible for a damned thing. they have privatized nearly every facet of government so the private owners of these corporations can make a buck off of what used to be government What is privatization.… Read more »

Martin Usher
Reader
Martin Usher

This software really isn’t artificial intelligence although it might seem like it to an outside observer because they can’t see how it works. From the description it looks like the person cited as the author has devised a way of digesting disparate, seemingly unrelated, information into a set of one or more linear scores which are used to simplify the administration of complex tasks, a bit like the mechanisms credit score agencies use. The problem with this approach — and the reason why I wouldn’t call it Artificial Intelligence — is that there doesn’t appear to be any mechanism for… Read more »

Hope K
Reader

Did you even read it? The Department of Human services got sued and Legal Aid got a copy of the “algorithm.” Call it whatever you want, but I call it what it is–artificial intelligence. Because it fits the definition. You’re just convincing me more that there is some kind of whackadoodle coalition of people who want to believe it is some extraordinarily complex idea that only they get to define. Well, I go with the Oxford definition, but you do you.

Hope K
Reader
icannahasinternets
Reader
icannahasinternets

The ethical considerations in that article seem to have some basis in Asimov’s laws of robotics.

…But the feline community needs to speak up now! According to the survey upon which future algorithms may be based, cats will not be spared in car accidents with the potential for multiple casualties. But it’s ok if you’re a dog standing next to a criminal.

icannahasinternets
Reader
icannahasinternets

Whatever your definition of AI, and the creation of decision-based algorithms is certainly one component, automation is increasingly intertwining itself in our daily lives. It is when this automation is used to make decisions on a social level that we’ll inevitably run into issues like those in the article. Automation should not be used (or at the very least should not be the sole component) where crucial ‘quality of life’ decisions are concerned. The only motivation to automate assessments at a social level is to punish, cause stress, avoid accountability and get ‘quick wins’. This is precisely what happened when… Read more »

Level-headed comment
Reader
Level-headed comment

Spot-on, level-headed comment. Thank you!

It is surely a feature in this Neo-Liberalism era to encounter problems with only extremely convoluted ways for redress.

” The only motivation to automate assessments at a social level is to punish, cause stress, avoid accountability and get ‘quick wins’ ”

And who hasn’t suffered from at least one of these problems lately?

vexarb
Reader

Uncle $cam’s attempt to weaponize SWIFT (the banking transfer code) is unravelling, because humans (Putin and Xi) are smarter than AI bots (Obomba and Trumpetty). From DT_Regime-change-watch BTL SyrPer: “Too bad that SWIFT is just code. It’s just an encrypted messaging system. And like the push to stifle alternative voices on social media — de-platforming Alex Jones and Gab for examples — the solution to authoritarian control is not fighting fire with fire, but technology. And that’s exactly what Russia has done. They applied themselves, spent the money and wrote their own code. Code is, after all, hard to control.… Read more »

wardropper
Reader
wardropper

And let no one imagine that AI plays no part in the counting of votes at mid-term elections…
Seriously, we crossed the line in 2000. And AI won’t let us cross back again.
We can’t go on thinking like we used to back in the 1950s, when manipulation on this scale was actually hard work for the perpetrators.
It’s just a short algorithm away now.

Stop Obfuscating!
Reader
Stop Obfuscating!

Rigging computer programs in favour of Big Corps and the government against ordinary people is already widespread. Rigging a computer program is a common practice when applying business logic during software development. AI only makes it worse. Exceedingly worse! Here is a non-AI example: Many GPs (doctors) are looking less at their patients and fixating their gaze on the computer screen. They are following a series of prompts that enable them not only a ‘quick diagnosis’ but also [to] generate more business activities for ensure increased profits for the medical practice. This are not using an AI system. It’s a… Read more »

frank
Reader
frank

Hello there. 🙂 How was my comment disingenuous? Everything you said corroborates what I said.

Stop Obfuscating!
Reader
Stop Obfuscating!

That was a reply to a disingenuous post by ‘frank’ who said:

“if we are going to call every computer program “AI” then the term becomes meaningless. And that’s not going to help when the time comes to battle real “AI”.”

John2o2o
Reader
John2o2o

Not sure that really fits the definition of AI. AI is independent “artificial” thought. This problem seems to have occurred because of the crude application of a computer program (algorithm). Computer programs don’t think.

I shouldn’t have to say this, but when your computer says “hello world” that is just as a result of the execution of a computer program. The computer isn’t there saying, “Gee, I wonder how to communicate with these organic beings?”

Computers don’t think. Good article in many ways, but this is not an AI issue as far as I can make out.

Ken Kenn
Reader
Ken Kenn

All true – but the name of the game is “smart ” as in smarter than a biased human being. Take the politics out of decision making by using this smart technology – convince the populous that the machine is smarter than them and logically the decisions they make will be objective. ” Look folks – no interference from us politicians – the smart algorithms say it’s so and so it is.” Remind me though – were events that were only suppose to happen once in one thousand years happen in the finance Sector eight times in three days? Any… Read more »

axisofoil
Reader

frank
Reader
frank

“The algorithms seem to offer so-called objective cover for what are really budget cuts.” Then you go on to say: “In my research, I’ve come across many instances of AI being used against the vulnerable.” Why are you blaming the AI when it’s clearly the authorities that are making the budget cuts and using the computer system as some kind of cover. I agree that we should be worried about AI, but for the moment it’s still humans making the real decisions. Using the example of the healthcare budget cuts and blaming it on AI is a bit manipulative, I… Read more »

Hope K
Reader

We should be concerned because unelected people are weaponizing it. Social vampires like Cindy Gillespie here: https://www.arkansasonline.com/news/2018/jul/11/absent-leader-criticized-in-review-of-d-1/
I stated that it is helpful in some ways. I mean, have you used Google translate? Sometimes there’s no good substitute for a human.

frank
Reader
frank

Should we ban science because corporate psychopaths are weaponizing it?

You think that by banning this computer program they’re using the money will be put back into the budget? If you want more money for healthcare I think you should focus on the real reason, not this computer program.

As for AI, I have serious concerns about it, but imo you are using the wrong example to attack it.

And also: if we are going to call every computer program “AI” then the term becomes meaningless. And that’s not going to help when the time comes to battle real “AI”.

Mulga Mumblebrain
Reader
Mulga Mumblebrain

Precisely. AI is just another tool in the psychopathic Right’s endless war against their Eternal Enemy-other people.

Eric Blair
Reader

Good piece. AI is also what drives the computers that banks and traders on Wall Street and in the City of London use to shunt trillions around the globe in fractions of a second and buy or sell currency and other commodities in the blink of an eye. A common German colloquialism for computer is Rechner – it literally means calculator, which is what computers at their core basically are….fast calculators, really really fast calculators that can do millions (or is it billions already?) of calculations in a matter of seconds. This is where their power lies. Forget Kurzweil’s Singularity…it’s… Read more »

DunGroanin
Reader
DunGroanin

I recall that it came up in Wiley’s SCL/CA testimony to parliament and forgot to follow it up. So thanks for the heads-up. I remember it from Tolkein as his imagining of TV or skype. Anyway, having had a quick search, looks like they are recruiting. “A World-Changing Company At Palantir, we’re passionate about building software that solves problems. We partner with the most important institutions in the world to transform how they use data and technology. [WTF???…>] Our software has been used to stop terrorist attacks, discover new medicines, gain an edge in global financial markets, and more. If… Read more »

Hope K
Reader

I feel like I should give this reply a round of applause.

I just saw this: https://www.zerohedge.com/news/2018-11-01/ai-lie-detectors-tests-coming-eu-airports

Norcal
Reader
Norcal

I can’t thank you enough Hope K for this excellent investigative reporting. I’m especially impressed by your Arkansas Links under the Asa Hutchinson photo and I encourage readers to go to those articles to find the depth of corruption occurring in Arkansas.

I must also praise your High School for their award process; keep up the masterful work, we need it desperately…

Hope K
Reader

😁 Thank you.

Application rejected. Why?
Reader
Application rejected. Why?

When things go wrong (highly likely deliberately/programmatically), those responsible can use AI or Machine learning to instantly find excuses (imaginable or unimaginable) that work well for any given situation using the entire history of excuses on record .. before the victim even blinks.

Jen
Reader
Jen

Thanks to Hope K for alerting us to something I myself would never have guessed (but is now too painfully obvious, that I should have foreseen it): that AI could be used in a cynical way by governments and private corporations to extort money from vulnerable patients (be they elderly patients, patients with chronic conditions or even babies and their mothers) and their families, and even to deny people the help and care they need. All for the sake of profit or making more efficient use of scarce funding and sticking to budgets (ha).

vexarb
Reader

It’s GIGO. Garbage In Garbage Out; Greed In Greed Out. As in Lord and Lady Macbeth of Arkansaw. Search “‘The Clinton Body Trail”. While CIA drugs flew into Arkansaw airport under the Governor’s protection and young witnesses were battered to death, the Clinton Foundation climbed up from zero to $80G, and Miss Macbeth married into a Rothschild bank. I don’t know about the present Governor (is he that handsome figure in the photo?) but I do know this: The descent to the lowest circles of Hell can only be made on the back of a monster with the face of… Read more »

AI = virtual slavery
Reader
AI = virtual slavery

WOW .. it’s the computer’s fault!

AI errors (or ‘features’) always disadvantage people at the weak end.
The 1% can own, create, design, operate AI.

What can the 99% do except ‘Obey!’?

flaxgirl
Reader

This is all screamingly obvious human error (or perhaps it’s deliberate) in devising the AI system. If the system reduces hours then it needs to be able to provide the reason and then the system should check that the reason is valid, or, if not sophisticated enough, with the reason provided, the client and other stakeholders will immediately pick up any anomaly. There should, of course, be a notification to the client and other stakeholders well in advance of reduced hours taking effect, containing some caveat that “if you believe an error has been made please contact us within x-timeframe”.… Read more »

DunGroanin
Reader
DunGroanin

Dare we venture into scifi territory? Quickly – as far as I understand, a true AI, would become ‘sentient’ and start to reprogramme and redesign it self. Unlike a Robot it would soon overide any of its preset parameters (e.g. KILL THE POOR) and start to set it’s own goals. At a speed much much faster than human conscience communication capability. I suppose it would be able to decide what is good/bad. And be able to be reasonable/fair. It might decide that we are not worthy enough to bother with or that we are worth having around for our human… Read more »

Hope K
Reader

They did not tell the patients ahead of time. Also, the “stakeholders” in this case are the taxpayers. Thus the lawsuits, federal and state. Do you think leaving cerebral palsy out for two years was a mistake? A human nurse would never have done that. AI is being used oftentimes for bureaucrats to shun their responsibility for denials of care.

flaxgirl
Reader

AI is being used oftentimes for bureaucrats to shun their responsibility for denials of care. Exactly. It’s being used incorrectly – the AI is only ever supposed to be part of the system – it produces the algorithm but then the other part of the system is checking with the client – not just telling them ahead of time but checking that it’s OK. It’s pretty straightforward. Where I work the patient gets notified a number of times with ample opportunity to dispute any proposed reduction of payments – ultimately how fair the system is I don’t know because I… Read more »

axisofoil
Reader

We are dealing with Arkansas here. Need I say more?

Jen
Reader
Jen

Former stomping ground of the Clintons and the radioactive centre of Walmart, I see.

Hope K
Reader
axisofoil
Reader

Fair dinkum.
Reader
Fair dinkum.

It’s the ignorant (our psychopathic leaders) leading the ignorant (their subordinates).
It’s Capital$chi$m.