That’s not correct thinking. Unless you’re going to start from the position that AI is 100% correct, it’s going to take just as much time to review that decision written by AI and then compare it with the actual evidence of record. It might actually take more time. It’s a lot easier to review the evidence and write a decision, then it is to review somebody else’s decision and then go through the evidence to find the evidence it supports. |
Agreed. People seem to think AI will just be correct. That’s not true at all. AI has been tested in courts; attorneys relied on AI to write briefs, and they have been disastrous. AI just makes up case names and fake citations. AI doesn’t have any control. It just gets input, and gives an output. |
There’s no need to review every single AI drafted decision. Just review samples. However, it will probably take a minimum of 10 years for AI to reach this level. BVA will still employ many attorneys in the meantime. |
Maybe I’m missing something. You want to use AI to draft decisions, but you don’t actually want to review them? |
You think the Court will accept credibility determinations made by a computer program? |
AI is just a generic term for what generally refers to something that has a very, very large language database. There are countless AI tools. Some, like Chat GPT, don't have any parameters that would restrict hallucinating. Others, like Lexis AI, are confined to real law and cases and do not hallucinate. AI can also be instructed to give citations and authority for any position it takes. It could cite to the record for any factual positions, just like a law journal does for virtually all propositions. Chat GPT, for example can be given a set of your judge's cases and be told to rely on only those along with the CFR and USC and asked questions and told to cite to relevant authority for its positions. I know because I do just that. I update the dataset every few months. I can grab draft language out of it 10x faster than snippets. I can also give it fact patterns and it'll tell me how my judge has decided in the past and give me those cases. The searches are so much easier and faster than what we have on SharePoint. If I were to put the effort into it, I could easily have it write decisions. Could it do the entire job? No, not as I have it implemented, but does it simplify my work? Oh very much so. In terms of what the rest of the legal world is doing for doc review, it wouldn't be hard to have AI doing far more work. Take 5 minutes and look at what platforms like Everlaw are doing. BVA is in the stoneage with Case Flow. |
You are missing the point. AI is a tool. It is not the lawyer. AI would review and gather relevant evidence. It would present it and reach a suggested conclusion. An attorney/VLJ would review it. It is just a tool. So many in here have clearly never used AI and the resistance to technology that could vastly improve the service to veterans is a perfect example of why the general public is so quick to celebrate the termination of bureaucrats and disruptions to our lives. The government is a bloated dinosaur that is too protective of the notion of how things were/are done in order to protect the bloat and jobs. |
It will take years before AI is capable of making credibility determinations. But, it will get there eventually. |
Credibility is entirely subjective. Not suited for AI analysis. The criteria for a yay or a nay on credibility would have to be established under a set of rules that could be programmed into the AI, and that would have to be established in law first. |
Oh for pete's sake, this discussion is annoying. I bet if you asked most people at the Board, 'would you like an AI tool to help you review lengthy case files, highlight relevant evidence, and assist you in drafting?', they would say yes. If you ask people 'do you think AI can do your job?' they would say no. And if you couple that question with an insult and a threat to everyone's livelihood, the response is going to be even more vehement. The so-called technology resistance some folks are complaining about is more a response to the inflammatory way AI being discussed on this board than anything else imo. There are parts of our job that I think can absolutely be done better by AI - reviewing thousands of pages of medical records, for example. I would LOVE for an AI program to just pull out the relevant evidence. AI can also help us draft more quickly. If/when AI is introduced, some attorney jobs might shift away from drafting decisions and towards teaching the AI - after all, the law is constantly changing, so it's not like you can set up the AI and just let it run. In the future, I think AI-assisted attorneys could turn out exponentially more cases than we do now. But at the end of the day, I think a human needs to make the call of whether it's going to be a grant, a denial, or a remand. A human needs to direct the AI, not just do a quality check every 10 cases. Even if we get to the point where there's no technological limitation that prevents AI from generating a case entirely independently, there are ethical limitations. I suspect there would be legal limitations too - if an AI independently denies a claim, people are going to start working on a lawsuit about the way to use AI. I doubt the outcome will be 'yeah, this is fine, we don't need humans at all.' Any predictions that don't take that into account are just blowing smoke. |
Immediate BVA concerns: RIF, RTO, squashing union protections, whatever the hell “anti-Christian bias” is
Not an immediate concern: AI |
That’s a complete misconception. The US population grew at a staggering rate increasing 50% in 50 years whereas the government only increased at a rate of 4%. We simply don’t have enough people to do the job that are required.The government isn’t bloated, it’s just Republicans who are telling you that. And you seem to simply be buying into it without any actual thought. |
You are correct that a human has to make the call, mostly because it will be impossible for an attorney to explain how the "Board" arrived at a conclusion if the conclusion was not actually made by someone at the Board. Will the OGC attorney just tell the Court that the conclusions in the decision were based on algorithms? That goes against the entire principle of an attorney at BVA doing de novo review. AI would work fine if the decisions did not have to be defended in Court. |
For anyone interested in training their own chat bot, you can grab decisions from here: https://search.usa.gov/search?query=tinnitus&affiliate=bvadecisions&start_date=2024-01-01&end_date=2025-01-23.
Example search goes to tinnitus decisions published between 1/1/24 and 1/23/25. You can edit the search for your judge, disability, time, etc. Drop those into your AI tool of choice and instruct your AI to use this as its resource and to cite to the citation number when responding, and to use the CFR and USC, and not secondary authority like other websites. If you're using GPT, you can reduce the AI's creativity (hallucinations) by telling it to tell you it cannot find what you're looking for, instead of trying to please you by creating answers or trying to synthesize or provide close answers. Instead, tell it to prompt you with the option to expand your search criteria. You'll also need to tell it to search through the entire CFR. Otherwise, when you ask it a question about something like effective date, it'll stop at the first answer instead of continuing to read and getting to the section on notice of intent to file. Think of AI like a lazy 1L. For the upload, GPT worked fine when I combined the .txt files into a single PDF with a bookmark for each file (done automatically by my PDF software) and uploaded it. Keep track of the subjects and dates the decisions are from, and then you can make adjustments when new controlling case law comes out, or when your judge decides to switch up their templates. This is so much faster than Research Tools and you don't need to use terms and connectors. |
And where were computers 50 years ago? 50 years ago, VA was using paper files and typewriters. VA is a dinosaur when it comes to the use of technology. |