[House Hearing, 110 Congress]
[From the U.S. Government Publishing Office]
DISABILITY CLAIMS RATINGS AND BENEFITS
DISPARITIES WITHIN THE VETERANS
BENEFITS ADMINISTRATION
=======================================================================
HEARING
before the
SUBCOMMITTEE ON OVERSIGHT AND INVESTIGATIONS
of the
COMMITTEE ON VETERANS' AFFAIRS
U.S. HOUSE OF REPRESENTATIVES
ONE HUNDRED TENTH CONGRESS
FIRST SESSION
__________
OCTOBER 16, 2007
__________
Serial No. 110-53
__________
Printed for the use of the Committee on Veterans' Affairs
U.S. GOVERNMENT PRINTING OFFICE
39-462 PDF WASHINGTON DC: 2008
---------------------------------------------------------------------
For sale by the Superintendent of Documents, U.S. Government Printing
Office Internet: bookstore.gpo.gov Phone: toll free (866)512-1800
DC area (202)512-1800 Fax: (202) 512-2250 Mail Stop SSOP,
Washington, DC 20402-0001
COMMITTEE ON VETERANS' AFFAIRS
BOB FILNER, California, Chairman
CORRINE BROWN, Florida STEVE BUYER, Indiana, Ranking
VIC SNYDER, Arkansas CLIFF STEARNS, Florida
MICHAEL H. MICHAUD, Maine JERRY MORAN, Kansas
STEPHANIE HERSETH SANDLIN, South RICHARD H. BAKER, Louisiana
Dakota HENRY E. BROWN, Jr., South
HARRY E. MITCHELL, Arizona Carolina
JOHN J. HALL, New York JEFF MILLER, Florida
PHIL HARE, Illinois JOHN BOOZMAN, Arkansas
MICHAEL F. DOYLE, Pennsylvania GINNY BROWN-WAITE, Florida
SHELLEY BERKLEY, Nevada MICHAEL R. TURNER, Ohio
JOHN T. SALAZAR, Colorado BRIAN P. BILBRAY, California
CIRO D. RODRIGUEZ, Texas DOUG LAMBORN, Colorado
JOE DONNELLY, Indiana GUS M. BILIRAKIS, Florida
JERRY McNERNEY, California VERN BUCHANAN, Florida
ZACHARY T. SPACE, Ohio
TIMOTHY J. WALZ, Minnesota
Malcom A. Shorter, Staff Director
______
SUBCOMMITTEE ON OVERSIGHT AND INVESTIGATIONS
HARRY E. MITCHELL, Arizona, Chairman
ZACHARY T. SPACE, Ohio GINNY BROWN-WAITE, Florida,
TIMOTHY J. WALZ, Minnesota Ranking
CIRO D. RODRIGUEZ, Texas CLIFF STEARNS, Florida
BRIAN P. BILBRAY, California
Pursuant to clause 2(e)(4) of Rule XI of the Rules of the House, public
hearing records of the Committee on Veterans' Affairs are also
published in electronic form. The printed hearing record remains the
official version. Because electronic submissions are used to prepare
both printed and electronic versions of the hearing record, the process
of converting between various electronic formats may introduce
unintentional errors or omissions. Such occurrences are inherent in the
current publication process and should diminish as the process is
further refined.
C O N T E N T S
__________
October 16, 2007
Page
Disability Claims Ratings and Benefits Disparities within the
Veterans Benefits Administration............................... 1
OPENING STATEMENTS
Chairman Harry E. Mitchell....................................... 1
Prepared statement of Chairman Mitchell...................... 36
Hon. Ginny Brown-Waite, Ranking Republican Member................ 2
Prepared statement of Congresswoman Brown-Waite.............. 36
Hon. Zachary T. Space............................................ 3
Hon. Cliff Stearns............................................... 4
Hon. Timothy J. Walz............................................. 5
WITNESSES
U.S. Department of Veterans Affairs:
Jon A. Wooditch, Deputy Inspector General, Office of Inspector
General...................................................... 16
Prepared statement of Mr. Wooditch........................... 42
Ronald R. Aument, Deputy Under Secretary for Benefits, Veterans
Benefits Administration...................................... 22
Prepared statement of Mr. Aument............................. 44
______
American Veterans (AMVETS), Ray Pryor, USN (Ret.), Chillicothe,
OH............................................................. 9
Prepared statement of Mr. Pryor.............................. 39
Kenney, John J. ``J.J.'', USMC (Ret.), Homosassa, FL, Veteran
Service Officer, Citrus County, FL............................. 7
Prepared statement of Mr. Kenney............................. 37
Institute for Defense Analyses, David E. Hunter, Ph.D., Research
Staff Member, Cost Analysis and Research Division.............. 14
Prepared statement of Dr. Hunter............................. 40
SUBMISSIONS FOR THE RECORD
American Legion, Steve Smithson, Deputy Director, Veterans
Affairs and Rehabilitation Commission, statement............... 46
American Legion, Department of Ohio, Donald R. Lanthorn,
Department Service Director, statement......................... 50
Wilson, Hon. Charles A., a Representative in Congress from the
State of Ohio, statement....................................... 53
MATERIAL SUBMITTED FOR THE RECORD
Post Hearing Questions and Responses for the Record:
Hon. Harry E. Mitchell, Chairman, and Hon. Ginny Brown-Waite,
Ranking Republican Member, Subcommittee on Oversight and
Investigations, Committee on Veterans' Affairs, to Hon.
Gordon H. Mansfield, Acting Secretary, U.S. Department of
Veterans Affairs, letter dated November 2, 2007, and VA
responses.................................................... 54
DISABILITY CLAIMS RATINGS AND BENEFITS
DISPARITIES WITHIN THE VETERANS
BENEFITS ADMINISTRATION
----------
TUESDAY, OCTOBER 16, 2007
U.S. House of Representatives,
Committee on Veterans' Affairs,
Subcommittee on Oversight and Investigations,
Washington, DC.
The Subcommittee met, pursuant to notice, at 2:04 p.m., in
Room 334, Cannon House Office Building, Hon. Harry E. Mitchell
[Chairman of the Subcommittee on Oversight and Investigations]
presiding.
Present: Representatives Mitchell, Space, Walz, Brown-
Waite, and Stearns.
OPENING STATEMENT OF CHAIRMAN MITCHELL
Mr. Mitchell. This hearing will come to order. This is the
Subcommittee on Oversight and Investigations hearing on
Disability Claims Ratings and Benefit Disparities within the
Veterans Benefits Administration (VBA). I want to thank
everyone for coming this afternoon.
For years the Veterans Benefits Administration has
experienced problems maintaining adequate accuracy and
consistency data within it's rating system. The purpose of this
hearing is to evaluate what the VA is doing to fix these
problems. Their ability to keep accurate records is essential
to ensure the quality of veteran disability ratings now and
into the future.
Let me first thank Congressman Space who has quickly become
a leader in working to address this issue. He and Ranking
Member Ms. Brown-Waite took the lead in assembling the first
panel. The disability ratings system has been an issue of
serious concerns since 2002 following an eye-opening U.S.
Government Accountability Office (GAO) Report. On January of
2003, the GAO designated the U.S. Department of Veterans
Affairs' (VA's) Disability Program as high risk. This
designation resulted from concerns about consistency of
decision making and accuracy of records.
This Subcommittee is aware of the Department's efforts to
correct these issues, but more has to be done. I am concerned
about the wide variations in average compensation per veteran
and grant rates that persist between states. After years of
recommendations by the GAO and the VA Office of Inspector
General (OIG), the VA has failed to collect and maintain an
accurate database. This must change because our Nation's
veterans cannot be forced to wait any longer. According to
VBA's Systematic Technical Accuracy Review or STAR, accuracy of
regional office (RO) decisions vary from 76 percent in Boston
to 96 percent at the Fort Harrison regional office.
This variation is troubling. More troubling is that STAR
only looks at accuracy and completely ignores consistency of
decisions. The VA has implemented a new data system called
Rating Board Automation 2000. This system collects more
information but it continues to set roadblocks for analyzing
claim denials for disabilities like Post Traumatic Stress
Disorder (PTSD) and Traumatic Brain Injury (TBI). PTSD and TBI
are complicated and often misdiagnosed disabilities. Because of
their nature, rating a veteran with these disabilities is
somewhat subjective.
We understand there are variances between states and claims
decisions and that is to be expected. But the subjective nature
of the ratings process does not do our veterans justice. We are
sending the wrong message to our Nation's veterans. We are
saying that even though you served courageously for your
country, you better live in the right State and hire
professionals when filing for disability benefits. This is
unacceptable.
Just last week we heard from the Veterans Disability
Commission on the necessity to provide equitable treatment for
all veterans, but this is not the case today. Aside from
maintaining accurate records, we need to make sure that claims
officers nationwide receive the same training. This training
must be focused on the intricacy of each disability imposed on
any veteran young and old. I know that we can work together in
a bipartisan way with the VA to ensure that our veterans get
the best and most fair benefits available.
Before I recognize the Ranking Republican Member for her
remarks, I would like to swear in our witnesses. Will all
witnesses from all three panels please rise. And would you
raise your right hand.
Do you solemnly swear to tell the truth, the whole truth,
and nothing but the truth.
Thank you. I would now like to recognize Ms. Brown-Waite
for opening remarks.
[The prepared statement of Chairman Mitchell appears on p.
36.]
OPENING STATEMENT OF HON. GINNY BROWN-WAITE
Ms. Brown-Waite. Thank you very much, Mr. Chairman. The
Institute for Defense Analyses (IDA) recently issued their
final report in March of this year on the analysis of the
differences in disability compensation in the Department of
Veterans Affairs. This report was completed at the VA's request
to identify and collect data on compensation of recipients.
According to this study, the VA must do three things. One,
put forth a national effort of consistency of claims
processing; two, make certain that the raters receive
consistent training on a national basis; and three, collect and
maintain valid data to analyze national statistics and trends.
I am very interested in hearing from the three panels, but
especially I want to hear how VBA actually plans to implement
those recommendations. It is apparent that VBA must take steps
to improve training and to modernize it's rating system.
Whether a veteran's claim is rated at the St. Petersburg VA
regional office or the Phoenix, Arizona, VA regional office,
the same standard should be applied when making a rating
decision on the claim.
I would like to mention a bill that I have cosponsored with
my colleague, Mr. Lamborn, H.R. 3047, the ``Veterans Claim
Processing Innovation Act.'' This legislation would improve the
veterans claim process at VA by changing the work credit system
for VA. To do this, the measure establishes a fully electronic
system pilot to streamline the claims process.
That bill also requires the VA to have an independent
organization certify the effectiveness of VBA's training
programs and allow family members of veterans who have passed
away to continue the original claim instead of forcing the
dependents to start the claim all over again.
I hope that this legislation will pass the Committee before
the end of this Congress and be considered on the full House
Floor. I look forward to hearing more from our witnesses today.
And with that, I yield back the balance of my time, Mr.
Chairman. Thank you.
[The prepared statement of Congresswoman Brown-Waite
appears on p. 36.]
Mr. Mitchell. Thank you. Mr. Space?
OPENING STATEMENT OF HON. ZACHARY T. SPACE
Mr. Space. Thank you, Mr. Chairman, and thank you Ranking
Member Brown-Waite for holding today's Subcommittee hearing.
I requested this hearing because of my concern for the
existence of inequities in veterans disability payments. More
specifically, I requested this hearing because of my home
State's dismal ranking in average disability payments. Ohio
ranked dead last among States with an average of under $8,000.
The national average according to the Institute for Defense
Analyses Report based on 2005 data was $8,890. And the highest
ranking State was New Mexico with an average of over $12,000.
I am concerned that veterans in Ohio, who have served just
as honorably as veterans in other States, may not be getting a
fair deal by virtue of where they reside. In my district, one
is more likely to live below the poverty line than to have a
college education. That said, it is a struggle for many of my
constituents to meet the demands of the cost of living in Ohio.
Poor veterans in Ohio need every disability dollar they
have earned. I hope this hearing is a step in addressing that.
I understand that some of the State-by-State inequity may be a
result of factors beyond the control of the VA. However, I also
believe there is much that can be corrected. There is a need
for processes to increase consistency in the training given to
claims raters. And furthermore, there is a need for oversight
over the regional cultures that we will hear about today.
I want to know that every possible step toward addressing
what is fixable about this situation is being taken. I am
privileged to use my membership on the Subcommittee to shed
some light on this grave problem. Congressman Charlie Wilson, a
good friend and colleague, wanted very much to be here today.
However, he is recovering from recent surgery and on his behalf
and on behalf of his constituents as well as mine, I look
forward to hearing today's testimony to determine how Congress
can best work to address the disability claims disparity that
exists, and that is quite frankly negatively impacting the
brave heroes of the great State of Ohio.
And I should also add that I am very grateful to have on
hand today one of my constituents, Mr. Ray Pryor of
Chillicothe, Ohio, who will present testimony on behalf of
AMVETS.
I would like to request permission to submit the written
statement of Mr. Donald Lanthorn, Department Service Director
for the Ohio American Legion, for the record, pending review by
the appropriate authorities.
Thank you, Mr. Chairman.
[The statements of Donald Lanthorn and Congressman Charles
A. Wilson appear on pages 50 and 53.]
Mr. Mitchell. Thank you. Mr. Stearns?
OPENING STATEMENT OF HON. CLIFF STEARNS
Mr. Stearns. Yes. Thank you, Mr. Chairman. I guess we all
know there have been many organizations that review the
inconsistencies within the Veterans Benefit Administration. And
they have recommended that the VA start gathering data and
formulating metrics in order to better monitor any disparities.
The most recent investigation into consistencies between
the VA regional offices and VBA Compensation Benefits consist
of a Government Accountability Office Report that was issued in
2002. Another one was issued in 2003 and a third was issued in
2004. And they are followed by the Office of the Inspector
General investigation in 2005 and an Institute for Data
Analysis that the Chairlady mentioned earlier in the report in
2007.
My colleagues, in 2002, the GAO found that the VA did not
systematically assess decision making inconsistencies to
determine the degree of valuation variation that occurs for
specific impairments, and recommend that the VA begin to gather
useful data that would allow them to determine if there were
problems with inconsistencies. Following the GAO of 2003
investigation, the VA began to better monitor accuracy. But it
appears they still did not address the inconsistencies.
When the GAO returned to the issue in 2004 and determined
the VA had not yet acted. They had yet to act on the 2002
recommendations. So, Mr. Chairman, we have these reports and
the VA is not consistently acting on them. And as we go down
this line of reports, we find there is less action than we
would expect. GAO also noted that data in the VA benefits
delivery network system did not, ``provide a reliable basis for
identifying indications of possible decision making
inconsistencies among regional offices.''
So the question is, when you have all this information over
many years, why aren't they acting? In 2005, the VA OIG issued
its own report now highlighting various inconsistencies. So we
have all these reports and you have the VA's own OIG report.
There are disparities in claims ratings and payments within the
VBA, some
of the most significance being 100 percent disability ratings,
and most specifically, as mentioned for Post Traumatic Stress
Disorder and individual unemployment ratings.
Veterans with either a 100 percent disabled or individual
unemployability (IU) rating received 58 percent of the total
payment made by VA throughout the country, yet they make up
only 17 percent of the total veterans population.
So both the IU ratings and the PTSD are extremely
subjective. And I think hopefully our witnesses can give us a
better understanding of this. This is a critical issue that
must resolved, especially in light of the reiterated
frequencies of these various agencies indicating to us this is
a problem. And there has been no one acting for 4 or 5 years,
and yet we come back and talk about it.
So I am interested in hearing from our witnesses about the
possibility of perhaps modifying the current data compilation
program to collect more information on claims ratings to better
monitor possible disparities.
So, Mr. Chairman, I appreciate your hearing on this, and
hopefully the witnesses will be able to help us. Thank you.
Mr. Mitchell. Thank you. Mr. Walz?
OPENING STATEMENT OF HON. TIMOTHY J. WALZ
Mr. Walz. Thank you, Mr. Chairman. And thank you to every
one who is here today. We truly appreciate it and please know
that everybody in this room is committed to our veterans. I
would like to say Chief Kenney that I find no greater friend to
the veterans in our County Veteran Service Officers (CVSOs) so
I thank you for your work there. They understand who they go to
and who is their advocate to get things done. And in our
Veterans Service Organizations (VSOs) that are speaking for
veterans and understanding, we see ourselves and our role in
this Oversight and Investigation Subcommittee to help
facilitate any changes that are necessary to help our veterans.
I have said it dozens of times and I will continue to say
it. We know that the people and those great civil servants that
work in the VA want to deliver that service, but as my
colleague Mr. Stearns so clearly and sufficiently pointed out,
there have been plenty of suggestions to make changes, to make
this better that have not happened. And I think it is incumbent
upon us to make sure that this Committee is doing that.
So this is an issue that is on our veterans' minds. It is
on my constituents' minds. The Disability and Claims System is
something that they feel that there is a real injustice being
done to our veterans. And this is just one more of those
issues. But I do believe that there is absolutely no reason to
believe we cannot get this fixed. There are some good hard data
out there and I think there are some things that we can put
into place. And I hope, as my colleagues have said, to hear
from you on how we can do this.
And with that, I yield back, Mr. Chairman.
Mr. Mitchell. Thank you. I ask unanimous consent that all
Members have 5 legislative days to submit a statement for the
record.
Seeing no objections, so ordered.
I would now like to call on Ms. Brown-Waite to make her
introductions.
Ms. Brown-Waite. Thank you, Mr. Chairman. Thank you for
giving me the opportunity to introduce one of my constituents
and one of my favorite VSOs just because of the number of
veterans that he deals with every single day, and yet he does
it in a very cheerful manner. And that is John, ``J.J.,'' as
everyone knows him as, Kenney who is testifying before us
today.
There is a very strong sense of service to the country that
runs in the Kenney family. J.J. Kenney is the son of a World
War II combat-wounded veteran. He, along with three of his
other four siblings, served during the Vietnam War. His older
sister retired as a Chief Navy Corp with 20 years in the U.S.
Navy.
J.J. himself served in the United States Marine Corps from
November 1963 until his retirement in September 1986 with over
20 years of service in the Marines.
As a training officer at the Navy Parachute Rigger School,
he completed a total of 34 parachute jumps. After retirement
from his civilian positions he moved to Florida and like many
people got a little bored and re-entered the workforce as
Citrus County's Assistant Veterans Service Officer.
After just 18 months, the County Commission recognized his
talents and he was selected as the County Veterans Service
Officer.
In 2002, his office was selected as the best large service
office based on population by the Veterans of Foreign Wars
(VFW) Department of Florida. J.J. is an accredited service
officer holding accreditations for National Association of
County Veterans Service Officers, Florida Department of
Veterans Affairs, the American Legion, Disabled American
Veterans (DAV), and the Veterans of Foreign Wars.
J.J. Kenney and his beloved wife of 42 years, Mary Ann,
reside in a beautiful part of my district, Homosassa, Florida.
They have three children and ten grandchildren. I am very
pleased that he is here today to share his testimony. And we
need to listen to the disparities that he will bring forth.
Thank you, Mr. Chairman and thank you J.J. for being here.
[Applause.]
Mr. Mitchell. Thank you. Mr. Space?
Mr. Space. Thank you, Mr. Chairman. Ray Pryor served the
United States Navy on active duty from June 1973 to May 1975
making four tours aboard ships in the South Pacific. He then
served 6 additional years in the Naval Reserve. Following Mr.
Pryor's military service, he was an employee of Ohio's Job and
Family Services for 25 years, retiring in June of 2005.
In addition, Mr. Pryor served as a veterans employment
State representative for 20 years with the last 5 years as the
veterans licensing and certification coordinator for veterans
programs. Mr. Pryor currently serves on the Ross County
Veterans Service Commission as a county employee, and along
with four commissioners, oversees the operations of the County
Veterans Service Office. He sits on the South Central Ohio
Homeless Veterans Committee; Ross County Veterans Council;
Veterans in Transition, Inc.; and belongs to AMVETS, the
American Legion, the DAV, Vietnam Veterans of America, and the
VFW.
As a resident of Chillicothe, Mr. Pryor is, as I mentioned,
a constituent of mine and a member of my Veterans Advisory
Board. He is accompanied by Raymond Kelley, the Legislative
Director for AMVETS. And I welcome Mr. Pryor and Mr. Kelley and
thank them for taking time to be here today.
Mr. Mitchell. We will begin with Mr. Kenney. You have 5
minutes.
STATEMENTS OF JOHN J. ``J.J.'' KENNEY, USMC (RET.), HOMOSASSA,
FL, VETERAN SERVICE OFFICER, CITRUS COUNTY, FL; AND RAY PRYOR,
USN (RET.), CHILLICOTHE, OH, ON BEHALF OF AMERICAN VETERANS
(AMVETS); ACCOMPANIED BY RAYMOND C. KELLEY, NATIONAL
LEGISLATIVE DIRECTOR, AMERICAN VETERANS (AMVETS)
STATEMENT OF JOHN J. ``J.J.'' KENNEY
Mr. Kenney. Good afternoon, Mr. Chairman, Members of the
Committee. I would like to thank the Committee for this
invitation to speak this afternoon about some of the
disparities in the awards of benefits from State to State.
I also would like to thank in front of her peers,
Congresswoman Brown-Waite for her efforts on behalf of the
veterans of Citrus County, Florida. Thank you, Congresswoman.
I would like the Committee to know that I am not here today
to knock the VA. We, in the State of Florida, enjoy an
exceptional relationship with our one and only regional office
in St. Petersburg. Many of my fellow service officers in other
States only wish they had the relationship with their RO that
we do. If I have a problem, I can pick up the telephone and
talk directly with the service center manager and any of the
department heads and get the answers I need when I need them.
When I call them, they call me back.
There does, however, continue to be a disparity in the
awarding of benefits from State to State. And one wonders how
this could be possible since all 50 plus regional offices are
guided by the same regulations; the 38 Code of Federal
Regulations (CFR) and the M21.
One, the 38 CFR provides the necessary information with
regard to the ethical conduct in the adjudication of veterans
claims along with how and when the information about veterans
should be handled. Additionally, the 38 CFR provides the
various information required with regards to diagnostic codes
for the different illnesses and injuries along with the
percentage to be awarded for severity of the disability.
The M21 Manual is basically a Standard Operating Procedure.
What do I do to get from point ``A'' the receipt of the claim,
to point ``B'' the decision. It would appear a relatively
simple task of reviewing the evidence supplied by the veterans;
reviewing the service medical records for an in-service
occurrence; verifying character of service; determining from
the medical evidence if the condition is chronic in nature or
if the disease or illness is presumptive. Presumptive meaning
that veteran has filed a claim within 1 year of separation or
has a disability as a result of exposure to an environmental
hazard; i.e., Agent Orange, radiation or was a prisoner of war.
There are several elements that are not be considered and
they include the human element, the veteran population, and the
inventory of the various regional offices. The human element is
in every decision the VA renders. However, it differs from
State to State. I know the training received by VA is superb
and to the best of my knowledge standardized. So why the
disparity in the awards?
I would like to provide the Committee with a couple
samples. In the first example, the veteran who I will call Mr.
Smith, resides in California. He entered the Armed Forces in
the mid 1960s. At boot camp, the veteran received his
inoculations with the air gun. In the late 1990s, early 2000,
he was diagnosed with hepatitis C. He had not used drugs, had
no tattoos, and had not engaged in any improper conduct.
He applied for service-connected compensation based on the
use of the air guns providing medical evidence that supported
his claim. He was awarded service connection.
Veteran number two, we will call him Mr. Jones, resides in
Florida and entered the service approximately the same time as
Mr. Smith. He too received inoculations with the air gun.
Around the same time as Mr. Smith, Mr. Jones was diagnosed with
hepatitis C. He initially thought it may have been the result
of surgery he had undergone at the VA. Thinking he had received
blood during the surgery, he applied for compensation thinking
the blood may have been tainted.
Upon receipt of the claim, the VA located the surgical
notes that indicated Mr. Jones had not received any blood
products and denied his claim. In discussion with the veteran,
again ruling out drugs, inappropriate behavior or tattoos it
came down again to the air gun.
The veteran again applied for compensation based on the air
gun providing some of the same evidence as Mr. Smith did in his
claim. Additionally, he found a medic who was administering
shot to the same time Mr. Jones was at boot camp. The medic
verified the method the air gun was used and this supported by
medical evidence that was----
Mr. Mitchell. Can you wrap it up a little?
Mr. Kenney. Yes, sir. Basically, both individuals, same
disability, one granted, one not granted.
And the same thing applies. There was no disparity in the
two of them. And the second example I had for you was with
reference to hearing. Two veterans, same problem, hearing loss.
Same type of service, same type of exposure. Veteran from New
Jersey was approved, veteran from Florida was denied. That case
is now on appeal.
It is apparent to me that the VSR is--the human element
played a significant role in all these claims. How to remove
that factor from the process, I don't know. Continued training
is probably the best bet in reducing this factor in the claims
process.
We look at the populations of Texas, Florida, and
California. You can see the populations run from three million
down to one million with Florida having the second largest
amount of veterans and the oldest veterans population but we
only have one regional office. California has three, Texas has
two. That is another problem.
I submit that the VA should take a look at or look at it
basically like they did with the Capital Asset Realignment for
Enhanced Services (CARES) Commission. Look at the States, think
of possible realignment, additional regional offices, and
standardize the training if it is not standardized.
Mr. Chairman, thank you for your time. And again I
appreciate the opportunity to come here before this Committee
and your patient listening.
[The prepared statement of Mr. Kenney appears on p. 37.]
Mr. Mitchell. Thank you. Mr. Pryor, you have 5 minutes.
STATEMENT OF RAY PRYOR
Mr. Pryor. Thank you, Mr. Chairman and Members of the
Subcommittee. And a special thank you to Congressman Space for
inviting me over. Thank you for providing the AMVETS the
opportunity to testify regarding the issue of disability claims
and ratings and benefits disparities within the Veterans
Benefits Administration. This hearing is very important in as
it addresses an issue that continues to plague our Veterans
Benefits Administration and leaves veterans frustrated and
suspicious of the system that is in place to support them after
their service to our great Nation.
In examining the factors that have led to the disparities
in claims ratings two large overlying conditions are present
that have allowed the gaps in ratings to exist and several
circumstances have occurred which have exacerbated the problem.
First, and foremost, we are working with the system based
on humans making decisions. Their perceptions understanding of
conditions and occasional mistakes are going to play a role in
disparities. If this was the only issue then the disparities
would not be regionally based, they would be proportionately
distributed throughout VBA.
However, there is evidence that displays disparities
between regional offices. AMVETS believes these disparities are
caused by two separate, but related, groups within the claims
process. The Veterans Service Representative, the Rating
Veteran Service Representative, the Decision Review Officer
(DRO) and the rating--on the rating side and the compensation
and pension (CP) doctors whose evaluation of a veteran is used
by the regional office to decide a claim.
The reason these two groups have a great influence on the
outcome of the veterans claims and why there are regional
disparities is due to the personalities of the doctors, the
raters, the review officers, the personalities of the regional
office in general. These regional personalities develop because
new raters and DROs are trained by the regional office and they
develop the regional personality in styles of common terms and
language are used by the raters when filing a claim.
Terminology such as full range of motion compared to
essentially full range of motion could change a rating by 10
percent.
Likewise, physicians perceptions and similar language usage
can alter a claim. Veterans Service Officers will state they
routinely see compensation and pension exams which will
describe the patient with cookie cutter language leaving room
for subjective interpretation.
In addition to these personalities that determine
compensation on similar if not identical claims with a broad
range of outcomes is the backlog of the claims themselves--VBA
and the performance credit system that monitors the number of
claims filed by the raters and DROs.
Currently there is no oversight of the quality of work the
DROs perform. As identified by the AMVETS sponsored ``National
Symposium for the Needs of Young Veterans,'' DROs are evaluated
on the number of claims they submit, but not necessarily on the
number of claims that are submitted and that are good claims
and have awards given to them or are denied or lowered.
The backlog has increased the challenge of the number of
claims that are overturned and remanded. When they are
overturned and remanded they come right back into the system
through appeals. AMVETS suggest three recommendations which
will assist in narrowing the disparities in claims and reduce
the backlog.
First, a centralized training facility that will be tasked
with teaching new raters and DROs in a standardized outlined
process in filing and reviewing claims. This will remove much
of the regional personality that affects the disparity in the
claims as they are.
Secondly, there are needs to improve the oversight of both
the rater and reviewer and CP doctors. In regard to the CP,
oversight should be placed and to ensure the examiners guide is
being utilized. This could be done through the Whistle Blower
Program which would allow a veteran to make an appeal or make a
report on a compensation and pension (C&P) exam that went
wrong.
This system--a system needs to be developed that will not
only ensure claims are being filed but the claims are being
filed properly and completely. H.R. 3047 makes efforts to
improve the credit receive system which the DROs and rating
veterans service representatives (RVSRs) currently work. This
system, or a system that monitors a ratio of cases remanded--
were overturned--to the total number of cases referred is
essential in improving the claims process.
Lastly, understanding this is a 2- or 3-year process,
hiring more staff to reduce the burden of the backlog is
critical. There is no single, simple solution to the disparity
problem, but identifying the roots of the problem and tasking
VA with finding solutions to these problems is critical if
improvements are going to recognize the claims system.
Mr. Chairman, this concludes my testimony.
[The prepared statement of Mr. Pryor appears on p. 39.]
Mr. Mitchell. Thank you. We will now open up for questions.
And I have a question, first of all, to Mr. Pryor.
Mr. Pryor. Yes.
Mr. Mitchell. You stated that the current disability rating
disparities leave veterans frustrated and suspicious?
Mr. Pryor. Absolutely.
Mr. Mitchell. I think this is perfectly understandable and
justified.
Mr. Pryor. Right.
Mr. Mitchell. In your opinion, and maybe you gave it in
your last three recommendations, what can we do in Congress,
short of a complete overhaul, to restore confidence in
disability ratings?
Mr. Pryor. We need to give the VA system the support at the
Congressional level, full funding, money to hire new staff
people. Staff people to help decrease the backlog, bring extra
people in or people in to work on those claims. And do exactly
as we talked support the initiative to make a standardized
training system throughout the VA system where all are trained
the same nationwide to support the veterans that are out there.
Mr. Mitchell. And this question is to either Mr. Kenney or
Mr. Pryor. We are all aware that the disability claims backlog
is embarrassingly long. This is due in large part to inadequate
data systems. Pressure is being placed on decision review
officers to meet quota standards in order to address this
backlog. It seems to me that this pressure is pushing
complicated cases to the back burner when they should be
receiving extra attention. What should be done in Congress or
the VA to ensure that we put an end to this practice?
Mr. Pryor. Well again we need to definitely make sure that
the rating and adjudicators and the people reviewing those
claims are fully trained and have a standardized manual or
standardized process that they are using to make the decisions
on ratings.
Secondly, when they make poor decisions and they make a low
rating or a non-rating and that goes back to the veteran, that
is going to cause the backlog if they are making those types of
decisions when they should be rating a claim, that is going to
go back to the veteran. The veteran is going to file an appeal.
It is going to go back into the system and it is going to
continually even bog the system down even more.
So maybe we should have a review of claims that are denied
before they ever go back to the veteran, you know that might be
an idea. But we need to be have a standard process and
everybody rating from the same process.
Mr. Kenney. Mr. Chairman, If I may? I know in our regional
office one of the top priorities are the young men and the
young women that are catastrophically injured as a result of
our current conflict. And that has with some of our older
veterans given them the perception that they are being placed
on the back burner. And we assure them, you know, the VA went
about and they established a Tiger Team in Cleveland to handle
the backlog of those veterans over 70. But until VA gets the
funding that they need to fully staff, it is just going to
continue.
And it is, I think, it is going to get worse because most
of, I would say and I am going to guess at 50 percent of the
staff in the VA are about my age. About 2 years, 3 years from
now, sir, they are going to be thinking very seriously about
that cabana on the beach. And the VA is going to get hit with a
large loss of personnel. And I think now is the time the VA has
got to start thinking about those 3, 4 years down the road when
those people are going to be leaving.
And I would suggest that similar to the BOOTS to Teachers
Program. We should have a BOOTS to VA problem? Why not
reimplement Project Transition? Military personnel that are
planning on leaving the service either due to the expiration of
their enlistment or their retirement. Six years prior to that
put them into a transitional program and put them in a VA
Office. Have the VA offer them employment. Put them into a
project transition. And then at their retirement or their
release from active duty, these individuals will be 6 months
ahead of everybody else. And they are coming off the line. They
know what these troops have been going through because they are
the troops. And I think that would greatly improve.
Mr. Mitchell. Thank you. Ms. Brown-Waite?
Ms. Brown-Waite. Thank you, Mr. Chairman. That is an
excellent suggestion, J.J. And as you were talking about, you
know, people looking forward to the cabana on the beach, it
made me think that you know even though right now there is a
slow down in the housing market, people are not going to retire
to, with all due respect, Alaska. They are going to be coming
to Florida. And Florida will have even more veterans than what
they----
Mr. Mitchell. And Arizona.
Ms. Brown-Waite [continuing]. And Arizona. Even more
veterans than what they do now. Right now we exceed Texas in
the number of veterans that Florida is caring for and yet Texas
has two regional offices. Tell me what if another regional
office were to be placed in Florida, what do you think the
outcome would be. Would it be more timely decisions? Tell me
what your expectation would be if another regional office could
be placed in Florida?
Mr. Kenney. If we had another regional office in the State
of Florida, I have no doubt that the claims process would be
expedited. I think the last thing I saw was we had like 25,
26,000 claims in the inventory at St. Petersburg. So you split
that, you have 13,000 each regional office. If we staff up the
second regional office with experienced raters, plus a
contingency of newly assigned or newly raters, DROs, I think it
can't help but improve the system.
As we not only do we have the State of Florida, they are
handling Puerto Rico----
Ms. Brown-Waite. And Georgia.
Mr. Kenney [continuing]. And the U.S. Virgin Islands. So
they are definitely, our regional office is overwhelmed.
Ms. Brown-Waite. Mr. Pryor, let me ask you a question. The
IDA report that we will be hearing about in a later panel
focuses on six recommendations for consideration by the VA.
They are standardization of initial and ongoing training for
rating specialist, to standardize the medical evaluation
reporting process, to increase oversight and review of the
rating decisions, to consolidate rating activities into a
central location, and to develop and implement metrics to
monitor consistency in adjudication results, and expand and
improve data collection and retention.
I know that you assist veterans in their claims processing.
If only three of these recommendations could be implemented,
which three do you think should be at the top of the list?
Mr. Pryor. Again, standardizing the training and the
process for all of the adjudicators and the people that are
reviewing the claims at the regional level, I think, definitely
should take place. I think that should be our number one
priority.
The claims itself and developing the claims process once it
reaches the VA system, the regional office the VA system should
be I think looked at very heavily. I, you know, possibly
setting up a pre-screening a claim before it ever goes to an
adjudicator to make sure everything is there. So that when it
does go to the adjudicator, the adjudicator, the person
reviewing that claim, can make good decisions.
So standardizing and maybe reorganizing or revamping that
claims process and what is happening with that claim once it
gets to the regional office would be the second issue. I really
believe that.
The third issue to me is very important, is the staffing
issue and to the AMVETS it is very important. I don't think you
can do any of those things unless you staff appropriately and
get that backlog out of there. Taking care of that backlog.
Ms. Brown-Waite. As you know, there are 1,500 that I
believe that were in last year's appropriations. Obviously,
there is a training process that takes place there. There are
many of who believe that certainly could at least double maybe
triple that number to work on that backlog.
I appreciate your comments, sir. Thank you very much.
Mr. Pryor. Thank you.
Ms. Brown-Waite. J.J., while the yellow light is on, did
you want to add anything?
Mr. Kenney. I think he pretty well covered it. We--I know
the VA has the Veterans Claim Assistance Act. They have a duty
to assist now. We in the field that sit across the desk from
the veteran, it is our responsibility. And in my office I try
not to forward any claim that is not ready to rate.
Ms. Brown-Waite. Thank you very much. And I yield back, Mr.
Chairman.
Mr. Mitchell. Thank you. Mr. Space?
Mr. Space. Thank you, Mr. Chairman. Mr. Pryor, I wanted to
just inquire if we could in maybe more real world terms about
the issue of personalities, whether it be of claims or
physicians who are doing compensation or pension exams, or
whether it be on the raters that are making their
determinations based upon, in large part anyway, those exams.
Can you give us an example, perhaps, of how that, you know
that term personality that you referred to in real world
application what we are talking about here?
Mr. Pryor. Well you mentioned New Mexico and just last week
I was working with a Veteran there in Southern Ohio who had a
claim for PTSD. And the veteran was awarded I think 20 percent
service connection on PTSD and was in the process of filing an
appeal. And I don't know who he talked with at the VA System,
but you know he was told you know, ``If you want 100 percent,
go to New Mexico.'' Because there was C&P doctor down there
that was a war time veteran that reviews claims. And anybody
that was in combat and saw battle was automatically recommended
100 percent disabled.
That is where the personalities, past experiences, a
persons' life--that human factor gets involved. And we are
never, I guess, we will never get totally away from that, but
if we try to standardize and provide that person with standard
formula that he has to go by or they have to go by then, I
think we are going to have more standard awards across the
Nation, State by State.
Mr. Space. Right. And I mean are you aware of any
reputation that any particular facilities in your region
perhaps may have from a personality standpoint that may affect
the amount of awards that are rated?
Mr. Pryor. I am, you know, I think each facility--first of
all, I want to say that every VA facility that I have ever
worked with the people have been great people. But if a VA
facility is short staffed and does not have the staffing level
to give good in-depth service and the people, the doctors, the
people are spread so thin that they are dealing with thousands
and thousands of people, then that is going to have an affect
on their decision making and how much care they are going to
take on a claim, how much care they are gong to take with one
person. And you may have one hospital, for instance the
hospital there in Chillicothe which is a very fine hospital,
but they may be staffed short, staffed in the psychoanalysis
area and that is an area that is going to suffer in that
hospital.
Mr. Space. Thank you, Mr. Pryor. Again, thank you for
coming to Washington for this hearing. And I yield back.
Mr. Mitchell. Thank you.
Mr. Walz.
Mr. Walz. I have no further questions, Mr. Chairman. I
yield back.
Mr. Mitchell. Thank you. Any further questions? Well thank
you very much. We appreciate you coming today.
Before we get to the second panel, let me just say that we
are about to take some votes and the votes will be about 30
minutes. So if the next panel would come up we can get started
anyway. And once the buzzer rings we will recess until we have
the votes.
And I welcome panel two to the witness table. Dr. David
Hunter is a Research Staff Member at the Institute for Defense
Analyses and the Project Leader for IDA's recent report on
disparities.
Mr. Jon A. Wooditch is the Deputy Inspector General in the
VA's Office of Inspector General and an original author of the
OIG's report from 2005.
Their insight and experience on this issue is welcomed. Mr.
Hunter you have 5 minutes to make your presentation.
STATEMENTS OF DAVID E. HUNTER, PH.D., RESEARCH STAFF MEMBER,
COST ANALYSIS AND RESEARCH DIVISION, INSTITUTE FOR DEFENSE
ANALYSES; AND JON A. WOODITCH, DEPUTY INSPECTOR GENERAL, OFFICE
OF INSPECTOR GENERAL, U.S. DEPARTMENT OF VETERANS AFFAIRS;
ACCOMPANIED BY JOSEPH M. VALLOWE, DEPUTY ASSISTANT INSPECTOR
GENERAL, MANAGEMENT AND ADMINISTRATION, OFFICE OF INSPECTOR
GENERAL, U.S. DEPARTMENT OF VETERANS AFFAIRS
STATEMENT OF DAVID E. HUNTER, PH.D.
Mr. Hunter. Mr. Chairman and Members of the Subcommittee, I
am please to come before you today to discuss IDA's work on
disability compensation conducted for the Department of Veteran
Affairs.
In May 2005, the VA asked the Institute for Defense
Analyses to conduct a study of the major sources of the
observed variations across States, and one, the average
payments that veterans receiving disability compensation, and
two, the percent of veterans receiving disability compensation.
My testimony today will be based on the results of that
study which have been documented in IDA paper P4175. For the
first question, the variation in average payments across
States, we found that the average award in this State is almost
entirely driven by the proportion of recipients who are
receiving maximum awards. For the maximum awards, we found that
awards of individual unemployability or IU exhibited the
greatest variability across States.
Our study quantified the amount of variation attributable
to States having veteran population with different
characteristics. We found that State-to-State differences in
compensation recipients account for 50 to 70 percent of the
observed variation across State in average awards.
The major factors we identified that contribute to the
observed variation across States are Post Traumatic Stress
Disorder or PTSD, power of attorney representation, and period
of service of the veteran.
For the second question, the variation of percent of
veterans receiving compensation, we found that application
rates appear to be the key driver of the variation. In
addition, we found that military retirees are over four times
as likely to be receiving compensation as non-retirees. And
this alone accounts for over 40 percent of the variation across
States.
Based on our findings and observations we made six
recommendations for consideration by the VA. I should mention
we examined the process by which VA adjudicates claims and
found that the process has potential for producing persistent
regional differences in rating results.
Our recommendations were, one, standardize initial and
ongoing training specialists. Two, to standardize the medical
evaluation reporting process. Three, to increase oversight and
review of the rating decision. Four, to consolidate rating
activities to a central location or to fewer locations. Five,
develop and implement metrics to monitor consistency and
adjudication results. And, six, to improve and expand data
collection and retention.
Now these recommendations aim to improve the consistency of
the adjudication process.
Mr. Chairman and Members of the Subcommittee, that
concludes my remarks. I have provided a more extensive
statement that I ask be included in the record. And I am
available for questions.
[The prepared statement of Dr. Hunter appears on p. 40.]
Mr. Mitchell. Thank you, Dr. Hunter, and we do have that
statement. It will be included.
Mr. Hunter. Thank you, sir.
Mr. Mitchell. Mr. Wooditch?
STATEMENT OF JON A. WOODITCH
Mr. Wooditch. Thank you, Mr. Chairman and Members of the
Subcommittee.
I am pleased to be here today to discuss the OIG's report
on State Variances and VA Disability Compensation Payments
issued in May 2005. With me is Joe Vallowe, Deputy Assistant
Inspector General for Management and Administration, who is
responsible for tracking implementation of OIG report
recommendations.
Variances in average annual disability compensation
payments have existed for decades. Our report indicated that
the variance between the high and low State in fiscal year 2004
was $5,043. To understand why this variance existed we
identified and assessed more than 20 possible factors.
We discovered that some factors contributing to the
variance were not within VBA's control. As such, we concluded
that some level of variance is expected. We also discovered
that some factors are within VBA's control, especially
disability rating decisions, where much of the information
needed to make these decisions is subject to varying degrees of
interpretation in judgment.
This occurs with both veterans when providing information
about their medical condition and VBA claims adjudicators when
assessing it for rating purposes.
Rating decisions can also be influenced on how medical
examination results are presented, by the amount of training
and rater experience, and by the Rating Schedule itself.
This subjectivity results in inconsistent ratings for
similar conditions, which can influence variances in payments
among States. As a result, the issue is not whether a variance
exists, but whether the magnitude of the variance is
acceptable.
Our 2005 report includes eight recommendations aimed at
improving consistency of ratings. In particular, we recommended
that VBA conduct a study of compensation payments in order to
develop data and metrics for monitoring and managing variances.
The December 2006 Institute for Defense Analyses report
conducted as a result of this recommendation confirmed our
findings and made meaningful recommendations to assist VBA in
understanding and reducing unacceptable variances.
In preparation for this hearing, we obtained compensation
payment data by State for fiscal years 2005 and 2006. Our
review of this data revealed that while national variances
continued to increase, it is doing so at a much lower rate than
in previous years.
We also discovered that one reason for this decline can be
attributed to more consistent ratings for new claims. In fact,
the national variance for new claims has declined from $6,054
in 2004 to $4,477 in 2006.
While some progress has been made, VBA remains challenged
to improve consistency of ratings. To accomplish this, we
believe further efforts are needed in monitoring and measuring
variations in ratings by State and VBA regional offices.
Unacceptable variations should be thoroughly evaluated to
include in-depth reviews of individual claims that deviate from
expected norms. Information obtained from these reviews should
be used to improve rating consistency nationwide.
This approach is consistent with IDA's recommendations and
with VBA's own Consistency Analysis Study Group, which provided
a plan to analyze and rectify inconsistencies in disability
evaluations by looking at individual claims.
In response to our 2007 Major Management Challenges, VBA
stated that it plans to begin quarterly monitoring of rating
decisions by diagnostic code and expand its quality assurance
program, known as STAR, to accomplish these reviews during
fiscal year 2008.
In closing, we believe implementation of the Study Group
plan and IDA's recommendations will greatly assist VA in
improving the consistency of rating decisions. We also believe
that expansion of the responsibilities and staff of the STAR
Program will be very important to achieving this goal.
Mr. Chairman, that concludes my remarks. I thank you once
again for the opportunity to discuss this very important issue.
Mr. Vallowe and I will be pleased to answer any questions.
[The prepared statement of the Mr. Wooditch appears on p.
42.]
Mr. Mitchell. Thank you. We hate to inconvenience you, but
we will be back in about a half hour. Thank you.
At this time, the meeting is recessed for about 30 minutes.
[Recess.]
Mr. Mitchell. We will reconvene now with this hearing.
Because of the little break, I hope the questions I ask both of
you will not be redundant from what you said in your statement.
I want you to know that both your full statements have been
included in the record.
First, Mr. Wooditch, you know the VA Inspector General has
weighed in on this issue in the past. And the Department
responded with minimal action. Having seen the report and
recommendations by the Institute of Defense Analyses, do you
think the VA has the resources to react more aggressively now?
And if not, what would be immediately necessary to remedy
this problem?
Mr. Wooditch. As I said in my statement, I really think it
is important to do individual case reviews of claims that show
wide variances in ratings nationwide. I think VBA's STAR
program, their quality assurance program, is the ideal
mechanism to help make that happen.
I believe that program currently is underfunded in terms of
resources, and I think that more quality assurance folks need
to be put into it. But given the magnitude of the problem, VBA
has a very, very difficult challenge. They process something
like 1.7 million claims a year. They have a backlog that
everybody is aware of. They have a very difficult balancing act
on determining do we put resources into processing claims or do
we put resources into quality assurance?
So I think they need more resources in both areas to make
it happen.
Mr. Mitchell. Your analysis used various data sources and
advance statistical procedures to reach your conclusions. I do
not think we should have to commission such an in-depth audit
every time we want some information on improving veterans
disability claims.
What improvements, and I think you did suggest those, would
you make to the current system to ensure that Congress and the
VA always have the best disability claims data at the ready?
Mr. Hunter. Sir, I think we recommended in our report one
of the pressing issues is VA needs to improve their data
collection and retention. One of the struggles we had in our
study was getting the data in a proper form for us to do our
analysis. To do something quicker where you could have access
to metrics to examine variations more quickly, that would be
only done I think if data was available for analysis that was
now being collected by VA.
So VA has this data in hand but typically uses it to pay
veterans but does not keep it for analysis. And that would need
to be done, I believe.
Mr. Mitchell. Mr. Space?
Mr. Space. Thanks, Mr. Chairman. Mr. Hunter, I listened to
your testimony and I have not had a chance to review your
report or your statement. But I just want to clarify a couple
of things.
Your testimony discussed some issues that I believe you
indicated were part of the reason anyway that we see variations
that were not attributable to actions or inaction of the VA. Is
that correct?
Mr. Hunter. Yes, sir. That is correct.
Mr. Space. I want to make clear, however, that you are in
fact acknowledging that there are variations that perhaps are
systemic within the VA that may be contributing to the extent
of those variances?
Mr. Hunter. Yes. Our findings were that between 50 and 70
percent could be attributable to different characteristics in
the veteran and recipient population across States. But that
the current process as it has been set up, has the potential
for producing regional differences. And the 30 to 50 percent
remaining that could not be explained could be due to these
regional differences in adjudication results.
Mr. Space. And so, 30 to 50 percent of those--we all
understand there are going to be variations. If you had a
perfect system there would be variations. But your feeling is
that 30 to 50 percent are attributable to deficiencies within
the VA?
Mr. Hunter. Potential inconsistencies. I mean we couldn't
find any other explanation the data we looked at.
Mr. Space. All right. And do you--I believe your statement
references attorney representation as a variable. I want to
talk a little bit about that with the limited time that I have.
First of all, with respect to attorney representation, why
would that be listed as a variable?
Mr. Hunter. Well it turned out that if you looked at
veterans with power of attorney representation, they received I
think about three times as much average awards. A little over
11,000 versus veterans without power of attorney
representation.
Mr. Space. All right. And----
Mr. Hunter. There was a huge disparity between those two
groups.
Mr. Space. Okay. And I understand that. I am curious as to
why that would be listed as a variable in a study as to the
reasons for discrepancies on a State-by-State basis.
Mr. Hunter. I mean one of the things we tried to get at in
our study was if that was the reason for the differences, then
we know what corrective actions to recommend. If the
differences were just certain States had more access to power
of attorney than others.
What we had found, however, were that veterans with power
of attorney had done substantially better than veterans without
power of attorney. But that across States, it didn't explain as
much of the variation. We quantified only 15 percent of the
variation was due to power of attorney differences alone across
States.
Mr. Space. So there are significant discrepancies between
States in terms of the number of veterans who seek legal
counsel?
Mr. Hunter. Of the percent of recipients who have a power
of attorney listed on their claim. Correct. There is variation
across States.
Mr. Space. Now is there a difference between having a power
of attorney listed in seeking legal counsel or is it
essentially the same thing?
Mr. Hunter. Yes. I think power of attorneys aren't
necessarily lawyers. I mean they could be CVSOs or VSOs. I mean
there could be different levels of training. But we categorize
whether you had a sponsor on your claim whether it was from
AMVETS, DAV or from legal counsel versus if you submitted your
claim yourself.
Mr. Space. All right. Now do you have any ideas as to the
reasons for that discrepancy between those who have power of
attorney and those who don't?
Mr. Hunter. Yes. We looked into that a little bit and we
found three reasons. I mean the first one was that if you had a
power of attorney you had slightly more issues per claim. So a
power of attorney would advise the veteran to submit not only
the claim they came in for, but other things they may qualify
for of which the veteran may not been aware.
Mr. Space. Uh huh.
Mr. Hunter. The second one is the veteran with the power of
attorney had a slightly higher average degree of disability
than a veteran without a power of attorney. So the hypothesis
there which proved out to be shown in the data was that the
power of attorney would know what forms to fill out and be able
to more adequately explain the veterans injury.
But the number one reason by far for the improvement was a
veteran with a power of attorney was far more likely to qualify
for individual unemployability. Twelve percent of veterans with
a power of attorney or the IU had a power of attorney. If it
didn't have a power of attorney, only 1.7 percent wound up with
IU.
Mr. Space. Is it safe to assume from that a veteran is more
likely to be able to obtain the assistance of counsel when he
has got a stronger case?
I mean, I am trying to figure out the reasons for this. It
seems to me that as far as legal counsel goes, the likelihood
of obtaining or being able to retain counsel improves with the
quality of one's argument for disability.
As a former lawyer, I can tell you that is true. I also--
Mr. Chairman, may I have an additional 2 minutes?
Mr. Mitchell. Yes.
Mr. Space. Thank you. I am also concerned that the presence
of legal counsel in and of itself may have an impact on hearing
officers or raters. Did your studies determine whether or not
there was any influence that the mere presence of counsel may
have had on the outcomes?
Mr. Hunter. Well, I mean, I guess all we could look at was
the data at the end. And we certainly found veterans with a
power of attorney had higher average award than veterans
without a power of attorney. But the hypothesis we got for
raters were that the claims were better developed or that the
power of attorney assisted the veteran in filling out all their
necessary paper work for an IU claim or advise them of their
legal rights.
All the raters we interviewed suggested that the mere
presence of a power of attorney did not sway their opinion one
way or the other.
Mr. Space. Okay. And one final question. Do you recommend
that in response to your findings that it seems to me that we
have one of two courses we can pursue. One is to encourage
veterans to obtain counsel or second streamline the system and
make it more navigable for those who aren't trained to deal
with it.
Have you considered recommendations in terms of either one
or both of those?
Mr. Hunter. I would say our recommendations that we
presented addressed probably more than what can be done to
improve consistency across all States.
We didn't necessarily address the benefits of streamlining.
It seems like it would be an excellent idea for many of the
other VA problems that they face, but I think it was something
a little out of the scope of our analysis which was just to
address consistency and adjudication claims.
Mr. Space. Okay. Thank you, Mr. Hunter.
Mr. Mitchell. Excuse me just a second. I would like to
follow up on a question.
You mentioned that with an attorney or a power of attorney
that many times they would file more than just one claim. I
mean all the potential claims. For example, in the first panel
Mr. Kenney pointed that two people had hepatitis C. And one
said it was with an air gun and he was granted disability. The
other thought it was with a blood transfusion, it was denied.
When they went back and filed again it was the same type of air
gun that created that.
So you are saying probably with a power of attorney they
would take into account all the potential reasons for the
disability----
Mr. Hunter. Yes.
Mr. Mitchell [continuing]. At one time.
Mr. Hunter. That certainly is true, sir. And they would
recommend other disabilities that you may be presumptive for
and ask if you also have any symptoms for those conditions as
well. Where a veteran might just come in for his knee or his
back, which was his primary injury and wouldn't know about
ringing in the ear or other presumptive conditions that he may
qualify for.
Mr. Mitchell. Thank you. Ms. Brown-Waite.
Ms. Brown-Waite. Thank you. I don't know how we take the
human factor out of it with prejudices that people bring to a
job, but in your testimony you discuss how the systematic
technical accuracy review that is used by the VA doesn't track
consistency. If we can't take prejudices out of it, shouldn't
we at least be tracking consistencies?
And I say prejudices. I mean not only prejudices but also
sloppiness. And just those things that occur in the workplace
that certainly shouldn't be occurring when you are dealing with
veterans benefits, but it is out there.
If we started over to create a performance and quality
assurance program that would include consistency and accuracy,
what should this program look like?
Mr. Hunter. I think our recommendation we said to create
metrics to improve consistency. Sort of a two-step approach.
The first is due to the large volume of claims the VA
processes, it is unrealistic to do a large sampling just from
the bottom up of picking them off the pile. So we suggested
doing metrics where you could track the data to see if there
are any red flags that pop up on some of your more variable
issues such as post traumatic stress disorder awards or awards
for individual unemployability or for grants or denials of
service connection. And that will point you into areas of
possible discrepancies at which point you can do some detailed
reviews of the claim files to make sure that the claims are
being adjudicated consistently across States and correctly
across States according to VA guidelines. Neither of which is
really being done or hasn't been done previously
Ms. Brown-Waite. Do you believe that having an independent
agency review and certify the VBA training procedure would
improve standardization?
Mr. Hunter. I don't know if we addressed in our report
anything about having an independent assessment or what the
best way to do it. But we certainly stressed the importance of
making sure where everyone did receive the same training.
And the mechanism by which they received the training, we
really didn't look too much into, but when one rating office is
getting different training from raters in a different regional
office you certainly have the potential for inconsistencies.
Ms. Brown-Waite. Is it the echo process of this, is the way
we did it so we're feeding this back to the raters?
In other words, the trainers that is the way that they did
it. They are in a region and so they echo this back to the
people doing the rating.
Mr. Hunter. Yeah. And then we certainly saw a lot of raters
we had talked to said that on-the-job training from the more
experienced people was their number one way of learning. And so
they would typically pick up the rating style and be judged
correct if they rated a case the same way as whoever was the
second signature or who the more experienced raters were at the
regional office.
Ms. Brown-Waite. Did you at all look at best practices that
perhaps could be emulated elsewhere?
Mr. Hunter. Yeah. I don't think we compared, for instance,
the VA practice to claims processing or other similar
activities outside of the Department of Veterans Affairs, if
that is what you are asking.
Ms. Brown-Waite. No. I mean in the Department of Veterans
Affairs, a really consistent, good regional office versus maybe
one that isn't. I mean, you pointed out the disparity in the
awards that are given. Did you also find several that really
did a superb job where the error rate was very low?
Mr. Hunter. Yeah. No. I think we have consciously avoided
any declarations of good regional offices versus bad regional
offices or correct versus incorrect. We made no judgment if one
was too high or too low. Our tasking was to find why they were
different. And so we tried to identify why one regional office
was giving different results from another regional office.
Ms. Brown-Waite. Okay. I yield back the balance of my time.
Mr. Mitchell. Thank you. Are there any other questions of
the panel?
Thank you very much. And thank you for sticking around
during the vote. I appreciate that.
And as the third panel comes up, I just want you to know
that in about 45 minutes to an hour we will be called for
another one. So hopefully we can conclude this hearing by then.
I would like to welcome to the table Mr. Ron Aument. He is
Deputy Under Secretary for Benefits for the VA and the most
senior civil servant at the VBA. I appreciate you coming, Mr.
Aument, and also want to thank you for your commitment to
helping our Nation's veterans.
Would you please also introduce your team with you?
STATEMENT OF RONALD R. AUMENT, DEPUTY UNDER SECRETARY FOR
BENEFITS, VETERANS BENEFITS ADMINISTRATION, U.S. DEPARTMENT OF
VETERANS AFFAIRS; ACCOMPANIED BY BRADLEY G. MAYES, DIRECTOR,
COMPENSATION AND PENSION SERVICE, VETERANS BENEFITS
ADMINISTRATION, U.S. DEPARTMENT OF VETERANS AFFAIRS
Mr. Aument. I certainly will, Mr. Chairman. I am pleased
today to be accompanied by Mr. Bradley Mayes who is the
Director of our Compensation and Pension Service, which is the
Program Office responsible for the Disability Compensation
Program and developing policy and procedures that are to be
applied uniformly throughout the system.
Mr. Mitchell. Thank you. And you have 5 minutes. You can
submit your full statement to the panel afterward. Thank you.
Mr. Aument. Mr. Chairman and Members of the Subcommittee, I
am pleased to be here today to discuss the Veterans Benefits
Administration response to the Institute for Defense Analyses
report on analysis of differences and disability compensation
in the Department of Veterans Affairs.
Today I will discuss the various initiatives underway
within VBA that support the recommendations put forth by IDA to
improve the quality and consistency of disability claims
processing.
I will respond to each recommendation in turn and discuss
how VBA is working to achieve the intended outcome of that
recommendation.
First, standardize the initial and ongoing training for
rating specialists. VBA continues to enhance and expand
training investments to ensure accurate and consistent decision
making. New hires receive comprehensive training and a
consistent foundation in claims processing principles through a
national centralized training program and a standardized
training curriculum used by all regional offices. Standardized
computer-based tools have been developed for use by all
decisionmakers.
We have established a program of advanced development
training for post traumatic stress disorder claims, and a
mandatory cycle of training has been implemented for all
employees involved in claims processing. VBA already has in
place a skills certification process for veteran service
representatives and we are developing a skills certification
process for rating specialists. Additionally, we have increased
our systematic technical accuracy review, STAR, staff and
tasked it with more oversight visits to our regional offices
and greater responsibilities for training our decision makers.
We have also made significant progress in our efforts to
standardize the medical evaluation process. VA's Compensation
and Pension Examination Program, CPEP, has been very successful
in improving the examination process through the use of
templates, quality reports, and examiner certification. Our
CPEP initiatives are instrumental to achievement of our quality
goals. VBA and the Veterans Health Administration continue to
work together to develop and refine tools that will ensure
greater consistency.
VBA has established an aggressive and comprehensive program
of quality assurance and oversight to assess compliance of
claims processing policy and procedures and assure consistent
application. We are increasing our STAR Program staffing and
the sample size of their reviews. We have enhanced the STAR
database to better use the information collected in reporting
reviews. And we are also increasing on-site training, site
visit participation, and use of results from STAR reports to
clarify procedures and better focus training.
The consolidation of specialized processing operations for
certain types of claims has been implemented to provide better
and more consistent decisions. Some of our efforts include the
establishment of three Pension Maintenance Centers, the Tiger
Team, the Appeals Management Center, and the Casualty
Assistance Unit. VBA also established two Development Centers
in Phoenix and Roanoke and centralized the processing of all
radiation claims to the Jackson regional office.
The Benefits Delivery at Discharge Program provides
servicemembers with briefings on VA benefits, assistance of
completing applications and a disability examination before
leaving military service. The goal of this program is to
deliver benefits within 60 days following discharge. VBA has
consolidated the rating aspects of this program to two rating
sites, which is bringing greater consistency of decisions on
claims filed by newly separating veterans.
We continue to look for ways to achieve additional
organizational efficiencies through consolidation of other
aspects of our claims processing, including death benefits,
fiduciary activities, and telephone services. In addition to
conducting quality reviews, C&P's STAR staff are beginning to
conduct analyses to identify unusual patterns of variance and
claims adjudication by diagnostic codes, and then review
selected disabilities to assess the level of decision
consistency among and between regional offices.
These studies are used to identify where additional
guidance and training are needed to improve consistency and
accuracy as well as to drive procedural or regulatory changes.
The VBA's data management systems have been substantially
improved in recent years with such programs as the VETSNET
suite of applications and the establishment of our data
warehouse.
VETSNET and the analytical tools in our data warehouse
provide our employees and managers with more robust data which
better support information management and analyses.
Mr. Chairman, this concludes my testimony. And I will be
pleased to answer any questions you or other Members of the
Subcommittee may have.
[The prepared statement of Mr. Aument appears on p. 44.]
Mr. Mitchell. Thank you. I do have one. You know this has
been going on for a while. In 2002, there were inconsistencies
brought up by the Government Accountability Office. And they
issued another report in 2003 and a third report in 2004. And
then it was followed by an Inspector General investigation in
2005. And from what we have heard over the course of this
session of Congress, the lines and the wait periods are also
getting longer, not shorter.
I want to know, how long is it going to take? And what are
you doing to address these particular reports? It has been
2002, 2003, 2005 and now the most current one. What is stopping
the VA from implementing the systems that have been
recommended? And how long do you think it is going to take
before we get this under control?
Mr. Aument. Well I think there is more than a single
question there.
Mr. Mitchell. Right.
Mr. Aument [continuing]. Mr. Chairman. If I can start with
the GAO reports on consistency. One of the challenges, I
believe that Dr. Hunter mentioned, had to do with our data on
this. And the lack of robust and adequate data to help us----
Mr. Mitchell. And let me ask you, who compiles this data?
Mr. Aument. We do. It is collected as a subsidiary of the
claims adjudication process.
Mr. Mitchell. And evidently that is part of the problem----
Mr. Aument. Indeed it is, sir.
Mr. Mitchell [continuing]. Your data that you collected and
that you have.
Mr. Aument. That is correct, sir. And, as Dr. Hunter
mentioned, one of the things that they found challenging in the
course of conducting their analysis was that our payment
system, our old legacy payment system, is just that. It is a
payment system. And it did not collect and retain as much
administrative data as was needed to conduct very thorough
analyses.
Much of that has changed starting in 2005. Looking forward
from 2005 we have a much more robust data set available to us
to conduct those very analyses that will lead us toward areas
we should be examining more closely for inconsistencies.
Up until then, our STAR Program had always focused on the
accuracy of decisions. And we had been tied philosophically to
the notion that if we became more accurate in our decision
making, consistency would indeed follow. But as GAO and many
others have told you, that is not necessarily true.
So we need to follow up on these analyses to take us where
the data leads us.
Mr. Mitchell. And what did you do after the 2002, 2003,
2005 reports? Did you do anything with these reports at all?
Mr. Aument. There were many things included in those
reports about what we should be doing to become more efficient,
as well as to introduce qualitative improvements. There are a
number of suggestions that they made about our STAR system for
measuring accuracy. Many of those recommendations have been
acted upon. But as for the consistency analysis aspect of it,
very little was done with those recommendations due in part to
the problems I just mentioned to you as far as having
sufficient administrative data to actually get our arms around
that.
Mr. Mitchell. And do you think by the next time we have a
report like this that these problems will be taken care of? How
long do you think it is going to take to follow up on these
recommendations?
Mr. Aument. I believe that we will have in place, before
this fiscal year is out, a much more robust quality assurance
program that includes consistency review capacity.
The STAR staff answers to Mr. Mayes and the Quality
Assurance Program is under his direct control. We are adding
staff to that program and we have armed them with some tools so
that they have already begun some of the preliminary steps in
conducting some of these consistency reviews.
For example, we have taken a look at some station outliers,
from our perspective, in the PTSD Program, trying to take a
look at what those findings will tell us about that program. He
is going to be looking at more and more of our diagnostic areas
and disability areas to try and find out if we have stations
that are outliers either with excessively high or excessively
low levels of service-connection ratings and what is going on
at those stations that is different from the Nation as a whole.
Mr. Mitchell. And the last follow up. Then there is nothing
stopping the VA from implementing the system? You got
everything in place and it should all be implemented?
Mr. Aument. That is correct, sir.
Mr. Mitchell. Thank you. Ms. Brown-Waite.
Ms. Brown-Waite. Mr. Aument, the system that you were
talking about obtaining and having available all the data, is
that the BDN System?
Mr. Aument. The system that the Institute for Defense
Analyses had to turn to for the data used in their analysis was
the BDN System.
The system that I am speaking of today that has the more
robust data is part of the VETSNET suite of applications and it
is called RBA 2000. It also contains information and retains
information about those claims where we have denied benefits as
well as those where we have granted benefits.
Ms. Brown-Waite. Okay. Before we get to the benefit denial
or granting, there is a problem that I actually was involved in
with one of my constituents. He sent the necessary data in. He
sent it, they never received it. He sent it in a second time
and had a certified receipt requested. He got the certified
receipt back. Any time he called or my office called the
response was the same, that they did not have the information.
I get involved in some of the more difficult constituent
issues. And I called and I got a very, very helpful man who
probably if I give his name will be fired. He told me that he
said, ``No, ma'am.'' I identified myself. And I said, ``This is
my constituent.'' And I said, ``You have a privacy form
there.'' He said, ``No, we don't have the information.'' He
said, ``But wait a minute, let me check another program. Let me
go to another screen for another program.'' He went to the
second screen. He said, ``No, ma'am it is not there.'' He said,
``But there is one more.'' There was a third program and he
finally found this information.
Now if the veteran is calling in saying, ``Do you have my
information that I sent you? The documentation that you
needed.'' And he or she is getting a, ``No, no, no,'' answer
because the person answering the phone only goes to one of the
screens then there is an initial problem there. And when I went
back because when I told staff about this they asked me when it
was. I actually had the constituent step file through IQ
printed out last night and it was in the fall of 2005. It was 2
years ago.
Has that problem been remedied?
Mr. Aument. There may be more than one problem there,
Congresswoman. One of the problems it sounds like you are
speaking about is an employee deficiency that I cannot
guarantee----
Ms. Brown-Waite. Are there, sir----
Mr. Aument. There is only one system in which that data
should be entered. That is correct. One----
Ms. Brown-Waite. Well, obviously, there are more than one
system----
Mr. Aument. Right.
Ms. Brown-Waite [continuing]. That somebody somewhere has
been using. And this man had the key to unlock it.
Mr. Aument. Yes.
Ms. Brown-Waite. Now the veteran was an elderly man who
kind of was losing his patience and his belief in our
government. He sent it in twice. We sent it in once. And the
answer he consistently was getting was no it wasn't there.
So you are telling me that there is now only one system----
Mr. Aument. That is correct.
Ms. Brown-Waite [continuing]. That this information would
be entered into?
Mr. Aument. That is correct.
Ms. Brown-Waite. And the name of that system is the RBA
2000?
Mr. Aument. No, it is not. It is called MAP-D, which is
part of the tools used by our veteran service representatives
in the process of developing claims. It should be entered into
that system.
Ms. Brown-Waite. Are there still legacy systems out there
that some people refuse to give up in the VA?
Mr. Aument. There is--they don't really have the option to
do that, Congresswoman. It is not discretionary on their part
as to whether or not they wish to retain an old system or a new
system.
Ms. Brown-Waite. Are there simultaneously in different
programs? Is information also being captured or is it all in
one?
Mr. Aument. The legacy system is still in place. Not
everything has been moved to the replacement system. So in some
cases not all offices are working off of the same system. Not
all of the functionality is in place; pension is still being
processed in the legacy systems.
Ms. Brown-Waite. So what you are telling me is that there
still may exist some additional system components out there
that this information is in?
Mr. Aument. If you are going back to your original
question, no there is not. There are not two systems in which
the information you described would be entered, there is one.
Ms. Brown-Waite. Well there were three. So you are now
saying there is only one?
Mr. Aument. That is what I am saying, Congresswoman, yes.
Ms. Brown-Waite. All right. I yield back.
Mr. Mitchell. Thank you. Mr. Space?
Mr. Space. Thanks, Mr. Chairman. Mr. Aument, this--the VA
contracted with IDA to do this analysis. And I would be
interested in knowing, if you know, the circumstances
surrounding that contract. Why was IDA chosen? Was it bid out?
What were the--what was the impetus to hire IDA as opposed to
someone else?
Mr. Aument. Well, first of all, the impetus was to respond
initially to one of the recommendations that the OIG made in
their May 2005 report----
Mr. Space. Right.
Mr. Aument [continuing]. Which was saying that we should
bring in some outside----
Mr. Space. Right.
Mr. Aument [continuing]. Entities.
Mr. Space. Right.
Mr. Aument. Number two was that VBA was not the contracting
party on that.
Mr. Space. Who was?
Mr. Aument. The VA's Office of Policy and Planning.
Mr. Space. Okay.
Mr. Aument. We co-funded that study with them, but we
deferred to them as to the selection of the appropriate outside
entity to perform this analysis.
Mr. Space. Do you know what went into that selection
process?
Mr. Aument. No, I do not, sir.
Mr. Space. Overall what is your sense of the IDA
recommendations?
Mr. Aument. I think that they are all very good
recommendations. We believe that, for the most part, they help
validate and support many initiatives that we already had
underway. And they point us to areas where we need to do
better, as well as where we need to have additional investment.
As a result, the quality assurance component of that is
based in part, upon their recommendations that we provide
substantially greater resources to the Quality Assurance
Program because we realized we probably have under resourced
that in the past.
Mr. Space. What is your impression of the recommendations
concerning standardized training?
Mr. Aument. We think it is right on target. My boss,
Admiral Cooper, comes from a Navy background. When he came into
this position in 2001, one of the first changes he wanted to
see introduced into the system was to have a greater degree of
standardized training so as to avoid perpetuating some of the
regional proclivities toward training in a particular
direction.
So we have invested substantially greater amounts every
year in centralized training.
Mr. Space. Okay. Just so I understand, you agree that
investing in standardized training is a good thing. The concern
I have is that I have looked at your statement and I have heard
your testimony. And, I fear that I may be drawing the wrong
impression here, but the sense I have is that you believe that
the VBA is already undertaking the necessary steps to satisfy
the concerns raised by IDA in its analysis.
And the concern I have is that perhaps you haven't. And
again it almost appears to me as though you are brushing that
off and saying, ``We agree. We are already taking steps to do
that. What is the next recommendation?''
And the question I have is what efforts, if any, is the VBA
going to undertake in response to this analysis over and above
what it has already undertaken in the past as it pertains to
standardized training?
Mr. Aument. As it pertains to standardized training, we are
insisting upon, first of all, the training plans coming in from
each and every one of our regional offices. They are required
to submit to the Under Secretary, before the end of this month,
standardized training programs. In the compensation and pension
area I believe we are asking for 80 hours?
Mr. Mayes. Eighty hours of mandatory training.
Mr. Aument. Eighty hours of mandatory training for every
employee within their service centers at the regional offices.
Mr. Space. Now this is new in response to the analysis?
Mr. Aument. Well this is an increase in the standardized
training requirements of the past. We have increased that. We
are saying we need to have more mandatory training. We are
increasing the dollar investments in our Centralized Training
Program Development Process. In 2006, we spent around $5
million on developing standardized training products for the
compensation----
Mr. Space. Okay. But that appears to be something that was
undertaken prior to the receipt of this report. And the concern
I have is----
Mr. Aument. Pardon?
Mr. Space [continuing]. I think the VBA needs to step it
up. I think the analysis verifies that. And I am optimistic
that will happen.
Mr. Aument. Yes, sir.
Mr. Space. Now my other question, following up with what
our Chairman asked, it would seem to me from your testimony
that you believe the solution to these problems is entirely
administrative. That apart from perhaps some additional
funding, these are matters that can be handled administratively
without the need for additional legislation. Is that a correct
summary of your impression regarding the need for this Congress
to act?
Mr. Aument. I believe that certainly the problems that were
brought to our attention by the Institute for Defense Analyses
that are actionable on our part lend themselves to
administrative solutions.
I believe you had mentioned to one of the earlier panels
about process simplification, whether or not we should be
looking at that. Certainly from an ease of administration
perspective, process simplification is a very attractive idea.
But so often that occurs at the cost of a compromise in
existing due process protections for veterans. We would never
advocate that. Certainly not for the sake of making our job
easier.
Mr. Space. Thank you, Mr. Aument.
Mr. Mitchell. Just to follow up a little bit with
Congressman Space. You said that there already has been, not
new, but there has been training to standardize. If there has
been training already, then what has been the problem? Not
enough training?
Mr. Aument. Mr. Chairman, I regret if you took my comments
to mean that we have solved the entire problem of standardized
training, that there is nothing further to do. Indeed there is.
We constantly must be developing new and additional
standardized training and improving the standardized training
that we have already developed.
We have a long way to go. It is a never ending process. We
are nowhere near the end of the road yet, sir.
Mr. Mitchell. Is there anything that you need from Congress
to make sure that we can ensure that there is a national
consistency? As to follow up again on Mr. Space who asked if it
was just administrative. Is there anything you need from us to
bring about a consistency?
Mr. Aument. There are two areas I would like to speak to. I
think that I have seen some legislative proposals that are
requesting reports to Congress. I think that in exercising your
role as an oversight body holding us to task in these types of
situations. I think that helps keep us honest.
Number two is the continuing support through funding. Right
now we are able to have Mr. Mayes go out and essentially double
the staff that he has devoted to his Quality Assurance Program
and the STAR Program. And we have also told him if that is not
enough come back to us and ask for more and we will provide
that.
I think the continuing support from the Congress,
recognizing the additional resources that need to go into this
particular program, if we can continue to seek that support
from you, that is a very important contribution, sir.
Mr. Mitchell. Plus more oversight, looking over your
shoulder.
Mr. Aument. Yes, sir. Never hurts us.
Mr. Mitchell. Thank you. Ms. Brown-Waite.
Ms. Brown-Waite. Thank you very much, Mr. Chairman. How
many people do you have in the Quality Service Program?
Mr. Aument. I will ask Mr. Mayes, but I believe the number
today is 18. And I believe we are in the process of hiring 16
additional.
Ms. Brown-Waite. So that is the doubling you were talking
about?
Mr. Aument. Yes.
Ms. Brown-Waite. Did I--is that enough?
Mr. Aument. That is a question I will ask Mr. Mayes.
Mr. Mayes. I think that----
Ms. Brown-Waite. You might want to turn your microphone on,
sir.
Mr. Mayes. First of all, when we talk about our Quality
Assurance Program in VBA, we have 18 STAR quality reviewers
right now. But we have over 40 employees, or will have over 40
employees, involved in quality assurance with respect to the
Compensation and Pension Program.
If I could just point out, the one thing that I think we
really learned from the IDA report is that, while we were
looking at quality, that is through the STAR Program where we
have 18 employees that are reviewing the final product, the
decision that encumbers the government to the veteran. What we
weren't looking at is the variation in that final product
across States.
So what we are doing is adding a fourth element to our
Quality Assurance Program and that is consistency reviews.
Previously we had three elements. They included STAR. They
included site surveys. We actually go out and conduct site
surveys to check that regional offices are in compliance with
our policy. Then we conduct unique special reviews at the
request of the Under Secretary or as a result of some unusual
situation that requires reviews.
But we weren't calling cases in. We weren't systematically
looking at variation and then calling cases in and reviewing
those cases to look for the root cause of the variation.
Ms. Brown-Waite. So did you not know that there were
discrepancies out there?
Mr. Mayes. Well, as Mr. Aument pointed out, we were under
the impression that if we were assured that our final product
was accurate that in fact that would take care of
inconsistency. And I think that is the lesson that we have
learned. We are not just looking at whether we dot all the
``I's'' and cross all the ``T's'' in that rating decision. But,
is that rating decision, are those rules implemented
consistently across jurisdictions?
Ms. Brown-Waite. If you double the number of those in the
Quality Assurance Program to 34 from the 18 that it is now, or
36. If you double it, this is for 800,000 claims. Is that even
going to be sufficient?
Mr. Aument. He has already gone through the next steps of
increasing the sample size that we have per station.
Ms. Brown-Waite. All right.
Mr. Aument. I have had many conversations with Brad on this
issue about what the right sample size is. If you want to parse
that sample in more than a single way, how large should that
sample be?
Frankly, I am of the view right now that we probably need
to have additional staff devoted to this in order to give us
the critical mass of sample that we really need. So it is my
expectation that number is going to grow.
Ms. Brown-Waite. Did I understand you correctly, Mr.
Aument, and I may not have, that you are getting from the RO's
their training plan? Is that what you said?
Mr. Aument. We have asked them to produce training plans
saying, ``Show us how your training needs provide for your
staff.'' Not every RO has training needs that are going to be
identical. They are going to be using the same training
products, but we may find some office has a much more senior
group of staff that need refresher training in a particular
area, whereas other regional offices have many more new hires
and are going to have to focus on more basic types of training,
more introductory products.
Ms. Brown-Waite. But is the training centralized? Or is the
training left up to the RO's in which case you are still going
to have inconsistencies?
Mr. Aument. The training plans submitted by each RO are
going to be subject to the Under Secretary's approval.
Generally speaking, we are going to find that most of the
courses we are using for those in training are going to be
centrally developed. It will not necessarily be centrally
administered. A lot of it is computer delivered so that the
employees will be taking the training at their desktop.
But we may find that there are some employees on site at
those regional offices that may have to go to some source other
than VBA's own centralized technical training. In some cases,
there is leadership, management training, and coach training.
For some of those types of products, there is a greater variety
of course offerings.
Ms. Brown-Waite. So it is or is not centralized? In other
words, is the training in Arizona the same as the training
requirement and course outlines in Florida?
Mr. Aument. The basic training that every employee in
Arizona takes is the same as the basic training that every
employee in St. Petersburg takes.
Ms. Brown-Waite. And there is a course outline?
Mr. Aument. And there is a course outline.
Ms. Brown-Waite. Okay. Thank you. I yield back.
Mr. Mitchell. Mr. Space.
Mr. Space. I have no further questions, Mr. Chairman.
Mr. Mitchell. Let me ask this. This course outline that is
the same, how long has that been in place?
Mr. Aument. Excuse me, Mr. Chairman?
Mr. Mitchell. How long has that course outline been in
place?
Mr. Aument. Oh, it is a dynamic outline. As we add courses
to the curriculum it is going to be revised every single year.
Mr. Mitchell. And everybody--how long have they been using
this course? The same course?
Mr. Aument. Brad.
Mr. Mayes. Well, I want to make sure we are talking----
Mr. Mitchell. We are talking about the training. The
training these people are getting.
Mr. Mayes. Yes, sir.
Mr. Mitchell. You said that they are getting the same
training----
Mr. Mayes. The TPSS----
Mr. Mitchell [continuing]. In St. Petersburg as well as
Phoenix. If they are getting the same training, I want to know
how long they have been getting this training.
Mr. Mayes. The development of the training modules, I
believe, was initiated back in 2002. These would be the tools
that are used.
Mr. Mitchell. And then why are there discrepancies?
Mr. Aument. Discrepancies arise from more than just
training differences, Mr. Chairman.
Mr. Mitchell. But I think we were told that national
training was one of the most important parts.
Mr. Aument. Absolutely, Mr. Chairman.
Mr. Mitchell. And you say you have already gotten the
training. It hasn't been working.
Mr. Aument. Well, there are differences in performance.
There are many performance variables across the system, not all
of which can be attributed to training. There are many, many
issues: good supervision, good management, good leadership.
Mr. Mitchell. How are you going to handle those things
then?
Mr. Aument. Pardon?
Mr. Mitchell. How are you going to handle good supervision
and performance?
Mr. Aument. I think the way that you would do that in any
sort of an operation, sir.
Mr. Mitchell. But it hasn't been working.
Mr. Aument. You put out good performance standards and you
try to make sure that people adhere to those.
Mr. Mitchell. But it hasn't worked.
Mr. Aument. Well, I don't know. I think that, for the most
part, it has worked. We have had regional office directors who
have been removed.
Mr. Mitchell. Then why are there discrepancies that are so
wide?
Mr. Aument. It is more than just that, sir. There are many
other reasons. You heard the Institute for Defense Analyses say
it is not all something within our control. I mean there are
differences in veteran populations. We can't control that.
Mr. Mitchell. So we shouldn't expect any difference than
what has been going on?
Mr. Aument. I think you should expect difference. You
should see a narrowing band of variation on the new work coming
into the system. But to the extent that you are going to be
calling us up every year and taking a look at everybody on the
rolls and saying, ``what is the average annual compensation,''
that is not going to change measurably from year to year. The
total population of veterans receiving compensation probably
only changes by 5 percent each year.
Mr. Mitchell. So we can expect Ohio to still be at the
bottom?
Mr. Aument. Well, if I look at where Ohio is for the work
that we do----
Mr. Mitchell. And New Mexico at the top.
Mr. Aument [continuing]. In 2007 you will find that Ohio
was number 37 for the work completed and the veterans added
during 2007. But that is not going to change their position in
the aggregate average that you are pointing to.
Mr. Mitchell. Mr. Space?
Ms. Brown-Waite. May I----
Mr. Mitchell. Excuse me. Go ahead.
Ms. Brown-Waite. One of the training components is the
training for PTSD. And I understand it is a 30-hour training
course. Is that correct?
Mr. Aument. Yes. The second module is a 30- to 33-hour
course. It is the time, including the testing, it would take
the average rating specialist to go through the course.
Ms. Brown-Waite. So how many raters have taken this 30-hour
mandatory PTSD training?
Mr. Aument. Very few to date. We just rolled this out in
July of this year. We just completed the field testing. In the
training plans that are coming in for the fiscal year that
began October 1, we are going to require every rating
specialist to complete that. That is going to be without
exception. Every rating specialist will complete that in 2008.
Ms. Brown-Waite. Do I understand you correctly that
although we have been dealing with PTSD for this long that you
just now have a training module or is this a new one?
Mr. Aument. This is a new one. There had been a more basic
training module that was in place before that was really an
introduction for new raters. This new product is developed to
apply to all rating specialists whether they be new or
experienced rating specialists.
Ms. Brown-Waite. When can we expect to have all of the
raters trained on this 30-hour course?
Mr. Aument. By the end of this fiscal year.
Ms. Brown-Waite. And after they take the course, is there a
test that is given?
Mr. Aument. Yes. During the course of the package, I
believe there are four modules of testing built into it,
correct?
Mr. Mayes. Yes. There are sample cases that are basically
cases with fact patterns that an RVSR would see in the field.
And so they go through these cases and apply the learning that
they just had, going through the 33-hour module, and then they
are tested on those fact patterns to see if they arrive at a
consistent decision.
Ms. Brown-Waite. So is there a right and a wrong answer?
Sir, I don't think this is funny.
Mr. Aument. No. I----
Ms. Brown-Waite. I have too many veterans who have been
screwed over by the VA for you to sit there. I was going to
comment before about your laughing.
Mr. Aument. I am sorry.
Ms. Brown-Waite. It is very inappropriate.
Mr. Aument. I duly apologize.
Ms. Brown-Waite. And you owe every veteran in this great
country of ours an apology.
Mr. Aument. Let me answer that question. And I do
apologize, ma'am. You are sensing my own frustration with a
system that allows more than a single answer to that.
First of all, there is a right and a wrong determination on
the notion of service connection. That is a yes/no
determination. And there is an absolutely right answer and an
absolutely wrong answer.
Now as to the rating, the evaluation that is applied to the
case. Are they going to be rated zero percent, 10, 30, 50, 70,
or 100? There can be more than one right answer to that. Two
different raters may rate that case and one may rate it at 50
percent and one may rate that at 70 percent. And there is not
going to be an absolute answer to that question.
That is one of the frustrations. And if you sense my
reaction to your question, that was my frustration,
Congresswoman. Because to me, is one of the shortcomings of the
system that it permits more than a single answer on that.
Ms. Brown-Waite. May I just ask a follow up question? Did
you know this when you bought this module?
Mr. Aument. Pardon?
Ms. Brown-Waite. Did you know that----
Mr. Aument. Yes.
Ms. Brown-Waite. Did you know this when you bought the
module?
Mr. Aument. That is correct, yes.
Ms. Brown-Waite. So we--how much did we spend for this?
Mr. Aument. I don't know how much on this module, but the
training would not have changed that particular outcome,
Congresswoman.
Ms. Brown-Waite. So we are still going to have
inconsistencies even after the training for PTSD?
Mr. Aument. That is correct.
Ms. Brown-Waite. And is there a reason why it took so long?
It is not like PTSD is something new that the VA is having to
deal with. You have so many returning who returned from Vietnam
who have PTSD. It just seems like it is almost too little too
late.
Mr. Aument. I don't know if it is--I think that you could
build a strong case for that, Congresswoman. But I think that
we have to tackle it sometime. If we have not tackled that
sufficiently in the past, we have to remedy that problem.
Mr. Mitchell. Congressman Space?
Mr. Space. Thank you, Mr. Chairman. You know, it seems to
me that I mean you are right in a sense that much of the rating
determination boils down to a subjective judgment call. It is
not entirely objective and it never will be. But it still
troubles me that IDA, an organization contracted with by the
VA, has clearly stated that up to 50 percent of the variation
is attributable to something systemic within the system or the
process.
And I mean it simply doesn't cut it to say it is a
subjective issue and we are doing everything we can. We have
heard testimony from more than one source that most of these
raters' educational process, training process occurs while on
the job. You can give them 30 hours of training, you can give
them 120 hours of training, you can give them 3 years of
training. That is always going to be the case.
We have heard testimony that these various regional offices
have developed personalities of their own. We heard one
gentleman testify that it is common knowledge. You can apply
for PTSD disability rating in Ohio and expect 20 percent, and
you can go to New Mexico and expect 100 percent.
And it seems to me that there should be focus on attacking
that deviation in personality. Figuring out a way to overcome
it. I don't see these courses as doing that. I don't see the
work that the VA is doing now as properly addressing the issue
of culture and personalities that varies from regional office
to regional office.
And my question to you, Mr. Aument, are there any efforts
underway, either now or preceding this analysis, that would
address specifically the problems associated with this--I mean
it is generational. It is just that if Cleveland has got a bad
reputation today, they had a bad reputation among veterans 20
years ago, and they are going to have a bad reputation 20 years
from now because the raters who work now learned on the job
from those who preceded them and they are going to be teaching
the raters who are going to be rating in 20 years.
Is there anything that is going to be done or that can be
done to address that generational culture of personalities that
is in practice and in reality affecting this variation?
Mr. Aument. I think there are some things that we can do,
Congressman. I believe that we have to set the tone, first of
all, out of Washington philosophically as to what our
expectations are and the approach that raters and anyone
working in the regional office is going to be taking toward
serving veterans.
We have to make sure that, to the extent there are pockets
among any of the offices that have built in biases, we do
everything that we can to stamp that out. But one of the things
that I would suggest, and it is not necessarily a universally
popular answer to that question, one of the recommendations
that IDA had mentioned was that it is going to be inherently
difficult to ensure consistency when we are rating these cases
in 57 different locations.
If we want to become more consistent, one of the basic
answers is to rate these cases in fewer locations.
Mr. Space. I am not sure that is practical or feasible.
Mr. Aument. Correct.
Mr. Space. Do we contract out responsibility for rating?
Mr. Aument. No, we do not. That is an inherently
governmental function that cannot be contracted out.
Mr. Space. And it is not your position that should be or
that would serve as a possible solution?
Mr. Aument. No. I can give you a parallel. I know in other
government programs some of the front end work can be
contracted out, some of the development activities. I know the
State Department, for example, in doing some of their work in
the visa program does some contracting of the development
activities. But ultimately the decision that binds the country
to this continuing liability and responsibility has to be made
by a government employee.
Mr. Space. Nothing further. Thank you, Mr. Aument. Thank
you, Mr. Chairman.
Mr. Mitchell. Are there any other questions?
Thank you.
Mr. Aument. Yes, sir.
Mr. Mitchell. Thank you. I appreciate it. And this hearing
is adjourned.
[Whereupon, at 4:41 p.m., the Subcommittee was adjourned.]
A P P E N D I X
----------
Prepared Statement of Hon. Harry E. Mitchell, Chairman,
Subcommittee on Oversight and Investigations
Thank you all for coming today.
For years, the Veterans' Benefits Administration has experienced
problems maintaining adequate accuracy and consistency data within its
ratings system. The purpose of this hearing is to evaluate what the VA
is doing to fix these problems. Their ability to keep accurate records
is essential to ensure the quality of veteran disability ratings, now
and into the future.
Let me first thank Congressman Space, who has quickly become a
leader in working to address this issue. He and Ranking Member Brown-
Waite took the lead in assembling the first panel.
The disability rating system has been an issue of serious concern
since 2002, following an eye-opening GAO Report. In January of 2003,
the GAO designated the VA's disability program as high risk. This
designation resulted from concerns about consistency of decision making
and accuracy of records.
This Subcommittee is aware of the department's efforts to correct
these issues, but more has to be done. I am concerned about the wide
variations in average compensation per veteran and grant rates that
persist between States.
After years of recommendations by the GAO and the VA Inspector
General, the VA has failed to collect and maintain an accurate
database. That must change because our Nation's veterans cannot be
forced to wait any longer.
According to the VBA's Systematic Technical Accuracy Review, or
STAR, accuracy of regional office decisions vary from 76 percent in
Boston to 96 percent at the Fort Harrison regional office. This
variation is troubling. More troubling is that STAR only looks at
accuracy, and completely ignores consistency of decisions.
The VA has implemented a new data system called the Rating Board
Automation 2000. This system collects more information, but it
continues to set road blocks for analyzing claim denials for
disabilities like Post Traumatic Stress Disorder and Traumatic Brain
Injury.
PTSD and TBI are complicated and often misdiagnosed disabilities.
Because of their nature, rating a veteran with these disabilities is
somewhat subjective.
We understand there are variances between States in claims
decisions, and it is to be expected. But the subjective nature of the
ratings process does not do our veterans justice.
We are sending the wrong message to our Nation's veterans. We are
saying that even though you served courageously for your country, you
better live in the right State and hire a professional when filing for
disability benefits.
This is unacceptable. Just last week we heard from the Veterans'
Disability Commission on the necessity to provide equitable treatment
for all veterans. But this is not the case today.
Aside from maintaining accurate records, we need to make sure that
claims officers nationwide receive the same training. This training
must be focused on the intricacies of each disability imposed on any
veteran, young and old.
I know that we can work together in a bipartisan way with the VA to
ensure that our veterans get the best and most fair benefits available.
Prepared Statement of Hon. Ginny Brown-Waite, Ranking Republican
Member, Subcommittee on Oversight and Investigations
Thank you Mr. Chairman.
The Institute for Defense Analyses (IDA) recently issued their
final report in March 2007 on their analysis of differences in
Disability Compensation in the Department of Veterans Affairs (VA).
This report was completed at the VA's request to identify and
collect data on compensation recipients.
According to this study, the VA must do three things:
1. put forth a national effort of consistency of claims
processing,
2. make certain that raters receive consistent training on a
national basis, and,
3. collect and maintain valid data to analyze national statistics
and trends.
I am interested in hearing from Mr. Aument on how the VBA plans to
implement these recommendations.
It is apparent that VBA must take steps to improve training and
modernize its ratings system.
Whether a veteran's claim is rated at the St. Petersburg VA
Regional Office, or the Phoenix VA Regional Office, the same standard
must be applied when making a rating decision on the claim.
I would like to bring to your attention a bill I have cosponsored
with my colleague, Mr. Lamborn, H.R. 3047, the Veterans Claims
Processing Innovation Act of 2007.
This legislation would improve the veterans' claims processing
system at VA by changing the work credit system for VA.
To do this, the measure establishes a fully electronic system pilot
to streamline the claims process.
H.R. 3047 also requires the VA to have an independent organization
certify the effectiveness of VBA's training programs, and allow family
members of veterans who have passed away to continue the original claim
instead of forcing the dependents to start the claims filing over.
I hope that this legislation will pass the Committee before the end
of this Congress, and will be considered on the House floor.
I look forward to hearing more from our witnesses today, and yield
back the balance of my time.
Prepared Statement of John J. ``JJ'' Kenney, USMC (Ret.), Homosassa,
FL,
Veteran Service Officer, Citrus County, FL
Good afternoon Mr. Chairman and members of the committee. I'd like
to thank the committee for the invitation to speak this afternoon about
some of the disparities in the awarding of benefits from state to
state. Also I would like to express, in front of her peers, my sincere
appreciation to Congresswoman Ginny Brown-Waite for her efforts on
behalf of the veterans of Citrus County. Thank you Congresswoman.
I would like the Committee to know that I am not here today to
knock the VA. We in the state of Florida enjoy a relationship with our
one (1) and only VA Regional Office in St. Petersburg. Many of my
fellow service officers in other states only wish they had the working
relationship with their ROs. If I have a problem I can pick up the
phone and talk directly with the Service Center Manger and the heads of
any of the departments at the RO if necessary. And when they say they
will get back to you they do!
There has been and continues to be a disparity in the awarding of
benefits from state to state. One wonders how this could be possible
since all fifty (50) plus regional offices are guided by the same
regulations the 38 CFR and the M21 Manual. One, 38 CFR, provides the
necessary information with regards to the ethical conduct in the
adjudication of veteran's claims along with how and when information
about veterans should be handled. Additionally, the 38 CFR provides the
various information required with regards to diagnostic codes for
different illnesses and injuries along with the percentages to be
awarded for severity of the disability. The M21 Manual is basically a
Standard Operating Procedure. What do I do to get from point a, the
receipt of a claim, to point b, the decision. It would appear a
relatively simple task of reviewing the evidence supplied by the
veterans, reviewing Service Medical Records, for in service occurrence,
verify character of service, determine from medical evidence if
condition is chronic in nature or if the disease or illness is
presumptive. Presumptive meaning that the veteran has filed within one
(1) of separation or the disability is a result of exposure to some
environmental hazard or i.e., Agent Orange, Radiation or was a Prisoner
of War.
There are several elements that are not being considered and they
include the human element, the veteran population and the inventory of
the various VA Regional Offices.
The human element is in every decision the VA renders, however, it
differs from state to state. I know that the training received by VA
personnel is superb and to the best of my knowledge, standardized. So
why the disparity in awards? I'd like to provide the Committee with a
couple examples.
Example 1--The veteran, we'll call him Mr. Smith, resides in
California. He entered the Armed Forces in mid 1960s. At boot camp the
veteran received inoculations with the air guns. In the late 1990s
early 2000s he is diagnosed with Hepatitis C. He had not used drugs,
had no tattoos and had not engaged in any improper conduct. He applied
for Service Connection based on the use of the air guns providing
medical evidence that supported his claim. He was awarded service
connection. Veteran number 2, we'll call him Mr. Jones, resides in
Florida and entered the service approximately the same time as Mr.
Smith. He too received inoculations with the air gun. Again, around the
same time as Mr. Smith Mr. Jones was diagnosed with Hepatitis C. He
initially thought it may have been the result of a surgery he'd
undergone at the VA. Thinking he'd received blood during the surgery he
applied for compensation thinking the blood was tainted. Upon receipt
of the claim the VA located the surgical notes that indicated Mr. Jones
had not received any blood products and denied his claim. In discussion
with the veteran again ruling out drugs, improper behavior or tattoos
it came down to the air gun. The veteran again applied for compensation
based on the air gun providing some of the very same information Mr.
Smith did in his claim. Additionally, he found a medic who was
administering shots the same time as Mr. Jones was at boot camp. The
medic verified the method the air gun was used and this supported the
medical evidence that was submitted by both Mr. Smith and Mr. Jones.
Mr. Jones claim was again denied and it is being appealed. Mr. Jones
will die before his appeal is complete.
Example 2--The veteran, we'll call him Mr. Toms, resides in New
Jersey. He spent over twenty years in aviation. Almost twenty years
after retirement he applied to the VA for service connection for a
hearing loss and tinnitus. He provided medical evidence of his hearing
loss and listed the types of acoustical trauma he was exposed to which
included several tours in Vietnam as a door gunner. His claim moved
through the system and he was subsequently granted service connection.
Our next veteran, we'll call him Mr. Wilson, resides in Florida. He too
spent over twenty years in aviation. Fourteen years after his
retirement he applied for service connected disability for several
conditions included hearing loss and tinnitus. He provided the VA with
medical evidence of the hearing loss and his service medical records at
retirement supported a hearing loss. He too provided information on the
types of acoustical trauma he was exposed to including several tours in
Vietnam serving as a door gunner also. The claim was denied and is in
appeal.
It is apparent to me that the VSR, that human element, played a
significant role in all these claims. How to remove this factor in the
claims process is, in my opinion, almost impossible. Continued training
is the best bet in reducing this factor in the claims process.
In discussing the state veteran population and regional office
inventory one has only to look at three (3) states and see where the
problem is. California has a veteran population of 2,310,968 million,
the largest, and has three (3) regional offices. Florida has a veteran
population of 1,788,496 million and has one (1) regional office. Texas
has a veteran population 1,681,748 million and has two (2) regional
offices. Looking at the numbers is it any wonder there is a disparity
in decisions. The key word in rating decisions is production. It's sad
but the truth that VSR's are graded on their production so it's no
wonder given the size of inventory and the number of regional offices
that there will be disparities in decisions. I submit to the committee
that the VA should conduct a study similar to the CARES Commission to
accurately identify by state either additional regional office
requirement and/or reallocation of regional office areas of
responsible.
One last item before I close and that will affect the claims
process is the age of our VSR's. A significant amount are about my age
and looking to retirement in the next couple of years. Now is the time
for the VA to establish a plan for recruitment of the replacements of
these VSR's. If we don't plan for it now I can assure you that the
disparities in the claims process will escalate.
Again I'd like to thank the committee for the invitation to speak
and also your efforts on behalf of our Nation's veterans.
Respectfully submitted
J.J. Kenney
Prepared Statement of Ray Pryor, USN (Ret.), Chillicothe, OH,
on behalf of American Veterans (AMVETS)
Mr. Chairman and Members of the Subcommittee:
Thank you for providing AMVETS (American Veterans) the opportunity
to testify regarding the issue of disability claims ratings and
benefits disparities within the Veterans' Benefits Administration.
This hearing is very important in as it addresses an issue that
continues to plague the Veterans' Benefits Administration (VBA) and
leaves veterans frustrated and suspicious of the system that is in
place to support them after their service to our Nation. In examining
the factors that have led to the disparities in claims ratings, two
large over-lying conditions are present that have allowed the gaps in
ratings to exist and several circumstances have occurred which have
exacerbated the problem.
First and foremost, we are working with a system that is based on
humans making decisions. Their perceptions, understandings of
conditions, and occasional mistakes are going to play a role in
disparities. If this was the only issue then the disparities would not
be regionally based they would be proportionally distributed throughout
VBA. However, there is evidence that displays disparities between
Regional Offices. AMVETS believes these disparities are caused by two
separate but related groups within the claims process: (a) the Veteran
Service Representative (VSR), the Rating Veteran Service Representative
(RVSR) the Decision Review Officer (DRO) on the rating side; and (b)
the Compensation and Pension Doctors (C&P) whose evaluation of a
veteran is used by the regional offices to decide a claim.
The reason these two groups have such a great influence on the
outcome of the veterans claims and why there are regional disparities
is due to the personalities of the doctors, the raters and review
officers, and the personalities of the Regional Offices as a whole.
These regional personalities develop because new raters and DROs are
trained by the region, and styles and common terms and language are
used by the raters when filing a claim. Terminology such as ``full
range of motion'' compared to ``essentially full range of motion''
could change a rating by 10 percent. Likewise, physician's perceptions
and similar language usage can alter a claim. Veteran Service Officers
(VSO) will state they routinely see Compensation and Pension Exams
which will describe the patient with cookie cutter language leaving
room for subjective interpretation.
In addition to these personalities that determine compensation on
similar if not identical claims with a broad range of outcomes is the
backlog of claims that are in the VBA and the performance credit system
that monitors the number of claims filed by the raters and DROs.
Currently, there is no oversight of the quality of work the DROs
perform. As identified by the AMVETS sponsored ``National Symposium for
the Needs of Young Veterans,'' DROs are evaluated on the number of
claims they submit, but there is no distinction between positives and
negatives in the performance evaluation. There is only a requirement to
process a certain number of claims and they receive credit for all
claims they move forward, regardless of the number of that are
overturned or remanded. The backlog has increased the challenge to push
more claims through, but because of the need to push them through,
incomplete and poorly written claims are routinely submitted and
remanded cycling the claim through the system a second or third time,
exacerbating the systems backlog.
AMVETS suggests three recommendations which will assist in
narrowing the disparities in claims and reduce the backlog. First, a
centralized training facility that will be tasked with teaching new
raters and DROs in a standardized outlined process in filing and
reviewing claims. This will remove much of the regional personality
that affects the disparity in the claims at the rater/reviewer level.
Secondly, there needs to be improved oversight of both the rater/
reviewer and the C&P doctors. In regard to the C&P, oversight should be
in place to ensure the examiner's guide is being utilized. This could
be done through a ``whistle blower'' program that will allow veterans
to feel safe in identifying C&Ps who are misdiagnosing claimants, or
any other mechanism that could track validity of physical exams.
Oversight could be improved in the rating and review of claims also. A
system needs to be developed that will not only ensure claims are being
filed, but that claims are being filed properly and completely. H.R.
3047 makes efforts to improve the credit received system under which
the DROs and RVSRs currently work. This legislation would not credit a
regional office for a claim until the expiration of the appellate
period. This system or a system that monitors the ratio of cases
remanded or overturned to the total number of cares referred is
essential in improving the claims process. Lastly, understanding this
is a two- to three-year process, hiring more staff to reduce the burden
of the backlog is critical. There is no single, simple solution to the
disparity problem, but identifying the roots of the problem and tasking
VA with finding solutions to these problems is critical if improvements
are going to be recognized in the claims system.
Mr. Chairman, this concludes my testimony.
Prepared Statement of David E. Hunter, Ph.D., Research Staff Member,
Cost Analysis and Research Division, Institute for Defense Analyses
Institute for Defense Analyses Study on Analysis of Differences in
Disability Compensation in the Department of Veterans Affairs
Mr. Chairman and Members of the Subcommittee, I am pleased to come
before you today to discuss IDA's work on disability compensation
conducted for the Department of Veterans Affairs (VA). Let me begin
with some background on the study and then I will summarize findings
and recommendations.
I. Introduction
A total of 2.6 million veterans were receiving disability
compensation as of September 2005. The average yearly award for the
entire United States was $8,890, and the average varied across states
from more than $12,000 in New Mexico to less than $8,000 in Ohio.
In addition, the percentage of veterans receiving compensation
differed from state to state. Nationwide, 10.8 percent of veterans were
receiving compensation, and this varied from nearly 18 percent in
Alaska to about 7 percent in Illinois.
In May 2005, the VA asked the Institute for Defense Analyses (IDA)
to conduct a study of the major sources of the observed variation
across states in:
1. The average payments to veterans receiving disability
compensation; and
2. The percentage of veterans receiving disability compensation.
My testimony today will be based on the results of that study,
which have been documented in IDA Paper P-4175.
There are two potential reasons for the observed state-to-state
variations in average awards. First, there may be systematic
differences across states in the claim adjudication process. Second,
the variation may reflect differences across states in the
characteristics of the veteran populations.
Our study quantified the amount of variation attributable to states
having veteran populations with different characteristics. To do this,
we identified and collected relevant data on disability compensation
recipients and the veteran population and used these data to test a
wide variety of hypotheses. We used data as of September 2005 as the
baseline for our analysis. To identify historical trends, we also
examined available historical data.
II. Impact of Maximum Awards
Payments to veterans are based on overall disability level, from 0
percent to 100 percent in increments of 10 percent. In addition,
veterans may receive an award of Individual Unemployability (IU), which
pays them the equivalent of 100 percent disability.
We found that the percentage of recipients receiving a maximum
award (100 percent or IU) explains the vast majority of the observed
state-to-state variation in average compensation. We calculated that 94
percent of the variation was explained solely by differences across
states in the percentage of compensation recipients receiving a maximum
award.
This result reflects two underlying facts. First, although veterans
receiving maximum awards make up a small percentage (17 percent) of all
compensation recipients, they receive the majority (58 percent) of the
total compensation dollars. Second, there is variability across states
in the percentage of compensation recipients receiving maximum awards,
ranging from a low of 10 percent in Alaska to a high of 30 percent in
New Mexico.
For the maximum awards, we found the IU awards exhibited the
greatest variability across states and alone accounted for 75 percent
of the observed variation in average awards. The percentage of
compensation recipients receiving IU per state ranges from a low of 3
percent in Maryland to a high of nearly 20 percent in New Mexico.
Given these findings, the key issue our study had to address was:
To what extent do the state-by-state variations in maximum awards
reflect different treatment of similar veterans and to what extent can
they be explained by differences across states in the veteran
populations?
III. Demographic and Claim-Specific Factors
We tested a wide variety of demographic and claim-specific factors
to identify those that influence the award outcomes. We identified
three major factors that contribute to the observed variation across
states in average disability compensation awards.
1. Post Traumatic Stress Disorder (PTSD). We found that all states
have high average awards for veterans with PTSD. However, there are
large differences across states in the proportion of compensation
recipients with a PTSD award. This difference in the percentage of
recipients with a PTSD award accounts for 40 percent of the observed
variation in average awards across states.
2. Power of Attorney (POA) representation. Nationwide, veterans
with POA representation receive an average annual award of over twice
that of veterans with no POA representation. We found that differences
across states in the percentage of claims with POA account for 16
percent of the variation in average award across states.
3. Period of service. The average award for Vietnam veterans is
$11,670--the highest for any period of service. As a single predictive
factor, differences across states in the period of service of
recipients accounts for 8 percent of the observed variation in average
awards.
We calculated the combined effect of the three main factors that we
identified: PTSD, power of attorney, and period of service. Note that
these factors are correlated, and we could not simply add the
percentage of variation explained by each single factor to calculate
their combined explanatory power. Taking account of the correlations,
we found that 50 percent of the variation across states is explained by
these three factors.
Using a more detail model that included several demographic factors
related to the veteran's county of residence, which proved to correlate
with average awards, we found that as much as 70 percent of the
variation across states is due to differences in the recipient
populations. While these observed correlations are of interest, it is
important to be careful in interpreting them; they almost certainly do
not reflect direct causal relationships.
IV. Variation in the Percentage of Veterans Receiving Compensation
Our second area of study was the sources of differences in the
percentage of veterans receiving compensation.
Two top-level factors influence the percentage of veterans
receiving compensation. These factors are application rates and
adjudication results. Of these two factors, we found application rates
to be more important than adjudication results in explaining variation
across states. Using available data over the past 10 years, we
calculated that differences in application rates explained over 70
percent of the variation in the percentage of veterans receiving
compensation.
We also tested a wide variety of demographic factors to identify
those that influence the percentage of veterans receiving compensation.
We found that military retirees are over four times as likely to
receive compensation as non-retirees. This alone accounts for over 40
percent of the variation across states. The percentage of veterans
receiving compensation also varies by period of service. We calculated
that differences in state veteran populations by period of service
account for 12 percent of the variation across states. Unfortunately,
available veteran population data and demographic information on all
applicants are insufficient to quantify the total variation accounted
for by the combination of these demographic factors.
V. The Adjudication Process
As noted above, we found that state-to-state differences in
compensation recipients explain 50 percent to 70 percent of the
variation in average awards. This implies that as much as 30 percent to
50 percent of the variation in average awards could be due to
differences across states in the adjudication process. We examined the
VA's adjudication process and found that most rating decisions are made
locally and often call for subjective judgments. We also found that
initial and ongoing rater training varies by regional office and has
changed over time. On-the-job training and mentoring, an important
source of rater education, promotes uniformity within a regional
office. The current national quality review program (STAR) focuses on
accuracy of individual claims and does not attempt to promote
consistency. There is no program to monitor trends in ratings across
regional offices aimed at improving understanding of regional
differences. For these reasons, the current adjudication process has
the potential for allowing regional differences to develop and persist.
VI. Recommendations
Based on our findings and observations, the IDA report presented
six recommendations for consideration by the VA.
1. Standardize initial and ongoing training for rating specialists.
The VA should consider preparing a set of test cases as part of
ongoing training procedures.
2. Standardize the medical evaluation reporting process.
Many raters identified variation in quality of medical reports as a
possible cause of variation in awards and stated that poor quality
reporting hinders their ability to make an accurate rating decision.
3. Increase oversight and review of rating decisions.
The VA could strategically select a more significant fraction of
rating decisions for review. This selection process should target
claims with high leverage and evaluate each on service connection,
degree of disability, and IU status determination.
4. Consolidate rating activities to a central location.
Consolidation would remove many of the underlying differences
across regional offices that contribute to potential inconsistencies in
decisions. Realizing that this may not be feasible, we note that
consolidation to fewer regional offices or having regional offices
specialize for certain claim types would also improve consistency.
5. Develop and implement metrics to monitor consistency in
adjudication results.
These metrics would target the key factors that impact the
variations in average awards and the percentage of veterans receiving
compensation.
6. Improve and expand data collection and retention.
The ability to monitor variances is currently limited by lack of
available data. Most notably, the VA has not historically tracked data
on denied claims. Such data are needed to further understand the
underlying reasons for differences across states in the composition of
claim recipients. For instance, data do not exist to show how much the
denied claims contribute to differences across states in the mix of
compensation recipients.
Mr. Chairman and Members of the Subcommittee, that concludes my
statement, and I am available for questions.
Prepared Statement of Jon A. Wooditch, Deputy Inspector General,
Office of Inspector General, U.S. Department of Veterans Affairs
INTRODUCTION
Mr. Chairman and Members of the Subcommittee, I am pleased to be
here to address the Office of Inspector General's (OIG) report, Review
of State Variances in VA Disability Compensation Payments, issued May
19, 2005. Today, I will summarize the report and our subsequent
activity relating to the report, and provide observations on the
remaining actions needed to reduce unacceptable variances in average
annual disability compensation payments. With me is Joseph Vallowe,
Deputy Assistant Inspector General for Management and Administration,
who can answer questions about implementation of OIG recommendations
and our work since the report was issued.
THE OIG REPORT
Our review confirmed that variances in average annual disability
compensation payments by state have existed for decades. In trying to
understand why these variances exist, we identified and assessed more
than 20 possible factors. Based on our assessment, we discovered that
some of the factors contributing to differences in average payments by
state, such as the veteran's period and branch of service, number of
dependents, and disabling conditions, are not within the Veterans
Benefits Administration's (VBA) control. Since these factors are not
within VBA's control and all veterans are not identical, we concluded
that some level of variance across states is expected.
On the other hand, we also discovered that some of the factors that
impact average payments are within VBA's control, such as disability
rating decisions. To better understand the impact of rating decisions
on the variance, we analyzed claims data for fiscal year (FY) 2004, and
concluded that much of the information needed to make these decisions
is subject to varying degrees of interpretation and judgment, by both
veterans when providing information on their medical condition and VBA
claims adjudicators when assessing this information for rating
purposes. We also determined that the degree of rater subjectivity can
be influenced by differences in the way medical examination results are
presented, by vague criteria set forth in the Rating Schedule for some
disabling conditions, and by the amount of training and rater
experience. In short, subjectivity can lead to inconsistencies in
rating decisions, which can influence variances in average annual
disability compensation payments nationwide. As such, the issue is not
whether a variance exists but whether the magnitude of the variance is
acceptable.
Our report included eight recommendations aimed at improving
consistency in rating decisions in order to reduce unacceptable
variances. VBA has taken acceptable action to implement those
recommendations. In particular, our report recommended that VBA conduct
a scientifically sound study of the major influences on compensation
payments in order to develop data and metrics for monitoring and
managing variances. The December 2006 Institute for Defense Analyses
(IDA) report conducted as a result of this recommendation confirmed our
review findings and made meaningful recommendations to assist VBA in
understanding and reducing unacceptable variances.
Other key actions taken by VBA in response to our recommendations
include:
Coordinating with the Veterans' Disability Benefits
Commission to discuss issues pertaining to revising and clarifying the
Rating Schedule.
Forming the Consistency Analysis Study Group, which
provided a plan to identify, analyze, and rectify inconsistencies in
disability evaluations.
Deploying 57 standardized medical examination templates
that are used to submit examination results to VBA for rating
decisions.
Hiring 1,100 additional benefits processing staff and
providing additional standardized training for rating decision makers.
Enhancing outreach efforts by mailing 325,000 letters to
veterans in the six states with the lowest average disability
compensation payment in FY 2004, advising them of steps to follow if
they want to reopen their disability claim.
OIG ANALYSIS OF CURRENT STATUS AND REMAINING ACTIONS
In preparation for this hearing, we obtained updated information on
average annual disability compensation payments, reviewed the IDA
report, and updated our information on VBA activities since our report
was issued with the purpose of identifying what remains to be done to
improve rating consistency and reduce unacceptable variances.
In our 2005 report, we indicated that the variance in average
annual disability compensation payments between the highest and lowest
states was $5,043 in FY 2004. We recently obtained compensation payment
data by state for FYs 2005 and 2006. Because VBA is in the process of
migrating disability benefit claims data from the Benefits Delivery
Network system to the VETSNET system, we were unable to obtain complete
data for FY 2007. The variance was $5,061 for FY 2005 and $5,105 for FY
2006. While the trend in variances continues to increase, it is doing
so at a much lower rate than in the previous 5 years, which averaged
$332 a year. We also discovered that one reason for this decline can be
attributed to more consistent ratings for new claims. In fact, the
national variance in new claims declined from $6,054 in FY 2004 to
$4,477 in FY 2006. This was directly attributed to an increase in
average payment by the lowest state and a decrease in average payment
by the highest state.
While some progress has been made, VBA remains challenged to
improve the consistency of rating decisions. To achieve this, we
believe further efforts are needed in monitoring and measuring
variations in rating decisions by state and VBA regional offices. In
particular, we recommend that VBA review claims folders for particular
diagnostic codes or body systems where ratings fall outside the
expected variance range to determine whether the rating is justified or
explained by unacceptable causes, such as incorrect or subjective
application of the standards. VBA should incorporate what it learns
from these reviews to improve rating consistency nationwide. This
approach is consistent with the plan submitted by the Consistency
Analysis Study Group and with IDA's recommendations.
In response to our 2007 Major Management Challenges, VBA stated
that it conducted a pilot project to monitor the consistency of
decision making for rating-related claims and conducted a consistency
review focusing on evaluations of Post Traumatic Stress Disorder (PTSD)
claims from a regional office identified as a statistical outlier. VBA
also developed a plan to expand its Systematic Technical Accuracy
Review (STAR) quality assurance program to enable increased sampling,
expanded rating data analysis, and focused disability decision reviews.
During FY 2008, VBA plans to begin quarterly monitoring of rating
decisions by diagnostic code, complete the 2007 pilot by conducting
consistency reviews focused on Individual Unemployability claims from a
statistical outlier regional office, and increase staff to accomplish
additional STAR reviews.
Our report also identified the Rating Schedule as a contributing
factor to the subjectivity associated with the disability rating
process. The Veterans' Disability Benefits Commission was charged with
evaluating the Rating Schedule and making recommendations for changing
or updating it. We defer to the Commission's recommendations, but would
like to point out that effectively dealing with the issue of
inconsistency in disability ratings cannot entirely occur until the
subjectivity inherent in the Rating Schedule is addressed.
CONCLUSION
In closing, we strongly encourage VBA to continue its efforts
toward identifying and reducing unacceptable variances. Implementation
of VBA's Consistency Analysis Study Group plan and IDA's
recommendations will assist VBA in improving the consistency of ratings
decisions. While VBA has made some progress, further efforts are needed
to monitor and measure variations in award decisions by state.
Unacceptable variations should be thoroughly evaluated to include in-
depth reviews of individual claims that deviate from expected norms.
Information obtained from these reviews should be used to improve
consistency in rating decisions nationwide. Expansion of the
responsibilities and staff of the STAR quality assurance program will
also be important to achieving greater consistency in rating decisions.
Mr. Chairman, that concludes my remarks and thank you once again
for the opportunity to discuss this important issue. Mr. Vallowe and I
would be pleased to answer any questions.
Prepared Statement of Ronald R. Aument,
Deputy Under Secretary for Benefits, Veterans Benefits Administration,
U.S. Department of Veterans Affairs
Mr. Chairman and members of the Subcommittee, it is my pleasure to
be here to discuss the Veterans Benefits Administration's (VBA)
response to the Institute for Defense Analyses' (IDA) Analysis of
Differences in Disability Compensation in the Department of Veterans
Affairs. I am pleased to be accompanied by Mr. Bradley G. Mayes, VBA's
Director of the Compensation and Pension Service. Today I will discuss
the various initiatives underway within VBA that support the
recommendations put forth by IDA to improve the quality and consistency
of the disability claims processing.
Background
In December 2004, media reports identified differences in average
disability compensation payments across states. In response, the
Secretary of VA requested the Office of Inspector General (OIG) to
conduct a review of disability payments. OIG examined benefit payment
data for the six states with the lowest average payments and the six
with the highest average payments to determine the factors that
contributed to the differences. OIG's report concluded that the
factors, to include demographics, were complicated and intertwined, and
intertwined, and recommended that VA pursue a scientific study to
further understand the influences on disability compensation payments.
In May 2005, the Department of Veterans Affairs contracted with the
Institute for Defense Analyses to better understand the potential
causes of the differences in disability payments. The IDA study was
structured to determine if a significant correlation to one or more
variables could be identified that contribute to the variance.
Findings from IDA Study
IDA identified several major factors that individually contribute
to the observed variation in average compensation. These factors
include:
Distribution of veterans with ratings of 100%;
Types of disabilities (including PTSD and other mental
disabilities);
County of residence;
Median family income;
Percent of the population with physical or mental
disability;
Population density;
Representation by power of attorney; and
Period of service.
Other key drivers include application rates, which influence the
percentage of the veteran population receiving disability benefits, and
the percentage of beneficiaries that are military retirees.
It is important to understand that the average payments being
compared in the IDA study cover all veterans currently receiving VA
disability compensation benefits, and that the decisions that awarded
these benefits have been made over a period of more than fifty years.
The average payment for compensation recipients is therefore not
necessarily reflective of the experience of veterans currently applying
for disability compensation benefits. In order to assess differences in
VA benefits currently being awarded to recently separated veterans, VA
also looks at average payments to veterans who are added to VA's
disability compensation rolls during the year.
Based on the study results, IDA made six recommendations aimed at
critical aspects of the adjudication process they found most likely to
affect the consistency of claims determinations. The recommendations
are:
Standardize initial and on-going training for rating
specialists
Standardize the hospital evaluation reporting process
Increase oversight and review of rating decisions
Consider consolidating all or selected parts of the rating
process into
one location
Develop and implement metrics to monitor consistency in
adjudication
results
Improve and expand data capture and retention
VBA Response to IDA Recommendations
I will respond to each recommendation in turn and discuss how VBA
is working to achieve the intended outcomes of that recommendation.
Standardize initial and on-going training for rating specialists
Critical to improving claims accuracy and consistency is ensuring
that our employees receive the essential guidance, materials, and tools
to meet the ever-changing and increasingly complex demands of their
decision-making responsibilities. To that end, VBA has deployed new
training tools and centralized training programs that support accurate
and consistent decision-making.
New hires receive comprehensive training and a consistent
foundation in claims processing principles through a national
centralized training program called "Challenge." After the initial
centralized training, employees follow a national standardized training
curriculum (full lesson plans, handouts, student guides, instructor
guides, and slides for classroom instruction) available to all regional
offices. Standardized computer-based tools have been developed for
training decision-makers (71 courses completed and an additional 5 in
development). Training letters and satellite broadcasts on the proper
approach to rating complex issues are provided to the field stations.
In addition, a mandatory cycle of training for all Veterans Service
Center employees has been developed consisting of an 80-hour annual
curriculum.
VBA already has in place a skills-certification process for veteran
service representatives, and we are developing a skills-certification
process for rating specialists. Additionally, we have increased our
Systematic Technical Accuracy Review (STAR) staff and tasked it with
more oversight visits of our regional offices and greater
responsibilities for training our decisionmakers.
Standardize the hospital evaluation reporting process
VA has made significant progress in our efforts to standardize the
medical evaluation process. VA's Compensation and Pension Examination
Program (CPEP) continues to improve the examination process through the
use of templates, quality reports, and examiner certification.
To date CPEP has developed 58 computerized examination templates
based on associated worksheets that cover a variety of body systems and
disabilities. The templates guide the examiner through specific
examination types to ensure pertinent information is obtained and
included in the examination report. The templates have been deployed to
all VA medical care sites where Compensation and Pension Service (C&P)
examinations are conducted.
A critical component of the C&P examination process is the
examination request generated by VBA and submitted to the Veterans
Health Administration (VHA). Examination requests must properly
identify the specific examinations to be conducted and provide accurate
explanations for any medical opinions that are required. To ensure the
quality of these requests, CPEP staff review a sampling of examination
requests from all regional offices on a monthly basis.
The compensation and pension disability examination is often a key
component of the VBA disability determination process. To ensure the
quality of these reports, CPEP conducts a monthly review of a sampling
of completed exams generated by the Veterans Health Administration
(VHA) medical facilities. VHA instituted a performance measure on the
quality of C&P examinations in 2004. CPEP quality reviews are used to
calculate this performance metric. Contract examinations are subject to
internal quality reviews that parallel the CPEP process.
In FY 2008, through CPEP, VHA will implement an examiner
certification program for all examiners performing compensation and
pension disability examinations. The examiners themselves are expected
to undergo specified computerized training modules relevant to C&P
examinations and be certified to perform these disability examinations.
Our CPEP initiatives are instrumental to achieving our quality
goals. VBA and VHA continue to work together to develop and refine
tools that will ensure even greater consistency in the hospital
disability evaluation reporting process.
Increase oversight and review of rating decisions
To ensure accurate benefit decisions, VBA has established an
aggressive and comprehensive program of quality assurance and oversight
to assess compliance with VBA claims processing policy and procedures
and assure consistent application.
The Systematic Technical Accuracy Review (STAR) program includes
review of work in three areas: rating accuracy, authorization accuracy,
and fiduciary program accuracy. Overall station accuracy averages for
these three areas are included in each regional office director's
performance standards and the station's performance measures. STAR
results are readily available to facilitate analysis and to allow for
the delivery of targeted training at the regional office level. C&P
Service conducts satellite broadcast training sessions based on an
analysis of national STAR error trends. Over the last 4 years, our
quality has risen significantly from 81 percent to 89 percent.
Site surveys of regional offices address compliance with
procedures, both from a management perspective in the operation of the
service center and from a program administration perspective, with
particular emphasis on current consistency issues. Training is
provided, when appropriate, to address gaps identified as part of the
site survey.
Consider consolidating all or part of the rating process into one
location
The consolidation of specialized processing operations for certain
types of claims has been implemented to provide better and more
consistent decisions. Three Pension Maintenance Centers were
established to consolidate the complex and labor-intensive work
involved in ensuring the continued eligibility and appropriateness of
benefit amounts for pension recipients. We are exploring centralization
of all pension adjudications in these Centers.
In November 2001, a Tiger Team was established at the Cleveland
Regional Office to adjudicate the claims of veterans age 70 and older.
VBA also established an Appeals Management Center to consolidate
expertise in processing remands from the Board of Veterans' Appeals. In
a similar manner, a centralized Casualty Assistance Unit was
established to process all in-service death claims. VBA also
established two Development Centers in Phoenix and Roanoke to assist
regional offices in obtaining the required evidence and preparing cases
for decision, and centralized the processing of all radiation claims to
the Jackson Regional Office.
The Benefits Delivery at Discharge (BDD) Program provides
servicemembers with briefings on VA benefits, assistance with
completing applications, and a disability examination before leaving
service. The goal of this program is to deliver benefits within 60 days
following discharge. VBA has consolidated the rating aspects of our BDD
program to two rating sites, which will bring greater consistency of
decisions on claims filed by newly separated veterans.
We continue to look for ways to achieve additional organizational
efficiencies through the consolidation of other aspects of our claims
processing, including death benefits, fiduciary activities, and
telephone service.
Develop and implement metrics to monitor consistency in adjudication
results
In addition to conducting quality reviews, C&P Service's STAR staff
are beginning to conduct analyses to identify unusual patterns of
variance in claims adjudication by diagnostic code, and then review
selected disabilities to assess the level of decision consistency among
and between regional offices. These studies are used to identify where
additional guidance and training are needed to improve consistency and
accuracy, as well as to drive procedural or regulatory changes.
Improve and expand data collection and retention
VBA's data management systems have been substantially improved in
recent years with such programs as the VETSNET suite of applications
and the establishment of our data warehouse. VETSNET and the analytical
tools in our data warehouse provide our employees and managers with
more robust data, which better support information management and
analysis.
Mr. Chairman, this concludes my testimony. I would be pleased to
answer any questions you or other members of the Committee may have.
Statement of Steve Smithson, Deputy Director,
Veterans Affairs and Rehabilitation Commission, American Legion
Mr. Chairman and Members of the Subcommittee:
Thank you for this opportunity to present The American Legion's
views on disability claims ratings and benefits disparities within the
Department of Veterans Affairs (VA) Veterans Benefits Administration
(VBA). The American Legion commends the Subcommittee for holding a
hearing to discuss this important and timely issue.
May 2005 VA Office of the Inspector General Report
In response to a December 2004 Chicago Sun-Times article revealing
disparities in VA disability compensation payments on a state-by-state
basis, the Secretary of VA ordered the VA Office of the Inspector
General (VAOIG) to investigate the matter. On May 19, 2005, the VAOIG
issued a report addressing the reasons for differences in average
monthly VA disability compensation made to veterans living in different
states.
The VAOIG noted that for fiscal year (FY) 2004, average annual
payments by state ranged from $6,961 to $12,000, a difference of over
$5,000. According to the VAOIG the highest paying states were: New
Mexico (the highest), Maine, Arkansas, West Virginia, Oklahoma, and
Oregon. The lowest paying states were: Indiana, Michigan, Connecticut,
Ohio, New Jersey, and Illinois. The VAOIG concluded that no single
variable factor was responsible for the discrepancies in compensation
payments.
The VAOIG found that there were sixteen possible factors that could
cause compensation payment disparities. In its analysis, the VAOIG
concluded that there were ten factors that the VA could not control and
there were six factors over which the VA could exert some control.
According to the VAOIG, the factors that the VA cannot control are:
power of attorney representation, enlisted versus officer, military
retirees versus non-military retirees, participation of veterans
receiving benefits, period of service, branch of service, dependents,
special monthly compensation, age, and the average number of
disabilities. The six factors that the VAOIG indicated the VA has some
control over are: pending claims, brokered claims, appeal rates,
transferred cases, grant rates, and rater experience.
Finally, the VAOIG stated that some disabilities are inherently
more susceptible to variations in rating determinations. The VAOIG
indicated that the Rating Schedule (38 C.F.R. Part 4), because it is a
60-year-old model, may also cause some inconsistencies. The VAOIG
identified post traumatic stress disorder (PTSD) evaluations, total
disability based on PTSD (including individual unemployability or IU),
and all veterans rated with IU as rating decisions susceptible to
variations.
The VAOIG focused on mental disabilities because of several
reasons: mental disabilities have a high variable rate (compared to the
other parts of the body systems evaluated by the Rating Schedule);
mental disabilities have the highest average evaluation (58 percent);
and PTSD, which is a mental disability, is one of the fastest growing
service-connected disabilities.
The VAOIG reviewed 2,100 PTSD cases at seven regional offices (RO).
They found that the ROs approach stressor verification requirements
differently from state to state. In particular, there were differences
in how the ROs verified veterans' allegations about traumatic events in
service. The VAOIG also found that, in general, once veterans with PTSD
obtain a 100 percent evaluation their receipt of mental health
treatment declined.
The VAOIG noted that there were several instances of benefits fraud
in the past few years. It was stressed that based on an income match,
8,486 veterans in receipt of IU benefits reported earned income to the
Internal Revenue Service (IRS). The VAOIG indicated that some or all of
the 8,486 veterans in receipt of IU benefits and in receipt of earned
income, may not be entitled to IU benefits.
The VAOIG also surveyed 1,992 rating specialists and Decision
Review Officers (DROs) and 1,349 responded. The relevant results
indicate:
65 percent stated they did not have enough time to
provide timely and quality service;
57 percent indicated that they had difficulty meeting
production standards if they took time to adequately develop claims and
thoroughly reviewed the evidence before making a decision;
41 percent declared that 30 percent or more of the claims
they decided were not ready to rate when presented for rating;
20 percent estimated that of the claims not ready to rate
more than 10 percent were actually rated without all the needed
information; and
52 percent responded that they could assign two or more
different ratings for the same medical condition.
The May 2005 VAOIG report contained the following recommendations:
1. Conduct a study to detect and correct unacceptable payment
patterns.
2. Work with the Veterans' Disability Benefits Commission to
clarify and revise the rating schedule.
3. Conduct Review of rating practices for certain disabilities
such as PTSD and IU.
4. Expand national VA quality review to include review of PTSD
evaluations for consistency, and to determine if the stressor was fully
documented.
5. Coordinate with the Veterans Health Administration to improve
the quality of medical examinations.
6. Ensure that VA regional offices are adequately staffed and
equipped.
7. Consider establishing a lump-sum payment option in lieu of
recurring monthly payments for veterans with disability evaluations of
20 percent or less.
8. Analyze differences in claim submission patterns to determine
if certain veteran sub-populations, such as World War II veterans or
veterans living in certain areas, have been underserved and perform
outreach based on the results of the analysis.
For years, The American Legion and other veterans service
organizations (VSOs) have stated that the driving force behind most VA
adjudications is the need for VA to process as many claims as possible
in the fastest possible time. This emphasis on quantity and speed of
adjudication results in premature adjudications, improper denials of
benefits, and of course, inconsistent decisions.
The VAOIG report confirms much of what we have been saying about
the VA claims adjudication process. Essentially, the VAOIG acknowledges
that because the VA often does not take the time to obtain all relevant
evidence and information, there is a good chance that these claims are
not properly adjudicated. The VAOIG, to its credit, quoted raters and
DROs who indicated that VA management is much more concerned with
quantity than quality. Some VA adjudicators stated that awards and
bonuses are centered around production. The report, however, did not
mention that in most claims where the VA does not obtain all relevant
information, the claim is denied or under evaluated.
The overall tone of the VAOIG report was disappointing. It implied
that where the VA fails to develop claims properly, there are only
improper grants of benefits. The VAOIG ignored the fact that many
deserving veterans have their claims denied or under evaluated because
the VA, in a rush to claim work credit, failed to, or refused to,
comply with the duties to assist and notify. Although the VAOIG
conceded that VA often makes errors, it failed to consider or discuss
whether these errors could result in the unlawful denial of benefits or
the under evaluation of service-connected disabilities.
This negative tone exists throughout the VAOIG report. For example,
when discussing the differences between adjudications in New Mexico and
Illinois, the VAOIG noted that New Mexico had the highest average
monthly VA disability compensation payments at $11,206. The VAOIG
indicated that the high New Mexico payments ``may be a cause for
concern.'' The VAOIG, however, did not express any concern about the
low paying ROs. Apparently, the possibility that some veterans may be
underpaid or unfairly denied did not alarm the VAOIG.
The VAOIG also attacked the current rating schedule as ``a 1945
model that does not reflect modern concepts of disability'' even though
most of the major body systems have been updated in the last 20 years.
Also, it did not define the term ``modern concepts of disability'' and
did not explain why the current rating schedule would cause
inconsistent payments.
According to the VAOIG, whether a veteran was represented by a VSO
was the single most important factor in determining the amount of
compensation payments made to that veteran. The VAOIG reported that on
the average, veterans who are represented by a VSO, receive $6,225 more
per year than those veterans without representatives. This is a telling
statistic. VA operates a disability benefits program that is required
to be non-adversarial and ex parte. (See 38 C.F.R. Sec. 3.103(a).) The
huge disparity between non-represented veterans and represented
veterans supports the conclusion that VA's claims adjudication system
is more adversarial than VA cares to admit.
Additionally, the VAOIG report appears to assume that the states
with high levels of compensation payments are doing something wrong.
The VAOIG apparently did not consider that the states paying a high
level of benefits are making correct legal decisions--doing a better
job than the states with low levels of payments. The American Legion
asserts that it is quite possible that some, if not all, ROs are
incorrectly denying a considerable number of claims for compensation
and under-evaluating some service-connected conditions. We believe
there are more veterans being unfairly denied benefits and underpaid
benefits than there are veterans who are being unfairly granted
benefits and/or overpaid benefits.
This conclusion is based on the following fact. In the past few
years The American Legion has jointly reviewed the quality of
adjudications in approximately 40 ROs. Our quality review team has
found errors in all of the VA offices reviewed, including the regional
office in New Mexico. For example, the review of the VA regional office
in New Mexico generated the following comments.
Some of the New Mexico rating decisions reviewed by The American
Legion team exhibited lack of knowledge or carelessness. For example:
In some instances the RO incorrectly denied service
connection for a congenital disease because the RO misinterpreted 38
C.F.R. Sec. 4.9.
In some instances the Global Assessment of Functioning
(GAF) score was ignored.
The effective dates assigned for individual
unemployability (IU) created problems. According to an RO official, the
RO assigned an effective date from the receipt of the VAF 21-8940--
instead of the date of the informal claim for IU. The official stated
this was a recurrent problem in this RO.
Some VA examinations were inadequate.
Some ratings concerning claims for increase should have,
but did not, consider 38 C.F.R. Sec. 3.400(o)(2).
Some inferred issues were either missed or ignored.
The rules concerning new and material evidence were not
correctly applied. In some instances, special monthly pension (SMP) was
not correctly considered or improperly rejected.
In some cases, the RO issued confusing and misleading
development and notice letters.
In some instances the RO failed to clarify the appellate
process to veterans who clearly were confused.
Many of the types of errors identified in New Mexico were similar
to the errors that we found in low paying ROs like Chicago. If the New
Mexico RO, the highest paying office according to the VAOIG, exhibited
these underpayment and improper denial problems, it is possible that
all VA ROs under-compensate some claimants to various degrees. The
VAOIG never considered this possibility. In fact, all ROs reviewed by
The American Legion's quality review team exhibited patterns of
improper denial and underpayment. Of course, some ROs exhibited much
better quality than other ROs.
Also, in FY 2007 the Board of Veterans' Appeals (BVA or Board)
remanded or reversed 56 percent of the appeals it reviewed. It is very
unlikely that any of those remands or reversals involved overpayments
of benefits or the improper grant of service connection. The BVA
reversal/remand rate reveals that ROs commit many errors adverse to
veterans.
In spite of the inescapable fact that there is a serious quality
problem within the ROs that unfairly deprives many deserving veterans
of VA benefits, the VAOIG did not mention or even allude to this
situation. This omission is a disservice to veterans and casts doubt on
most of the VAOIG conclusions.
Institute for Defense Analyses (IDA) Report
In response to the VAOIG's recommendation, VA contracted IDA to
conduct a study in order to gain a better understanding of the
potential causes of the variances in disability payments.
The IDA offered six recommendations for improving the consistency
of VBA's claims adjudication process:
Standardize initial and on-going training for rating
specialists.
Standardize the hospital evaluation reporting process.
Increase oversight and review of rating decisions.
Consider consolidating all or selected parts of the
rating process into one location.
Develop and implement metrics to monitor consistency in
adjudication results.
Improve and expand data capture and retention.
The American Legion agrees with IDA's recommendation to increase
VBA oversight and review of RO rating decisions. We also note that this
recommendation specifically stated that denied claims should also be
reviewed, something the VAOIG did not consider in its investigation and
subsequent report.
Regarding its training recommendation, IDA noted that although VBA
provides centralized training modules for training purposes, many
regional offices supplement this training with material developed
locally. IDA also noted that many rating specialists interviewed stated
that they received ``on-the-job'' training from senior raters and
identified these individuals as the biggest influence on their rating
styles. IDA suggested that a ``stronger mechanism'' would reduce the
potential for persistent differences among regional offices in ratings
and ensure that raters VA wide are receiving the same training. IDA
further recommended that raters be given standardized test cases,
reflecting the most likely areas of variation, as part of an ongoing
training process.
The American Legion is appreciative of the importance the Under
Secretary for Benefits has placed on training of VBA personnel. We are
also aware of the centralized training program that has been
implemented; however, a national training standard/requirement, in
addition to the centralized training conducted by Compensation and
Pension Service (C&P), for regional office personnel is also needed.
Consistent and standardized training at each regional office must take
place for all personnel--experienced and new hires alike. The American
Legion believes it is crucial that such a program be implemented and
closely monitored for compliance by the Under Secretary for Benefits.
Management in stations not in compliance with such training
requirements must be held accountable; otherwise any national or
centralized training effort will not be successful.
Additionally, The American Legion also believes it is essential to
proper training that information (reasons for remand or reversal) from
BVA decisions, Court of Appeals for Veteran Claims decisions, DRO
decisions and errors noted in the National Systematic Technical
Accuracy Review (STAR) be tracked and examined for patterns. This
information should then be analyzed by VBA and provided to ROs in
mandatory formal training to ensure that common errors and other
discrepancies occurring in regional office rating decisions are not
repeated. This information should also be used for remedial training
purposes when patterns of errors are identified for specific
individuals. Although such data is currently being collected and
disseminated to the ROs, it appears that consistent utilization of this
data in regular formalized and specific training has been lacking.
Unless ROs (both managers and individual adjudicators) learn from their
mistakes and take corrective action, there will continue to be a high
rate of improperly adjudicated claims, resulting in a consistently high
appeals rate and subsequent high BVA remand/reversal rate of RO
decisions.
In addition to our training-related concerns discussed above, we
also have concerns regarding VA's skill certification testing program
to ensure competency and proficiency. C&P conducted an open book
(pilot) job skill certification test for veterans service
representatives (VSR) several years ago in which the pass rate was
extremely low (approximately 23 percent). Even more alarming than the
low-test scores was the fact that those who took the test had several
years of experience in the position and were considered to be
proficient.
C&P subsequently finalized its VSR proficiency test and conducted
tests in May and August 2006. Employees participating in the testing
underwent 20 hours of training prior to taking the test. Although the
pass rate (about 42 percent) for these tests was much higher than the
pilot test, it is still very low and can hardly be considered
acceptable. C&P did not conduct any tests in FY 2007.
The American Legion applauds the new testing program as a step in
the right direction, but we still have concerns. Although successful
completion of the test will be required for promotion or assignment to
a rating board, it is not mandatory as a condition of employment in
that position and is completely optional. C&P is in the process of
developing a test for rating veterans service representatives (RVSR)
and DROs, but a timeline for completion or implementation has not yet
been determined. Unfortunately, like the VSR test, the test for RVSRs
and DROs will not be mandatory as a condition of employment.
The ultimate goal of proficiency or competency testing should be to
ensure that an individual in any given position is competent,
proficient, and otherwise qualified to perform the duties required of
that position. This goal will not be achieved if testing is not
mandatory, or is not provided for all levels or for all positions, and
remedial training or other corrective action is not required for those
who do not successfully pass the test. Although this concept may not be
embraced by some, the ultimate goal is to have qualified and competent
staff who will provide the best service possible for America's
veterans.
Lastly, The American Legion opposes IDA's recommendation supporting
rating consolidation. It is likely that some VA managers also like the
idea of consolidation because of the economic advantage to the VA. It
is cheaper to have 10 or 16 offices than to pay for 57 regional
offices. However, in our experience, many of the bigger VA offices have
more quality problems than the smaller ROs. The American Legion quality
reviews reveal that the fact that raters and DROs are under the same
roof does not mean they will all rate claims consistently. Also,
consolidation, especially consolidation in low cost of living rural
areas, would hamper access to the VA regional offices for many
veterans, especially low income and minority veterans. Obviously, that
is not a good thing.
Closing
In closing, The American Legion recommends increased oversight by
VBA as well as more frequent transferring of RO service center managers
in order to create a ``national'' culture to avoid regional differences
and biases. We also recommend the establishment of an independent
quality review program with accountability built in for managers and
adjudicators. Additionally, until substantive changes are made in the
work measurement system, a piecemeal ``band-aid'' approach will not
make a major difference. The creation of a work measurement system that
rewards prompt, but fair and complete adjudications would improve
consistency and quality. Such changes would be the fastest, least
expensive way to make the biggest positive impact on the VA's claims
adjudication system.
Mr. Chairman, that concludes my statement. The American Legion
welcomes the opportunity to work closely with you and your colleagues
on this and any other issue that concerns this nation's veterans.
Statement of Donald R. Lanthorn, Department Service Director,
American Legion, Department of Ohio
Mr. Chairman, members of the Committee. My name is Donald R.
Lanthorn. I am the Service Director of The Ohio American Legion, a
position I have held for 30 years. I appreciate the opportunity to
provide my personal perspective as to why Ohio is last among the fifty
states in VA benefit dollars received per compensated claimant.
The issue, in my opinion, is multi-faceted and quite complex, with
origins back to World War II in some areas. If the purpose is to lay
blame, there is plenty to go around. I will address fault on the part
of the Department of Veterans Affairs, and both VBA and VHA; the State
of Ohio; County Veterans Service Officers, their Commissioners and the
State Associations of both; and Veterans Service Organizations are not
without culpability.
However, fault may not be as much of an issue as one may surmise,
and in the May 19, 2005 ``Review of State Variances in VA Disability
Compensation Payments'' report of the Department of Veterans Affairs,
Office of the Inspector General it is noted, in referring to the dollar
averages of the clusters of the six highest and lowest ranked states,
that ``Preliminarily, this suggests that the high cluster may be more
problematic than the lower ranked states.''
There are several factors in determining compensation received by
Ohio Veterans that are the fault of no one.
The IG Report analyzed states by high and low clusters of six
states each. Ohio is in the low cluster with Indiana, Michigan,
Connecticut, New Jersey, and Illinois. New Mexico, Maine, Arkansas,
West Virginia, Oklahoma, and Oregon comprised the high cluster states.
It was noted in the DVA IG Report of May 19, 2005 that demographics
play a part in the disparity.
Average military officer VA compensation is less than that of the
average of enlisted personnel; hence states with more officers serving
in the military would likely reduce their average VA compensation.
(High cluster states have 63.4% enlisted personnel receiving VA
compensation to an average of 44.4% in the low cluster states.)
Military retirees receive more compensation than their non-military
retired peers. (Eleven percent more retirees receiving compensation,
27.6% to 16.6%, among high cluster states.)
Period of Service is a factor in computing average compensation.
Vietnam service veterans receive higher amounts, followed by Korean
War, World War II and Peacetime veterans. Gulf War veterans receive
less VA compensation on average than other periods of service. The
numbers of veterans from each state and percentage of veteran
population make this a no fault demographic statistic factor. (High
cluster, 13% WWII; low cluster averages 23%.)
Further analysis by Branch of Service indicates that Marine Corps
veterans receive the highest average amount of VA compensation.
Veterans with dependents receive a higher average amount of VA
compensation per year than their peers without dependents. (High
cluster averaged 43.8% to low cluster of 30.3% of veterans with
dependents.)
Age of recipients is a factor. The average age of the high cluster
states recipient was 58 compared to 61 in the low cluster. This
suggests younger veterans receive more compensation, but may more
closely relate to periods of service, indicating fewer WWII or higher
numbers of Vietnam veterans, by percentage, among VA compensation
recipients in the high cluster states.
The more service-connected disabilities a veteran has results in
higher VA combined ratings for compensation. There is a correlation
between the high and low clusters of 3.0 to 2.4, respectively, a 25
percent difference.
The above fact is especially significant if one accepts the premise
that those veterans that file their own claims file for fewer
disabilities than those who file with a veteran service organization
(VSO) or have advocacy representation. It is generally accepted that
VSOs recognize secondary conditions the veteran may not, and review the
service medical records, a more accurate list of possible service
connected conditions than the veterans' recollection. Hence, this
supports the facts of the IG report that veterans who receive legal
help or aid from advocacy groups receive on average $11,162 compared
with $4,728 for those who go it alone. The National average is two-
thirds receive VSO assistance, however, reportedly forty percent of
Ohioans file their own claims. This is a factor that can and should be
addressed and will result in increased federal dollars for Ohio
claimants, on an average.
The Institute for Defense Analyses (IDA) final report is a
scientific study of state-by-state and VA Regional Office variation in
disability compensation claims, ratings and benefits. We certainly
concur with their findings that 100 percent and Individual
Unemployability (IU) are the most significant factors affecting total
payments. Although they represent only 17% of compensation recipients,
they represent 58% of total compensation payments. IU and often a 100%
disability rating can be subjective. These differences alone reportedly
were found to explain the vast majority of the variation in average
awards across states.
It is also our opinion that ``new'' or less experienced
adjudication personnel would be less likely to make subjective
decisions awarding the highest of compensation benefits. We will
address this further later in this testimony.
In another area of the IDA study it was noted that military
retirees are over four times as likely to receive compensation as non-
retirees.
Ohio has only one major military installation and a notable lack of
state incentives for retaining military retirees in Ohio may account
for a considerable number of benefit dollars, as the IDA study
attributes military retirees alone for over 40 percent of the variation
in the percentage of veterans receiving compensation.
The IDA study identifies the ``key driver'' in the variation across
states of veterans receiving compensation as the ``application rates.''
Several studies, including the IDA report addressed consistency
across VA Regional Offices and the potential for inconsistencies. We
concur with the VA position that if VA addresses accuracy in the
decision making process, consistency will take care of itself. We do
support the recommendation of standardized initial and ongoing training
for rating specialists.
The IDA report recommends standardized hospital evaluation
reporting. For several years The Ohio American Legion would return
files to the adjudication officers at Cleveland VARO as ``insufficient
examination'' for PTSD exams where the Global Assessment of
Functionality (GAF) score did not match the doctor's list of
symptomology. Rating specialists would use the lesser of the two if we
did not, resulting in less VA compensation. We often wondered what
happened to those claims without advocates to which we did not have
access.
In recent years we send back far fewer for new exams. Have the
exams gotten better? Are the doctors more thorough? We doubt it.
We suspect when a claimant has representation that may find an exam
suspect the adjudicator gives the examiner an opportunity to ``fix
it.'' Few doctors would remember the patient well enough to add
symptomology to their first report. It is our belief that they adjust
the GAF score to be consistent with the earlier reported symptomology,
which was insufficient to justify the assigned score. This would result
in lower compensation ratings.
We earlier alluded that Ohio's rating specialists are ``new'' or
``less experienced.'' As a historical perspective, in 1945 VA ``geared
up'' to handle the wave of incoming World War II claims to be received
from returning veterans now offered education benefits, home loan
guarantees, and disability benefits, much as a result of the GI Bill.
The class of '45 was born.
Thirty years later in 1975, as these employees were completing
their federal service they were replaced with another wave of
personnel, many of whom were Vietnam veterans themselves. Again, VA
``geared up'' to address the needs of this group of returning veterans.
What is different in Ohio in 2005 as the ``Class of '75'' finished
their 30 years of federal service?
In 2005 VA was in the midst of a 2004-2006 hiring freeze. Key
adjudicative positions went unfilled in some instances, filled with
lower level employees in others. VA was also experiencing being the
victim of their own failed hiring practices of earlier years when
efforts were made to hire attorneys and nurses, which they were unable
to retain.
In Cleveland, Ohio VA created a ``Tiger Team,'' a force of senior
adjudicators formed by Central Office directive to address the older
claims of aged veterans. They developed processing Memorandums of
Understanding with other government agencies and excel at handling the
claims of our World War II veterans and those claims over a year old
from around the country.
As beneficial to the Nation as it is, the Tiger Team represents a
significant brain drain in Cleveland's adjudication ranks.
If the driving factor is the number of claims in determining state
rank, Ohio is lacking a single, consistent message to veterans
regarding the claims process. Each County Veterans Service Commission
and Veterans Service Organization operates independently and within
their own budget constraints. Few counties do any outreach and since
their funding is from the inside mileage, there is little incentive
from county officials to urge greater expenditures in promoting their
offices, as one of their services is financial assistance to veterans
and their dependents and survivors.
County Veterans Service Officers and County Veteran Service
Commissioners now receive their training from the Governor's Office of
Veterans Affairs, which utilizes VA personnel at no cost. Several years
ago VSOs provided the training, which emphasized advocacy tips. VA
training may be fine for most areas of service, but we liken it to
learning how to duck hunt from a duck. It should not be the lone source
of trainers.
Many CVSOs recognize the need for other sources of training in
their desire for professional excellence and belong to the National
Association of County Veteran Service Officers. However, their training
sessions are often held in resort areas and participation restricted by
their employers, members of the Ohio State Association of County
Veterans Service Commissioners. VSCs receive all of their required
training in Ohio and unfairly expect the same of their CVSO employees.
Veterans Service Organizations (VSOs) have long been a source of
outreach to veterans with local Posts, State Service Officers, house
organs at the local and state levels, the distribution of pamphlets and
benefit information.
This changed in Ohio as the state's appropriations to VSOs were of
slow growth, then flat lined for several of the recent years. VSO
appropriations are given the misnomer of ``subsidy,'' leading one to
believe that the State of Ohio is subsidizing VSO operations, when in
fact, VSOs are subsidizing a state function in Ohio where we have no
Department of Veterans Affairs to file claims and provide claimants
representation.
The flat lined revenue from the State of Ohio came at a most
inopportune time, for The American Legion, as World War II veteran
deaths were on the rise followed by membership declines and subsequent
lost revenue. Publications were cut back or curtailed, employees in our
Service Division were eliminated by attrition and wages and benefits
suffered for those remaining. The American Legion Service Division,
once 15 full-time employees, is now 10.5 Full-Time Employee Equivalents
(FTEE).
Clerical personnel have been replaced by claims representatives
using computers to do their own letters, reports, and ``status
updates'' to inquiring claimants. An increasing VA backlog causes
increasing status inquiries, and the spiral goes on. Time spent filing
claims and providing advocacy representation is often now directed to
other matters. Outreach is no longer a goal, as increasing the workload
is not an objective of an over burdened, underpaid staff. Meeting
deadlines has become the area of emphasis.
In conclusion, Ohio's woes can be addressed quite simply. Although
the variances in demographics may never put Ohio at the top of the list
of benefits by state, our problem areas can be resolved by the infusion
of federal and state dollars.
VBA needs to increase its adjudication staff and attract some
experienced adjudicators to Cleveland that may be effective now, not
following extensive training. Making the ``Tiger Team'' an advancement
desirable to adjudicators around the country in salary, benefits and
workload would go a long way in attracting bidding on vacancies from
outside of Cleveland.
VHA exams need to be thorough, and complete, and re-done, if not.
VAs work measurement system of ``End Products'' rewards ROs for work
reported, not accuracy or correctness. This is a situation that needs
addressed, but is not unique to Ohio.
The State of Ohio needs to centralize its veterans programs in one
department, an Ohio Department of Veterans Affairs (ODVA), which
Governor Strickland, by Executive Order created a Veterans Study
Council to investigate and report to him by year's end.
The Veterans Study Council is addressing the issue of a comparison
of benefits available in other states to Ohio. As noted in the IDA
report, attracting military retirees back to their roots, into Ohio or
retaining those separating from military service as a last duty station
will raise the compensation average significantly.
Increased appropriations to VSOs will go a long way in serving
veterans. The marketing and outreach by CVSOs or an ODVA would be a
wasted effort if VSOs were not prepared at their link in the chain to
provide needed services.
The Ohio State Associations of County Veterans Service Officers and
Commissioners (OSACVSO & OSACVSC) receive a state appropriation for
training. It can be well spent on trainers from beyond VA ranks or
sending CVSOs to VSO training programs or the NACVSO schools.
VSOs need to prepare for increased workloads. The Ohio American
Legion is addressing salaries and staffing levels as well as we can
with available resources. Post Service Officers continue to train to
identify potential beneficiaries of VA benefits and get them to claims
filing professionals, most often their CVSO.
Ohio has the infrastructure for excellent service to veterans, but
its loose knit organization has not served it well during trying
economic times.
Piecemeal legislative efforts by well meaning legislators need to
be coordinated under a Department of Veterans Affairs and directed into
one omnibus legislative bill to correct Ohio's problem areas.
Thank you for this opportunity to present my perspective on Ohio's
needed answer to trailing other states in average compensation benefits
per veteran.
Statement of Hon. Charles A. Wilson,
a Representative in Congress from the State of Ohio
Chairman Mitchell, thank you for providing me the opportunity to
participate in today's hearing on this important topic. While I was
prevented from attending the hearing in person because of a recent
surgery, I am very grateful for the committee's attention to the
discrepancies among states in average benefits paid to disabled
veterans.
Like many members of this subcommittee, I was disturbed to learn
that the level of benefits paid to a disabled veteran seems to depend
in part on the state in which that veteran resides. While some
variation may be expected, the discrepancy seems too large to be
explained fully by natural or demographic factors.
I am convinced that the federal employees responsible for
determining a veteran's level of disability are dedicated public
servants who keep at heart the interests of the veterans they serve.
Despite this, it seems likely that different Veterans Administration
Regional Offices have developed unique cultures that have an effect on
the level of benefits that they award. I believe that this is an
unacceptable state of affairs, and is not fair to veterans who have the
right to expect that their claims will be decided impartially and
according to statute.
As a representative from Ohio, I was dismayed to learn that my
state ranked dead last in the average benefit paid to its disabled
veterans. Ohio veterans, who have made the same sacrifices as veterans
from every other state, may feel that the system is slanted against
them. I do not believe that the Veterans Administration can afford to
allow this situation to breed cynicism among the veterans who have
sacrificed so much for this nation.
While the Veterans Administration has taken some steps to correct
this situation, I believe that more aggressive action should be taken.
I commend Chairman Mitchell and Ranking Member Brown-Waite for calling
this hearing to bring some much-needed attention and oversight to
efforts to level the playing field for veterans in every state. I would
also like to thank Congressman Space for his active leadership on this
issue. I thank the distinguished witnesses for their testimony, and
look forward to working to solve this problem as quickly as possible.
Committee on Veterans' Affairs
Subcommittee on Oversight and Investigations
Washington, DC.
November 2, 2007
Honorable Gordon H. Mansfield
Acting Secretary
U.S. Department of Veterans Affairs
810 Vermont Avenue, NW
Washington, DC 20420
Dear Secretary Mansfield:
On Tuesday, October 16, 2007, the Subcommittee on Oversight and
Investigations of the House Committee on Veterans' Affairs held a
hearing entitled Disability Claims Ratings and Benefits Disparities
within the Veterans Benefits Administration.
During the hearing, the Subcommittee heard testimony from Ronald R.
Aument, Deputy Under Secretary for Benefits, Veterans Benefits
Administration, U.S. Department of Veterans Affairs. He was accompanied
by Mr. Bradley G. Mayes, Director, Compensation and Pension Service,
Veterans Benefits Administration, U.S. Department of Veterans Affairs.
As a follow-up to that hearing, the Subcommittee is requesting that the
following questions be answered for the record:
1. Please explain the Institute of Defense Analyses' (IDA) finding
regarding attorney representation. The Subcommittee is concerned that
the findings indicate that if veterans hire attorneys, veterans will
derive a more favorable outcome for their claims. Does the VA's agree
with this impression? If not, please give your reasons.
2. Has the VA ever outsourced claims for purposes of adjudication?
If so, please state when this occurred, the number of claims so
outsourced, the reason for the outsourcing, and the oversight controls
VA implemented to assure the consistency and accuracy of the outsourced
adjudications.
3. Please describe how VA is implementing IDA's training
recommendations. Please address specifically how VA's training efforts
differ from those in the past and provide details about that training;
for example, title and brief description of training courses; whether
the training is mandatory or not; personnel required to take a
particular training module; whether the training includes a testing
requirement to ensure that trainees have assimilated the materials.
4. Are VBA personnel who adjudicate claims required to have
professional or other certification? If not, please explain why
certification is not required and whether VBA plans to require
certification in the future. If certification is required, please
describe the required certification, how VBA ensures that its personnel
have the necessary certification(s), and the consequences to VBA
personnel who do not obtain required certifications.
5. With respect to the STAR reviews that are conducted each year,
how many STAR reviews are taking place, and what are the outcomes of
each of the reviews?
6. The IG, GAO, and IDA have all noted that VBA has tested for
accuracy of claims adjudication but not for consistency across offices.
What is VBA doing to remedy this defect?
7. If the Veterans Benefits Administration is unable to get
information relating to a service member's in-theatre service directly
from the Department of Defense to verify stressors contributing to
PTSD, what alternate sources are being used to verify stressors when
validating a claim in the ratings process?
We request you provide responses to the Subcommittee no later than
close of business on Friday, November 30, 2007.
If you have any questions concerning these questions, please
contact Subcommittee on Oversight and Investigations Staff Director,
Geoffrey Bestor, Esq., at (202) 225-3569 or the Subcommittee Republican
Staff Director, Arthur Wu, at (202) 225-3527.
Sincerely,
HARRY E. MITCHELL
Chairman
GINNY BROWN-WAITE
Ranking Republican Member
Questions for the Record
Hon. Harry E. Mitchell, Chairman
Ginny Brown-Waite, Ranking Republican Member
Subcommittee on Oversight and Investigations
House Committee on Veterans' Affairs
October 16, 2007
``Disability Claims Ratings and Benefits Disparities within the
Veterans Benefit Administration''
Question 1: Please explain the Institute of Defense Analyses' (IDA)
findings regarding attorney representation. The Subcommittee is
concerned that the findings indicate that if veterans hire attorneys,
veterans will derive a more favorable outcome for their claims. Does
the VA agree with this impression? If not, please give your reasons.
Response: The IDA study contained no findings specific to
disability claims outcomes for veterans represented by attorneys. What
it did find was that claimants who were represented by attorneys,
veterans service organizations, and claims agents received, on average,
higher compensation payments than those without representation. IDA's
comparison was between veterans with any representation (i.e. national
and state veterans service organizations, attorneys, and agents) and
claimants without such assistance. The Department of Veterans Affairs
(VA) does not agree with the position that veterans with attorney
representation will have greater prospects for a successful claim. The
overwhelming majority of beneficiaries are capably represented by
national and State veterans service organizations that perform their
services without charge. Currently, paid attorneys and agents represent
a very small percent of claimants, although we expect that percent to
rise based on the legislation enacted last year to allow attorney
representation at the notice-of-disagreement stage in the adjudicative
process.
The claims process can be complex. We believe that claimants may
find it helpful to seek the assistance of a national or State service
organization, which provide their services free of charge, or
individuals recognized by VA to provide such assistance. We routinely
provide claimants with information about representation. We believe
that the free services of national and State veterans service
organizations provide the level of counsel needed in virtually all
cases.
Question 2: Has the VA ever outsourced claims for purposes of
adjudication? If so, please state when this occurred, the number of
claims so outsourced, the reason for the outsourcing, and the oversight
controls VA implemented to assure the consistency and accuracy of the
outsourced adjudications.
Response: The Veterans Benefits Administration (VBA) has never
contracted with any non-government entity to adjudicate claims for VA
disability compensation benefits.
Question 3: Please describe how VA is implementing IDA's training
recommendations. Please address specifically how VA's training efforts
differ from those in the past and provide details about that training;
for example, title and brief description of training courses; whether
the training is mandatory or not; personnel required to take a
particular training module; whether the training includes a testing
requirement to ensure that trainees have assimilated the materials.
Response: IDA recommended VA standardize initial and ongoing
training for rating specialists. VBA has a standardized training
curriculum for all rating veterans service representatives (RVSRs). All
new RVSRs are required to attend 3 weeks of national, centralized
training. VBA provides regularly recurring centralized training
sessions for newly appointed RVSRs. Topics covered during centralized
training include general rating policies as well as specific rating
policies related to the different body systems. Before and after
attending centralized training, new RVSRs follow a prescribed
standardized training schedule to include the use of computer-based
training and performance support system (TPSS) modules. TPSS modules
include tests to ensure that students assimilated the materials.
Each year all RVSRs are required to complete at least 80 hours of
training. Training topics are derived from a standardized RVSR
curriculum that is available on the Compensation & Pension Web site
(copy of curriculum is enclosed). VBA implemented the annual 80-hour
requirement in 2006. For fiscal year (FY) 2008, all RVSRs are required
to complete the new TPSS module on rating post traumatic stress
disorder (PTSD) claims.
Question 4: Are VBA personnel who adjudicate claims required to
have professional or other certification? If not, please explain why
certification is not required and whether VBA plans to require
certification in the future. If certification is required, please
describe the required certification, how VBA ensures that its personnel
have the necessary certification(s), and the consequences to VBA
personnel who do not obtain required certifications.
Response: VBA personnel who adjudicate claims are not required to
have professional or other certification. VBA has developed an
instrument and process for skills certification for the veterans
service representative (VSR) position. VSRs that elect to take the
certification test and pass are promoted to the GS-11 level in the
career ladder. Skills certification has been developed as a secure
assessment instrument that enables VSRs to demonstrate that they have
attained the level of skills required to provide quality service and
decisions to veterans. Six hundred VSRs have passed certification. To
date, certification has been a voluntary process.
VBA is currently developing a skills certification instrument for
rating VSRs and is in the final stages of validity testing. We are
currently engaged in our collective bargaining obligations with the
American Federation of Government Employees (AFGE) regarding full
implementation of the rating VSR skills certification process.
It is VBA's goal to make skills certification a requirement for
advancement to the journey level for our key decision making positions.
Question 5: With respect to the STAR reviews that are conducted
each year, how many STAR reviews are taking place, and what are the
outcomes of each of the reviews?
Response: Currently, 120 rating cases and 120 authorization cases
for each of our 57 regional offices are sampled for accuracy review
each year. The rating sample is doubled (240 cases) for the four
largest stations and the six stations with the lowest overall accuracy.
Regional office and national accuracy statistics are reported on a 12-
month rolling cumulative basis. VBA recently approved a significant
expansion of the number of claims sampled through the STAR program. In
2008, VBA is increasing the number of annual reviews to 246 rating and
246 authorization cases for each of the regional offices as well as 246
cases for each of the three pension maintenance centers. Hiring
authority for 16 additional quality reviewers was granted to support
this sampling increase and the addition of a national rating
consistency review. The Compensation and Pension Service is currently
recruiting for additional reviewers and obtaining additional space to
support this expansion.
The STAR program assesses the accuracy of claims processing
decisions across regional offices through a comprehensive review and
analysis of all elements of processing associated with a specific
claim. The STAR system includes review of work in three areas: (1)
claims that usually require a disability rating decision, (2) claims
that generally do not require a disability decision, and (3) fiduciary
work. Reviews are conducted after completion of all required processing
actions on a claim. The program was designed to be outcome-based, but
outcome was not limited to the decision reached. The definition of
outcome includes addressing all issues, fulfilling duty-to-notify and
duty-to-assist obligations, making the correct decision, and
establishing the correct payment from the correct date. These outcome
areas are identified under the ``benefit entitlement'' category. When
an error in the benefit entitlement category is identified, the case is
considered ``in error.'' Other review categories include ``decision
documentation/notification'' and ``administrative.'' A structured
quality review check sheet is used to promote consistency of reviews.
STAR accuracy review results are used to assess station accuracy for
quality improvement purposes and to facilitate local training efforts.
Upon return of the claims folder or guardianship file to the
regional office, station management ensures that deficiencies noted are
corrected. Corrective action can include re-adjudication or
notification, as well as employee training and feedback. Regional
offices are required to provide quarterly notification of corrective
action taken on STAR benefit entitlement and decision documentation/
notification errors identified during that quarter. The quarterly
corrective action reports are validated during routine oversight
compliance visits conducted by the Compensation and Pension Service.
Question 6: The IG, GAO, and IDA have all noted that VBA has tested
for accuracy of claims adjudication but not for consistency across
offices. What is VBA doing to remedy this defect?
Response: As part of VBA's continued commitment to quality
improvement, the Compensation and Pension Service Quality Assurance
Staff is being reorganized and expanded to add a consistency tier to
our national quality assurance program.
VBA developed and implemented a rating consistency review program
to assess both the frequency of assignment or denial of service
connection (grant/denial rate) and the most frequently assigned
evaluation (mode) across regional offices for selected diagnostic
codes. Results are plotted per diagnostic code to identify stations
falling outside of two standard deviations from the mean. Business
rules are applied to the data analyses to determine the diagnostic
codes warranting focused case reviews.
This methodology was successfully tested in a consistency review
pilot project that was completed in August 2007. Post-traumatic stress
disorder (PTSD) was identified as one of the most frequently rated
conditions during the period October 2004 through September 2006. The
grant/denial rate of PTSD across all regional offices was plotted to
identify stations falling outside of two standard deviations from the
mean.
Data on ratings involving individual unemployability (IU) decisions
from the October 2004 through September 2006 period was also analyzed.
The grant/denial rate of IU across all regional offices was plotted to
identify stations falling outside of two standard deviations from the
mean.
A Data Analysis Staff was created within the Quality Assurance
Staff to perform ongoing monitoring of rating consistency. Using
approved statistical methodology, this staff works with VBA's Office of
Performance Analysis & Integrity to extract, analyze, and identify
statistical outliers. Focused rating consistency case reviews will be
conducted by the quality review staff based on the results of the
statistical analysis. The results of the current data analysis of
rating decisions from the period October 2005 through September 2007
will be analyzed to determine the diagnostic codes warranting focused
case reviews for the remainder of the fiscal year.
Question 7: If the Veterans Benefits Administration is unable to
get information relating to a service member's in-theatre service
directly from the Department of Defense to verify stressors
contributing to PTSD, what alternate sources are being used to verify
stressors when validating a claim in the ratings process?
Response: VA uses the following sources to verify claimed in-
service stressors without requesting verification from the Department
of Defense (DoD), or when VA is unable to obtain verification from DoD:
National Archives and Records Administration (NARA) and Records
Management Center (RMC)
VA primarily uses records held by NARA and the RMC to verify
claimed in-service stressors. NARA maintains a registry of most
individual medical and personnel records in its custody, while the RMC
houses records received from DoD and the Coast Guard. Examples of
common sources of evidence VA uses to corroborate claimed in-service
stressors from NARA/RMC include:
military occupational specialty (MOS) or individual award
evidence
personnel folder or service medical records
morning reports
medical evidence from civilian/private hospitals,
clinics, and physicians where or by whom a veteran was treated, either
during service or shortly after separation
On-line Reference Material
To reduce the time involved in verifying a claimed in-service
stressor, VA uses VBA sanctioned Web sites and authorized reference
material for research on corroborating stressors. Authorized sources
are available through a VBA Web site that provides links to 30 sites on
Web pages and cites reference material relevant to PTSD stressor
research. A few examples of Web sites include:
The Vietnam Casualty Search Page, ``No Quarter''--This
Web site contains a database of Vietnam casualty information. A search
may be conducted by name, province of casualty, hometown or state of
the veteran.
DoD Gulflink--This site has information on the 1990-1991
Gulf War with declassified documents on records from all the Armed
Forces.
Iraqi Coalition Casualty Count--Contains a list of
coalition casualties and the circumstances of death for Operation
Enduring Freedom and Operation Iraqi Freedom.
Marine Corps unit records from the Korean Conflict and Vietnam Era
are maintained on VBA's imaging management system, Virtual VA. A Web-
based application has been developed to enable employees to research
these records. The Marine Corps Archives and Special Collections
(MCASC) Office maintains custodianship of the records. If VA cannot
verify a claimed stressor, or requires unit records dated after the
Vietnam Era, MCASC is contacted to identify the document or provide
confirmation that the claimed stressor cannot be corroborated.
VA considers other sources of evidence that may be used to help
corroborate in-service stressors. Such sources include buddy statements
or affidavits, letters written during service, photographs taken during
service, State or local accident and police reports, or newspaper
archives. In the case of combat veterans, the corroborating statements
of comrades who have personal knowledge of the stressful event are
sufficient.
In PTSD claims involving sexual or personal assault, VA also
develops for indicators of the assault, such as sudden declines in
performance, sexually transmitted disease testing, requests for
reassignment and other indicators to validate the event.