The audit: an annual physical for the laboratory Essay

Is the laboratory productive? Are we truly holding down costs? Do
we make the best possible use of our personnel? Are our methods for
test and instrument evaluation valid? Laboratory managers ask
themselves these difficult questions all the time. Often, they’re
satisfied that the lab is moving in the right direction, but now
administration is asking, “Can you prove it?”



Last year, on the eve of national introduction of prospective
payment, we decided it was time for serious introspection. A fullscale
departmental audit would tell us what kind of shape the laboratory was
in. Such an analysis is routine in big business, but it is fairly
uncharted territory for hospital departments.

We Will Write a Custom Essay Specifically
For You For Only $13.90/page!


order now



It also can be time-consuming and nerve-cracking, yet it’s
worth the struggle. You end up with a very good handle on laboratory
operations–how much of your resources are devoted to different
activities–and guidelines for improvements. The exercise makes it
easier to prepare annual budgets, possibly including changes in staffing
and instrumentation as well as new charges for tests. Similarly, it
provides a solid basis for long-range planning. Your proposals can be
amply justified to administrators with data developed in the audit.



We examined the laboratory from every angle: staffing,
productivity, and workload; test order patterns, department work flow,
and patient demographics; cost per test, single-test versus batch cost,
and instrumentation; and budget figures.


The audit had two facets. The first involved an analysis of the
lab as a unit to look at overall productivity, efficiency, costs, and
revenue. We planned to use external monitors–Ohio Hospital Management
Services (OHMS), Monitrend, and CAP workload recording–to rate our
performance internally, regionally, and nationally.



Ohio Hospital Management Services is an independent monitoring
group that works with the Ohio Hospital Association. As part of a
voluntary hospitalwide program, OHMS compares the laboratory’s
monthly workload with that of other hispital departments and with the
workload in labs at hospitals of similar bed size. Our 406-bed hospital
also participates in Monitrend, a computerized reporting system
developed by the American Hospital Association to monitor the activities
of different departments. In this program, we’re compared with
laboratories in institutions of similar size and case mix nationwide,
with hospital laboratories of similar size in the state, and with
hospital labs in our area.



The second facet of the audit was a concurrent analysis of the
individual lab sections. By working with the section supervisors, I
hoped to glean better information about workload and staffing and
ultimately develop new standards for measuring efficiency.



Figure I outlines our basic audit goals. Generally, these were to
assess lab management, determine the exact workload, rate our
cost-effectiveness, and evaluate our sections’ ability to meet
service demands.



The project seemed fairly straightforward. As audit coordinator, I
naively thought we could wrap things up in four weeks. Despite
countless hours volunteered by a dedicated staff, the time frame
eventually stretched to four months. Indeed, I spent an entire month
just researching the feasibility of various kind of studies, finding out
what information was available within the hospital, and along with other
personnel, writing the programs for our CompuPro 816A minicomputer.
Once the groundwork was laid, I charted what we hoped to accomplish and
listed tentative completion dates.



Everyone in the laboratory helped gather data. Phlebotomists, for
example, timed how long routine and Stat collections took. Technologists
carried out time studies on the tests they performed. Supervisors
determined direct costs for each test and reviewed many other aspects of
section operations. In addition to the clerical staff, technologists
and supervisors input data as time permitted and kept me informed so
that I could keep track of who was doing what and what remained to be
done.


The minicomputer performed more than 400,000 statistical
combinations, and the final report totaled 300 pages, including graphs
and tables. How many Stats chemistry ran by hour and shift would be a
single combination.



In examining laboratory activity by type of patient, to take one
example, we pulled requisition slips for a high-volume month, a
low-volume month, and an “average” month during 1983. Data
were entered on patient type (inpatient, outpatient, clinic, etc.);
routine, ASAP, or Stat testing; number of tests requested; section
performing the test; day of the week; time of day; turnaround time; and
other factors. This single exercise generated 50,246 computer records.



We initially had hoped to look at each laboratory section as a
separate functional cost center. The plan was to use standard cost
accounting methods to allocate indirect costs to each section based on
activity, space utilization, use of support services, and employee
hours. But the information needed for this type of audit was not
readily available; most hospitals don’t use such parameters to
allocate indirect costs. The method we settled on was to take the
lab’s percentage of hospital revenues and apply that to hospital
expenses to determine our share.



We analyzed all test procedures in terms of direct and indirect
costs, gross margin, and net margin. Our general goal was simply to
improve administration’s awareness of the lab’s financial
status. To accomplish this, we felt we needed to determine the
contribution of the total laboratory operation (gross revenue minus
total expenses) and the full cost per work unit generated within each
lab section (total section expenses divided by total section work
units). Direct costs were defined as salary expenses, supplies,
reagents, consumables, equipment, and other costs generated solely from
operating the lab.



All tests were evaluated singly and, where appropriate, as batch
procedures. If more than one instrument was used for a specific test,
both were evaluated. Backup methods and instruments used more than 10
per cent of the time were also evaluated. In this manner, we learned
what the cost differences were in performing the same test on different
analyzers. If technologists are aware of these diffrences, they can
employ instruments more economically.



Over the three-week period, each of the more than 100 time studies
was performed by at least five technologists to cover the various shifts
and days of the week. On multichannel instruments, a single study would
cover several tests. The resulting times, checked against CAP
standards, generally were slightly better than the norm. To adjust for
statistical variance on single tests, we added 4 per cent to the
performance times. That figure was recommended to us by OHMS time
management engineers. It might not apply to other labs.



After evaluating the test times, we used established laboratory
test cost analysis methods of summarize the per-test cost of all
supplies, reagents, and quality control. Collection time and collection
supplies were handled separately to evaluate their costs in relation to
batch and Stat procedures. The hospital’s computer-generated
department reports provided test volume and revenue figures.
Technologist and phlebotomist costs per minute were based on the
laboratory’s average wage rates for each position.



Following several weeks of data gathering, we began feeding the
numbers into the laboratory’s computer, which was programmed to
summarize direct costs, calculate indirect costs, gross margin, and net
margin, and analyze patient types and work flow. The entire lab staff
spent spare time over a two-month period entering all the data. It took
18 computer hours to run the statistical analysis that generated the
400,000 different data combinations mentioned earlier.



Meanwhile, supervisors reviewed their sections for ways to cut
costs, speed up turnaround time, and improve efficiency. Using one of
our programs, they calculated direct and total costs for all available
tests by every method employed. This covered about 250 procedures in
differing forms–single tests, batches, profiles, and panels.



Our new centrifugal analyzer proved to be a major cost-cutter. We
had estimated it would yield $40,000 in annual savings. According to the audit, this goal was reached in eight months; for the full year,
savings amounted to $60,000. The audit also confirmed that a new
automated blood culture instrument had increased productivity to
previously projected levels.



High-volume tests were further examined for direct cost/revenue and
total cost/revenue ratios. We compared the cost of doing a single Stat
test versus batching and then analyzed batching patterns and send-outs
to see if specimens were handled as expediently as possible. All
high-volume procedures were flow-charted from the time the physician
wrote the order until the test result reached the nursing station.
Turnaround times were, in fact, good. As for current send-outs, we
determined that the volume and the instrumentation needed did not
justify performing any of the tests in-house.



Workload was the next item on the audit agenda. We reviewed five
years’ worth of CAP workload statistics for each section and
projected future growth rates. By tracing each section’s changing
annual workload as a percentage of the laboratory’s total output,
we were able to pinpoint shifts and trends. With the help of the
computer, I could breadk down any section’s patient
population–percentage of inpatients, routine outpatients, and clinic,
preadmission testing, and emergency room patients. Figure II shows
simplified inpatient/outpatient worklod ratios for all of the lab
sections.



Even more interesting were the percentages of routine, Stat, timed,
pre-op, and ASAP requests received by each section (Figure III). We
also plotted the volume of such requests in two-hour increments during
the day (Figure IV). Another graph depicted total laboratory workload
by day of the week. These kinds of analyses helped us compare normal
staffing patterns with actual needs. Again, we found that for the mot
part our staffing closely matched work volume on all shifts in all
areas. By now, it may sound as if the audit didn’t lead to any
changes, but read on.



The supervisors’ final task was to review instrumentation.
This went beyond the kind of review conducted for annual budget
preparation. We wanted a list of all instruments, including purchase
price and date of acquisition, maintenance and replacement costs,
service contracts, depreciation–and volume and revenue accounted for by
the instruments. This status report would help us plan for future
capital outlays. The volume and revenue figures might tell us whether
the purchase was justified and how well we were evaluating new produts.



Assembling all this information gave us a clearer picture of each
section. Many of our management procedures and standards were upheld.
For example, the audit proved that our method for establishing the cost
of new tests was valid and that criteria for batching generally
hadn’t changed.



The statistics further demonstrated that our utilization of
personnel was good and that our productivity was exceptional.
Flow-charting high-volume procedures showed that our test processing
system works well, while the evaluation of instrumentation indicated
that it is well maintained and often exceeds the estimated useful life
expectancy.



Much of the data would please administration. Calculations based
on Monitrend formulas revealed that our direct expenses per adjusted
patient day were 22 per cent below the national average and 20 per cent
less thant those of state and regional comparison groups. The
lab’s direct expenses and salary expenses relative to workload were
about 16 per cent under the national average and 11 per cent below those
of area comparison groups.



Although the audit results were largely favorable, we did pinpoint
several areas that merited further study. Here are some of the changes
that resulted during the past year:



* Test charges. Often in reviewing rates, the immediate
inclination is to raise charges for high-volume tests as a means of
maximizing revenue. But the audit disclosed that we weren’t
covering indirect costs on other tests, particularly longstanding assays
in chemistry as well as some of the more esoteric procedures. So we
first made sure all tests were at breakeven or better before raising any
of them further.



* Staffing adjustments. The audit established the
laboratory’s productivity at 57.7 units per hour worked. That was
a 96 per cent efficiency rate, compared with the 80s range that the CAP
recommends. Indeed, we were too high. The only way we could attain
such efficiency was by having supervisors working at the bench and
consistently putting in more than 50 hours per week. Our data persuaded
administration to approve the addition of three technical FTEs to the
laboratory staff.



* Scheduling adjustments. Assumptions about the lab’s busiest
and slackest periods were corrected somewhat by the audit results. Like
many other labs, ours earmarked the Thursday before or the Friday after
weekend duty as a compensatory day off. These weekdays, however, turned
out to be peak workload periods. Tuesday is now the compensatory day.



The data also indicated that when we scheduled the same employee
into microbiology at 6:30 a.m. for an entire week, the early start
proved to be wearing, and reports got out more slowly after several
days. It was better to split the duty between two technologists or use
part-timers.



Scheduling will have to change with the shift in ordering patterns
under prospective payment. Traditionally, the day shift has done the
majority of the work while other shifts covered the lab for Stat work.
With DRGs, physicians are having their Medicare patients enter the
hospital later in the day to trim part of the length of stay. They want
the lab work started in the afternoon or evening and posted on the
chart. This puts more pressure on the later shifts.



* Clerical services. Although the hospital has expanded
considerably, we had done little to upgrade our clerical capabilities.
A mush-rooming outpatient load further strained the staff and the
system. The audit identified the lab office as one bottleneck for test
reporting. We plan to streamline the filing system to speed up storage
and retrieval, and we are reevaluating job descriptions, priorities, and
staffing to bring the clerical service in line with current needs. We
also have remodeled the front office.



Inadequate transportation of test requests and distribution and
charting of laboratory reports require further investigation. The
nursing staff’s confidence in the pneumatic tube system must be
bolstered, or it should be replaced with another transport system. A
computerized order entry system would elminate several problems in this
area.



* Patient processing. The audit made it clear that the
laboratory’s percentage of outpatient work was much higher than the
norm and that we weren’t processing these patients as efficiently
as we could. Physicians’ offices had to make separate outpatient
appointments for laboratory work, x-rays, and ECGs. Now, with a
centralized scheduling system, one phone call books a patient for all
required hospital services. We also have expanded seating in the
waiting room and eliminated overflow of outpatients.



On admissions for diagnostic workups, a new approach channels
patients through the laboratory and other ancillary services before
sending them to the floor. This minimizes the late-afternoon rush that
always seemed to hit just as the day staff was leaving. Test requests
are no longer held overnight, duplicate orders are down, and the
laboratory is spared numerous follow-up telephone calls and phlebotomy trips to the floor.



* Outpatient marketing. We could do even more outpatient work. Our
study found that the laboratory received outpatients from only 40 per
cent of physicians with active staff privileges at the hospital. To
gain a larger share of outpatient testing, the laboratory would need
more competitive test charges, biling systems, and reporting practices;
a professional courier service; and a marketing-oriented account
representative.



We also learned lessons about auditing. Trial and error taught us
that some of the statistics so painstakingly collected were irrelevant
and superfluous. For example, in determining our indirect costs, we
spent a lot of time working out depreciation rates for our instruments,
but we made little use of these figures. It is important to keep your
goals in mind.



Also recognize that there are usually several sources for the same
kind of information and that their perspectives and answers may vary
markedly. We spent a good deal of time trying to reconcile data and
make certain that we were indeed comparing apples with apples and not
with oranges.



It is equally important to keep the audit within the
hospital’s financial system and to work closely with its financial
consultants. If the hospital uses zero base budgeting, that’s the
only methodology you can use. If you try to introduce statistics from
another system, they won’t mesh with hospital figures.



Set a realistic time frame. Future reviews–annual updates of key
data and biennial full-scale audits–won’t take four months now
that we have baseline figures and a good idea of how to proceed. But
the initial audit is time-consuming.



An independent test of our efforts came from an outside consulting
group studying ways to cut hospital and medical expenses in our
community. As part of their comprehensive study, the consultants
presented the laboratory with a massive questionnaire, which we were
easily able to complete, thanks to the availability of our audit
results. After a review of our responses and the audit itself, the
consultants concluded that they couldn’t make a single
recommendation for laboratory improvements.



We weren’t surprised.

x

Hi!
I'm Heidi!

Would you like to get a custom essay? How about receiving a customized one?

Check it out