Langfitt Technical Appendix - 1 -

TECHNICAL APPENDIX for Langfitt et al. “Health Care Costs Decline After Successful Epilepsy Surgery: A US Multi-center Study”

Identifying Providers of Care –Enrollment and consent in the cost study was carried out centrally at the University of Rochester after the parent study had been enrolling patients for about a year. At the time of enrollment in the cost study, subjects underwent a structured phone interview. They were asked to identify all medical and non-medical persons and institutions from whom they had received health care in the 2 years prior to their evaluation. They were specifically prompted for information about primary care providers, neurologists, surgeons, mental health providers, hospitals, emergency rooms and any provider seen for problems with their lungs, heart, stomach, skin or eyes. They were encouraged to consult family and friends to assist their recall. They provided audiotaped, verbal permission to obtain records from each provider. They were told that they could withhold permission to obtain a particular record out of privacy concerns.

At the end of the study period, subjects were re-contacted by letter and asked to provide the names of any new providers that they had seen since enrollment, along with written permission to obtain records. Similar recall prompts were given in writing. Non-respondents received up to 2 reminder letters. Providers were reimbursed $30 per record. Non-responsive providers received up to three follow-up letters and phone calls. Records were reviewed for references to providers that the subjecthad failed to name during the initial interview or on the follow-up form. Written permission to obtain these new providers’ records was also obtained. If these records revealed other providers not previously identified, permission to obtain those providers’ records was obtained. The cycle of record retrieval, identification of new providers and retrieval of new provider records was repeated no more than twice for any original record.

The 68 subjects in the final analysis sample received care from a total of 704 providers (mean=10.4 per subject), of whom 458 (65%) were key providers:neurologists (22%), hospitals or EDs (18%), primary care doctors (including obstetricians/gynecologists) (16%), psychologists, neuropsychologists or psychiatrists (13%), neurosurgeons (9%), and dentists (7%). Cardiologists, ophthalmologists, orthopedists and dermatologists accounted for 6%. The remaining 10 % were other specialists (surgeon, gastroenterologist, otolarnygologist, plastic surgeon, infectious disease doctor, social worker, urologist, oncologist, osteopath, chiropractor, counselor, endocrinologist, radiologist, hematologist, nurse practitioner, physical rehabilitation specialist, pulmonologist).

For the final sample, we received an average of 87% of all provider records and 92% of key provider records per subject. This included 96% of all inpatient records, 89% of all ED records and 90% of all UB-92s. Eighty-two percent (82%) of providers were identified at the time of enrollment and 12% were found in other records. Twenty-seven of the 68 subjects responded to the 2 year follow-up request to identify new providers. These subjects identified a total of 12 key and 22 non-key providers representing 11% of all providers and 7% of key providers for this group. We imputed mean costs attributed to new providers to all subjects who did not follow up, irrespective of group membership. The limited response to follow-up and our imputation strategy could underestimate costs in the post-operative period, if those who did not respond to the follow-up had used more services in this period than those who did respond to follow-up, but we had no independent way to test this. There was no significant difference between subject groups in the proportion of subjects responding to the follow-up(SF=54%, PS= 30%, NS= 30%, chi-square = 3.8, p=.15). Therefore any underestimateshould not have biased group-level analyses.

Determining Use from Medical Records - For each subject, records of use from all provider records and UB-92s were collated and arranged chronologically. Duplicate records were separated from the collated record. Dates of procedures and tests found in records were cross-referenced with inpatient admission and discharge dates to ensure that their costs would not be double counted. Admission dates for UB-92s were cross-referenced with dates from the MSES database and with the inpatient medical records. Any missing UB-92s were re-requested. We only counted an episode of service if we found its specific record in the provider’s record, or if we found a mention or record of it in another provider’s record that was sufficiently specific regarding the date and setting that we could unambiguously distinguish it from similar records (to avoid double counting). We did not count tests or consultations ordered by a physician for which there was no unambiguous evidence in a record that it had been performed.

Costing Resource Use - ‘Medicare costs’ are unlikely to reflect the true opportunity costs of the range of resources consumed across the range of health care systems serving the study subjects. For example, a hospital’s internal cost-accounting may provide a more accurate estimate of the opportunity cost to that hospital of providing a specific service in its community. Nevertheless, the Medicare costing approach is the most appropriate approach for cost studies aimed at informing policy-level decisions. It provides the most consistent, reliable, valid and generalizable estimates of the opportunity costs of direct medical costs in a patient sample served by many different health care systems spread across a number of states. It applies a standard relative valuation metric to a wide range of health care activities that occurred across multiple provider sites over a multi-year observation period. It relies on a legally-mandated and standard method of allocating fixed costs of production (e.g., overhead, malpractice) to marginal units of resource use across all sites of care. The relative value of resources consumed has been derived and codified so that it may be applied across a wide range of settings. Finally, it provides the most readily comparable information on costs, as it is among the most commonly used costing methodology in cost-of-illness or cost-effectiveness studies of other diseases and treatments in the United States.

Outpatient Costs: Outpatient episodes (outpatient visits to all medical and non-medical providers, ambulatory diagnostic and laboratory tests and procedures, and emergency department (ED) visits that did not result in a hospital admission) were coded according to the American Medical Association’s Current Procedure Terminology (CPT) 2002 manual. The cost of the professional component of each episode of care was calculated as the fully implemented, non-facility, resource-based, relative value unit (RBRVU) associated with the AMA CPT code (Federal Register, November 2, 1998), multiplied by the year 2000 conversion factor ($36.6137) (Federal Register, November 2, 1999). Visits to neurologists and neurosurgeons were sufficiently well-documented to be classified as low, moderate or high complexity initial and follow-up visits (CPT codes 99213, 99214 and 99215), using an algorithm developed specifically for this study by an expert panel consisting of the first author, a general neurologist, a neurosurgeon and 2 epileptologists. Inter-rater reliability for coding a random sample of 67 neurologist and neurosurgeon visits was good (weighted kappa = .87 (95% CI .77-.97)). Documentation of visits to other providers was too variable for reliable and consistent coding of complexity levels. As recommended by the expert panel, initial visits to a provider were coded as high complexity (CPT code 99205 for primary care providers (including PCP, GYN), and 99245 for specialists). Subsequent visits were coded as 99213, 99214, or 99215 based on the modal code for that provider’s specialty. Modal CPT codes for each specialty encountered were taken from the 2001 E/M Bell Curve Data Book , 2001: Decision Health, Rockville, MD.For ambulatory procedures and diagnostic tests, the technical component cost was calculated as the 2000 Ambulatory Procedure Code (APC) reimbursement value associated with that CPT code. If multiple similar ambulatory surgical codes were used on the same day, the costs were bundled per Medicare reimbursement procedures and calculated accordingly. The cost of dental visits was calculated as the mean New York State Medicaid reimbursement for initial and follow-up visits, since Medicare does not reimburse for dental care. The cost of a laboratory test (e.g., AED blood level) was assigned the 2000 Medicare reimbursement per the 2000 Clinical Laboratory Fee Schedule (2000, American Medical Association). Any professional component costs for laboratory tests were calculated using the RBRVU procedure described above.

Inpatient Hospital Costs: Hospital costs for admissions to study site hospitals were calculated from a standardized hospital bill (UB-92), using a ratio-of-costs-to-charges (RCC) method developed to study inpatient costs of stroke(Holloway RG et al., Neurology 1996;46:854-860). This approach assumes that the true (opportunity) costs of services provided by the same department within a hospital are directly proportional to what the hospital charges for those services. While this assumption may often be violated in specific cases depending on market conditions, the approach offers a convenient, internally consistent and standardized method for estimating the relative opportunity costs of different services within and across hospitals.

Total charges on the UB-92 attributed to each hospital department (e.g., laboratory, pharmacy, operating room) were multiplied by a ratio of costs-to-charges (RCCs) that was specific to the hospital, the department within each hospital and the year in which discharge occurred. RCCs were calculated from Medicare Cost Report data obtained from Health Markets Insights, a health care consulting firm. Each RCC was calculated by first combining related UB-92 revenue categories into aggregate revenue categories (ARC). Each ARC was mapped to a single cost center from the Medicare Cost report on the basis of shared activities. The total costs within each cost center for each hospital and each year then were divided by the total charges for each ARC for the corresponding site hospital and year. All costs were adjusted to year 2000 dollars using the Consumer Price Index for medical care.

For admissions to hospitals other than the study site, hospital costs were calculated as the DRG weight for the year in which the admission occurred, multiplied by the national capital standard federal payment rate for the Year 2000 ($377.03) (Federal Register, July 30, 1999).

Inpatient Professional Services: Costs of inpatient professional services were calculated in the manner described above for outpatient services, except that the facility RVU was used. For admissions for long-term video-EEG monitoring, intracranial monitoring, IAP or surgery at a study site hospital, CPT Code 99255 (high complexity initial consult) was assigned to the first day of admission and CPT code 99232 (the modal code used for subsequent neurology consults of moderate complexity) for each additional inpatient day, based on consensus of the expert panel. CPT codes for non-neurology consults were the modal codes reported nationally for that specialty (2001 E/M Bell Curve Data Book , 2001: Decision Health, Rockville, MD).

Emergency Costs:Emergency room visits that did not result in a hospital admission were assigned a CPT code based on severity and resource use. The severity index was extracted from the medical record whenever possible and translated to one of five CPT codes (99281, 99282, 99283, 99284 or 99285), which reflected a low to high complexity visit, respectively. These codes captured both facility and professional costs. Costs were calculated per the RBRVU procedure, described above. Emergency Room labs and procedures were costed separately, using the RBRVU procedure described above.

AED costs: AED costs were the 2002 average wholesale price of the prescribed dosage and formulation of AEDs taken from the 2002 Drug Topics Red Book (Thomson Medical Economics, Montvale, NJ). Where information about timing or formulation was missing, the cost was assumed to be the price of the least expensive formulation for that AED at that daily dose. All costs were adjusted to year 2000 costs, using the Consumer Price Index for Medical Care.

Imputation of Missing Provider Data - Frequency of use and associated costs from unresponsive providers were imputed in the following way. If we had retrieved at least one record for a specific provider type (e.g., primary care physician) for a subject, we multiplied frequency and cost of each observed episode of use for that subject and provider type by the ratio of a) the total number of identified providers of that type for that subject to b) the number of providers of that type for that subject whose records we received. If we had not received any records of a specific provider type for a subject, we took frequency and cost for that provider type for that subject to be the provider-and-period-specific, adjusted sample mean. For example, if we retrieved 3 out of 4 neurologists’ records for a subject, we multiplied the frequency and cost of all episodes attributable to the 3 neurologists whose records we had retrieved by 4/3, to adjust for the assumed use in the one missing record. If we had retrieved none of the 4 records, we took that subject’s neurologist-specific frequency and cost for each period to be the mean of the entire sample’s neurologist-specific use for the same period. Finally, for subjects who responded to the follow-up request to identify new providers, we calculated the average frequency and cost attributable to these new providers by provider type and period. This average frequency and cost was added to the observed frequency and cost in those subjects who did not respond to the follow-up request, in order to estimate frequency and cost in providers that they may have seen, but about whom we had no information. This may underestimate use if non-responders were more likely than responders to have seen new providers. One subject was excluded due to an error in determining the follow-up period that resulted in our retrieving records that covered only the first 6 months of follow-up. For the same reason, two subjects had records that did not cover the period from 24 to 18 months prior to evaluation. We imputed frequency and cost during this time for this subject to be equal to use in the period 18-12 months prior to evaluation.

For inpatient care at the study site, admission dates on all UB-92s were cross-referenced to the dates of evaluation-related admissions recorded in the inpatient record or in the MSES database. Cost for a missing UB-92 was imputed as the site- and admission-specific average per diem cost, multiplied by the length of stay obtained from the inpatient record. If the inpatient record was also missing, cost and length of stay was imputed as the mean cost for that admission type across all subjects at that site.

We did not need to impute missing data on AEDs, since information on AED dosages was readily available across most provider records of varying types. We were thus able to construct an accurate chronology of changes in medications and dosages for all subjects and all periods.

Classifying Use - The reliability and quality of documentation of outpatient care precluded reliable classification on an episode-by-episode basis. An algorithm was therefore used to classify care based on the type of care and the type of provider. The following episodes were considered ‘epilepsy-related’: laboratory assays of AEDs, CT/MRI/PET/SPECT scans of the head, EEGs, all outpatient visits to a neurologist, neurosurgeon, or neuropsychologist, and any other tests, laboratories or procedures ordered or performed by these specialists. AEDs were also considered epilepsy-related.

Outpatient visits to the following providers were considered ‘ambiguous’ because they could plausibly have been prompted by epilepsy symptoms, but could just have likely been prompted by other problems or symptoms: allergist, counselor, dentist, dermatologist, endocrinologist, family doctor, gynecologist/obstetrician, internal medicine, nurse, nurse practitioner, orthopedist, osteopath, pathologist, pediatrician, plastic surgeon, podiatrist, psychiatrist, psychologist, social worker, radiologist. Any tests or laboratories ordered or performed by these providers (except for the tests and procedures previously classified as ‘epilepsy-related’) were also considered ‘ambiguous’. All outpatient visits to, and tests or laboratories ordered or performed by, specialists other than these providers or by a neurologist, neurosurgeon, or neuropsychologist were considered ‘unrelated to epilepsy’.

UB-92s and all inpatient and ED records were reviewed to determine the relatedness of the presenting problem to the patient’s epilepsy and to the surgical evaluation process. Inpatient and ED admissions were considered ‘epilepsy related’ if the purpose of the admission was to treat symptoms or injuries unambiguously caused by seizures (e.g., status epilepticus, AED toxicity, near-drowning during a witnessed seizure), to clarify seizure diagnosis (e.g., new-onset epilepsy, or new seizure type) or for post-operative complications (e.g., wound infections). All other admissions were considered ‘unrelated’. Admissions were considered related to the evaluation if results of routine pre-surgical tests performed during that admission were recorded in the MSES database or if the goal stated in the discharge summary was to determine appropriateness for surgery or to perform surgery. Post-operative admissions for treatment of a post-surgical complication or for re-operation were considered related to the evaluation. Post-operative admissions prompted solely by seizure recurrence were considered unrelated to the evaluation.