AMERICAN STATISTICAL ASSOCIATION (ASA)

MEETING OF THE COMMITTEE ON ENERGY STATISTICS

WITH THE ENERGY INFORMATION ADMINISTRATION (EIA)

Washington, D.C.

Friday, April 29, 2005

COMMITTEE MEMBERS:

NICOLAS HENGARTNER, Chair

Los Alamos National Laboratory

MARK BERNSTEIN

RAND Corporation

CUTLER CLEVELAND

Center for Energy and Environmental Studies

JAE EDMONDS

Pacific Northwest National Laboratory

MOSHE FEDER

Research Triangle Institute

BARBARA FORSYTH

Westat

WALTER HILL

St. Mary's College of Maryland

NEHA KHANNA

BinghamtonUniversity

NAGARAJ K. NEERCHAL

University of MarylandBaltimoreCounty

SUSAN M. SEREIKA

University of Pittsburgh

DARIUS SINGPURWALLA

LECG

RANDY R. SITTER

SimonFraserUniversity

ALSO PRESENT:

MARGOT ANDERSON

Energy Information Administration

ALSO PRESENT (CONT'D):

HENRY BROOKS

Energy Information Administration

BETH CAMPBELL

Energy Information Administration

GUY CARUSO

Energy Information Administration

BRENDA COX

Energy Information Administration Contractor

JOHN PAUL DELEY

Energy Information Administration

HOWARD BRADSHER-FREDRICK

Energy Information Administration

CAROL FRENCH

Energy Information Administration

LYNN GEISERT

Z, Inc.

HOWARD GRUENSPECHT

Energy Information Administration

NANCY KIRKENDALL

Energy Information Administration

LAURIE KRAUSS

Office of Management and Budget

RUEY PYNG LU

Energy Information Administration

RENEE MILLER

Energy Information Administration

KARA NORMAN

Energy Information Administration

ALSO PRESENT (CONT'D):

IRENE OLSON

Energy Information Administration

JANICE POLING

Energy Information Administration

ERIK RASMUSSEN

Energy Information Administration

KEN VAGTS

Energy Information Administration

BILL WEINIG

Energy Information Administration

MARGIT WHITAKER

Government Accounting Office

JOHN WOOD

Energy Information Administration

BARRY YAFFE

Energy Information Administration

* * * * *

C O N T E N T S

AGENDA SESSION: PAGE

Assessments: Presentations and

a Panel Discussion

External Review of Survey

Programs: A Progress Report

External Review of EIA Program

Progress on EIA's 914: Response

Rate and Kinds of Challenges

Committee Discussants on EIA's

914, EIA's 826

* * * * *

P R O C E E D I N G S

DR. HENGARTNER: Ladies andgentlemen, welcome back to the second sessionof the ASA Committee on Energy Statisticswith the EIA. It's a pleasure to be back.Nancy will talk to us this morning aboutassessments, presentations, and paneldiscussions.

MS. KIRKENDALL: This is an updatesession but we've made a lot of progresssince last time too. Just to remind you,last fall we had our first PART assessment.That's a program assessment rating tool,that's an Office of Management Budgetprogram, and they rated us poorly in twoareas. The questions were regularindependent evaluations used to supportprogram improvement, assess effectiveness andrelevance, and they said no, we didn't haveany evidence of that. And the second one wasdo we have independent evaluations thatindicate a program is effective and achievingresults, and they said we had a limitedamount of that.

And so we have done some work totry to get started on doing more withexternal assessment and this will be to tellyou where we are. And there are two words inhere that had occurred to me when I waslooking at them this morning that we need topay attention to and what we may not have iswhat you'll hear about today is we've beenlooking at evaluations to support programimprovement and look at our relevance but I'mnot sure that we have them looking ateffectiveness. So that's something for theteam who are sitting back there to thinkabout. I think we need to take another lookat these words and see where we're going withit.

So the last meeting was verysimilar to this meeting. We had Brenda Cox,who told us about our plans to look at aprogram assessment of a family of surveys.At that point she told you about the templatethat she was using for evaluating a singlesurvey.

Well, now she's developed atemplate for evaluating a program of surveysand she tried it out on the petroleummarketing package. So she's going to giveyou an update of where she is with that.Last time Doug Hill did a discussion aboutexternal reviews that was supposed to look atmodeling and forecasting but you guys cameback with the idea for a high-powered reviewteam and so we have taken that and arerunning with it and John Paul Deley is goingto tell about our progress with that.

For this session we only have likean hour and five minutes and so the way wethought we'd set it up is that Brenda willtalk for 10 minutes. We'll have discussionamong the committee about her talk. That'sthe evaluation for a survey program. ThenJohn Paul will talk for 10 minutes tellingyou where we are with the external study teamand we'll have another discussion for 10minutes and then trying to assemble thepieces.

So this is just open discussion,really any ideas you have about externalprogram evaluations, but the other word inthe items that we didn't do so well on withPART was that we have this on a regular basisand so then the thought is how do we put thisall together in a sensible program. Maybethe high-powered review team is every fiveyears, something that might feed intostrategic planning, the overview of EIA,maybe combined with individual program-specific evaluations during the interim.

Anyway, that would be an opendiscussion. Any idea is welcome, throwthings out. Well, those are my questions,what I just said, so why don't we just startwith Brenda and go on from there?

MS. COX: You're queuing up mytalk?

MS. KIRKENDALL: I'm trying.

MS. COX: Okay, I'll start. Well,the last time we spoke for this externalreview of survey programs I should tell thenewcomers that the idea behind this projectwas to develop the procedures. So theprocess of an external review of a family ofsurveys, that was our goal. But we decidedthat there's no way that you can develop afinal process without trying it out andseeing how it works and evaluating it so thatis what I'm speaking on today.

This sort of research, by the way,is being done by Battelle under a subcontractwe have from Z, Inc., and they've been veryhelpful. Henry Brooks of Z, Inc., has alsoassisted me on this project.

When I was here before I actuallypresented the survey template which was aguide to review on an individual survey.Since then we've developed the programevaluation template. We've also tested thesurvey evaluation template on six surveys.We've tested the program evaluation template.Both of these were done for the petroleummarketing family of surveys. The next stepis to produce these penultimate templatesusing the experience we've gained to say whatdo we think these survey and programtemplates should be like and then to documentthe findings.

Now, the new thing for thiscommittee is the program evaluation templateand that was included with the materials thatwere given to the committee. The templatehas an overall description of the program,the program's objectives, target populations,sampling frames, program design, and for anindividual survey it'd be a sample design butfor a program it's the overall design, theconceptual design, of the whole program, datacollection and processing, data analysisproducts and documentations, and then summaryof findings.

We've actually tested bothtemplates now. Only publicly availabledocumentation was used for the evaluation.That would include the OMB package,explanatory notes and reports, the actualforms that the respondents complete, andanything posted on EIA's website.

We had limited interactions withprogram staff at EIA so it's pretty much doneindependently. And then for both the surveysand the program we produced a summary offindings which were shared with program staffto obtain their comments and corrections.

For the survey template we actuallyevaluated six petroleum marketing surveys.The survey methods were summarized andcritiqued for each item on the template.Recommendations or endorsements wereprovided.

Endorsements are when the survey isdoing something right you need to say so. Inother words you need to both talk aboutwhat's good as well as recommendations forimprovement.

And then the results weredocumented for EIA staff review. Thesesummaries for the survey were from seven toten pages so they're fairly long, detaileddescriptions of what the survey is doing andwhat they might be doing.

The results, which I was verypleased with, is that the whole surveyevaluation process worked. We did discoverthat originally in the template I presentedto the committee last time we had therecommendations at the end of the survey.That didn't work because it was likerepetition. You're discussing a problem andyou wait until the end to talk about what todo about it. So we felt like it was betterto have the recommendations presented andlabeled at the time of the discussion. So ifyou have a recommendation for targetpopulation it'd be presented in the targetpopulation section.

We found that design attributes andflaws were repeated across surveys so thatsuggests in some ways that there will be asynergy possible through doing theseevaluations because not only these surveysbut perhaps other EIA surveys can benefitfrom the comments.

Then we found what I would say is aserious lack of survey methods reports whichdid impede the evaluation. I would say EIAis not unique in this regard as federalagencies go, it's probably more like thenorm, but I think this is something thatneeds to be worked on. The typical findingsfor an individual survey were that the surveyobjectives need to be more completelydefined. Typically we took the surveyobjectives out of the OMB package.

The target population definitionsneeded to be made more specific. Sometimes Icouldn't tell what was being includedgeographically in the target population, forinstance. Some include the territories andpossessions and some it wasn't clear if theywere or were not being included, forinstance. We found that coverage and frameupdating procedures need to be specified andsometimes they weren't.

We found that insufficientinformation was provided to evaluate theediting procedures. We found that themailout package tends to be very welldesigned, might need a little tweaking inminor ways, so this was not a majorcriticism. We were just finding littlethings that could be tweaked about themailout package but it has a nice designoverall.

For the program itself, which wasdone after evaluating six of the surveys andreviewing the rest, we found there was nooverall program documentation at that pointin time. We found the survey evaluationswere more important input than we originallythought. When we started this process wethought the survey evaluations would be donejust to get the reviewer, and that would beme, accustomed to what the whole family waslike and then turn to the program.

Well, we found those were just moreimportant than we had conceived. We foundquite a bit of variations across surveys thatwere both intentional and due to happenstanceand that in part is because the programwasn't designed from scratch is the best wayto put it.

MR. FEDER: Brenda, a variation inwhat sense?

MS. COX: For instance, thedefinition of a firm would vary acrosssurveys. Some surveys were extremely preciseand I liked what they did in the way theydefined the survey. Others were not asprecise. In fact it looked like they justlet the company decide who should report forthat company or whether multiple reportsshould be done.

What other examples would I give?The target population does differ acrosssurveys. It starts out including theterritories and possessions. At some pointit changes and it's only the 50 states and DCso that's an example of intentional. It'snot really presented so that you can clearlysee that a change is occurring.

The due to happenstance comes aboutjust because these were derived over such along period of time. In fact many or most ofthese surveys were inherited by EIA fromother agencies. What I'm saying here is thisprogram was not designed as a program andthen the surveys evolved from the program.Rather the surveys came about and the programhas just been the compilation of the surveysin some respect.

For the program findings we foundthat there was a program level conceptualdesign needed that didn't exist, at leastdocumented as a public document. I shouldsay EIA being the way EIA is there's alreadya draft of that started since I did thisreview. I found the population definitionsand reporting conventions need to bestandardized across the surveys, that becauseof the way the surveys were created this issomething that's needed because often thesame companies are responding to thesesurveys and it could even be the same personso you need a certain commonality acrosssurveys and that methods documentation wasneeded for the program and the surveys.

Now, for this contract theremaining activities are to use theinformation that we've gained to revise thetwo templates to create what will be thepenultimate template based upon the resultsof this actual experiment and then documentthe templates and their use.

I have these questions for thecommittee. You've never seen the programtemplates so I asked are there additionalitems that you think should be included inthe program template, do some programtemplate items need to be modified orexpanded, and are there program templateitems that should be deleted. So basically Ithought that you might want to comment on the

template itself.

That's it. Oh, and I should saythe committee was also given a copy of thetemplate as well as the actual executedversion for the petroleum marketing family.

DR. HENGARTNER: Comments, anyone?

MR. FEDER: More a question than acomment, Nancy, when you were, as you said,rated not so great on that particular issuewas it more at the program level or thesurvey level?

MS. KIRKENDALL: It was programlevel and that comes up with the question ofwhat's a program. I mean, one view is thatwell, EIA is a program, we are an energyinformation program, and so that's the topic.Getting a review at that level is what JohnPaul will talk about. But we also have otherprograms and I think right now we bundle ourOMB clearance packages and they actually dodefine programs so we have a petroleummarketing program.

The surveys do fit together in asensible way and can be evaluated bythemselves. Do we do a good job coveringpetroleum marketing area, the prices andthings that we're supposed to collect in thatarea? Petroleum supply might be in anotherarea or maybe those are related and should belooked at together but you can break downwhat we do in terms of programs to see if theanswer to the OMB questions could be providedfor those too.

MS. COX: And I would suspect justas we found that you have to look at thesurveys to talk about the petroleum marketingprogram you probably have to look at theindividual programs then to talk about thelarger EIA program. So it's like looking atthe building blocks as you go up to talkabout what you're doing, why you're doing it,and is it being done in the best way it couldbe done.

DR. HENGARTNER: It occurs to methat this overall review is a very good thingbut it might miss maybe a third of the workyou're doing, namely, every special reportthat's mandated by Congress. How are yougoing to evaluate these and how are you goingto show Congress or the OMB that this isvaluable and well done given that they'reone-time shocks?

MS. KIRKENDALL: That's a goodquestion. One of the things we've tried todo for this external review team is toprepare documentation saying what is EIA. AsBrenda said, if you take our little programsand you put them together and that's EIA.But when you look from the top down at whatwe do that should be the overall EIA program.And one reason why we started with Brendalooking at surveys is that, at least for me,it's easier to think about a program ofsurveys they build up to try to measure some.

But we also have other programs.We have our forecasting programs. You'veheard about STEO. Margot has lots ofdifferent programs under her jurisdiction.She has the short-term forecasting. She hasthe country analysis group. I'm not surewhat you'd say her program is because it's sodiverse. We have the NEMS forecasting. Isthat one program? It's easy to say that inone breath so is that one program? And thenthere are the service reports and the peoplein the forecasting office do a lot of those.So you're right. We need to have thoseincluded too somehow.

MR. SINGPURWALLA: How many surveyswere included in your initial setup review?

MS. KIRKENDALL: I think Brendalooked at six but the petroleum marketingpackage has, I think, eleven.

MS. COX: Eleven.

MS. KIRKENDALL: So hopefully we'llhave a little more funding and can actuallyfinish off the whole package.

DR. HENGARTNER: So, going back toyour answer, how many programs does EIA have?

MS. KIRKENDALL: Well, we have tofigure out how to count them.

DR. HENGARTNER: I mean, it's veryinteresting what you're showing us, Brenda,but as a statistician I want to know what'sthe population. And it might be helpful ifwe know what the population is just bynumbers, for example, be able to identify ifwe can help you.

MS. KIRKENDALL: If you're talkingabout survey programs in oil and gas you'vegot petroleum marketing. And this is howwe're organized, too, which is why I say itthis way because it's easy to break it down.Petroleum marketing, petroleum supply,natural gas, and then reserves and productionare separate divisions and those are separateforms, clearances packages, that go to OMB.In coal you have the electric power surveys,the coal surveys, and then some miscellaneoussurveys like renewables, alternative fuelvehicles. Margot has the financial reportingsystem and the consumption surveys.