Using Attorney Surveys and

Systematic Reviews of Competitors

in CLE Curriculum Design

by

Pat Nester

President, Nester Consulting Group

I. Surveys.

It will be assumed for purposes of this discussion that no CLE provider is in the position of designing a curriculum completely afresh—i.e., having no courses in mind that for some reason are thought worthy of doing. Most of us have mission statements or other manifestations of organizational expectation that suggest at least areas of programs, if not specific titles. That being the case, the discussion here will be narrowed to how one might use surveys to supplement or refine one’s curriculum rather than how to design it from scratch.

A. Eliciting Customer Preferences. The usual focus in survey design in CLE work is on eliciting preferences. We want to know what our prospective customers want, whether they like A more than B, whether they like the program idea well enough to attend the program, whether they feel strongly or only so-so.

Response Rate. Anything that reduces the response rate is to be avoided in survey research. Getting a high percentage of responses to a random sample survey is essential to generating statistically reliable and valid data. Many methodological issues like sample size, response rate, response bias, and margin of error can only be mentioned here, but there are many sources to which one can refer for help, as well as a number of qualified experts with whom one can consult.

Open-ended Questions. The first rule of survey design for us amateurs is to use open-ended responses sparingly. As tempting as it is, one can't just ask, "What are the hot topics we should do next year?" This puts too much pressure on the survey respondents. They have to generate responses from scratch. More often than not, the strain of such an effort persuades the respondent not to fill out the survey or to put it aside to work on it later. For respondents with schedules as dense as the typical lawyer, later rarely comes.

Typically opportunities for open-ended response appear as an “other—please specify” option at the end of a list of specific alternatives that have been presented for review. That is the right place for them.

Bear in mind that a response to an open-ended question will be statistically meaningless since it has not been reviewed by all respondents. If it has value, it is because it suggests something one hasn’t thought about before (that might be tested later on another survey) or because it has come up surprisingly often.

Closed-ended questions. Essential as they are, it may not be easy to come up with a good list of closed-ended alternatives for review by respondents. But one must try, because the quality of the testable data put into the survey limits the quality of the inferences one can draw from it later. Often the list can be generated at brainstorming sessions of professional staff or of a CLE committee or, perhaps, in a more elaborate research effort, as the work product of a focus group or from structured interviews with knowledgeable attorneys representing a cross-section of likely practice areas that one’s curriculum might address.

When the list of alternatives to be evaluated by respondents is drafted, one should address several questions:

(1) If there is a strong response to an item, is it phrased in sufficient detail that one can design a program to address it? If one asked, for example, what are the hot topics on which we should do programs next year and one gave “real estate” as an option to be evaluated, a strong response to that option would not be enough to help you design a program. Few CLE providers attempt any more to market programs at that level of generality. It has been found more effective to bore down to the level of “environmental issues affecting real estate financing.” Coming up with response options at this level of detail is difficult to but extremely important at turning the survey results into workable components of the curriculum.

(2) Does the list of options deal with the same practice area, so that a lawyer who doesn’t do any family law can easily skip over the intellectual property options and find and evaluate the family law options? Not requiring every respondent to evaluate every possible response on the questionnaire improves the probability that a lawyer will fill out as much of it as is relevant to that lawyer’s practice and turn it in. Better still, send separate questionnaires to lawyers in different practice areas, if they can be identified and randomly sorted.

(3) Has one made sure that somewhere on the survey instrument—typically this is done at the end--one collects demographic data about respondents that includes their practice areas, size of firm, location, length of practice, income, etc., so that one can identify the responses of those with the strongest connection to or knowledge about the item in question.

(4) Has one given the respondent explicit instructions welcoming comments on each item and a convenient blank space in which to write the comments?

(5) Does the scale that respondents will use to evaluate each option make sense? Often a five-point scale is used where the end points are something like “very desirable” and “not desirable at all.” Using actual numbers representing the points of the scale that the respondent must write down can sometimes be problematic. If “5” is supposed to represent the “very desirable” end of the scale, there will be an unknown few respondents who don’t get that message and who believe that “1” should be the most positive response. Being “number one,” in our culture, is a good thing. By asking the respondent to circle the response category desired or by using letters to identify the options, one can avoid the number ambiguity.

B. Identifying Behaviors. Research consultant (and ACLEA member) Jim Moffat has made the observation at past ACLEA meetings, which he notes is supported by research findings in consumer product marketing, that there can be considerable slippage between what customers might say they like and what they will do. Perhaps this is most true in the CLE context when dealing with high-minded issues like legal ethics and professionalism. One almost feels duty bound to rate such matters highly when given a chance in writing to do so, but whether or not one would take the time out of the office to attend a program on such topics is not so certain.

Instead of asking their opinion about what they like, perhaps one should seek to understand what lawyers do in their professional lives. As Thoreau once said, “Who you are stands and thunders over what you say.” An alternative approach to surveying for preferences then is to survey for behaviors. Knowing what lawyers actually do all day, one may try to infer what CLE topics would be of most interest.

One great listing of lawyer behaviors is the "MacCrate Report," Legal Education and Professional Development, published by the ABA in 1992 and named for Robert MacCrate, the chair of its Section on Legal Educations's task force on law schools and the profession. One goal of that study was to identify skills and values required by modern law practice. Although many of the skills identified are stated generically, they could be applied to the specific tasks required in the narrowest areas of law practice.

At the State Bar of Texas several years ago, we explored a methodology that attempted to identify and weight the importance of lawyer behaviors in an area of practice. Consultant Jim Carder of the Business Professional's Network (and ACLEA member) with the advice of Dr. Cynthia Spanhel, the Director of Research and Analysis for the state bar, devised a "needs assessment" form, a copy of which is attached. The goal was to use experts from a self-identified practice area--in this case a group of lawyers active in the real estate section of the bar--to identify the key behaviors that most contribute to success or failure in that practice. Then we asked respondents (a much larger group than the consulting experts) to (1) rate their own present level of proficiency at those tasks, (2) rate their desired level of proficiency, and (3) rate the importance of the task at issue.

The goal was to reveal demand for courses titles or topics on courses by homing in on topics rated as highly important for which there was a large discrepancy between the respondent's present assessment of proficiency and that same respondent's desired level of proficiency. If there was a large discrepancy across a wide range of respondents, we interpreted that to mean that there would be demand for CLE on that topic.

Surprisingly, the hardest thing about this study was getting the group of expert insiders from the section to identify sufficiently specific and unique behaviors that the respondents could evaluate. As one can see, the options to be evaluated on the instrument often represent several tasks, which impedes clarity of analysis in reviewing the results. But it was a start. The needs assessment was never distributed in quantities that would support a scientific analysis. Using it as a sample survey would have required much shortening and tightening. Nevertheless, the results were interesting enough to be used by planning committees on real estate topics for some years. It is submitted here as an approach that could be adapted to any practice area. In our situation, it also fulfilled a need of improving the relationship between Texas Bar CLE and the real estate section. Essentially, we undertook to find out what their members thought would be the most valuable topics for future CLE events, which they perceived to be of great value.

C. Commentary. Tightly designed surveys can be an aid in refining one's curriculum, but one is best advised to restrict them to particular practice areas rather than trying to design an all-purpose instrument. Both the creation of the questionnaire and the interpretation of the results require a close collaboration with subject matter experts. Such groups are as important to identify and organize as the sample of respondents. The opportunity presented in using surveys in curriculum design is in testing the market power of topic ideas. It is important to understand that such ideas are generally not the product of a survey itself, although they can be. The survey generally tests ideas that one comes up with from other sources--brainstorming, focus groups, interviews of stakeholders, professional staff, competitors, or perhaps comments or responses to open-ended responses on other surveys. While effective sample surveys can be expensive to implement--ranging from a few thousand dollars to perhaps twenty thousand for a comprehensive instrument, some CLE offerings cost hundreds of thousands of dollars. The prospect of avoiding a losing program or finding a golden nugget through use of a survey may be small by comparison. Moreover, it is likely that a well-designed survey will assist one with decisions about a range of courses in one's curriculum, not just a single course.

II. Reviews of Competitors' Programs. In the 1960's and 70's, Harris Morgan of Greenville, Texas, was one of the pioneers of the movement to bring business principles and technology to law practice. In delivering CLE talks all over the nation, Harris made one point repeatedly: if a document comes across your desk that one can appropriate in one's practice, particularly a form of some kind, then one should have a whisky box handy under the desk where one can throw it. (Whisky boxes happen to be the right size and may have other advantages.) Later one can go through the box and integrate all that liberated intelligence into one's own office processes. Harris' disciples all over the world recognize one another by their whisky boxes. Mine is an Austin Nichol's Wild Turkey box.

Since I don't run a law office, what I throw into the Wild Turkey box are CLE brochures. It is important to get onto as many CLE lists as possible so that one can see what's going on far and wide. In designing a curriculum, it is particularly important to review the work product of one's competitors. With the advent of Internet CLE, one's list of competitors has become much longer. I print out the interesting email ads I get from CLE providers to throw them into the whisky box.

One can search through the box for program ideas to try (as well as marketing, design, and pricing ideas, which are subjects for another article) and for programs not to try. If a topic is too well covered in your market area one may decide to avoid it.

On the other hand, one may suspect that there is excess demand that is not being met. The one percent hit rate for the typical CLE brochure is testament to an important fact: there are many lawyers out there who might be interested and might attend a program but who were conflicted out of attending the first time because of its inconvenience of date or place. Consequently, many providers have simply repeated successful programs, even in the same location, to take advantage of this fact. From a business point of view then, the fact that someone else has done a successful program may be a reason to actively consider a similar offering rather than the reverse. The old "we need time to let the market recharge" apothegm may not bear close scrutiny.

Sifting through CLE brochures must be done with some care. Sometimes one is surprised that a competitor is trying out some old chestnut of a program long since past its prime. Sometimes there are individual topics identified that could be converted into full programs. One never knows. If something catches appears interesting, it is a good idea to run it by lawyer experts in the field who have served on one's faculties. If it is a big enough or exciting enough idea, consider putting it on a curriculum survey to get it evaluated.