Injury Surveillance Workgroup Conference Call (Workgroup 2, Meeting #6)

September 28, 2012

Call Attendees:

  • Organizers/administrators:
  • Marguerite Carroll (Falmouth Institute, Project Manager)
  • Basla Andolsun (Falmouth Institute, Curriculum Development)
  • Holly Billie (Injury Prevention Specialist, CDC, co-lead of project)
  • Workgroup members:
  • Ward Jones
  • Jon Peabody
  • Jaylene Wheeler
  • Siona Willie

Agenda :

  1. Opening Comments/Thoughts
  2. IHS program/product is focused on getting information to public directly—a surveillance system is not the same thing in that the audience is tribal health people and service unit is management staff. We have not marketed it to be public on a wider scale.
  3. Content on marketing is overdone. One way to market surveillance system is to use it, e.g., make posters of what aggregate 5 year deaths look like and it will market itself (relevant to grants).
  4. L2 IS geared toward this. Not sure if we need to spend a whole lot of time on marketing; rather, we should spend time on communicating our results to the stakeholders. Marketing within narrow audience - folks at the table. Important thing is communicating results to partners.
  5. Linguistic competency – people who are bilingual, don’t speak English, or are disabled – think about this when disseminating results and use terminology folks will understand.
  6. Objective #7
  7. Page 4
  8. Title of objective – make distinction – evaluating surveillance system, not prevention activities.
  9. General data does not prove cause and effect – need to evaluate system, and track injury data in general.
  10. CDC language is too “high” here – simplify to plainer English.
  11. CDC examples had to do with police chiefs, etc., not guys in trenches. Focus is on convincing higher-ups of need. IHS objective is to teach people who want a system how to do it. People have already bought into it; need now to know how to do it.
  12. Page 5
  13. Needs to be stated, but it’s idealistic. Folks like the #’s and don’t care how we came about it.
  14. Add AI/AN examples.
  15. Page 5, 1.2 – No getting into evaluation of data, just does process work.
  16. Page 9
  17. Q: Is this process applicable to people taking the class?
  18. A: Page 6 – Formative Evaluation – is in planning stage. Go back at later time and do process evaluation. Is it doing what it was intended to do?
  19. Q: Do slides 6-8 apply here?
  20. A: May be worth reviewing, but we should be looking to future, does system work, and if not, how can we make it work. Process evaluation is what we should be doing here, not formative evaluation.
  21. Take out slides 6-8.
  22. In terms of surveillance, we should be doing internal quality assurance, making sure that data is appropriate. Quality Assurance should be part of surveillance system as well. See if you’re missing anything in your system.
  23. A process of doing surveillance is having one personlook at data as it comes in, then another person looks at coding and summary of data to make sure it’s coded correctly. Clean data makes sure of appropriate categories and reduces errors. Need clean data.
  24. Quality Assurance of process (evaluation process) is also important and separate from Q&A of (evaluation of) data.
  25. Figure 1 needs more applicable examples.
  26. Box 1 is ok.
  27. Box 2 – remove “operations team” (no replacement). Just “Train key persons in the process.”
  28. Box 3 – change to “Collect data obtained from your sources.”
  29. Box 4 is the same as 3 and redundant.
  30. Box 5 – change to “Review quality of information.”
  31. Box 6 – change to “Complete the data entry.”
  32. Box 7 – change to “Conduct data analysis and disseminate findings to stakeholders.”
  33. Figure seems to be describing the system and how to build one, NOT how to evaluate process.
  34. Re-title figure as “Flow of Activity for a Surveillance System.” The next slide is on how to evaluate this.
  35. Page 10
  36. 1.3 – Different people may want to evaluate their systems differently – could do process evaluation of every step, or Quality Assurance of data would just be last two or three steps.
  37. 1.4 – Not sure what they’re talking about here. Should just be “Does it work the way we wanted it to work.”
  38. Maybe consider looking at different ways to evaluate. Is system giving me information I need? For anything beyond that, steer to epicenter for assistance.
  39. Say, “Contact tribal epicenter for assistance,” or, “See your area I.P. advisor.”
  40. Don’t need to spend as much time on evaluation of surveillance system—important thing is building it.
  41. Pages 11-12 are beyond scope of course.
  42. Page 13
  43. Timeliness is important, but the rest can be left out. This is one evaluation factor to use, comparing yourself to others with timeliness.
  44. Page14
  45. i. Stability – Yes, include sentence or two about this (continuity, longitudinal data).
  46. 1.5 – Gather what your thoughts are and talk to folks involved about it. “Here are the problems we found in the evaluation; are we going to fix it or not?”
  47. Add importance of keeping notes in a file about what you’ve done, changes to system, and whether changes have affected data.
  48. Holly has term for this, will let know what it is.
  49. Give example, such as suicide plus drugs. May be important when you have a new focus or new people or switch from ICD-9 to ICD-10. Better note when switch is made (will switch in 2014).
  50. 2 – All statistical content has been covered in Objective 5. Leave pages 14.2 to 20 out.
  51. Page 21
  52. As long as it is stated that you’re just tracking data and not proving cause and effect, you’re ok.
  53. Material in bullet points here is iffy. Presenters could discuss bullet points as informal evaluation, but should note awareness of other contributing factors that may have influenced outcomes or if change in system/investments is reason for difference.
  54. Pages 22-25 – Leave out.
  55. Page 26
  56. Remove second bullet point.
  57. Third bullet point is iffy. Instructor could discuss good ways data is used to track injuries, upward and downward trends. May or may not relate to prevention activities. Not cause and effect. Need study to try to show causal relationship. Also, prioritize by knowing where you can make a difference.
  58. Closing Remarks
  59. After materials are in final form, pilot course will be made and tweaked and handed over to IHS to administer.
  60. Suggestion: Have workgroup members, e.g., Jon or Siona, be instructors or first-hand evaluate for pilot course.
  61. Outlines will be sent out and next meeting announced shortly.

To Do:

Holly: Please give Marguerite the term for the process of identifying the problems found through an evaluation and deciding if the problems should be fixed or not.

Marguerite:Create an outline and send to all.

All: Next meeting TBA (later in the fall)

Injury Surveillance Conference Call Summary (Workgroup 2, sixth meeting)

9/28/12