A Message to Educators: Lessons Learned from the Corporate Trenches

Funneling the Data into Intelligence

Data capture processes are mission critical when it comes to turning data into intelligence. Today’s databases have interfaces that allow very tight control on what data is captured and who is allowed to capture, edit and view it. It is recommended that system controls be supported by formal policies that clearly establish the rights and responsibilities of all involved. This is particularly true for people involved in the analysis and interpretation processes requiring a high level of integrity and expertise (Bernhardt, 2003, pp. 26-30). It is probable that educators who are responsible for analyzing and interpreting data will need training in new data analysis techniques (Williams, 2003, pp. 4-10).

A significant mistake made by corporate decision-makers was failing to plan for how captured data would be analyzed and interpreted. Marzano (2003) pointed out that many educators have fallen into this same trap. To be used effectively for decision-making and as a catalyst for improvement, the data gathered must be analyzed and interpreted in terms of potential causal factors that are identified through research (Slavin, 2002; Thomas, 2002). Hiebert, Gallimore and Stigler (2002) claimed this is “…an old problem revealed in a new light. Teachers rarely draw from a shared knowledge base to improve their practice.” To help change this practice, Marzano summarized the factors identified in the research that affect learning including school-level, teacher-level and student-level factors. It should be noted that Marzano includes product related factors at the teacher level. This fits well with the proposed data flow framework since the framework is based on a thorough understanding of the people and products that directly affect goal achievement.
 
Of course, understanding causal factors that may be impeding goal achievement is not enough. This intelligence must be strategically fed back through the funnel accompanied by a summary of those conditions and patterns that require corrective action, follow-up or further explanation. As Alwin (2002) pointed out: “…data and their purposes are not just a means to an end. Rather they are both a means and an end.” Parsons (2003) pointed out that improvement plans tend to lose momentum when there are no follow-up mechanisms in place. By comparing the approaches of two schools, Parsons emphasized this problem by describing how one school established action and evaluative inquiry teams that enabled them to turn data into intelligence that informs action and further evaluation.
 
Brimijoin, Marquissee and Tomlinson (2003) described an interesting application of the data flow framework by a 5th grade teacher who captures data from her students through ongoing extemporaneous self-assessments. Using a three-dimensional framework that includes pre-assessment, challenge moderation, and standard test results, the teacher captures, analyzes, and interprets the data throughout the course to target and address learner needs. While much of the data is informally collected, this approach allows the teacher to differentiate instruction precisely when it is needed most. It seems software developers are preparing themselves to support this type of extemporaneous assessment, as is the case with Pinnacle System offered by Excelsior. The system is precoded with state and local benchmarks which are correlated to actual assessments entered by teachers. Users, at a variety of institutional levels, can compile the data as needed and allow parental access using a Web-based interface (Hardy, 2003).