Learning Progressions Challenge

From Institutional Effectiveness and Accreditation

(Difference between revisions)
Jump to: navigation, search
Revision as of 18:11, 6 February 2014
Ddirlam (Talk | contribs)

← Previous diff
Revision as of 18:14, 6 February 2014
Ddirlam (Talk | contribs)

Next diff →
Line 1: Line 1:
-In recent weeks the nation’s premier journal for science and engineering, Science, has published an Editorial and an Educational Forum on the National Research Council’s Next Generation Science Standards (NGSS). Addressing these standards will not only transform science education, but define two methodological challenges for assessment: the "easy practices challenge" and the "converging dimensions challenge." The future of the assessment enterprise will depend on how we address these challenges.+In recent weeks the nation’s premier journal for science and engineering, Science, has published an Editorial and an Educational Forum on the [http://aalhe.org/sites/default/files/NGSS%20Framework.pdf] National Research Council’s Next Generation Science Standards (NGSS). Addressing these standards will not only transform science education, but define two methodological challenges for assessment: the "easy practices challenge" and the "converging dimensions challenge." The future of the assessment enterprise will depend on how we address these challenges.
The purpose of the new standards is to reorganize science education away from the factoid approach amenable to multiple choice testing and toward a new framework (downloadable from the link) that is as radically different from the way science is often taught in schools as it is radically faithful to the way science is done: The purpose of the new standards is to reorganize science education away from the factoid approach amenable to multiple choice testing and toward a new framework (downloadable from the link) that is as radically different from the way science is often taught in schools as it is radically faithful to the way science is done:

Revision as of 18:14, 6 February 2014

In recent weeks the nation’s premier journal for science and engineering, Science, has published an Editorial and an Educational Forum on the [1] National Research Council’s Next Generation Science Standards (NGSS). Addressing these standards will not only transform science education, but define two methodological challenges for assessment: the "easy practices challenge" and the "converging dimensions challenge." The future of the assessment enterprise will depend on how we address these challenges.

The purpose of the new standards is to reorganize science education away from the factoid approach amenable to multiple choice testing and toward a new framework (downloadable from the link) that is as radically different from the way science is often taught in schools as it is radically faithful to the way science is done:

"The framework is designed to help realize a vision for education in the sciences and engineering in which students, over multiple years of school, actively engage in scientific and engineering practices and apply crosscutting concepts to deepen their understanding of the core ideas in these fields. The learning experiences provided for students should engage them with fundamental questions about the world and with how scientists have investigated and found answers to those questions. Throughout grades K-12, students should have the opportunity to carry out scientific investigations and engineering design projects related to the disciplinary core ideas."

(downloaded from page 23 in framework on 2/14/2013).

That the framework need not be restricted to science and engineering can readily be derived by removing the restriction to those fields from the three principles used to accomplish this vision. The first principle of the framework is the developmental concept of learning progression, in which learners continually build on and revise their knowledge and abilities, starting from their curiosity and initial conceptions. Secondly, it focuses on a limited number of core ideas both within and across the disciplines. Third, it seeks to illustrate how knowledge and practice must be intertwined in designing learning experiences. These three principals can readily be seen in the developmental progressions for design expertise defined in the Savannah College of Art and Design study of Trillions of Ways to Design. That study revealed 21 progressions of design expertise, drawn from interviews with 60 design faculty members in fields ranging from Architecture to Service Design.

The NGSS committee sketched possible developmental progressions for each practice or concept. This is where both the excitement and the assessment challenges begin. The excitement was from seeing a national body advocate developmental based education, a topic I began writing about nearly 40 years ago. Duncan and Rivet (Science, 339, 2013, pp. 396-397) identified the challenges.

1. Understandings on the LP [Learning Progression] path… can be substantially different from accepted scientific concepts.
2. Little is known about how existing LPs interact within and across disciplines.

Examples of the first challenge included teaching weight before mass. Learners encounter the former in their daily experience with visible objects long before the latter and do not need much convincing to begin measuring the weight of invisible gasses. Measuring the mass of “weightless” objects in orbit presents technically more sound physics, but learning it depends on much less accessible experience. Duncan and Rivet urge teachers not to “shy away from such inaccuracies in the development of standards, assessments, and curricula.” From the viewpoint of the integrated assessment theory presented in these postings, such inaccuracies actually stem from easy practices. An important research question for assessment is whether easy practices are necessary to move learners from beginning practices (the first things they try) to practical practices (how they need to perform in a job). In developmental interviews, such easy learning is defined as learning that grows fast, does not compete with other practices in the same dimension, overshoots the resources that the activity depends on, and causes the activity to collapse unless it is replaced early by a slower growing but more competitive activity. Drawing with stickmen and geometric shapes is an example. Use in upper elementary school gets a child labeled as a poorer drawer, a label that ends many a student’s drawing interest. Since easy practices have such dire consequences, it is important to know their contribution to the acquisition of more advanced practices and if they do contribute, how to tell when they need to be replaced before they produce collapse. This then is the "easy practice challenge" for assessment.

Challenge number 2 is even more important. Duncan and Rivet referred simply to the place of crosscutting concepts like energy emerge in different disciplines. When multiple developmental progressions are assessed across programs in a learning outcomes network, it becomes possible to study the conditions under which advancement in one dimension affects advancement in another or conversely the conditions under which practicing developmental errors in one dimensions can hold back development in another. Do people need to learn how to collaborate before they can learn how to write from two points of view or vice versa? Does the inability to value the input of others on a topic cause one not to be able to write effectively? The great historian of psychology, Kurt Danziger, concluded his insightful analysis of Constructing the Subject with the observation that “progress in explicating psychological reality….is likely to involve the mutual confrontation of divergent empirical domains and the different investigative practices that constitute them.” The second assessment challenge of learning progressions framework extends such insight into the entire realm of knowledge acquisition. This is the "converging dimensions challenge" for assessment.

Because assessment professionals must work in cross-disciplinary social contexts, we are uniquely positioned to provide investigative practices that address these two challenges.

Personal tools