Published on March 11, 2014
National Council on Measurement in Education Sunday, April 28, 10:00 Grand Ballroom A, 3rd Floor
John Behrens (Pearson, Center for Digital Data, Analytics, & Adaptive Learning) Framing comments Panel 1: Beyond the Construct: New Forms of Measurement • Marcia Linn (UC Berkeley): Interpreting student progress w/ embedded assessments • John Byrnes (SRI International): Text Analytics for Big Data • Kristin Dicerbo (Pearson): Invisible assessments in the digital ocean - Questions/discussion Panel 2: The Test is Just the Beginning: Assessments Meet System Context • Gerald Tindal (U of Oregon): Curriculum-based Measurement and State Data • Lindsay Page (Harvard University): The Strategic Data Project • Jack Buckley (NCES): Federal data efforts - Questions/discussion Panel 3: Connecting the Dots: Research Agendas to Integrate Different Worlds • Andrea Conklin Bueschel (Spencer Foundation) • Ed Dieterle (Bill and Melinda Gates Foundation) • Edith Gummer (National Science Foundation) - Questions/discussion
BIG DATA AMERICAN STYLE: TECHNOLOGY, INNOVATION, AND THE PUBLIC INTEREST Monday, Apr 29 - 10:35am - 12:05pm, Building/Room: Parc 55 / Divisadero • Ryan Baker (Teachers College/Pres. Int. Ed. Data Mining Society): Educational Data Mining: Potentials and Possibilities • John T. Behrens (Pearson): Harnessing the Currents of the Digital Ocean • Aimee Rogstad Guidera (Data Quality Campaign): The 4 Ts of State Data Systems: Turf, Trust, Technology, and Time: Policy Perspective on Empowering Education Stakeholders with Data • Kathleen Styles (Chief Privacy Officer, Department of Education): Hold Your Horses! –Addressing Privacy and Governance for Big Data & Analytics • Phil Piety, John T. Behrens, Roy Pea: Educational Decision Sciences and Interpretive Skills • Barbara Schneider (Michigan State, AERA President for 2013-2014): Discussant
• What is “BIG DATA”… really? • How does “Big data” relate to education? • How does “big data” impact the field of measurement? • How much is “BIG data” is hype, how much real change?
“Big data exceeds the reach of commonly used hardware environments and software tools to capture, manage, and process it with in a tolerable elapsed time for its user population.” - Teradata Magazine article, 2011 “Big data refers to data sets whose size is beyond the ability of typical database software tools to capture, store, manage and analyze.” - The McKinsey Global Institute, 2011 From Steamrolled by Big Data by Gary Marcus, New Yorker, April 3, 2013
Copyright 2012, Cognizant. http://www.cognizant.com/InsightsWhitepapers/Big-Datas- Impact-on-the-Data-Supply-Chain.pdf
Tavo De León: BigDataArchitecture.com http://bigdataarchitecture.com/wp-content/uploads/2012/02/Big-Data-New-Frontiers-for-IT-Management-AITP.pdf
Mark Gahegan Centre for eResearch & Computer Science University of Auckland
Mark Gahegan Centre for eResearch & Computer Science University of Auckland
Tableau Software: http://www.tableausoftware.com/solutions/supply-chain-analysis
Which one is Education?
Which one is Education?
• Natural evolution with parallels to other fields • Education faces data differences – Error – Comparability – Human factors • Infrastructure challenges • Forward movement is inevitable BIG DATA is coming
INTERPRETING STUDENT PROGRESS FROM EMBEDDED ASSESSMENTS: EXPANDING ITEM TYPES FOR ASSESSING INQUIRY • Marcia C. Linn, University of California, Berkeley • Ou Lydia Liu, Educational Testing Service • Kihyun (Kelly) Ryoo, University of North Carolina, Chapel Hill • Vanessa Svihla, University of New Mexico • & Elissa Sato University of California, Berkeley
Invisible Assessment in the Digital Ocean Kristen DiCerbo, Ph.D. @kdicerbo April 28, 2013
The Digital Ocean Copyright © 2011 Pearson Education, Inc. or its affiliates. All rights reserved. 22
Invisible Assessment Copyright © 2011 Pearson Education, Inc. or its affiliates. All rights reserved. 23 The ability to capture data from everyday events should fundamentally change how we think about assessment.
Micro-level Copyright © 2011 Pearson Education, Inc. or its affiliates. All rights reserved. 24
Macro-level Copyright © 2011 Pearson Education, Inc. or its affiliates. All rights reserved. 25 Sept June
Evidence-Centered Assessment Design • What complex of knowledge, skills, or other attributes should be assessed? • What behaviors or performances should reveal those constructs? • What tasks or situations should elicit those behaviors? Student Model Evidence Model(s) Measurement Model Scoring Model X1 Task Model(s) 1. xxxxxxxx 2. xxxxxxxx 3. xxxxxxxx 4. xxxxxxxx 5. xxxxxxxx 6. xxxxxxxx 7. xxxxxxxx 8. xxxxxxxx X2 X1 X2 Mislevy, Steinberg, & Almond (2003)
We Don’t Know it All… Copyright © 2011 Pearson Education, Inc. or its affiliates. All rights reserved. 27 •How do we capture, store, and extract huge event log files? Technical Issues •How do we model changing proficiency? •How do we make sense of stream data? •How do we eliminate experience and interface effects? Measurement Issues •How do we balance rich environments with the need to isolate skills? •How do we allow student control while observing what we need? •How do we communicate results? Design Issues •Will teachers and parents trust the scores? Implementation Issues
A Change in Thinking Copyright © 2011 Pearson Education, Inc. or its affiliates. All rights reserved. 28 • Item paradigm to activity paradigm • Individual view to social ecosystem view • Assessment isolation to educational unification
Copyright © 2011 Pearson Education, Inc. or its affiliates. All rights reserved. 29 Thank you Kristen.DiCerbo@pearson.com http://researchnetwork.pearson.com
Text Analytics for Big Data Big Data: New Opportunities for Measurement and Data Analysis National Council on Measurement in Education 2013 Meeting John Byrnes Computer Scientist SRI International 29 April 2013
Automatic organization and identification of text • Collection analysis for review of National Science Foundation programs • Analysis of clinician notes for expert advisor for National Institutes of Health • Massive data analysis for the US Intelligence Community • Information extraction of names of: – persons, locations, organizations – ships, cargo, ports – scientific entities from text sources: – web forums, blogs – scientific journal articles 31
Distributional Semantics 32
Automated Front End • Real-Time Concept Recognition – Custom hardware – Fiberoptic rate (2.4Gbps) • Real-time Language Identification – Separate platform – web data without pre-processing
Data as Subject-Matter Expert • Hypothesis generation for understanding premature birth • Medical diagnostics for pediatric kidney injury • User behavior modeling • Data fusion and integration Age Weight
Headquarters: Silicon Valley SRI International 333 Ravenswood Avenue Menlo Park, CA 94025-3493 650.859.2000 Washington, D.C. SRI International 1100 Wilson Blvd., Suite 2800 Arlington, VA 22209-3915 703.524.2053 Princeton, New Jersey SRI International Sarnoff 201 Washington Road Princeton, NJ 08540 609.734.2553 Additional U.S. and international locations www.sri.com Thank You
Data Management, Data Mining, and Data Utilization with Curriculum-Based Measurement Systems Gerald Tindal and Julie Alonzo Behavioral Research and Teaching (BRT) – College of Education, University of Oregon
Center for Education Policy Research at Harvard University | April 28, 2013 The Strategic Data Project: Annual Meeting of the National Council on Measurement in Education www.gse.harvard.edu/sdp
MISSION Transform the use of data in education to improve student achievement.
The SDP Family
I. Fellows Place and support data strategists in agencies who will influence policy at the local, state, and national levels. Core Strategies 2. Diagnostic Analyses Create policy- and management-relevant standardized analyses for districts and states. 3. Scale Improve the way data is used in the education sector. Achieve broad impact through wide dissemination of analytic tools, methods, and best practices.
Standard Analyses Customized Analyses Data WorkTeaching • Human capital, college- going • ~ 35 analyses each • 10 CG analyses to be on Schoolzilla platform by year end • Key issues identified by partner • Denver: course grades analysis • LA: on-track for A-G requirements • Collect, clean, connect • Often this is a huge lift • Much discovery happens (laying the groundwork for better data collection and management strategies in the future) • Example: course data, teacher hiring data • Set up, manage, support working groups • Connect diagnostic to policy implications • Change management • Methods training • Publishing findings; distribution Diagnostic: Product + Process
• Set of specific recommendations about actions agencies should take to improve performance • Comprehensive collection of all that can be done with existing data • Root-cause analyses for specific issues • Ranking of agencies What the diagnostics are not…
The SDP Human-Capital Diagnostic Pathway
• Recruitment: When are teachers hired? How does teacher effectiveness vary with hire date? • Placement: Which students are assigned to new teachers? How does this compare to those assigned to veteran teachers? • Development: How do teachers develop in their level of effectiveness over time? • Evaluation: How much variation exists among teachers based on effectiveness measures from the agency’s traditional teacher evaluation system? Based on a value-added measure of teacher effectiveness? • Retention: What share of novice teachers remain in the same school and/or in the same district after five years? Illustrative Guiding Questions
The SDP College-Going Diagnostic Pathway
• 9th to 10th transition: What share of students are on-track to graduate at the end of the first year of high school? Of those who are off track, what share is able to get back on track? • High school graduation: To what extent do graduation rates vary across high schools when comparing students with similar incoming achievement? • College enrollment: To what extent do highly college-qualified students fail to matriculate in college? • College persistence: To what extent does college persistence vary across post-secondary institutions? Illustrative Guiding Questions
Illustrative Diagnostic Analysis
Korynn Schooley Chris Matthews Summer PACE: • College-Going Diagnostic revealed 22% of “college-intending” high school graduates were not matriculating to college • Worked with faculty and staff to design a summer counseling intervention • Utilized a randomized control trial to rigorously assess the impact of the intervention Fulton County Schools Impact
• 7 weeks (June 6 – July 22, 2011) • 6 schools participated; selected based on 2010 estimated summer melt rates and geographic location: 3 in South county and 3 in North county with highest estimated rates • Randomized control trial • 2 counselors per school with caseload of 40 students each • $115/student Summer PACE Quick Facts
Contact information: Lindsay Page email@example.com Will Marinell firstname.lastname@example.org Thank you
Federal Perspectives of Big Data Jack Buckley, Commissioner, National Center for Educational Statistics
Big Data: New Opportunities for Measurement & Data Analysis – NSF Perspectives Edith Gummer Program Officer Division of Research on Learning Directorate of Education and Human Resources National Science Foundation
NSF Investments- Data in STEM Education • Mathematics and Physical Sciences • Fundamental and statistical research in the field of computational and data-enabled science and engineering • Social, Behavioral and Economic Sciences • Science Learning Centers – multiple projects • Digging in the Data Challenge • Methodology, Measurement, and Statistics
NSF Investments- Data in STEM Education • Directorate for Computer & Information Science and Engineering (CISE) – Computing Research Infrastructure program – data repositories and visualization capabilities – Supercomputers whose mission also includes reserving capacity for education research users
NSF Investments- Data in STEM Education • CISE Cyberlearning – a crosscutting program that studies learning in technology-enabled environments • Education and Human Resources – Research on Education and Learning (REAL) – Discovery Research K-12 (DRK-12) – Advancing Informal STEM Learning (AISL) – Promoting Research and Innovation in Methodologies in Evaluation (PRIME) • SBE/EHR – Building Community Capacity for Data Intensive Research
Success and Challenge • Expanding diversity of learning environments in which a variety of theoretical, methodological, and research to practice perspectives inform the R & D field But • Insights from data that inform learning, classroom practices, and pathways through education
Future Directions • Expanded view of what it means to “know and be able to do” – Models of achievement • Common Core Standards in Mathematics and Next Generation Science Standards – connecting disciplinary knowledge and practice • NRC – Education for Life and Work: Developing Transferable Knowledge and Skills in the 21st Century – Models of individual performance from group settings • Opportunity to learn connected to achievement • NRC – Monitoring Progress Toward Successful K-12 STEM Education: A Nation Advancing • Developing instructional systems databases that track not only achievement but what a student has experienced.
NSF Funding Sources • EHR Core Research (ECR) NSF 13-555 – Target date July 12, 2013 – 4 Areas of research • Learning • Learning Environments • Workforce Development • Broadening Participation • SBE/EHR Building Community Capacity • EHR Ideas Lab to foster transformative approaches to teaching and learning
Perspectives from the Spencer Foundation Andrea Conklin-Bueschel Senior Program Officer
Ed Dieterle, Ed.D. Senior Program Officer for Research, Measurement, and Evaluation US Program New Opportunities for Measurement & Data Analysis to Personalize Learning For every complex question there is a simple answer – and it’s wrong. - H.L. Mencken
2013-04-28 | AERA 2013 | San Francisco, California © 2013 Bill & Melinda Gates Foundation | 67 Personalized Learning at Scale A means to achieve our U.S. Education strategy goal: 80% of the class of 2025 graduating high school college ready 55 M Students in the Pipeline 4.2 M Entering the Pipeline Goal: Accelerate Learning Goal: Use 1 Million In-School Minutes Wisely
2013-04-28 | AERA 2013 | San Francisco, California © 2013 Bill & Melinda Gates Foundation | 68 A confluence of breakthroughs is moving us closer to the personalization of learning for all learners Common Core Standards Measures of Effective Teaching Science of How People Learn Personalized Blended Learning Models Digitally Born Learning Innovations New Measures of Learning Advanced Learning Analytics inBloom
2013-04-28 | AERA 2013 | San Francisco, California © 2013 Bill & Melinda Gates Foundation | 69 Multiple Funders One Workgroup Bill & Melinda Gates Foundation MacArthur Foundation Academy Industry Government/ Philanthropy Practice Learning Analytics Workgroup Multiple Sectors There are urgent and growing global needs for the development of human capital, research tools and strategies, and professional infrastructure in the field of learning analytics
2013-04-28 | AERA 2013 | San Francisco, California © 2013 Bill & Melinda Gates Foundation | 70 Learning Analytic Workgroup Roy Pea | Stanford University Provide a conceptual framework and define critical questions for understanding Articulate and prioritize new tools, approaches, policies, markets, and programs of study Determine resources needed to address priorities Map how to implement the strategy and how to evaluate progress Group of 30 experts from academy, government, industry, practice, and philanthropy
2013-04-28 | AERA 2013 | San Francisco, California © 2013 Bill & Melinda Gates Foundation | 71 A confluence of breakthroughs is moving us closer to the personalization of learning for all learners Common Core Standards Measures of Effective Teaching Science of How People Learn Personalized Blended Learning Models Digitally Born Learning Innovations New Measures of Learning Advanced Learning Analytics inBloom
2013-04-28 | AERA 2013 | San Francisco, California © 2013 Bill & Melinda Gates Foundation | 72 Measures of Learning Cognitive, interpersonal, intrapersonal factors associated with learning Without reliable, valid, fair, and efficient measures collected from multiple sources, and analyzed by trained researchers applying methods and techniques appropriately, the entire value of a research study or a program evaluation is questionable, even with otherwise rigorous research designs and large sample sizes
2013-04-28 | AERA 2013 | San Francisco, California © 2013 Bill & Melinda Gates Foundation | 73 Analog Digitally Reborn Digitally Born All tools aren’t born equally Note: “Digitally Born” vs. “Digitally Reborn” was first articulated by Bernard Frischer, Professor of Art History and Classics at the University of Virginia Differences stem from the activities they support, the outputs they generate, and what one can do with those outputs
2013-04-28 | AERA 2013 | San Francisco, California © 2013 Bill & Melinda Gates Foundation | 74 Newton’s Playground Valerie Shute | Florida State University Measure three competencies unobtrusively through use of Newton’s Playground Simulation: a) conceptual physics, understanding Newton’s Laws of motion b) persistence, continuing to work hard despite challenging conditions c) creativity, the ability to create novel solutions to various problems Shute, V. J., & Ventura, M. (Eds.). (2013). Stealth assessment: Measuring and supporting learning in video games. Cambridge, MA: MIT Press.
2013-04-28 | AERA 2013 | San Francisco, California © 2013 Bill & Melinda Gates Foundation | 75 Data Analytics Studies of Engagement Ryan Baker | Columbia University Application of education data mining and field observations to develop sensors that detect: Engaged/Disengaged Behaviors: – off-task – gaming the system – on-task solitary – on-task conversation Relevant Affect: – engaged concentration – boredom – frustration – confusion – delight ASSISTments Worcester Polytechnic Institute EcoMUVE Harvard University Newton's Playground Florida State University Reasoning Mind
2013-04-28 | AERA 2013 | San Francisco, California © 2013 Bill & Melinda Gates Foundation | 76 Mindfulness and Prosocial Games Richard Davidson | University of Wisconsin Madison Before After Mindfulness Game: Tenacity By monitoring and controlling breathing, players grow flowers and learn to regulate their attention Prosocial Game: Krystals of Kaydor Players assess emotional facial expressions to perceive the emotional state of members of the inhabitants of an alien planet and engage in prosocial behavior appropriate to the setting where the emotion is encountered Bavelier, D., & Davidson, R. J. (2013). Brain training: Games to do you good. Nature, 494(7438), 425–426. Davidson, R. J., & Begley, S. (2012). The emotional life of your brain: How its unique patterns affect the way you think, feel, and live--and how you can change them. New York, NY: Hudson Street Press.
2013-04-28 | AERA 2013 | San Francisco, California © 2013 Bill & Melinda Gates Foundation | 77 Mindfulness and Prosocial Games Richard Davidson | University of Wisconsin Madison Measures • Mind/brain measures: Functional Magnetic Resonance Imaging (fMRI), Electroencephalograph (EEG), Galvanic Skin Response (GSR) • Best-in-class, self-report measures from psychology • Logfiles generated from activity with each game Goals • Change brain function in specific attention and social behavior circuits in beneficial ways • Improve performance on cognitive tasks of attention and working memory and on measures of the perception of social cues and the propensity to share and behave altruistically
2013-04-28 | AERA 2013 | San Francisco, California © 2013 Bill & Melinda Gates Foundation | 78 A confluence of breakthroughs is moving us closer to the personalization of learning for all learners Common Core Standards Measures of Effective Teaching Science of How People Learn Personalized Blended Learning Models Digitally Born Learning Innovations New Measures of Learning Advanced Learning Analytics inBloom
Ed Dieterle, Ed.D. Senior Program Officer for Research, Measurement, and Evaluation US Program New Opportunities for Measurement & Data Analysis to Personalize Learning If you're not failing every now and again, it's a sign you're not doing anything very innovative. - Woody Allen
> Big Data in Education ...
2016 NCME Program Highlights: ... Education Policy Directors ... and Will Lorie takes on Big Data in Education: ...
Please cite this MOOT as Baker, R.S. (2015) Big Data and Education. 2nd Edition. New York, NY: Teachers College, Columbia University.
NCME - April 26 - 30, ... so we invited sessions on topics such as big data, games in education, ... NCME Co-Chairing and Loving it! ...
Education (NCME) 2013 conferences. ... Pearson CCS big Data american style: technology, Innovation, and the Public Interest scheduled time: ...
What is Big Data University? An IBM community initiative, Big Data University is the world’s best education on big data. Learn about big data ...
Publications in Statistics Education; ... The following list of graduate programs in Big Data and data science is maintained outside of the American ...
Call for Proposals for 2017 volume of Review of Research in Education: ... More Publications ... and the National Council on Measurement in Education (NCME).