• 5 questions for author Susan Riedel on teaching electric circuits

    by Susan Riedel, Professor, Marquette University & Yvonne Vannatta, Product Marketing Manager, Pearson

    blog image alt text

    Yvonne Vannatta, Product Marketing Manager at Pearson recently sat down with Susan Riedel, author and Marquette University professor to talk about the challenges instructors face when teaching Electric Circuits and the best practices Susan uses to tackle them.

    Yvonne – What is the biggest challenge instructors face when teaching Circuits?

    Susan – Mastering the many different circuit analysis techniques presented in Electric Circuits requires most students to solve a lot of problems.  It is often hard to convince students that they cannot simply read through a worked example problem, or watch an instructor solve a problem at the board – they need to actively solve problems themselves in order to learn the circuit analysis techniques.

    While I typically assign 10 – 12 problems each week for homework, the students would benefit from working at least twice that number of problems every week.  So I have to find ways to get students to solve lots more problems than I assign for homework.

    Yvonne – What strategies do you use to engage students in more problem solving?

    Susan – I use a combination of Learning Catalytics questions, pre-lab questions, and old exams to present and ask students to solve more problems.

    Learning Catalytics

    I use Learning Catalytics to pose questions to my students throughout my lectures.  They get a small amount of extra credit for attempting to answer the questions, even when they answer incorrectly.  I usually start the lecture with a Learning Catalytics question focused on the material we covered in the previous day’s lecture, as a way to review the material and remind them what we are working on.

    Then throughout the lecture I pose Learning Catalytics questions that may ask them to complete a problem I started to solve for them on the board, find a way to verify that the problem’s solution is correct, or discover some interesting property of the circuit we are analyzing.

    The students are solving additional problems, not just watching me solve them, and I am getting real-time feedback that tells me whether or not the topic I’m covering is being understood by the students.

    About once every two weeks, I pick a lecture day and turn it into a group problem-solving challenge, again using Learning Catalytics.  The students work together in small self-selected teams to solve several circuit problems.

    I wander around the classroom, look over their shoulders, answer questions they ask, and encourage them.  Even though I don’t present this as a competition, they like to compete and see how their team stacks up against the other teams in the class.

    They are actively solving problems that are not assigned as homework, and I can observe what material they may be struggling with, so I can adjust my next lecture accordingly.

    Pre-lab questions

    The Electric Circuits class I teach has an embedded lab.  There are 11 labs during the 16 week semester.  Each lab requires students to complete a pre-lab assignment that they turn in to me for grading two days before the lab.  I return their graded pre-labs within 24 hours so they can correct any errors they made before building the circuits in the lab.

    Every pre-lab has two parts – an analysis of one or more circuits, and MultiSim simulation of those same circuits to verify the analytical results.  So again, they are solving additional circuit problems that are not assigned for homework, then simulating those same circuits and eventually building the circuits and acquiring and analyzing data.

    Old exams

    Students take an in-class exam every 4 weeks.  I make all of my old exams available to them so they can solve the exam problems as a way to study for the upcoming exam.  I never provide my solutions, to encourage them to solve the problems themselves and not merely study problems and their solutions.

    They can check their solutions during my office hours and during an evening Study Group I hold the night before the exam.  Again, they are willingly solving lots of additional circuit problems that are not formally assigned in order to prepare to take the exam.

    Using the combination of Learning Catalytics, pre-lab assignments, and old exams, I usually get close to my goal of having students solve 20 – 25 circuit problems every week, even though I formally assign about half that number as homework.

    Yvonne – What is the biggest challenge students face when taking Circuits?

    Susan– Many students struggle with the initial step in solving a circuit – where do I start?  Consider that a simple circuit with a dc source and a few resistors must be described by six or eight independent equations derived from Ohm’s law and the Kirchhoff laws.

    This often overwhelms a student seeing circuit analysis for the first time.  Most of my students would be discouraged by the prospect of entering six or eight equations into their calculator correctly to solve for the circuit’s voltages and currents.

    So when students finally discover a tool like the node-voltage method, they realize that six or eight equations are not necessary to describe simple circuits.  But many students still need some guidance to use the general-purpose circuit analysis tools.

    Yvonne – How do you prepare students to find that starting point?

    Susan – To help students first learning to use the general-purpose circuit analysis tools like the node-voltage and mesh-current methods, I have always constructed a step-by-step procedure for them to follow.

    The step-by-step procedure tells them what kinds of equations to write (KCL or KVL, for example), how many of these equations to write, where to write those equations in the circuit, and how to check their solutions to those equations by balancing the power in the circuit.

    We have now formalized these step-by-step procedures in the 11th Edition of Electric Circuits, where they are called “Analysis Methods.”  The Analysis Methods give students the confidence they need to solve circuit problems because they know how to start the problem and what procedure to follow to reach a solution.

    Initially students rely heavily on the Analysis Methods but they eventually need to follow a step-by-step procedure less often, often preferring to take a more intuitive approach.

    For most students, following an Analysis Method initially allows them to grasp the circuit analysis concepts faster than students who are not given a step-by-step procedure to follow.  Students using Analysis Methods spend less time trying to decide how to solve a problem because they follow a set of steps.  They finish their assignments faster and endure much less frustration along the way.

    Yvonne – What advice would you give to instructors new to teaching Circuits?

    Susan – There are so many resources available to instructors teaching Circuits, and a lot of thought and hard work have gone into the design and implementation of these resources.  Instructors should take advantage of as many resources as time allows.

    Learning Catalytics is a terrific resource for active learning in the classroom, supplying real-time feedback to instructors that enables them to identify material their students are struggling with.

    Mastering Engineering has tutorials that guide students through important material using intelligent feedback to assist their learning, video solutions for many different problems, automated grading for assigned homework, and many other useful features.

    Software simulators allow students to study a circuit with changing component values, plot circuit variables of interest, and use many different types of analysis including dc, transient, and ac steady-state. Many students benefit from the virtual laboratory experience that a simulator provides, even if an actual laboratory experience is not available to them.

    The more resources an instructor can bring to bear on the Circuits material, the more likely it is that the instructor will align with the various learning styles of all students in the classroom, leading to the success of every student.

    Hear directly from Professor Riedel on how you can engage more students in team-based problem solving our webinar: Using Learning Catalytics Inside and Outside the Circuits Classroom.

     

    read more
  • Games-based learning from "content" to "creation" (Episode 8)

    by Dr. Kristen DiCerbo, Vice President of Education Research, Pearson

    blog image alt text

    This series, produced with The Edtech Podcast, explores the implications of and questions around future tech for education. Listen for insights from experts — including contrarians — from across industry, research, and academia. Get caught up with episodes 1-7.  

    What initiatives are supporting teachers and students to co-create games together? In this episode of our Future Tech for Education podcast series, hear from educators, gaming companies, and researchers on the evolution of games-based learning from “content” to “creation”.

    Subscribe to the Future Tech for Education on iTunes.

     

    read more
  • Student, software and teacher in "personalized learning" (Episode 7)

    by Dr. Kristen DiCerbo, Vice President of Education Research, Pearson

    blog image alt text

    This series, produced with The Edtech Podcast, explores the implications of and questions around future tech for education. Listen for insights from experts — including contrarians — from across industry, research, and academia. Get caught up with episodes 1-6.  

    In episode 7 of our Future Tech for Education podcast series, we explore: What is personalized learning? What is it not? Is there an evidence base yet for personalized learning and what does the research evidence show us about the contexts where personalized learning works best? What is the role of student, software and teacher in a personalized learning context? What questions should we be asking?

    Subscribe to the Future Tech for Education on iTunes.

     

    read more
  • Analysis: Why school districts need a 'Consumer Reports' for ed tech

    by Bart Epstein, CEO, Jefferson Education Accelerator

    blog image alt text

    This is the sixth in a series of essays surrounding the EdTech Efficacy Research Symposium, a gathering of 275 researchers, teachers, entrepreneurs, professors, administrators, and philanthropists to discuss the role efficacy research should play in guiding the development and implementation of education technologies. This series was produced in partnership with Pearson, a co-sponsor of the symposium co-hosted by the University of Virginia’s Curry School of Education, Digital Promise, and the Jefferson Education Accelerator. Click through to read the firstsecondthirdfourth, and fifth pieces.

    Economists define a collective action problem as one in which a collection of people (or organizations) each have an interest in seeing an action happen, but the cost of any one of them independently taking the action is so high that no action is taken — and the problem persists.

    The world of education swirls with collective action problems. But when it comes to understanding the efficacy of education technology products and services, it’s a problem that costs schools and districts billions of dollars, countless hours, and (sadly) missed opportunities to improve outcomes for students.

    Collectively, our nation’s K-12 schools and institutions of higher education spend more than $13 billion annually on education technology. And yet we have a dearth of data to inform our understanding of which products (or categories of products) are most likely to “work” within a particular school or classroom. As a result, we purchase products that often turn out to be a poor match for the needs of our schools or students. Badly matched and improperly implemented, too many fall short of their promise of enabling better teaching — and learning.

    It’s not that the field is devoid of research. Quantifying the efficacy of ed tech is a favorite topic for a growing cadre of education researchers and academics. Most major publishers and dozens of educational technology companies conduct research in the form of case studies and, in some cases, randomized control trials that showcase the potential outcomes for their products. The What Works Clearinghouse, now entering its 15th year, sets a gold standard for educational research but provides very little context about why the same product “works” in some places but not others. And efficacy is a topic that has now come to the forefront of our policy discourse, as debates at the state and local level center on the proper interpretation of ESSA’s mercurial “evidence” requirements. Set too high a bar, and we’ll artificially contract a market laden with potential. Miss the mark, and we’ll continue to let weak outcomes serve as evidence.

    The problem is that most research only addresses a tiny part of the ed tech efficacy equation. Variability among and between school cultures, priorities, preferences, professional development, and technical factors tend to affect the outcomes associated with education technology. A district leader once put it to me this way: “a bad intervention implemented well can produce far better outcomes than a good intervention implemented poorly.”

    After all, a reading intervention might work well in a lab or school — but if teachers in your school aren’t involved in the decision-making or procurement process, they may very well reject the strategy (sometimes with good reason). The Rubik’s Cube of master scheduling can also create variability in efficacy outcomes: Do your teachers have time to devote to high-quality implementation and troubleshooting, and then to make good use of the data for instructional purposes? At its best, ed tech is about more than tech-driven instruction. It’s about the shift toward the use of more real-time data to inform instructional strategy. In some ways, matching an ed tech product with the unique environment and needs of a school or district is a lot like matching a diet to a person’s habits, lifestyle, and preferences: Implementation rules. Matching matters. We know what “works.” But we know far less about what works where, when, and why.

    Thoughtful efforts are underway to help school and district leaders understand the variables likely to shape the impact of their ed tech investments and strategies. Organizations like LEAP Innovations are doing pioneering work to better understand and document the implementation environment, creating a platform for sharing experiences, matching schools with products, and establishing a common framework to inform practice — with or without technology. Not only are they on the front lines of addressing the ed tech implementation problem, but they are also on the leading edge of a new discipline of “implementation research.”

    Implementation research is rooted in the capture of detailed descriptions of the myriad variables that undergird your school’s success — or failure — with a particular product or approach. It’s about understanding school cultures and user personas. It’s about respecting and valuing the insights and perspectives of educators. And presenting insights in ways that enable your peers to know whether they should expect similar results in their school.

    Building a body of implementation research will involve hard work on an important problem. And it’s work that no one institution — or even a small group of institutions — can do alone. The good news is that solving this rather serious problem doesn’t require a grand political compromise or major new legislation. We can address it by engaging in collective action to formalize, standardize, and share information that hundreds of thousands of educators are already collecting in informal and non-standard ways.

    The first step in understanding and documenting a multiplicity of variables across a range of implementation environments is creating a common language to describe our schools and classrooms in terms that are relevant to the implementation of education technology. We’ll need to identify the factors that may explain why the same ed tech product can thrive in your school but flop in my school. That doesn’t mean that every educator in the country needs to document their ed tech implementations and impact. It doesn’t require the development of a scary database of student or educator data. We can start small, honing our list of variables and learning, over time, what sorts of factors enable or impede expected outcomes.

    The next step is translating those variables into metadata, and creating a common, interoperable language for incorporating the insights and experiences of individuals and organizations already doing similar work. We know that there is demand for information and insights rooted in the implementation experiences and lessons of peers. If we build an accessible and consistently organized system for understanding, collecting, and sharing information, we can chip away at the collective action problem by making it easier and less expensive to capture — and share — perspectives from across the field.

    The final step is addressing accessibility to shared insights, facilitating a community of connected decision makers who work together both to call upon the system for information and to continue to make contributions to it. Think of it as a Consumer Reports for ed tech. We’ll use the data we’ve collected to hone a shared understanding of the implementation factors that matter — but we’ll also continue to rely upon lived experiences of users to inform and grow the data set. Over time, we can achieve a shared way of thinking about a complex problem that has the potential to bring decision-making out of the dark and into a well-informed, community-supported environment.

    My work with colleagues at the first-ever EdTech Efficacy Research Symposium found that a growing number of providers, organizations, and associations are already working with educators to crowdsource efficacy data. And educators across the country are already doing this work in informal but valuable ways. Bringing these efforts together and creating a more standard approach to their collection and dissemination is a critical step toward improving decision-making. My observation from both research and discussion with the field is that the effort is not only deeply needed — it also already enjoys great support. If we take collective action, we can develop a democratic approach to improving the fit between ed tech tools and the educators who use them.

    This series is produced in partnership with Pearson. The 74 originally published this article on January 2nd, 2018 and it was re-posted here with permission.

     

    read more
  • Imagine (a world of assessment without tests) (Episode 6)

    by Dr. Kristen DiCerbo, Vice President of Education Research, Pearson

    blog image alt text

    This series, produced with The Edtech Podcast, explores the implications of and questions around future tech for education. Listen for insights from experts — including contrarians — from across industry, research, and academia. Get caught up with episodes 1-5

    How do we get beyond the tick-box or bubble filling exercise of exams and tests, whilst also measuring ‘progress’? In episode 6, we review ideas around ‘invisible assessment’ and question who benefits from ‘traditional’ and re-imagined forms of assessment, including games-based assessment. Can ‘tests’ be fun and should they be? How do we measure collaboration?

    Subscribe to the Future Tech for Education on iTunes.

     

    read more
  • What can VR, AR & Simulation offer teaching & learning? Plus, strategies to avoid the technopanic (Episode 5)

    by Denis Hurley, Director of Future Technologies, Pearson

    blog image alt text

    This series, produced with The Edtech Podcast, explores the implications of and questions around future tech for education. Listen for insights from experts — including contrarians — from across industry, research, and academia. Get caught up here with  episode 1,  episode 2, episode 3, and episode 4

    In the latest episode of our Future Tech in Education podcast series, we dip into the world of VR and mixed reality to uncover what high-cost, high-risk learning opportunities are being made more accessible to all by this technology.

    Plus, we wrap our co-curated mini series with practical suggestions for educators: be mindfully skeptical, resist fear, understand that you can start small and grow, and avoid technology for technology’s sake. This last one is harder than it sounds. Many new technologies wow us but do not have useful application to education. Learn how to make the most of technology.

    Subscribe to the Future Tech for Education on iTunes. 

     

    read more
  • Professors: 3 things you might be spending more time on than you need to

    by Pearson

    blog image alt text

    Being a full-time educator takes commitment, organization and time — lots and lots of time. It’s rare to find an educator at any level who finishes his or her day once class is dismissed. With limited time to focus on the many aspects of quality course instruction, educators need the best tools to maximize their time.

    Ideally, leveraging said tools should focus on automating the most common tasks to which educators devote the majority of their time. Find out how leveraging the right digital learning platform can help with creating personalized lesson plans, student engagement and monitoring student progress.

    Developing a lesson plan is one of the most important tasks for educators. Lesson plans set the tone for the entire course from the outset. Creating a lesson plan personalized for each course and each group of students is immensely time consuming. Educators are expected to create new and engaging plans for each day, often with very little feedback with which to work.

    Engaging with students

    Keeping students engaged – in class and out of class – is vital for receiving feedback on teaching materials and assessing the concepts students retain and those they struggle to understand. Traditional methods of engagement, i.e. fostering group discussions and question-and-answer periods, are particularly difficult in larger classrooms. Students get distracted more easily and educators struggle to create a rapport with each individual.

    With digital learning educators can now utilize the devices students already bring into the classroom, think smartphones, tablets and laptops, to engage them in more sophisticated tasks to help develop critical thinking skills. MyLab creates a platform where students submit answers on a web-enabled device and receive immediate feedback from their instructors.

    Revel assignments completed prior to class allow instructors to use classroom time more efficiently for group work and discussion Increased dialogue and feedback between students and educators can make even large classes seem more personal.

    Monitoring student progress

    Keeping track of student progress allows an educator to know whether students are learning on pace with the lesson plan and completing all assignments. Traditional methods used to monitor progress – homework assignments, quizzes and exams – take time to develop on the front end and time to review on the back end.

    In larger classes especially, it may take several days or even weeks before students receive grades from previous assignments and exams. Delayed feedback is outdated and can be difficult for students to apply to future work.

    Monitoring student achievement is easier than ever before with Revel, a platform that saves hours of time by tracking assignment completion and automating analytics. A trending column, for example, demonstrates whether students’ grades are improving or declining, making it easy to identify students who need extra attention.

    Additionally, students have the opportunity to increase their own accountability by viewing real-time progress reports. With faster feedback, students can keep up with the pace of the course and address areas of difficulty as soon as they arise.


     
    read more
  • The best way to increase student engagement in your classroom

    by Pearson

    blog image alt text

    We’ve all had it happen. You spend countless hours preparing for a lecture only to watch students lose focus and disengage from class. From cellphones to that one student who manages to derail class (likely for a full 20 minutes after alerting class to the first snowfall out the window), it’s almost impossible to teach a class without some type of distraction.

    As instructors, we’re tasked with a lot. Achieving maximum comprehension, information retention and improving test scores are just a few of the challenges faced in addition to maintaining student attention.

    If you’re ready to take back your class time and refocus attention on course material, you’ve come to the right place. Keep reading to find out how you can leverage digital learning in your classroom to fight these distractions and foster student engagement.

    Teaching your classroom in a one-size-fits-all mindset

    In any classroom, there are students who learn at a different pace than the planned syllabus. Some students grasp concepts quickly, and may become bored by too much classroom time spent on a topic, while others struggle to keep up.

    There are countless reasons why a student may fall behind – whether it’s an overloaded schedule or something happening in their personal life. Regardless of the reason, a student who’s struggling to keep up, is increasingly likely to disengage from class and runs the risk of falling even further behind.

    When students can master basic subject level concepts away from the classroom, professors are able to refocus class time on engaging students by expanding on core concepts.

    Drowning in a sea of outdated class resources

    Let’s face it. No student wants an instructor who bogs them down with dozens of different paper handouts and online portals that may or may not have been constructed during the dawn of the internet.

    For many students, keeping track of materials for all their classes, including textbooks and paper handouts, can be a struggle. And a student who forgets one of the 80 “essential” materials for class that day may be unable to participate.

    Traditional materials like textbooks are a stark contrast to other media that students today are more familiar with. Today’s students are used to the internet, where simple keyword searches produce immediate results and relevant information on any internet-connected device.

    Confining all classroom materials in an online learning management system simplifies organization by placing all class and student materials in one place. With the necessary materials easily accessible, students are free to focus on learning and staying engaged in the classroom (unless someone breaks out a fidget spinner, at which point we can’t help you).

    Lecture format classes

    Keeping students engaged can be particularly difficult in a large lecture setting. With dozens, or even hundreds of students in just a single class, it’s no surprise to find professors standing at the front of the room talking for the entire period and hoping that some small fraction of their wisdom is being absorbed.

    Obstacles like acoustics for students in the back, or those who take advantage of class setup to escape on social media, are just a few of the challenges faced.

    If this scenario sounds familiar to you, trust us when we say you’re not alone. One of the best ways to foster greater engagement in a lecture-style class is through interactive question-and-answer sessions and peer discussions supplemented by an online learning platform.

    With a solution like this, professors can break a large class into groups quickly and easily, while receiving instant feedback to tailor lessons to student preferences.

    Avoiding new technology

    With the prevalence of social media and smartphones, it’s no surprise that today’s students expect to be constantly connected. Interacting with the world through their smartphones and tablets, it’s quite common for disconnect to occur when professors use outdated technology.

    With news apps and social networking platforms enabling information to spread like wildfire, today’s students are used to information in real time. When the internet provides them the information that they need instantly, it’s common for them to lose patience with textbooks written years before their time.

    Instead, professors can leverage the devices with which students are already familiar and which they bring to class, to provide a more interactive learning environment. An online learning platform makes it easy for professors to pose questions and receive immediate feedback from each student in the classroom (rather than one or two), and adjust their instructional strategies in real time.

     

    read more
  • Educators: Are you leveraging digital learning in your classroom?

    by Pearson

    blog image alt text

    Students today use technology more than ever — whether for research, studying or chatting every second of the day with friends. It’s no surprise that leveraging the ubiquity of digital communication can help produce countless benefits in the classroom for students and educators alike.

    Online assessments have the power to give students rapid feedback, while digital tools allow instructors to provide multimedia learning experiences. Video explanations, games, online note-taking and other features all work to help keep students engaged as they read and study. With the power of digital, educators can analyze test scores and tailor instruction to suit students’ strengths and weaknesses.

    Expand learning opportunities

    When teaching a subject like geology or art, it’s hard to fully convey the power of a volcano or the expansiveness of a work of art with photos alone. By incorporating videos and other digital assets, course instructors can fully engage students. With digital examples in geology for example, instructors won’t just tell students how landslides happen; they can show them.

    Video demonstrations allow students to take virtual field trips whenever they want, at their own pace and on their preferred devices. This video tour of the Pantheon leaves a much more lasting impression than any descriptive words ever could. Tour options take them to places they could never explore in person — at least not as part of a classroom.

    In addition to learning through experiences students also need concrete skills for success. Critical thinking is an important skill that applies to almost any field, and writing can be one of the best ways to master it. 

    read more
  • Language learning as the test-bunny for educational future tech (Episode 4)

    by Denis Hurley, Director of Future Technologies, Pearson

    blog image alt text

    This series, produced with The Edtech Podcast, explores the implications of and questions around future tech for education. Listen for insights from experts — including contrarians — from across industry, research, and academia. Watch episode 1,  episode 2, episode 3.

    Technological change is exponential, which means it will only impact our lives more and more quickly. Among the aspects of our lives undergoing change, language usage is one of the ones being altered most drastically. New technologies also create new opportunities for learning. How must we adjust and what can we take advantage of?

    Subscribe to the Future Tech for Education on iTunes.

     

    read more
  • How higher education is innovating instruction (and why it needs to continue to do so)

    by

    blog image alt text

    Digital learning and technology has a short and turbulent history as creating cultural, social, generational, and socio-economic divides. The swiftness of change in society due to technological advances has disrupted just about everything we do, but in education, the disruption is perhaps the most important to consider.

    There is a discontinuity in how education is evolving compared to the realities of career and society. Higher education attempts to be responsive to these changes, but the course corrections are often slow and/or don’t align well with the actual trajectory of the modern world. The solution is not clear-cut, but there are many ways higher education is trying to keep pace.

    Here are 5 trends that are helping higher education to align better with the actual needs of students:

    1. Online and hybrid classes have become a very popular part of the landscape at many institutions of higher education. The mix of flexibility and the infusion of technology such as video-conferencing software, cloud-based office suites such as Google’s Gsuite or Microsoft’s 365, and the use of learning management systems such as Blackboard or Desire to Learn. While the technology serves the purpose of adding flexibility and leveraging resources, the experiences students gain from working and learning in this environment align closely with the modern workplace.
    2. Digital Delivery of learning materials is the obvious evolution for higher education, and one that has been painfully slow. While the ability to deliver what we used to think of as a “textbook” as a digital resource has long been possible, many programs still rely heavily on student and faculty use of printed media. It doesn’t have to be this way, and some schools are beginning to take a hard look at the way materials are used in courses. In many cases, the switch can be easy. For instance, Pearson Education is one of the leaders in providing access to digitally delivered learning materials. The digital catalogs available for students and faculty are massive and growing every day. At this point, any move toward digital delivery is a positive one. This transition would modernize the higher ed experience and probably save students some money.
    3. Internships and outside experiential learning built into degree programs have continued to be a popular route due to the development of personal and social skills, but internships have a secondary yet powerful consequence: they also help instructors and program chairpeople stay current. There is a lot to be said for programs where internships, programming, and instruction are woven together in ways that a more traditional, sanitized, classroom experience cannot replicate.
    4. Student voice and choice is changing the landscape of post-secondary education. There is a great power in programs willing to allow for a variety of student voice and choice in the learning experience, not just for the capstone, but throughout the learning journey of the students. This seems to be far more accepted in vocational and advanced degree programs, and I’d like to see it sweep through the undergraduate experience as well.
    5. Embracing the learner, not the system, is really the key to the survival of many post-secondary programs. While the integration of learning technology, internships, diverse media delivery and student voice make for an increasingly intimate and individualized experience, it can’t survive in a vacuum. The evolution to embrace learner needs, especially when those needs run afoul of traditional practice, needs to be valued. Whether differentiated by time, place, pace, or method of delivery, individualized instruction can happen now in ways that would have been impossible or impractical even ten years ago. Not only can professors use their LMS platforms to deliver multimedia-rich learning options, but there are many options for curricula and review material already assembled and ready to use, such as Pearson’s Revel and MyLab/Mastering products.

    Disruption is the constant today, and post-secondary programs will need to continue to find ways to attend to the gap between what they deliver and what students actually need. They need to be nimble and responsive to the world they are preparing students for.

    While the familiar may have a certain nostalgia to some professors and instructors, these disruptions represent the best potential for future growth of programs, institutions, and the individuals. Unlike any other time in history, higher education faces a shift from tried and true to a constant reinvention to meet the fluid demands of both the working world and an ever-changing student body.

    This article was originally published on Dr. VonBank’s LinkedIn Pulse page and has been reposted here with permission.

    read more
  • Developing responsible and calm digital citizenship (Episode 3)

    by Denis Hurley, Director of Future Technologies, Pearson

    blog image alt text

    This series, produced with The Edtech Podcast, explores the implications of and questions around future tech for education. Listen for insights from experts — including contrarians — from across industry, research, and academia.

    Technology is a part of almost every aspect of our lives: buildings can be 3D printed, cars can drive themselves, and algorithms can direct our education.

    In the third episode of this series (catch episode 1 and episode 2), we explore how do we react to, interact with, and create with the tools of technology? It’s essential that we understand how these function and what the implications.

    We also look into the changing world of work and how we can best prepare.

    View on YouTube

    For more information, check out the Pearson Future Skills report.

    Subscribe to the Future Tech for Education on iTunes.

     

    read more
  • What is AI & what has it got to do with me and my students? (Episode 2)

    by Denis Hurley, Director of Future Technologies, Pearson

    blog image alt text

    This series, produced with The Edtech Podcast, explores the implications of and questions around future tech for education. Listen for insights from experts — including contrarians — from across industry, research, and academia. Subscribe to the Future Tech for Education on iTunes here.

    Smarter digital tools, such as artificial intelligence (AI), offer up the promise of learning that is more personalized, inclusive and flexible. Many see the benefits of AI, some are skeptical – but it’s crucial we understand what these tools can do and how they work.

    In the first episode of this series, we talked about the how to navigate the challenges and opportunities tech brings to the future of education. In episode two, we explore: What is AI and what is it not? What’s the difference between narrow AI, general AI, and super-intelligence? What type of AI is used now in education? What type do people fear? What questions might teachers want to use when thinking about AI in education?

    View on YouTube

    For more information, check out the report, Intelligence Unleashed: An argument for AI in Education.

     

    read more
  • What does future tech for education look like? (Episode 1)

    by Denis Hurley, Director of Future Technologies, Pearson

    blog image alt text

    This series, produced with The Edtech Podcast, explores the implications of and questions around future tech for education. Listen for insights from experts — including contrarians — from across industry, research, and academia. Subscribe to the Future Tech for Education on iTunes.

    In our first episode of the Future Tech for Education podcast series, we put “future-forecasting” in perspective through a few useful but simple models. We talk about the history of the future and mindful skepticism, and we delve into the four foci of edtech technologies — mixed reality, data science (AI), biosyncing, and human-machine relations — and their effect on education, teaching, and learning.

    View on YouTube 

    Employ mindful skepticism. This means not accepting a new technology as inherently good or evil. But try to understand what the possibilities are. Try to understand what can it be used for; how can I make the most of this technology.

    read more
  • Generation Z: Get to know your new students

    by Pearson

    blog image alt text

    Gen Zers are the current generation to embark on their journey in higher education. They are present on your campus and in your classes, with many more enrolling every year. How well do you know them? Do you have the tools to shape these newcomers into successful and productive adults after just a few short years of schooling?

    Born between 1997 and 2015, Generation Z accounts for 26% of all the total United States population, according to a Nielsen report. They’re currently the largest living generation and have the potential to reshape how we use technology and view the workplace, so you probably should.

    Understanding what drives this generation can help you better tailor your coursework around tangible and transferable skills so students can better understand how it relates to their future. Barnes & Noble College conducted a survey of 1,300 Gen Zers, and more than 89% of respondents acknowledge that a college education is valuable.

    For them, college is seen as the pathway to a good job. The study also states that Gen Z’s top criterion in selecting a college is how it will prepare them for their chosen careers, followed by interesting coursework and professors who care about student success.

    Learning how to engage with this generation is just as important as learning what tools to use to engage them. Their comfort and trust in the online space will greatly determine how they interact with their educators. In fact, Gen Zers often prefer video content—with 85% of surveyed students reporting that they watched an online video to learn a new skill in the past week, according to The Center for Generational Kinetics.

    And they have high hopes for their post-collegiate future, too. In fact, 88% of surveyed Gen Zers reported that they were optimistic about their own personal future—more than any other generation, according to a report by Vision Critical.

    But that optimism is balanced by realistic expectations about their careers. When asked what matters most in their ideal jobs, in the same survey, they favored salary more and work-life balance less than their millennial counterparts.

    Here’s just some of what you can expect to learn more about:

    • Up-to-the-minute analysis of what’s happening in higher education
    • Illuminating insights from multigenerational surveys about Gen Z behaviors and attitudes about education
    • Eye-opening interviews and surveys about the individual experiences of hundreds of Gen Z students from Jean Twenge, author of iGen: Why Today’s Super-Connected Kids Are Growing Up Less Rebellious, More Tolerant, Less Happy—and Completely Unprepared for Adulthood

    In the meantime, dive deeper into the Gen-Z psyche, and read about their learning habits in the infographic, “Engage from A to Gen Z.” Learn more about this generation’s make-up, goals, and what makes them tick.

     

    read more
  • 5 chats you don't want to miss from Educause

    by Caroline Leary, Manager, Pearson

    blog image alt text

    This year at Educause, Erick Jenkins, East Carolina University student and Pearson Campus Ambassador, and Jenn Rosenthal, community manager at Pearson, went behind the scenes to learn about what was top of mind for contributors to the best thinking in higher education IT.

    Erick and Jenn spoke with digital learning advocates about the latest and greatest in digital learning and what exactly that means for students, educators, and institutions.

    Together, they demystified Inclusive Access, discussed the importance of 21st century skills, engaged with cognitive tutor extraordinaire – IBM Watson, and dove into the world of AR and mixed reality.

    Catch their interviews below and let us know what roles you see technology playing in the future (near or far) of education in the comments section.


    Erick and Jenn talk with Jeff Erhlich, Director of Special Projects at Park University about what exactly Inclusive Access is (hint: it’s more than eText) and the benefits it brings to students, educators, and institutions.

    What is Direct Digital Access?

    We are sitting down to chat with Jeff Ehrlich, Park University Director of Special Projects, about Direct Digital Access. #edu17

    Posted by Pearson on Wednesday, November 1, 2017

     

    Jenn chats with Leah Jewell, Pearson’s Head of Career Development and Employability, about the Career Success Program and the importance of developing strong personal and social capabilities.

    Preparing Now: Career Success

    Chatting with Leah Jewell, Pearson's Head of Employability, about the Career Success Program.

    Posted by Pearson on Wednesday, November 1, 2017

     

    Erick gets a taste of how artificial intelligence can help students power through to success. Pearson’s Kaitlyn Banaszynski and Amy Wetzel introduce Erick to Watson – the cognitive tutor.

    Student Perspective: Watson

    East Carolina University student & Pearson intern, Erick Jenkins, is chatting with Pearson colleagues & IBM Watson experts, Kaitlyn & Amy.

    Posted by Pearson on Wednesday, November 1, 2017

     

    Jenn and Erick examine virtual patient Dave through HoloPatient using Microsoft HoloLens and chat with Mark Christian, Pearson’s Global Director of Immersive Learning about how Pearson is using AR/VR to enhance learning.

    Hololens & Immersive Learning Innovations

    We are so excited to try out the HoloLens – an example of Pearson immersive technology – and chat with Pearson's Global Director of Immersive Learning, Mark Christian.

    Posted by Pearson on Wednesday, November 1, 2017

     

    Erick sits down with Jenn and talks about how technology has played a role in his college experience.

    Student Perspective: Educational Technology

    We are live at EDUCAUSE 2017 with Pearson intern and East Carolina University student, Erick, talking about how technology has played a role in his college experience! #EDU17

    Posted by Pearson on Wednesday, November 1, 2017

     

    read more
  • How to engage tech-savvy students

    by Pearson

    blog image alt text

    From textbooks to laptops and white boards to smartboards, digital technologies continue to propel higher education forward. Instant access to information and various types of media and course materials create a more dynamic and collaborative learning experience.

    Today’s tech-savvy learners are accustomed to instructors utilizing technology to bolster curriculum and coursework. In fact, a majority of surveyed students (84%) understand that digital materials help solve for issues facing higher education, according to “Digital appetitive vs. what’s on the table,” a recent report that surveyed student attitudes on digital course materials. And many (57%) also expect the onus to fall on the institution to shift from print to digital learning tools.

    Many higher education institutions are looking for new ways to integrate technology into their coursework. Recently, Maryville University, a private institution in St. Louis, MO, developed a digital learning program that provided iPads to their students—with great results.

    94% of faculty have integrated iPads into their courses, and 87% of students agree that technology has been instrumental in their success at the school. What’s more, enrollment increased by 17.7% over two years, in part due to the Digital Learning Program, reports Inside Higher Ed.

    Learn more about how digital learning can strengthen higher education institutions with this infographic, “Digital Learning: Your best teacher’s assistant.”

    read more
  • Teaching collaboration skills from cradle to career

    by Emily Lai, Ph.D, Kristen DiCerbo, Ph.D, Peter Foltz, Ph.D

    blog image alt text

    We’ve heard from Emily Lai, Ph.D., twice before. Last year, she shared the story of her work in Jordan to improve learning opportunities for the children of Syrian refugees. More recently, she offered her tips for parents and teachers on helping students improve their information literacy.

    The Components of Collaboration

    “Most of us know what collaboration is, at least in its most basic sense,” says Emily Lai, Ph.D.

    “It means working with others to achieve a common goal.”

    Emily is Director of Formative Assessment and Feedback for Pearson. Her work is focused on improving the ways we assess learners’ knowledge and skills, and ensuring results support further learning and development.

    “We’ve been reviewing the research, trying to figure out what we know about collaboration and how to support it. For example, we know that collaboration skills have an impact on how successful somebody is in all kinds of group situations—at school, on the job, and even working with others within a community to address social issues.”

    Teaching Collaboration in the Classroom

    Teaching collaboration skills in the classroom can be harder than expected, Emily says.

    “When a teacher assigns a group project, oftentimes students will divide up the task into smaller pieces, work independently, and then just shove their parts together at the very end.”

    “In that case, the teacher likely had good intentions to help develop collaboration skills in students. But it didn’t happen.”

    Checking all the Boxes

    “Tasks that are truly supportive of collaboration are not easy to create,” Emily says.

    Digging deeper, Emily says there are three sub-components of successful collaboration:

    Interpersonal communication – how you communicate verbally and non-verbally with your teammates.

    Conflict resolution – your ability to acknowledge and resolve disagreements in a manner consistent with the best interest of the team.

    Task management – your ability to set goals, organize tasks, track team progress against goals, and adjust the process along the way as needed.

    Emily says she understands how difficult it can be for educators to check all three boxes.

    Before beginning an assignment, Emily suggests teachers talk to students explicitly about collaboration: what makes a good team member versus what makes a difficult one, as well as strategies for working with others, sharing the load responsibly, and overcoming disagreements.

    During group work, she says, observe students’ verbal and non-verbal behavior carefully and provide real-time feedback.

    “Talk with them about how they’re making decisions as a group, sharing responsibility, and dealing with obstacles,” Emily says.

    “In the classroom, it’s all about the combination of teaching collaboration skills explicitly, giving students opportunities to practice those skills, and providing feedback along the way so those skills continue to develop.”

    “The research shows that students who develop strong collaboration skills get more out of those cooperative learning situations at school.”

    Teaching Collaboration at Home

    Emily is a mother of two daughters, 4 and 8.

    At home, she says, there’s one part of collaboration that is especially valuable: conflict resolution.

    “Most often, it comes in handy on movie nights.”

    “The 8-year-old tends to gravitate towards movies that are a little too scary for the 4-year-old, and the 4-year-old tends to gravitate towards movies that are a little too babyish for the 8-year-old.”

    “It would be easy to intervene and just pick a movie for them, but my husband and I do our best to stay out of it,” Emily says.

    “We’ve established the procedure that they have to negotiate with each other and agree on a movie, and now they have a collaborative routine in place.”

    “They know they get to watch a movie, and we know they’re learning along the way.”

    “Taking turns in conversation is another big one for the four-year-old,” Emily says.

    “She doesn’t like to yield the floor, but it’s something we’re working on.”

    “I know from the research that if my daughters learn these collaboration skills, they are more likely to be successful in their future careers.”

    Sharing the Latest Research

    This week, Emily and two of her colleagues are releasing a research paper entitled “Skills for Today: What We Know about Teaching and Assessing Collaboration.”

    The paper will be jointly released by Pearson and The Partnership for 21st Century Learning (P21), a Washington, DC-based coalition that includes leaders from the business, education, and government sectors.

    “We teamed up on this paper because we both believe collaboration is too important for college, career, and life to leave to chance,” Emily says.

    It is the first in a four-part series on what is known about teaching and assessing “the Four Cs”: collaboration, critical thinking, creativity, and communication.

    “P21 is the perfect partner for this effort,” Emily says.

    “Our partnership signifies a joint commitment to helping stakeholders—educators, parents, policy-makers, and employers—understand what skills are needed to be successful today, and how to teach them effectively at any age.”


    To download the full version of “Skills for Today: What We Know about Teaching and Assessing Collaboration,” click here.

    Three executive summaries of the paper are also available:

    Pearson LearnEd originally published this article on April 24th, 2017, and it was re-posted here with permission.

     
    read more
  • 90%+ first-call resolution, and powerful support for GGU's teaching mission

    by Golden Gate University-San Francisco, CA

    blog image alt text

    SUCCESS STORY

    World-class support for 5,000+ busy adult learners

    To make higher education work for its students, many of whom are working professionals, Golden Gate University (GGU) offers flexible programs both online and at four campuses. Even its in-person courses are extensively enhanced with robust web components, and some have evolved towards flipped learning models.

    Both GGU’s students and its instructors are deeply reliant on the university’s online LMS and other systems. However, they have diverse expertise, and equally diverse hardware, ranging from old laptops to the newest smartphones.

    Students with full-time jobs often set aside nights and weekends for schoolwork. Most GGU faculty work professionally in the fields where they teach, bringing a wealth of experience and enthusiasm. Both students and teachers often need help desk support, especially as GGU has integrated more robust web functionality into courses—and neither group has time to wait for answers.

    As Doug Geier, GGU’s Director of eLearning and Instructional Design, puts it, “We provide really good support for our instructors and students, but we rely on the help desk to fill a critical need.”

    GGU’s small internal help desk responds during weekday business hours, focusing not only on technical help, but also calls requiring involvement from administrative offices. To fill the gaps, GGU chose Pearson, which seamlessly extends GGU’s own help desk, presenting its services as part of GGU. Through this close partnership, the help desk delivers 24x7x365 support for virtually any technical problem, regardless of location or device.

    GGU chooses to pay on a per inquiry basis, smoothly ramping up whenever it needs more help—for example, at the start of each trimester, when new students must quickly solve login or compatibility issues.

    Pearson’s reporting helps both partners identify emerging trends in support calls and escalations, flag individuals who need more training, find opportunities to improve, uncover student or faculty retention issues, and improve course quality to support GGU’s teaching mission.

    GGU’s Pearson help desk consistently exceeds 90% first-call resolution, so students and faculty can quickly move forward with their work. GGU’s Geier notes that some calls the help desk can’t resolve are due to issues it can’t control. “When that happens, Pearson can take the calls, offer some assurance as to when it’ll be fixed, and make sure our students and faculty don’t feel like they’re all alone. And sometimes Pearson’s help desk is first to know of a problem, and [they] tell us so we can follow up more rapidly.”

    Working together for more than six years, Pearson and GGU have built a trusted collaborative partnership with multiple benefits. “We reached out to Pearson as we integrated Turnitin to improve student writing and prevent plagiarism, and when we recently deployed a new video platform,” says Geier. “Pearson’s wide higher education support capabilities are becoming ever more critical as we continually expand the utility of our LMS and online course environment.”

    “Pearson’s help desk is incredibly responsive,” Geier concludes. “Their service is top-notch, it’s customizable, and it’s helped us come a long way in how we work with students and faculty. Pearson does more than just provide services: this is a true partnership.”

    Pearson’s help desk is incredibly responsive. Their service is top-notch, it’s customizable, and it’s helped us come a long way in how we work with students and faculty. Pearson does more than just provide services: this is a true partnership.

    Doug Geier, Director of eLearning and Instructional Design
    Golden Gate University

    To learn more about Golden Gate University’s help desk services, read the full success story.

    Read the full success story

  • Tapping into G-R-I-T to enhance students' 'burn to learn'

    by Paul G. Stoltz, Ph.D., Author|

    blog image alt text

    Helping students effectively harness their GRIT comes down to the difference between telling them about it and equipping them with the tools to acquire and grow it. I recently experienced the stark contrast between mere advising and actual “equipping” when I failed my own godson at a critical time.

    How? Well, instead of helping him tap into his GRIT in substantive and productive ways, I fell into the “sympathetic (if meaningless) advice trap.” Let my failure illuminate our path.

    As a first-term, out-of-state freshman at a challenging four-year university with a rigorous major, my godson has plenty on his plate and no shortage of distractions. But when the deadliest fires in California’s history surrounded his hometown of Napa, being away from home took on new meaning to him.

    Even though his family and pets were safe and their most precious possessions secured, summoning the drive and the discipline to slog through calculus homework seemed overwhelming and unimportant to him. He simply stopped doing it, and even when he tried to apply himself to him, his commitment soon waned.  This was understandable given the circumstances, but not ideal.

    So, what did I do? I checked in with him, offered some mouldy cliches and bland old platitudes like, “Thank goodness they’re safe”; “Don’t hesitate to call me anytime”; and “It’s always good to remember: It could be so much worse.”  Nice? Yes. Heartfelt? Definitely. But I could have done so much better by him. I missed my moment.

    What I didn’t do was serve up the harder truth. I didn’t take this critical opportunity to help him realize that “stuff happens,” adversity strikes, and moments like these—when it feels like life is grabbing you and strenuously pulling you away from your educational goals—are both the key tests of your GRIT and the opportunities to significantly grow and apply it to things that matter.

    Every student experiences some combination of rigorous academics, relational breakups, family issues, health concerns, roommate dramas, bureaucratic headaches, personal injustices, scheduling conflicts, emotional hardships, financial stress, external pressures, and existential angst while pursuing a college degree. This is a long list, but worthy path is strewn with struggles!

    My godson didn’t need my warm but vague advice as much as he needed the essential, practical tools to truly own—to dig deeper and better in order to unwaveringly pursue—his learning and his goals in the midst of his struggle. How could I have helped? I should have pointed him to the GRIT questions.

    Each and every component of GRIT—Growth, Resilience, Instinct and Tenacity—is critical, and individuals must fully engage with them to truly own and achieve worthwhile educational goals.

    Consider these four facets of GRIT and the questions I, a teacher, a counselor, or anyone can ask about each one to help students own their learning, their goals, and their lives in good times and bad.

    G–Growth

    The propensity to seek out fresh ideas, perspectives, input, and advice to accelerate and enhance one’s progress toward one’s long term, difficult goals.

    Growth is about going after one’s goals and finding out what one needs to know in order to get there better and faster. It shifts a student from being a victim or a passenger to being the driver at the helm of his journey. This dimension of GRIT accelerates growth, learning, and momentum, while reducing the kind of frustration and exasperation that lead many to fall short or quit.

    • What new resources might you tap into to get some clarity and support around your goal?
    • Who could you talk to, both inside and outside of school, who could offer you the best, freshest wisdom on this issue or concern?
    • Do you notice that as you keep attempting to achieve your goal, the effort seems to be making you stronger and allows you to imagine new strategies to get where you want?

    R—Resilience

    One’s capacity to not just overcome or cope with, but to make constructive use of adversity.

    One of the big wake up calls in education is: Adversity is on the rise everywhere, and resilience truly matters. Support and resources are external. Resilience is internal. Resilience is not about bouncing back. That’s not good enough.

    It’s about harnessing adversity, using it as fuel to end up better off because of the increased strength and knowledge that comes from working through and overcoming a difficult obstacle. There is no better place for a student to learn and master this distinction than in higher education.

    • While you perhaps can’t control this situation, what facets of this situation can you at least potentially influence?  Of those, which one(s) matters most to you?
    • How can you step up to make the most immediate, positive difference in this situation?
    • How can you use your experience of struggling against this adversity to actually fuel your next attempt to reach your goal?

    I—Instinct     

    One’s propensity to pursue the best goals in the most effective ways.

    Arguably one of the most consistent and potent contributors to student failure, dropouts, or underperformance is a lack of Instinct. The vast majority of students waste tremendous energy, time, and effort pursuing less than ideal goals in less than optimal ways. That’s why so many lose their way or quit. That’s why it’s important to ask:

    • What adjustment(s) can you make to your goal to have it be even more compelling and clear for you?
    • What specific tweaks or shifts can you make to how you are pursuing your goal to best accelerate and/or enhance your chances of achieving it?
    • As you think about your goal (e.g. graduation), in what ways might you be wasting your precious time, energy, and/or effort?  If you could do less of one thing and more of another to most dramatically enhance your chances of success, what would that look like?

    T–Tenacity

    The sheer relentlessness with which one pursues one’s most important, long-term, difficult goals.

    This is the classic, traditional definition of basic grit. But as the world education wakes up to the hard reality that more tenacity is not always a good thing, we have an opportunity to infuse the qualitative aspects of GRIT. These include two continua, Good versus Bad GRIT, and Effective versus Ineffective GRIT.

    Pretty much every student has expended considerable Tenacity on the wrong stuff, or in less than optimal ways. The more students master how to funnel the right of Tenacity and overall GRIT toward their most worthy goals, the more likely they are to thrive and succeed.

    • If you utterly refused to quit, and were to give this goal your best-ever effort, how would you attack it even better this time?
    • How can you re-engage toward and go after your goal in a way that is most beneficial, even elevating, to those around you?
    • If your life depended on you sticking to and achieving this goal, what steps would you take now, that you’ve not yet taken?

    How do we equip students to stay on path, no matter what occurs—from natural disasters to simple, everyday adversity?  Growth, Resilience, Instinct, and Tenacity spell more than GRIT. They spell ownership. And they transcend plain old advice (even the god-fatherly kind).

    While each of these dimensions is powerful on its own, when we weave them together they become the four, actionable facets of GRIT that not only fortify students, but can also permanently instill in them a lifelong sense of ownership for learning, making important decisions, and for contributing something of value to their own lives and their society.

     
    read more
  • 3 steps to upgrade your GRIT in education

    by Paul G. Stoltz, Ph.D., Author

    blog image alt text

    Grit, it is a powerful tool to help you achieve your goals, but as we know, it can sometimes fall short. Worse yet, using it the wrong way can backfire, even lead to real trouble. Consider this “fall short” and “backfire” conversation I overheard just last week.

    “What’s your grit and resilience strategy?” the Provost at a premier regional college asked his cross-town colleague at a college fundraising dinner I recently attended. The question instantly caught my ear and my eye. I was struck by both the ease with which this clearly loaded question fell from his lips, as well as the relaxed assumptiveness with which it was received.

    “Ah, well, you know, there’s so much talk and information about grit out there now, but honestly, we’re not sure what we think about it yet. Of course we’ve had our people watch the videos, read the books, start talking about to each other it more…at least the basics, you know? But frankly, results seem mixed, at best.

    Get this! We had one student repeatedly camp on the doorstep of the Registrar’s Office, apparently in an effort to get his grade changed, because he thought he could get what he wanted just be refusing to take no (or a bad grade) for an answer. When it was explained to him repeatedly that this wasn’t the best strategy and his grade was actually determined by his professor, the student somewhat deafly responded, ‘Got too much grit to quit!'”

    “That’s an amazing story,” the Provost replied. “Good to know. Honestly, you’re way ahead of us. We’re still exploring all the options on what we’re might pursue with grit, but your example will definitely help.”

    So what’s your grit and resilience strategy for your institution? And how do you avoid the dreaded and increasingly common “mixed results” or backfire conundrum? How do you minimize the potential downside of students misusing their and maximize the vital upside that will make them successful and productive? Here are three simple steps to Upgrade Your GRIT™ in Education.

    Step One: Shatter the “More is Better” Grit Myth

    Arguably one of the most dangerous assumptions when it comes to grit is the burgeoning belief that “more is better, more is more”. It’s nearly everywhere. “We just gotta show more grit!”, Dabo Swinney, Clemson University’s football coach declared after a heartbreaking loss.

    In another instance, I was asked by a faculty member at a Texas university, “Dr. Stoltz, how do we help our students grow and show more grit?” This is not an uncommon question. One I hear more and more.

    However, if just having more grit is so desirable, consider this simple provocation. First, think of the most dangerous person you’ve ever heard of or known. Second, ask yourself how much grit—determination, passion, and effort—they showed in pursuit of their nefarious goals. Next, ask yourself, is grit always and necessarily a good thing? For everyone? In all situations?

    The truth is that helping  our students build higher and higher levels of grit guarantees next to nothing. Worse yet, it can lead to disaster.  In truth, many students have plenty of grit. That’s not the issue. Their quantity of grit is not  what’s getting in their way. It’s the quality of their GRIT that may be hobbling their efforts, progress, and success.

    To free yourself from the “more is better” myth, ask yourself and/or your team a simple question. What matters more – the quantity or quality of your students’ grit? When it comes to the kind of students we want to grow, the kind of lives we’d like them to live, and contributions we’d like them to make in the world, do we want them touse their growth mindset, resilience, instinct and tenacity to not merely achieve their goals but also to show their consideration for other people, for their environment, and for the general good?”

    Ready for a bizarre, if not impossible statistic? I’ve asked this exact question of more than 500,000 people across six continents, and one hundred percent respond resoundingly with “Quality!” 100 percent. That’s stunning. And each time I test it, I get the same result: When it comes to GRIT, remember– Quantity is what we require, but Quality takes us higher.

    Step Two: Foster Smart GRIT

    “But I worked really hard on this!” How many times have students used said that do defend work or a test wasn’t as good as it should be. Don’t forget its anemic sibling, “I stayed up all night (or “spent all weekend’) studying for this test!” “Doesn’t my effort count?” they complain.

    What I sometimes call “Smart” and “Dumb” GRIT can be re-labeled “Effective” and “Ineffective” GRIT. Does urging our students to just try harder, to pour more effort and energy into the task always lead to the best results? More importantly, does it best serve our students as they try to make progress in an occasionally puzzling world? What if, instead, we taught them how to use ever-more thoughtful, intelligent, effective GRIT—the kind that accelerates and enhances their success—especially for the most daunting, long-term, challenging assignments, projects, and tasks?

    Shifting students’ focus from a concern with “how much or how hard can I try” to asking the questions “How else can I achieve my goal?” and “How can I do this even better?” can lead to profound revelations for them. By encouraging them to consider rational, creative, or more efficient alternatives when they get stuck or new ways to solve problems that might yield an even greater result, we begin to equip our students for the adversity-rich, highly demanding world of work, where they will be rewarded mainly for how well they achieved their goals, not the how much sheer effort or drive they expended in their pursuit.

    Step Three: Grow Good GRIT

    Ever see that high achieving student whose classmates find him hard to be around or to work with? What about the ones who, the higher their marks, the lower their classmates’ desire to pay attention to their comments or be part of that student’s group project?

    We’ve all experienced the boss, colleague, or student who has plenty of GRIT but goes after goals in ways that hinder, even hurt others. Consider the powerful difference between Bad and Good GRIT. Bad GRIT happens when a person goes after goals in ways that are intentionally or unintentionally detrimental to others. Good GRIT is of course the opposite: its hallmark is pursuing goals in ways that take other people and their goals into consideration or working in teams in ways that allow all participants to benefit. Pretty much everyone I know, me included, has demonstrated Bad GRIT, despite the best of intentions. That’s pretty humbling.

    Good GRIT happens when we go after our goals in ways that are ultimately beneficial, and ideally elevating to those around us–this attitude is often described by none other than rock star  Bruce Springsteen as he ends his concerts: “Nobody wins unless everybody wins.”

    Teaching students the difference between Good and Bad GRIT is arguably one of the most potent and important lessons we can impart. Awakening them to the power and potential of Good GRIT is elemental to us graduating not just decent students, but good citizens.

    Long after they return their caps and gowns, it is the quality of our students’ GRIT that determines how they will navigate life’s ups and downs and what kind of mark they will make in their community, their workplace, and their world.

     

    read more
  • The Networked University

    by Denis Hurley, Director of Future Technologies, Pearson

    blog image alt text

    From tomorrow through Friday (31 Oct-3 Nov), you can visit Pearson’s booth (#401) at Educause to learn about how the student of the future may navigate her learning experiences through networked universities with the assistance of Pearson’s digital products and services.

    This scenario is based on The Networked University: Building Alliances for Innovation in Higher Education, written by Jeff Selingo, which imagines institutions of higher education strengthening their own offerings and improving learner outcomes through greater collaboration rather than competition.

    Pearson’s partnership with IBM Watson, our mixed reality applications created for Hololens, and our digital badging platform Acclaim are just a few of the ways we are empowering students to make the most of emerging technologies.

    Since its inception, the Future Technologies program at Pearson has explored many of these technologies while considering how our education systems can evolve. We continue to scan the horizon for new opportunities, and we are always learning.

    If you are unable to attend Educause, check out the video below and follow Olivia’s journey from discovery and enrollment through lifelong learning:

    read more
  • Chirons will lead us out of the AI Technopanic (and you can be a chiron)

    by Denis Hurley, Director of Future Technologies, Pearson

    blog image alt text

    Now more than ever, faster than ever, technology is driving change. The future is an unknown, and that scares us. However, we can overcome these fears and utilize these new technologies to better equip ourselves and steer us in a positive direction.

    Language evolves, and understanding these changes is crucial to learning how to communicate effectively. Like almost all change, it’s best to embrace it rather than try in vain to reject it.

    For example, it appears as though I’m on the losing side in the popular definition of the term “mixed reality.” Sorry, Mr. Milgram — I’ve given in.

    Technopanic

    A technopanic is extreme fear of new technology and the changes that they may bring. Consider the Luddites, who destroyed machinery in the early 19th century. The only constant is change, so they had little success slowing down the Industrial Revolution. In recent history, think of Y2K. This was a little different because we feared that new technology had been embraced without our full understanding of the consequences. Of course, we proceeded into the new millennium without our computer systems plunging civilization back into the Dark Ages.

    Last year, the BBC compiled a list of some of history’s greatest technopanics. One of my favorites was the fear that telephone lines would be used by evil spirits as a means of entry into unsuspecting humans who were just trying to walk grandma through how to use her new light bulbs.

    Our current technopanic is about artificial intelligence and robotics. I am not saying this fear is unreasonable. We don’t know how this will play out, and it appears as though many jobs will no longer be necessary in the near future. However, expending too much energy on fear is not productive, and the most dire outcomes are unlikely. The Guardian produced this clever and amusing short about artificial intelligence:

    Working with New Technology

    The Replacements

    Narrow artificial intelligence is now prevalent, which means programs are better than humans at performing specific tasks. Perhaps the most famous example is IBM’s Deep Blue defeating Garry Kasparov, the world champion of chess at the time — in 1997. Today, complex algorithms outperform humans at driving and analyzing lab results, among many other things.

    Robots, which are stronger, larger (or smaller), and do not get bored or sick or go on strike, have been replacing humans for hundreds of years. They can fly and work through the night for days on end or longer.

    Can Humans Compete?

    Spending too much energy on searching for an answer to this question is a waste of time. We should not see progress as a competitor or as an enemy. These are tools we can use.

    Augmenting Ourselves

    Cyborgs: For many people, this is the word that will come to mind when reading the phrase above above it. While the word makes us think think of science fiction, we have been implanting devices in our bodies for decades. But we can now control artificial limbs directly from our brains, bypassing the spinal cord.

    More “extreme” cyborgs do exist, such as Neil Harbisson, who can hear colors via an antenna implanted in his skull. Transhumanists aim to overcome human limitations through science and technology.

    Becoming a cyborg is not practical, desirable, or even feasible for many of you. It’s also not necessary.

    Cobots: A cobot is a robot designed to work interactively with a human in a shared workspace. Lately, some people have been using the word to refer to the human who works with robots or to the unified entity itself.

    I don’t think the new definition of this word is useful. When referring to a specific type of robot, it has practical use.

    Centaurs: After Kasparov lost to Deep Blue, he understood the potential of humans working with machines. He created a new form of chess called “centaur chess” or “freestyle chess.” Teams can consist of all humans, all algorithms, or a combination (a centaur). The champion has almost always been a centaur. Kasparov saw the value of combining what humans do best with what machines do best.

    We Should Strive to Be Chirons

    In Greek mythology, centaurs tended to be unruly, amoral, and violent. When considering a blend of human abilities and machine abilities, a potential outcome is losing our sense of humanity.

    Chiron was a sensitive and refined centaur in Greek mythology. He taught and nurtured youth, most notably, Achilles.

    In the context of maintaining sanity through this technopanic and, more generally, coping with technological change, Chiron embodies the centaur we should aspire to.

    In regard to how we should manage technology-induced fear (reaction, interaction, and creative acceptance), this would be the third stage. We all need to strive to be chirons. For our own sake, this is critical to lifelong learning. For the sake of our youth, we need to be able to demonstrate constructive and responsible use of technology.

    At Educause 2017, we will explore how new technologies can impact the future of higher education and student success. Discover opportunities to engage with Pearson at the conference and drive these critical conversations.

     

    read more
  • Is ed tech really working? 5 core tenets to rethink how we buy, use, and measure new tools

    by Todd Bloom, David Deschryver, Pam Moran, Chrisandra Richardson, Joseph South, Katrina Stevens

    blog image alt text

    This is the fifth in a series of essays surrounding the EdTech Efficacy Research Symposium, a gathering of 275 researchers, teachers, entrepreneurs, professors, administrators, and philanthropists to discuss the role efficacy research should play in guiding the development and implementation of education technologies. This series was produced in partnership with Pearson, a co-sponsor of the symposium co-hosted by the University of Virginia’s Curry School of Education, Digital Promise, and the Jefferson Education Accelerator. Click through to read the firstsecondthird, and fourth pieces.

    Education technology plays an essential role in our schools today. Whether the technology supports instructional intervention, personalized learning, or school administration, the successful application of that technology can dramatically improve productivity and student learning.

    That said, too many school leaders lack the support they need to ensure that educational technology investment and related activities, strategies, or interventions are evidence-based and effective. This gap between opportunity and capacity is undermining the ability of school leaders to move the needle on educational equity and to execute on the goals of today’s K-16 policies. The education community needs to clearly understand this gap and take some immediate steps to close it.

    The time is ripe

    The new federal K-12 law, the Every Students Succeeds Act, elevates the importance of evidence-based practices in school purchasing and implementation practices. The use of the state’s allocation for school support and improvement illustrates the point. Schools that receive these funds must invest only in activities, strategies, or interventions that demonstrate a statistically significant effect on improving student outcomes or other relevant outcomes.

    That determination must rely on research that is well designed and well implemented, as defined in the law. And once implementation begins, the U.S. Department of Education asks schools to focus on continuous improvement by collecting information about the implementation and making necessary changes to advance the goals of equity and educational opportunity for at-risk students. The law, in short, links compliance with evidence-based procurement and implementation that is guided by continuous improvement.

    New instructional models in higher education rely on evidence-based practices if they are to take root. School leaders are under intense pressure to find ways to make programs more affordable, student-centered, and valuable to a rapidly changing labor market. Competency-based education (the unbundling of certificates and degrees into discrete skills and competencies) is one of the better-known responses to the challenge, but the model will likely stay experimental until there is more evidence of success.

    “We are still just beginning to understand CBE,” Southern New Hampshire University President Paul LeBlanc said. “Project-based learning, authentic learning, well-done assessment rubrics — those are all good efforts, but do we have the evidence to pass muster with a real assessment expert? Almost none of higher ed would.”

    It is easy to forget that the abundance of educational technology is a relatively new thing for schools and higher ed institutions. Back in the early 2000s, the question was how to make new educational technologies viable instructional and management tools. Education data was largely just a lagging measure used for school accountability and reporting.

    Today, the data can provide strong, real-time signals that advance productivity through, for example, predictive analytics, personalized learning, curriculum curating and delivery, and enabling the direct investigation into educational practices that work in specific contexts. The challenge is how to control and channel the deluge of bytes and information streaming from the estimated $25.4 billion K-16 education technology industry.

    “It’s [now] too easy to go to a conference and load up at the buffet of innovations. That’s something we try hard not to do,” said Chad Ratliff, director of instructional programs for Virginia’s Albemarle County Schools. The information has to be filtered and vetted, which takes time and expertise.

    Improving educational equity is the focus of ESSA, the Higher Education Act, and a key reason many school leaders chose to work in education. Moving the needle increasingly relies on evidence-based practices. As the Aspen Institute and Council of Chief State School Officers point out in a recent report, equity means — at the very least — that “every student has access to the resources and educational rigor they need at the right moment in their education despite race, gender, ethnicity, language, disability, family background, or family income.”

    Embedded in this is the presumption that the activities, strategies, or interventions actually work for the populations they intend to benefit.

    Educators cannot afford to invest in ineffective activities. At the federal K-12 level, President Donald Trump is proposing that, next year, Congress cut spending for the Education Department and eliminate many programs, including $2.3 billion for professional development programs, $1.2 billion for after-school funds, and the new Title IV grant that explicitly supports evidence-based and effective technology practices in our schools.

    Higher education is also in a tight spot. The president seeks to cut spending in half for Federal Work-Study programs, eliminate Supplemental Educational Opportunity grants, and take nearly $4 million from the Pell Grant surplus for other government spending. At the same time, Education Secretary Betsy DeVos is reviewing all programs to explore which can be eliminated, reduced, consolidated, or privatized.

    These proposed cuts and reductions increase the urgency for school leaders to tell better stories about the ways they use the funds to improve educational opportunities and learning outcomes. And these stories are more compelling (and protected from budget politics) when they are built upon evidence.

    Too few resources

    While this is a critical time for evidence-based and effective program practices, here is the rub: The education sector is just beginning to build out this body of knowledge, so school leaders are often forging ahead without the kind of guidance and research they need to succeed.

    The challenges are significant and evident throughout the education technology life cycle. For example, it is clear that evidence should influence procurement standards, but that is rarely the case. The issue of “procurement standards” is linked to cost thresholds and related competitive and transparent bidding requirements. It is seldom connected with measures of prior success and research related to implementation and program efficacy. Those types of standards are foreign to most state and local educational agencies, left to “innovative” educational agencies and organizations, like Digital Promise’s League of Innovative Schools, to explore.

    Once the trials of implementation begin, school leaders and their vendors typically act without clear models of success and in isolation. There just are not good data on efficacy for most products and implementation practices, which means that leaders cannot avail themselves of models of success and networks of practical experience. Some schools and institutions with the financial wherewithal, like Virginia’s Albemarle and Fairfax County Public Schools, have created their own research process to produce their own evidence.

    In Albemarle, for example, learning technology staff test-bed solutions to instructional and enterprise needs. Staff spend time observing students and staff using new devices and cloud-based services. They seek feedback and performance data from both teachers and students in response to questions about the efficacy of the solution. They will begin with questions like “If a service is designed to support literacy development, what variable are we attempting to affect? What information do we need to validate significant impact?” Yet, like the “innovators” of procurement standards, these are the exceptions to the rule.

    And as schools make headway and immerse themselves in new technologies and services, the bytes of data and useful information multiply, but the time and capacity necessary to make them useful remains scarce. Most schools are not like Fairfax and Albemarle counties. They do not have the staff and experts required to parse the data and uncover meaningful insights into what’s working and what’s not. That kind of work and expertise isn’t something that can be simply layered onto existing responsibilities without overloading and possibly burning out staff.

    “Many schools will have clear goals, a well-defined action plan that includes professional learning opportunities, mentoring, and a monitoring timeline,” said Chrisandra Richardson, a former associate superintendent for Montgomery County Public Schools in Maryland. “But too few schools know how to exercise a continuous improvement mindset, how to continuously ask: ‘Are we doing what we said we would do — and how do we course-correct if we are not?’ ”

    Immediate next steps

    So what needs to be done? Here are five specific issues that the education community (philanthropies, universities, vendors, and agencies) should rally around.

    • Set common standards for procurement. If every leader must reinvent the wheel when it comes to identifying key elements of the technology evaluation rubric, we will ensure we make little progress — and do so slowly. The sector should collectively secure consensus on the baseline procurement standards for evidence-based and research practices and provide them to leaders through free or open-source evaluative rubrics or “look fors” they can easily access and employ.
    • Make evidence-based practice a core skill for school leadership. Every few years, leaders in the field try to pin down exactly what core competencies every school leader should possess (or endeavor to develop). If we are to achieve a field in which leaders know what evidence-based decision-making looks like, we must incorporate it into professional standards and include it among our evaluative criteria.
    • Find and elevate exemplars. As Charles Duhigg points out in his recent best seller Smarter Faster Better, productive and effective people do their work with clear and frequently rehearsed mental models of how something should work. Without them, decision-making can become unmoored, wasteful, and sometimes even dangerous. Our school leaders need to know what successful evidence-based practices look like. We cannot anticipate that leader or educator training will incorporate good decision-making strategies around education technologies in the immediate future, so we should find alternative ways of showcasing these models.
    • Define “best practice” in technology evaluation and adoption. Rather than force every school leader to develop and struggle to find funds to support their own processes, we can develop models that can alleviate the need for schools to develop and invest in their own research and evidence departments. Not all school districts enjoy resources to investigate their own tools, but different contexts demand differing considerations. Best practices help leaders navigate variation within the confines of their resources. The Ed Tech RCE Coach is one example of a set of free, open-source tools available to help schools embed best practices in their decision-making.
    • Promote continuous evaluation and improvement. Decisions, even the best ones, have a shelf life. They may seem appropriate until evidence proves otherwise. But without a process to gather information and assess decision-making efficacy, it’s difficult to learn from any decisions (good or bad). Together, we should promote school practices that embrace continuous research and improvement practices within and across financial and program divisions to increase the likelihood of finding and keeping the best technologies.

    The urgency to learn about and apply evidence to buying, using, and measuring success with ed tech is pressing, but the resources and protocols they need to make it happen are scarce. These are conditions that position our school leaders for failure — unless the education community and its stakeholders get together to take some immediate actions.

    This series is produced in partnership with Pearson. The 74 originally published this article on September 11th, 2017, and it was re-posted here with permission.

    read more
  • Communicate often and better: How to make education research more meaningful

    by Jay Lynch, PhD and Nathan Martin, Pearson

    blog image alt text

    Question: What do we learn from a study that shows a technique or technology likely has affected an educational outcome?

    Answer: Not nearly enough.

    Despite widespread criticism, the field of education research continues to emphasize statistical significance—rejecting the conclusion that chance is a plausible explanation for an observed effect—while largely neglecting questions of precision and practical importance. Sure, a study may show that an intervention likely has an effect on learning, but so what? Even researchers’ recent efforts to estimate the size of an effect don’t answer key questions. What is the real-world impact on learners? How precisely is the effect estimated? Is the effect credible and reliable?

    Yet it’s the practical significance of research findings that educators, administrators, parents and students really care about when it comes to evaluating educational interventions. This has led to what Russ Whitehurst has called a “mismatch between what education decision makers want from the education research and what the education research community is providing.”

    Unfortunately, education researchers are not expected to interpret the practical significance of their findings or acknowledge the often embarrassingly large degree of uncertainty associated with their observations. So, education research literature is filled with results that are almost always statistically significant but rarely informative.

    Early evidence suggests that many edtech companies are following the same path. But we believe that they have the opportunity to change course and adopt more meaningful ways of interpreting and communicating research that will provide education decision makers with the information they need to help learners succeed.

    Admitting What You Don’t Know

    For educational research to be more meaningful, researchers will have to acknowledge its limits. Although published research often projects a sense of objectivity and certainty about study findings, accepting subjectivity and uncertainty is a critical element of the scientific process.

    On the positive side, some researchers have begun to report what is known as standardized effect sizes, a calculation that helps compare outcomes in different groups on a common scale. But researchers rarely interpret the meaning of these figures. And the figures can be confusing. A ‘large’ effect actually may be quite small when compared to available alternatives or when factoring in the length of treatment, and a ‘small’ effect may be highly impactful because it is simple to implement or cumulative in nature.

    Confused? Imagine the plight of a teacher trying to decide what products to use, based on evidence—an issue of increased importance since the Every Student Succeeds Act (ESSA) promotes the use of federal funds for certain programs, based upon evidence of effectiveness. The newly-launched Evidence for ESSA admirably tries to help support that process, complementing the What Works Clearinghouse and pointing to programs that have been deemed “effective.” But when that teacher starts comparing products, say Math in Focus (effect size: +0.18) and Pirate Math (effect size: +0.37), the best choice isn’t readily apparent.

    It’s also important to note that every intervention’s observed “effect” is associated with a quantifiable degree of uncertainty. By glossing over this fact, researchers risk promoting a false sense of precision and making it harder to craft useful data-driven solutions. While acknowledging uncertainty is likely to temper excitement about many research findings, in the end it will support more honest evaluations of an intervention’s likely effectiveness.

    Communicate Better, Not Just More

    In addition to faithfully describing the practical significance and uncertainty around a finding, there also is a need to clearly communicate information regarding research quality, in ways that are accessible to non-specialists. There has been a notable unwillingness in the broader educational research community to tackle the challenge of discriminating between high quality research and quackery for educators and other non-specialists. As such, there is a long overdue need for educational researchers to be forthcoming about the quality and reliability of interventions in ways that educational practitioners can understand and trust.

    Trust is the key. Whatever issues might surround the reporting of research results, educators are suspicious of people who have never been in the classroom. If a result or debunked academic fad (e.g. learning styles) doesn’t match their experience, they will be tempted to dismiss it. As education research becomes more rigorous, relevant, and understandable, we hope that trust will grow. Even simply categorizing research as either “replicated” or “unchallenged” would be a powerful initial filtering technique given the paucity of replication research in education. The alternative is to leave educators and policy-makers intellectually adrift, susceptible to whatever educational fad is popular at the moment.

    At the same time, we have to improve our understanding of how consumers of education research understand research claims. For instance, surveys reveal that even academic researchers commonly misinterpret the meaning of common concepts like statistical significance and confidence intervals. As a result, there is a pressing need to understand how those involved in education interpret (rightly or wrongly) common statistical ideas and decipher research claims.

    A Blueprint For Change

    So, how can the education technology community help address these issues?

    Despite the money and time spent conducting efficacy studies on their products, surveys reveal that research often plays a minor role in edtech consumer purchasing decisions. The opaqueness and perceived irrelevance of edtech research studies, which mirror the reporting conventions typically found in academia, no doubt contribute to this unfortunate fact. Educators and administrators rarely possess the research and statistical literacy to interpret the meaning and implications of research focused on claims of statistical significance and measuring indirect proxies for learning. This might help explain why even well-meaning educators fall victim to “learning myths.”

    And when nearly every edtech company is amassing troves of research studies, all ostensibly supporting the efficacy of their products (with the quality and reliability of this research varying widely), it is understandable that edtech consumers treat them all with equal incredulity.

    So, if the current edtech emphasis on efficacy is going to amount to more than a passing fad and avoid devolving into a costly marketing scheme, edtech companies might start by taking the following actions:

    • Edtech researchers should interpret the practical significance and uncertainty associated with their study findings. The researchers conducting an experiment are best qualified to answer interpretive questions around the real-world value of study findings and we should expect that they make an effort to do so.
    • As an industry, edtech needs to work toward adopting standardized ways to communicate the quality and strength of evidence as it relates to efficacy research. The What Works Clearinghouse has made important steps, but it is critical that relevant information is brought to the point of decision for educators. This work could resemble something like food labels for edtech products.
    • Researchers should increasingly use data visualizations to make complex findings more intuitive while making additional efforts to understand how non-specialists interpret and understand frequently reported statistical ideas.
    • Finally, researchers should employ direct measures of learning whenever possible rather than relying on misleading proxies (e.g., grades or student perceptions of learning) to ensure that the findings reflect what educators really care about. This also includes using validated assessments and focusing on long-term learning gains rather than short-term performance improvement.

    This series is produced in partnership with Pearson. EdSurge originally published this article on April 1, 2017, and it was re-posted here with permission.

     

    read more
  • Technical & human problems with anthropomorphism & technopomorphism

    by Denis Hurley, Director of Future Technologies, Pearson

    blog image alt text

    Anthropomorphism is the attribution of human traits, emotions, and intentions to non-human entities (OED). It has been used in storytelling from Aesop to Zootopia, and people debate its impact on how we view gods in religion and animals in the wild. This is out of scope for this short piece.

    When it comes to technology, anthropomorphism is certainly more problematic than it is useful. Here are three examples:

    1. Consider how artificial intelligence is described like a human brain, which is not how AI works. This results in people misunderstanding its potential uses, attempting to apply it in inappropriate ways, and failing to consider applications where it could provide more value. Ines Montani has written an excellent summary on AI’s PR problem.
    2. More importantly, anthropomorphism contributes to our fear of progress, which often leads to full-blown technopanics. We are currently in a technopanic brought about by the explosion of development in automation and data science. Physically, these machines are often depicted as bipedal killing machines, which is not even the most effective form of mobility for a killing machine. Regarding intent, superintelligent machines are thought of as a threat not just to employment but our survival as a species. This assumes that these machines will treat homo sapiens similar to how homo sapiens have treated other species on this planet.
    3. Pearson colleague Paul Del Signore asked via Twitter, “Would you say making AI speak more human-like is a successful form of anthropomorphism?” This brings to mind a third major problem with anthropomorphism: the uncanny valley. While adding humanlike interactions can contribute to good UX, too much (but not quite enough) similarity to a human can result in frustration, discomfort, and even revulsion.

    Historically, we have used technology to achieve both selfish and altruistic goals. Overwhelmingly, however, technology has helped us reach a point in human civilization in which we are the most peaceful and healthy in history. In order to continue on this path, we must design machines to function in ways that utilize their best machine-like abilities.

    Technopomorphism is the attribution of technological characteristics to human traits, emotions, intentions, or biological functions. Think of how people may describe a thought process like cogs in a machine or someone’s capacity for work may be described with bandwidth.

    A Google search for the term “technopomorphism” only returns 40 results, and it is not listed in any online dictionary. However, I think the term is useful because it helps us to be mindful of our difference from machines.

    It’s natural for humans to use imagery that we do understand to try to describe things we don’t yet understand, like consciousness. Combined with our innate fear of dying, we imagine ways of deconstructing and reconstructing ourselves as immortal or as one with technology (singularity). This is problematic for at least two reasons:

    1. It restricts the ways in which we may understand new discoveries about ourselves to very limited forms.
    2. It often leads to teaching and training humans to function as machines, which is not the best use of our potential as humans.

    It is increasingly important that we understand how humans can best work with technology for the sake learning. In the age of exponential technologies, that which makes us most human will be most highly valued for employment and is often used for personal enrichment.

    There may be some similarities, but we’re not machines. At least, not yet. In the meantime, I advocate for “centaur mentality.”

     

    read more
  • Can Edtech support - and even save - educational research?

    by Jay Lynch, PhD and Nathan Martin, Pearson

    blog image alt text

    There is a crisis engulfing the social sciences. What was thought to be known about psychology—based on published results and research—is being called into question by new findings and the efforts of individual groups like the Reproducibility Project. What we know is under question and so is how we come to know. Long institutionalized practices of scientific inquiry in the social sciences are being actively questioned, proposals put forth for needed reforms.

    While the fields of academia burn with this discussion, education results have remained largely untouched. But education is not immune to problems endemic in fields like psychology and medicine. In fact, there’s a strong case that the problems emerging in other fields are even worse in educational research. External or internal critical scrutiny has been lacking. A recent review of the top 100 education journals found that only 0.13% of published articles were replication studies. Education waits for its own crusading Brian Nosek to disrupt the canon of findings. Winter is coming.

    This should not be breaking news. Education research has long been criticized for its inability to generate a reliable and impactful evidence base. It has been derided for problematic statistical and methodological practices that hinder knowledge accumulation and encourage the adoption of unproven interventions. For its failure to communicate the uncertainty and relevance associated with research findings, like Value-Added Measures for teachers, in ways that practitioners can understand. And for struggling to impact educational habits (at least in the US) and how we develop, buy, and learn from (see Mike Petrilli’s summation) the best practices and tools.

    Unfortunately, decades of withering criticism have done little to change the methods and incentives of educational research in ways necessary to improve the reliability and usefulness of findings. The research community appears to be in no rush to alter its well-trodden path—even if the path is one of continued irrelevance. Something must change if educational research is to meaningfully impact teaching and learning. Yet history suggests the impetus for this change is unlikely to originate from within academia.

    Can edtech improve the quality and usefulness of educational research? We may be biased (as colleagues at a large and scrutinized edtech company), but we aren’t naïve. We know it might sound farcical to suggest technology companies may play a critical role in improving the quality of education research, given almost weekly revelations about corporations engaging in concerted efforts to distort and shape research results to fit their interests. It’s shocking to read efforts to warp public perception on the effects of sugar on heart disease or the effectiveness of antidepressants. It would be foolish not to view research conducted or paid for by corporations with a healthy degree of skepticism.

    Yet we believe there are signs of promise. The last few years has seen a movement of companies seeking to research and report on the efficacy of educational products. The movement benefited from the leadership of the Office of Education Technology, the Gates FoundationLearning AssemblyDigital Promise and countless others. Our own company has been on this road since 2013. (It’s not been easy!)

    These efforts represent opportunities to foment long-needed improvements in the practice of education research. A chance to redress education research’s most glaring weakness: its historical inability to appreciably impact the everyday activities of learning and teaching.

    Incentives for edtech companies to adopt better research practices already exist and there is early evidence of openness to change. Edtech companies possess a number of crucial advantages when it comes to conducting the types of research education desperately needs, including:

    • access to growing troves of digital learning data;
    • close partnerships with institutions, faculty, and students;
    • the resources necessary to conduct large and representative intervention studies;
    • in-house expertise in the diverse specialties (e.g., computer scientists, statisticians, research methodologists, educational psychologists, UX researchers, instructional designers, ed policy experts, etc.) that must increasingly collaborate to carry out more informative research;
    • a research audience consisting primarily of educators, students, and other non-specialists

    The real worry with edtech companies’ nascent efforts to conduct efficacy research is not that they will fail to conduct research with the same quality and objectivity typical of most educational research, but that they will fall into the same traps that currently plague such efforts. Rather than looking for what would be best for teachers and learners, entrepreneurs may focus on the wrong measures (p-values, for instance) that obfuscate people rather than enlighten them.

    If this growing edtech movement repeats the follies of the current paradigm of educational research, it will fail to seize the moment to adopt reforms that can significantly aid our efforts to understand how best to help people teach and learn. And we will miss an important opportunity to enact systemic changes in research practice across the edtech industry with the hope that academia follows suit.

    Our goal over the next three articles is to hold a mirror up, highlighting several crucial shortcomings of educational research. These institutionalized practices significantly limit its impact and informativeness.

    We argue that edtech is uniquely incentivized and positioned to realize long-needed research improvements through its efficacy efforts.

    Independent education research is a critical part of the learning world, but it needs improvement. It needs a new role model, its own George Washington Carver, a figure willing to test theories in the field, learn from them, and then to communicate them to back to practitioners. In particular, we will be focusing on three key ideas:

    Why ‘What Works’ Doesn’t: Education research needs to move beyond simply evaluating whether or not an effect exists; that is, whether an educational intervention ‘works’. The ubiquitous use of null hypothesis significance testing in educational research is an epistemic dead end. Instead, education researchers need to adopt more creative and flexible methods of data analysis, focus on identifying and explaining important variations hidden under mean scores, and devote themselves to developing robust theories capable of generating testable predictions that are refined and improved over time.

    Desperately Seeking Relevance: Education researchers are rarely expected to interpret the practical significance of their findings or report results in ways that are understandable to non-specialists making decisions based on their work. Although there has been progress in encouraging researchers to report standardized mean differences and correlation coefficients (i.e., effect sizes), this is not enough. In addition, researchers need to clearly communicate the importance of study findings within the context of alternative options and in relation to concrete benchmarks, openly acknowledge uncertainty and variation in their results, and refuse to be content measuring misleading proxies for what really matters.

    Embracing the Milieu: For research to meaningfully impact teaching and learning, it will need to expand beyond an emphasis on controlled intervention studies and prioritize the messy, real-life conditions facing teachers and students. More energy must be devoted to the creative and problem-solving work of translating research into useful and practical tools for practitioners, an intermediary function explicitly focused on inventing, exploring, and implementing research-based solutions that are responsive the needs and constraints of everyday teaching.

    Ultimately education research is about more than just publication. It’s about improving the lives of students and teachers. We don’t claim to have the complete answers but, as we expand these key principles over coming weeks, we want to offer steps edtech companies can take to improve the quality and value of educational research. These are things we’ve learned and things we are still learning.

    This series is produced in partnership with Pearson. EdSurge originally published this article on January 6, 2017, and it was re-posted here with permission.

     

    read more
  • Learning through both physical and virtual discovery

    by Denis Hurley, Director of Future Technologies, Pearson

    blog image alt text

    This morning, I read Bill McKibben’s “Pause! We Can Go Back!,” a review of David Sax’s The Revenge of Analog: Real Things and Why They Matter. My friend and mentor of twenty years, the filmmaker Jill Godmilow, emailed it to me. I immediately thought of Delicate Steve’s interview with Bob Boilen on “All Songs Considered,” and then I mentally time-traveled to 2011…

    I was in Austin in 2011 for SXSW, learning from other startups, networking, and promoting my own digital products. The interactive component of the conference ended with a “surprise” performance at the enormous Stubb’s BBQ concert venue. I reluctantly waited in line with hundreds of others, hopeful to hear something like LCD Soundsystem, who had appeared in a previous year. Once we were all inside, The Foo Fighters took the stage. Considered by many to be “the last great American rock band,” they’re just not my thing. A traveling companion saw the boredom on my face and asked, “Do you want to hear something different?”

    6th Street was dead for the first time all week (nearly all the conference attendees were at Stubb’s), and we popped into a small bar where about ten other patrons huddled near a wiry young man on a small stage. Delicate Steve began to play The Ballad of Speck and Pebble. My brain lit up. It was one of the most inspiring live performances I’ve ever heard.

    In my kitchen, six years later, while I was making applesauce with my earbuds in, Slate’s “Political Gabfest” ended, and Mr. Boilen’s voice came on to introduce Steve Marion, aka Delicate Steve, on “All Songs Considered.” Marion talked about being a “Napster kid” as well as how he was inspired to play music after his grandmother gave him a toy guitar.

    He dove into the rabbit holes of discovery that were available via the Internet to a kid living in northwestern New Jersey. Driven by curiosity and play, using the physical and virtual tools available to him, he began to create. Last year, he played slide guitar on Paul Simon’s new album, and next week, he’ll be at The Bowery Ballroom in New York City.

    In McKibbon’s review in The New York Review of Books, he comments, “Spotify’s playlists show people picking the same tunes over and over.” I believe the same was true when analogue music dominated. Virgin Megastore promoted the latest big release from one of the giant record labels.

    The difference now is that more tools — virtual and physical — are now available to us. How we use them is up to us. We need to ensure that everyone, especially young people are aware of them all and how to use them properly for discovery. Dig deep into that artists’s archive on Spotify. Flip through those old records on Bleeker Street.

    In the late 1990’s, Jill Godmilow taught me how to edit film and sound by hand while I was a student at The University of Notre Dame. I used an 8-plate Steenbeck. It was a lot of work to cut a film like that, but it helped me understand the value of a frame: 1/24 of a second.

    Now I have a child, and I try to help her understand how things work by making mechanical object available to her. She’ll pick up the hand-made kaleidoscope I brought back from London, or crank the Kikkerland music box to hear “Waltzing Matilda.” Together, we play both Minecraft and Clue. Her favorite Christmas present last month was a record player. She chooses to put on the Taylor Swift record “Red” over and over and over again. She also explores Minecraft videos made by other kids all over the world.

    Some of these interaction blend the virtual and the physical, like using the Osmo pizza game, learning math while playing, or programming Dash to wheel around the apartment, learning problem-solving.

    We can foster creativity and encourage exploration using whatever tools we have available to us. I am not advocating constant barrage of entertainment or toys — there is also value in escaping into a book or a tent in the woods — but new, digital tools are not necessarily a bad thing, and to many, they offer ways to learn and build, expanding their minds and enriching our culture.

    Explore, be weird, enjoy what you do, learn through what you enjoy. But do be careful not to lose yourself entirely into the virtual world. The physical world offers a nearly limitless amount of new experiences and adventures. These are thrilling to us because of our human nature, and even as we learn how to embrace the digital to a greater extent, we should do so to enrich our lives, not in an attempt to replace something that doesn’t need replacing.

    I will always be grateful to Jill Godmilow for showing me how to analyze the finest moving parts to a completed whole, which I often have to do in a purely digital format, where the individual elements are not so apparent. I appreciate the music from Delicate Steve, meticulously constructed with his mind and fingers through a medley of neuron-firings, Google searches, and guitar riffs.

    I am thankful that my daughter wonders at our Remington typewriter and miniature carousel, watches the interlocking pieces, and reconstructs some of these relationships with blocks on her iPad, with dominos on the table, and with her friends in the schoolyard.

     

    read more
  • Why 'what works' doesn't: False positives in education research

    by Jay Lynch, PhD and Nathan Martin, Pearson

    blog image alt text

    If edtech is to help improve education research it will need to kick a bad habit—focusing on whether or not an educational intervention ‘works’.

    Answering that question through null hypothesis significance testing (NHST), which explores whether an intervention or product has an effect on the average outcome, undermines the ability to make sustained progress in helping students learn. It provides little useful information and fails miserably as a method for accumulating knowledge about learning and teaching. For the sake of efficiency and learning gains, edtech companies need to understand the limits of this practice and adopt a more progressive research agenda that yields actionable data on which to build useful products.

    How does NHST look in action? A typical research question in education might be whether average test scores differ for students who use a new math game and those who don’t. Applying NHST, a researcher would assess whether a positive—i.e. non-zero—difference in scores is significant enough to conclude that the game has had an impact, or, in other words, that it ‘works’. Left unanswered is why and for whom.

    This approach pervades education research. It is reflected in the U.S. government-supported initiative to aggregate and evaluate educational research, aptly named the What Works Clearinghouse, and frequently serves as a litmus test for publication worthiness in education journals. Yet it has been subjected to scathing criticism almost since its inception, criticism that centers on two issues.

    False Positives And Other Pitfalls

    First, obtaining statistical evidence of an effect is shockingly easy in experimental research. One of the emerging realizations from the current crisis in psychology is that rather than serving as a responsible gatekeeper ensuring the trustworthiness of published findings, reliance on statistical significance has had the opposite effect of creating a literature filled with false positives, overestimated effect sizes, and grossly underpowered research designs.

    Assuming a proposed intervention involves students doing virtually anything more cognitively challenging than passively listening to lecturing-as-usual (the typical straw man control in education research), then a researcher is very likely to find a positive difference as long as the sample size is large enough. Showing that an educational intervention has a positive effect is quite a feeble hurdle to overcome. It isn’t at all shocking, therefore, that in education almost everything seems to work.

    But even if these methodological concerns with NHST were addressed, there is a second serious flaw undermining the NHST framework upon which most experimental educational research rests.

    Null hypothesis significance testing is an epistemic dead end. It obviates the need for researchers to put forward testable models of theories to predict and explain the effects that interventions have. In fact, the only hypothesis evaluated within the framework of NHST is a caricature, a hypothesis the researcher doesn’t believe—which is that an intervention has zero effect. A researcher’s own hypothesis is never directly tested. And yet with almost universal aplomb, education researchers falsely conclude that a rejection of the null hypothesis counts as strong evidence in favor of their preferred theory.

    As a result, NHST encourages and preserves hypotheses so vague, so lacking in predictive power and theoretical content, as to be nearly useless. As researchers in psychology are realizing, even well-regarded theories, ostensibly supported by hundreds of randomized controlled experiments, can start to evaporate under scrutiny because reliance on null hypothesis significance testing means a theory is never really tested at all. As long as educational research continues to rely on testing the null hypothesis of no difference as a universal foil for establishing whether an intervention or product ‘works,’ it will fail to improve our understanding of how to help students learn.

    As analysts Michael Horn and Julia Freeland have noted, this dominant paradigm of educational research is woefully incomplete and must change if we are going make progress in our understanding of how to help students learn:

    “An effective research agenda moves beyond merely identifying correlations of what works on average to articulate and test theories about how and why certain educational interventions work in different circumstances for different students.”

    Yet for academic researchers concerned primarily with producing publishable evidence of interventions that ‘work,’ the vapid nature of NHST has not been recognized as a serious issue. And because the NHST approach to educational research is relatively straightforward and safe to conduct (researchers have an excellent chance of getting the answer they want), a quick perusal of the efficacy pages at leading edtech companies shows that it holds as the dominant paradigm in edtech.

    Are there, however, reasons to think edtech companies might be incentivized to abandon the current NHST paradigm? We think there are.

    What About The Data You’re Not Capturing?

    Consider a product owner at an edtech company. Although evidence that an educational product has a positive effect is great for producing compelling marketing brochures, it provides little information regarding why a product works, how well it works in different circumstances, or really any guidance for how to make it more effective.

    • Are some product features useful and others not? Are some features actually detrimental to learners but masked by more effective elements?
    • Is the product more or less effective for different types of learners or levels of prior expertise?
    • What elements should be added, left alone or removed in future versions of the product?

    Testing whether a product works doesn’t provide answers to these questions. In fact, despite all the time, money, and resources spent conducting experimental research, a company actually learns very little about their product’s efficacy when evaluated using NHST. There is minimal ability to build on research of this sort. So product research becomes a game of efficacy roulette, with the company just hoping that findings show a positive effect each time it spins the NHST wheel. Companies truly committed to innovation and improving the effectiveness of their products should find this a very bitter pill to swallow.

    A Blueprint For Change

    We suggest edtech companies can vastly improve both their own product research as well as our understanding of how to help students learn by modifying their approach to research in several ways.

    • Recognize the limited information NHST can provide. As the primary statistical framework for moving our understanding of learning and teaching forward, it is misapplied because it ultimately tells us nothing that we actually want to know. Furthermore, it contributes to the proliferation of spurious findings in education by encouraging questionable research practices and the reporting of overestimated intervention effects.
    • Instead of relying on NHST, edtech researchers should focus on putting forward theoretically informed predictions and then designing experiments to test them against meaningful alternatives. Rather than rejecting the uninteresting hypothesis of “no-difference,” the primary goal of edtech research should be to improve our understanding of the impact that interventions have, and the best way to do this is to compare models that compete to describe observations that arise from experimentation.
    • Rather than dichotomous judgments about whether an intervention works on average, greater evaluative emphasis should be devoted to exploring the impact of interventions across subsets of students and conditions. No intervention works equally well for every student and it’s the creative and imaginative work of trying to understand why and where an intervention fails or succeeds that is most valuable.

    Returning to our original example, rather than relying on NHST to evaluate a math game, a company will learn more by trying to improve its estimates and measurements of important variables, looking beneath group mean differences to explore why the game worked better or worse for sub-groups of students, and directly testing competing theoretical mechanisms proposed to explain the game’s influence on learner achievement. It is in this way that practical, problem-solving tools will develop and evolve to improve the lives of all learners.

    This series is produced in partnership with Pearson. EdSurge originally published this article on February 12, 2017, and it was re-posted here with permission.

     

    read more
  • Analysis: For ed tech that actually works, embrace the science of learning

    by Kristen DiCerbo, Aubrey Francisco, Bror Saxberg, Melina Uncapher

    blog image alt text

    This is the second in a series of essays surrounding the EdTech Efficacy Research Symposium, a gathering of 275 researchers, teachers, entrepreneurs, professors, administrators, and philanthropists to discuss the role efficacy research should play in guiding the development and implementation of education technologies. This series was produced in partnership with Pearson, a co-sponsor of the symposium co-hosted by the University of Virginia’s Curry School of Education, Digital Promise, and the Jefferson Education Accelerator. Read the first piece here.

    As education technology gains an increasing presence in American schools, the big question being asked is, “Does it work?”

    But as curricula and learning tools are prepared for rigorous evaluation, we should think about how existing research on teaching and learning have informed their design. Building a movement around research and impact must include advocating for products based on learning research. Otherwise, we are essentially taking a “wait and hope” strategy to development: wait until we have something built and hope it works.

    When we make a meal, we want to at least have a theory about what each ingredient we include will contribute to the overall meal. How much salt do we put in to flavor it perfectly? When do we add it in? Similarly, when creating a curriculum or technology tool, we should be thinking about how each element impacts and optimizes overall learning. For example, how much and when do we add in a review of already-learned material to ensure memory retention? For this, we can turn to learning science as a guide.

    We know a lot about how people learn. Our understanding comes from fields as varied as cognitive and educational psychology, motivational psychology, neuroscience, behavioral economics, and computer science. There are research findings that have been replicated repeatedly across dozens of studies. If we want to create educational technology tools that ultimately demonstrate efficacy, these learning science findings should serve as the foundation, integrating the insights from decades of research into how people learn and how teachers teach into product design from the beginning.

    Existing research on learning

    So what do we know about how people learn? You could turn to foundational texts like Clark and Mayer’s e-Learning and the Science of Instruction, Dan Schwartz’s The ABCs of How We Learn, and Hattie and Yates’s Visible Learning for detail. Or you could look to the excellent summaries compiled by Deans for ImpactLearningScientists.org, and Digital Promise Global.

    Here are a few examples:

    Spaced practice: We know that extending practice over time is better than cramming all practice into the few days before an exam. Spaced practice strengthens information retention and keeps it fresh over time, interrupting the “forgetting curve.” Implementing spaced practice could be as simple as planning out review time. Technology can help implement spaced practice in at least two ways: 1) prompting students to make their own study calendars and 2) proactively presenting already-learned information for periodic review.

    Retrieval practice: What should that practice look like? Rather than rereading or reading and highlighting, we know it is better for students to actually retrieve the information from memory because retrieving the information actually changes the nature of the memory for the information. It strengthens and solidifies the learning, as well as provides more paths to access the learning when you need it. Learners creating flashcards have known about this strategy for a long time. RetrievalPractice.org offers useful information and helpful applications building on this important principle. There is a potential danger point here for designers not familiar with learning literature. Since multiple-choice activities are easier to score with technology, it is tempting to create these kinds of easy questions for retrieval practice. However, learning will be stronger if students practice freely recalling the information rather than simply recognizing the answer from choices.

    Elaboration: Taking new information and expanding on it, linking it to other known information and personal experience, is another way to improve memory for new concepts. Linking new information to information that is already known can make it easy to recall later. In addition, simply expanding on information and explaining it in different ways can make retrieval easier. One way to practice this is to take main ideas and ask how they work and why. Another method is to have students draw or fill in concept maps, visually linking ideas and experiences together. There are a number of online tools that have been developed for creating concept maps, and current research is focusing on how to provide automated feedback on them.

    So how many educational technology products actually incorporate these known practices? How do they encourage students to engage in these activities in a systematic way?

    Existing research on instructional use of technology

    There is also significant research about how technology supports teaching practices, which should inform how a product is designed to be used in the classroom.

    For example, there is a solid research base on how to design activities that introduce new material prior to formal instruction. It suggests that students should initially be given a relatively difficult, open-ended problem that they are asked to solve. Students, of course, tend to struggle with this activity, with almost no students able to generate the “correct” approach. However, the effort students spend in this activity has been shown to build a better foundation for future instruction to build on as students have a better understanding of the problem to be solved (e.g., Wiedmann, Leach, Rummel & Wiley, 2012 Belenky & Nokes-Malach, 2012. It is clearly important that this type of activity be presented to students as a chance to explore and that failure is accepted, expected, and encouraged. In contrast, an activity meant to be part of practice following direct instruction would likely include more step-by-step feedback and hints. So, if someone wants to design activities to be used prior to instruction, they might 1) select a fundamental idea from a lesson, 2) create multiple cases for which students must find an all-encompassing rule, and 3) situate those cases in an engaging scenario.

    Schwartz of Stanford University tested this idea with students learning about ratios — without telling them they were learning about ratios. Three cases with different ratios were created based on the number of objects in a space. This was translated into the number of clowns in different-sized vehicles, and students were asked to develop a “crowded clowns index” to measure how crowded the clowns are in the vehicles. Students are not specifically told about ratios, but must uncover that concept themselves.

    Product developers should consider research like this when designing their ed tech tools, as well as when they’re devising professional development programs for educators who will use those technologies in the classroom.

    Product makers must consider these questions when designing ed tech: Will the activity the technology facilitates be done before direct instruction? Will it be core instruction? Will it be used to review? How much professional development needs to be provided to teachers to ensure the fidelity of implementation at scale?

    Too often, designers think there is a singular answer to this series of questions: “Yes.” But in trying to be everything, we are likely to end up being nothing. Existing research on instructional uses of technology can help developers choose the best approach and design for effective implementation.

    Going forward

    With this research as foundation, though, we still have to cook the dish and taste it. Ultimately, applying learning science at scale to real-world learning situations is an engineering activity. It may require repeated iterations and ongoing measurement to get the mix of ingredients “just right” for a given audience, or a given challenging learning outcome. We need to make sure to carefully understand and tweak our learning environments, using good piloting techniques to find out both whether our learners and teachers can actually execute what we intend as we intended it (Is the learning intervention usable? Are teachers and students able to implement it as intended?), and whether the intervention gives us the learning benefits we hoped for (effectiveness).

    The key is that research should be informing development from the very beginning of an idea for a product, and an evidence-based “learning engineering” orientation should continue to be used to monitor and iterate changes to optimize impact. If we are building from a foundation of research, we are greatly increasing the probability that, when we get to those iterated and controlled trials after the product is created, we will in fact see improvements over time in learning outcomes.

    Follow the conversation on social media with the hashtag #ShowTheEvidence.

    Authors:

    • Kristen DiCerbo, Vice President, Education Research, Pearson
    • Aubrey Francisco, Chief Research Officer, Digital Promise
    • Bror Saxberg, Chief Learning Officer, Kaplan
    • Melina Uncapher, Assistant Professor, Department of Neurology, UC San Francisco

    This series is produced in partnership with Pearson. The 74 originally published this article on June 5, 2017, and it was re-posted here with permission.

    read more
  • #ShowTheEvidence: Building a movement around research, impact in ed tech

    by Aubrey Francisco, Bart Epstein, Gunnar Counselman, Katrina Stevens, Luyen Chou, Mahnaz Charania, Mark Grovic, Rahim Rajan, Robert Pianta, Rebecca Griffiths

    blog image alt text

    This is the first in a series of essays surrounding the EdTech Efficacy Research Symposium, a gathering of 275 researchers, teachers, entrepreneurs, professors, administrators, and philanthropists to discuss the role efficacy research should play in guiding the development and implementation of education technologies. This series was produced in partnership with Pearson, a co-sponsor of the symposium co-hosted by the University of Virginia’s Curry School of Education, Digital Promise, and the Jefferson Education Accelerator.

    To improve education in America, we must improve how we develop and use education technology.

    Teachers and students are increasingly using digital tools and platforms to support learning inside and outside the classroom every day. There are 3.6 million teachers using ed tech, and approximately one in four college students take online courses — four times as many as a decade earlier. Technology will impact the 74 million children currently under the age of 18 as they progress through the pre-K–12 education system. The key question is: What can we do to make sure that the education technology being developed and deployed today fits the needs of 21st-century learners?

    Our teachers and students deserve high-quality tools that provide evidence of student learning, and that provide the right kind of evidence — evidence that can tell us whether the tool is influencing the intended learning outcomes.

    Evidence and efficacy can no longer be someone else’s problem to be solved at some uncertain point in the future. The stakes are too high. We all have a role to play in ensuring that the money spent in ed tech (estimated at $13.2 billion in 2016 for K-12) lives up to the promise of enabling more educators, schools, and colleges to genuinely improve outcomes for students and help close persistent equity gaps.

    Still, education is complex. Regardless of the quality of a learning tool, there will be no singular, foolproof ed tech solution that will work for every student and teacher across the nation. Context matters. Implementation matters. Technology will always only be one element of an instructional intervention, which will also include instructor practices, student experiences, and multiple other contextual factors.

    Figuring out what actually works and why it works requires intentional planning, dedicated professional development, thoughtful implementation, and appropriate evaluation. This all occurs within a context of inconsistent and shifting incentives and, in the U.S., involves a particularly complex ecosystem of stakeholders. And unfortunately, despite the deep and vested interest of improving the system, the current ecosystem is many times better at supporting the status quo than introducing a potentially better-suited learning tool.

    That’s the challenge to be taken up by the EdTech Efficacy Research Symposium in Washington, D.C., this week, and the work underway as part of the initiative convened by the University of Virginia’s Curry School of Education, Digital Promise, and the Jefferson Education Accelerator. People like us rarely have the opportunity to collaborate, but this issue is too important to go it alone.

    Over the past six months, 10 working groups consisting of approximately 150 people spent valuable hours together learning about the challenges associated with improving efficacy and exploring opportunities to address these challenges. We’ve looked at issues such as how ed tech decisions are made in K-12 and higher education, what philanthropy can do to encourage more evidence-based decision-making, as well as what will be necessary to make the focus on efficacy and transparency of outcomes core to how ed tech companies operate.

    Over the next six weeks, we’ll explore these themes here, sharing findings and recommendations from the working groups. Our hope is to stimulate not just discussion but also practical action and concrete progress.

    Action and progress might look like new ways to use research in decision-making such as informational site Evidence for ESSA or tools that make it easier for education researchers to connect with teachers, districts, and ed tech companies, like the forthcoming National Education Researcher Database. Collaboration is critical to improving how we use research in ed tech, but it’s not easy. Building a common framework takes time. Acting on that framework is harder.

    So, as a starting point, here are three broader issues that we’ve learned about efficacy and evidence from our work so far.

    Everyone wants research and implementation analysis done, but nobody wants to pay more for it

    We know it’s not realistic to expect that the adoption of each ed tech product or curricular innovation will be backed up by a randomized control trial.

    Investors are reticent to fund these studies, while schools or developers rarely want to pick up the price tag for expensive studies. When Richard Culatta and Katrina Stevens were still at the U.S. Department of Education’s Office of Educational Technology, they pointed out that “it wouldn’t be economically feasible for most app creators (or schools) to spend $250k (a low price tag for traditional educational research) to evaluate the effectiveness of an app that only cost a total of $50k to build.”

    We could spend more efficiently, leveraging the 15,000 tiny pilots and decisions underway into new work and new insights without spending more money. This could look like a few well-designed initiatives to gather and share relevant information about implementations and efficacy. Critically, we’ll need to find a sustainability model for that type of rigorous evaluation to ensure this becomes a key feature in how adoption decisions are made.

    We need to recognize that evidence exists on a continuum

    Different types of evidence can support different purposes. What is important is that each decision is supported by an appropriate level of evidence. This guide by Mathematica provides a useful reference for educators on different evidence types and how they should be viewed. For educators, it would be wise to look at the scale and cost of the decision and determine the appropriate type of evidence.

    Tools like the Ed Tech Rapid Cycle Evaluation CoachLearn Platform, and Edustar can provide useful support in making decisions and evaluating the use of technology.

    It’s important to remember that researchers and philanthropists may use education research for different purposes than would a college, university system, or districts. Academic researchers may be looking to identify causal connections, learning gains, or retention rates, while a district is often focused on a specific context and implementation (what works for schools similar to mine).

    When possible, traditional randomized control trials provide useful information, but they’re often not affordable, feasible, or even necessarily appropriate. For example, many districts, schools, or colleges are not accustomed to or well versed in undertaking this type of research themselves.

    It’s easy to blame other actors for the current lack of evidence-driven decisions in education

    Everyone we spoke to agrees that decisions about ed tech should be made on the basis of merit and fit, not marketing or spin. But nearly everyone thinks that this problem is caused by other actors in the ecosystem, and this means that progress here will require hard work and coordination.

    For example, investors often don’t screen their investments for efficacy, nor do they promote their portfolio companies to necessarily undertake sufficient research. Not surprisingly, this tends to be because such research is costly and doesn’t necessarily drive market growth. It’s also because market demand is not driven by evidence. It’s simply not the case that selection choices for tools or technologies are most often driven by learning impact or efficacy research. That may be shifting slowly, but much more needs to be done.

    Entrepreneurs and organizations whose products are of the highest quality are frustrated that schools are too often swayed by their competitors’ flashy sales tactics. Researchers feel that their work is underappreciated and underutilized. Educators feel overwhelmed by volume and claims, and are frustrated by a lack of independent information and professional support. We have multiple moving pieces that must be brought together in order to improve our system.

    Ensuring that ed tech investments truly help close achievement gaps and expand student opportunity will require engagement and commitments from a disparate group of stakeholders to help invent a new normal so that our collective progress is directional and meaningful. To make progress on this, we must bring the conversation of efficacy and the use of evidence to center stage.

    That’s what we’re hoping to help continue with this symposium. We’ve learned much, but we know that the journey is just beginning. We can’t do it alone. Feel free to follow and join the conversation on Twitter with #ShowTheEvidence.


    Authors:

    • Aubrey Francisco, Chief Research Officer, Digital Promise
    • Bart Epstein, Founding CEO, Jefferson Education Accelerator
    • Gunnar Counselman, Chief Executive Officer, Fidelis Education
    • Katrina Stevens, former Deputy Director, Office of Educational Technology, U.S. Department of Education
    • Luyen Chou, Chief Product Officer, Pearson
    • Mahnaz Charania, Director, Strategic Planning and Evaluation, Fulton County Schools, Georgia
    • Mark Grovic, Co-Founder and General Partner, New Markets Venture Partners
    • Rahim Rajan, Senior Program Officer, Bill & Melinda Gates Foundation
    • Robert Pianta, Dean, University of Virginia Curry School of Education
    • Rebecca Griffiths, Senior Researcher, Center for Technology in Learning, SRI International

    This series is produced in partnership with Pearson. The 74 originally published this article on May 1, 2017, and it was re-posted here with permission.

    read more
  • 3 simple research-based ways to ace a test

    by John Sadauskas, PhD, Learning Capabilities Design Manager, Pearson

    blog image alt text

    On top of the traditional challenges of balancing their classwork, part-/full-time jobs, extracurricular activities, and social lives, today’s higher education students also face the challenge of the ever-present information firehose that is the Internet. Every day, they receive a constant stream of emails, push notifications, instant messages, social media comments, and other digital content — all of which they can carry in their pockets, and more importantly, can interrupt whatever they’re doing at a moment’s notice.

    As a result, one major challenge for today’s students is to manage the ever-growing amount of information, communication, and priorities competing for their time and attention — especially when they need to study.

    We’ve been hearing from many students that when they do make time to sit down and study, they find it difficult to manage that time efficiently — particularly making decisions on what to study, when to study, how often to study it, and how long to study until they become confident enough in preparation for multiple upcoming exams.

    Fortunately, researchers have been investigating this problem for decades and have identified multiple methods for getting the most out of study sessions. Accordingly, here are some research-based best practices that students (or anyone else, for that matter) can use to boost their memorization skills.

    Memorization takes practice

    Every time you recall a piece of information (your mother’s birthday, a favorite meal at a restaurant, a key term’s definition for an exam) you retrieve it from the vast trove of knowledge that is your long-term memory. However, you’ve probably found that some pieces of information are easier to remember than others.

    You’re likely to recall your home address easily because you constantly need it when filling out online forms and ensuring Amazon knows where to ship your limited edition Chewbacca mask. On the other hand, it may not be as easy to recall a friend’s phone number because it’s stored in your contacts and you rarely need to actually dial the numbers.

    Unsurprisingly, researchers have found similar results to these — the more often people “practice” retrieving a certain piece of information, the easier it is for them to remember it. More importantly, scientists have demonstrated that getting yourself on a regular studying schedule can take advantage of this using what is called “spaced practice” — studying in short sessions spaced out over long periods of time. Essentially, spaced practice involves quizzing yourself and giving yourself many opportunities to practice pulling information out of your long-term memory — and doing it often over an extended period of time.

    Want to give spaced practice a try? Here are some key guidelines to ensure you’re getting the most out of it.

    Study early and daily

    One of the most important things to remember when using spaced practice is to give yourself enough lead time before an exam. Research has shown that in general, the earlier in advance students start studying and keep studying until an exam, the higher their scores.

    For example, if you have an exam in two weeks, you could begin studying for 20 minutes every day for those two weeks. That way, you’ll have many opportunities to practice retrieving the information, increasing the likelihood that you’ll remember it the day of the exam.

    In contrast, if you start studying only a few days before the exam, you’ll have fewer opportunities to practice retrieving the material, and are less likely to remember it. So while there isn’t a magic recipe to determine the exact moment to start studying based on the amount of material you need to remember, it’s clear that the earlier you start studying every day, the better.

    Short and sweet beats long and grueling

    Another key component to spaced practice is the length of the study session. While it is common for students to embark upon marathon, multi-hour study sessions, researchers have found that when using spaced practice, long study sessions are not necessarily more effective than short study sessions. In other words, committing to studying certain material every day for 30 minutes is likely just as effective as studying that same material for an hour every day.

    Now, this doesn’t mean we should all keep our study sessions as short as humanly possible and expect amazing results. Instead, it reinforces the concept of spaced practice. For instance, let’s say your goal is to memorize 15 definitions for a quiz, and you’re committed to practicing every day until that quiz. You sit down to practice each definition twice, which takes 30 minutes. (Remember, the aim of spaced practice is to retrieve a memory, and then leave a “space” of time before you retrieve it again.)

    Because your brain has already retrieved each definition twice in that sitting, you may not benefit much more from studying the same words for an additional 30 minutes and reviewing each definition a total of four times. In short, once you’ve started studying early and daily, make sure to practice each concept, definition or item a few times per session — but more than that in a single sitting is likely overkill.

    Don’t break the chain

    I’ve emphasized the importance of practicing daily quite a bit here, and there is also a scientific reason behind that. A solid spaced practice routine means we’re continually retrieving certain information and keeping it fresh in our minds. However, if we stop practicing before something is committed to our long term memories, we’ll eventually forget it. Scientists have charted out this phenomenon in what is referred to as “The Forgetting Curve.”

    The Forgetting Curve

    Source: https://www.cambridge.org/core/journals/cns-spectrums/article/play-it-again-the-master-psychopharmacology-program-as-an-example-of-interval-learning-in-bite-sized-portions/E279E18C8133549F94CDEE74C4AF9310#

    In the same way that continual practice with short spaces between each session helps us to remember information, scientists have found that our ability to remember something decreases over time if we don’t practice or use the information — which is what the steep downward slope of the Forgetting Curve is meant to illustrate. When we learn new information and are immediately asked to recall it, we’re likely to remember it (the very left side of the graph).

    However, from that moment on, the likelihood that we’ll remember decreases quickly and drastically unless we recall or use the memory again. If we do, then we can keep resetting or “recharging” that Forgetting Curve and keep remembering the information over time with daily practice.

    Herman Ebbinghaus and the forgetting curve

    Source: http://www.wranx.com/ebbinghaus-and-the-forgetting-curve/

    For example, if you took a foreign language in high school, it’s likely that being in class five days a week, doing homework and studying for the exams kept the language’s vocabulary words fresh in your mind. However, unless you have continual opportunities to practice speaking that language after high school, it’s likely that you won’t be able to recall words, phrases, and verb conjugations over time — unless you start practicing again.

    With this all in mind, if your goal is to remember something, the Forgetting Curve suggests that daily practice is key. Essentially, it’s “use it or lose it.”

    Start early, finish quickly, practice daily

    Although memorizing material for an exam (or multiple exams) can be intimidating, research on learning has given us a few key guidelines that have consistently demonstrated results:

    1. Start early. The earlier in advance you start studying daily for the exam, the better
    2. Finish quickly. Cover all of the material you need to remember in your daily session, but keep it short and sweet.
    3. Practice daily. Don’t break the daily studying chain.

    While today’s students may struggle with numerous competing priorities, incorporating these habits into their routines when they do sit down to study is sure to make their sessions much more efficient.

     

    References

    Cepeda, N. J., Pashler, H., Vul, E., Wixted, J. T., & Rohrer, D. (2006). Distributed practice in verbal recall tasks: A review and quantitative synthesis. Psychological Bulletin, 132(3), 354–380.

    Ebbinghaus, H. (1964). Memory: A contribution to experimental psychology (H. A. Ruger, C. E. Bussenius, & E. R. Hilgard, Trans.). New York: Dover Publications. (Original work published 1885)

    Nathan, M. J., & Sawyer, R. K. (2014). Foundations of the Learning Sciences. In R. K. Sawyer (Ed.) Cambridge Handbook of The Learning Sciences. New York: Cambridge University Press.

    Pavlik, P. I., & Anderson, J. R. (2005). Practice and forgetting effects on vocabulary memory: An activation-based model of the spacing effect. Cognitive Science, 29(4), 559-586.

    Rohrer, D., Taylor, K., Pashler, H., Wixted, J. T., & Cepeda, N. J. (2005). The effect of overlearning on long-term retention. Applied Cognitive Psychology, 19(3), 361–374.

    Stahl, S. M., Davis, R. L., Kim, D. H., Lowe, N. G., Carlson, R. E., Fountain, K., & Grady, M. M. (2010). Play it Again: The Master Psychopharmacology Program as an Example of Interval Learning in Bite-Sized Portions. CNS Spectrums, 15(8), 491–504.

     

    read more
  • Designing for Learning: Pearson's Learning Design Principles

    by David Porcaro

    blog image alt text

    As I write this blogpost, I occasionally stare out my window to the Rocky Mountains looming above me. It’s amazing to think that these 4200m/14,000 ft mountains were formed by a series of small tremors and occasional larger seismic movements. Likewise, the tectonic plates of education are shifting, and seismic ripples are apparent globally.

    Many are small movements—tremors really. But, as education places greater focus on the learning sciences and our understanding about how people learn expands, such tremors may occur more frequently. In fact, bigger shifts occur when such insights are increasingly applied to learning tools and experiences used by millions of learners worldwide.

    As a part of this movement, Pearson is pioneering the application of learning sciences to education products at scale. For decades, many education research projects focused on basic or evaluative research, leading to discoveries shown to impact learning, but failing to do so at scale. On the other hand, many educational technology products have been built on solid user experience and market research, but have failed to impact learning. In the learning experience design team, we’re implementing a principle-based design process in which we apply design-based research methods to a variety of Pearson products across disciplines, supporting the outcomes of millions of learners globally (building on such efforts as Clark & Mayer, 2002; Gee, 2007; Koedinger, Corbett & Perfetti, 2012; and Oliver, 2000).

    Using both the design thinking methods of user experience (Kelley & Kelley, 2016), and the design-based research methods of the learning sciences traditions (Reeves & McKenney, 2013), we’re building, applying, and refining a set of forty-five Learning Design Principles. By doing so, we’re working at the nexus of education research (i.e., products based on research) and product efficacy (ie, research-based products that evidence impact on outcomes). In that messy and exciting space of innovation, we’ve established a design function and process that allows us to build products that meet user needs, are delightful and usable, and, most importantly, impact learning.

    Pearson’s Learning Design Principles

    The Learning Design Principles (LDPs) are research-based syntheses of targeted topics within the learning sciences that serve as quick, reliable reference for learning designers when working with all key stakeholders in the product development process. The LDPs provide us with a research-based point-of-view to inform how learning and teaching theory are integrated into Pearson products and features. Drawing on the work of leading researchers in education, including some of our own authors and customers, we are building a research base that we can continually reference in making design decisions.

    The LDPs are not official policy or “how students learn” type documents. Rather, they are focused on application, and are the first point of discussion on learning design, paving the way for further Pearson-led education research.

    Surrounding this work, we’ve developed design tools and guidance documents that we use to apply the learning sciences in new product design. From this larger corpus, we’ve created LDP cards that summarize key concepts and applications of each of the learning design principles into a succinct and portable form. In addition to providing quick reference on applications, impacts, and possible capabilities, each card includes a self-assessment instrument that we use to better align our products to what consistently improves learning. Internally, we use these cards to set a common language and understanding of learning sciences research for everyone involved in the design and development process, which helps us link back all design decisions to a research-based “why.” They also provide preliminary measures of learning impact in our early stage product design, and act as the glue to the logic models of more robust efficacy and impact evaluations.

    We’ve been using the principles in a number of ways that combine the speed of design thinking methodology with the rigor of design-based research. For instance, in one recent design sprint workshop with a university partner in our online program management work, we used the LDPs (Scaffolding, Motivation Design, Feedback, Self-Regulated Learning, and 21st Century Skills cards among others) in a card-sorting activity that allowed all participants–students, instructors, administrators and designers– to quickly prioritize the key learning elements that would need to be part of program innovations.

    The Learning Design Principles were also at the heart of recent product updates for Pearson Writer, a digital writing support tool. After analyzing the existing product using the LDP self-assessments, we identified a number of opportunities for features that would enable better learning. In creating a plug-in for Microsoft Word, learning designers identified LDPs best aligned to guide the efforts, including Cognitive Load and Multimedia, Scaffolding, Online Information Literacy, and Writing to Learn. Through several rounds of co-design with students, the application of these principles to feature requirements has been refined, measured, and iterated on. We’re continuing to refine the product through principle-based analysis and co-design, and measuring the impact of these developments to learning.

    Sharing the Learning Design Principles

    We are pleased to announce the public release of our summary cards for an initial forty-five LDPs in an attempt to initiate a larger conversation around how to better design learning experiences. (My sincerest appreciation to Dan Shapera and his team who tirelessly shepherded these to completion!)

    We invite you to download these cards which are being released under a Creative Commons Attribution-ShareAlike 4.0 International license which allows you to embrace and extend our work. By sharing these cards with you, we’re building with you a common vocabulary around how we’re making our products. You can use the cards to discuss with us what aspects of learning are important to you and your students, and how we can better support you in implementing these principles in your learning environments. Additionally, you can use these cards to begin your own principle-based design program, springboarding discussions around the kinds of capabilities, design implementations, and impacts you want to see in your own learning experiences.

    Please join me in an extended dialogue about these principles. Over the coming weeks, we plan to have further conversations about the principles and their application. I’d love to hear your thoughts on what’s missing, how we might further refine our understanding of learning, and how you are using them in your own user-centered design processes. Together, we can cause seismic shifts in how people learn worldwide.

    Find me on twitter, @DavidPorcaro

    Access additional resources on Pearson’s Learning Design Principles

     

    References

    Clark, R.C. and Mayer, R.E. (2002). E-Learning and the Science of Instruction: Proven Guidelines for Consumers and Designers of Multimedia Learning. San Francisco: Jossey-Bass Pfeiffer.

    Gee, J (2007). What Video Games Have to Teach Us about Learning and Literacy. New York: St Martin’s Griffin.

    Kelley, T., & Kelley, D. (2013). Creative confidence: Unleashing the creative potential within us all. Crown Business.

    Koedinger, K., Corbett, A., & Perfetti, C. (2012). The Knowledge-Learning-Instruction Framework: Bridging the Science-Practice Chasm to Enhance Robust Student Learning. In Cognitive Science 36, 757-798.

    McKenney, S., & Reeves, T. C. (2013). Conducting educational design research. Routledge.

    Oliver, R. (2000). When teaching meets learning: Design principles and strategies for web-based learning environments that support knowledge construction. In ASCILITE (pp. 17-28).

    read more
  • University increases student access to course materials

    by

    blog image alt text

    SUCCESS STORY

    A university saves students $7 million while increasing student access to course materials

    University of California, Davis

    “New students come to campus prepared for everything,” Jason Lorgan, executive director, Campus Recreation, Memorial Union, and University of California, Davis (UC Davis), Stores, explained. “They have a bus pass and a gym pass. All their classes and their dorm room are assigned. Yet the default is that they have no access to their course materials. Something that is core to their education is not automatic.”

    So Lorgan began investigating ways to increase student access to course materials. “As more adaptive learning digital content such as MyLabTM & MasteringTM came out, we started thinking that they could be adapted to a licensing model similar to the one our design students use for Adobe® Photoshop® versus the textbook model where the default is that you start without access to the content.”

    read more