Teaching and Learning blog

Explore insights, trends, and research that impact teaching, learning, and leading.

Explore posts in other areas.

PreK-12Pearson studentsProfessional

  • Improve learning by adding video
    Improve learning by adding video
    By Pearson

    Video is everywhere. With more than a billion hours of video footage viewed on YouTube every day,1 it is a medium that most students are both familiar and comfortable with. The question is not whether to use videos in higher education, but how to use them to improve learner outcomes.There is plenty of research that touches on the role of video in learning, and there are even some studies that specifically examine the different ways of using video in university or college courses.

    After reviewing and analyzing this research, we’re confident that most higher education courses could improve learner outcomes by supplementing instruction and other learning content with relevant educational videos.

    Here are three reasons why.

    1. Students want to learn from videos

    Video is part of higher education even when it’s not officially part of the learning experience. Some higher education students prefer videos to written sources and many will seek out subject-related videos on YouTube, even when they’re not assigned.

    In a survey of hundreds of business students:

    • 71% said they used YouTube as part of their academic learning
    • 70.5% believed they could learn a lot about a subject by watching related videos instead of reading a book2

    In a 2020 study, a group of higher education students was given 30 minutes of online research time to learn enough about a topic to write a brief summary. On average, the students spent 8.5 of their 30 minutes watching videos. Only 15.7% of the students watched no videos at all.3

    Studies also seem to show that the appeal of video is not limited to particular subjects or learning preferences.4 Whatever the course, and whatever the makeup of the student body, including videos can engage students in learning.

    2. Supplemental videos improve learning

    Videos clearly appeal to students, but do they actually help them learn? When combined with other learning methods, there is evidence that they can.

    A 2021 study looked at different ways of using videos in higher education courses. The researchers found that pivoting the course to video — dropping existing teaching methods and having students watch videos instead — did improve learning somewhat.

    But the biggest improvements came when video was added to the existing course content, rather than replacing it.5

    This may be because adding video gives students more ways to understand the content. If the learning didn’t take hold from a lecture or a written text, maybe it would from a video. Whereas when video replaced other methods, if a student didn’t grasp the content from the video, they had no alternative ways in.

    3. Videos can directly affect learning

    Does including videos improve learning by making the course more engaging, or do the videos themselves help improve learning? Understanding this helps determine the best types of video to include in higher education courses.

    A 2014 study experimented with integrating different types of videos into lectures. When the videos were mainly entertaining, students’ motivation and engagement improved. Higher motivation and engagement are associated with better learning outcomes.

    But when the videos were mainly educational and directly relevant to the lecture topic, students performed better on post-lecture quizzes than those who attended a lecture without videos.6

    This shows that while videos can affect learning by engaging students, they can also have a direct effect on students’ knowledge.

    Improving learning for students at all experience levels

    To summarize, based on a range of studies:

    • higher education courses should include videos
    • videos should supplement, not replace, existing course content and instruction
    • videos should be educational in nature and directly relevant to the subject

    When videos are integrated into higher education courses in this way, students — whatever their previous academic history — are more likely to outperform their predicted grades.7

  • How unlimited information actually limits learning
    How unlimited information actually limits learning
    By Pearson

    Once, students looking to supplement their knowledge of a topic had to rely on the limited selection of books in their college library. Today, college students have nearly unlimited information at their fingertips. But does more information always equal better learning?

    A number of recent research studies suggest that in fact, providing students with a more limited set of high-quality resources chosen specifically for the course can lead to better outcomes than when students supplement their knowledge using the internet. 

    It’s true that there is a large amount of high quality information available online, on nearly every topic imaginable. It’s also true that searching, assessing, filtering, and making use of online resources are valuable 21st-century skills. So it’s understandable when higher education courses call for students to look online for sources to cite, or to supplement their knowledge of the course subject.

    But that’s just the thing: finding information online and judging its reliability are skills in themselves. This complicates learning, because:

    • not all students in the course will have those skills to the same degree
    • they’re not usually the skills the course is teaching (or assessing)

    Reliable, or just familiar?

    As you may expect from a group of people who have largely grown up with the internet, higher education students know that not everything they find online is reliable. They do think about the origins of the information they find, and judge whether they are credible.

    However, students don’t always know how to make these kinds of judgments accurately.

    In one 2020 study, higher education students were provided with several items from different sources and prompted to write about the items’ perspectives. More than 2 in 5 of the students (41%) assumed that certain items were credible because they recognized the source.1 They thought they were judging the reliability of the information, but were really rating the familiarity of the sources.

    Another study, also published in 2020, asked economics students to use a search engine to investigate the truth of several claims. Again, these students ended up relying heavily on sites they were familiar with, rather than truly valid or reliable sources. Perhaps unsurprisingly, one of their most cited sources was Wikipedia.2

    Of course, not all students make the same mistakes. For example, a 2017 study found that students who score higher for reading comprehension are also more likely to find relevant, valid results when using search engines.3 Students with previous experience of searching for academic sources may also be more accurate judges of the information they find.

    But this presents another challenge to learning. It means that in courses that ask students to supplement their subject knowledge by searching the internet, those with lower reading comprehension and less academic experience are at an unfair disadvantage.

    Best use of effort

    Even with sophisticated search engines, sifting the vast quantities of information on the internet for relevant sources takes time and effort. So does assessing the reliability of each source.

    These activities also add to students’ cognitive load: the amount of brainpower needed to complete a task.

    Students’ time, effort, and cognitive load are all finite resources. What they expend on finding and assessing sources, they aren’t using to actually increase their knowledge.

    All of this means that providing students with a hand-picked suite of high quality resources, chosen specifically for the course, is better for learning than leaving them to find their own online.

    Providing learning resources as part of the course levels the playing field. Students with different levels of reading comprehension and academic experience will all have equally valid, reliable materials to learn from.

    And because students tend to trust material provided as part of the course, they won’t use up time, effort, or cognitive load gauging whether the material is reliable.

    All in one

    This isn’t a call to send students back to the college library. Even if the world wide web isn’t the best environment for learning, there are still clear benefits to digital learning.

    In fact, digital platforms allow us to free up even more of students’ cognitive load for learning: by providing suites of reliable resources under the same roof as learning and assessment.

    _____________________________________

     

    Sources

    1 Banerjee, Zlatkin-Troitschanskaia, & Roeper, 2020

    2 Nagel et al., 2020

    3 Hahnel et al., 2017

  • Man looking out the window, with laptop open in front of him
    Designed to Deliver Value: The University of North Dakota Introduces Certificates to its Cyber Security Program
    By Pearson

    Brought to you by Pearson’s Online Program Management team

    How do you deliver value to learners and employers alike? In the hot field of cyber security, the University of North Dakota has cracked the code with the design of its recently launched online program.

    The University of North Dakota is a public research university in Grand Forks, N.D. It offers more than 120 online degree and certificate programs, encouraging students from around the world to explore more than 225 fields of study every year. UND is dedicated to its mission to provide transformative learning, discovery and community engagement opportunities for developing tomorrow's leaders.

    Designing transformative online learning experiences

    In consultation with Pearson Online Learning Services, Vice Provost for Online Education and Strategic Planning  Jeff Holm chose to align the cyber security curriculum with highly sought-after and industry-recognized certifications. Advancing skills in cyber security can mean better job security, higher pay and more leadership opportunities for learners — program features that align with the university’s mission.

    To create a program that appealed to a broad audience while meeting UND’s high pedagogical standards, UND and Pearson established a collaborative working relationship. The teams partnered on course development, tailoring courses to 14 weeks each. Both partners agreed that this gave learners the right amount of time with the material and addressed their needs for convenient, short courses that deliver work-ready skills.

    The university also relied on the partnership for market research and insights, marketing and enrollment support to widen its reach. The strategy was to give more learners valuable career preparation by including certificates in the degree program. With the addition of cyber certificates to the online program, learners can gain recognizable, industry credentials as they move toward earning a full degree — making them more valuable to employers sooner.

    “UND offers a variety of options so learners can tailor their M.S. in Cyber Security to fit specific interests and career goals,” Holm says. “The cyber security master’s program offers four tracks (or) stackable academic certificate options.” One certificate is mandatory. Learners can select two of three other certificate options and graduate with a master’s and three academic certificates. The tracks and certificates include:

    • Cyber Security Analyst track aligned with the EC-Council Certified Threat Intelligence Analyst (CTIA) certification
    • Ethical Hacking track aligned with the EC-Council Certified Ethical Hacker (CEH) certification
    • Computer Forensics track aligned with the EC-Council Computer Hacking Forensics Investigator (CHFI) certification
    • Secure Networks track aligned with the Certified Information Systems Security Professional (CISSP) certification
  • Three women are looking at a laptop computer screen. Two are seated at a desk while the third is standing.
    Taking a Proactive and Positive Approach with Students about Academic Dishonesty
    By Jessica Bernards and Wendy Fresh

    As educators, one of the biggest issues we have recently had to tackle in our classrooms is the increase in academic misconduct. At our college, there was a 703% increase in academic misconduct reports from Winter 2020 to Winter 2021. Additionally, there has been a tremendous rise in ed tech companies that flourished during the pandemic. We feel like every time we look in the app store, a new “math solver” app appears. As educators, we can’t even keep up!

    In a presentation with Pearson Senior Learning Designer Dr. Elaine W. Tan we discussed specific strategies to be proactive with students about academic integrity. One of those strategies was to introduce academic integrity at the beginning of the term. This proactive approach from day 1 has really made a difference in our classes. In this post, we will go into more specifics.

    Define academic misconduct in your syllabus

    It’s important to define different forms of cheating and why they’re problematic. It’s equally important to state the value of academic integrity for learning. Many students might not see a given behavior as cheating until you tell them. In fact, in a College Pulse study1, students were asked how acceptable or unacceptable it is to Google homework questions to find the answers and use study websites to find answers to test or homework questions. Over 50% of the respondents said it was acceptable to Google homework questions and 44% said it was acceptable to use study websites to find answers to test or homework questions.

    A syllabus statement about academic integrity, including a link to your institution’s student code of conduct, is an important first step to making sure your students are all on the same page. See the wording that we include in our syllabus.

    Discuss academic integrity early

    Dr. Tan’s research2 found that most students don’t find cheating a problem, with only 15% saying they are very or extremely concerned about contract cheating. This may be because instructor’s aren’t talking about it. Only 1 in 5 students had instructors that discussed that cheating was problematic. Those are alarming statistics, and a good reason why it’s so important to begin the conversation early.

    One way to begin that conversation is by setting aside time in the first two weeks of class to show them a video covering academic integrity. Presented in an engaging way, a video like this gets the students’ attention and is more effective than lecturing them. You can also find a math-specific academic integrity video in the MyLab® Math shell for our textbooks Precalculus: A Right Triangle Approach, 5th Edition & Precalculus: A Unit Circle Approach, 4th Edition.

    Build connections with students

    More findings from Dr. Tan’s research show that one of the reasons students turn to academic dishonesty is because they feel a lack of personal connection, or a sense that instructors don’t know or care about them. This can be especially true with online learning and the isolation brought on by disruptions to learning over the last few years. We can address this proactively by creating a connection within the first days of class.

    Something we started doing this past year is having a required 10-minute one-on-one meeting with each student within the first two weeks of the term. Within that meeting, we communicate to them that we are invested in their success and how the course material can help them achieve their real-life goals. We also talk about academic integrity with them. Get the template email we send out to our classes.

    Set clear, specific instructions

    Have clear and specific rules and instructions for assignments and exams so students know what is ok to use and what is not. This even comes down to stating “you cannot use the solve feature on the calculator to get the answer.”

    One of the things we do is use an exam policy checklist that students have to complete before they’re able to take their test. This checklist states which resources are allowed and which are not, links to the student code of conduct, and clearly lays out the consequences for an academic misconduct violation. View our exam policy checklist.

    By bringing in these strategies at the beginning of the term, we have found that the number of academic misconduct issues in our courses has decreased dramatically. Although academic dishonesty may never fully go away, it is important to talk about and provide students with the education to improve their actions.

    Dive deeper

    Watch the full presentation, Proactive and Positive Ways to Engage Students about Academic Integrity.

    Get sample documents for communicating with your students about academic dishonesty


    Sources

    1. Academic Integrity. (2021). College Pulse.

    2. Bakken, S., Tan, E. W. & Wood, A. (2021). A Research Review on Student Cheating. Pearson Learning & Research Design.

  • Use Online Learning to Drive Change, Create Opportunity & Thrive Amidst Disorder
    By Sasha Thackaberry

    Sasha Thackaberry, Ph.D. recently joined the executive team at Pearson Online Learning Services (POLS) as Vice President of Student and Partner Services. Previously, she led Louisiana State University’s online program organization, where in just four years, her team grew from supporting 800 students in 9 programs to over 12,000 in 120+ programs, while keeping a strong focus on quality. Her online learning experience has been honed throughout a career at LSU, SNHU, and other innovators. See how her experiences shape her current work at Pearson to help learners and institutions thrive.

    Sasha, tell us something we should know about you.

    I get really geeked out about what’s next, and how to drive change – both in education, and in my own teams. I’m interested in building teams that get addicted to evolving, and to making the next big thing happen.

    Even today, change is underrated. Disruption is going to occur continually, and I’m passionate about how we move things forward towards a more effective fusion of education and technology.

    “High-tech, high-touch” isn’t a new concept, but in higher education, historically, we haven’t done it all that well. Now, though, there’s a lot of insight we can draw on to do better. For instance, we can use more of what’s been learned by behavioral economists. The techniques so often used to sell us stuff can also be used to remove barriers to learning and encourage people to continually engage in it.

    You’ve said institutions can go beyond resilience to become truly “anti-fragile”: able to thrive amidst disorder and chaos. How?

    It starts with creating and building a foundation that enables you to be proactive and flexible, no matter what. Then, there’s a reactive piece: when you see something coming down the pike, always getting ready, seeing what works and what doesn’t, pivoting quickly. You can build in “space” in your systems and processes, and keep things as simple as possible.

    Two issues are key. First, institutions must invest heavily in their technology infrastructures. Valuable data is everywhere, but you can’t react if you don’t know what’s going on.

    Second, there’s culture: committing to pivot on a dime and be super creative. One of the best ways is to be very upfront about failures because they teach us how to change. Obviously, there are exceptions, but in higher ed environments, failure is too often viewed as a lack of competence. Instead, we need to embrace smart risk, and then be ready to pivot fast if it doesn’t work. You need leaders who can approach "Black Swan” events as opportunities to do really great things, as some institutions did during the pandemic.

    COVID changed things forever, but what are we learning about the new higher ed environment that’s emerging?

    We now have a marketplace of many different sizes, types, and forms of learning – and our audience looks radically different. A generation ago, few expected the post-traditional audience to become the only part of higher ed that was growing. Twenty years from now, people will look back and ask each other, “Do you remember when they based everything on the degree?”

    We see young people who aren’t all headed straight to college. They’re doing other things first. I don’t think that’s a bad thing. I just think we must accommodate their needs as learners.

    Then, there’s “education as a benefit” from employers. Our infrastructures need to accommodate that, and many other flexible options – not just paying by credit card, but also subscriptions. More of what we do needs to be time-variable. People are voting with their enrollments, and they’re saying: I want shorter, faster, more applicable.

    You were a pioneer in stackables. What advice would you offer to those who worry about learner outcomes and building viable programs that don’t just cannibalize current programs?

    To begin, you can’t overly focus on cannibalization of revenue. If an early automaker thought, “If I build cars, I’m gonna cannibalize my base of horse customers,” they missed the point. It’s about what people want. It’s not about what we want to create for them. If you don’t disrupt your own business, someone else will.

    But it’s not just about defense. You can start a virtuous cycle of creating stackables by yourself, partnering with content providers to build them, and ingesting them from other places.

    Colleges and universities have amazing resources for learning in their faculty and their content knowledge. Many times, those same faculty and that same content can be used to create short-form credentials that open the door to a wider set of learners. It’s not only about the degree or a single point-in-time credential. All of us will need to continually learn and collect new credentials throughout our careers. Stackables empower institutions to set up lifelong partnerships with their students – from a traditional experience through a fully online experience, from a degree to a single hour-long, just-in-time learning session.

    Some folks worry about whether microcredentials will really have the value they promise. But institutions can develop a lot more information about what is being learned. And as we get better at intervening with post-traditional learners, we can get better at moving them to the appropriate classes or paths.

    You do, however, need to remain focused on your institution’s actual mission, to avoid mission creep. Not every institution needs to be everything for every learner. Each institution has its own unique strength, lens, and approach to learning. In the online space, it’s no different.

    You led LSU’s initiatives in non-degree and degree online learning. How did you bring faculty aboard?

    There are always champions: people who’ve discovered ways to get innovative things done. Find them. Then support them with all the expertise and political capital you can. If you make early adopters successful, others will come on board. I’ve never been in an environment where you didn’t have innovative faculty. It’s always a question of critical mass and political will.

    LSU was proud of building its own internal online learning organization without an external OPM. Now you’re at the company that pioneered the OPM model. Can you reflect on the decision to partner or go it alone?

    I had a very unusual situation at LSU. I had Board support, strategic focus from the President, and the best boss I’ve ever had – a Provost who promised to block and tackle for me, and came through every time, whether it involved changing policies or getting mainframes reprogrammed. She was willing to be unpopular – and that included fighting to protect our budgets.

    When you work with an OPM partner, there’s a contract in place, and dollars for things like marketing and recruitment are protected through that contract. Many institutions really don't know the true cost of learner acquisition, marketing, and recruitment. They may not know what it means to do digital campaigns, or the differences between a website and landing pages, and the implications for marketing spend. That requires specific talent, and it can be hard to get.

    At LSU, I was empowered to build a team from the ground up, where we had to be super-creative, use super-modern techniques, and be super-efficient. And it worked. But when an average institution has a strategic communications budget of, say, $200,000, and you propose dropping $6,000,000 on marketing for an online program that has 5,000 students this year, that budget line tends not to get preserved. You might start out with the commitment, but it gradually turns out that you can’t afford to market the program to reach the scale needed to sustain it.

    Even just the technology behind online programs can be challenging. You need a CRM, autodialers, texting, chat boxes, web development. Universities are not historically excellent at all that. If you can’t build that, you must get it externally.

    Not everything is either-or, and when we build service packages for new partners at Pearson, they’re differentiated and customized to each institution’s needs. But I can 100% say that if you don’t have certain ingredients to scale, it’s better to go with a partner.

    You’ve been a thought leader at institutions like LSU and SNHU, but also in organizations like Quality Matters. Based on all you’ve seen, can you share any final reflections?

    I’ve had the incredibly good fortune of meeting many great people who’ve been eager to have candid conversations about online learning. It seems strange to say this, though: this is still a relatively small and new field. The opportunities are wide open. We really are still at the very beginning of online education.

  • Two women are looking at a computer monitor. One of the women is pointing at the screen. Other people are sitting at the table, also working on computers.
    Outcomes-based assessment. The key to teaching critical thinking.
    By Dr. Shelley Gaskin

    What is critical thinking?

    Teaching students to think critically and solve problems is a widely pursued goal in higher education. Definitions of critical thinking vary but basically come down to having students examine an ill-defined or messy problem and carefully apply the knowledge and skills they have acquired during the instructional process to analyze the problem and suggest a solution.

    To learn to think critically and solve problems, students must take an active role in constructing and defending their knowledge. The best way for students to demonstrate critical thinking is to attempt one or more outcomes-based assessments.

    Outcomes-based learning and assessments

    What will my students do “out there”—in the real world of family, work, and community—as a result of what we do “in here”—in my classroom? 1

    This question presents a useful way to think of student learning outcomes and enables us to envision a broad, overall view of course content and goals. Sometimes this is termed “backward design” or “designing down,” referring to a process of defining intended outcomes first, and only then defining the content and designing the instruction.

    Defining the desired outcomes first is relatively easy for information technology courses. It is not difficult to envision what IT students will do “out there,” because of the discipline’s relatedness to employability skills. We have always been comfortable with helping students bridge classroom and real-life experiences.

    Once intended outcomes for a course or unit of instruction are determined, a student engages in observation, practice, and other learning activities to demonstrate mastery of the knowledge and skills associated with the course content. A student is then prepared to engage in an outcomes-based assessment, actively engaging the student in learning to think critically and solve problems by making judgments and taking appropriate actions that are observed and evaluated, simulating what occurs in real life.

    Skill development

    Discussions and evidence of critical thinking determined from outcomes-based assessments typically include the differentiation between novice and expert work. Authors such as Willingham and Riener2 and McTighe and Ferrara3 look at the progress of the students in terms of moving from novice to expert. Think of student performance as a continuum of competency; for example, the varying belt colors one can earn while studying martial arts.

    Students develop critical-thinking skills by observing excellent work and engaging in learning activities to obtain the knowledge and skills necessary to achieve the outcome. Student skills eventually become routine compared to the beginning of their learning when careful thought was required.

    For example, you can relate the learning process to how individuals learn to play a video game. They watch a friend, the expert, play a game and then gradually, with practice, learn the game mechanics. Additional practice leading to mastery moves them from novice to expert to eventually being able to strategize the game play and become an expert.

    Assessment of critical thinking

    An outcomes-based assessment is also referred to as an authentic assessment in the sense that the assessment is realistic. An authentic assessment engages students to apply the knowledge and skills they have acquired during the instruction in ways that reflect problems they would encounter in the real world. Such an assessment has no right or wrong answer but rather reflects the thinking of an expert in the field.

    The outcomes-based assessment provides the opportunity for students to apply their knowledge and skills to ill-defined problems like those in real life. Doing so requires integrating the discipline-based knowledge and skills they have acquired with completing various learning activities. Students learn to think critically by attempting an outcomes-based assessment that is representative of a current problem an expert in the discipline would encounter.

    Both the student and the instructor apply an analytic rubric to the result, discuss the results, and based on the instructor’s feedback, the student may attempt the outcomes-based assessment again until the work is of professional quality as determined by the rubric. When students attempt outcomes-based assessments, they are likely to be more effective as professionals.

    How to develop an assessment for critical thinking

    As instructors, we know that developing and grading an assessment that has no right or wrong answers can be time-consuming. Fortunately, there is abundant research and examples of how to do this, and many Pearson textbooks include outcomes-based assessments. For example, the GO! Series for Microsoft Office 365 uses an outcomes-based framework, and each unit of instruction includes numerous critical-thinking assessments and accompanying rubrics. Each instructional project includes a critical-thinking quiz so the student can immediately review the project and identify the purpose and benefit of creating the information.

    To develop an assessment for critical thinking, one useful device is the GRASPS model developed by Grant Wiggins and Jay McTighe and detailed in Designing Authentic Performance Tasks and Projects.4 The acronym GRASPS stands for:

    G—a realistic goal

    R—the role of the student

    A—the audience

    S—the real-world situation

    P—the product or performance the student will demonstrate

    S—the criteria for judging success

    For example, in a class where you are teaching Microsoft Excel, you could use the GRASPS model to develop an assessment for critical thinking as follows:

    You are an assistant in the Supply Chain and Logistics department of an online vitamin company (role). Your manager asks you to create an inventory status report (real-world situation) to present to the Chief Financial Officer (audience) of the company so the company can estimate warehouse costs for new products (goal). Based on inventory data, you develop an Excel workbook (product) that presents the inventory information in a way that makes it easy for the Chief Financial Officer to visualize warehouse needs (criteria for success).

    How to grade an assessment

    Students learn to think critically by attempting outcomes-based assessments that are representative of a current problem an expert in the discipline would encounter. Multiple exposures to outcomes-based assessments provide students the opportunity to apply their knowledge and skills to ill-defined problems like those in real life. To do so, students must integrate the discipline-based knowledge and skills they acquired during the instructional process.

    An analytic rubric distinguishes novice work from expert work. On completion of an outcomes-based assessment, both the student and the instructor apply an analytic rubric to the result and discuss the results. Based on the instructor’s feedback, the student attempts the outcomes-based assessment again until the work is of professional quality as determined by the rubric.

    An analytic rubric divides a product or performance into distinct traits or dimensions. As the instructor, you can judge and score each trait separately. The rubric is known ahead of time by both the student and the instructor. The analytic rubric gathers evidence of the student’s performance against a set of pre-determined standards. By applying the rubric, both you and the student can place the performance on a qualitative continuum.

    For teaching productivity software, here is an example of an analytic rubric that can be applied to any critical-thinking assessment such as the GRASPS example above:

  • Pearson’s learning tech wins awards
    Pearson’s learning tech wins awards
    By Pearson

    We’re proud to announce our learning solutions have won the following awards.

    MyLab® Math and MyLab Statistics won the CODiE Award for “Best Higher Education Mathematics Instructional Solution,” which recognizes the best instructional solution that offers mathematics curriculum and content for students in higher education math subjects.

    Revel for Political Science/History/Sociology/Psychology also won the CODiE Award for “Best Social Sciences/Studies Instructional Solution,” which recognizes the best instructional solution for social sciences/social studies curricula and content for students in the higher education or PK-12 markets.

    In addition, we were a finalist for the following award.

    NCCERConnect was a finalist for the CODiE Award for “Best College & Career Readiness Solution,” which recognizes the best digital product or service that develops 21st Century workforce skills and knowledge for students.

    The CODiE Awards were established so that pioneers of the budding software industry could evaluate and honor each other’s work. Today, the Awards continue to honor excellence in leading technology products and services. At Pearson, we've been creating innovative learning experiences since the Awards began in 1986, and our latest award-winning instructional solutions are evidence that we’re never satisfied with the status quo. Keep reading to learn more about what makes them unique.

    What is MyLab?

    MyLab Math and MyLab Statistics use data-driven guidance to improve results for students, with engaging, interactive content by expert authors that better helps them absorb and understand difficult concepts from developmental math to differential equations.

    MyLab gives instructors a comprehensive gradebook with enhanced reporting functionality that makes it easier for instructors to understand which students are struggling, and which topics they struggle with most.

  • A woman with short hair and glasses sits at a desk smiling at a laptop. Behind her are shelves of books and decorative items and a glass wall to the outdoors.
    End of term: Tweaking your course for next term
    By Dr. Terri Moore

    Many of you may be experiencing those end-of-term emotions ranging from relief to exhaustion. On top of all the final grades and last-minute faculty duties, it’s time to think about the next term’s classes, whether that’s a short summer session or getting a jump on Fall class designs.

    Course review

    If you’re a Revel® user, I suggest you examine your aggregate class data from the easy-to-access dashboard view before automatically copying the current course into the next term’s course shells. The dashboard view gives you a wealth of actionable data.

    The Revel dashboard is a completely different tool for analytics than I have ever used in terms of providing numbers that reflect what was working and what needed improvement. The data helped to inform my decisions about the efficacy of the current course and implied changes I could make to the current design to increase students’ engagement and content proficiency next term. Let’s walk through what I found most helpful.

    Educator Dashboard insights

    The Revel Educator Dashboard provides a great deal of information in the following areas:

    • aggregated class data for a view of overall performance
    • score details to see class performance on each type of assessment assigned
    • struggling and low performing student gauges for quick identification and communication
    • assignments with due dates as well as additional details, including challenging items
    • settings tab showing assessment types and ways to improve the course design

    Assessment data

    When reviewing the assessment data, I ask myself if there are any settings or scoring policies that I might change to increase both engagement and comprehension.

    The view score details section provides aggregate scores for students on each type of assessment assigned, allowing me to note assessment types that received low scores. This can indicate a lack of understanding or a lack of participation. By drilling into the details of some assessment types within the assignment view of the grades section, I might see a lack of participation rather than low scores. This could indicate I should assign greater value for these types of assessments if I feel they are sound activities for students to become proficient with the content.

    Increasing the weight of certain assessments might incentivize students to complete them. Or, by allowing fewer attempts for the Revel module or chapter quizzes, students may be less likely to complete the quizzes without fully understanding the concepts they should have read before taking the quiz.

    You might choose to exclude certain types of assessments next term if you feel the value is less than you wish for students to expend energy and time completing. In that manner, you might increase compliance on the assessments you feel are more robust in helping students acquire the knowledge needed to become proficient in your courses’ required outcomes.

    I acted weekly based on the struggling and low activity student gauges by sending a brief email to those students and it made a dramatic difference in my classes, both face to face and online. For three years I conducted my own efficacy study by examining the effect of using this intervention strategy with my low-performing students. I opened the dashboard view early Monday mornings after the Sunday due dates and dropped each student an email stating I noticed they were having some issues in completing their work in Revel the previous week. I would tell them to contact me if I could be of assistance with anything.

    This simple, very quick intervention was so telling during COVID-19 when students would email me back and share things like they had little connectivity at home with four siblings using the same Wi-Fi, or they had lost their homes and were in the process of moving. Issues that I had no ability to resolve yet tugged at my heart. However, I could put skin on the computer by letting my students know I cared and connected with their struggles. Even if the student was simply slacking, they knew I was an active presence in the online classroom. We know from research on distance learning that human connections between students and teachers, and between peers, are often the variable that increases persistence to completion.

    Over our three years of COVID-19 I have seen an increase of slightly more than 25% retention in my online classes and 13% in my face-to-face classes. Apparently, being engaged with the content outside of class was equally important as in-class presence.

    Deeper course analysis

    The next question I pose for myself relates to what I can change or renew for even greater success next term.

    When you scroll beneath the dashboard to the assignments and you see challenging items, this means there are questions on the quizzes that many of your students did not answer correctly on the first quiz attempt. This could indicate the concept is difficult to grasp by simply reading the material.

    When you dig deeper, you can see the exact question/concept where students struggled. This information has prompted me to add some of my own content to the Revel material to increase students' understanding. For instance, with psychology, operant and classical conditioning are concepts often confusing for intro to psych students. I have added material in my LMS, class, or Revel by using the highlighting and sharing a note feature to increase students’ understanding of that difficult concept.

    I also like to look at overall trends in the term by scanning the dates, the scores, and the participation. This can inform me about seasonal changes in students’ performance such as midterm slump, spring break fever, or those times in any of our terms where students’ performance historically declines.

    Student engagement tactics

    Interventions to increase student engagement might include reducing the number of assessments or using more active engagement assessments, such as asking students to present or to work collaboratively to engage them more fully.

    If you go to the resources tab and open your book, you can select the section you found of challenging items. Highlight that section, add a note or even a URL to create an active link in your students’ notes. You could add a TED talk, or, as I did with my psychology students, a link to YouTube of The Big Bang Theory show where the actors are using operant and classical conditioning to train their significant others. When you share notes like this, the information appears in your students’ notebooks, and they can use your notes as study guides.

    Revel offers the right amount of actionable data for me to understand my students’ progress, their engagement, and where they experience challenging concepts. The platform also helps me improve my delivery, increase student success with Revel, and helps students become proficient in the learning outcomes.

  • 4 persons looking at a laptop
    8 Strategies for Effective Online Teaching: Lessons from the Past 2 Years
    By Dr. Terri Moore

    My biggest challenge these past couple years has been to realistically manage and readjust my expectations, as a learning designer, instructor, as well as a human being. What was planned to be a temporary solution for the teaching and learning world, our initial rush to digital has since extended well into 2022, two years later. 

    In the spirit of pause, reflect, and adjust accordingly, we decided to look back at this blog post (9 Strategies for effective online learning, March 2020) and reevaluate the tips while taking into consideration what we have learnt these past two years.