Teaching and Learning blog

Explore insights, trends, and research that impact teaching, learning, and leading.

Explore posts in other areas.

PreK-12Pearson studentsProfessional

  • Man sitting in chair, smiling, as he is reviewing content on his laptop

    Pearson’s Maestro of Marketing Brings a Human Touch to a Customer-Centric Strategy

    By Michael Collins

    Brought to you by Pearson’s Online Program Management team.

    Since Michael Collins joined Pearson Online Learning Services as senior vice president of marketing and learner acquisition, he’s been working to harmonize and humanize everything we do to engage and enroll learners in our partners’ online programs.

    Collins brings a background in journalism, marketing, public relations, corporate communications, and — not least — music. In this interview, he shares insights that reflect where he’s been, what he’s seen, and where we can make the greatest impact for partners by building lifelong relationships that keep learners coming back.

    You studied music in college. What did you learn from that experience?

    Sometimes you can be the lead in a musical or in a play, right? But many times, you’ll be part of the ensemble. In marketing, I’ve learned it’s much the same. Sometimes you’re still part of the ensemble, and you have to switch between supporting roles. I may be leading marketing and learner acquisition, but I’m also part of a leadership team working to achieve shared outcomes. Even where I’m the lead within my own team, sometimes another member of the team has the stage.

    Beyond that, when we work with our partners, we’re also part of their team. So, knowing how to make all these teams work together well at the same time is one of the most important things I can do.

    You come to Pearson from the CFA Institute, the leading global provider of investment management education. But you’ve also played key marketing roles in other industries. What lessons do you see as especially relevant for your work here — especially your work with institutions?

    There’s a note that runs through my career in terms of working in-marketing, whether it’s been in retail, manufacturing, distribution, technology, or tech-enabled service companies. And that’s about creating affinity that makes customers want to keep buying from you.

    I ran global marketing at Iomega, which made external storage drives: Maybe you remember the Zip drive. We went from $140 million to $2 billion in revenue in under 36 months. We sold through retail channels like Best Buy, as well as through distributors who sold to retail. And I learned the power of channels and partnering.

    It’s one thing to sell your product or service, but how will you help partners be successful, so they want to keep partnering with you? That’s our challenge, too. We’ve built a business model where, when Pearson’s partners are successful, we’re successful. And our partners in turn succeed when their learners succeed.

    And it’s never one-and-done. In our student success and retention work, and in everything else we do, we need to be relentlessly focused on making both learners and partners more successful continually.

  • Guarding the Online Learning Galaxy

    By Jaime Mordue

    Brought to you by Pearson’s Online Program Management team

    When a college professor tells me that they never imagined their in-person course could be so engaging in an online format, or a student interacts with learning more online than they would have in the traditional classroom, I know that I, along with my incredibly talented team, are fulfilling our mission.

    We are Pearson's Learning Design Solutions (LDS) team, and our job is to reimagine traditional higher education courses for the online environment. Last year alone, we supported over 1,400 courses across 35+ programs for over a dozen of our university partners. We developed courses in disciplines such as Law, Social Work, Public Health, Education, Nursing, and Business, among others. It's a responsibility we take very seriously—not only to deliver amazing online learning—but to help safeguard the integrity and validity of the entire online education “galaxy.” It's no secret that online learning has had its naysayers, so if we prove them wrong while delivering, time-and-again, for students and academic partners... then, we fulfill our mission.

    With decades of expertise in online education and course development operations, LDS brings science and insight to ensure our college and university partners' online courses are designed and developed to meet the highest expectation of quality and efficacy. Our tenets are straightforward:

    1. Pedagogy: LDS brings data and science into designing courses to ensure they meet the appropriate rigor, engagement levels, and measurable outcomes. All instructional designers in LDS engage in regular professional development in the industry and hold various levels of certification in Quality Matters (QM). The team is currently supporting several partners in aligning courses to QM standards, including Regis College’s Nursing programs and Health Sciences programs.
    2. Equity: By designing through a lens of historically informed compassion and empathy, LDS consults to design courses with equity top of mind. LDS seeks ongoing team training opportunities in a commitment to raise diversity, equity, and inclusion (DEI) standards for online learning. LDS recently supported a university partner interested in auditing courses to identify ways to improve inclusivity in course content. All course components produced by LDS meet current WCAG 2.1 AA and Pearson’s Global Content and Editorial Policy.
    3. Research: By participating in, and helping to conduct, ongoing research in online learning, LDS helps partners refine practices, innovate learning solutions, and keep up with generation after generation of digital learners. LDS is currently engaged in collaborative research with multiple partners, who are focused on developing learning analytics dashboards to advance data-driven learning design insight and practice.

    It's especially meaningful when faculty recognize that designing together with Pearson’s Learning Design Solutions team positively influences their course beyond project boundaries and into their regular teaching practices. A recent Brookings article, Online college classes can be better than in-person ones, reaffirms that online learning is gaining recognition and thriving beyond the potential consequences of the pandemic. This is a goal for us—to use our education (super)powers for the good of all learners, no matter the model or method.

    Learn more, and explore Pearson's online learning offerings and OPM services

    Originally published by the Pearson Insights blog.

  • Three women are looking at a laptop computer screen. Two are seated at a desk while the third is standing.

    Taking a Proactive and Positive Approach with Students about Academic Dishonesty

    By Jessica Bernards and Wendy Fresh

    As educators, one of the biggest issues we have recently had to tackle in our classrooms is the increase in academic misconduct. At our college, there was a 703% increase in academic misconduct reports from Winter 2020 to Winter 2021. Additionally, there has been a tremendous rise in ed tech companies that flourished during the pandemic. We feel like every time we look in the app store, a new “math solver” app appears. As educators, we can’t even keep up!

    In a presentation with Pearson Senior Learning Designer Dr. Elaine W. Tan we discussed specific strategies to be proactive with students about academic integrity. One of those strategies was to introduce academic integrity at the beginning of the term. This proactive approach from day 1 has really made a difference in our classes. In this post, we will go into more specifics.

    Define academic misconduct in your syllabus

    It’s important to define different forms of cheating and why they’re problematic. It’s equally important to state the value of academic integrity for learning. Many students might not see a given behavior as cheating until you tell them. In fact, in a College Pulse study1, students were asked how acceptable or unacceptable it is to Google homework questions to find the answers and use study websites to find answers to test or homework questions. Over 50% of the respondents said it was acceptable to Google homework questions and 44% said it was acceptable to use study websites to find answers to test or homework questions.

    A syllabus statement about academic integrity, including a link to your institution’s student code of conduct, is an important first step to making sure your students are all on the same page. See the wording that we include in our syllabus.

    Discuss academic integrity early

    Dr. Tan’s research2 found that most students don’t find cheating a problem, with only 15% saying they are very or extremely concerned about contract cheating. This may be because instructor’s aren’t talking about it. Only 1 in 5 students had instructors that discussed that cheating was problematic. Those are alarming statistics, and a good reason why it’s so important to begin the conversation early.

    One way to begin that conversation is by setting aside time in the first two weeks of class to show them a video covering academic integrity. Presented in an engaging way, a video like this gets the students’ attention and is more effective than lecturing them. You can also find a math-specific academic integrity video in the MyLab® Math shell for our textbooks Precalculus: A Right Triangle Approach, 5th Edition & Precalculus: A Unit Circle Approach, 4th Edition.

    Build connections with students

    More findings from Dr. Tan’s research show that one of the reasons students turn to academic dishonesty is because they feel a lack of personal connection, or a sense that instructors don’t know or care about them. This can be especially true with online learning and the isolation brought on by disruptions to learning over the last few years. We can address this proactively by creating a connection within the first days of class.

    Something we started doing this past year is having a required 10-minute one-on-one meeting with each student within the first two weeks of the term. Within that meeting, we communicate to them that we are invested in their success and how the course material can help them achieve their real-life goals. We also talk about academic integrity with them. Get the template email we send out to our classes.

    Set clear, specific instructions

    Have clear and specific rules and instructions for assignments and exams so students know what is ok to use and what is not. This even comes down to stating “you cannot use the solve feature on the calculator to get the answer.”

    One of the things we do is use an exam policy checklist that students have to complete before they’re able to take their test. This checklist states which resources are allowed and which are not, links to the student code of conduct, and clearly lays out the consequences for an academic misconduct violation. View our exam policy checklist.

    By bringing in these strategies at the beginning of the term, we have found that the number of academic misconduct issues in our courses has decreased dramatically. Although academic dishonesty may never fully go away, it is important to talk about and provide students with the education to improve their actions.

    Dive deeper

    Watch the full presentation, Proactive and Positive Ways to Engage Students about Academic Integrity.

    Get sample documents for communicating with your students about academic dishonesty


    Sources

    1. Academic Integrity. (2021). College Pulse.

    2. Bakken, S., Tan, E. W. & Wood, A. (2021). A Research Review on Student Cheating. Pearson Learning & Research Design.

  • Use Online Learning to Drive Change, Create Opportunity & Thrive Amidst Disorder

    By Sasha Thackaberry

    Sasha Thackaberry, Ph.D. recently joined the executive team at Pearson Online Learning Services (POLS) as Vice President of Student and Partner Services. Previously, she led Louisiana State University’s online program organization, where in just four years, her team grew from supporting 800 students in 9 programs to over 12,000 in 120+ programs, while keeping a strong focus on quality. Her online learning experience has been honed throughout a career at LSU, SNHU, and other innovators. See how her experiences shape her current work at Pearson to help learners and institutions thrive.

    Sasha, tell us something we should know about you.

    I get really geeked out about what’s next, and how to drive change – both in education, and in my own teams. I’m interested in building teams that get addicted to evolving, and to making the next big thing happen.

    Even today, change is underrated. Disruption is going to occur continually, and I’m passionate about how we move things forward towards a more effective fusion of education and technology.

    “High-tech, high-touch” isn’t a new concept, but in higher education, historically, we haven’t done it all that well. Now, though, there’s a lot of insight we can draw on to do better. For instance, we can use more of what’s been learned by behavioral economists. The techniques so often used to sell us stuff can also be used to remove barriers to learning and encourage people to continually engage in it.

    You’ve said institutions can go beyond resilience to become truly “anti-fragile”: able to thrive amidst disorder and chaos. How?

    It starts with creating and building a foundation that enables you to be proactive and flexible, no matter what. Then, there’s a reactive piece: when you see something coming down the pike, always getting ready, seeing what works and what doesn’t, pivoting quickly. You can build in “space” in your systems and processes, and keep things as simple as possible.

    Two issues are key. First, institutions must invest heavily in their technology infrastructures. Valuable data is everywhere, but you can’t react if you don’t know what’s going on.

    Second, there’s culture: committing to pivot on a dime and be super creative. One of the best ways is to be very upfront about failures because they teach us how to change. Obviously, there are exceptions, but in higher ed environments, failure is too often viewed as a lack of competence. Instead, we need to embrace smart risk, and then be ready to pivot fast if it doesn’t work. You need leaders who can approach "Black Swan” events as opportunities to do really great things, as some institutions did during the pandemic.

    COVID changed things forever, but what are we learning about the new higher ed environment that’s emerging?

    We now have a marketplace of many different sizes, types, and forms of learning – and our audience looks radically different. A generation ago, few expected the post-traditional audience to become the only part of higher ed that was growing. Twenty years from now, people will look back and ask each other, “Do you remember when they based everything on the degree?”

    We see young people who aren’t all headed straight to college. They’re doing other things first. I don’t think that’s a bad thing. I just think we must accommodate their needs as learners.

    Then, there’s “education as a benefit” from employers. Our infrastructures need to accommodate that, and many other flexible options – not just paying by credit card, but also subscriptions. More of what we do needs to be time-variable. People are voting with their enrollments, and they’re saying: I want shorter, faster, more applicable.

    You were a pioneer in stackables. What advice would you offer to those who worry about learner outcomes and building viable programs that don’t just cannibalize current programs?

    To begin, you can’t overly focus on cannibalization of revenue. If an early automaker thought, “If I build cars, I’m gonna cannibalize my base of horse customers,” they missed the point. It’s about what people want. It’s not about what we want to create for them. If you don’t disrupt your own business, someone else will.

    But it’s not just about defense. You can start a virtuous cycle of creating stackables by yourself, partnering with content providers to build them, and ingesting them from other places.

    Colleges and universities have amazing resources for learning in their faculty and their content knowledge. Many times, those same faculty and that same content can be used to create short-form credentials that open the door to a wider set of learners. It’s not only about the degree or a single point-in-time credential. All of us will need to continually learn and collect new credentials throughout our careers. Stackables empower institutions to set up lifelong partnerships with their students – from a traditional experience through a fully online experience, from a degree to a single hour-long, just-in-time learning session.

    Some folks worry about whether microcredentials will really have the value they promise. But institutions can develop a lot more information about what is being learned. And as we get better at intervening with post-traditional learners, we can get better at moving them to the appropriate classes or paths.

    You do, however, need to remain focused on your institution’s actual mission, to avoid mission creep. Not every institution needs to be everything for every learner. Each institution has its own unique strength, lens, and approach to learning. In the online space, it’s no different.

    You led LSU’s initiatives in non-degree and degree online learning. How did you bring faculty aboard?

    There are always champions: people who’ve discovered ways to get innovative things done. Find them. Then support them with all the expertise and political capital you can. If you make early adopters successful, others will come on board. I’ve never been in an environment where you didn’t have innovative faculty. It’s always a question of critical mass and political will.

    LSU was proud of building its own internal online learning organization without an external OPM. Now you’re at the company that pioneered the OPM model. Can you reflect on the decision to partner or go it alone?

    I had a very unusual situation at LSU. I had Board support, strategic focus from the President, and the best boss I’ve ever had – a Provost who promised to block and tackle for me, and came through every time, whether it involved changing policies or getting mainframes reprogrammed. She was willing to be unpopular – and that included fighting to protect our budgets.

    When you work with an OPM partner, there’s a contract in place, and dollars for things like marketing and recruitment are protected through that contract. Many institutions really don't know the true cost of learner acquisition, marketing, and recruitment. They may not know what it means to do digital campaigns, or the differences between a website and landing pages, and the implications for marketing spend. That requires specific talent, and it can be hard to get.

    At LSU, I was empowered to build a team from the ground up, where we had to be super-creative, use super-modern techniques, and be super-efficient. And it worked. But when an average institution has a strategic communications budget of, say, $200,000, and you propose dropping $6,000,000 on marketing for an online program that has 5,000 students this year, that budget line tends not to get preserved. You might start out with the commitment, but it gradually turns out that you can’t afford to market the program to reach the scale needed to sustain it.

    Even just the technology behind online programs can be challenging. You need a CRM, autodialers, texting, chat boxes, web development. Universities are not historically excellent at all that. If you can’t build that, you must get it externally.

    Not everything is either-or, and when we build service packages for new partners at Pearson, they’re differentiated and customized to each institution’s needs. But I can 100% say that if you don’t have certain ingredients to scale, it’s better to go with a partner.

    You’ve been a thought leader at institutions like LSU and SNHU, but also in organizations like Quality Matters. Based on all you’ve seen, can you share any final reflections?

    I’ve had the incredibly good fortune of meeting many great people who’ve been eager to have candid conversations about online learning. It seems strange to say this, though: this is still a relatively small and new field. The opportunities are wide open. We really are still at the very beginning of online education.

  • A young person listening from laptop and taking notes in paper

    Consumer Behavior and its Impact on the Non-Degree Online Higher Education Market

    By Joe Morgan, Vice President, University Partnership Development, Pearson

    #2 in a series

    In my first post The student as consumer, and the burden of choice, I suggested that when learners face a high stakes purchase (the full degree) and information overload, they often narrow their decision to either known institutions or those that rank on Page 1 of search results. The endless aisle sounds great until you have to walk down it. The learners’ simplification strategy and conscious or unconscious bias excludes lesser-known institutions from the consideration set, even if they may be the “best fit.''

    In this post, I examine consumer behavior and its impact on the online higher education non-degree market. Those who have worked in direct-to-consumer businesses (either digital or brick-and-mortar) will recognize the language of consumer behavior trial, offer, purchase, add-on purchase. For learners as consumers, this language is profoundly relevant, and it is vital to the institution’s strategy for non-degree online learning programs.

    How consumer-learners reduce perceived risk

    In their study, Behavioral Changes in the Trial of New Products, Shoemaker and Shoaf found that consumers respond to the perceived risk of trying a new product by reducing the consequences: they buy a smaller quantity (trial). The growth of the non-degree market (certificates) is that very behavior in action.

    For the new traditional learner older, more diverse, navigating career and family obligations, concerned with increasing debt, and having spent years away from any formal academic setting entering a full degree program raises the stakes. Their anxiety is palpable. The resulting behavior is predictable.

    If a learner is uncertain about moving forward, or unsure they can succeed, they update their beliefs through a consumption experience (the trial). What better consumption experience than to begin with a “smaller-quantity,” affordable, low-stakes online learning program that provides an immediate, career-enhancing credential and a powerful signaling opportunity to the learner’s social and professional networks?

    Non-degree online program development and a wider view of student acquisition cost

    I hear many question the economic value of certificates to the institution, relative to the cost of acquiring each student. In isolation and barring substantial scale, one would be hard pressed to show meaningful economic return on a modestly priced certificate. But that misses the bigger point. If viewed as a valuable student acquisition strategy, the university generates exposure, awareness, and trial by delivering short form, employment-relevant content.

    With an appropriately constructed “offer” (freemium, credit bearing, pathway to degree admissions, university credential, digital badges) for these certificates, the institution creates affinity. Upon completion, and with a student’s newfound confidence, some of those learners will enter a degree program at that same institution (the “purchase”). When reskilling and upskilling becomes necessary, the student returns to what is now familiar (the add-on purchase). Coursera calls this “the flywheel effect”:

  • Two women are looking at a computer monitor. One of the women is pointing at the screen. Other people are sitting at the table, also working on computers.

    Outcomes-based assessment. The key to teaching critical thinking.

    By Dr. Shelley Gaskin

    What is critical thinking?

    Teaching students to think critically and solve problems is a widely pursued goal in higher education. Definitions of critical thinking vary but basically come down to having students examine an ill-defined or messy problem and carefully apply the knowledge and skills they have acquired during the instructional process to analyze the problem and suggest a solution.

    To learn to think critically and solve problems, students must take an active role in constructing and defending their knowledge. The best way for students to demonstrate critical thinking is to attempt one or more outcomes-based assessments.

    Outcomes-based learning and assessments

    What will my students do “out there”—in the real world of family, work, and community—as a result of what we do “in here”—in my classroom? 1

    This question presents a useful way to think of student learning outcomes and enables us to envision a broad, overall view of course content and goals. Sometimes this is termed “backward design” or “designing down,” referring to a process of defining intended outcomes first, and only then defining the content and designing the instruction.

    Defining the desired outcomes first is relatively easy for information technology courses. It is not difficult to envision what IT students will do “out there,” because of the discipline’s relatedness to employability skills. We have always been comfortable with helping students bridge classroom and real-life experiences.

    Once intended outcomes for a course or unit of instruction are determined, a student engages in observation, practice, and other learning activities to demonstrate mastery of the knowledge and skills associated with the course content. A student is then prepared to engage in an outcomes-based assessment, actively engaging the student in learning to think critically and solve problems by making judgments and taking appropriate actions that are observed and evaluated, simulating what occurs in real life.

    Skill development

    Discussions and evidence of critical thinking determined from outcomes-based assessments typically include the differentiation between novice and expert work. Authors such as Willingham and Riener2 and McTighe and Ferrara3 look at the progress of the students in terms of moving from novice to expert. Think of student performance as a continuum of competency; for example, the varying belt colors one can earn while studying martial arts.

    Students develop critical-thinking skills by observing excellent work and engaging in learning activities to obtain the knowledge and skills necessary to achieve the outcome. Student skills eventually become routine compared to the beginning of their learning when careful thought was required.

    For example, you can relate the learning process to how individuals learn to play a video game. They watch a friend, the expert, play a game and then gradually, with practice, learn the game mechanics. Additional practice leading to mastery moves them from novice to expert to eventually being able to strategize the game play and become an expert.

    Assessment of critical thinking

    An outcomes-based assessment is also referred to as an authentic assessment in the sense that the assessment is realistic. An authentic assessment engages students to apply the knowledge and skills they have acquired during the instruction in ways that reflect problems they would encounter in the real world. Such an assessment has no right or wrong answer but rather reflects the thinking of an expert in the field.

    The outcomes-based assessment provides the opportunity for students to apply their knowledge and skills to ill-defined problems like those in real life. Doing so requires integrating the discipline-based knowledge and skills they have acquired with completing various learning activities. Students learn to think critically by attempting an outcomes-based assessment that is representative of a current problem an expert in the discipline would encounter.

    Both the student and the instructor apply an analytic rubric to the result, discuss the results, and based on the instructor’s feedback, the student may attempt the outcomes-based assessment again until the work is of professional quality as determined by the rubric. When students attempt outcomes-based assessments, they are likely to be more effective as professionals.

    How to develop an assessment for critical thinking

    As instructors, we know that developing and grading an assessment that has no right or wrong answers can be time-consuming. Fortunately, there is abundant research and examples of how to do this, and many Pearson textbooks include outcomes-based assessments. For example, the GO! Series for Microsoft Office 365 uses an outcomes-based framework, and each unit of instruction includes numerous critical-thinking assessments and accompanying rubrics. Each instructional project includes a critical-thinking quiz so the student can immediately review the project and identify the purpose and benefit of creating the information.

    To develop an assessment for critical thinking, one useful device is the GRASPS model developed by Grant Wiggins and Jay McTighe and detailed in Designing Authentic Performance Tasks and Projects.4 The acronym GRASPS stands for:

    G—a realistic goal

    R—the role of the student

    A—the audience

    S—the real-world situation

    P—the product or performance the student will demonstrate

    S—the criteria for judging success

    For example, in a class where you are teaching Microsoft Excel, you could use the GRASPS model to develop an assessment for critical thinking as follows:

    You are an assistant in the Supply Chain and Logistics department of an online vitamin company (role). Your manager asks you to create an inventory status report (real-world situation) to present to the Chief Financial Officer (audience) of the company so the company can estimate warehouse costs for new products (goal). Based on inventory data, you develop an Excel workbook (product) that presents the inventory information in a way that makes it easy for the Chief Financial Officer to visualize warehouse needs (criteria for success).

    How to grade an assessment

    Students learn to think critically by attempting outcomes-based assessments that are representative of a current problem an expert in the discipline would encounter. Multiple exposures to outcomes-based assessments provide students the opportunity to apply their knowledge and skills to ill-defined problems like those in real life. To do so, students must integrate the discipline-based knowledge and skills they acquired during the instructional process.

    An analytic rubric distinguishes novice work from expert work. On completion of an outcomes-based assessment, both the student and the instructor apply an analytic rubric to the result and discuss the results. Based on the instructor’s feedback, the student attempts the outcomes-based assessment again until the work is of professional quality as determined by the rubric.

    An analytic rubric divides a product or performance into distinct traits or dimensions. As the instructor, you can judge and score each trait separately. The rubric is known ahead of time by both the student and the instructor. The analytic rubric gathers evidence of the student’s performance against a set of pre-determined standards. By applying the rubric, both you and the student can place the performance on a qualitative continuum.

    For teaching productivity software, here is an example of an analytic rubric that can be applied to any critical-thinking assessment such as the GRASPS example above: