Year Levels 5–10
Curriculum ACT, NSW, NT, QLD, SA, TAS, VIC, WA
Resource Formats Digital
What it is
Pearson Diagnostic is a heavily researched diagnostic tool giving you easy-to-use quizzes so you can go beyond measuring learner achievement and quickly diagnose and address student misconceptions and gaps in knowledge. Targeted Activities are provided, based on each diagnosis, to give learners a pathway forward.
Pearson Diagnostic was built from three decades of research that led to the development of the ‘SMART::tests’ – diagnostic assessment quizzes whose name comes from “Specific Mathematics Assessments that Reveal Thinking”. These quizzes were developed by an academic research team from the University of Melbourne lead by Professor Kaye Stacey.
How does it work?
Who said diagnosing misconceptions had to be difficult? We’ve put a ton of research into giving Diagnostic a powerful yet easy-to-use system. Read on to learn more, or watch our demo video below.
What it includes
Pearson Diagnostic is a comprehensive ecosystem of easy-to-use quizzes on key mathematical concepts introduced in years 5 to 9.
Digital quiz library
65 pairs of quizzes are available that can be assigned to individual students or whole classes through digital codes or links. These cover:
- Number and Algebra
- Measurement and Geometry
- Statistics and Probability
Educators receive instant results based on the pattern of responses from the quiz analytics, giving them a diagnosis of common misconceptions or gaps in understanding at both the class or individual level.
Following a diagnosis, receive simple and engaging targeted intervention activities to plug the gap in knowledge or understanding.
Track individual, class and school progress easily through an intuitive, secure analytics/progress dashboard that can be shared amongst stakeholders.
An EPAA Award winning resource
At the 2021 Educational Publishing Awards Australia, Pearson Diagnostic was awarded Winner of the category Secondary - Teaching Resource - Mathematics.
The judges described Pearson diagnostic as "an innovative formative assessment tool with high-quality learning activities and quizzes that make it a valuable resource for teachers of all levels."Read the judges comments
Tablets and phones can be used, however, for the best experience we recommend using a laptop or desktop computer.
You may need to add Pearson Diagnostic as a Whitelisted URL. Check the Systems requirements and whitelisting details in the knowledge base for more information: pearsonplaces.com.au/knowledgebase
Paid access enables use of all premium functions of Diagnostic. It allows you to diagnose, address and check students’ understanding with access to 65 pairs of pre- and post- quizzes and their supporting Targeted Activities which services students from years 5-10. Free access enables use of starter functions of Diagnostic. It gives you the ability to diagnose and address using 7 of the 65 premium pre-quizzes to assess your students and the associated Targeted Activities to address their specific learning needs.
Pearson Diagnostic is an affordable resource designed for schools and educational institutions to use. To see how much it will cost for your school or institution, complete the expression of interest form. A Pearson Education Consultant will then reach out to discuss your needs and provide a quote.
Pearson Diagnostic is designed for use within a school or educational institution. Please contact Customer Care if you have any further queries. Email email@example.com or call on 1300 473 277.
Smart tests was run and maintained by a company working for the University of Melbourne (who researched and developed the quizzes). They sold rights to the intellectual property to Pearson in 2019. Pearson has added over 500 additional ready to use upskilling activities to support teachers and students.
All premium account holders can share pre- and post- quizzes. Once you have shared a pre-quiz, navigate to the results page and click ‘stage 3’ to share a post- quiz with all students who have completed the pre-quiz.
The concepts assessed by the Pearson Diagnostic quizzes are curriculum agnostic. The results of the Pearson Diagnostic quizzes are not reported in terms of any specific curriculum. The levels of understanding and learning are not linked across topics and they are not reported in terms of any state curriculum levels. For example, a typical good student might reasonably be at level 1 in one quiz, level 2 in another and level 3 in another. It depends on when the topic is taught and to what depth. Each level is described in words and gives advice on how to move students from one level to the next. The Targeted Activities also support this.
The research used to develop the quizzes determined the minimum number of questions needed to validate a student's understanding so that students don't need to spend unnecessary time responding to questions.
The quizzes are designed to establish 'readiness' for an upcoming unit of work or topic. They assess the foundational understanding that may be needed before commencing a topic. Addressing these diagnoses leads to improved understanding and upskilling of students.
Some of the quizzes can be used as a Topic and Year level assessment, but this is not their purpose.
There are 7 quizzes available in the free subscription and 65 quizzes available in the paid subscription. Speak to your local Pearson Education Consultant about a Premium subscription and to understand how the quizzes can be mapped to Australian Mathematics Textbook Chapters we have created, indicating which quizzes are best used when.
There are currently no quizzes specifically designed for subject areas other than mathematics. Many of the current Diagnostic mathematics quizzes can be used for Numeracy understanding that is applicable across subjects such as Commerce, Economics, General Science, Biology, Chemistry, Physics, Technology, Business Management.
If you are looking for specific diagnostic assessment for other key learning areas or subjects, please contact customer care to let us know what you are interested in.
If a student that usually performs well has attempted most of the questions and has a low level, the diagnosis is worth reviewing. Even very capable students sometimes have gaps in their understanding. Sometimes good procedural skills can disguise a lack of fundamental understanding. Ask the student to explain their thinking on some of the easier questions to you to check the report. Working through the Targeted Activities can be useful for this.
Although you can see student responses to determine the number of questions answered correctly and incorrectly, the diagnosis reports on patterns in the answers given, not just whether they are right or wrong.
No, the quizzes are not adaptive but in most quizzes Pearson Diagnostic is looking for patterns of student responses that correspond to different levels of understanding.
The research behind Pearson Diagnostic
The set of quizzes were developed as a means to provide diagnostic information to mathematics teachers on their students’ thinking about and understanding of mathematical concepts. In addition to the quizzes, research backed teaching advice, including specific topic related targeted pedagogy was provided to the teachers. The intention of SMART::tests was to provide information to support teacher planning and to improve teachers’ mathematical pedagogical content knowledge. The difference between a grade or a percentage of correct responses offered by many assessment tasks and the diagnostic information provided by SMART::tests was that teachers were able to use the generated teaching advice to prepare appropriately to meet specific individual students’ needs. Having pairs of tests enables teachers to conduct post-assessment to monitor students’ learning.
The design of the SMART system was based on the principles that the quizzes needed to be short, diagnostic, and easy to deliver online, providing information about individuals and the class overall. By taking advantage of the digital tools available, the responses were marked automatically, making diagnoses accessible to teachers as soon as students completed a test. The diagnoses were generated from response patterns, instead of only reporting on accuracy patterns. An advantage to using the digital technology was to provide a diagnosis without using up teacher-time where analysing responses would be required.
“Our plan is to hand the formative diagnosis over to a machine, so that human teachers can concentrate on using the information to improve the learning of each of their individual students” (Stacey et al., 2009 p. 10).
The initial tests were designed to assess conceptual understanding for specific concepts that are important for learning progress in mathematics. However, by the middle of 2009, two additional test types had been added in response to requests from teacher users of SMART:: tests. The requests were for tests that assess prerequisite knowledge and students’ knowledge of basic number facts.
It is important to note that the developmental stages (now called levels in Pearson Diagnostic) included in the diagnoses are not linked to curriculum content descriptors or related achievement standards, but to the specific concept that is being assessed and hence are not limited to a particular educational jurisdiction. The intention was that SMART tests could be used by teachers anywhere, not just in Victoria, Australia, where they were developed.
In 2019 Pearson acquired rights to the SMART::tests system which includes the 65 pairs of quizzes, the algorithms that enable the diagnoses to be automatically generated and the corresponding teaching support covering topics on Number, Measurement, Geometry, Algebra, and Statistics and Probability.
The SMART::tests academic team along with other researchers used SMART::tests in studies from 2009. One of the key findings from these is that teachers who used SMART::tests conveyed that the diagnostic information was valuable to their professional learning indicating a positive impact on teacher pedagogical content knowledge, by making them more aware of how students learn and possible difficulties they may encounter when learning specific concepts. Details of the research can be accessed in the ‘Related research’ section.
Stacey K., Steinle V., Price B., & Gvozdenko E. (2018). Specific Mathematics Assessments that Reveal Thinking: An Online Tool to Build Teachers’ Diagnostic Competence and Support Teaching. In T. Leuders, K. Philipp, & J. Leuders (Eds.), Diagnostic Competence of Mathematics Teachers. Mathematics Teacher Education, Vol. 11, (pp. 241–261). Springer, Cham. https://minerva-access.unimelb.edu.au/handle/11343/247768
Akhtar, Z., & Steinle, V. (2017). The Prevalence of the 'letter as object' misconception in junior secondary students. In A. Downton, S. Livy & J. Hall (Eds.), 40 years on: We are still learning! (Proceedings of the 40th Annual Conference of the Mathematics Education Research Group of Australasia, pp.77–84). Melbourne: MERGA. https://merga.net.au/Public/Publications/Annual_Conference_Proceedings/2017_MERGA_annual_conference_proceedings.aspx
Barzel, B. & Holzapfel, L. (2017). Strukturen also Basis der Algebra. Mathematik Lehren, 202, 2–8
McKee, S. J. (2017). Using teacher capacity to measure improvement in key elements of teachers’ mathematical pedagogical content knowledge. [Doctoral thesis, University of Melbourne] https://minerva-access.unimelb.edu.au/handle/11343/123566
Stacey, K., Steinle, V., Price, B., & Gvozdenko, E. (2017). Fit in Algebra? Mach den smart-test. MatheWelt 202, pp. 1–16. Seelze, Germany: Friedrich Verlag GmbH. (Translators Judith Blomberg & Maike Abshagen).
Guzman, M. A. (2014). The smart test system: teachers’ views about this formative assessment for mathematics. [Master’s thesis, University of Melbourne]. https://minerva-access.unimelb.edu.au/handle/11343/44090
Price, B., Stacey, K., Steinle, V., & Gvozdenko, E. (2014). Using percentages to describe and calculate change. In J. Anderson, M. Cavanagh, & A. Prescott (Eds.), Curriculum in focus: Research guided practice (Proceedings of the 37th annual conference of the Mathematics Education Research Group of Australasia pp. 517–524). Sydney: MERGA. https://www.merga.net.au/Public/Public/Publications/Annual_Conference_Proceedings/2014_MERGA_CP.aspx
Quenette, J. (2014). Diagnostic testing and changes to teaching practice in Year 9 mathematics classes. [Master’s thesis, University of Melbourne]. https://minerva-access.unimelb.edu.au/handle/11343/43027
Akhtar, Z., & Steinle, V. (2013). Probing students' numerical misconceptions in school algebra. In V. Steinle, L. Ball & C. Bardini (Eds.), Mathematics education: Yesterday, today and tomorrow (Proceedings of the 36th annual conference of the Mathematics Education Research Group of Australasia, pp. 36–43). Melbourne: MERGA. https://www.merga.net.au/Public/Public/Publications/Annual_Conference_Proceedings/2013_MERGA_CP.aspx
Stacey, K. (2013). Bringing research on students' understanding into the classroom through formative assessment. In V. Steinle, L. Ball & C. Bardini (Eds.), Mathematics education: Yesterday, today and tomorrow (Proceedings of the 36th annual conference of the Mathematics Education Research Group of Australasia, pp. 13–20). Melbourne: MERGA. https://www.merga.net.au/Public/Public/Publications/Annual_Conference_Proceedings/2013_MERGA_CP.aspx
Stacey, K., Steinle, V., Gvozdenko, E., & Price, B. (2013). SMART online formative assessments for teaching mathematics, Curriculum & Leadership Journal, vol 11, issue 20. http://www.curriculum.edu.au/leader/smart_online_formative_assessments_for_teaching_ma,36822.html?issueID=12826
Stacey, K., Price, B., & Steinle, V. (2012). Identifying stages in a learning hierarchy for use in formative assessment – the example of line graphs. In J. Dindyal, L. P. Cheng & S. F. Ng (Eds.), Mathematics education: Expanding horizons (Proceedings of the 35th annual conference of Mathematics Education Group of Australasia, pp. 393–400). Adelaide: MERGA. https://merga.net.au/Public/Publications/Annual_Conference_Proceedings/2012_MERGA_CP.aspx
Steinle, V. & Stacey, K. (2012). Teachers' Views of using an on-line, formative assessment system for Mathematics. Pre-proceedings. 12th International Congress on Mathematical Education Topic Study Group 33, July 2012, (pp. 6721–6730). COEX, Seoul, Korea. https://minerva-access.unimelb.edu.au/handle/11343/247767
Baratta, W., Price, E., Stacey, K., Steinle, V., & Gvozdenko, E. (2010). Percentages: The effect of problem structure, number complexity and calculation format. In L. Sparrow, B. Kissane & C. Hurst (Eds.), Shaping the future of mathematics education (Proceedings of 33rd annual conference of the Mathematics Education Research Group of Australasia, pp. 61–68). Fremantle: MERGA. https://merga.net.au/Public/Publications/Annual_Conference_Proceedings/2010_MERGA_CP.aspx
Stacey, K., Steinle, V., Wu, M., Pierce, R., & Giri, J. (2010, 29 Nov-2 Dec). Evaluating automated processes for revealing students' mathematical thinking. Unpublished symposium presentation. Annual Conference of Australian Association of Research in Education, University of Melbourne, Melbourne. https://minerva-access.unimelb.edu.au/handle/11343/247766
Price, B., Stacey, K., Steinle, V., Chick, H., & Gvozdenko, E. (2009). Getting SMART about Assessment for Learning. In D. Martin, T. Fitzpatrick, R. Hunting, D. Itter, C. Lenard, T. Mills & L. Milne (Eds.), Proceedings of 2009 Annual Conference of the Mathematical Association of Victoria. (pp. 174–181). MAV: Melbourne. https://minervaaccess.unimelb.edu.au/handle/11343/247761
Rule, V. (2017). A case study of the influence of diagnostic information on a teacher’s planning for a Year 8 algebra lesson. [Master’s thesis, University of Melbourne]. https://minerva-access.unimelb.edu.au/handle/11343/213997
Stacey, K., Price, B., Steinle, V., Chick, H., & Gvozdenko, E. (2009, Sept 28–Oct 1). SMART Assessment for Learning. [Conference presentation]. International Society for Design and Development in Education, Cairns, Australia. https://minerva-access.unimelb.edu.au/handle/11343/247762
Steinle, V., Gvozdenko, E., Price, B., Stacey, K., & Pierce, R. (2009). Investigating students’ numerical misconceptions in algebra. In R. Hunter, B. Bicknell & T. Burgess (Eds.), Crossing divides (Proceedings of the 32nd annual conference of the Mathematics Education Research Group of Australasia, pp. 491–498). Wellington: MERGA. https://merga.net.au/Public/Publications/Annual_Conference_Proceedings/2009_MERGA_CP.aspx