Published on

February 4, 2022

Skill assessments: The future of personalization in learning

Julian Rasch photo

Dr. Julian Rasch

Data Scientist

Category:

Research [R&D]

Reading time:

10

Minutes
3D Illustration of skill assessments with Microsoft Excel

The future of personalization in professional online learning lies in detailed, but fast and effective skill assessments. Here’s how we do it at edyoucated.

Building the future of effective skill assessments (Part 1)

Welcome to the future of personalized online learning and assessments!

In this series of blog posts, we’ll gently introduce to the concept and challenges of personalized learning, and let you take a backstage glimpse at edyoucated’s approach at the topic. We’ll start easy and explain to you why we need personalization in online learning, and how skill assessments help us achieve it. In the upcoming posts, we’ll focus on the data science behind knowledge assessments and even line out, how we can create mathematical models that are able to predict your personal skills and knowledge.

Interested in the future of learning content personalization?

Let’s go!

Why we need personalization in online learning

The personalization and customization of learning is a hot topic and surely deserves a whole blog post series on its own, so let’s keep it short here. It essentially boils down to the rather obvious anyway:

Everybody is different!

We all bring different personalities, cultures, previous knowledge and experience, wishes, goals, expectations and many more diversities to the table, and this influences the way we live, work and, most important for us, how we learn. Unfortunately, this heterogeneity is not well reflected in most online learning opportunities, which rather try to provide learning content for the broadest possible range of learners.

The unfortunate thing about it?

Most of these trainings and courses follow one-fits-all strategies and do not take into account the different qualifications, needs and learning goals of their participants.

Individual needs as a mission for edyoucated

Our mission at edyoucated is to integrate the individual needs of our learners into their learning processes alongside a multitude of dimensions (some of which we are researching in our KAMAELEON project). Here are a few examples:

  • Learning interests,
  • motivation and self-regulation, or
  • learning goals and strategies.

In this post, we’ll focus on the personalization with respect to previous knowledge.

Why?

Our main goal is to start the learning process individually at the exactly the right place for each learner. Everybody enters the learning process with a different skill set, and we want to recommend the individual best next skills to learn. We call the general information about which skills a learner has already mastered the learner’s knowledge state.

But how can we actually figure out the knowledge state of our learners? The key are assessments!

Table showing the skill levels of different people
Everybody enters the learning process with a different skill set, and we need to make sure this is reflected in the learning as well.

Skill assessments: Figuring out your knowledge

You may already have experienced knowledge assessments as a part of a job application process, where they are used to screen and compare you to other applicants. It is used to determine the skills you have already mastered and will bring to the new company. But the real strength of assessments usually lies elsewhere:

It is identifying the areas where you are still lacking knowledge or experience!

Simply put, assessments help identify your needs, which can then be used to steer your  learning activities and personal development. And they come in very different shapes.

The ones you might be most familiar with from school are formative and summative assessments. The former are taken during your learning process to decide about upcoming learning activities. The latter are evaluations of your competency at the end of a learning unit in order to check your mastery of the topic.

People not knowing about their level of skill in Excel
The personalization of content requires us to figure out your current knowledge about the topic in the first place.

Self-assessments: Do you know your own skills?

While the discussed assessment types  come in form of tests or exams, self-assessments require you to reflect on your knowledge and skills on your own. As a quick check:

How would you rate your self-reflection abilities on a scale from 1 to 10? 🙂

Hard to be precise? This is often the case for self-assessments, even if the questions are more focused. It is difficult and highly subjective to give ratings on a scale, isn’t it?

Yet, self-assessments play a large role when it comes to personal development and the choice of learning activities. Whenever you decide about the next thing to learn, apart from interest, you do it (sub-)consciously: In which area do I need to improve my skills? Where am I already at a higher level?

Woman finding out about her skill level
Skill assessments help us to determine the exact state of your current knowledge, so that we can give the best possible recommendations.

The problem with self-assessments

Many learning recommendation systems adapt the kind of self-assessment described above, which leads to two major problems for the learner. On the one hand, the assessment becomes subjective and difficult to do for the learner. On the other hand, the granularity (think: How skilled are you with Microsoft Excel?) only allows for rather imprecise recommendations; most platforms recommend entire courses. Just like Netflix offering entire series to watch.

But do they actually fit?

Or do you already know the entire beginning and need to skip through it to find the first relevant material? As a remedy, we use what we call atomic assessments.

Atomic skill assessments: The finer, the better

At edyoucated, we break down learning topics into the most elementary bricks possible, opening up the possibility to recommend and learn skills on a very fine granularity.

How we call them? Skill atoms, of course! What could be smaller?

To give you an example, an atomic skill for Microsoft Excel would not be “Excel Fundamentals”, but instead “Opening and Saving Workbooks” or “Using the Fill Handle”. On this granularity, topics then are broken down not only into a few different bits, but into up to a few hundred. This gives us the opportunity to offer the best next skill for you to learn on a much finer granularity and with much more precision.

Your benefit as learner? You get exactly what you need!

Atomic assessments also solve another problem: The resulting self-assessment questions, for example, “Do you know how to open and save workbooks in Excel?”, can be answered simply with “yes” or “no” instead of requiring a difficult assessment on a scale. And, just like that, the self-assessment is much less subjective and easier to handle.

Graph showing the storage of atomic skills at edyoucated
At edyoucated, we store our atomic skills in large, connected graphs (here illustrated in Neo4j). You can see the progression from a coarse ontology with only 8 “coarse skills” to an atomic skill ontology with more than 120 atomic skills.

Problems with atomic skill assessments

Still problems? None that we can’t solve.

Nevertheless, while atomic assessments provide about your skills on a fine granularity and enable very precise recommendations, they come at a price. Quite obviously, we need to gather the information about your full current knowledge state first.

And that can turn into a tedious task!

In case of a self-assessment, you would be forced to answer a large amount of questions (worst case: one question per atomic skill!) before being able to start with the actual learning process. In most practical learning situations, this is almost impossible or at least a major limitation.

Difference between Coarse and Atomic Skills
Number of skill assessment questions we need to ask (in the worst case) for a full assessment for the above atomic ontology for Microsoft Excel. We need to employ some data science to reduce this number!

Research at edyoucated

To solve this issue, our research department is working on data-driven methods which are able to predict the most probable knowledge states of learners throughout the assessment process. Combined with an intelligent assessment strategy (the order in which single atomic skills are assessed) we can reliably and significantly speed up the assessment process, making atomic skill recommendations viable and effective in practice.

How do we do that? Stay tuned for Part 2 of our assessment blog series!

References

[Clair 2015] R. S. Clair, L. R. Winer, A. Finkelstein, A. Fuentes-Steeves, and S. Wald. Big hat and no cattle? The implications of MOOCs for the adult learning landscape. Canadian journal for the study of adult education, 27:65–82, 2015.

[Daradoumis 2013] T. Daradoumis, R. Bassi, F. Xhafa, and S. Caballé. A review on massive e-learning (MOOC) design, delivery and assessment. In 2013 Eighth International Conference on P2P, Parallel, Grid, Cloud and Internet Computing, pages 208–213, 2013.

[Gasevic 2016] D. Gasevic, S. Dawson, T. Rogers. Learning analytics should not promote one sizefits all: The effects of instructional conditions in predicting academic success. The Internet andHigher Education, 28:68–84, 2016.

[McBride 2004] B. G. McBride. Data-driven instructional methods: ’one strategy fits all’ doesn’t work in real classrooms. T.H.E. Journal Technological Horizons in Education, 31:38, 2004.

[Murray 2004] R. Murray, M. Shea, B. Shea, and R. Harlin. Issues in education: Avoiding the one-size-fits-all curriculum: Textsets, inquiry, and differentiating instruction. Childhood Education, 81(1):33–35,2004.

[Rhode 2017] Rhode, J., Richter, S. & Miller, T. Designing Personalized Online Teaching Professional Development through Self-Assessment. TechTrends 61, 444–451 (2017).

[Saadatdoost 2015] R. Saadatdoost, A. T. H. Sim, H. Jafarkarimi, and J. M. Hee. Exploring MOOC from education and information systems perspectives: a short literature review. Educational Review, 67(4):505–518, 2015.

[Silver 2008] Silver, I., Campbell, C., Marlow, B. and Sargeant, J. (2008), Self-assessment and continuing professional development: The Canadian perspective. J. Contin. Educ. Health Prof., 28: 25-31.

Explore further with these stories

All posts

edyoucated is funded by leading research institutions such as the Federal Ministry of Education and Research (BMBF), the Federal Institute for Vocational Education and Training (BIBB), Federal Ministry for Economic Affairs and Climate Action (BMWK).