Mossé Cyber Security Institute: Part III#

In previous articles (Mossé Cyber Security Institute: Part I & Mossé Cyber Security Institute: Part II) I discussed methods of training provided by the Mossé Institute that support learning based on some principles of cognitive science. In this last article in the series, I discuss how experts can benefit from Mossé training and some suggestions to further support their learning.

Highlights#

  • Experts learn best by engaging in deliberate practice to fine-tune or develop skill deficiencies.

  • Hard problems can include real case studies to identify deficiencies.

  • Training programs could provide customized learning based on those deficiencies.

  • Experts may need to use some novice methods of learning, such as learning step-by-step or with direct guidance.

  • Providing worked examples using similar problems is beneficial for novices and experts when given in the appropriate context.

Introduction#

Cybersecurity experts have learned a specific skill or multiple skillsets. They have fine-tuned it by engaging in activities that further their knowledge and practice. For the sake of defining an expert, it is someone with advanced knowledge and technical skill in a specific field. However, there are aspects of cybersecurity they may know very little about or not at all. How do experts learn and further their knowledge? Where can they further their knowledge in a field where they are experts? What type of training can help develop their technical skills?

In cognitive load theory, there is a phenomenon called the expert reversal effect. The expert reversal effect suggests instructional design practices could hinder learning if it is not appropriate for the learner’s skill level. For example, in the last article, One suggestion I discussed is that more guidance is needed for the introductory cybersecurity course offered by the Mossé Institute. The guidance suggestions included helping their students better understand the context and reasons for learning a skill to increase their motivation to complete each lab. The Mossé instructors have updated some of their lessons to reflect this suggestion.

However, step-by-step guidance may not work well for an expert learning a skill. The step-by-step instructions and close guidance for a novice are generally needed to help them conceptualize abstract topics. The conceptual issues result from not having any schemas or previous knowledge in cybersecurity to connect with new knowledge presented to them. Experts have developed schemas and can generally conceptualize new information in their field without guidance since they have experience and well-developed schemas. Experts can undoubtedly refresh on previously learned knowledge because those schemas start to fade if you don’t use the information for an extended period. Too, just because someone has advanced knowledge doesn’t always mean it is used to its full potential or can be accessed when needed.

One method of training experts is to engage them in “deliberate practice.” Deliberate practice means engaging in learning activities to improve performance. Engaging experts in deliberate practice makes advanced training programs challenging to create. The skills that are taught may not be what the expert needs to focus on, or they attend training, and only one or two modules fulfill their needs. Deliberate practice suggests that experts prefer to hone a specific skill that will be of value in their daily practice. Advanced training programs should assume everyone attending has learned the introductory and intermediate material. The best type of training for experts is when they are solving problems or presented with hard problems. Hard problems put the skills of experts to the test to see how they approach a given situation. It can allow them to understand better what skills they need to develop and provide time to practice those skills. Another advanced training method is to provide learners with case studies of real cybersecurity problems. For example, provide them the datasets for a breach to see how they analyze the data and their conclusions. Then, provide a debriefing on what was found by other experts. That offers an opportunity for them to know what skill sets they need to improve on. This would be an excellent way to provide customized training to experts.

I mentioned in the last article how Mossé instructors shine when it comes to feedback. I want to rephrase that to they shine when providing feedback and “advice.” Feedback is a summative response to a given task. In contrast, advice is a summative response, though it also includes actionable methods for improvement. The advice of this type is followed up with further reviews by a teacher or mentor to help ensure the learner improves from the advice provided. This part is really important. Experts, just like novices, may recognize they are weak in a given topic and choose not to continue with the learning if they don’t have a support system. Also, suppose they are provided a problem and do not have a solution to learn from when they reach the end of their knowledge. In that case, the learning experience may become futile, and they may be more likely to abandon the training. An effective expert training program will introduce learners to hard problems, identify where they are weak, and help them improve on those skills.

The Mossé Institute here again shines with presenting hard problems. The learner is presented with exercises that help them examine their current knowledge and skills and where they need to improve. There is a community forum where questions can be asked, and instructors or other students can respond. There is also a new feature in each exercise called “Quick Question” where students can ask a question directly to the instructors.

The deliberate practice works well when the expert has a mentor who can determine where they are struggling and provide other lessons to develop a skill and return to the original problem. The mentor or support system needs to allow for failure, error correction, and provide advice that enhances the expert’s ability to solve a hard problem.

One challenge I am encountering right now involves disassembling malware. One of the requirements for the exercise is to disassemble malware, find an API call and determine what it is doing. I have not disassembled malware in over 10 years. It is a skill that I’ve completely lost because I didn’t think I’d be disassembling malware anymore - those schemas have faded. I have to put the current exercise away and learn some assembly and using disassemblers again. In a previous article, I mentioned how having access to other training is beneficial to support my learning with the Mossé Institute. All the Humble Bundle collections I’ve amassed are being put to use so I can relearn disassembling malware and the basics of assembly programming. When I submit my assignment and, if I don’t do well, Mossé instructors are there to provide advice, and I can try again.

However, something that could occur is that an expert may not be able to complete the exercise due to a lack of skills and knowledge to do so. Mossé doesn’t have a ‘solution’ for each exercise. That is a bit of a gamble because someone may get stuck on not completing one exercise and get discouraged and believe they cannot complete the others. I imagine many educators have seen students do the same on their homework assignments or exams.

One suggestion is to have a dataset of students’ exemplary assignment solutions. The solution doesn’t have to show how they did it, but it shows the results. For experts, that may be all that’s needed to spark a possible solution. Incidentally, it can reduce anxiety for novices to see a final product, so they know and start to learn the scope of what needs to be done. In cognitive load theory, the worked example effect suggests showing someone how a solution is completed and providing a similar problem for them to work through improves students learning and understanding of the topic under study. Worked examples are very beneficial for novices but not experts. If there is a new type of SQL injection attack, a novice would need more help understanding it and may need to learn about databases and how SQL works. An expert pen-tester could conceptualize the attack without seeing an example due to their advanced skills in performing SQL attacks. For experts, worked examples do not tend to be helpful unless the expert needs assistance developing a new skill to solve a problem. As I previously mentioned, I am currently using worked examples to relearn how to use a disassembler and the basics of assembly.

Something hard to balance with training is appealing to the range of cognitive development and skills of learners. The balancing act deals with presenting information for an expert versus a novice. The Mossé Institute doesn’t provide pre-skill assessments, so students know what they need to brush up on before engaging in the learning. They provide some scaffolding that slowly builds skills as they work through a module. Some modules need more transitioning before the practical application of the skills. For example, the Pandas labs on threat hunting begin with learning how to create parquet files and convert from CSV to parquet and vice versa. Then it jumps directly into applying Pandas to threat hunting and performing statistical analysis. One or two exercises performing statistics with Pandas would be a good addition. Those statistics exercises would be more impactful to make them relevant for the practical threat hunting datasets.

Sometimes people have to learn a whole new skill before completing some Mossé tasks (for me, that is appealing about their training - continuous learning). For example, I went through a Pandas course on YouTube to complete the assignments using Pandas with learning threat hunting. They do provide references to the documentation for Pandas. Still, documentation for some products is not always intelligible to their intended audience. Despite having to learn Pandas, I saw its value once I started applying it to threat hunting with their datasets.

In summary, training experts requires providing them with hard problems. However, it is difficult to determine what is a hard problem for every expert. Presenting real-world case studies could allow finding skill deficits and then help them develop those skills.

It may be beneficial for Mossé exercises to have a ‘solution’ or template for students, where their advice is not furthering their skill development. That could help maintain a student’s motivation to complete further exercises. It can also reduce the instructors’ workload by providing them with a worked example. If students are actively trying to learn from the advice and not getting it, a worked example could be very beneficial for them to continue the training. Too, since they can see the student is trying, it shouldn’t be a problem to let them see how someone solved the exercise. Even better, since the student has to complete the assignment and upload the video and/or code, they can learn from performing the same steps as one of their peers. Vygotsky’s zone of proximal development suggests people learn best from their peers but they should work with peers who can help support their knowledge gaps. Instructors could develop “blind spots” in how they are teaching someone so a peer can help provide another method of explanation that helps them learn. The worked examples do not have to be solutions to the exercises, but similar to it. That would be an excellent job for an intern at the Mossé Institute - find out which exercises students struggle with the most and create a similar exercise and solution. In that case, a worked example would help a novice and expert.

The Mossé Institute provides a wide range of problems in their courses which allows an expert to determine what skills they need to develop to help them in their day-to-day practice or brush up on some skills they haven’t used in some time. It is always good practice to ensure students have a support system if they need assistance with their learning, which the Mossé Institute provides and could improve on.

References#

Ericsson, K. A. (2020). Towards a science of the acquisition of expert performance in sports: Clarifying the differences between deliberate practice and other types of practice. Journal of sports sciences, 38(2), 159-176.

Ericsson, K. A., Krampe, R. T., & Tesch-Römer, C. (1993). The role of deliberate practice in the acquisition of expert performance. Psychological review, 100(3), 363.

Kicken, W., Brand-Gruwel, S., Van Merriënboer, J. J., & Slot, W. (2009). The effects of portfolio-based advice on the development of self-directed learning skills in secondary vocational education. Educational technology research and development, 57(4), 439-460.

Rikers, R. M., Van Gerven, P. W., & Schmidt, H. G. (2004). Cognitive load theory as a tool for expertise development. Instructional Science, 32(1), 173-182.

Sweller, J. (2011). Cognitive load theory. Psychology of learning and motivation, 55, 37-76.

Sweller, J., & Chandler, P. (1991). Evidence for cognitive load theory. Cognition and instruction, 8(4), 351-362.

Van Gog, T., Ericsson, K. A., Rikers, R. M., & Paas, F. (2005). Instructional design for advanced learners: Establishing connections between the theoretical frameworks of cognitive load and deliberate practice. Educational Technology Research and Development, 53(3), 73-81.

Author: Duane Dunston College Cybersecurity Professor

Link to original article