In late November 2016, a group on high school students gathered at New York University during the school’s Cyber Security Awareness Week to solve a mystery relating to the murder of a fictional presidential candidate. The participants used digital evidence of security hacks to identify the killer, who turned out to be the individual that hacked the candidate’s email and other accounts.
The students who participated in this event stepped outside of their traditional computer science and coding educational curricula to absorb critical cybersecurity experience. Few schools teach or offer cybersecurity coursework even though cybersecurity is a growing concern for businesses and governmental entities alike. This suggests that computer science curriculum designers should look at events like NYU’s Cyber Security Awareness Week to ask: what can we teach our children about cybersecurity? The answers to this question delve deeper into why cybersecurity has traditionally not been included in computer science programs.
– Current computer science core classes are a poor fit for cybersecurity.
These classes teach networking, programming, and operating systems, which are not the best environment for teaching the kind of activities that comprise hacking. Schools should develop both core and advanced classes that teach hacking skills to give students the tools they need to understand and counter the kinds of hacks they will likely see in the real world. Critics might object to professional programs that explicitly teach students how to hack, but fighting fire with fire is a sound approach to cybersecurity training.
– A focus on cybersecurity is perceived as being too narrow.
Students are worrying that a cybersecurity degree will limit their options, instead favoring a more comprehensive computer science degree that will give them more options. Cybersecurity is also perceived to be an advanced degree option that builds upon basic network expertise. Incorporating cybersecurity classwork as a core competency in computer science education would address this concern.
– Cybersecurity is taught in seminars, continuing education, and other non-traditional arenas.
Cybersecurity training has developed as a “do-it-yourself” seminar industry that attracts computer science professional who are looking to expand their skills. These seminars may teach useful skills and techniques, but they are not subject to the same type of basic standards and overview that would apply to a university cybersecurity major.
On the positive side, the thinking among educators is moving toward teaching cybersecurity not just at a university level, but even in high school programming courses. High school teens have grown up with technology that requires no special technical training to use, but many of those teens have a natural curiosity to delve deeper into that technology. Catalyzing an interest in cybersecurity among students when they are younger will lead to a greater pool of trained professionals who are better able to handle cybersecurity issues than generally-trained programmers who have picked up their cybersecurity skills in a less formal manner. A corollary but equally important benefit of offering cybersecurity education to high school students is that it can enable them to better protect themselves against online hacking and other risks associated with their social network and other online presences.
The failure to expand and formalize cybersecurity education can lead to disastrous consequences as hackers improve their own skills and computer engineers are unable to keep up with hackers. Until that education expands, and even after computer engineers are as creative and adept as hackers, no business can assume that its defenses will protect it from all cybersecurity attacks. Cybersecurity insurance can offer a strong measure of protection to help businesses recoup financial losses they might suffer as a result of an attack. Improved cybersecurity education coupled with insurance is the best scenario to counter the growing risk of hacking attacks.