Six common misconceptions about cybersecurity in the enterprise

In some companies, dealing with cybersecurity is a rather unpopular task. In many cases, IT administrators already have a pretty good idea of what's wrong with their company and how their own IT could be put to the test so that security gaps can be identified and mitigated or even eliminated. However, this does not mean that the administration team's suggestions will be accepted by management.

The importance of cybersecurity is now undisputed. But there are still too many misconceptions circulating. (Image: Pixabay.com)

Cybersecurity costs money. As long as the IT systems and infrastructure are functioning, it is often difficult to invest the resources that would be needed to reduce risks and ensure smooth operation in the future as well, in other words: to establish cyber resilience. When organizations systematically underestimate their cyber risk, it has to do with several misconceptions. In the following, we look at six of the most common misconceptions.

Assumption 1: It only affects the others anyway

"Our company is not interesting enough for a cyberattack after all." This assessment is anything but rare. Unfortunately, the reality is completely different. Statistics show that as many as 99 percent of all cases of cyber damage are the result of attacks that were not targeted at all. In other words, the vast majority of attacks are spray-and-pray. Using the watering-can principle, cybercriminals launch a general attack attempt without a specific target. Then they simply wait to see which companies or organizations, for example, are successful with the mail containing the phishing link. Unfortunately, for many companies, the hurdle for an initial compromise of their IT is not high enough to withstand these attacks in the long term. This plays into the attackers' hands. Especially if they have primarily financial interests and want to blackmail the company, for example, by encrypting it with a crypto-Trojan or ransomware. Here, the spray-and-pray approach is usually the most profitable for cybercriminals. This in turn means that every company is a potential victim.

Politically motivated attacks are clearly distinct from this: Here, success is ultimately only a question of available manpower, because in an ideologically motivated attack, monetary cost-benefit considerations play a completely subordinate role. In such cases, zero-day attacks, which exploit security vulnerabilities in software that are not yet publicly known, are also used more frequently. With a zero-day exploit, the attacker plays a joker, so to speak. Because when the new attack method becomes public through its use, this attack vector is ultimately burned, because software manufacturers then roll out corresponding security updates.

Assumption 2: Attacks from the supply chain do not play a major role

In fact, supply chain attacks are on the rise. In this class of cyberattacks, software solutions, devices, or machines that are supplied to a company and that it uses to conduct its business act as the attack vectors. For example, the Log4j vulnerability disclosed in December 2021 was a zero-day vulnerability in a Java logging library. Log4j is used to create and store logging information from software, applications and hardware appliances. However, because Log4j is sometimes deeply embedded in many different solutions, in thousands of instances, a simple vulnerability scan is hardly sufficient to identify all vulnerable instances here.

In general, even open source software is not immune to security vulnerabilities. For example, a professor at the University of Minnesota succeeded in introducing vulnerabilities into the Linux kernel in the context of a study. To do this, he and one of his students pretended to provide bug fixes for the Linux community. The aim of the controversial action was to demonstrate how vulnerable even open source projects can be. A security vulnerability in the Linux kernel is potentially so serious because Linux is so widely used. It can now be found in servers and smartphones, and also in a wide variety of embedded devices - from cars to smart homes and machines.

With the increasing digitalization of our economy and our lives, networked devices can also become a gateway for cybercriminals. For example, a supermarket chain was hacked when the attackers chose the smart refrigerated shelves in the stores as an attack vector. The same risk exists for networked devices in the smart-home sector. They too represent potential points of attack - a serious reputational risk for the device manufacturer or distributor. In both the private and commercial sectors, therefore, a much more conscious approach to installed software and purchased devices is required. In the manufacturing industry, for example, where a machine can have a life cycle of several decades, sooner or later only mitigating measures are usually available to reduce security risks. This is because manufacturers then no longer exist, or they no longer supply security patches after a few years. Sometimes the only option is to seal off the machine from the rest of the network and accept the residual risk. As a general rule, it would be negligent for a company to shift responsibility for its cybersecurity entirely to its suppliers. Threats from within the supply chain are real and commonplace today. Companies therefore not only need appropriate risk awareness, but also experts who can support them in establishing effective cyber resilience.

Assumption 3: Our employees already have sufficient safety awareness

All too often, employees' careless behavior is still a convenient gateway for cybercriminals to enter the company. Creating and maintaining appropriate risk awareness is a building block for cybersecurity, the importance of which a company should never underestimate. Only if they are aware of the danger will employees consistently avoid passing on passwords over the phone, for example, or carelessly clicking on a dubious link in an e-mail. Sometimes the potential danger is also a direct consequence of daily work. Employees in the HR department, for example, open applications almost every day without knowing whether or not the digital resume contains malicious code. The same applies to invoice PDFs in the mail inbox of the accounting department. That's why companies need technical measures to protect themselves against such attacks.

But it is equally important to reduce the likelihood of successful phishing attempts by creating awareness of the dangers of social engineering attacks more generally. Social engineering means that attackers use deception to gain unauthorized data or access. Human psychology methods are misused to manipulate employees and persuade them to transmit information or perform certain actions - such as fatally clicking on the link in the phishing e-mail or telling the password to supposed support staff on the phone.

Assumption 4: The scope of this safety check will already be sufficient

Putting corporate cybersecurity to the test with penetration tests is an important building block in building cyber resilience. However, if the scope of the pentest is too small, little is gained. This creates a false sense of security. A typical example is the exclusion of certain systems, such as those that are at the end of their life cycle because they will soon be shut down or replaced anyway. As long as they are not shut down, however, it is precisely these legacy systems that often offer the most tempting attack vector. Another example: the server running a web application to be checked also runs an FTP service, which allows the server to be completely compromised - but all services except the web application are excluded from the check. Similarly, it happens that, for example, a financial institution chooses the scope of its audit to be only as large as is required by regulation and officially. Again, the result would be a deceptive sham security.

If pentests are to be truly meaningful, they must not just focus on a section of the company's IT. Rather, they must be holistic in nature. After all, the goal of a penetration test is not just to make management feel good about cybersecurity - it is to identify real vulnerabilities and potential attack vectors so that they can be fixed before they are exploited by criminal attackers.

Assumption 5: Penetration testing can be done by the IT department on the side

In most companies, pentesting cannot be an in-house task at all. After all, IT administrators have one thing above all else to do: they have to ensure that the company's systems run reliably. As a rule, the administration team is already working at 100, if not 120 percent capacity with its operational tasks. In addition, penetration testing requires highly specialized and cutting-edge expertiseThis is something that the IT department usually does not have at its disposal. It is important that management understands that a pentest is not something that can simply be done on the side. At the same time, internal IT staff must realize that a security audit is never about discrediting their own cybersecurity work, but about strengthening it. A meaningful penetration test would not even be feasible with in-house resources because know-how and time are lacking. This is only different if the company is large enough to afford its own dedicated Red Team - the attackers - for more or less continuous pentesting. This Red team is then countered by a dedicated Blue team with the defenders. But even a dedicated Red team can sometimes benefit greatly from external support from Ethical Hackers.

Assumption 6: Our backups save us in case of emergency

A little more than five years ago, this statement might have been true. Today it is no longer, not in every case. It's important to remember that the quality of malware has increased significantly. Crypto-Trojans that encrypt corporate data for extortion purposes no longer do so immediately. There is now ransomware that first nests in a company's backups and gradually destroys them. Only months later, when the backup has become unusable, does the crypto-Trojan then set about encrypting the company's data - and the actual extortion begins.

That's why it's important today, Backups firstly, to secure them against malware with suitable protection concepts and, secondly, to check them regularly. Only a backup that can actually be set up can be relied on in an emergency. Companies should therefore regularly test, practice and try out their disaster recovery. And if a company encrypts its backup for security reasons: This backup key itself is also a potential point of attack, because cyber criminals can of course also encrypt the company's backup key. The backup would then, in turn, be unusable, and the extortion attempt through the encryption of the company's data could begin. That's why it's important that companies keep their backup crypto keys offline and also document their disaster recovery training offline.

Conclusion: From cybersecurity to cyber resilience

The threat of cyberattacks has not diminished; on the contrary. If a company wanted to conclude from a past that went smoothly that it will continue to be safe from cybercrime in the future, this would perhaps be the most serious misconception of all. Operational reliability can only be established in IT if a company establishes, maintains and further develops its cyber resilience with suitable, holistic concepts and measures. In any case, it is worth the effort to deal with this, because the financial damage in the event of an emergency weighs many times more heavily than the foresighted investment in cyber security. As in medicine, prevention is better than cure when it comes to cybersecurity.

Authors:
Michael Niewöhner and Daniel Querzola are both managers and penetration testers at Ventum Consulting, Munich  

(Visited 156 times, 1 visits today)

More articles on the topic