Special Analysis Service

Information Network Protection

Information Network Protection

New threats give rise to new views on the protection of information networks.

Jack Danahy, Director of the Institute for Advanced Security Technology, IBM Security Division, and Dr. Matthias Kaiserswerth, Vice President and Director of the IBM Zurich Research Laboratory, have expressed new insights into information security and possible overcoming of old views and approaches in the implementation of information protection processes.

For decades, information security has been characterized by approaches based on the tacit recognition that systems and networks are unprofessional and vulnerable, that protection is required due to the growing number of connected devices, and not because of the vulnerability of the architecture of the devices themselves. Modern incidents related to unauthorized entry into systems, the introduction of malicious code, as well as publicly recognized cases of compromising data and services are a constant reminder: despite huge investments, traditional models of ensuring information security are no longer sufficient (if they were ever sufficient) .

Perimeter destruction

Generally accepted security measures are very often based on perimetrization of the information infrastructure. Whether it’s sharing internal networks with external networks or isolated execution of files on a common system, the concept of the possibility of providing a secure perimeter of security has become a calming illusion after the advent of multi-user systems and basic interworking. Alarms about the inadequacy of this protection have always arisen: from identifying the ability of virus programs and worms to easily overcome imaginary boundaries and ending with constant warnings that insiders pose a significant threat. They did not pay attention to these warnings, since the division of networks according to the principle of “internal / external”, “protected / insecure” created a false impression of the security of corporate and private networks. Due to the focus on providing strong perimeter protection, the internal systems themselves remained very weak and insufficiently protected, and the responsibility for information security passed from system developers to medium operators.

Over the past five years, the widespread adoption of devices using mobile and cloud technologies, and the expansion of access to corporate resources through the development of social networks, have forever destroyed even the very illusion of this protected perimeter. More and more data is stored and processed outside the internal network, and more and more unknown and not always verified organizations are involved in working with them.

Coming to an understanding of the situation

The disappearance of the perimeter definitely increased the exposure of business structures to IT risks. And if we add here the limited resources and the need to ensure compliance with international information security standards, we will see that information security issues have become not only the lot of specialists, but also the concern of company management. New people, from general directors to lawyers, from entry-level programmers to members of board of directors, are involved and participate in those tasks that were previously purely technical and highly specialized security items. The financial health and welfare of companies now increasingly depends on how they monitor their security, and therefore security commitments and decisions are becoming increasingly strategic.

Success awaits those leaders in the field of security equipment manufacturing who can explain to the market the need for real changes in a language that is understood not only by specialists. According to a recent analytical report by the Center for Applied Insights of IBM (Center for Applied Insights), approximately 25% of IT security directors and security industry leaders have already faced the described situation, recognizing the need for strategic consultations and actions in accordance with the directions and priorities agreed upon with management business structures.

The difficulties created by complex reality

Systems have become too complex and too unique to apply traditional operational testing technologies to them. Every day systems are built and improved that simply cannot be tested for all the practical purposes for which they are intended. There is a need for a new approach to building systems that recognize the importance of security. This approach should include new initiatives for traditionally untouched areas such as gathering security requirements, testing system components, providing a supply chain, and secure software architecture.

These changes are based on a re-examination of the principles on which a secure infrastructure rests. The level of security must go from quantity to quality when systems are built and implemented in such a way that security requirements are taken into account along with performance and functionality requirements. As a result, a system that cannot be tested for compliance with safety requirements cannot be implemented. An architecture that creates functionality to the detriment of a practical assessment should be reviewed to balance the value provided by each content architecture with the risks that it may pose. The problem of unverified and / or non-verifiable threats is growing almost daily, with every new function, every new system and every new user, especially against the background of the growing use of employee-owned communication devices in business (Bring Your Own Device, BYOD – “Bring to work your device ”).

BYOD and secure desktop

Against the background of an increase in the number of consumer IT devices and the appearance of personal devices at workplaces, employees of the organization are forced to look for ways to solve new security problems at the enterprise. In addition, a study conducted by IBM among IT directors showed that two out of three directors make plans to integrate mobile solutions into the information infrastructure and virtualize their business in order to remain competitive.

To meet these challenges, IBM Zurich experts, also known for their secure JCOP operating system, now used on millions of smart cards, have developed a “Safe Desktop” for organizations practicing the BYOD concept. A device connected to the USB port of any computer with a 64-bit Windows or Linux system downloads the main elements of the software for the desktop of its user from the corporate “cloud” using the streaming technology of the OS based on a virtual machine, also known as a “hypervisor”. After launching the innovative streaming hypervisor, a typical Windows or Linux environment starts from the cloud, emulating a user computer, including applications and files. Any changes to the host computer are safely backed up immediately. Even if the hard drive of the main host computer is infected with malware, the tight integration of client hardware and the hypervisor will prevent any malware from entering, making it useless.

This is one example of the changes that can be made now in order to be able to reduce security risks over time.

What to do today?

Any changes of this kind will take some time. As well as the development of any critical system, updating all systems or security practices is a gradual process. Meanwhile, while the business structures will have the corresponding desires and opportunities mature, investments are needed to manage and balance the risks that are characteristic of today’s very vulnerable state of systems. And here, the new capabilities of data analytics, providing analysis of deviations and trends, create opportunities for coherence and quick response.

Regardless of whether the concept of improving these environments in the future is based on reducing the risks of existing unsafe environments or on collecting data, these analytical platforms will become strategic cornerstones for identifying and preventing risks in the next 15–20 years of this process.

What should information security directors do?

It is unlikely that many of them will quickly change their minds, but amid the criticality of the changes and the speed with which organizations continue to jeopardize their security, there are definitely some measures that IT directors should take as soon as possible.

First of all, you need to determine your exposure to risk. Only 15 years ago, it was a surprise to learn that many organizations had “blind spots” in their internal structure: insufficiently documented or poorly understood components of corporate infrastructure, such as areas of network interaction, provision of services or storage of data and access to them, which was not at that time as familiar as it is now. The rapid introduction of new technologies in the last decade has virtually made it axiomatic that a fairly large percentage of risks arise from systems that remain institutionally independent and are not subject to any management or control system.

On the very first day, when the organization begins to implement a more serious and, finally, holistic approach to its security, it should examine and understand the full picture of the environments it intends to protect. Typically, these are tools for monitoring the network and detecting applications, as well as monitoring communications over the network to determine endpoints. Once unexpected components are identified, it is important to distribute responsibility among their owners and operators. In environments where there is an imbalance between deployment complexity and detection resources or disclaimers, some organizations even shut down or block systems and wait for incoming calls from interested parties to start a constructive dialogue.

Carrying out such a check, as well as timely and strictly monitoring the regularity of its conduct, you can at least have an idea of ​​the whole set of equipment that needs to be monitored for risks and which needs to be protected, while the organization must make decisions about how to strengthen, modernize or replace.

The short-sighted and definitely uncreative approach to solving problems in the new security concept is to constantly assume that the vulnerable system must be either isolated or upgraded in order to maintain its functionality and reduce vulnerability. Judge for yourselves – after the organization presents the whole set of security problems, this fragmented strategy will look like cleaning carpets in a house intended for demolition. A systematic view of the aggregate threat, in which an awareness of the need for security should lead to a wider reassessment and rethinking of all risk characteristics, may well indicate that other options are more effective and logical.

After inventory and risk assessment, there are many options for improving security in a broader sense. In some cases, the first right step is to remove a particular vulnerable application or replace large sectors that are sensitive to functionality threats. Sometimes, the transition to a new data delivery platform, for example, hosting or cloud services, can be an option to completely avoid the use of vulnerable equipment.

If you allocate temporary resources and apply a strategic approach when choosing the right risk reduction strategy, then efforts, mistakes and repeated corrections can be minimized. Such an approach may also involve partners of a higher rank and not so narrowly specialized in the technical field in the discussion, which in the future will increase awareness of problems and reduce the likelihood of their occurrence in the future.

Understand responsibility

Topics of organization profitability, investment risk, and performance issues in most organizations are usually addressed at the highest levels of leadership. These same leaders, as security concerns force them to make changes, must participate in a balanced distribution of responsibilities. In fact, some senior executives have moved away today from the processes of checking systems and people who will interact with sensitive or sensitive data. As a result, they are ideally suited to support the implementation of a new security vision.

The most effective accountability strategy is one that entrusts staff with the most powerful controls and the best opportunities for business justification and planning for change. Each new data source, each new network, each new application must be created, tested, implemented and operated in accordance with the needs of the organization, depending on how these needs are determined by its owner. The person responsible for a particular component should also be involved in this process, as well as the person responsible for the initiative that this component supports. Each of these individuals is responsible for part of the decision plan, and it is through their concentrated joint efforts that the necessary measures can be understood and taken.

Change is coming

The impact of the responsibility migration we have examined will be reflected in the products, in the strategies of business structures, in their expectations and results. Structures that understand this internal security need will develop new relationships, a new common language of security, which will ultimately lead to their prosperity. Organizations that apply yesterday’s decisions, hoping to bury a new reality under them, will face increasing costs, and their security will remain at risk.

The article is based on the materials of the magazine “PLUS”
https://www.plusworld.ru/journal/section_1168/section_153900/art153876/