Network Deployment
by
Mustafa Ali
|
October 23, 2017
Wondering, “What is micro segmentation?”. Microsegmentation is the art of using software-defined policies, instead of hardware network configurations, to make network security more flexible. It can only work if implemented with right tools and forethought.
There are four architectural models used in microsegmentation. They are:
When considering microsegmentation solutions, even for a virtual machine, avoid thinking about technical solutions only. The solution should be inclusive of technology, process, and people. Therefore, do not go for the model of security you think is best to implement. Rather, you should select the architectural model that ensures security in the operation of your modern data center.
Microsegmentation allows security policies to be defined by workload, applications, VM, OS, or other characteristic. Source: VMware
Now that you’re no longer asking yourself, “What is micro segmentation?”, the next step is getting information on its benefits. This will help you understand why it is important in data centers and other IT/ICT platforms. The following are benefits of microsegmentation:
Security administrators depend on microsegmentation to make adaptations to unfolding and new scenarios. Threat topologies in data centers are constantly changing. Old vulnerabilities become inconsequential as new ones are exposed. At the same time, user behavior is a constant variable that surprises any security administrator. Since emerging security scenarios are consistent, administrators are able to extend capabilities through microsegmentation.
For example, a security administrator may start with an effective firewall distributed in the data center. They may then add IPS and stateful firewalling for visibility of deeper traffic. Alternatively, the administrator may develop better server security using agentless anti-malware.
All the same, administrators need all functions to cooperate to have more effective security. For this to happen, microsegmentation ensures sharing of intelligence between security functions is enabled. The end result is a security infrastructure working concertedly to design response to different situations.
Normal data center architecture concentrates security on important workloads. This is usually at the cost of creating minimal security for lower priority systems. Managing and deploying traditional security in virtual networking is costly. The cost forces data center administrators to ration security. Intruders take advantage of the low security in low-priority systems to infiltrate the data center.
To have a sufficient defense level, security administrators are required to rely on a high level of security in every system in a data center. This is made possible through microsegmentation because it embeds security functions into the infrastructure itself. Taking advantage of this allows administrators to depend on security functions for all workloads in the data center.
A security administrator will need to be sure enforcement of security persists even when there are changes in the network environment. This is crucial due to the constant change in data center topologies. Workloads are moved, server pools are expanded, networks are re-numbered, and so on. The constants in all this change are need for security and the workload.
The security protocols implemented when a workload was first deployed in a changing environment will no longer be enforceable after a short while. The situation is mostly common in scenarios where the policy relied on loose associations. Examples of loose associations with workloads include protocol, port, and IP address. The challenge of maintaining this persistent security is aggravated by workloads to the hybrid cloud, or even other data centers.
Administrators are given more useful ways to describe the workload through microsegmentation. They can describe inherent characteristics of a workload, instead of depending on IP addresses. The information is then tied back to the security policy. Once this is done, the policy can answer questions such as: what kind of data will this workload handle (personally identifiable information, financial, or low-sensitivity)?, or what will the workload be used for (production, staging, or development)? Additionally, administrators can combine these characteristics to describe inherited policy attributes. For instance, a production workload handling financial data may get a higher level of security than a workload handling financial data.
Similar to conventional virtualization, there are many ways to implement network microsegmentation. In most scenarios, the existing protection mechanisms and legacy infrastructure are augmented systematically with new technologies, which include virtual firewalls and software defined networking. When it comes to adoption of microsegmentation technology, there are three major considerations involved.
Visibility is the first thing to be considered. Potential adopters need to thoroughly understand communication patterns and network traffic flow within, from, and to the data center. Next, use a zero-trust approach to implement security policies and rules. This is a complete lock down of communications. Throughout the deployment of microsegmentation, zero-trust policies should be followed. Across the network, communication should be only allowed carefully using the results of previous analysis. It is the best practice for anyone who wants to ensure application security and connectivity.
The process is to be repeated regularly. Distilling rules and analyzing traffic is not a deployment effort that is done once. It needs to be a continuous activity that has to be done often to make sure policies and workloads do not change suddenly and any current analytical results can be used to effectively tune microsegmentation rules. Current analytical results may come from changes in traffic patterns or new applications. All these are consideration putting an emphasis on the choice of tools and hypervisor used in microsegmentation facilitation.