The security protocols implemented when a workload was first deployed in a changing environment will no longer be enforceable after a short while. The situation is mostly common in scenarios where the policy relied on loose associations. Examples of loose associations with workloads include protocol, port, and IP address. The challenge of maintaining this persistent security is aggravated by workloads to the hybrid cloud, or even other data centers.

Administrators are given more useful ways to describe the workload through network micro segmentation. They can describe inherent characteristics of a workload, instead of depending on IP addresses. The information is then tied back to the security policy. Once this is done, the policy can answer questions such as: what kind of data will this workload handle (personally identifiable information, financial, or low-sensitivity)?, or what will the workload be used for (production, staging, or development)? Additionally, administrators can combine these characteristics to describe inherited policy attributes. For instance, a production workload handling financial data may get a higher level of security than a workload handling financial data.

The process is to be repeated regularly. Distilling rules and analyzing traffic is not a deployment effort that is done once. It needs to be a continuous activity that has to be done often to make sure policies and workloads do not change suddenly and any current analytical results can be used to effectively tune micro segmentation rules. Current analytical results may come from changes in traffic patterns or new applications. All these are consideration putting an emphasis on the choice of tools and hypervisor used in micro segmentation facilitation.

Views: 23

Add a Comment

You need to be a member of SFCHS Journalism Lab to add comments!

Join SFCHS Journalism Lab

© 2020   Created by Brenda Day.   Powered by

Badges  |  Report an Issue  |  Terms of Service