CarahCast: Podcasts on Technology in the Public Sector

Zero Trust and Protecting Data as a Strategic Asset with McAfee

Episode Summary

Carahsoft & McAfee Enterprise have invited two guest panelists to discuss existing Federal government frameworks for Zero Trust, the importance of placing data at the center of a Zero Trust Architecture, and best practices for meeting the requirements defined by Executive Order 14028.

Episode Transcription

Speaker 1: Hi everyone. Thanks for joining our executive perspective series featuring the White House Executive Order and more. Today's session is called "Zero Trust and Protecting Data as a Strategic Asset." Our speakers for today are Kent Landfield, Chief Standards and Technology Product Strategist for McAfee Enterprise and Sadik Al-Abdulla, Vice President of Product Management for McAfee Enterprise. At this time, I'd like to pass it off to them.

Jason White: Kent, Sadik, thank you very much for joining me today. Yeah, today we're going to be spending some time talking about section three of Executive Order 14028, modernizing the nation cybersecurity. The section of EO includes requirements for accelerating momentum to secure cloud services, gaining a better understanding of sensitive unclassified data, the implementation of encryption and multi factor authentication. And lastly, moving towards a Zero Trust architecture. Our emphasis today is going to be on data and why data protection should be viewed as the subtext and the crux of the entirety of section three. After all, data is arguably the most important government resource other than its people, improving security should focus on improved data protection. Kent, my first question is for you. As I mentioned in the intro, section three of the Executive Order appears to be putting a strong emphasis on better data protection by requiring that agencies have a better understanding of their data by identifying both the type and sensitivity of unclassified data. The only specific data control requirement is encryption now. Do you think data requirements go far enough in this Executive Order? Why do you think the White House stops short of requiring the implementation of a full data protection platform?

Kent Landfield: Well, thanks for having me today. I guess the real answer here is I think the White House is being realistic. You can't jump to the end without going through processes needed to get the basics in place. And a lot of those basics are actually missing today. Things like multi factor authentication, for approving access to that data is something that really needs to be put in place passwords are just not something that is very successful and keeping people out with brute force attacks and credentials being stolen. While this is requiring agencies to evaluate the types and sensitivity of their own class data. This reevaluation is really trying to understand what data they have in their agencies are the biggest target and greatest threat. This will help them to focus their controls and what is really needed to be enhanced. This is not something that has occurred in the past. This is a concerted efforts, directed effort by the EO, to enforce this to agencies and report on their findings still and be. So this is actually something that I think is a major benefit of the EO as organizations really try to get a solid understanding of their data protection needs. Before unclasp data, data encryption is absolutely critical for data at rest and in transit. The EO makes it clear that agencies really need to get that and to implement that so that they have those types of baseline protections in place. Once all, we're on some sort of consistent baseline, it'll be much easier to implement that data level object security full data protection platform.

Jason White: Yeah, sure, it may can't actually start protecting your data until you have some sense of what your data actually is. 

Kent Landfield: Correct.

Jason White: Makes a lot of sense. So Sadik in your role, you've engaged with McAfee customers globally and evaluated industry trends and challenges. The EO outlines a need for accelerating the adoption of cloud services. How has that increased the demand for better data protection? What are the biggest challenges customers are facing as they attempt to get a better handle on securing their data in a hybrid environment? 

Sadik Al-Abdulla: Jason, I'd say that the adoption of cloud services introduces a level of velocity, flexibility and power that can dramatically assist agencies in their mission. However, all of those fundamental shifts introduce more risk and more vectors. So while different cloud services in their preparation of federal environments can add an infrastructure or platform level, not inherently present a risk, the functionality of those services and how collaboration works and how the enhanced business processes work, do open up new vectors. And so you asked, how does it increase the demand for better data production? Well, as you increase velocity, and as you increase collaboration, you inherently increase risk and so you want to make to get alongside. Now, your second part of the question was, you know, what are the biggest challenges people are facing as they attempt to get a better handle? You know, it probably starts with uncovered vectors, right? Traditional approaches covered traditional architectures. And the ability to enforce controls can frequently come back to the endpoint or the network or appliances. And not all of them. And in most cases, not any of them translate well into native cloud fabrics. And so as you introduce the new fabrics, you introduce new vectors that also need native controls.

Jason White: Yeah, that makes sense. Yeah, I mean, I would imagine that, you know, when you start, you start accelerating the utilization of data to it becomes a much bigger challenge of actually trying to quantify that data. You know, what guidance or best practice can you offer to the government? Because as you've heard, one of the Executive Order requirements is that they need to do they need to start identifying both the type and sensitivity of their unclassified data? Are there best practices or specific guidance you might offer up as they try to tackle that challenge?

Sadik Al-Abdulla: Well, I'd say the first and probably the most significant piece of guidance I give is, is don't look at it as a monolithic project, the worst possible outcome is chartering a massive endeavor going through classification. And in the midst of a multi-year project, having a breach this should be, I would suggest approaching these kinds of initiatives with the same model of agile development that we use in the software industry in the cloud industry. So iterate quickly, start with what you absolutely know, start with a certain type of high risk data that has a certain restricted population. And both map that and enforce controls for it while you continue to iterate and do the remainder of the identification, classification definition of permission, etc. I can't tell you how often we've encountered customers that have written a beautiful project plan two years in duration, and unfortunately, had an incident partway through before they implemented controls.

Jason White: Yeah, yeah, I mean, obviously, that you want it you want to take your time and do it right. But you want to start with the low hanging fruit, if you will. And what would you consider low hanging fruit? Typically, would it be regulated data? Would it be specific project data, what would be kind of the easiest data to kind of address first?

Sadik Al-Abdulla: Well, on the unclass side, I think it would be the highest risk data, which is that which is easily monetized. So large amounts of either personal or financial information about as associated with the agency's mission.

Jason White: Makes sense, appreciate that. So hey, can one of the Define requirements in section three is that agencies begin adopting Zero Trust architecture. There's a great deal of focus and movement around Zero Trust not just within the government, but in the in the industry at large. What do you see as problems or challenges confronting agencies when it comes to effectively migrating networks to a Zero Trust architecture?

Kent Landfield: Well, first education, there's a real educational component needed while some agencies have been moving in this direction for a while, others just now seriously looking at it. The NIST SP 800-207 Zero Trust architecture document is a year old this month, and the DOD Zero Trust reference architecture document was published late last February of this year. There is a good deal of hype around Zero Trust, it needs to be understood Zero Trust is a security model, a set of design principles combined with a coordinated cybersecurity system management strategy, all based on the assumption that your networks already been compromised, the breaches occurred. Second, planning there is a great deal of planning that will be needed up front to assure success of the network's migration into Zero Trust architecture is important to understand what needs to be done in order to have a successful migration. For example, what is the critical data? What are the required workflows? What's the desired outcome? What analytics are needed for better visibility? How is the granularity of access going to be managed, how to architect to leverage the infrastructure to allow for rapid response to suspected events? There's a lot of additional questions that need to be addressed by the organization's as part of the planning and planning upfront is critical the organization's success. Third cost, the makes it clear that the administration wants all agencies to drive towards the use of Zero Trust but from a modernization perspective, Zero Trust is the right way to go but will not come cheaply reengineering an agency to support a Zero Trust architecture will be expensive and extremely time consuming. And then one of the bigger pieces that I think is most important but hard sometimes for large organizations to consistently succeed at is the follow through. Zero Trust implementations have to be followed through on, you cannot adopt pieces and parts of Zero Trust principles and expect to get any real value out of the Zero Trust architectures. Zero Trust concepts and principles must be ubiquitous across the entire network. And additionally, the organization itself needs to commit fully from the executive leadership to operations in order to be successful and see the value that was intended by implementing a Zero Trust architecture and network. So let's be clear that the EO is that agencies on a five year journey to adopt Zero Trust in order to better secure critical data, while providing more resilience within federal hybrid architectures seamlessly integrating both prem and cloud based capabilities.

Jason White: Yeah, I mean, obviously, they're giving them five years, they understand that it's going to be no small feat in doing so. And hopefully the appropriate funding is aligned to and resources are aligned to aid and aid them as they as they move forward in that journey. Absolutely. So as you know, Cisco was tasked by the EO with developing a federal cloud security strategy to steer agency adoption of Zero Trust. Do you see this as an opportunity for Cisco to revisit the NIST 800-207 and potentially blend that document with a more recent duty reference architecture that was released earlier this year? Or do you think they're more likely to design that cloud security strategy using 800-207 migration principles? What role should data play in that strategy?

Kent Landfield: Well, first off any data should be a central part of that strategy. Interestingly, this is occurring at a time when CISA's is losing visibility of what is occurring on federal networks as federal computing moves away from the traditional on prem and more and more to a cloud based approach. To address the CISA working with OMB and FedRAMP are developing the cloud security technical reference architecture, due August 12, which as you mentioned, it's expected it will address the cloud security migration considerations as well as address data protection requirements for reporting and visibility. Protecting data is critical, and it needs to be a foundational capability. In this cloud reference architecture. We expect to see Zero Trust concepts to be addressed in the architectures documentation. Time will tell what to what extent the existing architecture guidance from DOD and NIST, will be in how it will be incorporated. Don't expect the CISA reference architecture to be done in final when it's released. This architecture as documentation as I've actually been in process for a while, the started late last year, but it's doubtful that they're resolved on August 12, will be a static document, it's expected that this will be just the first release with subsequent updates as the federal cloud security reference architecture needs to evolve. And more lessons are learned. Reality as EO is needed, but highly reactive. One of the things that we found with EO was it had a purpose. Hopefully, the items outlined in the EOB followed up on and subsequent administration's be reassured steps being taken today will allow agencies to get into a proactive stance and finally, off of their heels.

Jason White: Yeah, obviously, that would be ideal, right? I think we've seen with recent attacks, especially as you look at how they've targeted the cloud and how they've been able to expose weaknesses and some of the cloud design that enabled our efforts and exploiting the customer. I think it certainly it certainly bodes well if we do put the right amount of effort and follow up into this to make sure we're being more proactive with how we're protecting our customers, and ultimately, our enterprises. So, this episodic It seems the Industry Focus for in large part for Zero Trust has traditionally been on network access for remote users applying contextual access to user and device connections prior to allowing access to applications or network segments, much in the way traditional nack solution to function. What appears to be missing from the traditional ZTNA approach, however, is putting data at the center. Last week, McAfee Enterprise announced they will be entering the ZTNA space with envision private access. Can you explain why envision private access will be a game changer in the ZTNA market?

Sadik Al-Abdulla: Sure, Jason, I would say you're spot on with the traditional ZTNA approaches. And in fact, you can look at the why and say that there was a significant opportunity to iterate on the traditional concepts of VPN, that access itself was a massive problem that needed to be solved, maybe particularly in this cloud and in COVID world. It presented a tremendous opportunity for the vendors that were in the space, and it was also a tremendous pot of gold, it became all about access and optimization. When the whole world was struggling with throughput and capacity, and even performance and optimization, the ability to publish application access became transformative because it optimized all of the other direct to cloud connections for those users, while redirecting capacity or infrastructure spend to deliver greater cloud security. As a result of that, it's no wonder that the vendors focused first on access, which, as you think about the subtext of your own question, it's the inherent fallacy, right? That the reason that we're taking on this more sophisticated approach is because of those Zero Trust principles, that that we must always verify, not just the user, not just the device, not just the posture, not just the source, not just the context, and then bringing that into continuous authentication. Why are we doing that in the first place, we're doing that for data access. So why will envision private access be a game changer, fundamentally, because it is not only built with an extremely mature and sophisticated data protection engine in the fabric of the solution from Day Zero, but that is merged and unified with the rest of our unified cloud at the same engines, the same policies, the same objects that apply to Data Protection concepts in private access, and ZTNA. Apply for general web browsing and posting, upload and download behaviors inside of cloud services, the entire framework of data protection becomes multi vector. So agencies won't need to be implementing the same controls in different systems, once in their web security, for Internet access, once in their cloud security for behavioral health services, once in their ZTNA, for behavior of the private application access those assets, you know, both front end middleware back end that policy objects and the policies themselves become fully unified. And when you think about it, data protection is a user and a data problem, the vector is almost irrelevant. And so when you have a feature gap between what different vectors can solve, or you have a reporting gap between what the different vectors are telling you, you introduce systemic level challenge, so envision private access is fundamentally differentiated because of that unification from Day Zero.

Jason White: Yeah, that's great. I know one of the biggest challenges that I've met when I have dealt with customers in the past, one of them is actually quantifying their data, which is what we've already talked about earlier on in this discussion, but the other is okay, now that I've actually identified what it is, how do I effectively make sure that I've got my bases covered across multiple vectors. And if you've got to do that, across multiple solutions, I can imagine the administrative and just the dysfunction that might create as you're trying to effectively control and respond to incidents in your environment.

Sadik Al-Abdulla: Think about it, Jason, whether it's an accidental issue from an employee's behavior, or whether it's a malicious actor trying to expel if emailing, it doesn't work, they'll go upload it, if uploading, it doesn't work, they'll share it from a cloud service of sharing it from a cloud service doesn't work, they'll try to tunnel it out in other ways. If you've got disparate systems with disparate controls, you may not even have visibility to all of the vectors much less a unified set of telemetry. 

Jason White: Sure, absolutely. So you know, Zero Trust is obviously more than, you know, a single product requirement, right? It is an architecture, and it's really a, you kind of hit the point there a second ago, it's a broader telemetry architecture that is providing persistent evaluation of both the user and entity trust. How will envision private access integrate with the rest of the Mc portfolio to drive Zero Trust outcomes?

Sadik Al-Abdulla: So the short answer is that it falls into the individual family of products is a unified product with a unified set of outcomes. So whether an agency consumes only envision private access, they can take advantage of the strengths on data protection, or as they start to look at the rest of the architecture. You inherit the synergies. So how will it integrate? I would say integrated is probably the wrong word, Jason. Integrate implies that it's something that has to be set up and maintained. And typically when software vendors talk about integration, they're talking about work that their customers have to perform. I prefer the word convergence. The product is simply unified. I'll say out of the box, although by a terrible analogy for a cloud service.

Jason White: No, I appreciate that. So really, thank you guys, both for your time today that that's really all the questions that I had. I feel like it was really great conversation both from a capabilities understanding as well as you know, what's really driving a lot of the initiatives behind this Executive Order for our customers. So, you know, for those who've been listening to this, and if you found it valuable, I would like to encourage you to go and view the other entries that we've produced in this particular series. Our next release is going to focus on the release of CISA’s federal cloud security strategies. So stay tuned for more information on that, as I believe that's supposed to be released on August 12. And we're recording this on August 11. So other than that, everyone stay safe. And thank you all very much for listening. And thank you both again, Kent and Sadik for your time today. 

Kent Landfield: Quite welcome. 

Sadik Al-Abdulla: Thank you.

Speaker 1: I'd like to take the time to thank our speakers for joining us today. If you'd like to learn more, please visit www.mcafee.com/publicsector. If anyone has any follow up questions, feel free to reach out to McAfeeMarketing@carahsoft.com. Thank you for listening in and have a great day.