CarahCast: Podcasts on Technology in the Public Sector

Accelerating CI/CD Cycles with Programmable Data Infrastructure with Delphix

Episode Summary

In this podcast, Aaron Jensen, the Senior Solutions Engineer at Delphix and Arif Hajee, the Principal Solutions Engineer at Delphix explain how a programmable data infrastructure ensures test data can keep pace with high-velocity development and improve development productivity.

Episode Transcription

Speaker 1: On behalf of Delphix and Carahsoft, we would like to welcome you to today's podcast focused on accelerating CI/CD cycles with programmable data infrastructure where Aaron Jensen, the senior solutions engineer at Delphix and Arif Hajee, the principal solutions engineer at Delphix will discuss how a programmable data infrastructure ensures that test data can keep pace with high velocity development and improve development productivity.

Aaron Jensen: Hello, everyone. Welcome to our webinar on accelerating CI/CD cycles with programmable data infrastructure. I am Aaron Jensen, senior solutions engineering manager at Delphix. I've been with Delphix for a couple of years. And before Delphix, I was a customer of Delphix implemented Delphix and saw a pretty transformational change to our organization when I put that in place. And so I decided to join Delphix and help other customers and companies see the see the value and I am grateful to be here today. And I'm joined by Arif. Arif you want to introduce yourself first.

Arif Hajee: Thanks, Aaron. Arif Hajee, principal solution engineer here at Delphix come from the application development data space for the last 20 years. So I've kind of gone through the cycles of you know, waterfall to agile to see CI/CD. And yeah, looking forward to today.

Aaron Jensen: So here's our agenda for today, what we want to want to dive through and cover. We'll start out by talking about CI/CD benefits, and some of the challenges that companies face, making the move to see CI/CD. More specifically, we'll dive into data bottlenecks and how those data bottlenecks are holding back, your CI/CD pipelines and your, you know, your development speed. And we'll talk about how Delphix addresses those challenges, and how we can help you accelerate your development, speed and quality using Delphix. We'll talk about how we add data agility to App dev teams, give you a little more about the complete benefits for DevOps teams and, and what you get by implementing Delphix. We will talk about Delphix under the hood give a little bit of a overview of the screens and how it works. And then we'll close out with some customer use cases and examples of how customers have leveraged Delphix. In the wild. With that I will dive in. And you have probably seen this before. This is from the state of DevOps report. And it's the what I call the value of DevOps slide that really sort of explains the difference between customers who have made the investment and gone through the learning curve, and are mature in their DevOps processes, compared to those that are not mature. So the real benefits here are they deploy 208 times more frequently, they are 106 times faster, in their lead time from commit to deploy 2604 times faster time to recovery from incidents, they're producing more, and they're also recovering from incidents faster, and they have a 7% or seven times lower change rate changes are one seventh is likely to fail when they're using DevOps. And again, it's all about breaking down those dev cycles, and the coding into more finite deliverables. And those finite deliverables, those finite deployments are much more likely to succeed because there's not a whole bunch of interdependency bundled up, and it's not, you know, the entire IT department is waiting for this one massive deployment to happen, you know, twice a year, and all the potential problems for that, right. But it's really about this cycle time of feeding through independent changes and managing the change more rapidly. One of the big challenges to doing this is really the data and the test beds and the cycle, right. And so we have companies that are in this one year release cycle where there's these big development pushes, and then huge testing pushes that could happen for weeks and weeks and weeks, and then the deployment, testing and the lockdown of everything and the deployment. And over the years, we've matured to agile, which is hey, let's do these two week or two month or one month sprints, and let's make our changes smaller, but that requires more test beds and more agility in how you work those, you know, those releases through your through your pipeline. And then with CI/CD, we're really talking about accelerating that to a, you know, almost a release per backlog item or release per change. And your teams are really, you know, much more finite in each of each of those releases. Now, the key to that is you got to be able to automate regression testing, you got to be able to automate just about anything you can through that through that pipeline. So you can make that happen. And what we've discovered, and one of the biggest challenges to automating all of this is data. This slide I think, sums it up well, right when it comes to building pipelines, and automating your code development, all the way through the through the pipeline, we see lots of tooling and support around infrastructure automation, around being able to manage your environments and your configurations and, and pushing all that through, we see lots of tooling and support around code automation, right, managing branching, merging, and testing all of that code. And then when it comes to data, and databases, we see an old squeaky Caboose, on the end of this train, most of our customers that are that are in this CI/CD transformation process, tell us that data is a big weight slowing them down, right, they're refreshing databases manually, they're, you know, opening tickets waiting for DBAs to do restores. And, and refreshes, it's literally slowing down this entire process. And that's really what we want to we want to tackle today. So these data bottlenecks are one of the big things holding back your pipelines and holding back the speed of development, right, there's a ton of weight and waste in it. And it's because data environments are complicated. We all know, there's multiple databases that have to be synchronized, they, you know, our applications are talking across different databases all across the environment, or the enterprise. And the environment is just, it's complicated. It's you can't just, you know, go make a copy in a restore and, and let everything go. There's a lot of interconnectedness data for test environments is generally because of this not refreshed, and it's slow, and the quality is poor. And data is sensitive, right? And so you've got the security team saying, Hey, we got to have a bunch of process on this, we want to make sure we're not taking our sensitive data, and let it you know, get released into non prod zones, where developers are importing code from who knows where and opening up, who knows what kind of, you know, problems. And so there's weight and gravity to the data, and the ability to deploy that data is impacting us. So the result of this is because data has this weight and complexity. Most teams are using shared test environments, which means they're wasting time on coordinating efforts, right? My favorite the emails that go out, Hey, everyone get out of QA, we got to refresh the environment or the, you know, developer who makes a mistake and, and blows up a development database. And now there's 10 developers yelling out and why did you do that. And now we have to wait for a restore, right? These shared environments are waste and cause slowness, and you know, have that gravity to our systems, waiting for environments to be provisioned and refreshed, right? If the refresh time is hours and hours, then we're refreshing less frequently. But while we're refreshing, we're waiting for that data. When I was at Deloitte as a as an example, if a developer needed a database, refreshed, eight hours was a good day. And so that's eight hours of a developer, waiting, being non-productive, waiting for a database to be refreshed if they needed that to happen in cycle. So, you know, so we don't refresh as often. And it's, you know, it's painful. We have sensitive data leaking into the nonprofit environment, that's a common issue, right? The data takes forever to get restored from the regular processes. And so people open up backdoors, you know, a business owner will come to a developer and say, Hey, can you make a tweak to the report, let me just give you an export from prod so that you can work on it, right. And so we have sensitive data leaking, and the time spent manually performing DBA operations is waste and wait, right? There's a lot of time for DBAs are spending, you know, restoring non prod environments. And that's time that's just, you know, wasted creating fake data. Because these data environments are expensive and take time to provision. A lot of times, well, I'll just make some fake data. And then I'm not testing all the use cases. And all of this results in bugs being caught late in the cycle, right. So if you've got a lot of bugs being found in integration and performance testing, that are, you know, code bugs, and data bugs, it's probably because the development test teams, they don't have fresh data, and they're doing the best they can with the data they've got. So let's introduce Delphix and what we do to solve and address these challenges, and I want to start With this quote from Rudy Gonzales, the managing director of unisys, who said data is a big hole in CI/CD pipelines without Delphix. Delphix is like the magic sauce for data, I want to tackle and sort of give an overview of why that is and what Delphix does to solve this. So, so Delphix has a unique paradigm is changing thinking in the delivery of data, right. So without Delphix. Generally speaking, if you want to create a pipeline with dev test staging, and have that pipeline, feeding your code into prod, you are taking a production database, and you're restoring three copies of it. Let's say this is a 10 terabyte production database. Without Delphix, you're making a 10 terabyte copy for Dev, a 10 terabyte copy for tests and a 10 terabyte copy for stage, that backup and restore process takes, you know, at best hours, maybe longer, there's a lot of extra time spent. And resource needed to do this. Delphix, changes that paradigm and says, you know, we don't need to create full copies of these databases, we can ingest a copy of the database into Delphix. And while it's in Delphix, we can profile and mask that data, which means we can we can change the data in a way that preserves the Devon test value of the data, but removes the sensitivity. So we're not randomly scrambling data, we're replacing names with names, social security numbers with numbers that look like social security numbers, even embedding business logic into that process, right, changing dates, so that dates preserve the value but can be changed, right and make the data so that you have a nice full data set that matches the production weight, it has the production variability, and it doesn't have the sensitivity of production so that if someone were to steal that data, it has zero street value. Once we have that data in Delphix, masked and ready to go, we can create virtual copies of these databases and these Virtual Copies provision in minutes and take up virtually no storage. So I can give each developer their own copy of this database. And I don't need 10 terabytes for every developer, I only need 10 terabytes ones to bring it into Delphix. And I can make 10 2030 copies of that data, the database will act and behave just like a physical database. But it's a virtual database, it provisions in minutes. And it gives the dev and test teams the ability to have that data rapidly and work with that data. So the benefits are, each developer can have their own copy to work with no more stepping on each other's toes, no more everybody get out of you know, out of environments. This can facilitate ephemeral test environments, right, the ability to actually have ephemeral environments with your data, where a developer comes into work does work during the day, at the end of the day, they push a button, the database blows up. The next morning, they come in, they push a button, the database is created. Everything's there from a state that they left it the night before. It allows you to go parallel, you can use API's to provision the databases in minutes and provision all the data environments in minutes with API's, this leads to a tremendous shift left in the production quality data, right, using production quality data leads to developers having the full data set. And when they test their code, they're testing, you know, they're testing against a good data set that has all the variability of production, not some fabricated data set that's missing use cases and doesn't have the things that went in last night, right, you can rebase line your databases the same way you rebase line your code. So again, a quick API call and I can rebase line my database. So now as a developer, I'm doing trial prod deployments as my unit test. And anytime I refresh the database, I'm getting the schema changes the objects, everything that went into prod last night, I now have in my environment as I'm doing my coding. And so I can test my code out, make sure it works and refresh my database in a matter of minutes and test it again. And finally, we can do identify all that sensitive data and remove it and provide these good data sets that are that are secure and easy to be used in non-prod. So with that, I'm going to turn it over to a reef dig dive in more into the details of Delphix and how we how we add value to your DevOps pipelines.

Arif Hajee: Thanks, Aaron. So you know, obviously, you know, as Aaron kind of just mentioned, you know, from a high level architecture standpoint of how Delphix really works, you know, the key is helping drive that agility from a data perspective, right. So, without Delphix, you know, a lot of organizations have been moving down this agile development path over the last few years and trying to transition and like Aaron mentioned the previous slide from waterfall to agile, agile to more continuous integration, continuous delivery, right? But again, the data being that bottleneck in order to make that transition to true full agility again, you it's difficult and it's costly, right. So you know, on teams I've been before we've done exactly what, what we're depicting here, right? We'd have a single development environment. So, you know, the concept of unit testing was all right, it compiles, let's move it up to test because we didn't have a full set of data or we had a subset of data. So typically, you know, finding bugs early on, that didn't really happen, right, or in other scenarios where we would have environments where each team would have their own dev database, which have full copies of data. The challenge there was the infrastructure itself was very costly. So every time, kind of like Aaron mentioned, every time you want to take that 10 terabyte database and make a copy, and Dev, and test and prod or in these other arena, sit in these other environments, again, there's infrastructure behind it that needs to be spun up and support that, right not only from a underlying infrastructure cost, but also the operational cost involved from DBAs, doing their cycles and things like that. So with Delphix. Again, the way we drive a little bit more data from an agility standpoint, allows you to start working things in parallel, in this case, because we have the ability to virtualize individual databases and give teams or give individuals access to these virtual data libraries, again, there is you're removing those bottlenecks of you know, having to synchronize specific steps within your cycles themselves and allow either developers or tests or anybody on the application team to start working in parallel. And also gives them the ability to start, you know, swapping data sets in and out when we started talking about, you know, test data management. So with Delphix, you know, allows developers and testers to treat data like code, right, so one of the challenges we ran into early on, you know, from an application development standpoint is, okay, well, our data is stale. So within Delphix, because we have this production stream of data always coming in, we can always refresh our virtual copy at any time, whether that's this is a, you know, a ticket to a DBA, saying, Hey, can you refresh my data, again, they can click one button at Delphix. And do that. Or I could do it from a self-service standpoint, I can go into an interface and say, Hey, let me refresh my data sets. Or if I want to tie it into my automation tools, and automatically have my data refreshed, on whatever scheduled basis, again, all of those things are possible with the same platform. Other things that come into play the ability to bookmark rewind, and branch for data sets. So you know, think of the context of, you know, integrated testing, right, I want to take all of my data sets and kind of keep them in sync. So take a snapshot or take a bookmark, it's a 9am, this morning, run through a series of tests, and again, have the ability to rewind to that bookmark point time across, not just one data set, but all my data sets at the exact same point in time. And then obviously, from a branching perspective, you know, just like you would, you know, start a new branch from a code perspective, again, you could take that same exact version of data at that point in time, branch it off, and then start another parallel path as well. So what that eventually leads to, again, is, you know, greater, greater efficiency and essentially accelerating these application development cycles. So kind of, as Aaron mentioned, as far as how Delphix work as a platform, again, on the left hand side, you've got all your different data sources, right? You've got your traditional relational sources to no SQL sources, even file based systems, again, we have the ability to virtualize all of these different systems can bring them into our engine. From a data preparation standpoint, again, you know, Aaron kind of touched on the ability to mask and secure this data. So that way, as we are moving this data downstream into our lower environments, again, you're ensuring that you're always getting secure compliant data without being at risk of exposing any sensitive information down there, you have the ability to, you know, at this point, add in any synthetic data as well. So there's a lot of third parties out there, we partner with some as well to generate data. So if you need data, that's not necessarily to, I guess, generate data generated in order to support some new application development standpoint, again, you could bring that as a dataset, the ultimate goal is you're building this library of test data sets. And what that allows you to do is, as you're going through your development and testing cycles, now, you have a wider variety of use cases to develop and test against. And you can get and do all of this in automated fashion. Imagine the use case where you're going through as a QA engineer, and not only testing a series of use cases, but you're testing those exact same use cases or variety of different sets of data. So from full production data to mass data, to synthetic data, again, you could run all of those in an automated fashion. And kind of like Aaron mentioned, all in an ephemeral fashion. So as we spin up an environment, we can run our tests, spin it down, spin up another one, again, we can do it in synchronously, we do it in parallel, they can, you know, wide variety of options there. So again, the ultimate golden, you know, like we're talking about support is, you know, it's bringing agility, it's bringing efficiencies to your processes. Again, from a testing perspective, again, whether you're a little bit more mature from a CI/CD standpoint, and actually doing true automated testing, or you may be a little bit, still doing some manual efforts. Again, the key, the key behind all of that is the data piece, right? So without Delphix. Again, you know, if you were to go through, take a data set, run a series of tests on it, and if those tests were to do anything to the data Right, all trade are changing in any way. Again, you have to put in a ticket to a DBA. Again, they'd go through that restore process, once the data will reset back to a certain point, then you could run the next test. Again, this takes time effort. And again, you know, there's a lot of time wasted waiting on different components within the organization in order for that to happen. Again, with Delphix. Again, using our bookmarking, rewind capabilities, again, as an end user, you can easily go do that manually or, again, as we pushed down the automated path, you have the ability to script all of that, and you run through X number of automated tests, you know, bookmark and rewinding all along the way, or spinning up and spinning down ephemeral environments to accomplish those tests, all in a matter of minutes. So, you know, kind of the highlight what we touched, touched on how Delphix can bring agility to your application development process, obviously, you know, the creation and refreshing of environments is key, right? Not having to wait, you know, hours, two days, two weeks in order to get an environment spun up, you know, with Delphix, you can do it in a matter of minutes. We talked about integration, testing the ability to synchronously bookmark across multiple different data sets and file systems all at once. So as we go through and run true integration testing, where an application is touching not only one data source, but many data sources downstream, again, automatically bookmarking and having the ability to rewind all of them back to the same exact point in time and continuously run through that process. Same thing goes for destructive testing, as well, as you go through run a series of tests that modify, destroy or do something to certain records within specific data sets. Again, you can always rewind, run through another series of tests, rewind, run through another series of tests, based on the ephemeral concept, again, you can do these all in parallel, too, and shrink the amount of time it's actually takes to test a specific piece of code or a specific function or feature within your application. And then the last piece, obviously, from a production issue standpoint, you know, simply by ingesting that production data into Delphix. Now, you have some added benefits on the production operation side of the house. So things like point in time environments, so as an issue is being detected within your production systems. You know, typically, as that issue gets detected, someone would notify the DBA, the DBAs would either take a current snapshot of their data environments, spin it up into another area. So developers, testers, people have an area to start detecting and troubleshooting what that issue is, again, with Delphix, DBAs can either come in and say, Hey, we already have that data, let me automatically spin that up. Or from an automated fashion, we have the ability to hook into, you know, APM based tools like AP diamond dynamics, and things like that. So as issues are detected, we can automatically spin up these point in time environments, and start the issue detection resolution process much faster, essentially shrinking the amount of time that you're down in production. With that, you know, as we talk about agility, also, from an app dev standpoint, you know, a lot of organizations as their data environments are growing, right, we have organizations that are growing, you know, 1020 100%, as far as what their data footprints look like on a on an annual basis, right? So with that, as in order to support that on premise, that means additional infrastructure costs, servers, development, all that kind of stuff behind the scenes. So a lot of customers are migrating towards some type of hybrid cloud strategy to enable development and testing. Right, we'll still keep their production environments on premise. But you know, how do we become more agile? How do we spin up environments as needed in different cloud based sources, and Delphix has done this for quite a few customers and actually helps you accelerate that as well. Kind of like Aaron mentioned, once we ingest the data from your production systems and master that data, again, you could simply spin up a Delphix engine in any of the supportive cloud environments, Delphix will automatically replicate only the change data between the two Delphix instances. So you're not pushing entire data sets over the network every single time you need to do a refresh. Once that data is sitting there in the Delphix environment. Now the same concepts apply that we've been talking about today, from an affiliate standpoint, spinning up new data sets in, you know, at AWS, or Azure, or GCP, or whatever those may be, again, you could do that, you know, a couple clicks of buttons, or, again, add those into your tool chains today to automatically say, hey, let me provision not only a database, but you know, as we start getting on the infrastructure management side, you know, configuring entire stack and having a dedicated Delphix be the data component of that as well. And lastly, obviously, you know, when we talk about multi cloud environments as well, having Delphix setup, a replication engine in one or many clouds allows you a to reduce vendor lock in B allows you to test and develop across multiple different platforms as well. You know, depending on what your internal strategy is, today, it may be maybe AWS tomorrow, it may be Google, or it may be split across multiple cloud vendors. You know, what Delphix allows you to do is help reduce the amount of time it takes moving data from one site to another. And then again, obviously, surrounding the whole thing is to secure compliant data as well.

Aaron Jensen: Hey, thanks, Arif. So now I want to shift gears a little bit and talk about the benefits some of the, you know, the real tangible financial benefits of deploying Delphix and accelerating your pipelines. And the next few slides here are from a study that IDC did. They, they went out and analyzed a number of our customers and, you know, through a series of, you know, before Delphix, after Delphix. And, and trying to dive in and understand our customers, they produce a study that gives us some insight into the value of Delphix. And the value of automating delivery of your data. And you know where that comes from. So the first one here is really around the shift left, right, everyone talks about, we got to get a shift left, there's a pretty dramatic impact on shift left with Delphix. The reason being, you're enabling those dev and test teams to have rebase line data in minutes. And therefore, developers and testers, they code faster, they find bugs faster. They're running unit tests against production like data, and they're able to wring out those bugs much sooner in the pipeline. And that, of course, has significant impacts, right, every downstream step, that a bug goes before it's discovered exponentially increases the cost of remediating that bug. And nothing, you know, is worse for developers, then they're working on something. And, you know, the code that they pushed down two weeks ago, is now coming back. And they have to figure out, Okay, what was I thinking, when I did this, let me go, you know, context switch, and try and dive back in and figure out what went wrong, you know, only to find out that, you know, it was because the data that they tested on didn't match production, or someone put a brake fix into production, you know, last week, and it's colliding with it, and they're not able to be efficient. And they have to do this context, switching back and forth. So shifting all of us left, giving developers the ability to find bugs early find issues early is a huge benefit to organizations and, you know, goes without saying that, that that's a big benefit on costs. And so in this case, they discovered that, you know, for customers that have Delphix, 93%, decrease in the amount of time it takes to create development environments, 91% decrease in the amount of time it takes to refresh those development environments. 4% decrease in the amount of time it takes to code because developers can code faster, not worrying about if I blow this database up, I'm going to bring it down for five people, they can try it fail, retry it, and so they're able to code faster, 39% decrease in testing time, 26%, decrease in integration time and 24% decrease in deployment time, coming to a net change of a 40% decrease in time to get code through the pipeline. So that's significant. That translates to take your development teams, you can get 40% more throughput, if you're using Delphix. Through those teams, 40% more code, or think about it in the terms of like, it's adding 40% more developers to your staff without adding salary. So that's significant. And then finally, production benefits for the application development teams really hones in on 30% decrease in development time per new application 40% of development, time per new feature, or upgrade, 40% decrease in the time for a new feature and upgrade, an increased 33% of the average number of releases per application per year. So you're getting 33% increase in speed and you know, the number of releases and a 20% 26% increase in the number of new features that they're able to get in those releases. So more frequent releases, and more features in those releases. Again, this is like going to the business saying, Hey, we can get you 26% more features and do it 33% faster, we can use to ethics. So now we'll switch gears again, Sharif is going to give us a little bit of an under the hood.

Arif Hajee:  Look at Delphix Thanks, sir. Yeah, so you know, up until now, we've gone through, you know, kind of what Delphix is how it's being used, you know why you should use it from a use case perspective, as well. And some of the numbers behind it. What we thought here we do here is kind of just give you a glimpse into the actual product itself. You know, obviously, depending on we offer a couple different interfaces based on based on roles, and then obviously, everything we're going to be talking about here is API driven as well. So as a data administrator, you know, logging into Delphix, you would easily come into a single interface, they would give you kind of the health of the system, let you know how much data is being used, what the IO, it looks like, all those different kinds of things. And then obviously, what is the current state of ultra-virtual databases themselves? You know, when was the last snapshot taken? Any pending actions that are going on as well? Any action that can take be taken here again, you can expose that from a self-service standpoint, over to your developers and testers. Again, using the same set of underlying API's, again, we have what we call our self-service interfaces a little more simplified allows your developers test QA engineers to visualize the state in which their current data containers lie, right. And so you'll see across the top the ability to bookmark branch, they can share those bookmarks as well, at any point, if they need to refresh or restore data back to a specific point in time. Again, they have all those self-service features available to them. And like I said, again, everything that we're doing here is API driven. So again, to fully automate all of these different tasks, again, taking those API's and dropping them into a, for example, a Jenkins pipeline is very simple. Again, here's an example of, you know, a simple Jenkins pipeline going in and creating a virtual database, you know, provisioning a self-service template, and then delivering that out to a specific environment. Everything we're doing there is driven through those same exact API's that our interfaces are built on top of. Great, thanks.

Aaron Jensen: So I'm going to just dive in to a couple of his three use cases here, customer scenarios where they've actually deployed Delphix and, you know, specific examples across these, these three scenarios. So gain capital, they deployed Delphix, and saw significant, you know, increases in their throughput and decreases in their cycle times 75% decrease in cycle times and an increase of 20% in their output by doing this, and they directly attributed to having rapid data provision by Delphix. And so provisioning databases in three minutes instead of hours, has a significant impact to the business and in this case, helped them significantly increase their development speed, Delta Dental, another example, they saw significant increase in revenue, because they were able to accelerate their releases using the Self Service interface of Delphix for 200, developers and testers in their AWS environments. So developers and testers can now self-service their data, they can self-service refresh their data, they can bookmark their data, right, and they can do all of this without needing, you know, again, waiting for DBAs waiting for these processes. And it resulted in significant increase that they directly attributed to their ability to capture more revenue, because they're getting features out ahead of competition and being more agile with their business features and what they need to do. And then finally, StubHub, this is sort of one of the best use cases, we have a direct increase in CI/CD speed stub hub was doing five releases a year before Delphix. And most of that, you know, the reason for that was, the data was just complex. And the time to recycle in order for them to speed that up, would require many, many parallel environments, and the cost was just overwhelming. And with Delphix, they were able to go parallel. And it really helped them to make the shift to, you know, releasing each backlog item as an individual release. And now they are in the in the range of 10,000 releases a year. And they again, attribute that to a significant increase in their revenue from 1 billion to 4 billion over three years and saying, look, we're able to stay ahead of the competition and release business value faster. And that is resulting in a significant increase of value to the business. That's sort of, you know, the overview of Delphix. You know, we hope it explains how you can go faster how you can, you know, get your data to not be that that wait that's slowing you down and slowing down your cycles.

Speaker 1: Thanks for listening. If you'd like more information on how Carahsoft or Delphix can assist your institution, please visit www.carahsoft.com/delphix or email us at delphix@carahsoft.com. Thanks again for listening and have a great day.