Enterprises using cloud-first methods focus on building in the cloud before relying on other methods. This works well since there are a vast array of cloud services available. Connecting those systems together typically is quick and painless. It can also be implemented in a use case by use case fashion since all the individual components are readily available.
This is especially true and a recommended DevOps approach when the development teams are cross-functional.
This approach also enables quick testing since the assets are all available. Each time one component gets hooked in, it can be tested as a small unit and then built into the larger solution. There is no need to worry about a larger team since the individual cloud services all have tested interfaces.
But what if some of the services needed are coming from core or legacy systems not conveniently accessible from the cloud? For example, say you need to read or modify data located on a mainframe.
Then you need first to see if that system is scheduled to be migrated to the cloud and, if yes, does the timing meet your needs. If it does, work with the migration time to understand what is being built and how clean interfaces will come together.
If the migration plan doesn’t meet your needs, you need a way to access the assets directly. But the key here is not to change the cloud-first strategy. The same digital team needs direct and seamless access through cloud-native services. The trick is to automate all of it and have those core assets available in a repository as cloud-native assets for the team can use. To learn more, please take a look at our whitepaper on the topic of DevOps with Legacy.