Modernizing Legacy Data Access
Without Rewrites
Data Layer Decoupling separates database access from legacy apps so mainframe and IBM i
programs can use modern cloud data platforms without rewriting business logic. Modernization
Planner defines targets, and Integration Studio generates the APIs and proxies
Why Data Layer Decoupling Matters
Modern enterprises sit on mountains of critical data locked in legacy databases—often DB2 or IMS accessed via thousands of
embedded SQL statements inside COBOL or RPG programs. Those databases are rock-solid, but they make change slow, risky,
and expensive: every schema change or migration proposal triggers fears of breaking business logic everywhere.
A classic “rip-and-replace” approach forces a painful choice:
- Rewrite millions of lines of code to point at new data stores, or
- Leave data where it is and delay cloud, analytics, and AI initiatives indefinitely
Data Layer Decoupling offers another path.
By separating data access from business logic, you can:
Gradual Data Migration
Migrate data gradually instead of all at once to reduce risk and make the process more manageable.
Hybrid Data Architecture
Support hybrid data architectures (some data on-prem, some in cloud databases).
Data for Analytics & AI
Unlock legacy data for analytics, AI, and new applications without waiting years for a full rewrite.
If Internal Decoupling builds a bridge between modules, and External Decoupling builds a bridge at
the gateway edge, Data Layer Decoupling builds a bridge to modern data—allowing legacy apps and
cloud databases to coexist safely and evolve together.
What Is the Data Layer Decoupling Pattern?
Data Layer Decoupling replaces embedded SQL and direct database calls in legacy
applications with remote API-based data services, enabling connections to modern databases
without rewriting business logic.
In traditional mainframe and IBM i systems
- COBOL/RPG programs contain hard-coded SQL that directly targets DB2 or IMS.
- Schema changes are dangerous because they touch many programs.
- Moving data to cloud databases like PostgreSQL, Aurora, or NoSQL feels impossible.
With Data Layer Decoupling using OpenLegacy Hub
- Embedded SQL is extracted and replaced with calls to data services (REST/gRPC APIs).
- A generated Data Proxy translates these API calls into legacy SQL for DB2/IMS and/or modern SQL for cloud databases.
- Applications keep their existing business logic flow—but data access is now routed through a governed, observable interface.
From the enterprise perspective
- Legacy apps don’t need a deep rewrite, you swap SQL blocks for service calls.
- Data lives where it makes the most sense (on-prem, cloud, or both at once).
- Data modernization can run at its own pace, independently of application modernization
When to use it
- You want to migrate DB2/IMS data to PostgreSQL, Aurora, MongoDB, or other cloud stores.
- You need to feed analytics platforms and AI/ML models from legacy data without disrupting core apps.
- You plan to support long-term hybrid data (some on mainframe, some in cloud) while keeping existing programs running.
From Embedded SQL to
Hybrid Data Access
AS-IS State
External systems are tightly coupled to legacy endpoints
Planning
Modernization Planner identifies which gateway traffic to redirect to modern services
The Gap
Modernizing the backend alone breaks existing connections—without a bridge
The Bridge
OpenLegacy generates adapters that transform and reroute legacy traffic
End-State
Legacy channels remain online while the backend modernizes seamlessly.
How Data Layer Decoupling Works
with OpenLegacy Hub
Like the other patterns, you’re not hand-designing this. Planner and Integration
Studio operationalize Data Layer Decoupling end-to-end.
Discover & Analyze Embedded SQL
- Modernization Planner scans COBOL/RPG and related artifacts to detect embedded SQL, cursors, and data access patterns.
- It maps which programs access which tables, views, and schemas, giving clear visibility into data dependencies
A clear picture of where data access is tightly coupled to code
Proposes Data Decoupling
- Planner groups programs and tables into domains (e.g., Customers, Accounts, Claims, Inventory).
- It recommends which data sets and programs to decouple first, based on modernization priorities and risk.
A phased, domain-based data decoupling plan.
Generates Data Services & Proxies
- Integration Studio creates Integration Factory assets, including REST/gRPC SQL services and data proxies for legacy DB2/IMS.
- Policies control naming, security, and deployment, ensuring every service is consistent and fully governed.
Concrete data services and proxies that you can wire into legacy apps without bespoke plumbing
Refactor Data Access in Apps
- Embedded SQL is replaced with calls to the generated APIs (often via simple wrappers or call routines).
- Optional stubs allow calling the legacy DB path directly as fallback.
Apps now call a data service, not a specific DB implementation
Deploy in Phased Coexistence
- Data proxies run on z/OS USS, Linux, or Kubernetes (on-prem or cloud).
- You can point them at DB2/IMS, cloud databases, or both—supporting hybrid read/write strategies.
Legacy apps and cloud databases coexist behind a stable data access interface.
Observe & Optimize
- Built-in observability (metrics, logs, traces) helps monitor performance, errors, and usage.
- You gradually shift read and write traffic to modern databases; when certain tables are no longer needed on legacy, they can be retired.
Data modernization becomes a controlled, measurable process—not a blind leap.
Data Layer Decoupling: Before & After
Transform your legacy data architecture into a flexible, cloud-ready infrastructure
without disrupting business operations
Before (Tightly Coupled Data):
- Business logic and data access live together in COBOL/RPG programs.
- Thousands of embedded SQL statements are hard-wired to DB2/IMS.
- Any schema change or database migration is high-risk and slow.
After Data Layer Decoupling (Bridge in Place)
- Applications call data services exposed via OpenLegacy Data Proxy
- Data services handle routing, transformation, and security across DB2/IMS and cloud databases.
- You can change databases, move tables, or add cloud stores without changing business logic.
How OpenLegacy Removes the Risk
in Data Modernization
OpenLegacy eliminates data modernization risks by decoupling the data layer with
proven patterns and automated tooling.
No big-bang rewrite
Migrate data incrementally, domain by domain, on your schedule.
Automation-first
AI-driven tools generate SQL APIs, proxies, and mappings in minutes, not manual re-coding
Structured roll‑back
If an issue arises, you can route data calls back to DB2/IMS without breaking applications
Security & Compliance baked‑in
TLS, OAuth2, RBAC, audit logs, and field-level encryption are available out of the box
Seamless hybrid operation
Legacy and modern data stores coexist; your apps don’t need to know where the data physically lives.
Typical Industry Use Cases for
Data-layer Decoupling
Banking
Migrate core banking data from DB2 to PostgreSQL to support open banking APIs, advanced analytics, and real-time customer insights—while mainframe applications keep running unchanged.
Insurance
Move policy, claims, and customer data to Aurora or other cloud databases to enable digital portals, AI-driven underwriting, and compliance reporting—without rewriting COBOL adjudication logic.
Retail
Offload transactional and historical sales data from IMS to NoSQL platforms to power personalized marketing, dynamic pricing, and omnichannel experiences, all while batch programs continue accessing legacy data.
Manufacturing
Migrate inventory and production data from DB2 into cloud databases to support IoT-driven analytics, predictive maintenance, and real-time supply chain visibility—without disrupting mission-critical legacy apps.
Cross-Industry
Enable hybrid data access for any industry where legacy applications integrate with cloud analytics, AI/ML, or new channels. Data Layer Decoupling unlocks legacy data for modern initiatives without risks of large-scale rewrites
Key Benefits at a Glance
65% faster migration
Migrate data access faster
than rewriting embedded SQL
and COBOL logic.
Zero Unplanned Downtime
Transition data layers without
interrupting applications or
business operations.
40% lower TCO
Cut costs by gradually retiring
legacy databases and storage
layers.
Cloud-Ready Data
Make data instantly available for
analytics, AI, and modern
applications.
Net result:
You don’t need to complete a full legacy migration before leveraging modern data capabilities—your decoupled data layer is ready for AI, analytics, and new cloud-native applications from day one.
How Data Layer Decoupling Fits with
Other Decoupling Patterns
Data Layer Decoupling is one of three core modernization patterns in OpenLegacy Hub
External Decoupling
Modernize access gateways (CTG, MQ, IMS, DB endpoints) without changing channels or UIs.
Internal Decoupling
Modernize core modules one at a time while keeping internal calls working via generated proxies.
Together they provide
Internal Decoupling
keeps modules talking
Data-Layer Decoupling
keeps data accessible and governable
External Decoupling
keeps channels stable
Data Layer Decoupling FAQs
Data Layer Decoupling is a modernization pattern that removes embedded SQL and direct database calls from legacy applications and replaces them with API-based data services. This lets legacy apps access modern cloud databases and analytics platforms without rewriting core business logic.
Ready to Free Legacy Data Without
Rewriting Your Apps?
Data Layer Decoupling with OpenLegacy Hub turns locked-in legacy data into a strategic asset —
available to modern databases, analytics platforms, and AI/ML services—without forcing you to rewrite
core applications or risk big-bang migrations.