Illustration of interlocking multi-colored shapes



2-Step Solution for a Solid Data Ecosystem

Teradata® Data Integration Optimization Services can help companies boost analytic performance while lowering costs.

From the outside looking in, data integration (DI) seems easy, right? You just combine data residing in different systems to provide users with a unified view of the individual data elements in a timely manner. All you need is a database, a few SQL statements, some scripts and voila, perfection! However, delivering data—including big data—and information system DI is anything but simple.

DI runtimes can account for 70% to 80% of the overall data ecosystem workload. And as the need for business analytics grows, system resources that were previously available for data integration are now being required for these analytics requests. Organizations then have to look at tuning, redesigning and expanding the DI function within their analytics ecosystem.

Teradata® Consulting Services offers Data Integration Optimization Services that can assist with the evaluation of all options for improving DI, including deciding whether to modify extract, transform and load (ETL) code, re-architect ETL processes or extend the ecosystem with options such as Apache™ Hadoop® or a Teradata system.

Offload or Optimize

As new technologies, business requirements, data sources and other factors are introduced, a rebalancing or data integration optimization (DIO) effort is often required. Rebalancing entails changing the architectural model and offloading non-analytic processes like ETL to another platform to free the data warehouse to concentrate on analytics.

However, offloading ETL processes is not easy since many dependencies need to be considered and weighed first. Plus, most organizations have written their ETL processes over a period of time, and even if they applied the best practices of the day, they are still yesterday’s approaches. Those processes must therefore be evaluated to find ways to lower costs now while keeping an eye toward the future.

Another approach is to optimize the DI in place. Yet optimizing existing implementations can be challenging since doing so requires:

  • Resources
  • Spending money to “fix” a solution that has already been purchased
  • Using existing tools
  • Re-implementation expenses
  • New business drivers as a result of changing priorities
  • Balance of potential gains versus expenses

With such a wide variety of trade-offs available, Teradata Consulting Services recommends assessing the DI environment and then systematically tuning, redesigning and expanding or extending the ecosystem.

2-Step Solution

An organization’s DI practices are often not the best, and the data loading processes and environment may have actually eroded over time. Although a company often assumes that offloading is the answer, Teradata experts, based on real-world implementation, have found that re-architecting DI processes recaptures and returns valuable CPU cycles back to the data warehouse for business analytics use. Teradata Consulting Services offers a two-step approach for DIO:

Step 1: DIO Assessment
Critical components of an assessment are a goal and performance metrics. Thanks to query banding technology from Teradata, it is easy to determine how specific DI functionalities relate to utilization on the Teradata Database. (An example of a Teradata query banding implementation with Informatica Connections can be found on the Teradata Developer Exchange.)

Defining a measurable goal for the optimization, for instance recovering 50% of the CPU from the analytic database for user availability, is critical to success. The metrics, then, should include costs, which can guide the value proposition of the recommendations.

With metrics and a goal, Teradata Consulting Services can provide a recommended approach. Based on the timeline, risk, cost, roadmaps, service level agreements (SLAs), critical execution windows and other factors, the services specialists will usually recommend a combination of options to meet specific objectives as the project increases in complexity and cost. Typically, tuning is on the low end of the spectrum and extending the ecosystem is at the top. (See table.)

Step 2: Engineer the Solution
Armed with Teradata Consulting Services recommendations, organizations can begin the implementation, testing and deployment. Throughout this process, baseline metrics from the DI optimization assessment should be measured against the established goals.

Once the DIO is complete, organizations can expect to see improved SLAs and CPU cycle recovery. The DI strategy and architecture, once determined, can establish consistent patterns to enable DI delivery that increases productivity and decreases costs.

Leverage Proven Techniques

The answer to benefitting from solutions is as clear today as it has been across many generations of new capabilities and platforms. It’s to use the available platforms for their strengths, limit complexity and focus on the delivery of value—not just try to utilize the latest and most hyped technology.

That’s why the Teradata approach to DI is to assess, plan and execute an engineered solution using proven techniques from leaders in the field. Through experience, expertise and thought leadership, Teradata DIO experts leverage multiple vendors across various domains to provide a balanced solution for a solid ecosystem.

Brian Richards is a Partner with Teradata Professional Services. He has more than 25 years of IT experience and leads the Enterprise Data Management Center of Experience (COE) within Teradata.

David R. Schiller, CCP, has nearly 30 years of IT experience. He manages Teradata Professional Services marketing programs.

Your Comment:
Your Rating:

Fuzzy Logix