Rising data-loading speeds may be leaving most BI users behind

Started by dhilipkumar, Apr 21, 2009, 12:39 AM

previous topic - next topic
Go Down

dhilipkumar

Rising data-loading speeds may be leaving most BI users behind

One of the more prosaic parts of data warehousing is getting information into a warehouse in the first place. Vendors that sell data-loading tools have long operated in the shadows of the business intelligence market, with little pizazz or glory.

Even in the isolated world of extract, transform and load (ETL) software, the focus has traditionally been more on the problem of cleansing and modifying data to prepare it for analytical uses. Data loading seemed to be an afterthought -- a piece of cake -- in comparison.But that's changing as BI and analytics become more of a near-real-time affair at many companies. In addition, the biggest BI users now operate data warehouses larger than a petabyte in size and need to import huge amounts of data into them. For instance, decision-support database vendor Teradata Corp. says that eBay Inc. loads 50TB of online auction and purchase data on a daily basis.

Over the past few months, several start-ups and relatively unknown vendors have tried to take advantage of such needs by touting screaming-fast data-loading speeds that they claim to have achieved either in the lab or with users in the field.Database start-up Greenplum Inc. said customer Fox Interactive Media Inc. routinely loads 2TB of Web usage data into its data warehouse in half an hour. Meanwhile, rival Aster Data Systems Inc. claimed that its nCluster technology supports load speeds of up to 3.6TB per hour.

Not to be outdone, Expressor Software Corp., an ETL start-up offering so-called semantic data integration tools, said in-house tests show that its data processing engine can scale to a rate of nearly 11TB per hour.

Even Syncsort Inc., a 41-year-old company that began as a mainframe software vendor, has gotten into the act. Syncsort said that in lab tests, its data integration software loaded 5.4TB of data into a warehouse built around Vertica Systems Inc.'s column-based database in less than an hour.If Syncsort and the other vendors are actually achieving those kinds of load rates, that's "really impressive," said James Kobielus, an analyst at Forrester Research Inc. "Anything above a terabyte per hour is good."

And what about more-established vendors? Two years ago, SAS Institute Inc. and Sun Microsystems Inc. demonstrated a SAS data warehouse running on Sun hardware and StorageTek disk arrays that could push through 1.7TB of data in 17 minutes -- the equivalent of just under 6TB per hour.

But other big-name vendors have posted performance benchmarks that fall short of the load rates claimed by the upstarts. Last fall, for instance, Oracle Corp. and Hewlett-Packard Co. said their joint BI-oriented HP Oracle Database Machine could load up to 1TB per hour. And Microsoft Corp. said early last year that the data integration software built into SQL Server 2008 had loaded at a rate of 2.36TB per hour.

computerworld

dhilipkumar

Re: Rising data-loading speeds may be leaving most BI users behind

But do customers really need the ultrafast loading speeds that vendors have been touting recently?

Not surprisingly, John Russell, Expressor's chief scientist, contends that many do. "Every financial firm we talk with says they want . . . something close to 1TB per day," he said. "For clickstream data [from Web sites], those figures could be as high as 200 billion clicks, or nearly 24TB a day."

A longtime data warehouse architect for Fortune 100 companies, Russell said he co-founded Expressor partly "out of the frustration I felt when dealing with the performance limitations and bottlenecks of those high-end [data integration] tools."

Kobielus said load rates of multiple terabytes per hour "are becoming the norm" for warehouses that store large volumes of event-based data, such as Web clickstream info or the call-detail records generated by telecommunications systems.

They can also be useful when companies need to populate new warehouses or data marts with historical information for quick-turnaround data-mining projects, Kobielus said.

But such uses are outside the mainstream of enterprise data warehousing, he added. According to Kobielus, most warehouses still store less than 10TB of data and only need "gigabytes per hour" load rates.

Independent database analyst Curt Monash made a similar point last December in a blog post about the performance claim made by Syncsort and Vertica. Monash acknowledged that data loading "is an increasingly nontrivial subject." But in general, he contended, commercial databases "will provide most users with much more load speed than they actually need."

computerworld

dhilipkumar

Re: Rising data-loading speeds may be leaving most BI users behind

Peter Schmidt is director of business intelligence at Centro LLC, a 100-employee online advertising services firm in Chicago. Centro has modest data-loading needs, but Schmidt has worked in BI for more than 20 years and has held jobs at companies with much greater storage needs, such as United Air Lines Inc. and OfficeMax Inc.

So, could Schmidt ever imagine needing an ETL tool that could load 4TB or more per hour? "No, not to that level," he said. And, he noted, "the performance tests out there are never apples-to-apples [with real-world applications]. So what if you can bring in 11TB per hour but aren't transforming anything, just moving data from Point A to Point B?"

Even Teradata is skeptical about how broad the demand is for high-end loading tools. "Extreme data-load rates are irrelevant to most customer environments," said Randy Lea, the vendor's vice president of products and services.

Most data warehousing systems, including Teradata's, can be configured to load multiple terabytes of data per hour, Lea said. But he cautioned that such systems are at risk of becoming unbalanced and performing badly in other areas, such as data reads and queries.

In addition, he said, "the current crop of 'gee whiz' data-loading boasts have little value because there are no benchmark standards."

The latter issue is now being addressed by the Transaction Processing Performance Council, a benchmark development group known as the TPC. It formed an ETL benchmark subcommittee last November, and an initial meeting is scheduled for this month. That may at least enable BI users to see how vendors really stack up against each other when it comes to pressing the load pedal to the floor

computerworld

aswinnandha

that sell data loading tools have long operated in the shadows of the business intelligence
the isolated world of extract, transform and load software, the focus has traditionally been more on the problem of cleansing and modifying data to prepare it for analytical uses
the biggest BI users now operate data warehouses larger than a petabyte in size and need to import huge amounts of data into them

Go Up
 

Quick Reply

With Quick-Reply you can write a post when viewing a topic without loading a new page. You can still use bulletin board code and smileys as you would in a normal post.

Warning: this topic has not been posted in for at least 120 days.
Unless you're sure you want to reply, please consider starting a new topic.

Note: this post will not display until it's been approved by a moderator.
Name:
Email:
Verification:
Please leave this box empty:

Type the letters shown in the picture
Listen to the letters / Request another image

Type the letters shown in the picture:

shortcuts: alt+s submit/post or alt+p preview
IT Acumens Web Designing Chennai | GinGly :: Build your Personal Website | CineBuzz :: Cinema News | My Kids Diary :: Kids Memories Writing :: Book Website @ 349 Rs monthly
Copyright 2005 - 2020 :: IT Acumens :: All Rights Reserved. :: Sitemap
ITAcumens Discussion Forum with 2 lakhs post running for 15 years - Powered by IT Acumens Dedicated Server