There was an interesting article in BeyeNETWORK the other day about the term advanced analytics. In the article, Claudia Imhof says that use of the phrase is growing in vendor marketing, and attempts to shed light on it in order “to hopefully start some constructive dialog leading to consensus.” After examining related definitions and categories, she describes the types of vendors and technologies that typically fall into the advanced analytics bucket:
Analytical processing solutions continue to evolve and change as new technologies and new types of data are introduced into organizations. The result is that companies are at different levels of maturity in their use of analytics. Some may be using basic dashboards of performance indicators, whereas others may have expanded into using more sophisticated techniques such as data mining and predictive analytics. The term, advanced analytics, is often associated with the more sophisticated capabilities.
This led us to reflect on the term, too and what it means – or what it should mean. We thought we would make the following modest proposal: instead of associating advanced analytics with the “usual suspects” – basically dense technology for complex needs – why not use “advanced” to describe the underpinnings that actually simplify BI for the user. In our view, the advanced part should hide complexity that otherwise is prevalent throughout the BI and analytics stacks and processes, and make it easy for users to do a range of analyses from simple queries to extraordinarily sophisticated exploration, directly from raw data.
My readers know about my frequent call for increased transparency in the mortgage securities industry and I’m pleased to say that today we announced a partnership that works to move that goal ever closer.
S&P Valuation and Risk Strategies, an independent and analytically separate business unit within Standard & Poor’s, announced an agreement to offer new investor analytics for U.S. residential mortgage-backed securities (RMBS) via 1010data. Together we bring loan-level performance data on subprime, alternative-A, prime jumbo and additional collateral type loans to our joint customers. S&P’s Jonathan Reeve said that working with 1010data, makes it easier for them to deliverer a comprehensive risk management solutions, allowing investors to quickly dig down into the foundational components of their portfolios in a way that was never before possible.
I was glad to see John Hintze’s article, Goldman CDO: The Loan Data Was There for All to See in Securities Industry News. It was extremely refreshing to see a journalist make the point that many in the industry have been waiting to see published since the news broke about the Abacus deal.
According to the article:
…detailed and updated data about these assets-mortgage loans populating 90 mortgage-backed securities that made up the $2 billion CDO called ABACUS 2007-ACI-was there for all to analyze…
The fact that the data was out there for investors to analyze is extremely important. It means buyers of the Goldman bonds had enough information to determine that the deal could suffer from extreme losses. Some of the controversy around this CDO revolves around the parties involved in selecting the bonds in the deal. Leaving this particular issue aside, the mere fact that the data was there for all investors to analyze is a huge point. If the investors had been doing their jobs properly, they could have looked right at the collateral and determined the quality of the bonds themselves.
There was also a related story in the Wall Street Journal yesterday: Toxic CDOs Beset FDIC as Banks Fail. The article reports:
The Federal Deposit Insurance Corp., and by extension the U.S. taxpayer, owns more than 250 collateralized debt obligations that were purchased by small institutions that later failed. Although the bonds have a book value of more than $400 million, they are a headache for the agency as it grapples with the toxic assets flowing from many banks around the country…The FDIC‘s focus on CDOs comes at a time when the financial instruments are being scrutinized by regulators and prosecutors. Several Wall Street firms, including Goldman Sachs Group Inc. and Morgan Stanley, have attracted particular attention in recent weeks for what they told investors about the nature of the CDOs when the initially sold them.
Here too, the data is there for all to see.
This blog is about education, and we intentionally steer clear of self-promotion. Just the same, we feel it is important to point out the third party validation for our approach, and are pretty excited about the story. There is a lot of misinformation out there about the limitations of cloud-based analytics (see last Tuesday’s post on this blog, for example, in which we point out erroneous claims) and believe that highlighting the experiences of customers is one way to counter misinformation and portray the results that can be achieved, especially for data-intensive fields like CPG and retail.
Writer Joe Skorupa very eloquently summarizes the effort and results achieved in the opening paragraphs:
A massive cloud over Europe obscured views and disrupted air travel this spring thanks to airborne dust from an active volcano, while another large cloud within Dollar General was having the opposite effect. It was producing new levels of clarity by doing large-scale, high-speed analysis on billions of rows of stored data and delivering actionable insight to marketing and merchandising executives who were effectively putting it to use.
Insight from this cloud-based, data warehouse and reporting solution was uncovering previously hidden opportunities for key executives to make more informed decisions to increase same-store sales, which is essential to help drive one of the fastest growing retailers in the country.
Results from this system and other strategic steps taken by Dollar General in the past two years have produced impressive financial gains. At its most recent quarterly filing, on March 31, Dollar General reported a net income increase of 121 percent to $172.9 million along with 9.5 percent growth in same-store sales in 2009 on top of nine percent growth in 2008.
Regarding concerns over cloud scalability, the article states:
The Dollar General deployment goes a long way toward allaying these fears, in part because it is a $12 billion enterprise with 8,800 stores. Its cloud-based data warehouse system currently handles 45 terabytes with 70 billion rows of data. If a retailer the size of Dollar General can use a cloud-based tool to handle core business operations, then most retailers also can use it and the technology may be more bullet proof than many retail CIOs think.
Elsewhere, the article quotes Dollar General’s chairman and CEO:
“We accomplished these objectives while investing for future growth, a balance that positions us well for the long-term,” said Rick Dreiling, chairman and CEO. “We are confident that we have the right strategy in place to continue building on our track record of profitable growth as we enter 2010.”
Please visit the following link to read the full story.
A number of very interesting articles have appeared recently. This post is a roundup that offers links and observations for each.
IT Web just published an article with the title: BI’s Next Frontier: Social Computing and the Cloud. It predicts that unstructured data that typifies social media and corporate content will increasingly be factored into BI projects, which have historically worked with structured data. On the topic of cloud-based BI, the article quotes an Oracle executive:“ … there is no reason why most – if not all – BI functions can’t be done ‘in the cloud’, says Du Toit.”
Unfortunately, it repeats a common misperception: However, there are some constraints to the cloud, which could make BI functions near impossible. For example, moving large volumes of data into the cloud might be problematic due to latency and data storage fees. Please see our post on cloud-based analytics and scalability, which should put any questions about handling large data in the cloud to rest.
The Informatica blog had a nice post that features an interview with Tom Davenport and Jeanne Harris, authors of a new book called Analytics at Work: Smarter Decisions, Better Results. The book is a follow-up to their earlier text, which was an important contribution to the field of data management: Competing on Analytics: The New Science of Winning. This latest entry provides a hands-on guide that offers advice on how companies can become more data driven.
Finally, analyst Dana Gardner had a post on Friday that featured a round table discussion including Forrester analyst James Kobielus, Stan Swete of SaaS ERP provider Workday, and Seth Grimes, Alta Plana consultant and Intelligent Enterprise contributor. The forum asked and answered the question: Can software-as-a-service (SaaS) applications actually accelerate the use and power of business analytics?
Dana starts out by invoking Marshal McLuhan in inviting readers to: learn more about how the application and interfaces are the analytics.
I found this to be a great read and discussion.