Today I attended the SAP BI 4.0 launch event at the Royal College of Physicians in Regent’s Park London, the first major launch event for Business Objects since being firmly under the SAP banner lending it a degree of expectation amongst the BO user community. With all said and done the day had a slightly unusual structure, interleaving the “new features” sessions with “the future of BI” and “other interesting stuff” talks so the remainder of this post covers the main themes of the day and the “big picture” topics whilst I’ve broken out the new features into a “What’s New in Business Objects XI Release 4 / SAP BI 4.0?” post.
The morning’s main talk came from SAP’s Technology Evangelist, Timo Elliott (for those of you from the SQL Server world think of a cross between Brent Ozar and Andrew Fryer) who delivered a punchy and informative overview of the ‘big ticket’ enhancements coming in 4.0. The main themes of Timo’s talk an of the day in general were the forthcoming enhancements in the Enterprise Analytics space including the recently acquired Column-Oriented data store Sybase IQ and AP’s latest iteration of In-Memory analysis, HANA (High-performance ANalytical Appliance). HANA will run on hardware from vendors such as IBM and HP with upwards of 1TB RAM and sit between SAP BW and other large data sources providing lightning-fast (up to 350x faster in SAP tests) though later iterations of HANA will all-but replace the current storage engine behind SAP BW (planned late 2011) and ultimately will replace the entire data storage infrastructure behind SAP’s ERP systems and potentially other third-party applications.
Timo’s enthusiasm for these new technologies clearly showed and having been in the industry (and the company) for 20 years it’s worth noting that he described the advent of large-scale in-memory analytics as a “once in a decade” leap in capability and for Enterprise-class organisations I’m quite sure it will be but having worked in much smaller companies I’m somewhat sceptical about how much of an impact it will make at the lower end of the market.
Another major theme for the day was the advent of Analytic Applications, essentially packaged BI and Data Warehouse products pre-built for specific industries (e.g. Healthcare, Retail, Manufacturing) or for departmental purposes (e.g. Finance, HR). Demoed by Jeff Veis and Andy Hirst, these applications are presented as a series of dashboards but since much of the underlying KPI definitions and data architecture are already built they can reduce implementation time to as little as 12 weeks vs. 6-9 months for a ‘from scratch’ implementation. It’s easy to be sceptical about this as we all tend to believe that our problems are unique but each application is focused so closely on a particular industry/department that even if they’re only able to meet 70% of the core requirements out of the box the simplicity and reduced timescales ought to be well worth the sacrifice, especially since they’re customisable after the initial setup.
The third major theme of the day was Data Quality, in fact in addition to the session on Information Steward by Barry Dodds and Dave Pugh four other speakers made a point of telling the audience that everybody in the room had data quality problems – it’s probably true but I couldn’t help feeling a little nagged by the end of it all! The tool itself seemed very capable and for a DQ application it was remarkably visual and included dashboard-style elements (to paraphrase Barry) “using analytics to improve analytics” which despite being a cool soundbite is a actually a very sensible approach to take.
Also announced was the new Complex Event Processing engine Event Insight, essentially these CEP engines (like Microsoft’s StreamInsight) take an incoming stream of events in real-time from operational systems and provide monitoring and alerting capabilities as well as processing for more traditional reporting and dashboarding. Additional products mentioned but not thoroughly explored were a collaboration tool sapstreamwork.com and a new unstructured text processing engine that is able to parse free text such as Twitter feeds and provide “sentiment analysis” as well as tagging various context indicators including geography.
Roadmap wise we were told to export more along the lines of Pervasive BI, Big Data, Social / Collaboration and more in the Mobile BI space. On the latter we should expect enhancements to the existing Business Objects Explorer mobile app as well as a native WebI application, mobile platforms mentioned included Blackberry, Symbian, Windows Mobile, iPhone, iPad and even the RIM Playbook but oddly no mention of Android. I’m not sure if it was left off of the slide by accident or there are legitimately no Android plans, I’d assume the former since Android is almost certain to become the market leader in terms of wide-spread adoption.
In addition to the Business Objects staff there were also a couple of external speakers both of whom gave interesting talks…
Tony Harper of Capgemini who spoke on the general topic of Mobile BI, highlighting the increased user expectations presented by high quality consumer-oriented smartphone and tablet apps as a particular challenge. The talk was thought-provoking and in particular it Tony’s statement that Mobile BI projects will be “sending information farther from the walls of the data centre than ever before” really underscored one of his main themes that providing so many people in so many disparate locations live access to your data will significantly stretch both performance and data quality and these expectations should be factored into Mobile BI projects from the beginning.
Following Tony was Alys Woodward from the research firm IDC who gave a good talk on the factors influencing BI uptake within organisations listing the most important contributing factors as being as Degree of Training (including training on KPIs as well as the tools), Design Quality (of architecture and processes), Non-Executive Involvement (i.e. get the business users involved), Importance of Governance and Use of Performance Management Methodology (the last two being important drivers in organisations where they are relevant).
Don’t forget to check out my ”What’s New in Business Objects XI Release 4 / SAP BI 4.0?” post too for more detail on the core Business Objects product stack.
Categories: Business Intelligence, Business Objects, Sybase, The Cloud Tags: Analytical Applications, Andrew Fryer, BI, BI 4.0, Blackberry, BO, Brent Ozar, Capgemini, CEP, Column-Oriented, Data Quality, DQ, ERP, Event Insight, HANA, IDC, In-Memory, Information Steward, iOS, iPad, iPhone, IQ, Mobile BI, Playbook, SAP, SAP BW, Sybase, Timo Elliott, WinMo, XIR4
Having been raised on the good old fashioned ZX Spectrum and introduced to PCs via MS-DOS I have something of a nostalgic fascination with command line interfaces, there’s something beautifully simplistic about using a good command line – like you’re talking to the machine directly. I’m not just talking about using the pseudo DOS-shell that comes with NT or the ubiquitous Unix/Linux command line, it’s the more exotic examples that pique my interest which is why I’m quite excited about the new Google command line tool.
Essentially GoogleCL is a Python application that can be executed at the command-line to make calls to various Google APIs, it currently offers limited support for Blogger, Calendar, Contacts, Docs, Picasa and YouTube but I’m certain that Google will deliver more features in the future. In terms of security, there’s a one-time authentication process for each application whereby the command-line tool launches a page in your default web browser to grant access for the GoogleCL tool. To me, the most interesting examples that Google provide are those allowing content creation…
- google blogger post blogpost.txt
- google calendar add "Dinner party with George today at 6pm"
- google contacts add "J. Random Hacker, firstname.lastname@example.org"
- google picasa create --title "Vermont Test" --tags Vermont vermont.jpg>
- google youtube post --category Education --devtags GoogleCL killer_robots.avi
There may not be many obvious ties to the world of Business Intelligence here as GoogleCL is still in its infancy but for now at least you could perhaps drive scheduling through Google Calendar, maintain distribution lists in Google Contacts or automatically upload reports to Google Docs. I’m quite sure the possibilities will expand over time though especially since some major Google products are currently not included (e.g. Search, Gmail) – I, for one, will be watching with great expectations.
If you’ve not heard the buzz already Google have released a Command Line tool called GoogleCL, you can install in in Windows by following Isaac Truett’s guide to”Setup GoogleCL on WinXP“ but if you’re using a Mac and you’d like to install it and have a play here’s a few simple instructions…
- Enable your Root login (instructions from Apple in KBHT1528).
- Log in as Administrator (bear in mind your normal user shouldn’t have Admin rights).
- Download and install Xcode.
- Download and install MacPorts.
- Open up Terminal.
- Edit your ‘paths’ file: sudo vi /etc/paths
- Add a new line (press ‘i’ then scroll to the bottom first): “
- Save the file (press ESC, then type “wq!”).
- Close Terminal and re-open.
sudo port install googlecl, and press Enter (this takes a while).
- Log off as the Administrator.
- Log back in as yourself and test (see examples).
I enjoy going to SQL Server community events, I usually find they provide a refreshing look at what other people are doing and provide inspiration and ideas of what I could be doing myself. Vendor-run events are different so I attended Microsoft’s SQL Server 2008 R2 Tech Days event with mixed expectations, not sure if it was going to overly marketing-heavy or whether it really would be worth taking a day out of the office.
Thankfully I was in luck, Microsoft did a great job of treading the line between promotion and information and whilst the intro and the first 5-10 mins of each take were quite marketing oriented the majority of the content was realistic and provided honest demonstrations of the product. Also throughout the talks presenters were offering to answer questions via SMS or via the Twitter hash-tag #uktechdays, this was a great touch and even though there wasn’t time to answer all of the questions it really added to the interactivity of the event.
First up was Power Pivot, as a product it looks to be immensely powerful and provides lightning fast analytical capabilities though I imagine it needs a decent amount of RAM and an up to date processor to achieve it – the most amazing part is that it’s a free add-in for Excel 2010! Essentially PowerPivot allows you to extract up to a million records from a database and perform in-memory analysis with that set of data, including combining it with other data sets, combining it with data in your spreadsheet, performing calculations, making summaries, etc. It’s well worth taking a look at the demos, PowerPivot is a massive leap forward in Excel’s capabilities but to me it seems like a step backwards in terms of the centralised BI ‘single version of the truth’ concept – allowing users to rip a million rows out of the Data Warehouse, mix them up with other data sources and then send them around via email or even publish them via SharePoint. As it goes the Share Point integration was also pretty remarkable, allowing other users to use published reports not only for viewing but also as a data source on which to build new reports – pretty ground breaking stuff but I’d hate to be the guy debugging a report based on a report based on a report based on… (you get where I’m going). Overall I’d give PowerPivot a 5* rating for innovation but it seems that Microsoft is using a common tactic from Formula One – trying to get ahead of the competition by taking a contrary strategy, but will it turn out like Jenson Button in Shanghai (he won) or like Lewis Hamilton in Australia (he didn’t)?
After a relatively dry talk on virtualisation and Hyper-V Live Migration (impressive stuff but I’ve seen it before) the next talk was about Report Builder 3 and having never been a user of Reporting Services I thought I was just going to sit through it and twiddle my thumbs – I was wrong. Having been knocking about in the BI world for about 8 years or so I can really say that this release of Report Builder really cements Microsoft’s position in Business Intelligence. It’s still not very slick from a usability standpoint but the visualisations they’ve added are stunning and having been a long-time user of Business Objects the talk actually did make me think “how hard would it be to switch?” – since I have a mature installation the answer is very hard but it still made me think. The most impressive visual elements were the Spark Lines, Data Bars and Indicators but the maps were also pretty good especially given that you can use ESRI shape files.
The next talk was “Maximising your existing hardware CPU, memory and disk” by Ramesh Meyyappan, I’ve seen Ramesh before at SQLBits and he’s always very good, very detailed and straight to the point. It was a great talk, taking place mainly in Management Studio rather than PowerPoint and if you get the chance to see one of Ramesh’s talkes in the future you should definitely go (but have a cup of coffee first). Following Ramesh’s rollercoaster of a talk was much more relaxing run-through of Microsoft’s ‘database in the cloud’ offering SQL Azure, a product I find extremely interesting but don’t have an immediate use for though I expect in time as the feature-set converges with SQL Server I will be changing my mind. Next up was StreamInsight, R2′s Complex Event Processing (CEP) solution for analysing large data streams (10k rows/sec+) on the fly without touching the relational engine – it looks interesting but I don’t have those sorts of requirements at the moment so I don’t have much of a reaction. The day was rounded off by a presentation by Andrew Fryer about Master Data Services, a difficult topic to present in a jazzy way but it looks very interesting and if it will integrate with the spaghetti-junction of systems floating around in most organisations it could do a lot to help us keep our data warehouses in line with corporate naming conventions, it sounds like a lot of fuss over a little issue but if you’ve ever actually tried to solve the problem yourself in a company with more than a couple of source systems you’ll understand how hard it can be.
All in all a good day, I’ll give a shout out to the staff at Jumbucks in Shepherd’s Bush where I had breakfast and bought a bagful of Australian confectionary and to the Vegetarian Chinese buffet over the road for providing me with much needed sustenance.
Categories: Business Intelligence, Microsoft Excel, Microsoft SQL Server, Reporting Services, The Cloud Tags: 2008 R2, excel, F1, Formula 1, Hyper-V, Jenson Button, Lewis Hamilton, Master Data, Microsoft, Microsoft SQL Server, powerpivot, Report Builder, Report Builder 3, Reporting Services, SharePoint, sql, SQL 2008 R2, SQL Azure, SQL Server, SQL Server 2008 R2, SQLBits, StreamInsight, Tech Days, Twitter, Virtualisation