Having setup a Linked Server in Management Studio talking to a PostgreSQL 8 database I encountered the following error when attempting to run any valid query:
Msg 7399, Level 16, State 1, Line 1 The OLE DB provider "MSDASQL" for linked server "PG_SERVER" reported an error. The provider reported an unexpected catastrophic failure. Msg 7350, Level 16, State 2, Line 1 Cannot get the column information from OLE DB provider "MSDASQL" for linked server "PG_SERVER".
After some digging I came across a handy article on Microsoft Connect describing the same issue, with thanks to Nenea Nelu here’s the solution…
- Expand Server Objects > Linked Servers > Providers.
- Right-click on MSDASQL and select Properties…
- In the Properties dialogue un-check “Allow inprocess” as follows…
- Click OK and re-run your query.
Hopefully that should solve your problem, please note that this will affect all Linked Servers using that provider however as the Connect article points out – this is best practice for linked servers anyway.
Last week saw the release of Windows Server 2008 R2 Service Pack 1 and with it one of the feature’s I’ve been waiting for – Hyper-V Dynamic Memory. Until now if you were running a Hyper-V host with (for example) 16GB of RAM your guests could never exceed that amount, e.g. you could have 4 x 4GB, 2 x 4GB + 1 x 8GB, etc. but never more than 16GB in total.
With the addition of Dynamic Memory you can finally over-commit RAM enabling you to make better use of available resources, as with CPU usage you still need to balance your workloads carefully and it only really makes sense to combine workloads that have high memory pressures at different times otherwise you could end up with poor performance or experience system failures when memory is unavailable.
Please note that Dynamic Memory is configured on a host-by-host basis so nothing will change until you follow the process below, Microsoft have a really helpful TechNet page explaining how to configure Dynamic Memory but in a nutshell you should follow these steps…
Update both the Host and the Guest to Windows Server 2008 R2 SP1 then use Hyper-V Manager to connect to the Guest then choose Action >> Insert Integration Services Setup Disk and reinstall the integration components.
Shut down the Guest and in Hyper-V Manager right-click on the guest and pick Settings in the memory panel choose Dynamic then set the Startup RAM and Maximum RAM. There’s also a configurable buffer percentage (Hyper-V reserves this extra amount but will give it up under pressure). I’d leave it on the default 20% unless you’ve got a good reason not to.
Set a priority for this guest (e.g. you could set this higher for servers that could fail with too little memory).
Restart the guest and check in Hyper-V Manager…
Here you can see that I’ve exceeded my Startup memory of 2GB but only have a current demand of 1795MB and since there’s no memory pressure on the host the status shows as OK. If the host is unable to reserver the entire buffer amount (in my case 20%) the status will show as “Low” and if the host is unable to allocate any buffer it will show “Warning”.
One of my primary data sources for Business Objects is a replicated pair of MySQL servers where I am asked by the DBAs to report against the slave however during maintenance replication can fall behind and reports that require up-to-date data will be incomplete. Since we don’t live in an ideal world we can’t always plan our maintenance windows so I wrote a small VBScript routine that will detect the replication delay and if if it exceeds a threshold will change the ODBC source to point to the master.
If you’ve caught my earlier article on 32-bit ODBC Drivers in Windows Server 2008 R2 you’ll know that there’s plenty of fun to be had since my ODBC drivers are 32-bit. This means that I need to run the VBScript using the 32-bit version of CScript and the schedule it using the 32-bit Task Scheduler and once again the solution is to use the 32-bit tools provided in the SysWOW64 directory….
Beyond that you shouldn’t have too much trouble but if you do please leave a comment below with details and I’ll get back to you if I can help.
As more and more functionality is built into products like SQL Server it’s always worthwhile reviewing third-party tools and utilities when you’re considering an upgrade to see (a) if they’re still required and (b) if the tools themselves need to be upgraded. With the introduction of Backup Compression in SQL Server 2008 R2 Standard Edition you could begin to think that the future is grim for Quest’s backup compression software LiteSpeed so I thought I’d do some testing to see exactly how it stacks up against the native compression.
I’ve been using LiteSpeed on and off for a few years now and it has always been a great tool but I’ve always found it a bit of a drag to have to use the GUI to administer and setup jobs however in January 2010 Quest launched the LiteSpeed Engine for SQL Server which allows you to administer jobs using the native SQL Server tools. The LiteSpeed Engine acts as a driver and the configuration tool allows you to define a variety of configuration profiles based on file extension and from that point onwards you can use the Management Studio to setup backup jobs, maintenance plans, etc. and all you have to do is specify the file extension of the profile you wish to use.
The configuration tool allows you specify the compression level from 1 to 8, encryption level including various bit-length versions of RC2, RC4, 3DES and AES though as you’ll see later the overhead of adding the highest level (256-bit AES) isn’t that great so I’d always shoot for the maximum.
The test is relatively unscientific since I used only one database but it was carried out systematically, the data comes from a transactional billing system which I chose as it has a mix of strucured tables and raw transactions and comes in at about 6.5GB so it wouldn’t take too long to test. I used the following configurations…
On my test database the baseline SQL Server native compression reduced the 6.2GB database to 765MB (12.2% of the original size) and took less than half the time (43%), to achieve the same level of compression using LiteSpeed I had to use Level 2 which gave me 12.2% of the original size and 40% of the original duration.
At first this doesn’t look great for the third-party tool but the benefit of using a mature backup compression engine is the flexibility and LiteSpeed’s configurations allow you to tweak the performance to solve whatever problem you have in your environment whether that be the absolute size of the backup, the backup window time or a mixture of the two.
If it’s size you’re after then Level 8 really did seem to work wonders on my test DB bringing the size down to 5.6% of the original at only 352MB though it did take 2.6 times the original duration, if it’s the backup window you’re looking to reduce then the basic Level 1 did manage to improve on the native compression by taking 0nly 37% of the original duration whilst still compressing to 13% of the original size. If like most people you’re looking to have your cake and eat it (i.e. reducing size and backup window) I’d suggest that Level 3 is the best compromise giving 10.9% of the original size at 77% of the original duration so you get some benefit in both areas, though Level 4 takes compression a bit further and still gave a slight time reduction.
Clearly, the real answer is testing and since I’m at the beginning of data warehousing project I’m not in the position to make any firm decisions but I think that even if you don’t run out and purchase it now LiteSpeed is a very valuable tool to have in your mental arsenal so that if you come up against backup size/window issues or you’re faced with older versions of SQL Server you’ve got a solution in mind already. Quest have an odd policy of keeping pricing quite opaque but I believe that the full Enterprise version (including the LiteSpeed Engine) retails for around £1,800 ($2,800) which isn’t too bad if you need that level of flexibility.
SQL Server Native Compression
|Compression||Size (MB)||Time (s)||Size (%)||Time (%)|
LiteSpeed Compression (No Encryption)
|Compression||Size (MB)||Time (s)||Size (%)||Time (%)|
LiteSpeed Compression (With Encryption)
|Compression||Size (MB)||Time (s)||Size (%)||Time (%)|
It’s the time of year when magazine editors can’t resist the urge to fill their glossy wares full of ‘thing of the year’ articles, the print equivalent of the mid-season “clip show” that has plagued many a TV series. Well, if it’s good enough for them it’s good enough for me so here’s my rather unstructured and unscientific take on Business Intelligence and Data Warehousing in the year that was – 2010…
To start, I’ve taken a series of snapshots from the excellent Google Trends showing global search volumes for each of the Big Four offerings to measure the level of interest. It’s reasonably clear to see from the graph below that interest in OBIEE shows a small but steady growth whilst Reporting Services shows a marked decline and the other two offerings remains roughly static (maybe a small decline?), this surprised me given that with the release of 2008 R2 I think that Reporting Services is really getting to the point where it offers a legitimate choice in the BI marketplace. Perhaps the issue that Microsoft have fragmented their BI offering to include a mixture of terms with Excel, PowerPivot, SharePoint, Analysis Services and Reporting Services all making up the BI stack and nobody really knows what to call it?
|Cognos||OBIEE||Business Objects||Reporting Services|
This year has also brought an increased emphasis on Mobile BI with the iPad and iPhone fast becoming common executive playthings, Business Objects making it’s Explorer and Xcelsius products available on Android in addition to the iPhone (Explorer only). MicroStrategy took the mobile emphasis a step further (perhaps to help stick their head above the crowd) by announcing a strong focus on the mobile BI market and offering a free 25-seat licence for their Mobile Suite. Despite a strong focus on marketing Mobile BI I’m still not convinced that any of the vendors have really hit the nail on the head with their solutions in that whilst many offer pretty visualisations and slick interfaces most seem to lack the kind of simplicity that helps to present information quickly and succinctly, even the frankly beautiful independent product RoamBI just feels a little overdone when it comes to actually using it.
Major Product Releases
It’s been quite a year in the BI & Database world with the launch of Microsoft SQL Server 2008 R2, Oracle Business Intelligence Enterprise Edition (OBIEE) 11g and IBM’s Cognos 10…
Microsoft’s launch is effectively a moderate evolution of SQL Server 2008 in most areas with little change to the database engine, it’s ETL tool Integration Services and it’s OLAP engine Analysis Services. That said, R2 did bring some handy incremental features which will be especially welcomed by the budget-conscious with an increase in the DB size of the free Express Edition from 4GB to 10GB and the addition of Backup Compression to Standard Edition. There were some interesting additions with PowerPivot, Master Data Services and StreamInsight thought I’m not sure that either will find a natural home for a good year or so as busy DBAs and developers struggle to find the time to try these new features out.
Despite the major jump in the version number Oracle’s release too seems to be mainly an evolution and as a great fan of the product I’m quite considerably relieved since Oracle could quite easily have been over-zealous in integrating their ‘own’ tools like Discoverer and Warehouse Builder with bought-in technologies like Siebel Analytics (which became the bedrock of OBIEE), Hyperion’s Essbase and Sunopsis (now Oracle Data Integrator). One of the less exciting but fundamentally important additions is that the semantic layer employed in OBIEE will be directly and immediately compatible with future releases of other Oracle products in the CRM, ERP and Finance application spaces.
I’m not as familiar with Cognos as the other two tools having only experimented with Cognos 8 for a couple of weeks but from everything I’ve read it seems that Cognos 10 was certainly a major milestone in the product’s lifecycle. Aside from the shiny sounding features such as Social Networking and iPad support (actually a very serviceable looking mobile BI app) there are some very cutting-edge additions to the product including a statistical engine drawn from SPSS and Active Reports which allows users to explore and analyse offline data including interactive email reports.
No good review and roundup article ends without a nod to the future and whilst I’m not keen on making absolute predictions there are a few developments I’ll be keeping my eye on for 2011 and beyond.
The main event I’m anticipating is the release of Business Objects XI Release 4, I’ve not seen too many concrete details about functionality but over the last few years Business Objects have seen themselves distracted by the Crystal acquisition (including the shoe-horning of their core product into Crystal Enterprise) and in turn their acquisition by SAP. As a regular and long-term user of Business Objects I’m really hoping that they’ll blow away some of the cobwebs and deliver some new functionality as well as rounding off some of the edges that in previous versions feel a little unfinished, it would be great too if they finally included the key functionality from the legacy desktop client (which many long-term customer still rely on) in their core Web Intelligence product (Freehand-SQL & Grouping – I’m looking at you).
Another area to watch in Business Intelligence and Data Warehousing as well as the wider enterprise market is cloud computing, Informatica’s ETL in the Cloud offering has seen improvements and adoption throughout 2010 and it’s almost a given that Microsoft will be adding some degree of ETL capability to their SQL Azure platform. I’d expect an announcement if not a release along these lines in the coming year, though it’s possible that ETL comes behind providing cloud based analytics (something SSIS guru Jamie Thomson suggests).
In a broader sense I’m expecting to see a little more interest and pickup in the open source BI market, I’ve been saying this for a while (“this time next year, Rodders…“) and I might be wrong for some time to come but I always keep an eye on companies using an Open Source model such as the ETL vendor Talend who recently acquired Sopera (a middleware and SOA vendor), BI vendor Jaspersoft and all-rounder Pentaho. With the global economy still suffering a hangover from the sub-prime mortgage crisis and banking collapse people have been looking for cheaper alternatives and open source companies provide a great way to achieve that, though some of Talend’s high-end offerings are almost comparable in price with other commercial products.
Another possible area to watch out for is the area of Personal Intelligence, essentially Business Intelligence for the individual. A colleague and I spoke about this back in 2008 and we could both see that as people increasingly become data-aware they’ll start to look inwards and aim to measure things about themselves, one obvious starting point is fitness and we already have sites to log and chart your weight and calorie intake as well as the brilliant Nike+ product that measures your pace, time and distance during a run using either a sensor in your shoe or GPS (iPhone app), see the sidebar of this blog or below (one of my runs on the Nike+ site) for examples of the output.
Categories: Business Intelligence, Business Objects, Microsoft SQL Server, Open Source, Oracle, PostgreSQL, Reporting Services, Security, SSIS, Windows Tags: 2008 R2, analytics, Android, BI, BI Trends, Business Intelligence, business objects, cloud, Cognos, IBM, informatica, iPhone, Jaspersoft, Microsoft, Mobile BI, Nike+, OBIEE, Open Source, Oracle, OSS, Pentaho, Personal Intelligence, Reporting Services, RoamBI, SAP, SQL Server, SSIS, Talend
If you need to run an application using the credentials of a user other than yourself (or the one you’re logged-in as) in Windows 7 or Windows Server 2008 R2 (may work in other versions) all you need to do is hold shift as you right-click on the application.
For example, in this case I would like to launch Windows Explorer as a different user…