Last week saw the release of Windows Server 2008 R2 Service Pack 1 and with it one of the feature’s I’ve been waiting for – Hyper-V Dynamic Memory. Until now if you were running a Hyper-V host with (for example) 16GB of RAM your guests could never exceed that amount, e.g. you could have 4 x 4GB, 2 x 4GB + 1 x 8GB, etc. but never more than 16GB in total.
With the addition of Dynamic Memory you can finally over-commit RAM enabling you to make better use of available resources, as with CPU usage you still need to balance your workloads carefully and it only really makes sense to combine workloads that have high memory pressures at different times otherwise you could end up with poor performance or experience system failures when memory is unavailable.
Please note that Dynamic Memory is configured on a host-by-host basis so nothing will change until you follow the process below, Microsoft have a really helpful TechNet page explaining how to configure Dynamic Memory but in a nutshell you should follow these steps…
Update both the Host and the Guest to Windows Server 2008 R2 SP1 then use Hyper-V Manager to connect to the Guest then choose Action >> Insert Integration Services Setup Disk and reinstall the integration components.
Shut down the Guest and in Hyper-V Manager right-click on the guest and pick Settings in the memory panel choose Dynamic then set the Startup RAM and Maximum RAM. There’s also a configurable buffer percentage (Hyper-V reserves this extra amount but will give it up under pressure). I’d leave it on the default 20% unless you’ve got a good reason not to.
Set a priority for this guest (e.g. you could set this higher for servers that could fail with too little memory).
Restart the guest and check in Hyper-V Manager…
Here you can see that I’ve exceeded my Startup memory of 2GB but only have a current demand of 1795MB and since there’s no memory pressure on the host the status shows as OK. If the host is unable to reserver the entire buffer amount (in my case 20%) the status will show as “Low” and if the host is unable to allocate any buffer it will show “Warning”.
Many people only attend the free ‘Community Day’ of SQLBits and I can understand why given the cost (£125) for the Friday sessions but if SQL Server is how you make your living I really do think it’s worth the money. It’s not even that the Friday sessions are significantly different in content, it’s really just more of the same high level of quality you get on Saturday but when it comes to SQLBits more is definitely better.
It’s always a tough choice picking which sessions to attend so it’s often best to go with speakers you know will be good so despite having spent the entire previous day with Maciej Pilecki in the SQLBits Training Day I made my first session Maciej’s SQL Server Statistics talk. Despite a few initial technical gremlins the talk went well and gave a few insights into how statistics are used by the query optimiser with the key takeaways being to always keep both AUTO_CREATE_STATISTICS and AUTO_UPDATE_STATISTICS turned on, to consider turning on AUTO_UPDATE_STATISTICS_ASYNC (does not force queries to wait for stats to be updated but subsequent queries will benefit) and to run sp_updatestats after any major updats or to reindex your tables periodically.
My next session was Brent Ozar‘s Virtualisation and SAN talk, this gave me a whole load of questions to go back to my SAN Administrator with as well as a whole load of tests I intend to perform before I deploy my next Data Warehouse on a Hyper-V guest. One concept that was completely new to me was the Balloon Driver that hypervisors use to encourage Windows to free-up RAM, since SQL Server is a good citizen it can end-up flushing the entire Buffer Pool and wrecking your performance – the solution is to ensure that Dynamic Memory is disabled in the Hyper-V Manager. Some great related resources can be found at…
The lunchtime sponsor talk I chose was the one from Quest that covered IT Horror Stories, it was a brilliant session with plenty of audience interaction and steered clear of pimping any specific Quest products but instead just showed that the people that work there are experienced, pragmatic and generally just nice guys. I think this approach is far better than the extended product demos that many software companies tend to give as their lunchtime sessions as they’ll only be of interest if you’re genuinely considering the product and if you’re not they’ll do little to increase brand awareness with a room full of bored people on Twitter of Facebook.
After lunch I went for Buck Woody‘s talk on Business Continuity which provided a few simple paths and the crucial tasks to help get people started on a business-relevant disaster recovery strategy. I was particularly impressed with one of the central themes of the talk which was (I’m reading between the lines a little) that even if you think it’s ‘not your job’ to put a DR plan in place, it’s likely that as the company’s ‘Data Professional’ people will still look to you in times of failure and if you’ve already done all of the planning you’ll be the guy with a calm head solving the problem and if you’re not that guy – start getting your CV ready. Despite having heard the name and having read a few of his blog posts over the years I’d never heard Buck speak and he’s great so if you get the chance to see him you definitely should.
Well that wraps-up the day nicely, I’ll be posting Saturday’s round up soon after I’ve written it!
Categories: DBA, Events, Microsoft SQL Server Tags: Brent Ozar, Buck Woody, Business Continuity, Disaster Recovery, DR, Hyper-V, Microsoft SQL Server, SAN, SQL Server, SQLBits, Virtualisation, Virtualization
I enjoy going to SQL Server community events, I usually find they provide a refreshing look at what other people are doing and provide inspiration and ideas of what I could be doing myself. Vendor-run events are different so I attended Microsoft’s SQL Server 2008 R2 Tech Days event with mixed expectations, not sure if it was going to overly marketing-heavy or whether it really would be worth taking a day out of the office.
Thankfully I was in luck, Microsoft did a great job of treading the line between promotion and information and whilst the intro and the first 5-10 mins of each take were quite marketing oriented the majority of the content was realistic and provided honest demonstrations of the product. Also throughout the talks presenters were offering to answer questions via SMS or via the Twitter hash-tag #uktechdays, this was a great touch and even though there wasn’t time to answer all of the questions it really added to the interactivity of the event.
First up was Power Pivot, as a product it looks to be immensely powerful and provides lightning fast analytical capabilities though I imagine it needs a decent amount of RAM and an up to date processor to achieve it – the most amazing part is that it’s a free add-in for Excel 2010! Essentially PowerPivot allows you to extract up to a million records from a database and perform in-memory analysis with that set of data, including combining it with other data sets, combining it with data in your spreadsheet, performing calculations, making summaries, etc. It’s well worth taking a look at the demos, PowerPivot is a massive leap forward in Excel’s capabilities but to me it seems like a step backwards in terms of the centralised BI ‘single version of the truth’ concept – allowing users to rip a million rows out of the Data Warehouse, mix them up with other data sources and then send them around via email or even publish them via SharePoint. As it goes the Share Point integration was also pretty remarkable, allowing other users to use published reports not only for viewing but also as a data source on which to build new reports – pretty ground breaking stuff but I’d hate to be the guy debugging a report based on a report based on a report based on… (you get where I’m going). Overall I’d give PowerPivot a 5* rating for innovation but it seems that Microsoft is using a common tactic from Formula One – trying to get ahead of the competition by taking a contrary strategy, but will it turn out like Jenson Button in Shanghai (he won) or like Lewis Hamilton in Australia (he didn’t)?
After a relatively dry talk on virtualisation and Hyper-V Live Migration (impressive stuff but I’ve seen it before) the next talk was about Report Builder 3 and having never been a user of Reporting Services I thought I was just going to sit through it and twiddle my thumbs – I was wrong. Having been knocking about in the BI world for about 8 years or so I can really say that this release of Report Builder really cements Microsoft’s position in Business Intelligence. It’s still not very slick from a usability standpoint but the visualisations they’ve added are stunning and having been a long-time user of Business Objects the talk actually did make me think “how hard would it be to switch?” – since I have a mature installation the answer is very hard but it still made me think. The most impressive visual elements were the Spark Lines, Data Bars and Indicators but the maps were also pretty good especially given that you can use ESRI shape files.
The next talk was “Maximising your existing hardware CPU, memory and disk” by Ramesh Meyyappan, I’ve seen Ramesh before at SQLBits and he’s always very good, very detailed and straight to the point. It was a great talk, taking place mainly in Management Studio rather than PowerPoint and if you get the chance to see one of Ramesh’s talkes in the future you should definitely go (but have a cup of coffee first). Following Ramesh’s rollercoaster of a talk was much more relaxing run-through of Microsoft’s ‘database in the cloud’ offering SQL Azure, a product I find extremely interesting but don’t have an immediate use for though I expect in time as the feature-set converges with SQL Server I will be changing my mind. Next up was StreamInsight, R2′s Complex Event Processing (CEP) solution for analysing large data streams (10k rows/sec+) on the fly without touching the relational engine – it looks interesting but I don’t have those sorts of requirements at the moment so I don’t have much of a reaction. The day was rounded off by a presentation by Andrew Fryer about Master Data Services, a difficult topic to present in a jazzy way but it looks very interesting and if it will integrate with the spaghetti-junction of systems floating around in most organisations it could do a lot to help us keep our data warehouses in line with corporate naming conventions, it sounds like a lot of fuss over a little issue but if you’ve ever actually tried to solve the problem yourself in a company with more than a couple of source systems you’ll understand how hard it can be.
All in all a good day, I’ll give a shout out to the staff at Jumbucks in Shepherd’s Bush where I had breakfast and bought a bagful of Australian confectionary and to the Vegetarian Chinese buffet over the road for providing me with much needed sustenance.
Categories: Business Intelligence, Microsoft Excel, Microsoft SQL Server, Reporting Services, The Cloud Tags: 2008 R2, excel, F1, Formula 1, Hyper-V, Jenson Button, Lewis Hamilton, Master Data, Microsoft, Microsoft SQL Server, powerpivot, Report Builder, Report Builder 3, Reporting Services, SharePoint, sql, SQL 2008 R2, SQL Azure, SQL Server, SQL Server 2008 R2, SQLBits, StreamInsight, Tech Days, Twitter, Virtualisation
I’ve never been much of a server admin but in order to install a fresh copy of SQL Server 2008 R2 (November CTP) I decided to install a fresh copy of Windows Server 2008 R2. I downloaded the install from Microsoft’s site and because I’ve been primarily running on Windows Server 2003 I ran through one of their e-Learning sessions to fill in the blanks of what’s new in both R2 and Server 2008.
The main versions are:
- Foundation (up to 8GB RAM, 1 Socket, no VMs)
- Standard (up to 32GB RAM, 4 Sockets, Host + 1VM),
- Web Server (up to 32GB RAM, 4 Sockets, no VMs)
- Enterprise (up to 2TB RAM, 8 Sockets, Host + 4VMs)
- Data Centre (up to 2TB RAM, 64 Sockets, unlimited VMs)
As always there are lots of new features on the list but the biggies seem to be Hyper-V, Remote Desktop Services (RDS) and Virtual Desktop Infrastructure (VDI). The most stark break from the past here is that 2008 R2 will only run on 64-bit processors, existing users of Server 2008 32-bit installs on 64-bit processors will not be able to perform an upgrade and will have to do a clean install.
Hyper-V is Microsoft’s new virtualisation technology which on paper seems like a good challenger to VMWare ESX and it comes with R2 as standard although you’ll need Enterprise or Datacenter to make the most of it. By far the coolest feature of Hyper-V is the Live Migration (similar to VMWare’s VMotion), this allows you to move a guest system from one host server to another without any interruption to the users of the guest, that’s it – zero downtime.
RDS and VDI represent an enhancement of Terminal Services, along with Hyper-V you can now host virtual desktops on a virtual host and permit access from approved devices over the web or via the network, remote desktop now supports multiple monitors and Aero-Glass.
The session mentioned a number of other features, most noteworthy wew BranchCache (WAN optimisation), DirectAccess (seamless and interventionlessVPN replacement) and PowerShell 2.0 (command-line server admin) an there was an incremental 7.5 release of IIS.
Well, I hope you got something from this post, more details and a link to the e-Learning session can be found on Microsoft’s Windows Server 2008 R2 microsite.