My normal Data Warehouse load process runs daily in the early hours of the morning so I’m not used to sitting watching it – this morning however through lack of sleep I happened to be there and I noticed something odd: there seemed to be a queue.
What seemed to be happening is that two jobs would run simultaneously whilst other jobs sat showing a status of “Waiting for Worker Thread”. After digging around I found that…
… showed that the server was configured to automatically manage the max_worker_threads setting (value of 0), additionally…
SELECT max_workers_count FROM sys.dm_os_sys_info;
… showed that the value being used (based on the 32/64 bit status and number of cores) was indeed 512. A quick check of…
SELECT COUNT(*) FROM sys.dm_os_threads;
SELECT COUNT(*) FROM sys.dm_os_workers;
… showed that the actual running values (54 & 46 respectively) were within expectations and nowhere near the 512 thread limit so what was happening?
Well, it turned out that whilst my jobs are a mix of OS admin tasks, legacy DTS jobs and SSIS packages most of them involve some form of PowerShell script to initialise them (usually moving files around, setting permissions, etc.). The problem was down to an obscure setting that limits the max_worker_threads to just 2 for the PowerShell subsystem, the setting can be seen here:
In my case I just upped the running value to 40 (the same value set for CmdExec) and restarted SQL Server Agent:
SET max_worker_threads = 40
WHERE subsystem = ‘PowerShell’;
I’m yet to spot whether this has any other long term effect but things seem to be behaving fine, I will say that as part of my research I found other people saying that this setting did not persist through a restart on certain versions – I’m running 2008 R2 and that appears to be fine but it’s worth double-checking if you try this.
Starting my third day at SQLBits with a hat-trick of talks on technologies I’m unfamiliar with was a bit of a head-bender but an enjoyable one nonetheless. The first talk I chose was a great overview of how to use completely free tools (SQL Server Express 2008 R2, among others) with completely free spacial data (Ordnance Survey’s Open Data) to create spatial reports in Reporting Services. Since it’s not an area I’m working in at the moment I didn’t get any major take-aways but I do have an understanding of what’s possible and how to go about it – if you’re in the same boat you should check out FWTools, Shape2SQL and Grid InQuest.
The second talk was Matt Whitfield‘s ‘CLR Demystified’ and not being a developer I’ll admit that much of it went over my head but I get the basic principles and most importantly I know what’s possible and where I might make use of CLR. The most interesting avenues for me are the ability to write custom aggregate functions and define custom data types – I was also impressed with the opportunity to increase performance in certain text processing / forward log parsing situations.
The next talk I attended was a gentle introduction to PowerShell and James Boother did a good job of showing where it might be useful for admin tasks, in particular the example of purging old backup/log files based on age lit my eyes up. There was also a demo of PowerShell authenticating with Twitter and posting tweets as admin alerts, alas the demo failed but that could well have been a timeout on the Uni’s WiFi network or just plain old demo-gremlins. I must say thou that I still can’t view PowerShell with some sense of disappointment because with all it’s flexibility and power it’s so damned wordy and many of the tasks it performs could be achieved with less code that a good old Bash script.
As with Friday I attended Quest‘s lunchtime session run by Kevin Kline, Ian Kick, Brent Ozar and Buck Woody – they’re some of the most experienced guys in the SQL community and when you get them together they’re funny as he’ll too so I was both entertained and informed in their myth-busting quiz.
My first afternoon talk was Gary Short’s session on NoSQL which predictably sparked a few polite but irate rebuttals from argumentative DBAs but the session itself was an excellent whistle-stop tour of the predominant NoSQL technologies and use cases. I was encouraged to hear from someone experienced in the field that nobody has quite put together all the pieces to hook up BI tools (that traditionally expect relational/dimensional models or OLAP sources) to the NoSQL back-ends, it’s a shame since I might need to do so pretty soon – I guess I’m going to have to get my hands dirty then!
The final talk was from Kevin Kline of Quest who covered SQL Injection, it was an informative talk that gave me pause for thought about a couple ‘best practices’ that I probably ought to harden a little. Kevin recommended a few tools that I’ll definitely be checking out at some point, notably:
- HP Scrawlr
- Source Code Analyser for SQL Injection
- Assessment and Planning Tool
- Discovery Wizard
And a few handy sites/articles:
Once more it was a great conference and the free day was every bit as good as the paid day, I can honestly say that I walked out of the event already looking forward to the next one.
I’ve never been much of a server admin but in order to install a fresh copy of SQL Server 2008 R2 (November CTP) I decided to install a fresh copy of Windows Server 2008 R2. I downloaded the install from Microsoft’s site and because I’ve been primarily running on Windows Server 2003 I ran through one of their e-Learning sessions to fill in the blanks of what’s new in both R2 and Server 2008.
The main versions are:
- Foundation (up to 8GB RAM, 1 Socket, no VMs)
- Standard (up to 32GB RAM, 4 Sockets, Host + 1VM),
- Web Server (up to 32GB RAM, 4 Sockets, no VMs)
- Enterprise (up to 2TB RAM, 8 Sockets, Host + 4VMs)
- Data Centre (up to 2TB RAM, 64 Sockets, unlimited VMs)
As always there are lots of new features on the list but the biggies seem to be Hyper-V, Remote Desktop Services (RDS) and Virtual Desktop Infrastructure (VDI). The most stark break from the past here is that 2008 R2 will only run on 64-bit processors, existing users of Server 2008 32-bit installs on 64-bit processors will not be able to perform an upgrade and will have to do a clean install.
Hyper-V is Microsoft’s new virtualisation technology which on paper seems like a good challenger to VMWare ESX and it comes with R2 as standard although you’ll need Enterprise or Datacenter to make the most of it. By far the coolest feature of Hyper-V is the Live Migration (similar to VMWare’s VMotion), this allows you to move a guest system from one host server to another without any interruption to the users of the guest, that’s it – zero downtime.
RDS and VDI represent an enhancement of Terminal Services, along with Hyper-V you can now host virtual desktops on a virtual host and permit access from approved devices over the web or via the network, remote desktop now supports multiple monitors and Aero-Glass.
The session mentioned a number of other features, most noteworthy wew BranchCache (WAN optimisation), DirectAccess (seamless and interventionlessVPN replacement) and PowerShell 2.0 (command-line server admin) an there was an incremental 7.5 release of IIS.
Well, I hope you got something from this post, more details and a link to the e-Learning session can be found on Microsoft’s Windows Server 2008 R2 microsite.