The patterns and practices team has released the definitive work on performance and scalability for .NET.
I can’t help but mention that if you look …
really hard …
near the bottom of the front page …
you’ll see I'm listed as an external reviewer.
I had a blast reviewing chapters – this guide is dense with tips and tricks on everything from ADO.NET to XML.
A question today on how to get the valid values for a report parameter via the web service API when a query specifies the valid values. (Whew - clunky sentence).
To do this you can use the GetReportParameters web method. First, be wary of a small bug in the documentation where the order of parameters are incorrect. Not a big problem with intellisense, though.
Example:
ReportingService rService = new ReportingService(); rService.Credentials = System.Net.CredentialCache.DefaultCredentials; string historyID = null; bool forRendering = true; ParameterValue[] values = null; DataSourceCredentials[] credentials = null; ReportParameter[] parameters; parameters = rService.GetReportParameters ( "/Northwind Reports/Report2", historyID, forRendering, values, credentials );
This will give you back an array of ReportParameter objects. If a ReportParameter object has a set of valid values defined by a query, reporting services will run the query and populate the ValidValues property with ValidValue objects (only when ForRendering == true). If we wanted to dump the valid values for parameter[0]:
foreach(ValidValue v in parameters[0].ValidValues) { Console.WriteLine(v.Label + " " + v.Value); }
Note, if you have hierarchical parameters, where, for example, the set of valid values for parameters[1] depends on the value selected for parameters[0], you’ll need to initialize the ParameterValue array to get the valid values for parameters[1]. If you are doing this all from scratch this may mean multiple calls to GetReportParameters.
I’ve been in and around quite a number of Hospital IT departments over the last 18 months. I’ve selected one practice that separates the good hospital IT department from the bad: the good ones occasionally hire people from outside the healthcare industry, the bad ones never do.
Scoble suggests a visit from the PAG group for one Hospital IT department. This is exactly what some hospitals could use, that’s why it will never happen at the places that need it most.
Some industries deem themselves as being significantly different from any other industry and only hire from the pool of workers with experience in their industry, or only work with companies and individuals who specialize in their industry. I know this is not just hospitals, financial institutions and others have the same practice, it is just I’ve never seen the problem as exacerbated as I have in healthcare. The effects of inbreeding in IT are the same as in biological organisms: more defects, lower quality. You have to put some new genes in the pool to grow new ideas and move forward.
With my mini-rant over, I’d like to present something I find somewhat comical (albeit dangerous) about hospitals. As we know, the M in MRI stands for magnetic. The magnetic field is extremely powerful, with many of today’s machines reaching 3 – 4 Tesla in strength (30,000 to 40,000 times more powerful than the earth’s field). An MRI machine can turn a pen into a projectile, stop a quartz watch, and suck in chairs, oxygen tanks, and industrial floor polishers. There are pictures to prove it.
I took the SQL Server 2000 Best Practices Analyzer Beta for a spin this evening in a virtual PC. The download and install worked without problems. The installation needs to create a database to store rules and report results – the default name is sqlbpa.
After registering a SQL Server to analyze I went to the Best Practices screen, where I selected “Create Best Practice” and began to look through the available rules. Rules fall into the following categories:
Each category contains a set of rules, many of which you can parameterize. For example, you can check to see if databases have successfully backed up in the last X days (where X defaults to 30, but you can enter your own value).
Here are some of the interesting rules you can run:
Index fragmentation: you can specify the maximum value for fragmentation, and the minimum number of pages an index must have to be included in the scan.
File placement: ensures data and log files are not on the same drive.
Unexpected shutdowns: looks in the NT event log and flags any unexpected shutdowns.
Object Prefixes / Object Suffixes: enforces naming standards. You can select a prefix / suffix for each object type, although it is not clear to me how to setup more than one type of object to scan. Perhaps this is a beta issue.
NULL comparisons: scans stored procedures, views, triggers, and functions to find equality and inequality comparison with NULL constants.
Temp table usage: looks for opportunities to replace temp tables with table variables.
Once I had a set of best practices defined I could move them into the execution area and scan one more or SQL servers. Reports are saved into the database for viewing through the UI.
This looks like an extremely productive tool for watching databases both in development and production. There are managed code assemblies in the bin directory, and possibly room for extensions, as a quick peek in the BPA database shows assembly names. Perhaps custom rules can be put into a custom assembly and registered in the database also.
Definitely going to add this tool to the toolkit.
I never realized you bypass the network protocol stack when connecting to SQL Server on the the same physical machine. At least you CAN bypass the protocol stack if you use the correct settings. I learned this from Ken Henderson's SQL Server Connection Basics on MSDN. Excerpt:
You can indicate that the shared memory Net-Library should be used by specifying either a period or (local) as your machine name when connecting. You can also prefix your machine\instance name with lpc: when connecting to indicate that you want to use the shared memory Net-Library.
I regularly connect to a SQL instance on my desktop at work by using SALLEN\dbs1, meaning I am using the protocol stack instead of shared memory. Shared memory is generally quicker, but not always. As Ken says, make sure to test first for your specific environment.
Of course, for development, I'm sure it won't make a noticable difference at all, but this is a good tip to remember. I always assumed SQL would not optimize for a local connection, but then my assumptions always come back to bite me.
Yesterday I was assuming I was in the correct departure gate for a flight out of Toronto. To make a long story short – I discovered with 5 minutes to go before boarding that I was in the wrong area. The conversation went something like this
Wondering Woman : Is this the flight to Pittsburgh?
Me: I think this is the gate for a flight to Baltimore
Man in another seat: I’m here for a flight to Indianapolis.
I knew two of us had to be wrong, and one of those two was probably me. For some reasons airports do their best to confound me. I had to move from gate T to gate E in YYZ’s Terminal 2, which, as one airport employee described to me with a grin, is “one heck of a walk”. Yep. I felt like I was trying to break the 4 minute mile with two carry on bags in tow. It also involves a shuttle bus ride. The kind of shuttle bus ride where you want to pound on the glass separator and tell the driver to floor it but are restrained just enough by the thought of being arrested as an airport lunatic. Fortunately, I made the plane and was on enough of a post-panic comedown to actually nap for a bit.