OdeToCode IC Logo

New Longhorn Bits

Thursday, May 6, 2004 by scott

Robert McLaws has compiled a list of “things to do” while downloading the latest Longhorn bits, and I may make it through most of them as I still have 5 hours left (not counting the SDK transfer of 380 MB). The transfer rate has steadily been falling from ~ 70 KB/sec to about ~ 20 KB/sec as the evening progresses. I don’t think I will be seeing the installation and setup screen until after a night of sleep. Update: it appears I have downloaded the DDK, not the SDK, as the SDK has yet to appear on subscriber downloads.

Chris Sells points out there will not be any Visual Studio bits to put on this build (M7.2 Longhorn). I might have to grab vi and see if the muscle memory in my fingers can still play :w and yy like the days of old.

Wesner Moise tells us that MSDN has already updated the online Longhorn SDK to reflect the latest build.

Finally, Scoble addressed all the hardware requirement speculators who believe Longhorn will require a creation from the Los Alamos labs to run. I know I’ve been running the PDC bits on what some would consider ridiculously modest hardware – a 1Ghz P3 with 1GB of RAM. Quit laughing! It runs pretty well!

ASP.NET Validation : False Sense of Security?

Monday, May 3, 2004 by scott

A subtle and dangerous bug appears regularly in newsgroup postings, and some have even sighted the problem in sample code from articles and books.

Take the following ASPX snippet:

<asp:textbox id="txtPassword" runat="server"/>
<asp:button id="btnSubmit" runat="server" Text="Submit"/>
<asp:requiredfieldvalidator id="valReqPassword" 
            runat="server" ErrorMessage="Password required" 

And the following code-behind logic:

private void btnSubmit_Click(object sender, 
                             System.EventArgs e)
   Response.Write("Password set!");

If you enter a blank password and click submit on this form (in a DHTML capable browser), the validation control prevents the post back and displays an error message next to the TextBox control. Testing complete, validation works, continue to the next form.

I’m sure many of you have spotted the problem, but judging from newsgroup postings this isn’t so easy for newcomers to catch. One can expose the bug by setting the EnableClientScript property of the validation control to false. Now if the user enters a blank password and clicks submit the validation error message still appears, but in addition all of the code inside the click event handler executes. Unless there is a database constraint in place, chances are the user just set their password to an empty string.

Even with client side scripting enabled, we know it would be easy to give the software un-validated input with the System.Net.WebRequest class. Client side validation works so well in the browser, however, so it is hard to see this vulnerability.

The crux of the misunderstanding is how the client side validation behavior is entirely different from server side behavior. On the client side, if validation fails, the flow of execution effectively stops. On the server side, you have to check Page.IsValid and alter the flow yourself.

Darren Neimke posted today about the difficulty in achieving elegance when writing functionality spanning client and server sides. I agree, at times it still seems much harder than it should be to get things right (with pretty code), particularly since the two sides behave so differently. Perhaps failed validation should trigger an exception on the server side - but I realize this would break many existing applications. Still, something to think about ... and watch for Page.IsValid in code reviews.

Performance and Scalability Guide Released

Thursday, April 29, 2004 by scott

The patterns and practices team has released the definitive work on performance and scalability for .NET.

I can’t help but mention that if you look …

really hard …

near the bottom of the front page …

you’ll see I'm listed as an external reviewer.

I had a blast reviewing chapters – this guide is dense with tips and tricks on everything from ADO.NET to XML.

Using Reporting Services GetReportParameters web method

Wednesday, April 28, 2004 by scott

A question today on how to get the valid values for a report parameter via the web service API when a query specifies the valid values. (Whew - clunky sentence).

To do this you can use the GetReportParameters web method. First, be wary of a small bug in the documentation where the order of parameters are incorrect. Not a big problem with intellisense, though.


ReportingService rService = new ReportingService();
rService.Credentials = 
string historyID = null;
bool forRendering = true;
ParameterValue[] values = null;                                  
DataSourceCredentials[] credentials = null;
ReportParameter[] parameters;
parameters = rService.GetReportParameters
                        "/Northwind Reports/Report2",

This will give you back an array of ReportParameter objects. If a ReportParameter object has a set of valid values defined by a query, reporting services will run the query and populate the ValidValues property with ValidValue objects (only when ForRendering == true). If we wanted to dump the valid values for parameter[0]:

foreach(ValidValue v in parameters[0].ValidValues)
    Console.WriteLine(v.Label + " " + v.Value);

Note, if you have hierarchical parameters, where, for example, the set of valid values for parameters[1] depends on the value selected for parameters[0], you’ll need to initialize the ParameterValue array to get the valid values for parameters[1]. If you are doing this all from scratch this may mean multiple calls to GetReportParameters.

Healthcare IT: Self inflicted wounds

Tuesday, April 27, 2004 by scott

I’ve been in and around quite a number of Hospital IT departments over the last 18 months. I’ve selected one practice that separates the good hospital IT department from the bad: the good ones occasionally hire people from outside the healthcare industry, the bad ones never do.

Scoble suggests a visit from the PAG group for one Hospital IT department. This is exactly what some hospitals could use, that’s why it will never happen at the places that need it most.

Some industries deem themselves as being significantly different from any other industry and only hire from the pool of workers with experience in their industry, or only work with companies and individuals who specialize in their industry. I know this is not just hospitals, financial institutions and others have the same practice, it is just I’ve never seen the problem as exacerbated as I have in healthcare. The effects of inbreeding in IT are the same as in biological organisms: more defects, lower quality. You have to put some new genes in the pool to grow new ideas and move forward.

With my mini-rant over, I’d like to present something I find somewhat comical (albeit dangerous) about hospitals. As we know, the M in MRI stands for magnetic. The magnetic field is extremely powerful, with many of today’s machines reaching 3 – 4 Tesla in strength (30,000 to 40,000 times more powerful than the earth’s field). An MRI machine can turn a pen into a projectile, stop a quartz watch, and suck in chairs, oxygen tanks, and industrial floor polishers. There are pictures to prove it.

SQL Server Best Practice Analyzer

Monday, April 26, 2004 by scott

I took the SQL Server 2000 Best Practices Analyzer Beta for a spin this evening in a virtual PC. The download and install worked without problems. The installation needs to create a database to store rules and report results – the default name is sqlbpa.

After registering a SQL Server to analyze I went to the Best Practices screen, where I selected “Create Best Practice” and began to look through the available rules. Rules fall into the following categories:

  • Backup and Recovery
  • Configuration Options
  • Database Design
  • Database Administration
  • Deprecation
  • Full-Text
  • General Administration
  • Generic
  • T-SQL
  • Yukon Readiness

Each category contains a set of rules, many of which you can parameterize. For example, you can check to see if databases have successfully backed up in the last X days (where X defaults to 30, but you can enter your own value).

Here are some of the interesting rules you can run:

Index fragmentation: you can specify the maximum value for fragmentation, and the minimum number of pages an index must have to be included in the scan.

File placement: ensures data and log files are not on the same drive.

Unexpected shutdowns: looks in the NT event log and flags any unexpected shutdowns.

Object Prefixes / Object Suffixes: enforces naming standards. You can select a prefix / suffix for each object type, although it is not clear to me how to setup more than one type of object to scan. Perhaps this is a beta issue.

NULL comparisons: scans stored procedures, views, triggers, and functions to find equality and inequality comparison with NULL constants.

Temp table usage: looks for opportunities to replace temp tables with table variables.

Once I had a set of best practices defined I could move them into the execution area and scan one more or SQL servers. Reports are saved into the database for viewing through the UI.

This looks like an extremely productive tool for watching databases both in development and production. There are managed code assemblies in the bin directory, and possibly room for extensions, as a quick peek in the BPA database shows assembly names. Perhaps custom rules can be put into a custom assembly and registered in the database also.

Definitely going to add this tool to the toolkit.


Sunday, April 25, 2004 by scott
I know using FindControl in ASP.NET is not always as easy as it would seem on the surface. Particularly when you start looking for controls inside of a DataGrid or a Repeater. This article should help out.