OdeToCode IC Logo

Double Check Locking In The News Again

Thursday, May 13, 2004 by scott

Once upon a time, Chris Brumme posted about shortcomings in the memory model of the ECMA specification for the CLR. Not necessarily shortcomings from a runtime performance point of view, but shortcomings from a programmer productivity point of view. In the post he discussed why double check locking requires some attention to detail. Specifically, the following code snippet may not be as thread safe as it first appears.

if (a == null)
{
  lock(obj)
  {
    if (a == null) a = new A();
  }
}

There are interesting comments in response to the post, and eventually Jon Skeet devoted a page to singleton construction. Jon avoids the double check locking issue altogether by using a static field initializer in a nested type. The approach Jon promotes works very well except in cases where you do not know the singleton type to construct at compile time. For example, the type of singleton to construct may be an object derived from an abstract base class in a provider / pluggable architecture and the application reads the type to construct from a config file.

If you can’t use a static field initializer, but still want safe, lazy instantiation, then it seems to me that Brad Adam’s post about using the static MemoryBarrier method of the System.Threading.Thread class is the direction to go, for a couple reasons.

To me, the volatile keyword carries specific overtones. I still think of programming with memory mapped IO when I see the volatile keyword. Volatile variables are completely unsafe for caching. Imagine having a byte in memory hooked up to a thermometer laying on your desk. Not even a single CPU machine knows when the memory location may update with a new temperature value – you have to read it from main memory every time. Volatile has an unfortunate connotation for a singleton reference, which after construction isn’t going to change.

Secondly, the use of Thread.MemoryBarrier explicitly calls out what needs to happen for the code to be thread safe. For people who stumble across the code in the future, they will not need to think of the side effects of a volatile variable when Thread.MemoryBarrier is in place.

Not only do we have maintainable code showing programmer intent, there is a performance bonus too. That being said, if this code was not part of a singleton, and other methods were involved, I'd prefer volatile.

Server Software For The Desktop

Saturday, May 8, 2004 by scott

For the past four years I’ve always run a server version of Windows on my development machine. I do this for a few reasons. On those rare occasions when I find myself in a server room around a production machine, I feel comfortable knowing where all the buttons and settings are. It’s hard to feel comfortable when you have only a vague memory of where you saw a particular configuration dialog, and pointy haired people stand behind you spouting “Is it online yet?”.

The other reason is that I want to feel like I am getting my moneys worth from my MSDN subscription. Some people keep their subscription discs in numerical order. I like to keep mine ordered by license fees. That way if there is something expensive I have not installed as yet, I can throw it on a virtual PC and tinker around.

Since switching to a DVD subscription two years ago, it’s been much harder to compute the license fee value per disc. It requires a calculator. Then again, the CD subscription was driving me insane years ago. I’m convinced Microsoft implemented the CD numbering and coloring scheme using a stochastic process. First, you received about 2400 CDs each year. If you organized the CDs numerically, it was impossible to find any specific product inside without an up to date annotated index, and you never knew when any particular CD was obsolete. The numbering sequence often left large gaps, but invariably a CD would show up with a number in between two other CDs and all the discs had to be manually bubble-sorted throughout the CD binder. I’m certain the process has driven some percentage of developers to drink. One company I worked at budgeted 40 intern hours a month to organizing MSDN subscription binders for developers.

But getting back to my previous topic, which is running server software on what is essentially a desktop machine. Windows 2003 is different beast and requires some tweaking to offer a pleasant desktop experience. Kevin Moore offers a tip on getting rid of the Shutdown Event Tracking. MSFN has some other tips to enable themes, video acceleration, audio acceleration, and more. By the time you get to the end of the guide, you’ll be able to watch those MSDN webcasts in a nicely themed Windows Media Player at full frame rate.

The only drawback to running 2003 as a desktop OS is you’ll find some software refuses to install, saying it requires Windows XP. Also, some utility software, like good anti-virus software, has tiered pricing for server class machines. On the other hand, it is the only OS where you can install some of the new, expensive stuff.

Oh, and at least some of those obsolete MSDN CDs have found a good home.

What Goes On At the ASP.NET Website?

Thursday, May 6, 2004 by scott

First, I think ASP.NET is a great web site, and featuring articles from all over the community helps build a diverse and informative resource for developers.

But…

There are problems which in my opinion devalue the site. I used to think there was some forethought and human intelligence behind the scenes which would take the steps necessary for the site to appear with the polished veneer you’d expect from a site with Microsoft’s name attached, but the ‘man behind the curtain’ appears to be 100% silicon.

Take today’s new article description. This is obviously meant to reach someone who is responsible for the ASP.NET daily article content and not meant to appear on the front page. I’m sorry to say John, even if you used the official contact email of aspnetw3@microsoft.com, you won’t get a response, at least in my experience.

(UPDATE: The article has changed as of 3:20 PM EST. The description used to begin with "Dear Editor, Thank you for accepting my article,,,," Thank you ASP.NET!)

In the past there have been articles that have nothing to do with web development, which doesn’t bother me too much, but when this happens I start to wonder what sort of standards the site maintains. What I do find troubling is how at least two articles were duplicated in the space of 10 days this year, which indicates to me nobody is paying attention to what is going on. The front page content is just a FIFO queue in a database. My guess is, someone could post a link to Michael Jackson's legal documents and the article will show up on the front page of ASP.NET.

I can appreciate filtering content to feature a daily article (indeed, even having a daily article) can be a tough job. Hopefully, someone can step up and address the issue. Given the site’s domain name, and the site’s owner, there are certain expectations to meet. Don’t devalue the site and the work by the author’s who contribute to this resource.

New Longhorn Bits

Thursday, May 6, 2004 by scott

Robert McLaws has compiled a list of “things to do” while downloading the latest Longhorn bits, and I may make it through most of them as I still have 5 hours left (not counting the SDK transfer of 380 MB). The transfer rate has steadily been falling from ~ 70 KB/sec to about ~ 20 KB/sec as the evening progresses. I don’t think I will be seeing the installation and setup screen until after a night of sleep. Update: it appears I have downloaded the DDK, not the SDK, as the SDK has yet to appear on subscriber downloads.

Chris Sells points out there will not be any Visual Studio bits to put on this build (M7.2 Longhorn). I might have to grab vi and see if the muscle memory in my fingers can still play :w and yy like the days of old.

Wesner Moise tells us that MSDN has already updated the online Longhorn SDK to reflect the latest build.

Finally, Scoble addressed all the hardware requirement speculators who believe Longhorn will require a creation from the Los Alamos labs to run. I know I’ve been running the PDC bits on what some would consider ridiculously modest hardware – a 1Ghz P3 with 1GB of RAM. Quit laughing! It runs pretty well!

ASP.NET Validation : False Sense of Security?

Monday, May 3, 2004 by scott

A subtle and dangerous bug appears regularly in newsgroup postings, and some have even sighted the problem in sample code from articles and books.

Take the following ASPX snippet:

<asp:textbox id="txtPassword" runat="server"/>
<asp:button id="btnSubmit" runat="server" Text="Submit"/>
<asp:requiredfieldvalidator id="valReqPassword" 
            runat="server" ErrorMessage="Password required" 
            ControlToValidate="txtPassword"/>

And the following code-behind logic:

private void btnSubmit_Click(object sender, 
                             System.EventArgs e)
{
   SetUserPassword(txtPassword.Text);
   Response.Write("Password set!");
}

If you enter a blank password and click submit on this form (in a DHTML capable browser), the validation control prevents the post back and displays an error message next to the TextBox control. Testing complete, validation works, continue to the next form.

I’m sure many of you have spotted the problem, but judging from newsgroup postings this isn’t so easy for newcomers to catch. One can expose the bug by setting the EnableClientScript property of the validation control to false. Now if the user enters a blank password and clicks submit the validation error message still appears, but in addition all of the code inside the click event handler executes. Unless there is a database constraint in place, chances are the user just set their password to an empty string.

Even with client side scripting enabled, we know it would be easy to give the software un-validated input with the System.Net.WebRequest class. Client side validation works so well in the browser, however, so it is hard to see this vulnerability.

The crux of the misunderstanding is how the client side validation behavior is entirely different from server side behavior. On the client side, if validation fails, the flow of execution effectively stops. On the server side, you have to check Page.IsValid and alter the flow yourself.

Darren Neimke posted today about the difficulty in achieving elegance when writing functionality spanning client and server sides. I agree, at times it still seems much harder than it should be to get things right (with pretty code), particularly since the two sides behave so differently. Perhaps failed validation should trigger an exception on the server side - but I realize this would break many existing applications. Still, something to think about ... and watch for Page.IsValid in code reviews.

Performance and Scalability Guide Released

Thursday, April 29, 2004 by scott

The patterns and practices team has released the definitive work on performance and scalability for .NET.

I can’t help but mention that if you look …

really hard …

near the bottom of the front page …

you’ll see I'm listed as an external reviewer.

I had a blast reviewing chapters – this guide is dense with tips and tricks on everything from ADO.NET to XML.

Using Reporting Services GetReportParameters web method

Wednesday, April 28, 2004 by scott

A question today on how to get the valid values for a report parameter via the web service API when a query specifies the valid values. (Whew - clunky sentence).

To do this you can use the GetReportParameters web method. First, be wary of a small bug in the documentation where the order of parameters are incorrect. Not a big problem with intellisense, though.

Example:

ReportingService rService = new ReportingService();
rService.Credentials = 
            System.Net.CredentialCache.DefaultCredentials;
 
string historyID = null;
bool forRendering = true;
ParameterValue[] values = null;                                  
DataSourceCredentials[] credentials = null;
ReportParameter[] parameters;
 
parameters = rService.GetReportParameters
            (
                        "/Northwind Reports/Report2",
                        historyID,
                        forRendering,
                        values,
                        credentials                                           
            );

This will give you back an array of ReportParameter objects. If a ReportParameter object has a set of valid values defined by a query, reporting services will run the query and populate the ValidValues property with ValidValue objects (only when ForRendering == true). If we wanted to dump the valid values for parameter[0]:

foreach(ValidValue v in parameters[0].ValidValues)
{
    Console.WriteLine(v.Label + " " + v.Value);
}

Note, if you have hierarchical parameters, where, for example, the set of valid values for parameters[1] depends on the value selected for parameters[0], you’ll need to initialize the ParameterValue array to get the valid values for parameters[1]. If you are doing this all from scratch this may mean multiple calls to GetReportParameters.