Someone should delete this article in purgatory at codeproject.com. The article is full of mis-information, but look at the view count.
The article attempts to restrict a Windows application to a single running instance. The article tries to do this using the Process class from the System.Diagnostics namespace. The code invokes Process.GetProcessesByName(processName) to see if there are any existing processes running with the same name, and exits if another is found. If you do some searching, you’ll find other code snippets using the same technique.
There are at least three problems with this approach:
1) It doesn’t account for race conditions. Two instances of the application could launch at nearly the same time, see each other, and both shut down.
2) It doesn’t work in terminal services, at least not if I want the application to run an instance in each login session.
3) It doesn’t account for the possibility that someone else might have a process with the same name.
These problems might be considered ‘edge conditions’, except there is an easy, foolproof way to check for a running instance of an application. A named mutex allows multiple processes to use the same mutex object for interprocess synchronization. The author asserts a mutex is not safe on a multiprocessor machine. If this were true it would be the end of civilization as we know it.
The name of the mutex will preferably be a unique identifier to offset the chances of another application using the same name. One could chose to use the full name of the executing assembly, or a GUID. If the application can acquire the named mutex with WaitOne, then it is the first instance running. If the application calls WaitOne with a timeout value and WaitOne returns false, another instance of the application is running and this one needs to exit.
When using this approach in .NET there is one ‘gotcha’. The following code has a small problem:
[STAThread] static void Main() { Mutex mutex = new Mutex(false, appGuid); if(!mutex.WaitOne(0, false)) { MessageBox.Show("Instance already running"); return; } Application.Run(new Form1()); } private static string appGuid = "c0a76b5a-12ab-45c5-b9d9-d693faa6e7b9";
The problem is easy to reproduce if you run the following code in a release build:
[STAThread] static void Main() { Mutex mutex = new Mutex(false, appGuid); if(!mutex.WaitOne(0, false)) { MessageBox.Show("Instance already running"); return; } GC.Collect(); Application.Run(new Form1()); } private static string appGuid = "c0a76b5a-12ab-45c5-b9d9-d693faa6e7b9";
Since the mutex goes unused when the Form starts running, the compiler and garbage collector are free to conspire together to collect the mutex out of existence. After the first garbage collector run, one might be able to launch multiple instances of the application again. The following code will keep the mutex alive. (The call to GC.Collect is still here just for testing).
[STAThread] static void Main() { Mutex mutex = new Mutex(false, appGuid); if(!mutex.WaitOne(0, false)) { MessageBox.Show("Instance already running"); return; } GC.Collect(); Application.Run(new Form1()); GC.KeepAlive(mutex); } private static string appGuid = "c0a76b5a-12ab-45c5-b9d9-d693faa6e7b9";
There is still an imperfection in the code. Mutex derives from WaitHandle, and WaitHandle implements IDisposable. Here is one more example that keeps the mutex alive and properly disposes the mutex when finished.
[STAThread] static void Main() { using(Mutex mutex = new Mutex(false, appGuid)) { if(!mutex.WaitOne(0, false)) { MessageBox.Show("Instance already running"); return; } GC.Collect(); Application.Run(new Form1()); } }
With the above code I can run the application from the console, and also log into the machine with terminal services and run the application in a different session. Terminal services provides a unique namespace for each client session (so does fast user switching on Windows XP). When I create a named mutex, the mutex lives inside the namespace for the session I am running in. Like .NET namespaces, terminal services uses namespaces to prevent naming collisions.
If I want to have only one instance of the application running across all sessions on the machine, I can put the named mutex into the global namespace with the prefix “Global\”.
[STAThread] static void Main() { using(Mutex mutex = new Mutex(false, @"Global\" + appGuid)) { if(!mutex.WaitOne(0, false)) { MessageBox.Show("Instance already running"); return; } GC.Collect(); Application.Run(new Form1()); } }
After all this you may find you need to adjust permissions on the mutex in order to access the mutex from another process running with different credentials than the first. This requires some PInvoke work and deserves a post unto itself.
This was all particularly impressive considering Dan had moved into the pointy haired legions of ‘management’.
Back in the day before Outlook had a desktop alert, Dan decided to write his own alert as an Outlook plug-in. Every time a new message arrived, the plug-in used the MS text to speech engine to read the subject line aloud.
When Dan's machine said "Error in production", Dan knew something important was up. When Dan's machine said "Halloween Dress Up Day”, Dan knew he could continue working on something important and leave the email alone.
This all worked pretty well until the childish developers in the company realized the power of speech. Dan would be having a staff meeting with 4 other people in the office when his computer would blurt out:
“Beer. Beer. Beer. Beer. Beer. Beer. Beer. Beer. Beer. Beer. Beer.“
Anyway, when we were an up and coming company we had to plan for our destiny. It was very important, the VCs would tell us, to be prepared for the massive incoming rush of future customers. The absolute worst scenario for an Internet company is not to have the infrastructure required to meet customer demand. IBM would ridicule any company caught unprepared in a television commercial.
Like many companies of the late 90s, we promptly leased enough networking gear to wire the planet over twice.
In 2001 cash became an issue. Someone decided that if we could just get out of all the leases and replace high end Cisco and Sun hardware with stuff from WalMart, then we would buy ourselves enough time for the massive incoming rush of customers to arrive.
Dan’s job was to crunch all the numbers and present a plan to the CFO. Dan knew the first answer is never acceptable in these scenarios so he came up with a complete financial model in the form of a spreadsheet. If they didn’t like the first answer then they could just change a few numbers around and the entire spreadsheet recalculated.
About 6 months later Dan told me what happened when he turned the spreadsheet over to the CFO. The CFO was excited and called Dan into his office.
CFO: Look, I can change a number here, and the number down here changes!
Dan: Yes, I thought I’d put this in a spreadsheet to try a few different things and see what works out best.
CFO: Yes, but look! I can change a number here, and the number down here changes!
Dan: Well, yeah, it’s a spreadsheet…..
CFO: But this is fantastic! I can change a number here, and the number down here changes!
I heard this story as I was packing the books at my desk into cardboard boxes. At this point nothing surprised me. In fact, being as how we were only days away from closing the door for good, maybe it was all starting to make sense.
Note: this story is not my fondest memory of the CFO…..
In order for SQL Server Reporting Services to deliver a report to the destination of your choosing, you only need to create an assembly with class types implementing 3 simple interfaces: IExtension, IDeliveryExtension, and ISubscriptionBaseUIUserControl.
Then the fun really begins. I have some tips to share for anyone else who tries. Refer to the last post for the source code.
My delivery extension needs to plug into both the user interface of the Report Manager (for the user to set delivery parameters), and into the ReportServerService (where all the heavy lifting and rendering takes place). After every build I had to get the assembly into the bin directory of both the Report Manager (the web application), and the ReportServerService. All this takes is a little batch file (like the following) and a command prompt with admin privileges.
net stop reportserver xcopy /y (BuildPath) (SSRSHome)\ReportServer\bin xcopy /y (BuildPath) (SSRSHome)\ReportManager\bin" net start reportserver
There is no need to shutdown the Report Manager web application. ASP.NET shadow copies the extension assembly and leaves the original unlocked in the bin directory. The runtime will recognize when the assembly has changed and loads the new version immediately.
The next step was to have SSRS accept my new delivery extension with loving arms. There were 4 total configuration files to modify. The RSWebApplication.config and RSReportServer.config files are easy to figure out. Just provide SSRS with the type name and the assembly name at the appropriate location in the XML:
<Extensions> <Delivery> <Extension Name="Report Server Blog" Type="OdeToCode.BlogDeliveryExtension.BlogDeliveryProvider, BlogDeliveryExtension"> <MaxRetries>3</MaxRetries> <SecondsBeforeRetry>900</SecondsBeforeRetry> </Extension> </Delivery> </Extensions>
The rssrvpolicy.config and rsmgrpolicy.config policy files are a different story. These files manage the security policies of SSRS. After nearly blacking out from reading the documentation on code groups for the 15th time and not having any success, I found Bryan Keller’s post with a hint on where to place extension code groups (right after the CodeGen membership group).
<CodeGroup class="UnionCodeGroup" version="1" PermissionSetName="FullTrust" Name="Report Server Blog" Description="Code group for OdeToCode Blog Extension"> <IMembershipCondition class="UrlMembershipCondition" version="1" Url="C:\Program Files\Microsoft SQL Server\MSSQL\Reporting Services\ReportManager\Bin\BlogDeliveryExtension.dll" /> </CodeGroup>
Typos, or incorrect element placement can lead to a wide variety of interesting exception messages and log file entries. Any exception with “security”, “permission”, or “cannot load type” in the description is a possible user malfunction in editing the code groups.
To debug the UI behavior I’d attach to the ASP.NET worker process. To debug the actual delivery, I’d attach to the ReportServerService, or use System.Diagnostics.Debugger.Launch() to bring up a debugger as soon as it hit the line of code.
To execute the delivery I first needed to setup a subscription to a report. I setup a subscription to run a report every Monday morning at 8 am. I’ve got the debugger ready and just need to wait a few days now for the breakpoint to hit.
Just kidding.
There is a SQL Agent job for each schedule in SSRS. By executing the Agent job, you can trigger the subscription to fire off a delivery. Finding the right job can be problematic if you have several report subscriptions set, but scanning the results of the following query might help.
SELECT RS.ScheduleID, C.Name, U.UserName, S.Description FROM ReportSchedule RS INNER JOIN Subscriptions S ON RS.SubscriptionID = S.SubscriptionID INNER JOIN Users U ON S.OwnerID = U.UserID INNER JOIN [Catalog] C ON RS.ReportID = C.ItemID
ScheduleID | Name | UserName | Description |
B20C9057-EE51-41E2-B3B5-7450AC73FFCB | Customers Report | REPORTING\bitmask | Post report to http://ibm600xp/dottext/scott/services/simpleblogservice.asmx |
That concludes the tips for now. Happy 100th post to me.
SQL Server Reporting Services ships with two delivery extensions: one to deliver reports through email, and one to deliver reports to a shared network drive. A third extension to deliver reports to a printer exists in the SSRS samples directory.
One day I was setting up an email subscription and a thought occurred. Delivering reports to a blog instead of to a company email alias would be ideal. Instead of sitting in a slew of inboxes, a new delivery extension could post these reports with the blog’s web service API and intranet users could easily comment on and link to the report. Anyone needing the report subscribes to the blog with an aggregator and knows when a new report is ready.
I learned quite a bit and have quite a number of tips to share about the experience, but I’ll have to save those for future posts. Implementing, deploying, testing, and debugging a reporting service extension involves a little more work than I initially suspected. It looks easy in theory (just implement these 3 simple interfaces!).
In the meantime, you can look at the source if you dare. It has no warranty, no guarantees, contains liberal amounts of TODO comments, and comes with no installation instructions (yet).
BlogDeliveryExtension.zip (C#)
On the plus side, it does work on the simple reports I've tested so far. I have not tried reports with images, I suspect these are going to pose a problem. I just finished adding DPAPI calls to keep the blog user's password encrypted in the ReportServerDB, and the next step is to look at the fancier reports.
Feedback and criticism welcomed.
If you find reporting to be a boring subject, then I'm sure you won't download the code, and won't offer any feedback, because you've already stopped reading this.
I was looking at some web forms with Trace enabled in ASP.NET 2.0, looked at the ViewState size, and remembered reading about some view state enhancements in 2.0. So I did a little experiment.
In ASP.NET 1.1 I rendered ‘SELECT * FROM pubs..employee’ in a DataGrid control using all the default settings (auto-generated columns, ViewState enabled). The resulting page used 26,241 bytes.
In ASP.NET 2.0 I rendered ‘SELECT * FROM pub..employee’ in a DataGrid control, again using all the defaults. The resulting page used 16,089 bytes. A difference of 10,152 bytes, which is quite a bit if you insist on enabling ViewState on a DataGrid. The same experiment using the new GridView control, with sorting enabled, ran 17,996 bytes.
For anyone who has not seen the GridView in action, Dino Espisito has an article: Move Over DataGrid, There's a New Grid in Town!
I noticed in the trace there is a new PreInit event in the page life cycle. This makes me wonder if we might need a PrePreInit event in ASP.NET 3.0? Perhaps a BeforePreInit?
It turns out PreInit is the event to grab in order to dynamically change the personalization, theme, or master page settings (it was also time to get in touch with my kindler, gentler, VB side, but don’t tell anyone):
Private Sub Page_PreInit(ByVal sender As Object, ByVal e As System.EventArgs) Handles Me.PreInit MasterPageFile = "~/FunkyDoodleLayout.master" End Sub
Setting the MasterPageFile property after the PreInit event only throws an exception. Paul Wilson has a good introduction to the Master Page feature: Standardize Your Site Fast With Master Pages.
Somehow, in the middle of this, I began surfing and read about Boston’s Great Molasses Flood of 1919. I’ve never heard of this and first thought it was an urban myth, but it’s not. A 30 foot high wall of molasses moving at 35 miles per hour.
The code expansion in C# is quite good. I type the ‘using’ keyword, hit the TAB key, and the IDE inserts a code block for me:
using (resource) { }
The word resource appears in yellow and is selected, so I just need to type in the expression. If you expand a for statement, then the initializer is highlighted. As soon as you change the variable name in the initializer, the IDE changes the variable name in the expression and iterator to match, saving quite a few keystrokes.
What is interesting is the C# code snippets center around simple block statements (lock, using, for, do). The VB snippets seem to include entire algorithms. You choose to “Insert Snippet…”, then chose “Processing Drives, Folders, and Files”, then “Parse Column Data In a Text File”, and the IDE spits out the following code, where the file name and delimiters array are replaceable by tabbing through the snippet and typing (this is hard to describe, you just have to try it):
' This example parses a file with this structure. ' Line1Column1, Line1Column2, Line1Column3 ' Line2Column1, Line2Column2, Line2Column3 ' Line3Column1, Line3Column2, Line3Column3 ' Line4Column1, Line4Column2, Line4Column3 Dim parser As TextFieldParser parser = My.Computer.FileSystem.OpenTextFieldParser("C:\TextFile.txt") parser.Delimiters = New String() {","} Dim fields() As String While Not parser.EndOfData Try ' ReadFields reads one line of data from the file. ' Array 'fields' contains one string element for each column. fields = parser.ReadFields Catch ex As MalformedLineException MsgBox("Error on line: " & ex.LineNumber) Throw ex End Try End While parser.Close()
As I’ve said before, someday I’m going to come home and find my cat has written a Tetris clone by sleeping on my keyboard. In VB of course.
I work in a government technology “incubator” building, meaning there are a dozen startup companies around here and we share a common copier room, common break room, and common conference rooms. The key word here is ‘government’. I have complete faith the county maintenance people feel my discomfort and will rush to fix the air conditioning problems just in time for the first major snowfall of the year.
In other news, I’ve been under a lot of peer pressure lately. Test driven development is a great way to write quality code, but sometimes good old fashioned peer pressure works just as well. Why just recently I opened the latest build notes to look at the list of file diffs. Imagine my surprise when I see:
File | Check-in Date | Version | Check-in By | Comment |
VisitSearch.ascx.cs | 8/8/2004 11:42:36 AM | 2 | Plall | Fix Scott’s sloppy code. |
For anyone facing authentication or authorization errors in a Reporting Services environment, or just looking for some introductory material, I've posted two new articles to OdeToCode:
Introduction To Role-Based Security In SQL Server Reporting Services
Authentication, Role-based Security, and Reporting Services Web Services
If you give them a read, please let me know if you think they are easy to understand, and please let me know if they contain any errors.