OdeToCode IC Logo

ASP.NET and Windows Workflows Foundation

Sunday, February 11, 2007
Programming Windows Workflow Foundation by Scott Allen Programming Windows Workflow Foundation If you enjoyed this article, you'll enjoy the book even more! Order now from Packt Publishing and save 10%!

Combining ASP.NET 2.0 and Windows Workflow Foundation (WF) will provide us with all the essential tools for building workflow-enabled web applications. Both ASP.NET and WF, however, are significant pieces of technology. Joining the two into a long-term relationship requires some careful thought and planning. In this article, we will look at an order processing application built for the web, and walk through the design. The application includes practical abstractions that shield the web application from the details of the workflow runtime. The application also addresses some real world scenarios. For instance, we'll keep our application state consistent with our workflow state by enlisting our database work inside the same transactions used by the WF tracking and persistence services.

Windows Workflow and ASP.NET In Harmony

Full source code to this sample is available for download. After extracting the solution files, see the readme.htm file for installation tips. This article doesn't go into great detail on how to use Windows Workflow Foundation, or how to use ASP.NET, but does highlight important decisions to make when combining the two, and tries to point to other resources for additional information.

wfOrders

The requirements have arrived from the product manager and we're building an order tracking application. What sort of orders? Well, these could be book orders, or bicycle orders – we don't care about the details. What we do care about is the requirement to track the order from creation to completion, with a number of steps in-between. We also need to record the processing history of each order, and provide our non-technical business experts some insight into the order-processing pipeline. Finally, our product manager wants us to provide an intelligent user interface that can, for example, prevent a user from accidently marking an order as shipped when the warehouse has yet to process the order.

These requirements play to the strengths of Windows Workflow Foundation. Microsoft designed WF to manage long-running processes, and provide transparency into both the design and the execution of these processes. An out of the box tracking service can record the processing history for each order, and the state machine workflow type provided by WF can prevent a user from accidently short-circuiting the process.

Enough of the marketing spiel though, let’s builds something.

The State Machine

For this application we are going to use a state machine workflow, and this workflow will be the first piece of the sample we will build (see OrderStateMachine.xoml in the project wfOrderWorkflows). Before authoring a new workflow, one has to choose between a sequential workflow and a state machine workflow. Both models have their relative strengths and weaknesses. My article "State Machines In Windows Workflow" highlights the differences and offers an in-depth look at the design and execution of state machines. We won't cover those basics in this article.

Let's say we've discussed the life of an order with the business people and we have a clear picture of the business process. From the discussion we've determined an order is always in one of 3 states: open, processed, or completed. An order is "open" when a customer first creates the order, and moves to a "processed" state when it is ready to ship. If the customer changes the order in this state, the order will return to the "open" state. When we ship a processed order, (or the customer cancels the order), the order moves to a completed state and the workflow is finished. We can start to model this process by dragging the state activities we need from the workflow toolbox window into a new state machine workflow.

A first cut at the state machine

At this point we only have empty states -we haven't created events and transitions between states. Notice we actually have 4 states in the workflow. We've decided to add an "initial" state that represents the step just before an order is actually created. We want the creation step to be a part of the workflow, and this "Initial" state represents the stage of the process just before an order is created. We'll see advantages to this approach later.

The Contract

A state machine workflow needs the outside world to raise events and drive the workflow to completion. These events represent real actions that can happen to an order. In talking with the business people, we’ve decided there are 5 distinct actions that can occur to an order during its lifetime. An order can be created, updated, processed, canceled, and shipped.

To deliver these events to the workflow, we'll need to define a communications contract. In .NET, an interface type defines a software contract. The following interface is decorated with the ExternalDataExchange attribute. This attribute is allows WF to recognize this interface as a contract to deliver workflow related events.

[ExternalDataExchange]
public interface IOrderService
{
    
event EventHandler<OrderEventArgs> OrderCreated;
    
event EventHandler<OrderEventArgs> OrderShipped;
    
event EventHandler<OrderEventArgs> OrderUpdated;
    
event EventHandler<OrderEventArgs> OrderProcessed;
    
event EventHandler<OrderEventArgs> OrderCanceled;
}

The OrderEventArgs class we are using in the interface represents the data we are delivering to WF. In some scenarios, simply raising an event to a workflow provides enough information for a state machine to act. However, we want to pass in a "business object" or "domain object" that represents the order being acted on by the user. We'll find out later that WF will use this object to update an order record in the database, however, by passing in the order object we could perform validations and run business rules inside the workflow and before allowing the order to progress to a new state. Passing the order gives us greater flexibility and power inside the WF code. We will define our event arguments class with following code. 

[Serializable]
public class OrderEventArgs : ExternalDataEventArgs
{
    
public OrderEventArgs(Guid instanceId, Order order)
        :
base(instanceId)
    {
        
Check.ArgumentIsNotNull(instanceId, "instanceId");
        
Check.ArgumentIsNotNull(order, "order");

        _order = order;
    }

    
private Order _order;
    
public Order Order
    {
        
get { return _order; }
        
set { _order = value; }
    }   
}

The Order class itself is defined in the wfOrderCommon project. You can assume it is a typical "business" object with properties that ultimately map to columns in one or more database tables.

Note: it is important for the event arguments to be 100% serializable. Since the event arguments contain an instance of the Order class, the Order class, will also need to be serializable.In this sample it is enough to decorate both with the Serializable attribute.

At some point, we will need to provide a concrete implementation of this interface, but for now we can finish our state machine.

The Activities

Again, if you are not familiar with the design and execution of state machine workflow, I encourage you to go back and read upon the basics, as this section is going to move fast.

With an external data exchange interface in place, we can start to add EventDriven activities to our workflow. These activities will listen for events to arrive from the outside world. The outside world could be an ASP.NET application, or could be a Windows Forms smart client – the workflow doesn't (and shouldn't) care. It is at this point that we will decide which events are legal events in each state, and where those events will lead. When we are done adding all the activities, our workflow will look like the following.

A fleshed out state machine

We can see that when the order is in the "Open" state, it will accept the OrderUpdated and the OrderProcessed events, but not the OrderShipped event (we set the Name property on each EventDriven activity to match the name of the event it will listen for - a highly recommend practice). If we click on an EventDriven activity, we can add additional activities that will execute when the associated event arrives. Below shows a completed EventDriven activity for the ProcessedOrder event.

 EventDriven activity details

Inside we've placed three activities. The first activity is a HandleExternalEvent (HEC) activity. The HEC is configured to wait for an OrderProcessed event to arrive from a service implementing the IOrderService interface we defined earlier. If that event arrives, control will continue to the next activity, a SetState activity. The SetState activity tells the state machine to transition to the Processed state (this activity has focus, and you can see the configuration in the Properties window of the screen shot).

You might be wondering why the SetState activity is not the last activity in the sequence, as it would seem logical that the last thing we want to do is transition to the next state. It turns out that WF will transition to the state targeted by the last SetState activity that executes, even if that SetState activity is not the last activity to execute. In this case, our last activity (a custom activity we will discuss soon), actually binds to some meta properties of the SetState activity, and it works better to have our custom activity appear after SetState (to avoid compiler warnings about binding to an activity that executes later in the sequence).

Our last activity is special. Our last activity is a custom activity that encapsulates the updating and saving of an order to the database. In addition to saving all the order details, like the order description, this custom activity will also update a State property on the business object representing an order. This State property will ultimately be saved into a column in our database table with other orders. By keeping the current status of an order in the same table as the order itself, we've made it easy for the user interface and other components to quickly inspect the current status of all orders.

Let me expand on the above paragraph a bit. Our UI will ultimately want to present a grid view of orders in the system, like the one seen in the beginning of the article. We'll want to display the current state for each order, and this state should reflect the current state of the workflow associated with the order. There are actually many techniques we could use to retrieve (or record) the status of any given order. We could use a WF tracking service, as Matt Milner shows in his blog post: "Tracking the current state in a state machine workflow". Another approach is to use a StateMachineWorkflowInstance object, but this doesn't scale well if we need to display multiple orders. We will use the StateMachineWorkflowInstance class for another purpose later in this article.

Our approach is to save the state of an Order into the Order table of the database. Anyone issuing a "SELECT * FROM Orders" query will see the current status and all other information for all orders. How will our custom activity know what state the Order will be in? We will bind the State property on our custom activity to the TargetStateName property of the previous SetState activity. See "Using Dependency Properties" for more information on activity binding.

custom activity property binding

Before we show some more details of our custom activity, let's talk about how we will get this information into the database in a reliable manner.

The Transaction

We'll be making use of the WF SQL Persistence Service. Why? Well, it may take days or weeks for an order to move from the "Open" state to the "Completed" state. We can't rely on the web server staying up 100% of the time for weeks on end, or having enough memory to keep millions of order workflows in RAM. We need a place to save our workflows, and this is the job of a workflow persistence service. If you want more details on the SQL persistence service, see my article on "Hosting Windows Workflow".

In short, when our workflow reaches a point where it has nothing to do but wait for an event to arrive, the workflow runtime will see the workflow is idle and ask the persistence service (if one is configured) to take action. The SQL persistence service will serialize the workflow and shove the resulting bits into a SQL Server table. The service can then unload the workflow instance from memory. Weeks later, when a new event arrives (like the order finally shipped), the WF runtime will ask the persistence service to rehydrate the workflow. The runtime can than deliver the new event to the workflow and the workflow can resume execution.

You might wonder what will happen if we save our updated Order object to the database, but the persistence service fails to connect to SQL Server and throws an exception. We'd have inconsistent data in our database, which is always bad for business!. It would be best if our custom activity could work inside the same database transaction that the persistence service will use (and the tracking service, too). We can ensure that our Order table and the workflow will both agree on the state of all orders, regardless of catastrophic database or network errors. Working with transactions in WF is possible using a variety of techniques. See "Workflows and Transactions" and Windows Workflow Runtime Service: The Transaction Service" for more details from WF team members.

Our approach will be to interact with a transaction service. This means our workflow doesn't explicitly model a transaction in the workflow with a TransactionScope activity. Instead, our custom activity will shield a workflow developer from the intricacies of transactions by implicitly working with a transaction service. For details on building custom activities, see Matt Milner's article "Build Custom Activities To Extend The Reach Of Your Workflows". Let's look at the Execute method of our custom activity.

protected override ActivityExecutionStatus Execute(
                
ActivityExecutionContext executionContext)
{
    
if (Order != null)
    {
        Order.State = State;
        Order.WorkflowId = WorkflowInstanceId;
        
        
IOrderTransactionService txnService =
            executionContext.GetService<
IOrderTransactionService>();
        txnService.UpdateOrder(Order);                
    }

    
return ActivityExecutionStatus.Closed;
}

Our activity doesn't work with the database directly, but asks for a service implementing the IOrderTransactionService interface. The custom activity asks this transaction service to do the work. A transaction service in WF has to implement the IPendingWork interface. First, we define an interface for our transaction service.

public interface IOrderTransactionService : IPendingWork
{
    
void UpdateOrder(Order order);
}

We implement this interface with the following concrete class.

class OrderTransactionService : IOrderTransactionService
{
    #region IOrderTransactionService Members

    
public void UpdateOrder(Order order)
    {
        
Check.ArgumentIsNotNull(order, "order");
        
        
WorkflowEnvironment.WorkBatch.Add(this, order);
    }

    #endregion

    #region
IPendingWork Members

    
public void Commit(Transaction transaction, ICollection items)
    {
        
Check.ArgumentIsNotNull(transaction, "transaction");
        
Check.ArgumentIsNotNull(items, "items");

        
OrderGateway gateway = new OrderGateway();

        
foreach (object o in items)
        {
            
Order order = o as Order;
            
Check.IsNotNull(order,
                
"Unexpected object in items work batch collection");

            
if (order.Id == 0)
            {
                gateway.InsertOrder(order);
            }
            
else
            {
                gateway.UpdateOrder(order);
            }
        }
    }

    
public void Complete(bool succeeded, ICollection items)
    {
        
// nothing to cleanup
    }

    
public bool MustCommit(System.Collections.ICollection items)
    {
        
return items != null && items.Count > 0;
    }

    #endregion
}

The first method, the UpdateOrder method that our custom activity invokes, doesn't deal with the database either. Instead, it adds the order to the workflow's current work batch. Window s Workflow will collect batch items from our service, the persistence service, and the transaction service, and when the time is right will ask them all to commit after it has started an ambient transaction (a System.Transactions.Transaction). WF will call our service's Commit method, and our data access code will automatically enlist in this transaction (assuming we are using SQL Server). We've hidden the actual data access code behind an OrderGateway class, but you could easily use ADO.NET classes or other persistence approaches inside of Commit.

At this point we are fairly complete with everything our workflow needs. Now it's time to turn our attention to higher layers in the application and begin to think about how ASP.NET will interact with WF. We also need to build our service that will implement IOrderService, and raise events to the workflow on behalf of ASP.NET. 

The Mediator

Chances are that we don't want to put code that interacts directly with Windows Workflow inside our web forms. As we will see, it's not trivial to run a workflow and immediately fetch the results. We are going to build a mediator class that will take care of the workflow goo and make life easy for the UI developer. Before we look at our mediator class, I want to talk about one special consideration when using WF in ASP.NET.

Manual Scheduling

A scheduling service in Windows Workflow is responsible for acquiring a thread for the WF runtime to execute a workflow. For details on scheduling services, see my article on "Hosting Windows Workflow".

If you know about the scheduling services, then you know the default scheduling service will acquire threads from the CLR background thread pool. This default approach works well for smart client applications where threads are plentiful. For a server environment, however, using more background threads can starve the CLR thread pool. Besides, we don't really need to tie up two threads for every HTTP request when all we want to do is block the first thread until a workflow completes. Thus, we we want to skip the default scheduler and configure the WF runtime to use the ManualWorkflowSchedulerService for our ASP.NET apps.

The manual scheduling service does not actively acquire a thread. Instead, the manual service waits for the host to donate a thread, and the WF runtime will use the donated thread to execute a workflow. Using the manual scheduler we can use one thread to process the HTTP request and run the workflow synchronously. 

One catch that might not be obvious is that we need to explicitly run a workflow by donating a thread whenever we deliver an event to a workflow. If you are familiar with Windows programming, this is similar to "pumping messages" with a thread to keep a window responsive. Keep this in mind as we progress.

Running Workflows

Let's now take a look at a few of the WorkflowMediator methods. Remember, this is a class we are building to hide workflow complexities. In most applications the mediator will be a singleton that is globally available. The mediator is responsible for creating a WorkflowRuntime instance, but doesn't need to expose the WF runtime to the outside world. Applying your favorite dependency injection / inversion of control pattern with the mediator means you can unit test other components that use workflows without involving the workflow runtime.

public WorkflowResults RunWorkflow(Type workflowType)
{
    
Check.ArgumentIsNotNull(workflowType, "workflowType");

    
WorkflowInstance instance =
        _workflowRuntime.CreateWorkflow(workflowType);
    instance.Start();

    
bool result = WorkflowScheduler.RunWorkflow(instance.InstanceId);
    
Check.IsTrue(result, "Could not run workflow "
                        + instance.InstanceId);

    
return CurrentResultsInContext;
}

public WorkflowResults RunWorkflow(Guid instanceId)
{
    
Check.IsNotTrue(instanceId == Guid.Empty, "Invalid Workflow Instance ID");

    
WorkflowInstance instance = _workflowRuntime.GetWorkflow(instanceId);
    
Check.IsNotNull(instance, "Could not retrieve workflow");

    
bool result = WorkflowScheduler.RunWorkflow(instance.InstanceId);
    
Check.IsTrue(result, "Could not run workflow "
                        + instance.InstanceId);

    
return CurrentResultsInContext;
}

public ManualWorkflowSchedulerService WorkflowScheduler
{
    
get
    {
        
return _workflowRuntime.GetService
                    <
ManualWorkflowSchedulerService>();
    }
}

The WorkflowMediator class exposes two methods to run a workflow. The first method takes a Type parameter; the second version takes a Guid parameter. The first method assumes the workflow hasn't been created and will ask the WF runtime to create a new workflow instance. After calling Start on the workflow instance (which puts the workflow in a runnable state),the method uses the RunWorkflow method of the manual scheduler to execute the workflow. The second method uses the incoming GUID parameter and RunWorkflow to re-start an existing workflow that is probably idle and waiting to process an event.

When we call RunWorkflow, we hand our thread over to the WF runtime. The WF runtime will use this thread to execute the workflow until the workflow completes, terminates, aborts, or goes idle. Once one of these actions occurs the method call returns and we have control of our thread again. But how do we know what exactly happened to our workflow? Did it complete successfully? Did it terminate with an exception? The RunWorkflow method only returns a boolean value. A value of true means the workflow started running. This value doesn't tell us why the workflow ended. We need to look at the workflow lifetime events published by the workflow runtime.

Workflow Results

To understand what happens during execution of a workflow we need to subscribe to events published by the WorkflowRuntime. These events include WorkflowCompleted,WorkflowTerminated, and WorkflowIdled, among many others. We have to be careful when subscribing to these events. Most server-side applications will use a single WorkflowRuntime instance to run multiple workflows, we can create memory leaks and introduce race conditions if we don't handle the events properly. See my posts on managing workflow events in ASP.NET Part I and Part II for more details.

Since the goal of our WorkflowMediator is to hide the complexities of workflow from the rest of the application, the mediator takes care of subscribing to these events and translating the events into return values from its own RunWorkflow method. Let's take a look at a couple of the event handlers.

void _workflowRuntime_WorkflowTerminated(
    
object sender, WorkflowTerminatedEventArgs e)
{
    CurrentResultsInContext =
        
WorkflowResults.CreateTerminatedWorkflowResults(e)
}

void _workflowRuntime_WorkflowCompleted(
    
object sender, WorkflowCompletedEventArgs e)
{
    CurrentResultsInContext =
        
WorkflowResults.CreateCompletedWorkflowResults(e);
}

These event handlers take event arguments from the workflow runtime and create a WorkflowResults object. WorkflowResults is another class we've written in the wfOrderServices project. It has properties that can hold information for any kind of workflow event. Static factory methods call private constructors that look like the following.

public static WorkflowResults CreateCompletedWorkflowResults
                                (
WorkflowCompletedEventArgs args)
{
    
WorkflowResults results = new WorkflowResults(args);
    results._status =
WorkflowStatus.Completed;
    
return results;
}

public static WorkflowResults CreateTerminatedWorkflowResults
                      ``          (
WorkflowTerminatedEventArgs args)
{
    
WorkflowResults results = new WorkflowResults(args);
    results._status =
WorkflowStatus.Terminated;
    
return results;
}

private WorkflowResults(WorkflowCompletedEventArgs args)
{
    Check.ArgumentIsNotNull(args,
"args");
    _outputs = args.OutputParameters;
    _instanceId = args.WorkflowInstance.InstanceId;
    _definition = args.WorkflowDefinition;
}

private WorkflowResults(WorkflowTerminatedEventArgs args)
{
    Check.ArgumentIsNotNull(args,
"args");
    _instanceId = args.WorkflowInstance.InstanceId;
    _exception = args.Exception;
}

We also have a Status property on our WorkflowResults class. We've defined a WorkflowStatus enum that can be one of the following values: Completed, Terminated, Aborted, or Running. We can infer the current status of a workflow based on the type of event we receive. A WorkflowCompleted event means the workflow is completed- likewise for the Terminated and Aborted events. If we receive a WorkflowIdled event, we can assume the workflow is still running, but it has just gone idle with no work to do until a new event arrives.

The next challange is to deliver these results to someone who cares. To do this we'll need a bit of context.

The Context Service

When working with the manual workflow scheduler, it is safe to assume that the whoever called RunWorkflow donated the thread that executed the workflow and raised an event. If we place our WorkflowResults into a "context" for the current thread, then the caller can pull these results out of the context when it regains control of the thread.

In an ASP.NET application, we can use the current HttpContext Items collection as the context for our current thread. (One caveat here: if you use Delay activities with and configure active timers for the manual scheduling service, these events will happen on a background thread that is not associated with an HTTP request). We are not going to use HttpContext directly, however, we will hide it behind a service abstraction. After the WorkflowMediator recieves a workflow event, it places a new WorkflowResults object into a property called CurrentResultsInContext. This property is implemented with the following code.

public WorkflowResults CurrentResultsInContext
{
    
get { return _context.Items[_contextKey] as WorkflowResults; }
    
set
    {
        
if(_context.Items.Contains(_contextKey))
        {
            _context.Items[_contextKey] =
value;
        }
        
else
        {
            _context.Items.Add(_contextKey,
value);
        }
    }
}

The property is using a context service that is stored in a field named _context. This service implements the following interface.

public interface IContextService
{
    IDictionary Items
    {
        
get;
    }
}

If we are running under ASP.NET, we can provide a concrete implementation of the context service that looks like the following.

public class HttpContextWrapper : IContextService
{
    
public IDictionary Items
    {
        
get
        {
            
return HttpContext.Current.Items;
        }
    }
}

Back in the WorkflowMediator, it's now easy to pull back the results of running a workflow and return these results to the caller.

public WorkflowResults RunWorkflow(Type workflowType)
{
    
//...
    return CurrentResultsInContext;
}

We now have one additional piece of infrastructure to work with. Although the WorkflowMediator may look like a great deal of work, it contains relatively few lines of code. The code provides a good abstraction layer over top the underlying workflow engine, and decouples other services from relying on knowledge of the WF runtime. We are now ready to move to the next layer.

The Order Service

Early in this article we defined an IOrderService interface. This interface provided the communication contract between our workflow and the host. We are now ready to provide a concrete implementation of this service. There were 5 events that the interface will force us to implement, but we'll add some additional methods that will make this service easy to use from ASP.NET. The OrderService, defined in wfOrderServices, is the only "workflow" API we will be using from our ASP.NET web form.

We'll design our OrderService with methods that correspond to actions that can happen to an Order. We'll provide CreateOrder, ShipOrder, UpdateOrder, and CancelOrder methods. Below is our implementation for ShipOrder.

public bool ShipOrder(Order order)
{
    
Check.ArgumentIsNotNull(order, "order");

    
bool eventResult = RaiseEvent(OrderShipped, order, order.WorkflowId);
    
if (eventResult == true)
    {
        
WorkflowResults workflowResults =
            
WorkflowMediator.Instance.RunWorkflow(order.WorkflowId);
        
Check.IsNotNull(workflowResults, "Could not harvest workflow results");
        
        VerifyResults(workflowResults,
WorkflowStatus.Completed);

    }
    
return eventResult;
}

Most of the methods in our service follow a basic pattern. First, the method will raise an event that corresponds to the order action. In this case, we'll raise the OrderShipped event using a helper method we've defined called RaiseEvent – more on that method in a minute. This event should eventually reach our workflow. The methods then use the WorkflowMediator to run the workflow, and the ensuing WorkflowResults object is then inspected to ensure the workflow is doing what we expect. For instance, after the OrderShipped event is processed by our workflow, we expect the workflow to stop execution (remember the workflow results contain the status of the workflow itself, not the state of our order, although the state of our order is also complete).

Notice the OrderService doesn't have to deal directly with the WorkflowRuntime or its associated events, but it does have some knowledge of how the workflow should proceed for an order. This is a comfortable abstraction as we've separated concerns a bit. The WorkflowMediator concentrates on managing the WF runtime and its events, while the OrderService concentrates on managing the "big picture" workflow for an order. After each call into the WorkflowMediator, the OrderService will verify the state of the workflow using a VerifyResults method. This ensures the workflow didn't terminate with an exception. For all of our other methods, we ensure the workflow is still running. We pass VerifyResults the actual results received, and the expected status of the workflow (Running or Complete).   

private void VerifyResults(WorkflowResults workflowResults, WorkflowStatus status)
{
    
if (workflowResults.Status != status)
    {
        
if (workflowResults.Exception != null)
        {
            
throw workflowResults.Exception;
        }
        
else
        {
            
string expected = status.ToString();
            
string actual = workflowResults.Status.ToString();
            
throw new InvalidOperationException(
                
String.Format("Workflow {0} expected status {1} actual status {2}",
                                workflowResults.InstanceId, expected, actual));
        }
    }
}

If an exception occurred during the workflow processing, we'll throw the exception. If the workflow isn't in the state we expected, we'll also throw an exception.

Now, let's return to the RaiseEvent helper method. This method is shown below.

private bool RaiseEvent(EventHandler<OrderEventArgs> ev,
                        
Order order, Guid instanceId)
{
    
bool result = true;
    
try
    {

        
if (ev != null)
        {
            
OrderEventArgs e = new OrderEventArgs(instanceId, order);
            ev(
null, e);
        }
    }
    
catch (EventDeliveryFailedException)
    {
        result =
false;
    }
    
    
return result;
}

What jumps out immediately is that we are eating the EventDeliveryFailedException. This is a judgment call. Let me give you a common scenario for the EventDeliveryFailedException and you can decide for your own application the best approach to take.

Let's say I'm looking at Order #5 and know that the warehouse justprocessed order #5 and the order is ready to ship. I press the "Process Order" button in the UI to process the order. Unfortunately, my co-worker who is sitting in another room with the application open also knows that Order #5 is ready to ship. She clicked the button 5 seconds before I did and processed the order. When my event arrives, the state machine will throw an exception. As far as the state machine is concerned, it's not legal to "process" an order in the "processed" state. The exception is an EventDeliveryFailedException exception. In a web application with concurrent activity this isn't an entirely exceptional circumstance. In this sample I decided to return false, and the UI layer can ultimately deal with the failure to process (perhaps just refresh the screen to show a more recent order status). You'll have to decide on the best approach for your application in this scenario.

With all our services in place, we are finally ready to move to the presentation layer.

our little helpers

Finally, ASP.NET

For an article with "ASP.NET" in the title, we've sure done a great deal of work without seeing a single button click event handler. This is good news, actually, because we now have all the abstractions needed for an ASP.NET developer to use Windows Workflow with little grief. In fact, the ASP.NET developer doesn't know that Windows Workflow is inside the application and silently persisting and tracking order workflows as they execute. The ASP.NET application is in the wfOrderWeb project in the accompanying download.

The following piece of code is the event handler for the "Create Order" button.

protected void createOrderButton_Click(object sender, EventArgs e)
{
    
Order order = Order.CreateOrder(); // just creates an object instance - no DB activity
    order.Title = titleTextBox.Text;
    titleTextBox.Text =
"";

    
OrderService.Instance.CreateOrder(order);

    DataBindGridView();
}

The ASP.NET page tells the OrderService to create a new order, and then refreshes the GridView that appears on the screen. We've simplified life in the code-behind file tremendously, and the ASP.NET developer can concentrate on providing a good user experience.

Windows Workflow and ASP.NET In Harmony

One feature of our ASP.NET application is that the code will enable and disable buttons on the form. The available buttons are based on the available transitions in our workflow model. For instance, when an order is shipped, the bottom row of buttons will all be disabled. When an order is in the open state, the "Ship Order" button will be disabled. We can do this by calling into the OrderService and asking what transitions are available for any given order.

public OrderTransitions GetAvailableOrderTransitions(int orderId)
{
    
Order order = GetOrderById(orderId);
    
Check.IsNotNull(order, "Could not fetch order " + orderId);
  
    
OrderTransitions result = OrderTransitions.None;
    
ReadOnlyCollection<string> transitions =
        
WorkflowMediator.Instance.GetStateMachineTransitions(
            order.WorkflowId);

    
if (transitions.Count > 0)
    {
        result |=
OrderTransitions.CanCancel;
    }

    
foreach (string stateName in transitions)
    {
        
if (stateName == "Open")
            result |=
OrderTransitions.CanOpen;
        
if (stateName == "Processed")
            result |=
OrderTransitions.CanProcess;
        
if (stateName == "Completed")
            result |=
OrderTransitions.CanComplete;
    }

    
return result;
}

The OrderService will ask the WorkflowMediator to find out the available transitions for an order, and we set the flags in an OrderTransitions enumeration to match these transitions. The WorkflowMediator is querying the state machine using the StateMachingWorkflowInstance class, which you can read more about in my "State Machines In Windows Workflow" article.

Conclusions

Joining Windows Workflow and ASP.NET into a testable, flexible, maintainable application requires a bit of work, but the same could be said for almost any technology. What have we gained with WF? We've gained transparency in the sense that any developer, or even business person, can look at our workflow model in the workflow designer see how an order moves from Open to Completed. We have a UI that is driven by this model, and that will prevent users from accidently shipping a processed order. Although we haven't talked about the WF tracking service, we can instantly record a history of execution for each order by simply configuring this service into the runtime. We have automatic support for our long-running order processing by virtue of using the WF SQL persistence service. There are also a host of WF features we haven't taken advantage of, like using a WF Policy activity inside our state machine to execute a set of declarative business rules to validate an order.

There are an infinite number of ways to use Windows Workflow Foundation in ASP.NET, and this sample looks at just one approach. Hopefully this article and accompanying download will give you some ideas and code to build upon. If you have questions or comments, you can reach me through my blog.

 

by K. Scott Allen