OdeToCode IC Logo

Tablets, Ink, and Irony

Friday, June 11, 2004 by scott

There is a company selling automation solutions to the medical industry utilizing the Tablet PC (via MedicalTabletPC.com). I imagine they are using the INK API, even though the name of the company is NoInk Communications. I can picture the help wanted ads now:

“Ink Developer Needed For NoInk Software”

In print, you might be able to noodle it out. Recruiters would need to be more careful in face to face conversation:

“I see you’ve completed a commercial application with Ink”

“Yes, I love developing for Tablet PCs with Ink. I think in Ink.”

“Would you be interested in a challenging project with NoInk?”

“No”

Though really, it’s not such a bad name. I wouldn't put it on this 2002 list of the 50 worst company names, which included Molex Inc, and McData Corp.

Compound Keys: The Ally Of Evil

Thursday, June 10, 2004 by scott

I’m having a little fun with Brett’s post (Surrogate Keys … The Devil’s Spawn). The fact is, I like surrogate keys, but I’m picking a different battle. There is a vendor in the healthcare industry using compound keys in their data warehouse product. (The fact that they use surrogate identifiers to compose the compound key and other pleasantries puts the entire design squarely in the 5th circle of hell, but I’ll stick to the topic at hand).

While compound keys aren’t quite as bad as Satan’s spawn, there are issues to be aware of:

1) A bigger key means more work for the server.

2) I’m looking at a design where the first column of the compound key has no selectivity, there is only one distinct value in 33 million rows. All the selectivity is in the second column. Unfortunately, the histogram SQL Server builds for the optimizer only uses values from the first column in a compound key/index. Queries using <, >, BETWEEN, (non-equality predicates) suffer.

3) You can reduce the number of joins required by Analysis Services when processing an OLAP cube by running the “Optimize Schema” tool, but it only works on dimensions where a single column joins the fact table to the dimension table. This optimization is potentially in the “2 hour” to “2 minute” category of optimizations.

With the gauntlet cast, I hereby challenge readers to devise the following posts:

Primary Keys: The Fallen Angel

Foreign Keys: Agents Of Deceit

Healthcare IT Vendors: Taking Customers To Hell In A Handbasket

SharePoint Resource Kit

Thursday, June 10, 2004 by otcnews
The SharePoint Products and Technologies Resource Kit offers guidance and information to design, deploy, customize, and troubleshoot Microsoft Office SharePoint Portal Server 2003 and Microsoft Windows SharePoint Services.

DEVT Sessions From May

Thursday, June 10, 2004 by otcnews
Online seminars feature training presentations, tutorials, and demos on using Application Blocks, InfoPath, and Starter Kits.

HTTP Connection Limits

Wednesday, June 9, 2004 by scott

Gavin blogs about how to circumvent the two connection limit in IE. RFC 2068 mandates this limit in the HTTP 1.1 specification.

If you are calling web services in .NET, or using the WebRequest or WebClient classes, you'll also run into this connection limit, which can really throttle applications using an application server through web services. Fortunately, you change the connection limit in a config file:

<connectionManagement>
    <add address="*" maxconnection="2">
    <add address="65.53.32.230" maxconnection="12">
</connectionManagement>

For more information, see one of the following:

PRB: Contention, Poor Performance, and Deadlocks When You Make Web Service Requests from ASP.NET Applications

At Your Service: Performance Considerations for Making Web Service Calls from ASPX Pages

Duff’s Device

Tuesday, June 8, 2004 by scott

I don’t recall when I first heard of Tom Duff’s amazing device, but I’m sure it was from a USENET posting. Tom Duff invented his device while optimizing a program with loop unrolling. Loop unrolling takes the block of code inside of a loop and duplicates the code to avoid conditional jumps and the testing and incrementing of a variable. If this sounds like a lot of work for little payoff, well, most of us leave it to the compilers these days.

When I first saw the code for Duff’s device, I did a triple-take. I’m not sure if I then laughed, or cried, or just curled up into a little ball. The code was naughty in such a breathtaking way. Here is a version that compiles and works with the C++ compiler in the May CTP of VS 2005:

void send(short *to, short *from, int count)
{
    int n=(count+7)/8;
    switch(count%8){
      case 0: do{      *to = *from++;
      case 7:             *to = *from++;
      case 6:             *to = *from++;
      case 5:             *to = *from++;
      case 4:             *to = *from++;
      case 3:             *to = *from++;
      case 2:             *to = *from++;
      case 1:             *to = *from++;
              }while(--n>0);
    }
}

At first glance, the code appears to be a car wreck involving a switch statement and a do while loop, as they seem smashed together in a way that makes you slow down to look. The code does work and copies an array of shorts to a memory location (in Duff’s case, a memory mapped IO register). Perhaps the code is fodder for the daily WTF blog, except it has appeared in a Bjarne Stroustrup book.

If you are interested in seeing the original unveiling of the device, you can read the original posting here.

Reminisce

Monday, June 7, 2004 by scott

As a kid I remember spending one summer with a TI994/A hooked up to the kitchen TV. While I would be typing in code in front of the air conditioner, mom would be making lunch. Some days the code would work and we would eat tuna salad and potato chips – those were good days. Some days the code wouldn’t work and we’d eat tomato soup and grilled cheese sandwiches. Those were good days too.

I also remember how “the community” around the TI (and later I had an Atari ST) would wring every last drop of value out of the software and the hardware. Every few months someone discovered a little hardware quirk or invented a clever hack to make animations move faster or shave a few bytes off a routine.

No, I don’t want to go back to the days of trying to optimize out one more machine instruction of a loop. It’s just I spent most of the afternoon catching up on flagged items from RSS feeds which included demos of the Visual Studio 2005 Team System, a whitepaper on XML Support in SQL 2005, code samples of programming Outlook with C#, and channel 9 videos of XP SP2 features. (Scoble: I hope the whining about lighting, focus, and other amateurish video making problems does not deter you in the least, the videos are great, keep 'film' rolling.)

By the time I finished I was thinking just amazed. Look at all this new stuff we will have to work with. Let’s not even talk about what Longhorn will bring. How far will we get with this stuff? In the days of the TI99 we never knew what was coming down the road. Necessity was the mother of invention and we had to make do with what we had.

So what would happen if all the developers from Microsoft and all the engineers from Intel and AMD made the following announcement:

Dearest community,

Effective tomorrow we will be out of the office on a manned mission to Mars. See you in 5 years.

Best regards,

Just how far could we take this technology? How far could we push limits? Where are we on the efficiency curve today with the current tools and current hardware? With all of the extensibility hooks in VS.NET and Office, I’m certain there is plenty of innovation we could pull off. It’s just I now cannot imagine service oriented architecture taking off without Whitehorse.