TRAP progress

by Damon 27. February 2005 06:00
Before moving on to some client related work I did some more work on TRAP this weekend. Its actually going about as fast as I expected so far. As the design fleshes itself out, I can start to see some of the challenges involved in making this kind of tool. As is typically the case when getting involved in a project, some things turn out to be far easier than anticipated and some things are more difficult, or just annoying. For example, supporting multiple databases with the same engine will be easy, all encapsulated and behind the provider interfaces.

I see performance as the biggest challenge on this project. Its plain to me now that there's no reason why I can't support all the features I want to support and have a robust tool. There are two performance issues that I need to address, not necisarily right now but they need to stay in the back of my mind so that the design doesn't stand in the way of these changes later. First, the engine is heavily reflection based right now. In a data intensive application the performance hits incurred from inspecting classes at runtime could start to add up. The idea I have in mind to address this down the line is to borrow a cue from the core .NET framework and use Reflection Emit to build some helper assemblies on the fly that implement the same functionality in non-relfective code. The second performance challenge is of course generating efficient SQL.

When Type Manager has a one to many relationship with Type Employee, the tool has to decide how to load that. By turning on the maximum debug levels for OR tools I've used in the past I often found that when loading relationships the engine would:
  1. Issued one statement to load Manager data meeting the criteria.
  2. Then issued one statement per instance of Manager to read its list of employees
Obviously this is much simpler than writing code that can do the table join to get all the information in one call. Writing code that can handle the Manger-to-Employee situation is fairly easy, but what about when Employee has related types, and those types have related types, and so forth? Writing an engine that can generate all of this may be complex, or not possible. The "one statement per parent" method and the performance hit was infamously known in the EJB CMP community as the "1+", meaning essentially it is an Order(n) + 1 operation. I've been thinking quite a bit about the three relationship load options I plan to offer:
  • Lazy Load - related types are not loaded until the property is referenced. A proxy object is created that knows how to load its data once it is deferenced. This would likely require some Reflection emit code to extend a Type on the fly with a proxy, if used for 1:1 relationships. For 1:n and m:n relationships, a proxy class that extends a built in type is easy enough to create.
  • Eager Load - The engine will populate the entire type tree with one call. In most cases the number of calls could be drastically reduced by using a single subselect for each level of relationships.
  • Semi-Eager-Threaded Load (need a better name for this one)
  • - Execute the main call, with the "1+" logic being handheld by an asynchronous delegate, giving the user the illusion of faster performance. To keep using the same example, the main thread returns as soon as Managers are loaded but a threadpool thread immediately begins the work of loading Employees, the collection of Employees is likey already populated before it is referenced. This would still beat up the database pretty good.

I have quite a bit of the design and 2,000 lines of code done for TRAP right now. The next step is to finish the Types assembly design, specifically to determine exactly how I want the mapping schema to look. The stuff in the Core assembly and Types assembly comprise the interface projects using TRAP will interface with. At this point I could share a little bit of what that looks like right now:
So, essentially to find some instances of a type you would get a UnitOfWork, which represents both a connection to a data store and a transaction. You would pass a Criteria object and a System.Type to find instances of objects, and then possibly update these objects via the same unit of work.

I'm also showing a preview of the main mapper form from the UI. Hopefully after looking at this you will think "Oh yeah, a Designer like that would make me very liley to use this tool to get an application up and running quickly." The essential idea is to use reflection to display your types and use the connection information from your Project to display database schema, and the Mapping Types are displayed in a property grid when an item on either side is selected. Everything I show from the UI is working code, soup to nuts, so really this is likely to be in an "Alpha" stage in another week or two.
I have two websites to finish, next up I will display the design for the Types assembly. After that, Provider and Engine, these two are by far the most complex and closely associated. Stay tuned.

Tags:

PInvoke keybd_event

by Damon 25. February 2005 06:00
Today's tip for the compact framework is a simple one. Suppose you are working on a PPC device that only has two or 3 buttons, most likely the built in "calendar" and "task" type buttons. Your users would rather not use the screen/stylus as much as possible. Yes, you still have to build a user interface with tabbing, space bar, enter, etc. For example, suppose you use MessageBox to indicate an error; with no space bar or enter key you can clear it by using OpenNetCF's windows message filter to catch the key event. Then, you can PInvoke keybd_event to send whatever other message you want to your application. For example, the enter key (Key code 13) to hit "OK" on your message box.

[DllImport("coredll.dll")]

public static extern void keybd_event(byte vKey, byte bScan, uint dwFlags, uint dwExtraInfo);

That's all for today. Hopfully my other furnace doesn't blow up this weekend and I get some work done on TRAP.

Tags:

Will code for food

by Damon 21. February 2005 06:00
I typically avoid personal things here and leave for my other site. However, just to reach a wider audience: This weekend my furnace stopped working. The timing couldn't be worse, either. (Not that there's ever a good time)

So, just letting the general population know that I am available for work on the side. I am pretty much always working on some small system, but given the $$ I just spent keeping my house warm I will probably take on another project right now. You can establish contact here for starters.

Tags:

Microsoft Tech Ed 2005

by Damon 18. February 2005 06:00
Well, I got verbal confirmation from my new employer that I'll be attending Microsoft Tech Ed this summer. This year's event takes place in Orlando, Florida. Nice location for a geek vacation.

This is kind of a big deal to me because throughout my career things like training and conferences hasn't really worked out. The only trip I've taken on someone else's dime was a two day Scrum Master certification in Denver(Which was cool by the way). Going to a conference (I've heard) is a rejuvenating experience. If you're like me and typically spending more time implementing stuff than reading about the up and coming new tools, this seems like a fantastic way to shuffle off the burnout and get excited about life in the world of Software all over again.

I believe my family is going to come with me for at least part of the week, I think we'll be taking the munchkin to see The Ocean and Seaworld and such the weekend before. As far as the Tech Ed content, I plan to be primarily in the Architecture and Mobile tracts, assuming I can weave a schedule back and forth between those two.

If any other Wisconsin folks are going and need someone to hang out with, contact me.

Tags:

OR Mapping

by Damon 16. February 2005 06:00
I decided this week to scratch an itch I've had on and off for about 5 years. I was writing some data access code for a small web project I'm doing for a bank and started to get the Object/Relational mapper itch again. I have some history with the OR mapper itch. The OR mapper itch replaced the Ojbect Database itch when I realized no one was going to bite on object databases in my lifetime. The first OR tool I ever used was TOPLink, and in my opinion it remains the best by far. I've used many OR tools for many languages since then, all of them have fallen short of solving some part of my data access picture on the applications I've tried them on. To be fair, many of these were due to horrible database design.

At any rate, even in this great day and age I still finding myself writing very similar data access code over and over again for web and windows forms projects. I could, if I had no self respect, just throw SqlDataAdapters all over the place but I usually insist on writing three tier apps with custom types as the means of exchanging data between tiers and layers.

My itch, then, has been to write an OR mapper from the ground up; I want to do this primarly so I'll have one that fits my needs but also just for the experience of designing and building it. I typically work on several things at once, and have some software products of my own that are to be developed this year, so the tool should justify the time it takes to write very quickly. I have some goals for writing the tool that will hopefully make it very interesting to other people:
  1. Support for stored procedures: the MSFT world is very stored proc crazy and very against dynamic SQL, required by every OR mapper I know of. Just having something that can execute a stored proc and map the results back to my application's types would be a big time saver.
  2. Type-to-type mapping: As Philippe DeMilde pointed out one drawback of doing Services Oriented Architecture the right way is the fact that you will end up mapping Message types to the data structure your application uses. This is tedious and prone to human errors. Adding support in this engine to map from instances of one Type to another could be a time saver in many applications.
  3. A nice user interface: This is one thing TOPLink had that I have yet to see anywhere else. The UI should make it so that you configure the tool visually and easily: Connect to a data source, load a Types assembly, highlight a Type/property/table/column/storedproc and display its mapping configuration in a PropertyGrid.
  4. Support Lazy and Eager loading of related Types and Relationships: Lazy loading is really the only way to go so I may force lazy load for Collections.
As I flesh the design out, I'm sure many more things will make themselves known. After giving up on thinking of a clever name for this project I decided to just call it TRAP, meaning "Types, Relationships, and Persistence". After this ephiphany (which occurred Monday afternoon) I went wild and did a component diagram and coded a surprisingly good start to the user interface. The road to completing TRAP will probably be the subject of many of my blog posts for a while.

Behold, a component diagram

Its not much right now, but there is mucho more on the way. The responsibilities for each component should be straightforward.

Types will just represent the mapping structure and data source information and have no intelligence. The Core assembly serves as the interface between consumers of the tool and the internals of the tool. The Engine does the actual work of managing transactions and caching and such. The Provider assembly will implement a couple of default supported persistence targets, like Type-to-SQLServer, Type-to-OleDB and maybe Type-to-Access and Type-to-Type. The idea is that anything that can vary (like SQL dialect or connection string format) will be encapsulated in the provider classes. The Provider, then, is really a plug-in to the engine. I will probably make an identity map/caching mechanism part of the provider model as well.

I'm pretty eager to scratch this itch, especially since I view it as part of the bigger goal of offering 2 products for sale by the end of the year. I imagine by the end of this weekend (depends greatly on my other commitments) I will have a simple "select *" test working round trip: designing it in the UI and running the OR mapper from unit tests. That reminds me, I should add a Tests assembly out there and create a test database with all of the items I want to support. Stay tuned.

Tags:

Product keys

by Damon 13. February 2005 06:00
I have had the pleasure of working on a couple of smaller shrink-wrapped products, and I know many people interested in the shrink wrapped product business. One of the small challenges involved in selling people software is managing your licensing. Back in the 90s almost all consumer software had a CDKey, but it was a joke since all one had to do was dial in to your favorite BBS or call up a friend and ask them for their key. Piracy laws were there but if you were willing to do it, it was no issue.


In today's connected world there are some more complicated schemes. I thought Windows XP product protection was the ultimate protection: it was actually tied to your specific hardware.

So, with that in mind, I set out to try to create something similar in managed code. I wanted to support the following features:
  • Ability to tie an installation to specific hardware
  • Ability to automate the registration process
  • Mitigate any risks to the security that could arise from reverse engineering the code

Step 1, tie an installation to a piece of hardware was easy enough. Using the System.Management namespace and some cryptic Windows object queries, you can obtain all kinds of hardware information. There are ways to PInvoke even more information, but this was good enough to start. Once some hardware information is obtained, a trap door hash algorithm is applied to obfuscate the specific data used. We'll call this part a hardware key.



Step 2, automate the process. Assume now that some kind of product key has been given to a client, and the hardware key has been generated by the software. An online system could easily be created to associate a hardware key to a product key, and for some that might be enough. However, the software should ideally be able to determine if it should allow itself to run. Even more ideally the software should be able to make this determination without connecting to an online system ever time it starts up. By introducing some more complexity this can be accomplished.

We could create RSA key data, or a complete digital certificate, that represents the company licensing the software for use. Due to the great properties of an RSA public/private key pair the company could keep an RSA key (safely) RSA private key data that could be used to sign a "license file". If the license file contains the hardware key, the RSA public parameters, and a signature created based on the hardware key the client could use this license file to determine if its OK to run or not. Consider this scenario:
  1. New installation creates a hardware key
  2. New installation sends its product key and hardware key to a web service hosted by the product company
  3. The web service stores the product key and hardware key, in order to deny permission to other hardware trying to run with this key
  4. The web service creates a private key signature of the hardware key + product key, and sends this information back to the client along with the RSA public key data
  5. The client, whenever it attempts to run, can re-calculate the hardware key, and use the RSA data to verify the signature and allow itself to run


Step 3, code obfuscation is the best way to mitigate the risk of comprimising this strategy, however by using the digital signature piece knowledge of how the system works does not ruin the whole system, and that is the mark of good security. Contact me for code if you are interested. Maybe I should just post code? I suppose I will refactor the system to work within the .NET LicenseManager API, which I found after I got this working the first time.

Tags:

Slacking

by Damon 11. February 2005 06:00
Well, so much for my goal of trying to blog once per week.

With trying to wrap everything up with my last job, starting this new job, and having a grand total of four entities asking for my time right now, I'm somewhat behind on my contributions to the .NET community. (That would be this blog)

I promise this weekend I'll get to at least one of the things I've either promised or thought about posting:
  • SSL-like key exhange algorithm in managed code
  • Creating a verifiable CD-key scheme using managed code and crypto.
  • Creating Designer surfaces in .NET
  • WYSIWYG printing for the compact framework
  • Litmus test rants for enums, services, reference data and more
  • Creating RSS using XSLT and custom types
All in all, I have a lot on my mind and posting about it helps organize those thoughts. I suppose if anyone out there wants to see one of these items before the other, send me an email and I'll do that one first.

Post Script: My "IT Slavery" article wasn't meant as an attack at anyone except the specific individuals I had those conversations with. You know who you are. I really didn't mean to discourage anyone from the rewarding field of consulting either. I've actually gotten email from a student saying
  • I am a student studying C# and I found your blog through google. I don't think I want to be a consultant now.
  • Obviously, despite the issues, consulting is still the life from me, just be aware of what you get yourself into kids.

    Tags:

    About the author

    Damon Payne is a Microsoft MVP specializing in Smart Client solution architecture. 

    INETA Community Speakers Program

    Month List

    Page List

    flickr photostream