September 2007 - Posts

I was dealing with an error about "Failing to enable constraints".  I was just added to this project and I couldn't figure out how to determine exactly where the problem is.  In reading this article:

http://weblogs.asp.net/rosherove/archive/2004/10/03/DataSet-hell-_2D00_-_2200_Failed-to-enable-constraints.-One-or-more-rows--contain-values_2E002E002E002E002200_.aspx

I used this approach to create a custom class that generates a more detailed error message, by using the DataTable.GetErrors() and the DataRow.GetColumnsInError() methods.  This was very handy to identify the rows that were the problem (I even added the primary key to the message, which really helped to determine that there were duplicate keys, and some null fields).

Posted by bmains | with no comments
Filed under:

I've been using a tool called TestMatrix, which has a lot of features that Test-Driven Developers will love.  This tool features a test explorer that contains a tree view listing of all the unit tests in a unit test project.  Simply select one of the projects in the drop down, and all the unit tests load in the sidebar.  You don't need to run all the tests in the sidebar; it is possible to filter out the tests to run tests by project, namespace, single file, or specific method.  The last two can be performed by a right-click menu option in the test explorer, or my personal favorite, in the unit test itself.

So, say you just created a test and finished the code.  By right-clicking at the test level, you can run just a single test.  By clicking anywhere else, you can run the whole unit test.  When you do, everything runs in the background on threads (which works really well; I'm on a 512 MB machine and it works pretty nice) and the tests run right in visual studio itself.  No need to run the project or open it up in NUnit.

Upon running the test, it loads the results of the test in test explorer, and a test results window at the bottom of the screen.  But, even better, the results appear in the left pane (the gray bar next to the left of the line numbers) which tells you performance information (sec/ms), code coverage (a little red bar that states which code was not hit), and some other information.

This is a very beneficial tool that makes running unit tests very convenient.  I really love working with it, and I'd thought I would mention it to others.

Posted by bmains | 2 comment(s)
Filed under:

When ASP.NET 2.0 came out, it featured Table Adapters, which is the new way to replace typed datasets and connections.  This approach is cleaner, but works in a very similar way to typed datasets.  Furthermore, table adapters have a nice interface when adding queries and seeing the structure/relations of the data.

But is it worth using as part of the architecture?  It is a very handy feature; simply add a new query with some additional constraints, and you have a variation that is helpful.  In addition, you can have as many queries as you would like; you can also create a scalar query, or an insert/update/delete operation as well.

However, one of the drawbacks is that the query has to support all of the fields defined in the initial setup, or there could be potential problems.  To circumvent that, setting a field to allow nulls will not cause problems if the query doesn't return any results.  But, queries can become complex that multiple table adapters, which represent the same overall data, are returned.  For instance, a table adapter may be added to add additional calculated fields, or it may be added to allow an inner join with another table.

In addition, a single data set file, which is the .xsd that holds the table adapter structures and queries, do not typically hold all the data.  A project may contain multiple data sets, and because of that, a table may be duplicated throughout.  Just because its the same table doesn't mean the type is the same; rather, the data set name is the namespace, which means that each table object is within that namespace and separate from each other.  Because some code may work with the same object in different namespaces, this needs to be handled in some way.

Although, you can't beat the convenience that they bring; right-click, enter the query and the where clause, click Finish, and you got yourself a new query.  It's not quite as simple with Data Access code, though tools like CodeSmith do simplify the overall process.

In my conclusion, typed datasets and table adapters both have their advantages, but they have to fit within the architecture and application style.  And, it has to be evaluated with the scope.  It's also harder to unit test applications, as object models are better in this respect.

Posted by bmains | with no comments
Filed under: