+N Consulting, Inc.

Code better – measure your code with NDepend

If you ask the average developer what might be done to improve code, they would probably come up with “use design patterns” or “do code reviews” or even “write unit tests”. While all these are valid and useful, it is rare to hear “measure it”. It’s odd, when you think about it, because most of us consider ourselves scientists or sorts. Some of use obtained a degree in computer science, and we view the coding practice and a deterministic endeavor. Why is it then that we don’t measure our work using standard methodologies and objective tools and evidence?

For one, some of us are blissfully unaware of the existence of such methods. Indeed, the science of quality measurement of code has been the domain of university halls more so than practiced in the “real” world. Six Sigma and CMMI are probably the more familiar endeavors prescribing some sort of measure/improve into the coding practice but both include scant little in terms of measuring code itself. Rather they focus on the results of the software endeavor not on the “internal quality” of code.

Another reason for low adoption of code quality measurement is lack of tools. We have wealth of guidance instruments, but less so code quality focused. For example, FxCop and the addition of Code Analysis to VSTS have brought huge contribution to code reviewing and uniformity in coding among teams. But let’s face it - with so much guidance, it’s all too easy to either dismiss the whole process as “too picky” or focus too much on one aspect of coding style rather than the underlying runtime binary. This is to say that it is very possible that what would be considered “good style” may not yield good runtime, and vice-versa.

For a professional tool which enables you to view, understand, explore, analyze and improve your code look no further than NDepend. (www.ndepend.com). The tool is quite extensive and robust, and has matured in its presentation, exploration and integration capabilities becoming a great value for those of use interested digging deeper then the “my code seems to work” crowd.

The installation is fairly straightforward. You pretty much unpack the download and place your license file in your installation directory. Upon running the tool, you can chose to install integration to VS2005, VS2008 and Reflector (now a RedGate property btw).

Before using the tool for the first time, you can watch a few basic screen casts available from NDepend. The videos have no narration, so I found myself using the pause button if the text balloons flew by a bit quick. But that’s no big deal with a 3-5 minute video. Once you get comfortable with the basics, you can almost immediately reap the benefits. Through a very relevant set of canned queries and screens you can quickly get a feel for how your code measures up. A graphic “size gram” presents methods, types, classes, namespaces or assemblies in varying sizes according to measures like lines of code (LOC - either the source itself or the resultant IL), Cyclometric Complexity and other very useful views of code cohesiveness and complexity. This visual let’s you quickly identify or drill into the “biggest offender”.

Once you chose a target for exploration, the view in the assembly-method tree, the graphic size-gram and the dependency matrix all work in tandem: you chose an element in one, and the focal point shifts or drills down in the other two. There is also a pane which acts like a context menu which displays the metrics numbers for the selected method, field, assembly etc. This allows you to get the summary very quickly at any given point of your exploration.

When you use the dependency matrix, method or types and their dependents are easily correlated. A measure of code quality is how tightly different types are coupled or dependent on each other. Theory is that if a dependency tree is too deep or too vast, change in a type will ripple through a lot of code whereas shallow or narrow dependency will have less dramatically affected by change. So it’s a great thing to have a measure of your dependency relationships among your classes and assemblies. This measure tends to affect code most in the maintenance phase, but of course is as useful during initial prototype/refactor cycles pre-release.

Another great feature is a dependency graph, producing a visual map of dependencies among the assemblies analyzed. I have found it very useful when “cold reading” legacy code I was charged in maintaining. Using the visualization I could more quickly determine what’s going on and understand how pieces of code work together rather than follow painstakingly with bookmarks and “follow the code” with a debugger.

As for the metrics themselves, you would probably choose your own policy regarding measures and their relevance. For one, the numbers are great as relative comparison of various code pieces. You may find that some dependencies are “very deep” - which in theory is “bad” - but that the indication points to a base class which you designed very well and serves as the base for everything. For an extreme example, most of us will agree that the “deep dependency” on System.String is well justified and doesn’t merit change. It is important for the user to understand and digest the metrics in context, and draw appropriate conclusions.

The tool is built on an underlying query technology called CQL. Once a project is analyzed, the database of findings is exposed both through built in queries. These queries can be modified and new queries can be built to correlate your important factors. Quite honestly, I have not gotten to a point of need for customization yet. The existing presentations are very rich and useful out of the box. One instance where you might want to produce custom queries would be to exclude known “violations” by adding a where clause, thereby preventing code you already analyzed and mitigated from appearing or skewing the view of the rest of your code.

In summary, I found NDepend very useful in examining legacy and new code. It gave me insights beyond empirical style oriented rules. It is much more informative to me to have a complexity measure or IL-LOC rather than a rule like “methods should not span more than 2 screen-full”. Microsoft does include code metrics in VS 2010, and code analysis in VSTS or testing editions. If that is not within your budget, then you can have NDepend today and gain valuable insight right away. I would advise taking it slow in the beginning because there is a slight learning curve to the tool usage and navigation, and ascribing relevant weight to the findings takes time. But once you get a hang of it, it becomes indispensible.

Code generator for Visual Studio - Denum

Announcing a newly added a Codeplex project “Denum” code generator.

The Denum is a class / pattern for representing fairly static metadata from a database in an in-memory structure.

The structure behaves much like an Enum, but contains static members for each data member so that compile time type checking helps in transparency and coherency of your application logic and the build itself against the database version.

WCF Serialization – new in .NET framework 3.5 sp1

You may be surprised to find that classes are serialized by WCF without any [DataContract] attribute attached to them. I certainly was!

When WCF came out, there was much fanfare of the new, improved, superior WCF serializer (justified IMHO). The main policy sizzle was that unlike the [Serializable] marking a POCO object and then [NonSerialized] attribute marking specific fields (opt-out), WCF will now use “opt-in”: Only properties specifically decorated will be serialized.

This serialization policy was clearly expressed, and the side effect of it all is that suddenly I have a bunch of projects that fail some unit tests. Upon digging, I found that SP1 has introduced a new kink: if you don’t decorate your class at all (omit / not add the [DataContract] attribute) then the object becomes “fully” serializable by WCF. All public properties will be automatically included.

This may seem as a huge step back to those that relied on the opt-in feature to hide certain classes from the serializer. This also may be a huge sigh of relief to those who cringed at the effort of (re)decorating all their message and value objects for WCF.

Note that now with SP1, if you do decorate an object with [DataContract] then the old rules apply - only properties with [DataMember] will be serialized. So to be selective, you may still decorate with [DataContract] and then selectively decorate only the properties you want.

I don’t know what led to the exact decision, and the syntax nuance definitely walks a fine line. One could argue that classes without decoration now serialize without effort, but ones marked for WCF specifically still behave as previously advertised.

All in all, 2 hours of head scratching in disbelief, 1 hour to change some unit test expectations, not too awful. Long live refactoring tools in Visual Studio!

Unit testing value tests - Automate repeated tests

It is generally considered a good thing to use unit tests these days. Often it is necessary to test a method which takes some complex type. So in the unit testing one has to painstakingly manufacture such object, and pass it in.

Before doing so, you would (should!) ensure the complex type itself produces an identity - that is to say that if you create an instance of type MyClass and assign / construct it with proper values your should “get back” what you gave it. This is especially true for object that get serialized and de-serialized.

What I often do is use some helper code.

First snippet allows for testing an object for serialization using WCF, ensuring “round trip” serialization-de-serialization works.

The second snippet uses reflection to ensure that the object your put through the mill came back with identical values to the initial assigned one. This saves a LOT of Assert.AreEqual(expected.PropA, actual.PropA) etc.

Since the object is actually a reference type, other equality checks would not do at the root level (such as ReferenceEuqals and the like).

Structs or nested structs are handled via ensureFieldsMatch() method. Note that complex types may not be handled correctly - generics have not been addressed specifically here.

Future enhancements may include passing in an exclusion list of properties to skip or an inclusion list of properties to match exclusively. I’m on the fence on these, because the whole idea was to say “An object A matches B if every property and public fields match in value”, and if one has to explicitly provide all property names one could just as well Assert.AreEqual(a.x, b.x) them.
Updated 2008-11-07: Error in comparison fixed. (Thank you Rich for pointing it out!)

using System;
using System.Reflection;
using Microsoft.VisualStudio.TestTools.UnitTesting;
namespace Nuri.Test.Helpers
{
public static class Equality
{
/// <summary>
/// Some properties are instance specific, and can be excluded for value matching (unlike ref equivalence)
/// </summary>
private static readonly string[] _ReservedProperties = { "SyncRoot" };
public static void EnsureMatchByProperties(this object expected, object actual)
{
ensureNotNull(expected, actual);
Type expectedType = expected.GetType();
Type actualType = actual.GetType();
Assert.AreEqual(expectedType, actualType);
if (expectedType.IsArray)
{
Array expectedArray = expected as System.Array;
Array actualArray = actual as System.Array;
Console.WriteLine(">>>*** digging into array " + expectedType.Name);
for (int i = 0; i < expectedArray.Length; i++)
{
Console.WriteLine(" --- --- ---");
EnsureMatchByProperties(expectedArray.GetValue(i), actualArray.GetValue(i));
}
Console.WriteLine("<<<*** digging out from array " + expectedType.Name);
}
else
{
ensurePropertiesMatch(expected, actual, expectedType, actualType);
}
}
public static void EnsureMatchByFields(this object expected, object actual, params string[] exclusionList)
{
ensureNotNull(expected, actual);
Type expectedType = expected.GetType();
Type actualType = actual.GetType();
Assert.AreEqual(expectedType, actualType);
if (expectedType.IsArray)
{
Array expectedArray = expected as System.Array;
Array actualArray = actual as System.Array;
Console.WriteLine(">>>*** digging into array " + expectedType.Name);
for (int i = 0; i < expectedArray.Length; i++)
{
Console.WriteLine(" --- --- ---");
expectedArray.GetValue(i).EnsureMatchByFields(actualArray.GetValue(i)); // recursion
}
Console.WriteLine("<<<*** digging out from array " + expectedType.Name);
}
else
{
ensureFieldsMatch(expected, actual, exclusionList);
}
}

private static void ensurePropertiesMatch(object expected, object actual, Type expectedType, Type actualType)
{
BindingFlags propertyExtractionOptions = BindingFlags.Public
| BindingFlags.NonPublic
| BindingFlags.Instance
| BindingFlags.Static
| BindingFlags.GetProperty;
foreach (PropertyInfo expectedProp in expectedType.GetProperties())
{
if (expectedProp.CanRead && !_ReservedProperties.Contains(expectedProp.Name))
{
if (expectedProp.PropertyType.IsValueType || expectedProp.PropertyType == typeof(String))
{
object expectedValue = expectedType.InvokeMember(expectedProp.Name,
propertyExtractionOptions,
null, expected, null);
object actualValue = actualType.InvokeMember(expectedProp.Name,
propertyExtractionOptions,
null, actual, null);
if (expectedValue == null && actualValue == null)
{
// both null - ok
Console.WriteLine("{0}: null == null", expectedProp.Name);
continue;
}
if (expectedValue == null || actualValue == null)
{
// one null the other not. Failure
Assert.Fail(expectedProp.Name + ": Expected Or Actual is null! (but not both)");
break;
}
Console.Write("{0}: {1} == {2} ?", expectedProp.Name, expectedValue.ToString(),
actualValue.ToString());
Assert.AreEqual(expectedValue, actualValue,
"Value of property doesn't match in " + expectedProp.Name);
Console.WriteLine(" true.");
}
else if (expectedProp.PropertyType.IsClass)
{
object expectedObject = expectedType.InvokeMember(expectedProp.Name,
propertyExtractionOptions,
null, expected, null);
object actualObject = actualType.InvokeMember(expectedProp.Name,
propertyExtractionOptions,
null, actual, null);
if (expectedObject != null
&& actualObject != null)
{
Console.WriteLine(">>>>>>>> digging into " + expectedProp.Name);
EnsureMatchByProperties(expectedObject, actualObject);
Console.WriteLine("<<<<<<<< back from dig of " + expectedProp.Name);
}
}
}
}
}

private static void ensureFieldsMatch(object expected, object actual, params string[] exclusionList)
{
Type expectedType = expected.GetType();
Type actualType = actual.GetType();
BindingFlags filedExtractionOptions = BindingFlags.GetField |
BindingFlags.NonPublic |
BindingFlags.Public |
BindingFlags.Instance;
foreach (FieldInfo expectedField in expectedType.GetFields(filedExtractionOptions))
{
if (!exclusionList.Contains(expectedField.Name))
{
if (expectedField.FieldType.IsValueType || expectedField.FieldType == typeof(String))
{
object expectedValue = expectedType.InvokeMember(expectedField.Name,
filedExtractionOptions,
null, expected, null);
object actualValue = actualType.InvokeMember(expectedField.Name,
filedExtractionOptions,
null, actual, null);
if (actual == null && expectedValue == null)
{
// both null - ok
Console.WriteLine("{0}: null == null", expectedField.Name);
continue;
}
if (expectedValue == null || actualValue == null)
{
// one null the other not. Failure
Assert.Fail(expectedField.Name + ": Expected Or Actual is null! (but not both)");
break;
}
Console.Write("{0}: {1} == {2} ?", expectedField.Name, expectedValue.ToString(), actualValue.ToString());
Assert.AreEqual(expectedValue, actualValue, "Value of filed doesn't match in " + expectedField.Name);
Console.WriteLine(" true.");
}
else if (expectedField.FieldType.IsClass)
{
object expectedObject = expectedType.InvokeMember(expectedField.Name,
BindingFlags.Public | BindingFlags.NonPublic | BindingFlags.Instance | BindingFlags.GetField, null, expected, null);
object actualObject = actualType.InvokeMember(expectedField.Name,
BindingFlags.Public | BindingFlags.NonPublic | BindingFlags.Instance | BindingFlags.GetField, null, actual, null);
if (expectedObject != null
&& actualObject != null)
{
Console.WriteLine(">>>>>>>> digging into " + expectedField.Name);
expectedObject.EnsureMatchByFields(actualObject);
Console.WriteLine("<<<<<<<< back from dig" + expectedField.Name);
}
}
}
}
}

/// <summary>
/// Ensures none of the values is null.
/// </summary>
/// <param name="parameters">The parameters to check for null.</param>
private static void ensureNotNull(params object[] parameters)
{
foreach (object obj in parameters)
if (obj == null)
{
throw new ArgumentNullException("at least one parameter is null");
}
}
}
}

VS2008 and web application - hostname

Recently I ran into an issue where IIS was refusing to load an otherwise previously perfectly good web application in the solution.

The Visual Studio 2008 solution included a Web Application csproj, and was referencing it by URL on IIS (that is, not a local file based or Cassini web project). Attempting to load the solution brought up an error:

The local IIS URL http://localhost/{YourAppName} specified for Web Project {YourAppName} has not been configured. In order to open this project the Virtual Directory needs to be configured. Would you like to create the Virtual Directory now?

First, checked the IIS manager and ensured application pool and web site were up and operational. All seemingly well.

Since my web application was in fact a WCF web service, I first tried to access the WSDL: It worked. A WSDL was returned, so I knew the web application was operational or at the very least reachable.

Finally I looked more closely at IIS. Clicking the “Web Sites” folder itself brings up a grid on the right. It shows the web site name, port, IP and state. Sorting by Port, I noticed several web sites were running on 80. The way it works is that one of them had no host headers at all, and the rest had a host header, thereby distinguishing them from the rest of the web sites on the same port.

Normally, I do this when working on several web applications and wanting to mimic production as closely as possible without the need for multiple IPs on the dev box. At some point, I installed MOSS and used the dev box’s WINS name (the name of the machine in AD) as the host header name.

The confusion became clear when looking more closely at the WSDL returned. Although the IP I used was

http://localhost/{MyAppName} , the WSDL link rendered was

http://MyMachineName/{MyAppName}?WSDL ..

This clued in on the answer:

IIS setup for Sharepoint

Created a c:\windows\system32\drivers\etc\hosts entry 127.0.0.1 SomeName for the IIS web site which previously was using the machine name as a hosts header, changed IIS so that the host header for the offending site was using the new entry and things are back to normal.

Unit Testing as part of the build process in VS2005

We had a practice which included unit testing for a while now, but it entailed Nunit and MBunit with various build frameworks such as CruiseControl and VisualBuilder.

We are running VS2005, and wanted to include running the tests on developer boxes as part of a build, so that failed tests will flunk the build.

Using a post build event, this is easily done. In the Post Build Events, enter:

CD $(TargetDir)    

"$(DevEnvDir)MSTEST.exe" /testcontainer:$(TargetFileName)

The quotes around the MSTEST full path ensure that the space in the name “Program Files” is resolved correctly.

Changing directory to the target directory is easier than setting all explicit paths etc.

This could have been achieved by running a continuous integration server on each machine, but raises the setup complexity.

If MSTEST returns a less than success code, the build will fail.

If you further want to speed development compile/run cycle, you can create a new configuration:

Configuration Manager -> Active Solution Configuration (drop down) -> “New” -> name a new debug configuration “Debug No Test”

In the new configuration, check each project in the solution except the test project.

Upon Reflection - Localization / Internationalization (I18N) gotcha

The other day I stumbled over a non intuitive feature of localization. Turns out that you need to include any key you are going to use in a local specific file in the culture neutral file as well. Otherwise - declarative control localization using the _ meta:resourcekey="myKey" property syntax would not work as expected..

The setup:

Create a file named Default.aspx

Create an asp:HyperLink tag, and set some properties:

<asp:HyperLink ID="Greetings" runat="server" 
Text="Hello World"
NavigateUrl="http://www.ReallyDoesntMatterWhere.com/"
meta:resourcekey="HelloWorld"/>

<%=CultureInfo.CurrentCulture.NativeName%>

<%=CultureInfo.CurrentUICulture.NativeName%>

Now create a App_LocalResources folder, and in it create a resx file Default.aspx.resx and in it have the single string:

Key Value
“HelloWorld.Text” “Hello World!”

(Note that the resource has the exclamation point. This proves that the app is using the resource value rather than the Text property in the tag at runtime.)

Run the application. Nothing fancy, and as expected you get a link, it says what it says in the culture neutral resource file (Default.aspx.resx)

Now copy the file Default.aspx.resx and make a file named Default.aspx.en-AU.resx. Open the new file, and edit the value to say “G’day Mate!”. In the web .config add a node:


<system.web>
<globalization
fileEncoding="utf-8"
requestEncoding="utf-8"
responseEncoding="utf-8"
culture="en-AU"
uiCulture="en-AU"/>

Run again and you get what you expect, the resource value from the Default.aspx.en-AU.resx

At this point, add a key to the localized resource file to have a ToolTip:

Key Value
“HelloWorld.ToolTip” “Go Joey”

Run again, but no tool tip is displayed, hover over the link as you may. Debug through and try to inspect the expression _GetLocalResourceObject("HelloWorld.ToolTip") - it does return “Go Joey”, but doesn’t render. That’s peculiar!

Now go back to the Default.aspx.resx file and add the ToolTip key and any value you want as well.

Run it again and - drum roll please - you get the tool tip.

Curious about this, I surmised that somehow the compiler emits properties or methods that are based on the culture neutral resource file. ILDASM confirms it. If you open the generated assembly (c:\windows\Microsoft.net\framework\v2.0.50727\Temporary Internet Files.…) you see that when you include the key HelloWorld.ToolTip in Default.aspx.resx, you get something along the lines of:

actually emits a method ___BuildControlGreetings(). Opening the IL shows that in that method the ToolTip resource value is fetched and the property of the Greeting control is set to that value:


IL_003c: ldstr "HelloWorld.ToolTip"
IL_0041: call instance object [System.Web]System.Web.UI.TemplateControl::GetLocalResourceObject(string)
IL_0046: call class [mscorlib]System.Globalization.CultureInfo [mscorlib] System.Globalization.CultureInfo::get_CurrentCulture()
IL_004b: call string [mscorlib]System.Convert::ToString(object,
class [mscorlib]System.IFormatProvider)
IL_0050: callvirt instance void [System.Web]System.Web.UI.WebControls.WebControl::set_ToolTip(string)

But if you remove the resource key and value from the culture neutral file, the IL would not be emitted. That’s a pretty interesting discovery. The resource key gets compiled into inline code which makes the appropriate resource fetching call and populate the control’s property. This is a pretty efficient implementation because the property is set explicitly, and no reflection is used at runtime. The overhead of GetLocalResourceObject() is based on the internals of the provider, but in general it boils down to a hash lookup. Recall that GetLocalResourceObject() during debugging did indeed return the desired string? Well, that proved that the specific satellite assembly was correctly compiled and present. But that was no use because the runtime ___BuildControlxyz simply won’t include it.

In conclusion, make a sticky note to always have the culture neutral file include a superset of all keys you may want to localize. The default mechanism provides a fallback in case a specific local key is not specified to go to the less specific or culture neutral file. But there is no “fall forward” - all keys must be specified in the culture neutral file or the method to get a localized resource would simply be missing!

Upon Reflection - C# yields statement enumeration helper

The new yields statement sounds very convenient. In the past, one had to write a significant amount of code to implement the IEnumerator interface and expose an enumerator. That included considerations of concurrency, a loop variable bound to the instance or other methods to maintain current loop value during enumeration.

Fret no more, a new syntax is in town - the yields statement.

With the yields statement, IEnumerator implementation folds down to a scan one liner:

public class MyCollection : IEnumerable
{
public IEnumerator GetEnumerator()
{
foreach (string s in new string[] { "Larry", "Moe", "Curley" })
{
yield return s + " is a stooge";
}
}
}

You can also provide an enumerator returning static values or “hard coded” number of values:

public class VerySimple : IEnumerable
{
public List<DateTime> _Things;
public IEnumerator GetEnumerator()
{
yield return 1;
yield return 7;
yield return 11;
}
}

So that sounds great! No pesky Reset(), MoveNext() etc., no private index to hold on to, and even options to do more fancy things, like exposing only some of your items to enumeration:

public class Person
{
public string Name;
public bool IsPublic;
public Person(string name, bool isPublic)
{
this.Name = name;
this.IsPublic = isPublic;
}
}
public class People : IEnumerable
{
private Person[] _Peeps = new Person[] {
new Person("James Brown", true),
new Person("John Lenon", true),
new Person("Johnny Doe", false)
};
public IEnumerator GetEnumerator()
{
foreach (Person dude in _Peeps)
{
if (dude.IsPublic)
{
yield return dude.Name + " is a well known";
}
}
}
}

That was easy, and pretty useful too. You get to have an easy syntax for emitting each value, and you get exact control over which item is exposed without implementing a whole sub class just for the enumeration.

Looking at this keyword and the simplicity of exposing enumerator one might be tempted to think there is some magic new framework for enumerating a collection with hooks and generic loops or something. To find out, I looked at the IL generated for the MyCollection class we just created.

As expected, we find the class has a method named GetEnumerator(). It’s implementation is seemingly simple, instantiate some cryptically named class and return it.

public IEnumerator GetEnumerator()
{
< GetEnumerator > d__0 d__ = new < GetEnumerator > d__0(0);
d__.<> 4__this = this;
return d__;
}

When you look at the implementation of the enumerator class itself, you get quite a few lines of code:

private sealed class <GetEnumerator>d__0 : IEnumerator<object>, IEnumerator, IDisposable
{
// Fields
private int <>1__state;
private object <>2__current;
public MyCollection<>4__this;
public string[] <>7__wrap2;
public int <>7__wrap3;
public string <s>5__1;
// Methods
public <GetEnumerator>d__0(int <>1__state)
{
this.<> 1__state = <> 1__state;
}
private bool MoveNext()
{
try
{
switch (this.<> 1__state)
{
case 0:
this.<> 1__state = -1;
this.<> 1__state = 1;
this.<> 7__wrap2 = new string[] { "Larry", "Moe", "Curley" };
this.<> 7__wrap3 = 0;
while (this.<> 7__wrap3 < this.<> 7__wrap2.Length)
{
this.< s > 5__1 = this.<> 7__wrap2[this.<> 7__wrap3];
this.<> 2__current = this.< s > 5__1 + " is a stooge";
this.<> 1__state = 2;
return true;
Label_0098:
this.<> 1__state = 1;
this.<> 7__wrap3++;
}
this.<> 1__state = -1;
break;
case 2:
goto Label_0098;
}
return false;
}
fault
{
this.Dispose();
}
}
void IEnumerator.Reset()
{
throw new NotSupportedException();
}
void IDisposable.Dispose()
{
switch (this.<> 1__state)
{
case 1:
case 2:
this.<> 1__state = -1;
break;
}
}
// Properties
object IEnumerator<object>.Current
{
get
{
return this.<> 2__current;
}
}
object IEnumerator.Current
{
get
{
return this.<> 2__current;
}
}
}

So what is really going on here is that when you type out yield return x; the compiler transforms this into a method stub, implants your loop logic in the MoveNext() method of a new shiny enumerator class, and provides the standard requisite functions of IEnumerable interface which support the foreach statement.

Is this good or bad? Certainly it serves well in many instances. For most of your daily uses for an enumerator this should work quite well. It’s strongly typed to the list item and uses your class’s values referenced directly.

What can be sub optimal about this? Multithreaded applications need to implement locking at the class level. Some collections in .NET implement an internal version number such that if the collection chances during enumeration an exception gets thrown to the enumerating thread. Not so here. If you want that behavior you’d have to implement it yourself.

You should note that the loop itself and any of your conditions get transformed by the compiler. The transformation, I trust, is functionally equivalent. The transformation result will vary slightly based on the collection being iterated, or if you are using a static chain of yield statements. In the case of hard coded yielded values, no concurrency issues should arise, but that is fairly rare in my humble experience.

Besides that, I think it’s pretty cool. You get to write less code, the compiler take care of your code generation.

On a side note, when decompiling your code, don’t get too caught up in Reflector’s code rendering. For one, IL decompiled to your language of choice is not a symmetric operation. For that reason and due to compiler optimizations and inlining, certain language constructs may come up reflected as GOTO label but were not necessarily coded this way originally in the higher level language.

Notice

We use cookies to personalise content, to allow you to contact us, to provide social media features and to analyse our site usage. Information about your use of our site may be combined by our analytics partners with other information that you’ve provided to them or that they’ve collected from your use of their services. You consent to our cookies if you continue to use our website.