+N Consulting, Inc.

Websites | Databases | Consulting | Training

Code generator for Visual Studio - Denum

Announcing a newly added a Codeplex project “Denum” code generator.

The Denum is a class / pattern for representing fairly static metadata from a database in an in-memory structure.

The structure behaves much like an Enum, but contains static members for each data member so that compile time type checking helps in transparency and coherency of your application logic and the build itself against the database version.

WCF Serialization – new in .NET framework 3.5 sp1

You may be surprised to find that classes are serialized by WCF without any [DataContract] attribute attached to them. I certainly was!

When WCF came out, there was much fanfare of the new, improved, superior WCF serializer (justified IMHO). The main policy sizzle was that unlike the [Serializable] marking a POCO object and then [NonSerialized] attribute marking specific fields (opt-out), WCF will now use “opt-in”: Only properties specifically decorated will be serialized.

This serialization policy was clearly expressed, and the side effect of it all is that suddenly I have a bunch of projects that fail some unit tests. Upon digging, I found that SP1 has introduced a new kink: if you don’t decorate your class at all (omit / not add the [DataContract] attribute) then the object becomes “fully” serializable by WCF. All public properties will be automatically included.

This may seem as a huge step back to those that relied on the opt-in feature to hide certain classes from the serializer. This also may be a huge sigh of relief to those who cringed at the effort of (re)decorating all their message and value objects for WCF.

Note that now with SP1, if you do decorate an object with [DataContract] then the old rules apply - only properties with [DataMember] will be serialized. So to be selective, you may still decorate with [DataContract] and then selectively decorate only the properties you want.

I don’t know what led to the exact decision, and the syntax nuance definitely walks a fine line. One could argue that classes without decoration now serialize without effort, but ones marked for WCF specifically still behave as previously advertised.

All in all, 2 hours of head scratching in disbelief, 1 hour to change some unit test expectations, not too awful. Long live refactoring tools in Visual Studio!

Unit testing value tests - Automate repeated tests

It is generally considered a good thing to use unit tests these days. Often it is necessary to test a method which takes some complex type. So in the unit testing one has to painstakingly manufacture such object, and pass it in.

Before doing so, you would (should!) ensure the complex type itself produces an identity - that is to say that if you create an instance of type MyClass and assign / construct it with proper values your should “get back” what you gave it. This is especially true for object that get serialized and de-serialized.

What I often do is use some helper code.

First snippet allows for testing an object for serialization using WCF, ensuring “round trip” serialization-de-serialization works.

The second snippet uses reflection to ensure that the object your put through the mill came back with identical values to the initial assigned one. This saves a LOT of Assert.AreEqual(expected.PropA, actual.PropA) etc.

Since the object is actually a reference type, other equality checks would not do at the root level (such as ReferenceEuqals and the like).

Structs or nested structs are handled via ensureFieldsMatch() method. Note that complex types may not be handled correctly - generics have not been addressed specifically here.

Future enhancements may include passing in an exclusion list of properties to skip or an inclusion list of properties to match exclusively. I’m on the fence on these, because the whole idea was to say “An object A matches B if every property and public fields match in value”, and if one has to explicitly provide all property names one could just as well Assert.AreEqual(a.x, b.x) them.
Updated 2008-11-07: Error in comparison fixed. (Thank you Rich for pointing it out!)

using System;
using System.Reflection;
using Microsoft.VisualStudio.TestTools.UnitTesting;
namespace Nuri.Test.Helpers
{
public static class Equality
{
/// <summary>
/// Some properties are instance specific, and can be excluded for value matching (unlike ref equivalence)
/// </summary>
private static readonly string[] _ReservedProperties = { "SyncRoot" };
public static void EnsureMatchByProperties(this object expected, object actual)
{
ensureNotNull(expected, actual);
Type expectedType = expected.GetType();
Type actualType = actual.GetType();
Assert.AreEqual(expectedType, actualType);
if (expectedType.IsArray)
{
Array expectedArray = expected as System.Array;
Array actualArray = actual as System.Array;
Console.WriteLine(">>>*** digging into array " + expectedType.Name);
for (int i = 0; i < expectedArray.Length; i++)
{
Console.WriteLine(" --- --- ---");
EnsureMatchByProperties(expectedArray.GetValue(i), actualArray.GetValue(i));
}
Console.WriteLine("<<<*** digging out from array " + expectedType.Name);
}
else
{
ensurePropertiesMatch(expected, actual, expectedType, actualType);
}
}
public static void EnsureMatchByFields(this object expected, object actual, params string[] exclusionList)
{
ensureNotNull(expected, actual);
Type expectedType = expected.GetType();
Type actualType = actual.GetType();
Assert.AreEqual(expectedType, actualType);
if (expectedType.IsArray)
{
Array expectedArray = expected as System.Array;
Array actualArray = actual as System.Array;
Console.WriteLine(">>>*** digging into array " + expectedType.Name);
for (int i = 0; i < expectedArray.Length; i++)
{
Console.WriteLine(" --- --- ---");
expectedArray.GetValue(i).EnsureMatchByFields(actualArray.GetValue(i)); // recursion
}
Console.WriteLine("<<<*** digging out from array " + expectedType.Name);
}
else
{
ensureFieldsMatch(expected, actual, exclusionList);
}
}
private static void ensurePropertiesMatch(object expected, object actual, Type expectedType, Type actualType)
{
BindingFlags propertyExtractionOptions = BindingFlags.Public
| BindingFlags.NonPublic
| BindingFlags.Instance
| BindingFlags.Static
| BindingFlags.GetProperty;
foreach (PropertyInfo expectedProp in expectedType.GetProperties())
{
if (expectedProp.CanRead && !_ReservedProperties.Contains(expectedProp.Name))
{
if (expectedProp.PropertyType.IsValueType || expectedProp.PropertyType == typeof(String))
{
object expectedValue = expectedType.InvokeMember(expectedProp.Name,
propertyExtractionOptions,
null, expected, null);
object actualValue = actualType.InvokeMember(expectedProp.Name,
propertyExtractionOptions,
null, actual, null);
if (expectedValue == null && actualValue == null)
{
// both null - ok
Console.WriteLine("{0}: null == null", expectedProp.Name);
continue;
}
if (expectedValue == null || actualValue == null)
{
// one null the other not. Failure
Assert.Fail(expectedProp.Name + ": Expected Or Actual is null! (but not both)");
break;
}
Console.Write("{0}: {1} == {2} ?", expectedProp.Name, expectedValue.ToString(),
actualValue.ToString());
Assert.AreEqual(expectedValue, actualValue,
"Value of property doesn't match in " + expectedProp.Name);
Console.WriteLine(" true.");
}
else if (expectedProp.PropertyType.IsClass)
{
object expectedObject = expectedType.InvokeMember(expectedProp.Name,
propertyExtractionOptions,
null, expected, null);
object actualObject = actualType.InvokeMember(expectedProp.Name,
propertyExtractionOptions,
null, actual, null);
if (expectedObject != null
&& actualObject != null)
{
Console.WriteLine(">>>>>>>> digging into " + expectedProp.Name);
EnsureMatchByProperties(expectedObject, actualObject);
Console.WriteLine("<<<<<<<< back from dig of " + expectedProp.Name);
}
}
}
}
}
private static void ensureFieldsMatch(object expected, object actual, params string[] exclusionList)
{
Type expectedType = expected.GetType();
Type actualType = actual.GetType();
BindingFlags filedExtractionOptions = BindingFlags.GetField |
BindingFlags.NonPublic |
BindingFlags.Public |
BindingFlags.Instance;
foreach (FieldInfo expectedField in expectedType.GetFields(filedExtractionOptions))
{
if (!exclusionList.Contains(expectedField.Name))
{
if (expectedField.FieldType.IsValueType || expectedField.FieldType == typeof(String))
{
object expectedValue = expectedType.InvokeMember(expectedField.Name,
filedExtractionOptions,
null, expected, null);
object actualValue = actualType.InvokeMember(expectedField.Name,
filedExtractionOptions,
null, actual, null);
if (actual == null && expectedValue == null)
{
// both null - ok
Console.WriteLine("{0}: null == null", expectedField.Name);
continue;
}
if (expectedValue == null || actualValue == null)
{
// one null the other not. Failure
Assert.Fail(expectedField.Name + ": Expected Or Actual is null! (but not both)");
break;
}
Console.Write("{0}: {1} == {2} ?", expectedField.Name, expectedValue.ToString(), actualValue.ToString());
Assert.AreEqual(expectedValue, actualValue, "Value of filed doesn't match in " + expectedField.Name);
Console.WriteLine(" true.");
}
else if (expectedField.FieldType.IsClass)
{
object expectedObject = expectedType.InvokeMember(expectedField.Name,
BindingFlags.Public | BindingFlags.NonPublic | BindingFlags.Instance | BindingFlags.GetField, null, expected, null);
object actualObject = actualType.InvokeMember(expectedField.Name,
BindingFlags.Public | BindingFlags.NonPublic | BindingFlags.Instance | BindingFlags.GetField, null, actual, null);
if (expectedObject != null
&& actualObject != null)
{
Console.WriteLine(">>>>>>>> digging into " + expectedField.Name);
expectedObject.EnsureMatchByFields(actualObject);
Console.WriteLine("<<<<<<<< back from dig" + expectedField.Name);
}
}
}
}
}
/// <summary>
/// Ensures none of the values is null.
/// </summary>
/// <param name="parameters">The parameters to check for null.</param>
private static void ensureNotNull(params object[] parameters)
{
foreach (object obj in parameters)
if (obj == null)
{
throw new ArgumentNullException("at least one parameter is null");
}
}
}
}

VS2008 and web application - hostname

Recently I ran into an issue where IIS was refusing to load an otherwise previously perfectly good web application in the solution.

The Visual Studio 2008 solution included a Web Application csproj, and was referencing it by URL on IIS (that is, not a local file based or Cassini web project). Attempting to load the solution brought up an error:

The local IIS URL http://localhost/{YourAppName} specified for Web Project {YourAppName} has not been configured. In order to open this project the Virtual Directory needs to be configured. Would you like to create the Virtual Directory now?

First, checked the IIS manager and ensured application pool and web site were up and operational. All seemingly well.

Since my web application was in fact a WCF web service, I first tried to access the WSDL: It worked. A WSDL was returned, so I knew the web application was operational or at the very least reachable.

Finally I looked more closely at IIS. Clicking the “Web Sites” folder itself brings up a grid on the right. It shows the web site name, port, IP and state. Sorting by Port, I noticed several web sites were running on 80. The way it works is that one of them had no host headers at all, and the rest had a host header, thereby distinguishing them from the rest of the web sites on the same port.

Normally, I do this when working on several web applications and wanting to mimic production as closely as possible without the need for multiple IPs on the dev box. At some point, I installed MOSS and used the dev box’s WINS name (the name of the machine in AD) as the host header name.

The confusion became clear when looking more closely at the WSDL returned. Although the IP I used was

http://localhost/{MyAppName} , the WSDL link rendered was

http://MyMachineName/{MyAppName}?WSDL ..

This clued in on the answer:

IIS setup for Sharepoint

Created a c:\windows\system32\drivers\etc\hosts entry 127.0.0.1 SomeName for the IIS web site which previously was using the machine name as a hosts header, changed IIS so that the host header for the offending site was using the new entry and things are back to normal.

Unit Testing as part of the build process in VS2005

We had a practice which included unit testing for a while now, but it entailed Nunit and MBunit with various build frameworks such as CruiseControl and VisualBuilder.

We are running VS2005, and wanted to include running the tests on developer boxes as part of a build, so that failed tests will flunk the build.

Using a post build event, this is easily done. In the Post Build Events, enter:

CD $(TargetDir)
"$(DevEnvDir)MSTEST.exe" /testcontainer:$(TargetFileName)

The quotes around the MSTEST full path ensure that the space in the name “Program Files” is resolved correctly.

Changing directory to the target directory is easier than setting all explicit paths etc.

This could have been achieved by running a continuous integration server on each machine, but raises the setup complexity.

If MSTEST returns a less than success code, the build will fail.

If you further want to speed development compile/run cycle, you can create a new configuration:

Configuration Manager -> Active Solution Configuration (drop down) -> “New” -> name a new debug configuration “Debug No Test”

In the new configuration, check each project in the solution except the test project.

Upon Reflection - Localization / Internationalization (I18N) gotcha

The other day I stumbled over a non intuitive feature of localization. Turns out that you need to include any key you are going to use in a local specific file in the culture neutral file as well. Otherwise - declarative control localization using the _ meta:resourcekey="myKey" property syntax would not work as expected..

The setup:

Create a file named Default.aspx

Create an asp:HyperLink tag, and set some properties:

<asp:HyperLink ID="Greetings" runat="server"
Text="Hello World"
NavigateUrl="http://www.ReallyDoesntMatterWhere.com/"
meta:resourcekey="HelloWorld"/>
<%=CultureInfo.CurrentCulture.NativeName%>
<%=CultureInfo.CurrentUICulture.NativeName%>

Now create a App_LocalResources folder, and in it create a resx file Default.aspx.resx and in it have the single string:

Key Value
“HelloWorld.Text” “Hello World!”

(Note that the resource has the exclamation point. This proves that the app is using the resource value rather than the Text property in the tag at runtime.)

Run the application. Nothing fancy, and as expected you get a link, it says what it says in the culture neutral resource file (Default.aspx.resx)

Now copy the file Default.aspx.resx and make a file named Default.aspx.en-AU.resx. Open the new file, and edit the value to say “G’day Mate!”. In the web .config add a node:

<system.web>
<globalization
fileEncoding="utf-8"
requestEncoding="utf-8"
responseEncoding="utf-8"
culture="en-AU"
uiCulture="en-AU"/>

Run again and you get what you expect, the resource value from the Default.aspx.en-AU.resx

At this point, add a key to the localized resource file to have a ToolTip:

Key Value
“HelloWorld.ToolTip” “Go Joey”

Run again, but no tool tip is displayed, hover over the link as you may. Debug through and try to inspect the expression _GetLocalResourceObject("HelloWorld.ToolTip") - it does return “Go Joey”, but doesn’t render. That’s peculiar!

Now go back to the Default.aspx.resx file and add the ToolTip key and any value you want as well.

Run it again and - drum roll please - you get the tool tip.

Curious about this, I surmised that somehow the compiler emits properties or methods that are based on the culture neutral resource file. ILDASM confirms it. If you open the generated assembly (c:\windows\Microsoft.net\framework\v2.0.50727\Temporary Internet Files.…) you see that when you include the key HelloWorld.ToolTip in Default.aspx.resx, you get something along the lines of:

actually emits a method ___BuildControlGreetings(). Opening the IL shows that in that method the ToolTip resource value is fetched and the property of the Greeting control is set to that value:

IL_003c: ldstr "HelloWorld.ToolTip"
IL_0041: call instance object [System.Web]System.Web.UI.TemplateControl::GetLocalResourceObject(string)
IL_0046: call class [mscorlib]System.Globalization.CultureInfo [mscorlib] System.Globalization.CultureInfo::get_CurrentCulture()
IL_004b: call string [mscorlib]System.Convert::ToString(object,
class [mscorlib]System.IFormatProvider)
IL_0050: callvirt instance void [System.Web]System.Web.UI.WebControls.WebControl::set_ToolTip(string)

But if you remove the resource key and value from the culture neutral file, the IL would not be emitted. That’s a pretty interesting discovery. The resource key gets compiled into inline code which makes the appropriate resource fetching call and populate the control’s property. This is a pretty efficient implementation because the property is set explicitly, and no reflection is used at runtime. The overhead of GetLocalResourceObject() is based on the internals of the provider, but in general it boils down to a hash lookup. Recall that GetLocalResourceObject() during debugging did indeed return the desired string? Well, that proved that the specific satellite assembly was correctly compiled and present. But that was no use because the runtime ___BuildControlxyz simply won’t include it.

In conclusion, make a sticky note to always have the culture neutral file include a superset of all keys you may want to localize. The default mechanism provides a fallback in case a specific local key is not specified to go to the less specific or culture neutral file. But there is no “fall forward” - all keys must be specified in the culture neutral file or the method to get a localized resource would simply be missing!

Upon Reflection - C# yields statement enumeration helper

The new yields statement sounds very convenient. In the past, one had to write a significant amount of code to implement the IEnumerator interface and expose an enumerator. That included considerations of concurrency, a loop variable bound to the instance or other methods to maintain current loop value during enumeration.

Fret no more, a new syntax is in town - the yields statement.

With the yields statement, IEnumerator implementation folds down to a scan one liner:

public class MyCollection : IEnumerable
{
public IEnumerator GetEnumerator()
{
foreach (string s in new string[] { "Larry", "Moe", "Curley" })
{
yield return s + " is a stooge";
}
}
}

You can also provide an enumerator returning static values or “hard coded” number of values:

public class VerySimple : IEnumerable
{
public List<DateTime> _Things;
public IEnumerator GetEnumerator()
{
yield return 1;
yield return 7;
yield return 11;
}
}

So that sounds great! No pesky Reset(), MoveNext() etc., no private index to hold on to, and even options to do more fancy things, like exposing only some of your items to enumeration:

public class Person
{
public string Name;
public bool IsPublic;
public Person(string name, bool isPublic)
{
this.Name = name;
this.IsPublic = isPublic;
}
}
public class People : IEnumerable
{
private Person[] _Peeps = new Person[] {
new Person("James Brown", true),
new Person("John Lenon", true),
new Person("Johnny Doe", false)
};
public IEnumerator GetEnumerator()
{
foreach (Person dude in _Peeps)
{
if (dude.IsPublic)
{
yield return dude.Name + " is a well known";
}
}
}
}

That was easy, and pretty useful too. You get to have an easy syntax for emitting each value, and you get exact control over which item is exposed without implementing a whole sub class just for the enumeration.

Looking at this keyword and the simplicity of exposing enumerator one might be tempted to think there is some magic new framework for enumerating a collection with hooks and generic loops or something. To find out, I looked at the IL generated for the MyCollection class we just created.

As expected, we find the class has a method named GetEnumerator(). It’s implementation is seemingly simple, instantiate some cryptically named class and return it.

public IEnumerator GetEnumerator()
{
< GetEnumerator > d__0 d__ = new < GetEnumerator > d__0(0);
d__.<> 4__this = this;
return d__;
}

When you look at the implementation of the enumerator class itself, you get quite a few lines of code:

private sealed class <GetEnumerator>d__0 : IEnumerator<object>, IEnumerator, IDisposable
{
// Fields
private int <>1__state;
private object <>2__current;
public MyCollection<>4__this;
public string[] <>7__wrap2;
public int <>7__wrap3;
public string <s>5__1;
// Methods
public <GetEnumerator>d__0(int <>1__state)
{
this.<> 1__state = <> 1__state;
}
private bool MoveNext()
{
try
{
switch (this.<> 1__state)
{
case 0:
this.<> 1__state = -1;
this.<> 1__state = 1;
this.<> 7__wrap2 = new string[] { "Larry", "Moe", "Curley" };
this.<> 7__wrap3 = 0;
while (this.<> 7__wrap3 < this.<> 7__wrap2.Length)
{
this.< s > 5__1 = this.<> 7__wrap2[this.<> 7__wrap3];
this.<> 2__current = this.< s > 5__1 + " is a stooge";
this.<> 1__state = 2;
return true;
Label_0098:
this.<> 1__state = 1;
this.<> 7__wrap3++;
}
this.<> 1__state = -1;
break;
case 2:
goto Label_0098;
}
return false;
}
fault
{
this.Dispose();
}
}
void IEnumerator.Reset()
{
throw new NotSupportedException();
}
void IDisposable.Dispose()
{
switch (this.<> 1__state)
{
case 1:
case 2:
this.<> 1__state = -1;
break;
}
}
// Properties
object IEnumerator<object>.Current
{
get
{
return this.<> 2__current;
}
}
object IEnumerator.Current
{
get
{
return this.<> 2__current;
}
}
}

So what is really going on here is that when you type out yield return x; the compiler transforms this into a method stub, implants your loop logic in the MoveNext() method of a new shiny enumerator class, and provides the standard requisite functions of IEnumerable interface which support the foreach statement.

Is this good or bad? Certainly it serves well in many instances. For most of your daily uses for an enumerator this should work quite well. It’s strongly typed to the list item and uses your class’s values referenced directly.

What can be sub optimal about this? Multithreaded applications need to implement locking at the class level. Some collections in .NET implement an internal version number such that if the collection chances during enumeration an exception gets thrown to the enumerating thread. Not so here. If you want that behavior you’d have to implement it yourself.

You should note that the loop itself and any of your conditions get transformed by the compiler. The transformation, I trust, is functionally equivalent. The transformation result will vary slightly based on the collection being iterated, or if you are using a static chain of yield statements. In the case of hard coded yielded values, no concurrency issues should arise, but that is fairly rare in my humble experience.

Besides that, I think it’s pretty cool. You get to write less code, the compiler take care of your code generation.

On a side note, when decompiling your code, don’t get too caught up in Reflector’s code rendering. For one, IL decompiled to your language of choice is not a symmetric operation. For that reason and due to compiler optimizations and inlining, certain language constructs may come up reflected as GOTO label but were not necessarily coded this way originally in the higher level language.

Reign in your web parameters

As web developers we are tempted to be general about our use or the request object and submitted parameters. The temptation to access Request["someKey"] is high because it frees us from wondering whether MyPage.aspx was posted to using POST or GET and we might also convince ourselves that it’s more flexible because it means that both POST and GET would work. Well, it would do something, that’s for sure. But do we always get consistent results? Consider:

string val = Context.Request["MyKey"];

Where does val come from? HttpRequest class scans QueryString, Form, Cookies and ServerVariables of the request object, in that order. First match gets you an answer. This also means that if a query string param named “SID” exists and a form field named “SID” exists, you will get the query string value (GET). The implementation essentially is

// implementation detail
public string this[string key]
{
get
{
string returnValue = this.QueryString[key];
if (returnValue != null)
{
return returnValue;
}
returnValue = this.Form[key];
if (returnValue != null)
{
return returnValue;
}
HttpCookie cookie1 = this.Cookies[key];
if (cookie1 != null)
{
return cookie1.Value;
}
returnValue = this.ServerVariables[key];
if (returnValue != null)
{
return returnValue;
}
return null;
}
}

Note here that as a last resort, the key is looked up against the ServerVariables collection. That’s makes me a bit uneasy. Not that I ever wanted a form variable named “USER_AGENT” but now that I know it scans for it, I’ll be careful about my variable naming. Moving on, consider: string val = Context.Request.Params["MyKey"];

Where does val come from? You wouldn’t necessarily know. Params is built on first request, and accessed throughout the lifetime of the HttpRequest object. Building it is done by creating a new collection, which includes all 4 request sources, as listed below. Since it is a Collections.Specialized.NameValueCollection type, if a key exists in more than one of the 4 sources, the value would be appended as a comma separated list. So if “PID” was both a query string GET parameter (say value 123) and a form POST variable (say value 456), then (Request.Params["PID"] == "123,456") == true;

private void FillInParamsCollection()
{
//_params is the underlying collection supporing the Params property
this._params.Add(this.QueryString);
this._params.Add(this.Form);
this._params.Add(this.Cookies);
this._params.Add(this.ServerVariables);
}

What can we conclude from these observations?

  1. If you called Request.Params, even once, you now created a new collection, allocating enough memory to hold ALL parameters from the various sources. If you know what the parameter’s source is, you would be more efficient using that source collection directly. If you or your buddy working on the same code tree called Request.Params, however, the hit is taken, and subsequent references would not re-allocate collections.

  2. Precedence of parameters applies when you call Request[key], but not when you call Request.Params[key] .

  3. The effect of Params collection sporting a comma separated list is a double whammy:

    1. Upon insert, (Request.Params[“myKey”] = “myvalue”) an arraylist is created and appended to (repetitive allocation for each value).

    2. Upon assignment from Request.Params:

      1. If the string [] GetValues() method is used, the ArrayList gets converted to a string [] just for you each time (it’s not cached or preserved as an internal variable).

      2. If the string Get() is used, private static string GetAsOneString(ArrayList list), a string builder object is created just for you to concatenate the values and return them to you.

Although both methods are coded as efficiently, neither gets you a reference to an existing object, and both have to create another object and copy data to it on the fly each time. For that reason, it’s less efficient than if you knew exactly the source of your parameter and used that collection as the source.

This issue can manifest if your application combines things like flash movies, remote static HTML forms submitting to your site (think affiliate marketing programs) and various hard coded links which were created to a specific form from menus, site maps and other corners of your application. When you work with forms designed by visual studio and sue asp form controls you generally don’t encounter this because then you have access to strongly typed properties. More often than not I have run across a hybrid of asp form controls and hand coded links from CMS parts of your app or other rouge text links so it’s worth knowing and paying attention. To reign in the parameters and ensure no rouge parameters exist ever, I’d recommend creating a utility class that would wrap the Request object and use a set of well known parameters only which would encompass all usable parameter names. In conclusion, for both efficiency and un-ambiguity you would benefit from using concise parameter sources such as Request.QueryString["name"] and Request.Form["name"], and not rely on the “catchall” of Params or Reqest["name"] as shorthand. If you do find the need, inspect and ensure that parameter names do not collide so that you don’t end up with a peculiar value.

/// ParamMarshal is a sample utility which wraps the Request object.
/// It provides an easy way to eliminate ambiguity and define the
/// source of POST and GET parameters for large web projects so that
/// no parameter collision happens. Search your solution for any
/// reference to the "Request[" or "Request." and replace it with a
/// call to ParamMarshal.GetValue()
///
namespace NH.Web.Utilities
{
public class ParamMarshal
{
public static string GetValue(WellKnownParam param)
{
string result = string.Empty;
switch (param)
{
// the first 5 are POST variables. No ambiguity.
case WellKnownParam.FirstName:
case WellKnownParam.LastName:
case WellKnownParam.Password:
result = HttpContext.Current.Request.Form[param.ToString()];
break;
// the following 2 are GET variables. No ambiguity either.
case WellKnownParam.AffiliateID:
case WellKnownParam.Keywords:
result = HttpContext.Current.Request.QueryString[param.ToString()];
break;
default:
/// this should never happen., because you should
/// take care of every member of the WellKnownParameter
/// Specifically, do NOT put
/// return HttpContext.Current.Request[param.ToString()]..
throw new Exception(
"Programmer forgot to handle " + param.ToString());
break;
}
return result;
}
public enum WellKnownParam
{
FirstName,
LastName,
Password,
AffiliateID,
Keywords
}
}
}