RSS Log in
 

A couple of weeks ago, I started getting a VerificationException on a newly created unit test project which had already a bunch of tests in it. The full message was:

method [TEST_METHOD_NAME] threw exception. System.Security.VerificationException: Operation could destabilize the runtime.

For the story, my work project is currently using MSTest, VS2010 RC1 and targets .NET 4.

Heading straight to MSDN for more details, I got the following information:

"The exception that is thrown when the security policy requires code to be type safe and the verification process is unable to verify that the code is type safe".

I don't know about you but this didn't give me much clue about who/what exactly was causing that error. This was rather puzzling as thus far the tests were running perfectly well on my local Dev environment and on the build server. What I noticed however was that switching code coverage on seemed to cause this error to crop up. Since code coverage is a must have, I had to find a way of resolving this.

Another MSDN entry Troubleshooting Exceptions: System.Security.VerificationException looked promising. The only thing that it suggested though was to ensure that the “application is not loading two conflicting versions of a class library”. I couldn't really imagine this to be the reason the VerificationException was thrown as without code coverage it all worked fine on exactly the same code base and environments. No code coverage, no problem; code coverage on, tests aborted!

So my next step was to identify the project and the assembly that was triggering the exception. After a while I managed to track this down to one of the solution's project that was reusing some libraries written against previous versions of the .NET Framework. They in turn were referencing the System.Configuration v2 assembly. My first thought was to add a binding redirect in the configuration file as shown below:

<configuration>
...
   <runtime>
      <assemblyBinding xmlns="urn:schemas-microsoft-com:asm.v1">
         <dependentAssembly>
            <assemblyIdentity name="System.Configuration" publicKeyToken="B03F5F7F11D50A3A" culture="neutral"/>
            <bindingRedirect oldVersion="0.0.0.0-4.0.0.0" newVersion="4.0.0.0"/>
         </dependentAssembly>
      </assemblyBinding>
   </runtime>
...
</configuration> 

Unfortunately MSTest would still refuse to run the tests under code coverage. I then recalled some 'trust' issues I had in the past with System.Configuration on a .NET 3.5 application. More specifically this assembly requires full trust from the calling assembly otherwise a security exception is thrown. Here is an excerpt from an MSDN article that mattered to me:

There is no programmatic way for partially trusted code to call a library that does not have the AllowPartiallyTrustedCallersAttribute. If an application does not receive full trust by default, an administrator must choose to modify security policy and grant the application full trust before it can call such a library.

Only a few .NET Framework libraries installed in the GAC will allow partially trusted callers and System.Configuration is not one of them! So marking the assembly with the AllowPartiallyTrustedCallers attribute (APTCA) would not work (*).

[assembly: AllowPartiallyTrustedCallers(PartialTrustVisibilityLevel = PartialTrustVisibilityLevel.VisibleToAllHosts)] 

I could only conclude from the above that MSTest Code Coverage - presumably through the vshost.exe process? - was not, for reasons I still don't know, fully trusted and did not therefore have the right set of permissions to instrument the code. In the 'old' days of CAS (Code Access Security), I might have been able to apply an assembly level attribute in the AssemblyInfo.cs file to circumvent that with a SkipVerification permission request which tells the CLR to skip the type safety verification that happens during JIT compilation:

[assembly: SecurityPermission(SecurityAction.RequestMinimum, UnmanagedCode = true, SkipVerification = true, Execution = true)] 

But .NET 4 has changed all of that (Security Changes in the .NET Framework 4). Security policy decisions are now handed over to the host and managed desktop applications run automatically with full trust, leaving system administrators the responsibility of managing permissions at the operating system level. More importantly CAS policy has been disabled by default and is no longer the 'preferred security supplier'. This also means that the SecurityAction.RequestMinimum and others are now obsolete. Visual Studio will display a friendly warning when you add it:

SecurityAction.RequestMinimum Note that you can re-enable CAS in your configuration file as shown below:

<configuration>
...
   <runtime>
      <NetFx40_LegacySecurityPolicy enabled="true" />
   </runtime>
...
</configuration> 

However since I've always found CAS confusing and rather difficult to grok, it's not an avenue I wanted to even explore.

After some further reading (**) on the new security model in .NET 4 and in particular a short post on the .NET Security Blog Transparency Models: A tale of two levels, it dawned on me that what was required was to apply the level 1 rule set to my .NET 4 assembly calling the .NET 2 code. Level 1 corresponds to the security transparency model baked in the CLR v2 and effectively allows for backward compatibility with .NET 4. Level 2 is the model that comes with the .NET 4 CLR.

The follow-up post Differences Between the Security Rule Sets describes the effect of the APTCA attribute as follows:

Allows partial trusted callers access to the assembly by removing an implicit link demand for full trust on all code in signed assemblies.

I therefore added the explicit attribute below in my AssemblyInfo.cs file:

using System.Security; 

[assembly: SecurityRules(SecurityRuleSet.Level1)] 

Although the application I am working on is for internal use only and I probably could have left it at that, I decided to surround the SecurityRules attribute with a ‘COVERAGE’ conditional compilation symbol as I still want to be warned of any other potential issues.

using System.Security; 

#if COVERAGE 

[assembly: SecurityRules(SecurityRuleSet.Level1)] 

#endif

After that all my tests were passing with code coverage enabled. Result!

I don't know for sure whether my diagnosis is even close to being 50% correct as I honestly can’t say I fully understand .NET security. Not as much as I should do anyway. Like a majority of Devs, I suspect, I develop most of the time in a fully trusted environment with Admin privileges. I only had to worry about CAS on a few rare occasions. I know, I can hear some say “develop with a user account instead” but let's face it there is a reason most programmers don't: it’s hard and really a pain in the @~!#$. That could also be part of the reason CAS is being superseded.

At least this experience gave me an opportunity to start learning about .NET 4 security and being more aware of potential issues when developing a mixed solution with assemblies compiled against different versions of the framework. Hopefully this might also help some other people who are facing the same problem.

For now I need to return to my tests and get that code coverage back up to an acceptable level. Until next time,

Happy Programming!

(*)  You can find more details on Allowing Partially Trusted Callers.
(**) For a good introductory article on the new security model, check out Exploring the .NET Framework 4 Security Model.


Comments are closed
© Copyright 2014 TheBooleanFrog Powered by: BlogEngine.NET|Credits|Subscribe via RSS

Follow

twitter linkedin linkedin rss

TheBooleanFrog

Programming sticky notes and other distractions...