Tutorial :How to determine if a .NET assembly was built for x86 or x64?



Question:

I've got an arbitrary list of .NET assemblies.

I need to programmatically check if each DLL was built for x86 (as opposed to x64 or Any CPU). Is this possible?


Solution:1

Look at System.Reflection.AssemblyName.GetAssemblyName(string assemblyFile)

You can examine assembly metadata from the returned AssemblyName instance:

Using PowerShell:

  [36] C:\> [reflection.assemblyname]::GetAssemblyName("${pwd}\Microsoft.GLEE.dll") | fl    Name                  : Microsoft.GLEE  Version               : 1.0.0.0  CultureInfo           :  CodeBase              : file:///C:/projects/powershell/BuildAnalyzer/...  EscapedCodeBase       : file:///C:/projects/powershell/BuildAnalyzer/...  ProcessorArchitecture : MSIL  Flags                 : PublicKey  HashAlgorithm         : SHA1  VersionCompatibility  : SameMachine  KeyPair               :  FullName              : Microsoft.GLEE, Version=1.0.0.0, Culture=neut...   

Here, ProcessorArchitecture identifies target platform.

I'm using PowerShell in this example to call the method.


Solution:2

You can use the CorFlags CLI tool (for instance, C:\Program Files\Microsoft SDKs\Windows\v7.0\Bin\CorFlags.exe) to determine the status of an assembly, based on its output and opening an assembly as a binary asset you should be able to determine where you need to seek to determine if the 32BIT flag is set to 1 (x86) or 0 (Any CPU or x64, depending on PE):

Option    | PE    | 32BIT  ----------|-------|---------  x86       | PE32  | 1  Any CPU   | PE32  | 0  x64       | PE32+ | 0  

The blog post x64 Development with .NET has some information about corflags.

Even better, you can use Module.GetPEKind to determine whether an assembly is PortableExecutableKinds value PE32Plus (64-bit), Required32Bit (32-bit and WOW), or ILOnly (any CPU) along with other attributes.


Solution:3

Just for clarification, CorFlags.exe is part of the .NET Framework SDK. I have the development tools on my machine, and the simplest way for me determine whether a DLL is 32-bit only is to:

  1. Open the Visual Studio Command Prompt (In Windows: menu Start/Programs/Microsoft Visual Studio/Visual Studio Tools/Visual Studio 2008 Command Prompt)

  2. CD to the directory containing the DLL in question

  3. Run corflags like this: corflags MyAssembly.dll

You will get output something like this:

    Microsoft (R) .NET Framework CorFlags Conversion Tool.  Version  3.5.21022.8  Copyright (c) Microsoft Corporation.  All rights reserved.    Version   : v2.0.50727  CLR Header: 2.5  PE        : PE32  CorFlags  : 3  ILONLY    : 1  32BIT     : 1  Signed    : 0  

As per comments the flags above are to be read as following:

  • Any CPU: PE = PE32 and 32BIT = 0
  • x86: PE = PE32 and 32BIT = 1
  • 64-bit: PE = PE32+ and 32BIT = 0


Solution:4

How about you just write you own? The core of the PE architecture hasn't been seriously changed since its implementation in Windows 95. Here's a C# example:

    public static ushort GetPEArchitecture(string pFilePath)      {          ushort architecture = 0;          try          {              using (System.IO.FileStream fStream = new System.IO.FileStream(pFilePath, System.IO.FileMode.Open, System.IO.FileAccess.Read))              {                  using (System.IO.BinaryReader bReader = new System.IO.BinaryReader(fStream))                  {                      if (bReader.ReadUInt16() == 23117) //check the MZ signature                      {                          fStream.Seek(0x3A, System.IO.SeekOrigin.Current); //seek to e_lfanew.                          fStream.Seek(bReader.ReadUInt32(), System.IO.SeekOrigin.Begin); //seek to the start of the NT header.                          if (bReader.ReadUInt32() == 17744) //check the PE\0\0 signature.                          {                              fStream.Seek(20, System.IO.SeekOrigin.Current); //seek past the file header,                              architecture = bReader.ReadUInt16(); //read the magic number of the optional header.                          }                      }                  }              }          }          catch (Exception) { /* TODO: Any exception handling you want to do, personally I just take 0 as a sign of failure */}          //if architecture returns 0, there has been an error.          return architecture;      }  }  

Now the current constants are:

0x10B - PE32  format.  0x20B - PE32+ format.  

But with this method it allows for the possibilities of new constants, just validate the return as you see fit.


Solution:5

Try to use CorFlagsReader from this project at CodePlex. It has no references to other assemblies and it can be used as is.


Solution:6

[TestMethod]  public void EnsureKWLLibrariesAreAll64Bit()  {      var assemblies = Assembly.GetExecutingAssembly().GetReferencedAssemblies().Where(x => x.FullName.StartsWith("YourCommonProjectName")).ToArray();      foreach (var assembly in assemblies)      {          var myAssemblyName = AssemblyName.GetAssemblyName(assembly.FullName.Split(',')[0] + ".dll");          Assert.AreEqual(ProcessorArchitecture.MSIL, myAssemblyName.ProcessorArchitecture);      }  }  


Solution:7

DotPeek from JetBrians provides quick and easy way to see msil(anycpu), x86, x64 dotPeek


Solution:8

Below is a batch file that will run corflags.exe against all dlls and exes in the current working directory and all sub-directories, parse the results and display the target architecture of each.

Depending on the version of corflags.exe that is used, the line items in the output will either include 32BIT, or 32BITREQ (and 32BITPREF). Whichever of these two is included in the output is the critical line item that must be checked to differentiate between Any CPU and x86. If you are using an older version of corflags.exe (pre Windows SDK v8.0A), then only the 32BIT line item will be present in the output, as others have indicated in past answers. Otherwise 32BITREQ and 32BITPREF replace it.

This assumes corflags.exe is in the %PATH%. The simplest way to ensure this is to use a Developer Command Prompt. Alternatively you could copy it from it's default location.

If the batch file below is run against an unmanaged dll or exe, it will incorrectly display it as x86, since the actual output from Corflags.exe will be an error message similar to:

corflags : error CF008 : The specified file does not have a valid managed header

@echo off    echo.  echo Target architecture for all exes and dlls:  echo.    REM For each exe and dll in this directory and all subdirectories...  for %%a in (.exe, .dll) do forfiles /s /m *%%a /c "cmd /c echo @relpath" > testfiles.txt    for /f %%b in (testfiles.txt) do (      REM Dump corflags results to a text file      corflags /nologo %%b > corflagsdeets.txt       REM Parse the corflags results to look for key markers        findstr /C:"PE32+">nul .\corflagsdeets.txt && (              REM `PE32+` indicates x64          echo %%~b = x64      ) || (        REM pre-v8 Windows SDK listed only "32BIT" line item,         REM newer versions list "32BITREQ" and "32BITPREF" line items          findstr /C:"32BITREQ  : 0">nul /C:"32BIT     : 0" .\corflagsdeets.txt && (              REM `PE32` and NOT 32bit required indicates Any CPU              echo %%~b = Any CPU          ) || (              REM `PE32` and 32bit required indicates x86              echo %%~b = x86          )      )        del corflagsdeets.txt  )    del testfiles.txt  echo.  


Solution:9

Another way to check the target platform of a .NET assembly is inspecting the assembly with .NET Reflector...

@#~#€~! I've just realized that the new version is not free! So, correction, if you have a free version of .NET reflector, you can use it to check the target platform.


Solution:10

cfeduke notes the possibility of calling GetPEKind. It's potentially interesting to do this from PowerShell.

Here, for example, is code for a cmdlet that could be used: https://stackoverflow.com/a/16181743/64257

Alternatively, at https://stackoverflow.com/a/4719567/64257 it is noted that "there's also the Get-PEHeader cmdlet in the PowerShell Community Extensions that can be used to test for executable images."


Solution:11

A more advanced application for that you can find here: CodePlex - ApiChange

Examples:

C:\Downloads\ApiChange>ApiChange.exe -CorFlags c:\Windows\winhlp32.exe  File Name; Type; Size; Processor; IL Only; Signed  winhlp32.exe; Unmanaged; 296960; X86    C:\Downloads\ApiChange>ApiChange.exe -CorFlags c:\Windows\HelpPane.exe  File Name; Type; Size; Processor; IL Only; Signed  HelpPane.exe; Unmanaged; 733696; Amd64  


Solution:12

One more way would be to use dumpbin from the Visual Studio tools on DLL and look for the appropriate output

dumpbin.exe /HEADERS <your dll path>      FILE HEADER VALUE                   14C machine (x86)                     4 number of sections              5885AC36 time date stamp Mon Jan 23 12:39:42 2017                     0 file pointer to symbol table                     0 number of symbols                    E0 size of optional header                  2102 characteristics                         Executable                         32 bit word machine                         DLL  

Note: Above o/p is for 32bit dll

One more useful option with dumpbin.exe is /EXPORTS, It will show you the function exposed by the dll

dumpbin.exe /EXPORTS <PATH OF THE DLL>  


Solution:13

More generic way - use file structure to determine bitness and image type:

public static CompilationMode GetCompilationMode(this FileInfo info)  {      if (!info.Exists) throw new ArgumentException($"{info.FullName} does not exist");        var intPtr = IntPtr.Zero;      try      {          uint unmanagedBufferSize = 4096;          intPtr = Marshal.AllocHGlobal((int)unmanagedBufferSize);            using (var stream = File.Open(info.FullName, FileMode.Open, FileAccess.Read))          {              var bytes = new byte[unmanagedBufferSize];              stream.Read(bytes, 0, bytes.Length);              Marshal.Copy(bytes, 0, intPtr, bytes.Length);          }            //Check DOS header magic number          if (Marshal.ReadInt16(intPtr) != 0x5a4d) return CompilationMode.Invalid;            // This will get the address for the WinNT header            var ntHeaderAddressOffset = Marshal.ReadInt32(intPtr + 60);            // Check WinNT header signature          var signature = Marshal.ReadInt32(intPtr + ntHeaderAddressOffset);          if (signature != 0x4550) return CompilationMode.Invalid;            //Determine file bitness by reading magic from IMAGE_OPTIONAL_HEADER          var magic = Marshal.ReadInt16(intPtr + ntHeaderAddressOffset + 24);            var result = CompilationMode.Invalid;          uint clrHeaderSize;          if (magic == 0x10b)          {              clrHeaderSize = (uint)Marshal.ReadInt32(intPtr + ntHeaderAddressOffset + 24 + 208 + 4);              result |= CompilationMode.Bit32;          }          else if (magic == 0x20b)          {              clrHeaderSize = (uint)Marshal.ReadInt32(intPtr + ntHeaderAddressOffset + 24 + 224 + 4);              result |= CompilationMode.Bit64;          }          else return CompilationMode.Invalid;            result |= clrHeaderSize != 0              ? CompilationMode.CLR              : CompilationMode.Native;            return result;      }      finally      {          if (intPtr != IntPtr.Zero) Marshal.FreeHGlobal(intPtr);      }  }  

Compilation mode enumeration

[Flags]  public enum CompilationMode  {      Invalid = 0,      Native = 0x1,      CLR = Native << 1,      Bit32 = CLR << 1,      Bit64 = Bit32 << 1  }  

Source code with explanation at GitHub


Note:If u also have question or solution just comment us below or mail us on toontricks1994@gmail.com
Previous
Next Post »