Quantcast
Channel: MSBuild Team Blog
Viewing all 35 articles
Browse latest View live

MSBuild 4 Detailed Build Summary

$
0
0

Introduction

When we were developing the current version of MSBuild, we spent a lot of time analyzing builds to determine where our performance issues lay. The standard logging, even on diagnostic verbosity and with the performance summary enabled (/clp:PerformanceSummary=true on the MSBuild command line) doesn’t give us the kind of information we desired. What we were looking for was a quick way to visualize which projects depended on which other projects, how much time they were taking, and how MSBuild was allocating the work to its worker nodes. Enter the Detailed Build Summary.

A Word About Nodes

Before looking at these graphs, it is important to understand how MSBuild performs its work. When a project is being built, all of its tasks are executed on a worker node. There is always at least one node, present in the MSBuild executable launched from the command line or inside the Visual Studio IDE. This is referred to as the in-proc node. There may be additional nodes created out-of-proc, and are called the out-of-proc or multi-proc nodes. In the case where MSBuild has more projects to build than currently existing nodes, it will create new worker nodes up to the limit specified by the /m switch.

Each node may have any number of projects assigned to it, but only one project at a time will be executing tasks (though there an exception with the Yield mechanism which I won’t get in to here.)

Gimme Some Numbers!

You can enable the Detailed Build Summary for any build by passing the /ds or /detailedsummary switch to MSBuild. This will cause MSBuild to compile and log this information at the end of the log file. For this blog I whipped up a little build tree, built using four MSBuild worker nodes (/m:4), and turned on the log

 

F:\Bugs\MultiProc>msbuild /tv:4.0 cir1.proj /m:4 /ds /t:ParallelBuild > foo.txt

The end of foo.txt after the build…

         Detailed Build Summary 
         ====================== 
         ============================== Build Hierarchy (IDs represent configurations) ===================================================== 
         Id                  : Exclusive Time   Total Time   Path (Targets) 
         ----------------------------------------------------------------------------------------------------------------------------------- 
         0                   : 0.036s           0.295s       F:\Bugs\MultiProc\cir1.proj (ParallelBuild) 
         | 1                 : 0.047s           0.047s       F:\Bugs\MultiProc\cir2.proj () 
         | 3                 : 0.209s           0.209s       F:\Bugs\MultiProc\cir4.proj () 
         | 2                 : 0.210s           0.210s       F:\Bugs\MultiProc\cir3.proj () 
         | 4                 : 0.212s           0.217s       F:\Bugs\MultiProc\cir5.proj () 
         | | 10              : 0.005s           0.005s       F:\Bugs\MultiProc\cir8.proj () 
         | . 11              : 0.003s           0.003s       F:\Bugs\MultiProc\cir9.proj () 
         | 5                 : 0.001s           0.001s       F:\Bugs\MultiProc\cir6.proj () 
         . 6                 : 0.003s           0.015s       F:\Bugs\MultiProc\cir7.proj () 
         | . 7               : 0.004s           0.012s       F:\Bugs\MultiProc\cir5.proj () 
         | | | 8             : 0.007s           0.007s       F:\Bugs\MultiProc\cir8.proj () 
         | | . 9             : 0.001s           0.001s       F:\Bugs\MultiProc\cir9.proj () 
         ============================== Node Utilization (IDs represent configurations) ==================================================== 
         Timestamp:            1       2       3       4        Duration   Cumulative 
         ----------------------------------------------------------------------------------------------------------------------------------- 
         634032540333349050:   0       x       x       x        0.035s     0.035s 
         634032540333699050:   1       x       x       x        0.041s     0.076s 
         634032540334109050:   |       4       2       3        0.006s     0.082s 
         634032540334169050:   5       |       |       |        0.001s     0.083s 
         634032540334179050:   6       |       |       |        0.002s     0.085s 
         634032540334199050:   7       |       |       |        0.002s     0.087s 
         634032540334219050:   8       |       |       |        0.007s     0.094s 
         634032540334289050:   9       |       |       |        0.001s     0.095s 
         634032540334299050:   7       |       |       |        0.002s     0.097s 
         634032540334319050:   6       |       |       |        0.001s     0.098s 
         634032540334329050:   x       |       |       |        0.188s     0.286s ### 
         634032540336209050:   x       x       10      11       0.003s     0.289s 
         634032540336239050:   x       x       |       x        0.002s     0.291s 
         634032540336259050:   x       4       x       x        0.003s     0.294s 
         634032540336289050:   0       x       x       x        0.002s     0.296s 
         ----------------------------------------------------------------------------------------------------------------------------------- 
         Utilization:          33.8    96.8    97.7    96.8     Average Utilization: 81.3

Build succeeded. 
    0 Warning(s) 
    0 Error(s)

Time Elapsed 00:00:00.32

 

The data is split up into two sections – the Build Hierarchy and the Node Utilization. I’ll explain them in order.

Build Hierarchy

The hierarchy section shows all of the projects that were built. Each of the columns contains the following information:

  • ID – The request ID which was built. (This is a bug in the text output, it’s not a configuration.) A request is any request to build a target on a project. This can be from the command-line or through am MSBuild task. Note that if the same project file is invoked with the same target using the same global properties and tools version multiple times during the build, it may show up multiple times in the graph in different places. This is ok. However, sometimes it will be invoked multiple times and NOT show up. This is due to the way MSBuild works internally in that it can shortcut some work and we don’t capture that in this graph.
  • Exclusive Time – The amount of time MSBuild spent actually executing the tasks and targets in that request. This does NOT include the time spent waiting for an MSBuild task to build the projects it depends on.
  • Total Time – This is the time spent executing tasks and targets plus the time spent waiting for other dependency requests to build.
  • Path – Displays the path to the project file which was invoked for that request.
  • (Targets) – In parentheses after the path is the list of targets which were specified. If this is empty, the default targets were executed instead. This list will never show the default or initial targets, only those explicitly specified on the command-line or in the Targets parameter of the MSBuild task.

To the left and below each ID number is the tree layout. Each pipe ‘|’ symbol means that the request to the right of it is a dependency (that is, the request above depends on the request to the right directly.) The more pipes, the deeper the dependency tree. A period ‘.’ symbol means that the ID to the left is the last dependency request for the parent (which is the ID above the symbol in that column) which was actually built. So in the above, 0 is the root request and it depends on 1, 3, 2, 4, 5 and 6 directly. Request 6 depends on 7, which depends on 8 and 9. 6 can be said to indirectly depend on 8 and 9 through request 7.

Node Utilization

The utilization section shows how MSBuild has allocated requests to build using the nodes available. The columns have the following meanings:

  • Timestamp – This is the wall-clock time for the current event. We generate a new event any time something has changed about how work is distributed. The time between events may vary significantly because of this.
  • (numbered columns) – This is the ‘current work’ display for each node. The following symbols may appear:
    • (number) – This represents a request and corresponds to the requests in the Build Hierarchy. It specifically means that the specified request has either started or resumed on the node.
    • x – The node is idle and doing no work.
    • | - The node is still working on the current request. The request number is the one at the top of the | symbols.
  • Duration – This is the amount of time the system spent in this state.
  • Cumulative – This is the total amount of time which has elapsed from the beginning of the build until this state ends.
  • (hash ‘#’ marks) – These display the duration in units of 0.05 seconds. Useful to quickly find events which take a long time to process (though less useful than a bar graph scaled relative to the entire build time.)

Using this information, we can see that request 0 is first assigned to node 1. It then immediately cedes control to request 1 (one of its dependencies). If we look at request 0, we can see it depends on many other requests. But they are not scheduled at this event because MSBuild creates nodes dynamically, so the other nodes weren’t available yet. The next event we see requests 2, 3 and 4 are all scheduled because those nodes have become available. During this period request 1 continues to execute on node 1. This proceeds for a while until we get down toward the bottom. We can see eventually we run out of work to schedule on node 1, leaving 2, 3 and 4 to keep executing. Request 4 depends on requests 10 and 11, and we can see the point at which node 2 suspends executing request 4 – this is where request 4 must wait on its dependencies. Once those have finished (building on nodes 3 and 4 in this case), request 4 resumes on node 2. By that point it is the last request which request 0 was waiting on. Once it is finished, request 0 resumes on node 1, and then finishes.

Analyzing the Data

At the end, we display some utilization numbers, which tells how well we loaded the worker nodes. In a 100% perfectly parallelizable build, we would like to see 100% for all of those utilizations. In practice this rarely occurs because builds tend to have places where they are more serialized (if you have a common library which takes a long time to build, you will see this.) If you look at your graph and you see one node doing work while no others are and the total duration of that period is long, then that is an indication you have serialization in your build and it may be worth looking at whether that request really should be that long – can it be split up into smaller chunks and have other requests refer to it piecemeal? Can the project itself be made to build faster using better tools? Is the request doing something unexpected?

Another thing you can experiment with when trying to tune your builds is changing the multi-proc node count limit. For instance, sometimes setting the /m number to one more or one less than the number of actual cores you have will enhance scheduling. If you are performing C++ builds, you may also play with the /MP setting on the compiler which enables it to parallelize the processing of C++ files directly.

MSBuild currently uses some heuristics to determine when projects should be scheduled to build. This is especially important when there are more outstanding build requests to schedule than there are nodes to work on them. So even if you do manage to eliminate all serialization, the theoretical minimal build time might not be achieved because we lack the information to make the right decisions. This is an area we are currently working on, and we hope to bring you even more improved build times in the future.

Conclusion

Here in MSBuild we are very motivated to provide improved build analysis tools and intrinsically smarter build systems. The Detailed Build Summary diagnostic output in MSBuild 4 can provide some useful information about how MSBuild sees and builds your projects. Using it you can determine the actual project dependencies and relative build times of all of your projects. This information can then be used to better organize your projects for the purposes of more efficient builds. This functionality only scratches the surface of what is necessary for analyzing more complex builds, but rest assured we are hard at work bringing you those tools.

If you find this information useful, and especially if you decide to actually parse the output of this functionality, let us know. We certainly intend to improve the mechanism, but if it ends up being useful as-is we will want to try to keep it stable so that future releases don’t break your code.

Cliff Hudson
Visual Studio Platform
MSBuild Developer


Tuning C++ build parallelism in VS2010

$
0
0

A great way to get fast builds on a multiprocessor computer is to take advantage of as much parallelism in your build as possible. If you have C++ projects, there’s two different kinds of parallelism you can configure.

What are the dials I can set?

Project-level parallel build, which is controlled by MSBuild, is set at the solution level in Visual Studio. (Visual Studio actually stores the value per-user per-computer, which may not be always what you want – you may want to have different values for different solutions, and the UI doesn’t allow you to do that.). By default Visual Studio picks the number of processors on your machine. Do some experiments with slightly higher and lower numbers to see what gives the best speed for your particular code. Some people like to dial it down a little so that they can do other work while a build goes on.

image

This dial is just the same as VS2008, although under the covers MSBuild is taking over some of the work from Visual Studio now.

If you’re building C++ or C++/CLI, there’s another place you can get build parallelism. The CL compiler supports the /MP switch, which tells it to build subsets of its inputs concurrently with separate instances of itself. The default number of buckets, again, is the number of CPU’s, but you can also specify a number, like /MP5. Again, this was available before, so I’m going to just remind you where the value is and what it looks like in the MSBuild format project file.

Go to your project’s property pages, and to the C/C++, General page. For now I suggest that you select All Configurations and All Platforms. You can be more selective later if you want.

image 

As usual you can see what’s in the project file by unloading it, right clicking on the node in the Solution Explorer, and choosing Edit:

image

Here’s what it looks like in the project file. Yes, it’s inside a configuration and platform specific block, but it put the same value in all of them.

image

 

 

 

 

Notice that it’s in an “ItemDefinitionGroup”. That MSBuild tag simply indicates it defines a  “template” for items of a particular type. In this case, all items of type “ClCompile” will automatically have metadata MultiProcessorCompilation with value true unless they explicitly choose a different value.

By the way, MSBuild Items, in case you’re wondering, are just files, usually. Their subelements, if any, are the metadata. Here’s what some look like. Notice they’re in an “ItemGroup”:

image

Because this is metadata, at an extreme, I could actually set this down to a per-file basis. In that case, MSBuild would bucket together all the inputs that have a common value. You would need to disable /MP for particular files that use #import, for example, because that's not supported with /MP. (Other features not supported with /MP are /Gm, which is incremental compilation, and a few other switches documented here)

Note it’s under the “ItemGroup” because these are actual items:

image

Back to multiprocessor CL. If you want to tell CL explicitly how many parallel compiles to do, Visual Studio lets you do this – as for /MP, it's exposed as a global setting:

image

Under the covers, VS passes this on by setting a property (a global property – it's not persisted) named CL_MPCount. That means it won't have any effect when building outside of VS.

If you want to choose a value at a finer grained level you can’t use the UI as it’s not exposed in the property pages or the command line preview. You have to go into the project file editor and type it. It’s a different piece of metadata on the CLCompile items, named “ProcessorNumber”. It can have a number from 1 to as high as you like and adds the numeric value to /MP if you want it. If you don't have <MultiProcessorCompilation> it will be ignored.

image

The squiggle here is a minor bug – ignore it.

What about building on the command line?

The /MP settings come from the project files, so they work exactly the same on the command line. That’s part of the whole point of MSBuild, right, the same build on the command line as in Visual Studio? But the global parallelism setting that you set in Tools, Options does not affect the command line. You must pass it yourself to the msbuild.exe command with the /m switch. Again, the value is optional and if you don’t supply a value it uses the number of CPU’s. However, unlike Visual Studio, out of the box, without /m supplied, it uses 1 CPU. That might change in future.

image

To choose the number on any /MP value, you can set an environment variable, or pass a property, named CL_MPCount, just like Visual Studio does.

 

 

 

Setting /MP on every project is tiresome, what are my options?

Probably you’ll want to use /MP on more than one of your projects, and you don’t want to edit each individually. The Visual Studio solution to this kind of problem is property sheets. They don’t have any special connection to multiprocessor build, but it’s an opportunity for me to give a quick refresher using this as an example. First open the “Property Manager” from the view menu. Its exact location will vary depending on the settings you’re using, here’s where it is if you have C++ settings;

image

Right click on a project and choose “Add New Property Sheet”:

image

I have mine the name “MultiprocCpp.props”. You’ll see it gets added to all configurations of this project. Right click on it, and you’ll see the same property pages that the project has, but this time you’re editing the property sheet. Again, set “Multi-processor Compilation” to “Yes”. Close the property pages, select the property sheet in the Property Manager, and hit Save.

Now I can open up that new MultiprocCpp.props file in the editor, and I see this:

image

(Again, ignore the squiggle.)

Looking in the project file, you can see the property sheet pulled in to each configuration, using an “Import” tag. Think of that just like a #include in C++:

image

So now we have the definition we put in the project file before, but in a reusable form. Given that, I can put it into all the projects I want in one shot, by multi-selecting in the Property Manager and choosing Add Existing Property Sheet:

image

Now all your projects compile with /MP !

In some circumstances, you might want to go beyond what you can easily do in the Property Manager. For example you might want to bulk-remove a property sheet, or put a property sheet in each project once outside of all the configurations. Fortunately MSBuild 4.0 has a powerful and complete object model over its files that you can use to do this kind of work in a few lines of code. More on that in a future blog post, but for now, if you want to take a look, point the Object Browser at Microsoft.Build.dll.

Before I leave property sheets, it’s worth mentioning that you can do this kind of common-importing in your own ways, if you don’t mind losing some of the UI support. For example, in the build of VS itself, we pull in a common set of properties at the top of every project, like this example from the project that builds msenv.dll (which contains much of the VS shell)

image

Within that we define all kinds of global settings, and import yet others. I’ll talk about this kind of structure in a future blog post about the organization of large build trees.

Too much of a good thing

Usually the problem is getting enough parallelism to exploit all your machine’s cores. But the reverse problem is possible, and although it’s a nice problem to have, it needs fixing because it will cause your machine to thrash. Here’s what task manager might look like when this is happening:

image

In this case on a box with 8 CPU’s I enabled /MP on all my projects in the solution, and then built it with msbuild.exe /m (I didn’t need to use the command line to have this problem, the same could happen in Visual Studio). If dependencies don’t prevent it, MSBuild will kick off 8 projects at once, and in each of those CL will run 8 instances of itself at once, so we could have up to 64 copies of CL all fighting over my cores and my disk. Not a recipe for performance.

You can expect that one day the system will auto-tune itself here, but for now if you have this problem you would do some manual adjustment. Here’s some ideas:

Dial down the values globally

Reduce /m:4 to /m:3, for example, or use a property sheet to change /MP to /MP2, say. Easy, but a blunt instrument: if there are points elsewhere in your build where there is a lot of project parallelism but not much CL parallelism, or vice versa, you probably just slowed them down.

Tune /MP for each project and configuration

A project that compiles at a relatively parallelized point in the build is not such a good candidate for /MP, for example. You might adjust by configuration as well. Retail configuration can be much slower to build because the compiler’s optimizing more: that might make it interesting to enable /MP for Retail and not Debug.

Get super custom

In your team, you might have a range of hardware. Perhaps your developers have 2-CPU machines, but your nightly build is on an 8-CPU beast. Yet the both need to build the same set of sources, and you don't want any box to be either slow or thrashing. In this case, you could use environment variables, and Conditions on the MSBuild tags. Almost all MSBuild tags can have Conditions.

Here’s an example below. When a property “MultiprocCLCount” (which I just invented) has a value, and it’s greater than 0, /MP is enabled with that value.

image

MSBuild pulls in all environment variables as its initial properties when it starts up. So on my fast machine, I set an environment variable MultiprocCLCount=8, and on my developer boxes, I set MultiprocCLCount=2.

The build machine’s script could also parameterize the /m switch going to MSBuild.exe, like /m:%MultiprocMSBuildCount%

To other properties that might be useful in exotic conditions: $(Number_Of_Processors) is the number of logical cores on the box – this just comes from the environment variable. $(MSBuildNodeCount) is the value that was passed to /m on msbuild.exe, or within VS, the value from Tools>Options for project parallelism.

That’s it. I hope while walking you through /m and /MP I’ve also given you an overview of some MSBuild features and how much flexibility they give you to configure your build process.

Optimizing your build speed is a huge topic so look for more blogging on this subject from me.

Dan Moseley
Developer Lead - MSBuild

Displaying Target Output Items Using The Console Logger

$
0
0

 In previous versions of MSBuild users could see the initial item and property values of projects in the build, this was useful to diagnose what values certain properties and items were set to.  A requested addition to these items and properties was the ability to get view the values of the target output items on a target.  This is useful because it allows the build author to view the state of the items being passed to the rest of the build from the target which has just completed.  One of the features added in MSBuild 4.0 was the ability to retrieve the output items of a target when a target completes and display this information in a logger.

 

This post is intended for two audiences, first.  If one wishes to enable the display of target output items in the default MSBuild console logger then the only action that is required is to set the environment variable MSBUILDTARGETOUTPUTLOGGINGto true.

 

Let us consider the following simple project which has a target that sets some outputs.

 

Note: In both cases we are using detailed logging

 

msbuild OutputLogging.proj /v:d 

 

OutputLogging.proj

 

<?xmlversion="1.0"encoding="utf-8"?>

<ProjectToolsVersion="4.0"DefaultTargets="Build"xmlns="http://schemas.microsoft.com/developer/msbuild/2003">

  <TargetName="Build"Outputs="@(TargetOutputs);$(AnotherOutput)">

    <ItemGroup>

      <TargetOutputsInclude="TargetOutPut1"/>

      <TargetOutputsInclude="TargetOutPut2"/>

      <TargetOutputsInclude="TargetOutPut3">

        <Metadata1>Metadata1</Metadata1>

        <Metadata2>Metadata2</Metadata2>

        <Metadata3>Metadata3</Metadata3>

      </TargetOutputs>

    </ItemGroup>

    <PropertyGroup>

      <AnotherOutput>AnotherOutput</AnotherOutput>

    </PropertyGroup>

  </Target>

</Project>

 

Here is what the log looks like when target output logging is not enabled.

 

Project "C:\OutputLogging\OutputLogging.proj" on node 1 (default targets).
Building with tools version "4.0".
Target "Build" in project "C:\OutputLogging\OutputLogging.proj" (entry point):
Done building target "Build" in project "OutputLogging.proj".
Done Building Project "C:\OutputLogging\OutputLogging.proj" (default targets).

 

Here is the output with the target output logging enabled using the msbuild console logger.

 

Project "C:\OutputLogging\OutputLogging.proj" on node 1 (default targets).
Building with tools version "4.0".
Target "Build" in project "C:\OutputLogging\OutputLogging.proj" (entry point):
Target output items:
    TargetOutPut1
    TargetOutPut2
    TargetOutPut3
        Metadata1 = Metadata1
        Metadata2 = Metadata2
        Metadata3 = Metadata3
    AnotherOutput
Done building target "Build" in project "OutputLogging.proj".
Done Building Project "C:\OutputLogging\OutputLogging.proj" (default targets).

 

As you can see we now see the output items when the target finished event is logged. 

 

The second intended audience are authors of custom loggers.  To get access to the target outputs in ones custom logger a new property on the TargetFinishedEventArgs has been added called TargetOutputs.  This property is normally not set to anything, we chose not to enable the logging of the TargetOutputs by default because it does cause a performance hit due to additional object serialization during a multi-process build.  When the environment variableMSBUILDTARGETOUTPUTLOGGING is set to true this field will be populated with a set of ITaskItems which are the output items for the target.

 

Custom loggers can get access to this event and use it as they see fit.

 

Though not the exact code in the logger, the algorithm to use the target output is as follows

 

In the event handler which is registered to receive the targetFinishedEventArgs in the custom logger.

 

targetOutputs = targetFinishedEvent.TargetOutputs;

 

foreach (ITaskItem item in targetOutputs)

{

    Console.Out.WriteLine(Item.ItemSpec);

}

 

One thing to note about this feature is when batching targets only the last target finished event for a given target will have the target outputs.  For example if you have a target which that batches three times then you will get two target finished events where the targetOutputs is null and the last one will have the set out output items for ALL batches. The reason we only have the outputs on the last batch is because of the way the target outputs are gathered by the MSBuild engine. They are only gathered after all of the batches are completed. For this reason during the batch we do not have access to the final target outputs.

 

MSBuild Property Functions

$
0
0

Have you ever wanted to do something simple in a build, like get a substring of a property value, and found that MSBuild didn't have syntax for it? You then had to write a task for it, which was tiresome to do for such a simple operation. What's more, if you wanted to do this during evaluation – outside of a target – you couldn't run a task there anyway.

In MSBuild 4.0 we addressed this by adding "property functions" which allow you to execute a variety of regular .NET API calls during evaluation or execution.

Here's an example. For the default VB or C# project, both the intermediate and final output directories are by default below the project's directory. Instead, I'm going to move the final outputs to c:\outputs\<some guid>\ followed by the usual path. You can see below how I did this. I removed the <OutputPath> property and replaced it with an expression that generated a guid for this project.

image

Now I reopen the project and hit build to show it worked:

image

Syntax

There are two syntaxes, as follows. They're intended to be fairly close to the existing Powershell syntax for calling .NET types. The first is for calling static members:

$([Namespace.Type]::Method(..parameters…))

$([Namespace.Type]::Property)

$([Namespace.Type]::set_Property(value))

The second is for instance members on the String class. You write it as if the property itself is a string.

$(property.Method(..parameters...))

$(property.Property)

$(property.set_Property(value))

Notice that when setting a property, you must use CLR syntax for properties ("set_XXX(value)").

The neat part is that these can all be nested – be sure to match your parentheses correctly of course. We attempt to coerce parameters as far as possible in order to find a method or overload that will work.

If you want to pass strings, quote with back-ticks.

When you pass the result of one expression to another, the types are maintained along the chain. This helps the binder find the member you are trying to call. Only when the final result of the expression needs to go into the build do we coerce it to a string.

Some examples may help:

Examples

image

Limitations

* You can't run instance methods on raw strings. For example $("c:\foo".get_Length()). They must go into a property first.

* Out parameters won't work – there are no intermediate values except for the return value. No delegates or generics either.

* If we coerce to the wrong overload, you may be able to use a Convert method to force the correct one.

* By default, you can only call certain members on certain types – selected to be free of side-effects. Here's the full list:

(1) All members on the following types:

System.Byte 
System.Char 
System.Convert 
System.DateTime 
System.Decimal 
System.Double 
System.Enum 
System.Guid 
System.Int16 
System.Int32 
System.Int64 
System.IO.Path 
System.Math 
System.UInt16 
System.UInt32 
System.UInt64 
System.SByte 
System.Single 
System.String 
System.StringComparer 
System.TimeSpan 
System.Text.RegularExpressions.Regex 
System.Version 
MSBuild  (see below) 
Microsoft.Build.Utilities.ToolLocationHelper

(2) Selected members on certain other types:

System.Environment::CommandLine 
System.Environment::ExpandEnvironmentVariables 
System.Environment::GetEnvironmentVariable 
System.Environment::GetEnvironmentVariables 
System.Environment::GetFolderPath 
System.Environment::GetLogicalDrives 
System.IO.Directory::GetDirectories 
System.IO.Directory::GetFiles 
System.IO.Directory::GetLastAccessTime 
System.IO.Directory::GetLastWriteTime 
System.IO.Directory::GetParent 
System.IO.File::Exists 
System.IO.File::GetCreationTime 
System.IO.File::GetAttributes 
System.IO.File::GetLastAccessTime 
System.IO.File::GetLastWriteTime 
System.IO.File::ReadAllText

But I want to use other types and custom types ..

The reason we prevent this is to make it more safe to load Visual Studio projects. Otherwise, someone could give you a project that formatted your hard-disk during evaluation. Visual Studio load-time safety is actually more complicated than that – some targets will run and do arbitrary things – but we didn't want to make new opportunities for badness. We could have made this limitation only apply to Visual Studio, but then it would be possible to have your build work differently on the command line. I'd like to hear your feedback on this – is the list too constraining?

You can decide whether we made the correct call here. Meanwhile there is an unsupported way to call members on arbitrary types: set the environment variable MSBUILDENABLEALLPROPERTYFUNCTIONS=1. You can now use any type in any assembly. Of course, MSbuild has to know what assembly it is in (it knows them for the list above); and the CLR binder still has to be able to find it to load it.

To figure out the assembly, it tries to work up the name. So for this example (assuming the environment variable is set)

$([Microsoft.VisualBasic.FileIO.FileSystem]::CurrentDirectory)

it will look for Microsoft.VisualBasic.FileIO.dll, then Microsoft.VisualBasic.dll (which it will find and load from the GAC) and you will get the value of the current directory.

If that's not going to work for your assembly, it is possible to pass in a strong name. For example, the above could equivalently be written like this:

$([Microsoft.VisualBasic.FileIO.FileSystem, Microsoft.VisualBasic, Version=10.0.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a]::CurrentDirectory)

This means you can write your own functions for MSBuild to call – just put the assembly somewhere that the CLR can find it. By doing that, you can (if you set the environment variable) cause your build do do absolutely anything during property evaluation.

Here's a screenshot of these examples:

image

Have fun, and let me know what you think. I'd love to get suggestions on how we can improve this.

 

[Update] I'll post about this separately later, but here's one other property function that will be useful to some people:

$([MSBuild]::GetDirectoryNameOfFileAbove(directory, filename)

Looks in the designated directory, then progressively in the parent directories until it finds the file provided or hits the root. Then it returns the path to that root. What would you need such an odd function for? It's very useful if you have a tree of projects in source control, and want them all to share a single imported file. You can check it in at the root, but how do they find it to import it? They could all specify the relative path, but that's cumbersome as it's different depending on where they are. Or, you could set an environment variable pointing to the root, but you might not want to use environment variables. That's where this function comes in handy – you can write something like this, and all projects will be able to find and import it:

  <Import Project="$([MSBuild]::GetDirectoryNameOfFileAbove($(MSBuildThisFileDirectory), EnlistmentInfo.props))\EnlistmentInfo.props" Condition=" '$([MSBuild]::GetDirectoryNameOfFileAbove($(MSBuildThisFileDirectory), EnlistmentInfo.props))' != '' " />

Dan Moseley
Developer Lead - MSBuild

A brief MSBuild Blog Note

$
0
0

Following extensive feedback on the decision to move the MSBuild blog to The Visual Studio Blog, a decision has been made to cross-post all MSBuild blog posts to this blog.

All current MSBuild posts to The Visual Studio Blog have now been posted here. Apologies if you have received RSS notifications on content you may already have known about.

Mike Fourie
Microsoft Visual Studio ALM MVP

MSBuild Property Functions (2)

$
0
0

Some more information about this 4.0 feature. (I’ve also updated the first post with this, so everything’s in one place for your reference.)

Built-in MSBuild functions

The full list of built-in [MSBuild] functions, like the one above, are in the MSDN topic here. They include arithmetic (useful, for example, for modifying version numbers), functions to convert to and from the MSBuild escaping format (on rare occasions, that is useful). Here’s another example

$([MSBuild]::Add($(VersionNumber), 1))

And here’s one other property function that will be useful to some people:

$([MSBuild]::GetDirectoryNameOfFileAbove(directory, filename)

Looks in the designated directory, then progressively in the parent directories until it finds the file provided or hits the root. Then it returns the path to that root. What would you need such an odd function for? It’s very useful if you have a tree of projects in source control, and want them all to share a single imported file. You can check it in at the root, but how do they find it to import it? They could all specify the relative path, but that’s cumbersome as it’s different depending on where they are. Or, you could set an environment variable pointing to the root, but you might not want to use environment variables. That’s where this function comes in handy – you can write something like this, and all projects will be able to find and import it:

  <Import Project=”$([MSBuild]::GetDirectoryNameOfFileAbove($(MSBuildThisFileDirectory), EnlistmentInfo.props))\EnlistmentInfo.props” Condition=” ‘$([MSBuild]::GetDirectoryNameOfFileAbove($(MSBuildThisFileDirectory), EnlistmentInfo.props))’ != ” ” />

Error handling

The functions parser is pretty robust but not necessarily that helpful when it doesn’t wokr. Errors you can get include

(1) It doesn’t evaluate but just comes out as a string. Your syntax isn’t recognized as an attempt at a function, most likely you’ve missed a closing parenthesis somewhere. That’s easy to do when there’s lots of nesting.

(2) error MSB4184: The expression “…” cannot be evaluated. It treated it as a function, but probably it couldn’t parse it.

(3) error MSB4184: The expression “…” cannot be evaluated. Method ‘…’ not found. It could parse it, but not find a member it could coerce to, or it was considered ambiguous by the binder. Verify you weren’t calling a static member using instance member syntax. Try to make the call less ambiguous between overloads, either by picking another overload (that perhaps has a unique number of parameters) or using the Convert class to force one of the parameters explicitly to the type the method wants. One common case where this happens is where one overload takes an integer, and the other an enumeration.

(4) error MSB4184: The expression “[System.Text.RegularExpressions.Regex]::Replace(d:\bar\libs;;c:\Foo\libs;, \lib\x86, ”)” cannot be evaluated. parsing “\lib\x86” – Unrecognized escape sequence \l.  Here’s an example where it bound the method, but the method threw an exception (“unrecognized escape sequence”) because the parameter values weren’t valid.

(5) error MSB4186: Invalid static method invocation syntax: “….”. Method ‘System.Text.RegularExpressions.Regex.Replace’ not found. Static method invocation should be of the form: $([FullTypeName]::Method()), e.g. $([System.IO.Path]::Combine(`a`, `b`)).. Hopefully this is self explanatory, but more often than a syntax mistake, you called an instance member using static member syntax.

Arrays

Arrays are tricky as the C# style syntax “new Foo[]” does not work, and Array.CreateInstance needs a Type object. To get an array, you either need a method or property that returns one, or you use a special case where we can force a string into an array. Here’s an example of the latter case:

$(LibraryPath.Split(`;`))

In this case, the string.Split overload wants a string array, and we’re converting the string into an array with one element.

Regex Example

Here I’m replacing a string in the property “LibraryPath”, case insensitively.

<LibraryPath>$([System.Text.RegularExpressions.Regex]::Replace($(LibraryPath), `$(DXSDK_DIR)\\lib\\x86`, “, System.Text.RegularExpressions.RegexOptions.IgnoreCase))</LibraryPath>

Here’s how to do the same with string manipulation, less pretty.

<LibraryPath>$(LibraryPath.Remove($(LibraryPath.IndexOf(`$(DXSDK_DIR)\lib\x86`, 0, $(IncludePath.Length), System.StringComparison.OrdinalIgnoreCase)), $([MSBuild]::Add($(DXSDK_DIR.Length), 8))))</LibraryPath>

Future Thoughts

So far in my own work I’ve found this feature really useful, and far, far, better than creating a task. It can make some simple tasks that were impossible possible, and often, easy. But as you can see from the examples above, it often has rough edges and sometimes it can be horrible to read and write. Here’s some ways we can make it better in future:

  1. A “language service” would make writing these expressions much easier to get right. What that means is a better XML editing experience inside Visual Studio for MSBuild format files, that understands this syntax, gives you intellisense, and squiggles errors. (Especially missed closing parentheses!)
  2. A smarter binder. Right now we’re using the regular CLR binder, with some customizations. Powershell has a much more heavily customized binder, and I believe there is now one for the DLR. If we switch to that, it would be much easier to get the method you want, with appropriate type conversion done for you.
  3. Some more methods in the [MSBuild] namespace for common tasks. For example, a method like $([MSBuild]::ReplaceInsensitive(`$(DXSDK_DIR)\\lib\\x86`, “)) would be easier than the long regular expression example above.
  4. Enable more types and members in the .NET Framework that are safe, and useful.
  5. Make it possible to expose your own functions, that you can use with this syntax, but write in inline code like MSBuild 4.0 allows you to do for tasks. You’d write once, and use many.
  6. Offer some similar powers for items and metadata.

What do you think?

Dan Moseley
Developer Lead – MSBuild 

Building on Cross targeting scenarios and 64-bit MSBuild

$
0
0

During the Visual Studio 2010 development cycle a push to make the build experience better on Cross compilation scenarios as well on making sure a build using 32-bit MSBuild was identical (in outputs) to a build using 64-bit MSBuild.

In most cases, 64-bit and 32-bit MSBuild will indeed produce the same output. However there are some cases, generally cross compilation scenarios, where this is not the case.

Note that since Visual Studio is a 32-bit application, if you build from Visual Studio, it is equivalent to running the 32-bit MSBuild.

 

ResolveAssemblyReference: Reference resolution ignores Processor Architecture except when resolving from the Global Assembly Cache

Description:

If you have two assemblies whose identities differ only by the processor architecture, i.e.

myTypes, Version=1.0.1234.0, Culture=en-US, PublicKeyToken=b77a5c561934e089c, ProcessorArchitecture=msil 
myTypes, Version=1.0.1234.0, Culture=en-US, PublicKeyToken=b77a5c561934e089c, ProcessorArchitecture=x86

And you try to reference one of them specifically:

<Reference Include=”myTypes, Version=1.0.1234.0, Culture=en-US,  PublicKeyToken=b77a5c561934e089c, ProcessorArchitecture=x86″/>

You will notice that the first reference found will be picked up.

It will also cause the CopyLocal property being set to false.

Affected scenarios:

Building with MSBuild 32-bit or 64-bit.

Workaround:

Add your affected references to the Global Assembly Cache. See KB315682 on how to do that.

 

64-bit MSBuild is not able to find VCBuild.exe while building a VC++ 3.5 solution

Description:

You keep facing the following error:

Build FAILED.

“mysolution.sln” (Rebuild target) (1) ->(mcpplib1:Rebuild target) -> 
  MSBUILD : error MSB3411: Could not load the Visual C++ component “VCBuild.exe”. If the component is not installed, either 1) install the Microsoft Windows SDK for Windows Server 2008 and .NET Framework 3.5, or 2) install Microsoft Visual Studio 2008.      0 Warning(s) 
    1 Error(s)

Affected scenarios:

Building with MSBuild 64-bit only.

 

Workaround:

In order to properly build solutions with MSBuild containing 3.5 and earlier VC++ project files (*.vcproj) the PATH environment variable should contain the path to VCBuild.exe, which happens to be a 32-bit only executable. To build with 64-bit MSBuild you should point to the location under “Program Files (x86)” path.

 

LC.exe causes build failures while building AMD64 configurations inside Visual Studio

Description:

When building the AMD64 configuration of a solution, LC.exe is being picked up from %Program Files (x86)%\Microsoft SDKs\Windows\v7.0A\bin\NETFX 4.0 Tools\LC.exe instead of being picked of from %Program Files (x86)%\Microsoft SDKs\Windows\v7.0A\bin\NETFX 4.0 Tools\x64\LC.exe. Or if you are using Visual Studio 2008, from the directory %Program Files%\Microsoft SDKs\Windows\v6.0A\bin.

This will make you face an error like this:

LC : error LC0000: ‘Could not load file or assembly ‘MyAssembly, Version=1.0.0.0, Culture=neutral, PublicKeyToken=null’ or one of its dependencies. An attempt was made to load a program with an incorrect format.’

This issue is caused by the fact that LC.exe is not able to satisfy the Cross-Compilation scenarios because of some specific requirements on how it needs to load the referenced dynamic libraries.

NOTE: your build will succeed if you use 64-bit MSBuild on the command line.

Affected scenarios:

Cross compilation scenarios. Building x64 platforms with 32-bit MSBuild or x86 platform with 32-bit MSBuild.

Workaround:

Your build will succeed if you use 64-bit MSBuild in the command line, however if you still want to build inside Visual Studio IDE you can use the following to your project file (by manually editing it):

<PropertyGroup Condition=” ‘$(Configuration)|$(Platform)’ == ‘Release|x64’ “>

<LCToolPath>C:\Program Files\Microsoft SDKs\Windows\v6.0A\bin\x64</LCToolPath>

RegisterAssembly task fails on Cross targeting scenarios

Description:

If you have a project in your solution targeted to build an x64 platform and you:

  • set it to be registered for COM Interop (on the Project Build properties)
  • or set the RegisterForComInterop property in the project file to true

You will face the following issue while building it:

Error 1 File “MyDll.dll” is not a valid assembly. C:\Windows\Microsoft.NET\Framework\v4.0.20904\Microsoft.Common.targets 3257 9 ClassLibrary3

MSBuild cannot register a library for COM Interop if its architecture does not match the architecture of the MSBuild.exe (or DevEnv.exe) hosting the build process.

So on a 64-bit OS, the following scenarios will not work when the RegisterAssembly task is invoked as part of the build:

  • In the IDE the user changes the platform of the project to x64 and builds
  • In the command line, 32-bit MSBuild will fail to build a project targeting a x64 platform
  • In the command line, 64-bit MSBuild will fail to build a project targeting the x86 platform
Affected scenarios:

Cross compilation scenarios. Building x64 platforms with 32-bit MSBuild or x86 platform with 32-bit MSBuild.

Workaround:
  1. You need to match the architecture of MSBuild.exe with the platform you are attempting to build, or
  2. Instead of setting the “RegisterForComInterop” property to true, add a custom step to your build that runs RegAsm.exe to register your COM library. It must run the version of RegAsm.exe that matches the architecture of your library. For details on how to add a custom build step, see here. Or follow this steps:
        1. In the project properties, select “Build Events…” from the compile page.
        2. Add the following post build command line: “%Windir%\Microsoft.NET\Framework[64]\v4.0.xxxxx\regasm” “$(TargetPath)”
          • Be careful to select the Framework directory that matches the architecture you are targeting

COM references are not resolved on cross targeting scenarios

Description:

If you have a COM object registered by using regsvr32.exe, consider that there is a 32-bit and 64-bit regsvr32.exe. If you used 32-bit regsvr32.exe to register your COM object and you are attempting to build a project targeting x86 platform but using 64-bit MSBuild. The build will fail, this issue is caused by the fact that the library was registered with a pure 32-bit regsvr32.exe and thus it only registers the component under the WOW registry section, that is invisible to 64-bit processes that do not attempt an explicit look up on the WOW nodes.

The opposite is also true, using the 64-bit regsvr32.exe to register the library and attempting to build a project targeting a x64 platform with 32-bit MSBuild. This process has no way to access the 64-bit part of the registry.

One manifestation of this issue would be if your build is failing with an AxImp error when building a project that consumes a registered PIA of an ActiveX control:

Build FAILED. 
“ActiveXWithPiaConsumer.csproj” (default target) (1) –> (ResolveComReferences target) ->  
  C:\Windows\Microsoft.NET\Framework64\v4.0.21112\Microsoft.Common.targets(1543,9): warning MSB3283: Cannot find wrapper assembly for type library “AxActiveXControlLib”. [ActiveXWithPiaConsumer.csproj] 
“ActiveXWithPiaConsumer.csproj” (default target) (1) –> (ResolveComReferences target) ->  
  AXIMP : AxImp error : Did not find a registered ActiveX control in ‘ActiveXWithPia\ActiveXControl.dll’. [ActiveXWithPiaConsumer.csproj]

 

Affected scenarios:

Mismatches between the architecture of the regsvr32.exe used to register the library and the architecture of MSBuild used to build:

  1. 32-bit regsvr32.exe and 64-bit MSBuild.exe
  2. 64-bit regsvr32.exe and 32-bit MSBuild.exe

NOTE: if you are using a 32-bit only COM object while trying to build a x64 platform the Interop assembly cannot be generated, and the same applies if you are using a 64-bit only COM object and you are trying to build the x86 platform.

Workaround:

Build using a matching MSBuild architecture with the architecture of regsvr32 and the platform to build:

  1. You want to build the x86 platform, use 32-bit MSBuild + 32-bit regsvr32.exe.
  2. You want to build a x64 platform, use 64-bit MSBuild + 64-bit regsvr32.exe.

Cannot build Silverlight project targeting a x64 platform or using 64-bit MSBuild

Description:

You have a Silverlight project and you change the platform to x64. You might face one of the following errors:

The “ValidateXaml” task failed unexpectedly. 
System.BadImageFormatException: Could not load file or assembly ‘obj\x64\Debug\SilverlightApplication1.dll’ or one of its dependencies. An attempt was made to load a program with an incorrect format.

or if you are building using 64-bit MSBuild:

“SilverlightApplication1.csproj” (GetXapOutputFile target) (2:2) -> 
  C:\Program Files (x86)\MSBuild\Microsoft\Silverlight\v3.0\Microsoft.Silverlight.Common.targets(101,9): error : The Silverlight 3 SDK is not installed. [SilverlightApplication1.csproj]

Affected scenarios:
  1. Attempts to build a Silverlight project targeting a x64 platform with either 32-bit or 64-bit MSBuild.
  2. Attempts to build a Silverlight project with 64-bit MSBuild.
Workaround:

No workaround. Silverlight DOES NOT support x64 platforms. And Silverlight projects cannot be built by 64-bit MSBuild. You must use the 32-bit MSBuild and target x86 or AnyCPU platforms to build your Silverlight projects.

If you are using Team Build select x86 for the MSBuild platform setting.

Interop assemblies are not generated correctly if the project targets the default platform

Description:

If you have a class library project (for example) and you haven’t changed the platform of the project, it will be targeting the AnyCPU platform. However if you add a reference to a COM object you will find out that the generated Interop assembly is specifically targeting the x86 platform.

This is because Interop assemblies always have an explicit target platform, and in the absence of an explicit platform from the project consuming the Interop assembly this target platform is defaulted to the value of an Environment Variable named “PROCESSOR_ARCHITECTURE”, which inside Visual Studio IDE it evaluates to the x86 platform.

The effect of this is that if your application (targeting AnyCPU platform) is run in a 64-bit Operating System, it will run as a 64-bit process and the will fail to load the Interop assembly.

Note that applications built as AnyCPU will always run as 64-bit under a 64-bit Operating System, no matter if you launch them from a 64-bit or 32-bit command window.

Affected scenarios:

Projects targeting the default platform and consuming Interop assemblies. This will happen either with 32-bit and 64-bit MSBuild.

Workaround:

Explicitly set the platform on your project or manually add it to the project by defining the PlatformTarget property to your configuration block in the project file:

  <PlatformTarget>AnyCPU</PlatformTarget>

 

An error occurs when compiling a .resx file MSBuild

Description:

On a 64-bit OS you have a project targeting the x86 platform and it targets 3.5 .NET Framework or below. Your project has a reference to a 32-bit only assembly, and when you build you get the following error:

ResourceFrm.resx(1436,5): error RG0000: Could not load file or assembly ’32bitOnlyAssembly.dll’ or one of its dependencies. An attempt was made to load a program with an incorrect format. Line 1436, position 5.

This issue is caused by the fact that in the 3.5 .NET tools resgen.exe in both the x86 and x64 bin directories is marked as IL (architecture agnostic), causing it to run on a 64-bit Operating System as a64-bit executable no matter what. As a 64-bit process, resgen.exe is unable to load the 32-bit only library.

Also if you are targeting 4.0 .NET, MSBuild will fail in the same way if you are referencing a 32-bit only assembly while using 64-bit MSBuild and vice versa.

Affected scenarios:

Cross targeting scenarios while building projects which contain resource files with MSBuild:

  1. In a 64-bit Operating System If you are targeting 3.5 .NET and the project references a 32-bit assembly with either 32-bit or 64-bit MSBuild.
  2. The project references a 32-bit assembly and you are using 64-bit MSBuild.
  3. The project references a 64-bit assembly and you are using 32-bit MSBuild.
Workaround:

Make the library referred on the error target the AnyCPU platform

Cannot build SQL Server project using 64-bit MSBuild

Description:

You have a SQL server project and you attempt to build it using 64-bit MSBuild, the following error is displayed:

  SqlServerProject1.vbproj(149,3): error MSB4019: The imported project “C:\Windows\Microsoft.NET\Framework64\v4.0.xxxxx\SqlServer.targets” was not found. Confirm that the path in the <Import> declaration is correct, and that the file exists on disk.

Workaround:

Copy C:\Windows\Microsoft.NET\Framework64\v4.0.xxxxx\SqlServer.targets to C:\Windows\Microsoft.NET\Framework\v4.0.xxxxx\SqlServer.targets

Daniel Estrada

Software Development Engineer in Test, MSBuild Team

Assembly Resolution in MSBuild and Visual Studio Series Introduction

$
0
0

Assembly references are an integral part of build process. When the assembly references passed to the compiler are correct everything works but when they are not projects stop building.  When this happens It can be frustrating to try and figure out why a reference was resolved from one location rather than another thereby causing the problem. In this series we will be detailing what steps are taken to take a reference from the project file and turn it into the path on disk that is passed to the compilers.

This series will be focusing on the MSBuild task ResolveAssemblyReference. This task does the work of taking references declared in project files and turning them into paths on disk.

The reason we discuss this task is because this same task is used by both MSBuild on the commandline and Visual Studio to find the references. Internally Visual Studio uses MSBuild as its build engine, so even though this series focuses on the behavior in MSBuild, it behaves exactly the same way in Visual Studio.

Outline for the of the assembly resolution series.

Part 1

In part one we will discuss how references are represented inside of the project file. This will give a basic understanding when looking at a project file of the different forms a reference can take (e.g. file path, simple name, or fusion name) and the additional attributes that can be set on those references.

Part 2

In part two we will discuss some of the basic inputs to the ResolveAssemblyReference task. Why these inputs are important and how they affect how references are found is outlined. This post also goes into detail about how a reference is resolved and the different kinds of algorithms involved in turning what is represented in the project file into a path on disk.

Part 3

In part three we discuss the AssemblyFoldersEx registry location. This is just one of the places where references can be resolved from, however it is one of the most complicated locations due to how the location is searched. This section will discuss the layout of the registry keys and the algorithms used to find assemblies which are declared in this location.

Part 4

In part four we will discuss how conflicts between dependent assemblies arise and how they are dealt with. This section will provide an understanding of why conflict warnings occur and how they can be prevented or disabled. This part also discusses how the determination as to whether or not an assembly should be copied to the output directory is made. This is partially dependent on the resolution of conflicts between dependent assemblies and for this reason is in the same section.

Part 5

In part five we will discuss how the target framework moniker represented in the project file is used to generate a list of directories which represent the framework that the project is targeting. We also discuss the new multi-targeting rules that were introduced in MSBuild 4.0 to prevent users from referencing framework assemblies which are not part of the framework their project is targeting.

Part 6

In part six we will discuss the ResolveAssemblyReference task logging output. It logs a large amount of information about why it resolves references a certain way. This information is very useful when trying to determine why a reference was not resolved when it was expected to resolve or why one reference was picked from a certain location when it was expected to come from another.

Chris Mann – Developer, MSBuild 


Better Parallelism in MSBuild 4 with YieldDuringToolExecution

$
0
0

Introduction

In MSBuild 4 we introduced several performance improvements, particular for large interdependent builds.  By and large they are automatic and you receive their benefit without making any changes to the way your build process in authored.  However, there are still some cases where we are unable to make the best decision.  One such case is when there is a particular external tool which is invoked as part of the build but which takes a significant amount of time.  An example of such a tool would be cl.exe, the C++ compiler.  This article discusses how to use the new yield mechanism for external tools to improve the performance of your builds.

Tool Tasks

There are a few ways MSBuild can be made to execute external, command-line tools:

  1. Write a task which derives from ToolTask.
  2. Use the Exec task to call your command.
  3. Use the XamlTaskFactory.

All of these methods ultimately use the ToolTask class in Microsoft.Build.Utilities.v4.0.dll to handle executing a command-line task and deal with the output in the MSBuild way.  Like all tasks, however, they block any other work from happening in MSBuild while they are executing.  In cases where the task is very short, such as touching a log file or copying a file from one place to another this is perfectly acceptable.  But in the original example of invoking the C++ compiler, the amount of time MSBuild itself sits idle can be lengthy and in some cases it may be a significant impediment to good parallelization of your build.

The problem has to do with the way MSBuild utilizes its worker nodes.  Whenever a project is scheduled to be built, it is assigned to one of the worker nodes.  This node will then execute that project from start to finish, and will not accept more work until the project is either finished or the project makes an MSBuild call (for instance to satisfy a project-to-project reference.)  This is in large part because a node can only execute one task at a time, as tasks must be guaranteed their environment and current directory will not be modified during execution.

However, command-line tools do not execute in-process, and therefore their environment cannot be polluted by the running of additional tasks in parallel on the same node.  We can take advantage of this behavior to let the MSBuild node execute tasks in other projects while our long-running tool completes its work.  This is done using the YieldDuringToolExecution parameter.

YieldDuringToolExecution

In order to allow MSBuild to continue building other projects while a command-line tool in one project is running is simple.  Just set the YieldDuringToolExecution parameter to ‘True’ on your long running command-line tool.  This is a boolean parameter, so any valid MSBuild expression which resolves to a boolean value will work.  Here’s an example:

<PropertyGroup>
    <YieldDuringToolExecution>true</YieldDuringToolExecution>
</PropertyGroup>
<Exec CommandLine=”Sleep 10000” YieldDuringToolExecution=”$(YieldDuringToolExecution)”/>

When the Exec task executes, normally it would sleep for 10000 seconds during which no other work on the node can proceed.  However, with yielding enabled, the Sleep command will still run but the MSBuild node will be free to do other work.  Once the Sleep command is finished, the node will resume building the project which launched it as soon as the node is free to do so.

Whether or not you should enable yielding for your ToolTasks depends on what they do.  Generally speaking if the task runs for less than one second, it’s probably not worth it to enable this since there is a small cost to give up the MSBuild node.  However, for longer tools you may see some wins, and the wins will likely be larger the more complex your build is and the more long running tasks you have in it.  Again, large interdependent C++ builds are a great example of this and they benefit tremendously from yielding being applied to the compiler.  You can investigate your build’s performance using the Detailed Summary feature of MSBuild 4.

Yielding interacts well with the /m switch in MSBuild as well.  For instance, if you have specified /m:4 to enable parallelization, MSBuild will ensure that no more than four parallel things are going on at once, whether they be regularly building projects or yielding tools.  So enabling yielding will not cause your machine to become more overloaded.  Instead your builds are likely to improve their parallelization and make better use of available CPU and I/O cycles that they would otherwise. 

We have already enabled yield semantic for several tool tasks.  These include:

  • CL, the C++ compiler
  • MIDL, the IDL compiler
  • Link, the native linker – Only when the LinkTimeCodeGeneration metadata is set to UseLinkTimeCodeGeneration

It could also be enabled for the Vbc and Csc tasks since they are ToolTasks as well, but this support is not in the Microsoft.CSharp.targets and Microsoft.VisualBasic.targets shipped with .Net 4.0.  You could easily add them yourself if you wished.  More generally, if you include Microsoft.Common.targets the YieldDuringToolExecution property will be set to true unless it is overridden with the parameter /p:YieldDuringToolExecution=false being passed to MSBuild.  We will continue to use this property as the basis for selecting the tool parameter value of the same name.

Why isn’t it automatic?

Unfortunately for MSBuild 4 we didn’t get the opportunity to make this system as automatic as we would like.  In future versions we would like to automatically yield when ToolTasks are executing if they look like they will last longer than a certain threshold.  This will also work together with additional automatic improvements in build analysis and scheduling we have planned.

Cliff Hudson – MSBuild Developer

Debugging MSBuild script with Visual Studio

$
0
0

Back when we started 4.0 development, I polled readers of the MSBuild blog to find out what features were most important to them. Debugging was #1 which was very surprising to us. Thinking about it more, it makes sense. In our team we’ve become so proficient ourselves at reading the XML and making sense of logs that it’s easy to forget how difficult it is – especially for someone new. John Robbins, debugging guru, also requested a Visual-Studio-integrated debugger.

Fast forward to the 4.0 release earlier this year, and we addressed 7 out of 16 of the requests by my count. We had to balance the requests with what Visual Studio itself needed from MSBuild. There were two major requirements it had on MSBuild: to enable VC++ to move onto MSBuild (#5 request), and to help enable more powerful and fine grained multi-targeting.

It turned out that these two in turn required many other features, most of which were happily also popular requests on that blog poll. We added the ability to define a task with inline code (#7 – see powershell example) a new, comprehensive object model (#14; in three parts, one, two, three), improved performance and scalability in many cases (#8 — and here), property and item functions (#9 – albeit not currently extensible) , and accurate automatic dependency checking by performing file system interception (#11), plus some small syntax additions (label, import group, import by wildcard) and a more configurable build engine (eg see here, here, and here), plus easier build extensibility and some performance diagnostics.

We didn’t have time, unfortunately, to address converting the solution file to MSBuild (#3) – which we would dearly love to do – nor to add a Visual Studio integrated debugger (#1).

At least, not a supported one!

Mike Stall approached us to demonstrate an ingenious reflection-emit idea which made it considerably more feasible to create an MSBuild specific debugger with many of the features of the real managed code debugger. While on leave I wrote and checked-in the code to do it. Unfortunately we couldn’t complete it in time to make the 4.0 schedule.

For that reason, it’s in the product, but disabled by default. It does work, it’s just not supported or documented, and has a few limitations and bugs: it may be slow, it’s not always pretty, and in at least one case, it’s a little inaccurate. This blog post is “unofficial” documentation of how to use it in the hope it will be useful. Although it’s not supported we will welcome Connect feedback, but it will likely will be moved to our backlog rather than fixed immediately. It would also be a great idea to add any bug reports and feedback to the comments on this blog post.

Debugging Walkthrough

I’m going to walk through each debugging scenario in turn.

Before you start, open Visual Studio briefly and make sure that “Just My Code” is enabled. It’s essential for this to work properly:

image

There’s a lot of screenshots here, but this blog is rather narrow, so some of them are distorted – you can click on them to see the full size version.

Scenario 1 – Command Line only

First, enable the undocumented “/debug” switch on MSBuild.exe by setting the HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\MSBuild\4.0 key to have debuggerenabled=true, as I’ve done here with reg.exe in an elevated Visual Studio prompt:

image

You should now have these keys, assuming C: is your system drive.

image

Run MSBuild /? and you’ll see the new switch has appeared.

image

 

 

 

 

 

 

 

 

 

 

We are now ready to debug.

Normally you’d be debugging some build process you’ve customized or authored, but for illustrative purposes I’m going to debug a brand new C# Windows Forms project. I’m going to build it with the /debug switch, and it will immediately stop:

image

 

In my case I get a prompt to elevate, and hit Yes:

image

Then I get the standard JIT debugging prompt. Make sure you check “Manually choose the debugging engines”.

image

that causes a dialog to appear to choose the debugging engine: you want Managed only. (Mixed will work, is more clunky.)

image

 

And you are now debugging!

image

The first thing to notice is that we are right at the top of the first project file, the very first line MSBuild is evaluating. You are breaking in automatically at the very start, as if you started debugging a regular application with “F11”.  Well, almost the very start: MSBuild already read in the environment and its other initial settings:

image

Now hit F10 and you will step line by line:

image

 

As you step over properties, you’ll see the locals window is updating:

image

image

 

As you probably know, MSBuild evaluates in passes. The first pass evaluates just properties, pulling in any imports as they’re encountered. Try to set a breakpoint (F9) on an item tag right now – you can’t! MSBuild is unaware of them at this point.

Set a breakpoint on the <Import> tag at the bottom and run to it (F5):

image

Now step in (F11). You’ll enter the file that’s being imported, which in this case is Microsoft.CSharp.targets.

image

The Callstack window shows that jump like a function call, including the location in the file:

image

Of course, <Import> does not have the semantics of a function call at all. Like an #include in C++, it simply logically inserts the content of another file. But I chose to make it work this way so that you can see the chain of imports in the Callstack window and figure out your context.

By setting some more breakpoints on Imports and doing step-into, I can go deeper to illustrate:

image

To get past the property pass, given that I can’t set a breakpoint on items yet, I’ll use a trick. I’ll Step Out repeatedly (Shift-F11) until we get to the project file again, then step to get to the next pass, which is Item Definitions. The C++ build process uses Item Definitions a great deal, but they’re not very interesting for C#, in fact there’s only one:

image

Use the same trick to get to the Item pass, and we’ll get to the first item. I’ve then set a breakpoint to illustrate that I can do that now.

image

Conditional Breakpoints work too, by the way as I believe do Trace Points.

Stepping a bit further, I can see items in the locals window, and also their metadata. A small bug here — ignore the red message, and go into “Non-public members” to see the names and values:

image

image

 

 

 

Sometimes you’ll want to figure what a condition evaluates to at the current moment. To do that, in the immediate window, pass the condition to the function EvaluateCondition:

image

It’s much the same if you want to evaluate (expand) an expression, but the function is named EvaluateExpression:

image

This is also a convenient way to see what a property value is, or what’s in an item list, without navigating through the locals window. Be sure to escape any slashes, as I’ve done here.

The Autos window doesn’t work, but Watch does:

image

 

In the Immediate window you can change almost any project state during the build, using the new object model. For example, I’ll modify this property while I’m stopped here:

image

You can do a lot through the new object model, so it’s very useful to be able to call it here.

My Watch window updated to match:

image

That’s the end of what I’m going to show for debugging MSBuild evaluation.

How it works

What’s happening at the high level (you can find out more from Mike’s blog) is that MSBuild is pretending the script is actually VB or C#. It’s doing this by emitting IL on the fly that’s semantically equivalent to what it’s really doing as it goes through the XML. The code of MSBuild itself is of course optimized, so Just My Code hides it, but conveniently the IL isn’t optimized, so it shows up. Inside the IL MSBuild emits line directives that point to the right place in the project file, completing the trick. As for the “locals”, they’re actually parameters passed to functions in the IL so that they appear. EvaluateCondition and EvaluateExpression are just delegates passed the same way.

As such, a large part of the basic features you get with the regular VB/C# debugger just work. Some that don’t: hovering over an expression doesn’t give you the result; you can’t just use the “?” syntax in the immediate window; Threads and Processes windows don’t make sense; I doubt Intellitrace works. Plus, there’s some of our internals leaking out in the windows here and there. But by using this trick, it was vastly less work to get the basics of an integrated debugger. I believe I spent a day or two tidying up Mike’s sample code, and another three days wiring it straightforwardly into MSBuild. Creating a real debugger engine would be much more costly; and something comparable with what you get for C# would be fantastically costly, so I expect that long term, this will be the MSBuild debugging story. I hope you’ll agree it’s a lot better than staring at XML and logs or adding <Message> tags.

In my next post I’m going to cover
  • Debugging during the build – ie., debugging what happens inside targets, and project references;
  • Debugging a multiprocessor build;
  • Debugging the build of projects loaded into Visual Studio

See you then!

Dan

Visual Studio Project & Build Dev Lead

Debugging MSBuild script with Visual Studio (2)

$
0
0

In my previous post, I showed how to enable the hidden Visual Studio debugger for MSBuild script, and demonstrated it by stepping through the evaluation of a C# project. In this post, I’ll keep debugging into the actual build of that project.

Note that this blog is rather narrow, so some of the screenshots may be hard to see – you can click on them to see the full size version.

Starting from where we left off last post, I’ll set a breakpoint in Microsoft.CSharp.targets on the <Csc> task tag. That’s where the compiler will run. Then hit F5 to run to it.

image 

Ideally, I’d be able to set a breakpoint on the enclosing <Target> tag, but unfortunately there’s a bug: you can’t. As a workaround you could inspect the values of a target’s attributes when you get to the first tag in the body of that target. If the target’s condition is false, or if its inputs and outputs are up to date so that it skips, it’s not so simple. You’d have to work around this by stepping up to the target before.

I’d like to be able to evaluate the condition on the task there, but because it’s batchable it could have multiple results: the EvaluateCondition delegate I used before won’t accept it. If this was a parameter on the task, I’d probably step into the task’s code itself to see the value. Since it’s a condition, I’d probably look through the metadata on the item list directly, or query the object model in some way.

Something like the value of Sources is easy to evaluate here, though:

image

Now I want to step into the task implementation.

 

You might think that at this point, you can simply Step In (F11). However, you can’t – it happens that MSBuild will run this task on a different thread. To know to jump threads properly here, the debugger has to support what they call “causality” for this kind of debugging, and since it doesn’t know anything about MSBuild, it doesn’t.

It’s easy to get the job done though – set a breakpoint in the task and run until it’s hit.

I have the source code for the Csc task, so I set a breakpoint here on the setter of the WarningLevel property, and did Continue (F5). I can see the task is getting “4” as the value of that property here. I can debug this code just like any other C# code, stepping through methods and so forth.

image

 

To get out to the XML, I’ll set a breakpoint in it and run – the same trick I used to get into the C#, but in reverse. Here I’m at the next tag after the task:

image

I used Csc as an example here, but you’ll generally be debugging a custom task (or possibly, logger). Just make sure the assembly is not optimized: since you have Just-My-Code on, it won’t be debuggable otherwise. If you only have access to an optimized one, you can switch off Just-My-Code temporarily.

There’s a CallTarget tag here: you can step into those, and like imports they’ll look like a new frame on the callstack – although unlike imports, that’s correct for their semantics.

Probably the biggest limitation (bug) with the debugger right now is that you can’t see item and property changes made inside a target. For example, at this point @(_CoreCompileResourceInputs) should be empty because of the line above, but the immediate window tells me it isn’t:

image

When you get past that target, you can see the changes.

Scenario 2: Debugging a build with multiple projects

Typically there’s more than one project in a build. I’ve added a project reference from this project to another. I’ve put a breakpoint at the top of that project, and run to it:

image

The bottom of the callstack is the point in Microsoft.Common.targets where, early in the build of WindowsFormsApplication1, it invoked WindowsFormsApplication2.

In my next posts I’m going to cover

 

 

  • Debugging a multiprocessor build
  • Debugging the build of projects loaded into Visual Studio

See you then!

Dan

Visual Studio Project & Build Dev Lead

Debugging MSBuild script with Visual Studio (3)

$
0
0

In my last two posts (here and here) I showed how to enable the unsupported MSBuild debugger to debug a build started on the command line with MSBuild.exe. In this final post, I’ll mention some other variations.

Note that this blog is rather narrow, so some of the screenshots may be hard to see – you can click on them to see the full size version.

Scenario 3: Debug a Solution file on the command line

In the previous example, I launched msbuild.exe against a project file. You well might want to start with a solution file instead. If you do, you may get an error that looks like this:

Microsoft.Build.Shared.InternalErrorException: MSB0001: Internal MSBuild Error: Mismatched leave was C:\Users\danmose\Do
cuments\visual studio 2010\Projects\WindowsFormsApplication1\WindowsFormsApplication1.sln.metaproj expected C:\Users\dan
mose\Documents\visual studio 2010\Projects\WindowsFormsApplication1\WindowsFormsApplication1.csproj (2,1)

The workaround is as follows. First, set an environment variable MSBUILDEMITSOLUTION=1. Then do a build of the solution in the regular way: if you want, you can cancel it after it starts building projects. Next to the solution file, you should see a file with the extension “.sln.metaproj”. This is essentially the solution file translated into MSBuild-format. Now do the usual debugging procedure, but this time launch msbuild.exe against this sln.metaproj file instead of the original solution file.

Scenario 4: Multiprocessor build

When you’re debugging your build process or tasks, it’s much easier to follow if only one project is building at a time, and you’ll probably do it that way whenever you can. What’s more, debugging slows down the build so much that it may not help you diagnose a timing problem anyway.

If however for some reason you do need to debug a multiprocessor build, it’s possible.

As you probably know, MSBuild launches child processes to build more than one project at once, and they persist for a while to be ready for another build. The /debug switch doesn’t propagate to them. To get this to work, first terminate any msbuild.exe processes that are still alive. Then in a command window set an environment variable MSBUILDDEBUGGING=1. That environment variable is equivalent to the /debug switch, but unlike the switch it will get propagated to any child processes.

From this command window now start msbuild.exe with the /debug switch as usual. Everything will work much the same as before, but you’ll get a new JIT prompt as each child process starts to do its work, and you’ll have to use a new instance of Visual Studio for each one of them.

For example, I’m building a solution here, using the .metaproj workaround I mentioned above. Attaching at the first JIT prompt, I can see I am starting in the .metaproj itself:

image

If I hit Continue (F5) I’ll get to the top of the next project, and stop there as usual. In the callstack window, I can see that the solution has invoked that project, and I can double click on the lower frame in the callstack to see where it did that – it’s an MSBuild task, of course:

image

While I’m stopped here, I get a JIT prompt again. This time it’s for the other project in my solution, which has already started to load in parallel in another msbuild.exe process. I’ll go through the JIT prompts as I did before, and select to start a new instance of Visual Studio. That will in turn break in at the top of that other project.

Here’s what I see now:

image

Visual Studio supports debugging more than one process at the same time, which would be more convenient, but I don’t think the JIT launcher will let you do that. If you have a lot of child processes, it could get rather cumbersome. Remember that the “/m” switch defaults to the same number of processes as you have CPU’s, so you may want to cut it down with “/m:2”.

 

 

 

Scenario 5: Debugging projects while they are loaded in Visual Studio

When I wrote the prototype, it was also possible to debug the evaluation and building of projects loaded in Visual Studio.

Unfortunately in walking through this scenario to write this blog post, I was sorry to find that it’s somewhat broken in the release version of Visual Studio 2010. In my experiments debugging works through the first evaluation, as the projects are loaded, but then it hits a bug and terminates. Since it’s an untested feature there’s always the chance something can break without detection and that’s apparently what happened here.

Given that, you’ll most likely have to stick with msbuild.exe. However, I’ll go ahead and explain a little about how one would do this in Visual Studio, in case you have better luck than I do.

Start off by setting the registry key as described at the start of my first post and kill any lingering instances of MSBuild.exe as you did in the last scenario.

Make sure the environment variable MSBUILDDEBUGGING is not set and open your first Visual Studio to act as your debugger. You can do this from the start menu. As before make sure Just-My-Code is switched on in this instance.

Open a command prompt and set the environment variable MSBUILDDEBUGGING=1 again. Then launch Visual Studio from that command prompt — most likely you’d start it with a command like “%ProgramFiles%\Microsoft Visual Studio 10.0\Common7\IDE\devenv.exe”. This second instance is your debuggee, which you will load your projects into.

Go back to the debugger process, and attach to that debuggee, choosing the “Managed (v4.0)” engine as always. In this debugger process, open up the project or targets file you want to start debugging through. Make sure you open it with “Open File” into the XML editor, rather than loading it as an actual project.

Debugging may not start automatically on the first line this time, so set a breakpoint where you want to begin. I opened the “WindowsFormsApplication1.csproj” file I used before into the XML editor, and set a breakpoint on the first line to get hit when the project is first loaded.

Open up your project or projects in the normal way in the debuggee now and you should see your breakpoint is hit. Step a little further, and for me, Visual Studio terminates.

End of the Walkthrough

Now I’ve shown you all the features of the MSBuild debugger, I hope you’ll try it out.

If you have feedback, do post it here. In particular, how important is it to be able to debug projects loaded inside Visual Studio? Also, as you’ve seen, MSBuild must be in “debugging mode” from launch — it’s not possible to walk up to a regular build that’s in progress and attach to it. Is it important to be able to do that?

Most of all, how useful is this to you, and how problematic are the bugs and limitations you run into?

Thanks for reading

Dan

Visual Studio Project & Build Dev Lead

MSBuild Known Issues

$
0
0

Since the release of Visual Studio 2010 we have received a few reports of crashing behavior which can be traced back to issues with MSBuild.  We’ve analyzed all of these and there are several particular cases where a crash can occur.  We’ve also added a notification to Windows Error Reporting to help guide those who hit these errors.  You can determine you your error is one of these either by matching the problem description below, or looking in the Event Viewer as follows:

  1. Open the Event Viewer
  2. Search for Information events with ID = 1001 and Source = Windows Error Reporting.  Look for those with the time that approximately matches when you saw the crash.
  3. At the top of the details pane for the event is text that would look like the below.  If the bucket number is not 1055654512, then this post may not apply to you.

Fault bucket 1055654512, type 1

Event Name: APPCRASH

Response: xxxxxxx

Cab Id: 0

Crash when debugging using F5

Problem: This can occur if the build process is missing a required target.  This is normally due to an improperly customized build process.  If you are using the .NET MicroFramework 4, which is not supported in Visual Studio 2010, you may also see this issue.

Solution: Provide the missing target.  Try building the project/solution on the command-line.  If MSBuild logs an error that a target is missing, that could be the problem. 

Crash when registering COM component

Problem: COM registration requires the user have permission to certain registry keys, and lacking that permission the RegASM task crashes.

Solution: Ensure the registry keys needed to perform the registration are accessible to the account doing the registration.  Look under HKCR\Record for the GUID matching the type (or types) associated with the COM components you are registering.

Invalid Project Exception when building with .NET MicroFramework 4

Problem: When building projects with Visual Studio 2010 and the .NET MicroFramework 4, the build would fail with an InvalidProjectFileException.  The .NET MicroFramework 4 is not compatible with Visual Studio 2010.

Workaround: There is unfortunately no workaround for this.  According to the .NET MicroFramework 4 team, the next version of the MicroFramework will support VS2010.  Check out http://www.microsoft.com/netmf/default.mspx for updated information.

MSBuild throws an error that it cannot find msbuild.exe

Problem: Building on the command-line or from Visual Studio displays an error that MSBuild could not find MSBuild.exe during a C++ or multiproc build.  If your username is exactly 20 character long excluding your domain (the maximum allowed under Windows), there is a bug in the .NET Framework which will prevent us from authenticating the other MSBuild nodes.

Workaround: Build under an account with a name that is less than 20 characters.  The bug should be fixed in the next version of the .NET Framework.

MSBuild throws an OutOfMemory exception

Problem: Your build in Visual Studio aborts with an OutOfMemory exception.  It may or may not also do this when built from the command-line.  This occurs typically when there are a very large number of projects with a lot of interdependencies, or when projects have an extremely large number of source files.

Workaround: Build your solution on the command-line, so that your process does not have Visual Studio’s initial memory foot print; or build on a 64-bit machine under a 64-bit command window so we can take advantage of the additional virtual memory spacel or split your solution into smaller chunks which can be built individually; or create a solution configuration in which only a subset of your projects build.

Visual Studio crashes when building a solution containing C++ and WiX projects

Problem: When building a solution which contains C++ and WiX projects, Visual Studio may crash.

Solution: This problem has been traced to a bug in the WiX project system.  Please contact the WiX project for a newer version with the correct bits.

Other issues

For some crashing issues we have set up Windows Error Reporting so that it will automatically request for you to send us additional information and contact us directly.  If you see such a request, please consider providing us with the requested additional information so we can either determine that your issue is already known or address it in the next version of the product.

Incorrect solution build ordering when using MSBuild.exe

$
0
0

We’ve had a few reports of cases where Visual Studio, and previous versions of MSBuild, will build the projects in the solution in the correct order, but the 4.0 version of MSBuild.exe gets the order wrong.

Here’s a full description of what’s going on, why it began in 4.0, and the fix we recommend to your projects to solve the problem. If you’re not interested in the “why”, skip ahead to the workaround.

Archetypical case exhibiting the problem

dep1b

 

This diagram shows a solution file containing three projects, A, B, and C. Let’s say they are C# projects.

A has a regular project reference to B, so it will invoke B, then when B comes back done, it will build itself. At the same time in the solution file, there is a manually specified dependency: B depends on C.

I’ve shown regular project references with solid lines, and the information in the solution with dotted lines.

This manually specified dependency was set up in the solution by right clicking on the solution and choosing Project Dependencies…, then checking a box. Below I’ve shown the context menu and what you see in the dialog when you have this setup.

The build ordering you expect here is C, then B, then A, and Visual Studio shows that correctly in the Build Order tab, as you see below.

Of course a real case would have more projects in it, but it would boil down to this case.

 

image

image image

Why does this happen (skip ahead if you just want the “fix”)

Essentially the problem is that MSBuild doesn’t know anything about the project files until it starts to build them.

Solution files, as you know, are not in MSBuild format (yet). On the command line, MSBuild.exe is on its own, so it parses them and generates one or more in-memory MSBuild format files that are essentially a translation. If you want to see these ugly files, set an environment variable MSBUILDEMITSOLUTION=1 then build the solution. You’ll see a .sln.metaproj file emitted next to your solution file, and possibly one or more files with an extension like .csproj.metaproj next to some of your projects.

The .metaproj generated for B.csproj is how MSBuild makes sure that the solution dependency is respected — at least, it’s created so that the solution itself does not invoke B until C is built. It does this by invoking the B metaproj instead of B directly, and in the B metaproj, it builds C before B. This is exactly equivalent to someone going into B and adding a project reference to C, instead of a solution dependency, but it means that we don’t have to edit the B project directly.

Here’s what it looks like after this translation:

dep2

Why doesn’t that work in this case? In short, the problem is the project reference from A to B. Here’s what happens: the solution invokes A.csproj, B.metaproj, and C.csproj concurrently (or at least, in undefined order), which would normally be fine. B.metaproj invokes C.csproj, and waits, then invokes B.csproj. However in the meantime, A.csproj was invoked, and because it has a project reference to B.csproj, it invokes B.csproj — it “goes around the back”. C.csproj hasn’t necessarily built yet, so the build breaks.

Why did this work in previous versions of MSBuild?

In previous versions, we loaded and scanned every project file listed in the solution, and any they referenced, in order to draw a complete graph. Then we used the graph to create the MSBuild format equivalent of the solution file. The reason we did all this scanning was not actually to address this problem, it was to make interop with old-style non-MSBuild VC projects (“.vcproj”) work correctly. It was also slow, especially for large solutions.

In VS2010, VC projects converted to MSBuild, so in 4.0 we took out this complex interop code. After making .metaproj’s to express any dependencies stored in the solution file, we could now simply invoke all the projects in the solution and the build would order itself. That was potentially much faster, because we didn’t need to load any projects (potentially hundreds) to scan them before building anything. (Of course, when MSBuild 4.0 is fed a VS2005 or VS2008 solution file, it still calls into the old code in the old assembly to do it the old way, since they may contain .vcproj’s. So those guys don’t have this problem.) The oversight was this case — where a project reference “goes behind the back” of a solution-expressed dependency.

To fix this we would have to revert to loading and scanning, which slows things down — the correct approach is to use project references instead of solution dependencies, as I explain below.

How to fix this

Follow this principle: do not use dependencies expressed in the solution file at all! Better to express dependencies in the file that has the dependency: put a project reference in the project, instead. In our example, that would be a project reference from B to C.

You may not have done that before because you didn’t want to reference the target of the project reference, but merely order the build. However, in 4.0 you can create a project reference that only orders the build without adding a reference. It would look like this – note the metadata element, and all this is inside an <ItemGroup> tag of course:

<ProjectReference Include=”… foo.csproj”> 
    <ReferenceOutputAssembly>false</ReferenceOutputAssembly> 
</ProjectReference>

Note that you have to add the child element with a text editor — Visual Studio can add a project reference, but doesn’t expose UI for this metadata.

I can tidy up by removing the dependency in the solution file as well – removing now-unnecessary lines like this — your GUID will be different, but use the VS dialog and it will do the job.

    ProjectSection(ProjectDependencies) = postProject 
        {B79CE0B0-565B-4BC5-8D28-8463A05F0EDC} = {B79CE0B0-565B-4BC5-8D28-8463A05F0EDC}

    EndProjectSection

If you’re using C++ projects, you are less likely to have this problem, because in the upgrade process that converts .vcproj’s to .vcxproj’s, it moves any solution dependencies relating to them to project references for you. However, if you do, there’s a similar fix. For C++/CLI project references to other managed projects, use project references like the one above. For the equivalent situation with a project reference to a static lib, where you want a project reference without linking in the referenced lib, the metadata you want is

<ProjectReference Include=”… lib.vcxproj”> 
    <LinkLibraryDependencies>false</LinkLibraryDependencies> 
</ProjectReference>

Summary

Although it’s tiresome to have to edit your projects in this way to make the bug go away, it’s a best practice to use project references instead and consider the solution file merely a “view”, and you’ll end up with projects that if you want can be built without a solution file.

Post Script

I know of one other, more obscure and completely different case, where MSBuild 4.0 does not order correctly but Visual Studio does. This can happen if you have web application projects, AND you build with 64 bit MSBuild (which is the default, in Team Build 2010). I won’t go into the tedious details but the fix is to do one of these things: (1) set a property or environment variable named MSBuildExtensionsPath to C:\program files (x86)\msbuild or (2) Build with 32 bit MSBuild, which I recommend in general for other reasons or (3) copy the web application targets files under the 64 bit program files MSBuild folder to the equivalent location in the 32 bit program files MSBuild folder.

If you find any other case where the solution is not ordering correctly yet this workaround does not work, that’s interesting. Please make a minimal repro like the one in this bug, and send it to Dan atmsbuild@microsoft.com.

Dan

Update (12/25)

Luc C points out below that sometimes removing a project reference in favor of a new solution dependency reference is an alternative solution. That assumes you’re only using the project reference for ordering , or you replace it with a file reference. Still, I do recommend project references in general, on the principle of “express the dependency in the place it applies” – you can look at the project and see that the dependency is correct, and you can include the projects in more than one solution easily.

Luc also points out the case of a managed project depending on a non-CLR native project (presumably for PInvoke). In my experiments, VS will let you add a project reference, albeit with an ugly bang, and it will do the ordering correctly, as will msbuild.exe.

Second edition of the MSBuild and Team Foundation Build book released

$
0
0

Not many books that are reviewed like this on Amazon:

clip_image001

Now the heavily augmented second edition has just come out, written by several people at Microsoft and reviewed by the product team.

  • New extensive coverage of building C++ with MSBuild
  • Complete rewrite of the Team Build sections to cover the new Windows Workflow Foundation build orchestration
  • Detail on new MSBuild 4.0 features like inline tasks, and item and property functions.

 

image

If you need a reference to Team Build or MSBuild, you should get it.

— Dan, VS Solution/Project/Build dev lead

Update: Here’s a link to the book on Amazon


Viewing all 35 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>