Monday, November 7, 2016

Delivering files using NuGet 2 so they're copied to the output directory

I recently had a situation where a NuGet package I was creating had to deliver some supporting DLLs which needed to be copied to the output directory (actually into a sub-directory named /x86). We're using NuGet 2 at work because our internal NuGet feed is managed via ProGet, which doesn't yet support NuGet 3.x. While NuGet 3.3+ gives the copyToOutput option to have content files delivered to the output directory, there's no such support in NuGet 2. In researching how to do this I came across two very helpful posts on StackOverflow which gave me the direction I needed. The posts were http://stackoverflow.com/a/30386836/123147 and http://stackoverflow.com/a/30316946/123147.

The approach I took was to have the files I needed to distribute be included in the NuGet package in the /build folder, as opposed to /content. In fact, I put them in a sub-folder named /x86 which was the desired target location in the output directory. I then used an MSBuild targets file to mark all the files in that sub-folder as always needing to be copied to the output directory. I had not worked with MSBuild targets (or properties) files before - so this approach was totally new to me.

This blog post seeks to expand on using a MSBuild targets file in a NuGet package. The NuGet documentation talks about this technique - but I thought a more step-by-step set of instructions might help others.

The sample code I'm providing in this post is a little different from what I did at work in that there's no sub-folder created in the target directory. I am using a sub-folder in the NuGet package's /build folder, though, so don't have to act on each file individually.

The sample scenario

The contrived example for this post is to create a package which delivers files to the output directory of any project that installs my NeededFiles package; an image and a readme file.

What does an MSBuild targets file do?

If you were to open a Visual Studio project file (e.g. csproj or vbproj) in a text editor you would see it's an XML format which defines properties and actions (i.e. "targets"). I found it pretty similar to a NANT script. What a targets file lets us do is define a target that can be imported into the project file later. The NuGet documentation explains that when a package includes a targets file with the same name as the package in its /build folder the NuGet installation will add an import element to the project file.

What's in the NeededFiles package?

The NeededFiles.nuspec file looks like this:
1. <?xml version="1.0"?>
2. <package >
3.   <metadata>
4.     <id>NeededFiles</id>
5.     <version>1.0.0</version>
6.     <authors>@Rob_Hale_VT</authors>
7.     <requireLicenseAcceptance>false</requireLicenseAcceptance>
8.     <description>
9.   An example of delivering files to the consuming project's output via build targets
10.  </description>
11.     <releaseNotes>Created</releaseNotes>
12.     <copyright>Copyright 2016</copyright>
13.     <tags>Demo</tags>
14.   </metadata>
15.   <files>
16.  <!-- In this instance, the file(s) to be delivered are in the sub-folder named "howdy" -->
17.  <file src="howdy\*.*" target="build\howdy" />
18. 
19.  <!-- The targets file must have the same name as this package -->
20.  <!-- More info: https://docs.nuget.org/ndocs/create-packages/creating-a-package#including-msbuild-props-and-targets-in-a-package -->
21.  <file src="NeededFiles.targets" target="build" />
22 .  </files>
23. </package>

The key lines for what I'm doing are 15 - 22, where the files element is defined. On line 17 I'm added all the contents of a folder called "howdy" to the /build/howdy target folder. This folder contains the image and readme file I want to have delivered

Additionally, line 21 adds the NeededFiles.targets file to the /build target folder.

What does the NeededFiles.targets file do?

Now that the NeededFiles package includes the /build/NeededFiles.targets file we can take a look at what it does.
1. <Project xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
2.   <ItemGroup>
3.     <NativeLibs Include="$(MSBuildThisFileDirectory)howdy\*.*" />
4.     <None Include="@(NativeLibs)">
5.       <Link>%(RecursiveDir)%(FileName)%(Extension)</Link>
6.       <CopyToOutputDirectory>Always</CopyToOutputDirectory>
7.     </None>
8.   </ItemGroup>
9. </Project>

This file is imported by the consuming project. This means when the project is built line 3 declares an element named NativeLibs which is comprised of the contents of the howdy folder (delivered as part of the NeededFiles package). Lines 4 - 7 are acting like a loop, instructing MSBuild to mark each file in NativeLibs so they are always copied to the output directory.

If I wanted the files to all appear in a sub-folder of the output directory I could modify line 5 to be:
<Link>%(RecursiveDir)howdy/%(FileName)%(Extension)</Link>

Summary

At this point the package will deliver the contents of the Howdy folder to the output directory of any project that installs the NeededFiles package. Unlike files marked as content, the files in the build target folder don't even appear in the Solution Explorer in Visual Studio. "It just works."

The source for my contrived example, NeededFiles, can be found at https://github.com/robhalevt/NeededFilesExample

Thursday, March 31, 2016

Seven year navel gazing...

The first post of this blog documented that I was preparing to start a new job. That was December of 2008. Seven years seems like a long ago.

On Monday I start another new job.

So I feel like this is a good opportunity to reflect on what these past seven years has meant to me and my career. A chance to thank the universe for what I've experienced, to thank the folks I've met, and to recognize where I think I'm going.

The month after that initial post I started working at MyWebGrocer. I had been hired to work on the eCommerce product and introduce some improvements to the development processes. I had played a similar role at my prior employer by implementing an automated build system and a few other bits of tooling. At MyWebGrocer, however, I was able to expand on that prior work. I led the introduction of a continuous integration server, static code analysis, automated unit testing, some scrum practices, and more. The team I was on kicked ass; our product was pretty stable, we were delivering new functionality, refactoring to reduce complexity and lines of code, and enjoying each others' company. After a few years I took a management position. While I eventually resumed my technical career - for those two years I had the privilege of managing the best team with whom I've had the pleasure of working. When I decided to leave management the MWG executive team was supportive; allowing me to resume my technical career without leaving the company. Not every employer would have done that; I'm appreciative.

My tenure at MWG coincides with my increased participation in the development community around Burlington, VT. I increased my involved with the VT .NET user group which led to co-founding the VT Code Camp. When MyWebGrocer founded the HackVT event they invited me to be on the planning team. I've continued to serve on the organizing team for HackVT. I've met amazing members of our technical community, too. Organizers of other local user groups, students, presenters, sponsors, employers, journalists, writers, etc. My participation in this community will continue regardless of where I work, and I'm confident that MWG's support of the local development community will continue, too. They've been reliable sponsors for VT Code Camp, HackVT, and other local development events.

As I transition into my next role MWG is six times larger than on my first day. I look at how the development team works and I see my fingerprints on a lot of it. It's not been a perfect experience. I've made mistakes and there are elements of what I would have like to have contributed to the MWG experience which are unfulfilled. This is the real world, after all. But these past seven years have (so far) been the highlight of my professional career.

Come Monday, when I start at Renewable NRG Systems I will be a significantly better, and more well rounded, developer than I was seven years ago. I'm eager to dive into a different industry for an organization which has a different corporate culture. New people, new knowledge - it's going to be great!

So thank you to the MWG team for helping me grow professionally. And for your friendships. Thank you to RNRG for the opportunity to challenge myself in an industry I feel is important.

Thanks universe.

Wednesday, December 9, 2015

What HTTP Status Code to return

A colleague shared a link to http://racksburg.com/choosing-an-http-status-code/ with me today. It contains a nice rundown of when to have your API return certain status code values.

Tuesday, November 10, 2015

Interesting link to keep

Not really a blog post here. Just don't want to lose the URL for a calendar that has a round up of Vermont based events of interest to entrepreneurs and development folks: http://www.vermontventurenetwork.org/?view=calendar&month=November-2015

Wednesday, July 8, 2015

End to End testing takedown

Google's Testing Blog has a post, Just Say No to More End-to-End Tests, which does a nice job of articulating my experience with automated testing. Specifically I've found unit testing to be more beneficial than integration testing and integration testing more beneficial than end to end testing (E2E).

The immediate feedback provided by unit tests, which run significantly faster than the other two, helps me as a developer. Using a TDD approach, and frequently invoking my unit tests, the design is refined and failing code is identified quickly. At my current employer we've tied the unit tests into the continuous integration process so that if code coverage drops by 1% or more the build fails. That's a nice defense against having the code atrophy. I also like the closing statement from Martin Fowler's post, Test Pyramid:
In particular I always argue that high-level tests are there as a second line of test defense. If you get a failure in a high level test, not just do you have a bug in your functional code, you also have a missing unit test. Thus whenever you fix a failing end-to-end test, you should be adding unit tests too.

Integration and E2E tests have value, and I'm not arguing they be skipped, but I've found the greatest ROI comes with unit tests.

Monday, May 18, 2015

Bookmarking future reading

A co-worker was passing around a link to a Mozilla document that provides A re-introduction to JavaScript and I don't want to lose the link. It looks to be a thorough article - which requires more time than I can currently dedicate to it, hence the bookmark post.

Nothing to read here, citizen. Move along.

Friday, April 3, 2015

Adventures (and frustrations) in NuGet package creation

I'm frustrated.

And disappointed.

Frustrated and disappointed. But more knowledgeable.

I spent a good part of today trying to get our CI build to create our NuGet packages so the resulting packages list their dependencies without developers having to manually edit the .nuspec file unless they have to. Unfortunately, I've failed, but I know a little more about how to interact with our TeamCity installation and the nuget pack command.

What I learned was that our build process invokes the nuget pack command which creates the package. Reviewing the documentation for that command I read about the IncludeReferencedProjects option which is described as doing the following:
Include referenced projects either as dependencies or as part of the package. If a referenced project has a corresponding nuspec file that has the same name as the project, then that referenced project is added as a dependency. Otherwise, the referenced project is added as part of the package.
"This is great!" I thought, "The solution I'm currently working on has project references that met that description. I won't have to explicitly declare the dependency."

What I was trying to do

Maybe a brief note about the solution is in order. It's for a WebAPI and has several projects in it, not all of which are published as NuGet packages. The projects are:
  • MyApp 
  • MyApp.Application
  • MyApp.Api
  • MyApp.Api.WebHost
  • MyApp.Repository
With the exception of MyApp.Api they each result in a NuGet package.  Mainly because even though we're moving to have all calls go through the MyApp.Api.WebHost which is the package that's automatically deployed to the web servers, there are some older projects that continue to need access to MyApp, MyApp.Application and MyApp.Repository.

As you would guess, there are project references between some of the projects. For example, MyApp.Api.WebHost references MyApp, MyApp.Application and MyApp.Repository. What we've been doing is modifying the web host project's .nuspec file to declare the dependencies on MyApp, MyApp.Application and MyApp.Repository.

What I tried

Given this configuration I thought the IncludeReferencedProjects option would allow us to only declare package dependencies that existed outside the solution.

So I sat with a co-worker who is very knowledgeable about the TeamCity templates we have. He showed me the how to include that option on the build step which creates the NuGet package. We copied the template to a temporary one to test with, added the option, associated the MyApp build to that template, modified the .nuspec file to remove the explicit package dependency declarations, committed the code, and sat back to see the results.

It didn't work.

The resulting nupkg file doesn't list the dependencies. We then modified the way nuget was run to turn up the verbosity (-Verbosity detailed) and ran it again to see if we learned anything useful. All we saw in our build log was [pack] Dependencies: None. Then my co-worker noticed in the pack command examples that in addition to running nuget pack against the .nuspec file you can also run it against the .csproj file. So we tried that. That change resulted in a number of failing builds because the .csproj files had references to Visual Studio 10 and we're using VS2013 - doesn't make a difference to MSBuild because it knows what version it is, and it didn't matter to nuget pack when we targeted the .nuspec file (presumably because it didn't need it) but when we target the .csproj file it broke. So we changed the project files so:
<VisualStudioVersion Condition="'$(VisualStudioVersion)' == ''">10.0</VisualStudioVersion>
became
<VisualStudioVersion Condition="'$(VisualStudioVersion)' == ''">12.0</VisualStudioVersion>

and
<Import Project="$(MSBuildExtensionsPath32)\Microsoft\VisualStudio\v10.0\WebApplications\Microsoft.WebApplication.targets" Condition="false" />
became
<Import Project="$(MSBuildExtensionsPath32)\Microsoft\VisualStudio\v12.0\WebApplications\Microsoft.WebApplication.targets" Condition="false" />

Neither of these changes fixed the problem. We were continuing to see an error that read:
[pack] Microsoft.Build.Exceptions.InvalidProjectFileException: The imported project "C:\Program Files (x86)\MSBuild\Microsoft\VisualStudio\v10.0\WebApplications\Microsoft.WebApplication.targets" was not found. Confirm that the path in the declaration is correct, and that the file exists on disk.  [the path to my .csproj file]
[pack]    at Microsoft.Build.Shared.ProjectErrorUtilities.ThrowInvalidProject(String errorSubCategoryResourceName, IElementLocation elementLocation, String resourceName, Object[] args)
[pack]    at Microsoft.Build.Shared.ProjectErrorUtilities.ThrowInvalidProject(IElementLocation elementLocation, String resourceName, Object arg0)
[pack]    at Microsoft.Build.Evaluation.Evaluator`4.ExpandAndLoadImports(String directoryOfImportingFile, String importExpressionEscaped, ProjectImportElement importElement)
[pack]    at Microsoft.Build.Evaluation.Evaluator`4.EvaluateImportElement(String directoryOfImportingFile, ProjectImportElement importElement)
[pack]    at Microsoft.Build.Evaluation.Evaluator`4.PerformDepthFirstPass(ProjectRootElement currentProjectOrImport)
[pack]    at Microsoft.Build.Evaluation.Evaluator`4.Evaluate()
[pack]    at Microsoft.Build.Evaluation.Evaluator`4.Evaluate(IEvaluatorData`4 data, ProjectRootElement root, ProjectLoadSettings loadSettings, Int32 maxNodeCount, PropertyDictionary`1 environmentProperties, ILoggingService loggingService, IItemFactory`2 itemFactory, IToolsetProvider toolsetProvider, ProjectRootElementCache projectRootElementCache, BuildEventContext buildEventContext, ProjectInstance projectInstanceIfAnyForDebuggerOnly)
[pack]    at Microsoft.Build.Evaluation.Project.ReevaluateIfNecessary(ILoggingService loggingServiceForEvaluation)
[pack]    at Microsoft.Build.Evaluation.Project.Initialize(IDictionary`2 globalProperties, String toolsVersion, String subToolsetVersion, ProjectLoadSettings loadSettings)
[pack]    at Microsoft.Build.Evaluation.Project..ctor(String projectFile, IDictionary`2 globalProperties, String toolsVersion, String subToolsetVersion, ProjectCollection projectCollection, ProjectLoadSettings loadSettings)
[pack]    at NuGet.Commands.PackCommand.BuildFromProjectFile(String path)
[pack]    at NuGet.Commands.PackCommand.BuildPackage(String path)
[pack]    at NuGet.Commands.PackCommand.ExecuteCommand()
[pack]    at NuGet.Commands.Command.Execute()
[pack]    at NuGet.Program.Main(String[] args)
[pack] Process exited with code 1


For some reason neither of us understood, the project file was still looking for version 10, even though the file on the build server had version 12 specified. We decided to change the way nuget pack was being called to include the property assignment of the VisualStudioVersion (-Properties Configuration=Release;VisualStudioVersion=12.0).

This got us a successful build and the resulting nupkg file started listing our dependencies as expected.

So the IncludeReferencedProjects option only appears to work when nuget pack targets the project file.

Short lived victory lap

After high fives all around we set about changing the real template for our CI builds. Once that was done we kicked off builds for the projects which used that template. What we found was that even with the changes we made related to the VisualStudioVersion, in both the properties passed from the nuget pack command and the project files, some of the builds kept failing because the version being sought was v10 not v12.

Observations before reverting

What we noticed was the problem seemed to occur when project A.1 was being packed but project A.1 referenced A.2 and the import line that was causing the exception was in project A.2.  For example, during the pack of project MyApp.Account.Api.WebHost the exception line was
[pack] Microsoft.Build.Exceptions.InvalidProjectFileException: The imported project "C:\Program Files (x86)\MSBuild\Microsoft\VisualStudio\v10.0\WebApplications\Microsoft.WebApplication.targets" was not found. Confirm that the path in the declaration is correct, and that the file exists on disk.  F:\Agent\Work\3b114de58491d1e6\src\MyApp.Account.Api\MyApp.Account.Api.csproj

At this point we got stumped because all the project files had the hard coded v12.0 value. Somehow the import is being given the wrong path. We can't find the cause of it and after a half day we had to move on.

Appeal for advice

If anyone knows how we might successfully revisit this and work past the problems; either by telling us why the IncludeReferencedProjects option wasn't working from the .nuspec file or explaining what we have to do related to the VisualStudioVersion, (or something else we're not seeing that you know of) it would be appreciated. Tweet me.