Update on experiences of using Git


I’ve been using Git now for a few months almost exclusively. When I last blogged on it I’d been using it for but a few weeks. Since then I’ve come to the conclusion that it should be the de-facto choice for source control. This might be obvious to those of you that use Git on a regular basis, but there are obviously many people out there using other source control systems in the .NET world – I would guess primarily TFS and Subversion.

There are, to me, two main benefits to Git. Firstly, it’s incredibly lightweight to start using. There’s no requirement for using a source control server, so you can easily start using it for local projects. It’s also extremely quick – check-ins are virtually instant, whilst you can seamlessly switch between different revisions of code without any network connectivity instantly.

It’s also very powerful, with excellent branching and merging features; I’ve started using branching for features much more now, thanks to the ease of use in git and the aforementioned ability to quickly jump around the source tree. In TFS, getting an “older version” of code means checking out that version of code using a rarely-used dialog, or in SVN finding the particular version of code first, download that version to a separate folder etc. etc… In Git, with e.g. Git Extensions, all I do is visually look down the tree, see the version I want to look at, and open that revision. It’s no surprise that the overall TFS suite now supports Git for source control and VS has started to get some support for it as well.

Online source control repositories such as GitHub (free for public projects) and BitBucket (free for small projects) mean you can get up and running with a central repository to collaborate with others almost instantly.

Whilst on the subject of tooling – I’ve been using GitExtensions more and more and found it powerful and much easier to use than faffing with the command line.

There is indeed a learning curve for doing more than just basic check-in / get latest version of code – primarily around the ability to instantly move from one revision of code to another in the same physical folder – but once you “get” git, you’ll not look back.

In conclusion, if you’re using TFS or Subversion, I would seriously recommend looking at moving to Git for new projects – the productivity gain you can get from improving your release process e.g. Release branches, hot fixes etc. will make it worthwhile immediately in addition to the day-to-day developer gains.

Better git integration for VS 2012


git seems to be everywhere these days doesn’t it. Everyone is using it, and looking for any excuse to blog today, I wanted to share both my early experiences with it, and the new Microsoft git plugin for VS2012.

Initial thoughts on git

Pretty much like everyone else, I imagine. Branching is much easier to do than with other source control systems that I’ve used – I’m talking TFS, SVN (and I suppose SourceSafe as well). There are many reasons for this I think – the biggest one is the fact that when you pull down code, you pull down the entire repository i.e. all history of the repository. This means that you can do things like rollbacks, checkins, branches and merges all locally – when you’re happy with your changes as an entire piece of work you can “push” the lot in one go to your remote repository (remote repository = TFS / SVN-style central source control server, although with git it doesn’t quite work like that).

It also encourages more frequent checkins since you’re doing things locally, so part of the “mental block” of checking in centrally is removed. You can do things like perform multiple local check ins, and then before pushing to the remote repository, convert those many check-ins into one check-in.

Because branches are much easier to control in git, you may find yourself doing things like feature branches more often. Oh, and there is no support for a “single check out” mode like you have with TFS – hopefully those days are beyond all of us!

git also performs quickly – check-ins happen pretty much instantly as they are local, and you can instantly switch back to an older revision of code with a single command – no complicated rolling back and so on. In fact, it’s so easy to do you might be surprised by it at first – you rollback or switch branches, and VS instantly says that files have been changed and updates.

This is all very nice, although I have also struggled with some aspects of git – firstly it blends concepts like merging and check-ins, so there’s a slight learning curve there, as well as introducing the idea of “rebasing” – which essentially is the merging of two branches into one so that they appear as though they were a single set of ordered check-ins. Secondly, I’ve had one or two issues when I’ve somehow completely trashed my local repository, forcing me to completely clean the repository and “start over”. Once I lost a couple of hours’ work – not much fun.

Overall, having used git, I must say that I do like the features it offers and the possibilities for helping teams of developers improve their day-to-day processes. It’s lightweight to set up, powerful, and fully embraces the “offline” mode of working rather than the “if-your-network connection-is-slow-then-VS-will-run-like-a-dog” way that TFS operates, which I nowadays find very frustrating.

Tooling

This is where things go a little awry. On the Windows platform, git has several options for managing your code: –

  • Command Line. This is how you talk to git under the bonnet. Many developers use this for their source control; personally I prefer something a little more accessible and easy-to-learn, but you can obviously do anything from here. There are several different variants, like command line, powershell etc. but to me they are all basically the same thing.
  • GitHub For Windows. This is GitHub’s version of a git source control front end. It’s a fine tool to use for basic operations, i.e. push and pull from remote repositories, check in, rollback etc. It also offers branching and merging, but if there’s any conflicts you’ll just get a “something went wrong!” sort of message. It worked fine for me for single person projects, but for anything more, you might struggle.
  • Git Extensions. This is a suite of tools including some Visual Studio integration points, as well as a GUI front-end over the command line like GFW, except this actually supports merge conflicts (via diff tools like KDiff etc.). It has some decent docs and support, so is well worth checking out.
  • Git Source Control Provider. This is a free, third-party VS source control provider that integrates pretty well with VS. It doesn’t support branching etc. (at least, I couldn’t find that) so you’ll need another tool to do that – but it does have context menu options in solution explorer to help you out.
  • Visual Studio Tools for git. This is still in preview, but is another VSSC provider, so it integrates with solution explorer etc. It also allows branching, integrates with the in-built VS2012 merge tool, and has decent support for viewing history etc. Somewhat annoyingly, it won’t automatically mark added files into a check-in – you have to explicitly do that.

There are simply too many options here for a complete newbie to know which one does what and when to use one or the other. Only the last one comes with a built-in diff tool (although Git Extensions does offer to install KDiff I believe). What you want is a one-stop shop for git really, or at most a couple of installs – one for the core git libraries etc. and another for the UI plugin.

Having used all of these over the past few weeks, I’m still struggling for the “sweet spot” tool. I think any VS dev using git on a daily basis will want VS Tools for git, as it makes 80% of what you will do a doddle i.e. pulling latest changes, checking in locally, pushing to remotes, branching and merging. You can do all that directly in VS. However, you’ll still probably want Git extensions for other, less commonly used tasks. And underneath all of that sits the command line tools.

Conclusion

In practical terms, I struggled initially to do some fairly basic operations like resolving merge conflicts, simply because I couldn’t figure out how to wire up a diff tool. Eventually after faffing with Git Extensions and installing a couple of diff tools I did manage it. Thankfully now VST for Git does make that easier.

I still think part of the work will be for devs that are experienced in TFS and SVN to come around to a different way of source control, but in order to do that, the tools need to be more streamlined and accessible. Those two source control systems have mature UIs – git just needs a bit more work on this front to lower the barrier to entry even more.

Unity Automapper 1.3 released


A new version of the Unity Automapper (1.3) is now available on NuGet. It now supports a fluent-style, provider API in addition to the attribute-based model for overriding default settings etc.

I’ve also decided to open-source the Unity Automapper at this time and push it onto GitHub. There are several reasons for this. Primarly, I’ve been using the public, web-facing version of TFS for a while but wanted to try GitHub out to see the differences both from a website as well as a source control mechanism perspective.

I plan on blogging shortly about my initial experiences of GitHub (generally quite positive) but for the purposes of the Unity Automapper, all documentation and guidance can now be found here. I still plan on publicising updates via this blog, but it will no longer be the repository for documentation and low-level details of each release.

TFS check-in policies over multiple branches


The project I’m working on currently has two branches; a main trunk for hot-fixes and a branch for a new module which will be released at some stage in the future.

We’re getting a few issues with check-in policies and builds as a result of branching, however. For example: –

  • Continuous integration. We’ve had to turn off the CI build in TFS as whenever anyone checks in anything to the “new module” branch, the main trunk CI build kicks off. Is there a way to change this so that it only happens when a check in occurs for that branch of code?
  • Our unit testing check-in policy is playing up now. It complains that certain unit tests haven’t been run as a result on check-in. I can only assume that this is because the policy points to the main trunk’s VSMDI file, whereas when you run unit tests on the branch, that runs that version of the VSMDI file. As we’ve removed some unit tests from the branch version of the application (due to them no longer being appropriate), I suspect that TFS thinks that we’ve not run all the units tests, when in fact we have. I can’t figure out a way around this, though…. any ideas welcome.

Windows 7 Virtual PC & configuring TFS 2010 in a test environment


I recently installed TFS 2010 on my local machine for testing, but after playing around with it, decided that I wanted to try the new Labs feature of TFS plus some of the other stuff. Plus I didn’t want to keep running TFS all the time (and SQL) just in case I decided to do some development.

So I downloaded Windows Virtual PC (WVPC) for Windows 7 and installed a version of Windows 7 on it. I then installed TFS 2010 along with SQL Express 2008 and hey presto, working TFS in its own environment.

WVPC is pretty good; the only criticism I have with it is that it only emulates a 32 bit CPU (even though I have a 64 bit one). This is a little annoying as I can’t run Windows 2008 R2 through a virtual pc (as it is 64 bit only). However, aside from that, the whole process did not take that long:-

  • Downloading the ISO for Windows 7 off MSDN: 15-20 minutes.
  • Installing on the VPC: about the same.
  • Installing and configuring TFS: 10 minutes

The only thing that did take a while was installing SQL Express 2008 Management Studio and then exposing the SQL service to my host machine via Windows Firewall.

But it’s all up and running now. I want to try TFS Labs but we’ll see how that goes…

Scrum TFS Templates?


I’m planning on using a “proper” Scrum TFS template on the next project we do. My understanding is that there are two main TFS templates available for Scrum: –

There’s also what looks like a handy dashboard TFS add-in which works with both templates.

I’d really like to use one of these, but I’d like to know people’s thoughts on either of the templates – what did they like about them? What did they not like? What are the reports like that come with the template (if any)? Was it a pain to install the template? etc. etc.

Any thoughts / comments greatly appreciated! Once I choose one and we start to use it, I’ll blog on how we’re finding it ourselves.

TFS “GetLatest” and “GetSpecific” version of code – the definitive guide


At my office we we discussing the differences between these two ways of getting source code from TFS are, and I wanted to find out once and for all. I have found some people generally use Get Specific because Get Latest “doesn’t work” / is buggy / is unreliable / doesn’t actually get latest code (?) or something similar.

  • What are the real differences between them?
  • When should we use one and not the other?
  • How do Workspaces fit in with these two features?

Workspaces

OK. To start with, I went through Workspaces and how they work – and, importantly, how do they make TFS by default act differently from SourceSafe (VSS)? This is how I understand it: –

Workspaces are a local cache of the code in TFS. When you pull code out of TFS – whether it’s by GetLatest or GetSpecific – the code is stored locally in some folder that you never know the name of (or care about, really). The files in this workspace can be thought of offline-version of TFS against which all your changesets are managed.

Checking out a file

When you edit a file for the first time after getting code, the TFS plug-in does the following: –

  1. If you are using single check-outs, the plug-in checks with TFS to see if the file is locked. If multiple check-outs are permitted, this is ignored.
  2. If the file is unlocked, the plug-in removes the read-only lock on your local file and marks the file as checked out in TFS.
  3. Assuming (2) was successful, if you are using single check-outs, it also places a lock on the file to prevent anyone else checking it out at the same time.

At no point in this process does TFS check whether the version of code in your workspace is the same as what is centrally stored in TFS.

Checking in a file

Imagine the following scenario (assume that we are using single checkouts for simplicity): –

  1. On Monday Jim does a Get Latest (or Get Specific) of all files in a TFS project. This code is copied onto his local machine’s workspace, and also to wherever he has mapped the workspace to e.g. C:\Dev\MySolution etc.
  2. On Tuesday Sally start to work on File A and checks it out exclusively. At this point, TFS knows that she has a check-out on the file, but that’s all. It does not know what the changes she is making to that file are.
  3. On Wednesday she checks in her changes of File A. TFS now has the latest version of code. Jim’s workspace version of the file is unchanged.
  4. On Thursday Jim decides to modify File A. The plug-in checks out the file exclusively to him and places a lock on the file in TFS as already described. The version of the file he modifies is the original version he got on Monday. He does NOT get the version which was checked into TFS yesterday by Sally.
  5. On Friday Jim is finished with his work and checks it in. The TFS plug-in pops up with a message asking him to merge his changes in with TFS’s version of the file. This is to resolve the changes he has made and the ones that Sally made earlier in the week.

This shows that you can get the merge dialog even in a single check-out environment, not just with multiple check-outs.

The old VSS way that this process would have worked would be on editing the file ((4) above) that the plug-in automatically pulls down the latest version of File A. The reason why TFS does not do this is to prevent you accidentally breaking your local build i.e. you get the latest version of File A which references File B which was checked in at the same time by myself on Wednesday. You don’t have File B, so your build will break, forcing you to do a full Get Latest of code.

You can do it get the TFS plug-in for VS to behave in this way: –

  1. Open up Team Explorer.
  2. Go to the root node of your project.
  3. Right click and select Team Project Settings –> Source Control… (you will need Project Admin rights to get this option)
  4. Select “Enable get latest on check-out”.

That’s it – you’re back to the original VSS behaviour.

Get Latest vs Get Specific version

I’ve searched everywhere for a definitive answer to this one. I’ve been using Get Latest since we started using TFS and don’t get any problems with it, ever! What I have read is this: –

  • Get Latest will synchronise your local workspace (and the mapped development folder) with all the files checked into TFS. This means it will: –
    • Update files already in your workspace that differ from TFS (bringing up the Merge dialog if necessary)
    • Add any new files into your workspace
    • Remove any deleted files from your workspace
  • Get Specific (provided that tick a few of the boxes in the dialog) will get EVERY file from TFS under the node you’ve requested, even if they match the copy of the file in your workspace exactly.

When is Get Specific “better” than Get Latest?

Under what circumstances would Get Latest not be enough for you? I can’t think of any for day-to-day development. I’ve found a few blogs on this issue – the overriding situations where Get Latest will not get your the same set of files on your machine is if you have :-

  • Removed files outside of Visual Studio, in which case the TFS plug-in will not know that you have removed the file and will not get it from TFS when you do a Get Latest because it still exists in your workspace. The solution is to either delete it through Solution Explorer in VS or through Team Explorer. Conversely, if you delete a file in e.g. Windows Explorer, when you check-in your changes, I don’t believe that TFS will know that the file has been deleted (for the same reason), so won’t delete it from TFS. If someone can confirm this that would be nice 🙂
  • Removed the lock on a file manually i.e. right click to get properties and remove the Read-Only flag. In this case, when you Get Latest on the file, it will not do any change tracking because again the workspace does not know that you’ve editing the file, because you never had to remove the lock in the first place.

When is Get Latest “better” than Get Specific

There’s nothing “better” in terms of the accuracy of files that you will get using Get Latest. However: –

  • It’s much quicker, as it only pulls down files that are different between your workspace and TFS. In a slow network environment this might be very important!
  • You don’t get VS going bananas about reloading projects.

Designer-generated files

There is one unique scenario where things can go belly-up when resolving your check-ins – designer files e.g. Forms, Entity models, DataSets etc. which have designer.cs / .dbml / xsd / xss files that are generated by Visual Studio when you made changes in a designer surface.

When you check in your e.g. Form, if someone else has made changes to the same form (the visual part, not the “code behind”) in the meantime, VS will fail to resolve the differences automatically and ask you to resolve the changes manually. If you fancy going through a designer generated file, this is of course not a problem 🙂 But I’d sooner undo my checkout on that file, get latest, and make my changes again. In a single-check out environment, the only way to guard against this is to force the “Get Latest on Check Out” feature on developers.

However, in a multiple check out scenario, even using both Get Specific Version and the Get Latest on Check Out feature of Visual Studio will not save you from this situation! As far as I can tell, there is simply no way to prevent it happening. So, if you are using multiple checkouts, be careful when working on designer-generated files – maintain short check-outs on them so that the risk of someone else working on them in the meantime is reduced.

Comments welcome…

See also: –