Tuesday, 23 September 2014

Hitting the limit on a Local Workspace

Everybody remember the introduction of the local workspaces in 2012, enabling offline scenarios with Team Foundation Server. But did you know they have a limit on the number of files prior to performance degradation?

This limit is 100000 elements.

TF401190: The local workspace temp_WS;User has 248536 items in it,
which exceeds the recommended limit of 100000 items. To improve
performance, either reduce the number of items in the workspace,
or convert the workspace to a server workspace.

There it is. It’s not a bug, but it is a design choice by the Team Foundation Server team.

Local workspaces work leveraging the content of the hidden $tf folder, which tracks all the changes for a file (deltas) from check-out to check-in. That’s how you get features like Candidate Changes. The side effect is that despite the source copy is compressed, it is still a copy, hence you have a physically bigger workspace.

The workarounds in this case is to use a server workspace (easy) or to split the huge, monolithic workspace into several smaller workspace so you won’t hit the issue. This could be harder than just using a server workspace, but with a bit of planning it is absolutely feasible.

This post by Philip Kelley is extremely enlightening, as it is a deep comparison between local and server workspaces. Right there he explains the differences, and how they are implemented (the PendChange permission, the +R bit, etc.).

9 comments:

  1. You can say that again!

    ReplyDelete
  2. You can say that again!

    ReplyDelete
  3. how about making TFS work better instead of giving us another useless message

    ReplyDelete
    Replies
    1. Yep exactly. They really should have implemented local workspaces in a manner that is scalable rather than useless for large projects.

      Delete
    2. I totally agree I see the message every day. I don't notice any performance issues. But I do notice the subtle annoyance of seeing the message every time I open visual studio.

      Delete
  4. Exactly, during my history with SVN I had never performance problems or hit any limit. My bad TFS is sqeezed by my current development?

    ReplyDelete
  5. Source Safe was so simple....never had such issues ... now with TFS sometimes I have to get a few times "the latest" b/c it doesn't get it... if that doesn't work I have to get a special version, and check two buttons....now they came up with work-spaces (include/exclude) and the list goes on....

    New kids who join us think we are dinosaurs for not understanding this, they never worked with SS simplicity and reliability!

    The only thing TFS does better is auto-merging.... thats it!

    ReplyDelete
    Replies
    1. The issue in this post is more about TFS-VC. TFS itself does not impose this limit. Using TFS-Git has no issue with this for exmaple.

      TFS-VC is simply bad design, Git is miles better.

      Delete