The TFS team has released a security update for TFS 2015 and 2017, that fixes issues with cross-site scripting.
When you have a TFS collection that has been running for a long time, as I do, you end with a lot of shelvesets.
The drawbacks of this is of course, (minor) storage, but it also affects upgrade time.
But instead of manually delete all the old obsoleted stuff manually, Powershell rescues us.
The previous sample scripts on this blog has been about how to use Powershell through the REST API on a on-premise TFS. I’ve got some question about how to connect to Visualstudio Online (VSTS).
All the samples can be modified to connect to VSTS.
When doing release management one of the first step is to get the artifacts from your buildjob copied to a remote machine. TFS already have some buildin tasks like Windows machine File Copy or the Azure File Copy task
The Azure file copy task uses the Azure scripts to copy files to the Blob or to the VM itself( it actually copies the files through the Azure LOB storage)
The Windows Machine File copy is using robocopy so it is meant for copying locally through the SMB protocol. However you might be blocked by a firewall that wont allow SMB.
Then PowerShell comes to the rescue. (As always)
Sometimes you need to get the latest successful build and get the download path for the artifact, for e.g. some 3 part tools or as an input for other build jobs.
You should always have a realistic test environment where you can test updates and breaking changes. To make it as realistic as possible you need to update it regularly with data from the production environment. So why not use your normal SQL backup?
Sometimes you need to get the workitems from TFS directly into your favorite tool, Powershell.
When the workitems are available in PS, you have a lot of possibilities, like making custom reports, release notes papers, or just plain lists.
But instead of creating your own query in PS, why not use the queries already in TFS.
Execution of test runs, either manuel or automatic, will create a bunch of diagnostic test data that can cause the the databse to crow quite rapid.
For vNext build /test jobs you define a retention policy, that will tell TFS how many days you want to keep the data (default 90 days). This job will run automatically every night.
But when using xaml based builds the retention policy is triggered on each build, so if you stop using a build definition, e.g. when you have relased and clone the definition, old builds and test data will be kept forever, until you trigger a build.
Currently there is no out-of-the-box way of making sequential builds in TFS, so I took a look at the REST API, that TFS provides.