Team Foundation Server

Posted by bronco on 14-Jan-2011 06:23

Hi,

Does anyone have experience with MS Team Foundation Server 2010 in conjunction with OEA & Team Explorer Everywhere plugin?

If so, what are your experiences?

I'm trying to evaluate this combination (because we quite some .NET guys here who alreaqdy use TFS) but so far it seems a bit awkward compared to SVN/subclipse.

Thanks,

Bronco

All Replies

Posted by Stefan Drissen on 05-Jul-2011 14:07

Our source control is migrating to Team Foundation Server next month, we will be looking into using Team Explorer Everywhere . Any experiences / pitfalls you can share?

Thanks,

Stefan

Posted by bronco on 22-Jul-2011 14:02

Hi Stefan,

I'm afraid not. We decided to go the Mercurial way for several reasons including the license costs for both TFS and TEE. Apart from this I think the combination of Mercurial, Ant & Hudson was a better one with OE Architect in mind.

Bronco

Posted by Stefan Drissen on 22-Jul-2011 14:47

Thanks. TFS is being implemented for all corporate development teams and nearly all of the other teams are very much Microsoft minded.

We uploaded all our sources to TFS this week only to find out that file date has no meaning in TFS. Whenever getting 'latest version' locally all files are timestamped with today. Since we use the file date to determine what is compiled in a patch this was somewhat killing.

We are now looking into a modification that will at least use the check-in date when getting latest version locally.

I had a quick look at the trial version of Team Explorer Everywhere once all sources were in TFS and I liked the integration with Eclipse, simply double click a file, start editting and the file is automatically pending check-in.

Posted by Admin on 22-Jul-2011 15:21

Since we use the file date to determine what is compiled in a patch this was somewhat killing.

 

Relying on the time stamp sounds counterproductive when using a powerful SCM tool. TFS certainly will have better ways of finding out what has changes since day X. Look for a scripting/command line interface .

However with Perforce (our SCM tool) we can decide if the time stamp get's set to the date of synchronization or keep the original modification date. Check for a similar setting in the TFS configuration.

Posted by bronco on 22-Jul-2011 15:33

Well, I guess that relying on modification date leaves something to be desired as well. How about dependencies with includes? You must have a good xref parser which records all the dependencies. I would bet on the MD5 of rcode. Compare the version you delivered with the current version, based on the MD5 value, and determine this way what .r's changed.

just my 2c,

Bronco

Posted by Admin on 22-Jul-2011 15:47

Relying on the MD5 hash of the R-Code (RCODE-INFO Handle) is certainly the most reliable way of selecting patch files.

Especially as it also takes changes to the DB schema (CRC check) into account (I guess that was fixed in 10.0B).

Posted by Stefan Drissen on 22-Jul-2011 15:50

mikefe wrote:

Since we use the file date to determine what is compiled in a patch this was somewhat killing.

Relying on the time stamp sounds counterproductive when using a powerful SCM tool. TFS certainly will have better ways of finding out what has changes since day X. Look for a scripting/command line interface .

However with Perforce (our SCM tool) we can decide if the time stamp get's set to the date of synchronization or keep the original modification date. Check for a similar setting in the TFS configuration.

We've used the file date for over 10 years so it's hard to just drop it like that. I'm happy to look into other ways, but we need to do this one step at a time without killing all our current deployment tools.

TFS has no setting for the date (searched for this during the week and confirmed by google).

Posted by Stefan Drissen on 22-Jul-2011 15:53

Yes we create an xref build each night, the results are then poured into a database with a webspeed front end for querying the results, one of which is which files include which include. It is up to the programmers to make use of this.

We change the database once a year (or so) and that marks the batch compile (all sources compiled). All changes (file date) on the same database model are patched.

Posted by Admin on 22-Jul-2011 15:56

but we need to do this one step at a time without killing all our current deployment tools.

Understandable.

TFS has no setting for the date (searched for this during the week and confirmed by google).

Ok, if you need the files last modification date, maybe there is a report in TFS returning that?

The "tf properties" command looks like returning that: http://msdn.microsoft.com/en-us/library/tzy14b58.aspx

So probably just a small change at where you are getting the last modified date from, rather than the whole deployment scheme.

Posted by Stefan Drissen on 22-Jul-2011 16:07

mikefe wrote:

Ok, if you need the files last modification date, maybe there is a report in TFS returning that?

The "tf properties" command looks like returning that: http://msdn.microsoft.com/en-us/library/tzy14b58.aspx

So probably just a small change at where you are getting the last modified date from, rather than the whole deployment scheme.

This is what our corporate TFS specialist is looking at using these properties when getting files (http://blog.coryfoy.com/2007/12/fixing-the-timestamps-on-files-from-team-foundation-server/).

Posted by Admin on 22-Jul-2011 16:11

This is what our corporate TFS specialist is looking at using these properties when getting files (http://blog.coryfoy.com/2007/12/fixing-the-timestamps-on-files-from-team-foundation-server/).

Cool C# code It's trivial to translate it into ABL code.

Posted by Stefan Drissen on 22-Jul-2011 16:16

mikefe wrote:

Cool C# code It's trivial to translate it into ABL code.


Are you suggesting we get sources directly from TFS with ABL? I hadn't thought about it like this before. The only downside is that that costs me time instead of corporate.

editor put some junk in the quote...

Posted by Admin on 22-Jul-2011 16:31

Are you suggesting we get sources directly from TFS with ABL? I hadn't thought about it like this before.

I know too little about your whole environment to suggest anything in that direction. But if your code that "collects" patch files is ABL, querying the TFS directly from ABL might be a good way moving forward. I wouldn't actually change the file date/time on disk, I'd change the way you query it from file system to the TFS .NET calls.

Perforce has a .NET API as well, and I used that a couple of times from ABL. One example is a tool to create delta DF files from the checked in files: I always just create a full DF file, and have an ABL/P4.NET program that gets the previous version as well, loads both into empty databases, creates a delta DF file and checks that in. I like being able to have a simple ABL API for my SCM tool (and full integration into the IDE).

The only downside is that that costs me time instead of corporate.

Don't get me into your politics

Posted by Stefan Drissen on 22-Jul-2011 16:45

Thanks for the idea! Our compiler, which is currently grabbing all files from a network share, could be adjusted to do grab files from TFS instead.

Any thoughts on how test environments (with AppServers and WebSpeed agents) should remain up to date? Currently they simply have propaths which point to the network shares. 

Posted by Admin on 22-Jul-2011 16:56

Any thoughts on how test environments (with AppServers and WebSpeed agents) should remain up to date? Currently they simply have propaths which point to the network shares.

Never really used TFS.

But for me it's just a "p4 sync" on that machine. Probably also a tf command line option.

Posted by Stefan Drissen on 23-Jul-2011 16:11

The GENERATE-MD5 idea intrigued me. A quick test with it unfortunately shows that any change (even irrelevant changes) to the source file will result in a different MD5. We have a central include file which contains preprocessors with all table and field ids. This file is included by all business logic sources. When a new business logic source is added extra defines are added for the new tables / fields. This change will result in a different MD5 and thus, if MD5 were used to determine redeploy, will result in nearly all sources being marked as needing to be redeployed.

I thought I might be able to MD5-DIGEST the r-code, but this changes per identical compile run.

Any other ideas?

Posted by Thomas Mercer-Hursh on 23-Jul-2011 16:23

One of the conspicuous downsides of widely used include files is that any change to the include file is going to impact the r-code of any compile unit which includes it.  Sort of goes with the territory ... since the r-code is, in fact, different.  And, as you have noticed, the r-code has things like date and time information in it which also makes it an unreliable source.  I'm not sure why, though, the MD5 of an unchanged source would change if the only change was in an include.  The r-code yes.  The LISTING yes.  But, why unmodified source?  Or, is that not what you are saying?

Posted by Stefan Drissen on 23-Jul-2011 16:32

We have a mapping of all our table / fields to numbers (which are used in multiple system functions, among others which fields are shown on a search screen, entry screen or report), for example:

&GLOBAL-DEFINE TABLE-DEBTOR 900001

&GLOBAL-DEFINE TABLE-CREDITOR 900002

&GLOBAL-DEFINE FIELD-DEBTOR 1234

&GLOBAL-DEFINE FIELD-CREDITOR 1235

&GLOBAL-DEFINE FIELD-NAME 1236

This include is then used by both debtor and creditor business logic. Debtor business logic only uses FIELD-DEBTOR and FIELD-NAME, whereas creditor business logic only uses FIELD-CREDITOR and FIELD-NAME.

If we then decide to add item business logic an additional define would be added

&GLOBAL-DEFINE TABLE-ITEM 900003

&GLOBAL-DEFINE FIELD-ITEM 1237

This change to the include, although not used by either debtor or creditor business logic would be flagged as having a different MD5-VALUE for debtor and creditor business logic. This then makes the MD5-VALUE not usable for us to detect a real change in r-code.

An alternative could be to preprocess the file before compiling, strip out all comments and white space, MD5-DIGEST this and compare this with the previous compile. It would have been nice if MD5-VALUE had been a bit smarter.

Posted by Thomas Mercer-Hursh on 23-Jul-2011 16:41

Suffice it to say that I would do what you have done ... but, be that as it may, I still don't get why if you have debtprocessabc.p which includes mycodes.i and you make a change to mycodes.i, but no change to the source in debtprocesssabc.p that the MD5 of debtprocessabc.p is going to change.  The r-code, sure ... it isn't useable anyway.  The LISTING sure, because the include is there too.  But why the unmodified source file?  Or, do I get you wrong.

Posted by Stefan Drissen on 23-Jul-2011 16:52

Aha, I see the confusion. The idea was to use MD5-VALUE to determine if a certain r-code needs to be redeployed (without needing any xref checking and without relying on file dates). I had hoped that this would be a magic bullet which could simply indicate that the resulting r-code really is different.

Our current setup is using the file date (and our human / xref checks which will touch the file date if required) to determine recompile. The whole file date thing came under fire with the advent of TFS which has no notion of file date when getting all latest sources (all files have today as file date).

But... I think that you are throwing up the option to MD5-DIGEST the source file? Which is what I was also suggesting, but with the added bonus of preprocessing (and stripping) the source file so that any real change to the source code (or its include tree) would trigger a redeploy. I had only hoped that this is what MD5-VALUE (on the RCODE-INFO handle) would already do this.

Posted by Thomas Mercer-Hursh on 23-Jul-2011 17:13

To really tell if there was a meaningful change, one could do an MD5 on the export of the AST from Proparse!

BTW, believe me that I understand the problem.  Something like a million lines or more of my former ERP came from a program generation tool called Specification-Driven Development.   Any change, even cosmetic, to one of the template files was likely to impact thousands of source code files, even though the specifications for that file had not changed.  Of course, often the change was not cosmetic, and one really did want to deploy all those files because one had actually made a functional change.  But, it made any usual form of version control into nonsense since line by line comparisons were showing real differences, but ones that came from the template, not any actual change to the specifications or hand-written code for that function.

Posted by Stefan Drissen on 23-Jul-2011 17:25

Hmm... interesting idea. I have looked at proparse a few times in the past with source formatting as main goal and was somewhat hindered by includes being included in the result tree, but in this case it could be really useful.

I think I will however skip the overhead of creating the result tree and any other issues and simply preprocess and strip. The stripped result can then be used for compling reducing compile time overhead.

Pfff... just need to find a nice place to store the MD5-DIGESTS.

Posted by Thomas Mercer-Hursh on 23-Jul-2011 17:32

I plan on doing some work with Proparse scripting soonish ... seem to have a hard time getting it to the top of the pile.  One of the possible projects would be formatting.

But, you are right that Proparse is working on the full compile unit so one would have to do something to exclude the includes for something like this.  I't might also not be what you want if you make a change to preprocessor which doesn't currently impact a given source file in terms of how it compiles, but which is, nevertheless, a change in the source.

Somehow, it seems like the source code control should be giving you what you want without having to resort to this external testing.

Posted by Admin on 23-Jul-2011 23:56

The GENERATE-MD5 idea intrigued me. A quick test with it unfortunately shows that any change (even irrelevant changes) to the source file will result in a different MD5.

Sure - the compiler cannot understand what's irrelevant and what not.

I thought I might be able to MD5-DIGEST the r-code, but this changes per identical compile run.

 

That doesn't surprise me much. The GENERATE-MD5 option of the compiler does not take things like a header into account that an external MD5 generation routine cannot distinguish. I'm not surprised that that returns a lot of false positives.

Any other ideas?

For the short term: Stay on your track with working on time stamps (returned from tf properties or the .NET API) and exclude this one include like you probably did in the past.

Posted by Admin on 24-Jul-2011 00:01

To really tell if there was a meaningful change, one could do an MD5 on the export of the AST from Proparse!

 

Might work, as the AST from proparse does not contain any preprocessor code.

Posted by Stefan Drissen on 24-Jul-2011 09:09

mikefe wrote:

The GENERATE-MD5 idea intrigued me. A quick test with it unfortunately shows that any change (even irrelevant changes) to the source file will result in a different MD5.

Sure - the compiler cannot understand what's irrelevant and what not.

It should be able to, depending on how you define irrelevant. In my case irrelevant meant adding unused global defines to an include file. Even simply adding a space to a comment changes the MD5-VALUE. The compiler really should be able to understand that this is irrelevant, instead its doing a quick and dirty MD5 hash on all files included in the compile.

But no matter, I'll compare the MD5-DIGEST of the preprocessed stripped source to the previous run.

Posted by Admin on 24-Jul-2011 09:41

On a general base, other things are important for r-code generation:

DB schema, ldbname

-NL

font settings, frame layout dependent parameters

...

That's why the compilers MD5 value may report more changes that the sole source code (as I said a generic approach without you assumption that DB's are change only once a year).

But you are right, adding and removing spaces to a comment is not a relevant change. Maybe the r-code also contains timestamps of the source?

Posted by Thomas Mercer-Hursh on 24-Jul-2011 11:56

While it is hard to see that a space in a comment would be relevant, I can also see that different is different and that once one goes down the road of reporting different as same, that the results may not be as intended. In the case of CRC, we have come a long way in relaxing the requirements, but that is a context in which there is a very specific functional requirement, i.e., is it safe to run.  In this context, I don't know that there is universal agreement about what is relevant.

This thread is closed