Chase Goldman Posted January 17, 2018 Share Posted January 17, 2018 This change really bugs me. Seems like now we have a setting that can really screw up performance if it is not set properly and really have not been provided with the tools to properly determine the optimal setting. Its kind of like saying manually set ground balance based on the height of a bargraph without any units shown. Any scientific method at setting this parameter consistently and optimally? Link to comment Share on other sites More sharing options...
Chase Goldman Posted January 17, 2018 Author Share Posted January 17, 2018 Now its a guessing game for us. Would be better if you could enable or disable this feature (so the unit worked as before or you could turn it on an tweak and play with it). I think I am going to stay away from 4.1 for awhile until you and others get a handle on how to set it. I'll be watching your posts to see what you come up with in the comparison, in the mean time its 4.0 for me. Link to comment Share on other sites More sharing options...
Steve Herschbach Posted January 17, 2018 Share Posted January 17, 2018 For those who are not sure what the discussion is about, version 4.1 adds a new function - Ground Sensitivity. Page 35 of the new 4.1 Owner's Manual: 1 Link to comment Share on other sites More sharing options...
Chase Goldman Posted January 22, 2018 Author Share Posted January 22, 2018 Based on user testing, other reading and quotes from XP folks like Gary Blackwell on other forums, I think I have pieced together what this Ground Sensitivity thing is doing. First of all we need to take a look back at what XP was trying to accomplish with GB tracking. This is speculation because XP does not discuss much about secret sauce, but I believe prior to version 4 software, XP executedGB tracking as a time based feature. In other words, while in tracking XP simply periodically measured ground phase reading and adjusted accordingly (say every 1.0 seconds). This had the unfortunate drawback of cause ground phase reading to abruptly change when you were swinging over a ferrous target that had a pronounced effect over the localized ground phase reading and resulted in the dreaded disappearing target trick, wherein the target would simply disappear if you swung the coil over it enough times while in tracking mode provided the target affected the local ground phased reading (e.g., large iron target or hot rock). To solve this problem, I believe that in version 4, XP decided to use ground mineralization index as a trigger point to initialize a ground phase measurement. Changes in subsequent mineralization index readings above a certain threshold would initiate a new ground phase reading. The problem with this approach was that in highly mineralized soils you could get large swings in mineralization index which could result in unstable ground phase readings. In an effort to fix this issue, I believe, XP in version 4.1 implemented a user adjustable ground sensitivity setting that determines how much of a relative change in mineralization index needs to occur before the Deus takes a new ground phase reading for tracking purposes. If the sensitivity is set low (e.g., towards 1) then a really significant change in mineralization index needs to occur (think of this as a large spike occurring on the mineralization bar graph meter before a ground phase reading is taken for tracking purposes. If sensitivity is set high (e.g., towards 10) then only small changes in the mineralization index need to occur before a ground phase reading is taken for tracking purposes (think of only a small blip change in the mineralization bar graph changing). So a 10 setting would be good if you wanted ground tracking on dry white sugar sand beaches with little mineralization and little change in mineralization and a 1 setting might be good in Culpeper which has high mineralization and large swings in the index. What is unknown is what threshold change in mineralization is required for each sensitivity setting, whether the effect is linear, whether the magnitude of the mineralization index and not just the change in the index has any bearing on what you should set ground sensitivity (i.e., is it a percentage change in the baseline reading or an absolute change), and, finally, whether positive AND negative changes in mineralization index are tracked or only positive/increasing (or negative/decreasing) changes. We do know that if a change in mineralization index above the threshold is not sensed for 7 seconds, ground phase defaults to 88 (this might happen, too, if the coil is stationary for greater than 7 seconds). From reading information from a variety of sites and forums, it seems, according to Gary Blackwell, that XP had the version 4 software set to an equivalent ground sensitivity setting of 10 (!). Also, Tn's testing has shown for some reason that the effect of the ground sensitivity setting seems to be less pronounced when using the HF coils. Strange... (note that the HF coils are not "updated" when the version 4.1 update is applied to the Deus). Bottom line, if you want to play it "safe" for most circumstances and soil mineralizations, using the default sensitivity of 6 is a good start (anywhere between 5 and 7 should work). 1 Link to comment Share on other sites More sharing options...
stumpr Posted February 6, 2018 Share Posted February 6, 2018 This is great info! I relic hunt in relatively non-mineralized ground (except for Culpeper) so playing it safe with default setting of 6 seems like the safest play. Link to comment Share on other sites More sharing options...
martygene Posted February 12, 2018 Share Posted February 12, 2018 here is gary blackwell explaining ground sens in v4.1 2 Link to comment Share on other sites More sharing options...
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now