• Visitors can check out the Forum FAQ by clicking this link. You have to register before you can post: click the REGISTER link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below. View our Forum Privacy Policy.
  • Want to receive the latest contracting news and advice straight to your inbox? Sign up to the ContractorUK newsletter here. Every sign up will also be entered into a draw to WIN £100 Amazon vouchers!

.NET and destructors!!!!!!!!!!!!!

Collapse
X
  •  
  • Filter
  • Time
  • Show
Clear All
new posts

    #21
    Originally posted by AtW
    MSCOREE is in system32 dir - looks like .NET 2.0 replaces previous version.
    Absolutely correct... and I feel this is the flaw. Although backward compatibility is promised, we all know how this can lead to all sorts of bugs. It seems odd as the rest of the side-by-side versioning paradigm is designed to explicitly circumvent this very situation.

    Originally posted by AtW
    There is no reason why GC should change its behavior due to ThreadPool having increased number of threads that it can use, yet, I traced the issue with 100% replicability to usage of that thread unlocking code.
    Its not the increase in thread pool size, its the unmaged code (a COM dll) is used to do it... Im presuming you dont see this when using the new function call in .Net 2?


    I have classes that contain collections as class variables that contain references to other objects - I am pretty heavy object oriented nowadays.

    The truth is that in some cases GC.Collect() will clear lots of memory and if you not using it then you will face more memory used
    Seems to back up the theory about nested references in instance reference types - the GC will eventually get round to unallocating memory when the BASE reference is released.

    Originally posted by AtW
    the guy who says don't use GC.Collect() at all is seriously wrong - there are cases when usage of GC.Collect() is very beneficial and since he is supposed to know such cases better than me, he should have documented that.
    To be fair he does say that there are occasions where it is applicable, but it seems that we are all trying to find out exactly what is happening 'under the hood', and in what real life circumstances certain behaviour will be exhibited...


    Originally posted by AtW
    And certainly they should have made sure that programmer has complete control over GC - if I say collection all garbate then it should do just that - collect it.
    It seems it will behave that way as long as it cannot find a reason not to... e.g. a reference somewhere in the object graph still held that blocks collection. Seems a tricky one as to bypass that means a specific way of organising classes which does not follow some commonly used OO practices... and in your case Alexi you have hit the proverbial 'sound barrier' where what works ok under normal circumstances falls apart when a certain threshold is reached.
    Vieze Oude Man

    Comment


      #22
      Originally posted by mcquiggd
      Absolutely correct... and I feel this is the flaw. Although backward compatibility is promised, we all know how this can lead to all sorts of bugs.
      What I think happened is that .NET 2.0 became stricter about classes with destructors - even though I was running 1.1 code, it seems to me that this COM call to DLL that actually belongs to .NET 2.0 somehow triggered behavior of GC 2.0, that was really bizarre situation and looking on the net I could not find many people experiencing it, perhaps because they use .NET for toy programs - mine is very long running crunching lots of data so memory issues are important.

      Originally posted by mcquiggd
      Its not the increase in thread pool size, its the unmaged code (a COM dll) is used to do it... Im presuming you dont see this when using the new function call in .Net 2?
      That's the thing - it appears that this crap happens in .NET 2.0 as well, but, if you don't have .NET 2.0 at all- MSCOREE dll is 1.1, then all is fine - so clearly something from 2.0 messes things up - in fact it messes things up. Its real BS as side effect of having destructor should not be THAT bad.

      Originally posted by mcquiggd
      in your case Alexi you have hit the proverbial 'sound barrier' where what works ok under normal circumstances falls apart when a certain threshold is reached.
      This is true - I certainly push .NET to the limit and to be fair it did pretty good job, they just should have made few more steps to make it more suitable for serious long running processes - I was about to say "I bet Microsoft did not code their search engine in .NET", but then I changed my mind, got one entity that needs my donation already

      Comment


        #23
        I need to have many objects that will self-destruct when needed.
        If only it were that simple. Mine always seem to self destruct approximately 2 nano seconds after I have created them.

        Comment


          #24
          Originally posted by AtW
          I was about to say "I bet Microsoft did not code their search engine in .NET", but then I changed my mind, got one entity that needs my donation already


          As for the .net side by side issue and odd behaviour that seems to be a confusion about which version of the runtime is actually used by an Assembly, there is all sorts of confusion about this. My understanding is that the default behaviour changed with .Net 2 and that previously, if you built an assembly in .Net 1.1 but, say, only had the .Net 1.0 runtime installed, it would raise an exception and quit. Now, if it cant find the 1.1 runtime it defaults to loading the 2.0 runtime and doesnt actually notify you. Perhaps if you specify the version in the config file this might revert to previous behavior? Im not sure...
          Vieze Oude Man

          Comment


            #25
            Originally posted by mcquiggd
            Perhaps if you specify the version in the config file this might revert to previous behavior? Im not sure...
            The decision which run-time should be used is done before my code actually gets chance to execute - this COM call that I make is well after init stage is passed, there is no way it would result in .NET aborting execution of 1.1 machine and switching to 2.0, in fact, I suspect that this MSCOREE may have actually been installed (replacing 1.1 version of it) without having .NET 2.0, not sure, but definite things that I found were:

            1) using .NET 1.1 (SP 1) with COM call to increase threads is okay - GC is working fine - you need MSCOREE to be version 1.1

            2) as above but MSCOREE is version 2.0 and GC goes nuts and won't release classes with destructors - after commenting these out all was okay

            3) .NET 2.0 seems to have the same effect as in 2 - even though COM call there is not necessary since they allow direct call to ThreadPool to increase number of threads used.

            What really pissed me off is that whoever designed ThreadPool implemented SetMinThreads function, but did not think about SetMaxThreads - its like, if you do a MIN function, then you have to have MAX version of it!

            And don't get me started on whoever implemented HttpWebRequest

            Comment


              #26
              Actually I meant that it might revert to 'I cant find the right runtime, so ill raise an exception' behaviour... if this is the case, it would prove / disprove that ALL your assemblies are being loaded into 1.1 CLR or 2.0 CLR.... or even switching between them.... this might explain the odd behaviour you have detailed...

              Anyway, congratualtions on progress with SKA to date

              David
              Vieze Oude Man

              Comment


                #27
                Thanks David, there is still loads more work to do, would have been less if Microsoft used more of .NET for real big long running apps

                Comment


                  #28
                  SKA is a bit out of the ordinary in terms of potential lifetime of objects I suppose... any possibility of refactoring to limit the depth of references held / size of objects maintained at Field level (which seem to linger) so the more 'transient' objects only exist at Method level? As its a distributed app, perhaps even 'chunking' the calls and saving intermediate data to Protected Storage (initially LOSformatter to serialise your objects to XML as its quick and you dont really need to mind if the data isnt optimally sized at that point in time), then periodically uploading (en masse) to the central servers using something like ZLib compression on the XML would help (smallish overhead and really compacts the XML)...? Or is it server side code that is the biggest issue...?

                  I know some of what I am saying is perhaps stating the obvious.... but its interesting to find out what techniques would work with an app such as SKA (cant drop the name, I know its Majesty12), as it leads to a greater understanding of what is actually happening in more modestly scaled systems and might affect my approach in general...


                  By the way, wish me luck - second interview tomorrow for a permanent job in local Asset and Investment Management company using .Net 2 about 300 yards from where I live.. could be the elusive foothold into the finance domain ive been seeking...
                  Last edited by mcquiggd; 2 April 2006, 18:02.
                  Vieze Oude Man

                  Comment


                    #29
                    Don't mention XML serialisation - I use it for configs and started using for distributed query results, and guess what? The bl00dy XML serialiser involves a LOT of effort and memory usage when its created and it was not released back! Effectively a leak with Microsoft recommending using static instance of it, which is fine, but ffs, someone really needs to compile a list of all bits in .NET that cause leaks and publish them. I will do it when I get time.

                    Client was the biggest issue as it runs on different environments including Mono on Linux (v poor non-generational GC there), oh well, suppose without these issues there would not be sense of achievement.

                    Good luck on the interview!

                    Comment


                      #30
                      Cheers Alexei (hope I am spelling that correctly)...

                      The LOSFormatter is pretty good - its used extensively within the Framework classes and is fairly low cost performance wise (apparently the inherent Types have their own specialised implementation of such serialisation). I would imagine that MS have made that pretty watertight or else even normal applications would be falling over, as it would be difficult to avoid using it 'under the bonnet' as such.
                      Vieze Oude Man

                      Comment

                      Working...
                      X