I think I am beginning to understand what you are talking about but you are not really analogising enough for those not deeply involved to truly grasp the concept.
I'm not being facetious (sp?) here - I mean in the way that asian thought views the universe as a self-organising web of inter-related things, which become more and more inter-related as you examine closer (and hence towards quantum mechanics, the most inter-related you can get).
I'm applying the same to software - make it self-organising and inter-related down to a tiny level. This is obviously the first step on the path, because a working quantum computer would most definitely need to be self-organising in order to maintain coherence.
I am currently considering the concept of Intelligent Drivers.
Basically this grew as a result of my utter frustration that every time I buy new hardware I need a new "driver".
Why should I need a driver?
Why can't the PC "talk to the installed card", find out what it can do and "learn" how to drive it.
Basically this grew as a result of my utter frustration that every time I buy new hardware I need a new "driver".
Why should I need a driver?
Why can't the PC "talk to the installed card", find out what it can do and "learn" how to drive it.
The PC could be taught to do this over time from its experiences on other hardware and also pre-programmed hardware details.
In short, it works out what hardware is on the attached card and figures out what it can and can't do (obviously this is no minor problem) and learns how to "drive it".
Over further time it would learn how to drive it better and more efficiently.
In short, it works out what hardware is on the attached card and figures out what it can and can't do (obviously this is no minor problem) and learns how to "drive it".
Over further time it would learn how to drive it better and more efficiently.
I suspect this is analogous to what you are talking about.
Basically there is the initial database of characteristics, software library "components" and basic common known interface protocols like ethernet.
No code beyond this initial library needs to be written.
The rest is effectively "learned" by the app and added to using heuristics which are different between apps.
Therefore one app could use the same code functionally but a fast version whereas another could use a more maintainable version for wide compatability.
Am I close?
Basically there is the initial database of characteristics, software library "components" and basic common known interface protocols like ethernet.
No code beyond this initial library needs to be written.
The rest is effectively "learned" by the app and added to using heuristics which are different between apps.
Therefore one app could use the same code functionally but a fast version whereas another could use a more maintainable version for wide compatability.
Am I close?
Perhaps another tack - it's effects on code. Code written for Tornado is utterly reusable, in fact, it's very hard to not make it reusable. It's also automatically distributed - across threads, processors and machines - but you need not care in your code past making it thread-safe. You also need not nor should not care about anything outside your component - not even its API. You can use any style of solving the problem you feel like eg; functionally writing it in Haskell (which I always find "fun") and it will have absolutely no impact on any other code because in fact it cannot.
There is also no efficiency penalty to this because you can use freeform data structures - data can be in any format whatsoever. Issues such as endianess have been made obsolete because that's taken care of for you (so Motorola and Intel now work together seemlessly). Transport of data from A to B whether it be between threads or across the planet is no longer your concern.
I suppose one could look at Tornado as being the most flexible and efficient framework you can have - so like .NET, just without a lot of the manual work. No more common internet controls for example. No more differentiation between a keyboard and a printer for example.
I also made RAM obsolete - there is no longer the concept of loading a program or file in or indeed having a program running. RAM is now a cache of the hard drive - which it was heading anyway with virtual memory - I just take it all the way.
Basically, it's a massive simplification of using a computer - there are about five core rules of use, and everything follows them logically based on those. So the learning curve is very low - just get the rules, and the complexity and power follows from there based on combination of those rules.
That any better?
Cheers,
Niall
Leave a comment: