Originally posted by NickFitz
View Post
What you suggested is that the OS ought to have the responsibility for performing a real-time analysis of resource consumption (including processor cycles) and step in if an app behaves badly. What OS does that? And if desktop systems, with no worries about battery life and minimal resources such as memory, don't accept responsibility for policing shoddily-written software, why should mobile devices be expected to do so? It's more sensible to simply not allow such garbage on the system in the first place.
If the issue is not allowing any technology on a device that could be detrimental, then you can't allow anything that can be programmed, or scripted in any way, which would exclude Javascript and HTML5. If the use of the runtime drawing capabilities of HTML5 become common place, then there's going to be loads of inneficient, and "shoddy" websites all sapping battery life and grinding browsers to a halt instead. Except of course in comparison to the Flash equivalent, the download will be 10x as large, and more CPU and battery power will be used in compiling the Javascript on the client.
However, Flash runs grossly inefficiently on OS X, even though Apple have been trying to get Adobe to do something about it for years, and it is indeed the Flash implementation itself that is a resource hog.
Until you can prove that Flash doesn't use the native graphics capabilities of the OS, which would be the only sane approach to take, it seems reasonable to assume that it does.
After all, it can't use architecture-independent bitmaps without at some point translating them into the native device-independent bitmap structure of the platform on which it's running, which can then translate them to device-dependent bitmaps during final rendering; why would they want the performance hit of adding that extra layer of abstraction, and an additional step translating from their internal structures to the platform's own structures, rather than taking advantage of the platform's own graphics capabilities, which exist for that very purpose?
It uses the format of the output device as a memory bitmap and blits that to the screen, that being the fastest way. There's only a few formats it needs to support, and probably only one for phones. Again you're talking about the one single OS call at the end of a lot of rendering code. If it's slower on OSX than Windows (which I don't believe), then that can only be because OSX doesn't do the Blit as fast or as efficiently as Windows can with DirectX. Because everything else will be the same.BTW I've no great love for Flash. I'd much rather see some new open non-proprietary efficient binary format for graphics, animation and interactive applications, that's supported as part of the browser rather than by the nasty old plugin interface, that uses hardware acceleration as much as possible, that's as small and fast as possible, and that wouldn't get mired in all these politics. But we don't have one.

Leave a comment: