• Visitors can check out the Forum FAQ by clicking this link. You have to register before you can post: click the REGISTER link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below. View our Forum Privacy Policy.
  • Want to receive the latest contracting news and advice straight to your inbox? Sign up to the ContractorUK newsletter here. Every sign up will also be entered into a draw to WIN £100 Amazon vouchers!

Apple and Flash again

Collapse
X
  •  
  • Filter
  • Time
  • Show
Clear All
new posts

    #11
    Morning.

    Originally posted by NickFitz View Post
    iPhone OS is Unix.
    Really? I thought it didn't do multitasking. Even the UNIX I remember from university in 1989 did that.

    What you suggested is that the OS ought to have the responsibility for performing a real-time analysis of resource consumption (including processor cycles) and step in if an app behaves badly. What OS does that? And if desktop systems, with no worries about battery life and minimal resources such as memory, don't accept responsibility for policing shoddily-written software, why should mobile devices be expected to do so? It's more sensible to simply not allow such garbage on the system in the first place.
    Perhaps, but the platform used to create the software in that case is irrelevant. It's also possible to write shoddy software in Objective C - should Apple ban that? If the approach is to police the quality of the software (as they're doing with the App store), then there doesn't need to be any issue over what it's written in, and they may as well have Flash apps and judge the end results as they would something written in Objective C.

    If the issue is not allowing any technology on a device that could be detrimental, then you can't allow anything that can be programmed, or scripted in any way, which would exclude Javascript and HTML5. If the use of the runtime drawing capabilities of HTML5 become common place, then there's going to be loads of inneficient, and "shoddy" websites all sapping battery life and grinding browsers to a halt instead. Except of course in comparison to the Flash equivalent, the download will be 10x as large, and more CPU and battery power will be used in compiling the Javascript on the client.

    However, Flash runs grossly inefficiently on OS X, even though Apple have been trying to get Adobe to do something about it for years, and it is indeed the Flash implementation itself that is a resource hog.
    Well you keep saying that, but I can't find any technical reason to back that up (video stretching aside). As a Mac is a PC, as far as hardware goes, if one implementation is really so much different to the other then that can only be that OS or its API. So either OSX is slow and clunky, or this issue is just anti-Adobe propoganda with no basis in fact. I think the latter, but perhaps I'm wrong.

    Until you can prove that Flash doesn't use the native graphics capabilities of the OS, which would be the only sane approach to take, it seems reasonable to assume that it does.
    Actually the only sane approach is to not use the native graphics capabilities. That's what gives Flash its platform independence, that it has its own renderer it can look the same everywhere. Which is where HTML5 rendering potentially could be different, and could use some kind of hardware vector rendering if devices supported it. But I'd be suprised if they do. Running that exploding video example used some 60% of my CPU (Dual core 2.1GHz) in Chrome. That's not being done by the hardware, that's a software renderer I'm sure. If you did the same thing in Flash, I imagine it'd use about the same amount of CPU and look exactly the same.

    After all, it can't use architecture-independent bitmaps without at some point translating them into the native device-independent bitmap structure of the platform on which it's running, which can then translate them to device-dependent bitmaps during final rendering; why would they want the performance hit of adding that extra layer of abstraction, and an additional step translating from their internal structures to the platform's own structures, rather than taking advantage of the platform's own graphics capabilities, which exist for that very purpose?
    What? It uses the format of the output device as a memory bitmap and blits that to the screen, that being the fastest way. There's only a few formats it needs to support, and probably only one for phones. Again you're talking about the one single OS call at the end of a lot of rendering code. If it's slower on OSX than Windows (which I don't believe), then that can only be because OSX doesn't do the Blit as fast or as efficiently as Windows can with DirectX. Because everything else will be the same.

    BTW I've no great love for Flash. I'd much rather see some new open non-proprietary efficient binary format for graphics, animation and interactive applications, that's supported as part of the browser rather than by the nasty old plugin interface, that uses hardware acceleration as much as possible, that's as small and fast as possible, and that wouldn't get mired in all these politics. But we don't have one.
    Will work inside IR35. Or for food.

    Comment

    Working...
    X