Visitors can check out the Forum FAQ by clicking this link. You have to register before you can post: click the REGISTER link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below. View our Forum Privacy Policy.
Want to receive the latest contracting news and advice straight to your inbox? Sign up to the ContractorUK newsletter here. Every sign up will also be entered into a draw to WIN £100 Amazon vouchers!
You are not logged in or you do not have permission to access this page. This could be due to one of several reasons:
You are not logged in. If you are already registered, fill in the form below to log in, or follow the "Sign Up" link to register a new account.
You may not have sufficient privileges to access this page. Are you trying to edit someone else's post, access administrative features or some other privileged system?
If you are trying to post, the administrator may have disabled your account, or it may be awaiting activation.
Logging in...
Previously on "The road to hell is in fixed point binary"
I found an old notebook recently with a bunch of scribbled notes and diagrams to do with rendering polygons on a bitmapped screen, probably written in the pub twenty-odd years ago. Completely useless now
Not useless at all. I resurrected some of my old bitmapped screen handling stuff recently on the Raspberry Pi - most enjoyable.
There was a brief window where our pointless old knowledge was once again valuable in the mobile world, but then mobile chips doubled in power every generation and are now on a par with PCs.
I felt like that when I got a fully-featured S-Buffer implementation working (dispenses with the need for a Z-buffer). I read a little about the idea in the legendary Computer Graphics: Principles and Practice: Principles and Practices and aged 13 or so it was the first time I ever sat down and worked out an algorithm on paper properly, and then then implemented it.
This was about the time everyone was switching to Direct3D so my 3D engine with inner-loop hyper-optimisation never actually got used. But I was still dead chuffed.
Nice
I found an old notebook recently with a bunch of scribbled notes and diagrams to do with rendering polygons on a bitmapped screen, probably written in the pub twenty-odd years ago. Completely useless now
I felt like that when I got a fully-featured S-Buffer implementation working (dispenses with the need for a Z-buffer). I read a little about the idea in the legendary Computer Graphics: Principles and Practice: Principles and Practices and aged 13 or so it was the first time I ever sat down and worked out an algorithm on paper properly, and then then implemented it.
This was about the time everyone was switching to Direct3D so my 3D engine with inner-loop hyper-optimisation never actually got used. But I was still dead chuffed.
I remember at least one game which did (what I assume is) the exact same trick. We were amazed when suddenly "real music" came out of the PC speaker, I think it maybe even did speech.
Yes, it was a bit annoying that the boss just said "Ah, cool hack" and did nothing more about it, only even mentioning it to potential clients as an afterthought. A few months later some other company was getting big mentions in ACE and The Games Machine for doing the same thing
So I certainly wasn't the only one to think of it, but I did think of it - inspired by a conversation in the pub with an old friend who was an electronics engineer, as I remember
(And, unlike some implementations, I kept count and called the original interrupt handler every so many times, so the system's real-time clock didn't either go wildly fast or stop completely.)
I remember at least one game which did (what I assume is) the exact same trick. We were amazed when suddenly "real music" came out of the PC speaker, I think it maybe even did speech.
Did you notice the comment from the chap who'd worked out it was a 23 bit error rather than a 24 bit error?
Bet he wears a beany hat with a propeller.
And the chap who notes that the original IBM PC clock ticked at 18.206509677 Hz. That came from the 8253 PIT, counting down from 0xffff to zero, which triggered an interrupt.
For some of the games I wrote on the PC I reconfigured it with a smaller start value so it interrupted much more frequently, and used that to modulate the frequency at which the internal speaker was driven (also by the 8253). By this means, we were able to output sampled sounds (albeit at a very low sample rate) via pulse-width modulation through the internal speaker
To this day, I think that was the coolest hack I ever came up with
Though really it's not fixed point binary that's the problem; it's using two different ways of measuring time together that's the problem. And that's the sort of thing that should ring alarm bells for any experienced programmer.
That and the fact they didn't bother testing it properly. This being the military they probably had extensive test procedures, but as long as everybody ticked the right boxes and signed the right documents, nobody cared whether it was actually tested.
Leave a comment: