You see, it's been five years. The landscape of games technology has changed dramatically in that time, and the next five promise to be even more radical. Five years ago, I certainly wouldn't have predicted that things would turn out as they have.
As a matter of fact, five years ago, my plan was to be releasing the second or third generation of real-time raytracing technology. At the time I was still planning on software-based rendering, but that changed. I had a vision for a world with a radically different approach to computer graphics - and my products at the center of it all, naturally.
I "officially" discarded my other hobby projects to begin focused work on the "Freon 2/7 Project" in April of 2002. The announcement was made on my web site, to the vast audience of about three people. I'd already done quite a bit of experimentation in raytracing, and had written the beginnings of my own raytracer. Over the next two years, I poured thousands of hours into research and development on the technology.
It quickly became clear that software wouldn't hack it. I shifted my focus to developing a prototype of a system that would effectively be a fixed-function hardware accelerator for raytracing. At the peak of my research, I developed a fast global-illumination approximation algorithm, which still remains unpublished (out of hope that I might get to use it properly someday).
Soon, though, things started to slip; the original "release date" of December 17, 2003 came and went, with only a token update to the website. The second anniversary of the project came and went, and the whole thing just quietly shut down. I hacked off and on until the summer of 2004, and then stuffed the code into a .ZIP file, burned it to CD, and haven't opened it since.
Even in the intervening two years, things have changed. Two years ago, it was clear that a fixed-function raytracing card had no place in the market - but a programmable one had a good shot. The problem was, developing a proper emulator for a programmable system would require a full rewrite of my prototype code; more importantly, bringing the product to market would require substantial involvement (and money) from a hardware development team. I put the project away not because it was no longer worth pursuing, but because I no longer had the time - I'd buckled to the temptations of a day job, and had committed my remaining time to working with Egosoft.
After the day job went down the crapper, I found myself again with a surplus of time, but also a decent bit of burnout, and not much desire to work on many side projects. Over the past few months, that's clearly changed, with stuff like TKC2 and the Epoch language grabbing my attention.
Through it all, though, there's been a little nagging doubt in the back of my mind: what about Freon?
I've thought it over at length, and I think the time for the product has come and gone. There was a window of opportunity for a couple of years, but it's now closed. Shader Model 4 and the increasing convergence of CPU and GPU technology has basically eliminated the need for dedicated raytracing hardware. More importantly, programmable shaders in general have eliminated the quality disparity between rasterized and raytraced graphics. There's simply no need for raytracing in the market; there's no place for it.
Yet.
I think another window will open in a few years. Raytracing is inherently a more efficient and elegant rendering method than scanline conversion, and due to its power, it will eventually supplant traditional polygon-rasterization methods. This will particularly become true when hardware is fast enough to genuinely simulate complex lighting effects (global illumination, subsurface scattering, participating media, fully dynamic lights and geometry, and so on) rather than just making hackish approximations.
That time is a long ways off, though, and even that future has no place for dedicated raytracing hardware. Instead, I think the future is shaping up to hold something different.
The first step was AMD's integration of the memory controller and CPU. The AMD/ATi acquisition promises to provide further changes. Dual-core technology has hinted at the challenges to come, and a close look at multi-core processor plans of the future shows that there's no shortage of difficulties to overcome.
Perhaps most important, though, is the role of programming languages in all of this. Freon was killed, by and large, by programmability; with technology developed as far as it has, the age of fixed-function hardware is over. Programmability will be king for the foreseeable future, and likely until the von Neumann architecture is replaced entirely. But programmability requires programming languages, and the languages we have now are not enough.
For a few hours, I pondered the tradeoff of reviving the Freon project (which, after all, still has some promising proprietary technology) versus work on the Epoch language. Both have some potential, and both would require tremendous amounts of effort to really accomplish anything.
The more I think about it, though, the more I'm convinced that the future of both projects is one and the same. It lies down a twisting but all-important road of processing advancements, at the logical conclusion of core-proliferation and the challenges of writing multiprocessing-capable software.
And just because I'm a smug bastard, I'm going to make you wait to see what I think that future looks like [smile]
Yes, it'll happen and yes it'll look so damned cool as to make the polygons weep.
I just think it'll take so much investment in all areas that the commercial factors wont bite. We're already getting talk about diminishing returns with XB360/PS3/D3D10. A change from current polygonal techniques to all-out ray-tracing is no small thing and I just find it hard to see many people liking the cost:reward balance. At least initially...
Anyway, guess time will tell [smile]
Jack