Sunday, June 19, 2011

@Party 2011 results

In this post I will detail the results from two days filled with coding during @Party 2011[]. But first, a brief status update on recent progress. I made progress on two major todo items mentioned in the previous post: testing of general functionality (loading, saving, creating/connecting nodes and so on), and scene import. The latter however, did not quite go according to plan. To keep a long story short, I ended up using an off the shelf, free for non-commercial use, 3ds max/maya to Ogre exporter. This exporter is a) full featured, b) working, c) actively developed, d) documented, and e) supported. Rant: #$@#!#% open-source projects! The number of OSS projects that meet all the previously mentioned requirements can be counted on all fingers of a hand from which 3 or 4 fingers are amputated. Any advocate of open-source software (hereafter to be referred to as: 'delusional') who claims otherwise, basically is a hobbyist or bedroom coder who does not know what s/he is talking about. I will stop ranting here, I deliberately did not want this to turn into a rant blog. Although I may set up a dedicated blog in the (near) future, just for that ;-)

Back to business. I did some simple tests to export (animated) scenes and objects from 3ds Max to Ogre, which worked fine. Since I do not have any 3ds Max skills, I left it at that, for now. But learning the basics of 3ds Max has been high on my list of 'things to do or learn before I die'. However I've always put it off because of the intimidating graphic user interface of 3ds Max (I'll spare you another rant here ;-). So in the next few weeks/months I will spend a few hours every week to try to grasp it.

Last week I spent two evenings implementing a script editor. The script editor allows you to edit, well, scripts! It supports syntax highlighting (currently only shaders in cg, hlsl, glsl syntax), but it can easily be extended to support all types of Ogre scripts (e.g. materials). Furthermore the script editor allows you to recompile scripts on the fly, while displaying a 3d scene. I finalized this script editor during @party, and to test whether it worked I ported some Shader Toy [] shaders. There still are some issues to work out, but they are indicative of bigger 'untackled problems': even though the tool can deal with script compilation errors, and as a result, just will not display any shaded geometry on screen, it is possible for Ogre to throw an exception when loading a (successfully compiled) shader. Exception handling is something I have not fully implemented yet, but which is high up on my 'todo' list.
Without further ado, here is my updated todo list:

  • Testing, testing and more testing!
  • Learn some basic 3ds Max or Maya modeling skills, so I can test scene export/import features
  • Come up with an exception handling mechanism, to catch exceptions that originate from user-initiated actions. Such as: script compiler errors, setting invalid parameter values through the graphic user interface, and so on.
  • Improve colorcurve editor: Replace the current graphic user interface, which requires you to animate individual red, green and blue curves to create a color gradient, with another interface where you can put colors at specific time points.

Note: I am giving myself a break from working on the 'nuts and bolts' of the tool, and started implementing a GPU based particle renderer :-) Between that digression, learning basic 3ds Max skills, working on and around the house, and the friends and family that are coming over to visit us during the next few weeks, I do not think that I will have much time to work on anything else. Oh well :-)

... to be continued ...

Sunday, June 5, 2011

Testing, testing, 1,2,3...

Note: this post is a followup of my previous post.

Here is a quick status update. Basically I finished implementing the following functionality:

  • Keyframe node: The graphic user interface now supports a basic keyframe editor, and keyframe nodes can be shared across different 3d objects in the same scene.
  • Postprocessing node: There now is a separate postprocessing node for applying a single postprocessing effect to a scene that is either rendered directly from a camera node, or that was rendered previously and stored in a rendertarget node ("render to texture"). The node automagically exposes shader script parameters through the graphic user interface, so you can edit and/or animate them. This postprocessing node complements the legacy render node which by default allows you to 'stack up' as many postprocessing effects as you like- but (for now) it does not allow animating the shader script parameters.

I got these features to work in a matter of days, which I am quite happy about. The tool is rapidly nearing the point where it is actually useable by someone other than myself- since you should soon be able to create animations without having to change a single line of code. Ofcourse it is also possible to code your own effects (the tool includes a C++ software development kit (SDK) with a few examples) if you like. But, before that is really feasible, I will still have to work on the following list of to-do items:

  • Testing, testing and more testing: Including, creating a basic scene by placing and connecting nodes. Add some 3d models in the scene. Render the scene to the screen. Remove some nodes, add some other nodes. Keyframe some 3d models. Keyframe the camera. Render the scene to an offscreen rendertarget. Apply several postprocessing effects to the rendertarget. Save. Load. Modify. Save again. Load again. Repeat this process until the program crashes, in which case I need to debug the code and fix the problem, or until the program does not crash, in which case I can continue with the next item on this list. I think this will take me a few days at least, if I want to do this properly.
  • Scene import: Get a Lightwave exporter plugin up and running. Extend the tool to import scenes (3d objects, materials, animations) that were exported using the plugin. In fact, I am compiling a first version of the plugin while I am writing this post ;-)
  • Improve the color-curve editor: Replace the current graphic user interface, which requires you to animate individual red, green and blue curves to create a color gradient, with another interface where you can put colors at specific time points. This is much more user friendly than the current implementation.

What happens after this remains to be seen. It depends a bit on where my interests are. I see two main directions where I could go from here:

  • Evolved virtual creatures: Based on the work of Karl Sims [], and to a lesser extent, Archee [] I have since long been interested in creating my own version. More specifically I want to evolve neural controllers that are able to control any skeletal structure (2, 4, 6 or more legs/wings/fins). This would require me to extend the tool with two specific nodes: a evolving neural network node, and a 3d model with support for skeletal animation node.
  • Deferred particle rendering: Fairlight's Blunderbuss [] and later Agenda circling forth [].

I've been sitting on the papers that served as a basis for these algorithms for years now. It's about time that I did something for fun again- tool development can be a bit boring at times, staring at the same startup screen, with the same placeholder effects, because my priority was to get the tool up & running (while obtaining 2 masters degrees and emigrating from the Netherlands to the US. Go figure ;-) Implementing these algorithms is not that much work- note that Smash (the programmer behind Blunderbuss) said on his blog that he coded the effect in one Saturday morning. However history has shown that when I get it done I will spend considerable amount of time playing with the simulation. This has happened when I added support for L-systems (similar to Outracks []), which actually was the first effect evah to be added to the tool. History repeated itself when I added 4d Julia fractals, based on the work of John Hart and Keenan Crane [] and later IQ/Rgba [].

I will have all the tool's functionality that was mentioned above, and either evolved virtual creatures or deferred particle rendering working by the end of the summer. Mark my words :-D

... to be continued ...