Farm Tutorial

Empty Scene

 

This is an empty scene of Farm. You can see the two branches of the tree view, one called "DX" and concerning the DirectX subsystem ( graphics, textures, sounds and so on )  and the other called "Deck" about the simulation. The term "Deck" comes from Virtual Reality technology, which was heavily inspired from cyberpunk science fiction literature.

 

Empty Scene (2)

 

The same empty scene, with tree views fully expanded.

Loading

Here we load a 3D model. You can load models and sounds by drag and drop or from a menu option; we are now limited to Lightwave models, but we will add support for other professional modelers, like Maya and Alias. The idea is to let the graphics developers use the best professional tools they are accustomed to.

 

OneScene

 

This is a sample skate park built with Lightwave. The graphics files are loaded natively; there is no pre-processing needed, so every change on the graphics is seen immediately inside Farm. Models, textures, lights, animations, cameras and so on are imported, so building real-time prototypes is really a matter of minutes. You can see some details about the structure of the imported scene in the "DX" branch of the left tree view, while nothing about behaviours has been done, so the "Deck" branch is still empty (apart for sensors like keyboard and joystick which have been detected on Farm startup)

 

TwoScenes

 

Here we loaded a second scene, a hand with a skateboard, and selected a closer camera. In Farm you can load multple scenes, which will overlap in a global "universe". In this way many graphics developers can work each one on a very narrowed scope without interfering between them.

 

Actorizing

 

When the graphics are set up, you can start to add interactivity to your scene. This is achieved by "Actorizing" the graphics which represent important parts of the scene. The idea is that you will add "behaviours" to these actorized objects. Between these behaviours there is bouncing, being animated or being able to be controlled by a joystick, for example.  You actorize graphics using right-click drop down context menus.

 

Actorized

 

We actorized the root node of the hand and the skateboard graphics and, because they were natively animated in Lightwave, an "Animation Behaviour" is automatically detected and added. You can see some new entries in the "Deck" branch, namely a SkateHand representation with an attached SkateHand animation behaviour. Double clicking on the animation behaviour will open a control panel where you can set animation frame rate, loop mode and so on. Also, the items were added in the "Simulation Sequence" section; this allows you to fine tune simulation priorities (useful when using advanced features like real-time rigid body physics simulation).

 

Behaviours

 

Right-clicking on an actor will open a list of available behaviour. Here we want to control the skate using a joystick, so we have stopped the animation deactivating it and we will add a Sensor Modulation Behaviour.

 

SensorModBehaviour

 

We added a Sensor Modulation Behaviour and double clicked on it. As usual, a control panel will open. In Farm, every actor has input and output channels, and the Sensor Modulation Behaviour let us bind output channels from sensors like joysticks and keyboard to input channels of any actor. The nature of these control channels depend upon the nature of the actor. For example, here we are binding the X movement on the joystick on a translation of the X axis of the skateboard, but we could for example bind the joystick to an animation frame rate to control animation speed, or we could even control the real time/simulation time ratio of the deck, slowing down the simulation or even making time go backward in our simulated environment. This allows very easy and intuitive setup of realtime control strategies, one of the most complex task to develop in simulation field.

 

Observer

Now our skate is able to move around, so we must supply a "smart" camera to follow it. Here we actorize a camera from the "DX" branch, and a SkateObserver is created. By opening his control panel and dropping on it the representation of the skateboard, we set the "looked actor" of this observer. The observer reorients itself in realtime to aim to the skateboard.

 

MagnetObserver

To have an observer which just reorient itself is not enough. We want an observer which actually is able to follow our skateboard. So we add a Magnet effect between the SkateObserver and the skateboard. To do this, we add a Magnet Behaviour to the deck, and register the skateObserver and the skate by dragging and dropping them on the magnet behaviour. A magnetization between these two actors is created, and with a double click we can open the magnetization control panel and set some options like stiffness and the fact that the magnetization must be applied only to the observer ( we don't want the skate to move toward the observer, right? ). Also, some "dynamics", which are simple newtonian dynamics solvers, are generated because now our actors must accelerate under the influence of forces. The observer moves toward the skate, and from now on it will follow every movement.

 

Bounce

Now we want the skate to bounce when hitting a wall. We actorize the skatepark walls from the DX branch, and create a WallsRep actor. We add a Collision Detection behaviour to the deck and we register the skate and the walls by dragging and dropping like we did for the magnetization. This adds up a lot of functionality behind the curtains, namely the precise collision detection engine and the physical simulation engine. Also, a request for automatic generation of a script for the collision response is made. We answer "yes" and a simple script is created. This will enable us to customize the collision response, for example if we want an object to slide or stop or make a noise instead of bouncing when a wall is hit.

 

Scripting

Here we select the Script Behaviour. A simple text editor opens up and we can see the simple auto-generated bounce script. Scripting is probably the most powerful part of Farm, because we used an industrial-strenght programming language called Python. Despite its apparent simplicity, Python is a first-class object oriented language known and used world-wide, and our implementation in Farm let us control any aspect of the virtual worlds of Farm by scripting. This is how we build behaviours which are too specialized to be treated as general-purpose behaviours, like customized multiple animation switching etc. We are evaluating also adding another completely different language called Haskell which, being a Prolog-like functional language, could be used efficiently in Artificial Intelligence tasks like path searching, personality simulation etc.

 

Some other "stock" behaviours are also available: Proximity Detection, to have something happen when two actors are within a certain distance from each other, and Terrain Following, to reorient automatically any "vehicle" when moving on non-flat surfaces. Farm is a work in progress, so other behaviours will be developed when our games will need them.