About this Site
About Homura

Homura Games Development Framework

This section aims to provide an overview of the Homura games development framework. The overview covers the components which comprise a Homura-based application, the aspects of game development which Homura aims to tackle and the motivations for its development. For a more detailed assessment of Homura, we have several different sources of information including published academic papers, presentations, release documentation for the library, and JavaDoc API comments for the classes which comprise the framework. This can be found within the documentation section of this site.


The Homura project's game development framework provides an Open Source API for the creation of Java and Open GL based hardware-accelerated 3D games applications, which support cross platform, cross-browser deployment using Java Web Start (JWS) and Next-generation Applet technologies.
The framework is written entirely in Java with support for the Java Standard Edition (SE) Software Development Kit (SDK) 6 : update version 12 onwards.

Motivations for Development

Currently, web-based games applications are predominately utilising Adobe Flash or Java Applets as their primary technology choices. These games are often casual, two-dimensional games and do not utilise the specialist graphics hardware which has proliferated across modern PCs and Consoles. Digital Distribution is also becoming an increasingly important method of deploying content to the end-user. The Homura project aims to create an open source application framework for the development of hardware-accelerated, cross-platform 3D games, using Java technologies and OpenGL. The cross-platform capabilities will be inherently encoded in the application framework itself, emulating the " Write Once, Run Everywhere " paradigm of Java. Homura games are also designed for integration with its companion library, Net Homura, providing the games with a open digital distribution platform, utilising standard web technologies, which can be utilised in a cross-browser manner, both outside the browser as either windowed or fullscreen applications, or embedded inside the application as an Applet canvas. The open source nature of the project aims to reduce the development costs, and thus lower the barrier of entry for developing games. With Homura and Net Homura combined, it is possible to develop, host, deploy and even play Homura-based games using entirely open source software!. The framework also aims to provide a platform in which novice-intermediate developers can educate themselves on various aspects of game development, either through low-level manipulation and alteration of the engine, adapting the many sample applications provided, or prototyping their own concepts on top of the rich platform.

Logical Architecture

Figure 1 illustrates the Logical Architecture of a Homura-based application. Homura utilises and builds upon several open-source application libraries to provide a rich set of game-related functionality. Hover over the components of the architecture in Figure 1 to see a summary of their intended purpose within the system. Clicking on the component will take you to the project website for that component to provide a more detailed description of the library.
Logical Architecture of a Homura-Based Game Homura Supports JSR-223 Scripting Languages to be used within our game console system or attached to scene graph objects as Controllers. This is the Homura Framework API, downloadable from this site.  The library provides the base for all your Homura game creations. Java Monkey Engine is the Open Source Java-based Scenegraph API which underpins Homura. jME Physics 2 provides a rigid body dynamic simulation library which intergrates directly into Homura's scenegraph and game system to easily add sophisicated Physics effects to your games. The Java Open Particle System provides a library and editor for the easy creation of particle effects into your games.  Homura supports the direct loading of JOPS binary files into your game. The Open Source Scene Monitor / Worker library provides a Swing-based scene introspection and manipulation system, which is used within Homura's runtime debugging system. The MD5 Importer library allows Homura to load MD5 based animated models into your games, many popular artist tools such as Blender, Maya, 3D Studio Max and Softimage support exporting to MD5. The Lightweight Java Games Library provides Java-based bindings to both OpenGL and Open AL.  It also provides GL Canvas support and is used directly by jME as the primary rendering technology. OpenGL is an industry-standard 3D rendering API written in C, it is interfaced by LWJGL to provide the rendering functionality used in Homura. The Open Dynamics Engine is a Physics Library for Rigid Body Dynamics, it is used by jME Physics 2 to provide the core Physics functionality. Open AL is an industry standard audio library for the utilisation of Stereo/3D Audio within multimedia applications.  Homura uses OpenAL to provide 3D Sound Entities which can be used directly within the Scenegraph. OGG Vorbis is a general purpose audio and music encoding format which is patent and  royalty-free.  Homura directly supports the loading of Vorbis-based sound files.
Figure 1: Logical Architecture of a Homura-Based Game.
The lowermost layer of the architectural stack is the System layer. Homura is a cross-platform framework and will run on Windows, Linux, Mac OS X, with the requirement of an OpenGL 1.4+ compatible graphics card.

The second layer is the Native Library Layer. Homura is coded in Java, but utilises native, platform-specific libraries for the key sub-systems. This provides the best combination of performance and feature support, allowing hardware-accelerated rendering and audio to be utilised. Homura relies on the native versions of OpenGL for rendering support, Open Dynamic Engine (ODE) for Physics simulation, OpenAL for Audio support and OGG Vorbis for open source audio format support. Java interfaces with these libraries using the Java Native Interface (JNI).

The Homura Framework comprises the uppermost layer of the API and is programmed exclusively in Java. All libraries directly referenced by Homura are also Java based, with these libraries handling the calls to the Native libraries. This approach was chosen because these existing libraries are already established and have been optimised to handle the native calls in the most efficient way, whereas Homura is primarily concerned with the high-level architecture of a games application.

Homura utilises the Java Monkey Engine (jME) to provide rendering and input handling functionality. Programmed entirely in Java, jME uses the LightWeight Java Games Library (LWJGL) as its low-level OpenGL-based rendering sub-system. The primary function of LWJGL is to act as a Java binding to OpenGL by mirroring the interface of the C-Based OpenGL library with a Java version of each function. For example, OpenGL’s glBegin() is adapted as GL11.glBegin() in LWJGL. The LWJGL function will then utilise Java’s JNI system to call the native version of glBegin(), and uses Java’s NIO system to pass information between OpenGL and LWJGL as ByteBuffers. jME provides a high performance scene-graph based graphics API. The scene-graph allows the organization of 3D geometry into a tree-like structure where a parent node can contain any number of children nodes, but a child node must contain only a single parent. The nodes are organized spatially so that whole branches of the graph can be culled. This allows for complex scenes to be rendered quickly, as typically, most of the scene is not visible at any one time. The scenegraph’s leaf nodes consist of the geometry that will be rendered to the display. jME is an open-source technology which, over the last five years, has matured into a feature-rich system which is one of the most performant graphical implementations in Java for 3D applications. Homura also integrates jME’s 3D Audio support.

The audio sub-system again relies on LWJGL to provide the native bridge to the OpenAL audio library, whilst using the open-source OGG Vorbis codec as the media format for audio files. Homura also utilises a jME sub-project, jME Physics 2, to provide the Physics simulation functionality of the framework. jME Physics integrates tightly with the jME scene-graph by virtue of its Physics object classes inheriting from the jME scene-graph classes. jME Physics uses the concept of Static and Dynamic node types, Static nodes are nodes that are not affected by physics, but other objects still can react physically to them (e.g. a wall), Dynamic nodes can be affected by forces and mass such as gravity and collisions with other physics objects (e.g. modelling a bouncing ball colliding with the static wall). JNI is used to bridge jME Physics with ODE to provide the low-level physics functionality. Homura also integrates with the Java Open Particle System (JOPS), a framework which allows the creation of advanced particle effects (Smoke plumes, explosions, fireworks etc.) designed for LWJGL. This has been integrated into Homura by incorporating the JOPS file type into the Homura asset management system and encapsulating the particle generators as a specialised scene-graph node called a JOPSNode allowing for easy incorporation within a scene.

The framework composites a large set of disparate components into a single system, allowing a game to be easily built on top of the Homura system through linkage with the project’s binary Java Archive (JAR) file. Consequently, the final architectural layer is the User-Creation layer, which comprises the developed game. A game utilises the Homura base classes using OO inheritance and composition to provide the skeleton game - complete with all the aforementioned sub-systems. These classes are then implemented with the required game logic and the user-developed content (Models, textures, particle effects, music, sound effects, backgrounds, etc) which are stored as a Homura asset collection and loaded within the game classes using the Homura Asset Management System to construct the virtual environment which embodies the game.

Whilst the core of a Homura-based game is developed in Java, non-performance critical sections of the game (e.g. some parts of the game logic) can be implemented as Scripts. Homura supports a variety of languages such as Scala, Jython, JRuby and JavaScript (or any other JSR-223 compatible scripting engine), Scripts can easily be written to control any portion of the scene-graph (from the whole scene to a single node) and can be used for a variety of purposes such as AI, cinematics, animation control, event triggers etc.

Aspects of Game Development

Homura aims to provide an API which covers the main aspects of modern game development. There are several different disciplines involved in developing a modern games application. This section details the main aspects and how Homura tackles them. For an overview of the key features of the entire Homura project, please visit the Features section.
  • Toolset:

    Modern game development is a multi-disciplinary pursuit, requiring a large number of different skill-sets. The development of a game application requires the creation of several different sets of data including 3D Models, 2D Sprites, Particle effects,game engine code, application code, gameplay scripting, UI systems, audio etc. In order to produce this data to a high standard requires the investment of a lot of time and effort, and in most cases, also requires the investment of a lot of money. The Homura project aims to reduce this latter concern by providing and integrating with a large set of open-source development libraries, editing tools and media formats. Homura can be used to develop, distribute, maintain - and even play - games applications using entirely open-source projects. Figure 2 illustrates an example of a predominately Java-based, open-source toolchain.
    Open Source Development Toolchain
    Figure 2: Open Source Development Toolchain
    This is not a definitive list of applications, but illustrates the variety of open-source software which can provide a dramatic boost to development productivity. Also, Homura does not preclude the use of proprietary software, which many developers may already have within their armoury. Operating Systems such as Mac OS X and Windows; Graphics Applications such as Photoshop and Illustrator; 3D Modelling Tools such as Maya and 3D Studio Max; Web Servers such as IIS; all can be used within the toolchain of a Homura application.
  • Asset Pipeline and Asset Management System:

    Getting external assets to work effectively within a games framework is one of the most important development tasks. The ability to consistently and efficiently load data from a variety of sources can be a tricky task. With Homura's focus on cross-platform gaming and support for web-based deployment of applications this problem is exacerbated, as the software will be running across different filesystems and access patterns. Fortunately Homura provides a unified asset loading system called the AssetHelper which utilises Java's class system to relatively locate filetypes irrespective of OS, deployment type and includes support for loading resources directly from compressed packages such as JAR and PAK files. The Asset System supports caching of objects to avoid repeated loads and handles a wide variety of file-types for various types of media including image formats such as GIF, PNG, JPEG, TGA, DDS, BMP; 3D Model formats such as ASE, 3DS, Collada, MD2, MD3, MD5, Blender XML; Existing Engine Formats such as Ogre's .Mesh and .Scene Formats and jME's .JME Binary Format; JOPS Particle files (.ops format); MOD, Vorbis, WAV audio support; XML Files; XML Curve Descriptors, Script Files and many more. The loader system is extensible meaning new loaders for new formats can easily be plugged into the system.
  • Game State Management System:

    Modern game applications feature many different scenarios with a single game such as UI Systems, Levels, Gameplay modes,HUDs, Pause systems, in-game menus, 2D Content, 3D Content; persistent data or customisable parameters are often shared between these systems. Therefore, Homura provides a modular, flexible and extensible game state management system, which encapsulates these scenarios into a hierarchical class framework, designed for both component-based and inheritance-based extension. The game-state management system also features a unified management system which provides each game state with access to commonly required objects such as the display system, camera system, rendering sub-system, Asset manager, to easy access the core features of the game-engine. Figure 3 illustrates how the Game State Manager is used within the context of a Homura application.
    Homura State Management Hierarchy
    Figure 3: Homura State Management Hierarchy
    In this example, the game can be distributed in three ways: as a standard Java application, Java Web Start application or Next-Generation Applet. The GameStart class encapsulates the main method for the application entry point, this class construct an instance of the Homura Game classes (which extends from the Homura Base game, defining the application flow; e.g. Single or Multi-Threaded). The base game instantiates the display system and renderer and then invokes the Homura Game State Manager, binding the display and renderer to it. The Asset Management system is also setup, creating relative URL links between the Asset Types (as stated above) and the locations they can be found. This results in the creation of either a windowed or fullscreen application with an OpenGL rendering context. In this case the game class is the MyGame class. This controls display parameters such as Anti-Aliasing; Buffering Modes; VBO and FBO support; Bit Depths of the main OpenGL buffer types etc. This class then creates the initial game states of the game add binds them to the state manager, which in this case is the MenuState class which encapsulates the scenegraph for a 2D Menu system. The state manager is responsible for handling the transitions between game states and controlling game loop execution. The Applet version of the game MyApplet undertakes the responsiblities of the GameStart and MyGame classes, with the only difference involving the creation of a display canvas instead of an application window. The key aspect of this design is once the HomuraStateManager is constructed and control is delegated to it, it is responsible for control of the application execution and rendering process. This means that the Applet, Web Start and standard Java versions of the applications all behave in identical manners, irrespective of deployment type or OS Platform. The State Manager also provides capability querying to assess the performance of the machine on which the application is running, allowing performance and rendering fidelity to be scaled based on feature set, allowing the games to run across a wide variety of different hardware specifications.

    The Game State Management System features concrete implementations of various game states which encapsulate various different components of Homura, such as Physics-Based game states, Game Menu States, Dialog Boxes, Shader-based Game States, Debugging Game States, each of which provide a common interface to allow for composition and extension, allowing developers to easily get a game up and running by building upon these templates. Homura also provides a concrete implementation of a Stack-Based Game State Manager as shown in Figure 4 below.
    Execution of the Game State Stack
    Figure 4: Execution of the Game State Stack.
    Game states are added to a Homura game by pushing a new instance of a game state onto the stack, which also binds a reference to the state manager to the game state. The manager’s update loop iterates over all the game states in the stack from bottom to top. All game states have their backgroundupdate() method called, but only the topmost state has its update() method called as it is in focus. The manager’s render loop also iterates over each of the game states in the same order as the update loop calling their render() method. This guarantees the order of rendering so that the 3D root node is rendered first, then the HUD node, then the fade node. This means that 3D objects placed in a state higher in the stack are drawn after the 2D object of its previous state, allowing layering. Timing conditions can be set on the game states to specify a fade in and fade out duration when they are pushed on / popped off the stack so that the game state can query its current transition state as a value between 0-1 (where 1 is on top and 0 is overlaid). This can be used to apply transition effects such as a colour fade,transparency fades, slides etc.
    This game state system allows some typical game tasks to be carried out with ease. An example to illustrate the reduction in complexity afforded to the developer is an in game pause menu system, as highlighted in Figure 4.
    In this scenario, an existing game state called Level 1 has an event handler triggered (e.g. pressing the ‘p’ key) which has called a method called pause(), This method creates a new instance of the PauseMenuGameState and adds it to the state manager. The state manager pushes this onto the stack and initialises this game state’s scenegraph (comprised of menu items such as ‘resume’ or ‘exit game’). This pauses the game instantly as Level 1 is no longer the topmost game state, which means its update() method is not being called, so no input events or scenegraph changes are being made to this game state. When the PauseMenuGameState is terminated by the player (e.g. presses a button to resume the game) this game state is popped from the state manager and its cleanup() method called to de-allocate unused objects. Subsequently, Level 1 becomes the topmost state again and it’s update() method is called again, resuming play. This is all handled via the data structures and state system, requiring no coded logic in the game. The defined rendering order also allows easy visual effects to be applied to the pause system, such as adding a transparency to the Level 1 state’s fade node so that the Pause Menu state’s text becomes more legible, when it is rendered on top of Level 1’s scene.
    Whilst Homura provides these in-built Game States and State Manager, these can be switched out for custom versions, allowing you to tailor the framework to suit your needs. A full list of the Game State classes can be found in the JavaDoc API Docs.
  • Graphics Rendering:

    As mentioned previously, Homura utilises Java Monkey Engine version 2 - a Java-based OpenGL scenegraph API for its 3D and 2D Graphics rendering. A scenegraph is a generic data structure which hierarchically organises the logical and spatial elements of a rendered scene representation. The scenegraph creates a tree-like structure containing Nodes. A Node can have multiple child Nodes attached to it, but each Node can only be attached to one parent Node. This results in the classification of two types of Node:
    • Internal Node: These nodes are used to organise the scene. An Internal Node has children, which themseleves can be either Internal or Leaf Nodes.
    • Leaf Node: These typically represent the data that is to be processed by the rendering system, Leaf Nodes have no sub-nodes attached to them. They typically encapsulate the geometry of the scene.
    In Homura / jME's Scenegraph API, these two node types share a common base class called a Spatial. An internal Node is represented by the class Node and Leaf Nodes are represented by the class Geometry. Spatial encapsulates the data shared by both Internal and Leaf Nodes. There are four main types of data encapsulated by a spatial:
    • Transforms: These define the three types of transformation which can be applied to a spatial: Translation, Rotation and Scaling. In the node hierarchy these transforms are propagated to all children, so rotational changes to a parent node will also occur for all children. Therefore, two types of transformation are stored: Local and World, related to the context in which the spatial is applied. Local Transformations are the characteristics applied to the Spatial structure itself (e.g a rotation of 90 degrees). World transformations represent these characteristics dependent upon the Spatial's placement within the scenegraph. If another spatial has a rotation of 90 degrees applied to it and is the parent of this spatial, then the world transform would be 180 degrees.
    • Bounding Volumes: These define the volumes which minimally encapsulate the Node. For Leaf Nodes this is the volume which encapsulates the vertices of the geometry. For Internal Nodes, this is the volume which encapsulates the volumes of all child nodes.
    • RenderStates: These define how the geometry will be displayed and are also propagated across any child nodes. RenderStates are describe in detail later on this page.
    • Controllers: These define alterations to the spatials during the execution of the application. Controllers typically encapsulate aspects such as Model Animation, Physics calculations, RenderState changes, etc. A Controller object itself is not propagated to child nodes. However, the Controller may alter characteristics of the spatial such as transformations and RenderStates and thus may also effect the characteristics of all child nodes.

    The Node class represents an Internal Node. This class is responsible for maintaining a List of child Nodes (Internal or Leaf). This list is of arbitrary length (bounded only by processing and memory requirements). The Bounding Volumes of all children are merged together to form the entire BoundingVolume for this node. Node provides various methods for manipulation of the scenegraph, common to data structures, such as insertion, removal, access and modification. There are various helpers to return sub-sets of the node List, determined by class type, name, index etc. The Node also provides methods to assist in Picking and Collision detection.

    The Geometry class represents a Leaf Node. This class encapsulates renderable data such as the Colour Buffer; Normal, Tangent and Bi-Normal Buffers; Texture Buffer (for Per Texture, Per Vertex Mappings); VBO Support; and includes various methods to add the manipulation of these buffers, and common operations associated with maintaining geometric data. There are various pre-existing Geometry implementations within the framework, which cover a wide range of usage scenarios, such as Line, Point, Curve, TriMesh, QuadMesh, Text. TriMesh is the most utilised of these sub-types and is typically used to represent 3D Models loaded from external assets/data files as well as a number of primitive objects include Sphere,Box,Torus, Teapot (The famous Utah Teapot), Quad, Disk, Cylinder and many more.

    Whilst the scene-graph organises the data associated with the rendering, it does not actually perform the rendering. This is handled by the Renderer object, which is created when a Homura game is launched. As mentioned above, the Renderer is associated with various parameters to control aspects of the output such as colour depths, anti-aliasing and screen resolution. The Renderer provides the mechanism to render a scene to the created display (Window, Canvas etc). This is where LWJGL is utilised within the architecture, providing a concrete implementation of the renderer, which in turn uses OpenGL. The renderer is implemented as a pluggable system, meaning that you could provide your own custom renderer implementation (Software-Based, Sun's OpenGL implementation (JOGL) etc.) The top-most node of a scene-graph (the root node) is sent to the Renderer to display. The Renderer has an associated Camera object, which defines the perspective from which the scene is rendered. The Camera encapsulates the view inside a View Frustum which bounds the elements that are visible within the scene, based on the camera's characteristics. The Renderer determine the parts of the scene-graph which are inside the Frustum and culls those which are not in view. The geometry which is inside the Frustum is then placed into a set of RenderBuckets (Transparent, Opaque and Orthographic). The Transparent and Opaque Buckets are sorted to determine what objects are in front of each other from the current viewpoint. The Buckets are then processed by the GPU via OpenGL calls, producing the final scene on the display.
    So how do you actually use a scene-graph? Figure 5 illustrates a common method for arranging both a 3D and 2D Scene. A Typical Homura Scenegraph
    Figure 5: A Typical Homura Scenegraph
    The scenegraph on the left illustrates a typical 3D scene, in this case, we are rendering the Sun, with both Mars and Earth orbiting the sun and the moon orbiting the earth. There are several internal nodes representing the pivot points of the planet's positions. The Earth and Mars pivots are translated away from the Sun pivot Node. The Moon's pivot is translated away from the Earth's pivot Node. The Sun, Planets and Moon are each children of their requisite pivot Nodes. These leaf-nodes have no translation applied to them and are thus centred around the pivot points. Each Leaf Node contains the geometry, textures and renderstates which encapsulate the model of their object. The scene is animated by applying a rotation over time using a controller. The sun's pivot node is rotated which, via propagation automatically causes the Earth and Mars pivots to rotate around the sun, and thus the Earth, Mars and Moon are rotating around the Sun. The Earth's pivot is also rotated over time, causing the Moon to additionally start rotating around the Earth. Each of the Leaf Nodes then have there own rotation cycle causing them to spin around the pivot points to represent day/night cycles. Simply through organising the scene graph we have produced a (quite primitive ;-) ) solar system-style model. We would also like some advanced lighting effects to be applied to each element of the scene. This is achieved through a Shader. The Shader is placed on the Sun Pivot as a RenderState and is automatically propagated down the scene-graph and applied to all other members of the scene (it has no effect on the pivot points as they contain no geometry). If we added a new planet (e.g. Saturn) to the Sun pivot, it would automatically inherit this shader. If we wished to add a item that was not effected by any of the transforms or visual effects, then we would simply add a new branch to the root node (e.g. adding a rocket ship). The whole scene is drawn by simply passing the root node of the scene to the renderer, along with the camera positioned somewhere within the scene.

    The scenegraph can also be useful for logically organising 2D scenes. The API for transformations remains the same, but translations and scales of the Z co-ordinates and rotations around the X and Y axis will not be effective (and may cause rendering artifacts). Layering (rendering elements on top of each other) is not determined by the Z-order in orthographic mode. Instead the elements are draw based on how they are traversed (Depth First Search), or through explicit setting of a property called the Z-Order (where the greater the number, the later it is drawn). The example on the right illustrates the organisation of a typical 2D scene. This scene features several internal nodes for organisation. The first tier of organisational nodes separates the elements based on purpose. The UI Node is designed to hold user interface elements. The Text Node is designed to hold additional text, such as frame rate and controls help text. The texture representing the text font is placed on the Text Node. This means that it is automatically propogated onto the two text geometries, so they share the same font. Due to the positioning of these Text nodes in the scenegraph (assuming no Z-order is explicitly set), then they will also be drawn on top of the UI elements, should they overlap. The UI Internal Node features two sub nodes which are also Internal Nodes (Menu List and Menu Controls). These are designed to group common elements together. The Menu Item Node has two leaf nodes, representing Menu Buttons - Start Game and Options. Each of these nodes is linked to the same font, but it is not applied to the UI Node. This is so that new UI elements do not automatically inherit the font, but instead require it to be explicitly set. The same is true of the Menu Controls Node, which contains two leaf nodes representing icons (Exit Game and Cancel). The two Internal Nodes allow us to easily position groups of elements on the screen. By translating the Menu List Node to the centre of the screen, we have automatically applied this translation to all Menu List items. These can then be individually positioned away from each other. We could then also apply the same concept to the translate the icons into the top-right of the scene, without affecting the position of the Menu Items. In Homura, translations in orthographic mode relate directly to the pixel positions of the elements, so we can translate items to the exact X and Y positions required on the screen, based on screen resolution. We can also apply animation effects to 2D nodes as well. If we applied a controller to the UI Node which scales the node larger and smaller over time, based on a sin curve, then we could create a simple bouncing effect. This would only affect the icons and the menu buttons, leaving the FPS and Controls Text as is. If we wished to include them in the effect, we would simply place the Controller on the OrthoNode. As you can see, simple organisation of the scene-graph allows relatively complex effects to created easily.

    Homura's game state system provides pre-configured scene graphs for various types of rendering. These are entirely customisable and the developer is free to create their own compatible implementation by implementing the Interfaces and Abstract classes of the game state system (See the Homura API JavaDocs for more information). The most commonly used is the all-purpose scene-graph game state; this utilises three scene graphs - one to hold a 3D scene, one to hold an Orthographic (2D) scene - for fast sprite and text manipulation, and finally, an orthographic effects node that is designed for post-render effects, such as blurs, fades, transitions etc. The separation of the state into separate graphs is aimed to logically organise the state so that the renderbuckets can be sorted quickly, the order of rendering is clearly defined and easily allows the combination of multiple layered game-states to achieve multi-rendering effects.

    Some effects in modern games require multiple renderings of the same information (Bloom effects, Depth of Field, Shadows etc). To handle these effects Homura also features the concept of RenderPasses. Passes encapsulate the logic for performing multiple scene renders and enforce the order in which draw calls are made. They can be used to wrap an entire scene-graph for exclusion from special effects. For example, if you wish to apply shadow effects to the entire three-dimensional scene, then the root node of the 3D scene graph can be placed within a Shadow Render pass, which encapsulates the shadow technique. Blank Passes can be wrapped around the 2D node and effects node to exclude them from being rendered as part this technique. The Render Pass manager will then handle the draw calls to ensure the order of rendering is identical to when passes are not used. Homura's state system automatically handles render-passes and as soon as a render pass is utilised with a scene node, the state manager switches to pass rendering from standard scene-graph rendering.
  • Mathematical Functions:

    Modern game development is heavily reliant on mathematical functionality. 3D Graphics modelling and manipulation, Physics, Audio Processing, Collision Detection, Animation Systems, Artifical Intelligence depend on a solid maths library which is shared between each component so that each sub-system can re-use the data of the other system (e.g. A Vector class which is used for Translation in the 3D Graphics, but also by the AI Planners and Physics Systems). Homura utilises jME's Math's libraries which have been heavily optimised for performance within the JVM. The Libraries encapsulate common mathematical objects such as 2D Vectors, 3D Vectors (also used as Points), Rays, Quaternions, Matrices, Triangles, Planes etc, each complete with the common set of operations which are typically required of them (e.g. in the case of Vectors, operations such as normalisation, cross-product, dot-product, angle calculations, interpolation and many more. The Maths libraries support operations between geometric shapes and approximations (such as bounding volumes) to perform common tests such as intersection (e.g. Ray-Plane, Ray-Polygon,Ray-Sphere,Sphere-Sphere,OBB and AABB comparisons). The FastMath provides approximation methods for processor-intensive operations such as trigonometric functions (cos,sin,tan,acos etc), linear interpolation and approximation for common constants such as degree to radian conversions, PI which are statically calculated to high-precision upon startup. These classes form a very solid foundation for more complex calculations and are utilised by all primitive objects and high-level geometric representations within the framework.
  • Render States and Shading Languages:

    In order to apply visual affects in an efficient manner across the scene-graph, Homura utilises jME's concept of RenderStates. RenderStates can be placed upon any graph node within the scene, and can be applied to entire scenes, sub-sections of the scene, or individual scene nodes. Children of a Node inherit these RenderStates from their parents and ancestors, which are overidden by RenderStates applied directly to a Node. Each node can only one of each type of RenderState. Figure 6 illustrates RenderStates applied to a typical scene-graph.
    Application of RenderStates to a Homura Scenegraph
    Figure 6: Application of RenderStates to a Homura Scenegraph
    The example above shows the scenegraph arrangement of a simple car. The car is hierarchically grouped by a single Car Node. This means that as the car is translated and rotated into position around the scene (e.g. controlled by a player), the chassis and wheels are automatically translated with it. The car is attached to the root node, which is in the render queue. This means the car will automatically be drawn. The root node has a Z-Buffer State (see below) attached to it. The car node and all other sub-nodes automatically inherit this state, meaning the calculations associated with this renderstate are applied to all parts of the scene. The car node contains the geometry representing the chasis of the car. The chasis is then textured through the application of the Chasis TextureState, containing a reference to the chassis texture object and the mapping co-ordinates for placing the texture on the chasis geometry. The direct child of the car node is the Wheel node. This node holds no geometric data, but is transformed into the location at centre of the car node and is used to organise the scene efficiently. This node has four children, each representing the wheels of the car, which are each translated to the four wheel bases of the car. The children each possess the geometry representing the wheel. Because we do not want the wheels to have the same texture as the car, but we also want the wheels to share the texture data, the TextureState of the car node is overridden with the wheel texture by placing a new TextureState with a reference to the wheel texture on the Wheel Node. This is then inherited by each of wheels, efficiently utilising the texture. Finally the car has a ShadeState, to control the visual appearance of the car. We want this to act on both the car chassis and two of the wheels, so this is placed on the car node, and inherited by the Wheel Node and each of the four wheels. However, as stated previously, we only want the effect on two of the wheels, and therefore we need to stop the inheritance of this state on two of the wheels (numbers 2 and 4). This is achieved by creating a new instance of a ShadeState object, which is thenset to disabled, this will override the inheritance and tell this nodes not to use the shade state. We add this state to both wheels 2 and 4. This does not copy the state, but instead links both wheels to the same state. Thus, if we change the value of this state (e.g. setting it to enabled again) this changes will effect both wheels two and four. This referencing can be used as an alternative to using the inheritance model, but means that the same state is evaluated twice, when inherited, this evaluation does not occur; instead, the states which are propagated through inheritance are calculated and stored, and the stored change is applied, resulting in optimal efficiency. The inheritance mechanism opens up possibilities for large performance gains. For example a forest could be constructed by creating 1000 tree nodes and attaching them all to a parent forest node. The texture and effects associated with all the trees could be applied directly to this forest node and propagated across all the trees.

    Homura supports sixteen different types of RenderState, each providing a key function with respect to controlling the visual depiction of a node within the scene:
    • BlendState: A Blend State allows the blending of source and destination pixel colour values to create effects such as transparency and translucency. The user specifies a source value to blend (e.g Source Colour, Source Alpha , One Minus Source Alpha), a destination value (One, Source Alpha, Source Colour) and combines them using a blend equation (Less Than, Equals, Greater Than or Equal To etc). Reference values can be used for Alpha Blends (between 0-1) in order to provide finer grain control over the blending. order to create the blend.
    • ClipState: A Clip State can be used to take "slices" out of the geometric objects represented by a Node. This is achieved through the specification of up to six clip planes. This can be used for effects such as Clipping above and below a water line in order to create reflection and refraction models.
    • ColorMaskState: A Color Mask State is used to control writes to the Framebuffer of a Node (otherwise known as the Colour Buffer). This allows the user to control writes to the colour channels (Red, Green and Blue) switching them on or off.
    • CullState: CullState can be used to determine which side of a model will be visible when it is rendered, as by default, both sides are visible. The Front side is defined as the side specified by tracing its vertices counter clockwise, a side (front or back) can be culled, or not shown when the model is rendered. Instead, the side will be transparent.
    • FogState: Allows the application of a fogging effect. The Fog State support the specification of the Fog Density Blend function (e.g. Linear, Exponential), Per-pixel or Per-vertex support and will utilise either the Depth buffer of the scene or some given fog co-ordinates.
    • FragmentProgramState: The Fragment Program State allows the application of an OpenGL ARB Assembly Language Fragment program to a Node. Whilst effective, the GLSL Shader approach below is the recommended method for Fragment programming
    • GLSLShaderObjectsState: The GLSL Shader Objects State allows the applications of an OpenGL GLSL Shading Program (Typically split into Fragment and Vertex Programs). GLSL Supercedes the ARB Assembly Language with a more high-level C-Style programming language. This approach is recommended over the use of Fragment and Vertex Program States, as the language is more flexible, easier to develop with. See the GLSL Specification for more information.
    • LightState: LightStates provide the ability to dynamically light the scene. One or more Lights can be attached to a LightState, each with different properties such as Point Lights, Directional Lights and Spot Lights. The LightState can control how multiple lights are combined, the order of lighting and the specification of a global ambient lighting.
    • MaterialState: Material States are used to control how objects react to the light generated from a scene lit by Light States. This state allows the configuration of the ambient, specular, diffuse and emissive properties of the object, in order to complex lit scenes. MaterialStates can be used in conjuction with BlendStates to provide Translucent lighting effects which can dramatically improve the quality of the rendered scene.
    • ShadeState: Shade States control the shading scheme used to light an object. Shade State allows either Flat or Smooth shading. Smooth Shading interpolates the colours of the objects Vertices on a per fragment basis. Flat shading calculates the lighting on a per-polygon basis, which leds to unrealistic, but stylised lighting effects.
    • StencilState: Stencil State facilitates the control of the renderer's Stencil Buffer for a given Node. The Stencil plane can be used to mask out portions of the rendering to facilitate special effects such as Shadows and is controlled by specifying a Stencil Operation (Decrement, Increment, Invert, Replace etc) and a Stencil Function (Less Than, Equal To etc).
    • StippleState: Stipple State supports the application of Line and Polygon Stippling, through the specification of a Stipple Pattern.
    • TextureState: Texture States are used to manage the Texturing of objects within the scene. A Texture State has multiple texture units, each holding a single texture, allowing support for multitexturing. Textures can be combined across inherited scene members. The Textures can controlled by setting the combine mode on the TextureState (Inherit, Replace, Combine Closest etc). Texture States also control how each of the textures are applied across the object through the specification of the UV co-ordinates. 3D Models loaded from external applications such as 3d Studio Max preserve the texture mapping applied using the model tool through the use of Texture States.
    • VertexProgramState: The Vertex Program State allows the application of an OpenGL ARB Assembly Language Vertex program to a Node. Whilst effective, the GLSL Shader approach below is the recommended method for Vertex programming
    • WireframeState: Wireframe State controls whether an object is drawn as a solid fill, or drawn in Wireframe mode. This state allows the specification of the face side to draw (Front, Back); the line width and use of anti-aliasing.
    • ZBufferState: The Z Buffer State controls how the Depth Buffer is utilised to evaluate the order of polygons drawn to the scene, based on distance between the pixel source and the eye (camera source). This state is configured by specifying the Depth Test Function (Less Than, Not-Equal, Greater Than or Equal To etc) and setting the depth buffer to read-only/writable.
  • Animation Systems:

    Animating the objects placed within the scene can be tricky task. Homura attempts to alleviate these difficulties by supporting three types of animation Sprite-Based, Modelled and In-Engine. Sprite-based animation is for 2D games, where the characters are drawn onto a sprite sheet - an image consisting of the various frames of the animation. Homura provides two different methods to control sprites - the SpriteHandler and SpriteQuad. The SpriteHandler allows users to load Spritesheets as textures and manually control the frames displayed, allowing for fine-grain control over the animation. The class is highly configurable, allowing for the specification of arbitrary regions of the image to act as frames. This comes at the cost of more code required to control the animation. SpriteQuad provides a more automated system, where the frame are arranged in a grid within the Texture, and the user configures the number of rows and columns within the animation set. The SpriteQuad provides automated methods for cycling through the animation at a given speed, the triggering of other animations upon the finishing of a given animation. The user can specify the exact frame routine by passing a String of frame numbers to the animation controller (e.g. 0 1 2 3 4 3 2 1 0 for a bouncing animation). The SpriteQuad is less flexible then the SpriteHandler, but requires much less code to integrates Sprites into your game. Modelled animations are animated 3D models generated through an external modelling tool such as 3D Studio Max, Maya or Blender. The Modelled animation system supports Keyframe animation, Skeletal animation and morph-target (per-vertex) animations in various model formats (MD2/3/5, 3DS, OgreXML etc), with more advanced animation systems being added in the forthcoming releases. The final type of animation is In-Engine. This involves the use of the geometric transforms and render-states within the game engine itself to produce animations. These are controlled through the use of a Controller object attached to a scene node. Homura provides various controllers to animate across a curve (Catmull-rom, Bezier etc), through external Scripts, Key-framed changes, or explicit changes defined in the update() loop of the game. These effects can be applied to 2D and 3D objects, providing a wide variety of possible effects such as UI transitions, object-movement, Physics based interaction, Spring systems etc. Figure 7 provides examples of these animation types:
    Animation using Homura
    Figure 7: Animation using Homura
  • Physics and Particle Systems:

    As games have more sophisicated and the platforms they run on more powerful, special effects and more believable interactions within the game world have become increasingly important aspects of game development. To this end, Homura provides systems to easily incorporate Physics and Particle Systems into your games. Homura utilises the jME Physics system using ODE to provide a rigid body dynamics system for modelling Physics in realtime. The Physics system is heavily integrated into the game-engine, both through the game-state system which adds continuous and step-wise physics calculations directly into the game loop and the scene graph itself. The jME Physics-based objects inherit from the scene-graph structures so that they can be directly added as members of the scene. The engine supports the addition of both dynamic (such as characters, ragdolls, balls, vehicles, see-saws, springs, joints etc) and static (walls, floors, race-tracks, slopes etc) which seamlessly join with the collision system, to provide Physics-based collision detection. The Physics system models global physical systems such as wind and gravity as well as the per-object specification of material properties and forces (such as mass, tension, torque, density, friction, centre of mass,bounciness etc).
    Particle systems are used to apply special effects such as explosions, fireworks, fire, lightning, smoke, 3D Hud objects etc and can be statically or dynamically placed within the scene (e.g. a constant fire or a flash which appears when a collision occurs). Homura supports two types of Particle system: Java Monkey Engine's in-built particle system and the Java Open Particle System, which is a more advanced, highly-flexible system. Both systems provide GUI-based editors to create your particle effects and Homura provides integrated support for loading these effects into your game. The particle systems can be set with forces, life times, decay, placement and rotational generators etc for a variety of different uses. The particle objects integrated directly into the Homura scene and can be used with the standard geometrical transform system, or within the Physics system.
    Figure 8 illustrates the Physics and Particle Systems within Homura.
    Physics and Particle Effects using Homura
    Figure 8: Physics and Particle Effects using Homura
  • Platform Independence:

    Modern games are typically multi-platform releases. Porting a game between different platforms used to be an arduous process, but with the evolution of games engines came a platform-independence layer which separates the underlying hardware from the game implementation, resulting in a massive reduction in time and effort spent on converting games to run on multiple platforms. Homura features several important platform-independence features allowing your game application to run across a plethora of desktop hardware such as Windows, Linux and Mac OS X based machines. A large part of this independence comes from the use of Java, a language known for its portability across platforms. This provides the developer with a large API of standardised classes covering low-level libraries for IO, data-structures and common data-types, threading, XML manipulation, database connectivity, Swing UI systems, networking etc. Homura supplements the features of Java with additional independence functionality such as data transform classes for converting between platforms with different endianess, hardware and software introspection libraries for querying the hardware on which the game is running for information such as file-system format, GPU extensions and capabilities, memory usage and constraints, JVM version, Screen resolutions, buffer depths and many more. This allows the developer to scale their applications to run on different platforms and displays with minimal code changes. Homura's platform-agnostic Asset Management System, as described above, also makes the locating of external data files a trivial task, regardless of target platform. Homura's input system handles a large array of input devices such as Tablets, Mouse, Keyboard, Game-pads, Joysticks etc and unifies them into a common input system, abstracting the device into a series of commands, which can be queried. The event-based input paradigm utilised by the Homura game state management system allows for complex command combinations to be bound as triggers for any manipulation of the scene-graph or application e.g. UI Handling, character movement, Mouse Picking etc.
    In the future, Homura will be also adapted to support mobile devices such as the Android platform. Figure 9 illustrates a Homura application running on both Linux and Windows platform. This example contains no explicit code changes to run on these disparate systems and is handled entirely by the Homura framework.
    Platform Independence using Homura
    Figure 9: Platform Independence using Homura
    Digital distribution within the games industry is becoming an increasily important method of delivering games applications and content to its users. Homura's support for web-based distribution using the Java Web Start and Next-Generation Applet technologies mean that games can be deployed across the internet using standard Java and web technologies in a cross-platform, cross-browser manner. Homura supports virtually all the main browsers such as Internet Explorer 6/7/8, Mozilla Firefox, Apple Safari, Opera and Google Chrome running on Windows, Mac OS X and most linux distributions such as Ubuntu, SUSE, Debian, Fedora etc.
    Homura's companion library Net Homura provides a framework to easily create web-based applications to host your games applications. For more information regarding Net Homura and how it integrates with the Homura framework, look at the Net Homura summary or browse the related documents within the documentation section of this site.
  • Other Uses for Homura:

    Whilst Homura is primarily a games technology, its web-based focus, cross-platform capabilities and performant rendering engine opens up the possibilities for utilisation with various other types of application.
    Homura's web-based 3D support can be used to embed lightweight 3D canvases into a web page via the next generation Java applet technologies, in a similar manner to flash, with various avenues for exploration such as enhanced brochure sites, shopping sites with 3D previews of the items on sale, 3D Flythroughs, Avatar systems etc. Integration with JavaScript APIs allows AJAX-style calls to be made to the Homura canvas to perform updates, allowing integration with complex, multi-tiered web applications.
    Homura's canvas classes can also be embedded inside AWT, SWT and Swing applications, making the platform suitable for the creation of 3D Editing Tools, GIS applications, Visualisation Tools, 3D Graphing etc. Homura also can embed Swing and SWT components inside a Homura application, meaning the integration of existing integration of Swing components for various purposes (such as our MPEG4 example, making Homura a fully-functional MPEG 4 Player). This has the added benefit that these components exist as either 2D or 3D entities within the scenegraph, allowing them to be integrated with the RenderState system to apply additional visual effects, use the 3D input system to perform interactive picking actions and much more. These tools could be made into standalone Java applications, Java Web Start Applications and Next-Generation Applets, allowing them to become web-based applications.
    Homura can also be incorporated server-side as a headless rendering environment to perform 3D processing, physics calculations etc, which can be integrated with both standalone and web-based clients using Java's networking APIs. These are just a few examples, there a limitless ways of incorporating Java libraries with Homura, allowing integration with Java EE systems, Java FX, Swing, SWT, JSR-223 Scripting Languages etc.
Now you know a bit more about Homura, you can start creating the next-generation of web-based games applications. What are you waiting for? Download the entire collection of APIs, technical demos and example games from the downloads section of this site.