Software:OpenIllusionist

From HandWiki
Revision as of 16:18, 7 March 2023 by WikiG (talk | contribs) (simplify)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Short description: Computer program
OpenIllusionist
OpenIllusionistLogo.png
Developer(s)John Robinson, Dan Parnham, Sean O'Mahoney, Enrico Costanza
Initial release2001; 23 years ago (2001)
Websitewww.openillusionist.org.uk

The OpenIllusionist Project is a computer program for the rapid development of augmented reality applications. OpenIllusionist provides software libraries to make easier the tasks of generating these images, performing the necessary computer vision tasks to interpret the user input, modelling the behaviour of any of the virtual objects (or 'agents'), and threading all of the above to provide the illusion of reality.

Explanation

Open Illusionist focuses on the area of virtually augmented environments (VAEs), where the augmentation is not worn but is instead inherently communal and environmental - most commonly by the use of a digital projector and some kind of video camera to cause some surface to appear to be populated by objects which can be manipulated physically by the user. These objects do not exist as anything other than projected computer graphics.

History

OpenIllusionist is closely connected with the Media Engineering Group (MEG) of the Department of Electronics at the University of York, United Kingdom - specifically the Visual Systems subgroup. This group was formed when John Robinson took up a professorship in the Department in 2000/2001, bringing with him a background in image coding and an interest in augmented reality.

This manifested itself in the work of three undergraduates - Dan Parnham, who experimented with the interpretation of the pose of a mannequin by the use of a single webcam focussed exclusively on the input side of the augmentation problem during his Master's degree, Sean O'Mahoney, who developed the first incarnation what would come to be termed PenPets as his Masters project, and Enrico Costanza, who developed a variety of tangible augmented interfaces using fiducials stuck to blocks of wood, with augmentation provided variously by audio feedback ("Audio d-Touch"), and a projector ("Magic Desk"). Much of Audio d-Touch was created by Enrico in his spare time as a personal project, (with collaboration from the aforementioned Robinson, and Simon Shelley, another York alumni) while the Magic Desk became his Masters project.

All of these projects fed into a collective culture in the group - with Justen Hyde, then a research student studying the reconstruction of human facial images, getting sucked into the work, making minor contributions to all of the projects, though officially working on none of them. The projects most often wheeled out to demonstrations were quickly established as PenPets (O'Mahoney) and d-Touch (Costanza) both of which had a strong commonality - they appeared to work by magic. The computer could be hidden from view, and the user could then simply interact with the augmentation directly. In the case of d-Touch, by moving marked blocks in front of a webcam to sample, edit and produce music with very low-cost paraphernalia - just a cheap microphone, printed fiducials and a standard PC. PenPets required more of a hardware overhead - a data projector pointed at a table. Onto this agents resembling mice were projected. These would run around the table, bouncing off hands, pen marks and objects.

After O'Mahoney and Costanza left the group, in 2002 and 2003 respectively, d-Touch continued to be developed by Costanza, but PenPets remained only a proof-of-concept, not technically beyond the prototype stage, and was mothballed. Parnham and Hyde continued to work on other aspects of image processing. However, the desire to work on a VAE was far from lost... Funding, as ever, was the only stumbling block, along with the unsuitability of the PenPets code for further development and expansion.

In 2004, a new Centre for Usable Home Technology was launched at York. As part of the launch event, a virtual augmented environment was promised - but upon inspection the PenPets demo, which had been considered, was found to be unreliable outside a laboratory environment, and almost impossible to maintain with no developer available with any experience with the code. In order to provide a demo, Hyde and Parnham, in their own time, went away and using all that had been learned in the years of development of the various group project, designed and built a new interactive aLife demo from scratch in the space of a week. As PenPets had never made it beyond experimental prototype stage, they decided to bite the bullet and rather than building just a demo instead built a basic but extensible generic augmented reality framework upon which an aLife demo could be run. This framework became the core of the OpenIllusionist Project.

Over the latter half of 2004, interest in the framework put together for this demo escalated, and the advantages of maintaining a framework upon which VAE development could be carried out became readily apparent. Instead of weeks to build a stable VAE demo application, results could be achieved in hours. The extensible structure of the proto-Illusionist meant that applications completely unlike the original aLife demo were being supported with relative ease. In the Autumn of 2004, it was decided that the framework was more useful and important than any given implementation of a demo on that framework - and also that an opportunity presented itself to begin the push to get augmented desktops out of research labs, and into circulation with the general public. The demo framework became an entity in its own right - the Illusionist - and was published as open source software.

Since then, development has continued, with the project still administered and run by the two founders, Dan Parnham and Justen Hyde. In 2006 "Robot Ships", an exhibit built using OpenIllusionist, was installed in the new Connect gallery at the National Museum of Scotland.

Platforms

OpenIllusionist initially ran only on Microsoft Windows, though latest versions are implemented using wxWidgets and so are inherently cross-platform. However, due to the amount of hardware interfacing required by OpenIllusionist, video capture on other platforms (such as Linux) is still in early development.

External links