Reality Editor Can Empower Physical Object Connection In Factory of the Future

The Reality Editor is a next generation tool for empowering users to connect and manipulate the functionality of physical objects. Pointing a Smartphone camera at an object and its invisible capabilities becomes visible allowing users to edit function. Dragging a virtual line from one object to another creates a new relationship between these objects. With this simplicity it’s possible to master the entire scope of connected objects.

The Reality Editor uses a new way of of Augmented Reality called Bi-Directional AR and allows real-time interaction with the machines around you. It is the result of research at MIT with the goal being the creation of technology that grants users maximum control by leveraging human strengths such as spatial coordination, muscle memory and tool-making.

The way we interact to computers with has not changed significantly since their invention 40 years ago. The MIT Fluid Interfaces research group is radically rethinking human-computer interaction with the aim of making the user experience more seamless, natural and integrated in our physical lives. The goal is to design and develop interfaces that are a more natural extension of human minds, bodies and behavior aiming to design novel form factors that leverage the full range of sensory capabilities and control modalities of the user.

An example of Reality Editor applications in daily life can a light you always need to stand up in order to turn off – just point the Reality Editor at an object next to your seat and draw a line to the light. You have now customized your environment to serve your convenience. Spatial coordination and muscle memory can be used to easily operate objects such as tools. If you want a timer linked to the light, just borrow the functionality of an object with a timer, by drawing a line from it to the light.

Reality Editor can also be applied to complex manufacturing operations. In the MIT research simple universal rules were identified common to all physical objects (Mind Memory; Learn, Setup and Operate; How to connect everything). The Reality Editor is build on these rules to place an intuitive yet powerful technology into the hands of users.

ZEISS Present Solutions for Optimizing Additive Manufacturing Processes

The Reality Editor was developed the with the most open, resistant and knowledgeable standards with every visual interface based on html5 and therefore leveraging the full creative power of the web allowing the Reality Editor to transform web browsing technology into an interface for the physical space. In addition, connection between objects uses the most open and stable internet standards.

Reality Editor can be used to define simple actions, change functionality of objects, remix how things work and interact. Make something virtual into something physical and the physical more virtual. Through its simplicity, the Reality Editor allows users to merge two separate realities into one truly interwoven experience.

Reality Editing sounds like science fiction, but it’s not. MIT has made some of its research publicly available. The Reality Editor is available in the iOS App Store for use the MIT open source Open Hybrid platform to build a new generation of Hybrid Objects and is fully feasible for developing the next generation of manufacturing equipment user interfaces.

The factory of the future will for sure be very different from the workplace of today.

For more information: