Light Converse Shows Complex Video Mapping at Prolight + Sound Light Converse Ltd. recently presented one of the distinguishing features of its LightConverse 3D Show Platform at the Prolight + Sound show in Frankfurt. The exhibition, which took place April 6-9, allowed the software company to show off its in precise, real-time video mapping via its just-released version 51. Visitors to the Hall 11.0 booth enjoyed a view facilitated by six projectors shooting perfectly mapped video onto a four-sided surface. Total screen space was 9 sq. m. with the distance between the projectors and screen surface a short 1.5m. The perfectly blended projection was made possible due to the software's new 3D mapping engine and patented projection's automatic adjustment method based on the inverse transformation, which allows for precise automatic blending before the video signal is sent to the real-world projectors. The 3D mapping engine is used to solve common video projection problems such as shooting video or images onto objects of any geometric complexity. The process can be divided into several simple steps: You can set up virtual projectors in LightConverse just as they are installed in the real world, use the software's material editor to map video or images onto a replicated 3D model of the real-world object, and, finally, send DVI signals from the computer to the real-world projectors (3-15 outputs, depending on LightConverse version). The virtual projectors simultaneously function as virtual cameras, thus allowing the camera to "see" its correctly mapped virtual world and output its signal to the real projectors. Video sources can be input live from any media server to LightConverse or from on-board .avi files. Also shown at the show was the LightConverse Server-Studio, a new hardware solution that can visualize up to 1536 fixtures (96 Universes) in real-time in conjunction with the 3D SH Show Platform software.
|