SEOUL – VA Corporation, a virtual production company in South Korea, will launch a commercial virtual exploration service that pre-visualizes filming locations and places itself in a virtual environment. Without visiting the website, content creators can control backgrounds and environments through the website, while updating backgrounds by selecting and designing locations by simply setting up cameras and arranging lights in advance.
VA Corporation’s platform called “V Stage” effectively helps creators customize the background of content depending on their shooting concept. It would be the world’s first platform that allows content developers to visualize difficult-to-visit places like deserts and jungles. The shooting environment can be monitored with a 360-degree camera.
“Web virtual search is a service that has not yet been tried,” said VA producer Choi Chan at a media event on Nov. 30 at VA’s virtual production studio in Hanam, a southwestern satellite city of Seoul. “Time efficiency can be guaranteed during the pre-production phase as filming locations can be selected and designed in advance.”
V Stage updated 3100 three-dimensional spaces. “There has been no case that has tried to (introduce) web-based virtual search platforms that allow users to check all 3D environmental assets as if they were in a real place,” Choi said. Assets refer to 3D spaces used as backgrounds in a virtual production. The platform was used to update Seoul in 1988 for “Seoul Vibem,” a 2022 Netflix action film with dynamic car chase scenes.
Virtual search reduces the time required during the pre-production phase and allows creators to use the optimal pre-screened settings immediately after visiting the virtual production studio. The virtual environment can be explored with a 360° camera, so composition and shooting scenes can be easily checked.
The VA is upgrading the beta version of V-Stage, launched in September 2022, by storing data about the movement of virtual people. Artificial intelligence will be incorporated to combine different objects and naturally create unique backgrounds. “We will improve V Stage’s features, user experience, and user interface while increasing the use of virtual production studios by helping user-to-user communication through the establishment of asset storage and community functions,” said VA director Ko Byung-hyun.
A virtual production studio proves to be an effective tool for content creation as it seamlessly combines physical and virtual elements using a set of software tools. Studios can save time and money by filming on stage and viewing virtual graphics together in real time. Visual effects (VFX), which is the integration of live action footage and computerized CG elements to create realistic images, are processed and iterated during pre-production.
In May 2022, CJ ENM, the South Korean entertainment group that produced the 2020 Oscar film “Parasite”, opened a virtual production stage with attached cameras and a massive screen with micro-light diodes. SK Telecom, a leading mobile operator in South Korea, has formed a consortium with three virtual production companies to expand the domestic media ecosystem through its visual effects studio.
“We are introducing a new paradigm for content production by leading the growth of virtual production with preemptive investment, technology and the best partners in the domestic market,” said Ko.
Covering an area of about 15,000 square meters, the VA virtual production studio used real-time rendering technology of game engines. The studio has equipment optimized for the production of realistic content, such as LED wall scenes that create realistic graphics, as well as VFX equipment and an augmented reality (XR) operating system. XR refers to all digital environments created using virtual reality, augmented reality and mixed reality.
A key technology incorporated into the virtual production studio is “in-camera visual effects (ICVFX), an innovative technique that allows users to shoot multiple scenes and locations by erecting LED walls in front of the actors in the studio. With ICVFX, LED walls create an unprecedented level of illumination and reflection precision. Less time spent on lighting and tuning creates more time to focus on capturing the actual image.
“There has been no case where a movie or drama has been completely produced using the virtual production method,” said VA Convergence chief Kim Woo-hyung, adding that the company is working to improve the production quality required in the film industry.
VA aims to build a comprehensive content union that can cover the entire content production and distribution. In May 2022, the company unveiled its plan to acquire metaverse broadcast production company ROOT M&C, a metaverse company specializing in virtual reality and augmented reality content. The company has already made a strategic investment in Amberlin, a performance company known for producing AR and XR content with CJ ENM for various concerts, including KCON, the annual Korean cultural conference for global fans.
VA is also working with LG’s AI research institute to update the digital human. In February 2022, the two companies opened a joint research center in Hanam to develop LED walls optimized for ICVFX.
© Aju Business Daily & www.ajunews.com Copyright: All materials on this site may not be reproduced, distributed, transmitted, displayed, published or broadcast without the permission of Aju News Corporation.