Open Access

Integrating building information modeling and virtual reality development engines for building indoor lighting design

Visualization in Engineering20175:19

Received: 18 May 2017

Accepted: 11 October 2017

Published: 23 October 2017



Lighting simulation tools are extending the functionality of Building Information Modeling (BIM) authoring software applications to support the lighting design analysis of buildings. Although such tools enable quantitative and qualitative analysis and visualization of indoor lighting, they do not provide an interactive environment between users and the design context. Moreover, their visualization environments do not allow users to experience visual phenomena such as glare. In addition, lighting energy consumption generated from traditional tools is often separated from the 3D virtual context of the building. Therefore, an incorrect interpretation by designers regarding the relationship between their desirable lighting design and energy feedback may occur.


This research proposes a method and develops a BIM-based lighting design feedback (BLDF) prototype system for realistic visualization of lighting condition and the calculation of energy consumption.


The results of a case study revealed that BLDF supports design stakeholders to better perceive and optimize lighting conditions in order to achieve a higher degree of satisfaction in terms of lighting design and energy savings for future occupants.


The developed system utilizes an interactive and immersive virtual reality (VR) environment to simulate daylighting and the illumination of artificial lights in buildings and visualizes realistic VR scenes using head mounted displays (HMD). BLDF allows users to interact with design objects, to change them, and to compare multiple design scenarios, and provides real-time lighting quality and energy consumption feedback.


Building Information Modeling (BIM)Serious game simulationIndoor lighting designDesign feedbackVirtual reality (VR)Lighting design optimizationEnergy feedback


Lighting design relies on a combination of scientific, aesthetic, and human factors (Benya et al., 2001). The following determinants are amongst the factors that need to be considered in lighting design: daylight availability, aesthetics, lighting quality, and efficiency of lighting. Benya et al. (Benya et al., 2001) stated that lighting design primarily affects energy efficiency and lighting quality. The World Energy Council (World Energy Resources 2013 Survey, 2016) estimated that buildings account for nearly 40% of the total global energy consumption. A major proportion of the total energy demand for buildings is electricity for lighting. For example, lighting consumes approximately 16.7% of commercial building energy in the U.S. (Book, 2016), 20 to 40% in large office buildings in China (Zhou et al., 2015), and 40% in commercial buildings in Japan (The Energy Conservation Center Japan, 2010). A reduction in lighting energy consumption can make a substantial contribution toward lowering the energy demand for buildings.

The use of Building Information Modeling (BIM)-based computer simulation tools is growing rapidly, and such tools assist designers in making better decisions to reduce energy consumption and to create better lighting conditions for occupants. Several existing commercial simulation tools that are compatible with BIM have been widely used in visualizing, identifying, and examining indoor lighting performance as well as simulating lighting energy consumption. Conventional simulated outputs often comprise quantitative and qualitative data, and the visualization features are based primarily on static two-dimensional (2D) images. Energy consumption outputs are usually generated as numerical data and presented as complex graphs, documents, and/or tables (Sarhan & Rutherford, 2009; Hailemariam et al., 2010), which are separate from the visual context of the building (Hailemariam et al., 2010). Such numerical data are used by experts who are experienced in interpreting such data. However, this creates a barrier for collaboration between design teams and clients who are not familiar with interpreting numerical results, and this prevents the parties from understanding the energy requirements of different design options.

Some BIM-based lighting analysis tools, such as Elumtools (What Is ElumTools™?, 2016) and Dialux (Lighting Design Software DIALux, 2016), provide three-dimensional (3D) visualizations and walkthrough feature. However, such tools can only provide static outputs with no interaction between the users and the environment e.g., (Bille et al., 2014; Kumar et al., 2014), and are inadequate for dynamically updating or changing design parameters (Kensek & Noble, 2013) and providing simultaneous energy feedback when comparing multiple design scenarios. Moreover, when using traditional tools, designers are unable to directly experience some lighting phenomena that may affect the visual perception of the occupants, e.g., brightness, darkness, glare, or insufficient illumination. In addition, when using conventional BIM authoring software applications and their plugins, tasks related to preparing lighting simulation and sharing information between different tools are very time-consuming (Huang et al., 2008). Furthermore, the rendering step for achieving a photo-realistic output is very time consuming.

On the other hand, virtual reality (VR) provides new perspectives for designers to visualize their design through an immersive experience (Weidlich et al., 2007). Game engines are designed for creating dynamic activities, interacting with objects (Edwards et al., 2015), and providing accurate and timely feedback when users interact with building elements in a virtual environment. To this end, coupling BIM and game engines can extend the capabilities of BIM and makes BIM a more powerful tool for addressing the aforementioned issues.

Therefore, this paper proposes a framework for integrating BIM and a game engine for lighting design visualization and analysis. The objectives of this research are: (1) to investigate a method by which to use serious game simulations for lighting design visualization and performance analysis; (2) to develop a prototype system with a user friendly interface, which integrates BIM and a game engine to support lighting designers for identifying optimal design parameters; (3) to investigate the applicability of the system for assessing lighting design factors, such as visual comfort, energy consumption, and lighting performance; and (4) to validate the proposed method and assess the performance of the developed tool using a real-world case study.

This paper presents a prototype system called BIM-based lighting design feedback (BLDF), for realistic visualization of lighting conditions and calculation of lighting energy consumption with a user-friendly interface using an interactive and immersive VR environment. BLDF provides qualitative and quantitative outputs related to lighting design. The BIM database and scripting features of the game engine are used to create a robust prototype system that performs fast calculation and image rendering. VR technology enables designers to perceive realistic lighting scenes in their designs. Tools such as Autodesk Revit, Unreal Engine, and its scripting environment are used in our system to create an interactive environment that allows changing and visualizing lighting scenes and to facilitate comparison of different design scenarios in our experiments.

The main contributions of the present research are: (1) the proposal of a novel method and a tool for supporting lighting design to assess the required amount of light, the lighting aesthetic, and the total energy consumption using an interactive and immersive VR environment; (2) the proposal of an alternative approach to use a virtual environment to facilitate collaborative and participatory design between designers and clients that provides an efficient decision support system for indoor lighting design; (3) the development of a system that reduces the time required for calculating the illuminance level and achieving a more realistic lighting scene, as compared to conventional lighting simulation software; and (4) the investigation of a method by which to transfer lighting design properties from the VR environment to the BIM database.

Literature review

Lighting design review in the architecture, engineering, and construction (AEC) industry

Using BIM-based simulations are increasingly being used to assess the success or failure of indoor lighting design. Lighting simulation plugins for BIM applications help users to analyze design options in order to provide sufficient lighting for occupants and improve building energy efficiency and overall building performance (Aksamija & Zaki, 2010). Two types of lighting simulation plugins are compatible with the BIM: external plugins and internal plugins (e.g., Lighting Analysis in Revit, Lighting Assistant in 3ds Max, Radiance, Daysim, Diva, and DesignBuilder). Internal plugins or internal extensions can be added to BIM applications, e.g., Revit, to enable the highest degree of interoperability (Nasyrov et al., 2014). External plugins are needed for data exchange across platforms to perform simulations and visualizations. Generating quantitative results is the main use of such plugins. However, qualitative results of lighting design, which refers to the illumination of an entire scene and has complex characteristics, such as aesthetics, are difficult to quantify (Sorger et al., 2016).

A number of previous studies proposed methods for using lighting analysis tools with the BIM model. For example, Özener et al. (Özener et al., 2010) developed BIM-based daylight-ing simulations and analyses for educational purposes. Such tools help students to understand the impact of design decisions on the performance of daylight and artificial lights in different design options. Daysim, Radiance, and Ecotect were used in their experimentation. In addition, Yan et al. (Yan et al., 2013) and Kota et al. (Kota et al., 2014) developed a system to connect the BIM database and building energy modeling (BEM) to support thermal simulations and daylighting simulations using an application programming interface (API) called Physical BIM (PBIM). Stavrakantonaki et al. (Stavrakantonaki, 2013) proposed a framework for selecting different BIM tools for daylighting assessment. DesignBuilder, Diva, and Ecotect were used in their experiments. However, Hu et al. (Hu & Olbina, 2011) argued that limitation of traditional lighting simulation tools, e.g., Radiance, is that the setup process is time-consuming and such tools are not easy to use in real-time simulations.

Integrating building information modeling (BIM) with a game engine

BIM has created an opportunity for stakeholders to collaborate and share multidisciplinary building information (Eastman et al., 2008) that can help designers to save time when creating new models (Ham & Golparvar-Fard, 2014). Integration with computer gaming environments is a new capability of BIM technology that allows users to better observe facilities and experience their surrounding environment before it is constructed (Figueres-Munoz & Merschbrock, 2015). Coupling of a game engine and the BIM can create a highly interactive environment with a digital geometry model derived from the BIM software. Figueres-Munoz et al. (Shiratuddin & Thabet, 2011) indicated that integration of BIM and gaming technology has the potential for combinatorial innovation. A number of previous studies have proposed methods for integrating game engine, BIM, and VR experience for various purposes, e.g., design review (e.g., (Shiratuddin & Thabet, 2011; Wu, 2015)), construction management (e.g., (Bille et al., 2014)), education (e.g., (Wu, 2015; Goedert et al., 2011)), energy conservation (e.g., (Niu et al., 2015; Jalaei & Jrade, 2014; Brewer et al., 2013)), real-time architectural visualization (e.g., (Yan et al., 2011)), and design feedback (e.g., (Hosokawa et al., 2016; Fukuda et al., 2015; Bahar et al., 2013; Motamedi et al., 2016)).

A few studies have focused on using game engines and VR for lighting design visualization. For example, Gröhn et al. (Gröhn et al., 2001) presented a method for visualizing indoor climate and visual comfort in a 3D VR environment. In their method, photo-realistic visualization was used to visualize the lighting distribution on the surfaces of the architectural model. However, they did not include real-time control of lighting conditions. In addition, Sik-lányi et al. (Sik-lányi, 2009) studied how using different types of lights influences the development of realistic VR scenes. Sampaio et al. (Sampaio et al., 2010) developed a virtual interactive model as a tool to support building management by focusing on the lighting system. Santos et al. (Santos & Derani, 2003) developed an immersive VR system for interior and lighting design using the cave automatic virtual environment (CAVE). This system provided users with an interface to change the design elements of the building, e.g., walls, floor, furniture, and decoration, in a virtual environment. However, their system focused only on the appearance of lighting design using ray tracing as the rendering algorithm, which did not provide a function to simultaneously generate quantitative outputs of lighting design, e.g., illuminance level and energy consumption feedback.

Methods for fully integrating the BIM database, game engine, and VR for real-time lighting design and energy consumption feedback have not yet been proposed, and this topic remains an area of research. Therefore, the present paper investigates a new method that uses BIM, game engines, and VR to facilitate lighting design and lighting energy performance analysis via an immersive and interactive user experience.

Lighting design metrics in the game environment

Lighting plays an important role in indoor environmental conditions. Lighting performance depends on several factors, such as energy cost, energy savings, illuminance, and energy consumption (Deru et al., 2005). Designers want to achieve an optimal design with minimum energy consumption. The design of daylighting and artificial light rely on a combination of specific scientific principles, established standards and conventions, and aesthetic (Benya et al., 2001). The following five physical parameters that influence occupants’ visual comfort in architectural spaces should be considered: lighting illuminance levels, lighting distribution (diffusion), correlated color temperature (CCT), brightness ratio, and glare (Fielder, 2001), (Descottes & Ramos, 2011). Designers must ensure that the visual comfort parameters comply with the standard of lighting design. Regarding human visual perception, the most important variables are CCT and illuminance level (Shamsul et al., 2013). Visual perception is defined as an interpretation capability of visual sensation when people visually perceive their surrounding environment. As mentioned above, illumination level and CCT are two variables that have a significant influence on human perception. The illuminance value is an indicator for verifying the quantity of light in a given environment. International lighting standards and building regulations specify the level of illuminance required for providing visual comfort to facilitate human visual performance. Correlated color temperature describes the characteristics of light colors. Color temperature plays a particularly important role in both the physical properties of light and the physiological and psychological response of humans when light enters the eyes (Descottes & Ramos, 2011; Shamsul et al., 2013).

VR experience is based on users’ perception of the virtual world (Mihelj et al., 2013). VR technology enables the creation of realistic models and makes scenes look real (Stahre & Billger, 2006). Immersion (perception) and real-time interaction (action) are two properties of VR. VR allows users to experience an artificial environment through human senses (Souha et al., 2005). Head mounted displays (HMDs) have been used as 3D immersive tools to provide an experience that is close to the human perception of reality (Ciribini et al., 2014; Scarfe & Glennerster, 2015).

In the game environment, lighting plays a significant role in making scenes look realistic. Game engine technology has lighting features that resemble real-world lightings (Shiratuddin & Thabet, 2011). Lighting simulation for gaming and VR presentations relies on a mixture of rendering algorithms, including path tracing and photon mapping. It is possible to effectively and physically simulate indoor illuminated environments at a level of quality where concepts of photorealism and perceptual accuracy can be discussed and measured (Murdoch et al., 2015). Lighting simulation in game engines can produce realistic scenes based on the inverse square law, which states that illumination intensity varies directly with luminous intensity on a surface and is inversely proportional to the square of the distance between the light source and the lighted surface (IESNA, 2000; de Rousiers & Lagarde, 2014). Other lighting metrics, such as luminous flux and light color, are also provided in the game engine. Luminous flux (light output or intensity), measured in lumens (lm), determines the quantity of light emitted from the lamp to illuminate the entire scene. Light color is expressed as color temperature (kelvins or the SI abbreviation “K”). Another parameter for measuring light is luminance, measured in footlamberts (fL) or candelas/m2 (cd/m2). Luminance refers to perceive brightness which describes the amount of light delivered into space and reflecting off of a surface in a space that affects human ability to see (The Energy and Resources Institute (TERI), 2004). Luminance measurement affects the measurement of exterior/interior design and finishes in reflecting light rather than measuring lighting quality. Luminance is a useful baseline metric to identify sources of glare within a person’s field of view.

The advanced dynamic lighting feature in game engines allows pre-visualization of various designs. Consequently, lighting features in game engines replace conventional methods and enable faster lighting simulation when changing parameters. Moreover, it is possible for players to perceive the characteristics of light in order to determine which design pattern is more appropriate for the users.

Human visual perception through VR technology

A number of previous research has studied visual perception dimensions of VR technology. Menshikova et al. (Menshikova et al., 2012) studied the Simultaneous Lightness Contrast (SLC) illusion and the results of their study showed that VR technologies may be effectively used in studies of lightness perception. Their study showed that the VR technology enables reproducing visual illusions in depth and to construct complex 3D scenes with controlled parameters to create articulated effects. Scarfe et al. (Scarfe & Glennerster, 2015) stated that immersive VR opens up new possibilities for studying visual perception in much more natural conditions.

The quality of an HMD display screen depends on its resolution as well as luminance and contrast ratio of the screen, field of view (FOV), exit pupil, eye relief, and overlap (Vince, 2004). The contrast is the ratio between higher luminance and the lower luminance that defines the feature to be detected, a value of 100:1 is typical. Luminance is the amount of light energy emitted or reflected from a screen in a specific direction (Fuchs, 2017) and a measure of a screen’s brightness, and ideally should exceed 1000 cd/m2 (Vince, 2004). The luminance that can be observed in nature covers a range of 10−6 to 106 cd/m2 and the human visual system can accommodate a range of about 104 cd/m2 in a single glance (Fuchs, 2017). The FOV is a measure of the horizontal and vertical angle of view. The horizontal FOV of HMDs is restricted to about 100–120 degrees for each eye (Fuchs, 2017).

VR has a great potential to become a usable design tool for the planning of lighting design in buildings (Stahre & Billger, 2006). However, due to the technological limitations, there are still problems in representing realistic lighting condition. A virtual world may produce semi-realistic lighting phenomena, however, precisely perceiving illuminance and glare of scenes in HMD is impossible with the current technology. The FOV of a typical head based-display is limited and the resolution can vary from the relatively few pixels offered in the early system to very high resolution (Sherman & Craig, 2003). The screen of VR headset should display a huge number of pixels about 6000 pixels horizontally and 8400 pixels vertically, which are not compatible with the current technology (Fuchs, 2017). Although the visual perception is still different from the reality using the current technology, there are many visual details in the luminous environment that cannot be expressed through numerical information, such as misaligned luminaries and disturbing light-shade patterns (Inanici, 2007).

Discomfort glare indices

Glare refers to a physical discomfort and is caused by non-uniform luminance distribution within the visual field (Fasi & Budaiwi, 2015). Glare occurs in two ways; excessive brightness and the excessive range of luminance in a visual environment (IESNA, 2000). Glare is typically divided into three categories: 1) disability glare, which is caused by excessive brightness areas in the field of view of a person with greater luminance than that of which eyes can adapt to; 2) veiling glare, which is caused by reflections on specular or diffusive materials that might reduce the contrast and visibility of the task; 3) discomfort glare, a psychological sensation that does not necessarily impair vision in short term and can remain unnoticed to observers, it can cause annoyance, headaches or eyestrain after long exposure (Abraham, 2017). Discomfort glare is caused by luminance that are high relative to the average luminance in the field of view (IESNA, 2000). A glare index is a numerical evaluation of the acceptability of the presence of glare using a mathematical formula.

Several principal indices of discomfort glare have been proposed to quantify discomfort glare generated from small lighting sources (e.g., artificial lighting) and large luminous sources (e.g., windows) (Abraham, 2017). For discomfort generated by small sources, examples of discomfort indices include visual comfort probability (VCP) (Guth, 1963), unified glare rating (UGR) (Sorensen, 1987), and CIE glare index (CGI) (Einhorn, 1973). Example of discomfort indices generated by large luminous sources includes daylight glare index (DGI) (Chauvel et al., 1982) and daylight glare probability (DGP) (Wienold & Christoffersen, 2006). VCP is a rating that evaluates lighting systems in term of the percentages of people that will find a given discomfort glare (IESNA, 2000). VCP is defined as a number that corresponds to the percentage on a scale of 0–100. UGR is used to measure glare possibility in a given environment. UGR is the logarithm of glare from electric light sources (Boyce, 2003). UGR value ranging from 10 for low levels of discomfort glare to 28 for high levels of discomfort glare. DGI index was developed to predict glare from large area sources, such as a window. The DGI uses categorical rating to explain quantitative value between 16 for just noticeable to 28 for intolerable glare (Hirning et al., 2014). DGP is a metric for an estimate of the appearance of discomfort glare in daylit spaces. DGP considers the vertical illumination at the eye level as well as on the glare source luminance, its solid angle and its position index (Wienold & Christoffersen, 2006).

Recently, most architect and lighting designers prefer not to be guided by glare indices (Hirning et al., 2014). However, preventing glare is important for design consideration (Reinhart et al., 2012) such as work safety. Various studies have investigated that there are other potential subfactors which influence the large variations in individual glare sensitivity, such as age of the person (Reinhart & Wienold, 2011; Van den Berg et al., 2009), gender (Wolska & Sawicki, 2014), culture (Amirkhani et al., 2016), and physical differences (Pulpitlova & Detkova, 1993). Although various metrics have been developed attempting to quantify glare, a major obstacle in quantifying discomfort glare is the difficulty in analyzing complex lighting distributions due to the dynamic nature of daylight (Hirning et al., 2014). Every equation evaluate glare differently, and the evaluation of discomfort glare indices in practice is complicated (Stone & Harker, 1973; Clear, 2012). However, in the advanced lighting research, the assessment of discomfort glare in spaces increasingly involves the use of simulated high dynamic range (HDR) images (Boyce, 2003; Giraldo Vásquez et al., 2016).

A number of studies have focused on using HDR images for discomfort glare analysis from daylight, e.g., (Clear, 2012; Suk et al., 2017; Mcneil & Burrell, 2016). Existing glare assessment studies were typically based on real-world building. A few studies have proposed methods to simulate light illuminance and glare in a virtual environment, e.g., (Mangkuto et al., 2014; Inanici & Mehlika, 2003). HDR images store luminance data on a pixel scale, which enable the possibility of detecting glare sources to compute discomfort glare metrics. With the digital evaluation of HDR, it is possible to integrate the HDR rendering scenes generated by a game engine with glare evaluation tools, such as Evalglare, Photolux, Findglare, to improve the lighting design and discomfort glare measurement during the design process.

Overview of the proposed method

The goal of the present study is to develop a system for visualizing lighting design that allows users to experience, analyze, and assess the lighting quality of their designed space in an immersive environment. The proposed methodology comprises seven major steps, as shown in Fig. 1. The first step is to create the building model using a BIM authoring software application. In this step, a 3D BIM model comprising the virtual equivalents of veritable building elements is created using a BIM tool, e.g., Autodesk Revit. Building elements (e.g., walls, columns, ceiling, floor, furniture, light bulbs, and fixtures), and their geometric and non-geometric information (e.g., material and properties) are created in the BIM software (Fig. 1a). The model is then exported to a game engine (Fig. 1b). The next step is to adjust the model in the game environment and to set initial values using the user interface (Fig. 1c). Then, visualizations such as lighting illumination (shown with realistic scenes or false-colors), lighting atmosphere, and energy consumption feedback are provided (Fig. 1d). In this step, lighting simulation is performed in the game environment using an embedded physics engine. The fifth step is to use immersive visualizations provided in the system to identify and analyze the conditions of the lighting design by visually analyzing the lighting conditions, examining visual comfort to achieve an optimum lighting design, and analyzing and comparing quantitative information (Fig. 1e). If the design is not satisfactory, the system provides options to change and update the parameters for simulating new scenarios and visualizing lighting results via its graphic user interface (GUI) widgets (Fig. 1f). Once a satisfactory design is achieved, new design parameters are updated in the BIM software (Fig. 1g).
Fig. 1

Overview of the proposed method

BIM-based lighting design feedback is a system for visualizing lighting design using VR technology. This system provides two types of game views: first-person and birds-eye views. In the first-person view, an HMD can be used. The HMD responds to the motion and rotation of the user’s head and provides a lifelike lighting experience in game scenes. A first-person view is also suitable for perceiving phenomena caused by excessive luminance in the field of view (FOV), e.g., glare. A birds-eye view (or top-down view) is provided in order to clarify the overall setup of the environment and the position of the player when they are moving in the game environment. Realistic scenes and false-color scenes are two types of visualization outputs of the BLDF system. The system calculates the amount of illuminance and generates real-time false-color visualizations. Moreover, the system helps to verify the illuminance level of the design area. A mouse and a keyboard are used as input devices to help users to navigate in the first-person view, and to change parameters.

Development of the BLDF system

Autodesk Revit (Revit Family, 2016), 3ds Max (3ds Max 3D Modeling, 2016), Unreal Engine (What Is Unreal Engine 4, 2016) with its scripting feature, and the visual programming in Dynamo (Dynamo, 2016) are used to develop the BLDF system.

The built-in lighting feature in Unreal Engine 4 uses physically based shading, in which the lighting and shading algorithms approximate the physical interaction between light and materials (Walker, 2014). For example, light intensity in lumens falls off at a rate that follows the inverse square law (Bruneton & Neyret, 2008). BLDF system uses the built-in lighting algorithm in Unreal Engine to generate lighting scenes in a virtual environment. To simulate daylighting, Unreal Engine 4 uses Bruneton sky model, which is an accurate method to simulate skylight in real-time, and considers effects of light scattering, such as Rayleigh and Mie multiple scattering (Bruneton & Neyret, 2008). Precomputed lighting feature in Unreal Engine is available for evaluating the lighting results at runtime. Ray Traced Distance Field is the rendering technique used in Unreal Engine 4 (Ray Traced Distance Field Soft Shadows, 2017).

BLDF system user interface components and supported interactions

The prototype system supports interactivity with which users can experiment and adjust lighting design parameters in real-time. The following interactions are supported: (a) First-person movement: to control the movement of users’ avatars as they walk in the space; (b) User interface control: to customize parameters such as time (to observe the dynamic of sunlight), lighting fixtures, types, orientation, intensity, and color temperature of the bulbs, as well as the location of furniture. Unreal Motion Graphics (UMG), a visual user interface in Unreal Engine, is used to create various GUI elements in the BLDF system. The GUI interface of the proposed system is shown in Fig. 2. The menus of the game environment enable players to change parameters and freely navigate among game objects using input controllers. The following widgets are created, as shown in Fig. 2: (a) Plan view: to show the layout of furniture and the position of the user. Users can switch between false-color view and realistic views; (b) Shading devices menu: to change the type of shading for windows; (c) Material types menu: to change the materials of indoor surfaces; (d) Time slider: to set time for adjusting the daylight; (e, f) Light switches: to turn on/off individual artificial lights; (g) Lighting intensity menu: to change the intensity of light sources; (h) Color temperature control: to change the color of lights; (i) Lighting fixture types menu: to choose the type of lighting fixtures; (j) Moving and rotation tools: to move and rotate the light source, lighting fixtures, and furniture; (k) Lighting illuminance legend: to help measure the illuminance level (for false-color views); (l) Total energy usage: to display the total lighting energy consumption of the room; (m) Compass: to show the orientation of users when moving; (n) Lighting occupancy sensor: to switch lights on/off when the space is occupied and unoccupied; (o) Task light: to add individual light controls to each workstation.
Fig. 2

Screenshot of the main interface of the prototype BLDF system with all widgets visible

The global illumination algorithm in the Unreal Engine is used to automatically produce lighting illumination scenes and calculate detailed shadows at runtime. The scenes can also be visualized in false-color, which shows the illuminances at different levels in the scene. Red indicates an illuminance level that exceeds than 1000 lx, and dark blue indicates an illuminance level of less than 100 lx. The user can hide/unhide each widget on the viewport. A keyboard and mouse are used as input devices to change design parameters in the proposed system. The designer can change the light intensity (measured in lumens) and the color temperature of a light (measured in Kelvins) using associated widgets on the interface (Figs. 2g and h). Additional lighting equipment items are modeled in the BIM application and are added to the game environment as an alternative equipment repository. However, it is possible to import libraries of lighting elements from the BIM application into the game engine. After adding items to the equipment repository, new types of lighting equipment are shown on the interface (Fig. 2i). The material and texture of indoor envelopes, e.g., the floor, walls, and ceiling, can be modified using the developed widget (Fig. 2c).

Ambient lights can be switched on/off using the light switch panel (Fig. 2e). The system provides the ability to set up lighting zones when lighting control per zone is required. An additional option for lighting control is the lighting occupancy sensor, which automatically controls ambient lights and task lights (Figs. 2n and o) in the BLDF system. In order to implement occupancy sensing, the Box Trigger tool and visual scripting of Unreal Engine are used. The lights will automatically turn on when the user walks into a specific zone and automatically turn off when the specific zone is vacated.

The total energy consumption of lighting equipment is shown in the energy consumption bar. A simple lighting energy calculation formula is used to calculate lighting energy in watts (number of fixtures × light power (W) = total consumption). The system also provides an interface to enter the time and date for adjusting the daylight duration (Fig. 2d). In order to simulate sunlight, directional light is added as an actor in the game environment, and visual scripting is used to automatically control the yaw and pitch of the sun and to control the position of the sun based on the time and the season.

Alternatives for shading devices, e.g., horizontal panel, horizontal multiple blades, vertical fin, slanted vertical fin, and egg crate (Fig. 2b), are provided in the system. Shading device models are created in BIM and are imported into the game environment as an alternative equipment repository. Users can compare the influence of shading devices on indoor lighting. The illuminance range legend (Fig. 2k) is used to identify the illuminance values, and the compass (Fig. 2m) helps to identify the orientation. Fig. 2a shows a false-color visualization, which helps designers to preview the amount of illuminance. The sample visualizations in Fig. 3 show the results of various simulations when parameters are changed using the developed widgets.
Fig. 3

Sample visualizations using BLDF's GUI

BLDF’s usage process flow

The usage process flow of the BLDF system (as shown in Fig. 4) is as follows: (a) The BIM model is created in Autodesk Revit, in which the geometry and material properties of building elements are modeled; (b) A static mesh is created after exporting the FBX file from Revit to 3ds Max; (c) A 3D geometry in the game environment is automatically created after importing the FBX file of the BIM model into Unreal Engine. Due to interoperability issues between the current versions of applications, although the geometry information of the building elements and lighting equipment are successfully imported into Unreal Engine, some of their properties, such as color/texture and lighting properties, are not transferred; (d) Light bulbs and their properties (e.g., intensity and color temperature) are manually added to the game engine. In addition, the orientation, the scale of the building, and the color and textures of interior envelopes are manually configured in the game environment; (e) Lighting parameters, such as intensity, color temperature, position of lighting equipment, and interior materials, are configured using the GUI (as explained in Subsection 4.1); (f) The simulation is executed, which enables users to immediately view quantitative results (e.g., energy consumption) and visualizations; (g) The user analyzes the lighting design outputs; (h) If the results are satisfactory, the game parameters are saved and a text file is generated; (i) The final design information parameters, such as lighting intensities, color temperatures, and positions of fixtures and bulbs are updated in the BIM using visual programming in Dynamo (explained in Subsection 4.3); (j) The BIM model is saved with the updated parameters. The user can set new parameters and run simulations until a satisfactory design is achieved. Realistic scenes with walk-through capability support qualitative assessment and the false-color scenes support quantitative assessment. The system is used in design meetings to facilitate discussions between designers and clients.
Fig. 4

BLDF’s usage process flow

Updating BIM with new design parameters

Dynamo (Dynamo, 2016) visual programming is used to update new design parameters in the BIM application. Figure 5 shows the process flow of updating the the design parameters in BIM. After finalizing parameters, such as intensity, color temperature, and fixture position for each individual light bulb, and the positions of furniture and shading devices, and the material types of interior envelopes, the new parameters are saved in a text file (Fig. 5a). The developed visual script (Fig. 5b) automatically updates the lighting properties in Autodesk Revit (Fig. 5c) using unique IDs of model objects.
Fig. 5

Updating the design parameters in BIM

System accuracy verification

An office on the fourth floor of the M3 building at Osaka University, Japan, was chosen as the area for the experiment. Figure 6 shows the actual room condition. The room has a typical rectangular shape (Fig. 6a). First, a verification test was performed in order to evaluate the accuracy of the BLDF system. The verification test was performed in a section of the room shown in Fig. 6b.
Fig. 6

Actual room condition

In order to verify the accuracy of the lighting simulation outputs, lighting illumination levels calculated by the proposed system, those calculated by third-party lighting simulation software applications (i.e., Radiance and Lighting Analysis for 3ds Max) and the actual measured values using a light meter were compared. Radiance is a physically-based lighting simulation developed by Lawrence Berkeley National Laboratories in the U.S. In order to simulate lighting in Radiance, the building geometry information from Autodesk Revit is imported into Ecotect using gbXML file format to define an analysis grid, location of artificial lights, sky condition, and time. The model from Ecotect is then exported to Radiance to perform the lighting analysis. Radiance is a radiosity based lighting simulation program and uses backward ray tracing to compute lighting values for a scene and store scene values (Ochoa et al., 2010).

In the verification test, the lighting intensity and color temperature of lamps were set based on their specifications with no obstacles between the lights and light-measuring devices. The verification area (shown in Fig. 6b) has four ceiling lighting fixtures with eight tubular fluorescent (T8) lamps of 32 W each (2850 lm, CCT 5000 K) (Fig. 6c). The test was performed on the 23rd of August in Osaka, Japan, under a clear-sky weather condition (Fig. 7). The illuminance values were collected at 10:00 a.m. to evaluate the daylight (Fig. 7a), at 10:10 a.m. in order to evaluate the combination of daylight and artificial lights (Fig. 7b), and at 8.00 p.m. to evaluate only the artificial lights (Fig. 7c).
Fig. 7

Conditions for the accuracy test

The illuminance values were measured at six locations on the tables shown in Fig. 6c using a light meter (CEM DT-1308 light meter; accuracy: ±5%) (Fig. 7). The measurement results of the illuminance level on the table when the only lighting source is daylight are reported in Table 1(a). The results for the case of the combination of daylight and artificial lights are reported in Table 1(b), and the measured illuminance levels produced by artificial lights at night are reported in Table 1(c). The same scenarios were configured in the BLDF system, and the illuminance values were collected. Table 1 shows the data collected by light sensors, the BLDF simulation results, results of Lights Analysis for 3ds Max, and results of Radiance. The false-color visualizations of BLDF simulation are shown on Fig. 8.
Table 1

BLDF lighting simulations and comparison results


Illuminance levels (lux)

BLDF error

Illuminance levels (lux)

Radiance error

Illuminance levels (lux)


Sensor Reading



3ds Max

3ds Max error

(a) 10:00 a.m. (daylight)























































(b) 10:10 a.m. (daylight + artificial lights)



> 1000





> 1000









































(c) 8.00 p.m. (artificial lights)























































Fig. 8

BLDF lighting simulations and the comparison of the results

As shown in Table 1, in the case of daylighting simulation, the largest errors of the BLDF simulation, Radiance, and 3ds Max were 6.94, 12.38, and 40.92%, respectively. The average errors in the daylight simulation for the BLDF, Radiance and 3ds Max were 3.61, 6.65, and 11.80%, respectively. In the case of the combination of artificial lights and daylight, the largest absolute errors for the BLDF, Radiance and 3ds Max were 11.08, 6.85, and 37.87%, respectively. The BLDF system shows all values above 1000 lx with one color. Hence, an accurate comparison for points A and B in this case is not available. However, the average absolute errors for four other points (i.e., C, D, E, and F) were 5.25% for BLDF, 4.09% for Radiance, and 17.06% for 3ds Max. In addition, in the case of artificial light only, the maximum error of the BLDF system was 5.71%, whereas the maximum error of Radiance was 3.58%, and 3ds Max was 32.42%. Furthermore, the average absolute error for artificial light was 3.03% for BLDF, 1.85% for Radiance, and 15.48% for 3ds Max. Fisher et al. (Fisher, 1992) recommended an acceptable error range between measurements and simulation is 10% for average illuminance calculations and 20% for each measurement point. Therefore, the BLDF system provides a consistent level of accuracy similar to other lighting simulation systems and its accuracy values are within an acceptable range.

Case study

BIM modeling and game engine integration

The applicability of the BLDF system is validated in a design assessment and renovation project. The case study was performed in an office at Osaka University, Japan. Figure 9 shows 3D BIM model and 2D plan of existing fixtures position of the case study room. The BIM model of the room was created using Autodesk Revit Architecture 2015 (Fig. 9a). The BIM model contains details of the lighting system, and geometric and non-geometric information of components, e.g., building envelops and furniture. Lighting properties, such as the intensity and the color temperature, are configured based on the specifications of the lamps. Figure 9b shows a 2D plan of existing lighting fixtures. The room has 16 lighting fixtures with 32 tubular fluorescent (T8) lamps of 32 W.
Fig. 9

3D BIM model and 2D plan of existing fixtures position of the case study room

The goal is to facilitate lighting design by allowing users to interact with the design and to perceive and experience the effects of the modifications simulated by the system. In order to integrate BIM with a game engine, the geometry information of building elements with their reference IDs are transferred from the BIM application (i.e., Revit) to the game engine (i.e., Unreal Engine). The BIM geometry data are transformed to a static mesh using 3ds Max. Figure 10 shows the process of transferring the model, in which the FBX file format is used to export the model from Revit into Unreal Engine. Regardless of the data export format, some important information, e.g., material textures, color temperature, and light intensity, is lost while exporting data to the game engine. The lost features must be redefined in the game environment.
Fig. 10

BIM model transfers to the game environment

Immersive VR for BLDF

In the experiment, users were invited to use the BLDF system with an HMD (Oculus Rift Kit 2, OLED display, 1920 × 1080 pixels per eye with 100° nominal field of view) to visualize the design through a first-person perspective. In the first-person perspective with an HMD, a mouse is used to interact with game objects and a keyboard is used to navigate in the virtual environment. The user can move the game objects and change the lighting design properties by clicking on the GUI buttons. The goal is to reduce the energy consumption while providing adequate illumination on desks in an aesthetically pleasant environment. In order to achieve this goal, the following six test cases were designed. The test cases are based on three fundamental aspects of lighting design of buildings: (1) evaluating the illuminance level, (2) calculating the energy consumption of the lighting system, and (3) visualizing lighting appearances.

Current design assessment

In order to evaluate the quality of artificial lights in the office zone, a simulation is performed in the daytime (8 a.m.) and at night (8 p.m.). This office zone has 12 lighting fixtures with 24 tubular fluorescent (T8) lamps of 32 W. The results revealed that the illuminance levels on desks were approximately 500 to 700 lx throughout the day. In the daytime, the illuminance levels on the desks that were close to the windows, reached 800 to 900 lx. This confirmed that the current lighting condition of the office areas complied with the minimum standard of lighting design requirements (Fig. 11).
Fig. 11

Lighting illumination level at daytime (left) and nighttime (right)

Daylight availability assessment

The amount of illumination on all desks from 8 a.m. to 4 p.m. for different scenarios was analyzed in the BLDF system. Figure 12 shows an example of daylighting illumination on the work desks at various times of day without using artificial lights. This shows that working desks in the windows zone still have sufficient illuminance. The working zone near the windows to a distance of approximately 4 m, can use daylight and does not require artificial light. The zone far from windows requires artificial light throughout the day.
Fig. 12

Examples of the illumination visualization outputs

Figure 13 shows lighting energy consumption of two setups in the daytime (8 a.m. to 4 p.m.): (1) all artificial lights are on; (2) lights close to window are turned off. The results revealed that when all artificial lights are on, the lighting power usage is 1280 W and the energy consumption is 10.24 kWh. Moreover, the illuminance levels on the desks are approximately 700 to 800 lx. When lights for windows zone are switched off, the results revealed that the illuminance levels on the working desks in the windows zone are in the range of 450 to 650 lx, which is more than the minimum lighting requirement in the office (Shamsul et al., 2013). In addition, the power usage of electric lights is 720 W, and the energy consumption is 5.76 kWh. In this experiment, by turning off the artificial lights in the windows zone (10 bulbs) from 8 a.m. to 4 p.m., lighting energy consumption can be reduced by 43.75%.
Fig. 13

Lighting energy consumption for two scenarios

Lighting type design and assessment

The BLDF system shows the changes in the lighting power usage and lighting illuminance when different types of bulbs are used. In our case study, the illuminance levels and the power usage for three lamp alternatives, i.e., fluorescent T8 (32 W, 2850 lm), T5 (28 W, 2750 lm), and LED (T8) (15 W, 2000 lm) are compared when all artificial lights (32 bulbs) are turned on from 8 a.m. to 4 p.m (Fig. 14). The results show that the lighting power usages are 1024 W, 896 W, and 576 W, respectively. The total energy consumption when using T8, T5, and LED (T8) is 8.192 kWh, 7.168 kWh, and 4.536 kWh, respectively. Different illumination levels, i.e., approximately 760 lx (Fig. 14a), 740 lx (Fig. 14b), and 700 lx (Fig. 14c), were generated at each work desk. In this experiment, LED lights provide more energy savings than T8 and T5 lights, at 35% and 25%, respectively. Adequate illuminance levels were generated by LED lights at every desk that satisfy the lighting standard for computer tasks.
Fig. 14

Illuminance levels using different bulbs

Aesthetics of lighting design assessment

As explained in Section 3, the BLDF system provides a holistic overview of the design atmosphere that helps designers to aesthetically assess the design and to present the outcomes of the setup to their clients. Figure 15 shows visualization examples of lighting atmospheres of different design options. For example, warm color (Fig. 15a) produces orange and yellow lighting colors, which creates a relaxing atmosphere for occupants (Tomassoni et al., 2015). Cool colors or daylight colors are commonly used in office spaces (Fig. 15b). Tomassoni et al. (Tomassoni et al., 2015) stated that colder chromatic temperatures help to stimulate greater work efficiency and productivity. For our case study, the white color temperature (7000 K) generated by fluorescent bulbs is used. The white color temperature is a mixture of cool and warm colors, which is a suitable choice for an office environment.
Fig. 15

Lighting atmospheres

Furniture layout and illuminance level

Figure 16 shows an example of illuminance levels of different furniture layouts. Figure 16a shows the results of a simulation before placing a bookshelf. The illuminance level on the desk surface is approximately 720 lx. For the case in which a bookshelf is placed in the room, the illuminance level on the desk surface is reduced to approximately 550 lx, which is still within the acceptable lighting level required for working areas (Fig. 16b).
Fig. 16

Illuminance levels of different furniture layouts

Integrating lighting sensor control

The occupancy sensor in the BLDF is an optional lighting control that can create an opportunity to achieve a more energy efficient design for the building (Fig. 17). Ambient lights and task lights will be automatically turned on when the specified zone becomes occupied and will be automatically turned off when the specified zone is unoccupied. The user can experience the quality of lighting when occupancy sensors are added to the environment and can visually analyze the quality of the illuminance level when various lights are turned on/off in a specific work area.
Fig. 17

Adding occupancy sensors for ambient light and task light in the virtual environment

Results and Discussion

The result of the aforementioned validation scenarios provide several new regarding the integration of a BIM and VR for indoor lighting design feedback.

Although using existing lighting simulation software packages (e.g., Ecotect, Radiance, Dialux, 3ds Max) can help improving iterative lighting design processes, such commercial software application have limitations. For example, these applications require long processing time to calculate the illuminance level. There are also limitations in handling models with complex geometries. Additionally, operating existing software packages, such as Radiance and Dialux, usually requires specialized knowledge and technical skills. Furthermore, such applications do not provide analysis of the appearance of lighting design and quantification of the amount and the distribution of lighting in real-time. The lighting simulation outputs are mostly generated and presented as 2D photorealistic, false-color, and HDR images, which lack complementary information such as the geometry, volumetric information of the target space, and the real-life experience of the space. The conventional lighting simulations applications primarily measure the amount of light falling on surfaces (i.e., illuminance) and do not provide information about the luminance in the target area. Finally, using traditional lighting simulation applications, the human visual perception is not directly involved in the analysis process. The developed BLDF system introduced a new idea to address some of the above-mentioned drawbacks.

One of the goals of the development of the BLDF system was to provide a tool to explore and perceive luminance distribution and illuminance in a virtual environment. BLDF system provides a unique experience for realistic visualization of lighting design in an immersive environment. It allows previewing the effect of lighting in a target space and quantifying the amount of illumination with real-time visual feedback.

Currently, the BLDF system enables a holistic overview of lighting design through its visualizations. This benefits not only designers but also all parties involved in the design process (e.g., owners). Compared with non-immersive feedback (e.g., through a computer screen), the system provides more efficient feedback via immersive visualization through a first-person perspective, which helps to present and communicate ideas, and provide explicit decision support regarding future designs, by clearly indicating areas that require improvement in lighting quality. Using immersive VR display technology, BLDF delivers a semi-realistic perception of artificial and daylighting. It also generates realistic and false-color scenes at runtime. Using the GUI, material and texture of indoor envelope and several lighting parameters can be modified to suit the requirement of the designer. Performing such modifications can be very time-consuming using conventional lighting simulation applications.

The BLDF system was tested in an office building. However, other types of buildings can be easily analyzed by the system. The current BIM authoring software lack features to transfer data on light bulbs, materials and textures when exporting BIM data to VR authoring software. Although the game features worked well for our case study, a number of improvements should be considered in future research:
  • The ability to visualize and compare the outcomes of different lighting setups by switching between views of alternative design settings makes visual comparison more effective.

  • In false-color scenes in VR, providing the numerical data of illuminance values for each measurement point can reduce the error of estimating the illuminance value. In our system, 1000 lx is the maximum illuminance value that can be measured and visualized in false-color scenes. Therefore, improving the system to visualize and measure illuminance values above 1000 lx is required.

  • The electric power consumption in the interface only considered the total wattage. However, in order to approximate lighting energy consumption, both wattage and the duration of use per day are required. Therefore, providing an occupancy schedule that corresponds to the operating hours of artificial lights can help in calculating the total energy consumption more thoroughly.

  • Although HMD provides maximum immersion in the virtual scenes, in order to accurately perceive lighting phenomena, such as glare, HMDs with high-resolution display screens are required. It is also possible to use BLDF with an LED display with a higher resolution.

  • The effect of different sky conditions that influence outdoor/indoor illumination over time should be included in the BLDF.

  • Using conventional input devices, such as a mouse and keyboard, is difficult while wearing an HMD. Therefore, interruptions may occur when the system is being used. The use of game controllers and motion tracking technology can solve this problem.

  • The final design parameters in VR need to be updated in the BIM database both for geometry information and other properties, e.g., bulb types and light intensities. Although this can be done with the help of visual programming tools, such as Dynamo, automatic synchronization between design parameters in VR and BIM should be developed. In addition, using visual programming requires additional time and programming experience. Hence, open source data exchange formats, such as the industry foundation classes (IFC), can be used for exchanging data between applications.

Conclusions and future research

This paper investigated a method by which to integrate BIM and a game engine for lighting design visualization and analysis. The prototype system is developed using the Unreal Engine game authoring software system with its visual scripting environment. In order to use the BLDF system, the following activities should be performed: 1) creating a BIM model; 2) exporting the BIM model into the game engine; 3) adjusting the model in the game environment and redefining the initial values of the design parameters; 4) visualizing lighting illumination, lighting atmosphere, and energy consumption using VR; 5) evaluating and analyzing the condition of lighting design; and 6) configuring design parameters using a GUI to achieve the desired lighting results. An approach to allow users to interact with design objects created in a BIM application and experience lighting design in an immersive environment was discussed. The user can define and explore design parameters, and the results can be simulated in real-time.

The developed system presents a solution for integrating lighting simulation with interactive visualization to support lighting design. The users can perceive the influence of the design modifications on characteristics of light in the space. Using an immersive virtual reality experience, BLDF facilitates the process of design for users. The proposed system can aid understanding, predicting, and assessing indoor lighting environment in real-time during the design phase, which is not possible with traditional lighting simulation tools.

The proposed method was validated in a case study in a campus building at Osaka University, Japan. The initial results revealed that the data can be successfully exchanged between the BIM database and the game engine. The BLDF system enables design parameters to be determined by comparing multiple lighting setups. The BLDF is easy to use with its GUI, which allows design parameters to be redefined interactively. The user can easily change, move, and rotate fixtures. Quantitative and qualitative outputs of the proposed system help in analyzing the lighting design and evaluating visual comfort.

There are several limitations for our developed prototype system. Some of these limitations are related to the use of hardware and some are related to the developed tool. Although using HMD provides a sense of immersion and realism of lighting condition, there is a constrained FOV on distance perception and the pixel density of the current VR headsets. Such factors can decrease the accuracy of perceiving lighting phenomena, such as glare, brightness, and darkness. With the rapid advancement of HMD technologies, a higher level of perceptual accuracy of lighting in a future release of HMDs is expected. Our system currently rely on the physics engine of Unreal Engine for simulating glare on available HMD headsets, hence, accurate and realistic perception of glare is not possible. However, the simulation outputs of glare in the current implementation is still helpful for lighting designers.

Exchanging information between BIM and the chosen game engine is limited to only 3D geometry, and information such as the properties of light bulbs, materials, and textures cannot be transferred. Thus, the user has to manually redefine bulbs and recreate textures in the game environment. Furthermore, complex geometries contained in the BIM model, e.g., furniture models, were not transferred into the game engine. Therefore, the use of low-polygon models (level of development (LOD) 100–300) (Level of Development Specification, 2016) should be considered in the BIM modeling process.

Areas of future research include integrating the occupancy schedule in the system for the energy simulation, developing a method to export lighting information to the IFC format file using the proposed IFC resources, and developing a system to analyze the amount of excessive light and heat in VR. In addition, the influence of geographic locations and weather conditions were not considered in our prototype system. Therefore, in future work, we intend to study the influence of such factors on natural light in VR. Furthermore, performing glare analysis using VR technology is considered as future work of this research.



This work was supported by Japan Society for the Promotion of Science (JSPS) KAKENHI Grant Number JP26-04368.


Japan Society for the Promotion of Science (JSPS) KAKENHI Grant Number JP26–04368.

Availability of data and materials

Not applicable

Authors’ contributions

WN designed and developed the BIM-based lighting design feedback system, called BLDF, approach for evaluating lighting quality using immersive environment under the supervision of NY and AM. AM provided suggestions on the system validation and analysis. TF gave an idea to create a system for the lighting design feedback using BIM and VR. All authors contribute to the writing of the manuscript in the way that WN drafted the manuscript and AM reviewed and revised it. AM and NY supervised the entire processes of this study. All authors read and approved the fina manuscript.

Competing interests

The authors declare that they have no competing interests.

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Open AccessThis article is distributed under the terms of the Creative Commons Attribution 4.0 International License (, which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.

Authors’ Affiliations

Division of Sustainable Energy and Environmental Engineering, Environmental Design and Information Technology Laboratory, Graduate School of Engineering, Osaka University


  1. 3ds Max 3D Modeling, Animation and Rendering Software. Accessed 15 May 2016
  2. Abraham, M (2017). Encyclopedia of sustainable technologies. Elsevier.Google Scholar
  3. Aksamija, A, & Zaki, M. (2010). Building performance predictions. Perkins+will research. Journal, 2, 7–32.Google Scholar
  4. Amirkhani, M., Garcia-Hansen, V., Isoardi, G. (2016) Reducing luminance contrast on the window wall and users' interventions in an office room. In: Proceeding of the CIE 2016 Lighting Quality and Energy Efficiency, 3–5 March. Melbourne, Victoria, Australia, pp. 385–394.Google Scholar
  5. Bahar, YN, Christian, P, Christian, P (2013). Integration of thermal building simulation and VR techniques for sustainable building projects. In Proceeding of the Confere: July 2013; Biarritz, France, (pp. 1–8).Google Scholar
  6. Benya, J, Heschong, L, McGowan, T, Miller, N, Rubinstein, F (2001). Advanced lighting guidelines. USA: New Buildings Institute.Google Scholar
  7. Bille, R, Smith, SP, Maund, K, Brewer, G (2014). Extending building information models into game engines. In Proceedings of the 2014 conference on interactive entertainment: 2–3 December; Australia, pp. 1–8. Scholar
  8. Buildings Energy Data Book: Commercial Sector Energy Consumption. Accessed 15 May 2016
  9. Boyce, R. P. (2003). Human factors in lighting, Second Edition, CRC Press. doi:
  10. Brewer, R, Xu, Y, Lee, G, Katchuck, M, Moore, C, Johnson, P. (2013). Three principles for the design of energy feedback visualizations. International Journal on Advances in Intelligent Systems, 3, 188–198.Google Scholar
  11. Bruneton, E, & Neyret, F (2008). Precomputed atmospheric scattering to cite this version : Precomputed atmospheric scattering. In Proceedings of the 19th Eurographics symposium on rendering, (pp. 1079–1086). Google Scholar
  12. Chauvel, P, Collins, JB, Dogniaux, R, Longmore, J. (1982). Glare from windows: Current views of the problem. Lighting Research and Technology, 14(1), 31–46. View ArticleGoogle Scholar
  13. Ciribini, ALC, Ventura, SMB, Marzia, B. (2014). Informative content validation is the key to success in a BIM-based project. Territorio Italia, 1, 87–111.Google Scholar
  14. Clear, RD. (2012). Discomfort glare: What do we actually know? Lighting Research and Technology, 45(2), 141–158. View ArticleGoogle Scholar
  15. de Rousiers, C., Lagarde, S.: Moving to physically based rendering. In: Proceedings of the Association for Computing Machinery’s Special Interest Group on Computer Graphics and Interactive Techniques (ACMSIGGRAPH): 10–14 August 2014; Vancouver, Canada, p. 119 (2014)Google Scholar
  16. Deru, M, Blair, N, Torcellini, P (2005). Procedure to measure indoor lighting energy performance. Colorado: National Renewable Energy Laboratory.View ArticleGoogle Scholar
  17. Descottes, H, & Ramos, CE (2011). Architectural lighting designing with lighting and space. New York: Princeton Architectural Press.Google Scholar
  18. Dynamo BIM. Accessed 15 May 2016
  19. Eastman, C, Teicholz, P, Rafael, S, Liston, K (2008). BIM Handbook. New Jersey: John Wiley.View ArticleGoogle Scholar
  20. Edwards, G, Li, H, Wang, B. (2015). BIM based collaborative and interactive design process using computer game engine for general end-users. Visualization in. Engineering, 3.
  21. Einhorn, H. (1973). Discomfort glare: A formula to bridge differences. Lighting Research and Technology, 11(2), 90–94. View ArticleGoogle Scholar
  22. Fasi, MA, & Budaiwi, IM. (2015). Energy performance of windows in office buildings considering daylight integration and visual comfort in hot climates. Energy and Buildings, 108, 307–316. View ArticleGoogle Scholar
  23. Fielder, WJ (2011). The lit interior. Oxford: Architectural Press.Google Scholar
  24. Figueres-Munoz, A, & Merschbrock, C. (2015). Overcoming challenges in BIM and gaming integration: The case of a hospital project. Building information Modelling (BIM) in Design Construction and Operations, 149, 329–340.View ArticleGoogle Scholar
  25. Fisher, A (1992). Tolerances in lighting design. In Proceedings of the CIE seminar on computer programs for light and lighting, (pp. 102–103). Vienna, Austria.Google Scholar
  26. Fuchs, P. (2017). Virtual Reality Headsets - A Theoretical and Pragmatic Approach. CRC Press.Google Scholar
  27. Fukuda, T., Mori, K., Imaizumi, J.. Integration of CFD, VR, AR and BIM for design feedback in a design process-an experimental study. In: Proceeding of the 33rd Education and Research in Computer Aided Architectural Design in Europe (eCAADe): 16–18 September 2015; Vienna, Austria, pp. 665–672 (2015)Google Scholar
  28. Giraldo Vásquez, N., Ruttkay Pereira, F. O., Olivera Pires, M., Niero Morae Sl. (2016). Investigating a method to dynamic assessment of glare using the HDR technique. In: PLEA 2016 (Cities, Buildings People: Toward Regenerative Environments). Los Angeles, pp.10-16.Google Scholar
  29. Goedert, J, Cho, Y, Subramaniam, M, Guo, H, Xiao, L. (2011). A framework for virtual interactive construction education (VICE). Automation in Construction, 20, 76–87. ArticleGoogle Scholar
  30. Gröhn, M, Mantere, M, Savioja, L, Takala, T (2001). 3D visualization of building services in virtual environment. In Proceeding of 17th conference on education in computer aided architectural Desing in Europe (eCAADe), (pp. 523–528).Google Scholar
  31. Guth, SK. (1963). A method for the evaluation of discomfort glare. Illuminating Engineering, 58(5), 351–364. Google Scholar
  32. Hailemariam, E., Glueck, M., Attar, R., Tessier, A., McCrae, J., Khan, A.: Toward a unified representation system of performance-related data. In: Proceedings 6th eSim 2010 Conference: 19–20 May 2010; Canada, pp. 117–124 (2010). International Building Performance Simulation Association (IBPSA)Google Scholar
  33. Ham, Y, & Golparvar-Fard, M. (2014). Mapping actual thermal properties to building elements in gbXML-based BIM for reliable building energy performance modeling. Automation in Construction, 49, 214–224. ArticleGoogle Scholar
  34. Hirning, MB, Isoardi, GL, Cowling, I. (2014). Discomfort glare in open plan green buildings. Energy and Buildings, 70, 427–440. View ArticleGoogle Scholar
  35. Hosokawa, M, Fukuda, T, Yabuki, N, Michikawa, T, Motamedi, A. (2016). Integrating CFD and VR for indoor thermal environment design feedback. In: Proceedings of the 21st computer-aided architectural design research in Asia (CAADRIA): 30 March-2 April 2016; Melbourne. Australia.Google Scholar
  36. Hu, J, & Olbina, S. (2011). Illuminance-based slat angle selection model for automated control of split blinds. Building and Environment, 46, 786–796.View ArticleGoogle Scholar
  37. Huang, YC, Lam, KP, Dobb, G (2008). An scalable lighting simulation tool for integrated building design. In Proceedings of the SimBuild 2008: 30 July-1 august; Berkeley, USA, pp. 206–213.Google Scholar
  38. IESNA. (2000). The IESNA Lighting Handbook. The Illuminating Engineering Society of North America, New York.Google Scholar
  39. Inanici, M. (2007). Computational approach for determining the directionality of light: directional-to-diffuse ratio. In: Proceedings of the Building Simulation (BS 2007), 3–6 September, Beijing, China. pp. 1182–1188.Google Scholar
  40. Inanici, M, & Mehlika, N (2003). Utilization of image technology in virtual lighting laboratory. In Proceedings of the CIE 2003 conference, (pp. 26–28). San Diego: June.Google Scholar
  41. Jalaei, F, & Jrade, A. (2014). Integrating BIM and energy analysis tools with green building certification system to conceptually design sustainable buildings. ITcon, 19, 494–519.Google Scholar
  42. Kensek, KM, & Noble, DE (2013). BIM in current and future practice. New Jersey: John Wiley.Google Scholar
  43. Kota, S, Haberl, JS, Clayton, MJ, Yan, W. (2014). Building information modeling (BIM)-based daylighting simulation and analysis. Energy and Buildings, 81, 391–403. ArticleGoogle Scholar
  44. Kumar, S., Hedrick, M., Wiacek, C., Messner, J.I.: Developing an experienced-based design review application for healthcare facilities using a 3D game engine. In: Proceedings of the 2014 Conference on Interactive Entertainment: 2–3 December; Newcastle, Australia, pp. 1–8 (2014). doi:
  45. Level of Development Specification. Accessed 15 May 2016
  46. Lighting Design Software DIALux. Accessed 15 May 2016
  47. Mangkuto, RA, Wang, S, Aries, MBC, Van Loenen, EJ, Hensen, JLM. (2014). Comparison between lighting performance of a virtual natural lighting solutions prototype and a real window based on computer simulation. Frontiers of Architectural Research, 3(4), 398–412. View ArticleGoogle Scholar
  48. Mcneil, A., Burrell, G. (2016). Applicability of DGP and DGI for evaluating glare in a brightly daylit space. In: Proceeding of SimBuild 2016, 8–12 August, Salt Lake, pp. 57-64.Google Scholar
  49. Menshikova, G., Bayakovski, Y., Luniakova, E., Pestun, M., Zakharkin, D. (2012). Virtual reality technology for the visual perception study. In: Proceeding of GraphiCon’2012, 1–5 October, Moscow, pp. 51–54.Google Scholar
  50. Mihelj, M, Novak, D, Beguˇs, S (2013). Virtual reality technology and applications. Netherlands: Springer.Google Scholar
  51. Motamedi, A., Wang, Z., Yabuki, N., Fukuda, T., Michikawa, T. (2017). Automatic Signage Visibility and Optimization System Using BIM-enabled Virtual Reality (VR) Environments, Journal of Advanced Engineering Informatics, 32, pp. 248–262.Google Scholar
  52. Murdoch, M.J., Stokkermans, M.G.M., Lambooij, M.: Towards perceptual accuracy in 3D visualizations of illuminated indoor environments. Journal of Solid State Lighting 12, 159–165 (2015)Google Scholar
  53. Nasyrov, V., Stratbücker S, Ritter F, Borrmann A, Hua S, Lindauer M: Building information models as input for building energy performance simulation – the current state of industrial implementations. Building information models as input for building energy performance simulation – the current state of industrial implementations, 479–486 (2014)Google Scholar
  54. Niu, S, Pan, W, Zhao, Y. (2015). A virtual reality supported approach to occupancy engagement in building energy design for closing the energy performance gap. Procedia Engineering, 118, 573–580. ArticleGoogle Scholar
  55. Ochoa, C. E., Aries, M. B. C., Hensen, J. L. M. (2010). Current perspective on lighting simulation for building science. In: Proceeding of IBPSA-NVL 2010, Eindhoven.Google Scholar
  56. Özener, O, Farias, F, Haliburton, J, Clayton, MJ (2010). Illuminating the design incorporation of natural lighting¨ analyses in the design studio using BIM. In Proceedings of the 28th international conference on education and research in computer aided architectural Design in Europe (eCAADe): 15–18 September; Zurich, Switzerland, pp. 493–498.Google Scholar
  57. Pulpitlova, J., Detkova, P. (1993). Impact of the cultural and social background on the visual perception in living and working perception. In: Proceedings of the international symposium design of amenity, Fukuoka, Japan, pp. 216–227.Google Scholar
  58. Ray Traced Distance Field Soft Shadows. Accessed 15 August 2017
  59. Reinhart, C., Doyle, S., Jakubiec, J. A., Rashida, M. (2012). Glare analysis of daylit spaces: recommendations for practice. Accessed 8 August 2017
  60. Reinhart, C, & Wienold, J. (2011). The daylighting dashboard - a simulation-based design analysis for daylit spaces. Building and Environment, 46(2), 386–396.View ArticleGoogle Scholar
  61. Revit Family. Accessed 15 May 2016
  62. Sampaio, A.Z., Ferreira, M.M., Rośario, D.P.: Integration of VR technology in Buildings Management The lighting system. In: Proceeding of the 28th Education and Research in Computer Aided Architectural Design in Europe (eCAADe): 15–18 September; Zurich, Switzerland, pp. 729–738 (2010)Google Scholar
  63. Santos, E.T., Derani, L.A.: An immersive virtual reality system for interior and lighting design. In: Proceeding of the Computer-Aided Architectural Design Research in Asia (CAADRIA): 18–20 October; Bangkok, Thailand, pp. 4–6 (2003)Google Scholar
  64. Sarhan, A., Rutherford, P.: Game: A framework for utilizing game engine reporting agents for environmental design education and visualization. In: Proceedings of the 4th International Conference on Arab Society for Computer-Aided Architectural Design (ASCAAD): 11–12 May 2009; Kingdom of Bahrain, pp. 205–214 (2009). The Arab Society for Computer Aided Architectural DesignGoogle Scholar
  65. Scarfe, P, & Glennerster, A. (2015). Using high-fidelity virtual reality to study perception in freely moving observers. Journal of Vision, 3, 1–11.Google Scholar
  66. Shamsul, BM, Sia, CC, Ng, Y, Karmegan, K. (2013). Effects of light’s colour temperatures on visual comfort level, task performances, and alertness among students. American Journal of Public Health Research, 1, 159–165.View ArticleGoogle Scholar
  67. Sherman, WR, & Craig, AB (2003). Understanding virtual reality Interface, applications, and design. San Francisco: Morgan Kaufmann.Google Scholar
  68. Shiratuddin, MF, & Thabet, W. (2011). Utilizing a 3D game engine to develop a virtual design review system. ITcon, 16, 39–68.Google Scholar
  69. Sik-lányi, C (2009). Lighting in virtual reality. In Proceeding of the JAMPAPER 1, (pp. 19–26).Google Scholar
  70. Sorensen, K (1987). A modern glare index method. In Proceeding of 21st commission internationale de I’Eclairage. Venice: CIE.Google Scholar
  71. Sorger, J, Ortner, T, Luksch, C, Schwärzler, M, Gröller, E, Piringer, H. (2016). LiteVis: Integrated visualization for simulation-based decision support in lighting design. IEEE Transactions on Visualization and Computer Graphics, 22, 290–299.View ArticleGoogle Scholar
  72. Souha, T., Jihen, A., Guillaume, M., Philippe, W.: Towards a virtual reality tool for lighting. In: Proceeding of the 11th Computer Aided Architectural Design Futures (CAAD Futures): 20–22 June 2005; Vienna, Austria, pp. 115–124 (2005)Google Scholar
  73. Stahre, B., Billger, M.: Physical measurements vs visual perception. In: Proceeding of Computer Graphics, Imaging and Visualization (CGIV): 19–22 June; Leeds, UK, pp. 146–151 (2006)Google Scholar
  74. Stavrakantonaki, M (2013). Daylight performance simulations and 3D modeling in BIM and non-BIM tools. In Proceeding of the 31st international conference on education and research in computer aided architectural Design in Europe (eCAADe): 18–20 September; Netherlands, pp. 535–54.Google Scholar
  75. Stone, P, & Harker, SDP. (1973). Individual and group differences in discomfort glare responses. Lighting research. Technology, 5(1), 41–49. Google Scholar
  76. Suk, JY, Schiler, M, Kensek, K. (2017). Investigation of existing discomfort glare indices using human subject study data. Building and Environment, 113, 121–130. ArticleGoogle Scholar
  77. The Energy and Resources Institute (TERI) (2004). Sustainable Building - Design Manual: sustainable building design practices: TERI.Google Scholar
  78. The Energy Conservation Center Japan, Energy Conservation in Modern Office Buildings (2010). doi:
  79. Tomassoni, R, Galetta, G, Treglia, E. (2015). Psychology of light: How light influences the health and psyche. Psychology, 10, 1216–1222. ArticleGoogle Scholar
  80. Van den Berg, T. J. T. P., (René) van Rijn, L. J., Kaper-Bongers, R., Vonhoff, D. J., Völker-Dieben, H. J., Grabner, G. (2009). Disability glare in the aging eye. Assessment and impact on driving. Journal of Optometry, 2(3), 112–118. doi:
  81. Vince, J. (2004). Introduction to Virtual Reality. Springer Science & Business Media.Google Scholar
  82. Walker, J. (2014). Physically Based Shading In UE4. Accessed 8 Aug 2017.
  83. Weidlich, D, Cser, L, Polzin, T, Cristiano, D, Zickner, H. (2007). Virtual reality approaches for immersive design. CIRP Annals-Manufacturing Technology, 56, 139–142. ArticleGoogle Scholar
  84. What Is ElumTools™? Accessed 15 May 2016
  85. What Is Unreal Engine 4. Accessed 15 May 2016
  86. Wienold, J, & Christoffersen, J. (2006). Evaluation methods and development of a new glare prediction model for daylight environments with the use of CCD cameras. Energy and Buildings, 38(7), 743–757. View ArticleGoogle Scholar
  87. Wolska, A, & Sawicki, D. (2014). Evaluation of discomfort glare in the 50+ elderly: Experimental study. International Journal of Occupational Medicine and Environmental Health, 27(3), 444–459. View ArticleGoogle Scholar
  88. World Energy Resources 2013 Survey. Accessed 15 May 2016
  89. Wu, W.: Design for aging with BIM and game engine integration design for aging with building information modeling and game. In: Proceeding of the 122nd American Society for Engineering Education (ASEE): 14–17 June; Seattle, USA (2015)Google Scholar
  90. Yan, W, Clayton, M, Haberl, J, Jeong, W (2013). Interfacing BIM with building thermal and daylighting modeling. In Proceeding of the 13th international building performance simulation association (BS2013): 25–30 august; Chamb’ery, France, pp. 3521–3528.Google Scholar
  91. Yan, W, Culp, C, Graf, R. (2011). Integrating BIM and gaming for real-time interactive architectural visualization. Automation in Construction, 20, 446–458.View ArticleGoogle Scholar
  92. Zhou, X, Yan, D, Hong, T, Ren, X. (2015). Data analysis and stochastic modeling of lighting energy use in large office buildings in China. Energy and Buildings, 86, 275–287. ArticleGoogle Scholar


© The Author(s). 2017