- Research article
- Open Access
SMART: scalable and modular augmented reality template for rapid development of engineering visualization applications
© Dong and Kamat; licensee Springer. 2013
- Received: 11 December 2012
- Accepted: 29 April 2013
- Published: 12 June 2013
The visualization of civil infrastructure systems and processes is critical for the validation and communication of computer generated models to decision-makers. Augmented Reality (AR) visualization blends real-world information with graphical 3D models to create informative composite views that are difficult to create or replicate on the computer alone.
This paper presents a scalable and extensible mobile computing framework that allows users to readily create complex outdoor AR visual applications. The technical challenges of building this reusable framework from the software and hardware perspectives are described.
SMART is a generic and loosely-coupled software computing framework for creating AR visual applications with accurate registration algorithms and visually credible occlusion effects. While SMART is independent of any specific hardware realization, ARMOR is built as a hardware implementation of SMART to test its algorithm correctness and its adaption to engineering applications. In particular, ARMOR is a modular mobile hardware platform designed for user position and orientation tracking, as well as augmented view display.
The framework has been validated in several case studies, including the visualization of underground utilities for excavator control and collision avoidance to demonstrate SMART’s rapid creation of complex AR visual applications.
- Augmented Reality
- Finite Impulse Response
- Virtual Object
- Finite Impulse Response Filter
- Head Mount Display
Augmented Reality (AR) refers to a visualization technology that superimposes virtual objects on the real world so as to provide information beyond reality and enhance people’s interaction with the environment. It possesses distinct advantages over other visualization methods in at least three aspects: 1) From the perspective of visualization, the integration of the real world can significantly mitigate the efforts to create and render contextual models for virtual scenes, and can provide a better perception of the surroundings than virtual reality alone (e.g., visualization of construction simulations) (Behzadan and Kamat 2007), and the visualization of architectural designs (Thomas et al. 1999); 2) From the perspective of information retrieval, AR supplements a user’s normal vision with context-related or Georeferenced virtual objects (e.g., looking through walls to see columns (Webster et al. 1996), looking beneath the ground to inspect subsurface utilities (Roberts et al. 2002), or retrieving drawings and BIM models for on-site communication (Yeh et al. 2012); and 3) From the perspective of interaction, authentic virtual models can be deployed to evaluate the physical condition of real objects (e.g., evaluation of earthquake-induced building damage (Kamat and El-Tawil 2007), and automation of construction process monitoring (Golparvar-Fard et al. 2009)). There have been other proposed applications for AR in construction such as training platforms for heavy construction equipment operators (Wang and Dunston 2007). For a comprehensive and state-of-the-art review of applications of AR, especially in construction and built environments, the reader is referred to (Wang et al. 2013).
While the aforementioned AR engineering applications possess tangible economic and social values, some fundamental technical challenges have to be overcome before AR can be deployed in practical outdoor applications. The difficulties are associated with two requirements: 1) maintaining a constant spatial alignment between the virtual and real entities, which is also referred as registration; and 2) creating a sustained illusion that the virtual and real entities co-exist, which is also referred to as occlusion.
Since most engineering AR applications expect to encounter registration and occlusion problems, it is reasonable to solve these challenges and build the solutions into a scalable and extensible AR computing framework that is openly accessible to the community. Researchers who are interested in exploring AR for their specific application in construction or another domain can immediately have access to the core logic components without starting from scratch on developing solutions for the registration and occlusion issues. The existence of such a reusable framework can significantly shorten the lifecycle of developing AR engineering applications.
A real-time occlusion algorithm with robust depth sensing and frame buffer techniques for handling occlusion problems in ubiquitous AR applications has been presented in the paper (Dong and Kamat 2012b), and the designed software module has been built into the SMART framework. This paper primarily focuses on the fundamental challenge of achieving precise registration in AR visualizations from both the hardware and software perspectives.
The fundamental challenge in Augmented Reality is the difficulty of aligning virtual objects with the real environment with the correct pose, a process which is called registration in the literature (Azuma 1997). It is difficult because the registration errors arise from both the spatial and temporal domains (Azuma 1997). In addition, different tracking technologies have their own registration error sources. This paper focuses on solving the registration challenge in an unprepared environment (i.e., outdoors, where sensor-based AR is by far the most reliable tracking method free from constraints put on the user).
Errors in the spatial domain are also referred to as static errors when neither the user nor virtual objects are in motion. The static errors of sensor-based AR include: 1) inaccuracy in the sensor measurement; 2) mechanical misalignments between sensors; and 3) an incorrect registration algorithm. The selection of high-accuracy sensors is crucial, because the errors contained in the measurement can rarely be eliminated. Rigid placement of sensors on the AR backpack and helmet is also essential; else it can further compromise the system accuracy. Some early, relatively fragile AR backpack design examples can be found in the touring machine (Feiner et al. 1997) and Tinmith-Endeavour (Piekarski et al. 2004). A more robust and ergonomic version is demonstrated by the Tinmith backpack 2006 version (Stafford et al. 2006), in which a GPS antenna and an InterSense orientation tracker are anchored on top of a helmet. However, the 50cm accuracy of its GPS receiver is not sufficient for centimeter-level-accuracy AR tasks.
Compared with static errors, dynamic errors—errors in temporal domain—are much more unpredictable. The differences in latency between data streams create pronounced “swimming” effect of dynamic misregistration, which is called relative latency by (Jacobs et al. 1997). Given modern machine computational power, relative latency mainly results from: 1) off-host delay: the duration between the occurrence of a physical event and its arrival on the host; 2) synchronization delay: the time in which data is waiting between stages without being processed. Two common mitigation methods for resolving relative latency are: 1) adopting multi-threading programming or scheduling system latency (Jacobs et al. 1997); and 2) predicting head motion using a Kalman filter (Liang et al. 1991;Azuma et al. 1999).
The mobile computing framework presented in this paper provides a comprehensive hardware and software solution for centimeter-level-accuracy AR applications in the outdoor environment. The robustness of the framework has been presented with a case study in visualizing underground infrastructure as part of an excavation planning and safety project.
The Scalable and Modular Augmented Reality Template (SMART) builds on top of the ARVISCOPE software platform (Behzadan and Kamat 2009). Besides its engine that interprets visual simulation scripts, another contribution of ARVISCOPE is creating some basic modules communicating with peripheral hardware that can later be imported into other potential AR applications. SMART takes advantage of these modules, and constructs an AR application framework that separates the AR logic from the application-specific logic, and make the libraries hardware independent, thus providing a flexible and structured AR development environment for the rapid creation of complex AR visual applications.
The in-built registration algorithm of SMART affords high-accuracy static alignment between real and virtual objects. Efforts have also been made to reduce dynamical misregistration, including: 1) in order to reduce synchronization latency, multiple threads are dynamically generated for reading and processing sensor measurement immediately upon the data arrival in the host system; and 2) an adaptive lag compensation algorithm is designed to reduce the latency induced by the Finite Impulse Response (FIR) filter.
In order to test the robustness and effectiveness of SMART in specific engineering application, ARMOR is constructed as an hardware realization of SMART, The Augmented Reality Mobile OpeRation platform (ARMOR) evolves from the UM-AR-GPS-ROVER hardware platform (Behzadan et al. 2008). ARMOR improves the design of the UM-AR-GPS-ROVER in two distinct ways: rigidity and ergonomics. It introduces high-accuracy and lightweight devices, rigidly places all tracking instruments with full calibration, and upgrades the carrying harness to make it more wearable.
SMART provides a default application framework for AR tasks, where most of its components are written as generic libraries and can be inherited in specific applications. The framework isolates the domain logic from AR logic, so that the domain developer only needs to focus on realizing application-specific functionalities and leaving the AR logic to the SMART framework. Furthermore, SMART is designed to be hardware independent, therefore developer can pick their own hardware profile, write modules that conforms to SMART interface, and make the hardware run on top of SMART.
The SMART framework follows the classical model-view-controller (MVC) pattern. Scene-Graph-Controller is the implementation of the MVC pattern in SMART: (1) the model counterpart in SMART is the CARScene that utilizes application-specific I/O engines to load virtual objects, and that maintains their spatial and attribute status. The update of a virtual object’s status is reflected when it is time to refresh the associated graphs; (2) the CARGraph corresponds to the view and reflects the AR registration results for each frame update event. Given the fact that the user’s head can be in constant motion, the graph always invokes callbacks to rebuild the transformation matrix based on the latest position and attitude measurement, and refreshes the background image; (3) the CARController—manages all of the UI elements, and responds to a user’s commands by invoking delegates’ member functions like a scene or a graph.
Application for operations-level construction animation
As a study case of adapting SMART to a specific engineering application, ARVISCOPE animation functions have been re-implemented under the SMART framework as follows. In order to load the ARVISCOPE animation trace file (Behzadan and Kamat 2009), CARSiteForemanA contains CARSceneA, CARControllerA , and CARGraphA, all of which are subclasses inheriting from SMART’s superclasses, and are adapted for animation function. For example, CARSceneA employs CARStatementProcessor and CARnimation classes as the I/O engine to interpret the trace file, and CARControllerA adds customized elements for controlling the animation such as play, pause, continue, and jump functions.
Comparison between the UM-AR-GPS-ROVER and ARMOR configuration
Trimble AgGPS 332 using OmniStar XP correction for Differential GPS method
Trimble AgGPS 332 using CMR correction broadcast by a Trimble AgGPS RTK Base 450/900
OmniStar XP provides 10~20 cm accuracy. RTK provides 2.5 cm horizontal accuracy, and 3.7 cm vertical accuracy
PNI TCM 5
PNI TCM XB
Same accuracy, but ARMOR places TCM XB rigidly close to the camera
Fire-I Digital Firewire Camera
Microsoft LifeCam VX-5000
LifeCam VX-5000 is lightweight, small and uses less wire
i-Glasses SVGA Pro video see-through HMD
eMagin Z800 3DVisor
Z800 3DVisor is lightweight with stereovision
Dell Precision M60 Notebook
ASUS N10J Netbook
ASUS N10J is lightweight and smaller
User Command Input
WristPC wearable keyboard and Cirque Smart Cat touchpad
Nintendo Wii Remote
Wii Remote is lightweight and intuitive to use
Tekkeon myPower MP3750
MP3750 is lightweight and has multiple voltage outputs charging both GPS receiver and HMD
Kensington Contour Laptop Backpack
Load Bearing Vest
Extensible and easy to access equipment
Step 1 – viewing
Step 2 – modeling
The modeling step positions the virtual objects in the world coordinate system. The definition of the object coordinate system is determined by the drawing software. The origin is fixed to a pivot point on the object with program-specified geographical location. The geographical location of the world coordinate origin is also given by the GPS measurement; the 3D vector between the object and world coordinate origins can thus be calculated. The methods to calculate the distance between geographical coordinates is originally introduced by (Vincenty 1975), and (Behzadan and Kamat 2007) proposed a reference point concept to calculate the 3D vector between two geographical locations. SMART adopts the same algorithm to place a virtual object in the world coordinate system using the calculated 3D vector. After that, any further translation, rotation, and scaling operations are applied on the object.
Steps 3 and 4 – viewing frustum and projection
The NEAR and FAR planes do not affect how the virtual object appears on the projection plane. However, to maintain a high precision z-buffer, the principle is to keep the NEAR plane as far as possible, and the FAR plane as close as possible. The horizontal and vertical angle of view directly influence the magnification of the projected image and are affected by the focal length and aspect ratio of the camera. In order to ensure consistent perspective projection between the real and virtual camera, both of them need to share the same angle of view. This is achieved by calibrating the real camera and conforming the virtual camera’s angle to view to that of the real camera.
Registration validation experiment
Calibration of the mechanical attitude discrepancy
Power voltage demands of different devices
External power supply
3.6 V~5 V
RTK Rover Receiver
10 V~32 V
RTK Rover Radio
Through RTK Rover Receiver
10.5 V~20 V
RTK Base Receiver
Integrated Internal battery
User Command Input
Integrated Internal battery
Validation of the static registration algorithm
A series of experiments are performed to validate the agreement between the real and virtual camera; if the static registration algorithm works correctly, the virtual box should coincide with the real box when moved together with 6 degrees of freedom. Overall, the virtual box accurately matches the real one in all tested cases, and a selected set of experiments are shown below in Figure 6 Rows 2~3.
Resolving the latency problem in the electronic compass
Due to the latency induced by the compass module itself, correct static registration does not guarantee that the user can see the same correct and stable augmented image when in motion. This section addresses the cause and solution for the dynamic misregistration problem.
Multi-threading to reduce synchronization latency
There are two options for communicating with the compass module: PULL and PUSH mode. PULL is a passive output mode for the compass module, and is used by the UM-AR-GPS-ROVER to pull data out of the module. Without separating I/O communication with the electronic compass as a background task, one loop of the pulling request costs 70ms on average, and significantly slows down program performance.
Half-window Gaussian filter
Adaptive latency compensation algorithm
Compensation Value = 2 * (Half-Window Gaussian Filter Result – Twice Half-Window Gaussian Filter Result)
Compensation Value = Twice Half-Window Gaussian Filter Result – Observed Data
To test SMART framework’s correctness and robustness, ARMOR, an evolved product from UM-AR-GPS-ROVER (Behzadan and Kamat 2009), is designed as a hardware platform implementation of SMART. As a prototype design, the UM-AR-GPS-ROVER meets the need of a proof-of-concept platform for visual simulation. However, there are two primary design defects that are inadequately addressed: accuracy and ergonomics. First, the insecure placement of tracking devices disqualifies the UM-AR-GPS-ROVER from the centimeter-accuracy-level goal. Secondly, co-locating all devices, power panels, and wires into an ordinary backpack makes it impossible to accommodate more equipment like the Real Time Kinematic (RTK) rover radio. The weight of the backpack is also high for even distribution around the body.
ARMOR represents a significant upgrade from the UM-AR-GPS-ROVER. The improvements can be divided into three categories: 1) highly accurate tracking devices with rigid placement and full calibration; 2) lightweight selection of input/output devices and external power source, and 3) load-bearing vest to accommodate devices and distribute weight evenly around the user’s body. An overview comparison between UM-AR-GPS-ROVER and ARMOR is listed in Table 1.
Position tracking device — RTK-GPS
The UM-AR-GPS-ROVER uses the AgGPS 332 Receiver with OmniStar XP mode to provide a user’s position (i.e., the position of camera center) with 10~20 cm accuracy. This level of accuracy is sufficient for creating animations of a construction process simulation, where slight positional displacement may not necessarily compromise the validation purpose. However, for precision critical applications, 10~20 cm accuracy is not sufficient for visual accuracy.
The AgGPS 332 Receiver used in UM-AR-GPS-ROVER is thus upgraded and three objectives are pursued: 1) The upgraded GPS must be able to support centimeter-level accuracy; 2) The hardware upgrade should have minimum impact on the software communication module; and 3) The existing device should be fully utilized given the cost of high-accuracy GPS equipment. Ultimately, the AgGPS RTK Base 450/900 GPS Receiver is chosen for implementing the upgrade for three reasons. First, it leverages RTK technology to provide 2.5 cm horizontal accuracy and 3.7 cm vertical accuracy on a continuous real-time basis. The RTK Base 450/900 Receiver is set up as a base station placed at a known point (i.e., control points set up by the government with 1st order accuracy), and tracks the same satellites as an RTK rover. The carrier phase measurement is used to calculate the real-time differential correction that is sent as a Compact Measurement Record (CMR) through a radio link to the RTK rover within 100 km (depending on the radio amplifier and terrain) (Trimble 2007). The RTK rover applies the correction to the position it receives and generates centimeter-level accuracy output. The second reason for choosing this receiver is that, despite the upgrade, the RTK rover outputs the position data in NMEA format (acronym for National Marine Engineers Association) (NEMA 2010), which is also used in OmniStar XP. No change therefore applies to the communication module. The third and final reason is that the original AgGPS 332 Receiver is retained as an RTK rover with its differential GPS mode being changed from OmniStar XP to RTK. A SiteNet 900 radio works with the AgGPS 332 Receiver to receive the CMR from the base station.
Improvement has also been made to the receiver antenna placement. The UM-AR-GPS-ROVER mounted to the receiver antenna on a segment of pipe that was tied to the interior of the backpack. This method proved to be inefficient in preventing lateral movement. Therefore ARMOR anchors the GPS receiver with a bolt on the summit of the helmet, so that the phase center of the receiver will never shift relative to the camera center. The relative distance between the receiver phase center and the camera center is calibrated beforehand and added to the RTK rover measurement.
ARMOR can work in either indoor or outdoor mode. Indoor mode does not necessarily imply that the GPS signal is totally lost, rather the qualified GPS signal is absent. The GPS signal quality can be extracted from the $GGA section of the NMEA string. The fix quality ranges from 0–8, for example 2 means DGPS fix, 4 means Real Time Kinematic, and 5 means float RTK. The user can define the standard (i.e., which fix quality is deemed as qualified) in the hardware configuration file. When a qualified GPS signal is available, the geographical location is extracted from the $GPGGA section. Otherwise, a preset pseudo-location is used, and this pseudo-location can be controlled by user input.
Orientation tracking device—electronic compass
The TCM XB electronic compass is employed to measure the yaw, pitch, and roll that represent the relative attitude between the eye coordinate system and the world coordinate system. It measures the heading up to a full 360-degree range, and maintains the accuracy of 0.3°rms when the tilt (pitch and roll) is no larger than 65°, which is the common motion range of the human head.
Theoretically the electronic compass is applied to measure the orientation of the user’s head. However, since a user’s eyes are obstructed by a Video See-Through Head Mounted Display (HMD) and that the eyes’ function is replaced by a video camera, the electronic compass is applied to measure the attitude of the camera instead of the eyes.
The UM-AR-GPS-ROVER places the electronic compass on the zenith of the helmet, which makes it very hard to align the camera and the electronic compass. ARMOR chooses to anchor the electronic compass rigidly close to the camera anchored on the visor of the helmet, and parallel to the line of sight, making the physical discrepancy calibration much easier. The calibration approach is described in Section 4.2.1. For safety reasons, the electronic compass is encapsulated in a custom-sized aluminum enclosure that is free of magnetic forces.
Magnetometer calibration also needs to be carried out for the purpose of compensating for distortions to the magnetic field caused by the host system and the local environment (PNI 2009). Given that ARMOR’s entire periphery could have magnetic impact on the sensor—for example GPS receivers, HMDs, and web cameras—the TCM XB needs to be mounted within the host system that is moved as a single unit during the calibration.
Input/output devices and external power supply
Video sequence input: camera
The camera is responsible for capturing the continuous real-time background image. The ideal device should possess properties of high resolution, high-frequency sampling rate, and high-speed connection, with a small volume and lightweight. Microsoft LifeCam VX5000 is chosen for the following reasons—the size is only 45 mm × 45.6 mm and it does not compromise on resolution (640 × 480) or connection speed (480 Mbps). More importantly, it takes samples at 30 Hz, which is the same speed as the electronic compass.
Augmented view output: head-mounted display (HMD)
The augmented view generated by the video compositor is ultimately presented by the Video See-Through HMD. The eMagin Z800 3DVisor is chosen as the HMD component of ARMOR because it has remarkable performance in primary factors, including wide view angle, large number of colors, lightweight frame, and comfort. Furthermore, the stereovision capability is another important rendering effect that helps the user to better appreciate the 3D augmented space.
External power supply
External power supplies with variant voltage output are indispensible for powering devices without integrated internal batteries. ARMOR upgrades ‘POWERBASE’ of UM-AR-GPS-ROVER to ‘Tekkeon myPower ALL MP3750’ which shows improvements over ‘POWERBASE’ in four ways: 1) both the volume (17 cm x 8 cm x 2 cm) and weight (0.44 kg) of MP3750 are only 1/5 of POWERBASE’s volume and weight; 2) the main output voltage varies from 10 V to 19 V for powering the AgGPS 332 Receiver (12 V), and an extra USB output port can charge the HMD (5 V) simultaneously (Table 2); 3) it features automatic voltage detection with an option for manual voltage selection; and 4) an extended battery pack can be added to double the battery capacity (Tekkeon 2009).
The configuration of the vest has several overall advantages: 1) the design of the pouches allows for an even distribution of weight around the body; 2) the separation of devices allows the user to conveniently access and check the condition of certain hardware; and 3) different parts of the loading vest are loosely joined so that the vest can fit any body type, and be worn rapidly even when fully loaded. ARMOR has been tested by several users for outdoor operation that lasts continuously for over 30 minutes, without any interruption or significant discomfort.
The robustness of ARMOR and the SMART framework have been evaluated in an Augmented Reality application designed to visualize subsurface utilities during ongoing excavation operations for improved context awareness and accident avoidance.
Current practice of excavation damage prevention and its limitation
Every U.S. state’s “One-Call Center” is a message handling system for underground utility owners. It collects excavation requests from contractors and distributes them to all of its members (utility owners). The standard procedure can be summarized as the following steps: 1) the contractor issues a text-based request to the “One-Call Center” describing planned excavation locations; 2) “One-Call Center” distributes the request to its members who may own buried assets in the proposed excavation area; 3) Each facility owner dispatches field locators to mark the approximate location of the underground facilities; 4) the excavator operators carry out the excavation activities based on the visual guidance marked on the ground.
Despite its effectiveness of reducing utility lines hit to a great extent, there are several weaknesses in the current practice of damage prevention to buried utilities that can be improved using Augmented Reality. First, the description about the proposed excavation is a text-based request ticket that can be vague and ambiguous, and it is up to the field locators to interpret the digging zone from the written request. In case of misinterpretation, the field locators either fail to cover the entire proposed excavation area, or take significant extra effort to mark the ground that is unnecessary. Second, even though the surface markings (paint, stakes, flags, etc.) serve as visual guidance for the excavation operator, they are vulnerable to heavy traffic, severe weather, and excavation activity which scrapes the surface and then removes the top soil, thus destroying the surface markings. This makes it challenging for an excavator operator to maintain spatial orientation, and must rely on memory and judgment to recollect marked utility locations as excavation proceeds (Talmaki and Kamat 2012).
Proposed practice for improvement of the “One-call” damage prevention practice with augmented reality
Visualization of buried utilities based on SMART framework
Visualization logic of buried utilities
Extract the spatial and attribute information of pipelines from the KML file using libkml, a library for parsing, generating, and operating in KML (Google 2008). For example, the geographical location of pipelines is recorded under the Geometry element as “LineString” (Google 2012). A cursor is thus designed to iterate through the KML file, locate “LineString” elements, and extract the geographical locations.
Convert consecutive vertices within one “LineString” from the geographical coordinate to the local coordinate in order to raise computational efficiency during the registration routine. The first vertex on the line string is chosen as the origin of the local coordinate system, and the local coordinates of the remaining vertices are determined by calculating the relative 3D vector between the rest of the vertices and the first one, using the Vincenty algorithm (Behzadan and Kamat 2007).
In order to save storage memory, a unit cylinder is shared by all pipeline segments as primitive geometry upon which the transformation matrix is built.
Scale, rotate, and translate the primitive cylinder to the correct size, attitude, and position (Figure 18). For simplicity, the normalized vector between two successive vertices is named as the pipeline vector. First, the primitive cylinder is scaled along the X- and Y-axis by the radius of the true pipeline, and then scaled along the Z-axis by the distance between two successive vertices. Secondly, the scaled cylinder is rotated along the axis—formed by the cross product between vector <0, 0, 1> and the pipeline vector—by the angle of the dot product between vector <0, 0, 1> and the pipeline vector. Finally, the center of the rotated pipeline is translated to the midpoint between two successive vertices. This step is applied to each pair of two successive vertices.
Translucent rectangles (Figure 12D) are created and laid underneath the pipelines to represent the buffer zone for excavation.
An inverted pyramid without bottom is created to represent the X-ray vision of the underground pipelines. Each cylinder segment and its associated rectangle (buffer zone) are tested against the surface of pyramid, and the parts falling outside are truncated (Figure 15).
This paper has presented the design and implementation of a robust mobile computing platform composed of the rigid hardware platform ARMOR and the application framework SMART with open access to the community. Researchers who are interested in improving the AR graphical algorithms or developing new AR engineering applications can take advantage of the existing AR framework implemented in this research, and can prototype their ideas in a much shorter lifecycle.
Targeting outdoor AR applications at centimeter-level accuracy, algorithms for both static and dynamic registration have been introduced. Several dynamic misregistration correction approaches are also described and evaluated. It is found that dynamic misregistration continues to be an open research problem and continues to be under investigation by the authors. Several efforts are being made in ongoing work, which include: 1) synchronizing the captured image and sensor measurements; and 2) optimizing the adaptive latency compensation algorithm with image processing techniques (e.g., optical flow can provide additional heuristics about the angular speed).
A video demo about SMART and ARMOR, and its application in excavation damage prevention can be found at <http://pathfinder.engin.umich.edu/videos.htm>. A video demo about its occlusion capability can also be found at the same site. SMART is an open source project that can be downloaded at <http://pathfinder.engin.umich.edu/software.htm>.
- Azuma R: A survey of augmented reality. Teleoperators and Virtual Environments, 6 1997, 4: 355–385.Google Scholar
- Azuma R, Hoff B, Neely H, Sarfaty R: A motion-stabilized outdoor augmented reality. Houston, TX: Proceedings of the Virtual Reality; 1999. IEEE IEEEView ArticleGoogle Scholar
- Behzadan AH, Kamat VR: Georeferenced registration of construction graphics in mobile outdoor augmented reality. Journal of Computing in Civil Engineering 2007,21(4):247–258. 10.1061/(ASCE)0887-3801(2007)21:4(247)View ArticleGoogle Scholar
- Behzadan AH, Kamat VR: Automated generation of operations level construction animations in outdoor augmented reality. Journal of Computing in Civil Engineering 2009,23(6):405–417. 10.1061/(ASCE)0887-3801(2009)23:6(405)View ArticleGoogle Scholar
- Behzadan AH, Timm BW, Kamat VR: General-purpose modular hardware and software framework for mobile outdoor augmented reality applications in engineering. Advances Engineering Informatics 2008, 22: 90–105. 10.1016/j.aei.2007.08.005View ArticleGoogle Scholar
- Dong S, Kamat RV: Scalable and Extensible Augmented Reality with Applications in Civil Infrastructure Systems. Ann Arbor: University of Michigan; 2012.Google Scholar
- Dong S, Kamat RV: Real-Time Occlusion Handling for Dynamic Augmented Reality Using Geometric Sensing and Graphical Shading. Journal of Computing in Civil Engineering 2012. in press in pressGoogle Scholar
- Feiner S, Macintyre B, Hollerer TA: A Touring Machine: Prototyping 3D Mobile Augmented Reality Systems for Exploring the Urban Environment. In Proceedings of 1997 International Symposium on Wearable Computers. Piscataway, NJ: IEEE; 1997:74–81.Google Scholar
- Golparvar-Fard M, Pena-Mora F, Arboleda CA, Lee S: Visualization of construction progress monitoring with 4D simulation model overlaid on time-lapsed photographs. Journal of Computing in Civil Engineering 2009,23(6):391–404. 10.1061/(ASCE)0887-3801(2009)23:6(391)View ArticleGoogle Scholar
- Google: Introducing libkml: a library for reading, writing, and manipulating KML. 2008. http://google-opensource.blogspot.com/2008/03/introducing-libkml-library-for-reading.htmlGoogle Scholar
- Google: KML Documentation Introduction. 2012. https://developers.google.com/kml/documentation/Google Scholar
- Jacobs MC, Livingston MA, State A: Managing Latency in Complex Augmented Reality Systems. In Proceedings of the 1997 Symposium on Interactives 3D Graphics. New York, NY: ACM; 1997:49–54.View ArticleGoogle Scholar
- Kamat VR, El-Tawil S: Evaluation of augmented reality for rapid assessment of earthquake-induced building damage. Journal of Computing in Civil Engineering 2007,21(5):303–310. 10.1061/(ASCE)0887-3801(2007)21:5(303)View ArticleGoogle Scholar
- Liang O, Shaw C, Green M: On temporal-spatial realism in the virtual reality environment. In Proceeding of 1991 Symposium on User Interface Software and Technology. New York, NY: ACM; 1991.Google Scholar
- Martz P: OpenSceneGraph Quick Start Guide. 2007.Google Scholar
- NEMA: NEMA data. 2010. http://www.gpsinformation.org/dale/nmea.htmGoogle Scholar
- Piekarski W, Smith R, Thomas BH: Designing Backpacks for High Fidelity Mobile Outdoor Augmented Reality. In Proceedings of the 2004 IEEE and ACM International Symposium on Mixed and Augmented Reality. Piscataway, NJ: IEEE; 2004:pp. 280–281.Google Scholar
- PNI: User Manual Field Force TCM XB. 2009.Google Scholar
- Roberts G, Evans A, Dodson AH, Denby B, Cooper S, Hollands R: The Use of Augmented Reality, GPS and INS for Subsurface Data Visualization. In Proceedings of the 2002 FIG XIII International Congress. Copenhagon, Denmark: International Federation of Surveyors, FIG; 2002:1–12.Google Scholar
- Shreiner D, Woo M, Neider J, Davis T: OpenGL Programming Guide. 6th edition. Boston, MA: Addison-Wesley; 2006.Google Scholar
- Stafford A, Piekarski W, Thomas HB: Implementation of God-like Interaction Techniques for Supporting Collaboration Between Outdoor AR and Indoor Tabletop Users. In Proceedings of the 2006 IEEE and ACM International Symposium on Mixed and Augmented Reality. Piscataway, NJ: IEEE; 2006:165–172.View ArticleGoogle Scholar
- Talmaki S, Kamat VR: Real-Time Hybrid Virtuality for Prevention of Excavation Related Utility Strikes. Journal of Computing in Civil Engineering 2012.Google Scholar
- Tekkeon: MP3450i/MP3450/MP3750 datasheets. 2009.Google Scholar
- Thomas B, Piekarski W, Gunther B: Using Augmented Reality to Visualise Architecture Designs in an Outdoor Environment. Proceedings of the Design Computing on the Net 1999.Google Scholar
- Trimble: AgGPS RTK Base 900 and 450 receivers.: Trimble. 2007.Google Scholar
- Vincenty T: Direct and inverse solutions of geodesics on the ellipsoid with application o fnested equations. Survey Reviews 1975, XXIII: 88–93.View ArticleGoogle Scholar
- Wang X, Dunston P: Design, strategies, and issues towards an augmented reality-based construction training platform. Journal of Information Technology in Construction (ITcon) 2007, 12: 16.Google Scholar
- Wang X, Kim MJ, Love PED, Kang S-C: Augmented Reality in built environment: Classification and implications for future research. Automation in Construction 2013, 32: 13.View ArticleGoogle Scholar
- Webster A, Feiner S, Macintyre B, Massie W: Augmented Reality in Architectural Construction, Inspection, and Renovation. In Proceedings of 1996 ASCE Congress on Computing in Civil Engineering. New York, NY: ASCE; 1996.Google Scholar
- Yeh K-C, Tsai M-H, Kang S-C: On-site building information retrieval by using projection-based augmented reality. Journal of Computing in Civil Engineering 2012,26(3):14.View ArticleGoogle Scholar
This article is published under license to BioMed Central Ltd. This is an Open Access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/2.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.