- 2018 American Society of Civil Engineers. The field reporting methods in the architecture/engineering/construction and facility management (AEC/FM) industry are in the middle of a major change in terms of how tasks are being conducted in the field. One of the key components in this trend is the reliable capability of localizing distant target objects in need of monitoring and documentation without preinstalled systems in any built environments including Global Positioning System (GPS) and wireless-denied indoor and outdoor environments. This paper proposes a new infrastructure-free approach for three-dimensional (3D) localization of distant target objects by integrating the following two sources of information on a mobile device: (1) embedded motion sensors in the mobile device such as an accelerometer and gyroscope - localizing a user equipped with the mobile device through the probabilistic integration of multiple dead-reckoning paths, which provides the location information of each camera center in the global coordinate system; and (2) an embedded visual sensor (i.e., camera) - localizing distant target objects of interest through the image-based 3D localization by leveraging photos taken from multiviews for the objects. To test the proposed method, several case studies were conducted in an existing instructional facility, and approximately 3% of the averaged distance errors were reported. Experimental results promise that the proposed multimodal mobile sensing and analytics have significant potential to robustly localize distant in-building assets and reduce the drift errors accumulated in proportion to the moving distance. In addition, the stereoscopic view provides more semantics on the distant target objects, which enables practitioners to improve the associated situational awareness about their performance (e.g., the as-is physical condition). The perceived benefits of the proposed mobile computing system and related open research challenges are discussed in detail.