For the ancient Maya, sights and sounds worked in concert to create synesthetic experiences that influenced daily life and shaped society. Spatial configurations of temples, ball courts, stelae, dwellings, and other architecture played integral roles in shaping and re-shaping city life. Geographic Information Systems (GIS) offer tools and methods to derive quantified and geovisualized data to explore the potential roles of vision and sound in structuring human interaction and experience. And yet, GIS alone, leaves untapped research avenues. Digital technologies such as photogrammetry, laser scanning, and airborne LiDAR allow us to create 3D models of extant architecture within contemporary landscapes. But what about missing architecture? How do we simulate ancient cities rather than simply visualize present archaeological landscapes? 3D Modeling software affords us possibilities to reconstruct non-extant architecture but typically requires time-consuming manual modeling. An alternative—procedural modeling—applies “standardized” rules to rapidly generate 3D architectural models allowing 3D reconstructions of ancient cities based on GIS data. Exporting these 3D GIS-derived data into an immersive Virtual Reality platform allows for cross-disciplinary humanistic and scientific analysis. In this talk, I discuss how the MayaCityBuilder Project is using GIS and 3D data in multiple platforms to explore sight and sound in eighth century Copan—today a UNESCO World Heritage Site in Honduras.