In the 1980s most seismic data were interpreted on paper sections by hand using colored pencils. With the development of interactive workstations, geoscientists have been used to perform seismic interpretations on computers. Today, they spent most of their time, interpreting, adjusting the interpretation and checking the coherency along different directions and in different viewers (3D, sections and maps). But so far, they can't feel it. What if we could touch it, updated it and visualize it in an augmented reality environment ? Our objective is to connect a sandbox with and an augmented reality headset and be able to update the interpretations by touching the sand.
The hands-on sandbox becomes the interface for interpreting seismic horizons, providing a natural interaction mode for seismic sculptors. We worked in Unity 3D to create the beginning interactions in the AR goggles. Ultimately, the user will use a pull motion with their hands to resize and slice seismic data while also molding the sandbox to interpret horizon meshes. Multi-user interactions will take this one step further to create a collaborate, kinetic interpretation environment.