Full text loading...
Recent emergence of continuous subsurface monitoring for low-carbon energy storage and sequestration projects has driven the need for an end-to-end workflow that will handle multi-terabyte daily volumes of raw acquired data from distributed fiber acoustic sensing (DFOS). We present an embedded data workflow methodology that combines edge computing, high speed satellite connectivity, intelligent tiered cloud storage, and open-source data management schemas to make DFOS data findable, accessible, interoperable, and reusable for geotechnical end-users. With the ability to continuously record strain and temperature and simultaneously detect acoustic energy, recent applications for continuous permanent real-time monitoring of CO2 geosequestration sites can require over a petabyte of onsite storage. Edge computing produces decimated datasets for transfer to cloud storage or integration into asset monitoring systems. This can be completely automated offshore, eliminating personnel on board and allowing real-time remote logging and data management to fast-track decision making. Transmitted data is then captured with industry standard and technology agnostic well known schemas on an open-source data platform. Business and data rules are applied to move field, processed, and interpreted data to intelligently tiered cloud storage. Industry approved standardized data schemas provide capture of minimum mandatory metadata to reduce latency for search and filter queries.