Blog | Tools | Glossary | Search

Share:   |  feedback   |  Join us  

Seismic Full-Waveform Inversion Using Deep Learning Tools and Techniques by Alan Richardson

From petrofaq
Jump to navigation Jump to search

Seismic Full-Waveform Inversion Using Deep Learning Tools and Techniques by Alan Richardson.

The work demonstrates that the conventional seismic full-waveform inversion algorithm can be constructed as a recurrent neural network and so implemented using deep learning software such as TensorFlow. Applying another deep learning concept, the Adam optimizer with minibatches of data, produces quicker convergence toward the true wave speed model on a 2D dataset than the L-BFGS-B optimizer with the cost function and gradient computed using the entire training dataset. The work shows that the cost function gradient calculation using reversemode automatic differentiation is the same as that used in the adjoint state method.

The archive contains the code necessary to reproduce the article, includingits figures.

In addition to the code in this archive, there are additional software requirements:

  • Seismic Un*x. I used version 44R14.
  • Python packages listed in requirements.txt. These can be installed using "pip install -r requirements.txt"
  • LaTeX
  • Asymptote (to produce rnn.eps from rnn.asy)
  • Make and utilities such as wget, unzip, and a shell script interpreter



The Python package tensorflow-gpu is recommended if you have a suitable GPU.

To test that the code runs correctly, run this command: make test

To recreate the article and this archive file, run this command: make

To recreate only the figures (which includes creating the data that they display), run this command: make results_figures

The code is intended for research and demonstration purposes. Clarity and simplicity are therefore more important than computational performance. It is thus unlikely to work well on a realistic dataset. It could potentially be modified to be more suitable for this, however.


Paper: https://arxiv.org/pdf/1801.07232.pdf
Code: https://arxiv.org/src/1801.07232v1