SpecFem3D Cartesian is an open source software for seismic wave propagation in 3D. It is based upon the Spectral Elements Method, encompasses its own mesher and can also perform Full Waveform Inversion among other things. It is flexible and powerful enough to run seismic simulations at a local and regional scale. The software was developed by researchers from the CNRS, Princeton and ETH Zürich since the late 90’s.
Version available on Qarnot’s cloud platform: 3.0
If you are interested in another version, please send us an email at qlab@qarnot.com.
Before launching the case, please ensure that the following prerequisites have been met.
We will be running one of the toy examples that can be found from the official SpecFem3d git repository. This simulation demonstrates how SpecFem3D works with coupled acoustic/elastic domains by simulating a seismic source inside a coffee cup. The mesh is already provided in this case.
The example folder contains :
We will run 2 slightly different versions of the same example, hence you can download scripts to run these examples from the archive just below. You need to unzip this folder to be able to use it.
Before starting a calculation with the Python SDK, a few steps are required:
Note: in addition to the Python SDK, Qarnot provides C# and Node.js SDKs and a Command Line.
Copy the following code in a Python script and save it next to the input folder you unzipped before. Be sure you have copied your authentication token in the script (instead of <<<MY_SECRET_TOKEN>>>
) to be able to launch the task on Qarnot.
#!/usr/bin/env python3
import qarnot
# Create a connection, from which all other objects will be derived
# Enter client token here
conn = qarnot.Connection(client_token='<<<MY_SECRET_TOKEN>>>')
# Create a task
task = conn.create_task("Hello World - SpecFem3D", "docker-batch", 1)
# Create the input bucket and synchronize with a local folder
# Insert a local folder directory
input_bucket = conn.create_bucket("sf3d-in")
input_bucket.sync_directory("test_case_qarnot_specfem3d")
# Attach the bucket to the task
task.resources.append(input_bucket)
# Create a result bucket and attach it to the task
task.results = conn.create_bucket("sf3d-out")
# Define the Docker image and the command to be run in the container
task.constants["DOCKER_REPO"] = "qarnotlab/specfem3d"
task.constants["DOCKER_TAG"] = "4.0"
task.constants['DOCKER_CMD'] = "/job/run.sh"
# Submit the task
task.run(output_dir = "sf3d_results")
Please unzip the input folder, copy the preceding code into a python script named Qarnot_SpecFem3D.py, for example, and make sure that your tree view looks like :
Now simply run the python script from your terminal and it will launch a SpecFem3D simulation on the platform.
At any given time, you can monitor the status of your task.
Once the task is deployed, it should take around 5 minutes to run, and a few more to download the results to your computer. At any time you can monitor You can visualize the AVS_movie_*.inp files using Paraview or any other viewer that supports the format. You can also use our Paraview Web payload to view the results online if needed.
Now that you know how to launch a "batch" case, we will now demonstrate the same "mug" simulation test case but running it simultaneously on a cluster of 2 instances.
#!/usr/bin/env python3
import qarnot
# Create a connection, from which all other objects will be derived
# Enter client token here
conn = qarnot.Connection(client_token='<<<MY_SECRET_TOKEN>>>')
# Create a 'docker-cluster' task
task = conn.create_task('sf3D - cluster - test', 'docker-cluster', 2)
# Create the input bucket and synchronize with a local folder
# Insert a local folder directory
input_bucket = conn.create_bucket("sf3d-cluster-in-nb")
input_bucket.sync_directory("test_case_qarnot_specfem3d_cluster")
# Attach the bucket to the task
task.resources.append(input_bucket)
task.constants["DOCKER_REPO"] = "qarnotlab/specfem3d"
task.constants["DOCKER_TAG"] = "4.0.cluster"
# Running the set-up master and worker nodes
task.constants['DOCKER_CMD_MASTER'] = "/bin/bash /opt/run-master.sh"
task.constants['DOCKER_CMD_WORKER'] = "/bin/bash /opt/run-worker.sh"
# Name of the user script passed in the input bucket
task.constants['USER_SCRIPT'] = "run.sh"
task.results = conn.create_bucket('sf3d-cluster-output-nb')
# Submit the task
task.run(output_dir = "sf3d_cluster_results")
Please unzip the input folder and make sure that your tree view looks like:
The script will run the simulation on the Qarnot platform and output the result in the 'sf3d_cluster_results' on your computer once it is completed. You should be able to view the same results as the previous case using Paraview.
That’s it! If you have any questions, please contact qlab@qarnot.com and we will help you with pleasure!