Weekly reports for Seismic SOS

Weekly Report #1 - 6/25/2013 (behind schedule)

This is the first weekly report for the "Seismic Observations in SOS" GSOC 2013 project. What was done:

  • Development environment finalized using Eclipse with SVN, Maven, and Tomcat plugins.
  • Research conducted on using embedded ObsPy python scripts, or Seisfile for data querying
    • SeisFile advantage: Written fully in java
  • ObsPy worked perfectly in a standalone python test
    • (In progress) I am trying to get my script to run through Jython, which could be used to embed ObsPy scripts in SOS.
  • IDE problems included SVN commit difficulties
    • SVN was not ignoring binaries
  • Jython is a bit unwieldy
    • Online documentation is a bit out of date. For example, a compilation instruction tells you how to use the Jythonc compiler, yet it is depreciated in current versions.
    • Having difficulty compiling with both javac and the built in jython runtime compiler.
What needs to be done:

  • Need to get Jython working properly with either Javac or Jython compiling.
  • Need to test SeisFile, since it may prove to be much easier than learning Jython.
  • Need to start thinking about testing the data querying methods in SOS itself.
    • integrate new seismic module into SOS encoding

Weekly Report #2 - 7/2/2013 (behind schedule)

This is the second weekly report for the "Seismic Observations in SOS" GSOC 2013 project.

What was done:

  • More practice and testing using ObsPy/Jython and SeisFile.
    • Jython was successfully used to run a Python script, and then an ObsPy script from a Java main class using PythonInterpreter.
    • SeisFile was successfully used as a query script to pull .mseed (waveform timeseries format) data via FDSNDataSelect.
      • Was not able to find a tool to utilize or display the data, at least not in the SeisFile package.

  • There was difficulty in making sure Jython knew where to find ObsPy packages.
    • It seemed the Jython runtime was not using the default path used in straightforward Python scripts, but its own sys.path variable that had to be set manually.
      • I would like the find a cleaner way for setting the Jython path variable.
  • SeisFile may or may not be chosen as the data querier of choice depending on Sensor Web's current capabilities for reading mseed data files.
What needs to be done:

  • Choosing ObsPy or SeisFile for integration into SOS.
    • Setting up maven .pom files to include proper dependencies in the build.
    • Get a clean maven install of SOS and serverlet to test capabilities of SeisFile, ObsPy, or both.
  • Study the source code to find a way to efficiently test the new scripts.
    • Run tests to successfully make queries from within SOS.
    • Hopefully there will be time this week to begin on storing queried data within my local SOS database.
  • General catchup is needed on selective reading such as SOS and Maven documentation.

Weekly Report #3 - 7/8/2013

This is the third weekly report for the "Seismic Observations in SOS" GSOC 2013 project.

What was done:

  • Jython was properly included in maven dependencies, and included in the necessary seismic data encoding module.
  • The data object model for seismic events (ie: earthquakes) and time series (windows of waveform data) was discussed and decided.
    • A rough draft UML representation of the time series model:

  • The method for bringing data into SOS is under ongoing planning
    • SOS could be used as a proxy data querier, where queries to the IRIS data source would be triggered externally (Sensor Web) and stored in the SOS database.
    • Another approach would involve SOS polling certain stations for updates in data, and importing the data when needed.

  • ObsPy is not currently included in the Maven Repository. To include it as I have with Jython, I will have to write a maven script to define the artifact correctly.
  • Learning the architecture of Observations and requests has been a bit difficult, especially when I need to discern the discrepencies between seismic data and the current SOS data formats.
What needs to be done:

  • Draw up another UML model for the seismic event architecture.
  • Upload ObsPy to the Maven Repository and include the artifact in the SOS maven dependencies.
    • Test the ability to query IRIS data directly from SOS.
  • Figure out decisively how observations will be triggered/created.
  • Start writing test scripts handling sensorML, and parsing the data values into different objects.
    • This will most likely take place outside of SOS for now, in a standalone Java test application.

Weekly Report #4 - 7/15/2013

This is the fourth weekly report for the "Seismic Observations in SOS" GSOC 2013 project.

What was done:

  • Seismic event UML for a rough draft of object model mapping.

  • XML parsing: Explored the xml format for sensor information standard for seismograph stations, StationXML, and how it relates to the standard for SOS and SWE, SensorML.
    • XML attributes that will be needed will be all station info that is used for IRIS queries, as well as parameters having to do with the time frame for which data is archived (when the station was onlined, and the date up to which it produced data).
  • Observation and sensor backend: Found that I will have to clone existing DAO methods that are called for operations, and add in my own functionality. The classes that must be redefined will be the cache feeder, get observation, and describe sensor.

  • ObsPy cannot be imported directly by Maven, as it is Python and is not contained in jars or any other java packaging system.
    • A solution that was thought of was to place the dependecies for using ObsPy scripts in the webapp resources directory so that they will be included with the webapp itself.
      • Jython will have to be able to find these packages and include them in its own buildpath.
    • Another possibility could be to use easy install, which would involve learning how to inplement it in conjunction with Maven.
What needs to be done:

  • SensorXML needs to be pulled from IRIS in a new DAO module (DescribeSensorDAO).
    • Once this is done successfully, it will be a matter of feeding the correct information into the rest of SOS so that it can set up the capability to get observations.
      • Reading and understanding the current DAO methods will be nesessary to properly do this.
  • Timeseries data will be pulled via GetObservationDAO, and this class must be cloned as well.
  • Maven .pom files will need to know which DAO methods to build, and those must be the cloned ones.

Weekly Report #5 - 7/26/2013

This is the fifth weekly report for the "Seismic Observations in SOS" GSOC 2013 project.

What was done:

  • The biggest step that was made last week was to move from using ObsPy to using SeisFile.
    • Combining the 2 libraries of Jython and ObsPy proved to be a bit too cumbersome, and I opted to go with the Java-only tool of SeisFile
    • This change happened inconveniently late, but it has allowed for much more efficient development.
    • A big pro related to the goals of last week is that SeisFile includes built in StationXML parsing, so no other tools will be needed as far as that is concerned.
  • DAO classes were successfully created by cloning some of hibernate.
    • DescribeSensorDAO is properly querying StationXML from the IRIS FDSNWS data source.
    • Feeding the correct information to the rest of SOS is mostly to be done in the SOSCacheFeederDAO.
      • As of this week, my implementation of the cache feeder involves querying a station network (StationXML), and iterating through its stations to find start and end times for available data.
        • This information will be mapped to the corresponding object in the SOS data model, and given the correct associations (see weekly reports 3 and 4)
    • GetObservationDAO will be in charge of actually querying time series data.
      • The scripts used to pull data have been successfully tested, and now the data just has to be mapped to SOS.
  • More frequent conferences were set up between me and my mentors, so productivity should ramp up even more.

  • Cloning the current hibernate cache feeder was difficult, as there are a lot of periferal processes taking place. Ie: threading and sessionholders to optimize performance
    • The seismic data cache feeder does not require this to function properly, so it merely made it more complicated to make sense of the original cache feeder.
  • The cache is required to know associations within the object model, ie: which FeatureOfInterest belongs to which Offering, or which Procedure to which Offering, so all programming must be done with the exact object model in mind.
  • For cache feeder queries, I want a broad query that returns whole networks of stations. However, in this format, the StationXML does not seem to contain which channels are available in each Station.
    • This is one of the parameters the cache needs to know about, so the cache feeder may have to be split into sub requests for each station individually, which could potentially hurt performance depending on the scale of request.
What needs to be done:

  • Seismic events need some thought as to what information exacly the cache needs to be provided with in order for SOS to get valid observations.
  • Cache feeder needs to start actually injecting the SOS object model with the correct mappings.
    • I expect this to need a lot of effort and testing to complete, but once finished, will mark a huge step towards project completion.
  • If Cache feeder can be completed and is working correctly this week, DescribeSensorDAO and GetObservationDAO will only require some quick mapping into SOS to work correctly.

Weekly Report #6 - 7/29/2013

This is the sixth weekly report for the "Seismic Observations in SOS" GSOC 2013 project.

What was done:

  • SOSCacheFeederDAO was finished and successfully tested!
    • It's current function is to add references to all global seismic events between magnitudes 8 and 10, and to all possible time series for the seismic station network IU.
      • These parameters are flexible of course, but the network IU is optimal since it covers the largest area of stations, on multiple continents.
    • Last week's channel issue was solved by adding the level query parameter attribute as "channel." No additional queries were needed.
    • Cache test output is in the attachments.
  • Object model mapping for seismic event was redone.
    • Events have nothing to do with sensors, stations, or networks, and the model had to reflect this properly.

  • The biggest issue this last week was having to redo the shoddy event model I created a few weeks back.
    • To properly feed the cache data, a proper mapping for the object model must be kept in mind.
    • For now, if you wish to see the object model, see the event hierarchy in thecache feeder test output. New UML will be created this week.
  • Had trouble creating an instance of WritableContentCache to actually test the adding and getting of cached entities (no class definition exception).
    • Once the cache module was added to the pom, this worked fine.
What needs to be done:

  • DescribeSensorDAO and GetObservationDAO must be finalized.
    • Now that the cache feeder is working correctly, these DAOs must take into account what the cache knows to be valid query parameters to make valid timeseries, station, and event queries.
  • The DOAs must be referenced in the webapp so that the server knows to use them.
    • If this works, I can test using the Tomcat servlet application rather than simple main methods, much more accurately signifying overall success.

Weekly Report #7 - 8/5/2013

This is the seventh weekly report for the "Seismic Observations in SOS" GSOC 2013 project.

What was done:

  • Some more updates on the cache providing had to be done in order to support development of DescribeSensorDAO and GetObservationDAO.
  • DescribeSensorDAO was cloned from the hibernate DescribeSensorDAO, in that it extends the same abstract class, and overrides the getSensorDescription function.
    • This method takes a requested procedure from SOS and converts it into a sensor description, to which all necessary values are added.
      • This sensor description is put into a server response and returned.
  • GetObservationDAO was cloned as well, which overrides the getObservation function.
    • The request contains the offering and procedures, as well as other request parameters associated with the observation.
      • The specific observation is determined and queried from the IRIS DNSR data repository, and returned in a server response.
  • The format of the data model is designed to allow the queriers to determine the type of observation, and some set of observation-specific information that allows creation of the query parameters that will get the exact data needed.
    • The Offering provides the format for observation type:
      • In the case of seismic event requests: Offering = <iris catalog>. This value describes the type as "PDE," which stands for "Preliminary Determination of Epicenters," which means the file returned is to be a seismic event. The value also contains the data provider, which is used to narrow down the query parameters.
      • In the case of time series requests: Offering = <sensor station code>.
    • In the case of events, the Procedure only acts as a continuation of the offering, and contains the iris contributor. This was done mainly as a place holder for objects that are referenced from the Procedure, since events do not have an exact sensor that they are associated with.
    • In the case of time series, the Procedure contains both the network code and the station code, to describe exactly where the data is to be queried from.
  • Text files linking the SOS operation service names to the related DAO classes were created. They should automatically be found and processed by Jasap during startup.

  • The biggest issue for this week has been testing:
    • The requests for DescribeSensorDAO and getObservationDAO come from the server, so if a working server is not set up yet, they are difficult to test completely.
    • I am currently having trouble with the webapp GUI that lets users set up admin settings on first startup. The webapp is starting, but I am not given the option to set up the server, or test the client.
What needs to be done:

  • All focus this next week is on figuring out the testing of the request services, and the DAO describeSensor, and getObservation methods.
    • If these can be working correctly, it means I am done with stage 1 of the project, and can move on to visualization in Sensor Web.

Weekly Report #8 - 8/13/2013

This is the seventh weekly report for the "Seismic Observations in SOS" GSOC 2013 project. It is a day late, since I have had spotty internet connection this past week.

What was done:


  • Challenges this week have been testing based once again.
    • I am still having trouble getting tomcat to connect to my remote debugger, and even once I got it connected, it was not properly configured to my code, and breakpoints were not registering during runtime.
    • Currently, testing is being done using print statements in areas where exceptions are occuring, which is cumbersome at best.
  • Issues with GetCabilities were based around not relating SoS entities correctly.
    • Problems were fixed when The offering was made to relate to the procedure, but not vise versa.
What needs to be done:

  • Still, the goal is to get Seismic SOS working completely.

Weekly Report #9 - 8/22/2013

This is the ninth weekly report for the "Seismic Observations in SOS" GSOC 2013 project.

What was done:

  • More work was put into DescribeSensorDAO
  • More work was put into CacheFeederDAO
    • Contents section of response now displays valid query times, offering, procedure, and various formats.

  • Still struggling to get rid of exceptions in the DescribeSensor request.
    • Most of these have to do with SOS encoder not expecting the StationXML schema.
  • CacheFeederDAO still needs to display observableProperties, and featuresOfInterest.
    • Probably has to do with cross relations.

What needs to be done:

  • This past week involved a lot of travel, and personal emergencies
    • This next week needs an improvement in work output
  • Hopefully, I can wrap up GetCapabilities and DescribeSensor by monday, and possibly getObservation as well.

Weekly Report #10 - 8/29/2013

This is the tenth weekly report for the "Seismic Observations in SOS" GSOC 2013 project.

What was done:

  • A problem was solved that involved the data queried by SeisFile to have improper station data.
    • The problem probably has to do with an update in the StationXML schema, or an issue with the IRIS database.
    • I am currently in contact with the SeisFile developers trying to solve the exact problem, as my solution was more of a workaround.
  • An issue in DescribeSensor that was inhibiting its ability to synthesize SensorML from StationXML was solved.
    • The issue had to do with an iterator being used improperly to be stepped through twice.
  • Coordinates for the sensor FeaturesOfInterest were added via the envelope element during cache feeding.

  • DescribeSensor is still incomplete in that I am still working out what exactly the SensorML needs to be recognized by SOS as valid.
  • GetObservation has seen limited work since GetCapabilities and DescribeSensor has had the most help.
What needs to be done:

  • All further problems in DescribeSensor must be solved completely.
  • Light documentation within the code needs to be done in order to prepare for finishing up the problem.

Weekly Report #11 - 9/2/2013

This is the eleventh weekly report for the "Seismic Observations in SOS" GSOC 2013 project.

What was done:

  • DescribeSensor now works correctly for seismic stations as well as seismic events.

  • As said above, not all procedures are working for DescribeSensor.
  • I have been doing a good amount of travel at the end of Summer, so consistant internet connection is difficult to find.
    • Luckily, DescribeSensor working means another huge step for the project.
What needs to be done:

  • Handling for faulty procedures needs to be descided upon.
    • Will those procedures just not be loaded into the cache, or will the absence of DescribeSensor output for those procedures be handled by whatever higher level program is using SOS?
  • GetObservation needs to go into overhaul and be brought up to speed with the rest of the project.

Weekly Report #12 - 9/10/2013

This is the twelvth weekly report for the "Seismic Observations in SOS" GSOC 2013 project.

What was done:


  • The modular design of SOS makes GetObservation somewhat difficult.
    • The data has to be funneled through the correct value frameworks, with the correct observation-type format.
    • This is less complicated than getCapabilities and describeSensor, so I am less worried about this challenge.
What needs to be done:

  • Now, there is less than a week left for the project.
    • GetObservation must be brought up to speed.
      • The data is there, it is now only a matter of formatting it correctly.
  • Also, I will clean up and document the code to the best of my ability in the time given.

Weekly Report #13 - 9/16/2013

This is the thirteenth weekly report for the "Seismic Observations in SOS" GSOC 2013 project.

What was done:

  • GetObservation operation is working correctly for event requests.
  • GetObservation operation for timeseries requests is properly formatting the observation, but cannot currently return the miniseed data needed to create waveform graphs.

  • Queried miniseed data is causing the server to hang somewhere in the encoding process.
    • This is either because it cannot be converted to the text format I am currently using to pass it into the SOS encoder, or because it simply involves too much raw data for the server to handle.
What needs to be done:

  • Since today is the suggested "pencils down" date, the project should be wrapped up. However, I am not quite satisfied not being able to return raw waveform data, and think the problem is on the verge of being solved.
    • The issue will undergo further work this week in addition to general documentation writing in preparation for the final GSOC due date.
  • Documentation will include:
    • in-code comments
    • final draft data models for the various entities involved in SOS observation requests
    • wiki instructions for setting up a working instance of Seismic SOS and making valid server requests
    • a list of known bugs and explanation/workarounds.
  • A live test server instance also needs to be set up for seismic SOS as it currently stands.
Topic attachments
I Attachment Action Size Date Who Comment
cachefeeder_test_output.txttxt cachefeeder_test_output.txt manage 95 K 29 Jul 2013 - 22:16 UnknownUser cache test console output
describe_event_response.xmlxml describe_event_response.xml manage 11 K 17 Sep 2013 - 01:16 UnknownUser Example event DescribeSensor response
getObservation_event_request.xmlxml getObservation_event_request.xml manage 866 bytes 17 Sep 2013 - 01:24 UnknownUser Event GetObservation request example
getObservation_event_response.xmlxml getObservation_event_response.xml manage 1 K 17 Sep 2013 - 01:25 UnknownUser Event GetObservation response example
getObservation_timeseries_request.xmlxml getObservation_timeseries_request.xml manage 1 K 17 Sep 2013 - 01:42 UnknownUser Timeseries GetObservation example request
getObservation_timeseries_response.xmlxml getObservation_timeseries_response.xml manage 3 K 17 Sep 2013 - 01:42 UnknownUser Timeseries GetObservation example response
getcapabilitiesout.xmlxml getcapabilitiesout.xml manage 24 K 13 Aug 2013 - 14:14 UnknownUser GetCapabilities response output
iu_anmo_describesensor_out.xmlxml iu_anmo_describesensor_out.xml manage 3 K 02 Sep 2013 - 22:37 UnknownUser  
Topic revision: r16 - 19 Sep 2013, PatrickNoble
Legal Notice | Privacy Statement

This site is powered by FoswikiCopyright © by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
Ideas, requests, problems regarding Wiki? Send feedback