msr_host_latency - Man Page

Calculate latency between last sample in a Mini-SEED record and the host computer time.

Synopsis

#include <libmseed.h>

double  msr_host_latency ( MSRecord *msr );

Description

msr_host_latency calculates the latency in seconds of the Mini-SEED data as the difference between current time in UTC of the host computer and the time of the last sample in the record.

This routine is only really useful when dealing with a telemetered data stream or something similar.

Double precision is returned, but the true precision is dependent on the accuracy of the host system clock among other things.

Return Values

msr_host_latency returns seconds of latency or 0.0 on error (indistinguishable from 0.0 latency).

See Also

ms_intro(3) and msr_unpack(3).

Author

Chad Trabant
IRIS Data Management Center

Info

2006/02/27 Libmseed API