0% found this document useful (0 votes)
88 views12 pages

Trace Editing: Ii. Detailed Seismic Data Processing Techniques

Trace editing involves processing seismic data to remove noise and anomalies, correct polarity, and zero out traces outside amplitude thresholds. This includes despiking, polarity reversal, and trace zeroing. Automatic first break picking detects the onset of refracted seismic signals and is used to determine near-surface static corrections by inversion. Common automatic picking methods include the STA/LTA ratio method using short and long-term amplitude averages, and multi-window methods using moving windows to calculate amplitude averages and detect signals over thresholds.

Uploaded by

Onur Akturk
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
88 views12 pages

Trace Editing: Ii. Detailed Seismic Data Processing Techniques

Trace editing involves processing seismic data to remove noise and anomalies, correct polarity, and zero out traces outside amplitude thresholds. This includes despiking, polarity reversal, and trace zeroing. Automatic first break picking detects the onset of refracted seismic signals and is used to determine near-surface static corrections by inversion. Common automatic picking methods include the STA/LTA ratio method using short and long-term amplitude averages, and multi-window methods using moving windows to calculate amplitude averages and detect signals over thresholds.

Uploaded by

Onur Akturk
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 12

TRACE EDITING

Trace editing includes de-spiking (eliminating of high-amplitude anomalies), polarity reversal


(change to correct polarity) and trace zeroing (set trace amplitudes to zero if average amplitude is
outside amplitude thresholds).
Seismic data are examined to spot bad traces from each seismic line that contains spikes or noise
trains that are unrelated to the true seismic data and to spot any trace with incorrect polarity.
A bad recording group will affect different traces on different shots. For example, a bad receiver will
generate one noisy trace per shot, and its location with the shot record will move as the shot moves.
Land data contain noise train from the shot itself. This noise, which travels along the near surface, is
usually referred to as ground roll. The traces that contain the noise train and spikes will be muted out
(that is, replace with zero), as they can spread in both time and space by subsequent processing steps if
not removed before hand.
Note that muting is the removal of the contribution of selected seismic traces in a stack to minimize
air waves, ground roll and other early-arriving noise. Low-frequency traces and long-offset traces are
typical targets for muting.

POLARITY REVERSAL

One final problem in trace editing is that of polarity. Polarity is the way in which seismic data are
recorded and displayed. Most seismic data are recorded using the standard specified by the Society of
Exploration Geophysicists (SEG).

II. DETAILED SEISMIC DATA PROCESSING TECHNIQUES


First break picking
From Wikipedia, the free encyclopedia
This article may be confusing or unclear to readers. Please help us clarify the article. There
might be a discussion about this on the talk page. (June 2010) (Learn how and when to remove this
template message)

First-break picking detecting or picking the onset arrivals of refracted signals from all the signals
received by receiver arrays and produced by a particular source signal generation. It is also called
first arrival picking or first break detection. First-break picking can be done automatically, manually or
as a combination of both. With the development of computer science and the size of seismic
surveys, automatic picking is often preferred.[1]

Contents
[hide]

1Significance

2History of First Break Picking

3Methods of Automatic First Break Picking


o 3.1STA/LTA ratio Method[8]

o 3.2Multi-Window Method[7]

4Available Code

5Future Trend of the Topic

6Notes

Significance[edit]
First-break picks associated with the refracted arrival times are used in an inversion scheme to
study the near-surface low-velocity zone and subsequent determination of static corrections. Static
correction is a correction applied to geophysical data, especially seismic data, to compensate for the
effect of near-surface irregularities, differences in the elevation of shots and geophones, or any
application to correct the positions of source and receivers.

History of First Break Picking[edit]


Gelchinsky and Shtivelman[2](1983) used correlation properties of signals and applied a statistical
criterion for the estimation of first arrivals time.
Coppens[3](1985) calculated the ratio of energy of seismogram of the two windows and used that to
differentiate in signal and noise.
Michael D. McCormark et al.[4](1993) introduced a backpropagation neural network (BNN) method.
The Neural network which edits seismic data or pick first breaks was trained by users, who were just
selecting and presenting to the network examples of trace edits or refraction picks. The network then
changes internal weights iteratively until it can reproduce the examples accurately provided by the
users.
Fabio Boschetti et al.[5](1996) introduce a fractal-based algorithm which detects the presence of a
signal by analyzing the variation in fractal dimension along the trace. This method works when
signal-to-noise ratio is small, but it is considerably slow.
A direct correlation method was introduced by Joseph et al.[6](1999) which was developed for use in
highly time-resolved, low-noise signals acquired in the laboratory. In this method, the greatest value
of Pearson's correlation coefficient between segments of observed waveforms near the pulse
onset and at an appropriate reference serves as the time determination criterion.
Zuolin Chen, et al.[7](2005) introduced a multi-window algorithm to detect the first break. In this
method, three moving windows were used and the averages of absolute amplitudes in each window
need to be calculated, then ratios based on the averages of the windows provide standards to
differentiate signals from unwanted noise.
Wong et al.[8](2009) introduced STA/LTA ratio method. This method is similar as Coppens[3] algorithm.
The difference is to do the ratio of two averages of energy between a short-term window and a long-
term window, which is denoted as STA/LTA (short-term average/long-term average), instead of
calculating the ratio of energy of seismogram of the two windows in Coppens algorithm.

Methods of Automatic First Break Picking[edit]


STA/LTA ratio Method[8][edit]
This method is similar as Coppens (1985) algorithm. The difference is to do the ratio of two
averages of energy between a short-term window and a long-term window, which is denoted as
STA/LTA (short-term average/long-term average), instead of calculating the ratio of energy of
seismogram of the two windows in Coppens algorithm. The numerical derivative of the ratio can be
defined as,
where ri+1 is the STA/LTA ratio at time index i+1, and ri is the STA/LTA ratio at time index i. For
noise-free seismograms, the maximum value of the numerical derivative of the STA/LTA ratio is
close to the time of the first arrival.
Wong et al. (2009) modified the algorithm of the energy ratio method, where they named the
method as modified energy ratio. In this method, they define the energy ratio as,
where xi is the times series representing a seismogram with the time index i=1, 2 N. and
the number of points in an energy window is ne. Then, the modified energy ratio is defined
as
The peak of the modified energy ratio er3i is very closed to the time of first arrivals on
noise-free seismograms.
Multi-Window Method[7][edit]
This method needs to calculate the averages of absolute amplitudes from a seismic
trace by using three moving time windows before and after each time point (sample).
When the instantaneous absolute amplitude exceeds an automatically adjusted
threshold, ratios based on the averages of the windows over previous time samples
provide standards to differentiate signals from unwanted noise.
The multi-window automatic P phase picker operates in the time-domain. It includes
procedures to define: time windows, standards, corresponding thresholds
and waveform correction.
1. The averages of absolute amplitudes within BTA (Before Term Average), ATA (After
Term Average) and DTA (Delayed Term Average) windows are respectively defined as
follows:
Standards R2(t) and R3(t) are used for the discrimination of high-amplitude short-
duration and long-duration noise.
2.Thresholds is defined as
where Em is mean and Esd is standard deviation; p is the number of shifted
samples; is the coefficient to adjust the height of the first threshold and is
taken to be 3. From this equation it is obvious that H1(t) is automatically
adjusted with the variance of the background noise.
3. H1(t) is defined larger than most pre-existing noise levels, and the
instantaneous absolute amplitude at the trigger time point is higher
than H1(t),according to the configuration of the first arrival of an event the real
onset time must be earlier than the trigger time point. A waveform correction
should be used to compensate this belated onset time. For an impulsive first
arrival, the height of the absolute amplitude and the representative gradient at
the trigger point can be used to accomplish the correction.

Available Code[edit]
Potash SU is a package including Seismic Unix style codes developed by
Balazs Nemeth, it provides a subroutine called simple window-based first break
picker, the figure shows the seismic images before and after the application of
subroutine.

Right one is before first break picking, left one is after first break picking

Future Trend of the Topic[edit]


Methods of Picking: automatic first-break picking has played an important role in
seismic data processing, and directly influences the quality of seismic sections.
Because of the increase of seismic survey size, more efficient and fast first
break picking methods are needed, with parallel methods being preferred.
Application of First Break detection: Traditionally the geophysicist uses first
breaks for static correction. First break signal can also be used as observation
data for history matching.
---------------------------------------------------------------------------------------------------

Dictionary:Flex binning
Other languages:
English

Locally increasing bin size to maintain constant multiplicity, designed to compensate for acquisition
irregularities. Bin-flexing schemes usually use some uniqueness criteria involving trace selection so
that only one trace in each offset range is retained.

Broadband seismic: What the fuss is all


about
Written by Andrew Long Friday, 01 March 2013 00:00

Print
Email
Andrew Long of Petroleum GeoServices provides a rough guide to the science behind broadband seismic
and why it is transforming the possibilities for acquiring high resolution images of the subsurface using
towed streamer marine seismic acquisition technology.
In
any towed streamer seismic acquisition project there are three considerations regarding the bandwidth of signal from
the earth available in the final seismic image product: 1) the frequency bandwidth propagated into the earth from the
source array; 2) the frequency bandwidth recovered from the earth in the recorded data; and 3) the frequency
bandwidth preserved throughout all processing and imaging steps.

Seismic signals are described in classical terms of amplitude, phase, and frequency content. Each component must
be faithfully preserved in order to accurately interpret geological structure and stratigraphy, and to accurately predict
lithology and fluid distribution during reservoir characterization. The latter pursuit benefi ts in particular from very low
frequency amplitudes being recovered from the earth. A generic defi nition of broadband seismic thus describes an
acquisition and processing system with source and receivers which enhances and preserves the bandwidth at both
low and high frequencies in a pre-stack amplitude and phase-compliant manner so that subsequent processing and
interpretation can utilize all the information contained in the signal from the earth.

Ghosts

Unwanted reflections from the freesurface of the ocean continuously interfere in a constructive and destructive
manner with the seismic wavefield propagated into the earth from a source array. The source wavefield reflected from
the surface (the source ghost) is a time-delayed and opposite polarity version of the source wavefield propagated
directly from the source array into the earth, and the two wavefields propagate together in a coupled manner. The net
effect is that the frequency bandwidth propagated into the earth contains signifi cant notches at periodic frequencies,
and the notch frequencies are a function of both source depth and emission angle (measured with respect to vertical).
Similarly, the receivers (along each streamer) record two versions of the seismic wavefield scattered back from the
earth, coupled together and interfering in a continuously constructive and destructive manner. The wavefield reflected
downwards from the free-surface of the ocean (the receiver ghost) is referred to as the down-going wavefield, and is
a time-delayed and opposite polarity version of the upgoing wavefield. The wavefield recorded with conventional
hydrophone-only streamers is a scalar measurement of pressure; the total pressure, which is the sum of the up-
going and downgoing pressure wavefields. The recorded total pressure wavefield contains signifi cant notches at
periodic frequencies, and the notch frequencies are a function of both receiver depth and emergence angle
(measured with respect to vertical). So collectively, conventional seismic data contains frequency notches related to
both source ghost and receiver ghost effects. These effects notably penalize the low and high frequency content in
seismic data, resulting in a limited frequency bandwidth being recovered from the earth.

Physics describes how any pressure wavefield can also be defi ned in terms of the derivative of pressure normal to
the wavefront; measured in units of particle velocity. Figure 1 (shown overleaf) illustrates how the receiver ghost notch
frequencies are complementary for pressure and particle velocity wavefields, and how the notch frequencies change
as a function of emergence angle. There is usually no usable information in the vicinity of the spectral notches, so any
processing-based solution to recover information in these parts of the spectrum must be based on reconstructing the
data that have not been recovered from other parts of the data with higher signal-to-noise (S/N) content.
Figure 1. illustrates how the pressure (blue) and particle velocity (red) amplitude spectra are complementary when
measured at the same depth and location (collocated). Periodic notches in both spectra are related to the receiver
depth and the angle of emergence of the seismic wavefield. As the emergence angle increases (vertical propagation
means zero emergence angle), the notches move to higher frequencies.

Figu
re 2. The image on the left is the result of seismic inversion applied to conventional seismic data containing both
source and receiver ghost effects. The color scale represents P impedance: the product of compressional velocity
and density. In contrast, the right image represents the ghost-free result provided from PGS GeoSource and dual-
sensor GeoStreamer technologies. Note the improvements in resolution on the right.

Traditionally, this involves simple 1D deconvolution of the data using a deterministic assumption that the sea surface
is perfectly flat, streamer depth is constant, and the earth and water column is homogeneous. Inevitably, such
methods are bound to fail as the various assumptions are increasingly violated.
Several acquisition-based methods have emerged that do not seek to mitigate the presence of the ghost notches.
They record information with different ghost characteristics such that, when all the data are combined, there is good
S/N at a wider range of frequencies:

Each methodology requires rather exhaustive explanation to describe its implementation, but the common element is
that a reflection wavefield approximating the true up-going pressure reflection is derived in processing, and the effects
of the receiver ghost are removed.

The past couple of years have also seen a variety of source array approaches that deploy source elements at two or
more different depths, and processing is able to reduce or remove source ghost effects. It is also noted that a family of
processing-based methods have also emerged in recent years that attempt to reduce or remove source and/or
receiver ghost effects from conventionally acquired seismic data. Each makes a series of assumptions, but results
can be favorable in certain scenarios. Figure 2 shows a comparison of conventional source and receiver seismic data
vs ghost-free seismic data.

Using ghost-free data

As demonstrated in Figure 2, removing the effects of the source and receiver ghosts significantly improve the
frequency bandwidth recovered from the earth, and facilitate high resolution interpretation. Ghostfree data is in fact a
prerequisite for many processing algorithms and inversion schemes. Overall, each acquisition and processing
solution to mitigating ghost effects and increasing recoverable frequency bandwidth is based upon several
assumptions. In optimal survey conditions and in locations with naturally high S/N seismic images the various
broadband results may be quite comparable in terms of image quality. But the industry is still in the process of
understanding the penalties for reservoir characterization and image quality as various assumptions in each
methodology are violated in the acquisition and survey environmental parameters, and in terms of various geological
settings and styles.

The most robust broadband seismic solutions are based on an acquisition platform, but even then the industry is still
learning how to best process such data. OE

Over-under hydrophone-only streamers. l Variable-depth hydrophone-only streamers.

Dual-sensor streamers (include vertical component particle velocity sensors).

Multi-component streamers(include vertical and cross-line component particle velocity sensors).

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy