953 resultados para DATA QUALITY


Relevância:

60.00% 60.00%

Publicador:

Resumo:

Stochastic modelling is critical in GNSS data processing. Currently, GNSS data processing commonly relies on the empirical stochastic model which may not reflect the actual data quality or noise characteristics. This paper examines the real-time GNSS observation noise estimation methods enabling to determine the observation variance from single receiver data stream. The methods involve three steps: forming linear combination, handling the ionosphere and ambiguity bias and variance estimation. Two distinguished ways are applied to overcome the ionosphere and ambiguity biases, known as the time differenced method and polynomial prediction method respectively. The real time variance estimation methods are compared with the zero-baseline and short-baseline methods. The proposed method only requires single receiver observation, thus applicable to both differenced and un-differenced data processing modes. However, the methods may be subject to the normal ionosphere conditions and low autocorrelation GNSS receivers. Experimental results also indicate the proposed method can result on more realistic parameter precision.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper presents a technique for the automated removal of noise from process execution logs. Noise is the result of data quality issues such as logging errors and manifests itself in the form of infrequent process behavior. The proposed technique generates an abstract representation of an event log as an automaton capturing the direct follows relations between event labels. This automaton is then pruned from arcs with low relative frequency and used to remove from the log those events not fitting the automaton, which are identified as outliers. The technique has been extensively evaluated on top of various auto- mated process discovery algorithms using both artificial logs with different levels of noise, as well as a variety of real-life logs. The results show that the technique significantly improves the quality of the discovered process model along fitness, appropriateness and simplicity, without negative effects on generalization. Further, the technique scales well to large and complex logs.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This research used design science research methods to develop, instantiate, implement, and measure the acceptance of a novel software artefact. The primary purpose of this software artefact was to enhance data collection, improve its quality and enable its capture in classroom environments without distracting from the teaching activity. The artefact set is an iOS app, with supporting web services and technologies designed in response to teacher and pastoral care needs. System analysis and design used Enterprise Architecture methods. The novel component of the iOS app implemented proximity detection to identify the student through their iPad and automatically link to that student's data. The use of this novel software artefact and web services was trialled in a school setting, measuring user acceptance and system utility. This integrated system was shown to improve the accuracy, consistency, completeness and timeliness of captured data and the utility of the input and reporting systems.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

With the introduction of the PCEHR (Personally Controlled Electronic Health Record), the Australian public is being asked to accept greater responsibility for the management of their health information. However, the implementation of the PCEHR has occasioned poor adoption rates underscored by criticism from stakeholders with concerns about transparency, accountability, privacy, confidentiality, governance, and limited capabilities. This study adopts an ethnographic lens to observe how information is created and used during the patient journey and the social factors impacting on the adoption of the PCEHR at the micro-level in order to develop a conceptual model that will encourage the sharing of patient information within the cycle of care. Objective: This study aims to firstly, establish a basic understanding of healthcare professional attitudes toward a national platform for sharing patient summary information in the form of a PCEHR. Secondly, the studies aims to map the flow of patient related information as it traverses a patient’s personal cycle of care. Thus, an ethnographic approach was used to bring a “real world” lens to information flow in a series of case studies in the Australian healthcare system to discover themes and issues that are important from the patient’s perspective. Design: Qualitative study utilising ethnographic case studies. Setting: Case studies were conducted at primary and allied healthcare professionals located in Brisbane Queensland between October 2013 and July 2014. Results: In the first dimension, it was identified that healthcare professionals’ concerns about trust and medico-legal issues related to patient control and information quality, and the lack of clinical value available with the PCEHR emerged as significant barriers to use. The second dimension of the study which attempted to map patient information flow identified information quality issues, clinical workflow inefficiencies and interoperability misconceptions resulting in duplication of effort, unnecessary manual processes, data quality and integrity issues and an over reliance on the understanding and communication skills of the patient. Conclusion: Opportunities for process efficiencies, improved data quality and increased patient safety emerge with the adoption of an appropriate information sharing platform. More importantly, large scale eHealth initiatives must be aligned with the value proposition of individual stakeholders in order to achieve widespread adoption. Leveraging an Australian national eHealth infrastructure and the PCEHR we offer a practical example of a service driven digital ecosystem suitable for co-creating value in healthcare.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Microarrays have a wide range of applications in the biomedical field. From the beginning, arrays have mostly been utilized in cancer research, including classification of tumors into different subgroups and identification of clinical associations. In the microarray format, a collection of small features, such as different oligonucleotides, is attached to a solid support. The advantage of microarray technology is the ability to simultaneously measure changes in the levels of multiple biomolecules. Because many diseases, including cancer, are complex, involving an interplay between various genes and environmental factors, the detection of only a single marker molecule is usually insufficient for determining disease status. Thus, a technique that simultaneously collects information on multiple molecules allows better insights into a complex disease. Since microarrays can be custom-manufactured or obtained from a number of commercial providers, understanding data quality and comparability between different platforms is important to enable the use of the technology to areas beyond basic research. When standardized, integrated array data could ultimately help to offer a complete profile of the disease, illuminating mechanisms and genes behind disorders as well as facilitating disease diagnostics. In the first part of this work, we aimed to elucidate the comparability of gene expression measurements from different oligonucleotide and cDNA microarray platforms. We compared three different gene expression microarrays; one was a commercial oligonucleotide microarray and the others commercial and custom-made cDNA microarrays. The filtered gene expression data from the commercial platforms correlated better across experiments (r=0.78-0.86) than the expression data between the custom-made and either of the two commercial platforms (r=0.62-0.76). Although the results from different platforms correlated reasonably well, combining and comparing the measurements were not straightforward. The clone errors on the custom-made array and annotation and technical differences between the platforms introduced variability in the data. In conclusion, the different gene expression microarray platforms provided results sufficiently concordant for the research setting, but the variability represents a challenge for developing diagnostic applications for the microarrays. In the second part of the work, we performed an integrated high-resolution microarray analysis of gene copy number and expression in 38 laryngeal and oral tongue squamous cell carcinoma cell lines and primary tumors. Our aim was to pinpoint genes for which expression was impacted by changes in copy number. The data revealed that especially amplifications had a clear impact on gene expression. Across the genome, 14-32% of genes in the highly amplified regions (copy number ratio >2.5) had associated overexpression. The impact of decreased copy number on gene underexpression was less clear. Using statistical analysis across the samples, we systematically identified hundreds of genes for which an increased copy number was associated with increased expression. For example, our data implied that FADD and PPFIA1 were frequently overexpressed at the 11q13 amplicon in HNSCC. The 11q13 amplicon, including known oncogenes such as CCND1 and CTTN, is well-characterized in different type of cancers, but the roles of FADD and PPFIA1 remain obscure. Taken together, the integrated microarray analysis revealed a number of known as well as novel target genes in altered regions in HNSCC. The identified genes provide a basis for functional validation and may eventually lead to the identification of novel candidates for targeted therapy in HNSCC.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This thesis studies human gene expression space using high throughput gene expression data from DNA microarrays. In molecular biology, high throughput techniques allow numerical measurements of expression of tens of thousands of genes simultaneously. In a single study, this data is traditionally obtained from a limited number of sample types with a small number of replicates. For organism-wide analysis, this data has been largely unavailable and the global structure of human transcriptome has remained unknown. This thesis introduces a human transcriptome map of different biological entities and analysis of its general structure. The map is constructed from gene expression data from the two largest public microarray data repositories, GEO and ArrayExpress. The creation of this map contributed to the development of ArrayExpress by identifying and retrofitting the previously unusable and missing data and by improving the access to its data. It also contributed to creation of several new tools for microarray data manipulation and establishment of data exchange between GEO and ArrayExpress. The data integration for the global map required creation of a new large ontology of human cell types, disease states, organism parts and cell lines. The ontology was used in a new text mining and decision tree based method for automatic conversion of human readable free text microarray data annotations into categorised format. The data comparability and minimisation of the systematic measurement errors that are characteristic to each lab- oratory in this large cross-laboratories integrated dataset, was ensured by computation of a range of microarray data quality metrics and exclusion of incomparable data. The structure of a global map of human gene expression was then explored by principal component analysis and hierarchical clustering using heuristics and help from another purpose built sample ontology. A preface and motivation to the construction and analysis of a global map of human gene expression is given by analysis of two microarray datasets of human malignant melanoma. The analysis of these sets incorporate indirect comparison of statistical methods for finding differentially expressed genes and point to the need to study gene expression on a global level.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A Finite Element Method based forward solver is developed for solving the forward problem of a 2D-Electrical Impedance Tomography. The Method of Weighted Residual technique with a Galerkin approach is used for the FEM formulation of EIT forward problem. The algorithm is written in MatLAB7.0 and the forward problem is studied with a practical biological phantom developed. EIT governing equation is numerically solved to calculate the surface potentials at the phantom boundary for a uniform conductivity. An EIT-phantom is developed with an array of 16 electrodes placed on the inner surface of the phantom tank filled with KCl solution. A sinusoidal current is injected through the current electrodes and the differential potentials across the voltage electrodes are measured. Measured data is compared with the differential potential calculated for known current and solution conductivity. Comparing measured voltage with the calculated data it is attempted to find the sources of errors to improve data quality for better image reconstruction.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Mesoscale weather phenomena, such as the sea breeze circulation or lake effect snow bands, are typically too large to be observed at one point, yet too small to be caught in a traditional network of weather stations. Hence, the weather radar is one of the best tools for observing, analyzing and understanding their behavior and development. A weather radar network is a complex system, which has many structural and technical features to be tuned, from the location of each radar to the number of pulses averaged in the signal processing. These design parameters have no universal optimal values, but their selection depends on the nature of the weather phenomena to be monitored as well as on the applications for which the data will be used. The priorities and critical values are different for forest fire forecasting, aviation weather service or the planning of snow ploughing, to name a few radar-based applications. The main objective of the work performed within this thesis has been to combine knowledge of technical properties of the radar systems and our understanding of weather conditions in order to produce better applications able to efficiently support decision making in service duties for modern society related to weather and safety in northern conditions. When a new application is developed, it must be tested against ground truth . Two new verification approaches for radar-based hail estimates are introduced in this thesis. For mesoscale applications, finding the representative reference can be challenging since these phenomena are by definition difficult to catch with surface observations. Hence, almost any valuable information, which can be distilled from unconventional data sources such as newspapers and holiday shots is welcome. However, as important as getting data is to obtain estimates of data quality, and to judge to what extent the two disparate information sources can be compared. The presented new applications do not rely on radar data alone, but ingest information from auxiliary sources such as temperature fields. The author concludes that in the future the radar will continue to be a key source of data and information especially when used together in an effective way with other meteorological data.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Practical phantoms are essential to assess the electrical impedance tomography (EIT) systems for their validation, calibration and comparison purposes. Metal surface electrodes are generally used in practical phantoms which reduce the SNR of the boundary data due to their design and development errors. Novel flexible and biocompatible gold electrode arrays of high geometric precision are proposed to improve the boundary data quality in EIT. The flexible gold electrode arrays are developed on flexible FR4 sheets using thin film technology and practical gold electrode phantoms are developed with different configurations. Injecting a constant current to the phantom boundary the surface potentials are measured by a LabVIEW based data acquisition system and the resistivity images are reconstructed in EIDORS. Boundary data profile and the resistivity images obtained from the gold electrode phantoms are compared with identical phantoms developed with stainless steel electrodes. Surface profilometry, microscopy and the impedance spectroscopy show that the gold electrode arrays are smooth, geometrically precised and less resistive. Results show that the boundary data accuracy and image quality are improved with gold electrode arrays. Results show that the diametric resistivity plot (DRP), contrast to noise ratio (CNR), percentage of contrast recovery (PCR) and coefficient of contrast (COC) of reconstructed images are improved in gold electrode phantoms. (C) 2013 Elsevier Ltd. All rights reserved.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

(Document pdf contains 193 pages) Executive Summary (pdf, < 0.1 Mb) 1. Introduction (pdf, 0.2 Mb) 1.1 Data sharing, international boundaries and large marine ecosystems 2. Objectives (pdf, 0.3 Mb) 3. Background (pdf, < 0.1 Mb) 3.1 North Pacific Ecosystem Metadatabase 3.2 First federation effort: NPEM and the Korea Oceanographic Data Center 3.2 Continuing effort: Adding Japan’s Marine Information Research Center 4. Metadata Standards (pdf, < 0.1 Mb) 4.1 Directory Interchange Format 4.2 Ecological Metadata Language 4.3 Dublin Core 4.3.1. Elements of DC 4.4 Federal Geographic Data Committee 4.5 The ISO 19115 Metadata Standard 4.6 Metadata stylesheets 4.7 Crosswalks 4.8 Tools for creating metadata 5. Communication Protocols (pdf, < 0.1 Mb) 5.1 Z39.50 5.1.1. What does Z39.50 do? 5.1.2. Isite 6. Clearinghouses (pdf, < 0.1 Mb) 7. Methodology (pdf, 0.2 Mb) 7.1 FGDC metadata 7.1.1. Main sections 7.1.2. Supporting sections 7.1.3. Metadata validation 7.2 Getting a copy of Isite 7.3 NSDI Clearinghouse 8. Server Configuration and Technical Issues (pdf, 0.4 Mb) 8.1 Hardware recommendations 8.2 Operating system – Red Hat Linux Fedora 8.3 Web services – Apache HTTP Server version 2.2.3 8.4 Create and validate FGDC-compliant Metadata in XML format 8.5 Obtaining, installing and configuring Isite for UNIX/Linux 8.5.1. Download the appropriate Isite software 8.5.2. Untar the file 8.5.3. Name your database 8.5.4. The zserver.ini file 8.5.5. The sapi.ini file 8.5.6. Indexing metadata 8.5.7. Start the Clearinghouse Server process 8.5.8. Testing the zserver installation 8.6 Registering with NSDI Clearinghouse 8.7 Security issues 9. Search Tutorial and Examples (pdf, 1 Mb) 9.1 Legacy NSDI Clearinghouse search interface 9.2 New GeoNetwork search interface 10. Challenges (pdf, < 0.1 Mb) 11. Emerging Standards (pdf, < 0.1 Mb) 12. Future Activity (pdf, < 0.1 Mb) 13. Acknowledgments (pdf, < 0.1 Mb) 14. References (pdf, < 0.1 Mb) 15. Acronyms (pdf, < 0.1 Mb) 16. Appendices 16.1. KODC-NPEM meeting agendas and minutes (pdf, < 0.1 Mb) 16.1.1. Seattle meeting agenda, August 22–23, 2005 16.1.2. Seattle meeting minutes, August 22–23, 2005 16.1.3. Busan meeting agenda, October 10–11, 2005 16.1.4. Busan meeting minutes, October 10–11, 2005 16.2. MIRC-NPEM meeting agendas and minutes (pdf, < 0.1 Mb) 16.2.1. Seattle Meeting agenda, August 14-15, 2006 16.2.2. Seattle meeting minutes, August 14–15, 2006 16.2.3. Tokyo meeting agenda, October 19–20, 2006 16.2.4. Tokyo, meeting minutes, October 19–20, 2006 16.3. XML stylesheet conversion crosswalks (pdf, < 0.1 Mb) 16.3.1. FGDCI to DIF stylesheet converter 16.3.2. DIF to FGDCI stylesheet converter 16.3.3. String-modified stylesheet 16.4. FGDC Metadata Standard (pdf, 0.1 Mb) 16.4.1. Overall structure 16.4.2. Section 1: Identification information 16.4.3. Section 2: Data quality information 16.4.4. Section 3: Spatial data organization information 16.4.5. Section 4: Spatial reference information 16.4.6. Section 5: Entity and attribute information 16.4.7. Section 6: Distribution information 16.4.8. Section 7: Metadata reference information 16.4.9. Sections 8, 9 and 10: Citation information, time period information, and contact information 16.5. Images of the Isite server directory structure and the files contained in each subdirectory after Isite installation (pdf, 0.2 Mb) 16.6 Listing of NPEM’s Isite configuration files (pdf, < 0.1 Mb) 16.6.1. zserver.ini 16.6.2. sapi.ini 16.7 Java program to extract records from the NPEM metadatabase and write one XML file for each record (pdf, < 0.1 Mb) 16.8 Java program to execute the metadata extraction program (pdf, < 0.1 Mb) A1 Addendum 1: Instructions for Isite for Windows (pdf, 0.6 Mb) A2 Addendum 2: Instructions for Isite for Windows ADHOST (pdf, 0.3 Mb)

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This document describes the analytical methods used to quantify core organic chemicals in tissue and sediment collected as part of NOAA’s National Status and Trends Program (NS&T) for the years 2000-2006. Organic contaminat analytical methods used during the early years of the program are described in NOAA Technical Memoranda NOS ORCA 71 and 130 (Lauenstein and Cantillo, 1993; Lauenstein and Cantillo, 1998) for the years 1984-1992 and 1993-1996, respectively. These reports are available from our website (http://www.ccma.nos.gov) The methods detailed in this document were utilized by the Mussel Watch Project and Bioeffects Project, which are both part of the NS&T program. The Mussel Watch Project has been monitoring contaminants in bivalves and sediments since 1986 and is the longest active national contaminant monitoring program operating in U.S. costal waters. Approximately 280 Mussel Watch sites are sampled on a biennial and decadal timescale for bivalve tissue and sediment respectively. Similarly, the Bioeffects Assessment Project began in 1986 to characterize estuaries and near coastal environs. Using the sediment quality triad approach that measures; (1) levels of contaminants in sediments, (2) incidence and severity of toxicity, and (3) benthic macrofaunal conmmunities, the Bioeffects Project describes the spatial extent of sediment toxicity. Contaminant assessment is a core function of both projects. These methods, while discussed here in the context of sediment and bivalve tissue, were also used with other matricies including: fish fillet, fish liver, nepheloid layer, and suspended particulate matter. The methods described herein are for the core organic contaminants monitored in the NS&T Program and include polycyclic aromatic hydrocarbons (PAHs), polychlorinated biphenyls (PCBs), butyltins, and organochlorines that have been analyzed consistently over the past 15-20 years. Organic contaminants such as dioxins, perfluoro compounds and polybrominated biphenyl ethers (PBDEs) were analyzed periodically in special studies of the NS&T Program and will be described in another document. All of the analytical techniques described in this document were used by B&B Laboratories, Inc, an affiliate of TDI-Brook International, Inc. in College Station, Texas under contract to NOAA. The NS&T Program uses a performance-based system approach to obtain the best possible data quality and comparability, and requires laboratories to demonstrate precision, accuracy, and sensitivity to ensure results-based performance goals and measures. (PDF contains 75 pages)

Relevância:

60.00% 60.00%

Publicador:

Resumo:

EXECUTIVE SUMMARY: The Coastal Change Analysis Programl (C-CAP) is developing a nationally standardized database on landcover and habitat change in the coastal regions of the United States. C-CAP is part of the Estuarine Habitat Program (EHP) of NOAA's Coastal Ocean Program (COP). C-CAP inventories coastal submersed habitats, wetland habitats, and adjacent uplands and monitors changes in these habitats on a one- to five-year cycle. This type of information and frequency of detection are required to improve scientific understanding of the linkages of coastal and submersed wetland habitats with adjacent uplands and with the distribution, abundance, and health of living marine resources. The monitoring cycle will vary according to the rate and magnitude of change in each geographic region. Satellite imagery (primarily Landsat Thematic Mapper), aerial photography, and field data are interpreted, classified, analyzed, and integrated with other digital data in a geographic information system (GIS). The resulting landcover change databases are disseminated in digital form for use by anyone wishing to conduct geographic analysis in the completed regions. C-CAP spatial information on coastal change will be input to EHP conceptual and predictive models to support coastal resource policy planning and analysis. CCAP products will include 1) spatially registered digital databases and images, 2) tabular summaries by state, county, and hydrologic unit, and 3) documentation. Aggregations to larger areas (representing habitats, wildlife refuges, or management districts) will be provided on a case-by-case basis. Ongoing C-CAP research will continue to explore techniques for remote determination of biomass, productivity, and functional status of wetlands and will evaluate new technologies (e.g. remote sensor systems, global positioning systems, image processing algorithms) as they become available. Selected hardcopy land-cover change maps will be produced at local (1:24,000) to regional scales (1:500,000) for distribution. Digital land-cover change data will be provided to users for the cost of reproduction. Much of the guidance contained in this document was developed through a series of professional workshops and interagency meetings that focused on a) coastal wetlands and uplands; b) coastal submersed habitat including aquatic beds; c) user needs; d) regional issues; e) classification schemes; f) change detection techniques; and g) data quality. Invited participants included technical and regional experts and representatives of key State and Federal organizations. Coastal habitat managers and researchers were given an opportunity for review and comment. This document summarizes C-CAP protocols and procedures that are to be used by scientists throughout the United States to develop consistent and reliable coastal change information for input to the C-CAP nationwide database. It also provides useful guidelines for contributors working on related projects. It is considered a working document subject to periodic review and revision.(PDF file contains 104 pages.)

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The use of self-contained, low-maintenance sensor systems installed on commercial vessels is becoming an important monitoring and scientific tool in many regions around the world. These systems integrate data from meteorological and water quality sensors with GPS data into a data stream that is automatically transferred from ship to shore. To begin linking some of this developing expertise, the Alliance for Coastal Technologies (ACT) and the European Coastal and Ocean Observing Technology (ECOOT) organized a workshop on this topic in Southampton, United Kingdom, October 10-12, 2006. The participants included technology users, technology developers, and shipping representatives. They collaborated to identify sensors currently employed on integrated systems, users of this data, limitations associated with these systems, and ways to overcome these limitations. The group also identified additional technologies that could be employed on future systems and examined whether standard architectures and data protocols for integrated systems should be established. Participants at the workshop defined 17 different parameters currently being measured by integrated systems. They identified that diverse user groups utilize information from these systems from resource management agencies, such as the Environmental Protection Agency (EPA), to local tourism groups and educational organizations. Among the limitations identified were instrument compatibility and interoperability, data quality control and quality assurance, and sensor calibration andlor maintenance frequency. Standardization of these integrated systems was viewed to be both advantageous and disadvantageous; while participants believed that standardization could be beneficial on many levels, they also felt that users may be hesitant to purchase a suite of instruments from a single manufacturer; and that a "plug and play" system including sensors from multiple manufactures may be difficult to achieve. A priority recommendation and conclusion for the general integrated sensor system community was to provide vessel operators with real-time access to relevant data (e.g., ambient temperature and salinity to increase efficiency of water treatment systems and meteorological data for increased vessel safety and operating efficiency) for broader system value. Simplified data displays are also required for education and public outreach/awareness. Other key recommendations were to encourage the use of integrated sensor packages within observing systems such as 100s and EuroGOOS, identify additional customers of sensor system data, and publish results of previous work in peer-reviewed journals to increase agency and scientific awareness and confidence in the technology. Priority recommendations and conclusions for ACT entailed highlighting the value of integrated sensor systems for vessels of opportunity through articles in the popular press, and marine science. [PDF contains 28 pages]

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Daily sea surface temperatures have been acquired at the Hopkins Marine Station in Pacific Grove, California since January 20, 1919.This time series is one of the longest oceanographic records along the U.S. west coast. Because of its length it is well-suited for studying climate-related and oceanic variability on interannual, decadal, and interdecadal time scales. The record, however, is not homogeneous, has numerous gaps, contains possible outliers, and the observations were not always collected at the same time each day. Because of these problems we have undertaken the task of reconstructing this long and unique series. We describe the steps that were taken and the methods that were used in this reconstruction. Although the methods employed are basic, we believe that they are consistent with the quality of the data. The reconstructed record has values at every time point, original, or estimated, and has been adjusted for time-of-day variations where this information was available. Possible outliers have also been examined and replaced where their credibility could not be established. Many of the studies that have employed the Hopkins time series have not discussed the issue of data quality and how these problems were addressed. Because of growing interest in this record, it is important that a single, well-documented version be adopted, so that the results of future analyses can be directly compared. Although additional work may be done to further improve the quality of this record, it is now available via the internet. [PDF contains 48 pages]

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Seismic reflection methods have been extensively used to probe the Earth's crust and suggest the nature of its formative processes. The analysis of multi-offset seismic reflection data extends the technique from a reconnaissance method to a powerful scientific tool that can be applied to test specific hypotheses. The treatment of reflections at multiple offsets becomes tractable if the assumptions of high-frequency rays are valid for the problem being considered. Their validity can be tested by applying the methods of analysis to full wave synthetics.

Three studies illustrate the application of these principles to investigations of the nature of the crust in southern California. A survey shot by the COCORP consortium in 1977 across the San Andreas fault near Parkfield revealed events in the record sections whose arrival time decreased with offset. The reflectors generating these events are imaged using a multi-offset three-dimensional Kirchhoff migration. Migrations of full wave acoustic synthetics having the same limitations in geometric coverage as the field survey demonstrate the utility of this back projection process for imaging. The migrated depth sections show the locations of the major physical boundaries of the San Andreas fault zone. The zone is bounded on the southwest by a near-vertical fault juxtaposing a Tertiary sedimentary section against uplifted crystalline rocks of the fault zone block. On the northeast, the fault zone is bounded by a fault dipping into the San Andreas, which includes slices of serpentinized ultramafics, intersecting it at 3 km depth. These interpretations can be made despite complications introduced by lateral heterogeneities.

In 1985 the Calcrust consortium designed a survey in the eastern Mojave desert to image structures in both the shallow and the deep crust. Preliminary field experiments showed that the major geophysical acquisition problem to be solved was the poor penetration of seismic energy through a low-velocity surface layer. Its effects could be mitigated through special acquisition and processing techniques. Data obtained from industry showed that quality data could be obtained from areas having a deeper, older sedimentary cover, causing a re-definition of the geologic objectives. Long offset stationary arrays were designed to provide reversed, wider angle coverage of the deep crust over parts of the survey. The preliminary field tests and constant monitoring of data quality and parameter adjustment allowed 108 km of excellent crustal data to be obtained.

This dataset, along with two others from the central and western Mojave, was used to constrain rock properties and the physical condition of the crust. The multi-offset analysis proceeded in two steps. First, an increase in reflection peak frequency with offset is indicative of a thinly layered reflector. The thickness and velocity contrast of the layering can be calculated from the spectral dispersion, to discriminate between structures resulting from broad scale or local effects. Second, the amplitude effects at different offsets of P-P scattering from weak elastic heterogeneities indicate whether the signs of the changes in density, rigidity, and Lame's parameter at the reflector agree or are opposed. The effects of reflection generation and propagation in a heterogeneous, anisotropic crust were contained by the design of the experiment and the simplicity of the observed amplitude and frequency trends. Multi-offset spectra and amplitude trend stacks of the three Mojave Desert datasets suggest that the most reflective structures in the middle crust are strong Poisson's ratio (σ) contrasts. Porous zones or the juxtaposition of units of mutually distant origin are indicated. Heterogeneities in σ increase towards the top of a basal crustal zone at ~22 km depth. The transition to the basal zone and to the mantle include increases in σ. The Moho itself includes ~400 m layering having a velocity higher than that of the uppermost mantle. The Moho maintains the same configuration across the Mojave despite 5 km of crustal thinning near the Colorado River. This indicates that Miocene extension there either thinned just the basal zone, or that the basal zone developed regionally after the extensional event.