921 resultados para seismic data processing
Resumo:
This paper reports on a new satellite sensor, the Geostationary Earth Radiation Budget (GERB) experiment. GERB is designed to make the first measurements of the Earth's radiation budget from geostationary orbit. Measurements at high absolute accuracy of the reflected sunlight from the Earth, and the thermal radiation emitted by the Earth are made every 15 min, with a spatial resolution at the subsatellite point of 44.6 km (north–south) by 39.3 km (east–west). With knowledge of the incoming solar constant, this gives the primary forcing and response components of the top-of-atmosphere radiation. The first GERB instrument is an instrument of opportunity on Meteosat-8, a new spin-stabilized spacecraft platform also carrying the Spinning Enhanced Visible and Infrared (SEVIRI) sensor, which is currently positioned over the equator at 3.5°W. This overview of the project includes a description of the instrument design and its preflight and in-flight calibration. An evaluation of the instrument performance after its first year in orbit, including comparisons with data from the Clouds and the Earth's Radiant Energy System (CERES) satellite sensors and with output from numerical models, are also presented. After a brief summary of the data processing system and data products, some of the scientific studies that are being undertaken using these early data are described. This marks the beginning of a decade or more of observations from GERB, as subsequent models will fly on each of the four Meteosat Second Generation satellites.
Resumo:
In the summer of 1982, the ICLCUA CAFS Special Interest Group defined three subject areas for working party activity. These were: 1) interfaces with compilers and databases, 2) end-user language facilities and display methods, and 3) text-handling and office automation. The CAFS SIG convened one working party to address the first subject with the following terms of reference: 1) review facilities and map requirements onto them, 2) "Database or CAFS" or "Database on CAFS", 3) training needs for users to bridge to new techniques, and 4) repair specifications to cover gaps in software. The working party interpreted the topic broadly as the data processing professional's, rather than the end-user's, view of and relationship with CAFS. This report is the result of the working party's activities. The report content for good reasons exceeds the terms of reference in their strictest sense. For example, we examine QUERYMASTER, which is deemed to be an end-user tool by ICL, from both the DP and end-user perspectives. First, this is the only interface to CAFS in the current SV201. Secondly, it is necessary for the DP department to understand the end-user's interface to CAFS. Thirdly, the other subjects have not yet been addressed by other active working parties.
Resumo:
The principles of operation of an experimental prototype instrument known as J-SCAN are described along with the derivation of formulae for the rapid calculation of normalized impedances; the structure of the instrument; relevant probe design parameters; digital quantization errors; and approaches for the optimization of single frequency operation. An eddy current probe is used As the inductance element of a passive tuned-circuit which is repeatedly excited with short impulses. Each impulse excites an oscillation which is subject to decay dependent upon the values of the tuned-circuit components: resistance, inductance and capacitance. Changing conditions under the probe that affect the resistance and inductance of this circuit will thus be detected through changes in the transient response. These changes in transient response, oscillation frequency and rate of decay, are digitized, and then normalized values for probe resistance and inductance changes are calculated immediately in a micro processor. This approach coupled with a minimum analogue processing and maximum of digital processing has advantages compared with the conventional approaches to eddy current instruments. In particular there are: the absence of an out of balance condition and the flexibility and stability of digital data processing.
Resumo:
Recent developments in the fields of veterinary epidemiology and economics are critically reviewed and assessed. The impacts of recent technological developments in diagnosis, genetic characterisation, data processing and statistical analysis are evaluated. It is concluded that the acquisition and availability of data remains the principal constraint to the application of available techniques in veterinary epidemiology and economics, especially at population level. As more commercial producers use computerised management systems, the availability of data for analysis within herds is improving. However, consistency of recording and diagnosis remains problematic. Recent trends to the development of national livestock databases intended to provide reassurance to consumers of the safety and traceability of livestock products are potentially valuable sources of data that could lead to much more effective application of veterinary epidemiology and economics. These opportunities will be greatly enhanced if data from different sources, such as movement recording, official animal health programmes, quality assurance schemes, production recording and breed societies can be integrated. However, in order to realise such integrated databases, it will be necessary to provide absolute control of user access to guarantee data security and confidentiality. The potential applications of integrated livestock databases in analysis, modelling, decision-support, and providing management information for veterinary services and livestock producers are discussed. (c) 2004 Elsevier B.V. All rights reserved.
Resumo:
Hydroponic isotope labelling of entire plants (HILEP) is a cost-effective method enabling metabolic labelling of whole and mature plants with a stable isotope such as N-15. By utilising hydroponic media that contain N-15 inorganic salts as the sole nitrogen source, near to 100% N-15-labelling of proteins can be achieved. In this study, it is shown that HILEP, in combination with mass spectrometry, is suitable for relative protein quantitation of seven week-old Arabidopsis plants submitted to oxidative stress. Protein extracts from pooled N-14- and N-15-hydroponically grown plants were fractionated by SDS-PAGE, digested and analysed by liquid chromatography electrospray ionisation tandem mass spectrometry (LC-ESI-MS/MS). Proteins were identified and the spectra of N-14/N-15 peptide pairs were extracted using their m/z chromatographic retention time, isotopic distributions, and the m/z difference between the N-14 and N-15 peptides. Relative amounts were calculated as the ratio of the sum of the peak areas of the two distinct N-14 and N-15 peptide isotope envelopes. Using Mascot and the open source trans-proteomic pipeline (TPP), the data processing was automated for global proteome quantitation down to the isoform level by extracting isoform specific peptides. With this combination of metabolic labelling and mass spectrometry it was possible to show differential protein expression in the apoplast of plants submitted to oxidative stress. Moreover, it was possible to discriminate between differentially expressed isoforms belonging to the same protein family, such as isoforms of xylanases and pathogen-related glucanases (PR 2). (C) 2008 Elsevier Ltd. All rights reserved.
Resumo:
A method is presented for determining the time to first division of individual bacterial cells growing on agar media. Bacteria were inoculated onto agar-coated slides and viewed by phase-contrast microscopy. Digital images of the growing bacteria were captured at intervals and the time to first division estimated by calculating the "box area ratio". This is the area of the smallest rectangle that can be drawn around an object, divided by the area of the object itself. The box area ratios of cells were found to increase suddenly during growth at a time that correlated with cell division as estimated by visual inspection of the digital images. This was caused by a change in the orientation of the two daughter cells that occurred when sufficient flexibility arose at their point of attachment. This method was used successfully to generate lag time distributions for populations of Escherichia coli, Listeria monocytogenes and Pseudomonas aeruginosa, but did not work with the coccoid organism Staphylococcus aureus. This method provides an objective measure of the time to first cell division, whilst automation of the data processing allows a large number of cells to be examined per experiment. (c) 2005 Elsevier B.V. All rights reserved.
Resumo:
External interferences can severely degrade the performance of an Over-the-horizon radar (OTHR), so suppression of external interferences in strong clutter environment is the prerequisite for the target detection. The traditional suppression solutions usually began with clutter suppression in either time or frequency domain, followed by the interference detection and suppression. Based on this traditional solution, this paper proposes a method characterized by joint clutter suppression and interference detection: by analyzing eigenvalues in a short-time moving window centered at different time position, Clutter is suppressed by discarding the maximum three eigenvalues at every time position and meanwhile detection is achieved by analyzing the remained eigenvalues at different position. Then, restoration is achieved by forward-backward linear prediction using interference-free data surrounding the interference position. In the numeric computation, the eigenvalue decomposition (EVD) is replaced by values decomposition (SVD) based on the equivalence of these two processing. Data processing and experimental results show its efficiency of noise floor falling down about 10-20 dB.
Resumo:
Mounted on the sides of two widely separated spacecraft, the two Heliospheric Imager (HI) instruments onboard NASA’s STEREO mission view, for the first time, the space between the Sun and Earth. These instruments are wide-angle visible-light imagers that incorporate sufficient baffling to eliminate scattered light to the extent that the passage of solar coronal mass ejections (CMEs) through the heliosphere can be detected. Each HI instrument comprises two cameras, HI-1 and HI-2, which have 20° and 70° fields of view and are off-pointed from the Sun direction by 14.0° and 53.7°, respectively, with their optical axes aligned in the ecliptic plane. This arrangement provides coverage over solar elongation angles from 4.0° to 88.7° at the viewpoints of the two spacecraft, thereby allowing the observation of Earth-directed CMEs along the Sun – Earth line to the vicinity of the Earth and beyond. Given the two separated platforms, this also presents the first opportunity to view the structure and evolution of CMEs in three dimensions. The STEREO spacecraft were launched from Cape Canaveral Air Force Base in late October 2006, and the HI instruments have been performing scientific observations since early 2007. The design, development, manufacture, and calibration of these unique instruments are reviewed in this paper. Mission operations, including the initial commissioning phase and the science operations phase, are described. Data processing and analysis procedures are briefly discussed, and ground-test results and in-orbit observations are used to demonstrate that the performance of the instruments meets the original scientific requirements.
Resumo:
Metabolic stable isotope labeling is increasingly employed for accurate protein (and metabolite) quantitation using mass spectrometry (MS). It provides sample-specific isotopologues that can be used to facilitate comparative analysis of two or more samples. Stable Isotope Labeling by Amino acids in Cell culture (SILAC) has been used for almost a decade in proteomic research and analytical software solutions have been established that provide an easy and integrated workflow for elucidating sample abundance ratios for most MS data formats. While SILAC is a discrete labeling method using specific amino acids, global metabolic stable isotope labeling using isotopes such as (15)N labels the entire element content of the sample, i.e. for (15)N the entire peptide backbone in addition to all nitrogen-containing side chains. Although global metabolic labeling can deliver advantages with regard to isotope incorporation and costs, the requirements for data analysis are more demanding because, for instance for polypeptides, the mass difference introduced by the label depends on the amino acid composition. Consequently, there has been less progress on the automation of the data processing and mining steps for this type of protein quantitation. Here, we present a new integrated software solution for the quantitative analysis of protein expression in differential samples and show the benefits of high-resolution MS data in quantitative proteomic analyses.
Resumo:
Multiple versions of information and associated problems are well documented in both academic research and industry best practices. Many solutions have proposed a single version of the truth, with Business intelligence being adopted by many organizations. Business Intelligence (BI), however, is largely based on the collection of data, processing and presentation of information to meet different stakeholders’ requirement. This paper reviews the promise of Enterprise Intelligence, which promises to support decision-making based on a defined strategic understanding of the organizations goals and a unified version of the truth.
Resumo:
Results from an idealized three-dimensional baroclinic life-cycle model are interpreted in a potential vorticity (PV) framework to identify the physical mechanisms by which frictional processes acting in the atmospheric boundary layer modify and reduce the baroclinic development of a midlatitude storm. Considering a life cycle where the only non-conservative process acting is boundary-layer friction, the rate of change of depth-averaged PV within the boundary layer is governed by frictional generation of PV and the flux of PV into the free troposphere. Frictional generation of PV has two contributions: Ekman generation, which is directly analogous to the well-known Ekman-pumping mechanism for barotropic vortices, and baroclinic generation, which depends on the turning of the wind in the boundary layer and low-level horizontal temperature gradients. It is usually assumed, at least implicitly, that an Ekman process of negative PV generation is the mechanism whereby friction reduces the strength and growth rates of baroclinic systems. Although there is evidence for this mechanism, it is shown that baroclinic generation of PV dominates, producing positive PV anomalies downstream of the low centre, close to developing warm and cold fronts. These PV anomalies are advected by the large-scale warm conveyor belt flow upwards and polewards, fluxed into the troposphere near the warm front, and then advected westwards relative to the system. The result is a thin band of positive PV in the lower troposphere above the surface low centre. This PV is shown to be associated with a positive static stability anomaly, which Rossby edge wave theory suggests reduces the strength of the coupling between the upper- and lower-level PV anomalies, thereby reducing the rate of baroclinic development. This mechanism, which is a result of the baroclinic dynamics in the frontal regions, is in marked contrast with simple barotropic spin-down ideas. Finally we note the implications of these frictionally generated PV anomalies for cyclone forecasting.
Resumo:
This research paper reports the findings from an international survey of fieldwork practitioners on their use of technology to enhance fieldwork teaching and learning. It was found that there was high information technology usage before and after time in the field, but some were also using portable devices such as smartphones and global positioning system whilst out in the field. The main pedagogic reasons cited for the use of technology were the need for efficient data processing and to develop students' technological skills. The influencing factors and barriers to the use of technology as well as the importance of emerging technologies are discussed.
Resumo:
This paper describes the techniques used to obtain sea surface temperature (SST) retrievals from the Geostationary Operational Environmental Satellite 12 (GOES-12) at the National Oceanic and Atmospheric Administration’s Office of Satellite Data Processing and Distribution. Previous SST retrieval techniques relying on channels at 11 and 12 μm are not applicable because GOES-12 lacks the latter channel. Cloud detection is performed using a Bayesian method exploiting fast-forward modeling of prior clear-sky radiances using numerical weather predictions. The basic retrieval algorithm used at nighttime is based on a linear combination of brightness temperatures at 3.9 and 11 μm. In comparison with traditional split window SSTs (using 11- and 12-μm channels), simulations show that this combination has maximum scatter when observing drier colder scenes, with a comparable overall performance. For daytime retrieval, the same algorithm is applied after estimating and removing the contribution to brightness temperature in the 3.9-μm channel from solar irradiance. The correction is based on radiative transfer simulations and comprises a parameterization for atmospheric scattering and a calculation of ocean surface reflected radiance. Potential use of the 13-μm channel for SST is shown in a simulation study: in conjunction with the 3.9-μm channel, it can reduce the retrieval error by 30%. Some validation results are shown while a companion paper by Maturi et al. shows a detailed analysis of the validation results for the operational algorithms described in this present article.
Resumo:
Communication signal processing applications often involve complex-valued (CV) functional representations for signals and systems. CV artificial neural networks have been studied theoretically and applied widely in nonlinear signal and data processing [1–11]. Note that most artificial neural networks cannot be automatically extended from the real-valued (RV) domain to the CV domain because the resulting model would in general violate Cauchy-Riemann conditions, and this means that the training algorithms become unusable. A number of analytic functions were introduced for the fully CV multilayer perceptrons (MLP) [4]. A fully CV radial basis function (RBF) nework was introduced in [8] for regression and classification applications. Alternatively, the problem can be avoided by using two RV artificial neural networks, one processing the real part and the other processing the imaginary part of the CV signal/system. A even more challenging problem is the inverse of a CV
Resumo:
Human ICT implants, such as RFID implants, cochlear implants, cardiac pacemakers, Deep Brain Stimulation, bionic limbs connected to the nervous system, and networked cognitive prostheses, are becoming increasingly complex. With ever-growing data processing functionalities in these implants, privacy and security become vital concerns. Electronic attacks on human ICT implants can cause significant harm, both to implant subjects and to their environment. This paper explores the vulnerabilities which human implants pose to crime victimisation in light of recent technological developments, and analyses how the law can deal with emerging challenges of what may well become the next generation of cybercrime: attacks targeted at technology implanted in the human body. After a state-of-the-art description of relevant types of human implants and a discussion how these implants challenge existing perceptions of the human body, we describe how various modes of attacks, such as sniffing, hacking, data interference, and denial of service, can be committed against implants. Subsequently, we analyse how these attacks can be assessed under current substantive and procedural criminal law, drawing on examples from UK and Dutch law. The possibilities and limitations of cybercrime provisions (eg, unlawful access, system interference) and bodily integrity provisions (eg, battery, assault, causing bodily harm) to deal with human-implant attacks are analysed. Based on this assessment, the paper concludes that attacks on human implants are not only a new generation in the evolution of cybercrime, but also raise fundamental questions on how criminal law conceives of attacks. Traditional distinctions between physical and non-physical modes of attack, between human bodies and things, between exterior and interior of the body need to be re-interpreted in light of developments in human implants. As the human body and technology become increasingly intertwined, cybercrime legislation and body-integrity crime legislation will also become intertwined, posing a new puzzle that legislators and practitioners will sooner or later have to solve.