66 resultados para Random data processing
Resumo:
This paper reports on a new satellite sensor, the Geostationary Earth Radiation Budget (GERB) experiment. GERB is designed to make the first measurements of the Earth's radiation budget from geostationary orbit. Measurements at high absolute accuracy of the reflected sunlight from the Earth, and the thermal radiation emitted by the Earth are made every 15 min, with a spatial resolution at the subsatellite point of 44.6 km (north–south) by 39.3 km (east–west). With knowledge of the incoming solar constant, this gives the primary forcing and response components of the top-of-atmosphere radiation. The first GERB instrument is an instrument of opportunity on Meteosat-8, a new spin-stabilized spacecraft platform also carrying the Spinning Enhanced Visible and Infrared (SEVIRI) sensor, which is currently positioned over the equator at 3.5°W. This overview of the project includes a description of the instrument design and its preflight and in-flight calibration. An evaluation of the instrument performance after its first year in orbit, including comparisons with data from the Clouds and the Earth's Radiant Energy System (CERES) satellite sensors and with output from numerical models, are also presented. After a brief summary of the data processing system and data products, some of the scientific studies that are being undertaken using these early data are described. This marks the beginning of a decade or more of observations from GERB, as subsequent models will fly on each of the four Meteosat Second Generation satellites.
Resumo:
In the summer of 1982, the ICLCUA CAFS Special Interest Group defined three subject areas for working party activity. These were: 1) interfaces with compilers and databases, 2) end-user language facilities and display methods, and 3) text-handling and office automation. The CAFS SIG convened one working party to address the first subject with the following terms of reference: 1) review facilities and map requirements onto them, 2) "Database or CAFS" or "Database on CAFS", 3) training needs for users to bridge to new techniques, and 4) repair specifications to cover gaps in software. The working party interpreted the topic broadly as the data processing professional's, rather than the end-user's, view of and relationship with CAFS. This report is the result of the working party's activities. The report content for good reasons exceeds the terms of reference in their strictest sense. For example, we examine QUERYMASTER, which is deemed to be an end-user tool by ICL, from both the DP and end-user perspectives. First, this is the only interface to CAFS in the current SV201. Secondly, it is necessary for the DP department to understand the end-user's interface to CAFS. Thirdly, the other subjects have not yet been addressed by other active working parties.
Resumo:
The principles of operation of an experimental prototype instrument known as J-SCAN are described along with the derivation of formulae for the rapid calculation of normalized impedances; the structure of the instrument; relevant probe design parameters; digital quantization errors; and approaches for the optimization of single frequency operation. An eddy current probe is used As the inductance element of a passive tuned-circuit which is repeatedly excited with short impulses. Each impulse excites an oscillation which is subject to decay dependent upon the values of the tuned-circuit components: resistance, inductance and capacitance. Changing conditions under the probe that affect the resistance and inductance of this circuit will thus be detected through changes in the transient response. These changes in transient response, oscillation frequency and rate of decay, are digitized, and then normalized values for probe resistance and inductance changes are calculated immediately in a micro processor. This approach coupled with a minimum analogue processing and maximum of digital processing has advantages compared with the conventional approaches to eddy current instruments. In particular there are: the absence of an out of balance condition and the flexibility and stability of digital data processing.
Resumo:
Recent developments in the fields of veterinary epidemiology and economics are critically reviewed and assessed. The impacts of recent technological developments in diagnosis, genetic characterisation, data processing and statistical analysis are evaluated. It is concluded that the acquisition and availability of data remains the principal constraint to the application of available techniques in veterinary epidemiology and economics, especially at population level. As more commercial producers use computerised management systems, the availability of data for analysis within herds is improving. However, consistency of recording and diagnosis remains problematic. Recent trends to the development of national livestock databases intended to provide reassurance to consumers of the safety and traceability of livestock products are potentially valuable sources of data that could lead to much more effective application of veterinary epidemiology and economics. These opportunities will be greatly enhanced if data from different sources, such as movement recording, official animal health programmes, quality assurance schemes, production recording and breed societies can be integrated. However, in order to realise such integrated databases, it will be necessary to provide absolute control of user access to guarantee data security and confidentiality. The potential applications of integrated livestock databases in analysis, modelling, decision-support, and providing management information for veterinary services and livestock producers are discussed. (c) 2004 Elsevier B.V. All rights reserved.
Resumo:
The node-density effect is an artifact of phylogeny reconstruction that can cause branch lengths to be underestimated in areas of the tree with fewer taxa. Webster, Payne, and Pagel (2003, Science 301:478) introduced a statistical procedure (the "delta" test) to detect this artifact, and here we report the results of computer simulations that examine the test's performance. In a sample of 50,000 random data sets, we find that the delta test detects the artifact in 94.4% of cases in which it is present. When the artifact is not present (n = 10,000 simulated data sets) the test showed a type I error rate of approximately 1.69%, incorrectly reporting the artifact in 169 data sets. Three measures of tree shape or "balance" failed to predict the size of the node-density effect. This may reflect the relative homogeneity of our randomly generated topologies, but emphasizes that nearly any topology can suffer from the artifact, the effect not being confined only to highly unevenly sampled or otherwise imbalanced trees. The ability to screen phylogenies for the node-density artifact is important for phylogenetic inference and for researchers using phylogenetic trees to infer evolutionary processes, including their use in molecular clock dating. [Delta test; molecular clock; molecular evolution; node-density effect; phylogenetic reconstruction; speciation; simulation.]
Resumo:
Hydroponic isotope labelling of entire plants (HILEP) is a cost-effective method enabling metabolic labelling of whole and mature plants with a stable isotope such as N-15. By utilising hydroponic media that contain N-15 inorganic salts as the sole nitrogen source, near to 100% N-15-labelling of proteins can be achieved. In this study, it is shown that HILEP, in combination with mass spectrometry, is suitable for relative protein quantitation of seven week-old Arabidopsis plants submitted to oxidative stress. Protein extracts from pooled N-14- and N-15-hydroponically grown plants were fractionated by SDS-PAGE, digested and analysed by liquid chromatography electrospray ionisation tandem mass spectrometry (LC-ESI-MS/MS). Proteins were identified and the spectra of N-14/N-15 peptide pairs were extracted using their m/z chromatographic retention time, isotopic distributions, and the m/z difference between the N-14 and N-15 peptides. Relative amounts were calculated as the ratio of the sum of the peak areas of the two distinct N-14 and N-15 peptide isotope envelopes. Using Mascot and the open source trans-proteomic pipeline (TPP), the data processing was automated for global proteome quantitation down to the isoform level by extracting isoform specific peptides. With this combination of metabolic labelling and mass spectrometry it was possible to show differential protein expression in the apoplast of plants submitted to oxidative stress. Moreover, it was possible to discriminate between differentially expressed isoforms belonging to the same protein family, such as isoforms of xylanases and pathogen-related glucanases (PR 2). (C) 2008 Elsevier Ltd. All rights reserved.