937 resultados para data integration


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The performance of postdetection integration (PDI) techniques for the detection of Global Navigation Satellite Systems (GNSS) signals in the presence of uncertainties in frequency offsets, noise variance, and unknown data-bits is studied. It is shown that the conventional PDI techniques are generally not robust to uncertainty in the data-bits and/or the noise variance. Two new modified PDI techniques are proposed, and they are shown to be robust to these uncertainties. The receiver operating characteristics (ROC) and sample complexity performance of the PDI techniques in the presence of model uncertainties are analytically derived. It is shown that the proposed methods significantly outperform existing methods, and hence they could become increasingly important as the GNSS receivers attempt to push the envelope on the minimum signal-to-noise ratio (SNR) for reliable detection.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this study, we applied the integration methodology developed in the companion paper by Aires (2014) by using real satellite observations over the Mississippi Basin. The methodology provides basin-scale estimates of the four water budget components (precipitation P, evapotranspiration E, water storage change Delta S, and runoff R) in a two-step process: the Simple Weighting (SW) integration and a Postprocessing Filtering (PF) that imposes the water budget closure. A comparison with in situ observations of P and E demonstrated that PF improved the estimation of both components. A Closure Correction Model (CCM) has been derived from the integrated product (SW+PF) that allows to correct each observation data set independently, unlike the SW+PF method which requires simultaneous estimates of the four components. The CCM allows to standardize the various data sets for each component and highly decrease the budget residual (P - E - Delta S - R). As a direct application, the CCM was combined with the water budget equation to reconstruct missing values in any component. Results of a Monte Carlo experiment with synthetic gaps demonstrated the good performances of the method, except for the runoff data that has a variability of the same order of magnitude as the budget residual. Similarly, we proposed a reconstruction of Delta S between 1990 and 2002 where no Gravity Recovery and Climate Experiment data are available. Unlike most of the studies dealing with the water budget closure at the basin scale, only satellite observations and in situ runoff measurements are used. Consequently, the integrated data sets are model independent and can be used for model calibration or validation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Chelmsford College has created an observation, appraisal and continuing professional development (CPD) cycle by successfully integrating a collection of bespoke web-based systems together within its intranet. Students have benefitted from improved teaching and learning because of the rapid, transparent and thorough cycle of staff being observed, appraised and given appropriate CPD. The College has also saved time and money by being able to use single-source data to schedule observations, appraisals and CPD to individuals' needs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A new approach based on the gated integration technique is proposed for the accurate measurement of the autocorrelation function of speckle intensities scattered from a random phase screen. The Boxcar used for this technique in the acquisition of the speckle intensity data integrates the photoelectric signal during its sampling gate open, and it repeats the sampling by a preset number, in. The average analog of the in samplings output by the Boxcar enhances the signal-to-noise ratio by root m, because the repeated sampling and the average make the useful speckle signals stable, while the randomly varied photoelectric noise is suppressed by 1/ root m. In the experiment, we use an analog-to-digital converter module to synchronize all the actions such as the stepped movement of the phase screen, the repeated sampling, the readout of the averaged output of the Boxcar, etc. The experimental results show that speckle signals are better recovered from contaminated signals, and the autocorrelation function with the secondary maximum is obtained, indicating that the accuracy of the measurement of the autocorrelation function is greatly improved by the gated integration technique. (C) 2006 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

FSodium phosphate tellurite glasses in the system (NaPO3)(x)(TeO2)(1-x) were prepared and structurally characterized by thermal analysis, vibrational spectroscopy, X-ray photoelectron spectroscopy (XPS) and a variety of complementary solid-state nuclear magnetic resonance (NMR) techniques. Unlike the situation in other mixed-network-former glasses, the interaction between the two network formers tellurium oxide and phosphorus oxide produces no new structural units, and no sharing of the network modifier Na2O takes place. The glass structure can be regarded as a network of interlinked metaphosphate-type P(2) tetrahedral and TeO4/2 antiprismotic units. The combined interpretation of the O 1s XPS data and the P-31 solid-state NMR spectra presents clear quantitative evidence for a nonstatistical connectivity distribution. Rather the formation of homootomic P-O-P and Te-O-Te linkages is favored over mixed P-O-Te connectivities. As a consequence of this chemical segregation effect, the spatial sodium distribution is not random, as also indicated by a detailed analysis of P-31/No-23 rotational echo double-resonance (REDOR) experiments. ACHTUNGTRENUNG(TeO2)1 x were prepared and structurally characterized by thermal analysis,vibrat ional spectroscopy,X-ray photoelectron spectroscopy (XPS) and a variety of complementary solid-state nuclear magnetic resonance (NMR) techniques. Unlike the situation in other mixed-network-former glasses,the interaction between the two network formers tellurium oxide and phosphorus oxide produces no new structural units,and no sharing of the network modifier Na2O takes place. The glass structure can be regarded as a network of interlinked metaphosphate-type P(2) tetrahedral and TeO4/2 antiprismatic units. The combined interpretation of the O 1s XPS data and the 31P solid-state NMR spectra presents clear quantitative evidence for a nonstatistical connectivity distribution. Rather,the formation of homoatomic P O P and Te O Te linkages is favored over mixed P O Te connectivities. As a consequence of this chemical segregation effect,the spatial sodium distribution is not random,as also indicated by a detailed analysis of 31P/23Na rotational echo double-resonance (REDOR) experiments.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Molecular markers have been demonstrated to be useful for the estimation of stock mixture proportions where the origin of individuals is determined from baseline samples. Bayesian statistical methods are widely recognized as providing a preferable strategy for such analyses. In general, Bayesian estimation is based on standard latent class models using data augmentation through Markov chain Monte Carlo techniques. In this study, we introduce a novel approach based on recent developments in the estimation of genetic population structure. Our strategy combines analytical integration with stochastic optimization to identify stock mixtures. An important enhancement over previous methods is the possibility of appropriately handling data where only partial baseline sample information is available. We address the potential use of nonmolecular, auxiliary biological information in our Bayesian model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

NOAA’s Coral Reef Conservation program (CRCP) develops coral reef management priorities by bringing together various partners to better understand threats to coral reef ecosystems with the goal of conserving, protecting and restoring these resources. Place-based and ecosystem-based management approaches employed by CRCP require that spatially explicit information about benthic habitats and fish utilization are available to characterize coral reef ecosystems and set conservation priorities. To accomplish this, seafloor habitat mapping of coral reefs around the U.S. Virgin Islands (USVI) and Puerto Rico has been ongoing since 2004. In 2008, fishery acoustics surveys were added to NOAA survey missions in the USVI and Puerto Rico to assess fish distribution and abundance in relation to benthic habitats in high priority conservation areas. NOAA’s National Centers for Coastal Ocean Science (NCCOS) have developed fisheries acoustics survey capabilities onboard the NOAA ship Nancy Foster to complement the CRCP seafloor habitat mapping effort spearheaded by the Center for Coastal Monitoring and Assessment Biogeography Branch (CCMA-BB). The integration of these activities has evolved on the Nancy Foster over the three years summarized in this report. A strategy for improved operations and products has emerged over that time. Not only has the concurrent operation of multibeam and fisheries acoustics surveys been beneficial in terms of optimizing ship time and resources, this joint effort has advanced an integrated approach to characterizing bottom and mid-water habitats and the fishes associated with them. CCMA conducts multibeam surveys to systematically map and characterize coral reef ecosystems, resulting in products such as high resolution bathymetric maps, backscatter information, and benthic habitat classification maps. These products focus on benthic features and live bottom habitats associated with them. NCCOS Centers (the Center for Coastal Fisheries and Habitat Research and the Center for Coastal Environmental Health and Biomolecular Research) characterize coral reef ecosystems by using fisheries acoustics methods to capture biological information through the entire water column. Spatially-explicit information on marine resources derived from fisheries acoustics surveys, such as maps of fish density, supports marine spatial planning strategies and decision making by providing a biological metric for evaluating coral reef ecosystems and assessing impacts from pollution, fishing pressure, and climate change. Data from fisheries acoustics surveys address management needs by providing a measure of biomass in management areas, detecting spatial and temporal responses in distribution relative to natural and anthropogenic impacts, and identifying hotspots that support high fish abundance or fish aggregations. Fisheries acoustics surveys conducted alongside multibeam mapping efforts inherently couple water column data with information on benthic habitats and provide information on the heterogeneity of both benthic habitats and biota in the water column. Building on this information serves to inform resource managers regarding how fishes are organized around habitat structure and the scale at which these relationships are important. Where resource managers require place-based assessments regarding the location of critical habitats along with high abundances of fish, concurrent multibeam and fisheries acoustics surveys serve as an important tool for characterizing and prioritizing coral reef ecosystems. This report summarizes the evolution of fisheries acoustics surveys onboard the NOAA ship Nancy Foster from 2008 to 2010, in conjunction with multibeam data collection, aimed at characterizing benthic and mid-water habitats in high priority conservation areas around the USVI and Puerto Rico. It also serves as a resource for the continued development of consistent data products derived from acoustic surveys. By focusing on the activities of 2010, this report highlights the progress made to date and illustrates the potential application of fisheries data derived from acoustic surveys to the management of coral reef ecosystems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This book explores the processes for retrieval, classification, and integration of construction images in AEC/FM model based systems. The author describes a combination of techniques from the areas of image and video processing, computer vision, information retrieval, statistics and content-based image and video retrieval that have been integrated into a novel method for the retrieval of related construction site image data from components of a project model. This method has been tested on available construction site images from a variety of sources like past and current building construction and transportation projects and is able to automatically classify, store, integrate and retrieve image data files in inter-organizational systems so as to allow their usage in project management related tasks. objects. Therefore, automated methods for the integration of construction images are important for construction information management. During this research, processes for retrieval, classification, and integration of construction images in AEC/FM model based systems have been explored. Specifically, a combination of techniques from the areas of image and video processing, computer vision, information retrieval, statistics and content-based image and video retrieval have been deployed in order to develop a methodology for the retrieval of related construction site image data from components of a project model. This method has been tested on available construction site images from a variety of sources like past and current building construction and transportation projects and is able to automatically classify, store, integrate and retrieve image data files in inter-organizational systems so as to allow their usage in project management related tasks.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A novel integration method for the production of cost-effective optoelectronic printed circuit boards (OE PCBs) is presented. The proposed integration method allows fabrication of OE PCBs with manufacturing processes common to the electronics industry while enabling direct attachment of electronic components onto the board with solder reflow processes as well as board assembly with automated pick-and-place tools. The OE PCB design is based on the use of polymer multimode waveguides, end-fired optical coupling schemes, and simple electro-optic connectors, eliminating the need for additional optical components in the optical layer, such as micro-mirrors and micro-lenses. A proof-of-concept low-cost optical transceiver produced with the proposed integration method is presented. This transceiver is fabricated on a low-cost FR4 substrate, comprises a polymer Y-splitter together with the electronic circuitry of the transmitter and receiver modules and achieves error-free 10-Gb/s bidirectional data transmission. Theoretical studies on the optical coupling efficiencies and alignment tolerances achieved with the employed end-fired coupling schemes are presented while experimental results on the optical transmission characteristics, frequency response, and data transmission performance of the integrated optical links are reported. The demonstrated optoelectronic unit can be used as a front-end optical network unit in short-reach datacommunication links. © 2011-2012 IEEE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With long-term marine surveys and research, and especially with the development of new marine environment monitoring technologies, prodigious amounts of complex marine environmental data are generated, and continuously increase rapidly. Features of these data include massive volume, widespread distribution, multiple-sources, heterogeneous, multi-dimensional and dynamic in structure and time. The present study recommends an integrative visualization solution for these data, to enhance the visual display of data and data archives, and to develop a joint use of these data distributed among different organizations or communities. This study also analyses the web services technologies and defines the concept of the marine information gird, then focuses on the spatiotemporal visualization method and proposes a process-oriented spatiotemporal visualization method. We discuss how marine environmental data can be organized based on the spatiotemporal visualization method, and how organized data are represented for use with web services and stored in a reusable fashion. In addition, we provide an original visualization architecture that is integrative and based on the explored technologies. In the end, we propose a prototype system of marine environmental data of the South China Sea for visualizations of Argo floats, sea surface temperature fields, sea current fields, salinity, in-situ investigation data, and ocean stations. An integration visualization architecture is illustrated on the prototype system, which highlights the process-oriented temporal visualization method and demonstrates the benefit of the architecture and the methods described in this study.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Li, Longzhuang, Liu, Yonghuai, Obregon, A., Weatherston, M. Visual Segmentation-Based Data Record Extraction From Web Documents. Proceedings of IEEE International Conference on Information Reuse and Integration, 2007, pp. 502-507. Sponsorship: IEEE

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND:In the current climate of high-throughput computational biology, the inference of a protein's function from related measurements, such as protein-protein interaction relations, has become a canonical task. Most existing technologies pursue this task as a classification problem, on a term-by-term basis, for each term in a database, such as the Gene Ontology (GO) database, a popular rigorous vocabulary for biological functions. However, ontology structures are essentially hierarchies, with certain top to bottom annotation rules which protein function predictions should in principle follow. Currently, the most common approach to imposing these hierarchical constraints on network-based classifiers is through the use of transitive closure to predictions.RESULTS:We propose a probabilistic framework to integrate information in relational data, in the form of a protein-protein interaction network, and a hierarchically structured database of terms, in the form of the GO database, for the purpose of protein function prediction. At the heart of our framework is a factorization of local neighborhood information in the protein-protein interaction network across successive ancestral terms in the GO hierarchy. We introduce a classifier within this framework, with computationally efficient implementation, that produces GO-term predictions that naturally obey a hierarchical 'true-path' consistency from root to leaves, without the need for further post-processing.CONCLUSION:A cross-validation study, using data from the yeast Saccharomyces cerevisiae, shows our method offers substantial improvements over both standard 'guilt-by-association' (i.e., Nearest-Neighbor) and more refined Markov random field methods, whether in their original form or when post-processed to artificially impose 'true-path' consistency. Further analysis of the results indicates that these improvements are associated with increased predictive capabilities (i.e., increased positive predictive value), and that this increase is consistent uniformly with GO-term depth. Additional in silico validation on a collection of new annotations recently added to GO confirms the advantages suggested by the cross-validation study. Taken as a whole, our results show that a hierarchical approach to network-based protein function prediction, that exploits the ontological structure of protein annotation databases in a principled manner, can offer substantial advantages over the successive application of 'flat' network-based methods.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

When brain mechanism carry out motion integration and segmentation processes that compute unambiguous global motion percepts from ambiguous local motion signals? Consider, for example, a deer running at variable speeds behind forest cover. The forest cover is an occluder that creates apertures through which fragments of the deer's motion signals are intermittently experienced. The brain coherently groups these fragments into a trackable percept of the deer in its trajectory. Form and motion processes are needed to accomplish this using feedforward and feedback interactions both within and across cortical processing streams. All the cortical areas V1, V2, MT, and MST are involved in these interactions. Figure-ground processes in the form stream through V2, such as the seperation of occluding boundaries of the forest cover from the boundaries of the deer, select the motion signals which determine global object motion percepts in the motion stream through MT. Sparse, but unambiguous, feauture tracking signals are amplified before they propogate across position and are intergrated with far more numerous ambiguous motion signals. Figure-ground and integration processes together determine the global percept. A neural model predicts the processing stages that embody these form and motion interactions. Model concepts and data are summarized about motion grouping across apertures in response to a wide variety of displays, and probabilistic decision making in parietal cortex in response to random dot displays.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Hydrologic research is a very demanding application of fiber-optic distributed temperature sensing (DTS) in terms of precision, accuracy and calibration. The physics behind the most frequently used DTS instruments are considered as they apply to four calibration methods for single-ended DTS installations. The new methods presented are more accurate than the instrument-calibrated data, achieving accuracies on the order of tenths of a degree root mean square error (RMSE) and mean bias. Effects of localized non-uniformities that violate the assumptions of single-ended calibration data are explored and quantified. Experimental design considerations such as selection of integration times or selection of the length of the reference sections are discussed, and the impacts of these considerations on calibrated temperatures are explored in two case studies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An enterprise information system (EIS) is an integrated data-applications platform characterized by diverse, heterogeneous, and distributed data sources. For many enterprises, a number of business processes still depend heavily on static rule-based methods and extensive human expertise. Enterprises are faced with the need for optimizing operation scheduling, improving resource utilization, discovering useful knowledge, and making data-driven decisions.

This thesis research is focused on real-time optimization and knowledge discovery that addresses workflow optimization, resource allocation, as well as data-driven predictions of process-execution times, order fulfillment, and enterprise service-level performance. In contrast to prior work on data analytics techniques for enterprise performance optimization, the emphasis here is on realizing scalable and real-time enterprise intelligence based on a combination of heterogeneous system simulation, combinatorial optimization, machine-learning algorithms, and statistical methods.

On-demand digital-print service is a representative enterprise requiring a powerful EIS.We use real-life data from Reischling Press, Inc. (RPI), a digit-print-service provider (PSP), to evaluate our optimization algorithms.

In order to handle the increase in volume and diversity of demands, we first present a high-performance, scalable, and real-time production scheduling algorithm for production automation based on an incremental genetic algorithm (IGA). The objective of this algorithm is to optimize the order dispatching sequence and balance resource utilization. Compared to prior work, this solution is scalable for a high volume of orders and it provides fast scheduling solutions for orders that require complex fulfillment procedures. Experimental results highlight its potential benefit in reducing production inefficiencies and enhancing the productivity of an enterprise.

We next discuss analysis and prediction of different attributes involved in hierarchical components of an enterprise. We start from a study of the fundamental processes related to real-time prediction. Our process-execution time and process status prediction models integrate statistical methods with machine-learning algorithms. In addition to improved prediction accuracy compared to stand-alone machine-learning algorithms, it also performs a probabilistic estimation of the predicted status. An order generally consists of multiple series and parallel processes. We next introduce an order-fulfillment prediction model that combines advantages of multiple classification models by incorporating flexible decision-integration mechanisms. Experimental results show that adopting due dates recommended by the model can significantly reduce enterprise late-delivery ratio. Finally, we investigate service-level attributes that reflect the overall performance of an enterprise. We analyze and decompose time-series data into different components according to their hierarchical periodic nature, perform correlation analysis,

and develop univariate prediction models for each component as well as multivariate models for correlated components. Predictions for the original time series are aggregated from the predictions of its components. In addition to a significant increase in mid-term prediction accuracy, this distributed modeling strategy also improves short-term time-series prediction accuracy.

In summary, this thesis research has led to a set of characterization, optimization, and prediction tools for an EIS to derive insightful knowledge from data and use them as guidance for production management. It is expected to provide solutions for enterprises to increase reconfigurability, accomplish more automated procedures, and obtain data-driven recommendations or effective decisions.