967 resultados para Capture-recapture Data


Relevância:

40.00% 40.00%

Publicador:

Resumo:

This research is investigating the claim that Change Data Capture (CDC) technologies capture data changes in real-time. Based on theory, our hypothesis states that real-time CDC is not achievable with traditional approaches (log scanning, triggers and timestamps). Traditional approaches to CDC require a resource to be polled, which prevents true real-time CDC. We propose an approach to CDC that encapsulates the data source with a set of web services. These web services will propagate the changes to the targets and eliminate the need for polling. Additionally we propose a framework for CDC technologies that allow changes to flow from source to target. This paper discusses current CDC technologies and presents the theory about why they are unable to deliver changes in real-time. Following, we discuss our web service approach to CDC and accompanying framework, explaining how they can produce real-time CDC. The paper concludes with a discussion on the research required to investigate the real-time capabilities of CDC technologies. © 2010 IEEE.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The Exhibitium Project , awarded by the BBVA Foundation, is a data-driven project developed by an international consortium of research groups . One of its main objectives is to build a prototype that will serve as a base to produce a platform for the recording and exploitation of data about art-exhibitions available on the Internet . Therefore, our proposal aims to expose the methods, procedures and decision-making processes that have governed the technological implementation of this prototype, especially with regard to the reuse of WordPress (WP) as development framework.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Information on marine and estuarine capture fishery activity in northern Todos os Santos Bay, northeastern Brazil, based on daily data collected between September 2003 and June 2005 is presented. Small-scale artisanal fishery in this area includes the use of traditional vessels both nonmotorized and motorized for locomotion, being carried out mainly by canoe or on foot, and involves many different kinds of gear, including gillnet, hook and line, seine nets, and traps. A total of 113 taxa were grouped into 77 resources, including 88 fish, 10 crustaceans, and 15 mollusks. Data on nominal catches of fish, crustaceans and mollusks are presented by month and location. A total of 345.2 tonnes of fishery resources were produced (285.4 tonnes of fish, 39.2 tonnes of fresh invertebrates, and 20.6 tonnes of processed invertebrates). Temporal variation in the fish catch was associated with the life cycle of the species or with the hydrographic conditions. The first-sale value of this catch amounted to around US$ 615,000.00, fishes representing 71.3% of it. A table of the average price of each fishery resource is presented. The results produced in this study may be considered a reference for future monitoring programs of fishery resources in the area.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Barium stars are optimal sites for studying the correlations between the neutron-capture elements and other species that may be depleted or enhanced, because they act as neutron seeds or poisons during the operation of the s-process. These data are necessary to help constrain the modeling of the neutron-capture paths and explain the s-process abundance curve of the solar system. Chemical abundances for a large number of barium stars with different degrees of s-process excesses, masses, metallicities, and evolutionary states are a crucial step towards this goal. We present abundances of Mn, Cu, Zn, and various light and heavy elements for a sample of barium and normal giant stars, and present correlations between abundances contributed to different degrees by the weak-s, mains, and r-processes of neutron capture, between Fe-peak elements and heavy elements. Data from the literature are also considered in order to better study the abundance pattern of peculiar stars. The stellar spectra were observed with FEROS/ESO. The stellar atmospheric parameters of the eight barium giant stars and six normal giants that we analyzed lie in the range 4300 < T(eff)/K < 5300, -0.7 < [Fe/H] <= 0.12 and 1.5 <= log g < 2.9. Carbon and nitrogen abundances were derived by spectral synthesis of the molecular bands of C(2), CH, and CN. For all other elements we used the atomic lines to perform the spectral synthesis. A very large scatter was found mainly for the Mn abundances when data from the literature were considered. We found that [Zn/Fe] correlates well with the heavy element excesses, its abundance clearly increasing as the heavy element excesses increase, a trend not shown by the [Cu/Fe] and [Mn/Fe] ratios. Also, the ratios involving Mn, Cu, and Zn and heavy elements usually show an increasing trend toward higher metallicities. Our results suggest that a larger fraction of the Zn synthesis than of Cu is owed to massive stars, and that the contribution of the main-s process to the synthesis of both elements is small. We also conclude that Mn is mostly synthesized by SN Ia, and that a non-negligible fraction of the synthesis of Mn, Cu, and Zn is owed to the weak s-process.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This document records the process of migrating eprints.org data to a Fez repository. Fez is a Web-based digital repository and workflow management system based on Fedora (http://www.fedora.info/). At the time of migration, the University of Queensland Library was using EPrints 2.2.1 [pepper] for its ePrintsUQ repository. Once we began to develop Fez, we did not upgrade to later versions of eprints.org software since we knew we would be migrating data from ePrintsUQ to the Fez-based UQ eSpace. Since this document records our experiences of migration from an earlier version of eprints.org, anyone seeking to migrate eprints.org data into a Fez repository might encounter some small differences. Moving UQ publication data from an eprints.org repository into a Fez repository (hereafter called UQ eSpace (http://espace.uq.edu.au/) was part of a plan to integrate metadata (and, in some cases, full texts) about all UQ research outputs, including theses, images, multimedia and datasets, in a single repository. This tied in with the plan to identify and capture the research output of a single institution, the main task of the eScholarshipUQ testbed for the Australian Partnership for Sustainable Repositories project (http://www.apsr.edu.au/). The migration could not occur at UQ until the functionality in Fez was at least equal to that of the existing ePrintsUQ repository. Accordingly, as Fez development occurred throughout 2006, a list of eprints.org functionality not currently supported in Fez was created so that programming of such development could be planned for and implemented.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Examples from the Murray-Darling basin in Australia are used to illustrate different methods of disaggregation of reconnaissance-scale maps. One approach for disaggregation revolves around the de-convolution of the soil-landscape paradigm elaborated during a soil survey. The descriptions of soil ma units and block diagrams in a soil survey report detail soil-landscape relationships or soil toposequences that can be used to disaggregate map units into component landscape elements. Toposequences can be visualised on a computer by combining soil maps with digital elevation data. Expert knowledge or statistics can be used to implement the disaggregation. Use of a restructuring element and k-means clustering are illustrated. Another approach to disaggregation uses training areas to develop rules to extrapolate detailed mapping into other, larger areas where detailed mapping is unavailable. A two-level decision tree example is presented. At one level, the decision tree method is used to capture mapping rules from the training area; at another level, it is used to define the domain over which those rules can be extrapolated. (C) 2001 Elsevier Science B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The collection of spatial information to quantify changes to the state and condition of the environment is a fundamental component of conservation or sustainable utilization of tropical and subtropical forests, Age is an important structural attribute of old-growth forests influencing biological diversity in Australia eucalypt forests. Aerial photograph interpretation has traditionally been used for mapping the age and structure of forest stands. However this method is subjective and is not able to accurately capture fine to landscape scale variation necessary for ecological studies. Identification and mapping of fine to landscape scale vegetative structural attributes will allow the compilation of information associated with Montreal Process indicators lb and ld, which seek to determine linkages between age structure and the diversity and abundance of forest fauna populations. This project integrated measurements of structural attributes derived from a canopy-height elevation model with results from a geometrical-optical/spectral mixture analysis model to map forest age structure at a landscape scale. The availability of multiple-scale data allows the transfer of high-resolution attributes to landscape scale monitoring. Multispectral image data were obtained from a DMSV (Digital Multi-Spectral Video) sensor over St Mary's State Forest in Southeast Queensland, Australia. Local scene variance levels for different forest tapes calculated from the DMSV data were used to optimize the tree density and canopy size output in a geometric-optical model applied to a Landsat Thematic Mapper (TU) data set. Airborne laser scanner data obtained over the project area were used to calibrate a digital filter to extract tree heights from a digital elevation model that was derived from scanned colour stereopairs. The modelled estimates of tree height, crown size, and tree density were used to produce a decision-tree classification of forest successional stage at a landscape scale. The results obtained (72% accuracy), were limited in validation, but demonstrate potential for using the multi-scale methodology to provide spatial information for forestry policy objectives (ie., monitoring forest age structure).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The present paper addresses two major concerns that were identified when developing neural network based prediction models and which can limit their wider applicability in the industry. The first problem is that it appears neural network models are not readily available to a corrosion engineer. Therefore the first part of this paper describes a neural network model of CO2 corrosion which was created using a standard commercial software package and simple modelling strategies. It was found that such a model was able to capture practically all of the trends noticed in the experimental data with acceptable accuracy. This exercise has proven that a corrosion engineer could readily develop a neural network model such as the one described below for any problem at hand, given that sufficient experimental data exist. This applies even in the cases when the understanding of the underlying processes is poor. The second problem arises from cases when all the required inputs for a model are not known or can be estimated with a limited degree of accuracy. It seems advantageous to have models that can take as input a range rather than a single value. One such model, based on the so-called Monte Carlo approach, is presented. A number of comparisons are shown which have illustrated how a corrosion engineer might use this approach to rapidly test the sensitivity of a model to the uncertainities associated with the input parameters. (C) 2001 Elsevier Science Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Image segmentation is an ubiquitous task in medical image analysis, which is required to estimate morphological or functional properties of given anatomical targets. While automatic processing is highly desirable, image segmentation remains to date a supervised process in daily clinical practice. Indeed, challenging data often requires user interaction to capture the required level of anatomical detail. To optimize the analysis of 3D images, the user should be able to efficiently interact with the result of any segmentation algorithm to correct any possible disagreement. Building on a previously developed real-time 3D segmentation algorithm, we propose in the present work an extension towards an interactive application where user information can be used online to steer the segmentation result. This enables a synergistic collaboration between the operator and the underlying segmentation algorithm, thus contributing to higher segmentation accuracy, while keeping total analysis time competitive. To this end, we formalize the user interaction paradigm using a geometrical approach, where the user input is mapped to a non-cartesian space while this information is used to drive the boundary towards the position provided by the user. Additionally, we propose a shape regularization term which improves the interaction with the segmented surface, thereby making the interactive segmentation process less cumbersome. The resulting algorithm offers competitive performance both in terms of segmentation accuracy, as well as in terms of total analysis time. This contributes to a more efficient use of the existing segmentation tools in daily clinical practice. Furthermore, it compares favorably to state-of-the-art interactive segmentation software based on a 3D livewire-based algorithm.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this paper is to develop models for experimental open-channel water delivery systems and assess the use of three data-driven modeling tools toward that end. Water delivery canals are nonlinear dynamical systems and thus should be modeled to meet given operational requirements while capturing all relevant dynamics, including transport delays. Typically, the derivation of first principle models for open-channel systems is based on the use of Saint-Venant equations for shallow water, which is a time-consuming task and demands for specific expertise. The present paper proposes and assesses the use of three data-driven modeling tools: artificial neural networks, composite local linear models and fuzzy systems. The canal from Hydraulics and Canal Control Nucleus (A parts per thousand vora University, Portugal) will be used as a benchmark: The models are identified using data collected from the experimental facility, and then their performances are assessed based on suitable validation criterion. The performance of all models is compared among each other and against the experimental data to show the effectiveness of such tools to capture all significant dynamics within the canal system and, therefore, provide accurate nonlinear models that can be used for simulation or control. The models are available upon request to the authors.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This document presents a tool able to automatically gather data provided by real energy markets and to generate scenarios, capture and improve market players’ profiles and strategies by using knowledge discovery processes in databases supported by artificial intelligence techniques, data mining algorithms and machine learning methods. It provides the means for generating scenarios with different dimensions and characteristics, ensuring the representation of real and adapted markets, and their participating entities. The scenarios generator module enhances the MASCEM (Multi-Agent Simulator of Competitive Electricity Markets) simulator, endowing a more effective tool for decision support. The achievements from the implementation of the proposed module enables researchers and electricity markets’ participating entities to analyze data, create real scenarios and make experiments with them. On the other hand, applying knowledge discovery techniques to real data also allows the improvement of MASCEM agents’ profiles and strategies resulting in a better representation of real market players’ behavior. This work aims to improve the comprehension of electricity markets and the interactions among the involved entities through adequate multi-agent simulation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Accepted in 13th IEEE Symposium on Embedded Systems for Real-Time Multimedia (ESTIMedia 2015), Amsterdam, Netherlands.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

PURPOSE: The Genous™ stent (GS) is designed to accelerate endothelization, which is potentially useful in the pro-thrombotic environment of ST-elevation acute myocardial infarction (STEMI). We aimed to evaluate the safety and effectiveness of the GS in the first year following primary percutaneous coronary intervention (PCI) and to compare our results with the few previously published studies. METHODS AND MATERIALS: All patients admitted to a single center due to STEMI that underwent primary PCI using exclusively GS, between May 2006 and January 2012, were enrolled. The primary study endpoints were major adverse cardiac events (MACEs), defined as the composite of cardiac death, acute myocardial infarction and target vessel revascularization, at one and 12months. RESULTS: In the cohort of 109 patients (73.4% male, 59 ±12years), 24.8% were diabetic. PCI was performed in 116 lesions with angiographic success in 99.1%, using 148 GS with median diameter of 3.00mm (2.50-4.00) and median length of 15mm (9-33). Cumulative MACEs were 2.8% at one month and 6.4% at 12months. Three stent thromboses (2.8%), all subacute, and one stent restenosis (0.9%) occurred. These accounted for the four target vessel revascularizations (3.7%). At 12months, 33.9% of patients were not on dual antiplatelet therapy. CONCLUSIONS: GS was safe and effective in the first year following primary PCI in STEMI, with an apparently safer profile comparing with the previously published data. SUMMARY: We report the safety and effectiveness of the Genous™ stent (GS) in the first year following primary percutaneous coronary intervention in ST-elevation acute myocardial infarction. A comprehensive review of the few studies that have been published on this subject was included and some suggest a less safe profile of the GS. Our results and the critical review included may add information and reinforce the safety and effectiveness of the GS in ST-elevation in acute myocardial infarction.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dissertação para obtenção do Grau de Mestre em Engenharia Informática

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dissertação para obtenção do grau de Mestre em Engenharia Química e Bioquímica