931 resultados para Capture-recapture Data
Resumo:
Latent variable models for network data extract a summary of the relational structure underlying an observed network. The simplest possible models subdivide nodes of the network into clusters; the probability of a link between any two nodes then depends only on their cluster assignment. Currently available models can be classified by whether clusters are disjoint or are allowed to overlap. These models can explain a "flat" clustering structure. Hierarchical Bayesian models provide a natural approach to capture more complex dependencies. We propose a model in which objects are characterised by a latent feature vector. Each feature is itself partitioned into disjoint groups (subclusters), corresponding to a second layer of hierarchy. In experimental comparisons, the model achieves significantly improved predictive performance on social and biological link prediction tasks. The results indicate that models with a single layer hierarchy over-simplify real networks.
Resumo:
Optical motion capture systems suffer from marker occlusions resulting in loss of useful information. This paper addresses the problem of real-time joint localisation of legged skeletons in the presence of such missing data. The data is assumed to be labelled 3d marker positions from a motion capture system. An integrated framework is presented which predicts the occluded marker positions using a Variable Turn Model within an Unscented Kalman filter. Inferred information from neighbouring markers is used as observation states; these constraints are efficient, simple, and real-time implementable. This work also takes advantage of the common case that missing markers are still visible to a single camera, by combining predictions with under-determined positions, resulting in more accurate predictions. An Inverse Kinematics technique is then applied ensuring that the bone lengths remain constant over time; the system can thereby maintain a continuous data-flow. The marker and Centre of Rotation (CoR) positions can be calculated with high accuracy even in cases where markers are occluded for a long period of time. Our methodology is tested against some of the most popular methods for marker prediction and the results confirm that our approach outperforms these methods in estimating both marker and CoR positions. © 2012 Springer-Verlag.
Resumo:
Radio Frequency Identification (RFID) technology allows automatic data capture from tagged objects moving in a supply chain. This data can be very useful if it is used to answer traceability queries, however it is distributed across many different repositories, owned by different companies. © 2012 IEEE.
Resumo:
Optical films containing the genetic variant bacteriorhodopsin BR-D96N were experimentally studied in view of their properties as media for holographic storage. Different polarization recording schemes were tested and compared. The influence of the polarization states of the recording and readout waves on the retrieved diffractive image's intensity and its signal-to-noise ratio were analyzed. The experimental results showed that, compared with the other tested polarization relations during holographic recording, the discrimination between the polarization states of diffracted and scattered light is optimized with orthogonal circular polarization of the recording beams, and thus a high signal-to-noise ratio and a high diffraction efficiency are obtained. Using a He-Ne laser (633 nm, 3 mW) for recording and readout, a spatial light modulator as a data input element, and a 2D-CCD sensor for data capture in a Fourier-transform holographic setup, a storage density of 2 x 10(8) bits/cm(2) was obtained on a 60 x 42 mu m(2) area in the BR-D96N film. The readout of encoded binary data was possible with a zero-error rate at the tested storage density. (c) 2005 Optical Society of America.
Resumo:
The ratios R-k1 of k-fold to single ionization of the target atom with simultaneous one-electron capture by the projectile have been measured for 15-480 keV/u (nu(p) = 0.8-4.4 a.u.) collisions of Cq+, Oq+ (q=1-4) with Ar, using time-of-flight techniques which allowed the simultaneous identification of the final charge state of both the low-velocity recoil ion and the high-velocity projectile for each collision event. The present ratios are similar to those for He+ and He2+ ion impact. The energy dependence of R-k1 shows a maximum at a certain energy, E-max. which approximately conforms to the q(1/2)-dependence scaling. For a fixed projectile state, the ratios R-k1 also vary strongly with outgoing reaction channels. The general behavior of the measured data can be qualitatively analyzed by a simple impact-parameter, independent-electron model. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
The shell effect is included in the improved isospin dependent quantum molecular dynamics model in which the shell correction energy of the system is calculated by using the deformed two-center shell model. A switch function is introduced to connect the shell correction energy of the projectile and the target with that of the compound nucleus during the dynamical fusion process. It is found that the calculated capture cross sections reproduce the experimental data quantitatively at the energy near the Coulomb barrier. The capture cross sections for reaction (35) (80) Br + (82) (208) Pb -> (117) (288) X are also calculated and discussed.
Resumo:
Cross sections of electron- loss in H( 1s)+ H( 1s) collisions and total collisional destruction of H( 2s) in H( 1s) + H( 2s) collisions are calculated by four- body classical- trajectory Monte Carlo ( CTMC) method and compared with previous theoretical and experimental data over the energy range of 4 - 100 keV. For the former a good agreement is obtained within di. erent four- body CTMC calculations, and for the incident energy Ep > 10 keV, comparison with the experimental data shows a better agreement than the results calculated by the impact parameter approximation. For the latter, our theory predicts the correct experimental behaviour, and the discrepancies between our results and experimental ones are less than 30%. Based on the successive comparison with experiments, the cross sections for excitation to H( 2p), single- and double- ionization and H- formation in H( 2s)+ H( 2s) collisions are calculated in the energy range of 4 - 100 keV for the. rst time, and compared with those in H( 1s)+ H( 1s) and H( 1s)+ H( 2s) collisions.
Resumo:
Wireless sensor networks are characterized by limited energy resources. To conserve energy, application-specific aggregation (fusion) of data reports from multiple sensors can be beneficial in reducing the amount of data flowing over the network. Furthermore, controlling the topology by scheduling the activity of nodes between active and sleep modes has often been used to uniformly distribute the energy consumption among all nodes by de-synchronizing their activities. We present an integrated analytical model to study the joint performance of in-network aggregation and topology control. We define performance metrics that capture the tradeoffs among delay, energy, and fidelity of the aggregation. Our results indicate that to achieve high fidelity levels under medium to high event reporting load, shorter and fatter aggregation/routing trees (toward the sink) offer the best delay-energy tradeoff as long as topology control is well coordinated with routing.
Resumo:
Colour is everywhere in our daily lives and impacts things like our mood, yet we rarely take notice of it. One method of capturing and analysing the predominant colours that we encounter is through visual lifelogging devices such as the SenseCam. However an issue related to these devices is the privacy concerns of capturing image level detail. Therefore in this work we demonstrate a hardware prototype wearable camera that captures only one pixel - of the dominant colour prevelant in front of the user, thus circumnavigating the privacy concerns raised in relation to lifelogging. To simulate whether the capture of dominant colour would be sufficient we report on a simulation carried out on 1.2 million SenseCam images captured by a group of 20 individuals. We compare the dominant colours that different groups of people are exposed to and show that useful inferences can be made from this data. We believe our prototype may be valuable in future experiments to capture colour correlated associated with an individual's mood.Colour is everywhere in our daily lives and impacts things like our mood, yet we rarely take notice of it. One method of capturing and analysing the predominant colours that we encounter is through visual lifelogging devices such as the SenseCam. However an issue related to these devices is the privacy concerns of capturing image level detail. Therefore in this work we demonstrate a hardware prototype wearable camera that captures only one pixel - of the dominant colour prevelant in front of the user, thus circumnavigating the privacy concerns raised in relation to lifelogging. To simulate whether the capture of dominant colour would be sufficient we report on a simulation carried out on 1.2 million SenseCam images captured by a group of 20 individuals. We compare the dominant colours that different groups of people are exposed to and show that useful inferences can be made from this data. We believe our prototype may be valuable in future experiments to capture colour correlated associated with an individual's mood.
Resumo:
European badgers (Meles meles) are an important part of the Irish ecosystem; they are a component of Ireland’s native fauna and are afforded protection by national and international laws. The species is also a reservoir host for bovine tuberculosis (bTB) and implicated in the epidemiology of bTB in cattle. Due to this latter point, badgers have been culled in the Republic of Ireland (ROI) in areas where persistent cattle bTB outbreaks exist. The population dynamics of badgers are therefore of great pure and applied interest. The studies within this thesis used large datasets and a number of analytical approaches to uncover essential elements of badger populations in the ROI. Furthermore, a review and meta-analysis of all available data on Irish badgers was completed to give a framework from which key knowledge gaps and future directions could be identified (Chapter 1). One main finding suggested that badger densities are significantly reduced in areas of repeated culling, as revealed through declining trends in signs of activity (Chapter 2) and capture numbers (Chapter 2 and Chapter 3). Despite this, the trappability of badgers was shown to be lower than previously thought. This indicates that management programmes would require repeated long-term efforts to be effective (Chapter 4). Mark-recapture modelling of a population (sample area: 755km2) suggested that mean badger density was typical of continental European populations, but substantially lower than British populations (Chapter 4). Badger movement patterns indicated that most of the population exhibited site fidelity. Long-distance movements were also recorded, the longest of which (20.1km) was the greatest displacement of an Irish badger currently known (Chapter 5). The studies presented in this thesis allows for the development of more robust models of the badger population at national scales (see Future Directions). Through the use of large-scale datasets future models will facilitate informed sustainable planning for disease control.
Resumo:
BACKGROUND: Biological processes occur on a vast range of time scales, and many of them occur concurrently. As a result, system-wide measurements of gene expression have the potential to capture many of these processes simultaneously. The challenge however, is to separate these processes and time scales in the data. In many cases the number of processes and their time scales is unknown. This issue is particularly relevant to developmental biologists, who are interested in processes such as growth, segmentation and differentiation, which can all take place simultaneously, but on different time scales. RESULTS: We introduce a flexible and statistically rigorous method for detecting different time scales in time-series gene expression data, by identifying expression patterns that are temporally shifted between replicate datasets. We apply our approach to a Saccharomyces cerevisiae cell-cycle dataset and an Arabidopsis thaliana root developmental dataset. In both datasets our method successfully detects processes operating on several different time scales. Furthermore we show that many of these time scales can be associated with particular biological functions. CONCLUSIONS: The spatiotemporal modules identified by our method suggest the presence of multiple biological processes, acting at distinct time scales in both the Arabidopsis root and yeast. Using similar large-scale expression datasets, the identification of biological processes acting at multiple time scales in many organisms is now possible.
Resumo:
BACKGROUND: The National Comprehensive Cancer Network and the American Society of Clinical Oncology have established guidelines for the treatment and surveillance of colorectal cancer (CRC), respectively. Considering these guidelines, an accurate and efficient method is needed to measure receipt of care. METHODS: The accuracy and completeness of Veterans Health Administration (VA) administrative data were assessed by comparing them with data manually abstracted during the Colorectal Cancer Care Collaborative (C4) quality improvement initiative for 618 patients with stage I-III CRC. RESULTS: The VA administrative data contained gender, marital, and birth information for all patients but race information was missing for 62.1% of patients. The percent agreement for demographic variables ranged from 98.1-100%. The kappa statistic for receipt of treatments ranged from 0.21 to 0.60 and there was a 96.9% agreement for the date of surgical resection. The percentage of post-diagnosis surveillance events in C4 also in VA administrative data were 76.0% for colonoscopy, 84.6% for physician visit, and 26.3% for carcinoembryonic antigen (CEA) test. CONCLUSIONS: VA administrative data are accurate and complete for non-race demographic variables, receipt of CRC treatment, colonoscopy, and physician visits; but alternative data sources may be necessary to capture patient race and receipt of CEA tests.
Resumo:
The neutron multidetector DéMoN has been used to investigate the symmetric splitting dynamics in the reactions 58.64Ni + 208Pb with excitation energies ranging from 65 to 186 MeV for the composite system. An analysis based on the new backtracing technique has been applied on the neutron data to determine the two-dimensional correlations between the parent composite system initial thermal energy (EthCN) and the total neutron multiplicity (νtot), and between pre- and post-scission neutron multiplicities (νpre and νpost, respectively). The νpre distribution shape indicates the possible coexistence of fast-fission and fusion-fission for the system 58Ni + 208Pb (Ebeam = 8.86 A MeV). The analysis of the neutron multiplicities in the framework of the combined dynamical statistical model (CDSM) gives a reduced friction coefficient β = 23 ± 2512 × 1021 s-1, above the one-body dissipation limit. The corresponding fission time is τf = 40 ± 4620 × 10-21 s. © 1999 Elsevier Science B.V. All rights reserved.
Resumo:
This short position paper considers issues in developing Data Architecture for the Internet of Things (IoT) through the medium of an exemplar project, Domain Expertise Capture in Authoring and Development Environments (DECADE). A brief discussion sets the background for IoT, and the development of the distinction between things and computers. The paper makes a strong argument to avoid reinvention of the wheel, and to reuse approaches to distributed heterogeneous data architectures and the lessons learned from that work, and apply them to this situation. DECADE requires an autonomous recording system, local data storage, semi-autonomous verification model, sign-off mechanism, qualitative and quantitative analysis carried out when and where required through web-service architecture, based on ontology and analytic agents, with a self-maintaining ontology model. To develop this, we describe a web-service architecture, combining a distributed data warehouse, web services for analysis agents, ontology agents and a verification engine, with a centrally verified outcome database maintained by certifying body for qualification/professional status.
Resumo:
The Continuous Plankton Recorder has been deployed on a seasonal basis in the north Pacific since 2000, accumulating a database of abundance measurements for over 290 planktonic taxa in over 3,500 processed samples. There is an additional archive of over 10,000 samples available for further analyses. Exxon Valdez Oil Spill Trustee Council financial support has contributed to about half of this tally, through four projects funded since 2002. Time series of zooplankton variables for sub-regions of the survey area are presented together with abstracts of eight papers published using data from these projects. The time series covers a period when the dominant climate signal in the north Pacific, the Pacific Decadal Oscillation (PDO), switched with unusual frequency between warm/positive states (pre-1999 and 2003-2006) and cool/negative states (1999-2002 and 2007). The CPR data suggest that cool negative years show higher biomass on the shelf and lower biomass in the open ocean, while the reverse is true in warm (PDO positive) years with lower shelf biomass (except 2005) and higher oceanic biomass. In addition, there was a delay in plankton increase on the Alaskan shelf in the colder spring of 2007, compared to the warmer springs of the preceding years. In warm years, smaller species of copepods which lack lipid reserves are also more common. Availability of the zooplankton prey to higher trophic levels (including those that society values highly) is therefore dependent on the timing of increase and peak abundance, ease of capture and nutritional value. Previously published studies using these data highlight the wide-ranging applicability of CPR data and include collaborative studies on; phenology in the key copepod species Neocalanus plumchrus, descriptions of distributions of decapod larvae and euphausiid species, the effects of hydrographic features such as mesoscale eddies and the North Pacific Current on plankton populations and a molecularbased investigation of macro-scale population structure in N. cristatus. The future funding situation is uncertain but the value of the data and studies so far accumulated is considerable and sets a strong foundation for further studies on plankton dynamics and interactions with higher trophic levels in the northern Gulf of Alaska.