17 resultados para statistical analysis
em Cambridge University Engineering Department Publications Database
Resumo:
Spatial normalisation is a key element of statistical parametric mapping and related techniques for analysing cohort statistics on voxel arrays and surfaces. The normalisation process involves aligning each individual specimen to a template using some sort of registration algorithm. Any misregistration will result in data being mapped onto the template at the wrong location. At best, this will introduce spatial imprecision into the subsequent statistical analysis. At worst, when the misregistration varies systematically with a covariate of interest, it may lead to false statistical inference. Since misregistration generally depends on the specimen's shape, we investigate here the effect of allowing for shape as a confound in the statistical analysis, with shape represented by the dominant modes of variation observed in the cohort. In a series of experiments on synthetic surface data, we demonstrate how allowing for shape can reveal true effects that were previously masked by systematic misregistration, and also guard against misinterpreting systematic misregistration as a true effect. We introduce some heuristics for disentangling misregistration effects from true effects, and demonstrate the approach's practical utility in a case study of the cortical bone distribution in 268 human femurs.
Resumo:
A novel launch scheme is proposed for multimode-fiber (MMF) links. Enhanced performance in 10 Gb/s MMF links using electronic equalization is demonstrated by statistical analysis of installed-base fiber and an experimental investigation. © 2007 Optical Society of America.
Resumo:
Rigorous statistical analysis is applied for the first time to identify optimal launch conditions and carrier frequencies for SCM transmission over worst-case MMF. The feasibility of multichannel schemes for 10 Gb/s over 300 m is demonstrated. © 2005 Optical Society of America.
Resumo:
Rigorous statistical analysis is applied for the first time to identify optimal launch conditions and carrier frequencies for SCM transmission over worst-case MMF. The feasibility of multichannel schemes for 10 Gb/s over 300 m is demonstrated. © 2005 Optical Society of America.
Resumo:
We have constructed plasmids to be used for in vitro signature-tagged mutagenesis (STM) of Campylobacter jejuni and used these to generate STM libraries in three different strains. Statistical analysis of the transposon insertion sites in the C. jejuni NCTC 11168 chromosome and the plasmids of strain 81-176 indicated that their distribution was not uniform. Visual inspection of the distribution suggested that deviation from uniformity was not due to preferential integration of the transposon into a limited number of hot spots but rather that there was a bias towards insertions around the origin. We screened pools of mutants from the STM libraries for their ability to colonize the ceca of 2-week-old chickens harboring a standardized gut flora. We observed high-frequency random loss of colonization proficient mutants. When cohoused birds were individually inoculated with different tagged mutants, random loss of colonization-proficient mutants was similarly observed, as was extensive bird-to-bird transmission of mutants. This indicates that the nature of campylobacter colonization in chickens is complex and dynamic, and we hypothesize that bottlenecks in the colonization process and between-bird transmission account for these observations.
Resumo:
The technique of Subcarrier Modulation is assessed by statistical analysis as a viable solution to broadband data transmission over dispersion limited multimode fibre. It is shown that a suitable passband region for transmission of 2.5 Gb/s channels exists at 5 GHz in greater than 80% of worst-case fibres under standard SMF/MMF launch conditions. ©2002 Optical Society of America.
Resumo:
We present a novel, implementation friendly and occlusion aware semi-supervised video segmentation algorithm using tree structured graphical models, which delivers pixel labels alongwith their uncertainty estimates. Our motivation to employ supervision is to tackle a task-specific segmentation problem where the semantic objects are pre-defined by the user. The video model we propose for this problem is based on a tree structured approximation of a patch based undirected mixture model, which includes a novel time-series and a soft label Random Forest classifier participating in a feedback mechanism. We demonstrate the efficacy of our model in cutting out foreground objects and multi-class segmentation problems in lengthy and complex road scene sequences. Our results have wide applicability, including harvesting labelled video data for training discriminative models, shape/pose/articulation learning and large scale statistical analysis to develop priors for video segmentation. © 2011 IEEE.
Resumo:
There is increasing adoption of computer-based tools to support the product development process. Tolls include computer-aided design, computer-aided manufacture, systems engineering and product data management systems. The fact that companies choose to invest in tools might be regarded as evidence that tools, in aggregate, are perceived to possess business value through their application to engineering activities. Yet the ways in which value accrues from tool technology are poorly understood.
This report records the proceedings of an international workshop during which some novel approaches to improving our understanding of this problem of tool valuation were presented and debated. The value of methods and processes were also discussed. The workshop brought together British, Dutch, German and Italian researchers. The presenters included speakers from industry and academia (the University of Cambridge, the University of Magdeburg and the Politechnico de Torino)
The work presented showed great variety. Research methods include case studies, questionnaires, statistical analysis, semi-structured interviews, deduction, inductive reasoning, the recording of anecdotes and analogies. The presentations drew on financial investment theory, the industrial experience of workshop participants, discussions with students developing tools, modern economic theories and speculation on the effects of company capabilities.
Resumo:
In order to better understand the stratified combustion, the propagation of flame through stratified mixture field in laminar and turbulent flow conditions has been studied by using combined PIV/PLIF techniques. A great emphasis was placed on developing methods to improve the accuracy of local measurements of flame propagation. In particular, a new PIV approach has been developed to measure the local fresh gas velocity near preheat zone of flame front. To improve the resolution of measurement, the shape of interrogation window has been continuously modified based on the local flame topology and gas expansion effect. Statistical analysis of conditioned local measurements by the local equivalence ratio of flames allows the characterization of the properties of flame propagation subjected to the mixture stratification in laminar and turbulent flows, especially the highlight of the memory effect.