262 resultados para litter mixture
An external field prior for the hidden Potts model with application to cone-beam computed tomography
Resumo:
In images with low contrast-to-noise ratio (CNR), the information gain from the observed pixel values can be insufficient to distinguish foreground objects. A Bayesian approach to this problem is to incorporate prior information about the objects into a statistical model. A method for representing spatial prior information as an external field in a hidden Potts model is introduced. This prior distribution over the latent pixel labels is a mixture of Gaussian fields, centred on the positions of the objects at a previous point in time. It is particularly applicable in longitudinal imaging studies, where the manual segmentation of one image can be used as a prior for automatic segmentation of subsequent images. The method is demonstrated by application to cone-beam computed tomography (CT), an imaging modality that exhibits distortions in pixel values due to X-ray scatter. The external field prior results in a substantial improvement in segmentation accuracy, reducing the mean pixel misclassification rate for an electron density phantom from 87% to 6%. The method is also applied to radiotherapy patient data, demonstrating how to derive the external field prior in a clinical context.
Resumo:
This paper examines the issue of face, speaker and bi-modal authentication in mobile environments when there is significant condition mismatch. We introduce this mismatch by enrolling client models on high quality biometric samples obtained on a laptop computer and authenticating them on lower quality biometric samples acquired with a mobile phone. To perform these experiments we develop three novel authentication protocols for the large publicly available MOBIO database. We evaluate state-of-the-art face, speaker and bi-modal authentication techniques and show that inter-session variability modelling using Gaussian mixture models provides a consistently robust system for face, speaker and bi-modal authentication. It is also shown that multi-algorithm fusion provides a consistent performance improvement for face, speaker and bi-modal authentication. Using this bi-modal multi-algorithm system we derive a state-of-the-art authentication system that obtains a half total error rate of 6.3% and 1.9% for Female and Male trials, respectively.
Resumo:
In this paper we propose a novel approach to multi-action recognition that performs joint segmentation and classification. This approach models each action using a Gaussian mixture using robust low-dimensional action features. Segmentation is achieved by performing classification on overlapping temporal windows, which are then merged to produce the final result. This approach is considerably less complicated than previous methods which use dynamic programming or computationally expensive hidden Markov models (HMMs). Initial experiments on a stitched version of the KTH dataset show that the proposed approach achieves an accuracy of 78.3%, outperforming a recent HMM-based approach which obtained 71.2%.
Resumo:
The quality of environmental decisions should be gauged according to managers' objectives. Management objectives generally seek to maximize quantifiable measures of system benefit, for instance population growth rate. Reaching these goals often requires a certain degree of learning about the system. Learning can occur by using management action in combination with a monitoring system. Furthermore, actions can be chosen strategically to obtain specific kinds of information. Formal decision making tools can choose actions to favor such learning in two ways: implicitly via the optimization algorithm that is used when there is a management objective (for instance, when using adaptive management), or explicitly by quantifying knowledge and using it as the fundamental project objective, an approach new to conservation.This paper outlines three conservation project objectives - a pure management objective, a pure learning objective, and an objective that is a weighted mixture of these two. We use eight optimization algorithms to choose actions that meet project objectives and illustrate them in a simulated conservation project. The algorithms provide a taxonomy of decision making tools in conservation management when there is uncertainty surrounding competing models of system function. The algorithms build upon each other such that their differences are highlighted and practitioners may see where their decision making tools can be improved. © 2010 Elsevier Ltd.
Resumo:
XD: Experience Design Magazine is an interdisciplinary publication that focuses on the concept and practice of ‘experience design’, as a holistic concept separate from the well known concept of ‘user experience’. The magazine aims to present a mixture of interrelated perspectives from industry and academic researchers with practicing designers and managers. The informal, journalistic style of the publication aims to simultaneously provide a platform for researchers and other writers to promote their work in an applied way for global impact, and for industry designers to present practical perspectives to inspire a global research audience. Each issue will feature a series of projects, interviews, visuals, reviews and creative inspiration – all of which help everyone understand why experience design is important, who does it and where, how experience design is done in practice and how experience design research can enhance practice. Contents Issue 1 Miller, F. Developing Principles for Designing Optimal Experiences Lavallee, P. Design for Emotions Khan, H. The Entropii XD Framework Bowe, M. & Silvers, A. First Steps in Experience Design Leaper, N. Learning by Design Forrest, R. & Roberts, T. Interpretive Design: Think, Do, Feel Tavakkoli, P. Working Hard at Play Stow, C. Designing Engaging Learning Experiences Wood, M. Enhance Your Travel Experience Using Apps Miller, F. Humanizing It Wood, M. Designing the White Night Experience Newberry, P. & Farnham, K. Experience Design Book Excerpt
Resumo:
Background The expression of biomass-degrading enzymes (such as cellobiohydrolases) in transgenic plants has the potential to reduce the costs of biomass saccharification by providing a source of enzymes to supplement commercial cellulase mixtures. Cellobiohydrolases are the main enzymes in commercial cellulase mixtures. In the present study, a cellobiohydrolase was expressed in transgenic corn stover leaf and assessed as an additive for two commercial cellulase mixtures for the saccharification of pretreated sugar cane bagasse obtained by different processes. Results Recombinant cellobiohydrolase in the senescent leaves of transgenic corn was extracted using a simple buffer with no concentration step. The extract significantly enhanced the performance of Celluclast 1.5 L (a commercial cellulase mixture) by up to fourfold on sugar cane bagasse pretreated at the pilot scale using a dilute sulfuric acid steam explosion process compared to the commercial cellulase mixture on its own. Also, the extracts were able to enhance the performance of Cellic CTec2 (a commercial cellulase mixture) up to fourfold on a range of residues from sugar cane bagasse pretreated at the laboratory (using acidified ethylene carbonate/ethylene glycol, 1-butyl-3-methylimidazolium chloride, and ball-milling) and pilot (dilute sodium hydroxide and glycerol/hydrochloric acid steam explosion) scales. We have demonstrated using tap water as a solvent (under conditions that mimic an industrial process) extraction of about 90% recombinant cellobiohydrolase from senescent, transgenic corn stover leaf that had minimal tissue disruption. Conclusions The accumulation of recombinant cellobiohydrolase in senescent, transgenic corn stover leaf is a viable strategy to reduce the saccharification cost associated with the production of fermentable sugars from pretreated biomass. We envisage an industrial-scale process in which transgenic plants provide both fibre and biomass-degrading enzymes for pretreatment and enzymatic hydrolysis, respectively.
Resumo:
A series of kaolinite–methanol complexes with different basal spacings were synthesized using guest displacement reactions of the intercalation precursors kaolinite–N-methyformamide (Kaol–NMF), kaolinite–urea (Kaol–U), or kaolinite–dimethylsulfoxide (Kaol–DMSO), with methanol (Me). The interaction of methanol with kaolinite was examined using X-ray diffraction (XRD), infrared spectroscopy (IR), and nuclear magnetic resonance (NMR). Kaolinite (Kaol) initially intercalated with N-methyformamide (NMF), urea (U), or dimethylsulfoxide (DMSO) before subsequent reaction with Me formed final kaolinite–methanol (Kaol–Me) complexes characterized by basal spacing ranging between 8.6 Å and 9.6 Å, depending on the pre-intercalated reagent. Based on a comparative analysis of the three Kaol–Me displacement intercalation complexes, three types of Me intercalation products were suggested to have been present in the interlayer space of Kaol: (1) molecules grafted onto a kaolinite octahedral sheet in the form of a methoxy group (Al-O-C bond); (2) mobile Me and/or water molecules kept in the interlayer space via hydrogen bonds that could be partially removed during drying; and (3) a mixture of types 1 and 2, with the methoxy group (Al-O-C bond) grafted onto the Kaol sheet and mobile Me and/or water molecules coexisted in the system after the displacement reaction by Me. Various structural models that reflected four possible complexes of Kaol–Me were constructed for use in a complimentary computational study. Results from the calculation of the methanol kaolinite interaction indicate that the hydroxyl oxygen atom of methanol plays the dominant role in the stabilization and localization of the molecule intercalated in the interlayer space, and that water existing in the intercalated Kaol layer is inevitable.
Resumo:
Asoftware-based environment was developed to provide practical training in medical radiation principles and safety. The Virtual Radiation Laboratory application allowed students to conduct virtual experiments using simulated diagnostic and radiotherapy X-ray generators. The experiments were designed to teach students about the inverse square law, half value layer and radiation protection measures and utilised genuine clinical and experimental data. Evaluation of the application was conducted in order to ascertain the impact of the software on students’ understanding, satisfaction and collaborative learning skills and also to determine potential further improvements to the software and guidelines for its continued use. Feedback was gathered via an anonymous online survey consisting of a mixture of Likert-style questions and short answer open questions. Student feedback was highly positive with 80 % of students reporting increased understanding of radiation protection principles. Furthermore 72 % enjoyed using the software and 87 %of students felt that the project facilitated collaboration within small groups. The main themes arising in the qualitative feedback comments related to efficiency and effectiveness of teaching, safety of environment, collaboration and realism. Staff and students both report gains in efficiency and effectiveness associated with the virtual experiments. In addition students particularly value the visualisation of ‘‘invisible’’ physical principles and increased opportunity for experimentation and collaborative problembased learning. Similar ventures will benefit from adopting an approach that allows for individual experimentation while visualizing challenging concepts.
Resumo:
Purpose The role of fine lactose in the dispersion of salmeterol xinafoate (SX) from lactose mixtures was studied by modifying the fine lactose concentration on the surface of the lactose carriers using wet decantation. Methods Fine lactose was removed from lactose carriers by wet decantation using ethanol saturated with lactose. Particle sizing was achieved by laser diffraction. Fine particle fractions (FPFs) were determined by Twin Stage Impinger using a 2.5% SX mixture, and SX was analyzed by a validated high-performance liquid chromatography method. Adhesion forces between probes of SX and silica and the lactose surfaces were determined by atomic force microscopy. Results FPFs of SX were related to fine lactose concentration in the mixture for inhalation grade lactose samples. Reductions in FPF (2-4-fold) of Aeroflo 95 and 65 were observed after removing fine lactose by wet decantation; FPFs reverted to original values after addition of micronized lactose to decanted mixtures. FPFs of SX of sieved and decanted fractions of Aeroflo carriers were significantly different (p < 0.001). The relationship between FPF and fine lactose concentration was linear. Decanted lactose demonstrated surface modification through increased SX-lactose adhesion forces; however, any surface modification other than removal of fine lactose only slightly influenced FPF. Conclusions Fine lactose played a key and dominating role in controlling FPF. SX to fine lactose ratios influenced dispersion of SX with maximum dispersion occurring as the ratio approached unity.
Resumo:
A new transdimensional Sequential Monte Carlo (SMC) algorithm called SM- CVB is proposed. In an SMC approach, a weighted sample of particles is generated from a sequence of probability distributions which ‘converge’ to the target distribution of interest, in this case a Bayesian posterior distri- bution. The approach is based on the use of variational Bayes to propose new particles at each iteration of the SMCVB algorithm in order to target the posterior more efficiently. The variational-Bayes-generated proposals are not limited to a fixed dimension. This means that the weighted particle sets that arise can have varying dimensions thereby allowing us the option to also estimate an appropriate dimension for the model. This novel algorithm is outlined within the context of finite mixture model estimation. This pro- vides a less computationally demanding alternative to using reversible jump Markov chain Monte Carlo kernels within an SMC approach. We illustrate these ideas in a simulated data analysis and in applications.
Resumo:
The scale of environmental problems in China is clearly evident. This paper analyses foreign direct investment (FDI) in China with a finite mixture model, also known as latent class model to understand the relationship between FDI and several pollutions. This is used to regresses FDI as function covariates including pollutants. The results reveal that FDI is affected by pollutants. There are cases reducing pollution deters foreign investment in China.
Resumo:
Hyperthermia, raised temperature, has been used as a means of treating cancer for centuries. Hippocrates (400 BC) and Galen (200 BC) used red-hot irons to treat small tumours. Much later, after the Renaissance, there are many reports of spontaneous tumour regression in patients with fevers produced by erysipelas, malaria, smallpox, tuberculosis and influenza. These illnesses produce fevers of about 40 °C which last for several days. Temperatures of at least 40 °C were found to be necessary for tumour regression. Towards the end of the nineteenth century pyrogenic bacteria were injected into patients with cancer. In 1896, Coly used a mixture of erysipelas and B. prodigeosus, with some success...
Resumo:
We propose a novel technique for conducting robust voice activity detection (VAD) in high-noise recordings. We use Gaussian mixture modeling (GMM) to train two generic models; speech and non-speech. We then score smaller segments of a given (unseen) recording against each of these GMMs to obtain two respective likelihood scores for each segment. These scores are used to compute a dissimilarity measure between pairs of segments and to carry out complete-linkage clustering of the segments into speech and non-speech clusters. We compare the accuracy of our method against state-of-the-art and standardised VAD techniques to demonstrate an absolute improvement of 15% in half-total error rate (HTER) over the best performing baseline system and across the QUT-NOISE-TIMIT database. We then apply our approach to the Audio-Visual Database of American English (AVDBAE) to demonstrate the performance of our algorithm in using visual, audio-visual or a proposed fusion of these features.
Resumo:
Background There has been growing interest in mixed species plantation systems because of their potential to provide a range of socio-economic and bio-physical benefits which can be matched to the diverse needs of smallholders and communities. Potential benefits include the production of a range of forest products for home and commercial use; improved soil fertility especially when nitrogen fixing species are included; improved survival rates and greater productivity of species; a reduction in the amount of damage from pests or disease; and improved biodiversity and wildlife habitats. Despite these documented services and growing interest in mixed species plantation systems, the actual planting areas in the tropics are low, and monocultures are still preferred for industrial plantings and many reforestation programs because of perceived higher economic returns and readily available information about the species and their silviculture. In contrast, there are few guidelines for the design and management of mixed-species systems, including the social and ecological factors of successful mixed species plantings. Methods This protocol explains the methodology used to investigate the following question: What is the available evidence for the relative performance of different designs of mixed-species plantings for smallholder and community forestry in the tropics? This study will systematically search, identify and describe studies related to mixed species plantings across tropical and temperate zones to identify the social and ecological factors that affect polyculture systems. The objectives of this study are first to identify the evidence of biophysical or socio-economic factors that have been considered when designing mixed species systems for community and smallholder forestry in the tropics; and second, to identify gaps in research of mixed species plantations. Results of the study will help create guidelines that can assist practitioners, scientists and farmers to better design mixed species plantation systems for smallholders in the tropics.
Resumo:
Meta-analyses estimate a statistical effect size for a test or an analysis by combining results from multiple studies without necessarily having access to each individual study's raw data. Multi-site meta-analysis is crucial for imaging genetics, as single sites rarely have a sample size large enough to pick up effects of single genetic variants associated with brain measures. However, if raw data can be shared, combining data in a "mega-analysis" is thought to improve power and precision in estimating global effects. As part of an ENIGMA-DTI investigation, we use fractional anisotropy (FA) maps from 5 studies (total N=2, 203 subjects, aged 9-85) to estimate heritability. We combine the studies through meta-and mega-analyses as well as a mixture of the two - combining some cohorts with mega-analysis and meta-analyzing the results with those of the remaining sites. A combination of mega-and meta-approaches may boost power compared to meta-analysis alone.