145 resultados para range extension


Relevância:

20.00% 20.00%

Publicador:

Resumo:

We have developed and validated a semi-automated fluorescent method of genotyping human leucocyte antigen (HLA)-DRB1 alleles, HLA-DRB1*01-16, by multiplex primer extension reactions. This method is based on the extension of a primer that anneals immediately adjacent to the single-nucleotide polymorphism with fluorescent dideoxynucleotide triphosphates (minisequencing), followed by analysis on an ABI Prism 3700 capillary electrophoresis instrument. The validity of the method was confirmed by genotyping 261 individuals using both this method and polymerase chain reaction with sequence-specific primer (PCR-SSP) or sequencing and by demonstrating Mendelian inheritance of HLA-DRB1 alleles in families. Our method provides a rapid means of performing high-throughput HLA-DRB1 genotyping using only two PCR reactions followed by four multiplex primer extension reactions and PCR-SSP for some allele groups. In this article, we describe the method and discuss its advantages and limitations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Ship seakeeping operability refers to the quantification of motion performance in waves relative to mission requirements. This is used to make decisions about preferred vessel designs, but it can also be used as comprehensive assessment of the benefits of ship-motion-control systems. Traditionally, operability computation aggregates statistics of motion computed over over the envelope of likely environmental conditions in order to determine a coefficient in the range from 0 to 1 called operability. When used for assessment of motion-control systems, the increase of operability is taken as the key performance indicator. The operability coefficient is often given the interpretation of the percentage of time operable. This paper considers an alternative probabilistic approach to this traditional computation of operability. It characterises operability not as a number to which a frequency interpretation is attached, but as a hypothesis that a vessel will attain the desired performance in one mission considering the envelope of likely operational conditions. This enables the use of Bayesian theory to compute the probability of that this hypothesis is true conditional on data from simulations. Thus, the metric considered is the probability of operability. This formulation not only adheres to recent developments in reliability and risk analysis, but also allows incorporating into the analysis more accurate descriptions of ship-motion-control systems since the analysis is not limited to linear ship responses in the frequency domain. The paper also discusses an extension of the approach to the case of assessment of increased levels of autonomy for unmanned marine craft.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The primary goal of a phase I trial is to find the maximally tolerated dose (MTD) of a treatment. The MTD is usually defined in terms of a tolerable probability, q*, of toxicity. Our objective is to find the highest dose with toxicity risk that does not exceed q*, a criterion that is often desired in designing phase I trials. This criterion differs from that of finding the dose with toxicity risk closest to q*, that is used in methods such as the continual reassessment method. We use the theory of decision processes to find optimal sequential designs that maximize the expected number of patients within the trial allocated to the highest dose with toxicity not exceeding q*, among the doses under consideration. The proposed method is very general in the sense that criteria other than the one considered here can be optimized and that optimal dose assignment can be defined in terms of patients within or outside the trial. It includes as an important special case the continual reassessment method. Numerical study indicates the strategy compares favourably with other phase I designs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper proposes solutions to three issues pertaining to the estimation of finite mixture models with an unknown number of components: the non-identifiability induced by overfitting the number of components, the mixing limitations of standard Markov Chain Monte Carlo (MCMC) sampling techniques, and the related label switching problem. An overfitting approach is used to estimate the number of components in a finite mixture model via a Zmix algorithm. Zmix provides a bridge between multidimensional samplers and test based estimation methods, whereby priors are chosen to encourage extra groups to have weights approaching zero. MCMC sampling is made possible by the implementation of prior parallel tempering, an extension of parallel tempering. Zmix can accurately estimate the number of components, posterior parameter estimates and allocation probabilities given a sufficiently large sample size. The results will reflect uncertainty in the final model and will report the range of possible candidate models and their respective estimated probabilities from a single run. Label switching is resolved with a computationally light-weight method, Zswitch, developed for overfitted mixtures by exploiting the intuitiveness of allocation-based relabelling algorithms and the precision of label-invariant loss functions. Four simulation studies are included to illustrate Zmix and Zswitch, as well as three case studies from the literature. All methods are available as part of the R package Zmix, which can currently be applied to univariate Gaussian mixture models.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Species distribution modelling (SDM) typically analyses species’ presence together with some form of absence information. Ideally absences comprise observations or are inferred from comprehensive sampling. When such information is not available, then pseudo-absences are often generated from the background locations within the study region of interest containing the presences, or else absence is implied through the comparison of presences to the whole study region, e.g. as is the case in Maximum Entropy (MaxEnt) or Poisson point process modelling. However, the choice of which absence information to include can be both challenging and highly influential on SDM predictions (e.g. Oksanen and Minchin, 2002). In practice, the use of pseudo- or implied absences often leads to an imbalance where absences far outnumber presences. This leaves analysis highly susceptible to ‘naughty-noughts’: absences that occur beyond the envelope of the species, which can exert strong influence on the model and its predictions (Austin and Meyers, 1996). Also known as ‘excess zeros’, naughty noughts can be estimated via an overall proportion in simple hurdle or mixture models (Martin et al., 2005). However, absences, especially those that occur beyond the species envelope, can often be more diverse than presences. Here we consider an extension to excess zero models. The two-staged approach first exploits the compartmentalisation provided by classification trees (CTs) (as in O’Leary, 2008) to identify multiple sources of naughty noughts and simultaneously delineate several species envelopes. Then SDMs can be fit separately within each envelope, and for this stage, we examine both CTs (as in Falk et al., 2014) and the popular MaxEnt (Elith et al., 2006). We introduce a wider range of model performance measures to improve treatment of naughty noughts in SDM. We retain an overall measure of model performance, the area under the curve (AUC) of the Receiver-Operating Curve (ROC), but focus on its constituent measures of false negative rate (FNR) and false positive rate (FPR), and how these relate to the threshold in the predicted probability of presence that delimits predicted presence from absence. We also propose error rates more relevant to users of predictions: false omission rate (FOR), the chance that a predicted absence corresponds to (and hence wastes) an observed presence, and the false discovery rate (FDR), reflecting those predicted (or potential) presences that correspond to absence. A high FDR may be desirable since it could help target future search efforts, whereas zero or low FOR is desirable since it indicates none of the (often valuable) presences have been ignored in the SDM. For illustration, we chose Bradypus variegatus, a species that has previously been published as an exemplar species for MaxEnt, proposed by Phillips et al. (2006). We used CTs to increasingly refine the species envelope, starting with the whole study region (E0), eliminating more and more potential naughty noughts (E1–E3). When combined with an SDM fit within the species envelope, the best CT SDM had similar AUC and FPR to the best MaxEnt SDM, but otherwise performed better. The FNR and FOR were greatly reduced, suggesting that CTs handle absences better. Interestingly, MaxEnt predictions showed low discriminatory performance, with the most common predicted probability of presence being in the same range (0.00-0.20) for both true absences and presences. In summary, this example shows that SDMs can be improved by introducing an initial hurdle to identify naughty noughts and partition the envelope before applying SDMs. This improvement was barely detectable via AUC and FPR yet visible in FOR, FNR, and the comparison of predicted probability of presence distribution for pres/absence.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Introduction Markerless motion capture systems are relatively new devices that can significantly speed up capturing full body motion. A precision of the assessment of the finger’s position with this type of equipment was evaluated at 17.30 ± 9.56 mm when compare to an active marker system [1]. The Microsoft Kinect was proposed to standardized and enhanced clinical evaluation of patients with hemiplegic cerebral palsy [2]. Markerless motion capture systems have the potential to be used in a clinical setting for movement analysis, as well as for large cohort research. However, the precision of such system needs to be characterized. Global objectives • To assess the precision within the recording field of the markerless motion capture system Openstage 2 (Organic Motion, NY). • To compare the markerless motion capture system with an optoelectric motion capture system with active markers. Specific objectives • To assess the noise of a static body at 13 different location within the recording field of the markerless motion capture system. • To assess the smallest oscillation detected by the markerless motion capture system. • To assess the difference between both systems regarding the body joint angle measurement. Methods Equipment • OpenStage® 2 (Organic Motion, NY) o Markerless motion capture system o 16 video cameras (acquisition rate : 60Hz) o Recording zone : 4m * 5m * 2.4m (depth * width * height) o Provide position and angle of 23 different body segments • VisualeyezTM VZ4000 (PhoeniX Technologies Incorporated, BC) o Optoelectric motion capture system with active markers o 4 trackers system (total of 12 cameras) o Accuracy : 0.5~0.7mm Protocol & Analysis • Static noise: o Motion recording of an humanoid mannequin was done in 13 different locations o RMSE was calculated for each segment in each location • Smallest oscillation detected: o Small oscillations were induced to the humanoid mannequin and motion was recorded until it stopped. o Correlation between the displacement of the head recorded by both systems was measured. A corresponding magnitude was also measured. • Body joints angle: o Body motion was recorded simultaneously with both systems (left side only). o 6 participants (3 females; 32.7 ± 9.4 years old) • Tasks: Walk, Squat, Shoulder flexion & abduction, Elbow flexion, Wrist extension, Pronation / supination (not in results), Head flexion & rotation (not in results), Leg rotation (not in results), Trunk rotation (not in results) o Several body joint angles were measured with both systems. o RMSE was calculated between signals of both systems. Results Conclusion Results show that the Organic Motion markerless system has the potential to be used for assessment of clinical motor symptoms or motor performances However, the following points should be considered: • Precision of the Openstage system varied within the recording field. • Precision is not constant between limb segments. • The error seems to be higher close to the range of motion extremities.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The nanometer scale surface topography of a solid substrate is known to influence the extent of bacterial attachment and their subsequent proliferation to form biofilms. As an extension of our previous work on the development of a novel organic polymer coating for the prevention of growth of medically significant bacteria on three-dimensional solid surfaces, this study examines the effect of surface coating on the adhesion and proliferation tendencies of Staphylococcus aureus and compares to those previously investigated tendencies of Pseudomonas aeruginosa on similar coatings. Radio frequency plasma enhanced chemical vapor deposition was used to coat the surface of the substrate with thin film of terpinen-4-ol, a constituent of tea-tree oil known to inhibit the growth of a broad range of bacteria. The presence of the coating decreased the substrate surface roughness from approximately 2.1 nm to 0.4 nm. Similar to P. aeruginosa, S. aureus presented notably different patterns of attachment in response to the presence of the surface film, where the amount of attachment, extracellular polymeric substance production, and cell proliferation on the coated surface was found to be greatly reduced compared to that obtained on the unmodified surface. This work suggests that the antimicrobial and antifouling coating used in this study could be effectively integrated into medical and other clinically relevant devices to prevent bacterial growth and to minimize bacteria-associated adverse host responses.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An important decision brand managers have to make when positioning their products in a retail setting is to whether price new line extensions at parity or let products vary in the price/quality spectrum. Despite the growing interest in vertical line extension issues, there has been little research investigating how product-line length affects extension favorability. Therefore, this paper investigates the framing effect that a product line price structure has on consumer judgments of vertical extensions and, in particular, of upscale extensions. A basic proposition of this research is that the parent brand price range affects the perceived or psychological distance between extension and parent brand, influencing extension favorability ratings. In two experiments, it is shown that positioning an upscale extension in the context of a wide product-line will lead to higher consistency perceptions between the parent brand and a new upscale extension than an equivalent upscale extension positioned in the context of a narrow parent brand product-line.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The nature of our moral judgments—and the extent to which we treat others with care—depend in part on the distinctions we make between entities deemed worthy or unworthy of moral consideration— our moral boundaries. Philosophers, historians, and social scientists have noted that people’s moral boundaries have expanded over the last few centuries, but the notion of moral expansiveness has received limited empirical attention in psychology. This research explores variations in the size of individuals’ moral boundaries using the psychological construct of moral expansiveness and introduces the Moral Expansiveness Scale (MES), designed to capture this variation. Across 6 studies, we established the reliability, convergent validity, and predictive validity of the MES. Moral expansiveness was related (but not reducible) to existing moral constructs (moral foundations, moral identity, “moral” universalism values), predictors of moral standing (moral patiency and warmth), and other constructs associated with concern for others (empathy, identification with humanity, connectedness to nature, and social responsibility). Importantly, the MES uniquely predicted willingness to engage in prosocial intentions and behaviors at personal cost independently of these established constructs. Specifically, the MES uniquely predicted willingness to prioritize humanitarian and environmental concerns over personal and national self-interest, willingness to sacrifice one’s life to save others (ranging from human out-groups to animals and plants), and volunteering behavior. Results demonstrate that moral expansiveness is a distinct and important factor in understanding moral judgments and their consequences.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

It is a challenge to increase the visible-light photoresponses of wide-gap metal oxides. In this study, we proposed a new strategy to enhance the visible-light photoresponses of wide-gap semiconductors by deliberately designing a multi-scale nanostructure with controlled architecture. Hollow ZnO microspheres with constituent units in the shape of one-dimensional (1D) nanowire networks, 2D nanosheet stacks, and 3D mesoporous nanoball blocks are synthesized via an approach of two-step assembly, where the oligomers or the constituent nanostructures with specially designed structures are first formed, and then further assembled into complex morphologies. Through deliberate designing of constituent architectures allowing multiple visible-light scattering, reflections, and dispersion inside the multiscale nanostructures, enhanced wide range visible-light photoresponses of the ZnO hollow microspheres were successfully achieved. Compared to the one-step synthesized ZnO hollow microspheres, where no nanostructured constituents were produced, the ZnO hollow microspheres with 2D nanosheet stacks presented a 50 times higher photocurrent in the visible-light range (λ > 420 nm). The nanostructure induced visible-light photoresponse enhancement gives a direction to the development of novel photosensitive materials.