45 resultados para Informative voting
em CentAUR: Central Archive University of Reading - UK
Resumo:
One of the major differences undergraduates experience during the transition to university is the style of teaching. In schools and colleges most students study key stage 5 subjects in relatively small informal groups where teacher–pupil interaction is encouraged and two-way feedback occurs through question and answer type delivery. On starting in HE students are amazed by the sizes of the classes. For even a relatively small chemistry department with an intake of 60-70 students, biologists, pharmacists, and other first year undergraduates requiring chemistry can boost numbers in the lecture hall to around 200 or higher. In many universities class sizes of 400 are not unusual for first year groups where efficiency is crucial. Clearly the personalised classroom-style delivery is not practical and it is a brave student who shows his ignorance by venturing to ask a question in front of such an audience. In these environments learning can be a very passive process, the lecture acts as a vehicle for the conveyance of information and our students are expected to reinforce their understanding by ‘self-study’, a term, the meaning of which, many struggle to understand. The use of electronic voting systems (EVS) in such situations can vastly change the students’ learning experience from a passive to a highly interactive process. This principle has already been demonstrated in Physics, most notably in the work of Bates and colleagues at Edinburgh.1 These small hand-held devices, similar to those which have become familiar through programmes such as ‘Who Wants to be a Millionaire’ can be used to provide instant feedback to students and teachers alike. Advances in technology now allow them to be used in a range of more sophisticated settings and comprehensive guides on use have been developed for even the most techno-phobic staff.
Resumo:
The book develops a novel legal argument about the voting rights of recognised 1951 Geneva Convention Refugees. The main normative contention is that such refugees should have the right to vote in the political community where they reside, assuming that the political community is a democracy and that its citizens have the right to vote. The basis of this contention is that the right to political participation in some political community is a basic right from the point of view of dignity and the protection of one’s interests. Due to their unique political predicament, 1951 Geneva Convention Refugees are a special category of non-citizen residents. They are unable to participate in elections of their state of origin, do not enjoy its diplomatic protection and consular assistance abroad, and – most fundamentally – are unable or unwilling, owing to a well-founded fear of persecution, to return to it; thus, they are in limbo for a potentially protracted period. Refugees, too, deserve to have a place in the world in the Arendtian sense, where their opinions are significant and their actions are effective. Their state of asylum is, for the time being, the only community in which there is any realistic prospect of political participation on their part.
Resumo:
The experimental variogram computed in the usual way by the method of moments and the Haar wavelet transform are similar in that they filter data and yield informative summaries that may be interpreted. The variogram filters out constant values; wavelets can filter variation at several spatial scales and thereby provide a richer repertoire for analysis and demand no assumptions other than that of finite variance. This paper compares the two functions, identifying that part of the Haar wavelet transform that gives it its advantages. It goes on to show that the generalized variogram of order k=1, 2, and 3 filters linear, quadratic, and cubic polynomials from the data, respectively, which correspond with more complex wavelets in Daubechies's family. The additional filter coefficients of the latter can reveal features of the data that are not evident in its usual form. Three examples in which data recorded at regular intervals on transects are analyzed illustrate the extended form of the variogram. The apparent periodicity of gilgais in Australia seems to be accentuated as filter coefficients are added, but otherwise the analysis provides no new insight. Analysis of hyerpsectral data with a strong linear trend showed that the wavelet-based variograms filtered it out. Adding filter coefficients in the analysis of the topsoil across the Jurassic scarplands of England changed the upper bound of the variogram; it then resembled the within-class variogram computed by the method of moments. To elucidate these results, we simulated several series of data to represent a random process with values fluctuating about a mean, data with long-range linear trend, data with local trend, and data with stepped transitions. The results suggest that the wavelet variogram can filter out the effects of long-range trend, but not local trend, and of transitions from one class to another, as across boundaries.
Resumo:
Models of the dynamics of nitrogen in soil (soil-N) can be used to aid the fertilizer management of a crop. The predictions of soil-N models can be validated by comparison with observed data. Validation generally involves calculating non-spatial statistics of the observations and predictions, such as their means, their mean squared-difference, and their correlation. However, when the model predictions are spatially distributed across a landscape the model requires validation with spatial statistics. There are three reasons for this: (i) the model may be more or less successful at reproducing the variance of the observations at different spatial scales; (ii) the correlation of the predictions with the observations may be different at different spatial scales; (iii) the spatial pattern of model error may be informative. In this study we used a model, parameterized with spatially variable input information about the soil, to predict the mineral-N content of soil in an arable field, and compared the results with observed data. We validated the performance of the N model spatially with a linear mixed model of the observations and model predictions, estimated by residual maximum likelihood. This novel approach allowed us to describe the joint variation of the observations and predictions as: (i) independent random variation that occurred at a fine spatial scale; (ii) correlated random variation that occurred at a coarse spatial scale; (iii) systematic variation associated with a spatial trend. The linear mixed model revealed that, in general, the performance of the N model changed depending on the spatial scale of interest. At the scales associated with random variation, the N model underestimated the variance of the observations, and the predictions were correlated poorly with the observations. At the scale of the trend, the predictions and observations shared a common surface. The spatial pattern of the error of the N model suggested that the observations were affected by the local soil condition, but this was not accounted for by the N model. In summary, the N model would be well-suited to field-scale management of soil nitrogen, but suited poorly to management at finer spatial scales. This information was not apparent with a non-spatial validation. (c),2007 Elsevier B.V. All rights reserved.
Resumo:
Tactile discrimination performance depends on the receptive field (RF) size of somatosensory cortical (SI) neurons. Psychophysical masking effects can reveal the RF of an idealized "virtual" somatosensory neuron. Previous studies show that top-down factors strongly affect tactile discrimination performance. Here, we show that non-informative vision of the touched body part influences tactile discrimination by modulating tactile RFs. Ten subjects performed spatial discrimination between touch locations on the forearm. Performance was improved when subjects saw their forearm compared to viewing a neutral object in the same location. The extent of visual information was relevant, since restricted view of the forearm did not have this enhancing effect. Vibrotactile maskers were placed symmetrically on either side of the tactile target locations, at two different distances. Overall, masking significantly impaired discrimination performance, but the spatial gradient of masking depended on what subjects viewed. Viewing the body reduced the effect of distant maskers, but enhanced the effect of close maskers, as compared to viewing a neutral object. We propose that viewing the body improves functional touch by sharpening tactile RFs in an early somatosensory map. Top-down modulation of lateral inhibition could underlie these effects.
Resumo:
Endorsed by the Society of Light and Lighting, this practical book offers comprehensive guidance on how colour, light and contrast can be incorporated within buildings to enhance their usability. The book provides state-of-the-art, clear guidance as well as a valuable information source for busy professionals involved in the design or management of new and existing environments. The ways colour, light and contrast are used within built environments are critical in determining how people interact with the space, and how confident, safe, and secure they will feel when doing so. They also have a major influence on a person’s sense of well-being and their ability to use the environment independently and without undue effort. Understanding how to use colour and contrast and how they are influenced by both natural and artificial lighting is vital for all those involved in the design and management of the environments and spaces we all use. In recent years there has been a considerable amount of work undertaken to further our understanding of how colour, light and contrast affect emotion and sensory abilities, and how they can assist or hinder people in their everyday lives. Other publications consider these issues individually but The Colour, Light and Contrast Manual: designing and managing inclusive built environments draws knowledge and information together to produce a unique, comprehensive and informative guide to how the three elements can work together to improve the design and management of environments for us all.
The sequential analysis of repeated binary responses: a score test for the case of three time points
Resumo:
In this paper a robust method is developed for the analysis of data consisting of repeated binary observations taken at up to three fixed time points on each subject. The primary objective is to compare outcomes at the last time point, using earlier observations to predict this for subjects with incomplete records. A score test is derived. The method is developed for application to sequential clinical trials, as at interim analyses there will be many incomplete records occurring in non-informative patterns. Motivation for the methodology comes from experience with clinical trials in stroke and head injury, and data from one such trial is used to illustrate the approach. Extensions to more than three time points and to allow for stratification are discussed. Copyright © 2005 John Wiley & Sons, Ltd.
Resumo:
Phoretic mites are likely the most abundant arthropods found on carcases and corpses. They outnumber their scavenger carriers in both number and diversity. Many phoretic mites travel on scavenger insects and are highly specific; they will arrive on a particular species of host and no other. Because of this, they may be useful as trace indicators of their carriers even when their carriers are absent. Phoretic mites can be valuable markers of time. They are usually found in a specialised transitional transport or dispersal stage, often moulting and transforming to adults shortly after arrival on a carcase or corpse. Many are characterised by faster development and generation cycles than their carriers. Humans are normally unaware, but we too carry mites; they are skin mites that are present in our clothes. More than 212 phoretic mite species associated with carcases have been reported in the literature. Among these, mites belonging to the Mesostigmata form the dominant group, represented by 127 species with 25 phoretic mite species belonging to the family Parasitidae and 48 to the Macrochelidae. Most of these mesostigmatids are associated with particular species of flies or carrion beetles, though some are associated with small mammals arriving during the early stages of decomposition. During dry decay, members of the Astigmata are more frequently found; 52 species are phoretic on scavengers, and the majority of these travel on late-arriving scavengers such as hide beetles, skin beetles and moths. Several species of carrion beetles can visit a corpse simultaneously, and each may carry 1-10 species of phoretic mites. An informative diversity of phoretic mites may be found on a decaying carcass at any given time. The composition of the phoretic mite assemblage on a carcass might provide valuable information about the conditions of and time elapsed since death.
Resumo:
This paper presents a simple Bayesian approach to sample size determination in clinical trials. It is required that the trial should be large enough to ensure that the data collected will provide convincing evidence either that an experimental treatment is better than a control or that it fails to improve upon control by some clinically relevant difference. The method resembles standard frequentist formulations of the problem, and indeed in certain circumstances involving 'non-informative' prior information it leads to identical answers. In particular, unlike many Bayesian approaches to sample size determination, use is made of an alternative hypothesis that an experimental treatment is better than a control treatment by some specified magnitude. The approach is introduced in the context of testing whether a single stream of binary observations are consistent with a given success rate p(0). Next the case of comparing two independent streams of normally distributed responses is considered, first under the assumption that their common variance is known and then for unknown variance. Finally, the more general situation in which a large sample is to be collected and analysed according to the asymptotic properties of the score statistic is explored. Copyright (C) 2007 John Wiley & Sons, Ltd.
Resumo:
The aim of phase II single-arm clinical trials of a new drug is to determine whether it has sufficient promising activity to warrant its further development. For the last several years Bayesian statistical methods have been proposed and used. Bayesian approaches are ideal for earlier phase trials as they take into account information that accrues during a trial. Predictive probabilities are then updated and so become more accurate as the trial progresses. Suitable priors can act as pseudo samples, which make small sample clinical trials more informative. Thus patients have better chances to receive better treatments. The goal of this paper is to provide a tutorial for statisticians who use Bayesian methods for the first time or investigators who have some statistical background. In addition, real data from three clinical trials are presented as examples to illustrate how to conduct a Bayesian approach for phase II single-arm clinical trials with binary outcomes.
Resumo:
Background: The amino terminal half of the cellular prion protein PrPc is implicated in both the binding of copper ions and the conformational changes that lead to disease but has no defined structure. However, as some structure is likely to exist we have investigated the use of an established protein refolding technology, fusion to green fluorescence protein (GFP), as a method to examine the refolding of the amino terminal domain of mouse prion protein. Results: Fusion proteins of PrPc and GFP were expressed at high level in E. coli and could be purified to near homogeneity as insoluble inclusion bodies. Following denaturation, proteins were diluted into a refolding buffer whereupon GFP fluorescence recovered with time. Using several truncations of PrPc the rate of refolding was shown to depend on the prion sequence expressed. In a variation of the format, direct observation in E. coli, mutations introduced randomly in the PrPc protein sequence that affected folding could be selected directly by recovery of GFP fluorescence. Conclusion: Use of GFP as a measure of refolding of PrPc fusion proteins in vitro and in vivo proved informative. Refolding in vitro suggested a local structure within the amino terminal domain while direct selection via fluorescence showed that as little as one amino acid change could significantly alter folding. These assay formats, not previously used to study PrP folding, may be generally useful for investigating PrPc structure and PrPc-ligand interaction.
Resumo:
We describe and evaluate a new estimator of the effective population size (N-e), a critical parameter in evolutionary and conservation biology. This new "SummStat" N-e. estimator is based upon the use of summary statistics in an approximate Bayesian computation framework to infer N-e. Simulations of a Wright-Fisher population with known N-e show that the SummStat estimator is useful across a realistic range of individuals and loci sampled, generations between samples, and N-e values. We also address the paucity of information about the relative performance of N-e estimators by comparing the SUMMStat estimator to two recently developed likelihood-based estimators and a traditional moment-based estimator. The SummStat estimator is the least biased of the four estimators compared. In 32 of 36 parameter combinations investigated rising initial allele frequencies drawn from a Dirichlet distribution, it has the lowest bias. The relative mean square error (RMSE) of the SummStat estimator was generally intermediate to the others. All of the estimators had RMSE > 1 when small samples (n = 20, five loci) were collected a generation apart. In contrast, when samples were separated by three or more generations and Ne less than or equal to 50, the SummStat and likelihood-based estimators all had greatly reduced RMSE. Under the conditions simulated, SummStat confidence intervals were more conservative than the likelihood-based estimators and more likely to include true N-e. The greatest strength of the SummStat estimator is its flexible structure. This flexibility allows it to incorporate any, potentially informative summary statistic from Population genetic data.