928 resultados para Binary hypothesis testing


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background The various cell types and their relative numbers in multicellular organisms are controlled by growth factors and related extracellular molecules which affect genetic expression pathways. However, these substances may have both/either inhibitory and/or stimulatory effects on cell division and cell differentiation depending on the cellular environment. It is not known how cells respond to these substances in such an ambiguous way. Many cellular effects have been investigated and reported using cell culture from cancer cell lines in an effort to define normal cellular behaviour using these abnormal cells. A model is offered to explain the harmony of cellular life in multicellular organisms involving interacting extracellular substances. Methods A basic model was proposed based on asymmetric cell division and evidence to support the hypothetical model was accumulated from the literature. In particular, relevant evidence was selected for the Insulin-Like Growth Factor system from the published data, especially from certain cell lines, to support the model. The evidence has been selective in an attempt to provide a picture of normal cellular responses, derived from the cell lines. Results The formation of a pair of coupled cells by asymmetric cell division is an integral part of the model as is the interaction of couplet molecules derived from these cells. Each couplet cell will have a receptor to measure the amount of the couplet molecule produced by the other cell; each cell will be receptor-positive or receptor-negative for the respective receptors. The couplet molecules will form a binary complex whose level is also measured by the cell. The hypothesis is heavily supported by selective collection of circumstantial evidence and by some direct evidence. The basic model can be expanded to other cellular interactions. Conclusions These couplet cells and interacting couplet molecules can be viewed as a mechanism that provides a controlled and balanced division-of-labour between the two progeny cells, and, in turn, their progeny. The presence or absence of a particular receptor for a couplet molecule will define a cell type and the presence or absence of many such receptors will define the cell types of the progeny within cell lineages.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents an effective classification method based on Support Vector Machines (SVM) in the context of activity recognition. Local features that capture both spatial and temporal information in activity videos have made significant progress recently. Efficient and effective features, feature representation and classification plays a crucial role in activity recognition. For classification, SVMs are popularly used because of their simplicity and efficiency; however the common multi-class SVM approaches applied suffer from limitations including having easily confused classes and been computationally inefficient. We propose using a binary tree SVM to address the shortcomings of multi-class SVMs in activity recognition. We proposed constructing a binary tree using Gaussian Mixture Models (GMM), where activities are repeatedly allocated to subnodes until every new created node contains only one activity. Then, for each internal node a separate SVM is learned to classify activities, which significantly reduces the training time and increases the speed of testing compared to popular the `one-against-the-rest' multi-class SVM classifier. Experiments carried out on the challenging and complex Hollywood dataset demonstrates comparable performance over the baseline bag-of-features method.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A test for time-varying correlation is developed within the framework of a dynamic conditional score (DCS) model for both Gaussian and Student t-distributions. The test may be interpreted as a Lagrange multiplier test and modified to allow for the estimation of models for time-varying volatility in the individual series. Unlike standard moment-based tests, the score-based test statistic includes information on the level of correlation under the null hypothesis and local power arguments indicate the benefits of doing so. A simulation study shows that the performance of the score-based test is strong relative to existing tests across a range of data generating processes. An application to the Hong Kong and South Korean equity markets shows that the new test reveals changes in correlation that are not detected by the standard moment-based test.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The solidification pathways of Nb rich Nb-Si alloys when processed under non-equilibrium conditions require understanding. Continuing with our earlier work on alloying additions in single eutectic composition 1,2], we report a detailed characterization of the microstructures of Nb-Si binary alloys with wide composition range (10-25 at% Si). The alloys are processed using chilled copper mould suction casting. This has allowed us to correlate the evolution of microstructure and phases with different possible solidification pathways. Finally these are correlated with mechanical properties through studies on deformation using mechanical testing under indentation and compressive loads. It is shown that microstructure modification can significantly influence the plasticity of these alloys.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we consider the problem of finding a spectrum hole of a specified bandwidth in a given wide band of interest. We propose a new, simple and easily implementable sub-Nyquist sampling scheme for signal acquisition and a spectrum hole search algorithm that exploits sparsity in the primary spectral occupancy in the frequency domain by testing a group of adjacent subbands in a single test. The sampling scheme deliberately introduces aliasing during signal acquisition, resulting in a signal that is the sum of signals from adjacent sub-bands. Energy-based hypothesis tests are used to provide an occupancy decision over the group of subbands, and this forms the basis of the proposed algorithm to find contiguous spectrum holes. We extend this framework to a multi-stage sensing algorithm that can be employed in a variety of spectrum sensing scenarios, including non-contiguous spectrum hole search. Further, we provide the analytical means to optimize the hypothesis tests with respect to the detection thresholds, number of samples and group size to minimize the detection delay under a given error rate constraint. Depending on the sparsity and SNR, the proposed algorithms can lead to significantly lower detection delays compared to a conventional bin-by-bin energy detection scheme; the latter is in fact a special case of the group test when the group size is set to 1. We validate our analytical results via Monte Carlo simulations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper investigates the use of adaptive group testing to find a spectrum hole of a specified bandwidth in a given wideband of interest. We propose a group testing-based spectrum hole search algorithm that exploits sparsity in the primary spectral occupancy by testing a group of adjacent subbands in a single test. This is enabled by a simple and easily implementable sub-Nyquist sampling scheme for signal acquisition by the cognitive radios (CRs). The sampling scheme deliberately introduces aliasing during signal acquisition, resulting in a signal that is the sum of signals from adjacent subbands. Energy-based hypothesis tests are used to provide an occupancy decision over the group of subbands, and this forms the basis of the proposed algorithm to find contiguous spectrum holes of a specified bandwidth. We extend this framework to a multistage sensing algorithm that can be employed in a variety of spectrum sensing scenarios, including noncontiguous spectrum hole search. Furthermore, we provide the analytical means to optimize the group tests with respect to the detection thresholds, number of samples, group size, and number of stages to minimize the detection delay under a given error probability constraint. Our analysis allows one to identify the sparsity and SNR regimes where group testing can lead to significantly lower detection delays compared with a conventional bin-by-bin energy detection scheme; the latter is, in fact, a special case of the group test when the group size is set to 1 bin. We validate our analytical results via Monte Carlo simulations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Published also as: Documento de Trabajo Banco de España 0504/2005.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the quest for a descriptive theory of decision-making, the rational actor model in economics imposes rather unrealistic expectations and abilities on human decision makers. The further we move from idealized scenarios, such as perfectly competitive markets, and ambitiously extend the reach of the theory to describe everyday decision making situations, the less sense these assumptions make. Behavioural economics has instead proposed models based on assumptions that are more psychologically realistic, with the aim of gaining more precision and descriptive power. Increased psychological realism, however, comes at the cost of a greater number of parameters and model complexity. Now there are a plethora of models, based on different assumptions, applicable in differing contextual settings, and selecting the right model to use tends to be an ad-hoc process. In this thesis, we develop optimal experimental design methods and evaluate different behavioral theories against evidence from lab and field experiments.

We look at evidence from controlled laboratory experiments. Subjects are presented with choices between monetary gambles or lotteries. Different decision-making theories evaluate the choices differently and would make distinct predictions about the subjects' choices. Theories whose predictions are inconsistent with the actual choices can be systematically eliminated. Behavioural theories can have multiple parameters requiring complex experimental designs with a very large number of possible choice tests. This imposes computational and economic constraints on using classical experimental design methods. We develop a methodology of adaptive tests: Bayesian Rapid Optimal Adaptive Designs (BROAD) that sequentially chooses the "most informative" test at each stage, and based on the response updates its posterior beliefs over the theories, which informs the next most informative test to run. BROAD utilizes the Equivalent Class Edge Cutting (EC2) criteria to select tests. We prove that the EC2 criteria is adaptively submodular, which allows us to prove theoretical guarantees against the Bayes-optimal testing sequence even in the presence of noisy responses. In simulated ground-truth experiments, we find that the EC2 criteria recovers the true hypotheses with significantly fewer tests than more widely used criteria such as Information Gain and Generalized Binary Search. We show, theoretically as well as experimentally, that surprisingly these popular criteria can perform poorly in the presence of noise, or subject errors. Furthermore, we use the adaptive submodular property of EC2 to implement an accelerated greedy version of BROAD which leads to orders of magnitude speedup over other methods.

We use BROAD to perform two experiments. First, we compare the main classes of theories for decision-making under risk, namely: expected value, prospect theory, constant relative risk aversion (CRRA) and moments models. Subjects are given an initial endowment, and sequentially presented choices between two lotteries, with the possibility of losses. The lotteries are selected using BROAD, and 57 subjects from Caltech and UCLA are incentivized by randomly realizing one of the lotteries chosen. Aggregate posterior probabilities over the theories show limited evidence in favour of CRRA and moments' models. Classifying the subjects into types showed that most subjects are described by prospect theory, followed by expected value. Adaptive experimental design raises the possibility that subjects could engage in strategic manipulation, i.e. subjects could mask their true preferences and choose differently in order to obtain more favourable tests in later rounds thereby increasing their payoffs. We pay close attention to this problem; strategic manipulation is ruled out since it is infeasible in practice, and also since we do not find any signatures of it in our data.

In the second experiment, we compare the main theories of time preference: exponential discounting, hyperbolic discounting, "present bias" models: quasi-hyperbolic (α, β) discounting and fixed cost discounting, and generalized-hyperbolic discounting. 40 subjects from UCLA were given choices between 2 options: a smaller but more immediate payoff versus a larger but later payoff. We found very limited evidence for present bias models and hyperbolic discounting, and most subjects were classified as generalized hyperbolic discounting types, followed by exponential discounting.

In these models the passage of time is linear. We instead consider a psychological model where the perception of time is subjective. We prove that when the biological (subjective) time is positively dependent, it gives rise to hyperbolic discounting and temporal choice inconsistency.

We also test the predictions of behavioral theories in the "wild". We pay attention to prospect theory, which emerged as the dominant theory in our lab experiments of risky choice. Loss aversion and reference dependence predicts that consumers will behave in a uniquely distinct way than the standard rational model predicts. Specifically, loss aversion predicts that when an item is being offered at a discount, the demand for it will be greater than that explained by its price elasticity. Even more importantly, when the item is no longer discounted, demand for its close substitute would increase excessively. We tested this prediction using a discrete choice model with loss-averse utility function on data from a large eCommerce retailer. Not only did we identify loss aversion, but we also found that the effect decreased with consumers' experience. We outline the policy implications that consumer loss aversion entails, and strategies for competitive pricing.

In future work, BROAD can be widely applicable for testing different behavioural models, e.g. in social preference and game theory, and in different contextual settings. Additional measurements beyond choice data, including biological measurements such as skin conductance, can be used to more rapidly eliminate hypothesis and speed up model comparison. Discrete choice models also provide a framework for testing behavioural models with field data, and encourage combined lab-field experiments.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Heavy goods vehicles exhibit poor braking performance in emergency situations when compared to other vehicles. Part of the problem is caused by sluggish pneumatic brake actuators, which limit the control bandwidth of their antilock braking systems. In addition, heuristic control algorithms are used that do not achieve the maximum braking force throughout the stop. In this article, a novel braking system is introduced for pneumatically braked heavy goods vehicles. The conventional brake actuators are improved by placing high-bandwidth, binary-actuated valves directly on the brake chambers. A made-for-purpose valve is described. It achieves a switching delay of 3-4 ms in tests, which is an order of magnitude faster than solenoids in conventional anti-lock braking systems. The heuristic braking control algorithms are replaced with a wheel slip regulator based on sliding mode control. The combined actuator and slip controller are shown to reduce stopping distances on smooth and rough, high friction (μ = 0.9) surfaces by 10% and 27% respectively in hardware-in-the-loop tests compared with conventional ABS. On smooth and rough, low friction (μ = 0.2) surfaces, stopping distances are reduced by 23% and 25%, respectively. Moreover, the overall air reservoir size required on a heavy goods vehicle is governed by its air usage during an anti-lock braking stop on a low friction, smooth surface. The 37% reduction in air usage observed in hardware-in-the-loop tests on this surface therefore represents the potential reduction in reservoir size that could be achieved by the new system. © 2012 IMechE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We develop a methodology for testing Hicks's induced innovation hypothesis by estimating a product-characteristics model of energy-using consumer durables, augmenting the hypothesis to allow for the influence of government regulations. For the products we explored, the evidence suggests that (i) the rate of overall innovation was independent of energy prices and regulations; (ii) the direction of innovation was responsive to energy price changes for some products but not for others; (iii) energy price changes induced changes in the subset of technically feasible models that were offered for sale; (iv) this responsiveness increased substantially during the period after energy-efficiency product labeling was required; and (v) nonetheless, a sizable portion of efficiency improvements were autonomous.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

One of the first attempts to develop a formal model of depth cue integration is to be found in Maloney and Landy's (1989) "human depth combination rule". They advocate that the combination of depth cues by the visual sysetem is best described by a weighted linear model. The present experiments tested whether the linear combination rule applies to the integration of texture and shading. As would be predicted by a linear combination rule, the weight assigned to the shading cue did vary as a function of its curvature value. However, the weight assigned to the texture cue varied systematically as a function of the curvature value of both cues. Here we descrive a non-linear model which provides a better fit to the data. Redescribing the stimuli in terms of depth rather than curvature reduced the goodness of fit for all models tested. These results support the hypothesis that the locus of cue integration is a curvature map, rather than a depth map. We conclude that the linear comination rule does not generalize to the integration of shading and texture, and that for these cues it is likely that integration occurs after the recovery of surface curvature.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Mid to high latitude forest ecosystems have undergone several major compositional changes during the Holocene. The temporal and spatial patterns of these vegetation changes hold potential information to their causes and triggers. Here we test the hypothesis that the timing of vegetation change was synchronous on a sub-continental scale, which implies a common trigger or a step-like change in climate parameters. Pollen diagrams from selected European regions were statistically divided into assemblage zones and the temporal pattern of the zone boundaries analysed. The results show that the temporal pattern of vegetation change was significantly different from random. Times of change cluster around8.2, 4.8, 3.7, and 1.2 ka, while times of higher than average stability were found around 2.1 and 5.1 ka.Compositional changes linked to the expansion of Corylus avellana and Alnus glutinosa centre around 10.6 and 9.5 ka, respectively. A climatic trigger initiating these changes may have occurred 0.5 to 1 ka earlier, respectively. The synchronous expansion of C. avellana and A. glutinosa exemplify that dispersal is not necessarily followed by population expansion. The partly synchronous, partly random expansion of A. glutinosa in adjacent European regions exemplifies that sudden synchronous population expansions are not species specific traits but vary regionally.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Moving beyond simply documenting that political violence negatively impacts children, we tested a social ecological hypothesis for relations between political violence and child outcomes. Participants were 700 mother child (M = 12.1 years, SD = 1.8) dyads from 18 working-class, socially deprived areas in Belfast, Northern Ireland, including single- and two-parent families. Sectarian community violence was associated with elevated family conflict and children's reduced security about multiple aspects of their social environment (i.e., family, parent child relations, and community), with links to child adjustment problems and reductions in prosocial behavior. By comparison, and consistent with expectations, links with negative family processes, child regulatory problems, and child outcomes were less consistent for nonsectarian community violence. Support was found for a social ecological model for relations between political violence and child outcomes among both single- and two-parent families, with evidence that emotional security and adjustment problems were more negatively affected in single-parent families. The implications for understanding social ecologies of political violence and children's functioning are discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Integrating evidence from multiple domains is useful in prioritizing disease candidate genes for subsequent testing. We ranked all known human genes (n = 3819) under linkage peaks in the Irish Study of High-Density Schizophrenia Families using three different evidence domains: 1) a meta-analysis of microarray gene expression results using the Stanley Brain collection, 2) a schizophrenia protein-protein interaction network, and 3) a systematic literature search. Each gene was assigned a domain-specific p-value and ranked after evaluating the evidence within each domain. For comparison to this
ranking process, a large-scale candidate gene hypothesis was also tested by including genes with Gene Ontology terms related to neurodevelopment. Subsequently, genotypes of 3725 SNPs in 167 genes from a custom Illumina iSelect array were used to evaluate the top ranked vs. hypothesis selected genes. Seventy-three genes were both highly ranked and involved in neurodevelopment (category 1) while 42 and 52 genes were exclusive to neurodevelopment (category 2) or highly ranked (category 3), respectively. The most significant associations were observed in genes PRKG1, PRKCE, and CNTN4 but no individual SNPs were significant after correction for multiple testing. Comparison of the approaches showed an excess of significant tests using the hypothesis-driven neurodevelopment category. Random selection of similar sized genes from two independent genome-wide association studies (GWAS) of schizophrenia showed the excess was unlikely by chance. In a further meta-analysis of three GWAS datasets, four candidate SNPs reached nominal significance. Although gene ranking using integrated sources of prior information did not enrich for significant results in the current experiment, gene selection using an a priori hypothesis (neurodevelopment) was superior to random selection. As such, further development of gene ranking strategies using more carefully selected sources of information is warranted.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Molecular characteristics of cancer vary between individuals. In future, most trials will require assessment of biomarkers to allocate patients into enriched populations in which targeted therapies are more likely to be effective. The MRC FOCUS3 trial is a feasibility study to assess key elements in the planning of such studies.

Patients and methods: Patients with advanced colorectal cancer were registered from 24 centres between February 2010 and April 2011. With their consent, patients' tumour samples were analysed for KRAS/BRAF oncogene mutation status and topoisomerase 1 (topo-1) immunohistochemistry. Patients were then classified into one of four molecular strata; within each strata patients were randomised to one of two hypothesis-driven experimental therapies or a common control arm (FOLFIRI chemotherapy). A 4-stage suite of patient information sheets (PISs) was developed to avoid patient overload.

Results: A total of 332 patients were registered, 244 randomised. Among randomised patients, biomarker results were provided within 10 working days (w.d.) in 71%, 15 w.d. in 91% and 20 w.d. in 99%. DNA mutation analysis was 100% concordant between two laboratories. Over 90% of participants reported excellent understanding of all aspects of the trial. In this randomised phase II setting, omission of irinotecan in the low topo-1 group was associated with increased response rate and addition of cetuximab in the KRAS, BRAF wild-type cohort was associated with longer progression-free survival.

Conclusions: Patient samples can be collected and analysed within workable time frames and with reproducible mutation results. Complex multi-arm designs are acceptable to patients with good PIS. Randomisation within each cohort provides outcome data that can inform clinical practice.