892 resultados para Two-stage classification


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Most statistical methodology for phase III clinical trials focuses on the comparison of a single experimental treatment with a control. An increasing desire to reduce the time before regulatory approval of a new drug is sought has led to development of two-stage or sequential designs for trials that combine the definitive analysis associated with phase III with the treatment selection element of a phase II study. In this paper we consider a trial in which the most promising of a number of experimental treatments is selected at the first interim analysis. This considerably reduces the computational load associated with the construction of stopping boundaries compared to the approach proposed by Follman, Proschan and Geller (Biometrics 1994; 50: 325-336). The computational requirement does not exceed that for the sequential comparison of a single experimental treatment with a control. Existing methods are extended in two ways. First, the use of the efficient score as a test statistic makes the analysis of binary, normal or failure-time data, as well as adjustment for covariates or stratification straightforward. Second, the question of trial power is also considered, enabling the determination of sample size required to give specified power. Copyright © 2003 John Wiley & Sons, Ltd.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We introduce a procedure for association based analysis of nuclear families that allows for dichotomous and more general measurements of phenotype and inclusion of covariate information. Standard generalized linear models are used to relate phenotype and its predictors. Our test procedure, based on the likelihood ratio, unifies the estimation of all parameters through the likelihood itself and yields maximum likelihood estimates of the genetic relative risk and interaction parameters. Our method has advantages in modelling the covariate and gene-covariate interaction terms over recently proposed conditional score tests that include covariate information via a two-stage modelling approach. We apply our method in a study of human systemic lupus erythematosus and the C-reactive protein that includes sex as a covariate.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Analyses of high-density single-nucleotide polymorphism (SNP) data, such as genetic mapping and linkage disequilibrium (LD) studies, require phase-known haplotypes to allow for the correlation between tightly linked loci. However, current SNP genotyping technology cannot determine phase, which must be inferred statistically. In this paper, we present a new Bayesian Markov chain Monte Carlo (MCMC) algorithm for population haplotype frequency estimation, particulary in the context of LD assessment. The novel feature of the method is the incorporation of a log-linear prior model for population haplotype frequencies. We present simulations to suggest that 1) the log-linear prior model is more appropriate than the standard coalescent process in the presence of recombination (>0.02cM between adjacent loci), and 2) there is substantial inflation in measures of LD obtained by a "two-stage" approach to the analysis by treating the "best" haplotype configuration as correct, without regard to uncertainty in the recombination process. Genet Epidemiol 25:106-114, 2003. (C) 2003 Wiley-Liss, Inc.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

1. Demographic models are assuming an important role in management decisions for endangered species. Elasticity analysis and scope for management analysis are two such applications. Elasticity analysis determines the vital rates that have the greatest impact on population growth. Scope for management analysis examines the effects that feasible management might have on vital rates and population growth. Both methods target management in an attempt to maximize population growth. 2. The Seychelles magpie robin Copsychus sechellarum is a critically endangered island endemic, the population of which underwent significant growth in the early 1990s following the implementation of a recovery programme. We examined how the formal use of elasticity and scope for management analyses might have shaped management in the recovery programme, and assessed their effectiveness by comparison with the actual population growth achieved. 3. The magpie robin population doubled from about 25 birds in 1990 to more than 50 by 1995. A simple two-stage demographic model showed that this growth was driven primarily by a significant increase in the annual survival probability of first-year birds and an increase in the birth rate. Neither the annual survival probability of adults nor the probability of a female breeding at age 1 changed significantly over time. 4. Elasticity analysis showed that the annual survival probability of adults had the greatest impact on population growth. There was some scope to use management to increase survival, but because survival rates were already high (> 0.9) this had a negligible effect on population growth. Scope for management analysis showed that significant population growth could have been achieved by targeting management measures at the birth rate and survival probability of first-year birds, although predicted growth rates were lower than those achieved by the recovery programme when all management measures were in place (i.e. 1992-95). 5. Synthesis and applications. We argue that scope for management analysis can provide a useful basis for management but will inevitably be limited to some extent by a lack of data, as our study shows. This means that identifying perceived ecological problems and designing management to alleviate them must be an important component of endangered species management. The corollary of this is that it will not be possible or wise to consider only management options for which there is a demonstrable ecological benefit. Given these constraints, we see little role for elasticity analysis because, when data are available, a scope for management analysis will always be of greater practical value and, when data are lacking, precautionary management demands that as many perceived ecological problems as possible are tackled.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Isothermal titration microcalorimetry (ITC) has been applied to investigate protein-tannin interactions. Two hydrolyzable tannins were studied, namely myrabolan and tara tannins, for their interaction with bovine serum albumin (BSA), a model globular protein, and gelatin, a model proline-rich random coil protein. Calorimetry data indicate that protein-tannin interaction mechanisms are dependent upon the nature of the protein involved. Tannins apparently interact nonspecifically with the globular BSA, leading to binding saturation at estimated tannin/BSA molar ratios of 48:1 for tara- and 178:1 for myrabolan tannins. Tannins bind to the random coil protein gelatin by a two-stage mechanism. The energetics of the first stage show evidence for cooperative binding of tannins to the protein, while the second stage indicates gradual saturation of binding sites as observed for interaction with BSA. The structure and flexibility of the tannins themselves alters the stoichiometry of the interaction, but does not appear to have any significant affect on the overall binding mechanism observed. This study demonstrates the potential of ITC for providing an insight into the nature of protein-tannin interactions.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The level set method is commonly used to address image noise removal. Existing studies concentrate mainly on determining the speed function of the evolution equation. Based on the idea of a Canny operator, this letter introduces a new method of controlling the level set evolution, in which the edge strength is taken into account in choosing curvature flows for the speed function and the normal to edge direction is used to orient the diffusion of the moving interface. The addition of an energy term to penalize the irregularity allows for better preservation of local edge information. In contrast with previous Canny-based level set methods that usually adopt a two-stage framework, the proposed algorithm can execute all the above operations in one process during noise removal.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper presents a unique two-stage image restoration framework especially for further application of a novel rectangular poor-pixels detector, which, with properties of miniature size, light weight and low power consumption, has great value in the micro vision system. To meet the demand of fast processing, only a few measured images shifted up to subpixel level are needed to join the fusion operation, fewer than those required in traditional approaches. By maximum likelihood estimation with a least squares method, a preliminary restored image is linearly interpolated. After noise removal via Canny operator based level set evolution, the final high-quality restored image is achieved. Experimental results demonstrate effectiveness of the proposed framework. It is a sensible step towards subsequent image understanding and object identification.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A common problem in many data based modelling algorithms such as associative memory networks is the problem of the curse of dimensionality. In this paper, a new two-stage neurofuzzy system design and construction algorithm (NeuDeC) for nonlinear dynamical processes is introduced to effectively tackle this problem. A new simple preprocessing method is initially derived and applied to reduce the rule base, followed by a fine model detection process based on the reduced rule set by using forward orthogonal least squares model structure detection. In both stages, new A-optimality experimental design-based criteria we used. In the preprocessing stage, a lower bound of the A-optimality design criterion is derived and applied as a subset selection metric, but in the later stage, the A-optimality design criterion is incorporated into a new composite cost function that minimises model prediction error as well as penalises the model parameter variance. The utilisation of NeuDeC leads to unbiased model parameters with low parameter variance and the additional benefit of a parsimonious model structure. Numerical examples are included to demonstrate the effectiveness of this new modelling approach for high dimensional inputs.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper proposes the subspace-based space-time (ST) dual-rate blind linear detectors for synchronous DS/CDMA systems, which can be viewed as the ST extension of our previously presented purely temporal dual-rate blind linear detectors. The theoretical analyses on their performances are also carried out. Finally, the two-stage ST blind detectors are presented, which combine the adaptive purely temporal dual-rate blind MMSE filters with the non-adaptive beamformer. Their adaptive stages with parallel structure converge much faster than the corresponding adaptive ST dual-rate blind MMSE detectors, while having a comparable computational complexity to the latter.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper explores the possibility of combining moderate vacuum frying followed by post-frying high vacuum application during the oil drainage stage, with the aim to reduce oil content in potato chips. Potato slices were initially vacuum fried under two operating conditions (140 °C, 20 kPa and 162 °C, 50.67 kPa) until the moisture content reached 10 and 15 % (wet basis), prior to holding the samples in the head space under high vacuum level (1.33 kPa). This two-stage process was found to lower significantly the amount of oil taken up by potato chips by an amount as high as 48 %, compared to drainage at the same pressure as the frying pressure. Reducing the pressure value to 1.33 kPa reduced the water saturation temperature (11 °C), causing the product to continuously lose moisture during the course of drainage. Continuous release of water vapour prevented the occluded surface oil from penetrating into the product structure and released it from the surface of the product. When frying and drainage occurred at the same pressure, the temperature of the product fell below the water saturation temperature soon after it was lifted out of the oil, which resulted in the oil getting sucked into the product. Thus, lowering the pressure after frying to a value well below the frying pressure is a promising method to lower oil uptake by the product.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This study examines differences in net selling price for residential real estate across male and female agents. A sample of 2,020 home sales transactions from Fulton County, Georgia are analyzed in a two-stage least squares, geospatial autoregressive corrected, semi-log hedonic model to test for gender and gender selection effects. Although agent gender seems to play a role in naïve models, its role becomes inconclusive as variables controlling for possible price and time on market expectations of the buyers and sellers are introduced to the models. Clear differences in real estate sales prices, time on market, and agent incomes across genders are unlikely due to differences in negotiation performance between genders or the mix of genders in a two-agent negotiation. The evidence suggests an interesting alternative to agent performance: that buyers and sellers with different reservation price and time on market expectations, such as those selling foreclosure homes, tend to select agents along gender lines.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper proposes a method for describing the distribution of observed temperatures on any day of the year such that the distribution and summary statistics of interest derived from the distribution vary smoothly through the year. The method removes the noise inherent in calculating summary statistics directly from the data thus easing comparisons of distributions and summary statistics between different periods. The method is demonstrated using daily effective temperatures (DET) derived from observations of temperature and wind speed at De Bilt, Holland. Distributions and summary statistics are obtained from 1985 to 2009 and compared to the period 1904–1984. A two-stage process first obtains parameters of a theoretical probability distribution, in this case the generalized extreme value (GEV) distribution, which describes the distribution of DET on any day of the year. Second, linear models describe seasonal variation in the parameters. Model predictions provide parameters of the GEV distribution, and therefore summary statistics, that vary smoothly through the year. There is evidence of an increasing mean temperature, a decrease in the variability in temperatures mainly in the winter and more positive skew, more warm days, in the summer. In the winter, the 2% point, the value below which 2% of observations are expected to fall, has risen by 1.2 °C, in the summer the 98% point has risen by 0.8 °C. Medians have risen by 1.1 and 0.9 °C in winter and summer, respectively. The method can be used to describe distributions of future climate projections and other climate variables. Further extensions to the methodology are suggested.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In Sub-Saharan Africa (SSA) the technological advances of the Green Revolution (GR) have not been very successful. However, the efforts being made to re-introduce the revolution call for more socio-economic research into the adoption and the effects of the new technologies. The paper discusses an investigation on the effects of GR technology adoption on poverty among households in Ghana. Maximum likelihood estimation of a poverty model within the framework of Heckman's two stage method of correcting for sample selection was employed. Technology adoption was found to have positive effects in reducing poverty. Other factors that reduce poverty include education, credit, durable assets, living in the forest belt and in the south of the country. Technology adoption itself was also facilitated by education, credit, non-farm income and household labour supply as well as living in urban centres. Inarguably, technology adoption can be taken seriously by increasing the levels of complementary inputs such as credit, extension services and infrastructure. Above all, the fundamental problems of illiteracy, inequality and lack of effective markets must be addressed through increasing the levels of formal and non-formal education, equitable distribution of the 'national cake' and a more pragmatic management of the ongoing Structural Adjustment Programme.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Purpose – This study aims to provide a review of brownfield policy and the emerging sustainable development agenda in the UK, and to examine the development industry’s (both commercial and residential) role and attitudes towards brownfield regeneration and contaminated land. Design/methodology/approach – The paper analyses results from a two-stage survey of commercial and residential developers carried out in mid-2004, underpinned by structured interviews with 11 developers. Findings – The results suggest that housebuilding on brownfield is no longer the preserve of specialists, and is now widespread throughout the industry in the UK. The redevelopment of contaminated sites for residential use could be threatened by the impact of the EU Landfill Directive. The findings also suggest that developers are not averse to developing on contaminated sites, although post-remediation stigma remains an issue. The market for warranties and insurance continues to evolve. Research limitations/implications – The survey is based on a sample which represents nearly 30 per cent of UK volume housebuilding. Although the response in the smaller developer groups was relatively under-represented, non-response bias was not found to be a significant issue. More research is needed to assess the way in which developers approach brownfield regeneration at a local level. Practical implications – The research suggests that clearer Government guidance in the UK is needed on how to integrate concepts of sustainability in brownfield development and that EU policy, which has been introduced for laudable aims, is creating tensions within the development industry. There may be an emphasis towards greenfield development in the future, as the implications of the Barker review are felt. Originality/value – This is a national survey of developers’ attitudes towards brownfield development in the UK, following the Barker Review, and highlights key issues in UK and EU policy layers. Keywords Brownfield sites, Contamination Paper type Research paper

Relevância:

80.00% 80.00%

Publicador:

Resumo:

By using simulation methods, we studied the adsorption of binary CO2-CH4 mixtures on various CH4 preadsorbed carbonaceous materials (e.g., triply periodic carbon minimal surfaces, slit-shaped carbon micropores, and Harris's virtual porous carbons) at 293 K. Regardless of the different micropore geometry, two-stage mechanism of CH4 displacement from carbon nanospaces by coadsorbed CO2 has been proposed. In the first stage, the coadsorbed CO2 molecules induced the enhancement of CH4 adsorbed amount. In the second stage, the stronger affinity of CO2 to flat/curved graphitic surfaces as well as CO2-CO2 interactions cause the displacement of CH4 molecules from carbonaceous materials. The operating conditions of CO2-induced cleaning of the adsorbed phase from CH4 mixture component strongly depend on the size of the carbon micropores, but, in general, the enhanced adsorption field in narrow carbon ultramicropores facilitates the nonreactive displacement of CH4 by coadsorbed CO2. This is because in narrow carbon ultramicropores the equilibrium CO2/CH4 selectivity (i.e., preferential adsorption toward CO2) increased significantly. The adsorption field in wider micropores (i.e., the overall surface energy) for both CO2 and CH4 is very similar, which decreases the preferential CO2 adsorption. This suppresses the displacement of CH4 by coadsorbed CO2 and assists further adsorption of CH4 from the bulk mixture (i.e., CO2/CH4 mixing in adsorbed phase).