50 resultados para Stair Nested Designs

em CentAUR: Central Archive University of Reading - UK


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The variogram is essential for local estimation and mapping of any variable by kriging. The variogram itself must usually be estimated from sample data. The sampling density is a compromise between precision and cost, but it must be sufficiently dense to encompass the principal spatial sources of variance. A nested, multi-stage, sampling with separating distances increasing in geometric progression from stage to stage will do that. The data may then be analyzed by a hierarchical analysis of variance to estimate the components of variance for every stage, and hence lag. By accumulating the components starting from the shortest lag one obtains a rough variogram for modest effort. For balanced designs the analysis of variance is optimal; for unbalanced ones, however, these estimators are not necessarily the best, and the analysis by residual maximum likelihood (REML) will usually be preferable. The paper summarizes the underlying theory and illustrates its application with data from three surveys, one in which the design had four stages and was balanced and two implemented with unbalanced designs to economize when there were more stages. A Fortran program is available for the analysis of variance, and code for the REML analysis is listed in the paper. (c) 2005 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The variogram is essential for local estimation and mapping of any variable by kriging. The variogram itself must usually be estimated from sample data. The sampling density is a compromise between precision and cost, but it must be sufficiently dense to encompass the principal spatial sources of variance. A nested, multi-stage, sampling with separating distances increasing in geometric progression from stage to stage will do that. The data may then be analyzed by a hierarchical analysis of variance to estimate the components of variance for every stage, and hence lag. By accumulating the components starting from the shortest lag one obtains a rough variogram for modest effort. For balanced designs the analysis of variance is optimal; for unbalanced ones, however, these estimators are not necessarily the best, and the analysis by residual maximum likelihood (REML) will usually be preferable. The paper summarizes the underlying theory and illustrates its application with data from three surveys, one in which the design had four stages and was balanced and two implemented with unbalanced designs to economize when there were more stages. A Fortran program is available for the analysis of variance, and code for the REML analysis is listed in the paper. (c) 2005 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The paper is concerned with the uniformization of a system of affine recurrence equations. This transformation is used in the design (or compilation) of highly parallel embedded systems (VLSI systolic arrays, signal processing filters, etc.). We present and implement an automatic system to achieve uniformization of systems of affine recurrence equations. We unify the results from many earlier papers, develop some theoretical extensions, and then propose effective uniformization algorithms. Our results can be used in any high level synthesis tool based on polyhedral representation of nested loop computations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An unbalanced nested sampling design was used to investigate the spatial scale of soil and herbicide interactions at the field scale. A hierarchical analysis of variance based on residual maximum likelihood (REML) was used to analyse the data and provide a first estimate of the variogram. Soil samples were taken at 108 locations at a range of separating distances in a 9 ha field to explore small and medium scale spatial variation. Soil organic matter content, pH, particle size distribution, microbial biomass and the degradation and sorption of the herbicide, isoproturon, were determined for each soil sample. A large proportion of the spatial variation in isoproturon degradation and sorption occurred at sampling intervals less than 60 m, however, the sampling design did not resolve the variation present at scales greater than this. A sampling interval of 20-25 m should ensure that the main spatial structures are identified for isoproturon degradation rate and sorption without too great a loss of information in this field.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Architects and engineers depend on copyright law to protect their original works. Copyright protection is automatic once a tangible medium of expression in any form of an innovative material, conforming the Copyright Designs and Patents Act 1988, is created. In terms of architectural works, they are protected as literary works (design drawings and plans) and as artistic works (the building or model of the building). The case law on the concept of “originality” however discloses that it may be difficult for certain artistic works of architecture to achieve copyright protection. Although copyright law provides automatic protection to all original architectural plans, the limitation is that it only protects the expression of ideas but not the ideas themselves. The purpose of this research is to explore how effective the UK’s copyright law regime is for protecting the rights and interests of architects in their works. In addition, the United States system of copyright law will be analysed to determine whether it provides more effective protection for architects and engineers with regard to architectural works. The key objective in carrying out this comparison is to compare and contrast the extent to which the two systems protect the rights and interests of architects against copyright infringement. This comparative analysis concludes by considering the possibility of copyright law reform in the UK.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Copyright protects the rights and interests of authors on their original works of authorship such as literary, dramatic, musical, artistic, and certain other intellectual works including architectural works and designs. It is automatic once a tangible medium of expression in any form of an innovative material, which conforms the Copyright Designs and Patents Act 1988 (CDPA 1988), is created. This includes the building, the architectural plans and drawings. There is no official copyright registry, no requirements on any fees need to be paid and they can be published or unpublished materials. Copyrights owners have the rights to control the reproduction, display, publication, and even derivation of the design. However, there are limitations on the rights of the copyright owners concerning copyrights infringements. Infringement of copyright is an unauthorised violation of the exclusive rights of the copyright author. Architects and engineers depend on copyright law to protect their works and design. Copyrights are protected on the arrangements of spaces and elements as well as the overall form of the architectural design. However, it does not cover the design of functional elements and standard features. Although copyright law provides automatic protection to all original architectural plans, the limitation is that copyright only protects the expression of ideas but not the ideas themselves. It can be argued that architectural drawings and design, including models are recognised categories of artistic works which are protected under the copyright law. This research investigates to what extent copyrights protect the rights and interests of the designers on architectural works and design.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper reviews state-of-art statistical designs for dose-escalation procedures in first-into-man studies. The main focus will be on studies in oncology, as most statistical procedures for phase I trials have been proposed in this context. Extensions to situations such as the observation of bivariate outcomes and healthy volunteer studies are also discussed. The number of dose levels and cohort sizes used in early phase trials are considered. Finally, this paper raises some practical issues for dose-escalation procedures.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, Bayesian decision procedures previously proposed for dose-escalation studies in healthy volunteers are reviewed and evaluated. Modifications are made to the expression of the prior distribution in order to make the procedure simpler to implement and a more relevant criterion for optimality is introduced. The results of an extensive simulation exercise to establish the proper-ties of the procedure and to aid choice between designs are summarized, and the way in which readers can use simulation to choose a design for their own trials is described. The influence of the value of the within-subject correlation on the procedure is investigated and the use of a simple prior to reflect uncertainty about the correlation is explored. Copyright (c) 2005 John Wiley & Sons, Ltd.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In clinical trials, situations often arise where more than one response from each patient is of interest; and it is required that any decision to stop the study be based upon some or all of these measures simultaneously. Theory for the design of sequential experiments with simultaneous bivariate responses is described by Jennison and Turnbull (Jennison, C., Turnbull, B. W. (1993). Group sequential tests for bivariate response: interim analyses of clinical trials with both efficacy and safety endpoints. Biometrics 49:741-752) and Cook and Farewell (Cook, R. J., Farewell, V. T. (1994). Guidelines for monitoring efficacy and toxicity responses in clinical trials. Biometrics 50:1146-1152) in the context of one efficacy and one safety response. These expositions are in terms of normally distributed data with known covariance. The methods proposed require specification of the correlation, ρ between test statistics monitored as part of the sequential test. It can be difficult to quantify ρ and previous authors have suggested simply taking the lowest plausible value, as this will guarantee power. This paper begins with an illustration of the effect that inappropriate specification of ρ can have on the preservation of trial error rates. It is shown that both the type I error and the power can be adversely affected. As a possible solution to this problem, formulas are provided for the calculation of correlation from data collected as part of the trial. An adaptive approach is proposed and evaluated that makes use of these formulas and an example is provided to illustrate the method. Attention is restricted to the bivariate case for ease of computation, although the formulas derived are applicable in the general multivariate case.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Most statistical methodology for phase III clinical trials focuses on the comparison of a single experimental treatment with a control. An increasing desire to reduce the time before regulatory approval of a new drug is sought has led to development of two-stage or sequential designs for trials that combine the definitive analysis associated with phase III with the treatment selection element of a phase II study. In this paper we consider a trial in which the most promising of a number of experimental treatments is selected at the first interim analysis. This considerably reduces the computational load associated with the construction of stopping boundaries compared to the approach proposed by Follman, Proschan and Geller (Biometrics 1994; 50: 325-336). The computational requirement does not exceed that for the sequential comparison of a single experimental treatment with a control. Existing methods are extended in two ways. First, the use of the efficient score as a test statistic makes the analysis of binary, normal or failure-time data, as well as adjustment for covariates or stratification straightforward. Second, the question of trial power is also considered, enabling the determination of sample size required to give specified power. Copyright © 2003 John Wiley & Sons, Ltd.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This article describes an approach to optimal design of phase II clinical trials using Bayesian decision theory. The method proposed extends that suggested by Stallard (1998, Biometrics54, 279–294) in which designs were obtained to maximize a gain function including the cost of drug development and the benefit from a successful therapy. Here, the approach is extended by the consideration of other potential therapies, the development of which is competing for the same limited resources. The resulting optimal designs are shown to have frequentist properties much more similar to those traditionally used in phase II trials.