85 resultados para purely sequential procedure
em CentAUR: Central Archive University of Reading - UK
Resumo:
A procedure is described in which patients are randomized between two experimental treatments and a control. At a series of interim analyses, each experimental treatment is compared with control. One of the experimental treatments might then be found sufficiently superior to the control for it to be declared the best treatment, and the trial stopped. Alternatively, experimental treatments might be eliminated from further consideration at any stage. It is shown how the procedure can be conducted while controlling overall error probabilities. Data concerning evaluation of different doses of riluzole in the treatment of motor neurone disease are used for illustration.
Resumo:
Pharmacovigilance, the monitoring of adverse events (AEs), is an integral part in the clinical evaluation of a new drug. Until recently, attempts to relate the incidence of AEs to putative causes have been restricted to the evaluation of simple demographic and environmental factors. The advent of large-scale genotyping, however, provides an opportunity to look for associations between AEs and genetic markers, such as single nucleotides polymorphisms (SNPs). It is envisaged that a very large number of SNPs, possibly over 500 000, will be used in pharmacovigilance in an attempt to identify any genetic difference between patients who have experienced an AE and those who have not. We propose a sequential genome-wide association test for analysing AEs as they arise, allowing evidence-based decision-making at the earliest opportunity. This gives us the capability of quickly establishing whether there is a group of patients at high-risk of an AE based upon their DNA. Our method provides a valid test which takes account of linkage disequilibrium and allows for the sequential nature of the procedure. The method is more powerful than using a correction, such as idák, that assumes that the tests are independent. Copyright © 2006 John Wiley & Sons, Ltd.
Resumo:
This investigation examines metal release from freshwater sediment using sequential extraction and single-step cold-acid leaching. The concentrations of Cd, Cr, Cu, Fe, Ni, Pb and Zn released using a standard 3-step sequential extraction (Rauret et al., 1999) are compared to those released using a 0.5 M HCl; leach. The results show that the three sediments behave in very different ways when subject to the same leaching experiments: the cold-acid extraction appears to remove higher relative concentrations of metals from the iron-rich sediment than from the other two sediments. Cold-acid extraction appears to be more effective at removing metals from sediments with crystalline iron oxides than the "reducible" step of the sequential extraction. The results show that a single-step acid leach can be just as effective as sequential extractions at removing metals from sediment and are a great deal less time-consuming.
Resumo:
Oral nutrition supplements (ONS) are routinely prescribed to those with, or at risk of, malnutrition. Previous research identified poor compliance due to taste and sweetness. This paper investigates taste and hedonic liking of ONS, of varying sweetness and metallic levels, over consumption volume; an important consideration as patients are prescribed large volumes of ONS daily. A sequential descriptive profile was developed to determine the perception of sensory attributes over repeat consumption of ONS. Changes in liking of ONS following repeat consumption were characterised by a boredom test. Certain flavour (metallic taste, soya milk flavour) and mouthfeel (mouthdrying, mouthcoating) attributes built up over increased consumption volume (p 0.002). Hedonic liking data from two cohorts, healthy older volunteers (n = 32, median age 73) and patients (n = 28, median age 85), suggested such build-up was disliked. Efforts made to improve the palatability of ONS must take account of the build up of taste and mouthfeel characteristics over increased consumption volume.
Resumo:
Reducing carbon conversion of ruminally degraded feed into methane increases feed efficiency and reduces emission of this potent greenhouse gas into the environment. Accurate, yet simple, predictions of methane production of ruminants on any feeding regime are important in the nutrition of ruminants, and in modeling methane produced by them. The current work investigated feed intake, digestibility and methane production by open-circuit respiration measurements in sheep fed 15 untreated, sodium hydroxide (NaOH) treated and anhydrous ammonia (NH3) treated wheat, barley and oat straws. In vitro fermentation characteristics of straws were obtained from incubations using the Hohenheim gas production system that measured gas production, true substrate degradability, short-chain fatty acid production and efficiency of microbial production from the ratio of truly degraded substrate to gas volume. In the 15 straws, organic matter (OM) intake and in vivo OM digestibility ranged from 563 to 1201 g and from 0.464 to 0.643, respectively. Total daily methane production ranged from 13.0 to 34.4 l, whereas methane produced/kg OM matter apparently digested in vivo varied from 35.0 to 61.8 l. The OM intake was positively related to total methane production (R2 = 0.81, P<0.0001), and in vivo OM digestibility was also positively associated with methane production (R2 = 0.67, P<0.001), but negatively associated with methane production/kg digestible OM intake (R2 = 0.61, P<0.001). In the in vitro incubations of the 15 straws, the ratio of acetate to propionate ranged from 2.3 to 2.8 (P<0.05) and efficiencies of microbial production ranged from 0.21 to 0.37 (P<0.05) at half asymptotic gas production. Total daily methane production, calculated from in vitro fermentation characteristics (i.e., true degradability, SCFA ratio and efficiency of microbial production) and OM intake, compared well with methane measured in the open-circuit respiration chamber (y = 2.5 + 0.86x, R2 = 0.89, P<0.0001, Sy.x = 2.3). Methane production from forage fed ruminants can be predicted accurately by simple in vitro incubations combining true substrate degradability and gas volume measurements, if feed intake is known.
Resumo:
Fixed transactions costs that prohibit exchange engender bias in supply analysis due to censoring of the sample observations. The associated bias in conventional regression procedures applied to censored data and the construction of robust methods for mitigating bias have been preoccupations of applied economists since Tobin [Econometrica 26 (1958) 24]. This literature assumes that the true point of censoring in the data is zero and, when this is not the case, imparts a bias to parameter estimates of the censored regression model. We conjecture that this bias can be significant; affirm this from experiments; and suggest techniques for mitigating this bias using Bayesian procedures. The bias-mitigating procedures are based on modifications of the key step that facilitates Bayesian estimation of the censored regression model; are easy to implement; work well in both small and large samples; and lead to significantly improved inference in the censored regression model. These findings are important in light of the widespread use of the zero-censored Tobit regression and we investigate their consequences using data on milk-market participation in the Ethiopian highlands. (C) 2004 Elsevier B.V. All rights reserved.
Resumo:
This article illustrates the usefulness of applying bootstrap procedures to total factor productivity Malmquist indices, derived with data envelopment analysis (DEA), for a sample of 250 Polish farms during 1996-2000. The confidence intervals constructed as in Simar and Wilson suggest that the common portrayal of productivity decline in Polish agriculture may be misleading. However, a cluster analysis based on bootstrap confidence intervals reveals that important policy conclusions can be drawn regarding productivity enhancement.
Resumo:
The conventional method for assessing acute oral toxicity (OECD Test Guideline 401) was designed to identify the median lethal dose (LD50), using the death of animals as an endpoint. Introduced as an alternative method (OECD Test Guideline 420), the Fixed Dose Procedure (FDP) relies on the observation of clear signs of toxicity, uses fewer animals and causes less suffering. More recently, the Acute Toxic Class method and the Up-and-Down Procedure have also been adopted as OECD test guidelines. Both of these methods also use fewer animals than the conventional method, although they still use death as an endpoint. Each of the three new methods incorporates a sequential dosing procedure, which results in increased efficiency. In 1999, with a view to replacing OECD Test Guideline 401, the OECD requested that the three new test guidelines be updated. This was to bring them in line with the regulatory needs of all OECD Member Countries, provide further reductions in the number of animals used, and introduce refinements to reduce the pain and distress experienced by the animals. This paper describes a statistical modelling approach for the evaluation of acute oral toxicity tests, by using the revised FDP for illustration. Opportunities for further design improvements are discussed.
Resumo:
In clinical trials, situations often arise where more than one response from each patient is of interest; and it is required that any decision to stop the study be based upon some or all of these measures simultaneously. Theory for the design of sequential experiments with simultaneous bivariate responses is described by Jennison and Turnbull (Jennison, C., Turnbull, B. W. (1993). Group sequential tests for bivariate response: interim analyses of clinical trials with both efficacy and safety endpoints. Biometrics 49:741-752) and Cook and Farewell (Cook, R. J., Farewell, V. T. (1994). Guidelines for monitoring efficacy and toxicity responses in clinical trials. Biometrics 50:1146-1152) in the context of one efficacy and one safety response. These expositions are in terms of normally distributed data with known covariance. The methods proposed require specification of the correlation, ρ between test statistics monitored as part of the sequential test. It can be difficult to quantify ρ and previous authors have suggested simply taking the lowest plausible value, as this will guarantee power. This paper begins with an illustration of the effect that inappropriate specification of ρ can have on the preservation of trial error rates. It is shown that both the type I error and the power can be adversely affected. As a possible solution to this problem, formulas are provided for the calculation of correlation from data collected as part of the trial. An adaptive approach is proposed and evaluated that makes use of these formulas and an example is provided to illustrate the method. Attention is restricted to the bivariate case for ease of computation, although the formulas derived are applicable in the general multivariate case.
Resumo:
The conventional method for the assessment of acute dermal toxicity (OECD Test Guideline 402, 1987) uses death of animals as an endpoint to identify the median lethal dose (LD50). A new OECD Testing Guideline called the dermal fixed dose procedure (dermal FDP) is being prepared to provide an alternative to Test Guideline 402. In contrast to Test Guideline 402, the dermal FDP does not provide a point estimate of the LD50, but aims to identify that dose of the substance under investigation that causes clear signs of nonlethal toxicity. This is then used to assign classification according to the new Globally Harmonised System of Classification and Labelling scheme (GHS). The dermal FDP has been validated using statistical modelling rather than by in vivo testing. The statistical modelling approach enables calculation of the probability of each GHS classification and the expected numbers of deaths and animals used in the test for imaginary substances with a range of LD50 values and dose-response curve slopes. This paper describes the dermal FDP and reports the results from the statistical evaluation. It is shown that the procedure will be completed with considerably less death and suffering than guideline 402, and will classify substances either in the same or a more stringent GHS class than that assigned on the basis of the LD50 value.
Statistical evaluation of the fixed concentration procedure for acute inhalation toxicity assessment
Resumo:
The conventional method for the assessment of acute inhalation toxicity (OECD Test Guideline 403, 1981) uses death of animals as an endpoint to identify the median lethal concentration (LC50). A new OECD Testing Guideline called the Fixed Concentration Procedure (FCP) is being prepared to provide an alternative to Test Guideline 403. Unlike Test Guideline 403, the FCP does not provide a point estimate of the LC50, but aims to identify an airborne exposure level that causes clear signs of nonlethal toxicity. This is then used to assign classification according to the new Globally Harmonized System of Classification and Labelling scheme (GHS). The FCP has been validated using statistical simulation rather than byin vivo testing. The statistical simulation approach predicts the GHS classification outcome and the numbers of deaths and animals used in the test for imaginary substances with a range of LC50 values and dose response curve slopes. This paper describes the FCP and reports the results from the statistical simulation study assessing its properties. It is shown that the procedure will be completed with considerably less death and suffering than Test Guideline 403, and will classify substances either in the same or a more stringent GHS class than that assigned on the basis of the LC50 value.