32 resultados para sequential data
Resumo:
Oral nutrition supplements (ONS) are routinely prescribed to those with, or at risk of, malnutrition. Previous research identified poor compliance due to taste and sweetness. This paper investigates taste and hedonic liking of ONS, of varying sweetness and metallic levels, over consumption volume; an important consideration as patients are prescribed large volumes of ONS daily. A sequential descriptive profile was developed to determine the perception of sensory attributes over repeat consumption of ONS. Changes in liking of ONS following repeat consumption were characterised by a boredom test. Certain flavour (metallic taste, soya milk flavour) and mouthfeel (mouthdrying, mouthcoating) attributes built up over increased consumption volume (p 0.002). Hedonic liking data from two cohorts, healthy older volunteers (n = 32, median age 73) and patients (n = 28, median age 85), suggested such build-up was disliked. Efforts made to improve the palatability of ONS must take account of the build up of taste and mouthfeel characteristics over increased consumption volume.
Resumo:
In clinical trials, situations often arise where more than one response from each patient is of interest; and it is required that any decision to stop the study be based upon some or all of these measures simultaneously. Theory for the design of sequential experiments with simultaneous bivariate responses is described by Jennison and Turnbull (Jennison, C., Turnbull, B. W. (1993). Group sequential tests for bivariate response: interim analyses of clinical trials with both efficacy and safety endpoints. Biometrics 49:741-752) and Cook and Farewell (Cook, R. J., Farewell, V. T. (1994). Guidelines for monitoring efficacy and toxicity responses in clinical trials. Biometrics 50:1146-1152) in the context of one efficacy and one safety response. These expositions are in terms of normally distributed data with known covariance. The methods proposed require specification of the correlation, ρ between test statistics monitored as part of the sequential test. It can be difficult to quantify ρ and previous authors have suggested simply taking the lowest plausible value, as this will guarantee power. This paper begins with an illustration of the effect that inappropriate specification of ρ can have on the preservation of trial error rates. It is shown that both the type I error and the power can be adversely affected. As a possible solution to this problem, formulas are provided for the calculation of correlation from data collected as part of the trial. An adaptive approach is proposed and evaluated that makes use of these formulas and an example is provided to illustrate the method. Attention is restricted to the bivariate case for ease of computation, although the formulas derived are applicable in the general multivariate case.
Resumo:
A number of authors have proposed clinical trial designs involving the comparison of several experimental treatments with a control treatment in two or more stages. At the end of the first stage, the most promising experimental treatment is selected, and all other experimental treatments are dropped from the trial. Provided it is good enough, the selected experimental treatment is then compared with the control treatment in one or more subsequent stages. The analysis of data from such a trial is problematic because of the treatment selection and the possibility of stopping at interim analyses. These aspects lead to bias in the maximum-likelihood estimate of the advantage of the selected experimental treatment over the control and to inaccurate coverage for the associated confidence interval. In this paper, we evaluate the bias of the maximum-likelihood estimate and propose a bias-adjusted estimate. We also propose an approach to the construction of a confidence region for the vector of advantages of the experimental treatments over the control based on an ordering of the sample space. These regions are shown to have accurate coverage, although they are also shown to be necessarily unbounded. Confidence intervals for the advantage of the selected treatment are obtained from the confidence regions and are shown to have more accurate coverage than the standard confidence interval based upon the maximum-likelihood estimate and its asymptotic standard error.
Resumo:
Most statistical methodology for phase III clinical trials focuses on the comparison of a single experimental treatment with a control. An increasing desire to reduce the time before regulatory approval of a new drug is sought has led to development of two-stage or sequential designs for trials that combine the definitive analysis associated with phase III with the treatment selection element of a phase II study. In this paper we consider a trial in which the most promising of a number of experimental treatments is selected at the first interim analysis. This considerably reduces the computational load associated with the construction of stopping boundaries compared to the approach proposed by Follman, Proschan and Geller (Biometrics 1994; 50: 325-336). The computational requirement does not exceed that for the sequential comparison of a single experimental treatment with a control. Existing methods are extended in two ways. First, the use of the efficient score as a test statistic makes the analysis of binary, normal or failure-time data, as well as adjustment for covariates or stratification straightforward. Second, the question of trial power is also considered, enabling the determination of sample size required to give specified power. Copyright © 2003 John Wiley & Sons, Ltd.
The sequential analysis of repeated binary responses: a score test for the case of three time points
Resumo:
In this paper a robust method is developed for the analysis of data consisting of repeated binary observations taken at up to three fixed time points on each subject. The primary objective is to compare outcomes at the last time point, using earlier observations to predict this for subjects with incomplete records. A score test is derived. The method is developed for application to sequential clinical trials, as at interim analyses there will be many incomplete records occurring in non-informative patterns. Motivation for the methodology comes from experience with clinical trials in stroke and head injury, and data from one such trial is used to illustrate the approach. Extensions to more than three time points and to allow for stratification are discussed. Copyright © 2005 John Wiley & Sons, Ltd.
Resumo:
Sequential methods provide a formal framework by which clinical trial data can be monitored as they accumulate. The results from interim analyses can be used either to modify the design of the remainder of the trial or to stop the trial as soon as sufficient evidence of either the presence or absence of a treatment effect is available. The circumstances under which the trial will be stopped with a claim of superiority for the experimental treatment, must, however, be determined in advance so as to control the overall type I error rate. One approach to calculating the stopping rule is the group-sequential method. A relatively recent alternative to group-sequential approaches is the adaptive design method. This latter approach provides considerable flexibility in changes to the design of a clinical trial at an interim point. However, a criticism is that the method by which evidence from different parts of the trial is combined means that a final comparison of treatments is not based on a sufficient statistic for the treatment difference, suggesting that the method may lack power. The aim of this paper is to compare two adaptive design approaches with the group-sequential approach. We first compare the form of the stopping boundaries obtained using the different methods. We then focus on a comparison of the power of the different trials when they are designed so as to be as similar as possible. We conclude that all methods acceptably control type I error rate and power when the sample size is modified based on a variance estimate, provided no interim analysis is so small that the asymptotic properties of the test statistic no longer hold. In the latter case, the group-sequential approach is to be preferred. Provided that asymptotic assumptions hold, the adaptive design approaches control the type I error rate even if the sample size is adjusted on the basis of an estimate of the treatment effect, showing that the adaptive designs allow more modifications than the group-sequential method.
Resumo:
While planning the GAIN International Study of gavestinel in acute stroke, a sequential triangular test was proposed but not implemented. Before the trial commenced it was agreed to evaluate the sequential design retrospectively to evaluate the differences in the resulting analyses, trial durations and sample sizes in order to assess the potential of sequential procedures for future stroke trials. This paper presents four sequential reconstructions of the GAIN study made under various scenarios. For the data as observed, the sequential design would have reduced the trial sample size by 234 patients and shortened its duration by 3 or 4 months. Had the study not achieved a recruitment rate that far exceeded expectation, the advantages of the sequential design would have been much greater. Sequential designs appear to be an attractive option for trials in stroke. Copyright 2004 S. Karger AG, Basel
Resumo:
Sequential crystallization of poly(L-lactide) (PLLA) followed by poly(epsilon-caprolactone) (PCL) in double crystalline PLLA-b-PCL diblock copolymers is studied by differential scanning calorimetry (DSC), polarized optical microscopy (POM), wide-angle X-ray scattering (WAXS) and small-angle X-ray scattering (SAXS). Three samples with different compositions are studied. The sample with the shortest PLLA block (32 wt.-% PLLA) crystallizes from a homogeneous melt, the other two (with 44 and 60% PLLA) from microphase separated structures. The microphase structure of the melt is changed as PLLA crystallizes at 122 degrees C (a temperature at which the PCL block is molten) forming spherulites regardless of composition, even with 32% PLLA. SAXS indicates that a lamellar structure with a different periodicity than that obtained in the melt forms (for melt segregated samples). Where PCL is the majority block, PCL crystallization at 42 degrees C following PLLA crystallization leads to rearrangement of the lamellar structure, as observed by SAXS, possibly due to local melting at the interphases between domains. POM results showed that PCL crystallizes within previously formed PLLA spherulites. WAXS data indicate that the PLLA unit cell is modified by crystallization of PCL, at least for the two majority PCL samples. The PCL minority sample did not crystallize at 42 degrees C (well below the PCL homopolymer crystallization temperature), pointing to the influence of pre-crystallization of PLLA on PCL crystallization, although it did crystallize at lower temperature. Crystallization kinetics were examined by DSC and WAXS, with good agreement in general. The crystallization rate of PLLA decreased with increase in PCL content in the copolymers. The crystallization rate of PCL decreased with increasing PLLA content. The Avrami exponents were in general depressed for both components in the block copolymers compared to the parent homopolymers. Polarized optical micrographs during isothermal crystalli zation of (a) homo-PLLA, (b) homo-PCL, (c) and (d) block copolymer after 30 min at 122 degrees C and after 15 min at 42 degrees C.
Resumo:
We present a novel algorithm for joint state-parameter estimation using sequential three dimensional variational data assimilation (3D Var) and demonstrate its application in the context of morphodynamic modelling using an idealised two parameter 1D sediment transport model. The new scheme combines a static representation of the state background error covariances with a flow dependent approximation of the state-parameter cross-covariances. For the case presented here, this involves calculating a local finite difference approximation of the gradient of the model with respect to the parameters. The new method is easy to implement and computationally inexpensive to run. Experimental results are positive with the scheme able to recover the model parameters to a high level of accuracy. We expect that there is potential for successful application of this new methodology to larger, more realistic models with more complex parameterisations.
Resumo:
A study or experiment can be described as sequential if its design includes one or more interim analyses at which it is possible to stop the study, having reached a definitive conclusion concerning the primary question of interest. The potential of the sequential study to terminate earlier than the equivalent fixed sample size study means that, typically, there are ethical and economic advantages to be gained from using a sequential design. These advantages have secured a place for the methodology in the conduct of many clinical trials of novel therapies. Recently, there has been increasing interest in pharmacogenetics: the study of how DNA variation in the human genome affects the safety and efficacy of drugs. The potential for using sequential methodology in pharmacogenetic studies is considered and the conduct of candidate gene association studies, family-based designs and genome-wide association studies within the sequential setting is explored. The objective is to provide a unified framework for the conduct of these types of studies as sequential designs and hence allow experimenters to consider using sequential methodology in their future pharmacogenetic studies.
Resumo:
Objective To examine the impact of increasing numbers of metabolic syndrome (MetS) components on postprandial lipaemia. Methods Healthy men (n = 112) underwent a sequential meal postprandial investigation, in which blood samples were taken at regular intervals after a test breakfast (0 min) and lunch (330 min). Lipids and glucose were measured in the fasting sample, with triacylglycerol (TAG), non-esterified fatty acids and glucose analysed in the postprandial samples. Results Subjects were grouped according to the number of MetS components regardless of the combinations of components (0/1, 2, 3 and 4/5). As expected, there was a trend for an increase in body mass index, blood pressure, fasting TAG, glucose and insulin, and a decrease in fasting high-density lipoprotein cholesterol with increasing numbers of MetS components (P≤0.0004). A similar trend was observed for the summary measures of the postprandial TAG and glucose responses. For TAG, the area under the curve (AUC) and maximum concentration (maxC) were significantly greater in men with ≥ 3 than < 3 components (P < 0.001), whereas incremental AUC was greater in those with 3 than 0/1 and 2, and 4/5 compared with 2 components (P < 0.04). For glucose, maxC after the test breakfast (0-330 min) and total AUC (0-480 min) were higher in men with ≥ 3 than < 3 components (P≤0.001). Conclusions Our data analysis has revealed a linear trend between increasing numbers of MetS components and magnitude (AUC) of the postprandial TAG and glucose responses. Furthermore, the two meal challenge discriminated a worsening of postprandial lipaemic control in subjects with ≥ 3 MetS components.
Resumo:
Data assimilation is predominantly used for state estimation; combining observational data with model predictions to produce an updated model state that most accurately approximates the true system state whilst keeping the model parameters fixed. This updated model state is then used to initiate the next model forecast. Even with perfect initial data, inaccurate model parameters will lead to the growth of prediction errors. To generate reliable forecasts we need good estimates of both the current system state and the model parameters. This paper presents research into data assimilation methods for morphodynamic model state and parameter estimation. First, we focus on state estimation and describe implementation of a three dimensional variational(3D-Var) data assimilation scheme in a simple 2D morphodynamic model of Morecambe Bay, UK. The assimilation of observations of bathymetry derived from SAR satellite imagery and a ship-borne survey is shown to significantly improve the predictive capability of the model over a 2 year run. Here, the model parameters are set by manual calibration; this is laborious and is found to produce different parameter values depending on the type and coverage of the validation dataset. The second part of this paper considers the problem of model parameter estimation in more detail. We explain how, by employing the technique of state augmentation, it is possible to use data assimilation to estimate uncertain model parameters concurrently with the model state. This approach removes inefficiencies associated with manual calibration and enables more effective use of observational data. We outline the development of a novel hybrid sequential 3D-Var data assimilation algorithm for joint state-parameter estimation and demonstrate its efficacy using an idealised 1D sediment transport model. The results of this study are extremely positive and suggest that there is great potential for the use of data assimilation-based state-parameter estimation in coastal morphodynamic modelling.
Resumo:
A procedure is described in which patients are randomized between two experimental treatments and a control. At a series of interim analyses, each experimental treatment is compared with control. One of the experimental treatments might then be found sufficiently superior to the control for it to be declared the best treatment, and the trial stopped. Alternatively, experimental treatments might be eliminated from further consideration at any stage. It is shown how the procedure can be conducted while controlling overall error probabilities. Data concerning evaluation of different doses of riluzole in the treatment of motor neurone disease are used for illustration.
Resumo:
As we enter an era of ‘big data’, asset information is becoming a deliverable of complex projects. Prior research suggests digital technologies enable rapid, flexible forms of project organizing. This research analyses practices of managing change in Airbus, CERN and Crossrail, through desk-based review, interviews, visits and a cross-case workshop. These organizations deliver complex projects, rely on digital technologies to manage large data-sets; and use configuration management, a systems engineering approach with mid-20th century origins, to establish and maintain integrity. In them, configuration management has become more, rather than less, important. Asset information is structured, with change managed through digital systems, using relatively hierarchical, asynchronous and sequential processes. The paper contributes by uncovering limits to flexibility in complex projects where integrity is important. Challenges of managing change are discussed, considering the evolving nature of configuration management; potential use of analytics on complex projects; and implications for research and practice.
Resumo:
Recruitment of patients to a clinical trial usually occurs over a period of time, resulting in the steady accumulation of data throughout the trial's duration. Yet, according to traditional statistical methods, the sample size of the trial should be determined in advance, and data collected on all subjects before analysis proceeds. For ethical and economic reasons, the technique of sequential testing has been developed to enable the examination of data at a series of interim analyses. The aim is to stop recruitment to the study as soon as there is sufficient evidence to reach a firm conclusion. In this paper we present the advantages and disadvantages of conducting interim analyses in phase III clinical trials, together with the key steps to enable the successful implementation of sequential methods in this setting. Examples are given of completed trials, which have been carried out sequentially, and references to relevant literature and software are provided.