847 resultados para sample
Resumo:
This paper presents practical approaches to the problem of sample size re-estimation in the case of clinical trials with survival data when proportional hazards can be assumed. When data are readily available at the time of the review, on a full range of survival experiences across the recruited patients, it is shown that, as expected, performing a blinded re-estimation procedure is straightforward and can help to maintain the trial's pre-specified error rates. Two alternative methods for dealing with the situation where limited survival experiences are available at the time of the sample size review are then presented and compared. In this instance, extrapolation is required in order to undertake the sample size re-estimation. Worked examples, together with results from a simulation study are described. It is concluded that, as in the standard case, use of either extrapolation approach successfully protects the trial error rates. Copyright © 2012 John Wiley & Sons, Ltd.
Resumo:
Data assimilation refers to the problem of finding trajectories of a prescribed dynamical model in such a way that the output of the model (usually some function of the model states) follows a given time series of observations. Typically though, these two requirements cannot both be met at the same time–tracking the observations is not possible without the trajectory deviating from the proposed model equations, while adherence to the model requires deviations from the observations. Thus, data assimilation faces a trade-off. In this contribution, the sensitivity of the data assimilation with respect to perturbations in the observations is identified as the parameter which controls the trade-off. A relation between the sensitivity and the out-of-sample error is established, which allows the latter to be calculated under operational conditions. A minimum out-of-sample error is proposed as a criterion to set an appropriate sensitivity and to settle the discussed trade-off. Two approaches to data assimilation are considered, namely variational data assimilation and Newtonian nudging, also known as synchronization. Numerical examples demonstrate the feasibility of the approach.
Resumo:
The detection of long-range dependence in time series analysis is an important task to which this paper contributes by showing that whilst the theoretical definition of a long-memory (or long-range dependent) process is based on the autocorrelation function, it is not possible for long memory to be identified using the sum of the sample autocorrelations, as usually defined. The reason for this is that the sample sum is a predetermined constant for any stationary time series; a result that is independent of the sample size. Diagnostic or estimation procedures, such as those in the frequency domain, that embed this sum are equally open to this criticism. We develop this result in the context of long memory, extending it to the implications for the spectral density function and the variance of partial sums of a stationary stochastic process. The results are further extended to higher order sample autocorrelations and the bispectral density. The corresponding result is that the sum of the third order sample (auto) bicorrelations at lags h,k≥1, is also a predetermined constant, different from that in the second order case, for any stationary time series of arbitrary length.
Resumo:
A series of imitation games involving 3-participant (simultaneous comparison of two hidden entities) and 2-participant (direct interrogation of a hidden entity) were conducted at Bletchley Park on the 100th anniversary of Alan Turing’s birth: 23 June 2012. From the ongoing analysis of over 150 games involving (expert and non-expert, males and females, adults and child) judges, machines and hidden humans (foils for the machines), we present six particular conversations that took place between human judges and a hidden entity that produced unexpected results. From this sample we focus on features of Turing’s machine intelligence test that the mathematician/code breaker did not consider in his examination for machine thinking: the subjective nature of attributing intelligence to another mind.
Resumo:
The self-assembly of proteins and peptides into b-sheet-rich amyloid fibers is a process that has gained notoriety because of its association with human diseases and disorders. Spontaneous self-assembly of peptides into nonfibrillar supramolecular structures can also provide a versatile and convenient mechanism for the bottom-up design of biocompatible materials with functional properties favoring a wide range of practical applications.[1] One subset of these fascinating and potentially useful nanoscale constructions are the peptide nanotubes, elongated cylindrical structures with a hollow center bounded by a thin wall of peptide molecules.[2] A formidable challenge in optimizing and harnessing the properties of nanotube assemblies is to gain atomistic insight into their architecture, and to elucidate precisely how the tubular morphology is constructed from the peptide building blocks. Some of these fine details have been elucidated recently with the use of magic-angle-spinning (MAS) solidstate NMR (SSNMR) spectroscopy.[3] MAS SSNMR measurements of chemical shifts and through-space interatomic distances provide constraints on peptide conformation (e.g., b-strands and turns) and quaternary packing. We describe here a new application of a straightforward SSNMR technique which, when combined with FTIR spectroscopy, reports quantitatively on the orientation of the peptide molecules within the nanotube structure, thereby providing an additional structural constraint not accessible to MAS SSNMR.
Resumo:
This paper considers the effect of using a GARCH filter on the properties of the BDS test statistic as well as a number of other issues relating to the application of the test. It is found that, for certain values of the user-adjustable parameters, the finite sample distribution of the test is far-removed from asymptotic normality. In particular, when data generated from some completely different model class are filtered through a GARCH model, the frequency of rejection of iid falls, often substantially. The implication of this result is that it might be inappropriate to use non-rejection of iid of the standardised residuals of a GARCH model as evidence that the GARCH model ‘fits’ the data.
Resumo:
The projected hand illusion (PHI) is a variant of the rubber hand illusion (RHI), and both are commonly used to study mechanisms of self-perception. A questionnaire was developed by Longo et al. (2008) to measure qualitative changes in the RHI. Such psychometric analyses have not yet been conducted on the questionnaire for the PHI. The present study is an attempt to validate minor modifications of the questionnaire of Longo et al. to assess the PHI in a community sample (n = 48) and to determine the association with selected demographic (age, sex, years of education), cognitive (Digit Span), and clinical (psychotic-like experiences) variables. Principal components analysis on the questionnaire data extracted four components: Embodiment of “Other” Hand, Disembodiment of Own Hand, Deafference, and Agency—in both synchronous and asynchronous PHI conditions. Questions assessing “Embodiment” and “Agency” loaded onto orthogonal components. Greater illusion ratings were positively associated with being female, being younger, and having higher scores on psychotic-like experiences. There was no association with cognitive performance. Overall, this study confirmed that self-perception as measured with PHI is a multicomponent construct, similar in many respects to the RHI. The main difference lies in the separation of Embodiment and Agency into separate constructs, and this likely reflects the fact that the “live” image of the PHI presents a more realistic picture of the hand and of the stroking movements of the experimenter compared with the RHI.
Resumo:
The concept of rationally designing MALDI matrices has been extended to the next “whole sample” level. These studies have revealed some unexpected and exploitable insights in improving MALDI sensitivity. It is shown that (i) additives which only provide additional laser energy absorption are best to be avoided; (ii) the addition of proton donors in the form of protonated weak bases can be highly beneficial; (iii) the addition of glycerol for coating crystalline samples is highly recommended. Overall, analytical sensitivity has been significantly increased compared to the current “gold” standards in MALDI MS, and new insights into the mechanisms and processes of MALDI have been gained.
Resumo:
Evidence is presented that the performance of the rationally designed MALDI matrix 4-chloro-α-cyanocinnamic acid (ClCCA) in comparison to its well-established predecessor α-cyano-4-hydroxycinnamic acid (CHCA) is significantly dependent on the sample preparation, such as the choice of the target plate. In this context, it becomes clear that any rational designs of MALDI matrices and their successful employment have to consider a larger set of physicochemical parameters, including sample crystallization and morphology/topology, in addition to parameters of basic (solution and/or gas-phase) chemistry.
Resumo:
The present systematic review was performed to assess consumer purchasing behaviour towards fish and seafood products in the wide context of developed countries. Web of Science, Scopus, ScienceDirect and Google Scholar engines were used to search the existing literature and a total of 49 studies were identified for inclusion. These studies investigated consumer purchasing behaviour towards a variety of fish and seafood products, in different countries and by means of different methodological approaches. In particular, the review identifies and discusses the main drivers and barriers of fish consumption as well as consumers’ preferences about the most relevant attributes of fish and seafood products providing useful insights for both practitioners and policy makers. Finally, main gaps of the existing literature and possible trajectories for future research are also discussed.
Resumo:
This paper proposes and tests a new framework for weighting recursive out-of-sample prediction errors according to their corresponding levels of in-sample estimation uncertainty. In essence, we show how to use the maximum possible amount of information from the sample in the evaluation of the prediction accuracy, by commencing the forecasts at the earliest opportunity and weighting the prediction errors. Via a Monte Carlo study, we demonstrate that the proposed framework selects the correct model from a set of candidate models considerably more often than the existing standard approach when only a small sample is available. We also show that the proposed weighting approaches result in tests of equal predictive accuracy that have much better sizes than the standard approach. An application to an exchange rate dataset highlights relevant differences in the results of tests of predictive accuracy based on the standard approach versus the framework proposed in this paper.
Resumo:
This paper presents an approximate closed form sample size formula for determining non-inferiority in active-control trials with binary data. We use the odds-ratio as the measure of the relative treatment effect, derive the sample size formula based on the score test and compare it with a second, well-known formula based on the Wald test. Both closed form formulae are compared with simulations based on the likelihood ratio test. Within the range of parameter values investigated, the score test closed form formula is reasonably accurate when non-inferiority margins are based on odds-ratios of about 0.5 or above and when the magnitude of the odds ratio under the alternative hypothesis lies between about 1 and 2.5. The accuracy generally decreases as the odds ratio under the alternative hypothesis moves upwards from 1. As the non-inferiority margin odds ratio decreases from 0.5, the score test closed form formula increasingly overestimates the sample size irrespective of the magnitude of the odds ratio under the alternative hypothesis. The Wald test closed form formula is also reasonably accurate in the cases where the score test closed form formula works well. Outside these scenarios, the Wald test closed form formula can either underestimate or overestimate the sample size, depending on the magnitude of the non-inferiority margin odds ratio and the odds ratio under the alternative hypothesis. Although neither approximation is accurate for all cases, both approaches lead to satisfactory sample size calculation for non-inferiority trials with binary data where the odds ratio is the parameter of interest.
Resumo:
Objective. Numerous studies have reported elevated levels of overgeneral autobiographical memory among depressed patients and also among those previously exposed to a traumatic event. No previous study has examined their joint association with overgeneral memory in a community sample, nor examined whether the associations are with both juvenile- and adult-onset depression. Methods. The current study examined the relative importance of exposure to childhood abuse and neglect in overgeneral memory of women with and without a history of major depressive disorder (MDD). Autobiographical memory test together with standardized interviews of childhood experiences and MDD were assessed in a risk-stratified community sample of 103 women aged 25–37. Results. Overgenerality in memory was associated with recalled childhood sexual abuse (CSA) but not other adversities. A history of CSA was predictive of overgeneral memory bias even in the absence of MDD. Our analyses indicated no significant association between a history of MDD and overgeneral memory in women who reported no CSA. However, overgeneral memory was increased in women who reported CSA and MDD with a significant difference found in relation to positive cues, the highest scores being seen among those with adult rather than juvenile-onset depression. Conclusions. The findings highlight the significance of CSA in predicting overgeneral memory, differential response in relation to positive and negative cue memories, and point to a specific role in the development of depression for overgeneral memory following CSA.
Resumo:
This study examines the effects of a multi-session Cognitive Bias Modification (CBM) program on interpretative biases and social anxiety in an Iranian sample. Thirty-six volunteers with a high score on social anxiety measures were recruited from a student population and randomly allocated into the experimental and control groups. In the experimental group, participants received 4 sessions of positive CBM for interpretative biases (CBM-I) over 2 weeks in the laboratory. Participants in the control condition completed a neutral task matched the active CBM-I intervention in format and duration but did not encourage positive disambiguation of socially ambiguous scenarios. The results indicated that after training the positive CBM-I group exhibited more positive (and less negative) interpretations of ambiguous scenarios and less social anxiety symptoms relative to the control condition at both 1 week post-test and 7 weeks follow-up. It is suggested that clinical trials are required to establish the clinical efficacy of this intervention for social anxiety.
Resumo:
Human observers exhibit large systematic distance-dependent biases when estimating the three-dimensional (3D) shape of objects defined by binocular image disparities. This has led some to question the utility of disparity as a cue to 3D shape and whether accurate estimation of 3D shape is at all possible. Others have argued that accurate perception is possible, but only with large continuous perspective transformations of an object. Using a stimulus that is known to elicit large distance-dependent perceptual bias (random dot stereograms of elliptical cylinders) we show that contrary to these findings the simple adoption of a more naturalistic viewing angle completely eliminates this bias. Using behavioural psychophysics, coupled with a novel surface-based reverse correlation methodology, we show that it is binocular edge and contour information that allows for accurate and precise perception and that observers actively exploit and sample this information when it is available.