883 resultados para Variable sample size X- control chart
Resumo:
Assaying a large number of genetic markers from patients in clinical trials is now possible in order to tailor drugs with respect to efficacy. The statistical methodology for analysing such massive data sets is challenging. The most popular type of statistical analysis is to use a univariate test for each genetic marker, once all the data from a clinical study have been collected. This paper presents a sequential method for conducting an omnibus test for detecting gene-drug interactions across the genome, thus allowing informed decisions at the earliest opportunity and overcoming the multiple testing problems from conducting many univariate tests. We first propose an omnibus test for a fixed sample size. This test is based on combining F-statistics that test for an interaction between treatment and the individual single nucleotide polymorphism (SNP). As SNPs tend to be correlated, we use permutations to calculate a global p-value. We extend our omnibus test to the sequential case. In order to control the type I error rate, we propose a sequential method that uses permutations to obtain the stopping boundaries. The results of a simulation study show that the sequential permutation method is more powerful than alternative sequential methods that control the type I error rate, such as the inverse-normal method. The proposed method is flexible as we do not need to assume a mode of inheritance and can also adjust for confounding factors. An application to real clinical data illustrates that the method is computationally feasible for a large number of SNPs. Copyright (c) 2007 John Wiley & Sons, Ltd.
Resumo:
Natural exposure to prion disease is likely to occur throughout successive challenges, yet most experiments focus on single large doses of infectious material. We analyze the results from an experiment in which rodents were exposed to multiple doses of feed contaminated with the scrapie agent. We formally define hypotheses for how the doses combine in terms of statistical models. The competing hypotheses are that only the total dose of infectivity is important (cumulative model), doses act independently, or a general alternative that interaction between successive doses occurs (to raise or lower the risk of infection). We provide sample size calculations to distinguish these hypotheses. In the experiment, a fixed total dose has a significantly reduced probability of causing infection if the material is presented as multiple challenges, and as the time between challenges lengthens. Incubation periods are shorter and less variable if all material is consumed on one occasion. We show that the probability of infection is inconsistent with the hypothesis that each dose acts as a cumulative or independent challenge. The incubation periods are inconsistent with the independence hypothesis. Thus, although a trend exists for the risk of infection with prion disease to increase with repeated doses, it does so to a lesser degree than is expected if challenges combine independently or in a cumulative manner.
Resumo:
A novel and generic miniaturization methodology for the determination of partition coefficient values of organic compounds in noctanol/water by using magnetic nanoparticles is, for the first time, described. We have successfully designed, synthesised and characterised new colloidal stable porous silica-encapsulated magnetic nanoparticles of controlled dimensions. These nanoparticles absorbing a tiny amount of n-octanol in their porous silica over-layer are homogeneously dispersed into a bulk aqueous phase (pH 7.40) containing an organic compound prior to magnetic separation. The small size of the particles and the efficient mixing allow a rapid establishment of the partition equilibrium of the organic compound between the solid supported n-octanol nano-droplets and the bulk aqueous phase. UV-vis spectrophotometry is then applied as a quantitative method to determine the concentration of the organic compound in the aqueous phase both before and after partitioning (after magnetic separation). log D values of organic compounds of pharmaceutical interest (0.65-3.50), determined by this novel methodology, were found to be in excellent agreement with the values measured by the shake-flask method in two independent laboratories, which are also consistent with the literature data. It was also found that this new technique gives a number of advantages such as providing an accurate measurement of log D value, a much shorter experimental time and a smaller sample size required. With this approach, the formation of a problematic emulsion, commonly encountered in shake-flask experiments, is eliminated. It is envisaged that this method could be applicable to the high throughput log D screening of drug candidates. (c) 2005 Elsevier B.V. All rights reserved.
Resumo:
The effects of milk protein fortification on the texture and microstructure of cottage cheese curd were evaluated. Protein powder (92.6% protein) was added to the skim milk at a level of 0.4% (w/w) to produce curds. Control curds with no protein powder addition were also produced. These curds were analysed for differences in yield, total solids, curd size, texture and structure. It was found that the addition of protein powder contributed to a significant yield increase, which can be attributed to increased water retention, with better curd size distribution. Control curds were firmer than the fortified curds and the structure showed less open-pore structure as revealed by electron microscopy. However, the addition of dressing masked the textural differences, and a sensory panel was unable to distinguish between cheeses produced from fortified milk and controls.
Resumo:
The convergence speed of the standard Least Mean Square adaptive array may be degraded in mobile communication environments. Different conventional variable step size LMS algorithms were proposed to enhance the convergence speed while maintaining low steady state error. In this paper, a new variable step LMS algorithm, using the accumulated instantaneous error concept is proposed. In the proposed algorithm, the accumulated instantaneous error is used to update the step size parameter of standard LMS is varied. Simulation results show that the proposed algorithm is simpler and yields better performance than conventional variable step LMS.
Resumo:
Reliable techniques for screening large numbers of plants for root traits are still being developed, but include aeroponic, hydroponic and agar plate systems. Coupled with digital cameras and image analysis software, these systems permit the rapid measurement of root numbers, length and diameter in moderate ( typically <1000) numbers of plants. Usually such systems are employed with relatively small seedlings, and information is recorded in 2D. Recent developments in X-ray microtomography have facilitated 3D non-invasive measurement of small root systems grown in solid media, allowing angular distributions to be obtained in addition to numbers and length. However, because of the time taken to scan samples, only a small number can be screened (typically<10 per day, not including analysis time of the large spatial datasets generated) and, depending on sample size, limited resolution may mean that fine roots remain unresolved. Although agar plates allow differences between lines and genotypes to be discerned in young seedlings, the rank order may not be the same when the same materials are grown in solid media. For example, root length of dwarfing wheat ( Triticum aestivum L.) lines grown on agar plates was increased by similar to 40% relative to wild-type and semi-dwarfing lines, but in a sandy loam soil under well watered conditions it was decreased by 24-33%. Such differences in ranking suggest that significant soil environment-genotype interactions are occurring. Developments in instruments and software mean that a combination of high-throughput simple screens and more in-depth examination of root-soil interactions is becoming viable.
Resumo:
The detection of long-range dependence in time series analysis is an important task to which this paper contributes by showing that whilst the theoretical definition of a long-memory (or long-range dependent) process is based on the autocorrelation function, it is not possible for long memory to be identified using the sum of the sample autocorrelations, as usually defined. The reason for this is that the sample sum is a predetermined constant for any stationary time series; a result that is independent of the sample size. Diagnostic or estimation procedures, such as those in the frequency domain, that embed this sum are equally open to this criticism. We develop this result in the context of long memory, extending it to the implications for the spectral density function and the variance of partial sums of a stationary stochastic process. The results are further extended to higher order sample autocorrelations and the bispectral density. The corresponding result is that the sum of the third order sample (auto) bicorrelations at lags h,k≥1, is also a predetermined constant, different from that in the second order case, for any stationary time series of arbitrary length.
Resumo:
Expressions for the viscosity correction function, and hence bulk complex impedance, density, compressibility, and propagation constant, are obtained for a rigid frame porous medium whose pores are prismatic with fixed cross-sectional shape, but of variable pore size distribution. The lowand high-frequency behavior of the viscosity correction function is derived for the particular case of a log-normal pore size distribution, in terms of coefficients which can, in general, be computed numerically, and are given here explicitly for the particular cases of pores of equilateral triangular, circular, and slitlike cross-section. Simple approximate formulae, based on two-point Pade´ approximants for the viscosity correction function are obtained, which avoid a requirement for numerical integration or evaluation of special functions, and their accuracy is illustrated and investigated for the three pore shapes already mentioned
Resumo:
• UV-B radiation currently represents c. 1.5% of incoming solar radiation. However, significant changes are known to have occurred in the amount of incoming radiation both on recent and on geological timescales. Until now it has not been possible to reconstruct a detailed measure of UV-B radiation beyond c. 150 yr ago. • Here, we studied the suitability of fossil Pinus spp. pollen to record variations in UV-B flux through time. In view of the large size of the grain and its long fossil history, we hypothesized that this grain could provide a good proxy for recording past variations in UV-B flux. • Two key objectives were addressed: to determine whether there was, similar to other studied species, a clear relationship between UV-B-absorbing compounds in the sporopollenin of extant pollen and the magnitude of UV-B radiation to which it had been exposed; and to determine whether these compounds could be extracted from a small enough sample size of fossil pollen to make reconstruction of a continuous record through time a realistic prospect. • Preliminary results indicate the excellent potential of this species for providing a quantitative record of UV-B through time. Using this technique, we present the first record of UV-B flux during the last 9500 yr from a site near Bergen, Norway.
Resumo:
Background Appropriately conducted adaptive designs (ADs) offer many potential advantages over conventional trials. They make better use of accruing data, potentially saving time, trial participants, and limited resources compared to conventional, fixed sample size designs. However, one can argue that ADs are not implemented as often as they should be, particularly in publicly funded confirmatory trials. This study explored barriers, concerns, and potential facilitators to the appropriate use of ADs in confirmatory trials among key stakeholders. Methods We conducted three cross-sectional, online parallel surveys between November 2014 and January 2015. The surveys were based upon findings drawn from in-depth interviews of key research stakeholders, predominantly in the UK, and targeted Clinical Trials Units (CTUs), public funders, and private sector organisations. Response rates were as follows: 30(55 %) UK CTUs, 17(68 %) private sector, and 86(41 %) public funders. A Rating Scale Model was used to rank barriers and concerns in order of perceived importance for prioritisation. Results Top-ranked barriers included the lack of bridge funding accessible to UK CTUs to support the design of ADs, limited practical implementation knowledge, preference for traditional mainstream designs, difficulties in marketing ADs to key stakeholders, time constraints to support ADs relative to competing priorities, lack of applied training, and insufficient access to case studies of undertaken ADs to facilitate practical learning and successful implementation. Associated practical complexities and inadequate data management infrastructure to support ADs were reported as more pronounced in the private sector. For funders of public research, the inadequate description of the rationale, scope, and decision-making criteria to guide the planned AD in grant proposals by researchers were all viewed as major obstacles. Conclusions There are still persistent and important perceptions of individual and organisational obstacles hampering the use of ADs in confirmatory trials research. Stakeholder perceptions about barriers are largely consistent across sectors, with a few exceptions that reflect differences in organisations’ funding structures, experiences and characterisation of study interventions. Most barriers appear connected to a lack of practical implementation knowledge and applied training, and limited access to case studies to facilitate practical learning. Keywords: Adaptive designs; flexible designs; barriers; surveys; confirmatory trials; Phase 3; clinical trials; early stopping; interim analyses
Resumo:
Recruitment of patients to a clinical trial usually occurs over a period of time, resulting in the steady accumulation of data throughout the trial's duration. Yet, according to traditional statistical methods, the sample size of the trial should be determined in advance, and data collected on all subjects before analysis proceeds. For ethical and economic reasons, the technique of sequential testing has been developed to enable the examination of data at a series of interim analyses. The aim is to stop recruitment to the study as soon as there is sufficient evidence to reach a firm conclusion. In this paper we present the advantages and disadvantages of conducting interim analyses in phase III clinical trials, together with the key steps to enable the successful implementation of sequential methods in this setting. Examples are given of completed trials, which have been carried out sequentially, and references to relevant literature and software are provided.
Resumo:
This is a reply to Ortega-Baes` et al. (2010) survey of 25 Argentinean species of cacti evaluated for vivipary. We argue that the sample size and geographic area of the species investigated is insufficient to totally exclude the putative commonness of this condition in the Cactaceae. We indicate possible reasons why they did not find viviparous fruits in their survey. Failure to detect vivipary in cacti of NW Argentina may be correlated with limited taxonomic sampling and geographic region in addition to intrinsic and extrinsic plant factors, including different stages of fruit and seed development and genetic, ecological, and edaphic aspects, which, individually or in concert, control precocious germination. We uphold that viviparity is putatively frequent in this family and list 16 new cases for a total of 53 viviparous cacti, which make up ca. 4% incidence of viviparism in the Cactaceae, a substantially higher percentage than most angiosperm families exhibiting this condition. The Cactaceae ranks fourth in frequency of viviparity after the aquatic families of mangroves and seagrasses. We suggest the re-evaluation of cactus vivipary, primarily as a reproductive adaptation to changing environments and physiological stress with a secondary role as a reproductive strategy with limited offspring dispersal/survival and fitness advantages. (C) 2011 Elsevier Ltd. All rights reserved.
Resumo:
This work reports on magnetic measurements of the quasi-two-dimensional (quasi-2D) system Zn(1-x)Mn(x)In(2)Se(4), with 0.01 <= x <= 1.00. For x > 0.67, the quasi-2D system seems to develop a spin-glass behaviour. Evidence of a true phase transition phenomenon is provided by the steep increase of the nonlinear susceptibility chi(nl) when approaching T(C) from above. The static scaling of chi(nl) data yields critical exponents delta = 4.0 +/- 0.2, phi = 4.37 +/- 0.17 and TC = 3.4 +/- 0.1 K for the sample with x = 1.00 and similar values for the sample with x = 0.87. These critical exponents are in good agreement with values reported for other spin-glass systems with short-range interactions.
Resumo:
P>In the context of either Bayesian or classical sensitivity analyses of over-parametrized models for incomplete categorical data, it is well known that prior-dependence on posterior inferences of nonidentifiable parameters or that too parsimonious over-parametrized models may lead to erroneous conclusions. Nevertheless, some authors either pay no attention to which parameters are nonidentifiable or do not appropriately account for possible prior-dependence. We review the literature on this topic and consider simple examples to emphasize that in both inferential frameworks, the subjective components can influence results in nontrivial ways, irrespectively of the sample size. Specifically, we show that prior distributions commonly regarded as slightly informative or noninformative may actually be too informative for nonidentifiable parameters, and that the choice of over-parametrized models may drastically impact the results, suggesting that a careful examination of their effects should be considered before drawing conclusions.Resume Que ce soit dans un cadre Bayesien ou classique, il est bien connu que la surparametrisation, dans les modeles pour donnees categorielles incompletes, peut conduire a des conclusions erronees. Cependant, certains auteurs persistent a negliger les problemes lies a la presence de parametres non identifies. Nous passons en revue la litterature dans ce domaine, et considerons quelques exemples surparametres simples dans lesquels les elements subjectifs influencent de facon non negligeable les resultats, independamment de la taille des echantillons. Plus precisement, nous montrons comment des a priori consideres comme peu ou non-informatifs peuvent se reveler extremement informatifs en ce qui concerne les parametres non identifies, et que le recours a des modeles surparametres peut avoir sur les conclusions finales un impact considerable. Ceci suggere un examen tres attentif de l`impact potentiel des a priori.
Resumo:
The purpose of this paper is to investigate how individuals with different characteristics make their choice-decisions when consuming STIGA table tennis blades, which are combinations of various attributes, such as price, control, attack, etc. It is expected that the general trend of choice behavior on this special commodity can be, at least to some extent, revealed. Data were collected using questionnaires sent to registered members of a table tennis club in China. The questionnaires included information and questions about individuals’ monthly income levels, ages, technique styles, etc. A multinomial logit model was then applied to analyze factors determining Chinese consumers’ choice behavior on STIGA table tennis blades. The results indicated that the main element influencing Chinese consumers’ choice of STIGA ping-pong blades was the technique style and other variables did not seem to influence the choice of table tennis blades. These results might be explained by the limited sample size as well as unmeasured and immeasurable factors. Thus, a more extensive research is needed to be conducted in the future.