966 resultados para Class I error
Resumo:
La coordinació i assignació de tasques en entorns distribuïts ha estat un punt important de la recerca en els últims anys i aquests temes són el cor dels sistemes multi-agent. Els agents en aquests sistemes necessiten cooperar i considerar els altres agents en les seves accions i decisions. A més a més, els agents han de coordinar-se ells mateixos per complir tasques complexes que necessiten més d'un agent per ser complerta. Aquestes tasques poden ser tan complexes que els agents poden no saber la ubicació de les tasques o el temps que resta abans de que les tasques quedin obsoletes. Els agents poden necessitar utilitzar la comunicació amb l'objectiu de conèixer la tasca en l'entorn, en cas contrari, poden perdre molt de temps per trobar la tasca dins de l'escenari. De forma similar, el procés de presa de decisions distribuït pot ser encara més complexa si l'entorn és dinàmic, amb incertesa i en temps real. En aquesta dissertació, considerem entorns amb sistemes multi-agent amb restriccions i cooperatius (dinàmics, amb incertesa i en temps real). En aquest sentit es proposen dues aproximacions que permeten la coordinació dels agents. La primera és un mecanisme semi-centralitzat basat en tècniques de subhastes combinatòries i la idea principal es minimitzar el cost de les tasques assignades des de l'agent central cap als equips d'agents. Aquest algoritme té en compte les preferències dels agents sobre les tasques. Aquestes preferències estan incloses en el bid enviat per l'agent. La segona és un aproximació d'scheduling totalment descentralitzat. Això permet als agents assignar les seves tasques tenint en compte les preferències temporals sobre les tasques dels agents. En aquest cas, el rendiment del sistema no només depèn de la maximització o del criteri d'optimització, sinó que també depèn de la capacitat dels agents per adaptar les seves assignacions eficientment. Addicionalment, en un entorn dinàmic, els errors d'execució poden succeir a qualsevol pla degut a la incertesa i error de accions individuals. A més, una part indispensable d'un sistema de planificació és la capacitat de re-planificar. Aquesta dissertació també proveeix una aproximació amb re-planificació amb l'objectiu de permetre als agent re-coordinar els seus plans quan els problemes en l'entorn no permeti la execució del pla. Totes aquestes aproximacions s'han portat a terme per permetre als agents assignar i coordinar de forma eficient totes les tasques complexes en un entorn multi-agent cooperatiu, dinàmic i amb incertesa. Totes aquestes aproximacions han demostrat la seva eficiència en experiments duts a terme en l'entorn de simulació RoboCup Rescue.
Resumo:
In clinical trials, situations often arise where more than one response from each patient is of interest; and it is required that any decision to stop the study be based upon some or all of these measures simultaneously. Theory for the design of sequential experiments with simultaneous bivariate responses is described by Jennison and Turnbull (Jennison, C., Turnbull, B. W. (1993). Group sequential tests for bivariate response: interim analyses of clinical trials with both efficacy and safety endpoints. Biometrics 49:741-752) and Cook and Farewell (Cook, R. J., Farewell, V. T. (1994). Guidelines for monitoring efficacy and toxicity responses in clinical trials. Biometrics 50:1146-1152) in the context of one efficacy and one safety response. These expositions are in terms of normally distributed data with known covariance. The methods proposed require specification of the correlation, ρ between test statistics monitored as part of the sequential test. It can be difficult to quantify ρ and previous authors have suggested simply taking the lowest plausible value, as this will guarantee power. This paper begins with an illustration of the effect that inappropriate specification of ρ can have on the preservation of trial error rates. It is shown that both the type I error and the power can be adversely affected. As a possible solution to this problem, formulas are provided for the calculation of correlation from data collected as part of the trial. An adaptive approach is proposed and evaluated that makes use of these formulas and an example is provided to illustrate the method. Attention is restricted to the bivariate case for ease of computation, although the formulas derived are applicable in the general multivariate case.
Resumo:
In a sequential clinical trial, accrual of data on patients often continues after the stopping criterion for the study has been met. This is termed “overrunning.” Overrunning occurs mainly when the primary response from each patient is measured after some extended observation period. The objective of this article is to compare two methods of allowing for overrunning. In particular, simulation studies are reported that assess the two procedures in terms of how well they maintain the intended type I error rate. The effect on power resulting from the incorporation of “overrunning data” using the two procedures is evaluated.
Resumo:
There is increasing interest in combining Phases II and III of clinical development into a single trial in which one of a small number of competing experimental treatments is ultimately selected and where a valid comparison is made between this treatment and the control treatment. Such a trial usually proceeds in stages, with the least promising experimental treatments dropped as soon as possible. In this paper we present a highly flexible design that uses adaptive group sequential methodology to monitor an order statistic. By using this approach, it is possible to design a trial which can have any number of stages, begins with any number of experimental treatments, and permits any number of these to continue at any stage. The test statistic used is based upon efficient scores, so the method can be easily applied to binary, ordinal, failure time, or normally distributed outcomes. The method is illustrated with an example, and simulations are conducted to investigate its type I error rate and power under a range of scenarios.
Resumo:
Sequential methods provide a formal framework by which clinical trial data can be monitored as they accumulate. The results from interim analyses can be used either to modify the design of the remainder of the trial or to stop the trial as soon as sufficient evidence of either the presence or absence of a treatment effect is available. The circumstances under which the trial will be stopped with a claim of superiority for the experimental treatment, must, however, be determined in advance so as to control the overall type I error rate. One approach to calculating the stopping rule is the group-sequential method. A relatively recent alternative to group-sequential approaches is the adaptive design method. This latter approach provides considerable flexibility in changes to the design of a clinical trial at an interim point. However, a criticism is that the method by which evidence from different parts of the trial is combined means that a final comparison of treatments is not based on a sufficient statistic for the treatment difference, suggesting that the method may lack power. The aim of this paper is to compare two adaptive design approaches with the group-sequential approach. We first compare the form of the stopping boundaries obtained using the different methods. We then focus on a comparison of the power of the different trials when they are designed so as to be as similar as possible. We conclude that all methods acceptably control type I error rate and power when the sample size is modified based on a variance estimate, provided no interim analysis is so small that the asymptotic properties of the test statistic no longer hold. In the latter case, the group-sequential approach is to be preferred. Provided that asymptotic assumptions hold, the adaptive design approaches control the type I error rate even if the sample size is adjusted on the basis of an estimate of the treatment effect, showing that the adaptive designs allow more modifications than the group-sequential method.
Resumo:
Viral fusion proteins mediate the merger of host and viral membranes during cell entry for all enveloped viruses. Baculovirus glycoprotein gp64 (gp64) is unusual in promoting entry into both insect and mammalian cells and is distinct from established class I and class II fusion proteins. We report the crystal structure of its postfusion form, which explains a number of gp64's biological properties including its cellular promiscuity, identifies the fusion peptides and shows it to be the third representative of a new class (III) of fusion proteins with unexpected structural homology with vesicular stomatitis virus G and herpes simplex virus type 1 gB proteins. We show that domains of class III proteins have counterparts in both class I and II proteins, suggesting that all these viral fusion machines are structurally more related than previously thought.
Resumo:
Assaying a large number of genetic markers from patients in clinical trials is now possible in order to tailor drugs with respect to efficacy. The statistical methodology for analysing such massive data sets is challenging. The most popular type of statistical analysis is to use a univariate test for each genetic marker, once all the data from a clinical study have been collected. This paper presents a sequential method for conducting an omnibus test for detecting gene-drug interactions across the genome, thus allowing informed decisions at the earliest opportunity and overcoming the multiple testing problems from conducting many univariate tests. We first propose an omnibus test for a fixed sample size. This test is based on combining F-statistics that test for an interaction between treatment and the individual single nucleotide polymorphism (SNP). As SNPs tend to be correlated, we use permutations to calculate a global p-value. We extend our omnibus test to the sequential case. In order to control the type I error rate, we propose a sequential method that uses permutations to obtain the stopping boundaries. The results of a simulation study show that the sequential permutation method is more powerful than alternative sequential methods that control the type I error rate, such as the inverse-normal method. The proposed method is flexible as we do not need to assume a mode of inheritance and can also adjust for confounding factors. An application to real clinical data illustrates that the method is computationally feasible for a large number of SNPs. Copyright (c) 2007 John Wiley & Sons, Ltd.
Resumo:
This paper considers methods for testing for superiority or non-inferiority in active-control trials with binary data, when the relative treatment effect is expressed as an odds ratio. Three asymptotic tests for the log-odds ratio based on the unconditional binary likelihood are presented, namely the likelihood ratio, Wald and score tests. All three tests can be implemented straightforwardly in standard statistical software packages, as can the corresponding confidence intervals. Simulations indicate that the three alternatives are similar in terms of the Type I error, with values close to the nominal level. However, when the non-inferiority margin becomes large, the score test slightly exceeds the nominal level. In general, the highest power is obtained from the score test, although all three tests are similar and the observed differences in power are not of practical importance. Copyright (C) 2007 John Wiley & Sons, Ltd.
Resumo:
The node-density effect is an artifact of phylogeny reconstruction that can cause branch lengths to be underestimated in areas of the tree with fewer taxa. Webster, Payne, and Pagel (2003, Science 301:478) introduced a statistical procedure (the "delta" test) to detect this artifact, and here we report the results of computer simulations that examine the test's performance. In a sample of 50,000 random data sets, we find that the delta test detects the artifact in 94.4% of cases in which it is present. When the artifact is not present (n = 10,000 simulated data sets) the test showed a type I error rate of approximately 1.69%, incorrectly reporting the artifact in 169 data sets. Three measures of tree shape or "balance" failed to predict the size of the node-density effect. This may reflect the relative homogeneity of our randomly generated topologies, but emphasizes that nearly any topology can suffer from the artifact, the effect not being confined only to highly unevenly sampled or otherwise imbalanced trees. The ability to screen phylogenies for the node-density artifact is important for phylogenetic inference and for researchers using phylogenetic trees to infer evolutionary processes, including their use in molecular clock dating. [Delta test; molecular clock; molecular evolution; node-density effect; phylogenetic reconstruction; speciation; simulation.]
Resumo:
Background: MHC Class I molecules present antigenic peptides to cytotoxic T cells, which forms an integral part of the adaptive immune response. Peptides are bound within a groove formed by the MHC heavy chain. Previous approaches to MHC Class I-peptide binding prediction have largely concentrated on the peptide anchor residues located at the P2 and C-terminus positions. Results: A large dataset comprising MHC-peptide structural complexes was created by remodelling pre-determined x-ray crystallographic structures. Static energetic analysis, following energy minimisation, was performed on the dataset in order to characterise interactions between bound peptides and the MHC Class I molecule, partitioning the interactions within the groove into van der Waals, electrostatic and total non-bonded energy contributions. Conclusion: The QSAR techniques of Genetic Function Approximation (GFA) and Genetic Partial Least Squares (G/PLS) algorithms were used to identify key interactions between the two molecules by comparing the calculated energy values with experimentally-determined BL50 data. Although the peptide termini binding interactions help ensure the stability of the MHC Class I-peptide complex, the central region of the peptide is also important in defining the specificity of the interaction. As thermodynamic studies indicate that peptide association and dissociation may be driven entropically, it may be necessary to incorporate entropic contributions into future calculations.
Resumo:
In recent years there has been a rapid growth of interest in exploring the relationship between nutritional therapies and the maintenance of cognitive function in adulthood. Emerging evidence reveals an increasingly complex picture with respect to the benefits of various food constituents on learning, memory and psychomotor function in adults. However, to date, there has been little consensus in human studies on the range of cognitive domains to be tested or the particular tests to be employed. To illustrate the potential difficulties that this poses, we conducted a systematic review of existing human adult randomised controlled trial (RCT) studies that have investigated the effects of 24 d to 36 months of supplementation with flavonoids and micronutrients on cognitive performance. There were thirty-nine studies employing a total of 121 different cognitive tasks that met the criteria for inclusion. Results showed that less than half of these studies reported positive effects of treatment, with some important cognitive domains either under-represented or not explored at all. Although there was some evidence of sensitivity to nutritional supplementation in a number of domains (for example, executive function, spatial working memory), interpretation is currently difficult given the prevailing 'scattergun approach' for selecting cognitive tests. Specifically, the practice means that it is often difficult to distinguish between a boundary condition for a particular nutrient and a lack of task sensitivity. We argue that for significant future progress to be made, researchers need to pay much closer attention to existing human RCT and animal data, as well as to more basic issues surrounding task sensitivity, statistical power and type I error.
Resumo:
Ribonucleotide reductases supply cells with their deoxyribonucleotides. Three enzyme types are known, classes I, II and III. Class II enzymes are anaerobic whereas class I enzymes are aerobic, and so class I and II enzymes are often produced by the same organism under opposing oxygen regimes. Escherichia coli contains two types of class I enzyme (Ia and Ib) with the Fe-dependent Ia enzyme (NrdAB) performing the major role aerobically, leaving the purpose of the Ib enzyme (NrdEF) unclear. Several papers have recently focused on the class Ib enzymes showing that they are Mn (rather than Fe) dependent and suggesting that the E. coli NrdEF may function under redox-stress conditions. A paper published in this issue of Molecular Microbiology from James Imlay's group confirms that this unexplained NrdEF Ib enzyme is Mn-dependent, but shows that it does not substitute for NrdAB during redox stress. Instead, a role during iron restriction is demonstrated. Thus, the purpose of NrdEF (and possibly other class Ib enzymes) is to enhance growth under aerobic, low-iron conditions, and to functionally replace the Fe-dependent NrdAB when iron is unavailable. This finding reveals a new mechanism by which bacteria adjust to life under iron deprivation.
Resumo:
Background: Early microbial colonization of the gut reduces the incidence of infectious, inflammatory and autoimmune diseases. Recent population studies reveal that childhood hygiene is a significant risk factor for development of inflammatory bowel disease, thereby reinforcing the hygiene hypothesis and the potential importance of microbial colonization during early life. The extent to which early-life environment impacts on microbial diversity of the adult gut and subsequent immune processes has not been comprehensively investigated thus far. We addressed this important question using the pig as a model to evaluate the impact of early-life environment on microbe/host gut interactions during development. Results: Genetically-related piglets were housed in either indoor or outdoor environments or in experimental isolators. Analysis of over 3,000 16S rRNA sequences revealed major differences in mucosa-adherent microbial diversity in the ileum of adult pigs attributable to differences in earlylife environment. Pigs housed in a natural outdoor environment showed a dominance of Firmicutes, in particular Lactobacillus, whereas animals housed in a hygienic indoor environment had reduced Lactobacillus and higher numbers of potentially pathogenic phylotypes. Our analysis revealed a strong negative correlation between the abundance of Firmicutes and pathogenic bacterial populations in the gut. These differences were exaggerated in animals housed in experimental isolators. Affymetrix microarray technology and Real-time Polymerase Chain Reaction revealed significant gut-specific gene responses also related to early-life environment. Significantly, indoorhoused pigs displayed increased expression of Type 1 interferon genes, Major Histocompatibility Complex class I and several chemokines. Gene Ontology and pathway analysis further confirmed these results.
Resumo:
We propose and analyze a simple mathematical model for susceptible prey (S)–infected prey (I)–predator (P) interaction, where the susceptible prey population (S) is infected directly from external sources as well as through contact with infected class (I) and the predator completely avoids consuming the infected prey. The model is analyzed to obtain different thresholds of the key parameters under which the system exhibits stability around the biologically feasible equilibria. Through numerical simulations we display the effects of external infection and the infection through contact on the system dynamics in the absence as well as in the presence of the predator. We compare the system dynamics when infection occurs only through contact, with that when it occurs through contact and external sources. Our analysis demonstrates that under a disease-selective predation, stability and oscillations of the system is determined by two key parameters: the external infection rate and the force of infection through contact. Due to the introduction of external infection, the predator and the prey population show limit-cycle oscillations over a range parametric values. We suggest that while predicting the dynamics of such an eco-epidemiological system, the modes of infection and the infection rates might be carefully investigated.
Resumo:
Histone deacetylase inhibitors (HDACIs) interfere with the epigenetic process of histone acetylation and are known to have analgesic properties in models of chronic inflammatory pain. The aim of this study was to determine whether these compounds could also affect neuropathic pain. Different class I HDACIs were delivered intrathecally into rat spinal cord in models of traumatic nerve injury and antiretroviral drug-induced peripheral neuropathy (stavudine, d4T). Mechanical and thermal hypersensitivity was attenuated by 40% to 50% as a result of HDACI treatment, but only if started before any insult. The drugs globally increased histone acetylation in the spinal cord, but appeared to have no measurable effects in relevant dorsal root ganglia in this treatment paradigm, suggesting that any potential mechanism should be sought in the central nervous system. Microarray analysis of dorsal cord RNA revealed the signature of the specific compound used (MS-275) and suggested that its main effect was mediated through HDAC1. Taken together, these data support a role for histone acetylation in the emergence of neuropathic pain.