967 resultados para Sequential analysis
Resumo:
In this paper we identify requirements for choosing a threat modelling formalisation for modelling sophisticated malware such as Duqu 2.0. We discuss the gaps in current formalisations and propose the use of Attack Trees with Sequential Conjunction when it comes to analysing complex attacks. The paper models Duqu 2.0 based on the latest information sourced from formal and informal sources. This paper provides a well structured model which can be used for future analysis of Duqu 2.0 and related attacks.
Resumo:
Ligands targeting G protein-coupled receptors (GPCRs) are currently classified as either orthosteric, allosteric, or dualsteric/bitopic. Here, we introduce a new pharmacological concept for GPCR functional modulation: sequential receptor activation. A hallmark feature of this is a stepwise ligand binding mode with transient activation of a first receptor site followed by sustained activation of a second topographically distinct site. We identify 4-CMTB (2-(4-chlorophenyl)-3-methyl-N-(thiazol-2-yl)butanamide), previously classified as a pure allosteric agonist of the free fatty acid receptor 2, as the first sequential activator and corroborate its two-step activation in living cells by tracking integrated responses with innovative label-free biosensors that visualize multiple signaling inputs in real time. We validate this unique pharmacology with traditional cellular readouts, including mutational and pharmacological perturbations along with computational methods, and propose a kinetic model applicable to the analysis of sequential receptor activation. We envision this form of dynamic agonism as a common principle of nature to spatiotemporally encode cellular information.
Resumo:
The European “Community Bureau of Reference” (BCR) sequential extraction procedure, diffusive gradient in thin-films technique (DGT), and physiologically based extraction test were applied to assess metal bioavailability in sediments of Lake Taihu (n = 13). Findings from the three methods showed that Cd was a significant problem in the western lake whereas Cu, Zn, and Ni pollution was most severe in the northern lake. Results from the sequential extraction revealed that more than 50 % of the Cu and Zn were highly mobile and defined within the extractable fraction (AS1 + FM2 + OS3) in the majority of the sediments, in contrast extractable fractions of Ni and Cd were lower than 50 % in most of the sampling sites. Average Cu, Zn, Ni, and Cd bioaccessibilities were <50 % in the gastric phase. Zn and Cd bioaccessibility in the intestinal phase was ∼50 % lower than the gastric phase while bioaccessibilities of Cu and Ni were 47–57 % greater than the gastric phase. Linear regression analysis between DGT and BCR measurements indicated that the extractable fractions (AS1 + FM2 + OS3) in the reducing environment were the main source of DGT uptake, suggesting that DGT is a good in situ evaluation tool for metal bioavailability in sediments.
Resumo:
Tese de dout., Engenharia Electrónica e Computação, Faculdade de Ciências e Tecnologia, Univ. do Algarve, 2005
Resumo:
The current epidemic of Hepatitis C infection in HIV-positive men who have sex with men is associated with increasing use of recreational drugs. Multiple HCV infections have been reported in haemophiliacs and intravenous drug users. Using ultra-deep sequencing analysis, we present the case of an HIV-positive MSM with evidence of three sequential HCV infections, each occurring during the acute phase of the preceding infection, following risk exposures. We observed rapid replacement of the original strain by the incoming genotype at subsequent time points. The impact of HCV super-infection remains unclear and UDS may provide new insights.
Resumo:
This paper proposes an explanation for why efficient reforms are not carried out when losers have the power to block their implementation, even though compensating them is feasible. We construct a signaling model with two-sided incomplete information in which a government faces the task of sequentially implementing two reforms by bargaining with interest groups. The organization of interest groups is endogenous. Compensations are distortionary and government types differ in the concern about distortions. We show that, when compensations are allowed to be informative about the government’s type, there is a bias against the payment of compensations and the implementation of reforms. This is because paying high compensations today provides incentives for some interest groups to organize and oppose subsequent reforms with the only purpose of receiving a transfer. By paying lower compensations, governments attempt to prevent such interest groups from organizing. However, this comes at the cost of reforms being blocked by interest groups with relatively high losses.
Resumo:
We study the problem of deriving a complete welfare ordering from a choice function. Under the sequential solution, the best alternative is the alternative chosen from the universal set; the second best is the one chosen when the best alternative is removed; and so on. We show that this is the only completion of Bernheim and Rangel's (2009) welfare relation that satisfies two natural axioms: neutrality, which ensures that the names of the alternatives are welfare-irrelevant; and persistence, which stipulates that every choice function between two welfare-identical choice functions must exhibit the same welfare ordering.
Resumo:
The study was carried out to understand the effect of silver-silica nanocomposite (Ag-SiO2NC) on the cell wall integrity, metabolism and genetic stability of Pseudomonas aeruginosa, a multiple drugresistant bacterium. Bacterial sensitivity towards antibiotics and Ag-SiO2NC was studied using standard disc diffusion and death rate assay, respectively. The effect of Ag-SiO2NC on cell wall integrity was monitored using SDS assay and fatty acid profile analysis while the effect on metabolism and genetic stability was assayed microscopically, using CTC viability staining and comet assay, respectively. P. aeruginosa was found to be resistant to β-lactamase, glycopeptidase, sulfonamide, quinolones, nitrofurantoin and macrolides classes of antibiotics. Complete mortality of the bacterium was achieved with 80 μgml-1 concentration of Ag-SiO2NC. The cell wall integrity reduced with increasing time and reached a plateau of 70 % in 110 min. Changes were also noticed in the proportion of fatty acids after the treatment. Inside the cytoplasm, a complete inhibition of electron transport system was achieved with 100 μgml-1 Ag-SiO2NC, followed by DNA breakage. The study thus demonstrates that Ag-SiO2NC invades the cytoplasm of the multiple drug-resistant P. aeruginosa by impinging upon the cell wall integrity and kills the cells by interfering with electron transport chain and the genetic stability
Resumo:
In most classical frameworks for learning from examples, it is assumed that examples are randomly drawn and presented to the learner. In this paper, we consider the possibility of a more active learner who is allowed to choose his/her own examples. Our investigations are carried out in a function approximation setting. In particular, using arguments from optimal recovery (Micchelli and Rivlin, 1976), we develop an adaptive sampling strategy (equivalent to adaptive approximation) for arbitrary approximation schemes. We provide a general formulation of the problem and show how it can be regarded as sequential optimal recovery. We demonstrate the application of this general formulation to two special cases of functions on the real line 1) monotonically increasing functions and 2) functions with bounded derivative. An extensive investigation of the sample complexity of approximating these functions is conducted yielding both theoretical and empirical results on test functions. Our theoretical results (stated insPAC-style), along with the simulations demonstrate the superiority of our active scheme over both passive learning as well as classical optimal recovery. The analysis of active function approximation is conducted in a worst-case setting, in contrast with other Bayesian paradigms obtained from optimal design (Mackay, 1992).
Resumo:
Changes in mature forest cover amount, composition, and configuration can be of significant consequence to wildlife populations. The response of wildlife to forest patterns is of concern to forest managers because it lies at the heart of such competing approaches to forest planning as aggregated vs. dispersed harvest block layouts. In this study, we developed a species assessment framework to evaluate the outcomes of forest management scenarios on biodiversity conservation objectives. Scenarios were assessed in the context of a broad range of forest structures and patterns that would be expected to occur under natural disturbance and succession processes. Spatial habitat models were used to predict the effects of varying degrees of mature forest cover amount, composition, and configuration on habitat occupancy for a set of 13 focal songbird species. We used a spatially explicit harvest scheduling program to model forest management options and simulate future forest conditions resulting from alternative forest management scenarios, and used a process-based fire-simulation model to simulate future forest conditions resulting from natural wildfire disturbance. Spatial pattern signatures were derived for both habitat occupancy and forest conditions, and these were placed in the context of the simulated range of natural variation. Strategic policy analyses were set in the context of current Ontario forest management policies. This included use of sequential time-restricted harvest blocks (created for Woodland caribou (Rangifer tarandus) conservation) and delayed harvest areas (created for American marten (Martes americana atrata) conservation). This approach increased the realism of the analysis, but reduced the generality of interpretations. We found that forest management options that create linear strips of old forest deviate the most from simulated natural patterns, and had the greatest negative effects on habitat occupancy, whereas policy options that specify deferment and timing of harvest for large blocks helped ensure the stable presence of an intact mature forest matrix over time. The management scenario that focused on maintaining compositional targets best supported biodiversity objectives by providing the composition patterns required by the 13 focal species, but this scenario may be improved by adding some broad-scale spatial objectives to better maintain large blocks of interior forest habitat through time.
Resumo:
A number of authors have proposed clinical trial designs involving the comparison of several experimental treatments with a control treatment in two or more stages. At the end of the first stage, the most promising experimental treatment is selected, and all other experimental treatments are dropped from the trial. Provided it is good enough, the selected experimental treatment is then compared with the control treatment in one or more subsequent stages. The analysis of data from such a trial is problematic because of the treatment selection and the possibility of stopping at interim analyses. These aspects lead to bias in the maximum-likelihood estimate of the advantage of the selected experimental treatment over the control and to inaccurate coverage for the associated confidence interval. In this paper, we evaluate the bias of the maximum-likelihood estimate and propose a bias-adjusted estimate. We also propose an approach to the construction of a confidence region for the vector of advantages of the experimental treatments over the control based on an ordering of the sample space. These regions are shown to have accurate coverage, although they are also shown to be necessarily unbounded. Confidence intervals for the advantage of the selected treatment are obtained from the confidence regions and are shown to have more accurate coverage than the standard confidence interval based upon the maximum-likelihood estimate and its asymptotic standard error.
Resumo:
Most statistical methodology for phase III clinical trials focuses on the comparison of a single experimental treatment with a control. An increasing desire to reduce the time before regulatory approval of a new drug is sought has led to development of two-stage or sequential designs for trials that combine the definitive analysis associated with phase III with the treatment selection element of a phase II study. In this paper we consider a trial in which the most promising of a number of experimental treatments is selected at the first interim analysis. This considerably reduces the computational load associated with the construction of stopping boundaries compared to the approach proposed by Follman, Proschan and Geller (Biometrics 1994; 50: 325-336). The computational requirement does not exceed that for the sequential comparison of a single experimental treatment with a control. Existing methods are extended in two ways. First, the use of the efficient score as a test statistic makes the analysis of binary, normal or failure-time data, as well as adjustment for covariates or stratification straightforward. Second, the question of trial power is also considered, enabling the determination of sample size required to give specified power. Copyright © 2003 John Wiley & Sons, Ltd.
Resumo:
Sequential methods provide a formal framework by which clinical trial data can be monitored as they accumulate. The results from interim analyses can be used either to modify the design of the remainder of the trial or to stop the trial as soon as sufficient evidence of either the presence or absence of a treatment effect is available. The circumstances under which the trial will be stopped with a claim of superiority for the experimental treatment, must, however, be determined in advance so as to control the overall type I error rate. One approach to calculating the stopping rule is the group-sequential method. A relatively recent alternative to group-sequential approaches is the adaptive design method. This latter approach provides considerable flexibility in changes to the design of a clinical trial at an interim point. However, a criticism is that the method by which evidence from different parts of the trial is combined means that a final comparison of treatments is not based on a sufficient statistic for the treatment difference, suggesting that the method may lack power. The aim of this paper is to compare two adaptive design approaches with the group-sequential approach. We first compare the form of the stopping boundaries obtained using the different methods. We then focus on a comparison of the power of the different trials when they are designed so as to be as similar as possible. We conclude that all methods acceptably control type I error rate and power when the sample size is modified based on a variance estimate, provided no interim analysis is so small that the asymptotic properties of the test statistic no longer hold. In the latter case, the group-sequential approach is to be preferred. Provided that asymptotic assumptions hold, the adaptive design approaches control the type I error rate even if the sample size is adjusted on the basis of an estimate of the treatment effect, showing that the adaptive designs allow more modifications than the group-sequential method.
Resumo:
The International Citicoline Trial in acUte Stroke is a sequential phase III study of the use of the drug citicoline in the treatment of acute ischaemic stroke, which was initiated in 2006 in 56 treatment centres. The primary objective of the trial is to demonstrate improved recovery of patients randomized to citicoline relative to those randomized to placebo after 12 weeks of follow-up. The primary analysis will take the form of a global test combining the dichotomized results of assessments on three well-established scales: the Barthel Index, the modified Rankin scale and the National Institutes of Health Stroke Scale. This approach was previously used in the analysis of the influential National Institute of Neurological Disorders and Stroke trial of recombinant tissue plasminogen activator in stroke. The purpose of this paper is to describe how this trial was designed, and in particular how the simultaneous objectives of taking into account three assessment scales, performing a series of interim analyses and conducting treatment allocation and adjusting the analyses to account for prognostic factors, including more than 50 treatment centres, were addressed. Copyright (C) 2008 John Wiley & Sons, Ltd.
Resumo:
Assaying a large number of genetic markers from patients in clinical trials is now possible in order to tailor drugs with respect to efficacy. The statistical methodology for analysing such massive data sets is challenging. The most popular type of statistical analysis is to use a univariate test for each genetic marker, once all the data from a clinical study have been collected. This paper presents a sequential method for conducting an omnibus test for detecting gene-drug interactions across the genome, thus allowing informed decisions at the earliest opportunity and overcoming the multiple testing problems from conducting many univariate tests. We first propose an omnibus test for a fixed sample size. This test is based on combining F-statistics that test for an interaction between treatment and the individual single nucleotide polymorphism (SNP). As SNPs tend to be correlated, we use permutations to calculate a global p-value. We extend our omnibus test to the sequential case. In order to control the type I error rate, we propose a sequential method that uses permutations to obtain the stopping boundaries. The results of a simulation study show that the sequential permutation method is more powerful than alternative sequential methods that control the type I error rate, such as the inverse-normal method. The proposed method is flexible as we do not need to assume a mode of inheritance and can also adjust for confounding factors. An application to real clinical data illustrates that the method is computationally feasible for a large number of SNPs. Copyright (c) 2007 John Wiley & Sons, Ltd.