930 resultados para sequential injection analysis
Resumo:
The purpose of the present study was to explore the usefulness of the Mexican sequential organ failure assessment (MEXSOFA) score for assessing the risk of mortality for critically ill patients in the ICU. A total of 232 consecutive patients admitted to an ICU were included in the study. The MEXSOFA was calculated using the original SOFA scoring system with two modifications: the PaO2/FiO2 ratio was replaced with the SpO2/FiO2 ratio, and the evaluation of neurologic dysfunction was excluded. The ICU mortality rate was 20.2%. Patients with an initial MEXSOFA score of 9 points or less calculated during the first 24 h after admission to the ICU had a mortality rate of 14.8%, while those with an initial MEXSOFA score of 10 points or more had a mortality rate of 40%. The MEXSOFA score at 48 h was also associated with mortality: patients with a score of 9 points or less had a mortality rate of 14.1%, while those with a score of 10 points or more had a mortality rate of 50%. In a multivariate analysis, only the MEXSOFA score at 48 h was an independent predictor for in-ICU death with an OR = 1.35 (95%CI = 1.14-1.59, P < 0.001). The SOFA and MEXSOFA scores calculated 24 h after admission to the ICU demonstrated a good level of discrimination for predicting the in-ICU mortality risk in critically ill patients. The MEXSOFA score at 48 h was an independent predictor of death; with each 1-point increase, the odds of death increased by 35%.
Resumo:
The effects of sample solvent composition and the injection volume, on the chromatographic peak profiles of two carbamate derivatives, methyl 2-benzimidazolecarbamate (MBC) and 3-butyl-2,4-dioxo[1,2-a]-s-triazinobenzimidazole (STB), were studied using reverse phase high performance liquid chromatograph. The study examined the effects of acetonitrile percentage in the sample solvent from 5 to 50%, effects of methanol percentage from 5 to 50%, effects of pH increase from 4.42 to 9.10, and effect of increasing buffer concentration from ° to 0.12M. The effects were studied at constant and increasing injection mass and at four injection volumes of 10, 50, 100 and 200 uL. The study demonstrated that the amount and the type of the organic solvents, the pH, and the buffer strength of the sample solution can have a pronounced effect on the peak heights, peak widths, and retention times of compounds analysed. MBC, which is capable of intramolecular hydrogen bonding and has no tendency to ionize, showed a predictable increase .in band broadening and a decrease in retention times at higher eluting strengths of the sample solvent. STB, which has a tendency to ionize or to strongly interact with the sample solvent, was influenced in various ways by the changes in ths sample solvent composition. The sample solvent effects became more pronounced as the injection volume increased and as the percentage of organic solvent in the sample solution became greater. The peak height increases for STB at increasing buffer concentrations became much more pronounced at higher analyte concentrations. It was shown that the widely accepted procedure of dissolving samples in the mobile phase does not yield the most efficient chromatograms. For that reason samples should be dissolved in the solutions with higher aqueous content than that of the mobile phase whenever possible. The results strongly recommend that all the samples and standards, regardless whether the standards are external or internal, be analysed at a constant sample composition and a constant injection volume.
Resumo:
This paper proposes an explanation for why efficient reforms are not carried out when losers have the power to block their implementation, even though compensating them is feasible. We construct a signaling model with two-sided incomplete information in which a government faces the task of sequentially implementing two reforms by bargaining with interest groups. The organization of interest groups is endogenous. Compensations are distortionary and government types differ in the concern about distortions. We show that, when compensations are allowed to be informative about the government’s type, there is a bias against the payment of compensations and the implementation of reforms. This is because paying high compensations today provides incentives for some interest groups to organize and oppose subsequent reforms with the only purpose of receiving a transfer. By paying lower compensations, governments attempt to prevent such interest groups from organizing. However, this comes at the cost of reforms being blocked by interest groups with relatively high losses.
Resumo:
Introduction: Malgré des taux d’efficacité comparable du traitement antiviral de l’hépatite C (VHC) entre utilisateurs de drogues par injection (UDIs) et non-UDIs, il y a encore d’importantes barrières à l’accessibilité au traitement pour cette population vulnérable. La méfiance des UDIs à l’égard des autorités médicales, ainsi que leur mode de vie souvent désorganisé ont un impact sur l’initiation du traitement. L’objectif de cette étude est d’examiner les liens entre l’initiation du traitement du VHC et l’utilisation des services de santé chez les UDIs actifs. Methode: 758 UDIs actifs et séropositifs aux anticorps anti-VHC ont été interrogés durant la période de novembre 2004 à mars 2011, dans la région de Montréal. Des questionnaires administrés par des intervieweurs ont fourni des informations sur les caractéristiques socio-économiques, ainsi que sur les variables relatives à l’usage de drogues et à l’utilisation des services de santé. Des échantillons sanguins ont été prélevés et testés pour les anticorps anti-VHC. Une régression logistique multivariée a permis de générer des associations entre les facteurs relatifs aux services de santé et l’initiation du traitement contre le VHC. Resultats: Parmi les 758 sujets, 55 (7,3%) avaient initié un traitement du VHC avant leur inclusion dans l’étude. Selon les analyses multivariées, les variables significativement associées à l’initiation du traitement sont les suivantes: avoir vu un médecin de famille dans les derniers 6 mois (Ratio de Cote ajusté (RCa): 1,96; Intervalle de Confiance à 95% (IC): 1,04-3,69); plus de 2 ans sous traitement de la dépendance à vie, sans usage actuel de méthadone (RCa: 2,25; IC: 1,12-4,51); plus de 2 ans sous traitement de la dépendance à vie, avec usage actuel de méthadone (RCa: 3,78; IC: 1,85-7,71); et avoir déjà séjourné en prison (RCa: 0,44; IC: 0,22-0,87). Conclusion: L’exposition à des services d’aide à la dépendance et aux services médicaux est associée à l’initiation du traitement du VHC. Ces résultats suggèrent que ces services jouent leur rôle de point d’entrée au traitement. Alternativement, les UDIs ayant initié un traitement du VHC, auraient possiblement adopté une attitude proactive quant à l'amélioration de leur santé globale. D’autre part, l’incarcération ressort comme un obstacle à la gestion de l’infection au VHC.
Resumo:
This research project aimed to conduct a strategic analysis of the implementation of a supervised injecting facility (SIF) in Montérégie. Using a mixed design, we first completed a portrait of the injection drug user (IDU) population. We then explored the perceptions of IDU and stakeholders with regard to the relevance of implementing a SIF in the region. Although some similarities were found with the IDU populations of Montreal and the province of Quebec, this population in Montérégie is characterized by a lower frequency of injections in public, less homeless people and lower rates of HIV and HCV infections. Despite these differences, the IDU population in Montérégie was found to have important physical and psychosocial needs. Although the relevance of a SIF in Montérégie is undeniable, improvements regarding the accessibility, continuity and appreciation of the actual services dedicated to IDU remain a priority.
Resumo:
We study the problem of deriving a complete welfare ordering from a choice function. Under the sequential solution, the best alternative is the alternative chosen from the universal set; the second best is the one chosen when the best alternative is removed; and so on. We show that this is the only completion of Bernheim and Rangel's (2009) welfare relation that satisfies two natural axioms: neutrality, which ensures that the names of the alternatives are welfare-irrelevant; and persistence, which stipulates that every choice function between two welfare-identical choice functions must exhibit the same welfare ordering.
Resumo:
The study was carried out to understand the effect of silver-silica nanocomposite (Ag-SiO2NC) on the cell wall integrity, metabolism and genetic stability of Pseudomonas aeruginosa, a multiple drugresistant bacterium. Bacterial sensitivity towards antibiotics and Ag-SiO2NC was studied using standard disc diffusion and death rate assay, respectively. The effect of Ag-SiO2NC on cell wall integrity was monitored using SDS assay and fatty acid profile analysis while the effect on metabolism and genetic stability was assayed microscopically, using CTC viability staining and comet assay, respectively. P. aeruginosa was found to be resistant to β-lactamase, glycopeptidase, sulfonamide, quinolones, nitrofurantoin and macrolides classes of antibiotics. Complete mortality of the bacterium was achieved with 80 μgml-1 concentration of Ag-SiO2NC. The cell wall integrity reduced with increasing time and reached a plateau of 70 % in 110 min. Changes were also noticed in the proportion of fatty acids after the treatment. Inside the cytoplasm, a complete inhibition of electron transport system was achieved with 100 μgml-1 Ag-SiO2NC, followed by DNA breakage. The study thus demonstrates that Ag-SiO2NC invades the cytoplasm of the multiple drug-resistant P. aeruginosa by impinging upon the cell wall integrity and kills the cells by interfering with electron transport chain and the genetic stability
Resumo:
In most classical frameworks for learning from examples, it is assumed that examples are randomly drawn and presented to the learner. In this paper, we consider the possibility of a more active learner who is allowed to choose his/her own examples. Our investigations are carried out in a function approximation setting. In particular, using arguments from optimal recovery (Micchelli and Rivlin, 1976), we develop an adaptive sampling strategy (equivalent to adaptive approximation) for arbitrary approximation schemes. We provide a general formulation of the problem and show how it can be regarded as sequential optimal recovery. We demonstrate the application of this general formulation to two special cases of functions on the real line 1) monotonically increasing functions and 2) functions with bounded derivative. An extensive investigation of the sample complexity of approximating these functions is conducted yielding both theoretical and empirical results on test functions. Our theoretical results (stated insPAC-style), along with the simulations demonstrate the superiority of our active scheme over both passive learning as well as classical optimal recovery. The analysis of active function approximation is conducted in a worst-case setting, in contrast with other Bayesian paradigms obtained from optimal design (Mackay, 1992).
Resumo:
Changes in mature forest cover amount, composition, and configuration can be of significant consequence to wildlife populations. The response of wildlife to forest patterns is of concern to forest managers because it lies at the heart of such competing approaches to forest planning as aggregated vs. dispersed harvest block layouts. In this study, we developed a species assessment framework to evaluate the outcomes of forest management scenarios on biodiversity conservation objectives. Scenarios were assessed in the context of a broad range of forest structures and patterns that would be expected to occur under natural disturbance and succession processes. Spatial habitat models were used to predict the effects of varying degrees of mature forest cover amount, composition, and configuration on habitat occupancy for a set of 13 focal songbird species. We used a spatially explicit harvest scheduling program to model forest management options and simulate future forest conditions resulting from alternative forest management scenarios, and used a process-based fire-simulation model to simulate future forest conditions resulting from natural wildfire disturbance. Spatial pattern signatures were derived for both habitat occupancy and forest conditions, and these were placed in the context of the simulated range of natural variation. Strategic policy analyses were set in the context of current Ontario forest management policies. This included use of sequential time-restricted harvest blocks (created for Woodland caribou (Rangifer tarandus) conservation) and delayed harvest areas (created for American marten (Martes americana atrata) conservation). This approach increased the realism of the analysis, but reduced the generality of interpretations. We found that forest management options that create linear strips of old forest deviate the most from simulated natural patterns, and had the greatest negative effects on habitat occupancy, whereas policy options that specify deferment and timing of harvest for large blocks helped ensure the stable presence of an intact mature forest matrix over time. The management scenario that focused on maintaining compositional targets best supported biodiversity objectives by providing the composition patterns required by the 13 focal species, but this scenario may be improved by adding some broad-scale spatial objectives to better maintain large blocks of interior forest habitat through time.
Resumo:
A number of authors have proposed clinical trial designs involving the comparison of several experimental treatments with a control treatment in two or more stages. At the end of the first stage, the most promising experimental treatment is selected, and all other experimental treatments are dropped from the trial. Provided it is good enough, the selected experimental treatment is then compared with the control treatment in one or more subsequent stages. The analysis of data from such a trial is problematic because of the treatment selection and the possibility of stopping at interim analyses. These aspects lead to bias in the maximum-likelihood estimate of the advantage of the selected experimental treatment over the control and to inaccurate coverage for the associated confidence interval. In this paper, we evaluate the bias of the maximum-likelihood estimate and propose a bias-adjusted estimate. We also propose an approach to the construction of a confidence region for the vector of advantages of the experimental treatments over the control based on an ordering of the sample space. These regions are shown to have accurate coverage, although they are also shown to be necessarily unbounded. Confidence intervals for the advantage of the selected treatment are obtained from the confidence regions and are shown to have more accurate coverage than the standard confidence interval based upon the maximum-likelihood estimate and its asymptotic standard error.
Resumo:
Most statistical methodology for phase III clinical trials focuses on the comparison of a single experimental treatment with a control. An increasing desire to reduce the time before regulatory approval of a new drug is sought has led to development of two-stage or sequential designs for trials that combine the definitive analysis associated with phase III with the treatment selection element of a phase II study. In this paper we consider a trial in which the most promising of a number of experimental treatments is selected at the first interim analysis. This considerably reduces the computational load associated with the construction of stopping boundaries compared to the approach proposed by Follman, Proschan and Geller (Biometrics 1994; 50: 325-336). The computational requirement does not exceed that for the sequential comparison of a single experimental treatment with a control. Existing methods are extended in two ways. First, the use of the efficient score as a test statistic makes the analysis of binary, normal or failure-time data, as well as adjustment for covariates or stratification straightforward. Second, the question of trial power is also considered, enabling the determination of sample size required to give specified power. Copyright © 2003 John Wiley & Sons, Ltd.
Resumo:
Sequential methods provide a formal framework by which clinical trial data can be monitored as they accumulate. The results from interim analyses can be used either to modify the design of the remainder of the trial or to stop the trial as soon as sufficient evidence of either the presence or absence of a treatment effect is available. The circumstances under which the trial will be stopped with a claim of superiority for the experimental treatment, must, however, be determined in advance so as to control the overall type I error rate. One approach to calculating the stopping rule is the group-sequential method. A relatively recent alternative to group-sequential approaches is the adaptive design method. This latter approach provides considerable flexibility in changes to the design of a clinical trial at an interim point. However, a criticism is that the method by which evidence from different parts of the trial is combined means that a final comparison of treatments is not based on a sufficient statistic for the treatment difference, suggesting that the method may lack power. The aim of this paper is to compare two adaptive design approaches with the group-sequential approach. We first compare the form of the stopping boundaries obtained using the different methods. We then focus on a comparison of the power of the different trials when they are designed so as to be as similar as possible. We conclude that all methods acceptably control type I error rate and power when the sample size is modified based on a variance estimate, provided no interim analysis is so small that the asymptotic properties of the test statistic no longer hold. In the latter case, the group-sequential approach is to be preferred. Provided that asymptotic assumptions hold, the adaptive design approaches control the type I error rate even if the sample size is adjusted on the basis of an estimate of the treatment effect, showing that the adaptive designs allow more modifications than the group-sequential method.
Resumo:
A sequential study design generally makes more efficient use of available information than a fixed sample counterpart of equal power. This feature is gradually being exploited by researchers in genetic and epidemiological investigations that utilize banked biological resources and in studies where time, cost and ethics are prominent considerations. Recent work in this area has focussed on the sequential analysis of matched case-control studies with a dichotomous trait. In this paper, we extend the sequential approach to a comparison of the associations within two independent groups of paired continuous observations. Such a comparison is particularly relevant in familial studies of phenotypic correlation using twins. We develop a sequential twin method based on the intraclass correlation and show that use of sequential methodology can lead to a substantial reduction in the number of observations without compromising the study error rates. Additionally, our approach permits straightforward allowance for other explanatory factors in the analysis. We illustrate our method in a sequential heritability study of dysplasia that allows for the effect of body mass index and compares monozygotes with pairs of singleton sisters. Copyright (c) 2006 John Wiley & Sons, Ltd.
Resumo:
The International Citicoline Trial in acUte Stroke is a sequential phase III study of the use of the drug citicoline in the treatment of acute ischaemic stroke, which was initiated in 2006 in 56 treatment centres. The primary objective of the trial is to demonstrate improved recovery of patients randomized to citicoline relative to those randomized to placebo after 12 weeks of follow-up. The primary analysis will take the form of a global test combining the dichotomized results of assessments on three well-established scales: the Barthel Index, the modified Rankin scale and the National Institutes of Health Stroke Scale. This approach was previously used in the analysis of the influential National Institute of Neurological Disorders and Stroke trial of recombinant tissue plasminogen activator in stroke. The purpose of this paper is to describe how this trial was designed, and in particular how the simultaneous objectives of taking into account three assessment scales, performing a series of interim analyses and conducting treatment allocation and adjusting the analyses to account for prognostic factors, including more than 50 treatment centres, were addressed. Copyright (C) 2008 John Wiley & Sons, Ltd.
Resumo:
Assaying a large number of genetic markers from patients in clinical trials is now possible in order to tailor drugs with respect to efficacy. The statistical methodology for analysing such massive data sets is challenging. The most popular type of statistical analysis is to use a univariate test for each genetic marker, once all the data from a clinical study have been collected. This paper presents a sequential method for conducting an omnibus test for detecting gene-drug interactions across the genome, thus allowing informed decisions at the earliest opportunity and overcoming the multiple testing problems from conducting many univariate tests. We first propose an omnibus test for a fixed sample size. This test is based on combining F-statistics that test for an interaction between treatment and the individual single nucleotide polymorphism (SNP). As SNPs tend to be correlated, we use permutations to calculate a global p-value. We extend our omnibus test to the sequential case. In order to control the type I error rate, we propose a sequential method that uses permutations to obtain the stopping boundaries. The results of a simulation study show that the sequential permutation method is more powerful than alternative sequential methods that control the type I error rate, such as the inverse-normal method. The proposed method is flexible as we do not need to assume a mode of inheritance and can also adjust for confounding factors. An application to real clinical data illustrates that the method is computationally feasible for a large number of SNPs. Copyright (c) 2007 John Wiley & Sons, Ltd.