995 resultados para Continuous Optimization


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Integrating and expressing stably a transgene into the cellular genome remain major challenges for gene-based therapies and for bioproduction purposes. While transposon vectors mediate efficient transgene integration, expression may be limited by epigenetic silencing, and persistent transposase expression may mediate multiple transposition cycles. Here, we evaluated the delivery of the piggyBac transposase messenger RNA combined with genetically insulated transposons to isolate the transgene from neighboring regulatory elements and stabilize expression. A comparison of piggyBac transposase expression from messenger RNA and DNA vectors was carried out in terms of expression levels, transposition efficiency, transgene expression and genotoxic effects, in order to calibrate and secure the transposition-based delivery system. Messenger RNA reduced the persistence of the transposase to a narrow window, thus decreasing side effects such as superfluous genomic DNA cleavage. Both the CTF/NF1 and the D4Z4 insulators were found to mediate more efficient expression from a few transposition events. We conclude that the use of engineered piggyBac transposase mRNA and insulated transposons offer promising ways of improving the quality of the integration process and sustaining the expression of transposon vectors.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Numerous recent reports by non-governmental organisations (NGOs), academics and international organisations have focused on so-called 'climate refugees'. This article examines the turn from a discourse of 'climate refugees', in which organisations perceive migration as a failure of both mitigation and adaptation to climate change, to one of 'climate migration', in which organisations promote migration as a strategy of adaptation. Its focus is the promotion of climate migration management, and it explores the trend of these discourses through two sections. First, it provides an empirical account of the two discourses, emphasising the differentiation between them. It then focuses on the discourse of climate migration, its origins, extent and content, and the associated practices of 'migration management'. The second part argues that the turn to the promotion of 'climate migration' should be understood as a way to manage the insecurity created by climate change. However, international organisations enacts this management within the forms of neoliberal capitalism, including the framework of governance. Therefore, the promotion of 'climate migration' as a strategy of adaptation to climate change is located within the tendencies of neoliberalism and the reconfiguration of southern states' sovereignty through governance.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Tractography is a class of algorithms aiming at in vivo mapping the major neuronal pathways in the white matter from diffusion magnetic resonance imaging (MRI) data. These techniques offer a powerful tool to noninvasively investigate at the macroscopic scale the architecture of the neuronal connections of the brain. However, unfortunately, the reconstructions recovered with existing tractography algorithms are not really quantitative even though diffusion MRI is a quantitative modality by nature. As a matter of fact, several techniques have been proposed in recent years to estimate, at the voxel level, intrinsic microstructural features of the tissue, such as axonal density and diameter, by using multicompartment models. In this paper, we present a novel framework to reestablish the link between tractography and tissue microstructure. Starting from an input set of candidate fiber-tracts, which are estimated from the data using standard fiber-tracking techniques, we model the diffusion MRI signal in each voxel of the image as a linear combination of the restricted and hindered contributions generated in every location of the brain by these candidate tracts. Then, we seek for the global weight of each of them, i.e., the effective contribution or volume, such that they globally fit the measured signal at best. We demonstrate that these weights can be easily recovered by solving a global convex optimization problem and using efficient algorithms. The effectiveness of our approach has been evaluated both on a realistic phantom with known ground-truth and in vivo brain data. Results clearly demonstrate the benefits of the proposed formulation, opening new perspectives for a more quantitative and biologically plausible assessment of the structural connectivity of the brain.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Undernutrition is a widespread problem in intensive care unit and is associated with a worse clinical outcome. A state of negative energy balance increases stress catabolism and is associated with increased morbidity and mortality in ICU patients. Undernutrition-related increased morbidity is correlated with an increase in the length of hospital stay and health care costs. Enteral nutrition is the recommended feeding route in critically ill patients, but it is often insufficient to cover the nutritional needs. The initiation of supplemental parenteral nutrition, when enteral nutrition is insufficient, could optimize the nutritional therapy by preventing the onset of early energy deficiency, and thus, could allow to reduce morbidity, length of stay and costs, shorten recovery period and, finally, improve quality of life. (C) 2009 Elsevier Masson SAS. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Analyzing the relationship between the baseline value and subsequent change of a continuous variable is a frequent matter of inquiry in cohort studies. These analyses are surprisingly complex, particularly if only two waves of data are available. It is unclear for non-biostatisticians where the complexity of this analysis lies and which statistical method is adequate.With the help of simulated longitudinal data of body mass index in children,we review statistical methods for the analysis of the association between the baseline value and subsequent change, assuming linear growth with time. Key issues in such analyses are mathematical coupling, measurement error, variability of change between individuals, and regression to the mean. Ideally, it is better to rely on multiple repeated measurements at different times and a linear random effects model is a standard approach if more than two waves of data are available. If only two waves of data are available, our simulations show that Blomqvist's method - which consists in adjusting for measurement error variance the estimated regression coefficient of observed change on baseline value - provides accurate estimates. The adequacy of the methods to assess the relationship between the baseline value and subsequent change depends on the number of data waves, the availability of information on measurement error, and the variability of change between individuals.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Summary Forests are key ecosystems of the earth and associated with a large range of functions. Many of these functions are beneficial to humans and are referred to as ecosystem services. Sustainable development requires that all relevant ecosystem services are quantified, managed and monitored equally. Natural resource management therefore targets the services associated with ecosystems. The main hypothesis of this thesis is that the spatial and temporal domains of relevant services do not correspond to a discrete forest ecosystem. As a consequence, the services are not quantified, managed and monitored in an equal and sustainable manner. The thesis aims were therefore to test this hypothesis, establish an improved conceptual approach and provide spatial applications for the relevant land cover and structure variables. The study was carried out in western Switzerland and based primarily on data from a countrywide landscape inventory. This inventory is part of the third Swiss national forest inventory and assesses continuous landscape variables based on a regular sampling of true colour aerial imagery. In addition, land cover variables were derived from Landsat 5 TM passive sensor data and land structure variables from active sensor data from a small footprint laserscanning system. The results confirmed the main hypothesis, as relevant services did not scale well with the forest ecosystem. Instead, a new conceptual approach for sustainable management of natural resources was described. This concept quantifies the services as a continuous function of the landscape, rather than a discrete function of the forest ecosystem. The explanatory landscape variables are therefore called continuous fields and the forest becomes a dependent and function-driven management unit. Continuous field mapping methods were established for land cover and structure variables. In conclusion, the discrete forest ecosystem is an adequate planning and management unit. However, monitoring the state of and trends in sustainability of services requires them to be quantified as a continuous function of the landscape. Sustainable natural resource management iteratively combines the ecosystem and gradient approaches. Résumé Les forêts sont des écosystèmes-clés de la terre et on leur attribue un grand nombre de fonctions. Beaucoup de ces fonctions sont bénéfiques pour l'homme et sont nommées services écosystémiques. Le développement durable exige que ces services écosystémiques soient tous quantifiés, gérés et surveillés de façon égale. La gestion des ressources naturelles a donc pour cible les services attribués aux écosystèmes. L'hypothèse principale de cette thèse est que les domaines spatiaux et temporels des services attribués à la forêt ne correspondent pas à un écosystème discret. Par conséquent, les services ne sont pas quantifiés, aménagés et surveillés d'une manière équivalente et durable. Les buts de la thèse étaient de tester cette hypothèse, d'établir une nouvelle approche conceptuelle de la gestion des ressources naturelles et de préparer des applications spatiales pour les variables paysagères et structurelles appropriées. L'étude a été menée en Suisse occidentale principalement sur la base d'un inventaire de paysage à l'échelon national. Cet inventaire fait partie du troisième inventaire forestier national suisse et mesure de façon continue des variables paysagères sur la base d'un échantillonnage régulier sur des photos aériennes couleur. En outre, des variables de couverture ? terrestre ont été dérivées des données d'un senseur passif Landsat 5 TM, ainsi que des variables structurelles, dérivées du laserscanning, un senseur actif. Les résultats confirment l'hypothèse principale, car l'échelle des services ne correspond pas à celle de l'écosystème forestier. Au lieu de cela, une nouvelle approche a été élaborée pour la gestion durable des ressources naturelles. Ce concept représente les services comme une fonction continue du paysage, plutôt qu'une fonction discrète de l'écosystème forestier. En conséquence, les variables explicatives de paysage sont dénommées continuous fields et la forêt devient une entité dépendante, définie par la fonction principale du paysage. Des méthodes correspondantes pour la couverture terrestre et la structure ont été élaborées. En conclusion, l'écosystème forestier discret est une unité adéquate pour la planification et la gestion. En revanche, la surveillance de la durabilité de l'état et de son évolution exige que les services soient quantifiés comme fonction continue du paysage. La gestion durable des ressources naturelles joint donc l'approche écosystémique avec celle du gradient de manière itérative.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background. A software based tool has been developed (Optem) to allow automatize the recommendations of the Canadian Multiple Sclerosis Working Group for optimizing MS treatment in order to avoid subjective interpretation. METHODS: Treatment Optimization Recommendations (TORs) were applied to our database of patients treated with IFN beta1a IM. Patient data were assessed during year 1 for disease activity, and patients were assigned to 2 groups according to TOR: "change treatment" (CH) and "no change treatment" (NCH). These assessments were then compared to observed clinical outcomes for disease activity over the following years. RESULTS: We have data on 55 patients. The "change treatment" status was assigned to 22 patients, and "no change treatment" to 33 patients. The estimated sensitivity and specificity according to last visit status were 73.9% and 84.4%. During the following years, the Relapse Rate was always higher in the "change treatment" group than in the "no change treatment" group (5 y; CH: 0.7, NCH: 0.07; p < 0.001, 12 m - last visit; CH: 0.536, NCH: 0.34). We obtained the same results with the EDSS (4 y; CH: 3.53, NCH: 2.55, annual progression rate in 12 m - last visit; CH: 0.29, NCH: 0.13). CONCLUSION: Applying TOR at the first year of therapy allowed accurate prediction of continued disease activity in relapses and disability progression.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECT: Cerebrovascular pressure reactivity is the ability of cerebral vessels to respond to changes in transmural pressure. A cerebrovascular pressure reactivity index (PRx) can be determined as the moving correlation coefficient between mean intracranial pressure (ICP) and mean arterial blood pressure. METHODS: The authors analyzed a database consisting of 398 patients with head injuries who underwent continuous monitoring of cerebrovascular pressure reactivity. In 298 patients, the PRx was compared with a transcranial Doppler ultrasonography assessment of cerebrovascular autoregulation (the mean index [Mx]), in 17 patients with the PET-assessed static rate of autoregulation, and in 22 patients with the cerebral metabolic rate for O(2). Patient outcome was assessed 6 months after injury. RESULTS: There was a positive and significant association between the PRx and Mx (R(2) = 0.36, p < 0.001) and with the static rate of autoregulation (R(2) = 0.31, p = 0.02). A PRx > 0.35 was associated with a high mortality rate (> 50%). The PRx showed significant deterioration in refractory intracranial hypertension, was correlated with outcome, and was able to differentiate patients with good outcome, moderate disability, severe disability, and death. The graph of PRx compared with cerebral perfusion pressure (CPP) indicated a U-shaped curve, suggesting that too low and too high CPP was associated with a disturbance in pressure reactivity. Such an optimal CPP was confirmed in individual cases and a greater difference between current and optimal CPP was associated with worse outcome (for patients who, on average, were treated below optimal CPP [R(2) = 0.53, p < 0.001] and for patients whose mean CPP was above optimal CPP [R(2) = -0.40, p < 0.05]). Following decompressive craniectomy, pressure reactivity initially worsened (median -0.03 [interquartile range -0.13 to 0.06] to 0.14 [interquartile range 0.12-0.22]; p < 0.01) and improved in the later postoperative course. After therapeutic hypothermia, in 17 (70.8%) of 24 patients in whom rewarming exceeded the brain temperature threshold of 37 degrees C, ICP remained stable, but the average PRx increased to 0.32 (p < 0.0001), indicating significant derangement in cerebrovascular reactivity. CONCLUSIONS: The PRx is a secondary index derived from changes in ICP and arterial blood pressure and can be used as a surrogate marker of cerebrovascular impairment. In view of an autoregulation-guided CPP therapy, a continuous determination of a PRx is feasible, but its value has to be evaluated in a prospective controlled trial.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper examines a dataset which is modeled well by thePoisson-Log Normal process and by this process mixed with LogNormal data, which are both turned into compositions. Thisgenerates compositional data that has zeros without any need forconditional models or assuming that there is missing or censoreddata that needs adjustment. It also enables us to model dependenceon covariates and within the composition

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND Only multifaceted hospital wide interventions have been successful in achieving sustained improvements in hand hygiene (HH) compliance. METHODOLOGY/PRINCIPAL FINDINGS Pre-post intervention study of HH performance at baseline (October 2007-December 2009) and during intervention, which included two phases. Phase 1 (2010) included multimodal WHO approach. Phase 2 (2011) added Continuous Quality Improvement (CQI) tools and was based on: a) Increase of alcohol hand rub (AHR) solution placement (from 0.57 dispensers/bed to 1.56); b) Increase in frequency of audits (three days every three weeks: "3/3 strategy"); c) Implementation of a standardized register form of HH corrective actions; d) Statistical Process Control (SPC) as time series analysis methodology through appropriate control charts. During the intervention period we performed 819 scheduled direct observation audits which provided data from 11,714 HH opportunities. The most remarkable findings were: a) significant improvements in HH compliance with respect to baseline (25% mean increase); b) sustained high level (82%) of HH compliance during intervention; c) significant increase in AHRs consumption over time; c) significant decrease in the rate of healthcare-acquired MRSA; d) small but significant improvements in HH compliance when comparing phase 2 to phase 1 [79.5% (95% CI: 78.2-80.7) vs 84.6% (95% CI:83.8-85.4), p<0.05]; e) successful use of control charts to identify significant negative and positive deviations (special causes) related to the HH compliance process over time ("positive": 90.1% as highest HH compliance coinciding with the "World hygiene day"; and "negative":73.7% as lowest HH compliance coinciding with a statutory lay-off proceeding). CONCLUSIONS/SIGNIFICANCE CQI tools may be a key addition to WHO strategy to maintain a good HH performance over time. In addition, SPC has shown to be a powerful methodology to detect special causes in HH performance (positive and negative) and to help establishing adequate feedback to healthcare workers.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: The ideal local anesthetic regime for femoral nerve block that balances analgesia with mobility after total knee arthroplasty (TKA) remains undefined. QUESTIONS/PURPOSES: We compared two volumes and concentrations of a fixed dose of ropivacaine for continuous femoral nerve block after TKA to a single injection femoral nerve block with ropivacaine to determine (1) time to discharge readiness; (2) early pain scores and analgesic consumption; and (3) functional outcomes, including range of motion and WOMAC scores at the time of recovery. METHODS: Ninety-nine patients were allocated to one of three continuous femoral nerve block groups for this randomized, placebo-controlled, double-blind trial: a high concentration group (ropivacaine 0.2% infusion), a low concentration group (ropivacaine 0.1% infusion), or a placebo infusion group (saline 0.9% infusion). Infusions were discontinued on postoperative Day (POD) 2. The primary outcome was time to discharge readiness. Secondary outcomes included opioid consumption, pain, and functional outcomes. Ninety-three patients completed the study protocol; the study was halted early because of unanticipated changes to pain protocols at the host institution, by which time only 61% of the required number of patients had been enrolled. RESULTS: With the numbers available, the mean time to discharge readiness was not different between groups (high concentration group, 62 hours [95% confidence interval [CI], 51-72 hours]; low concentration group, 73 hours [95% CI, 63-83 hours]; placebo infusion group 65 hours [95% CI, 56-75 hours]; p = 0.27). Patients in the low concentration group consumed significantly less morphine during the period of infusion (POD 1, high concentration group, 56 mg [95% CI, 42-70 mg]; low concentration group, 35 mg [95% CI, 27-43 mg]; placebo infusion group, 48 mg [95% CI, 38-59 mg], p = 0.02; POD 2, high concentration group, 50 mg [95% CI, 41-60 mg]; low concentration group, 33 mg [95% CI, 24-42 mg]; placebo infusion group, 39 mg [95% CI, 30-48 mg], p = 0.04); however, there were no important differences in pain scores or opioid-related side effects with the numbers available. Likewise, there were no important differences in functional outcomes between groups. CONCLUSIONS: Based on this study, which was terminated prematurely before the desired sample size could be achieved, we were unable to demonstrate that varying the concentration and volume of a fixed-dose ropivacaine infusion for continuous femoral nerve block influences time to discharge readiness when compared with a conventional single-injection femoral nerve block after TKA. A low concentration of ropivacaine infusion can reduce postoperative opioid consumption but without any important differences in pain scores, side effects, or functional outcomes. These pilot data may be used to inform the statistical power of future randomized trials. LEVEL OF EVIDENCE: Level II, therapeutic study. See Guidelines for Authors for a complete description of levels of evidence.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Three out of five human endometrial carcinomas were successfully grafted into nude mice (BALB/c/nu/nu). Two of these tumors could be maintained by serial transplantation. The morphological characteristics displayed by the grafted tumors were comparable to those of the original carcinomas. Permanent cell lines were established from these two tumors. Reinjection of cells grown in vitro into nude mice produced nodules of identical histology as compared to original solid transplants. The influence of medroxyprogesterone acetate on tumor growth in vivo and cell proliferation in vitro was studied. This hormonal treatment did not produce any significant effect on tumor cells, either in vitro or in vivo, for the two endometrial carcinomas. After medroxyprogesterone administration, a slight but non-significant growth inhibition of the tumor cells in vitro was observed and the tumor transplants in vivo did not appear to be influenced. The experiments illustrate the possible use of this model for testing potential anti-cancer agents.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Viruses rapidly evolve, and HIV in particular is known to be one of the fastest evolving human viruses. It is now commonly accepted that viral evolution is the cause of the intriguing dynamics exhibited during HIV infections and the ultimate success of the virus in its struggle with the immune system. To study viral evolution, we use a simple mathematical model of the within-host dynamics of HIV which incorporates random mutations. In this model, we assume a continuous distribution of viral strains in a one-dimensional phenotype space where random mutations are modelled by di ffusion. Numerical simulations show that random mutations combined with competition result in evolution towards higher Darwinian fitness: a stable traveling wave of evolution, moving towards higher levels of fi tness, is formed in the phenoty space.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We analyse in a unified way how the presence of a trader with privilege information makes the market to be efficient when the release time is known. We establish a general relation between the problem of finding an equilibrium and the problem of enlargement of filtrations. We also consider the case where the time of announcement is random. In such a case the market is not fully efficient and there exists equilibrium if the sensitivity of prices with respect to the global demand is time decreasing according with the distribution of the random time.