30 resultados para Triple Bottom Line Approach

em Université de Lausanne, Switzerland


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The abscesses of the breast are puerperal or non puerperal. The Staphylococcus aureus is the most common germ. The diagnosis is based on clinical criterias and confirmed by the ultrasonography. The percutaneous ultrasonography-guided drainage must be proposed in first intent to treat. Surgical treatment is still valid with a relapsing or chronic abscess, or after the non operative processes have failed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The treatment of reflux disease did not change in the review period. PPI therapy remains the first line treatment and surgery the second line approach. Endoscopic anti-reflux procedures should be only performed in controlled studies. Beside the classic triple therapy, sequential treatment of Helicobacter pylori infection can today be considered as a first line therapy. PPI are effective in the prevention of gastroduodenal lesions and in the treatment of dyspeptic symptoms induced by NSAIDs treatment. Only patients younger then 65 years and without any risk factors do not need a preventive PPI prescription during classic NSAIDS treatment.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, we perform a societal and economic risk assessment for debris flows at the regional scale, for lower Valtellina, Northern Italy. We apply a simple empirical debris-flow model, FLOW-R, which couples a probabilistic flow routing algorithm with an energy line approach, providing the relative probability of transit, and the maximum kinetic energy, for each cell. By assessing a vulnerability to people and to other exposed elements (buildings, public facilities, crops, woods, communication lines), and their economic value, we calculated the expected annual losses both in terms of lives (societal risk) and goods (direct economic risk). For societal risk assessment, we distinguish for the day and night scenarios. The distribution of people at different moments of the day was considered, accounting for the occupational and recreational activities, to provide a more realistic assessment of risk. Market studies were performed in order to assess a realistic economic value to goods, structures, and lifelines. As terrain unit, a 20 m x 20 m cell was used, in accordance with data availability and the spatial resolution requested for a risk assessment at this scale. Societal risk the whole area amounts to 1.98 and 4.22 deaths/year for the day and the night scenarios, respectively, with a maximum of 0.013 deaths/year/cell. Economic risk for goods amounts to 1,760,291 ?/year, with a maximum of 13,814 ?/year/cell.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Since the advent of high-throughput DNA sequencing technologies, the ever-increasing rate at which genomes have been published has generated new challenges notably at the level of genome annotation. Even if gene predictors and annotation softwares are more and more efficient, the ultimate validation is still in the observation of predicted gene product( s). Mass-spectrometry based proteomics provides the necessary high throughput technology to show evidences of protein presence and, from the identified sequences, confirmation or invalidation of predicted annotations. We review here different strategies used to perform a MS-based proteogenomics experiment with a bottom-up approach. We start from the strengths and weaknesses of the different database construction strategies, based on different genomic information (whole genome, ORF, cDNA, EST or RNA-Seq data), which are then used for matching mass spectra to peptides and proteins. We also review the important points to be considered for a correct statistical assessment of the peptide identifications. Finally, we provide references for tools used to map and visualize the peptide identifications back to the original genomic information.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The pharmacokinetics (PK) of efavirenz (EFV) is characterized by marked interpatient variability that correlates with its pharmacodynamics (PD). In vitro-in vivo extrapolation (IVIVE) is a "bottom-up" approach that combines drug data with system information to predict PK and PD. The aim of this study was to simulate EFV PK and PD after dose reductions. At the standard dose, the simulated probability was 80% for viral suppression and 28% for central nervous system (CNS) toxicity. After a dose reduction to 400 mg, the probabilities of viral suppression were reduced to 69, 75, and 82%, and those of CNS toxicity were 21, 24, and 29% for the 516 GG, 516 GT, and 516 TT genotypes, respectively. With reduction of the dose to 200 mg, the probabilities of viral suppression decreased to 54, 62, and 72% and those of CNS toxicity decreased to 13, 18, and 20% for the 516 GG, 516 GT, and 516 TT genotypes, respectively. These findings indicate how dose reductions might be applied in patients with favorable genetic characteristics.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Energy demand is an important constraint on neural signaling. Several methods have been proposed to assess the energy budget of the brain based on a bottom-up approach in which the energy demand of individual biophysical processes are first estimated independently and then summed up to compute the brain's total energy budget. Here, we address this question using a novel approach that makes use of published datasets that reported average cerebral glucose and oxygen utilization in humans and rodents during different activation states. Our approach allows us (1) to decipher neuron-glia compartmentalization in energy metabolism and (2) to compute a precise state-dependent energy budget for the brain. Under the assumption that the fraction of energy used for signaling is proportional to the cycling of neurotransmitters, we find that in the activated state, most of the energy ( approximately 80%) is oxidatively produced and consumed by neurons to support neuron-to-neuron signaling. Glial cells, while only contributing for a small fraction to energy production ( approximately 6%), actually take up a significant fraction of glucose (50% or more) from the blood and provide neurons with glucose-derived energy substrates. Our results suggest that glycolysis occurs for a significant part in astrocytes whereas most of the oxygen is utilized in neurons. As a consequence, a transfer of glucose-derived metabolites from glial cells to neurons has to take place. Furthermore, we find that the amplitude of this transfer is correlated to (1) the activity level of the brain; the larger the activity, the more metabolites are shuttled from glia to neurons and (2) the oxidative activity in astrocytes; with higher glial pyruvate metabolism, less metabolites are shuttled from glia to neurons. While some of the details of a bottom-up biophysical approach have to be simplified, our method allows for a straightforward assessment of the brain's energy budget from macroscopic measurements with minimal underlying assumptions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

1. Biogeographical models of species' distributions are essential tools for assessing impacts of changing environmental conditions on natural communities and ecosystems. Practitioners need more reliable predictions to integrate into conservation planning (e.g. reserve design and management). 2. Most models still largely ignore or inappropriately take into account important features of species' distributions, such as spatial autocorrelation, dispersal and migration, biotic and environmental interactions. Whether distributions of natural communities or ecosystems are better modelled by assembling individual species' predictions in a bottom-up approach or modelled as collective entities is another important issue. An international workshop was organized to address these issues. 3. We discuss more specifically six issues in a methodological framework for generalized regression: (i) links with ecological theory; (ii) optimal use of existing data and artificially generated data; (iii) incorporating spatial context; (iv) integrating ecological and environmental interactions; (v) assessing prediction errors and uncertainties; and (vi) predicting distributions of communities or collective properties of biodiversity. 4. Synthesis and applications. Better predictions of the effects of impacts on biological communities and ecosystems can emerge only from more robust species' distribution models and better documentation of the uncertainty associated with these models. An improved understanding of causes of species' distributions, especially at their range limits, as well as of ecological assembly rules and ecosystem functioning, is necessary if further progress is to be made. A better collaborative effort between theoretical and functional ecologists, ecological modellers and statisticians is required to reach these goals.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Carbapenemases should be accurately and rapidly detected, given their possible epidemiological spread and their impact on treatment options. Here, we developed a simple, easy and rapid matrix-assisted laser desorption ionization-time of flight (MALDI-TOF)-based assay to detect carbapenemases and compared this innovative test with four other diagnostic approaches on 47 clinical isolates. Tandem mass spectrometry (MS-MS) was also used to determine accurately the amount of antibiotic present in the supernatant after 1 h of incubation and both MALDI-TOF and MS-MS approaches exhibited a 100% sensitivity and a 100% specificity. By comparison, molecular genetic techniques (Check-MDR Carba PCR and Check-MDR CT103 microarray) showed a 90.5% sensitivity and a 100% specificity, as two strains of Aeromonas were not detected because their chromosomal carbapenemase is not targeted by probes used in both kits. Altogether, this innovative MALDI-TOF-based approach that uses a stable 10-μg disk of ertapenem was highly efficient in detecting carbapenemase, with a sensitivity higher than that of PCR and microarray.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The aim of this work is to present a new concept, called on-line desorption of dried blood spots (on-line DBS), allowing the direct analysis of a dried blood spot coupled to liquid chromatography mass spectrometry device (LC/MS). The system is based on an inox cell which can receive a blood sample (10 microL) previously spotted on a filter paper. The cell is then integrated into LC/MS system where the analytes are desorbed out of the paper towards a column switching system ensuring the purification and separation of the compounds before their detection on a single quadrupole MS coupled to atmospheric pressure chemical ionisation (APCI) source. The described procedure implies that no pretreatment is necessary in spite the analysis is based on whole blood sample. To ensure the applicability of the concept, saquinavir, imipramine, and verapamil were chosen. Despite the use of a small sampling volume and a single quadrupole detector, on-line DBS allowed the analyses of these three compounds over their therapeutic concentrations from 50 to 500 ng/mL for imipramine and verapamil and from 100 to 1000 ng/mL for saquinavir. Moreover, the method showed good repeatability with relative standard deviation (RSD) lower than 15% based on two levels of concentration (low and high). Function responses were found to be linear over the therapeutic concentration for each compound and were used to determine the concentrations of real patient samples for saquinavir. Comparison of the founded values with those of a validated method used routinely in a reference laboratory showed a good correlation between the two methods. Moreover, good selectivity was observed ensuring that no endogenous or chemical components interfered with the quantitation of the analytes. This work demonstrates the feasibility and applicability of the on-line DBS procedure for bioanalysis.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The extension of traditional data mining methods to time series has been effectively applied to a wide range of domains such as finance, econometrics, biology, security, and medicine. Many existing mining methods deal with the task of change points detection, but very few provide a flexible approach. Querying specific change points with linguistic variables is particularly useful in crime analysis, where intuitive, understandable, and appropriate detection of changes can significantly improve the allocation of resources for timely and concise operations. In this paper, we propose an on-line method for detecting and querying change points in crime-related time series with the use of a meaningful representation and a fuzzy inference system. Change points detection is based on a shape space representation, and linguistic terms describing geometric properties of the change points are used to express queries, offering the advantage of intuitiveness and flexibility. An empirical evaluation is first conducted on a crime data set to confirm the validity of the proposed method and then on a financial data set to test its general applicability. A comparison to a similar change-point detection algorithm and a sensitivity analysis are also conducted. Results show that the method is able to accurately detect change points at very low computational costs. More broadly, the detection of specific change points within time series of virtually any domain is made more intuitive and more understandable, even for experts not related to data mining.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Over the last few years, there has been a surge of work in a new field called "moral psychology", which uses experimental methods to test the psychological processes underlying human moral activity. In this paper, I shall follow this line of approach with the aim of working out a model of how people form value judgements and how they are motivated to act morally. I call this model an "affective picture": 'picture' because it remains strictly at the descriptive level and 'affective' because it has an important role for affects and emotions. This affective picture is grounded on a number of plausible and empirically supported hypotheses. The main idea is that we should distinguish between various kinds of value judgements by focusing on the sort of state of mind people find themselves in while uttering a judgement. "Reasoned judgements" are products of rational considerations and are based on preliminary acceptance of norms and values. On the contrary, "basic value judgements" are affective, primitive and non-reflective ways of assessing the world. As we shall see, this analysis has some consequences for the traditional internalism-externalism debate in philosophy; it highlights the fact that motivation is primarily linked to "basic value judgements" and that the judgements we openly defend might not have a particular effect on our actions, unless we are inclined to have an emotional attitude that conforms to them.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose: To assess the clinical outcome of patients who were subjected to long-axis sacroplasty as first line treatment for sacral insufficiency fractures. Methods and materials: Nineteen patients with unilateral (n = 3) or bilateral (n = 16) sacral fractures were involved. Under local anaesthesia, each patient was subjected to CT guided sacroplasty using the long-axis approach through a single entry point. An average of 6 ml of PMMA was delivered along the path of each sacral fracture. For each individual patient, the VAS pain score before sacroplasty and at 1, 4, 24, and 48 weeks after the procedure was obtained. Furthermore, the use of analgesics (narcotic/non-narcotic) along with the evolution of post interventional patient mobility before and after sacroplasty was also recorded. Results: The mean pre-procedure VAS score was 8 ± 1.9. This has rapidly declined in the first week after the procedure (mean 4 ± 1.5) followed by gradual decrease along the rest of follow-up period at 4 weeks (mean 3 ± 1.2), 24 weeks (mean 2 ± 1.3), and 48 weeks (mean 1.3 ± 1.4), respectively. Eleven (58%) patients were under narcotic analgesia before sacroplasty, whereas, 8 (42%) patients were using non-narcotics. Corresponding values after the procedure were 2/19 (10%) (narcotic) and 10/19 53% (non-narcotic). Seven (37%) patients did not address post-procedure analgesic use. The evolution of post interventional mobility was favourable in the study group since they revealed a significant improvement in their mobility point scale. Conclusion: Long-axis percutaneous sacroplasty is a suitable minimally invasive treatment option for patients who present with sacral insufficiency fractures. Future studies with larger patient number are warranted to grasp any potential limitations of this therapeutic approach.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

High-throughput technologies are now used to generate more than one type of data from the same biological samples. To properly integrate such data, we propose using co-modules, which describe coherent patterns across paired data sets, and conceive several modular methods for their identification. We first test these methods using in silico data, demonstrating that the integrative scheme of our Ping-Pong Algorithm uncovers drug-gene associations more accurately when considering noisy or complex data. Second, we provide an extensive comparative study using the gene-expression and drug-response data from the NCI-60 cell lines. Using information from the DrugBank and the Connectivity Map databases we show that the Ping-Pong Algorithm predicts drug-gene associations significantly better than other methods. Co-modules provide insights into possible mechanisms of action for a wide range of drugs and suggest new targets for therapy

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Methods like Event History Analysis can show the existence of diffusion and part of its nature, but do not study the process itself. Nowadays, thanks to the increasing performance of computers, processes can be studied using computational modeling. This thesis presents an agent-based model of policy diffusion mainly inspired from the model developed by Braun and Gilardi (2006). I first start by developing a theoretical framework of policy diffusion that presents the main internal drivers of policy diffusion - such as the preference for the policy, the effectiveness of the policy, the institutional constraints, and the ideology - and its main mechanisms, namely learning, competition, emulation, and coercion. Therefore diffusion, expressed by these interdependencies, is a complex process that needs to be studied with computational agent-based modeling. In a second step, computational agent-based modeling is defined along with its most significant concepts: complexity and emergence. Using computational agent-based modeling implies the development of an algorithm and its programming. When this latter has been developed, we let the different agents interact. Consequently, a phenomenon of diffusion, derived from learning, emerges, meaning that the choice made by an agent is conditional to that made by its neighbors. As a result, learning follows an inverted S-curve, which leads to partial convergence - global divergence and local convergence - that triggers the emergence of political clusters; i.e. the creation of regions with the same policy. Furthermore, the average effectiveness in this computational world tends to follow a J-shaped curve, meaning that not only time is needed for a policy to deploy its effects, but that it also takes time for a country to find the best-suited policy. To conclude, diffusion is an emergent phenomenon from complex interactions and its outcomes as ensued from my model are in line with the theoretical expectations and the empirical evidence.Les méthodes d'analyse de biographie (event history analysis) permettent de mettre en évidence l'existence de phénomènes de diffusion et de les décrire, mais ne permettent pas d'en étudier le processus. Les simulations informatiques, grâce aux performances croissantes des ordinateurs, rendent possible l'étude des processus en tant que tels. Cette thèse, basée sur le modèle théorique développé par Braun et Gilardi (2006), présente une simulation centrée sur les agents des phénomènes de diffusion des politiques. Le point de départ de ce travail met en lumière, au niveau théorique, les principaux facteurs de changement internes à un pays : la préférence pour une politique donnée, l'efficacité de cette dernière, les contraintes institutionnelles, l'idéologie, et les principaux mécanismes de diffusion que sont l'apprentissage, la compétition, l'émulation et la coercition. La diffusion, définie par l'interdépendance des différents acteurs, est un système complexe dont l'étude est rendue possible par les simulations centrées sur les agents. Au niveau méthodologique, nous présenterons également les principaux concepts sous-jacents aux simulations, notamment la complexité et l'émergence. De plus, l'utilisation de simulations informatiques implique le développement d'un algorithme et sa programmation. Cette dernière réalisée, les agents peuvent interagir, avec comme résultat l'émergence d'un phénomène de diffusion, dérivé de l'apprentissage, où le choix d'un agent dépend en grande partie de ceux faits par ses voisins. De plus, ce phénomène suit une courbe en S caractéristique, poussant à la création de régions politiquement identiques, mais divergentes au niveau globale. Enfin, l'efficacité moyenne, dans ce monde simulé, suit une courbe en J, ce qui signifie qu'il faut du temps, non seulement pour que la politique montre ses effets, mais également pour qu'un pays introduise la politique la plus efficace. En conclusion, la diffusion est un phénomène émergent résultant d'interactions complexes dont les résultats du processus tel que développé dans ce modèle correspondent tant aux attentes théoriques qu'aux résultats pratiques.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Stable isotope labels are routinely introduced into proteomes for quantification purposes. Full labeling of cells in varying biological states, followed by sample mixing, fractionation and intensive data acquisition, is used to obtain accurate large-scale quantification of total protein levels. However, biological processes often affect only a small group of proteins for a short time, resulting in changes that are difficult to detect against the total proteome background. An alternative approach could be the targeted analysis of the proteins synthesized in response to a given biological stimulus. Such proteins can be pulse-labeled with a stable isotope by metabolic incorporation of 'heavy' amino acids. In this study we investigated the specific detection and identification of labeled proteins using acquisition methods based on Precursor Ion Scans (PIS) on a triple-quadrupole ion trap mass spectrometer. PIS-based methods were set to detect unique immonium ions originating from labeled peptides. Different labels and methods were tested in standard mixtures to optimize performance. We showed that, in comparison with an untargeted analysis on the same instrument, the approach allowed a several-fold increase in the specificity of detection of labeled proteins over unlabeled ones. The technique was applied to the identification of proteins secreted by human cells into growth media containing bovine serum proteins, allowing the preferential detection of labeled cellular proteins over unlabeled bovine ones. However, compared with untargeted acquisitions on two different instruments, the PIS-based strategy showed some limitations in sensitivity. We discuss possible perspectives of the technique.