793 resultados para PHARMACY-BASED MEASURES


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Formation of hydrates is one of the major flow assurance problems faced by the oil and gas industry. Hydrates tend to form in natural gas pipelines with the presence of water and favorable temperature and pressure conditions, generally low temperatures and corresponding high pressures. Agglomeration of hydrates can result in blockage of flowlines and equipment, which can be time consuming to remove in subsea equipment and cause safety issues. Natural gas pipelines are more susceptible to burst and explosion owing to hydrate plugging. Therefore, a rigorous risk-assessment related to hydrate formation is required, which assists in preventing hydrate blockage and ensuring equipment integrity. This thesis presents a novel methodology to assess the probability of hydrate formation and presents a risk-based approach to determine the parameters of winterization schemes to avoid hydrate formation in natural gas pipelines operating in Arctic conditions. It also presents a lab-scale multiphase flow loop to study the effects of geometric and hydrodynamic parameters on hydrate formation and discusses the effects of geometric and hydrodynamic parameters on multiphase development length of a pipeline. Therefore, this study substantially contributes to the assessment of probability of hydrate formation and the decision making process of winterization strategies to prevent hydrate formation in Arctic conditions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Research on temporal-order perception uses temporal-order judgment (TOJ) tasks or synchrony judgment (SJ) tasks in their binary SJ2 or ternary SJ3 variants. In all cases, two stimuli are presented with some temporal delay, and observers judge the order of presentation. Arbitrary psychometric functions are typically fitted to obtain performance measures such as sensitivity or the point of subjective simultaneity, but the parameters of these functions are uninterpretable. We describe routines in MATLAB and R that fit model-based functions whose parameters are interpretable in terms of the processes underlying temporal-order and simultaneity judgments and responses. These functions arise from an independent-channels model assuming arrival latencies with exponential distributions and a trichotomous decision space. Different routines fit data separately for SJ2, SJ3, and TOJ tasks, jointly for any two tasks, or also jointly for the three tasks (for common cases in which two or even the three tasks were used with the same stimuli and participants). Additional routines provide bootstrap p-values and confidence intervals for estimated parameters. A further routine is included that obtains performance measures from the fitted functions. An R package for Windows and source code of the MATLAB and R routines are available as Supplementary Files.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dans le système nerveux central, la dopamine joue un rôle crucial dans de nombreuses fonctions physiologiques telles que : l’apprentissage, le mouvement volontaire, la motivation, la cognition et la production hormonale. Il a été aussi démontré que le système de signalisation dopaminergique est altéré dans plusieurs maladies neurologiques et psychiatriques comme la maladie de Parkinson et la schizophrénie. Des études, effectuées dans le laboratoire du Dr.Daniel Lévesque (laboratoire d’accueil), ont montré que les récepteurs nucléaires Nur77 (NR4A1, NGFI-B) et RXRγ (retinoid X receptors γ) sont impliqués dans la régulation des effets de la dopamine dans le système nerveux central. De plus, ces données suggèrent que le complexe Nur77 et RXR joueraient un rôle crucial dans l’effet des médicaments antipsychotiques et antiparkinsoniens. Toutefois, très peu de médicaments ciblant Nur77 ont été identifiés à ce jour et les médicaments agissant sur RXRγ restent mal caractérisés. En outre, les analyses actuellement disponibles ne peuvent pas résumer la complexité des activités des NRs et génèrent des mesures indirectes des activités des drogues. Afin de mieux comprendre comment est régulée l’interaction Nur77/RXRγ dans ces processus, mon projet a été de mettre au point un essai BRET (Bioluminescence Resonance Energy Transfer) et PCA-BRET (Protein Complementation Assay-BRET) basé sur le recrutement d'un motif mimant un co-activateur fusionné avec la YFP. Nos différents essais ont été validés par courbes dose-réponse en utilisant différents composés RXR . Les EC50 (concentration efficace médiane, qui permet de mesurer l'efficacité d'un composé) obtenues étaient très semblables aux valeurs précédemment rapportées dans la littérature. Nous avons aussi pu identifier un composé le SR11237 (BMS649) qui semble posséder une sélectivité pour le complexe Nur77/RXRγ par rapport aux complexes Nurr1/RXRγ et RXRγ /RXRγ. Nos résultats indiquent que ces essais de BRET peuvent être utilisés pour évaluer la sélectivité de nouveaux composés pour les complexes Nur77/RXRγ, Nurr1/RXRγ et RXRγ /RXRγ. Un autre aspect de mon projet de doctorat a été de mettre en évidence par BRET l’importance de la SUMOylation dans la régulation de l'activité de Nur77 dans sa forme monomèrique, homodimèrique et hétérodimèrique. Nous avons ainsi identifié que Nur77 recrute principalement SUMO2 sur sa lysine 577. Il est intéressant de noté que le recrutement de la SUMO2 à Nur77 est potentialisé en présence de la SUMO E3 Ligase PIASγ. Aussi, la perte de la SUMOylation sur la lysine 577 entraîne l'incapacité de Nur77 de recruter divers motifs de co-activation mais pas pour ses formes homo- et hétérodimèrique. Cependant, la présence de PIASγ ne potentialise pas le recrutement du co-activateur, suggérant que cette SUMO E3 Ligase est seulement impliqué dans le processus de recrutement de la SUMO mais pas dans celui du co-activateur. Nous avons ainsi déterminé une nouvelle modification post-traductionnelle sur Nur77 régulant spécifiquement son activité monomérique Ces projets pourraient donc apporter de nouvelles données cruciales pour l’amélioration du traitement de la maladie de Parkinson ou de la schizophrénie, ainsi que d'obtenir une meilleure compréhension sur les mécanismes permettant la régulation de la fonction de Nur77

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This dissertation presents a study and experimental research on asymmetric coding of stereoscopic video. A review on 3D technologies, video formats and coding is rst presented and then particular emphasis is given to asymmetric coding of 3D content and performance evaluation methods, based on subjective measures, of methods using asymmetric coding. The research objective was de ned to be an extension of the current concept of asymmetric coding for stereo video. To achieve this objective the rst step consists in de ning regions in the spatial dimension of auxiliary view with di erent perceptual relevance within the stereo pair, which are identi ed by a binary mask. Then these regions are encoded with better quality (lower quantisation) for the most relevant ones and worse quality (higher quantisation) for the those with lower perceptual relevance. The actual estimation of the relevance of a given region is based on a measure of disparity according to the absolute di erence between views. To allow encoding of a stereo sequence using this method, a reference H.264/MVC encoder (JM) has been modi ed to allow additional con guration parameters and inputs. The nal encoder is still standard compliant. In order to show the viability of the method subjective assessment tests were performed over a wide range of objective qualities of the auxiliary view. The results of these tests allow us to prove 3 main goals. First, it is shown that the proposed method can be more e cient than traditional asymmetric coding when encoding stereo video at higher qualities/rates. The method can also be used to extend the threshold at which uniform asymmetric coding methods start to have an impact on the subjective quality perceived by the observers. Finally the issue of eye dominance is addressed. Results from stereo still images displayed over a short period of time showed it has little or no impact on the proposed method.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The primary objective is to investigate the main factors contributing to GMS expenditure on pharmaceutical prescribing and projecting this expenditure to 2026. This study is located in the area of pharmacoeconomic cost containment and projections literature. The thesis has five main aims: 1. To determine the main factors contributing to GMS expenditure on pharmaceutical prescribing. 2. To develop a model to project GMS prescribing expenditure in five year intervals to 2026, using 2006 Central Statistics Office (CSO) Census data and 2007 Health Service Executive{Primary Care Reimbursement Service (HSE{PCRS) sample data. 3. To develop a model to project GMS prescribing expenditure in five year intervals to 2026, using 2012 HSE{PCRS population data, incorporating cost containment measures, and 2011 CSO Census data. 4. To investigate the impact of demographic factors and the pharmacology of drugs (Anatomical Therapeutic Chemical (ATC)) on GMS expenditure. 5. To explore the consequences of GMS policy changes on prescribing expenditure and behaviour between 2008 and 2014. The thesis is centered around three published articles and is located between the end of a booming Irish economy in 2007, a recession from 2008{2013, to the beginning of a recovery in 2014. The literature identified a number of factors influencing pharmaceutical expenditure, including population growth, population aging, changes in drug utilisation and drug therapies, age, gender and location. The literature identified the methods previously used in predictive modelling and consequently, the Monte Carlo Simulation (MCS) model was used to simulate projected expenditures to 2026. Also, the literature guided the use of Ordinary Least Squares (OLS) regression in determining demographic and pharmacology factors influencing prescribing expenditure. The study commences against a backdrop of growing GMS prescribing costs, which has risen from e250 million in 1998 to over e1 billion by 2007. Using a sample 2007 HSE{PCRS prescribing data (n=192,000) and CSO population data from 2008, (Conway et al., 2014) estimated GMS prescribing expenditure could rise to e2 billion by2026. The cogency of these findings was impacted by the global economic crisis of 2008, which resulted in a sharp contraction in the Irish economy, mounting fiscal deficits resulting in Ireland's entry to a bailout programme. The sustainability of funding community drug schemes, such as the GMS, came under the spotlight of the EU, IMF, ECB (Trioka), who set stringent targets for reducing drug costs, as conditions of the bailout programme. Cost containment measures included: the introduction of income eligibility limits for GP visit cards and medical cards for those aged 70 and over, introduction of co{payments for prescription items, reductions in wholesale mark{up and pharmacy dispensing fees. Projections for GMS expenditure were reevaluated using 2012 HSE{PCRS prescribing population data and CSO population data based on Census 2011. Taking into account both cost containment measures and revised population predictions, GMS expenditure is estimated to increase by 64%, from e1.1 billion in 2016 to e1.8 billion by 2026, (ConwayLenihan and Woods, 2015). In the final paper, a cross{sectional study was carried out on HSE{PCRS population prescribing database (n=1.63 million claimants) to investigate the impact of demographic factors, and the pharmacology of the drugs, on GMS prescribing expenditure. Those aged over 75 (ẞ = 1:195) and cardiovascular prescribing (ẞ = 1:193) were the greatest contributors to annual GMS prescribing costs. Respiratory drugs (Montelukast) recorded the highest proportion and expenditure for GMS claimants under the age of 15. Drugs prescribed for the nervous system (Escitalopram, Olanzapine and Pregabalin) were highest for those between 16 and 64 years with cardiovascular drugs (Statins) were highest for those aged over 65. Females are more expensive than males and are prescribed more items across the four ATC groups, except among children under 11, (ConwayLenihan et al., 2016). This research indicates that growth in the proportion of the elderly claimants and associated levels of cardiovascular prescribing, particularly for statins, will present difficulties for Ireland in terms of cost containment. Whilst policies aimed at cost containment (co{payment charges, generic substitution, reference pricing, adjustments to GMS eligibility) can be used to curtail expenditure, health promotional programs and educational interventions should be given equal emphasis. Also policies intended to affect physicians prescribing behaviour include guidelines, information (about price and less expensive alternatives) and feedback, and the use of budgetary restrictions could yield savings.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this study we propose the use of the performance measure distribution rather than its punctual value to rank hedge funds. Generalized Sharpe Ratio and other similar measures that take into account the higher-order moments of portfolio return distributions are commonly used to evaluate hedge funds performance. The literature in this field has reported non-significant difference in ranking between performance measures that take, and those that do not take, into account higher moments of distribution. Our approach provides a much more powerful manner to differentiate between hedge funds performance. We use a non-semiparametric density based on Gram-Charlier expansions to forecast the conditional distribution of hedge fund returns and its corresponding performance measure distribution. Through a forecasting exercise we show the advantages of our technique in relation to using the more traditional punctual performance measures.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Climate change and continuous urbanization contribute to an increased urban vulnerability towards flooding. Only relying on traditional flood control measures is recognized as inadequate, since the damage can be catastrophic if flood controls fail. The idea of a flood-resilient city – one which can withstand or adapt to a flood event without being harmed in its functionality – seems promising. But what does resilience actually mean when it is applied to urban environments exposed to flood risk, and how can resilience be achieved? This paper presents a heuristic framework for assessing the flood resilience of cities, for scientists and policy-makers alike. It enriches the current literature on flood resilience by clarifying the meaning of its three key characteristics – robustness, adaptability and transformability – and identifying important components to implement resilience strategies. The resilience discussion moves a step forward, from predominantly defining resilience to generating insight into “doing” resilience in practice. The framework is illustrated with two case studies from Hamburg, showing that resilience, and particularly the underlying notions of adaptability and transformability, first and foremost require further capacity-building among public as well as private stakeholders. The case studies suggest that flood resilience is currently not enough motivation to move from traditional to more resilient flood protection schemes in practice; rather, it needs to be integrated into a bigger urban agenda.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Aberrant behavior of biological signaling pathways has been implicated in diseases such as cancers. Therapies have been developed to target proteins in these networks in the hope of curing the illness or bringing about remission. However, identifying targets for drug inhibition that exhibit good therapeutic index has proven to be challenging since signaling pathways have a large number of components and many interconnections such as feedback, crosstalk, and divergence. Unfortunately, some characteristics of these pathways such as redundancy, feedback, and drug resistance reduce the efficacy of single drug target therapy and necessitate the employment of more than one drug to target multiple nodes in the system. However, choosing multiple targets with high therapeutic index poses more challenges since the combinatorial search space could be huge. To cope with the complexity of these systems, computational tools such as ordinary differential equations have been used to successfully model some of these pathways. Regrettably, for building these models, experimentally-measured initial concentrations of the components and rates of reactions are needed which are difficult to obtain, and in very large networks, they may not be available at the moment. Fortunately, there exist other modeling tools, though not as powerful as ordinary differential equations, which do not need the rates and initial conditions to model signaling pathways. Petri net and graph theory are among these tools. In this thesis, we introduce a methodology based on Petri net siphon analysis and graph network centrality measures for identifying prospective targets for single and multiple drug therapies. In this methodology, first, potential targets are identified in the Petri net model of a signaling pathway using siphon analysis. Then, the graph-theoretic centrality measures are employed to prioritize the candidate targets. Also, an algorithm is developed to check whether the candidate targets are able to disable the intended outputs in the graph model of the system or not. We implement structural and dynamical models of ErbB1-Ras-MAPK pathways and use them to assess and evaluate this methodology. The identified drug-targets, single and multiple, correspond to clinically relevant drugs. Overall, the results suggest that this methodology, using siphons and centrality measures, shows promise in identifying and ranking drugs. Since this methodology only uses the structural information of the signaling pathways and does not need initial conditions and dynamical rates, it can be utilized in larger networks.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The paper starts from the concern that while there is a large body of literature focusing on the theoretical definitions and measurements of accessibility, the extent to which such measures are used in planning practice is less clear. Previous reviews of accessibility instruments have in fact identified a gap between the clear theoretical assumptions and the infrequent applications of accessibility instruments in spatial and transport planning. In this paper we present the results of a structured-workshop involving private and public stakeholders to test usability of gravity-based accessibility measures (GraBaM) to assess integrated land-use and transport policies. The research is part of the COST Action TU1002 “Accessibility Instruments for Planning Practice” during which different accessibility instruments where tested for different case studies. Here we report on the empirical case study of Rome.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Advertising investment and audience figures indicate that television continues to lead as a mass advertising medium. However, its effectiveness is questioned due to problems such as zapping, saturation and audience fragmentation. This has favoured the development of non-conventional advertising formats. This study provides empirical evidence for the theoretical development. This investigation analyzes the recall generated by four non-conventional advertising formats in a real environment: short programme (branded content), television sponsorship, internal and external telepromotion versus the more conventional spot. The methodology employed has integrated secondary data with primary data from computer assisted telephone interviewing (CATI) were performed ad-hoc on a sample of 2000 individuals, aged 16 to 65, representative of the total television audience. Our findings show that non-conventional advertising formats are more effective at a cognitive level, as they generate higher levels of both unaided and aided recall, in all analyzed formats when compared to the spot.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective: To examine the effectiveness of an “Enhancing Positive Emotions Procedure” (EPEP) based on positive psychology and cognitive behavioral therapy in relieving distress at the time of adjuvant chemotherapy treatment in colorectal cancer patients (CRC). It is expected that EPEP will increase quality of life and positive affect in CRC patients during chemotherapy treatment intervention and at 1 month follow-up.Method: A group of 24 CRC patients received the EPEP procedure (intervention group), whereas another group of 20 CRC patients did not receive the EPEP (control group). Quality of life (EORTC-QLQC30), and mood (PANAS) were assessed in three moments: prior to enter the study (T1), at the end of the time required to apply the EPEP (T2, 6 weeks after T1), and, at follow-up (T3, one-month after T2). Patient’s assessments of the EPEP (improving in mood states, and significance of the attention received) were assessed with Lickert scales.Results: Insomnia was reduced in the intervention group. Treatment group had better scores on positive affect although there were no significantly differences between groups and over time. There was a trend to better scores at T2 and T3 for the intervention group on global health status, physical, role, and social functioning scales. Patients stated that positive mood was enhanced and that EPEP was an important resource.Conclusions: CRC patients receiving EPEP during chemotherapy believed that this intervention was important. Furthermore, EPEP seems to improve positive affect and quality of life. EPEP has potential benefits, and its implementation to CRC patients should be considered.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

IMPORTANCE: Prevention strategies for heart failure are needed.

OBJECTIVE: To determine the efficacy of a screening program using brain-type natriuretic peptide (BNP) and collaborative care in an at-risk population in reducing newly diagnosed heart failure and prevalence of significant left ventricular (LV) systolic and/or diastolic dysfunction.

DESIGN, SETTING, AND PARTICIPANTS: The St Vincent's Screening to Prevent Heart Failure Study, a parallel-group randomized trial involving 1374 participants with cardiovascular risk factors (mean age, 64.8 [SD, 10.2] years) recruited from 39 primary care practices in Ireland between January 2005 and December 2009 and followed up until December 2011 (mean follow-up, 4.2 [SD, 1.2] years).

INTERVENTION: Patients were randomly assigned to receive usual primary care (control condition; n=677) or screening with BNP testing (n=697). Intervention-group participants with BNP levels of 50 pg/mL or higher underwent echocardiography and collaborative care between their primary care physician and specialist cardiovascular service.

MAIN OUTCOMES AND MEASURES: The primary end point was prevalence of asymptomatic LV dysfunction with or without newly diagnosed heart failure. Secondary end points included emergency hospitalization for arrhythmia, transient ischemic attack, stroke, myocardial infarction, peripheral or pulmonary thrombosis/embolus, or heart failure.

RESULTS: A total of 263 patients (41.6%) in the intervention group had at least 1 BNP reading of 50 pg/mL or higher. The intervention group underwent more cardiovascular investigations (control, 496 per 1000 patient-years vs intervention, 850 per 1000 patient-years; incidence rate ratio, 1.71; 95% CI, 1.61-1.83; P<.001) and received more renin-angiotensin-aldosterone system-based therapy at follow-up (control, 49.6%; intervention, 56.5%; P=.01). The primary end point of LV dysfunction with or without heart failure was met in 59 (8.7%) of 677 in the control group and 37 (5.3%) of 697 in the intervention group (odds ratio [OR], 0.55; 95% CI, 0.37-0.82; P = .003). Asymptomatic LV dysfunction was found in 45 (6.6%) of 677 control-group patients and 30 (4.3%) of 697 intervention-group patients (OR, 0.57; 95% CI, 0.37-0.88; P = .01). Heart failure occurred in 14 (2.1%) of 677 control-group patients and 7 (1.0%) of 697 intervention-group patients (OR, 0.48; 95% CI, 0.20-1.20; P = .12). The incidence rates of emergency hospitalization for major cardiovascular events were 40.4 per 1000 patient-years in the control group vs 22.3 per 1000 patient-years in the intervention group (incidence rate ratio, 0.60; 95% CI, 0.45-0.81; P = .002).

CONCLUSION AND RELEVANCE: Among patients at risk of heart failure, BNP-based screening and collaborative care reduced the combined rates of LV systolic dysfunction, diastolic dysfunction, and heart failure.

TRIAL REGISTRATION: clinicaltrials.gov Identifier: NCT00921960.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Abstract - This study investigates the effect of solid dispersions prepared from of polyethylene glycol (PEG) 3350 and 6000 Da alone or combined with the non-ionic surfactant Tween 80 on the solubility and dissolution rate of a poorly soluble drug eprosartan mesylate (ESM) in attempt to improve its bioavailability following its oral administration.

INTRODUCTION

ESM is a potent anti-hypertension [1]. It has low water solubility and is classified as a Class II drug as per the Biopharmaceutical Classification Systems (BCS) leading to low and variable oral bioavailability (approximately 13%). [2]. Thus, improving ESM solubility and/or dissolution rate would eventually improve the drug bioavailability. Solid dispersion is widely used technique to improve the water solubility of poorly water-soluble drugs employing various biocompatible polymers. In this study, we aimed to enhance the solubility and dissolution of EMS employing solid dispersion (SD) formulated from two grades of poly ethylene glycol (PEG) polymers (i.e. PEG 3350 & PEG 6000 Da) either individually or in combination with Tween 80.

MATERIALS AND METHODS

ESM SDs were prepared by solvent evaporation method using either PEG 3350 or PEG 6000 at various (drug: polymer, w/w) ratios 1:1, 1:2, 1:3, 1:4, 1:5 alone or combined with Tween 80 added at fixed percentage of 0.1 of drug by weight?. Physical mixtures (PMs) of drug and carriers were also prepared at same ratios. Drug solid dispersions and physical mixtures were characterized in terms of drug content, drug dissolution using dissolution apparatus USP II and assayed using HPLC method. Drug dissolution enhancement ratio (ER %) from SD in comparison to the plain drug was calculated. Drug-polymer interactions were evaluated using Differential Scanning Calorimetry (DSC) and FT-IR.

RESULTS AND DISCUSSION

The in vitro solubility and dissolution studies showed SDs prepared using both polymers produced a remarkable improvement (p<0.05) in comparison to the plain drug which reached around 32% (Fig. 1). The dissolution enhancement ratio was polymer type and concentration-dependent. Adding Tween 80 to the SD did not show further dissolution enhancement but reduced the required amount of the polymer to get the same dissolution enhancement. The DSC and FT-IR studies indicated that using SD resulted in transformation of drug from crystalline to amorphous form.

CONCLUSIONS

This study indicated that SDs prepared by using both polymers i.e. PEG 3350 and PEG 6000 improved the in-vitro solubility and dissolution of ESM remarkably which may result in improving the drug bioavailability in vivo.

Acknowledgments

This work is a part of MSc thesis of O.M. Ali at the Faculty of Pharmacy, Aleppo University, Syria.

REFERENCES

[1] Ruilope L, Jager B: Eprosartan for the treatment of hypertension. Expert Opin Pharmacother 2003; 4(1):107-14

[2] Tenero D, Martin D, Wilson B, Jushchyshyn J, Boike S, Lundberg, D, et al. Pharmacokinetics of intravenously and orally administered Eprosartan in healthy males: absolute bioavailability and effect of food. Biopharm Drug Dispos 1998; 19(6): 351- 6.


Relevância:

30.00% 30.00%

Publicador:

Resumo:

There has been an increasing interest in the development of new methods using Pareto optimality to deal with multi-objective criteria (for example, accuracy and time complexity). Once one has developed an approach to a problem of interest, the problem is then how to compare it with the state of art. In machine learning, algorithms are typically evaluated by comparing their performance on different data sets by means of statistical tests. Standard tests used for this purpose are able to consider jointly neither performance measures nor multiple competitors at once. The aim of this paper is to resolve these issues by developing statistical procedures that are able to account for multiple competing measures at the same time and to compare multiple algorithms altogether. In particular, we develop two tests: a frequentist procedure based on the generalized likelihood-ratio test and a Bayesian procedure based on a multinomial-Dirichlet conjugate model. We further extend them by discovering conditional independences among measures to reduce the number of parameters of such models, as usually the number of studied cases is very reduced in such comparisons. Data from a comparison among general purpose classifiers is used to show a practical application of our tests.