950 resultados para contribution analysis


Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND Systemic approaches are needed to understand how variations in the genes associated with opioid pharmacokinetics and response can be used to predict patient outcome. The application of pharmacogenetic analysis to two cases of life-threatening opioid-induced respiratory depression is presented. The usefulness of genotyping in the context of these cases is discussed. METHODS A panel of 20 functional candidate polymorphisms in genes involved in the opioid biotransformation pathway (CYP2D6, UGT2B7, ABCB1, OPRM1, COMT) were genotyped in these two patients using commercially available genotyping assays. RESULTS In case 1, the patient experienced adverse outcomes when administered codeine and morphine, but not hydromorphone. Genetic test results suggested that this differential response may be due to an inherent propensity to generate active metabolites from both codeine and morphine. These active metabolites are not generated with hydromorphone. In case 2, the patient experienced severe respiratory depression during postoperative recovery following standard doses of morphine. The patient was found to carry genetic variations that result in decreased morphine efflux transporter activity at the blood-brain barrier and increased sensitivity to opioids. CONCLUSIONS Knowledge of the relative contribution of pharmacogenetic biomarkers and their influence on opioid response are continually evolving. Pharmacogenetic analysis, together with clinical history, has the potential to provide mechanistic insight into severe respiratory depressive events in patients who receive opioids at therapeutic doses.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Renin-Angiotensin system (RAS) regulates blood pressure through its effects on vascular tone, renal hemodynamics, and renal sodium and fluid balance. The genes encoding the four major components of the RAS, angiotensinogen, renin, angiotensin I-converting enzyme (ACE), and angiotensin II receptor type 1 (AT1), have been investigated as candidate genes in the pathogenesis of essential hypertension. However, studies have primarily focused on small samples of diseased individuals, and, therefore, have provided little information about the determinants of interindividual variation in blood pressure (BP) in the general population.^ Using data from a large population-based sample from Rochester, MN, I have evaluated the contribution of variation in the region of the RAS genes to interindividual variation in systolic, diastolic, and mean arterial pressure in the population-at-large. Marker genotype data from four polymorphisms located within or very near these genes were first collected on 3,974 individuals from 583 randomly ascertained three-generation pedigrees. Haseman-Elston regression and variance component methods of linkage analysis were then carried out to estimate the proportion of interindividual variance in BP attributable to the effects of variation at these four measured loci.^ A significant effect of the ACE locus on interindividual variation in mean arterial pressure (MAP) was detected in a sample of siblings belonging to the youngest generation. After allowing for measured covariates, this effect accounted for 15-25% of the interindividual variance in MAP, and was even greater in a subset with a positive family history of hypertension. When gender-specific analyses were carried out, this effect was significant in males but not in females. Extended pedigree analyses also provided evidence for an effect of the ACE locus on interindividual variation in MAP, but no difference between males and females was observed. Circumstantial evidence suggests that the ACE gene itself may be responsible for the observed effects on BP, although the possibility that other genes in the region may be at play cannot be excluded.^ No definitive evidence for an effect of the renin, angiotensinogen, or AT1 loci on interindividual variation in BP was obtained in this study, suggesting that the impact of these genes on BP may not be great in the Caucasian population-at-large. However, this does not preclude a larger effect of these genes in some subsets of individuals, especially among those with clinically manifest hypertension or coronary heart disease, or in other populations. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Treatment of mice with the immunomodulating agent, Corynebacterium parvum (C. parvum), was shown to result in a severe and long-lasting depression of splenic natural killer (NK) cell-mediated cytotoxicity 5-21 days post-inoculation. Because NK cells have been implicated in immunosurveillance against malignancy (due to their spontaneous occurrence and rapid reactivity to a variety of histological types of tumors), as well as in resistance to established tumors, this decreased activity was of particular concern, since this effect is contrary to that which would be considered therapeutically desirable in cancer treatment (i.e. a potentiation of antitumor effector functions, including NK cell activity, would be expected to lead to a more effective destruction of malignant cells). Therefore, an analysis of the mechanism of this decline of splenic NK cell activity in C.parvum treated mice was undertaken.^ From in vitro co-culturing experiments, it was found that low NK-responsive C. parvum splenocytes were capable of reducing the normally high-reactivity of cells from untreated syngeneic mice to YAC-1 lymphoma, suggesting the presence of NK-directed suppressor cells in C. parvum treated animals. This was further supported by the demonstration of normal levels of cytotoxicity in C. parvum splenocyte preparations following Ficoll-Hypaque separation, which coincided with removal of the NK-suppressive capabilities of these cells. The T cell nature of these regulatory cells was indicated by (1) the failure of C. parvum to cause a reduction of NK cell activity, or the generation of NK-directed suppressor cells in T cell-deficient athymic mice, (2) the removal of C. parvum-induced suppression by T cell-depleting fractionation procedures or treatments, and (3) demonstration of suppression of NK cell activity by T cell-enriched C. parvum splenocytes. These studies suggest, therefore, that the eventual reduction of suppression by T cell elimination and/or inhibition, may result in a promotion of the antitumor effectiveness of C. parvum due to the contribution of "freed" NK effector cell activity.^ However, the temporary suppression of NK cell activity induced by C. parvum (reactivity of treated mice returns to normal levels within 28 days after C. parvum injection), may in fact be favorable in some situations, e.g. in bone marrow transplantation cases, since NK cells have been suggested to play a role also in the process of bone marrow graft rejection.^ Therefore, the discriminate use of agents such as C. parvum may allow for the controlled regulation of NK cell activity suggested to be necessary for the optimalization of therapeutic regimens. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We address under what conditions a magma generated by partial melting at 100 km depth in the mantle wedge above a subduction zone can reach the crust in dikes before stalling. We also address under what conditions primitive basaltic magma (Mg # >60) can be delivered from this depth to the crust. We employ linear elastic fracture mechanics with magma solidification theory and perform a parametric sensitivity analysis. All dikes are initiated at a depth of 100 km in the thermal core of the wedge, and the Moho is fixed at 35 km depth. We consider a range of melt solidus temperatures (800-1100 degrees C), viscosities (10-100 Pa s), and densities (2400-2700 kg m(-3)). We also consider a range of host rock fracture toughness values (50-300 MPa m(1/2)) and dike lengths (2-5 km) and two thermal structures for the mantle wedge (1260 and 1400 degrees C at 100 km depth and 760 and 900 degrees C at 35 km depth). For the given parameter space, many dikes can reach the Moho in less than a few hundred hours, well within the time constraints provided by U series isotope disequilibria studies. Increasing the temperature in the mantle wedge, or increasing the dike length, allows additional dikes to propagate to the Moho. We conclude that some dikes with vertical lengths near their critical lengths and relatively high solidus temperatures will stall in the mantle before reaching the Moho, and these may be returned by corner flow to depths where they can melt under hydrous conditions. Thus, a chemical signature in arc lavas suggesting partial melting of slab basalts may be partly influenced by these recycled dikes. Alternatively, dikes with lengths well above their critical lengths can easily deliver primitive magmas to the crust, particularly if the mantle wedge is relatively hot. Dike transport remains a viable primary mechanism of magma ascent in convergent tectonic settings, but the potential for less rapid mechanisms making an important contribution increases as the mantle temperature at the Moho approaches the solidus temperature of the magma.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The contribution of Starlette, Stella, and AJI-SAI is currently neglected when defining the International Terrestrial Reference Frame, despite a long time series of precise SLR observations and a huge amount of available data. The inferior accuracy of the orbits of low orbiting geodetic satellites is the main reason for this neglect. The Analysis Centers of the International Laser Ranging Service (ILRS ACs) do, however, consider including low orbiting geodetic satellites for deriving the standard ILRS products based on LAGEOS and Etalon satellites, instead of the sparsely observed, and thus, virtually negligible Etalons. We process ten years of SLR observations to Starlette, Stella, AJISAI, and LAGEOS and we assess the impact of these Low Earth Orbiting (LEO) SLR satellites on the SLR-derived parameters. We study different orbit parameterizations, in particular different arc lengths and the impact of pseudo-stochastic pulses and dynamical orbit parameters on the quality of the solutions. We found that the repeatability of the East and North components of station coordinates, the quality of polar coordinates, and the scale estimates of the reference are improved when combining LAGEOS with low orbiting SLR satellites. In the multi-SLR solutions, the scale and the Z component of geocenter coordinates are less affected by deficiencies in solar radiation pressure modeling than in the LAGEOS-1/2 solutions, due to substantially reduced correlations between the Z geocenter coordinate and empirical orbit parameters. Eventually, we found that the standard values of Center-of-mass corrections (CoM) for geodetic LEO satellites are not valid for the currently operating SLR systems. The variations of station-dependent differential range biases reach 52 and 25 mm for AJISAI and Starlette/Stella, respectively, which is why estimating station dependent range biases or using station-dependent CoM, instead of one value for all SLR stations, is strongly recommended.This clearly indicates that the ILRS effort to produce CoM corrections for each satellite, which are site-specific and depend on the system characteristics at the time of tracking,is very important and needs to be implemented in the SLR data analysis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND Vitamin D deficiency is prevalent in HIV-infected individuals and vitamin D supplementation is proposed according to standard care. This study aimed at characterizing the kinetics of 25(OH)D in a cohort of HIV-infected individuals of European ancestry to better define the influence of genetic and non-genetic factors on 25(OH)D levels. These data were used for the optimization of vitamin D supplementation in order to reach therapeutic targets. METHODS 1,397 25(OH)D plasma levels and relevant clinical information were collected in 664 participants during medical routine follow up visits. They were genotyped for 7 SNPs in 4 genes known to be associated with 25(OH)D levels. 25(OH)D concentrations were analyzed using a population pharmacokinetic approach. The percentage of individuals with 25(OH)D concentrations within the recommended range of 20-40ng/ml during 12 months of follow up and several dosage regimens were evaluated by simulation. RESULTS A one-compartment model with linear absorption and elimination was used to describe 25(OH)D pharmacokinetics, while integrating endogenous baseline plasma concentrations. Covariate analyses confirmed the effect of seasonality, body mass index, smoking habits, the analytical method, darunavir/r and the genetic variant in GC (rs2282679) on 25(OH)D concentrations. 11% of the interindividual variability in 25(OH)D levels was explained by seasonality and other non-genetic covariates and 1% by genetics. The optimal supplementation for severe vitamin D deficient patients was 300000 IU two times per year. CONCLUSIONS This analysis allowed identifying factors associated with 25(OH)D plasma levels in HIV-infected individuals. Improvement of dosage regimen and timing of vitamin D supplementation is proposed based on those results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The European Territorial Cohesion Policy has been the subject of numerous debates in recent years. Most contributions focus on understanding the term itself and figuring out what is behind it, or arguing for or against a stronger formal competence of the European Union in this field. This article will leave out these aspects and pay attention to (undefined and legally non-binding) conceptual elements of territorial cohesion, focusing on the challenge of linking it within spatial policies and organising the relations. Therefore, the theoretical approach of Cultural Theory and its concept of clumsy solution are applied to overcome the dilemma of typical dichotomies by adding a third and a fourth (but not a fifth) perspective. In doing so, normative contradictions between different rational approaches can be revealed, explained and approached with the concept of ‘clumsy solutions’. This contribution aims at discussing how this theoretical approach helps us explain and frame a coalition between the Territorial Cohesion Policy and spatial policies. This approach contributes to finding the best way of linking and organising policies, although the solution might be clumsy according to the different rationalities involved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We estimate the effects of climatic changes, as predicted by six climate models, on lake surface temperatures on a global scale, using the lake surface equilibrium temperature as a proxy. We evaluate interactions between different forcing variables, the sensitivity of lake surface temperatures to these variables, as well as differences between climate zones. Lake surface equilibrium temperatures are predicted to increase by 70 to 85 % of the increase in air temperatures. On average, air temperature is the main driver for changes in lake surface temperatures, and its effect is reduced by ~10 % by changes in other meteorological variables. However, the contribution of these other variables to the variance is ~40 % of that of air temperature, and their effects can be important at specific locations. The warming increases the importance of longwave radiation and evaporation for the lake surface heat balance compared to shortwave radiation and convective heat fluxes. We discuss the consequences of our findings for the design and evaluation of different types of studies on climate change effects on lakes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Up to the present day, Sabina Spielrein has been seen as a means to deeper understanding of Freud and Jung and, in particular, the relationship between these two “great men”. This is also the reason why her scholarly achievements after her 1912 essay "Destruction as the Cause of Coming Into Being” are hardly taken into account. This study shows that Spielrein's main research work was in the areas of child analysis and developmental psychology—that is, beyond the work and the persons of Freud and Jung—and that she made numerous significant contributions to the field, so many of them ahead of her time.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The hadronic light-by-light contribution to the anomalous magnetic moment of the muon was recently analyzed in the framework of dispersion theory, providing a systematic formalism where all input quantities are expressed in terms of on-shell form factors and scattering amplitudes that are in principle accessible in experiment. We briefly review the main ideas behind this framework and discuss the various experimental ingredients needed for the evaluation of one- and two-pion intermediate states. In particular, we identify processes that in the absence of data for doubly-virtual pion–photon interactions can help constrain parameters in the dispersive reconstruction of the relevant input quantities, the pion transition form factor and the helicity partial waves for γ⁎γ⁎→ππ.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We analyze the pion transition form factor using dispersion theory. We calculate the singly-virtual form factor in the time-like region based on data for the e+e−→3π cross section, generalizing previous studies on ω,ϕ→3π decays and γπ→ππ scattering, and verify our result by comparing to e+e−→π0γ data. We perform the analytic continuation to the space-like region, predicting the poorly-constrained space-like transition form factor below 1GeV, and extract the slope of the form factor at vanishing momentum transfer aπ=(30.7±0.6)×10−3. We derive the dispersive formalism necessary for the extension of these results to the doubly-virtual case, as required for the pion-pole contribution to hadronic light-by-light scattering in the anomalous magnetic moment of the muon.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Neuroimaging (NI) technologies are having increasing impact in the study of complex cognitive and social processes. In this emerging field of social cognitive neuroscience, a central goal should be to increase the understanding of the interaction between the neurobiology of the individual and the environment in which humans develop and function. The study of sex/gender is often a focus for NI research, and may be motivated by a desire to better understand general developmental principles, mental health problems that show female-male disparities, and gendered differences in society. In order to ensure the maximum possible contribution of NI research to these goals, we draw attention to four key principles—overlap, mosaicism, contingency and entanglement—that have emerged from sex/gender research and that should inform NI research design, analysis and interpretation. We discuss the implications of these principles in the form of constructive guidelines and suggestions for researchers, editors, reviewers and science communicators.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A measurement of the B 0 s →J/ψϕ decay parameters, updated to include flavor tagging is reported using 4.9  fb −1 of integrated luminosity collected by the ATLAS detector from s √ =7  TeV pp collisions recorded in 2011 at the LHC. The values measured for the physical parameters are ϕ s 0.12±0.25(stat)±0.05(syst)  rad ΔΓ s 0.053±0.021(stat)±0.010(syst)  ps −1 Γ s 0.677±0.007(stat)±0.004(syst)  ps −1 |A ∥ (0)| 2 0.220±0.008(stat)±0.009(syst) |A 0 (0)| 2 0.529±0.006(stat)±0.012(syst) δ ⊥ =3.89±0.47(stat)±0.11(syst)  rad where the parameter ΔΓ s is constrained to be positive. The S -wave contribution was measured and found to be compatible with zero. Results for ϕ s and ΔΓ s are also presented as 68% and 95% likelihood contours, which show agreement with the Standard Model expectations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A detailed characterization of air quality in the megacity of Paris (France) during two 1-month intensive campaigns and from additional 1-year observations revealed that about 70% of the urban background fine particulate matter (PM) is transported on average into the megacity from upwind regions. This dominant influence of regional sources was confirmed by in situ measurements during short intensive and longer-term campaigns, aerosol optical depth (AOD) measurements from ENVISAT, and modeling results from PMCAMx and CHIMERE chemistry transport models. While advection of sulfate is well documented for other megacities, there was surprisingly high contribution from long-range transport for both nitrate and organic aerosol. The origin of organic PM was investigated by comprehensive analysis of aerosol mass spectrometer (AMS), radiocarbon and tracer measurements during two intensive campaigns. Primary fossil fuel combustion emissions constituted less than 20%in winter and 40%in summer of carbonaceous fine PM, unexpectedly small for a megacity. Cooking activities and, during winter, residential wood burning are the major primary organic PM sources. This analysis suggests that the major part of secondary organic aerosol is of modern origin, i.e., from biogenic precursors and from wood burning. Black carbon concentrations are on the lower end of values encountered in megacities worldwide, but still represent an issue for air quality. These comparatively low air pollution levels are due to a combination of low emissions per inhabitant, flat terrain, and a meteorology that is in general not conducive to local pollution build-up. This revised picture of a megacity only being partially responsible for its own average and peak PM levels has important implications for air pollution regulation policies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Bargaining is the building block of many economic interactions, ranging from bilateral to multilateral encounters and from situations in which the actors are individuals to negotiations between firms or countries. In all these settings, economists have been intrigued for a long time by the fact that some projects, trades or agreements are not realized even though they are mutually beneficial. On the one hand, this has been explained by incomplete information. A firm may not be willing to offer a wage that is acceptable to a qualified worker, because it knows that there are also unqualified workers and cannot distinguish between the two types. This phenomenon is known as adverse selection. On the other hand, it has been argued that even with complete information, the presence of externalities may impede efficient outcomes. To see this, consider the example of climate change. If a subset of countries agrees to curb emissions, non-participant regions benefit from the signatories’ efforts without incurring costs. These free riding opportunities give rise to incentives to strategically improve ones bargaining power that work against the formation of a global agreement. This thesis is concerned with extending our understanding of both factors, adverse selection and externalities. The findings are based on empirical evidence from original laboratory experiments as well as game theoretic modeling. On a very general note, it is demonstrated that the institutions through which agents interact matter to a large extent. Insights are provided about which institutions we should expect to perform better than others, at least in terms of aggregate welfare. Chapters 1 and 2 focus on the problem of adverse selection. Effective operation of markets and other institutions often depends on good information transmission properties. In terms of the example introduced above, a firm is only willing to offer high wages if it receives enough positive signals about the worker’s quality during the application and wage bargaining process. In Chapter 1, it will be shown that repeated interaction coupled with time costs facilitates information transmission. By making the wage bargaining process costly for the worker, the firm is able to obtain more accurate information about the worker’s type. The cost could be pure time cost from delaying agreement or cost of effort arising from a multi-step interviewing process. In Chapter 2, I abstract from time cost and show that communication can play a similar role. The simple fact that a worker states to be of high quality may be informative. In Chapter 3, the focus is on a different source of inefficiency. Agents strive for bargaining power and thus may be motivated by incentives that are at odds with the socially efficient outcome. I have already mentioned the example of climate change. Other examples are coalitions within committees that are formed to secure voting power to block outcomes or groups that commit to different technological standards although a single standard would be optimal (e.g. the format war between HD and BlueRay). It will be shown that such inefficiencies are directly linked to the presence of externalities and a certain degree of irreversibility in actions. I now discuss the three articles in more detail. In Chapter 1, Olivier Bochet and I study a simple bilateral bargaining institution that eliminates trade failures arising from incomplete information. In this setting, a buyer makes offers to a seller in order to acquire a good. Whenever an offer is rejected by the seller, the buyer may submit a further offer. Bargaining is costly, because both parties suffer a (small) time cost after any rejection. The difficulties arise, because the good can be of low or high quality and the quality of the good is only known to the seller. Indeed, without the possibility to make repeated offers, it is too risky for the buyer to offer prices that allow for trade of high quality goods. When allowing for repeated offers, however, at equilibrium both types of goods trade with probability one. We provide an experimental test of these predictions. Buyers gather information about sellers using specific price offers and rates of trade are high, much as the model’s qualitative predictions. We also observe a persistent over-delay before trade occurs, and this mitigates efficiency substantially. Possible channels for over-delay are identified in the form of two behavioral assumptions missing from the standard model, loss aversion (buyers) and haggling (sellers), which reconcile the data with the theoretical predictions. Chapter 2 also studies adverse selection, but interaction between buyers and sellers now takes place within a market rather than isolated pairs. Remarkably, in a market it suffices to let agents communicate in a very simple manner to mitigate trade failures. The key insight is that better informed agents (sellers) are willing to truthfully reveal their private information, because by doing so they are able to reduce search frictions and attract more buyers. Behavior observed in the experimental sessions closely follows the theoretical predictions. As a consequence, costless and non-binding communication (cheap talk) significantly raises rates of trade and welfare. Previous experiments have documented that cheap talk alleviates inefficiencies due to asymmetric information. These findings are explained by pro-social preferences and lie aversion. I use appropriate control treatments to show that such consideration play only a minor role in our market. Instead, the experiment highlights the ability to organize markets as a new channel through which communication can facilitate trade in the presence of private information. In Chapter 3, I theoretically explore coalition formation via multilateral bargaining under complete information. The environment studied is extremely rich in the sense that the model allows for all kinds of externalities. This is achieved by using so-called partition functions, which pin down a coalitional worth for each possible coalition in each possible coalition structure. It is found that although binding agreements can be written, efficiency is not guaranteed, because the negotiation process is inherently non-cooperative. The prospects of cooperation are shown to crucially depend on i) the degree to which players can renegotiate and gradually build up agreements and ii) the absence of a certain type of externalities that can loosely be described as incentives to free ride. Moreover, the willingness to concede bargaining power is identified as a novel reason for gradualism. Another key contribution of the study is that it identifies a strong connection between the Core, one of the most important concepts in cooperative game theory, and the set of environments for which efficiency is attained even without renegotiation.