27 resultados para Proportional counters.


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: Helicobacter pylori infection is usually acquired in early childhood and is rarely resolved spontaneously. Eradication therapy is currently recommended virtually to all patients. While the first and second therapies are prescribed without knowing the antibiotic resistance of the bacteria, it is important to know the primary resistance in the population. Aim: This study evaluates the primary resistance of H. pylori among patients in primary health care throughout Finland, the efficacy of three eradication regimens, the symptomatic response to successful therapy, and the effect of smoking on gastric histology and humoral response in H. pylori-positive patients. Patients and methods: A total of 23 endoscopy referral centres located throughout Finland recruited 342 adult patients with positive rapid urease test results, who were referred to upper gastrointestinal endoscopy from primary health care. Gastric histology, H. pylori resistance and H. pylori serology were evaluated. The patients were randomized to receive a seven-day regimen, comprising 1) lansoprazole 30 mg b.d., amoxicillin 1 g b.d. and metronidazole 400 mg t.d. (LAM), 2) lansoprazole 30 mg b.d., amoxicillin 1 g b.d. and clarithromycin 500 mg b.d. (LAC) or 3) ranitidine bismuth citrate 400 mg b.d., metronidazole 400 mg t.d. and tetracycline 500 mg q.d. (RMT). The eradication results were assessed, using the 13C-urea breath test 4 weeks after therapy. The patients completed a symptom questionnaire before and a year after the therapy. Results: Primary resistance of H. pylori to metronidazole was 48% among women and 25% among men. In women, metronidazole resistance correlated with previous use of antibiotics for gynaecologic infections and alcohol consumption. Resistance rate to clarithromycin was only 2%. Intention-to-treat cure rates of LAM, LAC, and RMT were 78%, 91% and 81%. While in metronidazole-sensitive cases the cure rates with LAM, LAC and RMT were similar, in metronidazole resistance LAM and RMT were inferior to LAC (53%, 67% and 84%). Previous antibiotic therapies reduced the efficacy of LAC, to the level of RMT. Dyspeptic symptoms in the Gastrointestinal Symptoms Rating Scale (GSRS) were decreased by 30.5%. In logistic regression analysis, duodenal ulcer, gastric antral neutrophilic inflammation and age from 50 to 59 years independently predicted greater decrease in dyspeptic symptoms. In the gastric body, smokers had milder inflammation and less atrophy and in the antrum denser H. pylori load. Smokers also had lower IgG antibody titres against H. pylori and a smaller proportional decrease in antibodies after successful eradication. Smoking tripled the risk of duodenal ulcers. Conclusions: in Finland H. pylori resistance to clarithromycin is low, but metronidazole resistance among women is high making metronidazole-based therapies unfavourable. Thus, LAC is the best choice for first-line eradication therapy. The effect of eradication on dyspeptic symptoms was only modest. Smoking slows the progression of atrophy in the gastric body.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Solar flares were first observed by plain eye in white light by William Carrington in England in 1859. Since then these eruptions in the solar corona have intrigued scientists. It is known that flares influence the space weather experienced by the planets in a multitude of ways, for example by causing aurora borealis. Understanding flares is at the epicentre of human survival in space, as astronauts cannot survive the highly energetic particles associated with large flares in high doses without contracting serious radiation disease symptoms, unless they shield themselves effectively during space missions. Flares may be at the epicentre of man s survival in the past as well: it has been suggested that giant flares might have played a role in exterminating many of the large species on Earth, including dinosaurs. Having said that prebiotic synthesis studies have shown lightning to be a decisive requirement for amino acid synthesis on the primordial Earth. Increased lightning activity could be attributed to space weather, and flares. This thesis studies flares in two ways: in the spectral and the spatial domain. We have extracted solar spectra using three different instruments, namely GOES (Geostationary Operational Environmental Satellite), RHESSI (Reuven Ramaty High Energy Solar Spectroscopic Imager) and XSM (X-ray Solar Monitor) for the same flares. The GOES spectra are low resolution obtained with a gas proportional counter, the RHESSI spectra are higher resolution obtained with Germanium detectors and the XSM spectra are very high resolution observed with a silicon detector. It turns out that the detector technology and response influence the spectra we see substantially, and are important to understanding what conclusions to draw from the data. With imaging data, there was not such a luxury of choice available. We used RHESSI imaging data to observe the spatial size of solar flares. In the present work the focus was primarily on current solar flares. However, we did make use of our improved understanding of solar flares to observe young suns in NGC 2547. The same techniques used with solar monitors were applied with XMM-Newton, a stellar X-ray monitor, and coupled with ground based Halpha observations these techniques yielded estimates for flare parameters in young suns. The material in this thesis is therefore structured from technology to application, covering the full processing path from raw data and detector responses to concrete physical parameter results, such as the first measurement of the length of plasma flare loops in young suns.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A novel method for functional lung imaging was introduced by adapting the K-edge subtraction method (KES) to in vivo studies of small animals. In this method two synchrotron radiation energies, which bracket the K-edge of the contrast agent, are used for simultaneous recording of absorption-contrast images. Stable xenon gas is used as the contrast agent, and imaging is performed in projection or computed tomography (CT) mode. Subtraction of the two images yields the distribution of xenon, while removing practically all features due to other structures, and the xenon density can be calculated quantitatively. Because the images are recorded simultaneously, there are no movement artifacts in the subtraction image. Time resolution for a series of CT images is one image/s, which allows functional studies. Voxel size is 0.1mm3, which is an order better than in traditional lung imaging methods. KES imaging technique was used in studies of ventilation distribution and the effects of histamine-induced airway narrowing in healthy, mechanically ventilated, and anaesthetized rabbits. First, the effect of tidal volume on ventilation was studied, and the results show that an increase in tidal volume without an increase in minute ventilation results a proportional increase in regional ventilation. Second, spiral CT was used to quantify the airspace volumes in lungs in normal conditions and after histamine aerosol inhalation, and the results showed large patchy filling defects in peripheral lungs following histamine provocation. Third, the kinetics of proximal and distal airway response to histamine aerosol were examined, and the findings show that the distal airways react immediately to histamine and start to recover, while the reaction and the recovery in proximal airways is slower. Fourth, the fractal dimensions of lungs was studied, and it was found that the fractal dimension is higher at the apical part of the lungs compared to the basal part, indicating structural differences between apical and basal lung level. These results provide new insights to lung function and the effects of drug challenge studies. Nowadays the technique is available at synchrotron radiation facilities, but the compact synchrotron radiation sources are being developed, and in relatively near future the method may be used at hospitals.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Atmospheric aerosol particles have a strong impact on the global climate. A deep understanding of the physical and chemical processes affecting the atmospheric aerosol climate system is crucial in order to describe those processes properly in global climate models. Besides the climatic effects, aerosol particles can deteriorate e.g. visibility and human health. Nucleation is a fundamental step in atmospheric new particle formation. However, details of the atmospheric nucleation mechanisms have remained unresolved. The main reason for that has been the non-existence of instruments capable of measuring neutral newly formed particles in the size range below 3 nm in diameter. This thesis aims to extend the detectable particle size range towards close-to-molecular sizes (~1nm) of freshly nucleated clusters, and by direct measurement obtain the concentrations of sub-3 nm particles in atmospheric environment and in well defined laboratory conditions. In the work presented in this thesis, new methods and instruments for the sub-3 nm particle detection were developed and tested. The selected approach comprises four different condensation based techniques and one electrical detection scheme. All of them are capable to detect particles with diameters well below 3 nm, some even down to ~1 nm. The developed techniques and instruments were deployed in the field measurements as well as in laboratory nucleation experiments. Ambient air studies showed that in a boreal forest environment a persistent population of 1-2 nm particles or clusters exists. The observation was done using 4 different instruments showing a consistent capability for the direct measurement of the atmospheric nucleation. The results from the laboratory experiments showed that sulphuric acid is a key species in the atmospheric nucleation. The mismatch between the earlier laboratory data and ambient observations on the dependency of nucleation rate on sulphuric acid concentration was explained. The reason was shown to be associated in the inefficient growth of the nucleated clusters and in the insufficient detection efficiency of particle counters used in the previous experiments. Even though the exact molecular steps of nucleation still remain an open question, the instrumental techniques developed in this work as well as their application in laboratory and ambient studies opened a new view into atmospheric nucleation and prepared the way for investigating the nucleation processes with more suitable tools.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The study seeks to find out whether the real burden of the personal taxation has increased or decreased. In order to determine this, we investigate how the same real income has been taxed in different years. Whenever the taxes for the same real income for a given year are higher than for the base year, the real tax burden has increased. If they are lower, the real tax burden has decreased. The study thus seeks to estimate how changes in the tax regulations affect the real tax burden. It should be kept in mind that the progression in the central government income tax schedule ensures that a real change in income will bring about a change in the tax ration. In case of inflation when the tax schedules are kept nominally the same will also increase the real tax burden. In calculations of the study it is assumed that the real income remains constant, so that we can get an unbiased measure of the effects of governmental actions in real terms. The main factors influencing the amount of income taxes an individual must pay are as follows: - Gross income (income subject to central and local government taxes). - Deductions from gross income and taxes calculated according to tax schedules. - The central government income tax schedule (progressive income taxation). - The rates for the local taxes and for social security payments (proportional taxation). In the study we investigate how much a certain group of taxpayers would have paid in taxes according to the actual tax regulations prevailing indifferent years if the income were kept constant in real terms. Other factors affecting tax liability are kept strictly unchanged (as constants). The resulting taxes, expressed in fixed prices, are then compared to the taxes levied in the base year (hypothetical taxation). The question we are addressing is thus how much taxes a certain group of taxpayers with the same socioeconomic characteristics would have paid on the same real income according to the actual tax regulations prevailing in different years. This has been suggested as the main way to measure real changes in taxation, although there are several alternative measures with essentially the same aim. Next an aggregate indicator of changes in income tax rates is constructed. It is designed to show how much the taxation of income has increased or reduced from one year to next year on average. The main question remains: How aggregation over all income levels should be performed? In order to determine the average real changes in the tax scales the difference functions (difference between actual and hypothetical taxation functions) were aggregated using taxable income as weights. Besides the difference functions, the relative changes in real taxes can be used as indicators of change. In this case the ratio between the taxes computed according to the new and the old situation indicates whether the taxation has become heavier or easier. The relative changes in tax scales can be described in a way similar to that used in describing the cost of living, or by means of price indices. For example, we can use Laspeyres´ price index formula for computing the ratio between taxes determined by the new tax scales and the old tax scales. The formula answers the question: How much more or less will be paid in taxes according to the new tax scales than according to the old ones when the real income situation corresponds to the old situation. In real terms the central government tax burden experienced a steady decline from its high post-war level up until the mid-1950s. The real tax burden then drifted upwards until the mid-1970s. The real level of taxation in 1975 was twice that of 1961. In the 1980s there was a steady phase due to the inflation corrections of tax schedules. In 1989 the tax schedule fell drastically and from the mid-1990s tax schedules have decreased the real tax burden significantly. Local tax rates have risen continuously from 10 percent in 1948 to nearly 19 percent in 2008. Deductions have lowered the real tax burden especially in recent years. Aggregate figures indicate how the tax ratio for the same real income has changed over the years according to the prevailing tax regulations. We call the tax ratio calculated in this manner the real income tax ratio. A change in the real income tax ratio depicts an increase or decrease in the real tax burden. The real income tax ratio declined after the war for some years. In the beginning of the 1960s it nearly doubled to mid-1970. From mid-1990s the real income tax ratio has fallen about 35 %.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Earlier work has suggested that large-scale dynamos can reach and maintain equipartition field strengths on a dynamical time scale only if magnetic helicity of the fluctuating field can be shed from the domain through open boundaries. To test this scenario in convection-driven dynamos by comparing results for open and closed boundary conditions. Three-dimensional numerical simulations of turbulent compressible convection with shear and rotation are used to study the effects of boundary conditions on the excitation and saturation level of large-scale dynamos. Open (vertical field) and closed (perfect conductor) boundary conditions are used for the magnetic field. The contours of shear are vertical, crossing the outer surface, and are thus ideally suited for driving a shear-induced magnetic helicity flux. We find that for given shear and rotation rate, the growth rate of the magnetic field is larger if open boundary conditions are used. The growth rate first increases for small magnetic Reynolds number, Rm, but then levels off at an approximately constant value for intermediate values of Rm. For large enough Rm, a small-scale dynamo is excited and the growth rate in this regime increases proportional to Rm^(1/2). In the nonlinear regime, the saturation level of the energy of the mean magnetic field is independent of Rm when open boundaries are used. In the case of perfect conductor boundaries, the saturation level first increases as a function of Rm, but then decreases proportional to Rm^(-1) for Rm > 30, indicative of catastrophic quenching. These results suggest that the shear-induced magnetic helicity flux is efficient in alleviating catastrophic quenching when open boundaries are used. The horizontally averaged mean field is still weakly decreasing as a function of Rm even for open boundaries.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Molecular machinery on the micro-scale, believed to be the fundamental building blocks of life, involve forces of 1-100 pN and movements of nanometers to micrometers. Micromechanical single-molecule experiments seek to understand the physics of nucleic acids, molecular motors, and other biological systems through direct measurement of forces and displacements. Optical tweezers are a popular choice among several complementary techniques for sensitive force-spectroscopy in the field of single molecule biology. The main objective of this thesis was to design and construct an optical tweezers instrument capable of investigating the physics of molecular motors and mechanisms of protein/nucleic-acid interactions on the single-molecule level. A double-trap optical tweezers instrument incorporating acousto-optic trap-steering, two independent detection channels, and a real-time digital controller was built. A numerical simulation and a theoretical study was performed to assess the signal-to-noise ratio in a constant-force molecular motor stepping experiment. Real-time feedback control of optical tweezers was explored in three studies. Position-clamping was implemented and compared to theoretical models using both proportional and predictive control. A force-clamp was implemented and tested with a DNA-tether in presence of the enzyme lambda exonuclease. The results of the study indicate that the presented models describing signal-to-noise ratio in constant-force experiments and feedback control experiments in optical tweezers agree well with experimental data. The effective trap stiffness can be increased by an order of magnitude using the presented position-clamping method. The force-clamp can be used for constant-force experiments, and the results from a proof-of-principle experiment, in which the enzyme lambda exonuclease converts double-stranded DNA to single-stranded DNA, agree with previous research. The main objective of the thesis was thus achieved. The developed instrument and presented results on feedback control serve as a stepping stone for future contributions to the growing field of single molecule biology.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Physical inactivity has become a major threat to public health worldwide. The Finnish health and welfare policies emphasize that the working population should maintain good health and functioning until their normal retirement age and remain in good health and independence later in life. Health behaviours like physical activity potentially play an important role in reaching this target as physical activity contributes to better physical fitness and to reduced risk of major chronic diseases. The aim of this study was to examine first whether the volume and intensity of leisure-time physical activity impacts on subsequent physical health functioning, sickness absence and disability retirement. The second aim was to examine changes in leisure-time physical activity of moderate and vigorous intensity after transition to retirement. This study is part of the ongoing Helsinki Health Study. The baseline data were collected by questionnaires in 2000 - 02 among the employees of the City of Helsinki aged 40 to 60. The follow-up survey data were collected in 2007. Data on sickness absence were obtained from the employer s (City of Helsinki) sickness absence registers and pension data were obtained from the Finnish Centre for Pensions. Leisure-time physical activity was measured in four grades of intensity and classified according to physical activity recommendations considering both the volume and intensity of physical activity. Statistical techniques including analysis of covariance, logistic regression, Cox proportional hazards models and Poisson regression were used. Employees who were vigorously active during leisure time especially had better physical health functioning than those physically inactive. High physical activity in particular contributed to the maintenance of good physical health functioning. High physical activity also reduced the risk of subsequent sickness absences as well as the risk of all-cause disability retirement and retirement due to musculoskeletal and mental causes. Among those transferred to old-age retirement moderate-intensity leisure-time physical activity increased on average by more than half an hour per week and in addition the occurrence of physical inactivity reduced. Such changes were not observed among those remained employed and those transferred to disability retirement. This prospective cohort study provided novel results on the effects of leisure-time physical activity on health related functioning and changes in leisure-time physical activity after retirement. Although the benefits of moderate-intensity physical activity for health are well known these results suggest the importance of vigorous physical activity for subsequent health related functioning. Thus vigorous physical activity to enhance fitness should be given more emphasis from a public health perspective. In addition, physical activity should be encouraged among those who are about to retire.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

QCD factorization in the Bjorken limit allows to separate the long-distance physics from the hard subprocess. At leading twist, only one parton in each hadron is coherent with the hard subprocess. Higher twist effects increase as one of the active partons carries most of the longitudinal momentum of the hadron, x -> 1. In the Drell-Yan process \pi N -> \mu^- mu^+ + X, the polarization of the virtual photon is observed to change to longitudinal when the photon carries x_F > 0.6 of the pion. I define and study the Berger-Brodsky limit of Q^2 -> \infty with Q^2(1-x) fixed. A new kind of factorization holds in the Drell-Yan process in this limit, in which both pion valence quarks are coherent with the hard subprocess, the virtual photon is longitudinal rather than transverse, and the cross section is proportional to a multiparton distribution. Generalized parton distributions contain information on the longitudinal momentum and transverse position densities of partons in a hadron. Transverse charge densities are Fourier transforms of the electromagnetic form factors. I discuss the application of these methods to the QED electron, studying the form factors, charge densities and spin distributions of the leading order |e\gamma> Fock state in impact parameter and longitudinal momentum space. I show how the transverse shape of any virtual photon induced process, \gamma^*(q)+i -> f, may be measured. Qualitative arguments concerning the size of such transitions have been previously made in the literature, but without a precise analysis. Properly defined, the amplitudes and the cross section in impact parameter space provide information on the transverse shape of the transition process.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The aim of this thesis was to examine the understanding of community in George Lindbeck s The Nature of Doctrine. Intrinsic to this question was also examining how Lindbeck understands the relation between the text and the world which both meet in a Christian community. Thirdly this study also aimed at understanding what the persuasiveness of this understanding depends on. The method applied for this task was systematic analysis. The study was conducted by first providing an orientation into the nontheological substance of the ND which was assumed useful with respect to the aim of this study. The study then went on to explore Lindbeck in his own context of postliberal theology in order to see how the ND was received. It also attempted to provide a picture of how the ND relates to Lindbeck as a theologian. The third chapter was a descriptive analysis into the cultural-linguistic perspective, which is understood as being directly proportional to his understanding of community. The fourth chapter was an analysis into how the cultural-linguistic perspective sees the relation between the text and the world. When religion is understood from a cultural-linguistic perspective, it presents itself as a cultural-linguistic entity, which Lindbeck understands as a comprehensive interpretive scheme which structures human experience and understanding of oneself and the world in which one lives. When one exists in this entity, it is the entity which shapes the subjectivities of all those who are at home in this entity which makes participation in the life of a cultural linguistic entity a condition for understanding it. Religion is above all an external word that moulds and shapes our religious existence and experience. Understanding faith then as coming from hearing, is something that correlates with the cultural-linguistic depiction of reality. Religion informs us of a religious reality, it does not originate in any way from ourselves. This externality linked to the axiomatic nature of religion is also something that distinguishes Lindbeck sharply from liberalist tendencies, which understand religion as ultimately expressing the prereflective depths of the inner self. Language is the central analogy to understanding the medium in which one moves when inhabiting a cultural-linguistic system because language is the transmitting medium in which the cultural-linguistic system is embodied. The realism entailed in Lindbeck s understanding of a community is that we are fundamentally on the receiving end when it comes to our identities whether cultural or religious. We always witness to something. Its persuasiveness rests on the fact that we never exist in an unpersuaded reality. The language of Christ is a self-sustaining and irreducible cultural-linguistic entity, which is ontologically founded upon Christ. It transmits the reality of a new being. The basic relation to the world for a Christian is that of witnessing salvation in Christ: witnessing Christ as the home of hearing the message of salvation, which is the God-willed way. Following this logic, the relation of the world and the text is one of relating to the world from the text, i.e. In Christ through the word (text) for the world, because it assumes it s logic from the way Christ ontologically relates to us.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Menneinä vuosikymmeninä maatalouden työt ovat ensin koneellistuneet voimakkaasti ja sittemmin mukaan on tullut automaatio. Nykyään koneiden kokoa suurentamalla ei enää saada tuottavuutta nostettua merkittävästi, vaan työn tehostaminen täytyy tehdä olemassa olevien resurssien käyttöä tehostamalla. Tässä työssä tarkastelun kohteena on ajosilppuriketju nurmisäilörehun korjuussa. Säilörehun korjuun intensiivisyys ja koneyksiköiden runsas määrä ovat työnjohdon kannalta vaativa yhdistelmä. Työn tavoitteena oli selvittää vaatimuksia maatalouden urakoinnin tueksi kehitettävälle tiedonhallintajärjestelmälle. Tutkimusta varten haastateltiin yhteensä 12 urakoitsijaa tai yhteistyötä tekevää viljelijää. Tutkimuksen perusteella urakoitsijoilla on tarvetta tietojärjestelmille.Luonnollisesti urakoinnin laajuus ja järjestelyt vaikuttavat asiaan. Tutkimuksen perusteella keskeisimpiä vaatimuksia tiedonhallinnalle ovat: • mahdollisimman laaja, yksityiskohtainen ja automaattinen tiedon keruu tehtävästä työstä • karttapohjaisuus, kuljettajien opastus kohteisiin • asiakasrekisteri, työn tilaus sähköisesti • tarjouspyyntöpohjat, hintalaskurit • luotettavuus, tiedon säilyvyys • sovellettavuus monenlaisiin töihin • yhteensopivuus muiden järjestelmien kanssa Kehitettävän järjestelmän tulisi siis tutkimuksen perusteella sisältää seuraavia osia: helppokäyttöinen suunnittelu/asiakasrekisterityökalu, toimintoja koneiden seurantaan, opastukseen ja johtamiseen, työnaikainen tiedonkeruu sekä kerätyn tiedon käsittelytoimintoja. Kaikki käyttäjät eivät kuitenkaan tarvitse kaikkia toimintoja, joten urakoitsijan on voitava valita tarvitsemansa osat ja mahdollisesti lisätä toimintoja myöhemmin. Tiukoissa taloudellisissa ja ajallisissa raameissa toimivat urakoitsijat ovat vaativia asiakkaita, joiden käyttämän tekniikan tulee olla toimivaa ja luotettavaa. Toisaalta inhimillisiä virheitä sattuu kokeneillekin, joten hyvällä tietojärjestelmällä työstä tulee helpompaa ja tehokkaampaa.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The prefrontal cortex (PFC), located in the anterior region of the frontal lobe, is considered to have several key roles in higher cognitive and executive functions. In general, the PFC can be seen as a coordinator of thought and action allowing subjects to behave in a goal-directed manner. Due to its anatomical connections with a variety of cortical and subcortical structures, several neurotransmitters, including dopamine, are involved in the regulation of PFC activity. In general, the majority of released dopamine is cleared by the dopamine transporter (DAT). In the PFC however, the number of presynaptic DAT is diminished, emphasizing the relative importance of catechol-O-methyltransferase (COMT) in dopamine metabolism. As a result, the role of COMT in the etiology of psychotic disorders is under constant debate. The present study investigated the role of COMT in prefrontal cortical dopamine metabolism by different neurochemical methods in COMT knockout (COMT-KO) mice. Pharmacological tools to inhibit other dopamine clearing mechanisms were also used for a more comprehensive and collective picture. In addition, this study investigated how a lack of the soluble (S-) COMT isoform affects the total COMT activity as well as the pharmacokinetics of orally administered L-dopa using mutant mice expressing only the membrane-bound (MB-) COMT isoform. Also the role of COMT in striatal and accumbal dopamine turnover during Δ9-tetrahydrocannabinol (THC) challenge was studied. We found markedly increased basal dopamine concentrations in the PFC, but not the striatum or nucleus accumbens (NAcc), of mice lacking COMT. Pharmacological inhibition of the noradrenaline transporter (NET) and monoamine oxidase (MAO) elevated prefrontal cortical dopamine levels several-fold, whereas inhibition of DAT did not. The lack of COMT doubled the dopamine raising effects of NET and MAO inhibition. No compensatory expression of either DAT or NET was found in the COMT-KO mice. The lack of S-COMT decreased the total COMT activity by 50-70 % and modified dopamine transmission and the pharmacokinetics of exogenous Ldopa in a sex and tissue specific manner. Finally, we found that subsequent tolcapone and THC increased dopamine levels in the NAcc, but not in the striatum. Conclusively, this study presents neurochemical evidence for the important role of COMT in the PFC and shows that COMT is responsible for about half of prefrontal cortical dopamine metabolism. This study also highlights the previously underestimated proportional role of MB-COMT and supports the clinical evidence of a gene x environment interaction between COMT and cannabis.