51 resultados para Flow Vector Tracking


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Part I: Parkinson’s disease is a slowly progressive neurodegenerative disorder in which particularly the dopaminergic neurons of the substantia nigra pars compacta degenerate and die. Current conventional treatment is based on restraining symptoms but it has no effect on the progression of the disease. Gene therapy research has focused on the possibility of restoring the lost brain function by at least two means: substitution of critical enzymes needed for the synthesis of dopamine and slowing down the progression of the disease by supporting the functions of the remaining nigral dopaminergic neurons by neurotrophic factors. The striatal levels of enzymes such as tyrosine hydroxylase, dopadecarboxylase and GTP-CH1 are decreased as the disease progresses. By replacing one or all of the enzymes, dopamine levels in the striatum may be restored to normal and behavioral impairments caused by the disease may be ameliorated especially in the later stages of the disease. The neurotrophic factors glial cell derived neurotrophic factor (GDNF) and neurturin have shown to protect and restore functions of dopaminergic cell somas and terminals as well as improve behavior in animal lesion models. This therapy may be best suited at the early stages of the disease when there are more dopaminergic neurons for neurotrophic factors to reach. Viral vector-mediated gene transfer provides a tool to deliver proteins with complex structures into specific brain locations and provides long-term protein over-expression. Part II: The aim of our study was to investigate the effects of two orally dosed COMT inhibitors entacapone (10 and 30 mg/kg) and tolcapone (10 and 30 mg/kg) with a subsequent administration of a peripheral dopadecarboxylase inhibitor carbidopa (30 mg/kg) and L- dopa (30 mg/kg) on dopamine and its metabolite levels in the dorsal striatum and nucleus accumbens of freely moving rats using dual-probe in vivo microdialysis. Earlier similarly designed studies have only been conducted in the dorsal striatum. We also confirmed the result of earlier ex vivo studies regarding the effects of intraperitoneally dosed tolcapone (30 mg/kg) and entacapone (30 mg/kg) on striatal and hepatic COMT activity. The results obtained from the dorsal striatum were generally in line with earlier studies, where tolcapone tended to increase dopamine and DOPAC levels and decrease HVA levels. Entacapone tended to keep striatal dopamine and HVA levels elevated longer than in controls and also tended to elevate the levels of DOPAC. Surprisingly in the nucleus accumbens, dopamine levels after either dose of entacapone or tolcapone were not elevated. Accumbal DOPAC levels, especially in the tolcapone 30 mg/kg group, were elevated nearly to the same extent as measured in the dorsal striatum. Entacapone 10 mg/kg elevated accumbal HVA levels more than the dose of 30 mg/kg and the effect was more pronounced in the nucleus accumbens than in the dorsal striatum. This suggests that entacapone 30 mg/kg has minor central effects. Also our ex vivo study results obtained from the dorsal striatum suggest that entacapone 30 mg/kg has minor and transient central effects, even though central HVA levels were not suppressed below those of the control group in either brain area in the microdialysis study. Both entacapone and tolcapone suppressed hepatic COMT activity more than striatal COMT activity. Tolcapone was more effective than entacapone in the dorsal striatum. The differences between dopamine and its metabolite levels in the dorsal striatum and nucleus accumbens may be due to different properties of the two brain areas.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Agriculture is an economic activity that heavily relies on the availability of natural resources. Through its role in food production agriculture is a major factor affecting public welfare and health, and its indirect contribution to gross domestic product and employment is significant. Agriculture also contributes to numerous ecosystem services through management of rural areas. However, the environmental impact of agriculture is considerable and reaches far beyond the agroecosystems. The questions related to farming for food production are, thus, manifold and of great public concern. Improving environmental performance of agriculture and sustainability of food production, sustainabilizing food production, calls for application of wide range of expertise knowledge. This study falls within the field of agro-ecology, with interphases to food systems and sustainability research and exploits the methods typical of industrial ecology. The research in these fields extends from multidisciplinary to interdisciplinary and transdisciplinary, a holistic approach being the key tenet. The methods of industrial ecology have been applied extensively to explore the interaction between human economic activity and resource use. Specifically, the material flow approach (MFA) has established its position through application of systematic environmental and economic accounting statistics. However, very few studies have applied MFA specifically to agriculture. The MFA approach was used in this thesis in such a context in Finland. The focus of this study is the ecological sustainability of primary production. The aim was to explore the possibilities of assessing ecological sustainability of agriculture by using two different approaches. In the first approach the MFA-methods from industrial ecology were applied to agriculture, whereas the other is based on the food consumption scenarios. The two approaches were used in order to capture some of the impacts of dietary changes and of changes in production mode on the environment. The methods were applied at levels ranging from national to sector and local levels. Through the supply-demand approach, the viewpoint changed between that of food production to that of food consumption. The main data sources were official statistics complemented with published research results and expertise appraisals. MFA approach was used to define the system boundaries, to quantify the material flows and to construct eco-efficiency indicators for agriculture. The results were further elaborated for an input-output model that was used to analyse the food flux in Finland and to determine its relationship to the economy-wide physical and monetary flows. The methods based on food consumption scenarios were applied at regional and local level for assessing feasibility and environmental impacts of relocalising food production. The approach was also used for quantification and source allocation of greenhouse gas (GHG) emissions of primary production. GHG assessment provided, thus, a means of crosschecking the results obtained by using the two different approaches. MFA data as such or expressed as eco-efficiency indicators, are useful in describing the overall development. However, the data are not sufficiently detailed for identifying the hot spots of environmental sustainability. Eco-efficiency indicators should not be bluntly used in environmental assessment: the carrying capacity of the nature, the potential exhaustion of non-renewable natural resources and the possible rebound effect need also to be accounted for when striving towards improved eco-efficiency. The input-output model is suitable for nationwide economy analyses and it shows the distribution of monetary and material flows among the various sectors. Environmental impact can be captured only at a very general level in terms of total material requirement, gaseous emissions, energy consumption and agricultural land use. Improving environmental performance of food production requires more detailed and more local information. The approach based on food consumption scenarios can be applied at regional or local scales. Based on various diet options the method accounts for the feasibility of re-localising food production and environmental impacts of such re-localisation in terms of nutrient balances, gaseous emissions, agricultural energy consumption, agricultural land use and diversity of crop cultivation. The approach is applicable anywhere, but the calculation parameters need to be adjusted so as to comply with the specific circumstances. The food consumption scenario approach, thus, pays attention to the variability of production circumstances, and may provide some environmental information that is locally relevant. The approaches based on the input-output model and on food consumption scenarios represent small steps towards more holistic systemic thinking. However, neither one alone nor the two together provide sufficient information for sustainabilizing food production. Environmental performance of food production should be assessed together with the other criteria of sustainable food provisioning. This requires evaluation and integration of research results from many different disciplines in the context of a specified geographic area. Foodshed area that comprises both the rural hinterlands of food production and the population centres of food consumption is suggested to represent a suitable areal extent for such research. Finding a balance between the various aspects of sustainability is a matter of optimal trade-off. The balance cannot be universally determined, but the assessment methods and the actual measures depend on what the bottlenecks of sustainability are in the area concerned. These have to be agreed upon among the actors of the area

Relevância:

20.00% 20.00%

Publicador:

Resumo:

ALICE (A Large Ion Collider Experiment) is the LHC (Large Hadron Collider) experiment devoted to investigating the strongly interacting matter created in nucleus-nucleus collisions at the LHC energies. The ALICE ITS, Inner Tracking System, consists of six cylindrical layers of silicon detectors with three different technologies; in the outward direction: two layers of pixel detectors, two layers each of drift, and strip detectors. The number of parameters to be determined in the spatial alignment of the 2198 sensor modules of the ITS is about 13,000. The target alignment precision is well below 10 micron in some cases (pixels). The sources of alignment information include survey measurements, and the reconstructed tracks from cosmic rays and from proton-proton collisions. The main track-based alignment method uses the Millepede global approach. An iterative local method was developed and used as well. We present the results obtained for the ITS alignment using about 10^5 charged tracks from cosmic rays that have been collected during summer 2008, with the ALICE solenoidal magnet switched off.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present three measurements of the top-quark mass in the lepton plus jets channel with approximately 1.9 fb-1 of integrated luminosity collected with the CDF II detector using quantities with minimal dependence on the jet energy scale. One measurement exploits the transverse decay length of b-tagged jets to determine a top-quark mass of 166.9+9.5-8.5 (stat) +/- 2.9 (syst) GeV/c2, and another the transverse momentum of electrons and muons from W-boson decays to determine a top-quark mass of 173.5+8.8-8.9 (stat) +/- 3.8 (syst) GeV/c2. These quantities are combined in a third, simultaneous mass measurement to determine a top-quark mass of 170.7 +/- 6.3 (stat) +/- 2.6 (syst) GeV/c2.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present the first observation in hadronic collisions of the electroweak production of vector boson pairs (VV, V=W, Z) where one boson decays to a dijet final state. The data correspond to 3.5  fb-1 of integrated luminosity of pp̅ collisions at √s=1.96  TeV collected by the CDF II detector at the Fermilab Tevatron. We observe 1516±239(stat)±144(syst) diboson candidate events and measure a cross section σ(pp̅ →VV+X) of 18.0±2.8(stat)±2.4(syst)±1.1(lumi)  pb, in agreement with the expectations of the standard model.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present the first observation in hadronic collisions of the electroweak production of vector boson pairs (VV, V=W,Z) where one boson decays to a dijet final state . The data correspond to 3.5 inverse femtobarns of integrated luminosity of ppbar collisions at sqrt(s)=1.96 TeV collected by the CDFII detector at the Fermilab Tevatron. We observe 1516+/-239(stat)+/-144(syst) diboson candidate events and measure a cross section sigma(ppbar->VV+X) of 18.0+/-2.8(stat)+/-2.4(syst)+/-1.1(lumi) pb, in agreement with the expectations of the standard model.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present the result of a search for a massive color-octet vector particle, (e.g. a massive gluon) decaying to a pair of top quarks in proton-antiproton collisions with a center-of-mass energy of 1.96 TeV. This search is based on 1.9 fb$^{-1}$ of data collected using the CDF detector during Run II of the Tevatron at Fermilab. We study $t\bar{t}$ events in the lepton+jets channel with at least one $b$-tagged jet. A massive gluon is characterized by its mass, decay width, and the strength of its coupling to quarks. These parameters are determined according to the observed invariant mass distribution of top quark pairs. We set limits on the massive gluon coupling strength for masses between 400 and 800 GeV$/c^2$ and width-to-mass ratios between 0.05 and 0.50. The coupling strength of the hypothetical massive gluon to quarks is consistent with zero within the explored parameter space.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the thesis we consider inference for cointegration in vector autoregressive (VAR) models. The thesis consists of an introduction and four papers. The first paper proposes a new test for cointegration in VAR models that is directly based on the eigenvalues of the least squares (LS) estimate of the autoregressive matrix. In the second paper we compare a small sample correction for the likelihood ratio (LR) test of cointegrating rank and the bootstrap. The simulation experiments show that the bootstrap works very well in practice and dominates the correction factor. The tests are applied to international stock prices data, and the .nite sample performance of the tests are investigated by simulating the data. The third paper studies the demand for money in Sweden 1970—2000 using the I(2) model. In the fourth paper we re-examine the evidence of cointegration between international stock prices. The paper shows that some of the previous empirical results can be explained by the small-sample bias and size distortion of Johansen’s LR tests for cointegration. In all papers we work with two data sets. The first data set is a Swedish money demand data set with observations on the money stock, the consumer price index, gross domestic product (GDP), the short-term interest rate and the long-term interest rate. The data are quarterly and the sample period is 1970(1)—2000(1). The second data set consists of month-end stock market index observations for Finland, France, Germany, Sweden, the United Kingdom and the United States from 1980(1) to 1997(2). Both data sets are typical of the sample sizes encountered in economic data, and the applications illustrate the usefulness of the models and tests discussed in the thesis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Knowledge Flow, my dear friend! I would like to introduce you to a close relative of yours: Organizational Communication. You might want to take a moment to hear what your newfound kin has to say. As bright as you are dear Flow, you're missing a piece of the puzzle - for one cannot study any aspect of an organization relating to communication without acknowledging the message. Without a message, communication does not exist. Organizational Communication has always appreciated this. Perhaps the time has come for you to join rank and do so too? The main point of this work is to prove that the form of a message considerably affects communication, interpretation - and knowledge flow. As stories are at the heart of this thesis; and entertaining, reader-friendly communication its main argument, the entire manuscript is written in story form and is intentionally breaking academic writing tradition as far as writing style goes. Each chapter reads as a story of sorts and put together they create a grand narrative of my journey as a PhD student, the research I have conducted and the outcomes of this work. Thus if a reader hopes to make any sense of this title, she must read it in the same way one would read a novel, from beginning to end. This is a thesis with three aspirations. First, it sets out to prove that knowledge flow cannot be studied without a message. Second, it moves on to give the reader a once-over of a much used message form: storytelling. After these two goals are tackled the path is clear to research if message form indeed is as essential as claimed. I do so through both a qualitative and a quantitative study. The former acted as both a stepping stone into the research area and as an inspirational pilot, from which the research design for the larger quantitative study was drawn. Together, these two studies answered my research question - and allowed me to fulfill the third, final and foremost aspiration of this study - bridging the gap between two separate fields of knowledge management: knowledge flow and storytelling.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Vuorokausivirtaaman ennustaminen yhdyskuntien vesi- ja viemärilaitosten yleissuunnittelussa.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The study presents a theory of utility models based on aspiration levels, as well as the application of this theory to the planning of timber flow economics. The first part of the study comprises a derivation of the utility-theoretic basis for the application of aspiration levels. Two basic models are dealt with: the additive and the multiplicative. Applied here solely for partial utility functions, aspiration and reservation levels are interpreted as defining piecewisely linear functions. The standpoint of the choices of the decision-maker is emphasized by the use of indifference curves. The second part of the study introduces a model for the management of timber flows. The model is based on the assumption that the decision-maker is willing to specify a shape of income flow which is different from that of the capital-theoretic optimum. The utility model comprises four aspiration-based compound utility functions. The theory and the flow model are tested numerically by computations covering three forest holdings. The results show that the additive model is sensitive even to slight changes in relative importances and aspiration levels. This applies particularly to nearly linear production possibility boundaries of monetary variables. The multiplicative model, on the other hand, is stable because it generates strictly convex indifference curves. Due to a higher marginal rate of substitution, the multiplicative model implies a stronger dependence on forest management than the additive function. For income trajectory optimization, a method utilizing an income trajectory index is more efficient than one based on the use of aspiration levels per management period. Smooth trajectories can be attained by squaring the deviations of the feasible trajectories from the desired one.