854 resultados para Theory of electronic transport scattering mechanisms
Resumo:
Drug addiction manifests clinically as compulsive drug seeking, and cravings that can persist and recur even after extended periods of abstinence. The fundamental principle that unites addictive drugs is that each one enhances synaptic DA by means that dissociate it from normal behavioral control, so that they act to reinforce their own acquisition. Our attention has focused on the study of phenomena associated with the consumption of alcohol and heroin. Alcohol has long been considered an unspecific pharmacological agent, recent molecular pharmacology studies have shown that acts on different primary targets. Through gene expression studies conducted recently it has been shown that the classical opioid receptors are differently involved in the consumption of ethanol and, furthermore, the system nociceptin / NOP, included in the family of endogenous opioid system, and both appear able to play a key role in the initiation of alcohol use in rodents. What emerges is that manipulation of the opioid system, nociceptin, may be useful in the treatment of addictions and there are several evidences that support the use of this strategy. The linkage between gene expression alterations and epigenetic modulation in PDYN and PNOC promoters following alcohol treatment confirm the possible chromatin remodeling mechanism already proposed for alcoholism. In the second part of present study, we also investigated alterations in signaling molecules directly associated with MAPK pathway in a unique collection of postmortem brains from heroin abusers. The interest was focused on understanding the effects that prolonged exposure of heroin can cause in an individual, over the entire MAPK cascade and consequently on the transcription factor ELK1, which is regulated by this pathway. We have shown that the activation of ERK1/2 resulting in Elk-1 phosphorylation in striatal neurons supporting the hypothesis that prolonged exposure to substance abuse causes a dysregulation of MAPK pathway.
Resumo:
In questa tesi abbiamo studiato la quantizzazione di una teoria di gauge di forme differenziali su spazi complessi dotati di una metrica di Kaehler. La particolarità di queste teorie risiede nel fatto che esse presentano invarianze di gauge riducibili, in altre parole non indipendenti tra loro. L'invarianza sotto trasformazioni di gauge rappresenta uno dei pilastri della moderna comprensione del mondo fisico. La caratteristica principale di tali teorie è che non tutte le variabili sono effettivamente presenti nella dinamica e alcune risultano essere ausiliarie. Il motivo per cui si preferisce adottare questo punto di vista è spesso il fatto che tali teorie risultano essere manifestamente covarianti sotto importanti gruppi di simmetria come il gruppo di Lorentz. Uno dei metodi più usati nella quantizzazione delle teorie di campo con simmetrie di gauge, richiede l'introduzione di campi non fisici detti ghosts e di una simmetria globale e fermionica che sostituisce l'iniziale invarianza locale di gauge, la simmetria BRST. Nella presente tesi abbiamo scelto di utilizzare uno dei più moderni formalismi per il trattamento delle teorie di gauge: il formalismo BRST Lagrangiano di Batalin-Vilkovisky. Questo metodo prevede l'introduzione di ghosts per ogni grado di riducibilità delle trasformazioni di gauge e di opportuni “antifields" associati a ogni campo precedentemente introdotto. Questo formalismo ci ha permesso di arrivare direttamente a una completa formulazione in termini di path integral della teoria quantistica delle (p,0)-forme. In particolare esso permette di dedurre correttamente la struttura dei ghost della teoria e la simmetria BRST associata. Per ottenere questa struttura è richiesta necessariamente una procedura di gauge fixing per eliminare completamente l'invarianza sotto trasformazioni di gauge. Tale procedura prevede l'eliminazione degli antifields in favore dei campi originali e dei ghosts e permette di implementare, direttamente nel path integral condizioni di gauge fixing covarianti necessari per definire correttamente i propagatori della teoria. Nell'ultima parte abbiamo presentato un’espansione dell’azione efficace (euclidea) che permette di studiare le divergenze della teoria. In particolare abbiamo calcolato i primi coefficienti di tale espansione (coefficienti di Seeley-DeWitt) tramite la tecnica dell'heat kernel. Questo calcolo ha tenuto conto dell'eventuale accoppiamento a una metrica di background cosi come di un possibile ulteriore accoppiamento alla traccia della connessione associata alla metrica.
Resumo:
The quest for universal memory is driving the rapid development of memories with superior all-round capabilities in non-volatility, high speed, high endurance and low power. The memory subsystem accounts for a significant cost and power budget of a computer system. Current DRAM-based main memory systems are starting to hit the power and cost limit. To resolve this issue the industry is improving existing technologies such as Flash and exploring new ones. Among those new technologies is the Phase Change Memory (PCM), which overcomes some of the shortcomings of the Flash such as durability and scalability. This alternative non-volatile memory technology, which uses resistance contrast in phase-change materials, offers more density relative to DRAM, and can help to increase main memory capacity of future systems while remaining within the cost and power constraints. Chalcogenide materials can suitably be exploited for manufacturing phase-change memory devices. Charge transport in amorphous chalcogenide-GST used for memory devices is modeled using two contributions: hopping of trapped electrons and motion of band electrons in extended states. Crystalline GST exhibits an almost Ohmic I(V) curve. In contrast amorphous GST shows a high resistance at low biases while, above a threshold voltage, a transition takes place from a highly resistive to a conductive state, characterized by a negative differential-resistance behavior. A clear and complete understanding of the threshold behavior of the amorphous phase is fundamental for exploiting such materials in the fabrication of innovative nonvolatile memories. The type of feedback that produces the snapback phenomenon is described as a filamentation in energy that is controlled by electron–electron interactions between trapped electrons and band electrons. The model thus derived is implemented within a state-of-the-art simulator. An analytical version of the model is also derived and is useful for discussing the snapback behavior and the scaling properties of the device.
Resumo:
This work deals with the theory of Relativity and its diffusion in Italy in the first decades of the XX century. Not many scientists belonging to Italian universities were active in understanding Relativity, but two of them, Max Abraham and Tullio Levi-Civita left a deep mark. Max Abraham engaged a substantial debate against Einstein between 1912 and 1914 about electromagnetic and gravitation aspects of the theories. Levi-Civita played a fundamental role in giving Einstein the correct mathematical instruments for the General Relativity formulation since 1915. This work, which doesn't have the aim of a mere historical chronicle of the events, wants to highlight two particular perspectives: on one hand, the importance of Abraham-Einstein debate in order to clarify the basis of Special Relativity, to observe the rigorous logical structure resulting from a fragmentary reasoning sequence and to understand Einstein's thinking; on the other hand, the originality of Levi-Civita's approach, quite different from the Einstein's one, characterized by the introduction of a method typical of General Relativity even to Special Relativity and the attempt to hide the two Einstein Special Relativity postulates.
Resumo:
We introduce labelled sequent calculi for indexed modal logics. We prove that the structural rules of weakening and contraction are height-preserving admissible, that all rules are invertible, and that cut is admissible. Then we prove that each calculus introduced is sound and complete with respect to the appropriate class of transition frames.
Resumo:
The first part of this work deals with the inverse problem solution in the X-ray spectroscopy field. An original strategy to solve the inverse problem by using the maximum entropy principle is illustrated. It is built the code UMESTRAT, to apply the described strategy in a semiautomatic way. The application of UMESTRAT is shown with a computational example. The second part of this work deals with the improvement of the X-ray Boltzmann model, by studying two radiative interactions neglected in the current photon models. Firstly it is studied the characteristic line emission due to Compton ionization. It is developed a strategy that allows the evaluation of this contribution for the shells K, L and M of all elements with Z from 11 to 92. It is evaluated the single shell Compton/photoelectric ratio as a function of the primary photon energy. It is derived the energy values at which the Compton interaction becomes the prevailing process to produce ionization for the considered shells. Finally it is introduced a new kernel for the XRF from Compton ionization. In a second place it is characterized the bremsstrahlung radiative contribution due the secondary electrons. The bremsstrahlung radiation is characterized in terms of space, angle and energy, for all elements whit Z=1-92 in the energy range 1–150 keV by using the Monte Carlo code PENELOPE. It is demonstrated that bremsstrahlung radiative contribution can be well approximated with an isotropic point photon source. It is created a data library comprising the energetic distributions of bremsstrahlung. It is developed a new bremsstrahlung kernel which allows the introduction of this contribution in the modified Boltzmann equation. An example of application to the simulation of a synchrotron experiment is shown.
Resumo:
Urban centers significantly contribute to anthropogenic air pollution, although they cover only a minor fraction of the Earth's land surface. Since the worldwide degree of urbanization is steadily increasing, the anthropogenic contribution to air pollution from urban centers is expected to become more substantial in future air quality assessments. The main objective of this thesis was to obtain a more profound insight in the dispersion and the deposition of aerosol particles from 46 individual major population centers (MPCs) as well as the regional and global influence on the atmospheric distribution of several aerosol types. For the first time, this was assessed in one model framework, for which the global model EMAC was applied with different representations of aerosol particles. First, in an approach with passive tracers and a setup in which the results depend only on the source location and the size and the solubility of the tracers, several metrics and a regional climate classification were used to quantify the major outflow pathways, both vertically and horizontally, and to compare the balance between pollution export away from and pollution build-up around the source points. Then in a more comprehensive approach, the anthropogenic emissions of key trace species were changed at the MPC locations to determine the cumulative impact of the MPC emissions on the atmospheric aerosol burdens of black carbon, particulate organic matter, sulfate, and nitrate. Ten different mono-modal passive aerosol tracers were continuously released at the same constant rate at each emission point. The results clearly showed that on average about five times more mass is advected quasi-horizontally at low levels than exported into the upper troposphere. The strength of the low-level export is mainly determined by the location of the source, while the vertical transport is mainly governed by the lifting potential and the solubility of the tracers. Similar to insoluble gas phase tracers, the low-level export of aerosol tracers is strongest at middle and high latitudes, while the regions of strongest vertical export differ between aerosol (temperate winter dry) and gas phase (tropics) tracers. The emitted mass fraction that is kept around MPCs is largest in regions where aerosol tracers have short lifetimes; this mass is also critical for assessing the impact on humans. However, the number of people who live in a strongly polluted region around urban centers depends more on the population density than on the size of the area which is affected by strong air pollution. Another major result was that fine aerosol particles (diameters smaller than 2.5 micrometer) from MPCs undergo substantial long-range transport, with about half of the emitted mass being deposited beyond 1000 km away from the source. In contrast to this diluted remote deposition, there are areas around the MPCs which experience high deposition rates, especially in regions which are frequently affected by heavy precipitation or are situated in poorly ventilated locations. Moreover, most MPC aerosol emissions are removed over land surfaces. In particular, forests experience more deposition from MPC pollutants than other land ecosystems. In addition, it was found that the generic treatment of aerosols has no substantial influence on the major conclusions drawn in this thesis. Moreover, in the more comprehensive approach, it was found that emissions of black carbon, particulate organic matter, sulfur dioxide, and nitrogen oxides from MPCs influence the atmospheric burden of various aerosol types very differently, with impacts generally being larger for secondary species, sulfate and nitrate, than for primary species, black carbon and particulate organic matter. While the changes in the burdens of sulfate, black carbon, and particulate organic matter show an almost linear response for changes in the emission strength, the formation of nitrate was found to be contingent upon many more factors, e.g., the abundance of sulfuric acid, than only upon the strength of the nitrogen oxide emissions. The generic tracer experiments were further extended to conduct the first risk assessment to obtain the cumulative risk of contamination from multiple nuclear reactor accidents on the global scale. For this, many factors had to be taken into account: the probability of major accidents, the cumulative deposition field of the radionuclide cesium-137, and a threshold value that defines contamination. By collecting the necessary data and after accounting for uncertainties, it was found that the risk is highest in western Europe, the eastern US, and in Japan, where on average contamination by major accidents is expected about every 50 years.
Resumo:
Progettazione, test e creazione di schede elettroniche per lo studio dell'atmosfera in condizioni ambientali difficili.
Resumo:
Recent advances have revealed that during exogenous airway challenge, airway diameters can not be adequately predicted by their initial diameters. Furthermore, airway diameters can also vary greatly in time on scales shorter than a breath. In order to better understand these phenomena, we developed a multiscale model which allows us to simulate aerosol challenge in the airways during ventilation. The model incorporates agonist-receptor binding kinetics to govern the temporal response of airway smooth muscle (ASM) contraction on individual airway segments, which together with airway wall mechanics, determines local airway caliber. Global agonist transport and deposition is coupled with pressure-driven flow, linking local airway constrictions with global flow dynamics. During the course of challenge, airway constriction alters the flow pattern, redistributing agonist to less constricted regions. This results in a negative feedback which may be a protective property of the normal lung. As a consequence, repetitive challenge can cause spatial constriction patterns to evolve in time, resulting in a loss of predictability of airway diameters. Additionally, the model offers new insight into several phenomena including the intra- and inter-breath dynamics of airway constriction throughout the tree structure.
Resumo:
Background Through this paper, we present the initial steps for the creation of an integrated platform for the provision of a series of eHealth tools and services to both citizens and travelers in isolated areas of thesoutheast Mediterranean, and on board ships travelling across it. The platform was created through an INTERREG IIIB ARCHIMED project called INTERMED. Methods The support of primary healthcare, home care and the continuous education of physicians are the three major issues that the proposed platform is trying to facilitate. The proposed system is based on state-of-the-art telemedicine systems and is able to provide the following healthcare services: i) Telecollaboration and teleconsultation services between remotely located healthcare providers, ii) telemedicine services in emergencies, iii) home telecare services for "at risk" citizens such as the elderly and patients with chronic diseases, and iv) eLearning services for the continuous training through seminars of both healthcare personnel (physicians, nurses etc) and persons supporting "at risk" citizens. These systems support data transmission over simple phone lines, internet connections, integrated services digital network/digital subscriber lines, satellite links, mobile networks (GPRS/3G), and wireless local area networks. The data corresponds, among others, to voice, vital biosignals, still medical images, video, and data used by eLearning applications. The proposed platform comprises several systems, each supporting different services. These were integrated using a common data storage and exchange scheme in order to achieve system interoperability in terms of software, language and national characteristics. Results The platform has been installed and evaluated in different rural and urban sites in Greece, Cyprus and Italy. The evaluation was mainly related to technical issues and user satisfaction. The selected sites are, among others, rural health centers, ambulances, homes of "at-risk" citizens, and a ferry. Conclusions The results proved the functionality and utilization of the platform in various rural places in Greece, Cyprus and Italy. However, further actions are needed to enable the local healthcare systems and the different population groups to be familiarized with, and use in their everyday lives, mature technological solutions for the provision of healthcare services.
Resumo:
Advanced electronic alerts (eAlerts) and computerised physician order entry (CPOE) increase adequate thromboprophylaxis orders among hospitalised medical patients. It remains unclear whether eAlerts maintain their efficacy over time, after withdrawal of continuing medical education (CME) on eAlerts and on thromboprophylaxis indications from the study staff. We analysed 5,317 hospital cases from the University Hospital Zurich during 2006-2009: 1,854 cases from a medical ward with eAlerts (interventiongroup) and 3,463 cases from a surgical ward without eAlerts (controlgroup). In the intervention group, an eAlert with hospital-specific venous thromboembolism (VTE) prevention guidelines was issued in the electronic patient chart 6 hours after admission if no pharmacological or mechanical thromboprophylaxis had been ordered. Data were analysed for three phases: pre-implementation (phase 1), eAlert implementation with CME (phase 2), and post-implementation without CME (phase3). The rates of thromboprophylaxis in the intervention group were 43.4% in phase 1 and 66.7% in phase 2 (p<0.001), and increased further to 73.6% in phase3 (p=0.011). Early thromboprophylaxis orders within 12 hours after admission were more often placed in phase 2 and 3 as compared to phase 1 (67.1% vs. 52.1%, p<0.001). In the surgical control group, the thromboprophylaxis rates in the three phases were 88.6%, 90.7%, 90.6% (p=0.16). Advanced eAlerts may provide sustained efficacy over time, with stable rates of thromboprophylaxis orders among hospitalised medical patients.