11 resultados para Myopic addiction
em AMS Tesi di Dottorato - Alm@DL - Università di Bologna
Resumo:
The human movement analysis (HMA) aims to measure the abilities of a subject to stand or to walk. In the field of HMA, tests are daily performed in research laboratories, hospitals and clinics, aiming to diagnose a disease, distinguish between disease entities, monitor the progress of a treatment and predict the outcome of an intervention [Brand and Crowninshield, 1981; Brand, 1987; Baker, 2006]. To achieve these purposes, clinicians and researchers use measurement devices, like force platforms, stereophotogrammetric systems, accelerometers, baropodometric insoles, etc. This thesis focus on the force platform (FP) and in particular on the quality assessment of the FP data. The principal objective of our work was the design and the experimental validation of a portable system for the in situ calibration of FPs. The thesis is structured as follows: Chapter 1. Description of the physical principles used for the functioning of a FP: how these principles are used to create force transducers, such as strain gauges and piezoelectrics transducers. Then, description of the two category of FPs, three- and six-component, the signals acquisition (hardware structure), and the signals calibration. Finally, a brief description of the use of FPs in HMA, for balance or gait analysis. Chapter 2. Description of the inverse dynamics, the most common method used in the field of HMA. This method uses the signals measured by a FP to estimate kinetic quantities, such as joint forces and moments. The measures of these variables can not be taken directly, unless very invasive techniques; consequently these variables can only be estimated using indirect techniques, as the inverse dynamics. Finally, a brief description of the sources of error, present in the gait analysis. Chapter 3. State of the art in the FP calibration. The selected literature is divided in sections, each section describes: systems for the periodic control of the FP accuracy; systems for the error reduction in the FP signals; systems and procedures for the construction of a FP. In particular is detailed described a calibration system designed by our group, based on the theoretical method proposed by ?. This system was the “starting point” for the new system presented in this thesis. Chapter 4. Description of the new system, divided in its parts: 1) the algorithm; 2) the device; and 3) the calibration procedure, for the correct performing of the calibration process. The algorithm characteristics were optimized by a simulation approach, the results are here presented. In addiction, the different versions of the device are described. Chapter 5. Experimental validation of the new system, achieved by testing it on 4 commercial FPs. The effectiveness of the calibration was verified by measuring, before and after calibration, the accuracy of the FPs in measuring the center of pressure of an applied force. The new system can estimate local and global calibration matrices; by local and global calibration matrices, the non–linearity of the FPs was quantified and locally compensated. Further, a non–linear calibration is proposed. This calibration compensates the non– linear effect in the FP functioning, due to the bending of its upper plate. The experimental results are presented. Chapter 6. Influence of the FP calibration on the estimation of kinetic quantities, with the inverse dynamics approach. Chapter 7. The conclusions of this thesis are presented: need of a calibration of FPs and consequential enhancement in the kinetic data quality. Appendix: Calibration of the LC used in the presented system. Different calibration set–up of a 3D force transducer are presented, and is proposed the optimal set–up, with particular attention to the compensation of non–linearities. The optimal set–up is verified by experimental results.
Resumo:
The digital electronic market development is founded on the continuous reduction of the transistors size, to reduce area, power, cost and increase the computational performance of integrated circuits. This trend, known as technology scaling, is approaching the nanometer size. The lithographic process in the manufacturing stage is increasing its uncertainty with the scaling down of the transistors size, resulting in a larger parameter variation in future technology generations. Furthermore, the exponential relationship between the leakage current and the threshold voltage, is limiting the threshold and supply voltages scaling, increasing the power density and creating local thermal issues, such as hot spots, thermal runaway and thermal cycles. In addiction, the introduction of new materials and the smaller devices dimension are reducing transistors robustness, that combined with high temperature and frequently thermal cycles, are speeding up wear out processes. Those effects are no longer addressable only at the process level. Consequently the deep sub-micron devices will require solutions which will imply several design levels, as system and logic, and new approaches called Design For Manufacturability (DFM) and Design For Reliability. The purpose of the above approaches is to bring in the early design stages the awareness of the device reliability and manufacturability, in order to introduce logic and system able to cope with the yield and reliability loss. The ITRS roadmap suggests the following research steps to integrate the design for manufacturability and reliability in the standard CAD automated design flow: i) The implementation of new analysis algorithms able to predict the system thermal behavior with the impact to the power and speed performances. ii) High level wear out models able to predict the mean time to failure of the system (MTTF). iii) Statistical performance analysis able to predict the impact of the process variation, both random and systematic. The new analysis tools have to be developed beside new logic and system strategies to cope with the future challenges, as for instance: i) Thermal management strategy that increase the reliability and life time of the devices acting to some tunable parameter,such as supply voltage or body bias. ii) Error detection logic able to interact with compensation techniques as Adaptive Supply Voltage ASV, Adaptive Body Bias ABB and error recovering, in order to increase yield and reliability. iii) architectures that are fundamentally resistant to variability, including locally asynchronous designs, redundancy, and error correcting signal encodings (ECC). The literature already features works addressing the prediction of the MTTF, papers focusing on thermal management in the general purpose chip, and publications on statistical performance analysis. In my Phd research activity, I investigated the need for thermal management in future embedded low-power Network On Chip (NoC) devices.I developed a thermal analysis library, that has been integrated in a NoC cycle accurate simulator and in a FPGA based NoC simulator. The results have shown that an accurate layout distribution can avoid the onset of hot-spot in a NoC chip. Furthermore the application of thermal management can reduce temperature and number of thermal cycles, increasing the systemreliability. Therefore the thesis advocates the need to integrate a thermal analysis in the first design stages for embedded NoC design. Later on, I focused my research in the development of statistical process variation analysis tool that is able to address both random and systematic variations. The tool was used to analyze the impact of self-timed asynchronous logic stages in an embedded microprocessor. As results we confirmed the capability of self-timed logic to increase the manufacturability and reliability. Furthermore we used the tool to investigate the suitability of low-swing techniques in the NoC system communication under process variations. In this case We discovered the superior robustness to systematic process variation of low-swing links, which shows a good response to compensation technique as ASV and ABB. Hence low-swing is a good alternative to the standard CMOS communication for power, speed, reliability and manufacturability. In summary my work proves the advantage of integrating a statistical process variation analysis tool in the first stages of the design flow.
Resumo:
The main task of this research is to investigate the situation of drugs in the city of Bologna. A first discussion pertains the method to adopt studying an ethical question as drug actually is. In fact it is widely known that drugs problem involves many political and religious considerations which are misleading in a scientific point of view. After a methodological chapter supposed to show the purpose of this research, it is discussed a logical definition of drugs. There it is examined an aristotelian definition of drugs with semantic instruments from philosophy of the language to fulfil meaning of terms. The following chapter discusses personal stories of different people involved in drug in the city, who actually represent the main characters of drug subculture. Afterwards the official statistics concerning drug enforcement is discussed and compared with a specific police action which allows to criticize that data, and to make some hypothesis about drug quantities circulating in town. Next step is investigating drugs addicted in town, with a validation technique of data base queries. The result is a statistics of users in which there is evidence of main presence of foreigners and not resident Italians who use to practice drugs in this city. Demographic analysis of identified people shows that drug addiction is widely diffused among all range of age and mainly pertains males, with an increasing trend. Then is examined the geographic distribution of users residence and use places, showing that drugs abuse is spread among all classes of population, while drugs squares are located in some points of town which realise a kind of drug area with a concentration of dealers not organised together. With some detailed queries in police reports statistics is studied some specific subject on nowadays drug abuse, the phenomenon of multi-use, the relation between drug and crime, the relation between drug and mental disease, recording some evidence in such topics. Finally a survey on city media along last two years shows the interest about this topic and gives an idea of public opinion’s information about drugs. The study refers to the city of Bologna only, and pertains data recorded along last ten years by the local metropolitan police corp.
Resumo:
Drug addiction manifests clinically as compulsive drug seeking, and cravings that can persist and recur even after extended periods of abstinence. The fundamental principle that unites addictive drugs is that each one enhances synaptic DA by means that dissociate it from normal behavioral control, so that they act to reinforce their own acquisition. Our attention has focused on the study of phenomena associated with the consumption of alcohol and heroin. Alcohol has long been considered an unspecific pharmacological agent, recent molecular pharmacology studies have shown that acts on different primary targets. Through gene expression studies conducted recently it has been shown that the classical opioid receptors are differently involved in the consumption of ethanol and, furthermore, the system nociceptin / NOP, included in the family of endogenous opioid system, and both appear able to play a key role in the initiation of alcohol use in rodents. What emerges is that manipulation of the opioid system, nociceptin, may be useful in the treatment of addictions and there are several evidences that support the use of this strategy. The linkage between gene expression alterations and epigenetic modulation in PDYN and PNOC promoters following alcohol treatment confirm the possible chromatin remodeling mechanism already proposed for alcoholism. In the second part of present study, we also investigated alterations in signaling molecules directly associated with MAPK pathway in a unique collection of postmortem brains from heroin abusers. The interest was focused on understanding the effects that prolonged exposure of heroin can cause in an individual, over the entire MAPK cascade and consequently on the transcription factor ELK1, which is regulated by this pathway. We have shown that the activation of ERK1/2 resulting in Elk-1 phosphorylation in striatal neurons supporting the hypothesis that prolonged exposure to substance abuse causes a dysregulation of MAPK pathway.
Resumo:
Drug abuse is a major global problem which has a strong impact not only on the single individual but also on the entire society. Among the different strategies that can be used to address this issue an important role is played by identification of abusers and proper medical treatment. This kind of therapy should be carefully monitored in order to discourage improper use of the medication and to tailor the dose according to the specific needs of the patient. Hence, reliable analytical methods are needed to reveal drug intake and to support physicians in the pharmacological management of drug dependence. In the present Ph.D. thesis original analytical methods for the determination of drugs with a potential for abuse and of substances used in the pharmacological treatment of drug addiction are presented. In particular, the work has been focused on the analysis of ketamine, naloxone and long-acting opioids (buprenorphine and methadone), oxycodone, disulfiram and bupropion in human plasma and in dried blood spots. The developed methods are based on the use of high performance liquid chromatography (HPLC) coupled to various kinds of detectors (mass spectrometer, coulometric detector, diode array detector). For biological sample pre-treatment different techniques have been exploited, namely solid phase extraction and microextraction by packed sorbent. All the presented methods have been validated according to official guidelines with good results and some of these have been successfully applied to the therapeutic drug monitoring of patients under treatment for drug abuse.
Resumo:
Scopo del presente lavoro di ricerca è quello di comparare due contesti metropolitani, valenciano e bolognese, sulle pratiche di accompagnamento al lavoro rivolte a fasce svantaggiate, in particolare persone con problemi di dipendenza da sostanze psicotrope. L’indagine propone un confronto su alcune tematiche trasversali (tipologia di azioni messe in campo, organizzazione territoriale e governance, profilo degli utenti, inserimento sociale, coinvolgimento del mondo produttivo) e pone in evidenza gli elementi che ci consentono di individuare e segnalare sia delle buone pratiche trasferibili sia delle linee progettuali, partendo dunque dal presupposto che capacitare una persona significa innanzitutto offrirle congrue opportunità di scelta, nel senso seniano e come spiegato dalla stessa Nussbaum, ma soprattutto accompagnarla e sostenerla nel percorso di inserimento lavorativo e, in parallelo, sociale. Il bisogno raccolto è quello di un sostegno, motivazionale e orientativo, che segua un approccio socio educativo capace di fornire, alla persona, una risposta integrata, di unicità, capace dunque di agire sull’autonomia, sull’autostima, sull’elaborazione delle proprie esperienze di vita e lavorative, nonché su elementi anche di contesto quali la casa, le reti amicali e familiari, spesso compromesse. L’elemento distintivo che consente di agire in questa direzione è il lavoro di collaborazione tra i diversi servizi e la co-progettazione del percorso con l’utente stesso. Il tema degli inserimenti lavorativi è un argomento molto complesso che chiama in causa diversi aspetti: i mutamenti sociali e le trasformazioni del lavoro; l’emergere di nuove fasce deboli e il rischio di aggravamento delle condizioni di esclusione per le fasce deboli “tradizionali”; l’importanza del lavoro per la costruzione di percorsi identitari e di riconoscimento; l’impatto delle politiche attive sulle fasce svantaggiate e i concetti di capitazione e attivazione; il ruolo del capitale sociale e l’emergere di nuovi welfare; la rete degli attori coinvolti dal processo di inserimento e il tema della governace territoriale.
Resumo:
Obiettivo Valutare l’ipotesi secondo cui la movimentazione manuale di carichi possa essere un fattore di rischio per il di distacco di retina. Metodi Si è condotto uno studio caso-controllo ospedaliero multicentrico, a Bologna, (reparto di Oculistica del policlinico S. Orsola Malpighi, Prof. Campos), e a Brescia (reparto di oculistica “Spedali Civili” Prof. Semeraro). I casi sono 104 pazienti operati per distacco di retina. I controlli sono 173 pazienti reclutati tra l’utenza degli ambulatori del medesimo reparto di provenienza dei casi. Sia i casi che i controlli (all’oscuro dall’ipotesi in studio) sono stati sottoposti ad un’intervista, attraverso un questionario strutturato concernente caratteristiche individuali, patologie pregresse e fattori di rischio professionali (e non) relativi al distacco di retina. I dati relativi alla movimentazione manuale di carichi sono stati utilizzati per creare un “indice di sollevamento cumulativo―ICS” (peso del carico sollevato x numero di sollevamenti/ora x numero di anni di sollevamento). Sono stati calcolati mediante un modello di regressione logistica unconditional (aggiustato per età e sesso) gli Odds Ratio (OR) relativi all’associazione tra distacco di retina e vari fattori di rischio, tra cui la movimentazione manuale di carichi. Risultati Oltre alla chirurgia oculare e alla miopia (fattori di rischio noti), si evidenzia un trend positivo tra l’aumento dell’ICS e il rischio di distacco della retina. Il rischio maggiore si osserva per la categoria di sollevamento severo (OR 3.6, IC 95%, 1.5–9.0). Conclusione I risultati, mostrano un maggiore rischio di sviluppare distacco di retina per coloro che svolgono attività lavorative che comportino la movimentazione manuale di carichi e, a conferma di quanto riportato in letteratura, anche per i soggetti miopi e per coloro che sono stati sottoposti ad intervento di cataratta. Si rende quindi evidente l’importanza degli interventi di prevenzione in soggetti addetti alla movimentazione manuale di carichi, in particolare se miopi.
Resumo:
Il nucleo accumbens (NAc), il maggior componente del sistema mesocorticolimbico, è coinvolto nella mediazione delle proprietà di rinforzo e nella dipendenza da diverse sostanze d’abuso. Le sinapsi glutammatergiche del NAc possono esprimere plasticità, tra cui una forma di depressione a lungo termine (LTD) dipendente dagli endocannabinoidi (eCB). Recenti studi hanno dimostrato un’interazione tra le vie di segnalazione del sistema eCB e quelle di altri sistemi recettoriali, compreso quello serotoninergico (5-HT); la vasta colocalizzazione di recettori serotoninergici e CB1 nel NAc suggerisce la possibilità di un’interazione tra questi due sistemi. In questo studio abbiamo riscontrato che una stimolazione a 4 Hz per 20 minuti (LFS-4Hz) delle afferenze glutammatergiche in fettine cerebrali di ratto, induce una nuova forma di eCB-LTD nel core del NAc, che richiede l’attivazione dei recettori CB1 e 5-HT2 e l’apertura dei canali del Ca2+ voltaggio-dipendenti di tipo L. Inoltre abbiamo valutato che l’applicazione esogena di 5-HT (5 M, 20 min) induce una LTD analoga (5-HT-LTD) a livello delle stesse sinapsi, che richiede l’attivazione dei medesimi recettori e l’apertura degli stessi canali del Ca2+; LFS-4Hz-LTD e 5-HT-LTD sono reciprocamente saturanti. Questi risultati suggeriscono che la LFS-4Hz induce il rilascio di 5-HT, che si lega ai recettori 5-HT2 a livello postsinaptico incrementando l’influsso di Ca2+ attraverso i canali voltaggio-dipendenti di tipo L e la produzione e il rilascio di 2-arachidonoilglicerolo; l’eCB viaggia a ritroso e si lega al recettore CB1 a livello presinaptico, causando una diminuzione duratura del rilascio di glutammato, che risulta in una LTD. Queste osservazioni possono essere utili per comprendere i meccanismi neurofisiologici che sono alla base della dipendenza da sostanze d’abuso, della depressione maggiore e di altre malattie psichiatriche caratterizzate dalla disfunzione della neurotrasmissione di 5-HT nel NAc.
Resumo:
In the first chapter we develop a theoretical model investigating food consumption and body weight with a novel assumption regarding human caloric expenditure (i.e. metabolism), in order to investigate why individuals can be rationally trapped in an excessive weight equilibrium and why they struggle to lose weight even when offered incentives for weight-loss. This assumption allows the theoretical model to have multiple equilibria and to provide an explanation for why losing weight is so difficult even in the presence of incentives, without relying on rational addiction, time-inconsistency preferences or bounded rationality. In addition to this result we are able to characterize under which circumstances a temporary incentive can create a persistent weight loss. In the second chapter we investigate the possible contributions that social norms and peer effects had on the spread of obesity. In recent literature peer effects and social norms have been characterized as important pathways for the biological and behavioral spread of body weight, along with decreased food prices and physical activity. We add to this literature by proposing a novel concept of social norm related to what we define as social distortion in weight perception. The theoretical model shows that, in equilibrium, the effect of an increase in peers' weight on i's weight is unrelated to health concerns while it is mainly associated with social concerns. Using regional data from England we prove that such social component is significant in influencing individual weight. In the last chapter we investigate the relationship between body weight and employment probability. Using a semi-parametric regression we show that men and women employment probability do not follow a linear relationship with body mass index (BMI) but rather an inverted U-shaped one, peaking at a BMI way over the clinical threshold for overweight.
Resumo:
This thesis work aims to develop original analytical methods for the determination of drugs with a potential for abuse, for the analysis of substances used in the pharmacological treatment of drug addiction in biological samples and for the monitoring of potentially toxic compounds added to street drugs. In fact reliable analytical techniques can play an important role in this setting. They can be employed to reveal drug intake, allowing the identification of drug users and to assess drug blood levels, assisting physicians in the management of the treatment. Pharmacological therapy needs to be carefully monitored indeed in order to optimize the dose scheduling according to the specific needs of the patient and to discourage improper use of the medication. In particular, different methods have been developed for the detection of gamma-hydroxybutiric acid (GHB), prescribed for the treatment of alcohol addiction, of glucocorticoids, one of the most abused pharmaceutical class to enhance sport performance and of adulterants, pharmacologically active compounds added to illicit drugs for recreational purposes. All the presented methods are based on capillary electrophoresis (CE) and high performance liquid chromatography (HPLC) coupled to various detectors (diode array detector, mass spectrometer). Biological samples pre-treatment was carried out using different extraction techniques, liquid-liquid extraction (LLE) and solid phase extraction (SPE). Different matrices have been considered: human plasma, dried blood spots, human urine, simulated street drugs. These developed analytical methods are individually described and discussed in this thesis work.