839 resultados para life time
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
With the fast growth of cancer research, new analytical methods are needed to measure anticancer drugs. This is usually accomplished by using sophisticated analytical instruments. Biosensors are attractive candidates for measuring anticancer drugs, but currently few biosensors can achieve this goal. In particular, it is challenging to have a general method to monitor various types of anticancer drugs with different structures. In this work, a biosensor was developed to detect anticancer drugs by modifying carbon paste electrodes with glutathione-s-transferase (GST) enzymes. GST is widely studied in the metabolism of xenobiotics and is a major contributing factor in resistance to anticancer drugs. The measurement of anticancer drugs is based on competition between 1-chloro-2,4-dinitrobenzene (CDNB) and the drugs for the GST enzyme in the electrochemical potential at 0.1 V vs. Ag/AgCl by square wave voltammetry (SWV) or using a colorimetric method. The sensor shows a detection limit of 8.8 mu M cisplatin and exhibits relatively long life time in daily measurements. (C) 2014 Elsevier B.V. All rights reserved.
Resumo:
The arthritis-encephalitis caprine vírus is found at mostly areas of the world, especially at the countries that milk goats is strongly tecnific, and already was found in many brasilian states. The illness causes greats econimics prejudice, influencing badly the productive life time of the flock, related with the low production, decrease of the reproductive performance and lost of the genetic potential, determining constants flock’s renew. Many authors calls the atention for the probable dissemination of the illness and reinforce the necessity of the adoption strong sanitary politics for the illnes control. The controls programs, basically, based on knowledeg of the target-cells, that keep and distribute the virus on the guests organisms; on the transmition way; on sensibility and specificity of the sorologics tests, as well as your frequency of realitation in the flock; and after all on the management that the flock is submeted
Resumo:
Memory, the ability of keeping and remembering past states of consciousness and everything that could be associated with them, is one of the human properties responsible for the construction of experience, knowledge and preservation of individual’s identity. However, recalling is also the way we keep relationships with time, factor that is essential and inherent to our existence and responsible for the fact that we have memory. As for the narrative, as for life, time is a prime category, because it is in time that narrated or experienced events happen. This relationship between time and memory is remarkable in “O burrinho pedrês”, Guimarães Rosa’s short story present in his debut book, Sagarana. In this story, we can notice, generally, the characters use memory as a way to return to the past, evoking, consciously or unconsciously, the experience and its result, the wisdom, to help on the understanding of present situations. So, this research has as objective to analyze how time and memory relate and become indispensable to the development of the narrative and to the construction of the other categories - character, narrator and space. The theoretical foundation is based on studies clustered in three dimensions: a) theoretical essays about Guimarães Rosa’s work in general and the particular short story b) philosophical and/or psychological propositions about memory and time, and c) propositions about the mentioned categories of narrative
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
Objective: Individuals with obsessive-compulsive disorder (OCD) and separation anxiety disorder (SAD) tend to present higher morbidity than do those with OCD alone. However, the relationship between OCD and SAD has yet to be fully explored.Method: This was a cross-sectional study using multiple logistic regression to identify differences between OCD patients with SAD (OCD + SAD, n = 260) and without SAD (OCD, n = 695), in terms of clinical and socio-demographic variables. Data were extracted from those collected between 2005 and 2009 via the Brazilian Research Consortium on Obsessive-Compulsive Spectrum Disorders project.Results: SAD was currently present in only 42 (4.4%) of the patients, although 260 (27.2%) had a life-time diagnosis of the disorder. In comparison with the OCD group patients, patients with SAD + OCD showed higher chance to present sensory phenomena, to undergo psychotherapy, and to have more psychiatric comorbidities, mainly bulimia.Conclusion: In patients with primary OCD, comorbid SAD might be related to greater personal dysfunction and a poorer response to treatment, since sensory phenomena may be a confounding aspect on diagnosis and therapeutics. Patients with OCD + SAD might be more prone to developing specific psychiatric comorbidities, especially bulimia. Our results suggest that SAD symptom assessment should be included in the management and prognostic evaluation of OCD, although the psychobiological role that such symptoms play in OCD merits further investigation. (C) 2014 Elsevier Masson SAS. All rights reserved.
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
The digital electronic market development is founded on the continuous reduction of the transistors size, to reduce area, power, cost and increase the computational performance of integrated circuits. This trend, known as technology scaling, is approaching the nanometer size. The lithographic process in the manufacturing stage is increasing its uncertainty with the scaling down of the transistors size, resulting in a larger parameter variation in future technology generations. Furthermore, the exponential relationship between the leakage current and the threshold voltage, is limiting the threshold and supply voltages scaling, increasing the power density and creating local thermal issues, such as hot spots, thermal runaway and thermal cycles. In addiction, the introduction of new materials and the smaller devices dimension are reducing transistors robustness, that combined with high temperature and frequently thermal cycles, are speeding up wear out processes. Those effects are no longer addressable only at the process level. Consequently the deep sub-micron devices will require solutions which will imply several design levels, as system and logic, and new approaches called Design For Manufacturability (DFM) and Design For Reliability. The purpose of the above approaches is to bring in the early design stages the awareness of the device reliability and manufacturability, in order to introduce logic and system able to cope with the yield and reliability loss. The ITRS roadmap suggests the following research steps to integrate the design for manufacturability and reliability in the standard CAD automated design flow: i) The implementation of new analysis algorithms able to predict the system thermal behavior with the impact to the power and speed performances. ii) High level wear out models able to predict the mean time to failure of the system (MTTF). iii) Statistical performance analysis able to predict the impact of the process variation, both random and systematic. The new analysis tools have to be developed beside new logic and system strategies to cope with the future challenges, as for instance: i) Thermal management strategy that increase the reliability and life time of the devices acting to some tunable parameter,such as supply voltage or body bias. ii) Error detection logic able to interact with compensation techniques as Adaptive Supply Voltage ASV, Adaptive Body Bias ABB and error recovering, in order to increase yield and reliability. iii) architectures that are fundamentally resistant to variability, including locally asynchronous designs, redundancy, and error correcting signal encodings (ECC). The literature already features works addressing the prediction of the MTTF, papers focusing on thermal management in the general purpose chip, and publications on statistical performance analysis. In my Phd research activity, I investigated the need for thermal management in future embedded low-power Network On Chip (NoC) devices.I developed a thermal analysis library, that has been integrated in a NoC cycle accurate simulator and in a FPGA based NoC simulator. The results have shown that an accurate layout distribution can avoid the onset of hot-spot in a NoC chip. Furthermore the application of thermal management can reduce temperature and number of thermal cycles, increasing the systemreliability. Therefore the thesis advocates the need to integrate a thermal analysis in the first design stages for embedded NoC design. Later on, I focused my research in the development of statistical process variation analysis tool that is able to address both random and systematic variations. The tool was used to analyze the impact of self-timed asynchronous logic stages in an embedded microprocessor. As results we confirmed the capability of self-timed logic to increase the manufacturability and reliability. Furthermore we used the tool to investigate the suitability of low-swing techniques in the NoC system communication under process variations. In this case We discovered the superior robustness to systematic process variation of low-swing links, which shows a good response to compensation technique as ASV and ABB. Hence low-swing is a good alternative to the standard CMOS communication for power, speed, reliability and manufacturability. In summary my work proves the advantage of integrating a statistical process variation analysis tool in the first stages of the design flow.
Resumo:
Im Rahmen der vorliegenden Arbeit wurde erstmals Laser-Atomspektroskopie an einem Element durchgeführt, für das bisher keine atomaren Niveaus bekannt waren. Die Experimente wurden am Element Fermium mit der Ordnungszahl Z=100 mit der Resonanzionisationsspektroskopie (RIS) in einer Puffergaszelle durchgeführt. Verwendet wurde das Isotop 255Fm mit einer Halbwertszeit von 20.1 h, das im Hochflusskernreaktor des ORNL, Oak Ridge, USA, hergestellt wurde. Die von einem elektrochemischen Filament in das Argon-Puffergas bei einer Temperatur von 960(20)°C abgedampften Fm-Atome wurden mit Lasern in einem Zweistufenprozess resonant ionisiert. Dazu wurde das Licht eines Excimerlaser gepumpten Farbstofflasers für den ersten Anregungsschritt um die Wellenlänge 400 nm durchgestimmt. Ein Teil des Excimer (XeF) Laser Pumplichtes mit den Wellenlänge 351/353 nm wurde für die nicht-resonante Ionisation verwendet. Die Ionen wurden mit Hilfe elektrischer Felder aus der optischen Zelle extrahiert und nach einem Quadrupol Massenfilter mit einem Channeltron-Detektor massenselektiv nachgewiesen. Trotz der geringen Probenmenge von 2.7 x 10^10 eingesetzten Atomen wurden zwei atomare Resonanzen bei Energien von 25099.8(2) cm-1 und 25111.8(2) cm-1 gefunden und das Sättigungsverhalten dieser Linien gemessen. Es wurde ein theoretisches Modell entwickelt, dass sowohl das spektrale Profil der sättigungsverbreiterten Linien als auch die Sättigungskurven beschreibt. Durch Anpassung an die Messdaten konnten die partiellen Übergangsraten in den 3H6 Grundzustand Aki=3.6(7) x 10^6/s und Aki=3.6(6) x 10^6/s bestimmt werden. Der Vergleich der Niveauenergien und Übergangsraten mit Multikonfigurations Dirac-Fock Rechnungen legt die spektroskopische Klassifizierung der beobachteten Niveaus als 5f12 7s7p 5I6 und 5G6 Terme nahe. Weiterhin wurde ein Übergang bei 25740 cm-1 gefunden, der aufgrund der beobachteten Linienbreite von 1000 GHz als Rydbergzustand Zustand mit der Niveauenergie 51480 cm-1 interpretiert wurde und über einen Zweiphotonen Prozess angeregt werden kann. Basierend auf dieser Annahme wurde die Obergrenze für die Ionisationsenergie IP = 52140 cm-1 = 6.5 eV abgeschätzt. In den Messungen wurden Verschiebungen in den Zeitverteilungsspektren zwischen den mono-atomaren Ionen Fm+ und Cf+ und dem Molekül-Ion UO+ festgestellt und auf Driftzeitunterschiede im elektrischen Feld der gasgefüllten optischen Zelle zurückgeführt. Unter einfachen Modellannahmen wurde daraus auf die relativen Unterschiede Delta_r(Fm+,Cf+)/r(Cf+) -0.2 % und Delta_r(UO+,Cf+)/r(Cf+) 20 % in den Ionenradien geschlossen. Über die Bestimmung der Abnahme der Fm-a Aktivität des Filamentes auf der einen Seite und die Messung der Resonanzzählrate auf der anderen Seite, wurde die Nachweiseffizienz der Apparatur zu 4.5(3) x 10^-4 bestimmt. Die Nachweisapparatur wurde mit dem Ziel weiterentwickelt, Laserspektroskopie am Isotop 251Fm durchzuführen, das über die Reaktion 249Cf(a,2n)251Fm direkt in der optischen Zelle erzeugt werden soll. Das Verfahren wurde am chemischen Homolog Erbium getestet. Dabei wurde das Isotop 163Er über die Reaktion 161Dy(a,2n)163Er erzeugt und nach Resonanzionisation nachgewiesen. Die Nachweiseffizienz der Methode wurde zu 1 x 10^-4 bestimmt.
Resumo:
The Gaia space mission is a major project for the European astronomical community. As challenging as it is, the processing and analysis of the huge data-flow incoming from Gaia is the subject of thorough study and preparatory work by the DPAC (Data Processing and Analysis Consortium), in charge of all aspects of the Gaia data reduction. This PhD Thesis was carried out in the framework of the DPAC, within the team based in Bologna. The task of the Bologna team is to define the calibration model and to build a grid of spectro-photometric standard stars (SPSS) suitable for the absolute flux calibration of the Gaia G-band photometry and the BP/RP spectrophotometry. Such a flux calibration can be performed by repeatedly observing each SPSS during the life-time of the Gaia mission and by comparing the observed Gaia spectra to the spectra obtained by our ground-based observations. Due to both the different observing sites involved and the huge amount of frames expected (≃100000), it is essential to maintain the maximum homogeneity in data quality, acquisition and treatment, and a particular care has to be used to test the capabilities of each telescope/instrument combination (through the “instrument familiarization plan”), to devise methods to keep under control, and eventually to correct for, the typical instrumental effects that can affect the high precision required for the Gaia SPSS grid (a few % with respect to Vega). I contributed to the ground-based survey of Gaia SPSS in many respects: with the observations, the instrument familiarization plan, the data reduction and analysis activities (both photometry and spectroscopy), and to the maintenance of the data archives. However, the field I was personally responsible for was photometry and in particular relative photometry for the production of short-term light curves. In this context I defined and tested a semi-automated pipeline which allows for the pre-reduction of imaging SPSS data and the production of aperture photometry catalogues ready to be used for further analysis. A series of semi-automated quality control criteria are included in the pipeline at various levels, from pre-reduction, to aperture photometry, to light curves production and analysis.
Resumo:
Diskotische Hexa-peri-hexabenzocoronene (HBC) als molekulare, definierte graphitische Substrukturen sind bereits seit langem Gegenstand von Untersuchungen zu der Delokalisierung von π-Elektronen. In dieser Arbeit wurden zusätzlich Platin-Komplexe in das periphere Substitutionsmuster von HBC eingeführt. Dies führte zu einer Verbesserung der Emission von dem angeregten Triplett-Zustand in den Singulett-Grundzustand mit einer zusätzlichen Verlängerung der Lebensdauer des angeregten Zustandes. Zusätzlich erlaubte diese Konfiguration ein schnelles Intersystem-Crossing mittels einer verstärkten Spin-Orbit Kopplung, die sowohl bei tiefen Temperaturen, als auch bei Raumtemperatur exklusiv zu Phosphoreszenz (T1→S0) führte. Das Verständniss über solche Prozesse ist auch essentiell für die Entwicklung verbesserter opto-elektronischer Bauteile. Die Erstellung von exakt definierten molekularen Strukturen, die speziell für spezifische Interaktionen hergestellt wurden, machten eine Inkorporation von hydrophoben-hydrophilen, wasserstoffverbrückten oder elektrostatischen funktionalisierten Einheiten notwendig, um damit den supramolekularen Aufbau zu kontrollieren. Mit Imidazolium-Salzen funktionalisierte HBC Derivate wurden zu diesem Zwecke hergestellt. Eine interessante Eigenschaft dieser Moleküle ist ihre Amphiphilie. Dies gestattete die Untersuchung ihrer Eigenschaften in einem polaren Solvens und sowohl der Prozessierbarkeit als auch der Faserbildung auf Siliziumoxid-Trägern. Abhängig vom Lösungsmittel und der gewählten Konditionen konnten hochkristalline Fasern erhalten werden. Durch eine Substitution der HBCs mit langen, sterisch anspruchsvollen Seitenketten, konnte durch eine geeignete Prozessierung eine homöotrope Ausrichtung auf Substraten erreicht werden, was dieses Material interessant für photovoltaische Applikationen macht. Neuartige Polyphenylen-Metall-Komplexe mit diskotischen, linearen und dendritischen Geometrien wurden mittels einer einfachen Reaktion zwischen Co2(CO)8 und Ethinyl-Funktionalitäten in Dichlormethan hergestellt. Nach der Pyrolyse dieser Komplexe ergaben sich unterschiedliche Kohlenstoff-Nanopartikel, inklusive Nanoröhren, graphitischen Nanostäben und Kohlenstoff/Metall Hybrid Komplexe, die durch Elektronenmikroskopie untersucht wurden. Die resultierenden Strukturen waren dabei abhängig von der Zusammensetzung und Struktur der Ausgangssubstanzen. Anhand dieser Resultate ergeben sich diverse Möglichkeiten, um den Mechanismus, der zur Herstellung graphitischer Nanopartikel führt, besser zu verstehen.
Resumo:
Aseptic loosening of metal implants is mainly attributed to the formation of metal degradation products. These include particulate debris and corrosion products, such as metal ions (anodic half-reaction) and ROS (cathodic half-reaction). While numerous clinical studies describe various adverse effects of metal degradation products, detailed knowledge of metal-induced cellular reactions, which might be important for possible therapeutic intervention, is not comprehensive. Since endothelial cells are involved in inflammation and angiogenesis, two processes which are critical for wound healing and integration of metal implants, the effects of different metal alloys and their degradation products on these cells were investigated. Endothelial cells on Ti6Al4V alloy showed signs of oxidative stress, which was similar to the response of endothelial cells to cathodic partial reaction of corrosion induced directly on Ti6Al4V surfaces. Furthermore, oxidative stress on Ti6Al4V alloy reduced the pro-inflammatory stimulation of endothelial cells by TNF-α and LPS. Oxidative stress and other stress-related responses were observed in endothelial cells in contact with Co28Cr6Mo alloy. Importantly, these features could be reduced by coating Co28Cr6Mo with a TiO2 layer, thus favouring the use of such surface modification in the development of medical devices for orthopaedic surgery. The reaction of endothelial cells to Co28Cr6Mo alloy was partially similar to the effects exerted by Co2+, which is known to be released from metal implants. Co2+ also induced ROS formation and DNA damage in endothelial cells. This correlated with p53 and p21 up-regulation, indicating the possibility of cell cycle arrest. Since CoCl2 is used as an hypoxia-mimicking agent, HIF-1α-dependence of cellular responses to Co2+ was studied in comparison to anoxia-induced effects. Although important HIF-1α-dependent genes were identified, a more detailed analysis of microarray data will be required to provide additional information about the mechanisms of Co2+ action. All these reactions of endothelial cells to metal degradation products might play their role in the complex processes taking place in the body following metal device implantation. In the worst case this can lead to aseptic loosening of the implant and requirement for revision surgery. Knowledge of molecular mechanisms of metal-induced responses will hopefully provide the possibility to interfere with undesirable processes at the implant/tissue interface, thus extending the life-time of the implant and the overall success of metal implant applications.
Resumo:
La città medievale di Leopoli-Cencelle (fondata da Papa Leone IV nell‘854 d.C. non lontano da Civitavecchia) è stata oggetto di studio e di periodiche campagne di scavo a partire dal 1994. Le stratigrafie investigate con metodi tradizionali, hanno portato alla luce le numerose trasformazioni che la città ha subìto nel corso della sua esistenza in vita. Case, torri, botteghe e strati di vissuto, sono stati interpretati sin dall’inizio dello scavo basandosi sulla documentazione tradizionale e bi-dimensionale, legata al dato cartaceo e al disegno. Il presente lavoro intende re-interpretare i dati di scavo con l’ausilio delle tecnologie digitali. Per il progetto sono stati utilizzati un laser scanner, tecniche di Computer Vision e modellazione 3D. I tre metodi sono stati combinati in modo da poter visualizzare tridimensionalmente gli edifici abitativi scavati, con la possibilità di sovrapporre semplici modelli 3D che permettano di formulare ipotesi differenti sulla forma e sull’uso degli spazi. Modellare spazio e tempo offrendo varie possibilità di scelta, permette di combinare i dati reali tridimensionali, acquisiti con un laser scanner, con semplici modelli filologici in 3D e offre l’opportunità di valutare diverse possibili interpretazioni delle caratteristiche dell’edificio in base agli spazi, ai materiali, alle tecniche costruttive. Lo scopo del progetto è andare oltre la Realtà Virtuale, con la possibilità di analizzare i resti e di re-interpretare la funzione di un edificio, sia in fase di scavo che a scavo concluso. Dal punto di vista della ricerca, la possibilità di visualizzare le ipotesi sul campo favorisce una comprensione più profonda del contesto archeologico. Un secondo obiettivo è la comunicazione a un pubblico di “non-archeologi”. Si vuole offrire a normali visitatori la possibilità di comprendere e sperimentare il processo interpretativo, fornendo loro qualcosa in più rispetto a una sola ipotesi definitiva.