550 resultados para execute
Resumo:
Introduction: Apoptotic cell death of cardiomyocytes is involved in several cardiovascular diseases including ischemia, hypertrophy and heart failure, thus representing a potential therapeutic target. Apoptosis of cardiac cells can be induced experimentally by several stimuli including hypoxia, serum withdrawal or combination of both. Several lines of research suggest that neurohormonal mechanisms play a central role in the progression of heart failure. In particular, excessive activation of the sympathetic nervous system or the renin-angiotensin-aldosterone system is known to have deleterious effects on the heart. Recent studies report that norepinephrine (NE), the primary transmitter of sympathetic nervous system, and aldosterone (ALD), which is actively produced in failing human heart, are able to induce apoptosis of rat cardiomyocytes. Polyamines are biogenic amines involved in many cellular processes, including apoptosis. Actually it appears that these molecules can act as promoting, modulating or protective agents in apoptosis depending on apoptotic stimulus and cellular model. We have studied the involvement of polyamines in the apoptosis of cardiac cells induced in a model of simulated ischemia and following treatment with NE or ALD. Methods: H9c2 cardiomyoblasts were exposed to a condition of simulated ischemia, consisting of hypoxia plus serum deprivation. Cardiomyocyte cultures were prepared from 1-3 day-old neonatal Wistar rat hearts. Polyamine depletion was obtained by culturing the cells in the presence of α-difluoromethylornithine (DFMO). Polyamines were separated and quantified in acidic cellular extracts by HPLC after derivatization with dansyl chloride. Caspase activity was measured by the cleavage of the fluorogenic peptide substrate. Ornithine decarboxylase (ODC) activity was measured by estimation of the release of 14C-CO2 from 14C-ornithine. DNA fragmentation was visualized by the method of terminal transferase-mediated dUTP nick end-labeling (TUNEL), and DNA laddering on agarose gel electophoresis. Cytochrome c was detected by immunoflorescent staining. Activation of signal transduction pathways was investigated by western blotting. Results: The results indicate that simulated ischemia, NE and ALD cause an early induction of the activity of ornithine decarboxylase (ODC), the first enzyme in polyamine biosynthesis, followed by a later increase of caspase activity, a family of proteases that execute the death program and induce cell death. This effect was prevented in the presence of DFMO, an irreversible inhibitor of ODC, thus suggesting that polyamines are involved in the execution of the death program activated by these stimuli. In H9c2 cells DFMO inhibits several molecular events related to apoptosis that follow simulated ischemia, such as the release of cytochrome c from mitochondria, down-regulation of Bcl-xL, and DNA fragmentation. The anti-apoptotic protein survivin is down-regulated after ALD or NE treatement and polyamine depletion obtained by DFMO partially opposes survivin decrease. Moreover, a study of key signal transduction pathways governing cell death and survival, revealed an involvement of AMP activated protein kinase (AMPK) and AKT kinase, in the modulation by polyamines of the response of cardiomyocytes to NE. In fact polyamine depleted cells show an altered pattern of AMPK and AKT activation that may contrast apoptosis and appears to result from a differential effect on the specific phosphatases that dephosphorylate and switch off these signaling proteins. Conclusions: These results indicate that polyamines are involved in the execution of the death program activated in cardiac cells by heart failure-related stimuli, like ischemia, ALD and NE, and suggest that their apoptosis facilitating action is mediated by a network of specific phosphatases and kinases.
Resumo:
In this work I address the study of language comprehension in an “embodied” framework. Firstly I show behavioral evidence supporting the idea that language modulates the motor system in a specific way, both at a proximal level (sensibility to the effectors) and at the distal level (sensibility to the goal of the action in which the single motor acts are inserted). I will present two studies in which the method is basically the same: we manipulated the linguistic stimuli (the kind of sentence: hand action vs. foot action vs. mouth action) and the effector by which participants had to respond (hand vs. foot vs. mouth; dominant hand vs. non-dominant hand). Response times analyses showed a specific modulation depending on the kind of sentence: participants were facilitated in the task execution (sentence sensibility judgment) when the effector they had to use to respond was the same to which the sentences referred. Namely, during language comprehension a pre-activation of the motor system seems to take place. This activation is analogous (even if less intense) to the one detectable when we practically execute the action described by the sentence. Beyond this effector specific modulation, we also found an effect of the goal suggested by the sentence. That is, the hand effector was pre-activated not only by hand-action-related sentences, but also by sentences describing mouth actions, consistently with the fact that to execute an action on an object with the mouth we firstly have to bring it to the mouth with the hand. After reviewing the evidence on simulation specificity directly referring to the body (for instance, the kind of the effector activated by the language), I focus on the specific properties of the object to which the words refer, particularly on the weight. In this case the hypothesis to test was if both lifting movement perception and lifting movement execution are modulated by language comprehension. We used behavioral and kinematics methods, and we manipulated the linguistic stimuli (the kind of sentence: the lifting of heavy objects vs. the lifting of light objects). To study the movement perception we measured the correlations between the weight of the objects lifted by an actor (heavy objects vs. light objects) and the esteems provided by the participants. To study the movement execution we measured kinematics parameters variance (velocity, acceleration, time to the first peak of velocity) during the actual lifting of objects (heavy objects vs. light objects). Both kinds of measures revealed that language had a specific effect on the motor system, both at a perceptive and at a motoric level. Finally, I address the issue of the abstract words. Different studies in the “embodied” framework tried to explain the meaning of abstract words The limit of these works is that they account only for subsets of phenomena, so results are difficult to generalize. We tried to circumvent this problem by contrasting transitive verbs (abstract and concrete) and nouns (abstract and concrete) in different combinations. The behavioral study was conducted both with German and Italian participants, as the two languages are syntactically different. We found that response times were faster for both the compatible pairs (concrete verb + concrete noun; abstract verb + abstract noun) than for the mixed ones. Interestingly, for the mixed combinations analyses showed a modulation due to the specific language (German vs. Italian): when the concrete word precedes the abstract one responses were faster, regardless of the word grammatical class. Results are discussed in the framework of current views on abstract words. They highlight the important role of developmental and social aspects of language use, and confirm theories assigning a crucial role to both sensorimotor and linguistic experience for abstract words.
Resumo:
Herpes simplex virus entry into cells requires a multipartite fusion apparatus made of gD, gB and heterodimer gH/gL. gD serves as receptor-binding glycoprotein and trigger of fusion; its ectodomain is organized in a N-terminal domain carrying the receptor-binding sites, and a C-terminal domain carrying the profusion domain, required for fusion but not receptor-binding. gB and gH/gL execute fusion. To understand how the four glycoproteins cross-talk to each other we searched for biochemical defined complexes in infected and transfected cells, and in virions. We report that gD formed complexes with gB in absence of gH/gL, and with gH/gL in absence of gB. Complexes with similar composition were formed in infected and transfected cells. They were also present in virions prior to entry, and did not increase at virus fusion with cell. A panel of gD mutants enabled the preliminary location of part of the binding site in gD to gB to the aa 240-260 portion and downstream, with T306P307 as critical residues, and of the binding site to gH/gL at aa 260-310 portion, with P291P292 as critical residues. The results indicate that gD carries composite independent binding sites for gB and gH/gL, both of which partly located in the profusion domain. The second part of the project dealt with rational design of peptides inhibiting virus entry has been performed. Considering gB and gD, the crystal structure is known, so we designed peptides that dock in the structure or prevent the adoption of the final conformation of target molecule. Considering the other glycoproteins, of which the structure is not known, peptide libraries were analyzed. Among several peptides, some were identified as active, designed on glycoprotein B. Two of them were further analyzed. We identified peptide residues fundamental for the inhibiting activity, suggesting a possible mechanism of action. Furthermore, changing the flexibility of peptides, an increased activity was observed,with an EC50 under 10μM. New approaches will try to demonstrate the direct interaction between these peptides and the target glycoprotein B.
Resumo:
Several activities were conducted during my PhD activity. For the NEMO experiment a collaboration between the INFN/University groups of Catania and Bologna led to the development and production of a mixed signal acquisition board for the Nemo Km3 telescope. The research concerned the feasibility study for a different acquisition technique quite far from that adopted in the NEMO Phase 1 telescope. The DAQ board that we realized exploits the LIRA06 front-end chip for the analog acquisition of anodic an dynodic sources of a PMT (Photo-Multiplier Tube). The low-power analog acquisition allows to sample contemporaneously multiple channels of the PMT at different gain factors in order to increase the signal response linearity over a wider dynamic range. Also the auto triggering and self-event-classification features help to improve the acquisition performance and the knowledge on the neutrino event. A fully functional interface towards the first level data concentrator, the Floor Control Module, has been integrated as well on the board, and a specific firmware has been realized to comply with the present communication protocols. This stage of the project foresees the use of an FPGA, a high speed configurable device, to provide the board with a flexible digital logic control core. After the validation of the whole front-end architecture this feature would be probably integrated in a common mixed-signal ASIC (Application Specific Integrated Circuit). The volatile nature of the configuration memory of the FPGA implied the integration of a flash ISP (In System Programming) memory and a smart architecture for a safe remote reconfiguration of it. All the integrated features of the board have been tested. At the Catania laboratory the behavior of the LIRA chip has been investigated in the digital environment of the DAQ board and we succeeded in driving the acquisition with the FPGA. The PMT pulses generated with an arbitrary waveform generator were correctly triggered and acquired by the analog chip, and successively they were digitized by the on board ADC under the supervision of the FPGA. For the communication towards the data concentrator a test bench has been realized in Bologna where, thanks to a lending of the Roma University and INFN, a full readout chain equivalent to that present in the NEMO phase-1 was installed. These tests showed a good behavior of the digital electronic that was able to receive and to execute command imparted by the PC console and to answer back with a reply. The remotely configurable logic behaved well too and demonstrated, at least in principle, the validity of this technique. A new prototype board is now under development at the Catania laboratory as an evolution of the one described above. This board is going to be deployed within the NEMO Phase-2 tower in one of its floors dedicated to new front-end proposals. This board will integrate a new analog acquisition chip called SAS (Smart Auto-triggering Sampler) introducing thus a new analog front-end but inheriting most of the digital logic present in the current DAQ board discussed in this thesis. For what concern the activity on high-resolution vertex detectors, I worked within the SLIM5 collaboration for the characterization of a MAPS (Monolithic Active Pixel Sensor) device called APSEL-4D. The mentioned chip is a matrix of 4096 active pixel sensors with deep N-well implantations meant for charge collection and to shield the analog electronics from digital noise. The chip integrates the full-custom sensors matrix and the sparsifification/readout logic realized with standard-cells in STM CMOS technology 130 nm. For the chip characterization a test-beam has been set up on the 12 GeV PS (Proton Synchrotron) line facility at CERN of Geneva (CH). The collaboration prepared a silicon strip telescope and a DAQ system (hardware and software) for data acquisition and control of the telescope that allowed to store about 90 million events in 7 equivalent days of live-time of the beam. My activities concerned basically the realization of a firmware interface towards and from the MAPS chip in order to integrate it on the general DAQ system. Thereafter I worked on the DAQ software to implement on it a proper Slow Control interface of the APSEL4D. Several APSEL4D chips with different thinning have been tested during the test beam. Those with 100 and 300 um presented an overall efficiency of about 90% imparting a threshold of 450 electrons. The test-beam allowed to estimate also the resolution of the pixel sensor providing good results consistent with the pitch/sqrt(12) formula. The MAPS intrinsic resolution has been extracted from the width of the residual plot taking into account the multiple scattering effect.
Resumo:
Zusammenfassung
Die stöchiometrischen Eisennitride Besonderes Interesse besteht in der Aufklärung desMechanismus der Entstehung der Eisennitride bei derUmsetzung von a-Eisen mitAmmoniak. Diese Reaktion wird großtechnisch zurNitridierhärtung von Eisenwerkstücken genutzt. Als Messmethode wurde die57Fe-Mößbauer-Spektroskopiegewählt, die auf Kernspinübergängen von57Fe beruht. Um in situ-Nitridierungen von Eisenproben mit Ammoniak mitder 57Fe-Mößbauer-Spektroskopieverfolgen zu können, wurde eineHochtemperatur-Messzelle entwickelt, die es erlaubt,Messungen bis 1100 K durchzuführen. Die Messzelle wurde durch Messungen an Eisennitriden mitbekannter Stöchiometrie, wie z.B. Neben 57Fe-Mößbauer-Messungen wurdenim Rahmen des DFG-Schwerpunkt-Programms Reaktivitätin Festkörpern weitere Messungen (u.a.Hochtemperatur-Leitfähigkeitsmessungen)durchgeführt. Die experimentellen Methoden wurden durchBandstruktur-Rechnungen ergänzt. Mit Hilfe der TB-LMTO-ASA-Methode erfolgten Rechnungen anÜbergangsmetallnitriden M3N (M = Mn, Fe, Co,Ni, Cu) der 3d-Reihe. Hierbei konnte der experimentell bestimmte strukturelleÜbergang von hexagonalem Ni3N zu kubischemCu3N bestätigt werden.
Resumo:
The work for the present thesis started in California, during my semester as an exchange student overseas. California is known worldwide for its seismicity and its effort in the earthquake engineering research field. For this reason, I immediately found interesting the Structural Dynamics Professor, Maria Q. Feng's proposal, to work on a pushover analysis of the existing Jamboree Road Overcrossing bridge. Concrete is a popular building material in California, and for the most part, it serves its functions well. However, concrete is inherently brittle and performs poorly during earthquakes if not reinforced properly. The San Fernando Earthquake of 1971 dramatically demonstrated this characteristic. Shortly thereafter, code writers revised the design provisions for new concrete buildings so to provide adequate ductility to resist strong ground shaking. There remain, nonetheless, millions of square feet of non-ductile concrete buildings in California. The purpose of this work is to perform a Pushover Analysis and compare the results with those of a Nonlinear Time-History Analysis of an existing bridge, located in Southern California. The analyses have been executed through the software OpenSees, the Open System for Earthquake Engineering Simulation. The bridge Jamboree Road Overcrossing is classified as a Standard Ordinary Bridge. In fact, the JRO is a typical three-span continuous cast-in-place prestressed post-tension box-girder. The total length of the bridge is 366 ft., and the height of the two bents are respectively 26,41 ft. and 28,41 ft.. Both the Pushover Analysis and the Nonlinear Time-History Analysis require the use of a model that takes into account for the nonlinearities of the system. In fact, in order to execute nonlinear analyses of highway bridges it is essential to incorporate an accurate model of the material behavior. It has been observed that, after the occurrence of destructive earthquakes, one of the most damaged elements on highway bridges is a column. To evaluate the performance of bridge columns during seismic events an adequate model of the column must be incorporated. Part of the work of the present thesis is, in fact, dedicated to the modeling of bents. Different types of nonlinear element have been studied and modeled, with emphasis on the plasticity zone length determination and location. Furthermore, different models for concrete and steel materials have been considered, and the selection of the parameters that define the constitutive laws of the different materials have been accurate. The work is structured into four chapters, to follow a brief overview of the content. The first chapter introduces the concepts related to capacity design, as the actual philosophy of seismic design. Furthermore, nonlinear analyses both static, pushover, and dynamic, time-history, are presented. The final paragraph concludes with a short description on how to determine the seismic demand at a specific site, according to the latest design criteria in California. The second chapter deals with the formulation of force-based finite elements and the issues regarding the objectivity of the response in nonlinear field. Both concentrated and distributed plasticity elements are discussed into detail. The third chapter presents the existing structure, the software used OpenSees, and the modeling assumptions and issues. The creation of the nonlinear model represents a central part in this work. Nonlinear material constitutive laws, for concrete and reinforcing steel, are discussed into detail; as well as the different scenarios employed in the columns modeling. Finally, the results of the pushover analysis are presented in chapter four. Capacity curves are examined for the different model scenarios used, and failure modes of concrete and steel are discussed. Capacity curve is converted into capacity spectrum and intersected with the design spectrum. In the last paragraph, the results of nonlinear time-history analyses are compared to those of pushover analysis.
Resumo:
The purpose of this research is to contribute to the literature on organizational demography and new product development by investigating how diverse individual career histories impact team performance. Moreover we highlighted the importance of considering also the institutional context and the specific labour market arrangements in which a team is embedded, in order to interpret correctly the effect of career-related diversity measures on performance. The empirical setting of the study is the videogame industry, and the teams in charge of the development of new game titles. Video games development teams are the ideal setting to investigate the influence of career histories on team performance, since the development of videogames is performed by multidisciplinary teams composed by specialists with a wide variety of technical and artistic backgrounds, who execute a significant amounts of creative thinking. We investigate our research question both with quantitative methods and with a case study on the Japanese videogame industry: one of the most innovative in this sector. Our results show how career histories in terms of occupational diversity, prior functional diversity and prior product diversity, usually have a positive influence on team performance. However, when the moderating effect of the institutional setting is taken in to account, career diversity has different or even opposite effect on team performance, according to the specific national context in which a team operates.
Resumo:
Life is full of uncertainties. Legal rules should have a clear intention, motivation and purpose in order to diminish daily uncertainties. However, practice shows that their consequences are complex and hard to predict. For instance, tort law has the general objectives of deterring future negligent behavior and compensating the victims of someone else's negligence. Achieving these goals are particularly difficult in medical malpractice cases. To start with, when patients search for medical care they are typically sick in the first place. In case harm materializes during the treatment, it might be very hard to assess if it was due to substandard medical care or to the patient's poor health conditions. Moreover, the practice of medicine has a positive externality on the society, meaning that the design of legal rules is crucial: for instance, it should not result in physicians avoiding practicing their activity just because they are afraid of being sued even when they acted according to the standard level of care. The empirical literature on medical malpractice has been developing substantially in the past two decades, with the American case being the most studied one. Evidence from civil law tradition countries is more difficult to find. The aim of this thesis is to contribute to the empirical literature on medical malpractice, using two civil law countries as a case-study: Spain and Italy. The goal of this thesis is to investigate, in the first place, some of the consequences of having two separate sub-systems (administrative and civil) coexisting within the same legal system, which is common in civil law tradition countries with a public national health system (such as Spain, France and Portugal). When this holds, different procedures might apply depending on the type of hospital where the injury took place (essentially whether it is a public hospital or a private hospital). Therefore, a patient injured in a public hospital should file a claim in administrative courts while a patient suffering an identical medical accident should file a claim in civil courts. A natural question that the reader might pose is why should both administrative and civil courts decide medical malpractice cases? Moreover, can this specialization of courts influence how judges decide medical malpractice cases? In the past few years, there was a general concern with patient safety, which is currently on the agenda of several national governments. Some initiatives have been taken at the international level, with the aim of preventing harm to patients during treatment and care. A negligently injured patient might present a claim against the health care provider with the aim of being compensated for the economic loss and for pain and suffering. In several European countries, health care is mainly provided by a public national health system, which means that if a patient harmed in a public hospital succeeds in a claim against the hospital, public expenditures increase because the State takes part in the litigation process. This poses a problem in a context of increasing national health expenditures and public debt. In Italy, with the aim of increasing patient safety, some regions implemented a monitoring system on medical malpractice claims. However, if properly implemented, this reform shall also allow for a reduction in medical malpractice insurance costs. This thesis is organized as follows. Chapter 1 provides a review of the empirical literature on medical malpractice, where studies on outcomes and merit of claims, costs and defensive medicine are presented. Chapter 2 presents an empirical analysis of medical malpractice claims arriving to the Spanish Supreme Court. The focus is on reversal rates for civil and administrative decisions. Administrative decisions appealed by the plaintiff have the highest reversal rates. The results show a bias in lower administrative courts, which tend to focus on the State side. We provide a detailed explanation for these results, which can rely on the organization of administrative judges career. Chapter 3 assesses predictors of compensation in medical malpractice cases appealed to the Spanish Supreme Court and investigates the amount of damages attributed to patients. The results show horizontal equity between administrative and civil decisions (controlling for observable case characteristics) and vertical inequity (patients suffering more severe injuries tend to receive higher payouts). In order to execute these analyses, a database of medical malpractice decisions appealed to the Administrative and Civil Chambers of the Spanish Supreme Court from 2006 until 2009 (designated by the Spanish Supreme Court Medical Malpractice Dataset (SSCMMD)) has been created. A description of how the SSCMMD was built and of the Spanish legal system is presented as well. Chapter 4 includes an empirical investigation of the effect of a monitoring system for medical malpractice claims on insurance premiums. In Italy, some regions adopted this policy in different years, while others did not. The study uses data on insurance premiums from Italian public hospitals for the years 2001-2008. This is a significant difference as most of the studies use the insurance company as unit of analysis. Although insurance premiums have risen from 2001 to 2008, the increase was lower for regions adopting a monitoring system for medical claims. Possible implications of this system are also provided. Finally, Chapter 5 discusses the main findings, describes possible future research and concludes.
Resumo:
La neuroriabilitazione è un processo attraverso cui individui affetti da patologie neurologiche mirano al conseguimento di un recupero completo o alla realizzazione del loro potenziale ottimale benessere fisico, mentale e sociale. Elementi essenziali per una riabilitazione efficace sono: una valutazione clinica da parte di un team multidisciplinare, un programma riabilitativo mirato e la valutazione dei risultati conseguiti mediante misure scientifiche e clinicamente appropriate. Obiettivo principale di questa tesi è stato sviluppare metodi e strumenti quantitativi per il trattamento e la valutazione motoria di pazienti neurologici. I trattamenti riabilitativi convenzionali richiedono a pazienti neurologici l’esecuzione di esercizi ripetitivi, diminuendo la loro motivazione. La realtà virtuale e i feedback sono in grado di coinvolgerli nel trattamento, permettendo ripetibilità e standardizzazione dei protocolli. È stato sviluppato e valutato uno strumento basato su feedback aumentati per il controllo del tronco. Inoltre, la realtà virtuale permette l’individualizzare il trattamento in base alle esigenze del paziente. Un’applicazione virtuale per la riabilitazione del cammino è stata sviluppata e testata durante un training su pazienti di sclerosi multipla, valutandone fattibilità e accettazione e dimostrando l'efficacia del trattamento. La valutazione quantitativa delle capacità motorie dei pazienti viene effettuata utilizzando sistemi di motion capture. Essendo il loro uso nella pratica clinica limitato, una metodologia per valutare l’oscillazione delle braccia in soggetti parkinsoniani basata su sensori inerziali è stata proposta. Questi sono piccoli, accurati e flessibili ma accumulano errori durante lunghe misurazioni. È stato affrontato questo problema e i risultati suggeriscono che, se il sensore è sul piede e le accelerazioni sono integrate iniziando dalla fase di mid stance, l’errore e le sue conseguenze nella determinazione dei parametri spaziali sono contenuti. Infine, è stata presentata una validazione del Kinect per il tracking del cammino in ambiente virtuale. Risultati preliminari consentono di definire il campo di utilizzo del sensore in riabilitazione.
Resumo:
The efficient emulation of a many-core architecture is a challenging task, each core could be emulated through a dedicated thread and such threads would be interleaved on an either single-core or a multi-core processor. The high number of context switches will results in an unacceptable performance. To support this kind of application, the GPU computational power is exploited in order to schedule the emulation threads on the GPU cores. This presents a non trivial divergence issue, since GPU computational power is offered through SIMD processing elements, that are forced to synchronously execute the same instruction on different memory portions. Thus, a new emulation technique is introduced in order to overcome this limitation: instead of providing a routine for each ISA opcode, the emulator mimics the behavior of the Micro Architecture level, here instructions are date that a unique routine takes as input. Our new technique has been implemented and compared with the classic emulation approach, in order to investigate the chance of a hybrid solution.
Resumo:
Biological systems are complex and highly organized architectures governed by noncovalent interactions, which are responsible for molecular recognition, self-assembly, self-organization, adaptation and evolution processes. These systems provided the inspiration for the development of supramolecular chemistry, that aimed at the design of artificial multicomponent molecular assemblies, namely supramolecular systems, properly designed to perform different operations: each constituting unit performs a single act, whereas the entire supramolecular system is able to execute a more complex function, resulting from the cooperation of the constituting components. Supramolecular chemistry deals with the development of molecular systems able to mimic naturally occurring events, for example complexation and self-assembly through the establishment of noncovalent interactions. Moreover, the application of external stimuli, such as light, allows to perform these operations in a time- and space-controlled manner. These systems can interact with biological systems and, thus, can be applied for bioimaging, therapeutic and drug delivery purposes. In this work the study of biocompatible supramolecular species able to interact with light is presented. The first part deals with the photophysical, photochemical and electrochemical characterization of water-soluble blue emitting triazoloquinolinium and triazolopyridinium salts. Moreover, their interaction with DNA has been explored, in the perspective of developing water-soluble systems for bioimaging applications. In the second part, the effect exerted by the presence of azobenzene-bearing supramolecular species in liposomes, inserted both in the phospholipid bilayer and in the in the aqueous core of vesicles has been studied, in order to develop systems able to deliver small molecules and ions in a photocontrolled manner. Moreover, the versatility of azobenzene and its broad range of applications have been highlighted, since conjugated oligoazobenzene derivatives proved not to be adequate to be inserted in the phospholipid bilayer of liposomes, but their electrochemical properties made them interesting candidates as electron acceptor materials for photovoltaic applications.
Resumo:
Im Bereich sicherheitsrelevanter eingebetteter Systeme stellt sich der Designprozess von Anwendungen als sehr komplex dar. Entsprechend einer gegebenen Hardwarearchitektur lassen sich Steuergeräte aufrüsten, um alle bestehenden Prozesse und Signale pünktlich auszuführen. Die zeitlichen Anforderungen sind strikt und müssen in jeder periodischen Wiederkehr der Prozesse erfüllt sein, da die Sicherstellung der parallelen Ausführung von größter Bedeutung ist. Existierende Ansätze können schnell Designalternativen berechnen, aber sie gewährleisten nicht, dass die Kosten für die nötigen Hardwareänderungen minimal sind. Wir stellen einen Ansatz vor, der kostenminimale Lösungen für das Problem berechnet, die alle zeitlichen Bedingungen erfüllen. Unser Algorithmus verwendet Lineare Programmierung mit Spaltengenerierung, eingebettet in eine Baumstruktur, um untere und obere Schranken während des Optimierungsprozesses bereitzustellen. Die komplexen Randbedingungen zur Gewährleistung der periodischen Ausführung verlagern sich durch eine Zerlegung des Hauptproblems in unabhängige Unterprobleme, die als ganzzahlige lineare Programme formuliert sind. Sowohl die Analysen zur Prozessausführung als auch die Methoden zur Signalübertragung werden untersucht und linearisierte Darstellungen angegeben. Des Weiteren präsentieren wir eine neue Formulierung für die Ausführung mit fixierten Prioritäten, die zusätzlich Prozessantwortzeiten im schlimmsten anzunehmenden Fall berechnet, welche für Szenarien nötig sind, in denen zeitliche Bedingungen an Teilmengen von Prozessen und Signalen gegeben sind. Wir weisen die Anwendbarkeit unserer Methoden durch die Analyse von Instanzen nach, welche Prozessstrukturen aus realen Anwendungen enthalten. Unsere Ergebnisse zeigen, dass untere Schranken schnell berechnet werden können, um die Optimalität von heuristischen Lösungen zu beweisen. Wenn wir optimale Lösungen mit Antwortzeiten liefern, stellt sich unsere neue Formulierung in der Laufzeitanalyse vorteilhaft gegenüber anderen Ansätzen dar. Die besten Resultate werden mit einem hybriden Ansatz erzielt, der heuristische Startlösungen, eine Vorverarbeitung und eine heuristische mit einer kurzen nachfolgenden exakten Berechnungsphase verbindet.
Resumo:
Under President Ronald Reagan, the White House pursued a complex foreign policy towards the Contras, rebels in trying to overthrow the Sandinista regime in Nicaragua, in Nicaragua. In 1979, the leftist Sandinista government seized power in Nicaragua. The loss of the previous pro-United States Somoza military dictatorship deeply troubled the conservatives, for whom eradication of communism internationally was a top foreign policy goal. Consequently, the Reagan Administration sought to redress the policy of his predecessor, Jimmy Carter, and assume a hard line stance against leftist regimes in Central America. Reagan and the conservatives within his administration, therefore, supported the Contra through military arms, humanitarian aid, and financial contributions. This intervention in Nicaragua, however, failed to garner popular support from American citizens and Democrats. Consequently, between 1982 and 1984 Congress prohibited further funding to the Contras in a series of legislation called the Boland Amendments. These Amendments barred any military aid from reaching the Contras, including through intelligence agencies. Shortly after their passage, Central Intelligence Agency Director William Casey and influential members of Reagan¿s National Security Council (NSC) including National Security Advisor Robert McFarlane, NSC Aide Oliver North, and Deputy National Security Advisor John Poindexter cooperated to identify and exploit loopholes in the legislation. By recognizing the NSC as a non-intelligence body, these masterminds orchestrated a scheme in which third parties, including foreign countries and private donors, contributed both financially and through arms donations to sustain the Contras independently of Congressional oversight. This thesis explores the mechanism and process of soliciting donations from private individuals, recognizing the forces and actors that created a situation for covert action to continue without detection. Oliver North, the main actor of the state, worked within his role as an NSC bureaucrat to network with influential politicians and private individuals to execute the orders of his superiors and shape foreign policy. Although Reagan articulated his desire for the Contras to remain a military presence in Nicaragua, he delegated the details of policy to his subordinates, which allowed this scheme to flourish. Second, this thesis explores the individual donors, analyzing their role as private citizens in sustaining and encouraging the policy of the Reagan Administration. The Contra movement found non-state support from followers of the New Right, demonstrated through financial and organizational assistance, that allowed the Reagan Administration¿s statistically unpopular policy in Nicaragua to continue. I interpret these donors as politically involved, but politically philanthropic, individuals, donating to their charity of choice to further the principles of American freedom internationally in a Cold War environment. The thesis then proceeds to assess the balance of power between the executive and other political actors in shaping policy, concluding that the executive cannot act alone in the formulation and implementation of foreign policy.
Resumo:
After decades of development in programming languages and programming environments, Smalltalk is still one of few environments that provide advanced features and is still widely used in the industry. However, as Java became prevalent, the ability to call Java code from Smalltalk and vice versa becomes important. Traditional approaches to integrate the Java and Smalltalk languages are through low-level communication between separate Java and Smalltalk virtual machines. We are not aware of any attempt to execute and integrate the Java language directly in the Smalltalk environment. A direct integration allows for very tight and almost seamless integration of the languages and their objects within a single environment. Yet integration and language interoperability impose challenging issues related to method naming conventions, method overloading, exception handling and thread-locking mechanisms. In this paper we describe ways to overcome these challenges and to integrate Java into the Smalltalk environment. Using techniques described in this paper, the programmer can call Java code from Smalltalk using standard Smalltalk idioms while the semantics of each language remains preserved. We present STX:LIBJAVA - an implementation of Java virtual machine within Smalltalk/X - as a validation of our approach
Resumo:
The intent of the work presented in this thesis is to show that relativistic perturbations should be considered in the same manner as well known perturbations currently taken into account in planet-satellite systems. It is also the aim of this research to show that relativistic perturbations are comparable to standard perturbations in speciffc force magnitude and effects. This work would have been regarded as little more then a curiosity to most engineers until recent advancements in space propulsion methods { e.g. the creation of a artiffcial neutron stars, light sails, and continuous propulsion techniques. These cutting-edge technologies have the potential to thrust the human race into interstellar, and hopefully intergalactic, travel in the not so distant future. The relativistic perturbations were simulated on two orbit cases: (1) a general orbit and (2) a Molniya type orbit. The simulations were completed using Matlab's ODE45 integration scheme. The methods used to organize, execute, and analyze these simulations are explained in detail. The results of the simulations are presented in graphical and statistical form. The simulation data reveals that the speciffc forces that arise from the relativistic perturbations do manifest as variations in the classical orbital elements. It is also apparent from the simulated data that the speciffc forces do exhibit similar magnitudes and effects that materialize from commonly considered perturbations that are used in trajectory design, optimization, and maintenance. Due to the similarities in behavior of relativistic versus non-relativistic perturbations, a case is made for the development of a fully relativistic formulation for the trajectory design and trajectory optimization problems. This new framework would afford the possibility of illuminating new more optimal solutions to the aforementioned problems that do not arise in current formulations. This type of reformulation has already showed promise when the previously unknown Space Superhighways arose as a optimal solution when classical astrodynamics was reformulated using geometric mechanics.