851 resultados para Distributed agent system
Resumo:
The wide diffusion of cheap, small, and portable sensors integrated in an unprecedented large variety of devices and the availability of almost ubiquitous Internet connectivity make it possible to collect an unprecedented amount of real time information about the environment we live in. These data streams, if properly and timely analyzed, can be exploited to build new intelligent and pervasive services that have the potential of improving people's quality of life in a variety of cross concerning domains such as entertainment, health-care, or energy management. The large heterogeneity of application domains, however, calls for a middleware-level infrastructure that can effectively support their different quality requirements. In this thesis we study the challenges related to the provisioning of differentiated quality-of-service (QoS) during the processing of data streams produced in pervasive environments. We analyze the trade-offs between guaranteed quality, cost, and scalability in streams distribution and processing by surveying existing state-of-the-art solutions and identifying and exploring their weaknesses. We propose an original model for QoS-centric distributed stream processing in data centers and we present Quasit, its prototype implementation offering a scalable and extensible platform that can be used by researchers to implement and validate novel QoS-enforcement mechanisms. To support our study, we also explore an original class of weaker quality guarantees that can reduce costs when application semantics do not require strict quality enforcement. We validate the effectiveness of this idea in a practical use-case scenario that investigates partial fault-tolerance policies in stream processing by performing a large experimental study on the prototype of our novel LAAR dynamic replication technique. Our modeling, prototyping, and experimental work demonstrates that, by providing data distribution and processing middleware with application-level knowledge of the different quality requirements associated to different pervasive data flows, it is possible to improve system scalability while reducing costs.
Resumo:
Modern software systems, in particular distributed ones, are everywhere around us and are at the basis of our everyday activities. Hence, guaranteeing their cor- rectness, consistency and safety is of paramount importance. Their complexity makes the verification of such properties a very challenging task. It is natural to expect that these systems are reliable and above all usable. i) In order to be reliable, compositional models of software systems need to account for consistent dynamic reconfiguration, i.e., changing at runtime the communication patterns of a program. ii) In order to be useful, compositional models of software systems need to account for interaction, which can be seen as communication patterns among components which collaborate together to achieve a common task. The aim of the Ph.D. was to develop powerful techniques based on formal methods for the verification of correctness, consistency and safety properties related to dynamic reconfiguration and communication in complex distributed systems. In particular, static analysis techniques based on types and type systems appeared to be an adequate methodology, considering their success in guaranteeing not only basic safety properties, but also more sophisticated ones like, deadlock or livelock freedom in a concurrent setting. The main contributions of this dissertation are twofold. i) On the components side: we design types and a type system for a concurrent object-oriented calculus to statically ensure consistency of dynamic reconfigurations related to modifications of communication patterns in a program during execution time. ii) On the communication side: we study advanced safety properties related to communication in complex distributed systems like deadlock-freedom, livelock- freedom and progress. Most importantly, we exploit an encoding of types and terms of a typical distributed language, session π-calculus, into the standard typed π- calculus, in order to understand their expressive power.
Resumo:
Aberrant expression of ETS transcription factors, including FLI1 and ERG, due to chromosomal translocations has been described as a driver event in initiation and progression of different tumors. In this study, the impact of prostate cancer (PCa) fusion gene TMPRSS2-ERG was evaluated on components of the insulin-like growth factor (IGF) system and the CD99 molecule, two well documented targets of EWS-FLI1, the hallmark of Ewing sarcoma (ES). The aim of this study was to identify common or distinctive ETS-related mechanisms which could be exploited at biological and clinical level. The results demonstrate that IGF-1R represents a common target of ETS rearrangements as ERG and FLI1 bind IGF-1R gene promoter and their modulation causes alteration in IGF-1R protein levels. At clinical level, this mechanism provides basis for a more rationale use of anti-IGF-1R inhibitors as PCa cells expressing the fusion gene better respond to anti-IGF-1R agents. EWS-FLI1/IGF-1R axis provides rationale for combination of anti-IGF-1R agents with trabectedin, an alkylator agent causing enhanced EWS-FLI1 occupancy on the IGF-1R promoter. TMPRSS2-ERG also influences prognosis relevance of IGF system as high IGF-1R correlates with a better biochemical progression free survival (BPFS) in PCa patients negative for the fusion gene while marginal or no association was found in the total cases or TMPRSS2-ERG-positive cases, respectively. This study indicates CD99 is differentially regulated between ETS-related tumors as CD99 is not a target of ERG. In PCa, CD99 did not show differential expression between TMPRSS2-ERG-positive and –negative cells. A direct correlation was anyway found between ERG and CD99 proteins both in vitro and in patients putatively suggesting that ERG target genes comprehend regulators of CD99. Despite a little trend suggesting a correlation between CD99 expression and a better BPFS, no clinical relevance for CD99 was found in the field of prognostic biomarkers.
Resumo:
Beside the traditional paradigm of "centralized" power generation, a new concept of "distributed" generation is emerging, in which the same user becomes pro-sumer. During this transition, the Energy Storage Systems (ESS) can provide multiple services and features, which are necessary for a higher quality of the electrical system and for the optimization of non-programmable Renewable Energy Source (RES) power plants. A ESS prototype was designed, developed and integrated into a renewable energy production system in order to create a smart microgrid and consequently manage in an efficient and intelligent way the energy flow as a function of the power demand. The produced energy can be introduced into the grid, supplied to the load directly or stored in batteries. The microgrid is composed by a 7 kW wind turbine (WT) and a 17 kW photovoltaic (PV) plant are part of. The load is given by electrical utilities of a cheese factory. The ESS is composed by the following two subsystems, a Battery Energy Storage System (BESS) and a Power Control System (PCS). With the aim of sizing the ESS, a Remote Grid Analyzer (RGA) was designed, realized and connected to the wind turbine, photovoltaic plant and the switchboard. Afterwards, different electrochemical storage technologies were studied, and taking into account the load requirements present in the cheese factory, the most suitable solution was identified in the high temperatures salt Na-NiCl2 battery technology. The data acquisition from all electrical utilities provided a detailed load analysis, indicating the optimal storage size equal to a 30 kW battery system. Moreover a container was designed and realized to locate the BESS and PCS, meeting all the requirements and safety conditions. Furthermore, a smart control system was implemented in order to handle the different applications of the ESS, such as peak shaving or load levelling.
Resumo:
This thesis is focused on Smart Grid applications in medium voltage distribution networks. For the development of new applications it appears useful the availability of simulation tools able to model dynamic behavior of both the power system and the communication network. Such a co-simulation environment would allow the assessment of the feasibility of using a given network technology to support communication-based Smart Grid control schemes on an existing segment of the electrical grid and to determine the range of control schemes that different communications technologies can support. For this reason, is presented a co-simulation platform that has been built by linking the Electromagnetic Transients Program Simulator (EMTP v3.0) with a Telecommunication Network Simulator (OPNET-Riverbed v18.0). The simulator is used to design and analyze a coordinate use of Distributed Energy Resources (DERs) for the voltage/var control (VVC) in distribution network. This thesis is focused control structure based on the use of phase measurement units (PMUs). In order to limit the required reinforcements of the communication infrastructures currently adopted by Distribution Network Operators (DNOs), the study is focused on leader-less MAS schemes that do not assign special coordinating rules to specific agents. Leader-less MAS are expected to produce more uniform communication traffic than centralized approaches that include a moderator agent. Moreover, leader-less MAS are expected to be less affected by limitations and constraint of some communication links. The developed co-simulator has allowed the definition of specific countermeasures against the limitations of the communication network, with particular reference to the latency and loss and information, for both the case of wired and wireless communication networks. Moreover, the co-simulation platform has bee also coupled with a mobility simulator in order to study specific countermeasures against the negative effects on the medium voltage/current distribution network caused by the concurrent connection of electric vehicles.
Resumo:
This Master thesis presents the results obtained in the curricular traineeship, carried out within the laboratories of the Department of Chemistry of the University of Bergen, during the Erasmus period, and within the Department of Industrial Chemistry of the University of Bologna. The project followed in Bergen concerned the synthesis of key intermediates used for the functionalization of the backbone of imidazole, using N,N'- diiodo-5,5-dimethylhydantoin (“DIH”) as an iodinating agent, and employing an innovative kind of chemical reactor: the “Multijet Oscillating Disc Millireactor” (MJOD Reactor). Afterwards, the work performed in Bologna consisted in verifying the stability in solution of the above mentioned N,N'-diiodo-5,5-dimethylhydantoin utilising spectrophotometric techniques and High Performance Liquid Chromatography analyses (HPLC).
Resumo:
Central nervous system involvement is a rare and serious complication of Behçet's disease (BD). Herein, we describe a patient with an atypical central lesion, who experienced progressive hypesthesia of the right arm and sensory loss of the trigeminal nerve together with intense headache. A repeated biopsy was necessary to conclusively establish the diagnosis of BD. Therapy with infusions of infliximab led to a remarkable full remission. TNFα-blocking therapy was successfully replaced by azathioprine. The present well-illustrated case demonstrates the difficulty of establishing the diagnosis of BD with central nervous system involvement, the dramatic benefit of short given TNF-α-blocking agent, and the long-term remission with azathioprin.
Resumo:
Energy transfer between the interacting waves in a distributed Brillouin sensor can result in a distorted measurement of the local Brillouin gain spectrum, leading to systematic errors. It is demonstrated that this depletion effect can be precisely modelled. This has been validated by experimental tests in an excellent quantitative agreement. Strict guidelines can be enunciated from the model to make the impact of depletion negligible, for any type and any length of fiber. (C) 2013 Optical Society of America
Resumo:
Squirrel monkeys (Saimiri sciureus) were infected experimentally with the agent of classical bovine spongiform encephalopathy (BSE). Two to four years later, six of the monkeys developed alterations in interactive behaviour and cognition and other neurological signs typical of transmissible spongiform encephalopathy (TSE). At necropsy examination, the brains from all of the monkeys showed pathological changes similar to those described in variant Creutzfeldt-Jakob disease (vCJD) of man, except that the squirrel monkey brains contained no PrP-amyloid plaques typical of that disease. Constant neuropathological features included spongiform degeneration, gliosis, deposition of abnormal prion protein (PrP(TSE)) and many deposits of abnormally phosphorylated tau protein (p-Tau) in several areas of the cerebrum and cerebellum. Western blots showed large amounts of proteinase K-resistant prion protein in the central nervous system. The striking absence of PrP plaques (prominent in brains of cynomolgus macaques [Macaca fascicularis] with experimentally-induced BSE and vCJD and in human patients with vCJD) reinforces the conclusion that the host plays a major role in determining the neuropathology of TSEs. Results of this study suggest that p-Tau, found in the brains of all BSE-infected monkeys, might play a role in the pathogenesis of TSEs. Whether p-Tau contributes to development of disease or appears as a secondary change late in the course of illness remains to be determined.
Resumo:
The glycine deportation system is an essential component of glycine catabolism in man whereby 400 to 800mg glycine per day are deported into urine as hippuric acid. The molecular escort for this deportation is benzoic acid, which derives from the diet and from gut microbiota metabolism of dietary precursors. Three components of this system, involving hepatic and renal metabolism, and renal active tubular secretion help regulate systemic and central nervous system levels of glycine. When glycine levels are pathologically high, as in congenital nonketotic hyperglycinemia, the glycine deportation system can be upregulated with pharmacological doses of benzoic acid to assist in normalization of glycine homeostasis. In congenital urea cycle enzymopathies, similar activation of the glycine deportation system with benzoic acid is useful for the excretion of excess nitrogen in the form of glycine. Drugs which can substitute for benzoic acid as substrates for the glycine deportation system have adverse reactions that may involve perturbations of glycine homeostasis. The cancer chemotherapeutic agent ifosfamide has an unacceptably high incidence of encephalopathy. This would appear to arise as a result of the production of toxic aldehyde metabolites which deplete ATP production and sequester NADH in the mitochondrial matrix, thereby inhibiting the glycine deportation system and causing de novo glycine synthesis by the glycine cleavage system. We hypothesize that this would result in hyperglycinemia and encephalopathy. This understanding may lead to novel prophylactic strategies for ifosfamide encephalopathy. Thus, the glycine deportation system plays multiple key roles in physiological and neurotoxicological processes involving glycine.
Resumo:
This thesis explores system performance for reconfigurable distributed systems and provides an analytical model for determining throughput of theoretical systems based on the OpenSPARC FPGA Board and the SIRC Communication Framework. This model was developed by studying a small set of variables that together determine a system¿s throughput. The importance of this model is in assisting system designers to make decisions as to whether or not to commit to designing a reconfigurable distributed system based on the estimated performance and hardware costs. Because custom hardware design and distributed system design are both time consuming and costly, it is important for designers to make decisions regarding system feasibility early in the development cycle. Based on experimental data the model presented in this paper shows a close fit with less than 10% experimental error on average. The model is limited to a certain range of problems, but it can still be used given those limitations and also provides a foundation for further development of modeling reconfigurable distributed systems.
Resumo:
This thesis presents two frameworks- a software framework and a hardware core manager framework- which, together, can be used to develop a processing platform using a distributed system of field-programmable gate array (FPGA) boards. The software framework providesusers with the ability to easily develop applications that exploit the processing power of FPGAs while the hardware core manager framework gives users the ability to configure and interact with multiple FPGA boards and/or hardware cores. This thesis describes the design and development of these frameworks and analyzes the performance of a system that was constructed using the frameworks. The performance analysis included measuring the effect of incorporating additional hardware components into the system and comparing the system to a software-only implementation. This work draws conclusions based on the provided results of the performance analysis and offers suggestions for future work.
Resumo:
OBJECTIVE: The objective of our study was to establish optimal perfusion conditions for high-resolution postmortem angiography that would permit dynamic visualization of the arterial and venous systems. MATERIALS AND METHODS: Cadavers of two dogs and one cat were perfused with diesel oil through a peristaltic pump. The lipophilic contrast agent Lipiodol Ultra Fluide was then injected, and angiography was performed. The efficiency of perfusion was evaluated in the chick chorioallantoic membrane. RESULTS: Vessels could be seen up to the level of the smaller supplying and draining vessels. Hence, both the arterial and the venous sides of the vascular system could be distinguished. The chorioallantoic membrane assay revealed that diesel oil enters microvessels up to 50 microm in diameter and that it does not penetrate the capillary network. CONCLUSION: After establishing a postmortem circulation by diesel oil perfusion, angiography can be performed by injection of Lipiodol Ultra Fluide. The resolution of the images obtained up to 3 days after death is comparable to that achieved in clinical angiography.
Resumo:
Site-specific delivery of anticancer agents to tumors represents a promising therapeutic strategy because it increases efficacy and reduces toxicity to normal tissues compared with untargeted drugs. Sterically stabilized immunoliposomes (SIL), guided by antibodies that specifically bind to well internalizing antigens on the tumor cell surface, are effective nanoscale delivery systems capable of accumulating large quantities of anticancer agents at the tumor site. The epithelial cell adhesion molecule (EpCAM) holds major promise as a target for antibody-based cancer therapy due to its abundant expression in many solid tumors and its limited distribution in normal tissues. We generated EpCAM-directed immunoliposomes by covalently coupling the humanized single-chain Fv antibody fragment 4D5MOCB to the surface of sterically stabilized liposomes loaded with the anticancer agent doxorubicin. In vitro, the doxorubicin-loaded immunoliposomes (SIL-Dox) showed efficient cell binding and internalization and were significantly more cytotoxic against EpCAM-positive tumor cells than nontargeted liposomes (SL-Dox). In athymic mice bearing established human tumor xenografts, pharmacokinetic and biodistribution analysis of SIL-Dox revealed long circulation times in the blood with a half-life of 11 h and effective time-dependent tumor localization, resulting in up to 15% injected dose per gram tissue. These favorable pharmacokinetic properties translated into potent antitumor activity, which resulted in significant growth inhibition (compared with control mice), and was more pronounced than that of doxorubicin alone and nontargeted SL-Dox at low, nontoxic doses. Our data show the promise of EpCAM-directed nanovesicular drug delivery for targeted therapy of solid tumors.
Resumo:
Comments on an article by Kashima et al. (see record 2007-10111-001). In their target article Kashima and colleagues try to show how a connectionist model conceptualization of the self is best suited to capture the self's temporal and socio-culturally contextualized nature. They propose a new model and to support this model, the authors conduct computer simulations of psychological phenomena whose importance for the self has long been clear, even if not formally modeled, such as imitation, and learning of sequence and narrative. As explicated when we advocated connectionist models as a metaphor for self in Mischel and Morf (2003), we fully endorse the utility of such a metaphor, as these models have some of the processing characteristics necessary for capturing key aspects and functions of a dynamic cognitive-affective self-system. As elaborated in that chapter, we see as their principal strength that connectionist models can take account of multiple simultaneous processes without invoking a single central control. All outputs reflect a distributed pattern of activation across a large number of simple processing units, the nature of which depends on (and changes with) the connection weights between the links and the satisfaction of mutual constraints across these links (Rummelhart & McClelland, 1986). This allows a simple account for why certain input features will at times predominate, while others take over on other occasions. (PsycINFO Database Record (c) 2008 APA, all rights reserved)