15 resultados para GST and incapacitated entities

em AMS Tesi di Dottorato - Alm@DL - Università di Bologna


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The advent of distributed and heterogeneous systems has laid the foundation for the birth of new architectural paradigms, in which many separated and autonomous entities collaborate and interact to the aim of achieving complex strategic goals, impossible to be accomplished on their own. A non exhaustive list of systems targeted by such paradigms includes Business Process Management, Clinical Guidelines and Careflow Protocols, Service-Oriented and Multi-Agent Systems. It is largely recognized that engineering these systems requires novel modeling techniques. In particular, many authors are claiming that an open, declarative perspective is needed to complement the closed, procedural nature of the state of the art specification languages. For example, the ConDec language has been recently proposed to target the declarative and open specification of Business Processes, overcoming the over-specification and over-constraining issues of classical procedural approaches. On the one hand, the success of such novel modeling languages strongly depends on their usability by non-IT savvy: they must provide an appealing, intuitive graphical front-end. On the other hand, they must be prone to verification, in order to guarantee the trustworthiness and reliability of the developed model, as well as to ensure that the actual executions of the system effectively comply with it. In this dissertation, we claim that Computational Logic is a suitable framework for dealing with the specification, verification, execution, monitoring and analysis of these systems. We propose to adopt an extended version of the ConDec language for specifying interaction models with a declarative, open flavor. We show how all the (extended) ConDec constructs can be automatically translated to the CLIMB Computational Logic-based language, and illustrate how its corresponding reasoning techniques can be successfully exploited to provide support and verification capabilities along the whole life cycle of the targeted systems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The present work tries to display a comprehensive and comparative study of the different legal and regulatory problems involved in international securitization transactions. First, an introduction to securitization is provided, with the basic elements of the transaction, followed by the different varieties of it, including dynamic securitization and synthetic securitization structures. Together with this introduction to the intricacies of the structure, a insight into the influence of securitization in the financial and economic crisis of 2007-2009 is provided too; as well as an overview of the process of regulatory competition and cooperation that constitutes the framework for the international aspects of securitization. The next Chapter focuses on the aspects that constitute the foundations of structured finance: the inception of the vehicle, and the transfer of risks associated to the securitized assets, with particular emphasis on the validity of those elements, and how a securitization transaction could be threatened at its root. In this sense, special importance is given to the validity of the trust as an instrument of finance, to the assignment of future receivables or receivables in block, and to the importance of formalities for the validity of corporations, trusts, assignments, etc., and the interaction of such formalities contained in general corporate, trust and assignment law with those contemplated under specific securitization regulations. Then, the next Chapter (III) focuses on creditor protection aspects. As such, we provide some insights on the debate on the capital structure of the firm, and its inadequacy to assess the financial soundness problems inherent to securitization. Then, we proceed to analyze the importance of rules on creditor protection in the context of securitization. The corollary is in the rules in case of insolvency. In this sense, we divide the cases where a party involved in the transaction goes bankrupt, from those where the transaction itself collapses. Finally, we focus on the scenario where a substance over form analysis may compromise some of the elements of the structure (notably the limited liability of the sponsor, and/or the transfer of assets) by means of veil piercing, substantive consolidation, or recharacterization theories. Once these elements have been covered, the next Chapters focus on the regulatory aspects involved in the transaction. Chapter IV is more referred to “market” regulations, i.e. those concerned with information disclosure and other rules (appointment of the indenture trustee, and elaboration of a rating by a rating agency) concerning the offering of asset-backed securities to the public. Chapter V, on the other hand, focuses on “prudential” regulation of the entity entrusted with securitizing assets (the so-called Special Purpose vehicle), and other entities involved in the process. Regarding the SPV, a reference is made to licensing requirements, restriction of activities and governance structures to prevent abuses. Regarding the sponsor of the transaction, a focus is made on provisions on sound originating practices, and the servicing function. Finally, we study accounting and banking regulations, including the Basel I and Basel II Frameworks, which determine the consolidation of the SPV, and the de-recognition of the securitized asset from the originating company’s balance-sheet, as well as the posterior treatment of those assets, in particular by banks. Chapters VI-IX are concerned with liability matters. Chapter VI is an introduction to the different sources of liability. Chapter VII focuses on the liability by the SPV and its management for the information supplied to investors, the management of the asset pool, and the breach of loyalty (or fiduciary) duties. Chapter VIII rather refers to the liability of the originator as a result of such information and statements, but also as a result of inadequate and reckless originating or servicing practices. Chapter IX finally focuses on third parties entrusted with the soundness of the transaction towards the market, the so-called gatekeepers. In this respect, we make special emphasis on the liability of indenture trustees, underwriters and rating agencies. Chapters X and XI focus on the international aspects of securitization. Chapter X contains a conflicts of laws analysis of the different aspects of structured finance. In this respect, a study is made of the laws applicable to the vehicle, to the transfer of risks (either by assignment or by means of derivatives contracts), to liability issues; and a study is also made of the competent jurisdiction (and applicable law) in bankruptcy cases; as well as in cases where a substance-over-form is performed. Then, special attention is also devoted to the role of financial and securities regulations; as well as to their territorial limits, and extraterritoriality problems involved. Chapter XI supplements the prior Chapter, for it analyzes the limits to the States’ exercise of regulatory power by the personal and “market” freedoms included in the US Constitution or the EU Treaties. A reference is also made to the (still insufficient) rules from the WTO Framework, and their significance to the States’ recognition and regulation of securitization transactions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Actual trends in software development are pushing the need to face a multiplicity of diverse activities and interaction styles characterizing complex and distributed application domains, in such a way that the resulting dynamics exhibits some grade of order, i.e. in terms of evolution of the system and desired equilibrium. Autonomous agents and Multiagent Systems are argued in literature as one of the most immediate approaches for describing such a kind of challenges. Actually, agent research seems to converge towards the definition of renewed abstraction tools aimed at better capturing the new demands of open systems. Besides agents, which are assumed as autonomous entities purposing a series of design objectives, Multiagent Systems account new notions as first-class entities, aimed, above all, at modeling institutional/organizational entities, placed for normative regulation, interaction and teamwork management, as well as environmental entities, placed as resources to further support and regulate agent work. The starting point of this thesis is recognizing that both organizations and environments can be rooted in a unifying perspective. Whereas recent research in agent systems seems to account a set of diverse approaches to specifically face with at least one aspect within the above mentioned, this work aims at proposing a unifying approach where both agents and their organizations can be straightforwardly situated in properly designed working environments. In this line, this work pursues reconciliation of environments with sociality, social interaction with environment based interaction, environmental resources with organizational functionalities with the aim to smoothly integrate the various aspects of complex and situated organizations in a coherent programming approach. Rooted in Agents and Artifacts (A&A) meta-model, which has been recently introduced both in the context of agent oriented software engineering and programming, the thesis promotes the notion of Embodied Organizations, characterized by computational infrastructures attaining a seamless integration between agents, organizations and environmental entities.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Oxidative stress is considered to be of major relevance for a variety of pathological processes. Thus, it is valuable to identify compounds, which might act as antioxidants, i.e. compounds that antagonize the deleterious action of reactive oxygen species (ROS) on biomolecules. The mode of action of these compounds could be either to scavenge ROS directly or to trigger protective mechanisms inside the cell, thereby resulting in improved defense against ROS. Sulforaphane (SF) (1-isothiocyanato-(4R)-(methylsulfinyl)butane) is a naturally occurring cancer chemopreventive agent found as a precursor glucosinolate in Cruciferous vegetables like broccoli. Although SF is not a direct-acting antioxidant, there is substantial evidence that SF acts indirectly to increase the antioxidant capacity of animal cells and their abilities to cope with oxidative stress. Induction of phase 2 enzymes is one means by which SF enhances the cellular antioxidant capacity. Enzymes induced by SF include Glutathione S-transferases (GST) and NAD[P]H:quinone oxidoreductase (NQO1) which can function as protectors against oxidative stress. To protect themselves from oxidative stress, cells are equipped with reducing buffer systems including the GSH and thioredoxin (Trx) reductase. GSH is an important tripeptide thiol which in addition to being the substrate for GSTs maintains the cellular oxidation– reduction balance and protects cells against free radical species. Aim of the first part of this thesis was to investigate the ability of SF to induce the expression and the activity of different phase 2 and antioxidant enzymes (such as GST, GR, GPx, NQO1, TR, SOD, CAT) in an in vitro model of rat cardiomyocytes, and also to define if SF treatment supprts cells in counteracting oxidative stress induced by H2O2 It is well known that acute exhaustive exercise causes significant reactive oxygen species generation that results in oxidative stress, which can induce negative effects on health and well being. In fact, increased oxidative stress and biomarkers (e.g., protein carbonyls, MDA, and 8- hydroxyguanosine) as well as muscle damage biomarkers (e.g. plasmatic Creatine cinase and Lactate dehydrogenase) have been observed after supramaximal sprint exercises, exhaustive longdistance cycling or running as well as resistance-type exercises, both in trained and untrained humans. Markers of oxidative stress also increase in rodents following exhaustive exercise. Moreover, antioxidant enzyme activities and expressions of antioxidant enzymes are known to increase in response to exhaustive exercise in both animal and human tissues. Aim of this project was to evaluate the effect of SF supplementation in counteracting oxidative stress induced by physical activity through its ability to induce phase 2, and antioxidant enzymes in rat muscle. The results show that SF is a nutraceutical compound able to induce the activity of different phase 2 and antioxidant enzymes in both cardiac muscle and skeletal muscle. Thanks to its actions SF is becoming a promising molecule able to prevent cardiovascular damages induced by oxidative stress and muscle damages induced by acute exhaustive exercise.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A reduced cancer risk associated with fruit and vegetable phytochemicals initially dictated chemopreventive approaches focused on specific green variety consumption or even single nutrient supplementations. However, these strategies not only failed to provide any health benefits but gave rise to detrimental effects. In parallel, public-health chemoprevention programmes were developed in the USA and Europe to increase whole vegetable consumption. Among these, the National Cancer Institute (NCI) sponsored plan “5 to 9 a day for a better health” was one of the most popular. This campaign promoted wide food choice through the consumption of at least 5 to 9 servings a day of colourful fruits and vegetables. In this study the effects of the diet suggested by NCI on transcription, translation and catalytic activity of both xenobiotic metabolizing (XME) and antioxidant enzymes were studied in the animal model. In fact, the boost of both antioxidant defences and “good” phase-II together with down-regulation of “bad” phase-I XMEs is still considered one of the most widely-used strategies of cancer control. Six male Sprague Dawley rats for each treatment group were used. According to the Italian Society of Human Nutrition, a serving of fruit, vegetables and leafy greens corresponds to 150, 250 and 50 g, respectively, in a 70 kg man. Proportionally, rats received one or five servings of lyophilized onion, tomato, peach, black grape or lettuce – for white, red, yellow, violet or green diet, respectively - or five servings of each green (“5 a day” diet) by oral gavage daily for 10 consecutive days. Liver subcellular fractions were tested for various cytochrome P450 (CYP) linked-monooxygenases, phase-II supported XMEs such as glutathione S-transferase (GST) and UDP-glucuronosyl transferase (UDPGT) as well as for some antioxidant enzymes. Hepatic transcriptional and translational effects were evaluated by reverse transcription-polymerase chain reaction (RT-PCR) and Western blot analysis, respectively. dROMs test was used to measure plasmatic oxidative stress. Routine haematochemical parameters were also monitored. While the five servings administration didn’t significantly vary XME catalytic activity, the lower dose caused a complex pattern of CYP inactivation with lettuce exerting particularly strong effects (a loss of up to 43% and 45% for CYP content and CYP2B1/2-linked XME, respectively; P<0.01). “5 a day” supplementation produced the most pronounced modulations (a loss of up to 60% for CYP2E1-linked XME and a reduction of CYP content of 54%; P<0.01). Testosterone hydroxylase activity confirmed these results. RT-PCR and Western blot analysis revealed that the “5 a day” diet XMEs inactivations were a result of both a transcriptional and a translational effect while lettuce didn’t exert such effects. All administrations brought out none or fewer modulation of phase-II supported XMEs. Apart from “5 a day” supplementation and the single serving of lettuce, which strongly induced DT- diaphorase (an increase of up to 141 and 171%, respectively; P<0.01), antioxidant enzymes were not significantly changed. RT-PCR analysis confirmed DT-diaphorase induction brought about by the administration of both “5 a day” diet and a single serving of lettuce. Furthermore, it unmasked a similar result for heme-oxygenase. dROMs test provided insight into a condition of high systemic oxidative stress as a consequence of animal diet supplementation with “5 a day” diet and a single serving of lettuce (an increase of up to 600% and 900%, respectively; P<0.01). Haematochemical parameters were mildly affected by such dietary manipulations. According to the classical chemopreventive theory, these results could be of particular relevance. In fact, even if antioxidant enzymes were only mildly affected, the phase-I inactivating ability of these vegetables would be a worthy strategy to cancer control. However, the recorded systemic considerable amount of reactive oxygen species and the complexity of these enzymes and their functions suggest caution in the widespread use of vegan/vegetarian diets as human chemopreventive strategies. In fact, recent literature rather suggests that only diets rich in fruits and vegetables and poor in certain types of fat, together with moderate caloric intake, could be associated with reduced cancer risk.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

La monografia propone un’analisi del periodo ca. 478-461 a.C. della storia ateniese e delle vicende di Cimone figlio di Milziade entro il contesto contemporaneo. Lo studio dell’Atene, e più in generale di varie realtà elleniche affacciate sull’Egeo, negli immediati anni ‘post-persiani’ si articola in due parti: la prima ripercorre in senso cronologico le notizie, essenzialmente letterarie, disponibili in merito alle attività politiche e militari di Atene, quale guida dell’alleanza greca; la seconda trae conclusioni di respiro più ampio, fondate sull’analisi precedente, e cerca una sintesi del periodo e del personaggio nel superamento di stereotipi e condizionamenti letterari. In tale ottica si esprime una riflessione, a partire dalla scarna trattazione tucididea, sui meccanismi attraverso i quali la tradizione ha deformato e sedimentato le informazioni disponibili generando un progressivo arricchimento che ha portato alla definizione, di fatto, di una ‘era cimoniana’ che è possibile mettere in discussione in alcuni tratti essenziali. Si mira dunque a proporre una valutazione del periodo priva di alcuni elementi, in ultimo evidenti soprattutto nell’approccio plutarcheo alla materia, che appaiono alieni al contesto di riferimento per la prima parte del V secolo: i temi principali ai quali si dedica la riflessione sono quelli dell’imperialismo ateniese, del filolaconismo, della bipolarità tra democrazia e oligarchia, della propaganda politico-mitologica. Il rapporto di Atene con Sparta, con gli Ioni e con le altre realtà del mondo egeo viene letto alla luce degli indizi disponibili sul clima di incertezza e di delicati equilibri creatosi all’indomani della ritirata delle forze persiane. Il ritratto di Cimone che si propone è quello di una figura indubbiamente significativa nella politica contemporanea ma, al contempo, fortemente condizionata e talora adombrata da dinamiche in qualche modo condivise all’interno dello scenario politico ateniese, improntato alla soddisfazione di necessità e volontà che la tradizione renderà archetipiche del paradigma democratico.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Il lavoro mira a fornire un complessivo inquadramento dell'istituto della società mista, identificato in primo luogo quale strumento di diritto privato a cui partecipano soggetti pubblici e soggetti privati. L'indagine si svolge su differenti piani di valutazione. Si dà ragione delle caratteristiche peculiari di tale contratto associativo e dei limiti che il nostro ordinamento impone allo sviluppo della figura. L'attenzione si sposta poi sulla specifica declinazione che il modello di società mista ha assunto in ambito europeo attraverso l'analisi del partenariato pubblico privato istituzionalizzato. L'istituto è di particolare interesse perchè individua nella società mista un modello organizzativo dai tratti specifici, all'interno del quale il ruolo del socio privato assume connotazioni e forme non comuni a tutti i modelli societari. La ricerca mira a mostrare come tale figura ha trovato riscontro nell'ordinamento interno e quali possibili sviluppi la stessa possa trovare in differenti campi della vita economica. In questi termini, si cerca di valutare quale sia l'incidenza delle procedure competitive nella costituzione e nella vita della società mista ed in che termini lo svolgimento delle attività affidate al socio privato debba essere inquadrato all'interno del rapporto di partenariato. Sul punto è centrale la declinazione fornita all'istituto in relazione ad uno specifico ambito di attività: i servizi pubblici locali di rilevanza economica. In questo contesto, particolarmente rilevante sul piano sistematico è la ricerca di un equilibrio tra il rispetto delle disciplina posta a tutela della concorrenza, ed il perseguimento delle finalità che hanno portato alla scelta di costruire una società mista. La scelta in favore di tale modello organizzativo pare infatti giustificata solo qualora essa apporti un reale vantaggio nella gestione del servizio e la realizzazione di concrete sinergie positive.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Interaction protocols establish how different computational entities can interact with each other. The interaction can be finalized to the exchange of data, as in 'communication protocols', or can be oriented to achieve some result, as in 'application protocols'. Moreover, with the increasing complexity of modern distributed systems, protocols are used also to control such a complexity, and to ensure that the system as a whole evolves with certain features. However, the extensive use of protocols has raised some issues, from the language for specifying them to the several verification aspects. Computational Logic provides models, languages and tools that can be effectively adopted to address such issues: its declarative nature can be exploited for a protocol specification language, while its operational counterpart can be used to reason upon such specifications. In this thesis we propose a proof-theoretic framework, called SCIFF, together with its extensions. SCIFF is based on Abductive Logic Programming, and provides a formal specification language with a clear declarative semantics (based on abduction). The operational counterpart is given by a proof procedure, that allows to reason upon the specifications and to test the conformance of given interactions w.r.t. a defined protocol. Moreover, by suitably adapting the SCIFF Framework, we propose solutions for addressing (1) the protocol properties verification (g-SCIFF Framework), and (2) the a-priori conformance verification of peers w.r.t. the given protocol (AlLoWS Framework). We introduce also an agent based architecture, the SCIFF Agent Platform, where the same protocol specification can be used to program and to ease the implementation task of the interacting peers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The human movement analysis (HMA) aims to measure the abilities of a subject to stand or to walk. In the field of HMA, tests are daily performed in research laboratories, hospitals and clinics, aiming to diagnose a disease, distinguish between disease entities, monitor the progress of a treatment and predict the outcome of an intervention [Brand and Crowninshield, 1981; Brand, 1987; Baker, 2006]. To achieve these purposes, clinicians and researchers use measurement devices, like force platforms, stereophotogrammetric systems, accelerometers, baropodometric insoles, etc. This thesis focus on the force platform (FP) and in particular on the quality assessment of the FP data. The principal objective of our work was the design and the experimental validation of a portable system for the in situ calibration of FPs. The thesis is structured as follows: Chapter 1. Description of the physical principles used for the functioning of a FP: how these principles are used to create force transducers, such as strain gauges and piezoelectrics transducers. Then, description of the two category of FPs, three- and six-component, the signals acquisition (hardware structure), and the signals calibration. Finally, a brief description of the use of FPs in HMA, for balance or gait analysis. Chapter 2. Description of the inverse dynamics, the most common method used in the field of HMA. This method uses the signals measured by a FP to estimate kinetic quantities, such as joint forces and moments. The measures of these variables can not be taken directly, unless very invasive techniques; consequently these variables can only be estimated using indirect techniques, as the inverse dynamics. Finally, a brief description of the sources of error, present in the gait analysis. Chapter 3. State of the art in the FP calibration. The selected literature is divided in sections, each section describes: systems for the periodic control of the FP accuracy; systems for the error reduction in the FP signals; systems and procedures for the construction of a FP. In particular is detailed described a calibration system designed by our group, based on the theoretical method proposed by ?. This system was the “starting point” for the new system presented in this thesis. Chapter 4. Description of the new system, divided in its parts: 1) the algorithm; 2) the device; and 3) the calibration procedure, for the correct performing of the calibration process. The algorithm characteristics were optimized by a simulation approach, the results are here presented. In addiction, the different versions of the device are described. Chapter 5. Experimental validation of the new system, achieved by testing it on 4 commercial FPs. The effectiveness of the calibration was verified by measuring, before and after calibration, the accuracy of the FPs in measuring the center of pressure of an applied force. The new system can estimate local and global calibration matrices; by local and global calibration matrices, the non–linearity of the FPs was quantified and locally compensated. Further, a non–linear calibration is proposed. This calibration compensates the non– linear effect in the FP functioning, due to the bending of its upper plate. The experimental results are presented. Chapter 6. Influence of the FP calibration on the estimation of kinetic quantities, with the inverse dynamics approach. Chapter 7. The conclusions of this thesis are presented: need of a calibration of FPs and consequential enhancement in the kinetic data quality. Appendix: Calibration of the LC used in the presented system. Different calibration set–up of a 3D force transducer are presented, and is proposed the optimal set–up, with particular attention to the compensation of non–linearities. The optimal set–up is verified by experimental results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Traditional software engineering approaches and metaphors fall short when applied to areas of growing relevance such as electronic commerce, enterprise resource planning, and mobile computing: such areas, in fact, generally call for open architectures that may evolve dynamically over time so as to accommodate new components and meet new requirements. This is probably one of the main reasons that the agent metaphor and the agent-oriented paradigm are gaining momentum in these areas. This thesis deals with the engineering of complex software systems in terms of the agent paradigm. This paradigm is based on the notions of agent and systems of interacting agents as fundamental abstractions for designing, developing and managing at runtime typically distributed software systems. However, today the engineer often works with technologies that do not support the abstractions used in the design of the systems. For this reason the research on methodologies becomes the basic point in the scientific activity. Currently most agent-oriented methodologies are supported by small teams of academic researchers, and as a result, most of them are in an early stage and still in the first context of mostly \academic" approaches for agent-oriented systems development. Moreover, such methodologies are not well documented and very often defined and presented only by focusing on specific aspects of the methodology. The role played by meta- models becomes fundamental for comparing and evaluating the methodologies. In fact a meta-model specifies the concepts, rules and relationships used to define methodologies. Although it is possible to describe a methodology without an explicit meta-model, formalising the underpinning ideas of the methodology in question is valuable when checking its consistency or planning extensions or modifications. A good meta-model must address all the different aspects of a methodology, i.e. the process to be followed, the work products to be generated and those responsible for making all this happen. In turn, specifying the work products that must be developed implies dening the basic modelling building blocks from which they are built. As a building block, the agent abstraction alone is not enough to fully model all the aspects related to multi-agent systems in a natural way. In particular, different perspectives exist on the role that environment plays within agent systems: however, it is clear at least that all non-agent elements of a multi-agent system are typically considered to be part of the multi-agent system environment. The key role of environment as a first-class abstraction in the engineering of multi-agent system is today generally acknowledged in the multi-agent system community, so environment should be explicitly accounted for in the engineering of multi-agent system, working as a new design dimension for agent-oriented methodologies. At least two main ingredients shape the environment: environment abstractions - entities of the environment encapsulating some functions -, and topology abstractions - entities of environment that represent the (either logical or physical) spatial structure. In addition, the engineering of non-trivial multi-agent systems requires principles and mechanisms for supporting the management of the system representation complexity. These principles lead to the adoption of a multi-layered description, which could be used by designers to provide different levels of abstraction over multi-agent systems. The research in these fields has lead to the formulation of a new version of the SODA methodology where environment abstractions and layering principles are exploited for en- gineering multi-agent systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A recent initiative of the European Space Agency (ESA) aims at the definition and adoption of a software reference architecture for use in on-board software of future space missions. Our PhD project placed in the context of that effort. At the outset of our work we gathered all the industrial needs relevant to ESA and all the main European space stakeholders and we were able to consolidate a set of technical high-level requirements for the fulfillment of them. The conclusion we reached from that phase confirmed that the adoption of a software reference architecture was indeed the best solution for the fulfillment of the high-level requirements. The software reference architecture we set on building rests on four constituents: (i) a component model, to design the software as a composition of individually verifiable and reusable software units; (ii) a computational model, to ensure that the architectural description of the software is statically analyzable; (iii) a programming model, to ensure that the implementation of the design entities conforms with the semantics, the assumptions and the constraints of the computational model; (iv) a conforming execution platform, to actively preserve at run time the properties asserted by static analysis. The nature, feasibility and fitness of constituents (ii), (iii) and (iv), were already proved by the author in an international project that preceded the commencement of the PhD work. The core of the PhD project was therefore centered on the design and prototype implementation of constituent (i), a component model. Our proposed component model is centered on: (i) rigorous separation of concerns, achieved with the support for design views and by careful allocation of concerns to the dedicated software entities; (ii) the support for specification and model-based analysis of extra-functional properties; (iii) the inclusion space-specific concerns.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The quest for universal memory is driving the rapid development of memories with superior all-round capabilities in non-volatility, high speed, high endurance and low power. The memory subsystem accounts for a significant cost and power budget of a computer system. Current DRAM-based main memory systems are starting to hit the power and cost limit. To resolve this issue the industry is improving existing technologies such as Flash and exploring new ones. Among those new technologies is the Phase Change Memory (PCM), which overcomes some of the shortcomings of the Flash such as durability and scalability. This alternative non-volatile memory technology, which uses resistance contrast in phase-change materials, offers more density relative to DRAM, and can help to increase main memory capacity of future systems while remaining within the cost and power constraints. Chalcogenide materials can suitably be exploited for manufacturing phase-change memory devices. Charge transport in amorphous chalcogenide-GST used for memory devices is modeled using two contributions: hopping of trapped electrons and motion of band electrons in extended states. Crystalline GST exhibits an almost Ohmic I(V) curve. In contrast amorphous GST shows a high resistance at low biases while, above a threshold voltage, a transition takes place from a highly resistive to a conductive state, characterized by a negative differential-resistance behavior. A clear and complete understanding of the threshold behavior of the amorphous phase is fundamental for exploiting such materials in the fabrication of innovative nonvolatile memories. The type of feedback that produces the snapback phenomenon is described as a filamentation in energy that is controlled by electron–electron interactions between trapped electrons and band electrons. The model thus derived is implemented within a state-of-the-art simulator. An analytical version of the model is also derived and is useful for discussing the snapback behavior and the scaling properties of the device.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this study, some important aspects of the relationship between honey bees (Apis mellifera L.) and pesticides have been investigated. In the first part of the research, the effects of the exposure of honey bees to neonicotinoids and fipronil contaminated dusts were analyzed. In fact, considerable amounts of these pesticides, employed for maize seed dressing treatments, may be dispersed during the sowing operations, thus representing a way of intoxication for honey bees. In particular, a specific way of exposure to this pesticides formulation, the indirect contact, was taken into account. To this aim, we conducted different experimentations, in laboratory, in semi-field and in open field conditions in order to assess the effects on mortality, foraging behaviour, colony development and capacity of orientation. The real dispersal of contaminated dusts was previously assessed in specific filed trials. In the second part, the impact of various pesticides (chemical and biological) on honey bee biochemical-physiological changes, was evaluated. Different ways and durations of exposure to the tested products were also employed. Three experimentations were performed, combining Bt spores and deltamethrin, Bt spores and fipronil, difenoconazole and deltamethrin. Several important enzymes (GST, ALP, SOD, CAT, G6PDH, GAPDH) were selected in order to test the pesticides induced variations in their activity. In particular, these enzymes are involved in different pathways of detoxification, oxidative stress defence and energetic metabolism. The results showed a significant effect on mortality of neonicotinoids and fipronil contaminated dusts, both in laboratory and in semi-field trials. However, no effects were evidenced in honey bees orientation capacity. The analysis of different biochemical indicators highlighted some interesting physiological variations that can be linked to the pesticide exposure. We therefore stress the attention on the possibility of using such a methodology as a novel toxicity endpoint in environmental risk assessment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis presents some different techniques designed to drive a swarm of robots in an a-priori unknown environment in order to move the group from a starting area to a final one avoiding obstacles. The presented techniques are based on two different theories used alone or in combination: Swarm Intelligence (SI) and Graph Theory. Both theories are based on the study of interactions between different entities (also called agents or units) in Multi- Agent Systems (MAS). The first one belongs to the Artificial Intelligence context and the second one to the Distributed Systems context. These theories, each one from its own point of view, exploit the emergent behaviour that comes from the interactive work of the entities, in order to achieve a common goal. The features of flexibility and adaptability of the swarm have been exploited with the aim to overcome and to minimize difficulties and problems that can affect one or more units of the group, having minimal impact to the whole group and to the common main target. Another aim of this work is to show the importance of the information shared between the units of the group, such as the communication topology, because it helps to maintain the environmental information, detected by each single agent, updated among the swarm. Swarm Intelligence has been applied to the presented technique, through the Particle Swarm Optimization algorithm (PSO), taking advantage of its features as a navigation system. The Graph Theory has been applied by exploiting Consensus and the application of the agreement protocol with the aim to maintain the units in a desired and controlled formation. This approach has been followed in order to conserve the power of PSO and to control part of its random behaviour with a distributed control algorithm like Consensus.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In many application domains data can be naturally represented as graphs. When the application of analytical solutions for a given problem is unfeasible, machine learning techniques could be a viable way to solve the problem. Classical machine learning techniques are defined for data represented in a vectorial form. Recently some of them have been extended to deal directly with structured data. Among those techniques, kernel methods have shown promising results both from the computational complexity and the predictive performance point of view. Kernel methods allow to avoid an explicit mapping in a vectorial form relying on kernel functions, which informally are functions calculating a similarity measure between two entities. However, the definition of good kernels for graphs is a challenging problem because of the difficulty to find a good tradeoff between computational complexity and expressiveness. Another problem we face is learning on data streams, where a potentially unbounded sequence of data is generated by some sources. There are three main contributions in this thesis. The first contribution is the definition of a new family of kernels for graphs based on Directed Acyclic Graphs (DAGs). We analyzed two kernels from this family, achieving state-of-the-art results from both the computational and the classification point of view on real-world datasets. The second contribution consists in making the application of learning algorithms for streams of graphs feasible. Moreover,we defined a principled way for the memory management. The third contribution is the application of machine learning techniques for structured data to non-coding RNA function prediction. In this setting, the secondary structure is thought to carry relevant information. However, existing methods considering the secondary structure have prohibitively high computational complexity. We propose to apply kernel methods on this domain, obtaining state-of-the-art results.