811 resultados para New concept


Relevância:

60.00% 60.00%

Publicador:

Resumo:

The mechanisms responsible for containing activity in systems represented by networks are crucial in various phenomena, for example, in diseases such as epilepsy that affect the neuronal networks and for information dissemination in social networks. The first models to account for contained activity included triggering and inhibition processes, but they cannot be applied to social networks where inhibition is clearly absent. A recent model showed that contained activity can be achieved with no need of inhibition processes provided that the network is subdivided into modules (communities). In this paper, we introduce a new concept inspired in the Hebbian theory, through which containment of activity is achieved by incorporating a dynamics based on a decaying activity in a random walk mechanism preferential to the node activity. Upon selecting the decay coefficient within a proper range, we observed sustained activity in all the networks tested, namely, random, Barabasi-Albert and geographical networks. The generality of this finding was confirmed by showing that modularity is no longer needed if the dynamics based on the integrate-and-fire dynamics incorporated the decay factor. Taken together, these results provide a proof of principle that persistent, restrained network activation might occur in the absence of any particular topological structure. This may be the reason why neuronal activity does not spread out to the entire neuronal network, even when no special topological organization exists.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Transportation planning is currently being confronted with a broader planning view, which is given by the concept of mobility. The Index of Sustainable Urban Mobility (I_SUM) is among the tools developed for supporting this new concept implementation. It is a tool to assess the current mobility conditions of any city, which can also be applied for policy formulation. This study focus on the application of I_SUM in the city of Curitiba, Brazil. Considering that the city is known worldwide as a reference of successful urban and transportation planning, the index application must confirm it. An additional objective of the study was to evaluate the index itself, or the subjacent assessment method and reference values. A global I_SUM value of 0.747 confirmed that the city has indeed very positive characteristics regarding sustainable mobility policies. However, some deficiencies were also detected, particularly with respect to non-motorized transport modes. The application has also served to show that a few I_SUM indicators were not able to capture some of the positive aspects of the city, what may suggest the need of changes in their formulation. Finally, the index application in parts of the city suggests that the city provides fair and equitable mobility conditions to all citizens throughout the city. This is certainly a good attribute for becoming a benchmark of sustainable mobility, even if it is not yet the ideal model. (C) 2012 Elsevier Ltd. All rights reserved.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Infants born to HIV-infected mothers are at high risk of becoming infected during gestation or the breastfeeding period. A search is thus warranted for vaccine formulations that will prevent mother-to-child HIV transmission. The LAMP/gag DNA chimeric vaccine encodes the HIV-1 p55gag fused to the lysosome-associated membrane protein-1 (LAMP-1) and has been shown to enhance anti-Gag antibody (Ab) and cellular immune responses in adult and neonatal mice; such a vaccine represents a new concept in antigen presentation. In this study, we evaluated the effect of LAMP/gag DNA immunization on neonates either before conception or during pregnancy. LAMP/gag immunization of BALB/c mice before conception by the intradermal route led to the transfer of anti-Gag IgG1 Ab through the placenta and via breastfeeding. Furthermore, there were an increased percentage of CD4+ CD25+ Foxp3+ T cells in the spleens of neonates. When offspring were immunized with LAMP/gag DNA, the anti-Gag Ab response and the Gag-specific IFN-gamma-secreting cells were decreased. Inhibition of anti-Gag Ab production and cellular responses were not observed six months after immunization, indicating that maternal immunization did not interfere with the long-lasting memory response in offspring. Injection of purified IgG in conjunction with LAMP/gag DNA immunization decreased humoral and cytotoxic T-cell responses. LAMP/gag DNA immunization by intradermal injection prior to conception promoted the transfer of Ab, leading to a diminished response to Gag without interfering with the development of anti-Gag T- and B-cell memory. Finally, we assessed responses after one intravenous injection of LAMP/gag DNA during the last five days of pregnancy. The intravenous injection led to in utero immunization. In conclusion, DNA vaccine enconding LAMP-1 with Gag and other HIV-1 antigens should be considered in the development of a protective vaccine for the maternal/fetal and newborn periods.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Reconstruction of bone is needed for high bone loss due to congenital deformities, trauma or neoplastic diseases. Commonly, orthopaedic surgical treatments are autologus or allogenic bone implant or prosthetic implant. A choice to the traditional approaches could be represented by tissue engineering that use cells (and/or their products) and innovative biomaterials to perform bone substitutes biologically active as an alternative to artificial devices. In the last years, there was a wide improvement in biology on stem cells potential research and in biomedical engineering through development of new biomaterials designed to resemble the physiological tissues. Tissue engineering strategies and smart materials aim together to stimulate in vivo bone regeneration. This approaches drive at restore not only structure integrity and/or function of the original tissue, but also to induce new tissue deposition in situ. An intelligent bone substitute is now designed like not only a scaffold but also as carrier of regeneration biomolecular signals. Biomimetics has helped to project new tissue engineered devices to simulate the physiological substrates architecture, such extracellular matrix (ECM), and molecular signals that drive the integration at the interface between pre-existing tissue and scaffold. Biomimetic strategies want to increase the material surface biological activity with physical modifications (topography) o chemical ones (adhesive peptides), to improve cell adhesion to material surface and possibly scaffold colonization. This study evaluated the effects of biomimetic modifications of surgical materials surface, as poly-caprolattone (PCL) and titanium on bone stem cells behaviour in a marrow experimental model in vitro. Two biomimetic strategies were analyzed; ione beam irradiation, that changes the surface roughness at the nanoscale, and surface functionalization with specific adhesive peptides or Self Assembled Monolayers (SAMs). These new concept could be a mean to improve the early (cell adhesion, spreading..) and late phases (osteoblast differentiation) of cell/substrate interactions.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Bioinformatics is a recent and emerging discipline which aims at studying biological problems through computational approaches. Most branches of bioinformatics such as Genomics, Proteomics and Molecular Dynamics are particularly computationally intensive, requiring huge amount of computational resources for running algorithms of everincreasing complexity over data of everincreasing size. In the search for computational power, the EGEE Grid platform, world's largest community of interconnected clusters load balanced as a whole, seems particularly promising and is considered the new hope for satisfying the everincreasing computational requirements of bioinformatics, as well as physics and other computational sciences. The EGEE platform, however, is rather new and not yet free of problems. In addition, specific requirements of bioinformatics need to be addressed in order to use this new platform effectively for bioinformatics tasks. In my three years' Ph.D. work I addressed numerous aspects of this Grid platform, with particular attention to those needed by the bioinformatics domain. I hence created three major frameworks, Vnas, GridDBManager and SETest, plus an additional smaller standalone solution, to enhance the support for bioinformatics applications in the Grid environment and to reduce the effort needed to create new applications, additionally addressing numerous existing Grid issues and performing a series of optimizations. The Vnas framework is an advanced system for the submission and monitoring of Grid jobs that provides an abstraction with reliability over the Grid platform. In addition, Vnas greatly simplifies the development of new Grid applications by providing a callback system to simplify the creation of arbitrarily complex multistage computational pipelines and provides an abstracted virtual sandbox which bypasses Grid limitations. Vnas also reduces the usage of Grid bandwidth and storage resources by transparently detecting equality of virtual sandbox files based on content, across different submissions, even when performed by different users. BGBlast, evolution of the earlier project GridBlast, now provides a Grid Database Manager (GridDBManager) component for managing and automatically updating biological flatfile databases in the Grid environment. GridDBManager sports very novel features such as an adaptive replication algorithm that constantly optimizes the number of replicas of the managed databases in the Grid environment, balancing between response times (performances) and storage costs according to a programmed cost formula. GridDBManager also provides a very optimized automated management for older versions of the databases based on reverse delta files, which reduces the storage costs required to keep such older versions available in the Grid environment by two orders of magnitude. The SETest framework provides a way to the user to test and regressiontest Python applications completely scattered with side effects (this is a common case with Grid computational pipelines), which could not easily be tested using the more standard methods of unit testing or test cases. The technique is based on a new concept of datasets containing invocations and results of filtered calls. The framework hence significantly accelerates the development of new applications and computational pipelines for the Grid environment, and the efforts required for maintenance. An analysis of the impact of these solutions will be provided in this thesis. This Ph.D. work originated various publications in journals and conference proceedings as reported in the Appendix. Also, I orally presented my work at numerous international conferences related to Grid and bioinformatics.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Nowadays licensing practices have increased in importance and relevance driving the widespread diffusion of markets for technologies. Firms are shifting from a tactical to a strategic attitude towards licensing, addressing both business and corporate level objectives. The Open Innovation Paradigm has been embraced. Firms rely more and more on collaboration and external sourcing of knowledge. This new model of innovation requires firms to leverage on external technologies to unlock the potential of firms’ internal innovative efforts. In this context, firms’ competitive advantage depends both on their ability to recognize available opportunities inside and outside their boundaries and on their readiness to exploit them in order to fuel their innovation process dynamically. Licensing is one of the ways available to firm to ripe the advantages associated to an open attitude in technology strategy. From the licensee’s point view this implies challenging the so-called not-invented-here syndrome, affecting the more traditional firms that emphasize the myth of internal research and development supremacy. This also entails understanding the so-called cognitive constraints affecting the perfect functioning of markets for technologies that are associated to the costs for the assimilation, integration and exploitation of external knowledge by recipient firms. My thesis aimed at shedding light on new interesting issues associated to in-licensing activities that have been neglected by the literature on licensing and markets for technologies. The reason for this gap is associated to the “perspective bias” affecting the works within this stream of research. With very few notable exceptions, they have been generally concerned with the investigation of the so-called licensing dilemma of the licensor – whether to license out or to internally exploit the in-house developed technologies, while neglecting the licensee’s perspective. In my opinion, this has left rooms for improving the understanding of the determinants and conditions affecting licensing-in practices. From the licensee’s viewpoint, the licensing strategy deals with the search, integration, assimilation, exploitation of external technologies. As such it lies at the very hearth of firm’s technology strategy. Improving our understanding of this strategy is thus required to assess the full implications of in-licensing decisions as they shape firms’ innovation patterns and technological capabilities evolution. It also allow for understanding the so-called cognitive constraints associated to the not-invented-here syndrome. In recognition of that, the aim of my work is to contribute to the theoretical and empirical literature explaining the determinants of the licensee’s behavior, by providing a comprehensive theoretical framework as well as ad-hoc conceptual tools to understand and overcome frictions and to ease the achievement of satisfactory technology transfer agreements in the marketplace. Aiming at this, I investigate licensing-in in three different fashions developed in three research papers. In the first work, I investigate the links between licensing and the patterns of firms’ technological search diversification according to the framework of references of the Search literature, Resource-based Theory and the theory of general purpose technologies. In the second paper - that continues where the first one left off – I analyze the new concept of learning-bylicensing, in terms of development of new knowledge inside the licensee firms (e.g. new patents) some years after the acquisition of the license, according to the Dynamic Capabilities perspective. Finally, in the third study, Ideal with the determinants of the remuneration structure of patent licenses (form and amount), and in particular on the role of the upfront fee from the licensee’s perspective. Aiming at this, I combine the insights of two theoretical approaches: agency and real options theory.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This study concerns the representation of space in Caribbean literature, both francophone and Anglophone and, in particular, but not only, in the martinican literature, in the works of the authors born in the island. The analysis focus on the second half of the last century, a period in which the martinican production of novels and romances increased considerably, and where the representation and the rule of space had a relevant place. So, the thesis explores the literary modalities of this representation. The work is constituted of 5 chapters and the critical and methodological approaches are both of an analytical and comparative type. The first chapter “The caribbean space: geography, history and society” presents the geographic context, through an analysis of the historical and political major events occurred in the Caribbean archipelago, in particular of the French Antilles, from the first colonization until the départementalisation. The first paragraph “The colonized space: historical-political excursus” the explores the history of the European colonization that marked forever the theatre of the relationship between Europe, Africa and the New World. This social situation take a long and complex process of “Re-appropriation and renegotiation of the space”, (second paragraph) always the space of the Other, that interest both the Antillean society and the writers’ universe. So, a series of questions take place in the third paragraph “Landscape and identity”: what is the function of space in the process of identity construction? What are the literary forms and representations of space in the Caribbean context? Could the writing be a tool of cultural identity definition, both individual and collective? The second chapter “The literary representation of the Antillean space” is a methodological analysis of the notions of literary space and descriptive gender. The first paragraph “The literary space of and in the novel” is an excursus of the theory of such critics like Blanchot, Bachelard, Genette and Greimas, and in particular the recent innovation of the 20th century; the second one “Space of the Antilles, space of the writing” is an attempt to apply this theory to the Antillean literary space. Finally the last paragraph “Signs on the page: the symbolic places of the antillean novel landscape” presents an inventory of the most recurrent antillean places (mornes, ravines, traces, cachots, En-ville,…), symbols of the history and the past, described in literary works, but according to new modalities of representation. The third chapter, the core of the thesis, “Re-drawing the map of the French Antilles” focused the study of space representation on francophone literature, in particular on a selected works of four martinican writers, like Roland Brival, Édouard Glissant, Patrick Chamoiseau and Raphaël Confiant. Through this section, a spatial evolution comes out step by step, from the first to the second paragraph, whose titles are linked together “The novel space evolution: from the forest of the morne… to the jungle of the ville”. The virgin and uncontaminated space of the Antilles, prior to the colonisation, where the Indios lived in harmony with the nature, find a representation in both works of Brival (Le sang du roucou, Le dernier des Aloukous) and of Glissant (Le Quatrième siècle, Ormerod). The arrival of the European colonizer brings a violent and sudden metamorphosis of the originary space and landscape, together with the traditions and culture of the Caraïbes population. These radical changes are visible in the works of Chamoiseau (Chronique des sept misères, Texaco, L’esclave vieil homme et le molosse, Livret des villes du deuxième monde, Un dimanche au cachot) and Confiant (Le Nègre et l’Amiral, Eau de Café, Ravines du devant-jour, Nègre marron) that explore the urban space of the creole En-ville. The fourth chapter represents the “2nd step: the Anglophone novel space” in the exploration of literary representation of space, through an analytical study of the works of three Anglophone writers, the 19th century Lafcadio Hearn (A Midsummer Trip To the West Indies, Two Years in the French West Indies, Youma) and the contemporary authors Derek Walcott (Omeros, Map of the New World, What the Twilight says) and Edward Kamau Brathwaite (The Arrivants: A New World Trilogy). The Anglophone voice of the Caribbean archipelago brings a very interesting contribution to the critical idea of a spatial evolution in the literary representation of space, started with francophone production: “The spatial evolution goes on: from the Martiniques Sketches of Hearn… to the modern bards of Caribbean archipelago” is the new linked title of the two paragraphs. The fifth chapter “Extended look, space shared: the Caribbean archipelago” is a comparative analysis of the results achieved in the prior sections, through a dialogue between all the texts in the first paragraph “Francophone and Anglophone representation of space compared: differences and analogies”. The last paragraph instead is an attempt of re-negotiate the conventional notions of space and place, from a geographical and physical meaning, to the new concept of “commonplace”, not synonym of prejudice, but “common place” of sharing and dialogue. The question sets in the last paragraph “The “commonplaces” of the physical and mental map of the Caribbean archipelago: toward a non-place?” contains the critical idea of the entire thesis.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Family businesses have acquired a very specific gravity in the economy of occidental countries, generating most of the employment and the richness for the last ages. In Spain Family Businesses represent the 65% about the total of enterprises with 1,5 million companies. They give employment to 8 million people, the 80% of the private employment and develop the 65% of the Spanish GNP (Gross National Product). Otherwise, the family business needs a complete law regulation that gives satisfaction to their own necessities and challenges. These companies have to deal with national or international economic scene to assure their permanency and competitiveness. In fact, the statistics about family companies have a medium life of 35 years. European family businesses success their successor process between a 10 and 25%. It’s said: first generation makes, second generation stays, third generation distributes. In that sense, the Recommendation of the European Commission of December 7º 1994 about the succession of the small and medium companies has reformed European internal orders according to make easier successor process and to introduce practices of family companies’ good government. So, the Italian law, under the 14th Law, February 2006, has reformed its Covil Code, appearing a new concept, called “Patto di famiglia”, wich abolish the prohibition as laid dwon in the 458 article about successors’ agreements, admitting the possibility that testator guarantees the continuity of the company or of the family society, giving it, totally or in part, to one or various of its descendents. On other hand, Spain has promulgated the 17th Royal Decree (9th February 2007), that governs the publicity of family agreements (Protocolos familiars). These “protocolo familiar” (Family Agreement) are known as accord of wills, consented and accepted unanimously of all the family members and the company, taking into account recommendations and practices of family company’s good government.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The quality of astronomical sites is the first step to be considered to have the best performances from the telescopes. In particular, the efficiency of large telescopes in UV, IR, radio etc. is critically dependent on atmospheric transparency. It is well known that the random optical effects induced on the light propagation by turbulent atmosphere also limit telescope’s performances. Nowadays, clear appears the importance to correlate the main atmospheric physical parameters with the optical quality reachable by large aperture telescopes. The sky quality evaluation improved with the introduction of new techniques, new instrumentations and with the understanding of the link between the meteorological (or synoptical parameters and the observational conditions thanks to the application of the theories of electromagnetic waves propagation in turbulent medias: what we actually call astroclimatology. At the present the site campaigns are evolved and are performed using the classical scheme of optical seeing properties, meteorological parameters, sky transparency, sky darkness and cloudiness. New concept are added and are related to the geophysical properties such as seismicity, microseismicity, local variability of the climate, atmospheric conditions related to the ground optical turbulence and ground wind regimes, aerosol presence, use of satellite data. The purpose of this project is to provide reliable methods to analyze the atmospheric properties that affect ground-based optical astronomical observations and to correlate them with the main atmospheric parameters generating turbulence and affecting the photometric accuracy. The first part of the research concerns the analysis and interpretation of longand short-time scale meteorological data at two of the most important astronomical sites located in very different environments: the Paranal Observatory in the Atacama Desert (Chile), and the Observatorio del Roque de Los Muchachos(ORM) located in La Palma (Canary Islands, Spain). The optical properties of airborne dust at ORM have been investigated collecting outdoor data using a ground-based dust monitor. Because of its dryness, Paranal is a suitable observatory for near-IR observations, thus the extinction properties in the spectral range 1.00-2.30 um have been investigated using an empirical method. Furthermore, this PhD research has been developed using several turbulence profilers in the selection of the site for the European Extremely Large Telescope(E-ELT). During the campaigns the properties of the turbulence at different heights at Paranal and in the sites located in northern Chile and Argentina have been studied. This given the possibility to characterize the surface layer turbulence at Paranal and its connection with local meteorological conditions.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Recently in most of the industrial automation process an ever increasing degree of automation has been observed. This increasing is motivated by the higher requirement of systems with great performance in terms of quality of products/services generated, productivity, efficiency and low costs in the design, realization and maintenance. This trend in the growth of complex automation systems is rapidly spreading over automated manufacturing systems (AMS), where the integration of the mechanical and electronic technology, typical of the Mechatronics, is merging with other technologies such as Informatics and the communication networks. An AMS is a very complex system that can be thought constituted by a set of flexible working stations, one or more transportation systems. To understand how this machine are important in our society let considerate that every day most of us use bottles of water or soda, buy product in box like food or cigarets and so on. Another important consideration from its complexity derive from the fact that the the consortium of machine producers has estimated around 350 types of manufacturing machine. A large number of manufacturing machine industry are presented in Italy and notably packaging machine industry,in particular a great concentration of this kind of industry is located in Bologna area; for this reason the Bologna area is called “packaging valley”. Usually, the various parts of the AMS interact among them in a concurrent and asynchronous way, and coordinate the parts of the machine to obtain a desiderated overall behaviour is an hard task. Often, this is the case in large scale systems, organized in a modular and distributed manner. Even if the success of a modern AMS from a functional and behavioural point of view is still to attribute to the design choices operated in the definition of the mechanical structure and electrical electronic architecture, the system that governs the control of the plant is becoming crucial, because of the large number of duties associated to it. Apart from the activity inherent to the automation of themachine cycles, the supervisory system is called to perform other main functions such as: emulating the behaviour of traditional mechanical members thus allowing a drastic constructive simplification of the machine and a crucial functional flexibility; dynamically adapting the control strategies according to the different productive needs and to the different operational scenarios; obtaining a high quality of the final product through the verification of the correctness of the processing; addressing the operator devoted to themachine to promptly and carefully take the actions devoted to establish or restore the optimal operating conditions; managing in real time information on diagnostics, as a support of the maintenance operations of the machine. The kind of facilities that designers can directly find on themarket, in terms of software component libraries provides in fact an adequate support as regard the implementation of either top-level or bottom-level functionalities, typically pertaining to the domains of user-friendly HMIs, closed-loop regulation and motion control, fieldbus-based interconnection of remote smart devices. What is still lacking is a reference framework comprising a comprehensive set of highly reusable logic control components that, focussing on the cross-cutting functionalities characterizing the automation domain, may help the designers in the process of modelling and structuring their applications according to the specific needs. Historically, the design and verification process for complex automated industrial systems is performed in empirical way, without a clear distinction between functional and technological-implementation concepts and without a systematic method to organically deal with the complete system. Traditionally, in the field of analog and digital control design and verification through formal and simulation tools have been adopted since a long time ago, at least for multivariable and/or nonlinear controllers for complex time-driven dynamics as in the fields of vehicles, aircrafts, robots, electric drives and complex power electronics equipments. Moving to the field of logic control, typical for industrial manufacturing automation, the design and verification process is approached in a completely different way, usually very “unstructured”. No clear distinction between functions and implementations, between functional architectures and technological architectures and platforms is considered. Probably this difference is due to the different “dynamical framework”of logic control with respect to analog/digital control. As a matter of facts, in logic control discrete-events dynamics replace time-driven dynamics; hence most of the formal and mathematical tools of analog/digital control cannot be directly migrated to logic control to enlighten the distinction between functions and implementations. In addition, in the common view of application technicians, logic control design is strictly connected to the adopted implementation technology (relays in the past, software nowadays), leading again to a deep confusion among functional view and technological view. In Industrial automation software engineering, concepts as modularity, encapsulation, composability and reusability are strongly emphasized and profitably realized in the so-calledobject-oriented methodologies. Industrial automation is receiving lately this approach, as testified by some IEC standards IEC 611313, IEC 61499 which have been considered in commercial products only recently. On the other hand, in the scientific and technical literature many contributions have been already proposed to establish a suitable modelling framework for industrial automation. During last years it was possible to note a considerable growth in the exploitation of innovative concepts and technologies from ICT world in industrial automation systems. For what concerns the logic control design, Model Based Design (MBD) is being imported in industrial automation from software engineering field. Another key-point in industrial automated systems is the growth of requirements in terms of availability, reliability and safety for technological systems. In other words, the control system should not only deal with the nominal behaviour, but should also deal with other important duties, such as diagnosis and faults isolations, recovery and safety management. Indeed, together with high performance, in complex systems fault occurrences increase. This is a consequence of the fact that, as it typically occurs in reliable mechatronic systems, in complex systems such as AMS, together with reliable mechanical elements, an increasing number of electronic devices are also present, that are more vulnerable by their own nature. The diagnosis problem and the faults isolation in a generic dynamical system consists in the design of an elaboration unit that, appropriately processing the inputs and outputs of the dynamical system, is also capable of detecting incipient faults on the plant devices, reconfiguring the control system so as to guarantee satisfactory performance. The designer should be able to formally verify the product, certifying that, in its final implementation, it will perform itsrequired function guarantying the desired level of reliability and safety; the next step is that of preventing faults and eventually reconfiguring the control system so that faults are tolerated. On this topic an important improvement to formal verification of logic control, fault diagnosis and fault tolerant control results derive from Discrete Event Systems theory. The aimof this work is to define a design pattern and a control architecture to help the designer of control logic in industrial automated systems. The work starts with a brief discussion on main characteristics and description of industrial automated systems on Chapter 1. In Chapter 2 a survey on the state of the software engineering paradigm applied to industrial automation is discussed. Chapter 3 presentes a architecture for industrial automated systems based on the new concept of Generalized Actuator showing its benefits, while in Chapter 4 this architecture is refined using a novel entity, the Generalized Device in order to have a better reusability and modularity of the control logic. In Chapter 5 a new approach will be present based on Discrete Event Systems for the problemof software formal verification and an active fault tolerant control architecture using online diagnostic. Finally conclusive remarks and some ideas on new directions to explore are given. In Appendix A are briefly reported some concepts and results about Discrete Event Systems which should help the reader in understanding some crucial points in chapter 5; while in Appendix B an overview on the experimental testbed of the Laboratory of Automation of University of Bologna, is reported to validated the approach presented in chapter 3, chapter 4 and chapter 5. In Appendix C some components model used in chapter 5 for formal verification are reported.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The present PhD thesis summarizes the three-years study about the neutronic investigation of a new concept nuclear reactor aiming at the optimization and the sustainable management of nuclear fuel in a possible European scenario. A new generation nuclear reactor for the nuclear reinassance is indeed desired by the actual industrialized world, both for the solution of the energetic question arising from the continuously growing energy demand together with the corresponding reduction of oil availability, and the environment question for a sustainable energy source free from Long Lived Radioisotopes and therefore geological repositories. Among the Generation IV candidate typologies, the Lead Fast Reactor concept has been pursued, being the one top rated in sustainability. The European Lead-cooled SYstem (ELSY) has been at first investigated. The neutronic analysis of the ELSY core has been performed via deterministic analysis by means of the ERANOS code, in order to retrieve a stable configuration for the overall design of the reactor. Further analyses have been carried out by means of the Monte Carlo general purpose transport code MCNP, in order to check the former one and to define an exact model of the system. An innovative system of absorbers has been conceptualized and designed for both the reactivity compensation and regulation of the core due to cycle swing, as well as for safety in order to guarantee the cold shutdown of the system in case of accident. Aiming at the sustainability of nuclear energy, the steady-state nuclear equilibrium has been investigated and generalized into the definition of the ``extended'' equilibrium state. According to this, the Adiabatic Reactor Theory has been developed, together with a New Paradigm for Nuclear Power: in order to design a reactor that does not exchange with the environment anything valuable (thus the term ``adiabatic''), in the sense of both Plutonium and Minor Actinides, it is required indeed to revert the logical design scheme of nuclear cores, starting from the definition of the equilibrium composition of the fuel and submitting to the latter the whole core design. The New Paradigm has been applied then to the core design of an Adiabatic Lead Fast Reactor complying with the ELSY overall system layout. A complete core characterization has been done in order to asses criticality and power flattening; a preliminary evaluation of the main safety parameters has been also done to verify the viability of the system. Burn up calculations have been then performed in order to investigate the operating cycle for the Adiabatic Lead Fast Reactor; the fuel performances have been therefore extracted and inserted in a more general analysis for an European scenario. The present nuclear reactors fleet has been modeled and its evolution simulated by means of the COSI code in order to investigate the materials fluxes to be managed in the European region. Different plausible scenarios have been identified to forecast the evolution of the European nuclear energy production, including the one involving the introduction of Adiabatic Lead Fast Reactors, and compared to better analyze the advantages introduced by the adoption of new concept reactors. At last, since both ELSY and the ALFR represent new concept systems based upon innovative solutions, the neutronic design of a demonstrator reactor has been carried out: such a system is intended to prove the viability of technology to be implemented in the First-of-a-Kind industrial power plant, with the aim at attesting the general strategy to use, to the largest extent. It was chosen then to base the DEMO design upon a compromise between demonstration of developed technology and testing of emerging technology in order to significantly subserve the purpose of reducing uncertainties about construction and licensing, both validating ELSY/ALFR main features and performances, and to qualify numerical codes and tools.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Es wird ein neues Konzept für ein protonenleitendes Polymer vorgestellt, das ohne eine zweite, flüssige Phase auskommt. Es beruht darauf, basische Gruppen (Imidazol) über flexible Spacer kovalent an ein Polymerrückgrat zu binden und durch Dotierung mit einer geringen Menge Säure Ladungsträger (Protonen) in dieses System einzubringen.Um die für die Leitfähigkeit und ihren Mechanismus verantwortlichen Größen zu identifizieren, wurde ein Satz von niedermolekularen Modellverbindungen definierter Struktur und hoher Reinheit synthetisiert und im reinen Zustand sowie nach Dotierung mit geringen Mengen Säure umfassend charakterisiert. Untersucht wurden die thermischen Eigenschaften, die Leitfähigkeit, die Diffusion der jeweiligen Modellverbindung sowie ggf. der zugesetzten Säure, das Protonierungsgleichgewicht und die dielektrischen Eigenschaften. Insbesondere wurden durch den Vergleich von Leitfähigkeits- und Diffusionsdaten unter Anwendung der Nernst-Einstein-Beziehung Rückschlüsse auf den Leitmechanismus gezogen.Es wurden Leitfähigkeiten von bis zu 6.5E-3 S/cm bei 120°C erreicht. Der Anteil der Strukturdiffusion (vergleichbar mit dem Grotthus-Mechanismus in Wasser) an der protonischen Leitfähigkeit betrug bis zu über 90%. Als entscheidende Faktoren für die Leitfähigkeit wurden die Glastemperatur und, mit geringerer Priorität, der Imidazolgehalt des Materials identifiziert. Die Temperaturabhängigkeit aller untersuchten Transportgrößen ließ sich durch die Vogel-Tamman-Fulcher-Gleichung exzellent beschreiben.Die vorgestellten Daten bilden die Grundlage für den Entwurf eines entsprechenden Polymers.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Membrane lipid rafts are detergent-resistant microdomains containing glycosphingolipids, cholesterol and glycosylphosphatidylinositol-linked proteins; they seem to be actively involved in many cellular processes including signal transduction, apoptosis, cell adhesion and migration. Lipid rafts may represent important functional platforms where redox signals are produced and transmitted in response to various agonists or stimuli. In addition, a new concept is emerging that could be used to define the interactions or amplification of both redox signalling and lipid raft-associated signalling. This concept is characterized by redox-mediated feed forward amplification in lipid platforms. It is proposed that lipid rafts are formed in response to various stimuli; for instance, NAD(P)H oxidase (Nox) subunits are aggregated or recruited in these platforms, increasing Nox activity. Superoxide and hydrogen peroxide generation could induce various regulatory activities, such as the induction of glucose transport activity and proliferation in leukaemia cells. The aim of our study is to probe: i) the involvement of lipid rafts in the modulation of the glucose transporter Glut1 in human acute leukemia cells; ii) the involvement of plasma membrane caveolae/lipid rafts in VEGF-mediated redox signaling via Nox activation in human leukemic cells; iii) the role of p66shc, an adaptor protein, in VEGF signaling and ROS production in endothelial cells (ECs); iv) the role of Sindecan-2, a transmembrane heparan sulphate proteoglycan, in VEGF signaling and physiological response in ECs and v) the antioxidant and pro-apoptotic activities of simple dietary phenolic acids, i. e. caffeic, syringic and protocatechuic acids in leukemia cells, characterized by a very high ROS content. Our results suggest that the role played by NAD(P)H oxidase-derived ROS in the regulation of glucose uptake, proliferation and migration of leukaemia and endothelial cells could likely occur through the control of lipid raft-associated signalling.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

During this internship, the α-alkylation of branched aldehydes was taken into consideration. An enantiopure Betti’s base derivative was used as catalyst, applying a new concept in catalysis: organocatalysis. The Betti’s base may be of particular interest for organic chemists working in the field of “reactions catalysed by enantiopure small organic molecules”, in particular for the ones interested in enantiopure primary amines. The potential of secondary amines as catalysts has certainly been known for years. It is indeed more innovative to conduct reactions using primary amine derivatives as catalyst. In this work, the efficacy of the primary amine was checked first. Then, the focus was set on finding optimal reaction conditions. Finally, to have a more complete picture of the structure of the compounds used in the project, experimental and computational IR spectra were compared, after the method was validated. Durante il periodo di tirocinio è stata presa in esame la reazione di α-alchilazione di aldeidi branched, utilizzando un derivato dell’ammina di Betti come catalizzatore enantiopuro ed applicando un nuovo tipo di catalisi: l’organocatalisi. Questi composti possono essere di particolare interesse per lavori in chimica organica, nel campo delle reazioni catalizzate da “piccole” molecole organiche, in particolare da ammine primarie a chiralità definita; la potenzialità delle ammine secondarie chirali come catalizzatori è certamente nota da anni, ma innovativo è condurre il tutto con l’impiego di un derivato amminico primario. Altri aspetti significativi sono gli apparenti e innumerevoli vantaggi, dal punto di vista economico ed ambientale, oltre che operativo e sintetico, derivanti dal nuovo tipo di catalisi. In un primo momento è stata verificata l’efficacia dell’ammina primaria sintetizzata nella reazione in progetto, quindi sono state individuate le condizioni di reazione ottimali. Infine, per un’analisi più completa di alcune molecole organiche e dopo un’opportuna validazione del metodo utilizzato, sono stati ottenuti a livello computazionale gli spettri IR delle molecole di sintesi prodotto e catalizzatore.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The main goals of this work were the design, synthesis, and characterization of new functional polyphenylene dendrimers. Polyphenylene dendrimers are highly branched, monodisperse macromolecules consisting exclusively of benzene rings. They can be obtained in high yield by a repetitive Diels-Alder cycloaddition - deprotection protocol. Their shape-persistent dendritic scaffold allows to obtain nanoparticles with functional groups in defined relative orientation. In the first chapter polyphenylene dendrimers with a pyrene core are presented. The focus of the investigations was upon the shielding efficiency of dendritic shells of different generations upon the pyrene-functionality in the core. The herein presented materials combine high quantum efficiency, good solubility and improved film forming properties making them possible candidates for several applications in electronic devices. The defined functionalization of polyphenylene dendrimers often requires a great synthetic effort, since for every desired function the appropriate building block has to be synthesized. To overcome these disadvantages, a new functionalization concept based upon benzophenone precursors has been developed. This new concept has successfully been applied for the functionalization of the dendritic core, the dendrimer shell, and the dendrimer surface. To investigate the accessibility and reactivity of the embedded groups, many functions of different size and nature were introduced. Moreover, suitable precursors for the synthesis of dendrimer entrapped species, trityl cations, trityl radicals, and ketyl radical anions, were obtained. The combination of the synthetic protocols of core- and surface-functionalization resulted in a new type of functional molecules, highly interesting from the point of electron transfer processes. A polyphenylene dendron was used to arrange a triphenylamine donor and a perylene acceptor moiety in a defined spatial distance and orientation. The in-depth photophysical investigation of a first model compound is reported. The herein presented functionalized dendrimers are highly interesting as well from the point of view of fundamental research (looking into the optic and electronic properties of such unique shape persistent structures) as from the point of view of their potential application as tailor-made nanomaterials in the field of optoelectronics.