980 resultados para Design Complexity


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Multicommodity flow (MF) problems have a wide variety of applications in areas such as VLSI circuit design, network design, etc., and are therefore very well studied. The fractional MF problems are polynomial time solvable while integer versions are NP-complete. However, exact algorithms to solve the fractional MF problems have high computational complexity. Therefore approximation algorithms to solve the fractional MF problems have been explored in the literature to reduce their computational complexity. Using these approximation algorithms and the randomized rounding technique, polynomial time approximation algorithms have been explored in the literature. In the design of high-speed networks, such as optical wavelength division multiplexing (WDM) networks, providing survivability carries great significance. Survivability is the ability of the network to recover from failures. It further increases the complexity of network design and presents network designers with more formidable challenges. In this work we formulate the survivable versions of the MF problems. We build approximation algorithms for the survivable multicommodity flow (SMF) problems based on the framework of the approximation algorithms for the MF problems presented in [1] and [2]. We discuss applications of the SMF problems to solve survivable routing in capacitated networks.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

There are some variants of the widely used Fuzzy C-Means (FCM) algorithm that support clustering data distributed across different sites. Those methods have been studied under different names, like collaborative and parallel fuzzy clustering. In this study, we offer some augmentation of the two FCM-based clustering algorithms used to cluster distributed data by arriving at some constructive ways of determining essential parameters of the algorithms (including the number of clusters) and forming a set of systematically structured guidelines such as a selection of the specific algorithm depending on the nature of the data environment and the assumptions being made about the number of clusters. A thorough complexity analysis, including space, time, and communication aspects, is reported. A series of detailed numeric experiments is used to illustrate the main ideas discussed in the study.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In fluid dynamics research, pressure measurements are of great importance to define the flow field acting on aerodynamic surfaces. In fact the experimental approach is fundamental to avoid the complexity of the mathematical models for predicting the fluid phenomena. It’s important to note that, using in-situ sensor to monitor pressure on large domains with highly unsteady flows, several problems are encountered working with the classical techniques due to the transducer cost, the intrusiveness, the time response and the operating range. An interesting approach for satisfying the previously reported sensor requirements is to implement a sensor network capable of acquiring pressure data on aerodynamic surface using a wireless communication system able to collect the pressure data with the lowest environmental–invasion level possible. In this thesis a wireless sensor network for fluid fields pressure has been designed, built and tested. To develop the system, a capacitive pressure sensor, based on polymeric membrane, and read out circuitry, based on microcontroller, have been designed, built and tested. The wireless communication has been performed using the Zensys Z-WAVE platform, and network and data management have been implemented. Finally, the full embedded system with antenna has been created. As a proof of concept, the monitoring of pressure on the top of the mainsail in a sailboat has been chosen as working example.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Introduction. Postnatal neurogenesis in the hippocampal dentate gyrus, can be modulated by numerous determinants, such as hormones, transmitters and stress. Among the factors positively interfering with neurogenesis, the complexity of the environment appears to play a particularly striking role. Adult mice reared in an enriched environment produce more neurons and exhibit better performance in hippocampus-specific learning tasks. While the effects of complex environments on hippocampal neurogenesis are well documented, there is a lack of information on the effects of living under socio-sensory deprivation conditions. Due to the immaturity of rats and mice at birth, studies dealing with the effects of environmental enrichment on hippocampal neurogenesis were carried out in adult animals, i.e. during a period of relatively low rate of neurogenesis. The impact of environment is likely to be more dramatic during the first postnatal weeks, because at this time granule cell production is remarkably higher than at later phases of development. The aim of the present research was to clarify whether and to what extent isolated or enriched rearing conditions affect hippocampal neurogenesis during the early postnatal period, a time window characterized by a high rate of precursor proliferation and to elucidate the mechanisms underlying these effects. The experimental model chosen for this research was the guinea pig, a precocious rodent, which, at 4-5 days of age can be independent from maternal care. Experimental design. Animals were assigned to a standard (control), an isolated, or an enriched environment a few days after birth (P5-P6). On P14-P17 animals received one daily bromodeoxyuridine (BrdU) injection, to label dividing cells, and were sacrificed either on P18, to evaluate cell proliferation or on P45, to evaluate cell survival and differentiation. Methods. Brain sections were processed for BrdU immunhistochemistry, to quantify the new born and surviving cells. The phenotype of the surviving cells was examined by means of confocal microscopy and immunofluorescent double-labeling for BrdU and either a marker of neurons (NeuN) or a marker of astrocytes (GFAP). Apoptotic cell death was examined with the TUNEL method. Serial sections were processed for immunohistochemistry for i) vimentin, a marker of radial glial cells, ii) BDNF (brain-derived neurotrofic factor), a neurotrophin involved in neuron proliferation/survival, iii) PSA-NCAM (the polysialylated form of the neural cell adhesion molecule), a molecule associated with neuronal migration. Total granule cell number in the dentate gyrus was evaluated by stereological methods, in Nissl-stained sections. Results. Effects of isolation. In P18 isolated animals we found a reduced cell proliferation (-35%) compared to controls and a lower expression of BDNF. Though in absolute terms P45 isolated animals had less surviving cells than controls, they showed no differences in survival rate and phenotype percent distribution compared to controls. Evaluation of the absolute number of surviving cells of each phenotype showed that isolated animals had a reduced number of cells with neuronal phenotype than controls. Looking at the location of the new neurons, we found that while in control animals 76% of them had migrated to the granule cell layer, in isolated animals only 55% of the new neurons had reached this layer. Examination of radial glia cells of P18 and P45 animals by vimentin immunohistochemistry showed that in isolated animals radial glia cells were reduced in density and had less and shorter processes. Granule cell count revealed that isolated animals had less granule cells than controls (-32% at P18 and -42% at P45). Effects of enrichment. In P18 enriched animals there was an increase in cell proliferation (+26%) compared to controls and a higher expression of BDNF. Though in both groups there was a decline in the number of BrdU-positive cells by P45, enriched animals had more surviving cells (+63) and a higher survival rate than controls. No differences were found between control and enriched animals in phenotype percent distribution. Evaluation of the absolute number of cells of each phenotype showed that enriched animals had a larger number of cells of each phenotype than controls. Looking at the location of cells of each phenotype we found that enriched animals had more new neurons in the granule cell layer and more astrocytes and cells with undetermined phenotype in the hilus. Enriched animals had a higher expression of PSA-NCAM in the granule cell layer and hilus Vimentin immunohistochemistry showed that in enriched animals radial glia cells were more numerous and had more processes.. Granule cell count revealed that enriched animals had more granule cells than controls (+37% at P18 and +31% at P45). Discussion. Results show that isolation rearing reduces hippocampal cell proliferation but does not affect cell survival, while enriched rearing increases both cell proliferation and cell survival. Changes in the expression of BDNF are likely to contribute to he effects of environment on precursor cell proliferation. The reduction and increase in final number of granule neurons in isolated and enriched animals, respectively, are attributable to the effects of environment on cell proliferation and survival and not to changes in the differentiation program. As radial glia cells play a pivotal role in neuron guidance to the granule cell layer, the reduced number of radial glia cells in isolated animals and the increased number in enriched animals suggests that the size of radial glia population may change dynamically, in order to match changes in neuron production. The high PSA-NCAM expression in enriched animals may concur to favor the survival of the new neurons by facilitating their migration to the granule cell layer. Conclusions. By using a precocious rodent we could demonstrate that isolated/enriched rearing conditions, at a time window during which intense granule cell proliferation takes place, lead to a notable decrease/increase of total granule cell number. The time-course and magnitude of postnatal granule cell production in guinea pigs are more similar to the human and non-human primate condition than in rats and mice. Translation of current data to humans would imply that exposure of children to environments poor/rich of stimuli may have a notably large impact on dentate neurogenesis and, very likely, on hippocampus dependent memory functions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Nowadays, computing is migrating from traditional high performance and distributed computing to pervasive and utility computing based on heterogeneous networks and clients. The current trend suggests that future IT services will rely on distributed resources and on fast communication of heterogeneous contents. The success of this new range of services is directly linked to the effectiveness of the infrastructure in delivering them. The communication infrastructure will be the aggregation of different technologies even though the current trend suggests the emergence of single IP based transport service. Optical networking is a key technology to answer the increasing requests for dynamic bandwidth allocation and configure multiple topologies over the same physical layer infrastructure, optical networks today are still “far” from accessible from directly configure and offer network services and need to be enriched with more “user oriented” functionalities. However, current Control Plane architectures only facilitate efficient end-to-end connectivity provisioning and certainly cannot meet future network service requirements, e.g. the coordinated control of resources. The overall objective of this work is to provide the network with the improved usability and accessibility of the services provided by the Optical Network. More precisely, the definition of a service-oriented architecture is the enable technology to allow user applications to gain benefit of advanced services over an underlying dynamic optical layer. The definition of a service oriented networking architecture based on advanced optical network technologies facilitates users and applications access to abstracted levels of information regarding offered advanced network services. This thesis faces the problem to define a Service Oriented Architecture and its relevant building blocks, protocols and languages. In particular, this work has been focused on the use of the SIP protocol as a inter-layers signalling protocol which defines the Session Plane in conjunction with the Network Resource Description language. On the other hand, an advantage optical network must accommodate high data bandwidth with different granularities. Currently, two main technologies are emerging promoting the development of the future optical transport network, Optical Burst and Packet Switching. Both technologies respectively promise to provide all optical burst or packet switching instead of the current circuit switching. However, the electronic domain is still present in the scheduler forwarding and routing decision. Because of the high optics transmission frequency the burst or packet scheduler faces a difficult challenge, consequentially, high performance and time focused design of both memory and forwarding logic is need. This open issue has been faced in this thesis proposing an high efficiently implementation of burst and packet scheduler. The main novelty of the proposed implementation is that the scheduling problem has turned into simple calculation of a min/max function and the function complexity is almost independent of on the traffic conditions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Technology advances in recent years have dramatically changed the way users exploit contents and services available on the Internet, by enforcing pervasive and mobile computing scenarios and enabling access to networked resources almost from everywhere, at anytime, and independently of the device in use. In addition, people increasingly require to customize their experience, by exploiting specific device capabilities and limitations, inherent features of the communication channel in use, and interaction paradigms that significantly differ from the traditional request/response one. So-called Ubiquitous Internet scenario calls for solutions that address many different challenges, such as device mobility, session management, content adaptation, context-awareness and the provisioning of multimodal interfaces. Moreover, new service opportunities demand simple and effective ways to integrate existing resources into new and value added applications, that can also undergo run-time modifications, according to ever-changing execution conditions. Despite service-oriented architectural models are gaining momentum to tame the increasing complexity of composing and orchestrating distributed and heterogeneous functionalities, existing solutions generally lack a unified approach and only provide support for specific Ubiquitous Internet aspects. Moreover, they usually target rather static scenarios and scarcely support the dynamic nature of pervasive access to Internet resources, that can make existing compositions soon become obsolete or inadequate, hence in need of reconfiguration. This thesis proposes a novel middleware approach to comprehensively deal with Ubiquitous Internet facets and assist in establishing innovative application scenarios. We claim that a truly viable ubiquity support infrastructure must neatly decouple distributed resources to integrate and push any kind of content-related logic outside its core layers, by keeping only management and coordination responsibilities. Furthermore, we promote an innovative, open, and dynamic resource composition model that allows to easily describe and enforce complex scenario requirements, and to suitably react to changes in the execution conditions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Alzheimer's disease (AD) and cancer represent two of the main causes of death worldwide. They are complex multifactorial diseases and several biochemical targets have been recognized to play a fundamental role in their development. Basing on their complex nature, a promising therapeutical approach could be represented by the so-called "Multi-Target-Directed Ligand" approach. This new strategy is based on the assumption that a single molecule could hit several targets responsible for the onset and/or progression of the pathology. In particular in AD, most currently prescribed drugs aim to increase the level of acetylcholine in the brain by inhibiting the enzyme acetylcholinesterase (AChE). However, clinical experience shows that AChE inhibition is a palliative treatment, and the simple modulation of a single target does not address AD aetiology. Research into newer and more potent anti-AD agents is thus focused on compounds whose properties go beyond AChE inhibition (such as inhibition of the enzyme β-secretase and inhibition of the aggregation of beta-amyloid). Therefore, the MTDL strategy seems a more appropriate approach for addressing the complexity of AD and may provide new drugs for tackling its multifactorial nature. In this thesis, it is described the design of new MTDLs able to tackle the multifactorial nature of AD. Such new MTDLs designed are less flexible analogues of Caproctamine, one of the first MTDL owing biological properties useful for the AD treatment. These new compounds are able to inhibit the enzymes AChE, beta-secretase and to inhibit both AChE-induced and self-induced beta-amyloid aggregation. In particular, the most potent compound of the series is able to inhibit AChE in subnanomolar range, to inhibit β-secretase in micromolar concentration and to inhibit both AChE-induced and self-induced beta-amyloid aggregation in micromolar concentration. Cancer, as AD, is a very complex pathology and many different therapeutical approaches are currently use for the treatment of such pathology. However, due to its multifactorial nature the MTDL approach could be, in principle, apply also to this pathology. Aim of this thesis has been the development of new molecules owing different structural motifs able to simultaneously interact with some of the multitude of targets responsible for the pathology. The designed compounds displayed cytotoxic activity in different cancer cell lines. In particular, the most potent compounds of the series have been further evaluated and they were able to bind DNA resulting 100-fold more potent than the reference compound Mitonafide. Furthermore, these compounds were able to trigger apoptosis through caspases activation and to inhibit PIN1 (preliminary result). This last protein is a very promising target because it is overexpressed in many human cancers, it functions as critical catalyst for multiple oncogenic pathways and in several cancer cell lines depletion of PIN1 determines arrest of mitosis followed by apoptosis induction. In conclusion, this study may represent a promising starting pint for the development of new MTDLs hopefully useful for cancer and AD treatment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recently in most of the industrial automation process an ever increasing degree of automation has been observed. This increasing is motivated by the higher requirement of systems with great performance in terms of quality of products/services generated, productivity, efficiency and low costs in the design, realization and maintenance. This trend in the growth of complex automation systems is rapidly spreading over automated manufacturing systems (AMS), where the integration of the mechanical and electronic technology, typical of the Mechatronics, is merging with other technologies such as Informatics and the communication networks. An AMS is a very complex system that can be thought constituted by a set of flexible working stations, one or more transportation systems. To understand how this machine are important in our society let considerate that every day most of us use bottles of water or soda, buy product in box like food or cigarets and so on. Another important consideration from its complexity derive from the fact that the the consortium of machine producers has estimated around 350 types of manufacturing machine. A large number of manufacturing machine industry are presented in Italy and notably packaging machine industry,in particular a great concentration of this kind of industry is located in Bologna area; for this reason the Bologna area is called “packaging valley”. Usually, the various parts of the AMS interact among them in a concurrent and asynchronous way, and coordinate the parts of the machine to obtain a desiderated overall behaviour is an hard task. Often, this is the case in large scale systems, organized in a modular and distributed manner. Even if the success of a modern AMS from a functional and behavioural point of view is still to attribute to the design choices operated in the definition of the mechanical structure and electrical electronic architecture, the system that governs the control of the plant is becoming crucial, because of the large number of duties associated to it. Apart from the activity inherent to the automation of themachine cycles, the supervisory system is called to perform other main functions such as: emulating the behaviour of traditional mechanical members thus allowing a drastic constructive simplification of the machine and a crucial functional flexibility; dynamically adapting the control strategies according to the different productive needs and to the different operational scenarios; obtaining a high quality of the final product through the verification of the correctness of the processing; addressing the operator devoted to themachine to promptly and carefully take the actions devoted to establish or restore the optimal operating conditions; managing in real time information on diagnostics, as a support of the maintenance operations of the machine. The kind of facilities that designers can directly find on themarket, in terms of software component libraries provides in fact an adequate support as regard the implementation of either top-level or bottom-level functionalities, typically pertaining to the domains of user-friendly HMIs, closed-loop regulation and motion control, fieldbus-based interconnection of remote smart devices. What is still lacking is a reference framework comprising a comprehensive set of highly reusable logic control components that, focussing on the cross-cutting functionalities characterizing the automation domain, may help the designers in the process of modelling and structuring their applications according to the specific needs. Historically, the design and verification process for complex automated industrial systems is performed in empirical way, without a clear distinction between functional and technological-implementation concepts and without a systematic method to organically deal with the complete system. Traditionally, in the field of analog and digital control design and verification through formal and simulation tools have been adopted since a long time ago, at least for multivariable and/or nonlinear controllers for complex time-driven dynamics as in the fields of vehicles, aircrafts, robots, electric drives and complex power electronics equipments. Moving to the field of logic control, typical for industrial manufacturing automation, the design and verification process is approached in a completely different way, usually very “unstructured”. No clear distinction between functions and implementations, between functional architectures and technological architectures and platforms is considered. Probably this difference is due to the different “dynamical framework”of logic control with respect to analog/digital control. As a matter of facts, in logic control discrete-events dynamics replace time-driven dynamics; hence most of the formal and mathematical tools of analog/digital control cannot be directly migrated to logic control to enlighten the distinction between functions and implementations. In addition, in the common view of application technicians, logic control design is strictly connected to the adopted implementation technology (relays in the past, software nowadays), leading again to a deep confusion among functional view and technological view. In Industrial automation software engineering, concepts as modularity, encapsulation, composability and reusability are strongly emphasized and profitably realized in the so-calledobject-oriented methodologies. Industrial automation is receiving lately this approach, as testified by some IEC standards IEC 611313, IEC 61499 which have been considered in commercial products only recently. On the other hand, in the scientific and technical literature many contributions have been already proposed to establish a suitable modelling framework for industrial automation. During last years it was possible to note a considerable growth in the exploitation of innovative concepts and technologies from ICT world in industrial automation systems. For what concerns the logic control design, Model Based Design (MBD) is being imported in industrial automation from software engineering field. Another key-point in industrial automated systems is the growth of requirements in terms of availability, reliability and safety for technological systems. In other words, the control system should not only deal with the nominal behaviour, but should also deal with other important duties, such as diagnosis and faults isolations, recovery and safety management. Indeed, together with high performance, in complex systems fault occurrences increase. This is a consequence of the fact that, as it typically occurs in reliable mechatronic systems, in complex systems such as AMS, together with reliable mechanical elements, an increasing number of electronic devices are also present, that are more vulnerable by their own nature. The diagnosis problem and the faults isolation in a generic dynamical system consists in the design of an elaboration unit that, appropriately processing the inputs and outputs of the dynamical system, is also capable of detecting incipient faults on the plant devices, reconfiguring the control system so as to guarantee satisfactory performance. The designer should be able to formally verify the product, certifying that, in its final implementation, it will perform itsrequired function guarantying the desired level of reliability and safety; the next step is that of preventing faults and eventually reconfiguring the control system so that faults are tolerated. On this topic an important improvement to formal verification of logic control, fault diagnosis and fault tolerant control results derive from Discrete Event Systems theory. The aimof this work is to define a design pattern and a control architecture to help the designer of control logic in industrial automated systems. The work starts with a brief discussion on main characteristics and description of industrial automated systems on Chapter 1. In Chapter 2 a survey on the state of the software engineering paradigm applied to industrial automation is discussed. Chapter 3 presentes a architecture for industrial automated systems based on the new concept of Generalized Actuator showing its benefits, while in Chapter 4 this architecture is refined using a novel entity, the Generalized Device in order to have a better reusability and modularity of the control logic. In Chapter 5 a new approach will be present based on Discrete Event Systems for the problemof software formal verification and an active fault tolerant control architecture using online diagnostic. Finally conclusive remarks and some ideas on new directions to explore are given. In Appendix A are briefly reported some concepts and results about Discrete Event Systems which should help the reader in understanding some crucial points in chapter 5; while in Appendix B an overview on the experimental testbed of the Laboratory of Automation of University of Bologna, is reported to validated the approach presented in chapter 3, chapter 4 and chapter 5. In Appendix C some components model used in chapter 5 for formal verification are reported.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis deal with the design of advanced OFDM systems. Both waveform and receiver design have been treated. The main scope of the Thesis is to study, create, and propose, ideas and novel design solutions able to cope with the weaknesses and crucial aspects of modern OFDM systems. Starting from the the transmitter side, the problem represented by low resilience to non-linear distortion has been assessed. A novel technique that considerably reduces the Peak-to-Average Power Ratio (PAPR) yielding a quasi constant signal envelope in the time domain (PAPR close to 1 dB) has been proposed.The proposed technique, named Rotation Invariant Subcarrier Mapping (RISM),is a novel scheme for subcarriers data mapping,where the symbols belonging to the modulation alphabet are not anchored, but maintain some degrees of freedom. In other words, a bit tuple is not mapped on a single point, rather it is mapped onto a geometrical locus, which is totally or partially rotation invariant. The final positions of the transmitted complex symbols are chosen by an iterative optimization process in order to minimize the PAPR of the resulting OFDM symbol. Numerical results confirm that RISM makes OFDM usable even in severe non-linear channels. Another well known problem which has been tackled is the vulnerability to synchronization errors. Indeed in OFDM system an accurate recovery of carrier frequency and symbol timing is crucial for the proper demodulation of the received packets. In general, timing and frequency synchronization is performed in two separate phases called PRE-FFT and POST-FFT synchronization. Regarding the PRE-FFT phase, a novel joint symbol timing and carrier frequency synchronization algorithm has been presented. The proposed algorithm is characterized by a very low hardware complexity, and, at the same time, it guarantees very good performance in in both AWGN and multipath channels. Regarding the POST-FFT phase, a novel approach for both pilot structure and receiver design has been presented. In particular, a novel pilot pattern has been introduced in order to minimize the occurrence of overlaps between two pattern shifted replicas. This allows to replace conventional pilots with nulls in the frequency domain, introducing the so called Silent Pilots. As a result, the optimal receiver turns out to be very robust against severe Rayleigh fading multipath and characterized by low complexity. Performance of this approach has been analytically and numerically evaluated. Comparing the proposed approach with state of the art alternatives, in both AWGN and multipath fading channels, considerable performance improvements have been obtained. The crucial problem of channel estimation has been thoroughly investigated, with particular emphasis on the decimation of the Channel Impulse Response (CIR) through the selection of the Most Significant Samples (MSSs). In this contest our contribution is twofold, from the theoretical side, we derived lower bounds on the estimation mean-square error (MSE) performance for any MSS selection strategy,from the receiver design we proposed novel MSS selection strategies which have been shown to approach these MSE lower bounds, and outperformed the state-of-the-art alternatives. Finally, the possibility of using of Single Carrier Frequency Division Multiple Access (SC-FDMA) in the Broadband Satellite Return Channel has been assessed. Notably, SC-FDMA is able to improve the physical layer spectral efficiency with respect to single carrier systems, which have been used so far in the Return Channel Satellite (RCS) standards. However, it requires a strict synchronization and it is also sensitive to phase noise of local radio frequency oscillators. For this reason, an effective pilot tone arrangement within the SC-FDMA frame, and a novel Joint Multi-User (JMU) estimation method for the SC-FDMA, has been proposed. As shown by numerical results, the proposed scheme manages to satisfy strict synchronization requirements and to guarantee a proper demodulation of the received signal.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Synthetic biology has recently had a great development, many papers have been published and many applications have been presented, spanning from the production of biopharmacheuticals to the synthesis of bioenergetic substrates or industrial catalysts. But, despite these advances, most of the applications are quite simple and don’t fully exploit the potential of this discipline. This limitation in complexity has many causes, like the incomplete characterization of some components, or the intrinsic variability of the biological systems, but one of the most important reasons is the incapability of the cell to sustain the additional metabolic burden introduced by a complex circuit. The objective of the project, of which this work is part, is trying to solve this problem through the engineering of a multicellular behaviour in prokaryotic cells. This system will introduce a cooperative behaviour that will allow to implement complex functionalities, that can’t be obtained with a single cell. In particular the goal is to implement the Leader Election, this procedure has been firstly devised in the field of distributed computing, to identify the process that allow to identify a single process as organizer and coordinator of a series of tasks assigned to the whole population. The election of the Leader greatly simplifies the computation providing a centralized control. Further- more this system may even be useful to evolutionary studies that aims to explain how complex organisms evolved from unicellular systems. The work presented here describes, in particular, the design and the experimental characterization of a component of the circuit that solves the Leader Election problem. This module, composed of an hybrid promoter and a gene, is activated in the non-leader cells after receiving the signal that a leader is present in the colony. The most important element, in this case, is the hybrid promoter, it has been realized in different versions, applying the heuristic rules stated in [22], and their activity has been experimentally tested. The objective of the experimental characterization was to test the response of the genetic circuit to the introduction, in the cellular environment, of particular molecules, inducers, that can be considered inputs of the system. The desired behaviour is similar to the one of a logic AND gate in which the exit, represented by the luminous signal produced by a fluorescent protein, is one only in presence of both inducers. The robustness and the stability of this behaviour have been tested by changing the concentration of the input signals and building dose response curves. From these data it is possible to conclude that the analysed constructs have an AND-like behaviour over a wide range of inducers’ concentrations, even if it is possible to identify many differences in the expression profiles of the different constructs. This variability accounts for the fact that the input and the output signals are continuous, and so their binary representation isn’t able to capture the complexity of the behaviour. The module of the circuit that has been considered in this analysis has a fundamental role in the realization of the intercellular communication system that is necessary for the cooperative behaviour to take place. For this reason, the second phase of the characterization has been focused on the analysis of the signal transmission. In particular, the interaction between this element and the one that is responsible for emitting the chemical signal has been tested. The desired behaviour is still similar to a logic AND, since, even in this case, the exit signal is determined by the hybrid promoter activity. The experimental results have demonstrated that the systems behave correctly, even if there is still a substantial variability between them. The dose response curves highlighted that stricter constrains on the inducers concentrations need to be imposed in order to obtain a clear separation between the two levels of expression. In the conclusive chapter the DNA sequences of the hybrid promoters are analysed, trying to identify the regulatory elements that are most important for the determination of the gene expression. Given the available data it wasn’t possible to draw definitive conclusions. In the end, few considerations on promoter engineering and complex circuits realization are presented. This section aims to briefly recall some of the problems outlined in the introduction and provide a few possible solutions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The research activities described in the present thesis have been oriented to the design and development of components and technological processes aimed at optimizing the performance of plasma sources in advanced in material treatments. Consumables components for high definition plasma arc cutting (PAC) torches were studied and developed. Experimental activities have in particular focussed on the modifications of the emissive insert with respect to the standard electrode configuration, which comprises a press fit hafnium insert in a copper body holder, to improve its durability. Based on a deep analysis of both the scientific and patent literature, different solutions were proposed and tested. First, the behaviour of Hf cathodes when operating at high current levels (250A) in oxidizing atmosphere has been experimentally investigated optimizing, with respect to expected service life, the initial shape of the electrode emissive surface. Moreover, the microstructural modifications of the Hf insert in PAC electrodes were experimentally investigated during first cycles, in order to understand those phenomena occurring on and under the Hf emissive surface and involved in the electrode erosion process. Thereafter, the research activity focussed on producing, characterizing and testing prototypes of composite inserts, combining powders of a high thermal conductibility (Cu, Ag) and high thermionic emissivity (Hf, Zr) materials The complexity of the thermal plasma torch environment required and integrated approach also involving physical modelling. Accordingly, a detailed line-by-line method was developed to compute the net emission coefficient of Ar plasmas at temperatures ranging from 3000 K to 25000 K and pressure ranging from 50 kPa to 200 kPa, for optically thin and partially autoabsorbed plasmas. Finally, prototypal electrodes were studied and realized for a newly developed plasma source, based on the plasma needle concept and devoted to the generation of atmospheric pressure non-thermal plasmas for biomedical applications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of Tissue Engineering is to develop biological substitutes that will restore lost morphological and functional features of diseased or damaged portions of organs. Recently computer-aided technology has received considerable attention in the area of tissue engineering and the advance of additive manufacture (AM) techniques has significantly improved control over the pore network architecture of tissue engineering scaffolds. To regenerate tissues more efficiently, an ideal scaffold should have appropriate porosity and pore structure. More sophisticated porous configurations with higher architectures of the pore network and scaffolding structures that mimic the intricate architecture and complexity of native organs and tissues are then required. This study adopts a macro-structural shape design approach to the production of open porous materials (Titanium foams), which utilizes spatial periodicity as a simple way to generate the models. From among various pore architectures which have been studied, this work simulated pore structure by triply-periodic minimal surfaces (TPMS) for the construction of tissue engineering scaffolds. TPMS are shown to be a versatile source of biomorphic scaffold design. A set of tissue scaffolds using the TPMS-based unit cell libraries was designed. TPMS-based Titanium foams were meant to be printed three dimensional with the relative predicted geometry, microstructure and consequently mechanical properties. Trough a finite element analysis (FEA) the mechanical properties of the designed scaffolds were determined in compression and analyzed in terms of their porosity and assemblies of unit cells. The purpose of this work was to investigate the mechanical performance of TPMS models trying to understand the best compromise between mechanical and geometrical requirements of the scaffolds. The intention was to predict the structural modulus in open porous materials via structural design of interconnected three-dimensional lattices, hence optimising geometrical properties. With the aid of FEA results, it is expected that the effective mechanical properties for the TPMS-based scaffold units can be used to design optimized scaffolds for tissue engineering applications. Regardless of the influence of fabrication method, it is desirable to calculate scaffold properties so that the effect of these properties on tissue regeneration may be better understood.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The application of dexterous robotic hands out of research laboratories has been limited by the intrinsic complexity that these devices present. This is directly reflected as an economically unreasonable cost and a low overall reliability. Within the research reported in this thesis it is shown how the problem of complexity in the design of robotic hands can be tackled, taking advantage of modern technologies (i.e. rapid prototyping), leading to innovative concepts for the design of the mechanical structure, the actuation and sensory systems. The solutions adopted drastically reduce the prototyping and production costs and increase the reliability, reducing the number of parts required and averaging their single reliability factors. In order to get guidelines for the design process, the problem of robotic grasp and manipulation by a dual arm/hand system has been reviewed. In this way, the requirements that should be fulfilled at hardware level to guarantee successful execution of the task has been highlighted. The contribution of this research from the manipulation planning side focuses on the redundancy resolution that arise in the execution of the task in a dexterous arm/hand system. In literature the problem of coordination of arm and hand during manipulation of an object has been widely analyzed in theory but often experimentally demonstrated in simplified robotic setup. Our aim is to cover the lack in the study of this topic and experimentally evaluate it in a complex system as a anthropomorphic arm hand system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The analysis of Komendant's design of the Kimbell Art Museum was carried out in order to determine the effectiveness of the ring beams, edge beams and prestressing in the shells of the roof system. Finite element analysis was not available to Komendant or other engineers of the time to aid them in the design and analysis. Thus, the use of this tool helped to form a new perspective on the Kimbell Art Museum and analyze the engineer's work. In order to carry out the finite element analysis of Kimbell Art Museum, ADINA finite element analysis software was utilized. Eight finite element models (FEM-1 through FEM-8) of increasing complexity were created. The results of the most realistic model, FEM-8, which included ring beams, edge beams and prestressing, were compared to Komendant's calculations. The maximum deflection at the crown of the mid-span surface of -0.1739 in. in FEM-8 was found to be larger than Komendant's deflection in the design documents before the loss in prestressing force (-0.152 in.) but smaller than his prediction after the loss in prestressing force (-0.3814 in.). Komendant predicted a larger longitudinal stress of -903 psi at the crown (vs. -797 psi in FEM-8) and 37 psi at the edge (vs. -347 psi in FEM-8). Considering the strength of concrete of 5000 psi, the difference in results is not significant. From the analysis it was determined that both FEM-5, which included prestressing and fixed rings, and FEM-8 can be successfully and effectively implemented in practice. Prestressing was used in both models and thus served as the main contribution to efficiency. FEM-5 showed that ring and edge beams can be avoided, however an architect might find them more aesthetically appropriate than rigid walls.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A growing body of literature indicates a modestly positive association between religiosity and spirituality as predictors of psychological health (anxiety and depression), suggesting they serve as personal resiliency factors. The purpose of this study was to expand our understanding of the relationships among these constructs. Using Lazarus's Transactional Model of Stress as a theoretical framework, we examined: (a) the extent to which spirituality and religiosity mediated and/or moderated the association between perceived stress and psychological health and (b) whether there was a moderated (religiosity) mediation (spirituality) between stress and health. The Perceived Stress Scale, Daily Spiritual Experiences Scale, Religious Commitment Inventory, and Hospital Anxiety and Depression Scale were administered to measure the following constructs: stress, spirituality, religiosity, and psychological health. This study utilized a nonexperimental, quantitative, correlational, cross-sectional, moderated-mediation design, and included a convenience sample of 331 research participants. Both spirituality and religiosity moderated stress and health. However, only spirituality partially mediated the relationship. In addition, religiosity did not moderate the mediating effects of spirituality. Overall, this study confirmed the role of both religiosity and spirituality as effective resiliency resources.