25 resultados para singleton design pattern, symmetric key encryption

em AMS Tesi di Dottorato - Alm@DL - Università di Bologna


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Recently in most of the industrial automation process an ever increasing degree of automation has been observed. This increasing is motivated by the higher requirement of systems with great performance in terms of quality of products/services generated, productivity, efficiency and low costs in the design, realization and maintenance. This trend in the growth of complex automation systems is rapidly spreading over automated manufacturing systems (AMS), where the integration of the mechanical and electronic technology, typical of the Mechatronics, is merging with other technologies such as Informatics and the communication networks. An AMS is a very complex system that can be thought constituted by a set of flexible working stations, one or more transportation systems. To understand how this machine are important in our society let considerate that every day most of us use bottles of water or soda, buy product in box like food or cigarets and so on. Another important consideration from its complexity derive from the fact that the the consortium of machine producers has estimated around 350 types of manufacturing machine. A large number of manufacturing machine industry are presented in Italy and notably packaging machine industry,in particular a great concentration of this kind of industry is located in Bologna area; for this reason the Bologna area is called “packaging valley”. Usually, the various parts of the AMS interact among them in a concurrent and asynchronous way, and coordinate the parts of the machine to obtain a desiderated overall behaviour is an hard task. Often, this is the case in large scale systems, organized in a modular and distributed manner. Even if the success of a modern AMS from a functional and behavioural point of view is still to attribute to the design choices operated in the definition of the mechanical structure and electrical electronic architecture, the system that governs the control of the plant is becoming crucial, because of the large number of duties associated to it. Apart from the activity inherent to the automation of themachine cycles, the supervisory system is called to perform other main functions such as: emulating the behaviour of traditional mechanical members thus allowing a drastic constructive simplification of the machine and a crucial functional flexibility; dynamically adapting the control strategies according to the different productive needs and to the different operational scenarios; obtaining a high quality of the final product through the verification of the correctness of the processing; addressing the operator devoted to themachine to promptly and carefully take the actions devoted to establish or restore the optimal operating conditions; managing in real time information on diagnostics, as a support of the maintenance operations of the machine. The kind of facilities that designers can directly find on themarket, in terms of software component libraries provides in fact an adequate support as regard the implementation of either top-level or bottom-level functionalities, typically pertaining to the domains of user-friendly HMIs, closed-loop regulation and motion control, fieldbus-based interconnection of remote smart devices. What is still lacking is a reference framework comprising a comprehensive set of highly reusable logic control components that, focussing on the cross-cutting functionalities characterizing the automation domain, may help the designers in the process of modelling and structuring their applications according to the specific needs. Historically, the design and verification process for complex automated industrial systems is performed in empirical way, without a clear distinction between functional and technological-implementation concepts and without a systematic method to organically deal with the complete system. Traditionally, in the field of analog and digital control design and verification through formal and simulation tools have been adopted since a long time ago, at least for multivariable and/or nonlinear controllers for complex time-driven dynamics as in the fields of vehicles, aircrafts, robots, electric drives and complex power electronics equipments. Moving to the field of logic control, typical for industrial manufacturing automation, the design and verification process is approached in a completely different way, usually very “unstructured”. No clear distinction between functions and implementations, between functional architectures and technological architectures and platforms is considered. Probably this difference is due to the different “dynamical framework”of logic control with respect to analog/digital control. As a matter of facts, in logic control discrete-events dynamics replace time-driven dynamics; hence most of the formal and mathematical tools of analog/digital control cannot be directly migrated to logic control to enlighten the distinction between functions and implementations. In addition, in the common view of application technicians, logic control design is strictly connected to the adopted implementation technology (relays in the past, software nowadays), leading again to a deep confusion among functional view and technological view. In Industrial automation software engineering, concepts as modularity, encapsulation, composability and reusability are strongly emphasized and profitably realized in the so-calledobject-oriented methodologies. Industrial automation is receiving lately this approach, as testified by some IEC standards IEC 611313, IEC 61499 which have been considered in commercial products only recently. On the other hand, in the scientific and technical literature many contributions have been already proposed to establish a suitable modelling framework for industrial automation. During last years it was possible to note a considerable growth in the exploitation of innovative concepts and technologies from ICT world in industrial automation systems. For what concerns the logic control design, Model Based Design (MBD) is being imported in industrial automation from software engineering field. Another key-point in industrial automated systems is the growth of requirements in terms of availability, reliability and safety for technological systems. In other words, the control system should not only deal with the nominal behaviour, but should also deal with other important duties, such as diagnosis and faults isolations, recovery and safety management. Indeed, together with high performance, in complex systems fault occurrences increase. This is a consequence of the fact that, as it typically occurs in reliable mechatronic systems, in complex systems such as AMS, together with reliable mechanical elements, an increasing number of electronic devices are also present, that are more vulnerable by their own nature. The diagnosis problem and the faults isolation in a generic dynamical system consists in the design of an elaboration unit that, appropriately processing the inputs and outputs of the dynamical system, is also capable of detecting incipient faults on the plant devices, reconfiguring the control system so as to guarantee satisfactory performance. The designer should be able to formally verify the product, certifying that, in its final implementation, it will perform itsrequired function guarantying the desired level of reliability and safety; the next step is that of preventing faults and eventually reconfiguring the control system so that faults are tolerated. On this topic an important improvement to formal verification of logic control, fault diagnosis and fault tolerant control results derive from Discrete Event Systems theory. The aimof this work is to define a design pattern and a control architecture to help the designer of control logic in industrial automated systems. The work starts with a brief discussion on main characteristics and description of industrial automated systems on Chapter 1. In Chapter 2 a survey on the state of the software engineering paradigm applied to industrial automation is discussed. Chapter 3 presentes a architecture for industrial automated systems based on the new concept of Generalized Actuator showing its benefits, while in Chapter 4 this architecture is refined using a novel entity, the Generalized Device in order to have a better reusability and modularity of the control logic. In Chapter 5 a new approach will be present based on Discrete Event Systems for the problemof software formal verification and an active fault tolerant control architecture using online diagnostic. Finally conclusive remarks and some ideas on new directions to explore are given. In Appendix A are briefly reported some concepts and results about Discrete Event Systems which should help the reader in understanding some crucial points in chapter 5; while in Appendix B an overview on the experimental testbed of the Laboratory of Automation of University of Bologna, is reported to validated the approach presented in chapter 3, chapter 4 and chapter 5. In Appendix C some components model used in chapter 5 for formal verification are reported.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Self-organisation is increasingly being regarded as an effective approach to tackle modern systems complexity. The self-organisation approach allows the development of systems exhibiting complex dynamics and adapting to environmental perturbations without requiring a complete knowledge of the future surrounding conditions. However, the development of self-organising systems (SOS) is driven by different principles with respect to traditional software engineering. For instance, engineers typically design systems combining smaller elements where the composition rules depend on the reference paradigm, but typically produce predictable results. Conversely, SOS display non-linear dynamics, which can hardly be captured by deterministic models, and, although robust with respect to external perturbations, are quite sensitive to changes on inner working parameters. In this thesis, we describe methodological aspects concerning the early-design stage of SOS built relying on the Multiagent paradigm: in particular, we refer to the A&A metamodel, where MAS are composed by agents and artefacts, i.e. environmental resources. Then, we describe an architectural pattern that has been extracted from a recurrent solution in designing self-organising systems: this pattern is based on a MAS environment formed by artefacts, modelling non-proactive resources, and environmental agents acting on artefacts so as to enable self-organising mechanisms. In this context, we propose a scientific approach for the early design stage of the engineering of self-organising systems: the process is an iterative one and each cycle is articulated in four stages, modelling, simulation, formal verification, and tuning. During the modelling phase we mainly rely on the existence of a self-organising strategy observed in Nature and, hopefully encoded as a design pattern. Simulations of an abstract system model are used to drive design choices until the required quality properties are obtained, thus providing guarantees that the subsequent design steps would lead to a correct implementation. However, system analysis exclusively based on simulation results does not provide sound guarantees for the engineering of complex systems: to this purpose, we envision the application of formal verification techniques, specifically model checking, in order to exactly characterise the system behaviours. During the tuning stage parameters are tweaked in order to meet the target global dynamics and feasibility constraints. In order to evaluate the methodology, we analysed several systems: in this thesis, we only describe three of them, i.e. the most representative ones for each of the three years of PhD course. We analyse each case study using the presented method, and describe the exploited formal tools and techniques.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Il progetto di ricerca è finalizzato allo sviluppo di una metodologia innovativa di supporto decisionale nel processo di selezione tra alternative progettuali, basata su indicatori di prestazione. In particolare il lavoro si è focalizzato sulla definizione d’indicatori atti a supportare la decisione negli interventi di sbottigliamento di un impianto di processo. Sono stati sviluppati due indicatori, “bottleneck indicators”, che permettono di valutare la reale necessità dello sbottigliamento, individuando le cause che impediscono la produzione e lo sfruttamento delle apparecchiature. Questi sono stati validati attraverso l’applicazione all’analisi di un intervento su un impianto esistente e verificando che lo sfruttamento delle apparecchiature fosse correttamente individuato. Definita la necessità dell’intervento di sbottigliamento, è stato affrontato il problema della selezione tra alternative di processo possibili per realizzarlo. È stato applicato alla scelta un metodo basato su indicatori di sostenibilità che consente di confrontare le alternative considerando non solo il ritorno economico degli investimenti ma anche gli impatti su ambiente e sicurezza, e che è stato ulteriormente sviluppato in questa tesi. Sono stati definiti due indicatori, “area hazard indicators”, relativi alle emissioni fuggitive, per integrare questi aspetti nell’analisi della sostenibilità delle alternative. Per migliorare l’accuratezza nella quantificazione degli impatti è stato sviluppato un nuovo modello previsionale atto alla stima delle emissioni fuggitive di un impianto, basato unicamente sui dati disponibili in fase progettuale, che tiene conto delle tipologie di sorgenti emettitrici, dei loro meccanismi di perdita e della manutenzione. Validato mediante il confronto con dati sperimentali di un impianto produttivo, si è dimostrato che tale metodo è indispensabile per un corretto confronto delle alternative poiché i modelli esistenti sovrastimano eccessivamente le emissioni reali. Infine applicando gli indicatori ad un impianto esistente si è dimostrato che sono fondamentali per semplificare il processo decisionale, fornendo chiare e precise indicazioni impiegando un numero limitato di informazioni per ricavarle.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Through modelling activity, experimental campaigns, test bench and on-field validation, a complete powertrain for a BEV has been designed, assembled and used in a motorsport competition. The activity can be split in three main subjects, representing the three key components of an BEV vehicle. First of all a model of the entire powertrain has been developed in order to understand how the various design choices will influence the race lap-time. The data obtained was then used to design, build and test a first battery pack. After bench tests and track tests, it was understood that by using all the cell charac-teristics, without breaking the rules limitations, higher energy and power densities could have been achieved. An updated battery pack was then designed, produced and raced with at Motostudent 2018 re-sulting in a third place at debut. The second topic of this PhD was the design of novel inverter topologies. Three inverters have been de-signed, two of them using Gallium Nitride devices, a promising semiconductor technology that can achieve high switching speeds while maintaining low switching losses. High switching frequency is crucial to reduce the DC-Bus capacitor and then increase the power density of 3 phase inverters. The third in-verter uses classic Silicon devices but employs a ZVS (Zero Voltage Switching) topology. Despite the in-creased complexity of both the hardware and the control software, it can offer reduced switching losses by using conventional and established silicon mosfet technology. Finally, the mechanical parts of a three phase permanent magnet motor have been designed with the aim to employ it in UniBo Motorsport’s 2020 Formula Student car.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Herpes simplex virus 1 (HSV-1) infects oral epitelial cells, then spreads to the nerve endings and estabilishes latency in sensory ganglia, from where it may, or may not reactivate. Diseases caused by virus reactivation include mild diseases such as muco-cutaneous lesions, and more severe, and even life-threatening encephalitis, or systemic infections affecting diverse organs. Herpes simplex virus represents the most comprehensive example of virus receptor interaction in Herpesviridae family, and the prototype virus encoding multipartite entry genes. In fact, it encodes 11-12 glycoproteins and a number of additional membrane proteins: five of these proteins play key roles in virus entry into subsceptible cells. Thus, glycoprotein B (gB) and glycoprotein C (gC) interact with heparan sulfate proteoglycan to enable initial attachment to cell surfaces. In the next step, in the entry cascade, gD binds a specific surface receptor such as nectin1 or HVEM. The interaction of glycoprotein D with the receptor alters the conformation of gD to enable the activation of gB, glycoprotein H, and glycoprotein L, a trio of glycoproteins that execute the fusion of the viral envelope with the plasma membrane. In this thesis, I described two distinct projects: I. The retargeting of viral tropism for the design of oncolytic Herpesviruses: • capable of infecting cells through the human epitelial growth factor receptor 2 (HER2), overexpressed in highly malignant mammary and ovarian tumors and correlates with a poor prognosis; • detargeted from its natural receptors, HVEM and nectin1. To this end, we inserted a ligand to HER2 in gD. Because HER2 has no natural ligand, the selected ligand was a single chain antibody (scFv) derived from MAb4D5 (monoclonal antibody to HER2), herein designated scHER2. All recombinant viruses were targeted to HER2 receptor, but only two viruses (R-LM113 and R-LM249) were completely detargeted from HVEM and nectin1. To engineer R-LM113, we removed a large portion at the N-terminus of gD (from aa 6 to aa 38) and inserted scHER2 sequence plus 9-aa serine-glycine flexible linker at position 39. On the other hand, to engineer R-LM249, we replaced the Ig-folded core of gD (from aa 61 to aa 218) with scHER2 flanked by Ser-Gly linkers. In summary, these results provide evidence that: i. gD can tolerate an insert almost as big as gD itself; ii. the Ig-like domain of gD can be removed; iii. the large portion at the N-terminus of gD (from aa 6 to aa 38) can be removed without loss of key function; iv. R-LM113 and R-LM249 recombinants are ready to be assayed in animal models of mammary and ovary tumour. This finding and the avaibility of a large number of scFv greatly increase the collection of potential receptors to which HSV can be redirected. II. The production and purification of recombinant truncated form of the heterodimer gHgL. We cloned a stable insect cell line expressing a soluble form of gH in complex with gL under the control of a metalloprotein inducible promoter and purified the heterodimer by means of ONE-STrEP-tag system by IBA. With respect to biological function, the purified heterodimer is capable: • of reacting to antibodies that recognize conformation dependent epitopes and neutralize virion infectivity; • of binding a variety cells at cell surface. No doubt, the availability of biological active purified gHgL heterodimer, in sufficient quantities, will speed up the efforts to solve its crystal structure and makes it feasible to identify more clearly whether gHgL has a cellular partner, and what is the role of this interaction on virus entry.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the past decade, the advent of efficient genome sequencing tools and high-throughput experimental biotechnology has lead to enormous progress in the life science. Among the most important innovations is the microarray tecnology. It allows to quantify the expression for thousands of genes simultaneously by measurin the hybridization from a tissue of interest to probes on a small glass or plastic slide. The characteristics of these data include a fair amount of random noise, a predictor dimension in the thousand, and a sample noise in the dozens. One of the most exciting areas to which microarray technology has been applied is the challenge of deciphering complex disease such as cancer. In these studies, samples are taken from two or more groups of individuals with heterogeneous phenotypes, pathologies, or clinical outcomes. these samples are hybridized to microarrays in an effort to find a small number of genes which are strongly correlated with the group of individuals. Eventhough today methods to analyse the data are welle developed and close to reach a standard organization (through the effort of preposed International project like Microarray Gene Expression Data -MGED- Society [1]) it is not unfrequant to stumble in a clinician's question that do not have a compelling statistical method that could permit to answer it.The contribution of this dissertation in deciphering disease regards the development of new approaches aiming at handle open problems posed by clinicians in handle specific experimental designs. In Chapter 1 starting from a biological necessary introduction, we revise the microarray tecnologies and all the important steps that involve an experiment from the production of the array, to the quality controls ending with preprocessing steps that will be used into the data analysis in the rest of the dissertation. While in Chapter 2 a critical review of standard analysis methods are provided stressing most of problems that In Chapter 3 is introduced a method to adress the issue of unbalanced design of miacroarray experiments. In microarray experiments, experimental design is a crucial starting-point for obtaining reasonable results. In a two-class problem, an equal or similar number of samples it should be collected between the two classes. However in some cases, e.g. rare pathologies, the approach to be taken is less evident. We propose to address this issue by applying a modified version of SAM [2]. MultiSAM consists in a reiterated application of a SAM analysis, comparing the less populated class (LPC) with 1,000 random samplings of the same size from the more populated class (MPC) A list of the differentially expressed genes is generated for each SAM application. After 1,000 reiterations, each single probe given a "score" ranging from 0 to 1,000 based on its recurrence in the 1,000 lists as differentially expressed. The performance of MultiSAM was compared to the performance of SAM and LIMMA [3] over two simulated data sets via beta and exponential distribution. The results of all three algorithms over low- noise data sets seems acceptable However, on a real unbalanced two-channel data set reagardin Chronic Lymphocitic Leukemia, LIMMA finds no significant probe, SAM finds 23 significantly changed probes but cannot separate the two classes, while MultiSAM finds 122 probes with score >300 and separates the data into two clusters by hierarchical clustering. We also report extra-assay validation in terms of differentially expressed genes Although standard algorithms perform well over low-noise simulated data sets, multi-SAM seems to be the only one able to reveal subtle differences in gene expression profiles on real unbalanced data. In Chapter 4 a method to adress similarities evaluation in a three-class prblem by means of Relevance Vector Machine [4] is described. In fact, looking at microarray data in a prognostic and diagnostic clinical framework, not only differences could have a crucial role. In some cases similarities can give useful and, sometimes even more, important information. The goal, given three classes, could be to establish, with a certain level of confidence, if the third one is similar to the first or the second one. In this work we show that Relevance Vector Machine (RVM) [2] could be a possible solutions to the limitation of standard supervised classification. In fact, RVM offers many advantages compared, for example, with his well-known precursor (Support Vector Machine - SVM [3]). Among these advantages, the estimate of posterior probability of class membership represents a key feature to address the similarity issue. This is a highly important, but often overlooked, option of any practical pattern recognition system. We focused on Tumor-Grade-three-class problem, so we have 67 samples of grade I (G1), 54 samples of grade 3 (G3) and 100 samples of grade 2 (G2). The goal is to find a model able to separate G1 from G3, then evaluate the third class G2 as test-set to obtain the probability for samples of G2 to be member of class G1 or class G3. The analysis showed that breast cancer samples of grade II have a molecular profile more similar to breast cancer samples of grade I. Looking at the literature this result have been guessed, but no measure of significance was gived before.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Nowadays, computing is migrating from traditional high performance and distributed computing to pervasive and utility computing based on heterogeneous networks and clients. The current trend suggests that future IT services will rely on distributed resources and on fast communication of heterogeneous contents. The success of this new range of services is directly linked to the effectiveness of the infrastructure in delivering them. The communication infrastructure will be the aggregation of different technologies even though the current trend suggests the emergence of single IP based transport service. Optical networking is a key technology to answer the increasing requests for dynamic bandwidth allocation and configure multiple topologies over the same physical layer infrastructure, optical networks today are still “far” from accessible from directly configure and offer network services and need to be enriched with more “user oriented” functionalities. However, current Control Plane architectures only facilitate efficient end-to-end connectivity provisioning and certainly cannot meet future network service requirements, e.g. the coordinated control of resources. The overall objective of this work is to provide the network with the improved usability and accessibility of the services provided by the Optical Network. More precisely, the definition of a service-oriented architecture is the enable technology to allow user applications to gain benefit of advanced services over an underlying dynamic optical layer. The definition of a service oriented networking architecture based on advanced optical network technologies facilitates users and applications access to abstracted levels of information regarding offered advanced network services. This thesis faces the problem to define a Service Oriented Architecture and its relevant building blocks, protocols and languages. In particular, this work has been focused on the use of the SIP protocol as a inter-layers signalling protocol which defines the Session Plane in conjunction with the Network Resource Description language. On the other hand, an advantage optical network must accommodate high data bandwidth with different granularities. Currently, two main technologies are emerging promoting the development of the future optical transport network, Optical Burst and Packet Switching. Both technologies respectively promise to provide all optical burst or packet switching instead of the current circuit switching. However, the electronic domain is still present in the scheduler forwarding and routing decision. Because of the high optics transmission frequency the burst or packet scheduler faces a difficult challenge, consequentially, high performance and time focused design of both memory and forwarding logic is need. This open issue has been faced in this thesis proposing an high efficiently implementation of burst and packet scheduler. The main novelty of the proposed implementation is that the scheduling problem has turned into simple calculation of a min/max function and the function complexity is almost independent of on the traffic conditions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The following Ph.D work was mainly focused on catalysis, as a key technology, to achieve the objectives of sustainable (green) chemistry. After introducing the concepts of sustainable (green) chemistry and an assessment of new sustainable chemical technologies, the relationship between catalysis and sustainable (green) chemistry was briefly discussed and illustrated via an analysis of some selected and relevant examples. Afterwards, as a continuation of the ongoing interest in Dr. Marco Bandini’s group on organometallic and organocatalytic processes, I addressed my efforts to the design and development of novel catalytic green methodologies for the synthesis of enantiomerically enriched molecules. In the first two projects the attention was focused on the employment of solid supports to carry out reactions that still remain a prerogative of omogeneous catalysis. Firstly, particular emphasis was addressed to the discovery of catalytic enantioselective variants of nitroaldol condensation (commonly termed Henry reaction), using a complex consisting in a polyethylene supported diamino thiopene (DATx) ligands and copper as active species. In the second project, a new class of electrochemically modified surfaces with DATx palladium complexes was presented. The DATx-graphite system proved to be efficient in promoting the Suzuki reaction. Moreover, in collaboration with Prof. Wolf at the University of British Columbia (Vancouver), cyclic voltammetry studies were reported. This study disclosed new opportunities for carbon–carbon forming processes by using heterogeneous, electrodeposited catalyst films. A straightforward metal-free catalysis allowed the exploration around the world of organocatalysis. In fact, three different and novel methodologies, using Cinchona, Guanidine and Phosphine derivatives, were envisioned in the three following projects. An interesting variant of nitroaldol condensation with simple trifluoromethyl ketones and also their application in a non-conventional activation of indolyl cores by Friedel-Crafts-functionalization, led to two novel synthetic protocols. These approaches allowed the preparation of synthetically useful trifluoromethyl derivatives bearing quaternary stereocenters. Lastly, in the sixth project the first γ-alkylation of allenoates with conjugated carbonyl compounds was envisioned. In the last part of this Ph.D thesis bases on an extra-ordinary collaboration with Prof. Balzani and Prof. Gigli, I was involved in the synthesis and characterization of a new type of heteroleptic cyclometaled-Ir(III) complexes, bearing bis-oxazolines (BOXs) as ancillary ligands. The new heteroleptic complexes were fully characterized and in order to examine the electroluminescent properties of FIrBOX(CH2), an Organic Light Emitting Device was realized.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis deal with the design of advanced OFDM systems. Both waveform and receiver design have been treated. The main scope of the Thesis is to study, create, and propose, ideas and novel design solutions able to cope with the weaknesses and crucial aspects of modern OFDM systems. Starting from the the transmitter side, the problem represented by low resilience to non-linear distortion has been assessed. A novel technique that considerably reduces the Peak-to-Average Power Ratio (PAPR) yielding a quasi constant signal envelope in the time domain (PAPR close to 1 dB) has been proposed.The proposed technique, named Rotation Invariant Subcarrier Mapping (RISM),is a novel scheme for subcarriers data mapping,where the symbols belonging to the modulation alphabet are not anchored, but maintain some degrees of freedom. In other words, a bit tuple is not mapped on a single point, rather it is mapped onto a geometrical locus, which is totally or partially rotation invariant. The final positions of the transmitted complex symbols are chosen by an iterative optimization process in order to minimize the PAPR of the resulting OFDM symbol. Numerical results confirm that RISM makes OFDM usable even in severe non-linear channels. Another well known problem which has been tackled is the vulnerability to synchronization errors. Indeed in OFDM system an accurate recovery of carrier frequency and symbol timing is crucial for the proper demodulation of the received packets. In general, timing and frequency synchronization is performed in two separate phases called PRE-FFT and POST-FFT synchronization. Regarding the PRE-FFT phase, a novel joint symbol timing and carrier frequency synchronization algorithm has been presented. The proposed algorithm is characterized by a very low hardware complexity, and, at the same time, it guarantees very good performance in in both AWGN and multipath channels. Regarding the POST-FFT phase, a novel approach for both pilot structure and receiver design has been presented. In particular, a novel pilot pattern has been introduced in order to minimize the occurrence of overlaps between two pattern shifted replicas. This allows to replace conventional pilots with nulls in the frequency domain, introducing the so called Silent Pilots. As a result, the optimal receiver turns out to be very robust against severe Rayleigh fading multipath and characterized by low complexity. Performance of this approach has been analytically and numerically evaluated. Comparing the proposed approach with state of the art alternatives, in both AWGN and multipath fading channels, considerable performance improvements have been obtained. The crucial problem of channel estimation has been thoroughly investigated, with particular emphasis on the decimation of the Channel Impulse Response (CIR) through the selection of the Most Significant Samples (MSSs). In this contest our contribution is twofold, from the theoretical side, we derived lower bounds on the estimation mean-square error (MSE) performance for any MSS selection strategy,from the receiver design we proposed novel MSS selection strategies which have been shown to approach these MSE lower bounds, and outperformed the state-of-the-art alternatives. Finally, the possibility of using of Single Carrier Frequency Division Multiple Access (SC-FDMA) in the Broadband Satellite Return Channel has been assessed. Notably, SC-FDMA is able to improve the physical layer spectral efficiency with respect to single carrier systems, which have been used so far in the Return Channel Satellite (RCS) standards. However, it requires a strict synchronization and it is also sensitive to phase noise of local radio frequency oscillators. For this reason, an effective pilot tone arrangement within the SC-FDMA frame, and a novel Joint Multi-User (JMU) estimation method for the SC-FDMA, has been proposed. As shown by numerical results, the proposed scheme manages to satisfy strict synchronization requirements and to guarantee a proper demodulation of the received signal.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis proposes design methods and test tools, for optical systems, which may be used in an industrial environment, where not only precision and reliability but also ease of use is important. The approach to the problem has been conceived to be as general as possible, although in the present work, the design of a portable device for automatic identification applications has been studied, because this doctorate has been funded by Datalogic Scanning Group s.r.l., a world-class producer of barcode readers. The main functional components of the complete device are: electro-optical imaging, illumination and pattern generator systems. For what concerns the electro-optical imaging system, a characterization tool and an analysis one has been developed to check if the desired performance of the system has been achieved. Moreover, two design tools for optimizing the imaging system have been implemented. The first optimizes just the core of the system, the optical part, improving its performance ignoring all other contributions and generating a good starting point for the optimization of the whole complex system. The second tool optimizes the system taking into account its behavior with a model as near as possible to reality including optics, electronics and detection. For what concerns the illumination and the pattern generator systems, two tools have been implemented. The first allows the design of free-form lenses described by an arbitrary analytical function exited by an incoherent source and is able to provide custom illumination conditions for all kind of applications. The second tool consists of a new method to design Diffractive Optical Elements excited by a coherent source for large pattern angles using the Iterative Fourier Transform Algorithm. Validation of the design tools has been obtained, whenever possible, comparing the performance of the designed systems with those of fabricated prototypes. In other cases simulations have been used.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Among the scientific objectives addressed by the Radio Science Experiment hosted on board the ESA mission BepiColombo is the retrieval of the rotational state of planet Mercury. In fact, the estimation of the obliquity and the librations amplitude were proven to be fundamental for constraining the interior composition of Mercury. This is accomplished by the Mercury Orbiter Radio science Experiment (MORE) via a strict interaction among different payloads thus making the experiment particularly challenging. The underlying idea consists in capturing images of the same landmark on the surface of the planet in different epochs in order to observe a displacement of the identified features with respect to a nominal rotation which allows to estimate the rotational parameters. Observations must be planned accurately in order to obtain image pairs carrying the highest information content for the following estimation process. This is not a trivial task especially in light of the several dynamical constraints involved. Another delicate issue is represented by the pattern matching process between image pairs for which the lowest correlation errors are desired. The research activity was conducted in the frame of the MORE rotation experiment and addressed the design and implementation of an end-to-end simulator of the experiment with the final objective of establishing an optimal science planning of the observations. In the thesis, the implementation of the singular modules forming the simulator is illustrated along with the simulations performed. The results obtained from the preliminary release of the optimization algorithm are finally presented although the software implemented is only at a preliminary release and will be improved and refined in the future also taking into account the developments of the mission.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The new generation of multicore processors opens new perspectives for the design of embedded systems. Multiprocessing, however, poses new challenges to the scheduling of real-time applications, in which the ever-increasing computational demands are constantly flanked by the need of meeting critical time constraints. Many research works have contributed to this field introducing new advanced scheduling algorithms. However, despite many of these works have solidly demonstrated their effectiveness, the actual support for multiprocessor real-time scheduling offered by current operating systems is still very limited. This dissertation deals with implementative aspects of real-time schedulers in modern embedded multiprocessor systems. The first contribution is represented by an open-source scheduling framework, which is capable of realizing complex multiprocessor scheduling policies, such as G-EDF, on conventional operating systems exploiting only their native scheduler from user-space. A set of experimental evaluations compare the proposed solution to other research projects that pursue the same goals by means of kernel modifications, highlighting comparable scheduling performances. The principles that underpin the operation of the framework, originally designed for symmetric multiprocessors, have been further extended first to asymmetric ones, which are subjected to major restrictions such as the lack of support for task migrations, and later to re-programmable hardware architectures (FPGAs). In the latter case, this work introduces a scheduling accelerator, which offloads most of the scheduling operations to the hardware and exhibits extremely low scheduling jitter. The realization of a portable scheduling framework presented many interesting software challenges. One of these has been represented by timekeeping. In this regard, a further contribution is represented by a novel data structure, called addressable binary heap (ABH). Such ABH, which is conceptually a pointer-based implementation of a binary heap, shows very interesting average and worst-case performances when addressing the problem of tick-less timekeeping of high-resolution timers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Regenerative medicine and tissue engineering attempt to repair or improve the biological functions of tissues that have been damaged or have ceased to perform their role through three main components: a biocompatible scaffold, cellular component and bioactive molecules. Nanotechnology provide a toolbox of innovative scaffold fabrication procedures in regenerative medicine. In fact, nanotechnology, using manufacturing techniques such as conventional and unconventional lithography, allows fabricating supports with different geometries and sizes as well as displaying physical chemical properties tunable over different length scales. Soft lithography techniques allow to functionalize the support by specific molecules that promote adhesion and control the growth of cells. Understanding cell response to scaffold, and viceversa, is a key issue; here we show our investigation of the essential features required for improving the cell-surface interaction over different scale lengths. The main goal of this thesis has been to devise a nanotechnology-based strategy for the fabrication of scaffolds for tissue regeneration. We made four types of scaffolds, which are able to accurately control cell adhesion and proliferation. For each scaffold, we chose properly designed materials, fabrication and characterization techniques.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cancer is a multifactorial disease characterized by a very complex etiology. Basing on its complex nature, a promising therapeutic strategy could be based by the “Multi-Target-Directed Ligand” (MTDL) approach, based on the assumption that a single molecule could hit several targets responsible for the pathology. Several agents acting on DNA are clinically used, but the severe deriving side effects limit their therapeutic application. G-quadruplex structures are DNA secondary structures located in key zones of human genome; targeting quadruplex structures could allow obtaining an anticancer therapy more free from side effects. In the last years it has been proved that epigenetic modulation can control the expression of human genes, playing a crucial role in carcinogenesis and, in particular, an abnormal expression of histone deacetylase enzymes are related to tumor onset and progression. This thesis deals with the design and synthesis of new naphthalene diimide (NDI) derivatives endowed with anticancer activity, interacting with DNA together with other targets implicated in cancer development, such as HDACs. NDI-polyamine and NDI-polyamine-hydroxamic acid conjugates have been designed with the aim to provide potential MTDLs, in order to create molecules able simultaneously to interact with different targets involved in this pathology, specifically the G-quadruplex structures and HDAC, and to exploit the polyamine transport system to get selectively into cancer cells. Macrocyclic NDIs have been designed with the aim to improve the quadruplex targeting profile of the disubstituted NDIs. These compounds proved the ability to induce a high and selective stabilization of the quadruplex structures, together with cytotoxic activities in the micromolar range. Finally, trisubstituted NDIs have been developed as G-quadruplex-binders, potentially effective against pancreatic adenocarcinoma. In conclusion, all these studies may represent a promising starting point for the development of new interesting molecules useful for the treatment of cancer, underlining the versatility of the NDI scaffold.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the aerospace, automotive, printing, and sports industries, the development of hybrid Carbon Fiber Reinforced Polymer (CFRP)-metal components is becoming increasingly important. The coupling of metal with CFRP in axial symmetric components results in reduced production costs and increased mechanical properties such as bending, torsional stiffness, mass reduction, damping, and critical speed compared to the single material-built ones. In this thesis, thanks to a novel methodology involving a rubbery/viscoelastic interface layer, several hybrid aluminum-CFRP prototype tubes were produced. Besides, an innovative system for the cure of the CFRP part has been studied, analyzed, tested, and developed in the company that financed these research activities (Reglass SRL, Minerbio BO, Italy). The residual thermal stresses and strains have been investigated with numerical models based on the Finite Element Method (FEM) and compared with experimental tests. Thanks to numerical models, it was also possible to reduce residual thermal stresses by optimizing the lamination sequence of CFRP and determining the influence of the system parameters. A novel software and methodology for evaluating mechanical and damping properties of specimens and tubes made in CFRP were also developed. Moreover, to increase the component's damping properties, rubber nanofibers have been produced and interposed throughout the lamination of specimens. The promising results indicated that the nanofibrous mat could improve the material damping factor over 77% and be adopted in CFRP components with a negligible increment of weight or losing mechanical properties.