845 resultados para exploratory design methods
Resumo:
The present dissertation relates to methodologies and technics about industrial and mechanical design. The author intends to give a complete idea about the world of design, showing the theories of Quality Function Deployment and TRIZ, of other methods just like planning, budgeting, Value Analysis and Engineering, Concurrent Engineering, Design for Assembly and Manufactoring, etc., and their applications to five concrete cases. In these cases there are also illustrated design technics as CAD, CAS, CAM; Rendering, which are ways to transform an idea into reality. The most important object of the work is, however, the birth of a new methodology, coming up from a comparison between QFD and TRIZ and their integration through other methodologies, just like Time and Cost Analysis, learned and skilled during an important experience in a very famous Italian automotive factory.
Resumo:
In the past decade the study of superparamagnetic nanoparticles has been intensively developed for many biomedical applications such as magnetically assisted drug delivery, MRI contrast agents, cells separation and hyperthermia therapy. All of these applications require nanoparticles with high magnetization, equipped also with a suitable surface coating which has to be non-toxic and biocompatible. In this master thesis, the silica coating of commercially available magnetic nanoparticles was investigated. Silica is a versatile material with many intrinsic features, such as hydrophilicity, low toxicity, proper design and derivatization yields particularly stable colloids even in physiological conditions. The coating process was applied to commercial magnetite particles dispersed in an aqueous solution. The formation of silica coated magnetite nanoparticles was performed following two main strategies: the Stöber process, in which the silica coating of the nanoparticle was directly formed by hydrolysis and condensation of suitable precursor in water-alcoholic mixtures; and the reverse microemulsions method in which inverse micelles were used to confine the hydrolysis and condensation reactions that bring to the nanoparticles formation. Between these two methods, the reverse microemulsions one resulted the most versatile and reliable because of the high control level upon monodispersity, silica shell thickness and overall particle size. Moving from low to high concentration, within the microemulsion region a gradual shift from larger particles to smaller one was detected. By increasing the amount of silica precursor the silica shell can also be tuned. Fluorescent dyes have also been incorporated within the silica shell by linking with the silica matrix. The structure of studied nanoparticles was investigated by using transmission electron microscope (TEM) and dynamic light scattering (DLS). These techniques have been used to monitor the syntetic procedures and for the final characterization of silica coated and silica dye doped nanoparticles. Finally, field dependent magnetization measurements showed the magnetic properties of core-shell nanoparticles were preserved. Due to a very well defined structure that combines magnetic and luminescent properties together with the possibility of further functionalization, these multifunctional nanoparticles are potentially useful platforms in biomedical fields such as labeling and imaging.
Resumo:
This work presents exact, hybrid algorithms for mixed resource Allocation and Scheduling problems; in general terms, those consist into assigning over time finite capacity resources to a set of precedence connected activities. The proposed methods have broad applicability, but are mainly motivated by applications in the field of Embedded System Design. In particular, high-performance embedded computing recently witnessed the shift from single CPU platforms with application-specific accelerators to programmable Multi Processor Systems-on-Chip (MPSoCs). Those allow higher flexibility, real time performance and low energy consumption, but the programmer must be able to effectively exploit the platform parallelism. This raises interest in the development of algorithmic techniques to be embedded in CAD tools; in particular, given a specific application and platform, the objective if to perform optimal allocation of hardware resources and to compute an execution schedule. On this regard, since embedded systems tend to run the same set of applications for their entire lifetime, off-line, exact optimization approaches are particularly appealing. Quite surprisingly, the use of exact algorithms has not been well investigated so far; this is in part motivated by the complexity of integrated allocation and scheduling, setting tough challenges for ``pure'' combinatorial methods. The use of hybrid CP/OR approaches presents the opportunity to exploit mutual advantages of different methods, while compensating for their weaknesses. In this work, we consider in first instance an Allocation and Scheduling problem over the Cell BE processor by Sony, IBM and Toshiba; we propose three different solution methods, leveraging decomposition, cut generation and heuristic guided search. Next, we face Allocation and Scheduling of so-called Conditional Task Graphs, explicitly accounting for branches with outcome not known at design time; we extend the CP scheduling framework to effectively deal with the introduced stochastic elements. Finally, we address Allocation and Scheduling with uncertain, bounded execution times, via conflict based tree search; we introduce a simple and flexible time model to take into account duration variability and provide an efficient conflict detection method. The proposed approaches achieve good results on practical size problem, thus demonstrating the use of exact approaches for system design is feasible. Furthermore, the developed techniques bring significant contributions to combinatorial optimization methods.
Resumo:
In a large number of problems the high dimensionality of the search space, the vast number of variables and the economical constrains limit the ability of classical techniques to reach the optimum of a function, known or unknown. In this thesis we investigate the possibility to combine approaches from advanced statistics and optimization algorithms in such a way to better explore the combinatorial search space and to increase the performance of the approaches. To this purpose we propose two methods: (i) Model Based Ant Colony Design and (ii) Naïve Bayes Ant Colony Optimization. We test the performance of the two proposed solutions on a simulation study and we apply the novel techniques on an appplication in the field of Enzyme Engineering and Design.
Resumo:
Die DNA stellt aufgrund der genetischen Krankheitsursache nach wie vor ein überaus attraktives Target für das Design antitumoraktiver Zytostatika dar. Ein wesentlicher Schwerpunkt der heutigen Forschung besteht vor allem in der Entwicklung niedermolekularer, sequenzspezifischer DNA-Liganden zur gezielten Ausschaltung defekter Gene. Im Rahmen dieser Arbeit erfolgte daher in Anlehnung an die antitumoral wirksame Leitsubstanz Netropsin - ein AT-selektiver Minor Groove Binder mit Bispyrrolcarboxamid-Grundstruktur - erstmals der systematische Aufbau einer neuen Serie bioisosterer Hybridmoleküle, bestehend aus einem interkalierenden Strukturelement (Acridon, Naphthalimid, 5-Nitronaphthalimid, Anthrachinon, 11H-Pyrido[2,3-a]carbazol) und Thiophenpyrrol-, Imidazolpyrrol-, Thiazolpyrrol- bzw. Bisimidazolcarboxamid als rinnenbindende Oligoamid-Einheit (sog. Combilexine). Die chromophoren Systeme am N-Terminus wurden hierbei über aliphatische Linker variabler Kettenlänge mit der Carboxamid-Kette verknüpft. Als C-terminale Funktion kam sowohl die N,N-Dimethyl-1,3-diaminopropan- als auch die um ein C-Atom kürzere Dimethylaminoethylamin-Seitenkette zum Einsatz. Unter Verwendung modernster Reagenzien aus der Peptidkupplungschemie ist es gelungen, ein präparativ gut zugängliches, reproduzierbares Verfahren zur Synthese dieser bioisosteren Combilexine zu entwickeln. Anhand biophysikalischer/biochemischer, zellbiologischer und physikochemischer (1H-NMR-spektroskopischer und röntgenstrukturanalytischer) Methoden sowie Molecular Modelling Studien wurden erstmals bezüglich der DNA-Bindung, der Topoisomerase-Hemmung und der Antitumor-Zellzytotoxizität in einem breiten Rahmen vororientierende Struktur-Wirkungsbeziehungen an bioisosteren Liganden erstellt. Wenngleich zwischen den in vitro und in silico ermittelten Befunden keine konkreten Gesetzmäßigkeiten zu erkennen waren, so ließ die Summation der Ergebnisse dennoch darauf schließen, dass es sich bei den Naphthalimidpropion- und Acridonbuttersäure-Derivaten mit C-terminaler Propylendiamin-Funktion um die aussichtsreichsten Kandidaten in Bezug auf die DNA-Affinität bzw. Zytotoxizität handelte.
Resumo:
This work of thesis involves various aspects of crystal engineering. Chapter 1 focuses on crystals containing crown ether complexes. Aspects such as the possibility of preparing these materials by non-solution methods, i.e. by direct reaction of the solid components, thermal behavior and also isomorphism and interconversion between hydrates are taken into account. In chapter 2 a study is presented aimed to understanding the relationship between hydrogen bonding capability and shape of the building blocks chosen to construct crystals. The focus is on the control exerted by shape on the organization of sandwich cations such as cobalticinium, decamethylcobalticinium and bisbenzenchromium(I) and on the aggregation of monoanions all containing carboxylic and carboxylate groups, into 0-D, 1-D, 2-D and 3-D networks. Reactions conducted in multi-component molecular assemblies or co-crystals have been recognized as a way to control reactivity in the solid state. The [2+2] photodimerization of olefins is a successful demonstration of how templated solid state synthesis can efficiently synthesize unique materials with remarkable stereoselectivity and under environment-friendly conditions. A demonstration of this synthetic strategy is given in chapter 3. The combination of various types of intermolecular linkages, leading to formation of high order aggregation and crystalline materials or to a random aggregation resulting in an amorphous precipitate, may not go to completeness. In such rare cases an aggregation process intermediate between crystalline and amorphous materials is observed, resulting in the formation of a gel, i.e. a viscoelastic solid-like or liquid-like material. In chapter 4 design of new Low Molecular Weight Gelators is presented. Aspects such as the relationships between molecular structure, crystal packing and gelation properties and the application of this kind of gels as a medium for crystal growth of organic molecules, such as APIs, are also discussed.
Resumo:
In the last couple of decades we assisted to a reappraisal of spatial design-based techniques. Usually the spatial information regarding the spatial location of the individuals of a population has been used to develop efficient sampling designs. This thesis aims at offering a new technique for both inference on individual values and global population values able to employ the spatial information available before sampling at estimation level by rewriting a deterministic interpolator under a design-based framework. The achieved point estimator of the individual values is treated both in the case of finite spatial populations and continuous spatial domains, while the theory on the estimator of the population global value covers the finite population case only. A fairly broad simulation study compares the results of the point estimator with the simple random sampling without replacement estimator in predictive form and the kriging, which is the benchmark technique for inference on spatial data. The Monte Carlo experiment is carried out on populations generated according to different superpopulation methods in order to manage different aspects of the spatial structure. The simulation outcomes point out that the proposed point estimator has almost the same behaviour as the kriging predictor regardless of the parameters adopted for generating the populations, especially for low sampling fractions. Moreover, the use of the spatial information improves substantially design-based spatial inference on individual values.
Resumo:
The research activity characterizing the present thesis was mainly centered on the design, development and validation of methodologies for the estimation of stationary and time-varying connectivity between different regions of the human brain during specific complex cognitive tasks. Such activity involved two main aspects: i) the development of a stable, consistent and reproducible procedure for functional connectivity estimation with a high impact on neuroscience field and ii) its application to real data from healthy volunteers eliciting specific cognitive processes (attention and memory). In particular the methodological issues addressed in the present thesis consisted in finding out an approach to be applied in neuroscience field able to: i) include all the cerebral sources in connectivity estimation process; ii) to accurately describe the temporal evolution of connectivity networks; iii) to assess the significance of connectivity patterns; iv) to consistently describe relevant properties of brain networks. The advancement provided in this thesis allowed finding out quantifiable descriptors of cognitive processes during a high resolution EEG experiment involving subjects performing complex cognitive tasks.
Resumo:
Fibre-Reinforced-Plastics are composite materials composed by thin fibres with high mechanical properties, made to work together with a cohesive plastic matrix. The huge advantages of fibre reinforced plastics over traditional materials are their high specific mechanical properties i.e. high stiffness and strength to weight ratios. This kind of composite materials is the most disruptive innovation in the structural materials field seen in recent years and the areas of potential application are still many. However, there are few aspects which limit their growth: on the one hand the information available about their properties and long term behaviour is still scarce, especially if compared with traditional materials for which there has been developed an extended database through years of use and research. On the other hand, the technologies of production are still not as developed as the ones available to form plastics, metals and other traditional materials. A third aspect is that the new properties presented by these materials e.g. their anisotropy, difficult the design of components. This thesis will provide several case-studies with advancements regarding the three limitations mentioned. In particular, the long term mechanical properties have been studied through an experimental analysis of the impact of seawater on GFRP. Regarding production methods, the pre-impregnated cured in autoclave process was considered: a rapid tooling method to produce moulds will be presented, and a study about the production of thick components. Also, two liquid composite moulding methods will be presented, with a case-study regarding a large component with sandwich structure that was produced with the Vacuum-Assisted-Resin-Infusion method, and a case-study regarding a thick con-rod beam that was produced with the Resin-Transfer-Moulding process. The final case-study will analyse the loads acting during the use of a particular sportive component, made with FRP layers and a sandwich structure, practical design rules will be provided.
Resumo:
Data Distribution Management (DDM) is a core part of High Level Architecture standard, as its goal is to optimize the resources used by simulation environments to exchange data. It has to filter and match the set of information generated during a simulation, so that each federate, that is a simulation entity, only receives the information it needs. It is important that this is done quickly and to the best in order to get better performances and avoiding the transmission of irrelevant data, otherwise network resources may saturate quickly. The main topic of this thesis is the implementation of a super partes DDM testbed. It evaluates the goodness of DDM approaches, of all kinds. In fact it supports both region and grid based approaches, and it may support other different methods still unknown too. It uses three factors to rank them: execution time, memory and distance from the optimal solution. A prearranged set of instances is already available, but we also allow the creation of instances with user-provided parameters. This is how this thesis is structured. We start introducing what DDM and HLA are and what do they do in details. Then in the first chapter we describe the state of the art, providing an overview of the most well known resolution approaches and the pseudocode of the most interesting ones. The third chapter describes how the testbed we implemented is structured. In the fourth chapter we expose and compare the results we got from the execution of four approaches we have implemented. The result of the work described in this thesis can be downloaded on sourceforge using the following link: https://sourceforge.net/projects/ddmtestbed/. It is licensed under the GNU General Public License version 3.0 (GPLv3).
Resumo:
This work presents exact algorithms for the Resource Allocation and Cyclic Scheduling Problems (RA&CSPs). Cyclic Scheduling Problems arise in a number of application areas, such as in hoist scheduling, mass production, compiler design (implementing scheduling loops on parallel architectures), software pipelining, and in embedded system design. The RA&CS problem concerns time and resource assignment to a set of activities, to be indefinitely repeated, subject to precedence and resource capacity constraints. In this work we present two constraint programming frameworks facing two different types of cyclic problems. In first instance, we consider the disjunctive RA&CSP, where the allocation problem considers unary resources. Instances are described through the Synchronous Data-flow (SDF) Model of Computation. The key problem of finding a maximum-throughput allocation and scheduling of Synchronous Data-Flow graphs onto a multi-core architecture is NP-hard and has been traditionally solved by means of heuristic (incomplete) algorithms. We propose an exact (complete) algorithm for the computation of a maximum-throughput mapping of applications specified as SDFG onto multi-core architectures. Results show that the approach can handle realistic instances in terms of size and complexity. Next, we tackle the Cyclic Resource-Constrained Scheduling Problem (i.e. CRCSP). We propose a Constraint Programming approach based on modular arithmetic: in particular, we introduce a modular precedence constraint and a global cumulative constraint along with their filtering algorithms. Many traditional approaches to cyclic scheduling operate by fixing the period value and then solving a linear problem in a generate-and-test fashion. Conversely, our technique is based on a non-linear model and tackles the problem as a whole: the period value is inferred from the scheduling decisions. The proposed approaches have been tested on a number of non-trivial synthetic instances and on a set of realistic industrial instances achieving good results on practical size problem.
Resumo:
The use of guided ultrasonic waves (GUW) has increased considerably in the fields of non-destructive (NDE) testing and structural health monitoring (SHM) due to their ability to perform long range inspections, to probe hidden areas as well as to provide a complete monitoring of the entire waveguide. Guided waves can be fully exploited only once their dispersive properties are known for the given waveguide. In this context, well stated analytical and numerical methods are represented by the Matrix family methods and the Semi Analytical Finite Element (SAFE) methods. However, while the former are limited to simple geometries of finite or infinite extent, the latter can model arbitrary cross-section waveguides of finite domain only. This thesis is aimed at developing three different numerical methods for modelling wave propagation in complex translational invariant systems. First, a classical SAFE formulation for viscoelastic waveguides is extended to account for a three dimensional translational invariant static prestress state. The effect of prestress, residual stress and applied loads on the dispersion properties of the guided waves is shown. Next, a two-and-a-half Boundary Element Method (2.5D BEM) for the dispersion analysis of damped guided waves in waveguides and cavities of arbitrary cross-section is proposed. The attenuation dispersive spectrum due to material damping and geometrical spreading of cavities with arbitrary shape is shown for the first time. Finally, a coupled SAFE-2.5D BEM framework is developed to study the dispersion characteristics of waves in viscoelastic waveguides of arbitrary geometry embedded in infinite solid or liquid media. Dispersion of leaky and non-leaky guided waves in terms of speed and attenuation, as well as the radiated wavefields, can be computed. The results obtained in this thesis can be helpful for the design of both actuation and sensing systems in practical application, as well as to tune experimental setup.
Resumo:
Obiettivo: capire la relazione che intercorre tra le coliche del cavallo e le parassitosi intestinali, quindi capire l’effettiva rilevanza clinica delle infezioni parassitarie. Tipo di studio: studio clinico-chirurgico e parassitologico. Metodi: in questo studio sono stati presi in esame 92 cavalli afferiti presso il servizio SARGA del Dipartimento di Scienze Mediche Veterinarie durante gli anni 2009-2011. 27 di questi soggetti sono stati sottoposti a laparotomia esplorativa per colica, 22 avevano una colica che si è risolta con terapia medica, sono stati 43 i cavalli afferiti presso il servizio per patologie diverse dall’addome acuto. I cavalli da cui è stato possibile prelevare un’adeguato quantitativo di feci (# 86) sono stati sottoposti ad esami coprologici, qualitativi e quantitativi. I dati ottenuti sono stati sottoposti ad analisi statistica descrittiva, al test del Chi quadrato e al test di Kuskall-Wallis rispettivamente per le prevalenze e i dati quantitativi, oltre ad una regressione logistica per evidenziare i fattori di rischio. Dai cavalli sottoposti a celiotomia è stato prelevato il contenuto intestinale per la raccolta dei parassiti adulti. Risultati: la prevalenza e l’abbondanza degli strongili è risultata significativamente minore nei cavalli sottoposti a chirurgia addominale rispetto al totale della popolazione presa in esame. Differenze significative di prevalenza sono state evidenziate anche tra i cavalli in colica medica e chirurgica. L’unico fattore di rischio evidenziato dall’analisi di regressione logistica è rappresentato dall’età per le sole coliche trattate chirurgicamente. Né strongili né ascaridi sembrano aumentare il rischio di colica. La probabilità di decesso aumenta significativamente in caso di colica chirurgica ma non è influenzata in alcun modo dalle infezioni parassitarie.
Resumo:
Most of the problems in modern structural design can be described with a set of equation; solutions of these mathematical models can lead the engineer and designer to get info during the design stage. The same holds true for physical-chemistry; this branch of chemistry uses mathematics and physics in order to explain real chemical phenomena. In this work two extremely different chemical processes will be studied; the dynamic of an artificial molecular motor and the generation and propagation of the nervous signals between excitable cells and tissues like neurons and axons. These two processes, in spite of their chemical and physical differences, can be both described successfully by partial differential equations, that are, respectively the Fokker-Planck equation and the Hodgkin and Huxley model. With the aid of an advanced engineering software these two processes have been modeled and simulated in order to extract a lot of physical informations about them and to predict a lot of properties that can be, in future, extremely useful during the design stage of both molecular motors and devices which rely their actions on the nervous communications between active fibres.
Resumo:
The main goal of this thesis is to facilitate the process of industrial automated systems development applying formal methods to ensure the reliability of systems. A new formulation of distributed diagnosability problem in terms of Discrete Event Systems theory and automata framework is presented, which is then used to enforce the desired property of the system, rather then just verifying it. This approach tackles the state explosion problem with modeling patterns and new algorithms, aimed for verification of diagnosability property in the context of the distributed diagnosability problem. The concepts are validated with a newly developed software tool.