903 resultados para game design techniques
Resumo:
Decomposition based approaches are recalled from primal and dual point of view. The possibility of building partially disaggregated reduced master problems is investigated. This extends the idea of aggregated-versus-disaggregated formulation to a gradual choice of alternative level of aggregation. Partial aggregation is applied to the linear multicommodity minimum cost flow problem. The possibility of having only partially aggregated bundles opens a wide range of alternatives with different trade-offs between the number of iterations and the required computation for solving it. This trade-off is explored for several sets of instances and the results are compared with the ones obtained by directly solving the natural node-arc formulation. An iterative solution process to the route assignment problem is proposed, based on the well-known Frank Wolfe algorithm. In order to provide a first feasible solution to the Frank Wolfe algorithm, a linear multicommodity min-cost flow problem is solved to optimality by using the decomposition techniques mentioned above. Solutions of this problem are useful for network orientation and design, especially in relation with public transportation systems as the Personal Rapid Transit. A single-commodity robust network design problem is addressed. In this, an undirected graph with edge costs is given together with a discrete set of balance matrices, representing different supply/demand scenarios. The goal is to determine the minimum cost installation of capacities on the edges such that the flow exchange is feasible for every scenario. A set of new instances that are computationally hard for the natural flow formulation are solved by means of a new heuristic algorithm. Finally, an efficient decomposition-based heuristic approach for a large scale stochastic unit commitment problem is presented. The addressed real-world stochastic problem employs at its core a deterministic unit commitment planning model developed by the California Independent System Operator (ISO).
Resumo:
The aim of Tissue Engineering is to develop biological substitutes that will restore lost morphological and functional features of diseased or damaged portions of organs. Recently computer-aided technology has received considerable attention in the area of tissue engineering and the advance of additive manufacture (AM) techniques has significantly improved control over the pore network architecture of tissue engineering scaffolds. To regenerate tissues more efficiently, an ideal scaffold should have appropriate porosity and pore structure. More sophisticated porous configurations with higher architectures of the pore network and scaffolding structures that mimic the intricate architecture and complexity of native organs and tissues are then required. This study adopts a macro-structural shape design approach to the production of open porous materials (Titanium foams), which utilizes spatial periodicity as a simple way to generate the models. From among various pore architectures which have been studied, this work simulated pore structure by triply-periodic minimal surfaces (TPMS) for the construction of tissue engineering scaffolds. TPMS are shown to be a versatile source of biomorphic scaffold design. A set of tissue scaffolds using the TPMS-based unit cell libraries was designed. TPMS-based Titanium foams were meant to be printed three dimensional with the relative predicted geometry, microstructure and consequently mechanical properties. Trough a finite element analysis (FEA) the mechanical properties of the designed scaffolds were determined in compression and analyzed in terms of their porosity and assemblies of unit cells. The purpose of this work was to investigate the mechanical performance of TPMS models trying to understand the best compromise between mechanical and geometrical requirements of the scaffolds. The intention was to predict the structural modulus in open porous materials via structural design of interconnected three-dimensional lattices, hence optimising geometrical properties. With the aid of FEA results, it is expected that the effective mechanical properties for the TPMS-based scaffold units can be used to design optimized scaffolds for tissue engineering applications. Regardless of the influence of fabrication method, it is desirable to calculate scaffold properties so that the effect of these properties on tissue regeneration may be better understood.
Resumo:
Infektiöse Komplikationen im Zusammenhang mit Implantaten stellen einen Großteil aller Krankenhausinfektionen dar und treiben die Gesundheitskosten signifikant in die Höhe. Die bakterielle Kolonisation von Implantatoberflächen zieht schwerwiegende medizinische Konsequenzen nach sich, die unter Umständen tödlich verlaufen können. Trotz umfassender Forschungsaktivitäten auf dem Gebiet der antibakteriellen Oberflächenbeschichtungen ist das Spektrum an wirksamen Substanzen aufgrund der Anpassungsfähigkeit und Ausbildung von Resistenzen verschiedener Mikroorganismen eingeschränkt. Die Erforschung und Entwicklung neuer antibakterieller Materialien ist daher von fundamentaler Bedeutung.rnIn der vorliegenden Arbeit wurden auf der Basis von Polymernanopartikeln und anorganischen/polymeren Verbundmaterialien verschiedene Systeme als Alternative zu bestehenden antibakteriellen Oberflächenbeschichtungen entwickelt. Polymerpartikel finden Anwendung in vielen verschiedenen Bereichen, da sowohl Größe als auch Zusammensetzung und Morphologie vielseitig gestaltet werden können. Mit Hilfe der Miniemulsionstechnik lassen sich u. A. funktionelle Polymernanopartikel im Größenbereich von 50-500 nm herstellen. Diese wurde im ersten System angewendet, um PEGylierte Poly(styrol)nanopartikel zu synthetisieren, deren anti-adhesives Potential in Bezug auf P. aeruginosa evaluiert wurde. Im zweiten System wurden sog. kontakt-aktive kolloide Dispersionen entwickelt, welche bakteriostatische Eigenschaften gegenüber S. aureus zeigten. In Analogie zum ersten System, wurden Poly(styrol)nanopartikel in Copolymerisation in Miniemulsion mit quaternären Ammoniumgruppen funktionalisiert. Als Costabilisator diente das zuvor quaternisierte, oberflächenaktive Monomer (2-Dimethylamino)ethylmethacrylat (qDMAEMA). Die Optimierung der antibakteriellen Eigenschaften wurde im nachfolgenden System realisiert. Hierbei wurde das oberflächenaktive Monomer qDMAEMA zu einem oberflächenaktiven Polyelektrolyt polymerisiert, welcher unter Anwendung von kombinierter Miniemulsions- und Lösemittelverdampfungstechnik, in entsprechende Polyelektrolytnanopartikel umgesetzt wurde. Infolge seiner oberflächenaktiven Eigenschaften, ließen sich aus dem Polyelektrolyt stabile Partikeldispersionen ohne Zusatz weiterer Tenside ausbilden. Die selektive Toxizität der Polyelektrolytnanopartikel gegenüber S. aureus im Unterschied zu Körperzellen, untermauert ihr vielversprechendes Potential als bakterizides, kontakt-aktives Reagenz. rnAufgrund ihrer antibakteriellen Eigenschaften wurden ZnO Nanopartikel ausgewählt und in verschiedene Freisetzungssysteme integriert. Hochdefinierte eckige ZnO Nanokristalle mit einem mittleren Durchmesser von 23 nm wurden durch thermische Zersetzung des Precursormaterials synthetisiert. Durch die nachfolgende Einkapselung in Poly(L-laktid) Latexpartikel wurden neue, antibakterielle und UV-responsive Hybridnanopartikel entwickelt. Durch die photokatalytische Aktivierung von ZnO mittels UV-Strahlung wurde der Abbau der ZnO/PLLA Hybridnanopartikel signifikant von mehreren Monaten auf mehrere Wochen verkürzt. Die Photoaktivierung von ZnO eröffnet somit die Möglichkeit einer gesteuerten Freisetzung von ZnO. Im nachfolgenden System wurden dünne Verbundfilme aus Poly(N-isopropylacrylamid)-Hydrogelschichten mit eingebetteten ZnO Nanopartikeln hergestellt, die als bakterizide Oberflächenbeschichtungen gegen E. coli zum Einsatz kamen. Mit minimalem Gehalt an ZnO zeigten die Filme eine vergleichbare antibakterielle Aktivität zu Silber-basierten Beschichtungen. Hierbei lässt sich der Gehalt an ZnO relativ einfach über die Filmdicke einstellen. Weiterhin erwiesen sich die Filme mit bakteriziden Konzentrationen an ZnO als nichtzytotoxisch gegenüber Körperzellen. Zusammenfassend wurden mehrere vielversprechende antibakterielle Prototypen entwickelt, die als potentielle Implantatbeschichtungen auf die jeweilige Anwendung weiterhin zugeschnitten und optimiert werden können.
Resumo:
Ziel dieser Arbeit ist die Untersuchung der Einflüsse von Blister-Design und Folienqualität auf die Funktionalität von Blisterverpackungen. Hierzu werden analytische Methoden mittels Interferometrie, IR-Spektroskopie, Betarückstreuverfahren, Wirbelstromverfahren und Impedanzspektroskopie entwickelt, die zur quantitativen Bestimmung von Heißsiegellacken und Laminatbeschichtungen von Aluminium-Blisterfolien geeignet sind. Ein Vergleich der Methoden zeigt, dass sich das Betarückstreuverfahren, die Interferometrie und IR-Messungen für die Heißsiegellackbestimmung, die Interferometrie und das Wirbelstromverfahren für die Bestimmung von Kunststofflaminaten eignen.rnIm zweiten Abschnitt der Arbeit werden Einflüsse des Heißsiegellack-Flächengewichtes von Deckfolien auf die Qualität von Blisterverpackungen untersucht. Mit Zunahme des Flächengewichtes zeigt sich eine Erhöhung der Siegelnahtfestigkeit aber auch der Wasserdampfdurchlässigkeit von Blistern. Die untersuchten Heißsiegellacke zeigen Permeationskoeffizienten vergleichbar mit Polyvinylchlorid. In Untersuchungen zur Siegelprozessvalidität zeigt das Heißsiegellack-Flächengewicht nur geringfügige Auswirkungen auf diese. rnIm dritten Abschnitt der Arbeit werden Einflüsse des Blister-Designs auf die Benutzerfreundlichkeit von Blisterverpackungen durch eine Handlingstudie untersucht. Variationen der Öffnungskräfte von Durchdrück-Blistern wirken sich deutlich auf die Bewertungen der Blister durch die Probanden aus. Während die meisten Probanden alle getesteten Durchdrück-Blister innerhalb der Testdauer von 4 Minuten öffnen können (>84%), treten beim Peel-Blister und Peel-off-push-through-Blister deutlich mehr Handlingprobleme auf. Die Handlingprobleme korrelieren mit dem Alter, der Lebenssituation, der gesundheitlichen Verfassung und der Sehfähigkeit der Probanden. rn
Resumo:
During the last few decades an unprecedented technological growth has been at the center of the embedded systems design paramount, with Moore’s Law being the leading factor of this trend. Today in fact an ever increasing number of cores can be integrated on the same die, marking the transition from state-of-the-art multi-core chips to the new many-core design paradigm. Despite the extraordinarily high computing power, the complexity of many-core chips opens the door to several challenges. As a result of the increased silicon density of modern Systems-on-a-Chip (SoC), the design space exploration needed to find the best design has exploded and hardware designers are in fact facing the problem of a huge design space. Virtual Platforms have always been used to enable hardware-software co-design, but today they are facing with the huge complexity of both hardware and software systems. In this thesis two different research works on Virtual Platforms are presented: the first one is intended for the hardware developer, to easily allow complex cycle accurate simulations of many-core SoCs. The second work exploits the parallel computing power of off-the-shelf General Purpose Graphics Processing Units (GPGPUs), with the goal of an increased simulation speed. The term Virtualization can be used in the context of many-core systems not only to refer to the aforementioned hardware emulation tools (Virtual Platforms), but also for two other main purposes: 1) to help the programmer to achieve the maximum possible performance of an application, by hiding the complexity of the underlying hardware. 2) to efficiently exploit the high parallel hardware of many-core chips in environments with multiple active Virtual Machines. This thesis is focused on virtualization techniques with the goal to mitigate, and overtake when possible, some of the challenges introduced by the many-core design paradigm.
Resumo:
This PhD thesis focused on nanomaterial (NM) engineering for occupational health and safety, in the frame of the EU project “Safe Nano Worker Exposure Scenarios (SANOWORK)”. Following a safety by design approach, surface engineering (surface coating, purification process, colloidal force control, wet milling, film coating deposition and granulation) were proposed as risk remediation strategies (RRS) to decrease toxicity and emission potential of NMs within real processing lines. In the first case investigated, the PlasmaChem ZrO2 manufacturing, the colloidal force control applied to the washing of synthesis rector, allowed to reduce ZrO2 contamination in wastewater, performing an efficient recycling procedure of ZrO2 recovered. Furthermore, ZrO2 NM was investigated in the ceramic process owned by CNR-ISTEC and GEA-Niro; the spray drying and freeze drying techniques were employed decreasing NM emissivity, but maintaining a reactive surface in dried NM. Considering the handling operation of nanofibers (NFs) obtained through Elmarco electrospinning procedure, the film coating deposition was applied on polyamide non-woven to avoid free fiber release. For TiO2 NF the wet milling was applied to reduce and homogenize the aspect ratio, leading to a significant mitigation of fiber toxicity. In the Colorobbia spray coating line, Ag and TiO2 nanosols, employed to transfer respectively antibacterial or depolluting properties to different substrates, were investigated. Ag was subjected to surface coating and purification, decreasing NM toxicity. TiO2 was modified by surface coating, spray drying and blending with colloidal SiO2, improving its technological performance. In the extrusion of polymeric matrix charged with carbon nanotube (CNTs) owned by Leitat, the CNTs used as filler were granulated by spray drying and freeze spray drying techniques, allowing to reduce their exposure potential. Engineered NMs tested by biologists were further investigated in relevant biological conditions, to improve the knowledge of structure/toxicity mechanisms and obtain new insights for the design of safest NMs.
Resumo:
This thesis collects the outcomes of a Ph.D. course in Telecommunications Engineering and it is focused on the study and design of possible techniques able to counteract interference signal in Global Navigation Satellite System (GNSS) systems. The subject is the jamming threat in navigation systems, that has become a very increasingly important topic in recent years, due to the wide diffusion of GNSS-based civil applications. Detection and mitigation techniques are developed in order to fight out jamming signals, tested in different scenarios and including sophisticated signals. The thesis is organized in two main parts, which deal with management of GNSS intentional counterfeit signals. The first part deals with the interference management, focusing on the intentional interfering signal. In particular, a technique for the detection and localization of the interfering signal level in the GNSS bands in frequency domain has been proposed. In addition, an effective mitigation technique which exploits the periodic characteristics of the common jamming signals reducing interfering effects at the receiver side has been introduced. Moreover, this technique has been also tested in a different and more complicated scenario resulting still effective in mitigation and cancellation of the interfering signal, without high complexity. The second part still deals with the problem of interference management, but regarding with more sophisticated signal. The attention is focused on the detection of spoofing signal, which is the most complex among the jamming signal types. Due to this highly difficulty in detect and mitigate this kind of signal, spoofing threat is considered the most dangerous. In this work, a possible techniques able to detect this sophisticated signal has been proposed, observing and exploiting jointly the outputs of several operational block measurements of the GNSS receiver operating chain.
Resumo:
Data deduplication describes a class of approaches that reduce the storage capacity needed to store data or the amount of data that has to be transferred over a network. These approaches detect coarse-grained redundancies within a data set, e.g. a file system, and remove them.rnrnOne of the most important applications of data deduplication are backup storage systems where these approaches are able to reduce the storage requirements to a small fraction of the logical backup data size.rnThis thesis introduces multiple new extensions of so-called fingerprinting-based data deduplication. It starts with the presentation of a novel system design, which allows using a cluster of servers to perform exact data deduplication with small chunks in a scalable way.rnrnAfterwards, a combination of compression approaches for an important, but often over- looked, data structure in data deduplication systems, so called block and file recipes, is introduced. Using these compression approaches that exploit unique properties of data deduplication systems, the size of these recipes can be reduced by more than 92% in all investigated data sets. As file recipes can occupy a significant fraction of the overall storage capacity of data deduplication systems, the compression enables significant savings.rnrnA technique to increase the write throughput of data deduplication systems, based on the aforementioned block and file recipes, is introduced next. The novel Block Locality Caching (BLC) uses properties of block and file recipes to overcome the chunk lookup disk bottleneck of data deduplication systems. This chunk lookup disk bottleneck either limits the scalability or the throughput of data deduplication systems. The presented BLC overcomes the disk bottleneck more efficiently than existing approaches. Furthermore, it is shown that it is less prone to aging effects.rnrnFinally, it is investigated if large HPC storage systems inhibit redundancies that can be found by fingerprinting-based data deduplication. Over 3 PB of HPC storage data from different data sets have been analyzed. In most data sets, between 20 and 30% of the data can be classified as redundant. According to these results, future work in HPC storage systems should further investigate how data deduplication can be integrated into future HPC storage systems.rnrnThis thesis presents important novel work in different area of data deduplication re- search.
Resumo:
Every year, thousand of surgical treatments are performed in order to fix up or completely substitute, where possible, organs or tissues affected by degenerative diseases. Patients with these kind of illnesses stay long times waiting for a donor that could replace, in a short time, the damaged organ or the tissue. The lack of biological alternates, related to conventional surgical treatments as autografts, allografts, e xenografts, led the researchers belonging to different areas to collaborate to find out innovative solutions. This research brought to a new discipline able to merge molecular biology, biomaterial, engineering, biomechanics and, recently, design and architecture knowledges. This discipline is named Tissue Engineering (TE) and it represents a step forward towards the substitutive or regenerative medicine. One of the major challenge of the TE is to design and develop, using a biomimetic approach, an artificial 3D anatomy scaffold, suitable for cells adhesion that are able to proliferate and differentiate themselves as consequence of the biological and biophysical stimulus offered by the specific tissue to be replaced. Nowadays, powerful instruments allow to perform analysis day by day more accurateand defined on patients that need more precise diagnosis and treatments.Starting from patient specific information provided by TC (Computed Tomography) microCT and MRI(Magnetic Resonance Imaging), an image-based approach can be performed in order to reconstruct the site to be replaced. With the aid of the recent Additive Manufacturing techniques that allow to print tridimensional objects with sub millimetric precision, it is now possible to practice an almost complete control of the parametrical characteristics of the scaffold: this is the way to achieve a correct cellular regeneration. In this work, we focalize the attention on a branch of TE known as Bone TE, whose the bone is main subject. Bone TE combines osteoconductive and morphological aspects of the scaffold, whose main properties are pore diameter, structure porosity and interconnectivity. The realization of the ideal values of these parameters represents the main goal of this work: here we'll a create simple and interactive biomimetic design process based on 3D CAD modeling and generative algorithmsthat provide a way to control the main properties and to create a structure morphologically similar to the cancellous bone. Two different typologies of scaffold will be compared: the first is based on Triply Periodic MinimalSurface (T.P.M.S.) whose basic crystalline geometries are nowadays used for Bone TE scaffolding; the second is based on using Voronoi's diagrams and they are more often used in the design of decorations and jewellery for their capacity to decompose and tasselate a volumetric space using an heterogeneous spatial distribution (often frequent in nature). In this work, we will show how to manipulate the main properties (pore diameter, structure porosity and interconnectivity) of the design TE oriented scaffolding using the implementation of generative algorithms: "bringing back the nature to the nature".
Resumo:
L'obiettivo di questo documento di tesi è descrivere il design e lo sviluppo di uno strumento per la raccolta delle segnalazioni di barriere architettoniche che possa coinvolgere il maggior numero possibile di utenti attraverso un serious game, implementando meccaniche di gioco che incitino il suo utilizzo, e che al tempo stesso sia divertente, anche grazie a contenuti tematici quali gli zombie, il tutto attraverso una interfaccia semplice e funzionale. Le segnalazioni sono disponibili pubblicamente attraverso Fusion Tables dove esse sono memorizzate, permettendo inoltre di avere una visione globale delle criticità grazie alla visualizzazione su Google Maps.
Resumo:
Cloud services are becoming ever more important for everyone's life. Cloud storage? Web mails? Yes, we don't need to be working in big IT companies to be surrounded by cloud services. Another thing that's growing in importance, or at least that should be considered ever more important, is the concept of privacy. The more we rely on services of which we know close to nothing about, the more we should be worried about our privacy. In this work, I will analyze a prototype software based on a peer to peer architecture for the offering of cloud services, to see if it's possible to make it completely anonymous, meaning that not only the users using it will be anonymous, but also the Peers composing it will not know the real identity of each others. To make it possible, I will make use of anonymizing networks like Tor. I will start by studying the state of art of Cloud Computing, by looking at some real example, followed by analyzing the architecture of the prototype, trying to expose the differences between its distributed nature and the somehow centralized solutions offered by the famous vendors. After that, I will get as deep as possible into the working principle of the anonymizing networks, because they are not something that can just be 'applied' mindlessly. Some de-anonymizing techniques are very subtle so things must be studied carefully. I will then implement the required changes, and test the new anonymized prototype to see how its performances differ from those of the standard one. The prototype will be run on many machines, orchestrated by a tester script that will automatically start, stop and do all the required API calls. As to where to find all these machines, I will make use of Amazon EC2 cloud services and their on-demand instances.
Resumo:
Real living cell is a complex system governed by many process which are not yet fully understood: the process of cell differentiation is one of these. In this thesis work we make use of a cell differentiation model to develop gene regulatory networks (Boolean networks) with desired differentiation dynamics. To accomplish this task we have introduced techniques of automatic design and we have performed experiments using various differentiation trees. The results obtained have shown that the developed algorithms, except the Random algorithm, are able to generate Boolean networks with interesting differentiation dynamics. Moreover, we have presented some possible future applications and developments of the cell differentiation model in robotics and in medical research. Understanding the mechanisms involved in biological cells can gives us the possibility to explain some not yet understood dangerous disease, i.e the cancer. Le cellula è un sistema complesso governato da molti processi ancora non pienamente compresi: il differenziamento cellulare è uno di questi. In questa tesi utilizziamo un modello di differenziamento cellulare per sviluppare reti di regolazione genica (reti Booleane) con dinamiche di differenziamento desiderate. Per svolgere questo compito abbiamo introdotto tecniche di progettazione automatica e abbiamo eseguito esperimenti utilizzando vari alberi di differenziamento. I risultati ottenuti hanno mostrato che gli algoritmi sviluppati, eccetto l'algoritmo Random, sono in grado di poter generare reti Booleane con dinamiche di differenziamento interessanti. Inoltre, abbiamo presentato alcune possibili applicazioni e sviluppi futuri del modello di differenziamento in robotica e nella ricerca medica. Capire i meccanismi alla base del funzionamento cellulare può fornirci la possibilità di spiegare patologie ancora oggi non comprese, come il cancro.
Resumo:
A new 2-D hydrophone array for ultrasound therapy monitoring is presented, along with a novel algorithm for passive acoustic mapping using a sparse weighted aperture. The array is constructed using existing polyvinylidene fluoride (PVDF) ultrasound sensor technology, and is utilized for its broadband characteristics and its high receive sensitivity. For most 2-D arrays, high-resolution imagery is desired, which requires a large aperture at the cost of a large number of elements. The proposed array's geometry is sparse, with elements only on the boundary of the rectangular aperture. The missing information from the interior is filled in using linear imaging techniques. After receiving acoustic emissions during ultrasound therapy, this algorithm applies an apodization to the sparse aperture to limit side lobes and then reconstructs acoustic activity with high spatiotemporal resolution. Experiments show verification of the theoretical point spread function, and cavitation maps in agar phantoms correspond closely to predicted areas, showing the validity of the array and methodology.
Resumo:
Analog filters and direct digital filters are implemented using digital signal processing techniques. Specifically, Butterworth, Elliptic, and Chebyshev filters are implemented using the Motorola 56001 Digital Signal Processor by the integration of three software packages: MATLAB, C++, and Motorola's Application Development System. The integrated environment allows the novice user to design a filter automatically by specifying the filter order and critical frequencies, while permitting more experienced designers to take advantage of MATLAB's advanced design capabilities. This project bridges the gap between the theoretical results produced by MATLAB and the practicalities of implementing digital filters using the Motorola 56001 Digital Signal Processor. While these results are specific to the Motorola 56001 they may be extended to other digital signal processors. MATLAB handles the filter calculations, a C++ routine handles the conversion to assembly code, and the Motorola software compiles and transmits the code to the processor
Digital signal processing and digital system design using discrete cosine transform [student course]
Resumo:
The discrete cosine transform (DCT) is an important functional block for image processing applications. The implementation of a DCT has been viewed as a specialized research task. We apply a micro-architecture based methodology to the hardware implementation of an efficient DCT algorithm in a digital design course. Several circuit optimization and design space exploration techniques at the register-transfer and logic levels are introduced in class for generating the final design. The students not only learn how the algorithm can be implemented, but also receive insights about how other signal processing algorithms can be translated into a hardware implementation. Since signal processing has very broad applications, the study and implementation of an extensively used signal processing algorithm in a digital design course significantly enhances the learning experience in both digital signal processing and digital design areas for the students.