869 resultados para NETWORK DESIGN PROBLEMS
Resumo:
Two of the main features of today complex software systems like pervasive computing systems and Internet-based applications are distribution and openness. Distribution revolves around three orthogonal dimensions: (i) distribution of control|systems are characterised by several independent computational entities and devices, each representing an autonomous and proactive locus of control; (ii) spatial distribution|entities and devices are physically distributed and connected in a global (such as the Internet) or local network; and (iii) temporal distribution|interacting system components come and go over time, and are not required to be available for interaction at the same time. Openness deals with the heterogeneity and dynamism of system components: complex computational systems are open to the integration of diverse components, heterogeneous in terms of architecture and technology, and are dynamic since they allow components to be updated, added, or removed while the system is running. The engineering of open and distributed computational systems mandates for the adoption of a software infrastructure whose underlying model and technology could provide the required level of uncoupling among system components. This is the main motivation behind current research trends in the area of coordination middleware to exploit tuple-based coordination models in the engineering of complex software systems, since they intrinsically provide coordinated components with communication uncoupling and further details in the references therein. An additional daunting challenge for tuple-based models comes from knowledge-intensive application scenarios, namely, scenarios where most of the activities are based on knowledge in some form|and where knowledge becomes the prominent means by which systems get coordinated. Handling knowledge in tuple-based systems induces problems in terms of syntax - e.g., two tuples containing the same data may not match due to differences in the tuple structure - and (mostly) of semantics|e.g., two tuples representing the same information may not match based on a dierent syntax adopted. Till now, the problem has been faced by exploiting tuple-based coordination within a middleware for knowledge intensive environments: e.g., experiments with tuple-based coordination within a Semantic Web middleware (surveys analogous approaches). However, they appear to be designed to tackle the design of coordination for specic application contexts like Semantic Web and Semantic Web Services, and they result in a rather involved extension of the tuple space model. The main goal of this thesis was to conceive a more general approach to semantic coordination. In particular, it was developed the model and technology of semantic tuple centres. It is adopted the tuple centre model as main coordination abstraction to manage system interactions. A tuple centre can be seen as a programmable tuple space, i.e. an extension of a Linda tuple space, where the behaviour of the tuple space can be programmed so as to react to interaction events. By encapsulating coordination laws within coordination media, tuple centres promote coordination uncoupling among coordinated components. Then, the tuple centre model was semantically enriched: a main design choice in this work was to try not to completely redesign the existing syntactic tuple space model, but rather provide a smooth extension that { although supporting semantic reasoning { keep the simplicity of tuple and tuple matching as easier as possible. By encapsulating the semantic representation of the domain of discourse within coordination media, semantic tuple centres promote semantic uncoupling among coordinated components. The main contributions of the thesis are: (i) the design of the semantic tuple centre model; (ii) the implementation and evaluation of the model based on an existent coordination infrastructure; (iii) a view of the application scenarios in which semantic tuple centres seem to be suitable as coordination media.
Resumo:
The objective of the Ph.D. thesis is to put the basis of an all-embracing link analysis procedure that may form a general reference scheme for the future state-of-the-art of RF/microwave link design: it is basically meant as a circuit-level simulation of an entire radio link, with – generally multiple – transmitting and receiving antennas examined by EM analysis. In this way the influence of mutual couplings on the frequency-dependent near-field and far-field performance of each element is fully accounted for. The set of transmitters is treated as a unique nonlinear system loaded by the multiport antenna, and is analyzed by nonlinear circuit techniques. In order to establish the connection between transmitters and receivers, the far-fields incident onto the receivers are evaluated by EM analysis and are combined by extending an available Ray Tracing technique to the link study. EM theory is used to describe the receiving array as a linear active multiport network. Link performances in terms of bit error rate (BER) are eventually verified a posteriori by a fast system-level algorithm. In order to validate the proposed approach, four heterogeneous application contexts are provided. A complete MIMO link design in a realistic propagation scenario is meant to constitute the reference case study. The second one regards the design, optimization and testing of various typologies of rectennas for power generation by common RF sources. Finally, the project and implementation of two typologies of radio identification tags, at X-band and V-band respectively. In all the cases the importance of an exhaustive nonlinear/electromagnetic co-simulation and co-design is demonstrated to be essential for any accurate system performance prediction.
Resumo:
Design parameters, process flows, electro-thermal-fluidic simulations and experimental characterizations of Micro-Electro-Mechanical-Systems (MEMS) suited for gas-chromatographic (GC) applications are presented and thoroughly described in this thesis, whose topic belongs to the research activities the Institute for Microelectronics and Microsystems (IMM)-Bologna is involved since several years, i.e. the development of micro-systems for chemical analysis, based on silicon micro-machining techniques and able to perform analysis of complex gaseous mixtures, especially in the field of environmental monitoring. In this regard, attention has been focused on the development of micro-fabricated devices to be employed in a portable mini-GC system for the analysis of aromatic Volatile Organic Compounds (VOC) like Benzene, Toluene, Ethyl-benzene and Xylene (BTEX), i.e. chemical compounds which can significantly affect environment and human health because of their demonstrated carcinogenicity (benzene) or toxicity (toluene, xylene) even at parts per billion (ppb) concentrations. The most significant results achieved through the laboratory functional characterization of the mini-GC system have been reported, together with in-field analysis results carried out in a station of the Bologna air monitoring network and compared with those provided by a commercial GC system. The development of more advanced prototypes of micro-fabricated devices specifically suited for FAST-GC have been also presented (silicon capillary columns, Ultra-Low-Power (ULP) Metal OXide (MOX) sensor, Thermal Conductivity Detector (TCD)), together with the technological processes for their fabrication. The experimentally demonstrated very high sensitivity of ULP-MOX sensors to VOCs, coupled with the extremely low power consumption, makes the developed ULP-MOX sensor the most performing metal oxide sensor reported up to now in literature, while preliminary test results proved that the developed silicon capillary columns are capable of performances comparable to those of the best fused silica capillary columns. Finally, the development and the validation of a coupled electro-thermal Finite Element Model suited for both steady-state and transient analysis of the micro-devices has been described, and subsequently implemented with a fluidic part to investigate devices behaviour in presence of a gas flowing with certain volumetric flow rates.
Resumo:
Synthetic biology has recently had a great development, many papers have been published and many applications have been presented, spanning from the production of biopharmacheuticals to the synthesis of bioenergetic substrates or industrial catalysts. But, despite these advances, most of the applications are quite simple and don’t fully exploit the potential of this discipline. This limitation in complexity has many causes, like the incomplete characterization of some components, or the intrinsic variability of the biological systems, but one of the most important reasons is the incapability of the cell to sustain the additional metabolic burden introduced by a complex circuit. The objective of the project, of which this work is part, is trying to solve this problem through the engineering of a multicellular behaviour in prokaryotic cells. This system will introduce a cooperative behaviour that will allow to implement complex functionalities, that can’t be obtained with a single cell. In particular the goal is to implement the Leader Election, this procedure has been firstly devised in the field of distributed computing, to identify the process that allow to identify a single process as organizer and coordinator of a series of tasks assigned to the whole population. The election of the Leader greatly simplifies the computation providing a centralized control. Further- more this system may even be useful to evolutionary studies that aims to explain how complex organisms evolved from unicellular systems. The work presented here describes, in particular, the design and the experimental characterization of a component of the circuit that solves the Leader Election problem. This module, composed of an hybrid promoter and a gene, is activated in the non-leader cells after receiving the signal that a leader is present in the colony. The most important element, in this case, is the hybrid promoter, it has been realized in different versions, applying the heuristic rules stated in [22], and their activity has been experimentally tested. The objective of the experimental characterization was to test the response of the genetic circuit to the introduction, in the cellular environment, of particular molecules, inducers, that can be considered inputs of the system. The desired behaviour is similar to the one of a logic AND gate in which the exit, represented by the luminous signal produced by a fluorescent protein, is one only in presence of both inducers. The robustness and the stability of this behaviour have been tested by changing the concentration of the input signals and building dose response curves. From these data it is possible to conclude that the analysed constructs have an AND-like behaviour over a wide range of inducers’ concentrations, even if it is possible to identify many differences in the expression profiles of the different constructs. This variability accounts for the fact that the input and the output signals are continuous, and so their binary representation isn’t able to capture the complexity of the behaviour. The module of the circuit that has been considered in this analysis has a fundamental role in the realization of the intercellular communication system that is necessary for the cooperative behaviour to take place. For this reason, the second phase of the characterization has been focused on the analysis of the signal transmission. In particular, the interaction between this element and the one that is responsible for emitting the chemical signal has been tested. The desired behaviour is still similar to a logic AND, since, even in this case, the exit signal is determined by the hybrid promoter activity. The experimental results have demonstrated that the systems behave correctly, even if there is still a substantial variability between them. The dose response curves highlighted that stricter constrains on the inducers concentrations need to be imposed in order to obtain a clear separation between the two levels of expression. In the conclusive chapter the DNA sequences of the hybrid promoters are analysed, trying to identify the regulatory elements that are most important for the determination of the gene expression. Given the available data it wasn’t possible to draw definitive conclusions. In the end, few considerations on promoter engineering and complex circuits realization are presented. This section aims to briefly recall some of the problems outlined in the introduction and provide a few possible solutions.
Resumo:
Multi-Processor SoC (MPSOC) design brings to the foreground a large number of challenges, one of the most prominent of which is the design of the chip interconnection. With a number of on-chip blocks presently ranging in the tens, and quickly approaching the hundreds, the novel issue of how to best provide on-chip communication resources is clearly felt. Scaling down of process technologies has increased process and dynamic variations as well as transistor wearout. Because of this, delay variations increase and impact the performance of the MPSoCs. The interconnect architecture inMPSoCs becomes a single point of failure as it connects all other components of the system together. A faulty processing element may be shut down entirely, but the interconnect architecture must be able to tolerate partial failure and variations and operate with performance, power or latency overhead. This dissertation focuses on techniques at different levels of abstraction to face with the reliability and variability issues in on-chip interconnection networks. By showing the test results of a GALS NoC testchip this dissertation motivates the need for techniques to detect and work around manufacturing faults and process variations in MPSoCs’ interconnection infrastructure. As a physical design technique, we propose the bundle routing framework as an effective way to route the Network on Chips’ global links. For architecture-level design, two cases are addressed: (I) Intra-cluster communication where we propose a low-latency interconnect with variability robustness (ii) Inter-cluster communication where an online functional testing with a reliable NoC configuration are proposed. We also propose dualVdd as an orthogonal way of compensating variability at the post-fabrication stage. This is an alternative strategy with respect to the design techniques, since it enforces the compensation at post silicon stage.
Resumo:
Heavy pig breeding in Italy is mainly oriented for the production of high quality processed products. Of particular importance is the dry cured ham production, which is strictly regulated and requires specific carcass characteristics correlated with green leg characteristics. Furthermore, as pigs are slaughtered at about 160 kg live weight, the Italian pig breeding sector faces severe problems of production efficiency that are related to all biological aspects linked to growth, feed conversion, fat deposition and so on. It is well known that production and carcass traits are in part genetically determined. Therefore, as a first step to understand genetic basis of traits that could have a direct or indirect impact on dry cured ham production, a candidate gene approach can be used to identify DNA markers associated with parameters of economic importance. In this thesis, we investigated three candidate genes for carcass and production traits (TRIB3, PCSK1, MUC4) in pig breeds used for dry cured ham production, using different experimental approaches in order to find molecular markers associated with these parameters.
Resumo:
Il lavoro tratta la progettazione di un intervento tecnologico e organizzativo in grado di guidare la creazione di un ecosistema di imprese per un distretto commerciale: Centergross (Funo d’Argelato, Bologna). Tale lavoro è stato preparato durante un periodo di Stage della durata di cinque mesi effettuato presso Epoca S.r.l (Bologna) a seguito della necessità, riscontrata dalla direzione del distretto, di ripensare il proprio modello di business. L’obiettivo del progetto di intervento di Epoca è quello di supportare il distretto in questa fase di cambiamento, da “condominio” di imprese autonome a ecosistema, grazie all’identificazione e alla realizzazione di strumenti organizzativi e tecnologici che possano costruire opportunità di sinergie, aumentare la completezza dell’offerta e sfruttare economie di scopo. Per questo è stato realizzato un nuovo sistema informatico (social network e applicazione mobile) in grado di supportare la strategia ed evolvere con essa. La tesi si struttura in tre parti. Nella prima parte, attraverso un’indagine della letteratura di riferimento, si sono indagati i principali modelli di cambiamento organizzativo e il ruolo dei sistemi informativi all’interno delle organizzazione. Un approfondimento circa gli approcci, i processi e la gestione del cambiamento in relazione all’introduzione di un nuovo sistema informativo all’interno delle organizzazioni con riferimento alle tematiche del Business Process Reengineering. La seconda parte è dedicata all’evoluzione del Web con la rivoluzione culturale causata dagli strumenti del web partecipativo, semantico e potenziato: social network e applicazione mobile nello specifico; e all’approccio di progettazione: il Design Thinking. Una tecnica che si prefigge di: trovare il miglior fit fra obiettivi, risorse e tecnologie disponibili; favorire lo sviluppo di nuove idee innovative entrando in empatia con il contesto del problema proponendo soluzioni centrate sui bisogni degli utenti. L’ultima parte del lavoro consiste nella descrizione del caso. Vengono presentate la fase di analisi della situazione attuale e i successivi step di progettazione del sistema informatico: la definizione dei prototipi, la popolazione del database, il modello di dominio, le interfacce utente e le esperienze di navigazione all’interno della piattaforma software proposta. L’approccio prevede che il progetto proceda per iterazioni successive e un adattamento continuo alle esigenze del cliente.
Resumo:
Scopo del presente lavoro di ricerca è quello di comparare due contesti metropolitani, valenciano e bolognese, sulle pratiche di accompagnamento al lavoro rivolte a fasce svantaggiate, in particolare persone con problemi di dipendenza da sostanze psicotrope. L’indagine propone un confronto su alcune tematiche trasversali (tipologia di azioni messe in campo, organizzazione territoriale e governance, profilo degli utenti, inserimento sociale, coinvolgimento del mondo produttivo) e pone in evidenza gli elementi che ci consentono di individuare e segnalare sia delle buone pratiche trasferibili sia delle linee progettuali, partendo dunque dal presupposto che capacitare una persona significa innanzitutto offrirle congrue opportunità di scelta, nel senso seniano e come spiegato dalla stessa Nussbaum, ma soprattutto accompagnarla e sostenerla nel percorso di inserimento lavorativo e, in parallelo, sociale. Il bisogno raccolto è quello di un sostegno, motivazionale e orientativo, che segua un approccio socio educativo capace di fornire, alla persona, una risposta integrata, di unicità, capace dunque di agire sull’autonomia, sull’autostima, sull’elaborazione delle proprie esperienze di vita e lavorative, nonché su elementi anche di contesto quali la casa, le reti amicali e familiari, spesso compromesse. L’elemento distintivo che consente di agire in questa direzione è il lavoro di collaborazione tra i diversi servizi e la co-progettazione del percorso con l’utente stesso. Il tema degli inserimenti lavorativi è un argomento molto complesso che chiama in causa diversi aspetti: i mutamenti sociali e le trasformazioni del lavoro; l’emergere di nuove fasce deboli e il rischio di aggravamento delle condizioni di esclusione per le fasce deboli “tradizionali”; l’importanza del lavoro per la costruzione di percorsi identitari e di riconoscimento; l’impatto delle politiche attive sulle fasce svantaggiate e i concetti di capitazione e attivazione; il ruolo del capitale sociale e l’emergere di nuovi welfare; la rete degli attori coinvolti dal processo di inserimento e il tema della governace territoriale.
Resumo:
Data Distribution Management (DDM) is a core part of High Level Architecture standard, as its goal is to optimize the resources used by simulation environments to exchange data. It has to filter and match the set of information generated during a simulation, so that each federate, that is a simulation entity, only receives the information it needs. It is important that this is done quickly and to the best in order to get better performances and avoiding the transmission of irrelevant data, otherwise network resources may saturate quickly. The main topic of this thesis is the implementation of a super partes DDM testbed. It evaluates the goodness of DDM approaches, of all kinds. In fact it supports both region and grid based approaches, and it may support other different methods still unknown too. It uses three factors to rank them: execution time, memory and distance from the optimal solution. A prearranged set of instances is already available, but we also allow the creation of instances with user-provided parameters. This is how this thesis is structured. We start introducing what DDM and HLA are and what do they do in details. Then in the first chapter we describe the state of the art, providing an overview of the most well known resolution approaches and the pseudocode of the most interesting ones. The third chapter describes how the testbed we implemented is structured. In the fourth chapter we expose and compare the results we got from the execution of four approaches we have implemented. The result of the work described in this thesis can be downloaded on sourceforge using the following link: https://sourceforge.net/projects/ddmtestbed/. It is licensed under the GNU General Public License version 3.0 (GPLv3).
Resumo:
Constructing ontology networks typically occurs at design time at the hands of knowledge engineers who assemble their components statically. There are, however, use cases where ontology networks need to be assembled upon request and processed at runtime, without altering the stored ontologies and without tampering with one another. These are what we call "virtual [ontology] networks", and keeping track of how an ontology changes in each virtual network is called "multiplexing". Issues may arise from the connectivity of ontology networks. In many cases, simple flat import schemes will not work, because many ontology managers can cause property assertions to be erroneously interpreted as annotations and ignored by reasoners. Also, multiple virtual networks should optimize their cumulative memory footprint, and where they cannot, this should occur for very limited periods of time. We claim that these problems should be handled by the software that serves these ontology networks, rather than by ontology engineering methodologies. We propose a method that spreads multiple virtual networks across a 3-tier structure, and can reduce the amount of erroneously interpreted axioms, under certain raw statement distributions across the ontologies. We assumed OWL as the core language handled by semantic applications in the framework at hand, due to the greater availability of reasoners and rule engines. We also verified that, in common OWL ontology management software, OWL axiom interpretation occurs in the worst case scenario of pre-order visit. To measure the effectiveness and space-efficiency of our solution, a Java and RESTful implementation was produced within an Apache project. We verified that a 3-tier structure can accommodate reasonably complex ontology networks better, in terms of the expressivity OWL axiom interpretation, than flat-tree import schemes can. We measured both the memory overhead of the additional components we put on top of traditional ontology networks, and the framework's caching capabilities.
Resumo:
Reliable electronic systems, namely a set of reliable electronic devices connected to each other and working correctly together for the same functionality, represent an essential ingredient for the large-scale commercial implementation of any technological advancement. Microelectronics technologies and new powerful integrated circuits provide noticeable improvements in performance and cost-effectiveness, and allow introducing electronic systems in increasingly diversified contexts. On the other hand, opening of new fields of application leads to new, unexplored reliability issues. The development of semiconductor device and electrical models (such as the well known SPICE models) able to describe the electrical behavior of devices and circuits, is a useful means to simulate and analyze the functionality of new electronic architectures and new technologies. Moreover, it represents an effective way to point out the reliability issues due to the employment of advanced electronic systems in new application contexts. In this thesis modeling and design of both advanced reliable circuits for general-purpose applications and devices for energy efficiency are considered. More in details, the following activities have been carried out: first, reliability issues in terms of security of standard communication protocols in wireless sensor networks are discussed. A new communication protocol is introduced, allows increasing the network security. Second, a novel scheme for the on-die measurement of either clock jitter or process parameter variations is proposed. The developed scheme can be used for an evaluation of both jitter and process parameter variations at low costs. Then, reliability issues in the field of “energy scavenging systems” have been analyzed. An accurate analysis and modeling of the effects of faults affecting circuit for energy harvesting from mechanical vibrations is performed. Finally, the problem of modeling the electrical and thermal behavior of photovoltaic (PV) cells under hot-spot condition is addressed with the development of an electrical and thermal model.
Resumo:
It is currently widely accepted that the understanding of complex cell functions depends on an integrated network theoretical approach and not on an isolated view of the different molecular agents. Aim of this thesis was the examination of topological properties that mirror known biological aspects by depicting the human protein network with methods from graph- and network theory. The presented network is a partial human interactome of 9222 proteins and 36324 interactions, consisting of single interactions reliably extracted from peer-reviewed scientific publications. In general, one can focus on intra- or intermodular characteristics, where a functional module is defined as "a discrete entity whose function is separable from those of other modules". It is found that the presented human network is also scale-free and hierarchically organised, as shown for yeast networks before. The interactome also exhibits proteins with high betweenness and low connectivity which are biologically analyzed and interpreted here as shuttling proteins between organelles (e.g. ER to Golgi, internal ER protein translocation, peroxisomal import, nuclear pores import/export) for the first time. As an optimisation for finding proteins that connect modules, a new method is developed here based on proteins located between highly clustered regions, rather than regarding highly connected regions. As a proof of principle, the Mediator complex is found in first place, the prime example for a connector complex. Focusing on intramodular aspects, the measurement of k-clique communities discriminates overlapping modules very well. Twenty of the largest identified modules are analysed in detail and annotated to known biological structures (e.g. proteasome, the NFκB-, TGF-β complex). Additionally, two large and highly interconnected modules for signal transducer and transcription factor proteins are revealed, separated by known shuttling proteins. These proteins yield also the highest number of redundant shortcuts (by calculating the skeleton), exhibit the highest numbers of interactions and might constitute highly interconnected but spatially separated rich-clubs either for signal transduction or for transcription factors. This design principle allows manifold regulatory events for signal transduction and enables a high diversity of transcription events in the nucleus by a limited set of proteins. Altogether, biological aspects are mirrored by pure topological features, leading to a new view and to new methods that assist the annotation of proteins to biological functions, structures and subcellular localisations. As the human protein network is one of the most complex networks at all, these results will be fruitful for other fields of network theory and will help understanding complex network functions in general.
Resumo:
The aim of Tissue Engineering is to develop biological substitutes that will restore lost morphological and functional features of diseased or damaged portions of organs. Recently computer-aided technology has received considerable attention in the area of tissue engineering and the advance of additive manufacture (AM) techniques has significantly improved control over the pore network architecture of tissue engineering scaffolds. To regenerate tissues more efficiently, an ideal scaffold should have appropriate porosity and pore structure. More sophisticated porous configurations with higher architectures of the pore network and scaffolding structures that mimic the intricate architecture and complexity of native organs and tissues are then required. This study adopts a macro-structural shape design approach to the production of open porous materials (Titanium foams), which utilizes spatial periodicity as a simple way to generate the models. From among various pore architectures which have been studied, this work simulated pore structure by triply-periodic minimal surfaces (TPMS) for the construction of tissue engineering scaffolds. TPMS are shown to be a versatile source of biomorphic scaffold design. A set of tissue scaffolds using the TPMS-based unit cell libraries was designed. TPMS-based Titanium foams were meant to be printed three dimensional with the relative predicted geometry, microstructure and consequently mechanical properties. Trough a finite element analysis (FEA) the mechanical properties of the designed scaffolds were determined in compression and analyzed in terms of their porosity and assemblies of unit cells. The purpose of this work was to investigate the mechanical performance of TPMS models trying to understand the best compromise between mechanical and geometrical requirements of the scaffolds. The intention was to predict the structural modulus in open porous materials via structural design of interconnected three-dimensional lattices, hence optimising geometrical properties. With the aid of FEA results, it is expected that the effective mechanical properties for the TPMS-based scaffold units can be used to design optimized scaffolds for tissue engineering applications. Regardless of the influence of fabrication method, it is desirable to calculate scaffold properties so that the effect of these properties on tissue regeneration may be better understood.
Resumo:
Ziel dieser Arbeit ist die Untersuchung der Einflüsse von Blister-Design und Folienqualität auf die Funktionalität von Blisterverpackungen. Hierzu werden analytische Methoden mittels Interferometrie, IR-Spektroskopie, Betarückstreuverfahren, Wirbelstromverfahren und Impedanzspektroskopie entwickelt, die zur quantitativen Bestimmung von Heißsiegellacken und Laminatbeschichtungen von Aluminium-Blisterfolien geeignet sind. Ein Vergleich der Methoden zeigt, dass sich das Betarückstreuverfahren, die Interferometrie und IR-Messungen für die Heißsiegellackbestimmung, die Interferometrie und das Wirbelstromverfahren für die Bestimmung von Kunststofflaminaten eignen.rnIm zweiten Abschnitt der Arbeit werden Einflüsse des Heißsiegellack-Flächengewichtes von Deckfolien auf die Qualität von Blisterverpackungen untersucht. Mit Zunahme des Flächengewichtes zeigt sich eine Erhöhung der Siegelnahtfestigkeit aber auch der Wasserdampfdurchlässigkeit von Blistern. Die untersuchten Heißsiegellacke zeigen Permeationskoeffizienten vergleichbar mit Polyvinylchlorid. In Untersuchungen zur Siegelprozessvalidität zeigt das Heißsiegellack-Flächengewicht nur geringfügige Auswirkungen auf diese. rnIm dritten Abschnitt der Arbeit werden Einflüsse des Blister-Designs auf die Benutzerfreundlichkeit von Blisterverpackungen durch eine Handlingstudie untersucht. Variationen der Öffnungskräfte von Durchdrück-Blistern wirken sich deutlich auf die Bewertungen der Blister durch die Probanden aus. Während die meisten Probanden alle getesteten Durchdrück-Blister innerhalb der Testdauer von 4 Minuten öffnen können (>84%), treten beim Peel-Blister und Peel-off-push-through-Blister deutlich mehr Handlingprobleme auf. Die Handlingprobleme korrelieren mit dem Alter, der Lebenssituation, der gesundheitlichen Verfassung und der Sehfähigkeit der Probanden. rn
Resumo:
This thesis presents a new Artificial Neural Network (ANN) able to predict at once the main parameters representative of the wave-structure interaction processes, i.e. the wave overtopping discharge, the wave transmission coefficient and the wave reflection coefficient. The new ANN has been specifically developed in order to provide managers and scientists with a tool that can be efficiently used for design purposes. The development of this ANN started with the preparation of a new extended and homogeneous database that collects all the available tests reporting at least one of the three parameters, for a total amount of 16’165 data. The variety of structure types and wave attack conditions in the database includes smooth, rock and armour unit slopes, berm breakwaters, vertical walls, low crested structures, oblique wave attacks. Some of the existing ANNs were compared and improved, leading to the selection of a final ANN, whose architecture was optimized through an in-depth sensitivity analysis to the training parameters of the ANN. Each of the selected 15 input parameters represents a physical aspect of the wave-structure interaction process, describing the wave attack (wave steepness and obliquity, breaking and shoaling factors), the structure geometry (submergence, straight or non-straight slope, with or without berm or toe, presence or not of a crown wall), or the structure type (smooth or covered by an armour layer, with permeable or impermeable core). The advanced ANN here proposed provides accurate predictions for all the three parameters, and demonstrates to overcome the limits imposed by the traditional formulae and approach adopted so far by some of the existing ANNs. The possibility to adopt just one model to obtain a handy and accurate evaluation of the overall performance of a coastal or harbor structure represents the most important and exportable result of the work.