934 resultados para Complex adaptive systems
Resumo:
This doctoral dissertation is triggered by an emergent problem: how can firms reinvent themselves? Continuity- and change-oriented decisions fundamentally shape overtime the activities and potential revenues of organizations and other adaptive systems, but both types of actions draw upon limited resources and rely on different organizational routines and capabilities. Most organizations appear to have difficulties in making tradeoffs, so that it is easier to overinvest in one of them than to successfully achieve a mixture of both. Nevertheless, theory and empirical evidence suggest that too little of either may reduce performance, indicating a need to learn more about how organizations reconcile these tensions. In the first paper, I moved from the consideration that rapid changes in competitive environments increasingly require firms to be “ambidextrous” implementing organizational mechanisms and structures that allow continuity- and change-oriented activities to be engaged at the same time. More specifically, I show that continuity- and change-related decisions can’t be confined either inside or outside the firm, but span overtime across distinct decision domains located within and beyond the organizational boundaries. Reconciling static and dynamic perspectives of ambidexterity, I conceptualize a firm’s strategy as a bundle of decisions about product attributes and components of the production team, proposing a multidimensional and dynamic model of structural ambidexterity that explains why and how firms could manage conflicting pressures for continuity and change in the context of new products. In the second study I note how rigorous systematic evidence documenting the success of ambidextrous organizations is lacking, and there has been very little investigation of how firms deal with continuity and change in new products. How to manage the transition form a successful product to another? What to change and what to keep? Incumbents that deal with series of products over time need to update their offerings in order to have the most relevant attributes to prospect clients without disappoint the current customer base. They need to both match and anticipate consumers’ preferences, blending something old with something new to satisfy the current demand and enlarge the herd by appealing to newer audiences. This paper contributes to strategic renewal and ambidexterity-related research with the first empirically assessment of a positive consumer response to ambidexterity in new products. Also, this study provides a practical method to monitor overtime the degree to which a brand or a firm is continuity- or change- oriented and evaluate different strategy profiles across two decision domains that play a pivotal role in new products: product attributes and components of the production team.
Resumo:
The aim of this PhD thesis was to study at a microscopic level different liquid crystal (LC) systems, in order to determine their physical properties, resorting to two distinct methodologies, one involving computer simulations, and the other spectroscopic techniques, in particular electron spin resonance (ESR) spectroscopy. By means of the computer simulation approach we tried to demonstrate this tool effectiveness for calculating anisotropic static properties of a LC material, as well as for predicting its behaviour and features. This required the development and adoption of suitable molecular models based on a convenient intermolecular potentials reflecting the essential molecular features of the investigated system. In particular, concerning the simulation approach, we have set up models for discotic liquid crystal dimers and we have studied, by means of Monte Carlo simulations, their phase behaviour and self-assembling properties, with respect to the simple monomer case. Each discotic dimer is described by two oblate GayBerne ellipsoids connected by a flexible spacer, modelled by a harmonic "spring" of three different lengths. In particular we investigated the effects of dimerization on the transition temperatures, as well as on the characteristics of molecular aggregation displayed and the relative orientational order. Moving to the experimental results, among the many experimental techniques that are typically employed to evaluate LC system distinctive features, ESR has proved to be a powerful tool in microscopic scale investigation of the properties, structure, order and dynamics of these materials. We have taken advantage of the high sensitivity of the ESR spin probe technique to investigate increasingly complex LC systems ranging from devices constituted by a polymer matrix in which LC molecules are confined in shape of nano- droplets, as well as biaxial liquid crystalline elastomers, and dimers whose monomeric units or lateral groups are constituted by rod-like mesogens (11BCB). Reflection-mode holographic-polymer dispersed liquid crystals (H-PDLCs) are devices in which LCs are confined into nanosized (50-300 nm) droplets, arranged in layers which alternate with polymer layers, forming a diffraction grating. We have determined the configuration of the LC local director and we have derived a model of the nanodroplet organization inside the layers. Resorting also to additional information on the nanodroplet size and shape distribution provided by SEM images of the H-PDLC cross-section, the observed director configuration has been modeled as a bidimensional distribution of elongated nanodroplets whose long axis is, on the average, parallel to the layers and whose internal director configuration is a uniaxial quasi- monodomain aligned along the nanodroplet long axis. The results suggest that the molecular organization is dictated mainly by the confinement, explaining, at least in part, the need for switching voltages significantly higher and the observed faster turn-off times in H-PDLCs compared to standard PDLC devices. Liquid crystal elastomers consist in cross-linked polymers, in which mesogens represent the monomers constituting the main chain or the laterally attached side groups. They bring together three important aspects: orientational order in amorphous soft materials, responsive molecular shape and quenched topological constraints. In biaxial nematic liquid crystalline elastomers (BLCEs), two orthogonal directions, rather than the one of normal uniaxial nematic, can be controlled, greatly enhancing their potential value for applications as novel actuators. Two versions of a side-chain BLCEs were characterized: side-on and end-on. Many tests have been carried out on both types of LCE, the main features detected being the lack of a significant dynamical behaviour, together with a strong permanent alignment along the principal director, and the confirmation of the transition temperatures already determined by DSC measurements. The end-on sample demonstrates a less hindered rotation of the side group mesogenic units and a greater freedom of alignment to the magnetic field, as already shown by previous NMR studies. Biaxial nematic ESR static spectra were also obtained on the basis of Molecular Dynamics generated biaxial configurations, to be compared to the experimentally determined ones, as a mean to establish a possible relation between biaxiality and the spectral features. This provides a concrete example of the advantages of combining the computer simulation and spectroscopic approaches. Finally, the dimer α,ω-bis(4'-cyanobiphenyl-4-yl)undecane (11BCB), synthesized in the "quest" for the biaxial nematic phase has been analysed. Its importance lies in the dimer significance as building blocks in the development of new materials to be employed in innovative technological applications, such as faster switching displays, resorting to the easier aligning ability of the secondary director in biaxial phases. A preliminary series of tests were performed revealing the population of mesogenic molecules as divided into two groups: one of elongated straightened conformers sharing a common director, and one of bent molecules, which display no order, being equally distributed in the three dimensions. Employing this model, the calculated values show a consistent trend, confirming at the same time the transition temperatures indicated by the DSC measurements, together with rotational diffusion tensor values that follow closely those of the constituting monomer 5CB.
Resumo:
Recently in most of the industrial automation process an ever increasing degree of automation has been observed. This increasing is motivated by the higher requirement of systems with great performance in terms of quality of products/services generated, productivity, efficiency and low costs in the design, realization and maintenance. This trend in the growth of complex automation systems is rapidly spreading over automated manufacturing systems (AMS), where the integration of the mechanical and electronic technology, typical of the Mechatronics, is merging with other technologies such as Informatics and the communication networks. An AMS is a very complex system that can be thought constituted by a set of flexible working stations, one or more transportation systems. To understand how this machine are important in our society let considerate that every day most of us use bottles of water or soda, buy product in box like food or cigarets and so on. Another important consideration from its complexity derive from the fact that the the consortium of machine producers has estimated around 350 types of manufacturing machine. A large number of manufacturing machine industry are presented in Italy and notably packaging machine industry,in particular a great concentration of this kind of industry is located in Bologna area; for this reason the Bologna area is called “packaging valley”. Usually, the various parts of the AMS interact among them in a concurrent and asynchronous way, and coordinate the parts of the machine to obtain a desiderated overall behaviour is an hard task. Often, this is the case in large scale systems, organized in a modular and distributed manner. Even if the success of a modern AMS from a functional and behavioural point of view is still to attribute to the design choices operated in the definition of the mechanical structure and electrical electronic architecture, the system that governs the control of the plant is becoming crucial, because of the large number of duties associated to it. Apart from the activity inherent to the automation of themachine cycles, the supervisory system is called to perform other main functions such as: emulating the behaviour of traditional mechanical members thus allowing a drastic constructive simplification of the machine and a crucial functional flexibility; dynamically adapting the control strategies according to the different productive needs and to the different operational scenarios; obtaining a high quality of the final product through the verification of the correctness of the processing; addressing the operator devoted to themachine to promptly and carefully take the actions devoted to establish or restore the optimal operating conditions; managing in real time information on diagnostics, as a support of the maintenance operations of the machine. The kind of facilities that designers can directly find on themarket, in terms of software component libraries provides in fact an adequate support as regard the implementation of either top-level or bottom-level functionalities, typically pertaining to the domains of user-friendly HMIs, closed-loop regulation and motion control, fieldbus-based interconnection of remote smart devices. What is still lacking is a reference framework comprising a comprehensive set of highly reusable logic control components that, focussing on the cross-cutting functionalities characterizing the automation domain, may help the designers in the process of modelling and structuring their applications according to the specific needs. Historically, the design and verification process for complex automated industrial systems is performed in empirical way, without a clear distinction between functional and technological-implementation concepts and without a systematic method to organically deal with the complete system. Traditionally, in the field of analog and digital control design and verification through formal and simulation tools have been adopted since a long time ago, at least for multivariable and/or nonlinear controllers for complex time-driven dynamics as in the fields of vehicles, aircrafts, robots, electric drives and complex power electronics equipments. Moving to the field of logic control, typical for industrial manufacturing automation, the design and verification process is approached in a completely different way, usually very “unstructured”. No clear distinction between functions and implementations, between functional architectures and technological architectures and platforms is considered. Probably this difference is due to the different “dynamical framework”of logic control with respect to analog/digital control. As a matter of facts, in logic control discrete-events dynamics replace time-driven dynamics; hence most of the formal and mathematical tools of analog/digital control cannot be directly migrated to logic control to enlighten the distinction between functions and implementations. In addition, in the common view of application technicians, logic control design is strictly connected to the adopted implementation technology (relays in the past, software nowadays), leading again to a deep confusion among functional view and technological view. In Industrial automation software engineering, concepts as modularity, encapsulation, composability and reusability are strongly emphasized and profitably realized in the so-calledobject-oriented methodologies. Industrial automation is receiving lately this approach, as testified by some IEC standards IEC 611313, IEC 61499 which have been considered in commercial products only recently. On the other hand, in the scientific and technical literature many contributions have been already proposed to establish a suitable modelling framework for industrial automation. During last years it was possible to note a considerable growth in the exploitation of innovative concepts and technologies from ICT world in industrial automation systems. For what concerns the logic control design, Model Based Design (MBD) is being imported in industrial automation from software engineering field. Another key-point in industrial automated systems is the growth of requirements in terms of availability, reliability and safety for technological systems. In other words, the control system should not only deal with the nominal behaviour, but should also deal with other important duties, such as diagnosis and faults isolations, recovery and safety management. Indeed, together with high performance, in complex systems fault occurrences increase. This is a consequence of the fact that, as it typically occurs in reliable mechatronic systems, in complex systems such as AMS, together with reliable mechanical elements, an increasing number of electronic devices are also present, that are more vulnerable by their own nature. The diagnosis problem and the faults isolation in a generic dynamical system consists in the design of an elaboration unit that, appropriately processing the inputs and outputs of the dynamical system, is also capable of detecting incipient faults on the plant devices, reconfiguring the control system so as to guarantee satisfactory performance. The designer should be able to formally verify the product, certifying that, in its final implementation, it will perform itsrequired function guarantying the desired level of reliability and safety; the next step is that of preventing faults and eventually reconfiguring the control system so that faults are tolerated. On this topic an important improvement to formal verification of logic control, fault diagnosis and fault tolerant control results derive from Discrete Event Systems theory. The aimof this work is to define a design pattern and a control architecture to help the designer of control logic in industrial automated systems. The work starts with a brief discussion on main characteristics and description of industrial automated systems on Chapter 1. In Chapter 2 a survey on the state of the software engineering paradigm applied to industrial automation is discussed. Chapter 3 presentes a architecture for industrial automated systems based on the new concept of Generalized Actuator showing its benefits, while in Chapter 4 this architecture is refined using a novel entity, the Generalized Device in order to have a better reusability and modularity of the control logic. In Chapter 5 a new approach will be present based on Discrete Event Systems for the problemof software formal verification and an active fault tolerant control architecture using online diagnostic. Finally conclusive remarks and some ideas on new directions to explore are given. In Appendix A are briefly reported some concepts and results about Discrete Event Systems which should help the reader in understanding some crucial points in chapter 5; while in Appendix B an overview on the experimental testbed of the Laboratory of Automation of University of Bologna, is reported to validated the approach presented in chapter 3, chapter 4 and chapter 5. In Appendix C some components model used in chapter 5 for formal verification are reported.
Structure and dynamics of supramolecular assemblies studied by advanced solid-state NMR spectroscopy
Resumo:
Ziel der vorliegenden Arbeit ist die Aufklärung von Struktur und Dynamik komplexer supramolekularer Systeme mittels Festkörper NMR Spektroskopie. Die Untersuchung von pi-pi Wechselwirkungen, welche einen entscheidenden Einfluss auf die strukturellen und dynamischen Eigenschaften supra- molekularer Systeme haben, hilft dabei, die Selbst- organisationsprozesse dieser komplexen Materialien besser zu verstehen. Mit dipolaren 1H-1H and 1H-13C Wiedereinkopplungs NMR Methoden unter schnellem MAS können sowohl 1H chemische Verschiebungen als auch dipolare 1H-1H und 1H-13C Kopplungen untersucht werden, ohne dass eine Isotopenmarkierung erforderlich ist. So erhält man detaillierte Informationen über die Struktur und die Beweglichkeit einzelner Molekül- segmente. In Verbindung mit sogenannten nucleus independent chemical shift (NICS) maps (berechnet mit ab-initio Methoden) lassen sich Abstände von Protonen relativ zu pi-Elektronensystemen bestimmen und so Strukturvorschläge ableiten. Mit Hilfe von homo- und heteronuklearen dipolaren Rotationsseitenbandenmustern könnenaußerdem Ordnungs- parameter für verschiedene Molekülsegmente bestimmt werden. Die auf diese Weise gewonnenen Informationen über die strukturellen und dynamischen Eigenschaften supramolekularer Systeme tragen dazu bei, strukturbestimmende Molekül- einheiten und Hauptordnungsphänomene zu identifizieren sowie lokale Wechselwirkungen zu quantifizieren, um so den Vorgang der Selbstorganisation besser zu verstehen.
Resumo:
Im Rahmen dieser Arbeit wurde die mehrstufige Resonanzionisation zur Spektroskopie im Gadolinium und Samarium eingesetzt und am Gadolinium für analytische Untersuchungen weiterentwickelt. Der Einsatzbereich der RIMS mit kontinuierlichen und gepulsten Lasern an komplexen Atomen wurde damit deutlich erweitert. Samarium und Gadolinium gehören zur Gruppe der Lanthanide, aufgrund der komplizierten Elektronenkonfigurationen zeichnen sie sich durch ein interessantes atomares Spektrum aus. Im Samarium wurde der erste von maximal drei resonanten Übergängen bezüglich Isotopieverschiebung und Hyperfein-strukturaufspaltung untersucht, knapp unterhalb des ersten Ionisationslimits nach möglichst ungestörten Rydbergserien gesucht und aus der Konvergenz dieser Serien das Ionisationspotenzial für 154Sm isotopenselektiv zu IP = 45519.30793(43) cm-1 bestimmt. Samarium und Gadolinium besitzen eine komplexe Kontinuumsstruktur, die sich durch schmale und starke autoionisierende Resonanzen auszeichnet. Daten früherer Untersuchungen zur Gadoliniumkontinuumsstruktur wurden in dieser Arbeit systematisch ausgewertet und durch eigene Messungen ergänzt. Zur theoretischen Beschreibung der Linienprofile interferierender autoionisierender Zustände wurde neben Fanoprofilen auch auf einen Ansatz aus der Kernphysik zurückgegriffen, den K-Matrix-Formalismus, und ein entsprechendes Simulationsprogramm eingesetzt. Anwendung auf ausgewählte spektrale Bereiche im Samarium und Gadolinium zeigt gute Reproduktion der Linienformen. Im Rahmen dieser Arbeit wurde darüber hinaus die Einsetzbarkeit von gepulsten Lasern für die Spurenanalyse untersucht und die Erreichbarkeit der notwendigen Spezifikationen für medizinische Fragestellungen demonstriert.
Resumo:
Two of the main features of today complex software systems like pervasive computing systems and Internet-based applications are distribution and openness. Distribution revolves around three orthogonal dimensions: (i) distribution of control|systems are characterised by several independent computational entities and devices, each representing an autonomous and proactive locus of control; (ii) spatial distribution|entities and devices are physically distributed and connected in a global (such as the Internet) or local network; and (iii) temporal distribution|interacting system components come and go over time, and are not required to be available for interaction at the same time. Openness deals with the heterogeneity and dynamism of system components: complex computational systems are open to the integration of diverse components, heterogeneous in terms of architecture and technology, and are dynamic since they allow components to be updated, added, or removed while the system is running. The engineering of open and distributed computational systems mandates for the adoption of a software infrastructure whose underlying model and technology could provide the required level of uncoupling among system components. This is the main motivation behind current research trends in the area of coordination middleware to exploit tuple-based coordination models in the engineering of complex software systems, since they intrinsically provide coordinated components with communication uncoupling and further details in the references therein. An additional daunting challenge for tuple-based models comes from knowledge-intensive application scenarios, namely, scenarios where most of the activities are based on knowledge in some form|and where knowledge becomes the prominent means by which systems get coordinated. Handling knowledge in tuple-based systems induces problems in terms of syntax - e.g., two tuples containing the same data may not match due to differences in the tuple structure - and (mostly) of semantics|e.g., two tuples representing the same information may not match based on a dierent syntax adopted. Till now, the problem has been faced by exploiting tuple-based coordination within a middleware for knowledge intensive environments: e.g., experiments with tuple-based coordination within a Semantic Web middleware (surveys analogous approaches). However, they appear to be designed to tackle the design of coordination for specic application contexts like Semantic Web and Semantic Web Services, and they result in a rather involved extension of the tuple space model. The main goal of this thesis was to conceive a more general approach to semantic coordination. In particular, it was developed the model and technology of semantic tuple centres. It is adopted the tuple centre model as main coordination abstraction to manage system interactions. A tuple centre can be seen as a programmable tuple space, i.e. an extension of a Linda tuple space, where the behaviour of the tuple space can be programmed so as to react to interaction events. By encapsulating coordination laws within coordination media, tuple centres promote coordination uncoupling among coordinated components. Then, the tuple centre model was semantically enriched: a main design choice in this work was to try not to completely redesign the existing syntactic tuple space model, but rather provide a smooth extension that { although supporting semantic reasoning { keep the simplicity of tuple and tuple matching as easier as possible. By encapsulating the semantic representation of the domain of discourse within coordination media, semantic tuple centres promote semantic uncoupling among coordinated components. The main contributions of the thesis are: (i) the design of the semantic tuple centre model; (ii) the implementation and evaluation of the model based on an existent coordination infrastructure; (iii) a view of the application scenarios in which semantic tuple centres seem to be suitable as coordination media.
Resumo:
L’attività di ricerca della presente tesi di dottorato ha riguardato sistemi tribologici complessi di interesse industriale per i quali sono stati individuati, mediante failure analysis, i meccanismi di usura dominanti. Per ciascuno di essi sono state studiate soluzioni migliorative sulla base di prove tribologiche di laboratorio. Nella realizzazione di maglie per macchine movimentazione terra sono ampiamente utilizzati i tradizionali acciai da bonifica. La possibilità di utilizzare i nuovi microlegati a medio tenore di carbonio, consentirebbe una notevole semplificazione del ciclo produttivo e benefici in termini di costi. Una parte della tesi ha riguardato lo studio del comportamento tribologico di tali acciai. E’ stato anche affrontato lo studio tribologico di motori idraulici, con l’obiettivo di riuscire a migliorarne la resistenza ad usura e quindi la vita utile. Sono state eseguite prove a banco, per valutare i principali meccanismi di usura, e prove di laboratorio atte a riprodurre le reali condizioni di utilizzo, valutando tecniche di modificazione superficiale che fossero in grado di ridurre l’usura dei componenti. Sono state analizzate diverse tipologie di rivestimenti Thermal Spray in termini di modalità di deposizione (AFS-APS) e di leghe metalliche depositate (Ni,Mo,Cu/Al). Si sono infine caratterizzati contatti tribologici nel settore del packaging, dove l’utilizzo di acciai inox austenitici è in alcuni casi obbligatorio. L’acciaio inossidabile AISI 316L è ampiamente utilizzato in applicazioni in cui siano richieste elevate resistenze alla corrosione, tuttavia la bassa resistenza all’usura, ne limitano l’impiego in campo tribologico. In tale ambito, è stata analizzata una problematica tribologica relativa a macchine automatiche per il dosaggio di polveri farmaceutiche. Sono state studiate soluzioni alternative che hanno previsto sia la completa sostituzione dei materiali della coppia tribologica, sia l’individuazione di tecniche di modificazione superficiale innovative quali la cementazione a bassa temperatura anche seguita dalla deposizione di un rivestimento di carbonio amorfo idrogenato a-C:H
Resumo:
Il lavoro svolto in questa tesi si colloca nell’area della robotica aerea e della visione artificiale attraverso l’integrazione di algoritmi di visione per il controllo di un velivolo senza pilota. Questo lavoro intende dare un contributo al progetto europeo SHERPA (Smart collaboration between Humans and ground-aErial Robots for imProving rescuing activities in Alpine environments), coordinato dall’università di Bologna e con la compartecipazione delle università di Brema, Zurigo, Twente, Leuven, Linkopings, del CREATE (Consorzio di Ricerca per l’Energia e le Applicazioni Tecnologiche dell’Elettromagnetismo), di alcune piccole e medie imprese e del club alpino italiano, che consiste nel realizzare un team di robots eterogenei in grado di collaborare con l’uomo per soccorrere i dispersi nell’ambiente alpino. L’obiettivo di SHERPA consiste nel progettare e integrare l’autopilota all’interno del team. In tale contesto andranno gestiti problemi di grande complessità, come il controllo della stabilità del velivolo a fronte di incertezze dovute alla presenza di vento, l’individuazione di ostacoli presenti nella traiettoria di volo, la gestione del volo in prossimità di ostacoli, ecc. Inoltre tutte queste operazioni devono essere svolte in tempo reale. La tesi è stata svolta presso il CASY (Center for Research on Complex Automated Systems) dell’università di Bologna, utilizzando per le prove sperimentali una PX4FLOW Smart Camera. Inizialmente è stato studiato un autopilota, il PIXHAWK, sul quale è possibile interfacciare la PX4FLOW, in seguito sono stati studiati e simulati in MATLAB alcuni algoritmi di visione basati su flusso ottico. Infine è stata studiata la PX4FLOW Smart Camera, con la quale sono state svolte le prove sperimentali. La PX4FLOW viene utilizzata come interfaccia alla PIXHAWK, in modo da eseguire il controllo del velivolo con la massima efficienza. E’ composta da una telecamera per la ripresa della scena, un giroscopio per la misura della velocità angolare, e da un sonar per le misure di distanza. E’ in grado di fornire la velocità di traslazione del velivolo, e quest’ultima, integrata, consente di ricostruire la traiettoria percorsa dal velivolo.
Resumo:
This thesis describes the investigation of systematically varied organic molecules for use in molecular self-assembly processes. All experiments were performed using high-resolution non-contact atomic force microscopy under UHV conditions and at room temperature. Using this technique, three different approaches for influencing intermolecular and molecule-surface interaction on the insulating calcite(10.4) surface were investigated by imaging the structure formation at the molecular scale. I first demonstrated the functionalization of shape-persistent oligo(p-benzamide)s that was engineered by introducing different functional groups and investigating their effect on the structural formation on the sample surface. The molecular core was designed to provide significant electrostatic anchoring towards the surface, while at the same time maintaining the flexibility to fine-tune the resulting structure by adjusting the intermolecular cohesion energy. The success of this strategy is based on a clear separation of the molecule-substrate interaction from the molecule-molecule interaction. My results show that sufficient molecule-surface anchoring can be achieved without restricting the structural flexibility that is needed for the design of complex molecular systems. Three derivatives of terephthalic acid (TPA) were investigated in chapter 7. Here, the focus was on changing the adhesion to the calcite surface by introducing different anchor functionalities to the TPA backbone. For all observed molecules, the strong substrate templating effect results in molecular structures that are strictly oriented along the calcite main crystal directions. This templating is especially pronounced in the case of 2-ATPA where chain formation on the calcite surface is observed in contrast to the formation of molecular layers in the bulk. At the same time, the amino group of 2-ATPA proved an efficient anchor functionality, successfully stabilizing the molecular chains on the sample surface. These findings emphasizes, once again, the importance of balancing and fine-tuning molecule-molecule and molecule-surface interactions in order to achieve stable, yet structurally flexible molecular arrangements on the sample surface. In the last chapter, I showed how the intrinsic property of molecular chirality decisively influences the structure formation in molecular self-assembly. This effect is especially pronounced in the case of the chiral heptahelicene-2-carboxylic acid. Deposition of the enantiopure molecules results in the formation of homochiral islands on the sample surface which is in sharp contrast to the formation of uni-directional double rows upon deposition of the racemate onto the same surface. While it remained uncertain from these previous experiments whether the double rows are composed of hetero- or homochiral molecules, I could clearly answer that question here and demonstrate that the rows are of heterochiral origin. Chirality, thus, proves to be another important parameter to steer the intermolecular interaction on surfaces. Altogether, the results of this thesis demonstrate that, in order to successfully control the structure formation in molecular self-assembly, the correct combination of molecule and surface properties is crucial. This is of special importance when working on substrates that exhibit a strong influence on the structure formation, such as the calcite(10.4) surface. Through the systematic variation of functional groups several important parameters that influence the balance between molecule-surface and molecule-molecule interaction were identified here, and the results of this thesis can, thus, act as a guideline for the rational design of molecules for use in molecular self-assembly.
Resumo:
Il sempre crescente numero di applicazioni di reti di sensori, robot cooperanti e formazioni di veicoli, ha fatto sì che le problematiche legate al coordinamento di sistemi multi-agente (MAS) diventassero tra le più studiate nell’ambito della teoria dei controlli. Esistono numerosi approcci per affrontare il problema, spesso profondamente diversi tra loro. La strategia studiata in questa tesi è basata sulla Teoria del Consenso, che ha una natura distribuita e completamente leader-less; inoltre il contenuto informativo scambiato tra gli agenti è ridotto al minimo. I primi 3 capitoli introducono ed analizzano le leggi di interazione (Protocolli di Consenso) che permettono di coordinare un Network di sistemi dinamici. Nel capitolo 4 si pensa all'applicazione della teoria al problema del "loitering" circolare di più robot volanti attorno ad un obiettivo in movimento. Si sviluppa a tale scopo una simulazione in ambiente Matlab/Simulink, che genera le traiettorie di riferimento di raggio e centro impostabili, a partire da qualunque posizione iniziale degli agenti. Tale simulazione è stata utilizzata presso il “Center for Research on Complex Automated Systems” (CASY-DEI Università di Bologna) per implementare il loitering di una rete di quadrirotori "CrazyFlie". I risultati ed il setup di laboratorio sono riportati nel capitolo 5. Sviluppi futuri si concentreranno su algoritmi locali che permettano agli agenti di evitare collisioni durante i transitori: il controllo di collision-avoidance dovrà essere completamente indipendente da quello di consenso, per non snaturare il protocollo di Consenso stesso.
Resumo:
Fine powders commonly have poor flowability and dispersibility due to interparticle adhesion that leads to formation of agglomerates. Knowing about adhesion in particle collectives is indispensable to gain a deeper fundamental understanding of particle behavior in powders. Especially in pharmaceutical industry a control of adhesion forces in powders is mandatory to improve the performance of inhalation products. Typically the size of inhalable particles is in the range of 1 - 5 µm. In this thesis, a new method was developed to measure adhesion forces of particles as an alternative to the established colloidal probe and centrifuge technique, which are both experimentally demanding, time consuming and of limited practical applicability. The new method is based on detachment of individual particles from a surface due to their inertia. The required acceleration in the order of 500 000 g is provided by a Hopkinson bar shock excitation system and measured via laser vibrometry. Particle detachment events are detected on-line by optical video microscopy. Subsequent automated data evaluation allows obtaining a statistical distribution of particle adhesion forces. To validate the new method, adhesion forces for ensembles of single polystyrene and silica microspheres on a polystyrene coated steel surface were measured under ambient conditions. It was possible to investigate more than 150 individual particles in one experiment and obtain adhesion values of particles in a diameter range of 3 - 13 µm. This enables a statistical evaluation while measuring effort and time are considerably lower compared to the established techniques. Measured adhesion forces of smaller particles agreed well with values from colloidal probe measurements and theoretical predictions. However, for the larger particles a stronger increase of adhesion with diameter was observed. This discrepancy might be induced by surface roughness and heterogeneity that influence small and large particles differently. By measuring adhesion forces of corrugated dextran particles with sizes down to 2 µm it was demonstrated that the Hopkinson bar method can be used to characterize more complex sample systems as well. Thus, the new device will be applicable to study a broad variety of different particle-surface combinations on a routine basis, including strongly cohesive powders like pharmaceutical drugs for inhalation.
Resumo:
The biggest challenge facing software developers today is how to gracefully evolve complex software systems in the face of changing requirements. We clearly need software systems to be more dynamic, compositional and model-centric, but instead we continue to build systems that are static, baroque and inflexible. How can we better build change-enabled systems in the future? To answer this question, we propose to look back to one of the most successful systems to support change, namely Smalltalk. We briefly introduce Smalltalk with a few simple examples, and draw some lessons for software evolution. Smalltalk's simplicity, its reflective design, and its highly dynamic nature all go a long way towards enabling change in Smalltalk applications. We then illustrate how these lessons work in practice by reviewing a number of research projects that support software evolution by exploiting Smalltalk's design. We conclude by summarizing open issues and challenges for change-enabled systems of the future.
Resumo:
Java Enterprise Applications (JEAs) are complex software systems written using multiple technologies. Moreover they are usually distributed systems and use a database to deal with persistence. A particular problem that appears in the design of these systems is the lack of a rich business model. In this paper we propose a technique to support the recovery of such rich business objects starting from anemic Data Transfer Objects (DTOs). Exposing the code duplications in the application's elements using the DTOs we suggest which business logic can be moved into the DTOs from the other classes.
Resumo:
Model based calibration has gained popularity in recent years as a method to optimize increasingly complex engine systems. However virtually all model based techniques are applied to steady state calibration. Transient calibration is by and large an emerging technology. An important piece of any transient calibration process is the ability to constrain the optimizer to treat the problem as a dynamic one and not as a quasi-static process. The optimized air-handling parameters corresponding to any instant of time must be achievable in a transient sense; this in turn depends on the trajectory of the same parameters over previous time instances. In this work dynamic constraint models have been proposed to translate commanded to actually achieved air-handling parameters. These models enable the optimization to be realistic in a transient sense. The air handling system has been treated as a linear second order system with PD control. Parameters for this second order system have been extracted from real transient data. The model has been shown to be the best choice relative to a list of appropriate candidates such as neural networks and first order models. The selected second order model was used in conjunction with transient emission models to predict emissions over the FTP cycle. It has been shown that emission predictions based on air-handing parameters predicted by the dynamic constraint model do not differ significantly from corresponding emissions based on measured air-handling parameters.
Resumo:
In an effort to understand some of the ways that accountability-based reform efforts have influenced teacher education, this article details the politics of accountability in Pennsylvania that motivated sweeping changes in the policies governing teacher preparation in 2006. This case study provides a poignant example of the kind of complex accountability systems now being constructed across the United States in an effort to change teacher preparation. By analyzing primary documents including the legal statutes governing teacher preparation in Pennsylvania, correspondence from the Pennsylvania Department of Education, related newsletters, memos, reports, transcripts of meetings, and testimony before the Pennsylvania House of Representatives, the complex nature of the conflicts underlying the development and implementation of teacher education reform is brought into focus. The study's findings suggest that a deep and uncritical acceptance of accountability-based teacher education reform on the part of educational policy makers is likely to do more harm than good. The article concludes by outlining a framework for developing more intelligent measures of accountability that might preserve professional autonomy and judgment.