21 resultados para Reasonable Lenght of Process
em AMS Tesi di Dottorato - Alm@DL - Università di Bologna
Resumo:
This thesis analyses problems related to the applicability, in business environments, of Process Mining tools and techniques. The first contribution is a presentation of the state of the art of Process Mining and a characterization of companies, in terms of their "process awareness". The work continues identifying circumstance where problems can emerge: data preparation; actual mining; and results interpretation. Other problems are the configuration of parameters by not-expert users and computational complexity. We concentrate on two possible scenarios: "batch" and "on-line" Process Mining. Concerning the batch Process Mining, we first investigated the data preparation problem and we proposed a solution for the identification of the "case-ids" whenever this field is not explicitly indicated. After that, we concentrated on problems at mining time and we propose the generalization of a well-known control-flow discovery algorithm in order to exploit non instantaneous events. The usage of interval-based recording leads to an important improvement of performance. Later on, we report our work on the parameters configuration for not-expert users. We present two approaches to select the "best" parameters configuration: one is completely autonomous; the other requires human interaction to navigate a hierarchy of candidate models. Concerning the data interpretation and results evaluation, we propose two metrics: a model-to-model and a model-to-log. Finally, we present an automatic approach for the extension of a control-flow model with social information, in order to simplify the analysis of these perspectives. The second part of this thesis deals with control-flow discovery algorithms in on-line settings. We propose a formal definition of the problem, and two baseline approaches. The actual mining algorithms proposed are two: the first is the adaptation, to the control-flow discovery problem, of a frequency counting algorithm; the second constitutes a framework of models which can be used for different kinds of streams (stationary versus evolving).
Resumo:
The increasing aversion to technological risks of the society requires the development of inherently safer and environmentally friendlier processes, besides assuring the economic competitiveness of the industrial activities. The different forms of impact (e.g. environmental, economic and societal) are frequently characterized by conflicting reduction strategies and must be holistically taken into account in order to identify the optimal solutions in process design. Though the literature reports an extensive discussion of strategies and specific principles, quantitative assessment tools are required to identify the marginal improvements in alternative design options, to allow the trade-off among contradictory aspects and to prevent the “risk shift”. In the present work a set of integrated quantitative tools for design assessment (i.e. design support system) was developed. The tools were specifically dedicated to the implementation of sustainability and inherent safety in process and plant design activities, with respect to chemical and industrial processes in which substances dangerous for humans and environment are used or stored. The tools were mainly devoted to the application in the stages of “conceptual” and “basic design”, when the project is still open to changes (due to the large number of degrees of freedom) which may comprise of strategies to improve sustainability and inherent safety. The set of developed tools includes different phases of the design activities, all through the lifecycle of a project (inventories, process flow diagrams, preliminary plant lay-out plans). The development of such tools gives a substantial contribution to fill the present gap in the availability of sound supports for implementing safety and sustainability in early phases of process design. The proposed decision support system was based on the development of a set of leading key performance indicators (KPIs), which ensure the assessment of economic, societal and environmental impacts of a process (i.e. sustainability profile). The KPIs were based on impact models (also complex), but are easy and swift in the practical application. Their full evaluation is possible also starting from the limited data available during early process design. Innovative reference criteria were developed to compare and aggregate the KPIs on the basis of the actual sitespecific impact burden and the sustainability policy. Particular attention was devoted to the development of reliable criteria and tools for the assessment of inherent safety in different stages of the project lifecycle. The assessment follows an innovative approach in the analysis of inherent safety, based on both the calculation of the expected consequences of potential accidents and the evaluation of the hazards related to equipment. The methodology overrides several problems present in the previous methods proposed for quantitative inherent safety assessment (use of arbitrary indexes, subjective judgement, build-in assumptions, etc.). A specific procedure was defined for the assessment of the hazards related to the formations of undesired substances in chemical systems undergoing “out of control” conditions. In the assessment of layout plans, “ad hoc” tools were developed to account for the hazard of domino escalations and the safety economics. The effectiveness and value of the tools were demonstrated by the application to a large number of case studies concerning different kinds of design activities (choice of materials, design of the process, of the plant, of the layout) and different types of processes/plants (chemical industry, storage facilities, waste disposal). An experimental survey (analysis of the thermal stability of isomers of nitrobenzaldehyde) provided the input data necessary to demonstrate the method for inherent safety assessment of materials.
Resumo:
INTRODUCTION: The orthotopic left lung transplantation model in rats has been developed to answer a variety of scientific questions in transplant immunology and in the related fields of respiratory diseases. However, its widespread use has been hampered by the complexity of the procedure. AIM OF THE RESEARCH: Our purpose is to provide a detailed description of the procedure of this technique, including the complications and difficulties from the very first microsurgical step until the ultimate successful completion of the transplant procedure. MATERIALS AND METHODS: The transplant procedures were performed by two collaborating transplant surgeons with microsurgical and thoracic surgery skills. A total of 150 left lung transplants in rats were performed. Twenty-seven syngeneic (Lewis to Lewis) and 123 allogeneic (Brown-Norway to Lewis) lung transplants were performed using the cuff technique. RESULTS: In first 50 transplant procedures, post-transplant survival rate was 74% of which 54% reached the end-point of 3 or 7 days post-transplant; whole complication rate was 66%. In the subsequent 50 transplant surgeries (from 51 to 100) post-transplant survival rate increased to 88% of which 56% reached the end-point; whole complication rate was 32 %. In the final 50 transplants (from 101 to 150) post-transplant survival rate was confirmed to be 88% of which 74% reached the end-point; whole complication rate was again 32 %. CONCLUSIONS: One hundred-fifty transplants can represent a reasonable number of procedures to obtain a satisfactory surgical outcome. Training period with simpler animal models is mandatory to develop anesthesiological and microsurgical skills required for successfully develop this model. The collaboration between at least two microsurgeons is mandatory to perform all the simultaneous procedures required for completing the transplant surgery.
Resumo:
A very recent and exciting new area of research is the application of Concurrency Theory tools to formalize and analyze biological systems and one of the most promising approach comes from the process algebras (process calculi). A process calculus is a formal language that allows to describe concurrent systems and comes with well-established techniques for quantitative and qualitative analysis. Biological systems can be regarded as concurrent systems and therefore modeled by means of process calculi. In this thesis we focus on the process calculi approach to the modeling of biological systems and investigate, mostly from a theoretical point of view, several promising bio-inspired formalisms: Brane Calculi and k-calculus family. We provide several expressiveness results mostly by means of comparisons between calculi. We provide a lower bound to the computational power of the non Turing complete MDB Brane Calculi by showing an encoding of a simple P-System into MDB. We address the issue of local implementation within the k-calculus family: whether n-way rewrites can be simulated by binary interactions only. A solution introducing divergence is provided and we prove a deterministic solution preserving the termination property is not possible. We use the symmetric leader election problem to test synchronization capabilities within the k-calculus family. Several fragments of the original k-calculus are considered and we prove an impossibility result about encoding n-way synchronization into (n-1)-way synchronization. A similar impossibility result is obtained in a pure computer science context. We introduce CCSn, an extension of CCS with multiple input prefixes and show, using the dining philosophers problem, that there is no reasonable encoding of CCS(n+1) into CCSn.
Resumo:
This research activity studied how the uncertainties are concerned and interrelated through the multi-model approach, since it seems to be the bigger challenge of ocean and weather forecasting. Moreover, we tried to reduce model error throughout the superensemble approach. In order to provide this aim, we created different dataset and by means of proper algorithms we obtained the superensamble estimate. We studied the sensitivity of this algorithm in function of its characteristics parameters. Clearly, it is not possible to evaluate a reasonable estimation of the error neglecting the importance of the grid size of ocean model, for the large amount of all the sub grid-phenomena embedded in space discretizations that can be only roughly parametrized instead of an explicit evaluation. For this reason we also developed a high resolution model, in order to calculate for the first time the impact of grid resolution on model error.
Resumo:
The use of atmospheric pressure plasmas for thin film deposition on thermo-sensitive materials is currently one of the main challenges of the plasma scientific community. Despite the growing interest in this field, the existing knowledge gap between gas-phase reaction mechanisms and thin film properties is still one of the most important barriers to overcome for a complete understanding of the process. In this work, thin films surface characterization techniques, combined with passive and active gas-phase diagnostic methods, were used to provide a comprehensive study of the Ar/TEOS deposition process assisted by an atmospheric pressure plasma jet. SiO2-based thin films exhibiting a well-defined chemistry, a good morphological structure and high uniformity were studied in detail by FTIR, XPS, AFM and SEM analysis. Furthermore, non-intrusive spectroscopy techniques (OES, filter imaging) and laser spectroscopic methods (Rayleigh scattering, LIF and TALIF) were employed to shed light on the complexity of gas-phase mechanisms involved in the deposition process and discuss the influence of TEOS admixture on gas temperature, electron density and spatial-temporal behaviours of active species. The poly-diagnostic approach proposed in this work opens interesting perspectives both in terms of process control and optimization of thin film performances.
Resumo:
Against a backdrop of rapidly increasing worldwide population and growing energy demand, the development of renewable energy technologies has become of primary importance in the effort to reduce greenhouse gas emissions. However, it is often technically and economically infeasible to transport discontinuous renewable electricity for long distances to the shore. Another shortcoming of non-programmable renewable power is its integration into the onshore grid without affecting the dispatching process. On the other hand, the offshore oil & gas industry is striving to reduce overall carbon footprint from onsite power generators and limiting large expenses associated to carrying electricity from remote offshore facilities. Furthermore, the increased complexity and expansion towards challenging areas of offshore hydrocarbons operations call for higher attention to safety and environmental protection issues from major accident hazards. Innovative hybrid energy systems, as Power-to-Gas (P2G), Power-to-Liquid (P2L) and Gas-to-Power (G2P) options, implemented at offshore locations, would offer the opportunity to overcome challenges of both renewable and oil & gas sectors. This study aims at the development of systematic methodologies based on proper sustainability and safety performance indicators supporting the choice of P2G, P2L and G2P hybrid energy options for offshore green projects in early design phases. An in-depth analysis of the different offshore hybrid strategies was performed. The literature reviews on existing methods proposing metrics to assess sustainability of hybrid energy systems, inherent safety of process routes in conceptual design stage and environmental protection of installations from oil and chemical accidental spills were carried out. To fill the gaps, a suite of specific decision-making methodologies was developed, based on representative multi-criteria indicators addressing technical, economic, environmental and societal aspects of alternative options. A set of five case-studies was defined, covering different offshore scenarios of concern, to provide an assessment of the effectiveness and value of the developed tools.
Resumo:
The research project aims to improve the Design for Additive Manufacturing of metal components. Firstly, the scenario of Additive Manufacturing is depicted, describing its role in Industry 4.0 and in particular focusing on Metal Additive Manufacturing technologies and the Automotive sector applications. Secondly, the state of the art in Design for Additive Manufacturing is described, contextualizing the methodologies, and classifying guidelines, rules, and approaches. The key phases of product design and process design to achieve lightweight functional designs and reliable processes are deepened together with the Computer-Aided Technologies to support the approaches implementation. Therefore, a general Design for Additive Manufacturing workflow based on product and process optimization has been systematically defined. From the analysis of the state of the art, the use of a holistic approach has been considered fundamental and thus the use of integrated product-process design platforms has been evaluated as a key element for its development. Indeed, a computer-based methodology exploiting integrated tools and numerical simulations to drive the product and process optimization has been proposed. A validation of CAD platform-based approaches has been performed, as well as potentials offered by integrated tools have been evaluated. Concerning product optimization, systematic approaches to integrate topology optimization in the design have been proposed and validated through product optimization of an automotive case study. Concerning process optimization, the use of process simulation techniques to prevent manufacturing flaws related to the high thermal gradients of metal processes is developed, providing case studies to validate results compared to experimental data, and application to process optimization of an automotive case study. Finally, an example of the product and process design through the proposed simulation-driven integrated approach is provided to prove the method's suitability for effective redesigns of Additive Manufacturing based high-performance metal products. The results are then outlined, and further developments are discussed.
Resumo:
The microstructure of 6XXX aluminum alloys deeply affects mechanical, crash, corrosion and aesthetic properties of extruded profiles. Unfortunately, grain structure evolution during manufacturing processes is a complex phenomenon because several process and material parameters such as alloy chemical composition, temperature, extrusion speed, tools geometries, quenching and thermal treatment parameters affect the grain evolution during the manufacturing process. The aim of the present PhD thesis was the analysis of the recrystallization kinetics during the hot extrusion of 6XXX aluminum alloys and the development of reliable recrystallization models to be used in FEM codes for the microstructure prediction at a die design stage. Experimental activities have been carried out in order to acquire data for the recrystallization models development, validation and also to investigate the effect of process parameters and die design on the microstructure of the final component. The experimental campaign reported in this thesis involved the extrusion of AA6063, AA6060 and AA6082 profiles with different process parameters in order to provide a reliable amount of data for the models validation. A particular focus was made to investigate the PCG defect evolution during the extrusion of medium-strength alloys such as AA6082. Several die designs and process conditions were analysed in order to understand the influence of each of them on the recrystallization behaviour of the investigated alloy. From the numerical point of view, innovative models for the microstructure prediction were developed and validated over the extrusion of industrial-scale profiles with complex geometries, showing a good matching in terms of the grain size and surface recrystallization prediction. The achieved results suggest the reliability of the developed models and their application in the industrial field for process and material properties optimization at a die-design stage.
Resumo:
Nowadays, the chemical industry has reached significant goals to produce essential components for human being. The growing competitiveness of the market caused an important acceleration in R&D activities, introducing new opportunities and procedures for the definition of process improvement and optimization. In this dynamicity, sustainability is becoming one of the key aspects for the technological progress encompassing economic, environmental protection and safety aspects. With respect to the conceptual definition of sustainability, literature reports an extensive discussion of the strategies, as well as sets of specific principles and guidelines. However, literature procedures are not completely suitable and applicable to process design activities. Therefore, the development and introduction of sustainability-oriented methodologies is a necessary step to enhance process and plant design. The definition of key drivers as support system is a focal point for early process design decisions or implementation of process modifications. In this context, three different methodologies are developed to support design activities providing criteria and guidelines in a sustainable perspective. In this framework, a set of key Performance Indicators is selected and adopted to characterize the environmental, safety, economic and energetic aspects of a reference process. The methodologies are based on heat and material balances and the level of detailed for input data are compatible with available information of the specific application. Multiple case-studies are defined to prove the effectiveness of the methodologies. The principal application is the polyolefin productive lifecycle chain with particular focus on polymerization technologies. In this context, different design phases are investigated spanning from early process feasibility study to operative and improvements assessment. This flexibility allows to apply the methodologies at any level of design, providing supporting guidelines for design activities, compare alternative solutions, monitor operating process and identify potential for improvements.
Resumo:
Every seismic event produces seismic waves which travel throughout the Earth. Seismology is the science of interpreting measurements to derive information about the structure of the Earth. Seismic tomography is the most powerful tool for determination of 3D structure of deep Earth's interiors. Tomographic models obtained at the global and regional scales are an underlying tool for determination of geodynamical state of the Earth, showing evident correlation with other geophysical and geological characteristics. The global tomographic images of the Earth can be written as a linear combinations of basis functions from a specifically chosen set, defining the model parameterization. A number of different parameterizations are commonly seen in literature: seismic velocities in the Earth have been expressed, for example, as combinations of spherical harmonics or by means of the simpler characteristic functions of discrete cells. With this work we are interested to focus our attention on this aspect, evaluating a new type of parameterization, performed by means of wavelet functions. It is known from the classical Fourier theory that a signal can be expressed as the sum of a, possibly infinite, series of sines and cosines. This sum is often referred as a Fourier expansion. The big disadvantage of a Fourier expansion is that it has only frequency resolution and no time resolution. The Wavelet Analysis (or Wavelet Transform) is probably the most recent solution to overcome the shortcomings of Fourier analysis. The fundamental idea behind this innovative analysis is to study signal according to scale. Wavelets, in fact, are mathematical functions that cut up data into different frequency components, and then study each component with resolution matched to its scale, so they are especially useful in the analysis of non stationary process that contains multi-scale features, discontinuities and sharp strike. Wavelets are essentially used in two ways when they are applied in geophysical process or signals studies: 1) as a basis for representation or characterization of process; 2) as an integration kernel for analysis to extract information about the process. These two types of applications of wavelets in geophysical field, are object of study of this work. At the beginning we use the wavelets as basis to represent and resolve the Tomographic Inverse Problem. After a briefly introduction to seismic tomography theory, we assess the power of wavelet analysis in the representation of two different type of synthetic models; then we apply it to real data, obtaining surface wave phase velocity maps and evaluating its abilities by means of comparison with an other type of parametrization (i.e., block parametrization). For the second type of wavelet application we analyze the ability of Continuous Wavelet Transform in the spectral analysis, starting again with some synthetic tests to evaluate its sensibility and capability and then apply the same analysis to real data to obtain Local Correlation Maps between different model at same depth or between different profiles of the same model.
Resumo:
Tissue engineering is a discipline that aims at regenerating damaged biological tissues by using a cell-construct engineered in vitro made of cells grown into a porous 3D scaffold. The role of the scaffold is to guide cell growth and differentiation by acting as a bioresorbable temporary substrate that will be eventually replaced by new tissue produced by cells. As a matter or fact, the obtainment of a successful engineered tissue requires a multidisciplinary approach that must integrate the basic principles of biology, engineering and material science. The present Ph.D. thesis aimed at developing and characterizing innovative polymeric bioresorbable scaffolds made of hydrolysable polyesters. The potentialities of both commercial polyesters (i.e. poly-e-caprolactone, polylactide and some lactide copolymers) and of non-commercial polyesters (i.e. poly-w-pentadecalactone and some of its copolymers) were explored and discussed. Two techniques were employed to fabricate scaffolds: supercritical carbon dioxide (scCO2) foaming and electrospinning (ES). The former is a powerful technology that enables to produce 3D microporous foams by avoiding the use of solvents that can be toxic to mammalian cells. The scCO2 process, which is commonly applied to amorphous polymers, was successfully modified to foam a highly crystalline poly(w-pentadecalactone-co-e-caprolactone) copolymer and the effect of process parameters on scaffold morphology and thermo-mechanical properties was investigated. In the course of the present research activity, sub-micrometric fibrous non-woven meshes were produced using ES technology. Electrospun materials are considered highly promising scaffolds because they resemble the 3D organization of native extra cellular matrix. A careful control of process parameters allowed to fabricate defect-free fibres with diameters ranging from hundreds of nanometers to several microns, having either smooth or porous surface. Moreover, versatility of ES technology enabled to produce electrospun scaffolds from different polyesters as well as “composite” non-woven meshes by concomitantly electrospinning different fibres in terms of both fibre morphology and polymer material. The 3D-architecture of the electrospun scaffolds fabricated in this research was controlled in terms of mutual fibre orientation by properly modifying the instrumental apparatus. This aspect is particularly interesting since the micro/nano-architecture of the scaffold is known to affect cell behaviour. Since last generation scaffolds are expected to induce specific cell response, the present research activity also explored the possibility to produce electrospun scaffolds bioactive towards cells. Bio-functionalized substrates were obtained by loading polymer fibres with growth factors (i.e. biomolecules that elicit specific cell behaviour) and it was demonstrated that, despite the high voltages applied during electrospinning, the growth factor retains its biological activity once released from the fibres upon contact with cell culture medium. A second fuctionalization approach aiming, at a final stage, at controlling cell adhesion on electrospun scaffolds, consisted in covering fibre surface with highly hydrophilic polymer brushes of glycerol monomethacrylate synthesized by Atom Transfer Radical Polymerization. Future investigations are going to exploit the hydroxyl groups of the polymer brushes for functionalizing the fibre surface with desired biomolecules. Electrospun scaffolds were employed in cell culture experiments performed in collaboration with biochemical laboratories aimed at evaluating the biocompatibility of new electrospun polymers and at investigating the effect of fibre orientation on cell behaviour. Moreover, at a preliminary stage, electrospun scaffolds were also cultured with tumour mammalian cells for developing in vitro tumour models aimed at better understanding the role of natural ECM on tumour malignity in vivo.
Resumo:
This thesis deals with the professional characteristics of secondary school teachers, with particular regard to their competence and their education. The topic will be approached starting from the characteristics and trasnformations social research has identified concerning Italian teachers, focusing on secondary teacher training. After a brief look at Europe, the attention will be directed to Italy, with particular regard to the Postgraduate Schools of Specialisation for Secondary School Teachers (SSIS); hence the need for an analysis that focuses on teaching per se and its concrete pratice. For its nature to be fully grasped, teaching must be reconsidered as an independent object of study, a performance in which competence manifests itself and a form of action involving a set of tacit and personal knowledge. A further perspective opens up for analysis, according to which the professional characteristics of teachers are the result of an education in which the whole history of the subject is involved, in its educative, formative, professional and personal aspects. The teaching profession is imbued with implicit meanings which are inaccessible to conscience but orient action and affect the interpretation of experience. Through the analysis of three different empirical data sets, collected among teachers-in-training and teachers qualified at SSIS, I will try to investigate the actual existence, the nature and the features of such implicit knowledge. It appears necessary to put the claims of process-product approaches back in their right perspective, to the benefit of a holistic conception of teaching competence. The teacher is, at the same time, “he who is teaching” and offers a concrete receiver the fruit of an endless work of study, reflection, practice and self-update. To understand this process will mean to penetrate more and more deeply into the core of teaching and teaching competence , a competence that in some respects “is” always “that” teacher, with his or her own story, implicit knowledge and representations.
Resumo:
Multi-Processor SoC (MPSOC) design brings to the foreground a large number of challenges, one of the most prominent of which is the design of the chip interconnection. With a number of on-chip blocks presently ranging in the tens, and quickly approaching the hundreds, the novel issue of how to best provide on-chip communication resources is clearly felt. Scaling down of process technologies has increased process and dynamic variations as well as transistor wearout. Because of this, delay variations increase and impact the performance of the MPSoCs. The interconnect architecture inMPSoCs becomes a single point of failure as it connects all other components of the system together. A faulty processing element may be shut down entirely, but the interconnect architecture must be able to tolerate partial failure and variations and operate with performance, power or latency overhead. This dissertation focuses on techniques at different levels of abstraction to face with the reliability and variability issues in on-chip interconnection networks. By showing the test results of a GALS NoC testchip this dissertation motivates the need for techniques to detect and work around manufacturing faults and process variations in MPSoCs’ interconnection infrastructure. As a physical design technique, we propose the bundle routing framework as an effective way to route the Network on Chips’ global links. For architecture-level design, two cases are addressed: (I) Intra-cluster communication where we propose a low-latency interconnect with variability robustness (ii) Inter-cluster communication where an online functional testing with a reliable NoC configuration are proposed. We also propose dualVdd as an orthogonal way of compensating variability at the post-fabrication stage. This is an alternative strategy with respect to the design techniques, since it enforces the compensation at post silicon stage.
Resumo:
La presente dissertazione si pone come oggetto di analisi la produzione poetica di Robert Kroetsch (1927-2011), scrittore e critico letterario canadese nativo dell’Alberta (Canada), che tra il 1960 e il 2010 ha pubblicato un numero notevole di opere (nove romanzi, più di venti opere poetiche tra componimenti singoli e in raccolta, due volumi di saggi e diverse interviste). In particolare si è scelto di focalizzare l’attenzione sulle ultime tre raccolte di poesia – rispettivamente The Hornbooks of Rita K (2001), The Snowbird Poems (2004) e Too Bad: Sketches Toward a Self-Portrait (2010) – che, se confrontate con la produzione precedente, forniscono prova di alcuni elementi di novità all’interno della prospettiva poetica di Kroetsch. L’ipotesi dalla quale trae origine il presente studio è infatti che, a partire dalla raccolta The Hornbooks of Rita K, Kroetsch abbia imboccato un percorso di evoluzione stilistica che corre in parallelo con la formulazione di una nuova poetica. Nello specifico, si osserva che, negli ultimi dieci anni, da un punto di vista formale i componimenti si frammentano progressivamente, passando da una forma lunga – quella del long poem – a una breve – lo sketch, che risulta più adatta a rappresentare sul piano espressivo una mutata percezione dell’Io poetico. A un simile aspetto si aggiunge poi il fatto che la raffigurazione della propria vicenda umana diventa, con sempre maggiore evidenza, motivo di riflessione su una condizione universale dell’umano e sulla dimensione etica del suo agire.