20 resultados para Design For Manufacturing and Assembly
em AMS Tesi di Dottorato - Alm@DL - Università di Bologna
Resumo:
This study is focused on radio-frequency inductively coupled thermal plasma (ICP) synthesis of nanoparticles, combining experimental and modelling approaches towards process optimization and industrial scale-up, in the framework of the FP7-NMP SIMBA European project (Scaling-up of ICP technology for continuous production of Metallic nanopowders for Battery Applications). First the state of the art of nanoparticle production through conventional and plasma routes is summarized, then results for the characterization of the plasma source and on the investigation of the nanoparticle synthesis phenomenon, aiming at highlighting fundamental process parameters while adopting a design oriented modelling approach, are presented. In particular, an energy balance of the torch and of the reaction chamber, employing a calorimetric method, is presented, while results for three- and two-dimensional modelling of an ICP system are compared with calorimetric and enthalpy probe measurements to validate the temperature field predicted by the model and used to characterize the ICP system under powder-free conditions. Moreover, results from the modeling of critical phases of ICP synthesis process, such as precursor evaporation, vapour conversion in nanoparticles and nanoparticle growth, are presented, with the aim of providing useful insights both for the design and optimization of the process and on the underlying physical phenomena. Indeed, precursor evaporation, one of the phases holding the highest impact on industrial feasibility of the process, is discussed; by employing models to describe particle trajectories and thermal histories, adapted from the ones originally developed for other plasma technologies or applications, such as DC non-transferred arc torches and powder spherodization, the evaporation of micro-sized Si solid precursor in a laboratory scale ICP system is investigated. Finally, a discussion on the role of thermo-fluid dynamic fields on nano-particle formation is presented, as well as a study on the effect of the reaction chamber geometry on produced nanoparticle characteristics and process yield.
Resumo:
This final thesis is aimed at summarizing the research program I have carried out during my PhD studies, that has been dealing with the design, the preparation, characterization and applications of new Re(I), Ru(II), and Ir(III) metal complexes containing anionic ligands such as 5-aryl tetrazolates [R-CN4]- or their neutral analogues, N-alkyltetrazoles [R-CN4-R1]. Chapter 1 consists of a brief introduction on tetrazoles and metal-tetrazolato complexes, and on the photophysical properties of d6 transition metal complexes. In chapter 2, the synthesis, characterization and study of the photophysical properties of new luminescent Ir(III)-tetrazolate complexes are discussed. Moreover, the application of one of the new Ir(III)-CN complexes as emissive core in the fabrication of an OLED device is reported. In chapter 3, the study of the antimicrobial activity of new Ru(II)-alkyltetrazole complexes is reported. When the pentatomic ring was substituted with a long alkyl residue, antimicrobial activity toward Deinococcus radiodurans was observed. In chapter 4, a new family of luminescent Re(I)-tetrazolate complexes is reported. In this study, different N-alkyl tetrazoles play the role of diimine (diim) ligands in the preparation of new Re(I) tricarbonyl complexes. In addition, absorption and emission titration experiments were performed to study their interaction with Bovine Serum Albumin (BSA). In chapter 5, the synthesis and characterization of new luminescent Re(I)-tetrazolate complexes are discussed. The use of sulfonated diimine ligands in the preparation of new Re(I) tricarbonyl complexes led to the first example Re(I) complexes for the luminescent staining of proteins. In chapter 6, the synthesis, a new family of Ir(III)-NO2 tetrazole complexes displaying unexpected photophysical properties are discussed. Moreover, the possibility to tune the luminescent output of such systems upon chemical modification of the pending nitro group was verified by performing reduction tests with sodium dithionite; this represents encouraging evidence for their possible application as hypoxia-responsive luminescent probes in bioimaging.
Resumo:
The project aims to gather an understanding of additive manufacturing and other manufacturing 4.0 techniques with an eyesight for industrialization. First the internal material anisotropy of elements created with the most economically feasible FEM technique was established. An understanding of the main drivers for variability for AM was portrayed, with the focus on achieving material internal isotropy. Subsequently, a technique for deposition parameter optimization was presented, further procedure testing was performed following other polymeric materials and composites. A replicability assessment by means of the use of technology 4.0 was proposed, and subsequent industry findings gathered the ultimate need of developing a process that demonstrate how to re-engineer designs in order to show the best results with AM processing. The latest study aims to apply the Industrial Design and Structure Method (IDES) and applying all the knowledge previously stacked into fully reengineer a product with focus of applying tools from 4.0 era, from product feasibility studies, until CAE – FEM analysis and CAM – DfAM. These results would help in making AM and FDM processes a viable option to be combined with composites technologies to achieve a reliable, cost-effective manufacturing method that could also be used for mass market, industry applications.
Resumo:
This thesis presents a new approach for the design and fabrication of bond wire magnetics for power converter applications by using standard IC gold bonding wires and micro-machined magnetic cores. It shows a systematic design and characterization study for bond wire transformers with toroidal and race-track cores for both PCB and silicon substrates. Measurement results show that the use of ferrite cores increases the secondary self-inductance up to 315 µH with a Q-factor up to 24.5 at 100 kHz. Measurement results on LTCC core report an enhancement of the secondary self-inductance up to 23 µH with a Q-factor up to 10.5 at 1.4 MHz. A resonant DC-DC converter is designed in 0.32 µm BCD6s technology at STMicroelectronics with a depletion nmosfet and a bond wire micro-transformer for EH applications. Measures report that the circuit begins to oscillate from a TEG voltage of 280 mV while starts to convert from an input down to 330 mV to a rectified output of 0.8 V at an input of 400 mV. Bond wire magnetics is a cost-effective approach that enables a flexible design of inductors and transformers with high inductance and high turns ratio. Additionally, it supports the development of magnetics on top of the IC active circuitry for package and wafer level integrations, thus enabling the design of high density power components. This makes possible the evolution of PwrSiP and PwrSoC with reliable highly efficient magnetics.
Resumo:
A flexure hinge is a flexible connector that can provide a limited rotational motion between two rigid parts by means of material deformation. These connectors can be used to substitute traditional kinematic pairs (like bearing couplings) in rigid-body mechanisms. When compared to their rigid-body counterpart, flexure hinges are characterized by reduced weight, absence of backlash and friction, part-count reduction, but restricted range of motion. There are several types of flexure hinges in the literature that have been studied and characterized for different applications. In our study, we have introduced new types of flexures with curved structures i.e. circularly curved-beam flexures and spherical flexures. These flexures have been utilized for both planar applications (e.g. articulated robotic fingers) and spatial applications (e.g. spherical compliant mechanisms). We have derived closed-form compliance equations for both circularly curved-beam flexures and spherical flexures. Each element of the spatial compliance matrix is analytically computed as a function of hinge dimensions and employed material. The theoretical model is then validated by comparing analytical data with the results obtained through Finite Element Analysis. A case study is also presented for each class of flexures, concerning the potential applications in the optimal design of planar and spatial compliant mechanisms. Each case study is followed by comparing the performance of these novel flexures with the performance of commonly used geometries in terms of principle compliance factors, parasitic motions and maximum stress demands. Furthermore, we have extended our study to the design and analysis of serial and parallel compliant mechanisms, where the proposed flexures have been employed to achieve spatial motions e.g. compliant spherical joints.
Resumo:
In the digital age, e-health technologies play a pivotal role in the processing of medical information. As personal health data represents sensitive information concerning a data subject, enhancing data protection and security of systems and practices has become a primary concern. In recent years, there has been an increasing interest in the concept of Privacy by Design, which aims at developing a product or a service in a way that it supports privacy principles and rules. In the EU, Article 25 of the General Data Protection Regulation provides a binding obligation of implementing Data Protection by Design technical and organisational measures. This thesis explores how an e-health system could be developed and how data processing activities could be carried out to apply data protection principles and requirements from the design stage. The research attempts to bridge the gap between the legal and technical disciplines on DPbD by providing a set of guidelines for the implementation of the principle. The work is based on literature review, legal and comparative analysis, and investigation of the existing technical solutions and engineering methodologies. The work can be differentiated by theoretical and applied perspectives. First, it critically conducts a legal analysis on the principle of PbD and it studies the DPbD legal obligation and the related provisions. Later, the research contextualises the rule in the health care field by investigating the applicable legal framework for personal health data processing. Moreover, the research focuses on the US legal system by conducting a comparative analysis. Adopting an applied perspective, the research investigates the existing technical methodologies and tools to design data protection and it proposes a set of comprehensive DPbD organisational and technical guidelines for a crucial case study, that is an Electronic Health Record system.
Resumo:
The growing demand for lightweight solutions in every field of engineering is driving the industry to seek new technological solutions to exploit the full potential of different materials. The combination of dissimilar materials with distinct property ranges embodies a transparent allocation of component functions while allowing an optimal mix of their characteristics. From both technological and design perspectives, the interaction between dissimilar materials can lead to severe defects that compromise a multi-material hybrid component's performance and its structural integrity. This thesis aims to develop methodologies for designing, manufacturing, and monitoring of hybrid metal-composite joints and hybrid composite components. In Chapter 1, a methodology for designing and manufacturing hybrid aluminum/composite co-cured tubes is assessed. In Chapter 2, a full-field methodology for fiber misalignment detection and stiffness prediction for hybrid, long fiber reinforced composite systems is shown and demonstrated. Chapter 3 reports the development of a novel technology for joining short fiber systems and metals in a one-step co-curing process using lattice structures. Chapter 4 is dedicated to a novel analytical framework for the design optimization of two lattice architectures.
Resumo:
This research proposes a solution for integrating RFID - Radio Frequency Identification technology within a structure based on CFRPs - Carbon Fiber Reinforced Polymers. Therefore, the main objective is to use technology to monitor and track composite components during manufacturing and service life. The study can be divided into two macro-areas. The first portion of the research evaluates the impact of the composite materials used on transmitting the electromagnetic signal to and from the tag. RFID technology communicates through radio frequencies to to track and trace items associated with the tags. In the first instance, a feasibility study was carried out to assess using commercially available tags. Then, after evaluating different solutions, it was decided to incorporate the tags into coupons during production. The second portion of the research is focused on evaluating the impact on the composite material's resistance to tag embedding. It starts with designing tensile test specimens through the FEM model with different housing configurations. Subsequently, the best configuration was tested in the facilities of the In the Faculty of Aerospace Engineering at TU Delft, particularly in the Structure & Materials Laboratory, two tests were conducted: the first one based on ASTM D3039/D3039 - 14 - Standard Test Method for Tensile Properties of Polymer Matrix Composite Materials, the second one dividing the path to failure into failure intervals in a load-unload-reload. Both tests were accompanied by instruments such as DIC, AE, C-Scan and Optical Microscopes. The expected result of the inclusion of RFID tags in composite components is that it brings added value to the parts with which it is associated without affecting too much its mechanical properties. This comes first from the automatic identification of RFID during the production cycle and its useful life. As a result, improvements were made in the design of production facilities.
Resumo:
Intangible resources have raised the interests of scholars from different research areas due to their importance as crucial factors for firm performance; yet, contributions to this field still lack a theoretical framework. This research analyses the state-of-the-art results reached in the literature concerning intangibles, their main features and evaluation problems and models. In search for a possible theoretical framework, the research draws a kind of indirect analysis of intangibles through the theories of the firm, their critic and developments. The heterodox approaches of the evolutionary theory and resource-based view are indicated as possible frameworks. Based on this theoretical analysis, organization capital (OC) is identified, for its features, as the most important intangible for firm performance. Empirical studies on the relationship intangibles-firm performance have been sporadic and have failed to reach firm conclusions with respect to OC; in the attempt to fill this gap, the effect of OC is tested on a large sample of European firms using the Compustat Global database. OC is proxied by capitalizing an income statement item (Selling, General and Administrative expenses) that includes expenses linked to information technology, business process design, reputation enhancement and employee training. This measure of OC is employed in a cross-sectional estimation of a firm level production function - modeled with different functional specifications (Cobb-Douglas and Translog) - that measures OC contribution to firm output and profitability. Results are robust and confirm the importance of OC for firm performance.
Resumo:
The digital electronic market development is founded on the continuous reduction of the transistors size, to reduce area, power, cost and increase the computational performance of integrated circuits. This trend, known as technology scaling, is approaching the nanometer size. The lithographic process in the manufacturing stage is increasing its uncertainty with the scaling down of the transistors size, resulting in a larger parameter variation in future technology generations. Furthermore, the exponential relationship between the leakage current and the threshold voltage, is limiting the threshold and supply voltages scaling, increasing the power density and creating local thermal issues, such as hot spots, thermal runaway and thermal cycles. In addiction, the introduction of new materials and the smaller devices dimension are reducing transistors robustness, that combined with high temperature and frequently thermal cycles, are speeding up wear out processes. Those effects are no longer addressable only at the process level. Consequently the deep sub-micron devices will require solutions which will imply several design levels, as system and logic, and new approaches called Design For Manufacturability (DFM) and Design For Reliability. The purpose of the above approaches is to bring in the early design stages the awareness of the device reliability and manufacturability, in order to introduce logic and system able to cope with the yield and reliability loss. The ITRS roadmap suggests the following research steps to integrate the design for manufacturability and reliability in the standard CAD automated design flow: i) The implementation of new analysis algorithms able to predict the system thermal behavior with the impact to the power and speed performances. ii) High level wear out models able to predict the mean time to failure of the system (MTTF). iii) Statistical performance analysis able to predict the impact of the process variation, both random and systematic. The new analysis tools have to be developed beside new logic and system strategies to cope with the future challenges, as for instance: i) Thermal management strategy that increase the reliability and life time of the devices acting to some tunable parameter,such as supply voltage or body bias. ii) Error detection logic able to interact with compensation techniques as Adaptive Supply Voltage ASV, Adaptive Body Bias ABB and error recovering, in order to increase yield and reliability. iii) architectures that are fundamentally resistant to variability, including locally asynchronous designs, redundancy, and error correcting signal encodings (ECC). The literature already features works addressing the prediction of the MTTF, papers focusing on thermal management in the general purpose chip, and publications on statistical performance analysis. In my Phd research activity, I investigated the need for thermal management in future embedded low-power Network On Chip (NoC) devices.I developed a thermal analysis library, that has been integrated in a NoC cycle accurate simulator and in a FPGA based NoC simulator. The results have shown that an accurate layout distribution can avoid the onset of hot-spot in a NoC chip. Furthermore the application of thermal management can reduce temperature and number of thermal cycles, increasing the systemreliability. Therefore the thesis advocates the need to integrate a thermal analysis in the first design stages for embedded NoC design. Later on, I focused my research in the development of statistical process variation analysis tool that is able to address both random and systematic variations. The tool was used to analyze the impact of self-timed asynchronous logic stages in an embedded microprocessor. As results we confirmed the capability of self-timed logic to increase the manufacturability and reliability. Furthermore we used the tool to investigate the suitability of low-swing techniques in the NoC system communication under process variations. In this case We discovered the superior robustness to systematic process variation of low-swing links, which shows a good response to compensation technique as ASV and ABB. Hence low-swing is a good alternative to the standard CMOS communication for power, speed, reliability and manufacturability. In summary my work proves the advantage of integrating a statistical process variation analysis tool in the first stages of the design flow.
Resumo:
Synchronization is a key issue in any communication system, but it becomes fundamental in the navigation systems, which are entirely based on the estimation of the time delay of the signals coming from the satellites. Thus, even if synchronization has been a well known topic for many years, the introduction of new modulations and new physical layer techniques in the modern standards makes the traditional synchronization strategies completely ineffective. For this reason, the design of advanced and innovative techniques for synchronization in modern communication systems, like DVB-SH, DVB-T2, DVB-RCS, WiMAX, LTE, and in the modern navigation system, like Galileo, has been the topic of the activity. Recent years have seen the consolidation of two different trends: the introduction of Orthogonal Frequency Division Multiplexing (OFDM) in the communication systems, and of the Binary Offset Carrier (BOC) modulation in the modern Global Navigation Satellite Systems (GNSS). Thus, a particular attention has been given to the investigation of the synchronization algorithms in these areas.
Resumo:
This PhD thesis reports on car fluff management, recycling and recovery. Car fluff is the residual waste produced by car recycling operations, particularly from hulk shredding. Car fluff is known also as Automotive Shredder Residue (ASR) and it is made of plastics, rubbers, textiles, metals and other materials, and it is very heterogeneous both in its composition and in its particle size. In fact, fines may amount to about 50%, making difficult to sort out recyclable materials or exploit ASR heat value by energy recovery. This 3 years long study started with the definition of the Italian End-of-Life Vehicles (ELVs) recycling state of the art. A national recycling trial revealed Italian recycling rate to be around 81% in 2008, while European Community recycling target are set to 85% by 2015. Consequently, according to Industrial Ecology framework, a life cycle assessment (LCA) has been conducted revealing that sorting and recycling polymers and metals contained in car fluff, followed by recovering residual energy, is the route which has the best environmental perspective. This results led the second year investigation that involved pyrolysis trials on pretreated ASR fractions aimed at investigating which processes could be suitable for an industrial scale ASR treatment plant. Sieving followed by floatation reported good result in thermochemical conversion of polymers with polyolefins giving excellent conversion rate. This factor triggered ecodesign considerations. Ecodesign, together with LCA, is one of the Industrial Ecology pillars and it consists of design for recycling and design for disassembly, both aimed at the improvement of car components dismantling speed and the substitution of non recyclable material. Finally, during the last year, innovative plants and technologies for metals recovery from car fluff have been visited and tested worldwide in order to design a new car fluff treatment plant aimed at ASR energy and material recovery.
Resumo:
This collection of essays examines various aspects of regional development and the issues of internationalization. The first essay investigates the implications of the impressive growth of China from a rural-urban perspective and addresses the topic of convergence in China by employing a non-parametrical approach to study the distribution dynamics of per capita income at province, rural and urban levels. To better understand the degree of inequality characterizing China and the long-term predictions of convergence or divergence of its different territorial aggregations, the second essay formulates a composite indicator of Regional Development (RDI) to benchmark development at province and sub-province level. The RDI goes beyond the uni-dimensional concept of development, generally proxied by the GDP per capita, and gives attention to the rural-urban dimension. The third essay “Internationalization and Trade Specialization in Italy. The role of China in the international intra-firm trade of the Italian regions” - deals with another aspect of regional economic development: the progressive de-industrialisation and de-localization of the local production. This essay looks at the trade specialization of selected Italian regions (those regions specialized in manufacturing) and the fragmentation of the local production on a global scale. China represents in this context an important stakeholder and the paper documents the importance of this country in the regional intra-firm trade.
Resumo:
Life Cycle Assessment (LCA) is a chain-oriented tool to evaluate the environment performance of products focussing on the entire life cycle of these products: from the extraction of resources, via manufacturing and use, to the final processing of the disposed products. Through all these stages consumption of resources and pollutant releases to air, water, soil are identified and quantified in Life Cycle Inventory (LCI) analysis. Subsequently to the LCI phase follows the Life Cycle Impact Assessment (LCIA) phase; that has the purpose to convert resource consumptions and pollutant releases in environmental impacts. The LCIA aims to model and to evaluate environmental issues, called impact categories. Several reports emphasises the importance of LCA in the field of ENMs. The ENMs offer enormous potential for the development of new products and application. There are however unanswered questions about the impacts of ENMs on human health and the environment. In the last decade the increasing production, use and consumption of nanoproducts, with a consequent release into the environment, has accentuated the obligation to ensure that potential risks are adequately understood to protect both human health and environment. Due to its holistic and comprehensive assessment, LCA is an essential tool evaluate, understand and manage the environmental and health effects of nanotechnology. The evaluation of health and environmental impacts of nanotechnologies, throughout the whole of their life-cycle by using LCA methodology. This is due to the lack of knowledge in relation to risk assessment. In fact, to date, the knowledge on human and environmental exposure to nanomaterials, such ENPs is limited. This bottleneck is reflected into LCA where characterisation models and consequently characterisation factors for ENPs are missed. The PhD project aims to assess limitations and challenges of the freshwater aquatic ecotoxicity potential evaluation in LCIA phase for ENPs and in particular nanoparticles as n-TiO2.
Resumo:
Nanotechnology entails the manufacturing and manipulation of matter at length scales ranging from single atoms to micron-sized objects. The ability to address properties on the biologically-relevant nanometer scale has made nanotechnology attractive for Nanomedicine. This is perceived as a great opportunity in healthcare especially in diagnostics, therapeutics and more in general to develop personalized medicine. Nanomedicine has the potential to enable early detection and prevention, and to improve diagnosis, mass screening, treatment and follow-up of many diseases. From the biological standpoint, nanomaterials match the typical size of naturally occurring functional units or components of living organisms and, for this reason, enable more effective interaction with biological systems. Nanomaterials have the potential to influence the functionality and cell fate in the regeneration of organs and tissues. To this aim, nanotechnology provides an arsenal of techniques for intervening, fabricate, and modulate the environment where cells live and function. Unconventional micro- and nano-fabrication techniques allow patterning biomolecules and biocompatible materials down to the level of a few nanometer feature size. Patterning is not simply a deterministic placement of a material; in a more extended acception it allows a controlled fabrication of structures and gradients of different nature. Gradients are emerging as one of the key factors guiding cell adhesion, proliferation, migration and even differentiation in the case of stem cells. The main goal of this thesis has been to devise a nanotechnology-based strategy and tools to spatially and temporally control biologically-relevant phenomena in-vitro which are important in some fields of medical research.