490 resultados para Workflow


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Introdução: A cintigrafia óssea é um dos exames mais frequentes em Medicina Nuclear. Esta modalidade de imagem médica requere um balanço apropriado entre a qualidade de imagem e a dose de radiação, ou seja, as imagens obtidas devem conter o número mínimo de contagem necessárias, para que apresentem qualidade considerada suficiente para fins diagnósticos. Objetivo: Este estudo tem como principal objetivo, a aplicação do software Enhanced Planar Processing (EPP), nos exames de cintigrafia óssea em doentes com carcinoma da mama e próstata que apresentam metástases ósseas. Desta forma, pretende-se avaliar a performance do algoritmo EPP na prática clínica em termos de qualidade e confiança diagnóstica quando o tempo de aquisição é reduzido em 50%. Material e Métodos: Esta investigação teve lugar no departamento de Radiologia e Medicina Nuclear do Radboud University Nijmegen Medical Centre. Cinquenta e um doentes com suspeita de metástases ósseas foram administrados com 500MBq de metilenodifosfonato marcado com tecnécio-99m. Cada doente foi submetido a duas aquisições de imagem, sendo que na primeira foi seguido o protocolo standard do departamento (scan speed=8 cm/min) e na segunda, o tempo de aquisição foi reduzido para metade (scan speed=16 cm/min). As imagens adquiridas com o segundo protocolo foram processadas com o algoritmo EPP. Todas as imagens foram submetidas a uma avaliação objetiva e subjetiva. Relativamente à análise subjetiva, três médicos especialistas em Medicina Nuclear avaliaram as imagens em termos da detetabilidade das lesões, qualidade de imagem, aceitabilidade diagnóstica, localização das lesões e confiança diagnóstica. No que respeita à avaliação objetiva, foram selecionadas duas regiões de interesse, uma localizada no terço médio do fémur e outra localizada nos tecidos moles adjacentes, de modo a obter os valores de relação sinal-ruído, relação contraste-ruído e coeficiente de variação. Resultados: Os resultados obtidos evidenciam que as imagens processadas com o software EPP oferecem aos médicos suficiente informação diagnóstica na deteção de metástases, uma vez que não foram encontradas diferenças estatisticamente significativas (p>0.05). Para além disso, a concordância entre os observadores, comparando essas imagens e as imagens adquiridas com o protocolo standard foi de 95% (k=0.88). Por outro lado, no que respeita à qualidade de imagem, foram encontradas diferenças estatisticamente significativas quando se compararam as modalidades de imagem entre si (p≤0.05). Relativamente à aceitabilidade diagnóstica, não foram encontradas diferenças estatisticamente significativas entre as imagens adquiridas com o protocolo standard e as imagens processadas com o EPP software (p>0.05), verificando-se uma concordância entre os observadores de 70.6%. Todavia, foram encontradas diferenças estatisticamente significativas entre as imagens adquiridas com o protocolo standard e as imagens adquiridas com o segundo protocolo e não processadas com o software EPP (p≤0.05). Para além disso, não foram encontradas diferenças estatisticamente significativas (p>0.05) em termos de relação sinal-ruído, relação contraste-ruído e coeficiente de variação entre as imagens adquiridas com o protocolo standard e as imagens processadas com o EPP. Conclusão: Com os resultados obtidos através deste estudo, é possível concluir que o algoritmo EPP, desenvolvido pela Siemens, oferece a possibilidade de reduzir o tempo de aquisição em 50%, mantendo ao mesmo tempo uma qualidade de imagem considerada suficiente para fins de diagnóstico. A utilização desta tecnologia, para além de aumentar a satisfação por parte dos doentes, é bastante vantajosa no que respeita ao workflow do departamento.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Single-cell functional proteomics assays can connect genomic information to biological function through quantitative and multiplex protein measurements. Tools for single-cell proteomics have developed rapidly over the past 5 years and are providing unique opportunities. This thesis describes an emerging microfluidics-based toolkit for single cell functional proteomics, focusing on the development of the single cell barcode chips (SCBCs) with applications in fundamental and translational cancer research.

The microchip designed to simultaneously quantify a panel of secreted, cytoplasmic and membrane proteins from single cells will be discussed at the beginning, which is the prototype for subsequent proteomic microchips with more sophisticated design in preclinical cancer research or clinical applications. The SCBCs are a highly versatile and information rich tool for single-cell functional proteomics. They are based upon isolating individual cells, or defined number of cells, within microchambers, each of which is equipped with a large antibody microarray (the barcode), with between a few hundred to ten thousand microchambers included within a single microchip. Functional proteomics assays at single-cell resolution yield unique pieces of information that significantly shape the way of thinking on cancer research. An in-depth discussion about analysis and interpretation of the unique information such as functional protein fluctuations and protein-protein correlative interactions will follow.

The SCBC is a powerful tool to resolve the functional heterogeneity of cancer cells. It has the capacity to extract a comprehensive picture of the signal transduction network from single tumor cells and thus provides insight into the effect of targeted therapies on protein signaling networks. We will demonstrate this point through applying the SCBCs to investigate three isogenic cell lines of glioblastoma multiforme (GBM).

The cancer cell population is highly heterogeneous with high-amplitude fluctuation at the single cell level, which in turn grants the robustness of the entire population. The concept that a stable population existing in the presence of random fluctuations is reminiscent of many physical systems that are successfully understood using statistical physics. Thus, tools derived from that field can probably be applied to using fluctuations to determine the nature of signaling networks. In the second part of the thesis, we will focus on such a case to use thermodynamics-motivated principles to understand cancer cell hypoxia, where single cell proteomics assays coupled with a quantitative version of Le Chatelier's principle derived from statistical mechanics yield detailed and surprising predictions, which were found to be correct in both cell line and primary tumor model.

The third part of the thesis demonstrates the application of this technology in the preclinical cancer research to study the GBM cancer cell resistance to molecular targeted therapy. Physical approaches to anticipate therapy resistance and to identify effective therapy combinations will be discussed in detail. Our approach is based upon elucidating the signaling coordination within the phosphoprotein signaling pathways that are hyperactivated in human GBMs, and interrogating how that coordination responds to the perturbation of targeted inhibitor. Strongly coupled protein-protein interactions constitute most signaling cascades. A physical analogy of such a system is the strongly coupled atom-atom interactions in a crystal lattice. Similar to decomposing the atomic interactions into a series of independent normal vibrational modes, a simplified picture of signaling network coordination can also be achieved by diagonalizing protein-protein correlation or covariance matrices to decompose the pairwise correlative interactions into a set of distinct linear combinations of signaling proteins (i.e. independent signaling modes). By doing so, two independent signaling modes – one associated with mTOR signaling and a second associated with ERK/Src signaling have been resolved, which in turn allow us to anticipate resistance, and to design combination therapies that are effective, as well as identify those therapies and therapy combinations that will be ineffective. We validated our predictions in mouse tumor models and all predictions were borne out.

In the last part, some preliminary results about the clinical translation of single-cell proteomics chips will be presented. The successful demonstration of our work on human-derived xenografts provides the rationale to extend our current work into the clinic. It will enable us to interrogate GBM tumor samples in a way that could potentially yield a straightforward, rapid interpretation so that we can give therapeutic guidance to the attending physicians within a clinical relevant time scale. The technical challenges of the clinical translation will be presented and our solutions to address the challenges will be discussed as well. A clinical case study will then follow, where some preliminary data collected from a pediatric GBM patient bearing an EGFR amplified tumor will be presented to demonstrate the general protocol and the workflow of the proposed clinical studies.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Crosswell data set contains a range of angles limited only by the geometry of the source and receiver configuration, the separation of the boreholes and the depth to the target. However, the wide angles reflections present in crosswell imaging result in amplitude-versus-angle (AVA) features not usually observed in surface data. These features include reflections from angles that are near critical and beyond critical for many of the interfaces; some of these reflections are visible only for a small range of angles, presumably near their critical angle. High-resolution crosswell seismic surveys were conducted over a Silurian (Niagaran) reef at two fields in northern Michigan, Springdale and Coldspring. The Springdale wells extended to much greater depths than the reef, and imaging was conducted from above and from beneath the reef. Combining the results from images obtained from above with those from beneath provides additional information, by exhibiting ranges of angles that are different for the two images, especially for reflectors at shallow depths, and second, by providing additional constraints on the solutions for Zoeppritz equations. Inversion of seismic data for impedance has become a standard part of the workflow for quantitative reservoir characterization. Inversion of crosswell data using either deterministic or geostatistical methods can lead to poor results with phase change beyond the critical angle, however, the simultaneous pre-stack inversion of partial angle stacks may be best conducted with restrictions to angles less than critical. Deterministic inversion is designed to yield only a single model of elastic properties (best-fit), while the geostatistical inversion produces multiple models (realizations) of elastic properties, lithology and reservoir properties. Geostatistical inversion produces results with far more detail than deterministic inversion. The magnitude of difference in details between both types of inversion becomes increasingly pronounced for thinner reservoirs, particularly those beyond the vertical resolution of the seismic. For any interface imaged from above and from beneath, the results AVA characters must result from identical contrasts in elastic properties in the two sets of images, albeit in reverse order. An inversion approach to handle both datasets simultaneously, at pre-critical angles, is demonstrated in this work. The main exploration problem for carbonate reefs is determining the porosity distribution. Images of elastic properties, obtained from deterministic and geostatistical simultaneous inversion of a high-resolution crosswell seismic survey were used to obtain the internal structure and reservoir properties (porosity) of Niagaran Michigan reef. The images obtained are the best of any Niagaran pinnacle reef to date.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Today, modern System-on-a-Chip (SoC) systems have grown rapidly due to the increased processing power, while maintaining the size of the hardware circuit. The number of transistors on a chip continues to increase, but current SoC designs may not be able to exploit the potential performance, especially with energy consumption and chip area becoming two major concerns. Traditional SoC designs usually separate software and hardware. Thus, the process of improving the system performance is a complicated task for both software and hardware designers. The aim of this research is to develop hardware acceleration workflow for software applications. Thus, system performance can be improved with constraints of energy consumption and on-chip resource costs. The characteristics of software applications can be identified by using profiling tools. Hardware acceleration can have significant performance improvement for highly mathematical calculations or repeated functions. The performance of SoC systems can then be improved, if the hardware acceleration method is used to accelerate the element that incurs performance overheads. The concepts mentioned in this study can be easily applied to a variety of sophisticated software applications. The contributions of SoC-based hardware acceleration in the hardware-software co-design platform include the following: (1) Software profiling methods are applied to H.264 Coder-Decoder (CODEC) core. The hotspot function of aimed application is identified by using critical attributes such as cycles per loop, loop rounds, etc. (2) Hardware acceleration method based on Field-Programmable Gate Array (FPGA) is used to resolve system bottlenecks and improve system performance. The identified hotspot function is then converted to a hardware accelerator and mapped onto the hardware platform. Two types of hardware acceleration methods – central bus design and co-processor design, are implemented for comparison in the proposed architecture. (3) System specifications, such as performance, energy consumption, and resource costs, are measured and analyzed. The trade-off of these three factors is compared and balanced. Different hardware accelerators are implemented and evaluated based on system requirements. 4) The system verification platform is designed based on Integrated Circuit (IC) workflow. Hardware optimization techniques are used for higher performance and less resource costs. Experimental results show that the proposed hardware acceleration workflow for software applications is an efficient technique. The system can reach 2.8X performance improvements and save 31.84% energy consumption by applying the Bus-IP design. The Co-processor design can have 7.9X performance and save 75.85% energy consumption.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This presentation was given at the Digital Commons Southeastern User Group conference at Winthrop University, South Carolina on June 5, 2015. The presentation discusses how the digital collections center (DCC) at Florida International University uses Digital Commons as their tool for ingesting, editing, tracking, and publishing university theses and dissertations. The basic DCC workflow is covered as well as institutional repository promotion.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Concurrent software executes multiple threads or processes to achieve high performance. However, concurrency results in a huge number of different system behaviors that are difficult to test and verify. The aim of this dissertation is to develop new methods and tools for modeling and analyzing concurrent software systems at design and code levels. This dissertation consists of several related results. First, a formal model of Mondex, an electronic purse system, is built using Petri nets from user requirements, which is formally verified using model checking. Second, Petri nets models are automatically mined from the event traces generated from scientific workflows. Third, partial order models are automatically extracted from some instrumented concurrent program execution, and potential atomicity violation bugs are automatically verified based on the partial order models using model checking. Our formal specification and verification of Mondex have contributed to the world wide effort in developing a verified software repository. Our method to mine Petri net models automatically from provenance offers a new approach to build scientific workflows. Our dynamic prediction tool, named McPatom, can predict several known bugs in real world systems including one that evades several other existing tools. McPatom is efficient and scalable as it takes advantage of the nature of atomicity violations and considers only a pair of threads and accesses to a single shared variable at one time. However, predictive tools need to consider the tradeoffs between precision and coverage. Based on McPatom, this dissertation presents two methods for improving the coverage and precision of atomicity violation predictions: 1) a post-prediction analysis method to increase coverage while ensuring precision; 2) a follow-up replaying method to further increase coverage. Both methods are implemented in a completely automatic tool.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Building Information Modeling (BIM) concept is able to reshape each AEC project and the industry in general, offering a comprehensive collaboration process over a model of structure with regularly actualized and synchronized information. This report presents an overview of BIM with focus on its core concepts, applications in the project life cycle and benefits for project stakeholders through four case studies carried out during the internship in the engineering office NEWTON - Engineering Consultancy Company. The aim of the four cases studies was to cover multidisciplinary and varied projects. The first case study highlights the engineering project’s workflow and presents a comparison of traditional procedures and BIM concepts applied on the rehabilitation of an existing building. In the second and third case study, attention is focused on the goals achieved, particularly by structural engineer, due to the implementation of the mentioned technology on a full-lifecycle BIM project of a small residence and a complex project of residential building in Porto and on its architectural integration. In addition, through the fourth case study, the spatial coordination of Mechanical, Electrical and Plumbing (MEP) systems at a large-scale hotel project has been analyzed and accomplished, highlighting merits of BIM at this project stage. Through a reduction of the space used for facilities and infrastructures and the ability to identify conflicts and to nullify the related costs, its advantage for a complex building was proved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Introduction: There has been a continuous development of new technologies in healthcare that are derived from national quality registries. However, this innovation needs to be translated into the workflow of healthcare delivery, to enable children with long-term conditions to get the best support possible to manage their health during everyday life. Since children living with long-term conditions experience different interference levels in their lives, healthcare professionals need to assess the impact of care on children’s day-to-day lives, as a complement to biomedical assessments. Aim: The overall aim of this thesis was to explore and describe the use of instruments about health-related quality of life (HRQOL) in outpatient care for children with long-term conditions on the basis of a national quality registry system. Methods: The research was conducted by using comparative, cross-sectional and explorative designs and data collection was performed by using different methods. The questionnaire DISABKIDS Chronic Generic Measure -37 was used as well as semi-structured interviews and video-recordings from consultations. Altogether, 156 children (8–18 years) and nine healthcare professionals participated in the studies. Children with Type 1 Diabetes (T1D) (n 131) answered the questionnaire DISABKIDS and children with rheumatic diseases, kidney diseases and T1D (n 25) were interviewed after their consultation at the outpatient clinic after the web-DISABKIDS had been used. In total, nine healthcare professionals used the HRQOL instrument as an assessment tool during the encounters which was video-recorded (n 21). Quantitative deductive content analysis was used to describe content in different HRQOL instruments. Statistical inference was used to analyse results from DISABKIDS and qualitative content analysis was used to analyse the interviews and video-recordings. Results: The findings showed that based on a biopsychosocial perspective, both generic and disease-specific instruments should be used to gain a comprehensive evaluation of the child’s HRQOL. The DISABKIDS instrument is applicable when describing different aspects of health concerning children with T1D. When DISABKIDS was used in the encounters, children expressed positive experiences about sharing their results with the healthcare professional. It was discovered that different approaches led to different outcomes for the child when the healthcare professionals were using DISABKIDS during the encounter. When an instructing approach is used, the child’s ability to learn more about their health and how to improve their health is limited. When an inviting or engaging approach is used by the professional, the child may become more involved during the conversations. Conclusions: It could be argued that instruments of HRQOL could be used as a complement to biomedical variables, to promote a biopsychosocial perspective on the child’s health. According to the children in this thesis, feedback on their results after answering to web-DISABKIDS is important, which implies that healthcare professionals need to prioritize time for discussions about results from HRQOL instruments in the encounters. If healthcare professionals involve the child in the discussion of the results of the HRQOL, misinterpreted answers could be corrected during the conversation. Concurrently, this claims that healthcare professionals invite and engage the child.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: Physician-rating websites have become a popular tool to create more transparency about the quality of health care providers. So far, it remains unknown whether online-based rating websites have the potential to contribute to a better standard of care. Objective: Our goal was to examine which health care providers use online rating websites and for what purposes, and whether health care providers use online patient ratings to improve patient care. Methods: We conducted an online-based cross-sectional study by surveying 2360 physicians and other health care providers (September 2015). In addition to descriptive statistics, we performed multilevel logistic regression models to ascertain the effects of providers' demographics as well as report card-related variables on the likelihood that providers implement measures to improve patient care. Results: Overall, more than half of the responding providers surveyed (54.66%, 1290/2360) used online ratings to derive measures to improve patient care (implemented measures: mean 3.06, SD 2.29). Ophthalmologists (68%, 40/59) and gynecologists (65.4%, 123/188) were most likely to implement any measures. The most widely implemented quality measures were related to communication with patients (28.77%, 679/2360), the appointment scheduling process (23.60%, 557/2360), and office workflow (21.23%, 501/2360). Scaled-survey results had a greater impact on deriving measures than narrative comments. Multilevel logistic regression models revealed medical specialty, the frequency of report card use, and the appraisal of the trustworthiness of scaled-survey ratings to be significantly associated predictors for implementing measures to improve patient care because of online ratings. Conclusions: Our results suggest that online ratings displayed on physician-rating websites have an impact on patient care. Despite the limitations of our study and unintended consequences of physician-rating websites, they still may have the potential to improve patient care.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

La tesi si divide in due macroargomenti relativi alla preparazione della geometria per modelli MCNP. Il primo è quello degli errori geometrici che vengono generati quando avviene una conversione da formato CAD a CSG e le loro relazioni con il fenomeno delle lost particles. Il passaggio a CSG tramite software è infatti inevitabile per la costruzione di modelli complessi come quelli che vengono usati per rappresentare i componenti di ITER e può generare zone della geometria che non vengono definite in modo corretto. Tali aree causano la perdita di particelle durante la simulazione Monte Carlo, andando ad intaccare l' integrità statistica della soluzione del trasporto. Per questo motivo è molto importante ridurre questo tipo di errori il più possibile, ed in quest'ottica il lavoro svolto è stato quello di trovare metodi standardizzati per identificare tali errori ed infine stimarne le dimensioni. Se la prima parte della tesi è incentrata sui problemi derivanti dalla modellazione CSG, la seconda invece suggerisce un alternativa ad essa, che è l'uso di Mesh non Strutturate (UM), un approccio che sta alla base di CFD e FEM, ma che risulta innovativo nell'ambito di codici Monte Carlo. In particolare le UM sono state applicate ad una porzione dell' Upper Launcher (un componente di ITER) in modo da validare tale metodologia su modelli nucleari di alta complessità. L'approccio CSG tradizionale e quello con UM sono state confrontati in termini di risorse computazionali richieste, velocità, precisione e accuratezza sia a livello di risultati globali che locali. Da ciò emerge che, nonostante esistano ancora alcuni limiti all'applicazione per le UM dovuti in parte anche alla sua novità, vari vantaggi possono essere attribuiti a questo tipo di approccio, tra cui un workflow più lineare, maggiore accuratezza nei risultati locali, e soprattutto la possibilità futura di usare la stessa mesh per diversi tipi di analisi (come quelle termiche o strutturali).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This dissertation explores the practice of transcreation as a consulting service aimed at companies wishing to enter the global market. Since a universally accepted definition of such term does not exist, different players use it to refer to different activities. In an attempt to investigate the meaning and scope of transcreation, as well as the skillset it requires, this dissertation consists of a theoretical part (Chapters 1 and 2) and a practical part (Chapter 3). The first chapter presents the opinions of academics, language services providers (LSPs) and transcreation experts. The different positions collected in this section are compared and discussed in order to better define transcreation and avoid any further misunderstanding about the practice. Lastly, the first chapter analyses the role of the transcreation expert by explaining in detail the four main skills it requires and the reasons for its increasing importance in the global market. The second chapter examines advertising and promotional materials, i.e. the kinds of texts to which transcreation applies. Not only does it illustrate the difference between above-the-line and below-the-line communications, but it also covers the different media used in advertising. In addition, the analysis of a billboard and two web pages in their Italian transcreation will help to further clarify the difference between translation and transcreation, both in the approach to a text and in the actual workflow followed. The third and final chapter of the dissertation, which entails the English to Italian transcreation of five different print ads performed by this author, aims to show how transcreation works in practice. By highlighting the main strategies used and difficulties encountered, it will also contribute to the notion of transcreation as a hybrid practice – something halfway between translation and copywriting, performed by professionals who possess the skills of both translators and copywriters.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Questa tesi di laurea si pone l’obiettivo di investigare alcune delle nuove frontiere offerte dalla crescita sincretica e multidisciplinare dei linguaggi digitali applicati all’architettura e ai beni culturali. Si approfondiranno i concetti teorici fondamentali dell’informazione digitale: il web semantico come ambiente di scambio, i metadata come informazioni sui dati, i LOD (Link Open Data) come standard e fine. Per l’ambito dei beni culturali verranno presentati i temi di ricerca e sviluppo nel campo della catalogazione e fruizione digitali: ontologie, dizionari normalizzati aperti, database (Catalogo Digitale), etc. Per l’ambito edilizio-architettonico verrà introdotto l’Heritage Building Information Modeling (HBIM) semantico come metodologia multidisciplinare focalizzata su rilievo geometrico, modellazione, archiviazione e scambio di tutte le informazioni utili alla conoscenza e conservazione dei beni storici. Il punto d’incontro tra i due mondi è individuato nella possibilità di arricchire le geometrie attraverso la definizione di una semantica (parametri-metadati) relazionata alle informazioni (valori-dati) presenti nei cataloghi digitali, creando di fatto un modello 3D per architetture storiche con funzione di database multidisciplinare. Sarà presentata la piattaforma web-based Inception, sviluppata dall’omonima startup incubata come spinoff dall’Università di Ferrara, che, tra le diverse applicazioni e potenzialità, verrà utilizzata come strumento per la condivisione e fruizione, garantendo la possibilità di interrogare geometrie e metadati in continuità con i principi LOD. Verrà definito un workflow generale (procedure Scan2BIM, modellazione geometrica, definizione script per l’estrazione automatica dei dati dal Catalogo Digitale, associazione dati-geometrie e caricamento in piattaforma) successivamente applicato e adattato alle precise necessità del caso studio: la Chiesa di S. Maria delle Vergini (MC), su commissione dell’ICCD referente al MiBACT.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Additive Manufacturing (AM) is nowadays considered an important alternative to traditional manufacturing processes. AM technology shows several advantages in literature as design flexibility, and its use increases in automotive, aerospace and biomedical applications. As a systematic literature review suggests, AM is sometimes coupled with voxelization, mainly for representation and simulation purposes. Voxelization can be defined as a volumetric representation technique based on the model’s discretization with hexahedral elements, as occurs with pixels in the 2D image. Voxels are used to simplify geometric representation, store intricated details of the interior and speed-up geometric and algebraic manipulation. Compared to boundary representation used in common CAD software, voxel’s inherent advantages are magnified in specific applications such as lattice or topologically structures for visualization or simulation purposes. Those structures can only be manufactured with AM employment due to their complex topology. After an accurate review of the existent literature, this project aims to exploit the potential of the voxelization algorithm to develop optimized Design for Additive Manufacturing (DfAM) tools. The final aim is to manipulate and support mechanical simulations of lightweight and optimized structures that should be ready to be manufactured with AM with particular attention to automotive applications. A voxel-based methodology is developed for efficient structural simulation of lattice structures. Moreover, thanks to an optimized smoothing algorithm specific for voxel-based geometries, a topological optimized and voxelized structure can be transformed into a surface triangulated mesh file ready for the AM process. Moreover, a modified panel code is developed for simple CFD simulations using the voxels as a discretization unit to understand the fluid-dynamics performances of industrial components for preliminary aerodynamic performance evaluation. The developed design tools and methodologies perfectly fit the automotive industry’s needs to accelerate and increase the efficiency of the design workflow from the conceptual idea to the final product.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Besides increasing the share of electric and hybrid vehicles, in order to comply with more stringent environmental protection limitations, in the mid-term the auto industry must improve the efficiency of the internal combustion engine and the well to wheel efficiency of the employed fuel. To achieve this target, a deeper knowledge of the phenomena that influence the mixture formation and the chemical reactions involving new synthetic fuel components is mandatory, but complex and time intensive to perform purely by experimentation. Therefore, numerical simulations play an important role in this development process, but their use can be effective only if they can be considered accurate enough to capture these variations. The most relevant models necessary for the simulation of the reacting mixture formation and successive chemical reactions have been investigated in the present work, with a critical approach, in order to provide instruments to define the most suitable approaches also in the industrial context, which is limited by time constraints and budget evaluations. To overcome these limitations, new methodologies have been developed to conjugate detailed and simplified modelling techniques for the phenomena involving chemical reactions and mixture formation in non-traditional conditions (e.g. water injection, biofuels etc.). Thanks to the large use of machine learning and deep learning algorithms, several applications have been revised or implemented, with the target of reducing the computing time of some traditional tasks by orders of magnitude. Finally, a complete workflow leveraging these new models has been defined and used for evaluating the effects of different surrogate formulations of the same experimental fuel on a proof-of-concept GDI engine model.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The research project aims to improve the Design for Additive Manufacturing of metal components. Firstly, the scenario of Additive Manufacturing is depicted, describing its role in Industry 4.0 and in particular focusing on Metal Additive Manufacturing technologies and the Automotive sector applications. Secondly, the state of the art in Design for Additive Manufacturing is described, contextualizing the methodologies, and classifying guidelines, rules, and approaches. The key phases of product design and process design to achieve lightweight functional designs and reliable processes are deepened together with the Computer-Aided Technologies to support the approaches implementation. Therefore, a general Design for Additive Manufacturing workflow based on product and process optimization has been systematically defined. From the analysis of the state of the art, the use of a holistic approach has been considered fundamental and thus the use of integrated product-process design platforms has been evaluated as a key element for its development. Indeed, a computer-based methodology exploiting integrated tools and numerical simulations to drive the product and process optimization has been proposed. A validation of CAD platform-based approaches has been performed, as well as potentials offered by integrated tools have been evaluated. Concerning product optimization, systematic approaches to integrate topology optimization in the design have been proposed and validated through product optimization of an automotive case study. Concerning process optimization, the use of process simulation techniques to prevent manufacturing flaws related to the high thermal gradients of metal processes is developed, providing case studies to validate results compared to experimental data, and application to process optimization of an automotive case study. Finally, an example of the product and process design through the proposed simulation-driven integrated approach is provided to prove the method's suitability for effective redesigns of Additive Manufacturing based high-performance metal products. The results are then outlined, and further developments are discussed.