976 resultados para Industry applications
Resumo:
This classical way to manage product development processes for massive production seems to be changing: high pressure for cost reduction, higher quality standards, markets reaching for innovation lead to the necessity of new tools for development control. Into this, and learning from the automotive and aerospace industries factories from other segments are starting to understand and apply manufacturing and assembly oriented projects to ease the task of generate goods and from this obtain at least a part of the expected results. This paper is intended to demonstrate the applicability of the concepts of Concurrent Engineering and DFM/DFA (Design for Manufacturing and Assembly) in the development of products and parts for the White Goods industry in Brazil (major appliances as refrigerators, cookers and washing machines), showing one case concerning the development and releasing of a component. Finally is demonstrated in a short term how was reached a solution that could provide cost savings and reduction on the time to delivery using those techniques.
Resumo:
Spent coffee grounds (SCG), which are the residue obtained from the treatment of coffee with hot water or steam, can be used for industrial applications, due to the high content in lipids. The cosmetic products might be a suitable application for these types of residues because the barrier properties of the stratum corneum (SC) are largely dependent on the intactness of the lipid lamellae that surrounds the corneocytes. The purpose of this work was to assess the feasibility of using the lipid fraction of SCG extracted with supercritical carbon dioxide in the development of new cosmetic formulations with improved skin lipids (sebum) and hydration. The use of spent coffee lipid extract in cosmetic industry seems to be a suitable approach to recycle the wastes from coffee industry. Emulsion containing 10% of the lipid fraction of SCG (SpentCofOil cream) presented promising characteristics in the improvement of sebum skin levels with a good acceptance by consumers when compared to an emulsion containing 10% w/w of green coffee oil (GreenCofOil cream) and a placebo without coffee oil (NoCofOil cream). Practical applications: In this work, the authors develop and characterize a cream containing 10% of the lipid fraction of SCG extracted with supercritical carbon dioxide with improved skin lipids (sebum) and hydration. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Resumo:
The worldwide demand for a clean and low-fuel-consuming transport promotes the development of safe, high energy and power electrochemical storage and conversion systems. Lithium-ion batteries (LIBs) are considered today the best technology for this application as demonstrated by the recent interest of automotive industry in hybrid (HEV) and electric vehicles (EV) based on LIBs. This thesis work, starting from the synthesis and characterization of electrode materials and the use of non-conventional electrolytes, demonstrates that LIBs with novel and safe electrolytes and electrode materials meet the targets of specific energy and power established by U.S.A. Department of Energy (DOE) for automotive application in HEV and EV. In chapter 2 is reported the origin of all chemicals used, the description of the instruments used for synthesis and chemical-physical characterizations, the electrodes preparation, the batteries configuration and the electrochemical characterization procedure of electrodes and batteries. Since the electrolyte is the main critical point of a battery, in particular in large- format modules, in chapter 3 we focused on the characterization of innovative and safe electrolytes based on ionic liquids (characterized by high boiling/decomposition points, thermal and electrochemical stability and appreciable conductivity) and mixtures of ionic liquid with conventional electrolyte. In chapter 4 is discussed the microwave accelerated sol–gel synthesis of the carbon- coated lithium iron phosphate (LiFePO 4 -C), an excellent cathode material for LIBs thanks to its intrinsic safety and tolerance to abusive conditions, which showed excellent electrochemical performance in terms of specific capacity and stability. In chapter 5 are presented the chemical-physical and electrochemical characterizations of graphite and titanium-based anode materials in different electrolytes. We also characterized a new anodic material, amorphous SnCo alloy, synthetized with a nanowire morphology that showed to strongly enhance the electrochemical stability of the material during galvanostatic full charge/discharge cycling. Finally, in chapter 6, are reported different types of batteries, assembled using the LiFePO 4 -C cathode material, different anode materials and electrolytes, characterized by deep galvanostatic charge/discharge cycles at different C-rates and by test procedures of the DOE protocol for evaluating pulse power capability and available energy. First, we tested a battery with the innovative cathode material LiFePO 4 -C and conventional graphite anode and carbonate-based electrolyte (EC DMC LiPF 6 1M) that demonstrated to surpass easily the target for power-assist HEV application. Given that the big concern of conventional lithium-ion batteries is the flammability of highly volatile organic carbonate- based electrolytes, we made safe batteries with electrolytes based on ionic liquid (IL). In order to use graphite anode in IL electrolyte we added to the IL 10% w/w of vinylene carbonate (VC) that produces a stable SEI (solid electrolyte interphase) and prevents the graphite exfoliation phenomenon. Then we assembled batteries with LiFePO 4 -C cathode, graphite anode and PYR 14 TFSI 0.4m LiTFSI with 10% w/w of VC that overcame the DOE targets for HEV application and were stable for over 275 cycles. We also assembled and characterized ―high safety‖ batteries with electrolytes based on pure IL, PYR 14 TFSI with 0.4m LiTFSI as lithium salt, and on mixture of this IL and standard electrolyte (PYR 14 TFSI 50% w/w and EC DMC LiPF 6 50% w/w), using titanium-based anodes (TiO 2 and Li 4 Ti 5 O 12 ) that are commonly considered safer than graphite in abusive conditions. The batteries bearing the pure ionic liquid did not satisfy the targets for HEV application, but the batteries with Li 4 Ti 5 O 12 anode and 50-50 mixture electrolyte were able to surpass the targets. We also assembled and characterized a lithium battery (with lithium metal anode) with a polymeric electrolyte based on poly-ethilenoxide (PEO 20 – LiCF 3 SO 3 +10%ZrO 2 ), which satisfied the targets for EV application and showed a very impressive cycling stability. In conclusion, we developed three lithium-ion batteries of different chemistries that demonstrated to be suitable for application in power-assist hybrid vehicles: graphite/EC DMC LiPF 6 /LiFePO 4 -C, graphite/PYR 14 TFSI 0.4m LiTFSI with 10% VC/LiFePO 4 -C and Li 4 T i5 O 12 /PYR 14 TFSI 50%-EC DMC LiPF 6 50%/LiFePO 4 -C. We also demonstrated that an all solid-state polymer lithium battery as Li/PEO 20 –LiCF 3 SO 3 +10%ZrO 2 /LiFePO 4 -C is suitable for application on electric vehicles. Furthermore we developed a promising anodic material alternative to the graphite, based on SnCo amorphous alloy.
Resumo:
In the last decades, the building materials and construction industry has been contributing to a great extent to generate a high impact on our environment. As it has been considered one of the key areas in which to operate to significantly reduce our footprint on environment, there has been widespread belief that particular attention now has to be paid and specific measures have to be taken to limit the use of non-renewable resources.The aim of this thesis is therefore to study and evaluate sustainable alternatives to commonly used building materials, mainly based on ordinary Portland Cement, and find a supportable path to reduce CO2 emissions and promote the re-use of waste materials. More specifically, this research explores different solutions for replacing cementitious binders in distinct application fields, particularly where special and more restricting requirements are needed, such as restoration and conservation of architectural heritage. Emphasis was thus placed on aspects and implications more closely related to the concept of non-invasivity and environmental sustainability. A first part of the research was addressed to the study and development of sustainable inorganic matrices, based on lime putty, for the pre-impregnation and on-site binding of continuous carbon fiber fabrics for structural rehabilitation and heritage restoration. Moreover, with the aim to further limit the exploitation of non-renewable resources, the synthesis of chemically activated silico-aluminate materials, as metakaolin, ladle slag or fly ash, was thus successfully achieved. New sustainable binders were hence proposed as novel building materials, suitable to be used as primary component for construction and repair mortars, as bulk materials in high-temperature applications or as matrices for high-toughness fiber reinforced composites.
Resumo:
Nanotechnologies are rapidly expanding because of the opportunities that the new materials offer in many areas such as the manufacturing industry, food production, processing and preservation, and in the pharmaceutical and cosmetic industry. Size distribution of the nanoparticles determines their properties and is a fundamental parameter that needs to be monitored from the small-scale synthesis up to the bulk production and quality control of nanotech products on the market. A consequence of the increasing number of applications of nanomaterial is that the EU regulatory authorities are introducing the obligation for companies that make use of nanomaterials to acquire analytical platforms for the assessment of the size parameters of the nanomaterials. In this work, Asymmetrical Flow Field-Flow Fractionation (AF4) and Hollow Fiber F4 (HF5), hyphenated with Multiangle Light Scattering (MALS) are presented as tools for a deep functional characterization of nanoparticles. In particular, it is demonstrated the applicability of AF4-MALS for the characterization of liposomes in a wide series of mediums. Afterwards the technique is used to explore the functional features of a liposomal drug vector in terms of its biological and physical interaction with blood serum components: a comprehensive approach to understand the behavior of lipid vesicles in terms of drug release and fusion/interaction with other biological species is described, together with weaknesses and strength of the method. Afterwards the size characterization, size stability, and conjugation of azidothymidine drug molecules with a new generation of metastable drug vectors, the Metal Organic Frameworks, is discussed. Lastly, it is shown the applicability of HF5-ICP-MS for the rapid screening of samples of relevant nanorisk: rather than a deep and comprehensive characterization it this time shown a quick and smart methodology that within few steps provides qualitative information on the content of metallic nanoparticles in tattoo ink samples.
Resumo:
Small, smaller, nano - it is a milestone in the development of new materials and technologies. Nanoscience is now present in our daily lives: in the car industry with self-cleaning surfaces, in medicine with cancer therapies, even our clothes and cosmetics utilize nanoparticles. The number and variety of applications has been growing fast in recent years, and the possibilities seem almost infinite. Nanoparticles made of inorganic materials have found applications in new electronic technologies, and organic nanomaterials have been added to resins to produce very strong but light weight materials.rnThis work deals with the combination of organic and inorganic materials for the fabrication of new, functional hybrid systems. For that purpose, block copolymers were made with a long, solubility-enhancing and semiconducting block, and a short anchor block. They were synthesized by either RAFT polymerization or Siegrist polycondensation. For the second block, an active ester was grafted on and subsequently reacted with the anchor molecules in a polymer analogue reaction. The resulting block copolymers had different properties; poly(para-phenylene vinylene) showed self-assembly in organic solvents, which resulted in gelling of the solution. The fibers from a diluted solution were visible through microscopy. When polymer chains were attached to TiO2 nanorods, the hybrids could be integrated into polymer fibers. A light-induced charge separation was demonstrated through KPFM. The polymer charged positively and the charge could travel along the fibers for several hundred nanometers. Polymers made via RAFT polymerization were based on poly(vinyltriphenylamine). Ruthenium chromophores which carried anchor groups were attached to the second block. These novel block copolymers were then attached to ZnO nanorods. A light-induced charge separation was also demonstrated in this system. The ability to disperse inorganic nanoparticles within the film is another advantage of these block copolymers. This was shown with the example of CdSe tetrapods. Poly(vinyltriphenylamine dimer) with disulfide anchor groups was attached to CdSe tetrapods. These four-armed nanoparticles are supposed to show very high charge transport. A polymer without anchor groups was also mixed with the tetrapods in order to investigate the influence of the anchor groups. It was shown that without them no good films were formed and the tetrapods aggregated heavily in the samples. Additionally, a large difference in the film qualities and the aggregation of the tetrapods was found in the sample of the polymer with anchor groups, dependent on the tetrapod arm length and the polymer loading. These systems are very interesting for hybrid solar cells. This work also illustrates similar systems with quantum dots. The influence of the energy level of the polymer on the hole transport from the polymer to the quantum dots, as well as on the efficiency of QLEDs was studied. For this purpose two different polymers were synthesized with different HOMO levels. It was clearly shown that the polymer with the adjusted lower HOMO level had a better hole injection to the quantum dots, which resulted in more efficient light emitting diodes.rnThese systems all have in common the fact that novel, and specially designed polymers, were attached to inorganic nanocrystals. All of these hybrid materials show fascinating properties, and are helpful in the research of new materials for optoelectronic applications.
Resumo:
BACKGROUND: Not all clinical trials are published, which may distort the evidence that is available in the literature. We studied the publication rate of a cohort of clinical trials and identified factors associated with publication and nonpublication of results. METHODS: We analysed the protocols of randomized clinical trials of drug interventions submitted to the research ethics committee of University Hospital (Inselspital) Bern, Switzerland from 1988 to 1998. We identified full articles published up to 2006 by searching the Cochrane CENTRAL database (issue 02/2006) and by contacting investigators. We analyzed factors associated with the publication of trials using descriptive statistics and logistic regression models. RESULTS: 451 study protocols and 375 corresponding articles were analyzed. 233 protocols resulted in at least one publication, a publication rate of 52%. A total of 366 (81%) trials were commercially funded, 47 (10%) had non-commercial funding. 346 trials (77%) were multi-centre studies and 272 of these (79%) were international collaborations. In the adjusted logistic regression model non-commercial funding (Odds Ratio [OR] 2.42, 95% CI 1.14-5.17), multi-centre status (OR 2.09, 95% CI 1.03-4.24), international collaboration (OR 1.87, 95% CI 0.99-3.55) and a sample size above the median of 236 participants (OR 2.04, 95% CI 1.23-3.39) were associated with full publication. CONCLUSIONS: In this cohort of applications to an ethics committee in Switzerland, only about half of clinical drug trials were published. Large multi-centre trials with non-commercial funding were more likely to be published than other trials, but most trials were funded by industry.
Resumo:
Laser Welding (LW) is more often used in manufacturing due to its advantages, such as accurate control, good repeatability, less heat input, opportunities for joining of special materials, high speed, capability to join small dimension parts etc. LW is dedicated to robotized manufacturing, and the fabrication cells are using various level of flexibility, from specialized robots to very flexible setups. This paper features several LW applications using two industrially-scaled manufacturing cells at UPM Laser Centre (CLUPM) of Polytechnical University of Madrid (Universidad Politécnica de Madrid). The one dedicated to Remote Laser Welding (RLW) of thin sheets for automotive and other sectors uses a CO2 laser of 3500 W. The second has a high flexibility, is based on a 6-axis ABB robot and a Nd:YAG laser of 3300 W, and is meant for various laser processing methods, including welding. After a short description of each cell, several LW applications experimented at CLUPM and recently implemented in industry are briefly presented: RLW of automotive coated sheets, LW of high strength automotive sheets, LW vs. laser hybrid welding (LHW) of Double Phase steel thin sheets, and LHW of thin sheets of stainless steel and carbon steel (dissimilar joints). The main technological issues overcame and the critical process parameters are pointed out. Conclusions about achievements and trends are provided.
Resumo:
Enabling real end-user programming development is the next logical stage in the evolution of Internetwide service-based applications. Even so, the vision of end users programming their own web-based solutions has not yet materialized. This will continue to be so unless both industry and the research community rise to the ambitious challenge of devising an end-to-end compositional model for developing a new age of end-user web application development tools. This paper describes a new composition model designed to empower programming-illiterate end users to create and share their own off-the-shelf rich Internet applications in a fully visual fashion. This paper presents the main insights and outcomes of our research and development efforts as part of a number of successful European Union research projects. A framework implementing this model was developed as part of the European Seventh Framework Programme FAST Project and the Spanish EzWeb Project and allowed us to validate the rationale behind our approach.
Resumo:
A high productivity rate in Engineering is related to an efficient management of the flow of the large quantities of information and associated decision making activities that are consubstantial to the Engineering processes both in design and production contexts. Dealing with such problems from an integrated point of view and mimicking real scenarios is not given much attention in Engineering degrees. In the context of Engineering Education, there are a number of courses designed for developing specific competencies, as required by the academic curricula, but not that many in which integration competencies are the main target. In this paper, a course devoted to that aim is discussed. The course is taught in a Marine Engineering degree but the philosophy could be used in any Engineering field. All the lessons are given in a computer room in which every student can use each all the treated software applications. The first part of the course is dedicated to Project Management: the students acquire skills in defining, using Ms-PROJECT, the work breakdown structure (WBS), and the organization breakdown structure (OBS) in Engineering projects, through a series of examples of increasing complexity, ending up with the case of vessel construction. The second part of the course is dedicated to the use of a database manager, Ms-ACCESS, for managing production related information. A series of increasing complexity examples is treated ending up with the management of the pipe database of a real vessel. This database consists of a few thousand of pipes, for which a production timing frame is defined, which connects this part of the course with the first one. Finally, the third part of the course is devoted to the work with FORAN, an Engineering Production package of widespread use in the shipbuilding industry. With this package, the frames and plates where all the outfitting will be carried out are defined through cooperative work by the studens, working simultaneously in the same 3D model. In the paper, specific details about the learning process are given. Surveys have been posed to the students in order to get feed-back from their experience as well as to assess their satisfaction with the learning process. Results from these surveys are discussed in the paper
Resumo:
Cadmium has been widely used as a coating to provide protection against galvanic corrosion for steels and for its natural lubricity on threaded applications. However, it is a toxic metal and a known carcinogenic agent, which is plated from an aqueous bath containing cyanide salts. For these reasons, the use of cadmium has been banned in Europe for most industrial applications. However, the aerospace industry is still exempt due to the stringent technical and safety requirements associated with aeronautical applications, as an acceptable replacement is yet to be found. Al slurry coatings have been developed as an alternative to replace cadmium coatings. The coatings were deposited on AISI 4340 steel and have been characterized by optical and electron microscopy. Testing included salt fog corrosion exposure, fluid corrosion exposure (immersion), humidity resistance, coating-substrate and paint-coating adhesion, electric conductivity, galvanic corrosion, embrittlement and fatigue. The results indicated that Al slurry coatings are an excellent alternative for Cd replacement.
Resumo:
Nowadays, Computational Fluid Dynamics (CFD) solvers are widely used within the industry to model fluid flow phenomenons. Several fluid flow model equations have been employed in the last decades to simulate and predict forces acting, for example, on different aircraft configurations. Computational time and accuracy are strongly dependent on the fluid flow model equation and the spatial dimension of the problem considered. While simple models based on perfect flows, like panel methods or potential flow models can be very fast to solve, they usually suffer from a poor accuracy in order to simulate real flows (transonic, viscous). On the other hand, more complex models such as the full Navier- Stokes equations provide high fidelity predictions but at a much higher computational cost. Thus, a good compromise between accuracy and computational time has to be fixed for engineering applications. A discretisation technique widely used within the industry is the so-called Finite Volume approach on unstructured meshes. This technique spatially discretises the flow motion equations onto a set of elements which form a mesh, a discrete representation of the continuous domain. Using this approach, for a given flow model equation, the accuracy and computational time mainly depend on the distribution of nodes forming the mesh. Therefore, a good compromise between accuracy and computational time might be obtained by carefully defining the mesh. However, defining an optimal mesh for complex flows and geometries requires a very high level expertize in fluid mechanics and numerical analysis, and in most cases a simple guess of regions of the computational domain which might affect the most the accuracy is impossible. Thus, it is desirable to have an automatized remeshing tool, which is more flexible with unstructured meshes than its structured counterpart. However, adaptive methods currently in use still have an opened question: how to efficiently drive the adaptation ? Pioneering sensors based on flow features generally suffer from a lack of reliability, so in the last decade more effort has been made in developing numerical error-based sensors, like for instance the adjoint-based adaptation sensors. While very efficient at adapting meshes for a given functional output, the latter method is very expensive as it requires to solve a dual set of equations and computes the sensor on an embedded mesh. Therefore, it would be desirable to develop a more affordable numerical error estimation method. The current work aims at estimating the truncation error, which arises when discretising a partial differential equation. These are the higher order terms neglected in the construction of the numerical scheme. The truncation error provides very useful information as it is strongly related to the flow model equation and its discretisation. On one hand, it is a very reliable measure of the quality of the mesh, therefore very useful in order to drive a mesh adaptation procedure. On the other hand, it is strongly linked to the flow model equation, so that a careful estimation actually gives information on how well a given equation is solved, which may be useful in the context of _ -extrapolation or zonal modelling. The following work is organized as follows: Chap. 1 contains a short review of mesh adaptation techniques as well as numerical error prediction. In the first section, Sec. 1.1, the basic refinement strategies are reviewed and the main contribution to structured and unstructured mesh adaptation are presented. Sec. 1.2 introduces the definitions of errors encountered when solving Computational Fluid Dynamics problems and reviews the most common approaches to predict them. Chap. 2 is devoted to the mathematical formulation of truncation error estimation in the context of finite volume methodology, as well as a complete verification procedure. Several features are studied, such as the influence of grid non-uniformities, non-linearity, boundary conditions and non-converged numerical solutions. This verification part has been submitted and accepted for publication in the Journal of Computational Physics. Chap. 3 presents a mesh adaptation algorithm based on truncation error estimates and compares the results to a feature-based and an adjoint-based sensor (in collaboration with Jorge Ponsín, INTA). Two- and three-dimensional cases relevant for validation in the aeronautical industry are considered. This part has been submitted and accepted in the AIAA Journal. An extension to Reynolds Averaged Navier- Stokes equations is also included, where _ -estimation-based mesh adaptation and _ -extrapolation are applied to viscous wing profiles. The latter has been submitted in the Proceedings of the Institution of Mechanical Engineers, Part G: Journal of Aerospace Engineering. Keywords: mesh adaptation, numerical error prediction, finite volume Hoy en día, la Dinámica de Fluidos Computacional (CFD) es ampliamente utilizada dentro de la industria para obtener información sobre fenómenos fluidos. La Dinámica de Fluidos Computacional considera distintas modelizaciones de las ecuaciones fluidas (Potencial, Euler, Navier-Stokes, etc) para simular y predecir las fuerzas que actúan, por ejemplo, sobre una configuración de aeronave. El tiempo de cálculo y la precisión en la solución depende en gran medida de los modelos utilizados, así como de la dimensión espacial del problema considerado. Mientras que modelos simples basados en flujos perfectos, como modelos de flujos potenciales, se pueden resolver rápidamente, por lo general aducen de una baja precisión a la hora de simular flujos reales (viscosos, transónicos, etc). Por otro lado, modelos más complejos tales como el conjunto de ecuaciones de Navier-Stokes proporcionan predicciones de alta fidelidad, a expensas de un coste computacional mucho más elevado. Por lo tanto, en términos de aplicaciones de ingeniería se debe fijar un buen compromiso entre precisión y tiempo de cálculo. Una técnica de discretización ampliamente utilizada en la industria es el método de los Volúmenes Finitos en mallas no estructuradas. Esta técnica discretiza espacialmente las ecuaciones del movimiento del flujo sobre un conjunto de elementos que forman una malla, una representación discreta del dominio continuo. Utilizando este enfoque, para una ecuación de flujo dado, la precisión y el tiempo computacional dependen principalmente de la distribución de los nodos que forman la malla. Por consiguiente, un buen compromiso entre precisión y tiempo de cálculo se podría obtener definiendo cuidadosamente la malla, concentrando sus elementos en aquellas zonas donde sea estrictamente necesario. Sin embargo, la definición de una malla óptima para corrientes y geometrías complejas requiere un nivel muy alto de experiencia en la mecánica de fluidos y el análisis numérico, así como un conocimiento previo de la solución. Aspecto que en la mayoría de los casos no está disponible. Por tanto, es deseable tener una herramienta que permita adaptar los elementos de malla de forma automática, acorde a la solución fluida (remallado). Esta herramienta es generalmente más flexible en mallas no estructuradas que con su homóloga estructurada. No obstante, los métodos de adaptación actualmente en uso todavía dejan una pregunta abierta: cómo conducir de manera eficiente la adaptación. Sensores pioneros basados en las características del flujo en general, adolecen de una falta de fiabilidad, por lo que en la última década se han realizado grandes esfuerzos en el desarrollo numérico de sensores basados en el error, como por ejemplo los sensores basados en el adjunto. A pesar de ser muy eficientes en la adaptación de mallas para un determinado funcional, este último método resulta muy costoso, pues requiere resolver un doble conjunto de ecuaciones: la solución y su adjunta. Por tanto, es deseable desarrollar un método numérico de estimación de error más asequible. El presente trabajo tiene como objetivo estimar el error local de truncación, que aparece cuando se discretiza una ecuación en derivadas parciales. Estos son los términos de orden superior olvidados en la construcción del esquema numérico. El error de truncación proporciona una información muy útil sobre la solución: es una medida muy fiable de la calidad de la malla, obteniendo información que permite llevar a cabo un procedimiento de adaptación de malla. Está fuertemente relacionado al modelo matemático fluido, de modo que una estimación precisa garantiza la idoneidad de dicho modelo en un campo fluido, lo que puede ser útil en el contexto de modelado zonal. Por último, permite mejorar la precisión de la solución resolviendo un nuevo sistema donde el error local actúa como término fuente (_ -extrapolación). El presenta trabajo se organiza de la siguiente manera: Cap. 1 contiene una breve reseña de las técnicas de adaptación de malla, así como de los métodos de predicción de los errores numéricos. En la primera sección, Sec. 1.1, se examinan las estrategias básicas de refinamiento y se presenta la principal contribución a la adaptación de malla estructurada y no estructurada. Sec 1.2 introduce las definiciones de los errores encontrados en la resolución de problemas de Dinámica Computacional de Fluidos y se examinan los enfoques más comunes para predecirlos. Cap. 2 está dedicado a la formulación matemática de la estimación del error de truncación en el contexto de la metodología de Volúmenes Finitos, así como a un procedimiento de verificación completo. Se estudian varias características que influyen en su estimación: la influencia de la falta de uniformidad de la malla, el efecto de las no linealidades del modelo matemático, diferentes condiciones de contorno y soluciones numéricas no convergidas. Esta parte de verificación ha sido presentada y aceptada para su publicación en el Journal of Computational Physics. Cap. 3 presenta un algoritmo de adaptación de malla basado en la estimación del error de truncación y compara los resultados con sensores de featured-based y adjointbased (en colaboración con Jorge Ponsín del INTA). Se consideran casos en dos y tres dimensiones, relevantes para la validación en la industria aeronáutica. Este trabajo ha sido presentado y aceptado en el AIAA Journal. También se incluye una extensión de estos métodos a las ecuaciones RANS (Reynolds Average Navier- Stokes), en donde adaptación de malla basada en _ y _ -extrapolación son aplicados a perfiles con viscosidad de alas. Este último trabajo se ha presentado en los Actas de la Institución de Ingenieros Mecánicos, Parte G: Journal of Aerospace Engineering. Palabras clave: adaptación de malla, predicción del error numérico, volúmenes finitos
Resumo:
Mobile games are a prime example of a successful mobile application and demonstrate the increasing range of platforms for the media and entertainment industries. Against this convergent background, this paper introduces the basic features of the mobile gaming market and its industrial ecosystem together with its main actors and activities. The focus of the paper lies in the challenges ahead for the evolution of mobile applications into a potentially dominant game platform and the possible disruptions along this road. The deep personal relationships between users and their mobile devices are considered to further explore the link between mobile games, players’ strategies and pending techno-economic developments. The paper concludes with a brief discussion of some policy options to assist with the development of this domain.
Resumo:
Antecedentes Europa vive una situación insostenible. Desde el 2008 se han reducido los recursos de los gobiernos a raíz de la crisis económica. El continente Europeo envejece con ritmo constante al punto que se prevé que en 2050 habrá sólo dos trabajadores por jubilado [54]. A esta situación se le añade el aumento de la incidencia de las enfermedades crónicas, relacionadas con el envejecimiento, cuyo coste puede alcanzar el 7% del PIB de un país [51]. Es necesario un cambio de paradigma. Una nueva manera de cuidar de la salud de las personas: sustentable, eficaz y preventiva más que curativa. Algunos estudios abogan por el cuidado personalizado de la salud (pHealth). En este modelo las prácticas médicas son adaptadas e individualizadas al paciente, desde la detección de los factores de riesgo hasta la personalización de los tratamientos basada en la respuesta del individuo [81]. El cuidado personalizado de la salud está asociado a menudo al uso de las tecnologías de la información y comunicación (TICs) que, con su desarrollo exponencial, ofrecen oportunidades interesantes para la mejora de la salud. El cambio de paradigma hacia el pHealth está lentamente ocurriendo, tanto en el ámbito de la investigación como en la industria, pero todavía no de manera significativa. Existen todavía muchas barreras relacionadas a la economía, a la política y la cultura. También existen barreras puramente tecnológicas, como la falta de sistemas de información interoperables [199]. A pesar de que los aspectos de interoperabilidad están evolucionando, todavía hace falta un diseño de referencia especialmente direccionado a la implementación y el despliegue en gran escala de sistemas basados en pHealth. La presente Tesis representa un intento de organizar la disciplina de la aplicación de las TICs al cuidado personalizado de la salud en un modelo de referencia, que permita la creación de plataformas de desarrollo de software para simplificar tareas comunes de desarrollo en este dominio. Preguntas de investigación RQ1 >Es posible definir un modelo, basado en técnicas de ingeniería del software, que represente el dominio del cuidado personalizado de la salud de una forma abstracta y representativa? RQ2 >Es posible construir una plataforma de desarrollo basada en este modelo? RQ3 >Esta plataforma ayuda a los desarrolladores a crear sistemas pHealth complejos e integrados? Métodos Para la descripción del modelo se adoptó el estándar ISO/IEC/IEEE 42010por ser lo suficientemente general y abstracto para el amplio enfoque de esta tesis [25]. El modelo está definido en varias partes: un modelo conceptual, expresado a través de mapas conceptuales que representan las partes interesadas (stakeholders), los artefactos y la información compartida; y escenarios y casos de uso para la descripción de sus funcionalidades. El modelo fue desarrollado de acuerdo a la información obtenida del análisis de la literatura, incluyendo 7 informes industriales y científicos, 9 estándares, 10 artículos en conferencias, 37 artículos en revistas, 25 páginas web y 5 libros. Basándose en el modelo se definieron los requisitos para la creación de la plataforma de desarrollo, enriquecidos por otros requisitos recolectados a través de una encuesta realizada a 11 ingenieros con experiencia en la rama. Para el desarrollo de la plataforma, se adoptó la metodología de integración continua [74] que permitió ejecutar tests automáticos en un servidor y también desplegar aplicaciones en una página web. En cuanto a la metodología utilizada para la validación se adoptó un marco para la formulación de teorías en la ingeniería del software [181]. Esto requiere el desarrollo de modelos y proposiciones que han de ser validados dentro de un ámbito de investigación definido, y que sirvan para guiar al investigador en la búsqueda de la evidencia necesaria para justificarla. La validación del modelo fue desarrollada mediante una encuesta online en tres rondas con un número creciente de invitados. El cuestionario fue enviado a 134 contactos y distribuido en algunos canales públicos como listas de correo y redes sociales. El objetivo era evaluar la legibilidad del modelo, su nivel de cobertura del dominio y su potencial utilidad en el diseño de sistemas derivados. El cuestionario incluía preguntas cuantitativas de tipo Likert y campos para recolección de comentarios. La plataforma de desarrollo fue validada en dos etapas. En la primera etapa se utilizó la plataforma en un experimento a pequeña escala, que consistió en una sesión de entrenamiento de 12 horas en la que 4 desarrolladores tuvieron que desarrollar algunos casos de uso y reunirse en un grupo focal para discutir su uso. La segunda etapa se realizó durante los tests de un proyecto en gran escala llamado HeartCycle [160]. En este proyecto un equipo de diseñadores y programadores desarrollaron tres aplicaciones en el campo de las enfermedades cardio-vasculares. Una de estas aplicaciones fue testeada en un ensayo clínico con pacientes reales. Al analizar el proyecto, el equipo de desarrollo se reunió en un grupo focal para identificar las ventajas y desventajas de la plataforma y su utilidad. Resultados Por lo que concierne el modelo que describe el dominio del pHealth, la parte conceptual incluye una descripción de los roles principales y las preocupaciones de los participantes, un modelo de los artefactos TIC que se usan comúnmente y un modelo para representar los datos típicos que son necesarios formalizar e intercambiar entre sistemas basados en pHealth. El modelo funcional incluye un conjunto de 18 escenarios, repartidos en: punto de vista de la persona asistida, punto de vista del cuidador, punto de vista del desarrollador, punto de vista de los proveedores de tecnologías y punto de vista de las autoridades; y un conjunto de 52 casos de uso repartidos en 6 categorías: actividades de la persona asistida, reacciones del sistema, actividades del cuidador, \engagement" del usuario, actividades del desarrollador y actividades de despliegue. Como resultado del cuestionario de validación del modelo, un total de 65 personas revisó el modelo proporcionando su nivel de acuerdo con las dimensiones evaluadas y un total de 248 comentarios sobre cómo mejorar el modelo. Los conocimientos de los participantes variaban desde la ingeniería del software (70%) hasta las especialidades médicas (15%), con declarado interés en eHealth (24%), mHealth (16%), Ambient Assisted Living (21%), medicina personalizada (5%), sistemas basados en pHealth (15%), informática médica (10%) e ingeniería biomédica (8%) con una media de 7.25_4.99 años de experiencia en estas áreas. Los resultados de la encuesta muestran que los expertos contactados consideran el modelo fácil de leer (media de 1.89_0.79 siendo 1 el valor más favorable y 5 el peor), suficientemente abstracto (1.99_0.88) y formal (2.13_0.77), con una cobertura suficiente del dominio (2.26_0.95), útil para describir el dominio (2.02_0.7) y para generar sistemas más específicos (2_0.75). Los expertos también reportan un interés parcial en utilizar el modelo en su trabajo (2.48_0.91). Gracias a sus comentarios, el modelo fue mejorado y enriquecido con conceptos que faltaban, aunque no se pudo demonstrar su mejora en las dimensiones evaluadas, dada la composición diferente de personas en las tres rondas de evaluación. Desde el modelo, se generó una plataforma de desarrollo llamada \pHealth Patient Platform (pHPP)". La plataforma desarrollada incluye librerías, herramientas de programación y desarrollo, un tutorial y una aplicación de ejemplo. Se definieron cuatro módulos principales de la arquitectura: el Data Collection Engine, que permite abstraer las fuentes de datos como sensores o servicios externos, mapeando los datos a bases de datos u ontologías, y permitiendo interacción basada en eventos; el GUI Engine, que abstrae la interfaz de usuario en un modelo de interacción basado en mensajes; y el Rule Engine, que proporciona a los desarrolladores un medio simple para programar la lógica de la aplicación en forma de reglas \if-then". Después de que la plataforma pHPP fue utilizada durante 5 años en el proyecto HeartCycle, 5 desarrolladores fueron reunidos en un grupo de discusión para analizar y evaluar la plataforma. De estas evaluaciones se concluye que la plataforma fue diseñada para encajar las necesidades de los ingenieros que trabajan en la rama, permitiendo la separación de problemas entre las distintas especialidades, y simplificando algunas tareas de desarrollo como el manejo de datos y la interacción asíncrona. A pesar de ello, se encontraron algunos defectos a causa de la inmadurez de algunas tecnologías empleadas, y la ausencia de algunas herramientas específicas para el dominio como el procesado de datos o algunos protocolos de comunicación relacionados con la salud. Dentro del proyecto HeartCycle la plataforma fue utilizada para el desarrollo de la aplicación \Guided Exercise", un sistema TIC para la rehabilitación de pacientes que han sufrido un infarto del miocardio. El sistema fue testeado en un ensayo clínico randomizado en el cual a 55 pacientes se les dio el sistema para su uso por 21 semanas. De los resultados técnicos del ensayo se puede concluir que, a pesar de algunos errores menores prontamente corregidos durante el estudio, la plataforma es estable y fiable. Conclusiones La investigación llevada a cabo en esta Tesis y los resultados obtenidos proporcionan las respuestas a las tres preguntas de investigación que motivaron este trabajo: RQ1 Se ha desarrollado un modelo para representar el dominio de los sistemas personalizados de salud. La evaluación hecha por los expertos de la rama concluye que el modelo representa el dominio con precisión y con un balance apropiado entre abstracción y detalle. RQ2 Se ha desarrollado, con éxito, una plataforma de desarrollo basada en el modelo. RQ3 Se ha demostrado que la plataforma es capaz de ayudar a los desarrolladores en la creación de software pHealth complejos. Las ventajas de la plataforma han sido demostradas en el ámbito de un proyecto de gran escala, aunque el enfoque genérico adoptado indica que la plataforma podría ofrecer beneficios también en otros contextos. Los resultados de estas evaluaciones ofrecen indicios de que, ambos, el modelo y la plataforma serán buenos candidatos para poderse convertir en una referencia para futuros desarrollos de sistemas pHealth. ABSTRACT Background Europe is living in an unsustainable situation. The economic crisis has been reducing governments' economic resources since 2008 and threatening social and health systems, while the proportion of older people in the European population continues to increase so that it is foreseen that in 2050 there will be only two workers per retiree [54]. To this situation it should be added the rise, strongly related to age, of chronic diseases the burden of which has been estimated to be up to the 7% of a country's gross domestic product [51]. There is a need for a paradigm shift, the need for a new way of caring for people's health, shifting the focus from curing conditions that have arisen to a sustainable and effective approach with the emphasis on prevention. Some advocate the adoption of personalised health care (pHealth), a model where medical practices are tailored to the patient's unique life, from the detection of risk factors to the customization of treatments based on each individual's response [81]. Personalised health is often associated to the use of Information and Communications Technology (ICT), that, with its exponential development, offers interesting opportunities for improving healthcare. The shift towards pHealth is slowly taking place, both in research and in industry, but the change is not significant yet. Many barriers still exist related to economy, politics and culture, while others are purely technological, like the lack of interoperable information systems [199]. Though interoperability aspects are evolving, there is still the need of a reference design, especially tackling implementation and large scale deployment of pHealth systems. This thesis contributes to organizing the subject of ICT systems for personalised health into a reference model that allows for the creation of software development platforms to ease common development issues in the domain. Research questions RQ1 Is it possible to define a model, based on software engineering techniques, for representing the personalised health domain in an abstract and representative way? RQ2 Is it possible to build a development platform based on this model? RQ3 Does the development platform help developers create complex integrated pHealth systems? Methods As method for describing the model, the ISO/IEC/IEEE 42010 framework [25] is adopted for its generality and high level of abstraction. The model is specified in different parts: a conceptual model, which makes use of concept maps, for representing stakeholders, artefacts and shared information, and in scenarios and use cases for the representation of the functionalities of pHealth systems. The model was derived from literature analysis, including 7 industrial and scientific reports, 9 electronic standards, 10 conference proceedings papers, 37 journal papers, 25 websites and 5 books. Based on the reference model, requirements were drawn for building the development platform enriched with a set of requirements gathered in a survey run among 11 experienced engineers. For developing the platform, the continuous integration methodology [74] was adopted which allowed to perform automatic tests on a server and also to deploy packaged releases on a web site. As a validation methodology, a theory building framework for SW engineering was adopted from [181]. The framework, chosen as a guide to find evidence for justifying the research questions, imposed the creation of theories based on models and propositions to be validated within a scope. The validation of the model was conducted as an on-line survey in three validation rounds, encompassing a growing number of participants. The survey was submitted to 134 experts of the field and on some public channels like relevant mailing lists and social networks. Its objective was to assess the model's readability, its level of coverage of the domain and its potential usefulness in the design of actual, derived systems. The questionnaires included quantitative Likert scale questions and free text inputs for comments. The development platform was validated in two scopes. As a small-scale experiment, the platform was used in a 12 hours training session where 4 developers had to perform an exercise consisting in developing a set of typical pHealth use cases At the end of the session, a focus group was held to identify benefits and drawbacks of the platform. The second validation was held as a test-case study in a large scale research project called HeartCycle the aim of which was to develop a closed-loop disease management system for heart failure and coronary heart disease patients [160]. During this project three applications were developed by a team of programmers and designers. One of these applications was tested in a clinical trial with actual patients. At the end of the project, the team was interviewed in a focus group to assess the role the platform had within the project. Results For what regards the model that describes the pHealth domain, its conceptual part includes a description of the main roles and concerns of pHealth stakeholders, a model of the ICT artefacts that are commonly adopted and a model representing the typical data that need to be formalized among pHealth systems. The functional model includes a set of 18 scenarios, divided into assisted person's view, caregiver's view, developer's view, technology and services providers' view and authority's view, and a set of 52 Use Cases grouped in 6 categories: assisted person's activities, system reactions, caregiver's activities, user engagement, developer's activities and deployer's activities. For what concerns the validation of the model, a total of 65 people participated in the online survey providing their level of agreement in all the assessed dimensions and a total of 248 comments on how to improve and complete the model. Participants' background spanned from engineering and software development (70%) to medical specialities (15%), with declared interest in the fields of eHealth (24%), mHealth (16%), Ambient Assisted Living (21%), Personalized Medicine (5%), Personal Health Systems (15%), Medical Informatics (10%) and Biomedical Engineering (8%) with an average of 7.25_4.99 years of experience in these fields. From the analysis of the answers it is possible to observe that the contacted experts considered the model easily readable (average of 1.89_0.79 being 1 the most favourable scoring and 5 the worst), sufficiently abstract (1.99_0.88) and formal (2.13_0.77) for its purpose, with a sufficient coverage of the domain (2.26_0.95), useful for describing the domain (2.02_0.7) and for generating more specific systems (2_0.75) and they reported a partial interest in using the model in their job (2.48_0.91). Thanks to their comments, the model was improved and enriched with concepts that were missing at the beginning, nonetheless it was not possible to prove an improvement among the iterations, due to the diversity of the participants in the three rounds. From the model, a development platform for the pHealth domain was generated called pHealth Patient Platform (pHPP). The platform includes a set of libraries, programming and deployment tools, a tutorial and a sample application. The main four modules of the architecture are: the Data Collection Engine, which allows abstracting sources of information like sensors or external services, mapping data to databases and ontologies, and allowing event-based interaction and filtering, the GUI Engine, which abstracts the user interface in a message-like interaction model, the Workow Engine, which allows programming the application's user interaction ows with graphical workows, and the Rule Engine, which gives developers a simple means for programming the application's logic in the form of \if-then" rules. After the 5 years experience of HeartCycle, partially programmed with pHPP, 5 developers were joined in a focus group to discuss the advantages and drawbacks of the platform. The view that emerged from the training course and the focus group was that the platform is well-suited to the needs of the engineers working in the field, it allowed the separation of concerns among the different specialities and it simplified some common development tasks like data management and asynchronous interaction. Nevertheless, some deficiencies were pointed out in terms of a lack of maturity of some technological choices, and for the absence of some domain-specific tools, e.g. for data processing or for health-related communication protocols. Within HeartCycle, the platform was used to develop part of the Guided Exercise system, a composition of ICT tools for the physical rehabilitation of patients who suffered from myocardial infarction. The system developed using the platform was tested in a randomized controlled clinical trial, in which 55 patients used the system for 21 weeks. The technical results of this trial showed that the system was stable and reliable. Some minor bugs were detected, but these were promptly corrected using the platform. This shows that the platform, as well as facilitating the development task, can be successfully used to produce reliable software. Conclusions The research work carried out in developing this thesis provides responses to the three three research questions that were the motivation for the work. RQ1 A model was developed representing the domain of personalised health systems, and the assessment of experts in the field was that it represents the domain accurately, with an appropriate balance between abstraction and detail. RQ2 A development platform based on the model was successfully developed. RQ3 The platform has been shown to assist developers create complex pHealth software. This was demonstrated within the scope of one large-scale project, but the generic approach adopted provides indications that it would offer benefits more widely. The results of these evaluations provide indications that both the model and the platform are good candidates for being a reference for future pHealth developments.
Resumo:
Acourse focused on the acquisition of integration competencies in ship production engineering, organized in collaboration with selected industry partners, is presented in this paper. The first part of the course is dedicated to Project Management: the students acquire skills in defining, using MS-PROJECT, the work breakdown structure (WBS), and the organization breakdown structure (OBS) in Engineering projects, through a series of examples of increasing complexity with the final one being the construction planning of a vessel. The second part of the course is dedicated to the use of a database manager, MS-ACCESS, in managing production related information.Aseries of increasing complexity examples is treated, the final one being the management of the piping database of a real vessel. This database consists of several thousand pipes, for which a production timing frame is defined connecting this part of the course with the first one. Finally, the third part of the course is devoted to working withFORAN,an Engineering Production application developed bySENERand widely used in the shipbuilding industry. With this application, the structural elements where all the outfittings will be located are defined through cooperative work by the students, working simultaneously in the same 3D model. In this paper, specific details about the learning process are given. Surveys have been posed to the students in order to get feedback from their experience as well as to assess their satisfaction with the learning process, compared to more traditional ones. Results from these surveys are discussed in the paper.