14 resultados para Minkowski metrics
em Universidade do Minho
Resumo:
The occupational risks in the nanotechnology research laboratories are an important topic since a great number of researchers are involved in this area. The risk assessment performed by both qualitative and quantitative methods is a necessary step for the management of the occupational risks. Risk assessment could be performed by qualitative methods that gather consensus in the scientific community. It is also possible to use quantitative methods, based in different technics and metrics, as indicative exposure limits are been settled by several institutions. While performing the risk assessment, the information on the materials used is very important and, if it is not updated, it could create a bias in the assessment results. The exposure to TiO2 nanoparticles risk was assessed in a research laboratory using a quantitative exposure method and qualitative risk assessment methods. It was found the results from direct-reading Condensation Particle Counter (CPC) equipment and the CB Nanotool seem to be related and aligned, while the results obtained from the use of the Stoffenmanager Nano seem to indicate a higher risk level.
Resumo:
Human activity is very dynamic and subtle, and most physical environments are also highly dynamic and support a vast range of social practices that do not map directly into any immediate ubiquitous computing functionally. Identifying what is valuable to people is very hard and obviously leads to great uncertainty regarding the type of support needed and the type of resources needed to create such support. We have addressed the issues of system development through the adoption of a Crowdsourced software development model [13]. We have designed and developed Anywhere places, an open and flexible system support infrastructure for Ubiquitous Computing that is based on a balanced combination between global services and applications and situated devices. Evaluation, however, is still an open problem. The characteristics of ubiquitous computing environments make their evaluation very complex: there are no globally accepted metrics and it is very difficult to evaluate large-scale and long-term environments in real contexts. In this paper, we describe a first proposal of an hybrid 3D simulated prototype of Anywhere places that combines simulated and real components to generate a mixed reality which can be used to assess the envisaged ubiquitous computing environments [17].
Resumo:
This paper presents a proposal for a management model based on reliability requirements concerning Cloud Computing (CC). The proposal was based on a literature review focused on the problems, challenges and underway studies related to the safety and reliability of Information Systems (IS) in this technological environment. This literature review examined the existing obstacles and challenges from the point of view of respected authors on the subject. The main issues are addressed and structured as a model, called "Trust Model for Cloud Computing environment". This is a proactive proposal that purposes to organize and discuss management solutions for the CC environment, aiming improved reliability of the IS applications operation, for both providers and their customers. On the other hand and central to trust, one of the CC challenges is the development of models for mutual audit management agreements, so that a formal relationship can be established involving the relevant legal responsibilities. To establish and control the appropriate contractual requirements, it is necessary to adopt technologies that can collect the data needed to inform risk decisions, such as access usage, security controls, location and other references related to the use of the service. In this process, the cloud service providers and consumers themselves must have metrics and controls to support cloud-use management in compliance with the SLAs agreed between the parties. The organization of these studies and its dissemination in the market as a conceptual model that is able to establish parameters to regulate a reliable relation between provider and user of IT services in CC environment is an interesting instrument to guide providers, developers and users in order to provide services and secure and reliable applications.
Resumo:
Although most of the accidents occurred in Olive Oil Mill (OOM) resulted from “basic” risks, there is a need to apply adequate tools to support risk decisions that can meet the specificities of this sector. This study aims to analyse the views of Occupational, Safety & Health (OSH) practitioners about the risk assessment process in OOM, identifying the key difficulties inherent to the risk assessment process in these sector, as well as identifying some improvements to the current practice. This analysis was based on a questionnaire that was developed and applied to 13 OSH practitioners working at OOM. The results showed that the time available to perform the risk assessment is the more frequent limitation. They believe that the methodologies available are not an important limitation to this process. However, a specific risk assessment methodology, that includes acceptance criteria adjusted to the OOM reality, using risk metrics supported on the frequency of accidents and workdays lost, were indicated as being also an important contributions improve the process. A semi-quantitative approach, complemented with the use of the sector accident statistics, can be a good solution for this sector. However, further strategies should also be adopted, mainly those that can lead to an easy application of the risk assessment process.
Resumo:
Tese de Doutoramento em Engenharia Industrial e de Sistemas (PDEIS)
Resumo:
The Prognostic Health Management (PHM) has been asserting itself as the most promising methodology to enhance the effective reliability and availability of a product or system during its life-cycle conditions by detecting current and approaching failures, thus, providing mitigation of the system risks with reduced logistics and support costs. However, PHM is at an early stage of development, it also expresses some concerns about possible shortcomings of its methods, tools, metrics and standardization. These factors have been severely restricting the applicability of PHM and its adoption by the industry. This paper presents a comprehensive literature review about the PHM main general weaknesses. Exploring the research opportunities present in some recent publications, are discussed and outlined the general guide-lines for finding the answer to these issues.
Resumo:
The main features of most components consist of simple basic functional geometries: planes, cylinders, spheres and cones. Shape and position recognition of these geometries is essential for dimensional characterization of components, and represent an important contribution in the life cycle of the product, concerning in particular the manufacturing and inspection processes of the final product. This work aims to establish an algorithm to automatically recognize such geometries, without operator intervention. Using differential geometry large volumes of data can be treated and the basic functional geometries to be dealt recognized. The original data can be obtained by rapid acquisition methods, such as 3D survey or photography, and then converted into Cartesian coordinates. The satisfaction of intrinsic decision conditions allows different geometries to be fast identified, without operator intervention. Since inspection is generally a time consuming task, this method reduces operator intervention in the process. The algorithm was first tested using geometric data generated in MATLAB and then through a set of data points acquired by measuring with a coordinate measuring machine and a 3D scan on real physical surfaces. Comparison time spent in measuring is presented to show the advantage of the method. The results validated the suitability and potential of the algorithm hereby proposed
Resumo:
Dissertação de mestrado integrado em Engenharia e Gestão Industrial
Resumo:
The research aimed to establish tyre-road noise models by using a Data Mining approach that allowed to build a predictive model and assess the importance of the tested input variables. The data modelling took into account three learning algorithms and three metrics to define the best predictive model. The variables tested included basic properties of pavement surfaces, macrotexture, megatexture, and uneven- ness and, for the first time, damping. Also, the importance of those variables was measured by using a sensitivity analysis procedure. Two types of models were set: one with basic variables and another with complex variables, such as megatexture and damping, all as a function of vehicles speed. More detailed models were additionally set by the speed level. As a result, several models with very good tyre-road noise predictive capacity were achieved. The most relevant variables were Speed, Temperature, Aggregate size, Mean Profile Depth, and Damping, which had the highest importance, even though influenced by speed. Megatexture and IRI had the lowest importance. The applicability of the models developed in this work is relevant for trucks tyre-noise prediction, represented by the AVON V4 test tyre, at the early stage of road pavements use. Therefore, the obtained models are highly useful for the design of pavements and for noise prediction by road authorities and contractors.
Resumo:
Dissertação de mestrado integrado em Engenharia e Gestão de Sistemas de Informação
Resumo:
Dissertação de mestrado integrado em Engenharia e Gestão de Sistemas de Informação
Resumo:
Dissertação de mestrado integrado em Engenharia de Gestão e Sistemas de Informação
Resumo:
Kinetic models have a great potential for metabolic engineering applications. They can be used for testing which genetic and regulatory modifications can increase the production of metabolites of interest, while simultaneously monitoring other key functions of the host organism. This work presents a methodology for increasing productivity in biotechnological processes exploiting dynamic models. It uses multi-objective dynamic optimization to identify the combination of targets (enzymatic modifications) and the degree of up- or down-regulation that must be performed in order to optimize a set of pre-defined performance metrics subject to process constraints. The capabilities of the approach are demonstrated on a realistic and computationally challenging application: a large-scale metabolic model of Chinese Hamster Ovary cells (CHO), which are used for antibody production in a fed-batch process. The proposed methodology manages to provide a sustained and robust growth in CHO cells, increasing productivity while simultaneously increasing biomass production, product titer, and keeping the concentrations of lactate and ammonia at low values. The approach presented here can be used for optimizing metabolic models by finding the best combination of targets and their optimal level of up/down-regulation. Furthermore, it can accommodate additional trade-offs and constraints with great flexibility.
Resumo:
Relatório de estágio de mestrado em Ciências da Comunicação (área de especialização em Publicidade e Relações Públicas)