995 resultados para Atmospheric systems
Resumo:
It is a difficult task to avoid the “smart systems” topic when discussing smart prevention and, similarly, it is a difficult task to address smart systems without focusing their ability to learn. Following the same line of thought, in the current reality, it seems a Herculean task (or an irreparable omission) to approach the topic of certified occupational health and safety management systems (OHSMS) without discussing the integrated management systems (IMSs). The available data suggest that seldom are the OHSMS operating as the single management system (MS) in a company so, any statement concerning OHSMS should mainly be interpreted from an integrated perspective. A major distinction between generic systems can be drawn between those that learn, i.e., those systems that have “memory” and those that have not. These former systems are often depicted as adaptive since they take into account past events to deal with novel, similar and future events modifying their structure to enable success in its environment. Often, these systems, present a nonlinear behavior and a huge uncertainty related to the forecasting of some events. This paper seeks to portray, for the first time as we were able to find out, the IMSs as complex adaptive systems (CASs) by listing their properties and dissecting the features that enable them to evolve and self-organize in order to, holistically, fulfil the requirements from different stakeholders and thus thrive by assuring the successful sustainability of a company. Based on the revision of literature carried out, this is the first time that IMSs are pointed out as CASs which may develop fruitful synergies both for the MSs and for CASs communities. By performing a thorough revision of literature and based on some concepts embedded in the “DNA” of the subsystems implementation standards it is intended, specifically, to identify, determine and discuss the properties of a generic IMS that should be considered to classify it as a CAS.
Resumo:
The MAP-i doctoral program of the Universities of Minho, Aveiro and Porto
Resumo:
Biometric systems are increasingly being used as a means for authentication to provide system security in modern technologies. The performance of a biometric system depends on the accuracy, the processing speed, the template size, and the time necessary for enrollment. While much research has focused on the first three factors, enrollment time has not received as much attention. In this work, we present the findings of our research focused upon studying user’s behavior when enrolling in a biometric system. Specifically, we collected information about the user’s availability for enrollment in respect to the hand recognition systems (e.g., hand geometry, palm geometry or any other requiring positioning the hand on an optical scanner). A sample of 19 participants, chosen randomly apart their age, gender, profession and nationality, were used as test subjects in an experiment to study the patience of users enrolling in a biometric hand recognition system.
Resumo:
Usually, data warehousing populating processes are data-oriented workflows composed by dozens of granular tasks that are responsible for the integration of data coming from different data sources. Specific subset of these tasks can be grouped on a collection together with their relationships in order to form higher- level constructs. Increasing task granularity allows for the generalization of processes, simplifying their views and providing methods to carry out expertise to new applications. Well-proven practices can be used to describe general solutions that use basic skeletons configured and instantiated according to a set of specific integration requirements. Patterns can be applied to ETL processes aiming to simplify not only a possible conceptual representation but also to reduce the gap that often exists between two design perspectives. In this paper, we demonstrate the feasibility and effectiveness of an ETL pattern-based approach using task clustering, analyzing a real world ETL scenario through the definitions of two commonly used clusters of tasks: a data lookup cluster and a data conciliation and integration cluster.
Resumo:
During the last few years many research efforts have been done to improve the design of ETL (Extract-Transform-Load) systems. ETL systems are considered very time-consuming, error-prone and complex involving several participants from different knowledge domains. ETL processes are one of the most important components of a data warehousing system that are strongly influenced by the complexity of business requirements, their changing and evolution. These aspects influence not only the structure of a data warehouse but also the structures of the data sources involved with. To minimize the negative impact of such variables, we propose the use of ETL patterns to build specific ETL packages. In this paper, we formalize this approach using BPMN (Business Process Modelling Language) for modelling more conceptual ETL workflows, mapping them to real execution primitives through the use of a domain-specific language that allows for the generation of specific instances that can be executed in an ETL commercial tool.
Resumo:
ETL conceptual modeling is a very important activity in any data warehousing system project implementation. Owning a high-level system representation allowing for a clear identification of the main parts of a data warehousing system is clearly a great advantage, especially in early stages of design and development. However, the effort to model conceptually an ETL system rarely is properly rewarded. Translating ETL conceptual models directly into something that saves work and time on the concrete implementation of the system process it would be, in fact, a great help. In this paper we present and discuss a hybrid approach to this problem, combining the simplicity of interpretation and power of expression of BPMN on ETL systems conceptualization with the use of ETL patterns to produce automatically an ETL skeleton, a first prototype system, which has the ability to be executed in a commercial ETL tool like Kettle.
Resumo:
Information technologies changed the way of how the health organizations work, contributing to their effectiveness, efficiency and sustainability. Hospital Information Systems (HIS) are emerging on all of health institutions, helping health professionals and patients. However, HIS are not always implemented and used in the best way, leading to low levels of benefits and acceptance by users of these systems. In order to mitigate this problem, it is essential to take measures able to ensure if the HIS and their interfaces are designed in a simple and interactive way. With this in mind, a study to measure the user satisfaction and their opinion was made. It was applied the Technology Acceptance Model (TAM) on a HIS implemented on various hospital centers (AIDA), being used the Pathologic Anatomy Service. The study identified weakness and strengths features of AIDA and it pointed some solutions to improve the medical record.
Resumo:
This article compiles the main topics addressed by management systems (MSs) literature concerningMSs integration by performing a systematic literature review. In this paper, it is intended to present themain limitations of non-integratedmanagement systems (IMSs), the main motivations driving an IMS implementation, the major resistances faced, the most common resultant benefits, the suitable guidelines and standards and the critical success factors. In addition, this paper addresses the issues concerning integration strategies and models, the integration levels or degrees achieved by an IMS and the audit function in an integrated context. The motivations that drive companies to integrate their management subsystems, the obstacles faced and the benefits collected may have internal or external origins. The publishing of standards guiding companies on how to integrate their management subsystems has been done mainly at a national level. There are several models that could be used in order to support companies in their management subsystems integration processes, and a sequential or an all-in strategy may be adopted. Four audit typologies can be distinguished, and the adoption of any of these typologies should consider resource availability and audit team know-how, among other features.
Resumo:
Maturity models are adopted to minimise our complexity perception over a truly complex phenomenon. In this sense, maturity models are tools that enable the assessment of the most relevant variables that impact on the outputs of a specific system. Ideally a maturity model should provide information concerning the qualitative and quantitative relationships between variables and how they affect the latent variable, that is, the maturity level. Management systems (MSs) are implemented worldwide and by an increasing number of companies. Integrated management systems (IMSs) consider the implementation of one or several MSs usually coexisting with the quality management subsystem (QMS). It is intended in this chapter to report a model based on two components that enables the assessment of the IMS maturity, considering the key process agents (KPAs) identified through a systematic literature review and the results collected from two surveys.
Resumo:
The distinction between convective and stratiform precipitation profiles around various precipitating systems existent in tropical regions is very important to the global atmospheric circulation, which is extremely sensitive to vertical latent heat distribution. In South America, the convective activity responds to the Intraseasonal Oscillation (IOS). This paper analyzes a disdrometer and a radar profiler data, installed in the Ji-Paraná airport, RO, Brazil, for the field experiment WETAMC/LBA & TRMM/LBA, during January and February of 1999. The microphysical analysis of wind regimes associated with IOS showed a large difference in type, size and microphysical processes of hydrometeor growth in each wind regime: easterly regimes had more turbulence and consequently convective precipitation formation, and westerly regimes had a more stratiform precipitation formation.
Resumo:
Tese de Doutoramento - Programa Doutoral em Engenharia Industrial e Sistemas (PDEIS)
Resumo:
Tese de Doutoramento Biologia Molecular e Ambiental - Especialidade em Biologia Celular e Saúde
Resumo:
This paper presents part of a study aimed at finding a suitable, yet cost-effective, surface finish for a steel structure subject to the car washing environment and corrosive chemicals. The initial, life cycle and average equivalent annual (AEAC) costs for surface finishing methods were calculated for a steel structure using the LCCC algorithm developed by American Galvanizers Association (AGA). The cost study consisted of 45 common surface finish systems including: hot-dip galvanization (HDG), metallization, acrylic, alkyd and epoxy as well as duplex coatings such as epoxy zinc and inorganic zinc (IOZ). The results show that initial, life cycle and AEAC costs for hot dip galvanization are the lowest among all the other methods, followed by coal tar epoxy painting. The annual average cost of HDG for this structure was estimated about €0.22/m2, while the other cost-effective alternatives were: IOZ, polyurea, epoxy waterborne and IOZ/epoxy duplex coating.
Resumo:
One of the major challenges in the development of an immersive system is handling the delay between the tracking of the user’s head position and the updated projection of a 3D image or auralised sound, also called end-to-end delay. Excessive end-to-end delay can result in the general decrement of the “feeling of presence”, the occurrence of motion sickness and poor performance in perception-action tasks. These latencies must be known in order to provide insights on the technological (hardware/software optimization) or psychophysical (recalibration sessions) strategies to deal with them. Our goal was to develop a new measurement method of end-to-end delay that is both precise and easily replicated. We used a Head and Torso simulator (HATS) as an auditory signal sensor, a fast response photo-sensor to detect a visual stimulus response from a Motion Capture System, and a voltage input trigger as real-time event. The HATS was mounted in a turntable which allowed us to precisely change the 3D sound relative to the head position. When the virtual sound source was at 90º azimuth, the correspondent HRTF would set all the intensity values to zero, at the same time a trigger would register the real-time event of turning the HATS 90º azimuth. Furthermore, with the HATS turned 90º to the left, the motion capture marker visualization would fell exactly in the photo-sensor receptor. This method allowed us to precisely measure the delay from tracking to displaying. Moreover, our results show that the method of tracking, its tracking frequency, and the rendering of the sound reflections are the main predictors of end-to-end delay.
Resumo:
The Prognostic Health Management (PHM) has been asserting itself as the most promising methodology to enhance the effective reliability and availability of a product or system during its life-cycle conditions by detecting current and approaching failures, thus, providing mitigation of the system risks with reduced logistics and support costs. However, PHM is at an early stage of development, it also expresses some concerns about possible shortcomings of its methods, tools, metrics and standardization. These factors have been severely restricting the applicability of PHM and its adoption by the industry. This paper presents a comprehensive literature review about the PHM main general weaknesses. Exploring the research opportunities present in some recent publications, are discussed and outlined the general guide-lines for finding the answer to these issues.