25 resultados para Generalized Lévy Process
em Universidade do Minho
Resumo:
This paper aims at developing a collision prediction model for three-leg junctions located in national roads (NR) in Northern Portugal. The focus is to identify factors that contribute for collision type crashes in those locations, mainly factors related to road geometric consistency, since literature is scarce on those, and to research the impact of three modeling methods: generalized estimating equations, random-effects negative binomial models and random-parameters negative binomial models, on the factors of those models. The database used included data published between 2008 and 2010 of 177 three-leg junctions. It was split in three groups of contributing factors which were tested sequentially for each of the adopted models: at first only traffic, then, traffic and the geometric characteristics of the junctions within their area of influence; and, lastly, factors which show the difference between the geometric characteristics of the segments boarding the junctionsâ area of influence and the segment included in that area were added. The choice of the best modeling technique was supported by the result of a cross validation made to ascertain the best model for the three sets of researched contributing factors. The models fitted with random-parameters negative binomial models had the best performance in the process. In the best models obtained for every modeling technique, the characteristics of the road environment, including proxy measures for the geometric consistency, along with traffic volume, contribute significantly to the number of collisions. Both the variables concerning junctions and the various national highway segments in their area of influence, as well as variations from those characteristics concerning roadway segments which border the already mentioned area of influence have proven their relevance and, therefore, there is a rightful need to incorporate the effect of geometric consistency in the three-leg junctions safety studies.
Resumo:
Given the current economic situation of the Portuguese municipalities, it is necessary to identify the priority investments in order to achieve a more efficient financial management. The classification of the road network of the municipality according to the occurrence of traffic accidents is fundamental to set priorities for road interventions. This paper presents a model for road network classification based on traffic accidents integrated in a geographic information system. Its practical application was developed through a case study in the municipality of Barcelos. An equation was defined to obtain a road safety index through the combination of the following indicators: severity, property damage only and accident costs. In addition to the road network classification, the application of the model allows to analyze the spatial coverage of accidents in order to determine the centrality and dispersion of the locations with the highest incidence of road accidents. This analysis can be further refined according to the nature of the accidents namely in collision, runoff and pedestrian crashes.
Resumo:
Due to the increasing acceptance of BPM, nowadays BPM tools are extensively used in organizations. Core to BPM are the process modeling languages, of which BPMN is the one that has been receiving most attention these days. Once a business process is described using BPMN, one can use a process simulation approach in order to find its optimized form. In this context, the simulation of business processes, such as those defined in BPMN, appears as an obvious way of improving processes. This paper analyzes the business process modeling and simulation areas, identifying the elements that must be present in the BPMN language in order to allow processes described in BPMN to be simulated. During this analysis a set of existing BPM tools, which support BPMN, are compared regarding their limitations in terms of simulation support.
Resumo:
Although most of the accidents occurred in Olive Oil Mill (OOM) resulted from “basic” risks, there is a need to apply adequate tools to support risk decisions that can meet the specificities of this sector. This study aims to analyse the views of Occupational, Safety & Health (OSH) practitioners about the risk assessment process in OOM, identifying the key difficulties inherent to the risk assessment process in these sector, as well as identifying some improvements to the current practice. This analysis was based on a questionnaire that was developed and applied to 13 OSH practitioners working at OOM. The results showed that the time available to perform the risk assessment is the more frequent limitation. They believe that the methodologies available are not an important limitation to this process. However, a specific risk assessment methodology, that includes acceptance criteria adjusted to the OOM reality, using risk metrics supported on the frequency of accidents and workdays lost, were indicated as being also an important contributions improve the process. A semi-quantitative approach, complemented with the use of the sector accident statistics, can be a good solution for this sector. However, further strategies should also be adopted, mainly those that can lead to an easy application of the risk assessment process.
Resumo:
The performance of parts produced by Free Form Extrusion (FFE), an increasingly popular additive manufacturing technique, depends mainly on their dimensional accuracy, surface quality and mechanical performance. These attributes are strongly influenced by the evolution of the filament temperature and deformation during deposition and solidification. Consequently, the availability of adequate process modelling software would offer a powerful tool to support efficient process set-up and optimisation. This work examines the contribution to the overall heat transfer of various thermal phenomena developing during the manufacturing sequence, including convection and radiation with the environment, conduction with support and between adjacent filaments, radiation between adjacent filaments and convection with entrapped air. The magnitude of the mechanical deformation is also studied. Once this exercise is completed, it is possible to select the material properties, process variables and thermal phenomena that should be taken in for effective numerical modelling of FFE.
Resumo:
Business Intelligence (BI) can be seen as a method that gathers information and data from information systems in order to help companies to be more accurate in their decision-making process. Traditionally BI systems were associated with the use of Data Warehouses (DW). The prime purpose of DW is to serve as a repository that stores all the relevant information required for making the correct decision. The necessity to integrate streaming data became crucial with the need to improve the efficiency and effectiveness of the decision process. In primary and secondary education, there is a lack of BI solutions. Due to the schools reality the main purpose of this study is to provide a Pervasive BI solution able to monitoring the schools and student data anywhere and anytime in real-time as well as disseminating the information through ubiquitous devices. The first task consisted in gathering data regarding the different choices made by the student since his enrolment in a certain school year until the end of it. Thereafter a dimensional model was developed in order to be possible building a BI platform. This paper presents the dimensional model, a set of pre-defined indicators, the Pervasive Business Intelligence characteristics and the prototype designed. The main contribution of this study was to offer to the schools a tool that could help them to make accurate decisions in real-time. Data dissemination was achieved through a localized application that can be accessed anywhere and anytime.
Resumo:
Children are an especially vulnerable population, particularly in respect to drug administration. It is estimated that neonatal and pediatric patients are at least three times more vulnerable to damage due to adverse events and medication errors than adults are. With the development of this framework, it is intended the provision of a Clinical Decision Support System based on a prototype already tested in a real environment. The framework will include features such as preparation of Total Parenteral Nutrition prescriptions, table pediatric and neonatal emergency drugs, medical scales of morbidity and mortality, anthropometry percentiles (weight, length/height, head circumference and BMI), utilities for supporting medical decision on the treatment of neonatal jaundice and anemia and support for technical procedures and other calculators and widespread use tools. The solution in development means an extension of INTCare project. The main goal is to provide an approach to get the functionality at all times of clinical practice and outside the hospital environment for dissemination, education and simulation of hypothetical situations. The aim is also to develop an area for the study and analysis of information and extraction of knowledge from the data collected by the use of the system. This paper presents the architecture, their requirements and functionalities and a SWOT analysis of the solution proposed.
Resumo:
Understanding the behavior of c omplex composite materials using mixing procedures is fundamental in several industrial processes. For instance, polymer composites are usually manufactured using dispersion of fillers in polymer melt matrices. The success of the filler dispersion depends both on the complex flow patterns generated and on the polymer melt rheological behavior. Consequently, the availability of a numerical tool that allow to model both fluid and particle would be very useful to increase the process insight. Nowadays there ar e computational tools that allow modeling the behavior of filled systems, taking into account both the behavior of the fluid (Computational Rheology) and the particles (Discrete Element Method). One example is the DPMFoam solver of the OpenFOAM ® framework where the averaged volume fraction momentum and mass conservation equations are used to describe the fluid (continuous phase) rheology, and the Newton’s second law of motion is used to compute the particles (discrete phase) movement. In this work the refer red solver is extended to take into account the elasticity of the polymer melts for the continuous phase. The solver capabilities will be illustrated by studying the effect of the fluid rheology on the filler dispersion, taking into account different fluid types (generalized Newtonian or viscoelastic) and particles volume fraction and size. The results obtained are used to evaluate the relevance of considering the fluid complex rheology for the prediction of the composites morphology
Resumo:
Understanding the mixing process of complex composite materials is fundamental in several industrial processes. For instance, the dispersion of fillers in polymer melt matrices is commonly employed to manufacture polymer composites, using a twin-screw extruder. The effectiveness of the filler dispersion depends not only on the complex flow patterns generated, but also on the polymer melt rheological behavior. Therefore, the availability of a numerical tool able to predict mixing, taking into account both fluid and particles phases would be very useful to increase the process insight, and thus provide useful guidelines for its optimization. In this work, a new Eulerian-Lagrangian numerical solver is developed OpenFOAM® computational library, and used to better understand the mechanisms determining the dispersion of fillers in polymer matrices. Particular attention will be given to the effect of the rheological model used to represent the fluid behavior, on the level of dispersion obtained. For the Eulerian phase the averaged volume fraction governing equations (conservation of mass and linear momentum) are used to describe the fluid behavior. In the case of the Lagrangian phase, Newton’s second law of motion is used to compute the particles trajectories and velocity. To study the effect of fluid behavior on the filler dispersion, several systems are modeled considering different fluid types (generalized Newtonian or viscoelastic) and particles volume fraction and size. The results obtained are used to correlate the fluid and particle characteristics on the effectiveness of mixing and morphology obtained.
Resumo:
In several industrial applications, highly complex behaviour materials are used together with intricate mixing processes, which difficult the achievement of the desired properties for the produced materials. This is the case of the well-known dispersion of nano-sized fillers in a melt polymer matrix, used to improve the nanocomposite mechanical and/or electrical properties. This mixing is usually performed in twin-screw extruders, that promote complex flow patterns, and, since an in loco analysis of the material evolution and mixing is difficult to perform, numerical tools can be very useful to predict the evolution and behaviour of the material. This work presents a numerical based study to improve the understanding of mixing processes. Initial numerical studies were performed with generalized Newtonian fluids, but, due to the null relaxation time that characterize this type of fluids, the assumption of viscoelastic behavior was required. Therefore, the polymer melt was rheologically characterized, and, a six mode Phan-Thien-Tanner and Giesekus models were used to fit the rheological data. These viscoelastic rheological models were used to model the process. The conclusions obtained in this work provide additional and useful data to correlate the type and intensity of the deformation history promoted to the polymer nanocomposite and the quality of the mixing obtained.
Resumo:
Dissertação de mestrado integrado em Psicologia
Resumo:
Dissertação de mestrado em Psicologia Aplicada
Resumo:
When representing the requirements for an intended software solution during the development process, a logical architecture is a model that provides an organized vision of how functionalities behave regardless of the technologies to be implemented. If the logical architecture represents an ambient assisted living (AAL) ecosystem, such representation is a complex task due to the existence of interrelated multidomains, which, most of the time, results in incomplete and incoherent user requirements. In this chap- ter, we present the results obtained when applying process-level modeling techniques to the derivation of the logical architecture for a real industrial AAL project. We adopt a V-Model–based approach that expresses the AAL requirements in a process-level perspec- tive, instead of the traditional product-level view. Additionally, we ensure compliance of the derived logical architecture with the National Institute of Standards and Technology (NIST) reference architecture as nonfunctional requirements to support the implementa- tion of the AAL architecture in cloud contexts.
Resumo:
There is currently an increasing demand for robots able to acquire the sequential organization of tasks from social learning interactions with ordinary people. Interactive learning-by-demonstration and communication is a promising research topic in current robotics research. However, the efficient acquisition of generalized task representations that allow the robot to adapt to different users and contexts is a major challenge. In this paper, we present a dynamic neural field (DNF) model that is inspired by the hypothesis that the nervous system uses the off-line re-activation of initial memory traces to incrementally incorporate new information into structured knowledge. To achieve this, the model combines fast activation-based learning to robustly represent sequential information from single task demonstrations with slower, weight-based learning during internal simulations to establish longer-term associations between neural populations representing individual subtasks. The efficiency of the learning process is tested in an assembly paradigm in which the humanoid robot ARoS learns to construct a toy vehicle from its parts. User demonstrations with different serial orders together with the correction of initial prediction errors allow the robot to acquire generalized task knowledge about possible serial orders and the longer term dependencies between subgoals in very few social learning interactions. This success is shown in a joint action scenario in which ARoS uses the newly acquired assembly plan to construct the toy together with a human partner.
Resumo:
Dissertação de mestrado integrado em Mechanical Engineering