30 resultados para real wage determination
em Universidade do Minho
Resumo:
Hospitals are nowadays collecting vast amounts of data related with patient records. All this data hold valuable knowledge that can be used to improve hospital decision making. Data mining techniques aim precisely at the extraction of useful knowledge from raw data. This work describes an implementation of a medical data mining project approach based on the CRISP-DM methodology. Recent real-world data, from 2000 to 2013, were collected from a Portuguese hospital and related with inpatient hospitalization. The goal was to predict generic hospital Length Of Stay based on indicators that are commonly available at the hospitalization process (e.g., gender, age, episode type, medical specialty). At the data preparation stage, the data were cleaned and variables were selected and transformed, leading to 14 inputs. Next, at the modeling stage, a regression approach was adopted, where six learning methods were compared: Average Prediction, Multiple Regression, Decision Tree, Artificial Neural Network ensemble, Support Vector Machine and Random Forest. The best learning model was obtained by the Random Forest method, which presents a high quality coefficient of determination value (0.81). This model was then opened by using a sensitivity analysis procedure that revealed three influential input attributes: the hospital episode type, the physical service where the patient is hospitalized and the associated medical specialty. Such extracted knowledge confirmed that the obtained predictive model is credible and with potential value for supporting decisions of hospital managers.
Resumo:
Esta comunicação, baseando-se num extenso corpus oral que está a ser recolhido, procurará refletir de que modo a língua realizada oralmente nos permite perceber o que realmente são e como funcionam as línguas para além da dimensão formalizada da escrita. Pretende-se demonstrar que é a linguagem em uso que melhor evidencia muitos aspetos impossíveis de perceber por uma designada do sistema. Assim, procurar-se-á verificar até que ponto a verdadeira realização linguística da oralidade respeita a noção de frase e de norma, bem assim como especificamente realiza várias dimensões lexicais e pragmáticas.
Resumo:
Following detailed studies of Portuguese vernacular building typologies, this paper deals with buildings located in historical urban centres. An analysis of the history of the urban centre and, in particular, of some vernacular buildings is enhanced. Additionally, a discussion on the influence of changes of the geometry, and on added built volumes to original buildings in the seismic vulnerability of the buildings is also provided.
Resumo:
Research and development around indoor positioning and navigation is capturing the attention of an increasing number of research groups and labs around the world. Among the several techniques being proposed for indoor positioning, solutions based on Wi-Fi fingerprinting are the most popular since they exploit existing WLAN infrastructures to support software-only positioning, tracking and navigation applications. Despite the enormous research efforts in this domain, and despite the existence of some commercial products based on Wi-Fi fingerprinting, it is still difficult to compare the performance, in the real world, of the several existing solutions. The EvAAL competition, hosted by the IPIN 2015 conference, contributed to fill this gap. This paper describes the experience of the RTLS@UM team in participating in track 3 of that competition.
Resumo:
Modeling Extract-Transform-Load (ETL) processes of a Data Warehousing System has always been a challenge. The heterogeneity of the sources, the quality of the data obtained and the conciliation process are some of the issues that must be addressed in the design phase of this critical component. Commercial ETL tools often provide proprietary diagrammatic components and modeling languages that are not standard, thus not providing the ideal separation between a modeling platform and an execution platform. This separation in conjunction with the use of standard notations and languages is critical in a system that tends to evolve through time and which cannot be undermined by a normally expensive tool that becomes an unsatisfactory component. In this paper we demonstrate the application of Relational Algebra as a modeling language of an ETL system as an effort to standardize operations and provide a basis for uncommon ETL execution platforms.
Resumo:
Dissertação de mestrado em Educação Especial (área de especialização em Intervenção Precoce)
Resumo:
The observational method in tunnel engineering allows the evaluation in real time of the actual conditions of the ground and to take measures if its behavior deviates considerably from predictions. However, it lacks a consistent and structured methodology to use the monitoring data to adapt the support system in real time. The definition of limit criteria above which adaptation is required are not defined and complex inverse analysis procedures (Rechea et al. 2008, Levasseur et al. 2010, Zentar et al. 2001, Lecampion et al. 2002, Finno and Calvello 2005, Goh 1999, Cui and Pan 2012, Deng et al. 2010, Mathew and Lehane 2013, Sharifzadeh et al. 2012, 2013) may be needed to consistently analyze the problem. In this paper a methodology for the real time adaptation of the support systems during tunneling is presented. In a first step limit criteria for displacements and stresses are proposed. The methodology uses graphics that are constructed during the project stage based on parametric calculations to assist in the process and when these graphics are not available, since it is not possible to predict every possible scenario, inverse analysis calculations are carried out. The methodology is applied to the “Bois de Peu” tunnel which is composed by two tubes with over 500 m long. High uncertainty levels existed concerning the heterogeneity of the soil and consequently in the geomechanical design parameters. The methodology was applied in four sections and the results focus on two of them. It is shown that the methodology has potential to be applied in real cases contributing for a consistent approach of a real time adaptation of the support system and highlight the importance of the existence of good quality and specific monitoring data to improve the inverse analysis procedure.
Resumo:
One of the major challenges in the development of an immersive system is handling the delay between the tracking of the user’s head position and the updated projection of a 3D image or auralised sound, also called end-to-end delay. Excessive end-to-end delay can result in the general decrement of the “feeling of presence”, the occurrence of motion sickness and poor performance in perception-action tasks. These latencies must be known in order to provide insights on the technological (hardware/software optimization) or psychophysical (recalibration sessions) strategies to deal with them. Our goal was to develop a new measurement method of end-to-end delay that is both precise and easily replicated. We used a Head and Torso simulator (HATS) as an auditory signal sensor, a fast response photo-sensor to detect a visual stimulus response from a Motion Capture System, and a voltage input trigger as real-time event. The HATS was mounted in a turntable which allowed us to precisely change the 3D sound relative to the head position. When the virtual sound source was at 90º azimuth, the correspondent HRTF would set all the intensity values to zero, at the same time a trigger would register the real-time event of turning the HATS 90º azimuth. Furthermore, with the HATS turned 90º to the left, the motion capture marker visualization would fell exactly in the photo-sensor receptor. This method allowed us to precisely measure the delay from tracking to displaying. Moreover, our results show that the method of tracking, its tracking frequency, and the rendering of the sound reflections are the main predictors of end-to-end delay.
Resumo:
It is successfully demonstrated that nanoparticle’s magnetostriction can be accurately determined based on the magnetoelectric effect measured on polymeric-composite materials. This represents a novel, simple and versatile method for the determination of particle’s magnetostriction at their nano-sized and dispersed state, which is, up to date, a difficult and imprecise task.
Resumo:
Tese de Doutoramento em Arquitectura / Cultura Arquitectónica.
Resumo:
Studies of the spin and parity quantum numbers of the Higgs boson in the WW∗→eνμν final state are presented, based on proton--proton collision data collected by the ATLAS detector at the Large Hadron Collider, corresponding to an integrated luminosity of 20.3 fb−1 at a centre-of-mass energy of s√=8 TeV. The Standard Model spin-parity JCP=0++ hypothesis is compared with alternative hypotheses for both spin and CP. The case where the observed resonance is a mixture of the Standard-Model-like Higgs boson and CP-even (JCP=0++) or CP-odd (JCP=0+−) Higgs boson in scenarios beyond the Standard Model is also studied. The data are found to be consistent with the Standard Model prediction and limits are placed on alternative spin and CP hypotheses, including CP mixing in different scenarios.
Resumo:
The normalized differential cross section for top-quark pair production in association with at least one jet is studied as a function of the inverse of the invariant mass of the tt¯+1-jet system. This distribution can be used for a precise determination of the top-quark mass since gluon radiation depends on the mass of the quarks. The experimental analysis is based on proton--proton collision data collected by the ATLAS detector at the LHC with a centre-of-mass energy of 7 TeV corresponding to an integrated luminosity of 4.6 fb−1. The selected events were identified using the lepton+jets top-quark-pair decay channel, where lepton refers to either an electron or a muon. The observed distribution is compared to a theoretical prediction at next-to-leading-order accuracy in quantum chromodynamics using the pole-mass scheme. With this method, the measured value of the top-quark pole mass, mpolet, is: mpolet =173.7 ± 1.5 (stat.) ± 1.4 (syst.) +1.0−0.5 (theory) GeV. This result represents the most precise measurement of the top-quark pole mass to date.
Resumo:
Relatório de estágio de mestrado em Negócios Internacionais
Resumo:
Dissertação de mestrado em Optometria Avançada
Resumo:
High transverse momentum jets produced in pp collisions at a centre of mass energy of 7 TeV are used to measure the transverse energy--energy correlation function and its associated azimuthal asymmetry. The data were recorded with the ATLAS detector at the LHC in the year 2011 and correspond to an integrated luminosity of 158 pb−1. The selection criteria demand the average transverse momentum of the two leading jets in an event to be larger than 250 GeV. The data at detector level are well described by Monte Carlo event generators. They are unfolded to the particle level and compared with theoretical calculations at next-to-leading-order accuracy. The agreement between data and theory is good and provides a precision test of perturbative Quantum Chromodynamics at large momentum transfers. From this comparison, the strong coupling constant given at the Z boson mass is determined to be αs(mZ)=0.1173±0.0010 (exp.) +0.0065−0.0026 (theo.).