796 resultados para Empirical Algorithm Analysis
Resumo:
Are return migrants more productive than non-migrants? If so, is it a causal effect or simply self-selection? Existing literature has not reached a consensus on the role of return migration for origin countries. To answer these research questions, an empirical analysis was performed based on household data collected in Cape Verde. One of the most common identification problems in the migration literature is the presence of migrant self-selection. In order to disentangle potential selection bias, we use instrumental variable estimation using variation provided by unemployment rates in migrant destination countries, which is compared with OLS and Nearest Neighbor Matching (NNM) methods. The results using the instrumental variable approach provide evidence of labour income gains due to return migration, while OLS underestimates the coefficient of interest. This bias points towards negative self-selection of return migrants on unobserved characteristics, although the different estimates cannot be distinguished statistically. Interestingly, migration duration and occupational changes after migration do not seem to influence post-migration income. There is weak evidence that return migrants from the United States have higher income gains caused by migration than the ones who returned from Portugal.
Resumo:
INTRODUCTION: Leprosy in Brazil is a public health issue, and there are many regions in the State of Espírito Santo with high endemic incidence levels of leprosy, characterizing this state as a priority for leprosy programs. The aim of this study was to determine the spatial distribution of coefficients of new cases of leprosy in the State of Espírito Santo, Brazil. METHODS: We conducted a descriptive and ecologic study based on the spatial distribution of leprosy in the State of Espírito Santo between 2004 and 2009. Data were gathered from the available records of the Espírito Santo State Health Secretary. The global and local Bayesian empirical methods were used to produce an estimate of leprosy risk, smoothing the fluctuation effects of the detection coefficients. RESULTS: The study resulted in a coefficient adjustment of new cases in 10 towns that changed their classification, among which, 2 went from low to medium, 4 from medium to high, 3 from high to very high, and 1 from very high to hyper-endemic. An average variation of 1.02, fluctuating between 0 and 12.39 cases/100,000 inhabitants, was found in a comparative calculation between the Local Ebest value and the average coefficient of new leprosy cases in the State of Espírito Santo. CONCLUSIONS: The spatial analysis of leprosy favors the establishment of control strategies with a better cost-benefit relationship since it reveals specific and priority regions, thereby enabling the development of actions that can interfere in the transmission chain.
Resumo:
Enterprise Risk Management (ERM) is gaining relevance among financial and non-financial companies but its benefits still are uncertain. This paper aims at investigating the relationship between ERM adoption and firm performance based on a sample of 1130 non-financial companies belonging to the STOXX® index. A content analysis of individual accounts is performed to distinguish adopters, and a regression analysis explores the effect of ERM adoption on firm performance, proxied by Tobin’s Q. The findings suggest that there is a statistical significant positive effect of ERM adoption on firm performance, meaning that firms are benefiting from the implementation of this process.
Resumo:
In this research we conducted a mixed research, using qualitative and quantitative analysis to study the relationship and impact between mobile advertisement and mobile app user acquisition and the conclusions companies can derive from it. Data was gathered from management of mobile advertisement campaigns of a portfolio of three different mobile apps. We found that a number of implications can be extracted from this intersection, namely to product development, internationalisation and management of marketing budget. We propose further research on alternative app users sources, impact of revenue on apps and exploitation of product segments: wearable technology and Internet of Things.
Resumo:
Introduction In 1999, Birigui and Araçatuba were the first municipalities in the State of São Paulo to present autochthonous cases of visceral leishmaniasis in humans (VLH). The aim of this study was to describe the temporal, spatial and spatiotemporal behaviors of VLH in Birigui. Methods Secondary data were obtained from the Notifiable Diseases Information System from 1999 to 2012. The incidence, mortality and case fatality rates by sex and age were calculated. The cases of VLH were geocoded and grouped according to census tracts. Local empirical Bayesian incidence rates were calculated. The existence of spatial and spatiotemporal clusters was investigated using SaTScan software. Results There were 156 confirmed cases of autochthonous VLH. The incidence rate was higher in the 0-4-year-old children, and the mortality and case fatality rates were higher in people aged 60 years and older. The peaks of incidence occurred in 2006 and 2011. The Bayesian rates identified the presence of VLH in all of the census tracts in the municipality; however, spatial and spatiotemporal clusters were found in the central area of the municipality. Conclusions Birigui, located in the Araçatuba region, has recently experienced increasing numbers of VLH cases; this increase is contrary to the behavior observed over the entire region, which has shown a decreasing trend in the number of VLH cases. The observations that the highest incidence is in children 0-4 years old and the highest mortality is in people 60 years and older are in agreement with the expected patterns of VLH.
Resumo:
Ship tracking systems allow Maritime Organizations that are concerned with the Safety at Sea to obtain information on the current location and route of merchant vessels. Thanks to Space technology in recent years the geographical coverage of the ship tracking platforms has increased significantly, from radar based near-shore traffic monitoring towards a worldwide picture of the maritime traffic situation. The long-range tracking systems currently in operations allow the storage of ship position data over many years: a valuable source of knowledge about the shipping routes between different ocean regions. The outcome of this Master project is a software prototype for the estimation of the most operated shipping route between any two geographical locations. The analysis is based on the historical ship positions acquired with long-range tracking systems. The proposed approach makes use of a Genetic Algorithm applied on a training set of relevant ship positions extracted from the long-term storage tracking database of the European Maritime Safety Agency (EMSA). The analysis of some representative shipping routes is presented and the quality of the results and their operational applications are assessed by a Maritime Safety expert.
Resumo:
The present paper reports the precipitation process of Al3Sc structures in an aluminum scandium alloy, which has been simulated with a synchronous parallel kinetic Monte Carlo (spkMC) algorithm. The spkMC implementation is based on the vacancy diffusion mechanism. To filter the raw data generated by the spkMC simulations, the density-based clustering with noise (DBSCAN) method has been employed. spkMC and DBSCAN algorithms were implemented in the C language and using MPI library. The simulations were conducted in the SeARCH cluster located at the University of Minho. The Al3Sc precipitation was successfully simulated at the atomistic scale with the spkMC. DBSCAN proved to be a valuable aid to identify the precipitates by performing a cluster analysis of the simulation results. The achieved simulations results are in good agreement with those reported in the literature under sequential kinetic Monte Carlo simulations (kMC). The parallel implementation of kMC has provided a 4x speedup over the sequential version.
Resumo:
The monitoring data collected during tunnel excavation can be used in inverse analysis procedures in order to identify more realistic geomechanical parameters that can increase the knowledge about the interested formations. These more realistic parameters can be used in real time to adapt the project to the real structure in situ behaviour. However, monitoring plans are normally designed for safety assessment and not especially for the purpose of inverse analysis. In fact, there is a lack of knowledge about what types and quantity of measurements are needed to succeed in identifying the parameters of interest. Also, the optimisation algorithm chosen for the identification procedure may be important for this matter. In this work, this problem is addressed using a theoretical case with which a thorough parametric study was carried out using two optimisation algorithms based on different calculation paradigms, namely a conventional gradient-based algorithm and an evolution strategy algorithm. Calculations were carried for different sets of parameters to identify several combinations of types and amount of monitoring data. The results clearly show the high importance of the available monitoring data and the chosen algorithm for the success rate of the inverse analysis process.
Resumo:
NIPE - WP 02/2016
Resumo:
NIPE - WP 01/ 2016
Resumo:
OBJECTIVE: Theoretical and empirical analysis of items and internal consistency of the Portuguese-language version of Social Phobia and Anxiety Inventory (SPAI-Portuguese). METHODS: Social phobia experts conducted a 45-item content analysis of the SPAI-Portuguese administered to a sample of 1,014 university students. Item discrimination was evaluated by Student's t test; interitem, mean and item-to-total correlations, by Pearson coefficient; reliability was estimated by Cronbach's alpha. RESULTS: There was 100% agreement among experts concerning the 45 items. On the SPAI-Portuguese 43 items were discriminative (p < 0.05). A few inter-item correlations between both subscales were below 0.2. The mean inter-item correlations were: 0.41 on social phobia subscale; 0.32 on agoraphobia subscale and 0.32 on the SPAI-Portuguese. Item-to-total correlations were all higher then 0.3 (p < 0.001). Cronbach's alphas were: 0.95 on the SPAI-Portuguese; 0.96 on social phobia subscale; 0.85 on agoraphobia subscale. CONCLUSION: The 45-item content analysis revealed appropriateness concerning the underlying construct of the SPAI-Portuguese (social phobia, agoraphobia) with good discriminative capacity on 43 items. The mean inter-item correlations and reliability coefficients demonstrated the SPAI-Portuguese and subscales internal consistency and multidimensionality. No item was suppressed in the SPAI-Portuguese but the authors suggest that a shortened SPAI, in its different versions, could be an even more useful tool for research settings in social phobia.
Resumo:
Allied to an epidemiological study of population of the Senology Unit of Braga’s Hospital that have been diagnosed with malignant breast cancer, we describe the progression in time of repeated measurements of tumor marker Carcinoembryonic antigen (CEA). Our main purpose is to describe the progression of this tumor marker as a function of possible risk factors and, hence, to understand how these risk factors influences that progression. The response variable, values of CEA, was analyzed making use of longitudinal models, testing for different correlation structures. The same covariates used in a previous survival analysis were considered in the longitudinal model. The reference time used was time from diagnose until death from breast cancer. For diagnostic of the models fitted we have used empirical and theoretical variograms. To evaluate the fixed term of the longitudinal model we have tested for a changing point on the effect of time on the tumor marker progression. A longitudinal model was also fitted only to the subset of patients that died from breast cancer, using the reference time as time from date of death until blood test.
Resumo:
Background: Several researchers seek methods for the selection of homogeneous groups of animals in experimental studies, a fact justified because homogeneity is an indispensable prerequisite for casualization of treatments. The lack of robust methods that comply with statistical and biological principles is the reason why researchers use empirical or subjective methods, influencing their results. Objective: To develop a multivariate statistical model for the selection of a homogeneous group of animals for experimental research and to elaborate a computational package to use it. Methods: The set of echocardiographic data of 115 male Wistar rats with supravalvular aortic stenosis (AoS) was used as an example of model development. Initially, the data were standardized, and became dimensionless. Then, the variance matrix of the set was submitted to principal components analysis (PCA), aiming at reducing the parametric space and at retaining the relevant variability. That technique established a new Cartesian system into which the animals were allocated, and finally the confidence region (ellipsoid) was built for the profile of the animals’ homogeneous responses. The animals located inside the ellipsoid were considered as belonging to the homogeneous batch; those outside the ellipsoid were considered spurious. Results: The PCA established eight descriptive axes that represented the accumulated variance of the data set in 88.71%. The allocation of the animals in the new system and the construction of the confidence region revealed six spurious animals as compared to the homogeneous batch of 109 animals. Conclusion: The biometric criterion presented proved to be effective, because it considers the animal as a whole, analyzing jointly all parameters measured, in addition to having a small discard rate.
Resumo:
Extending the traditional input-output model to account for the environmental impacts of production processes reveals the channels by which environmental burdens are transmitted throughout the economy. In particular, the environmental input-output approach is a useful technique for quantifying the changes in the levels of greenhouse emissions caused by changes in the final demand for production activities. The inputoutput model can also be used to determine the changes in the relative composition of greenhouse gas emissions due to exogenous inflows. In this paper we describe a method for evaluating how the exogenous changes in sectorial demand, such as changes in private consumption, public consumption, investment and exports, affect the relative contribution of the six major greenhouse gases regulated by the Kyoto Protocol to total greenhouse emissions. The empirical application is for Spain, and the economic and environmental data are for the year 2000. Our results show that there are significant differences in the effects of different sectors on the composition of greenhouse emissions. Therefore, the final impact on the relative contribution of pollutants will basically depend on the activity that receives the exogenous shock in final demand, because there are considerable differences in the way, and the extent to which, individual activities affect the relative composition of greenhouse gas emissions. Keywords: Greenhouse emissions, composition of emissions, sectorial demand, exogenous shock.
Resumo:
The aim of the paper is to analyse the economic impact of alternative policies implemented on the energy activities of the Catalan production system. Specifically, we analyse the effects of a tax on intermediate energy uses, a reduction in the final production of energy, and a reduction in intermediate energy uses. The methodology involves two versions of the input-output price model: a competitive price formulation and a mark-up price formulation. The input-output price framework will make it possible to evaluate how the alternative measures modify production prices, consumption prices, private welfare, and intermediate energy uses. The empirical application is for the Catalan economy and uses economic data for the year 2001.