948 resultados para Non-parametric methods
Resumo:
Large and sustained differences in economic performance across regions of developing countries have long provided motivation for fiscal incentives designed to encourage firm entry in lagging areas. Empirical evidence in support of these policies has, however, been weak at best. This paper undertakes a direct evaluation of the most prominent fiscal incentive policy in Brazil, the Fundos Constitucionais de Financiamento (Constitutional Funds). In doing so, we exploit valuable features of the Brazilian Ministry of Labor's RAIS data set to address two important elements of firm location decisions that have the potential to bias an assessment of the Funds: (i) firm “family structure” (in particular, proximity to headquarters for vertically integrated firms), and (ii) unobserved spatial heterogeneity (with the potential to confound the effects of the Funds). We find that the pull of firm headquarters is very strong relative to the Constitutional Funds for vertically integrated firms, but that, with non-parametric controls for time invariant spatial heterogeneity, the Funds provide statistically and economically significant incentives for firms in many of the targeted industries.
Resumo:
This paper proposes a new novel to calculate tail risks incorporating risk-neutral information without dependence on options data. Proceeding via a non parametric approach we derive a stochastic discount factor that correctly price a chosen panel of stocks returns. With the assumption that states probabilities are homogeneous we back out the risk neutral distribution and calculate five primitive tail risk measures, all extracted from this risk neutral probability. The final measure is than set as the first principal component of the preliminary measures. Using six Fama-French size and book to market portfolios to calculate our tail risk, we find that it has significant predictive power when forecasting market returns one month ahead, aggregate U.S. consumption and GDP one quarter ahead and also macroeconomic activity indexes. Conditional Fama-Macbeth two-pass cross-sectional regressions reveal that our factor present a positive risk premium when controlling for traditional factors.
Resumo:
Life cycle general equilibrium models with heterogeneous agents have a very hard time reproducing the American wealth distribution. A common assumption made in this literature is that all young adults enter the economy with no initial assets. In this article, we relax this assumption – not supported by the data - and evaluate the ability of an otherwise standard life cycle model to account for the U.S. wealth inequality. The new feature of the model is that agents enter the economy with assets drawn from an initial distribution of assets, which is estimated using a non-parametric method applied to data from the Survey of Consumer Finances. We found that heterogeneity with respect to initial wealth is key for this class of models to replicate the data. According to our results, American inequality can be explained almost entirely by the fact that some individuals are lucky enough to be born into wealth, while others are born with few or no assets.
Resumo:
O objetivo deste trabalho é verificar se os fundos de investimento Multimercado no Brasil geram alphas significativamente positivos, ou seja, se os gestores possuem habilidade e contribuem positivamente para o retorno de seus fundos. Para calcular o alpha dos fundos, foi utilizado um modelo com sete fatores, baseado, principalmente, em Edwards e Caglayan (2001), com a inclusão do fator de iliquidez de uma ação. O período analisado vai de 2003 a 2013. Encontramos que, em média, os fundos multimercado geram alpha negativo. Porém, apesar de o percentual dos que geram interceptos positivos ser baixo, a magnitude dos mesmos é expressiva. Os resultados diferem bastante por classificação Anbima e por base de dados utilizada. Verifica-se também se a performance desses fundos é persistente através de um modelo não-paramétrico baseado em tabelas de contingência. Não encontramos evidências de persistência, nem quando separamos os fundos por classificação.
Resumo:
The application of ergonomics in product design is essential to its accessibility and usability. The development of manual devices should be based on ergonomic principles. Effort perception analysis is an essential approach to understand the physical and subjective aspects of the interface. The objective of the present study was to analyze the effort perception during a simulated task with different door handles by Portuguese subjects of both genders and different ages. This transversal study agreed with ethical aspects. 180 subjects of both genders pertaining to three age groups have participated. Five door handles with different shapes were evaluated. A subjective numeric rating scale of 5 levels was used to evaluate the effort. For statistical analysis it was applied the Friedman non-parametric test. The results have showed no significant differences of effort perception in door handles "A" and "B"; "A" and "D"; and "D" and "C". Door handle "E" presented the lowest values of all. In general, there's an inverse relationship between the results of biomechanical studies and the effort perception of the same task activity. This shows that door handles design influence directly these two variables and can interfere in the accessibility and usability of these kinds of products.
Resumo:
Understand the origin, maintenance and the mechanisms that operate in the current biodiversity is the major goal of ecology. Species ecology can be influenced by different factors at different scales. There are three approaches about the ecological differences between species: the first brings that differences result from current processes on niche characteristics (e.g. diet, time, space); the second that species differences are explained by random patterns of speciation, extinction and dispersion, the third that historical events explain the formation and composition of species in communities. This study aims to evaluate the influence of phylogenetic relationships in determining ecological characteristics in amphibians (globally) and test with that, if ecological differences between species of frogs are the result of ancient pre-existing differences or as result of current interactions. Another objective of this study is to verify if ecological, historical or current characteristics determine the size of species geographical distribution. The diet data for analysis of trophic ecology were collected from published literature. We performed a non-parametric MANOVA to test the existence of phylogenetic effects in diet shifts on frogs history. Thus, it is expected to know the main factors that allow the coexistence of anuran species. We performed a phylogenetic regression to analyze if niche breadth, body size and evolutionary age variables determine the size of the geographical distribution of amphibians in the Amazon. In the present study, new contributions to knowledge of major ecological patterns of anurans are discussed under a phylogenetic perspective
Resumo:
This work has as an objective analyze the efficiency of producers costs of the irrigation Project Baixo-Açu , and identify the determining factors of this efficiency. To achieve these targets it was estimated, in a first stage, a frontier of costs by the method, non parametric of Data Envelopment Analysis-DEA, and measured the stakes of efficiency producers. On the second stage, it was utilized the Tobit regression pattern, estimating an inefficiency function of costs, and were indentified the associated factors of resources waste. Among the results found it was noticed the existence of a high waste of resources, that represent more than 54% of effective cost. Among the factors with the highest wastes are: energy, herbicides, defensives and chemical fertilizers. In a general way, the producers presented low efficiency level and, only, two, of seventy-five researched, achieved the frontier of costs minimization. These results reveal, in a certain way, that the producers in irrigated fruit growing in the project Baixo-Açu don t seek to minimize the production costs. It was still noticed, that the reduction of resources waste, and this way the inefficiency of costs, is associated with the agriculturalist education, his experience in agriculture, his access to the technical assistance and credit
Resumo:
The analysis of some aspects of development in Brazil in the past three decades reveals an improvement on a range of indicators isolated in the south east the richest region and north east the poorest region. From a database of twenty variables, the main purpose the study was to verify if there are indications of convergence or divergence in five dimensions of development between the two regions from 1990 to 2010. Aiming to identify the states more similar and different, and to verify changes in the composition of low development groups and high development in the adressed period, was used the analysis of groupings (Cluster Analysis). Additionally, to test equality of distance between states all the time, was used the non-parametric Test of Wilcoxon. This makes it possible to verify IF the distance between the states of two regions has been increasing or has been falling, showing signs of divergence or convergence. The results of Cluster s analysis suggest that there are indications of convergence inside the cluster of north east, but the distance between two regions has not changed. The results of test of Wilcoxon suggests that there have been no changes statistically significant in the distance between the states, in the two regions the standards of development became more homogenous, but the two regions will be far apart
Resumo:
This work aims to analyze the concept of "paradox" posed in the work of The Budget Paradox (1872) of mathematical and logical English Augustus De Morgan (1806-1871). Here it is important to note that a large part of this book consists of re-prints of a series of writings by the author in journal Athenaeum, where its performance as auditor of literature. The tests refer to some scientific work produced between the years 1489 and 1866 and the rules of selection for the composition of the work is, basically, the methodological aspects used in the completion or disclosed by such scholars. The concept of paradox is presented in two distinct moments. At first, we found a study of definitions for the term in a philosophical approach, characterizing it as something that requires further investigation; which was complemented with the classic examples of a scientific context. In the second, we present a concept advocated by De Morgan and, under this perspective, that he conceptualized the "paradox" is directly related to the non-usual methods employed in the formulation of new scientific theories. In this study some of these scientific concepts are detailed, where, through the redemption history, engaging in issues of our study Mathematics, Physics, of Logic, among others. Possession of the preliminary analysis and comparison with the design of De Morgan, it became possible to diagnose some limitations in the conceptualization suggested by the author. Further, evidenced, in front of the cases, the nonlinearity of the process of production of knowledge and hence the progress of science
Resumo:
Foram avaliadas a adaptabilidade e a estabilidade de genótipos de soja (Glycine max L.) segundo a metodologia clássica de Eberhart e Russell e a estabilidade dos mesmos genótipos pela metodologia não-paramétrica de Huhn. Os experimentos foram conduzidos no delineamento em blocos casualizados, com três repetições e com 30 tratamentos (genótipos de soja), durante três anos consecutivos. As parcelas experimentais foram constituídas por quatro linhas de cultivo, espaçadas de 0,50 m e com densidade de 25 plantas por metro linear. Como área útil, foram tomadas as linhas centrais, eliminando-se 0,5 m de cada extremidade. A comparação entre as metodologias foi efetuada considerando-se o caráter produção de grãos. Verificou-se correlação de posição significativa dos postos dos genótipos, entre o desvio da regressão e as duas medidas não-paramétricas de estabilidade, porém o mesmo não foi observado entre o coeficiente de regressão e as medidas não-paramétricas (Si(1) e Si(2)). As medidas Si(1) e Si(2) mostraram-se quase que perfeitamente correlacionadas.
Resumo:
Posture is one of the most worrying problems dentists face. That is because of the high incidence of low back pathologies regarding the professional activity, despite the development on the field of Dental Ergonomics. This work took place at the dental schoolclinic at a Federal University, and it was grounded on the Ergonomic Principles in the workplace. Its main objective was to analyze the determinants of inadequate posture adopted by students inasmuch as the adoption of non ergonomic methods at the school-clinic may influence them to develop inadequate postures in their working environment. The analysis of the activity showed us that it requires some complex procedures in the patient s mouth. Thus, when the students carry out the activity, they start to adopt, although unconsciously, inadequate postures which will make easier the visual accuracy and the access to the operation focus. In case there is no internal (body awareness) or external warning mechanisms (the professor s or the partner s counseling) regarding posture or possible risks which lead them to self-correction, the students become vulnerable to osteomuscle disorders. The time pressure, because the students are expected to perform their task in a predetermined clinical time. The facts related to each patient s variability as well as the stress caused by the expectations to get their work done in time make the students to advance it believing they will waste time if they help their partners or using an indirect view. We could also notice that there was no assistant to perform the job of minor ones, as well as there was no professor who could actually connect the knowledge on Ergonomics to its working practice. The conclusions of this work stand out the need of widen the discussion at the academic environment regarding health professionals in places such as universities. The ergonomic principles in the workplace aim a multidisciplinary analysis based on the experience of students, professors, staff members and janitors that can contribute to some reflection upon the issue and consequently actions which will bring positive changes at the working environment
Resumo:
This work presents a study in quality of health care, with focus on consulting appointment. The main purpose is to define a statistical model and propose a quality grade of the consulting appointment time. The time considered is that from the day the patient get the appointment done to the day the consulting is realized. It is used reliability techniques and functions that has as main characteristic the analysis of data regarding the time of occurrence certain event. It is gathered a random sample of 1743 patients in the appointment system of a University Hospital - the Hospital Universitário Onofre Lopes - of the Federal University of Rio Grande do Norte, Brazil. The sample is randomly stratified in terms on clinical specialty. The data were analyzed against the parametric methods of the reliability statistics and the adjustment of the regression model resulted in the Weibull distribution being best fit to data. The quality grade proposed is based in the PAHO criteria for a consulting appointment and result that no clinic got the PAHO quality grade. The quality grade proposed could be used to define priority for improvement and as criteria to quality control
Resumo:
The problems of combinatory optimization have involved a large number of researchers in search of approximative solutions for them, since it is generally accepted that they are unsolvable in polynomial time. Initially, these solutions were focused on heuristics. Currently, metaheuristics are used more for this task, especially those based on evolutionary algorithms. The two main contributions of this work are: the creation of what is called an -Operon- heuristic, for the construction of the information chains necessary for the implementation of transgenetic (evolutionary) algorithms, mainly using statistical methodology - the Cluster Analysis and the Principal Component Analysis; and the utilization of statistical analyses that are adequate for the evaluation of the performance of the algorithms that are developed to solve these problems. The aim of the Operon is to construct good quality dynamic information chains to promote an -intelligent- search in the space of solutions. The Traveling Salesman Problem (TSP) is intended for applications based on a transgenetic algorithmic known as ProtoG. A strategy is also proposed for the renovation of part of the chromosome population indicated by adopting a minimum limit in the coefficient of variation of the adequation function of the individuals, with calculations based on the population. Statistical methodology is used for the evaluation of the performance of four algorithms, as follows: the proposed ProtoG, two memetic algorithms and a Simulated Annealing algorithm. Three performance analyses of these algorithms are proposed. The first is accomplished through the Logistic Regression, based on the probability of finding an optimal solution for a TSP instance by the algorithm being tested. The second is accomplished through Survival Analysis, based on a probability of the time observed for its execution until an optimal solution is achieved. The third is accomplished by means of a non-parametric Analysis of Variance, considering the Percent Error of the Solution (PES) obtained by the percentage in which the solution found exceeds the best solution available in the literature. Six experiments have been conducted applied to sixty-one instances of Euclidean TSP with sizes of up to 1,655 cities. The first two experiments deal with the adjustments of four parameters used in the ProtoG algorithm in an attempt to improve its performance. The last four have been undertaken to evaluate the performance of the ProtoG in comparison to the three algorithms adopted. For these sixty-one instances, it has been concluded on the grounds of statistical tests that there is evidence that the ProtoG performs better than these three algorithms in fifty instances. In addition, for the thirty-six instances considered in the last three trials in which the performance of the algorithms was evaluated through PES, it was observed that the PES average obtained with the ProtoG was less than 1% in almost half of these instances, having reached the greatest average for one instance of 1,173 cities, with an PES average equal to 3.52%. Therefore, the ProtoG can be considered a competitive algorithm for solving the TSP, since it is not rare in the literature find PESs averages greater than 10% to be reported for instances of this size.
Resumo:
Following the new tendency of interdisciplinarity of modern science, a new field called neuroengineering has come to light in the last decades. After 2000, scientific journals and conferences all around the world have been created on this theme. The present work comprises three different subareas related to neuroengineering and electrical engineering: neural stimulation; theoretical and computational neuroscience; and neuronal signal processing; as well as biomedical engineering. The research can be divided in three parts: (i) A new method of neuronal photostimulation was developed based on the use of caged compounds. Using the inhibitory neurotransmitter GABA caged by a ruthenium complex it was possible to block neuronal population activity using a laser pulse. The obtained results were evaluated by Wavelet analysis and tested by non-parametric statistics. (ii) A mathematical method was created to identify neuronal assemblies. Neuronal assemblies were proposed as the basis of learning by Donald Hebb remain the most accepted theory for neuronal representation of external stimuli. Using the Marcenko-Pastur law of eigenvalue distribution it was possible to detect neuronal assemblies and to compute their activity with high temporal resolution. The application of the method in real electrophysiological data revealed that neurons from the neocortex and hippocampus can be part of the same assembly, and that neurons can participate in multiple assemblies. (iii) A new method of automatic classification of heart beats was developed, which does not rely on a data base for training and is not specialized in specific pathologies. The method is based on Wavelet decomposition and normality measures of random variables. Throughout, the results presented in the three fields of knowledge represent qualification in neural and biomedical engineering
Resumo:
This study evaluates the influence of different cartographic representations of in-car navigation systems on visual demand, subjective preference, and navigational error. It takes into account the type and complexity of the representation, maneuvering complexity, road layout, and driver gender. A group of 28 drivers (14 male and 14 female) participated in this experiment which was performed in a low-cost driving simulator. The tests were performed on a limited number of instances for each type of representation, and their purpose was to carry out a preliminary assessment and provide future avenues for further studies. Data collected for the visual demand study were analyzed using non-parametric statistical analyses. Results confirmed previous research that showed that different levels of design complexity significantly influence visual demand. Non-grid-like road networks, for example, influence significantly visual demand and navigational error. An analysis of simple maneuvers on a grid-like road network showed that static and blinking arrows did not present significant differences. From the set of representations analyzed to assess visual demand, both arrows were equally efficient. From a gender perspective, women seem to took at the display more than men, but this factor was not significant. With respect to subjective preferences, drivers prefer representations with mimetic landmarks when they perform straight-ahead tasks. For maneuvering tasks, landmarks in a perspective model created higher visual demands.