580 resultados para Benchmarks


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Brazilian public policy entered in the so-called new social federalism through its conditional cash transfers. States and municipalities can operate together through the nationwide platform of the Bolsa Familia Program (BFP), complementing federal actions with local innovations. The state and the city of Rio de Janeiro have created programs named, respectively, Renda Melhor (RM) and Família Carioca (FC). These programs make use of the operational structure of the BFP, which facilitates locating beneficiaries, issuing cards, synchronizing payment dates and access passwords and introducing new conditionalities. The payment system of the two programs complements the estimated permanent household income up to the poverty line established, giving more to those who have less. Similar income complementation system was subsequently adopted in the BFP and the Chilean Ingreso Ético Familiar, which also follow the principle of estimation of income used in the FC and in the RM. Instead of using the declared income, the value of the Rio cash transfers are set using the extensive collection of information obtained from the Single Registry of Social Programs (Cadastro Único): physical configuration of housing, access to public services, education and work conditions for all family members, presence of vulnerable groups, disabilities, pregnant or lactating women, children and benefits from other official transfers such as the BFP. With this multitude of assets and limitations, the permanent income of each individual is estimated. The basic benefit is defined by the poverty gap and priority is given to the poorest. These subnational programs use international benchmarks as a neutral ground between different government levels and mandates. Their poverty line is the highest of the first millennium goal of the United Nations (UN): US$ 2 per person per day adjusted for the cost of living. The other poverty line of the UN, US$ 1.25, was implicitly adopted as the national extreme poverty line in 2011. The exchange of methodologies between federal entities has happened both ways. The FC began with the 575,000 individuals living in the city of Rio de Janeiro who were on the payroll of the BFP. Its system of impact evaluation benefited from bi-monthly standardized examinations. In the educational conditionalities, the two programs reward students' progress, a potential advantage for those who most need to advance. The municipal program requires greater school attendance than that of the BFP and the presence of students’ parents at the bimonthly meetings held on Saturdays. Students must achieve a grade of 8 or improve at least 20% in each exam to receive a bi-monthly premium of R$50. In early childhood, priority is given to the poor children in the program Single Administrative Register (CadÚnico) to enroll in kindergarten, preschools and complementary activities. The state program reaches more than one million people with a payment system similar to the municipal one. Moreover, it innovates in that it transfers awards given to high school students to savings accounts. The prize increases and is paid to the student, who can withdraw up to 30% annually. The total can reach R$3,800 per low-income student. The State and the city rewarded already education professionals according to student performance, now completing the chain of demand incentives on poor students and their parents. Increased performance is higher among beneficiaries and the presence of their guardians at meetings is twice compared to non beneficiaries; The Houston program, also focuses on aligning the incentives to teachers, parents and students. In general, the plan is to explore strategic complementarities, where the whole is greater than the sum of its parts. The objective is to stimulate, through targets and incentives, synergies between social actors (teachers, parents, students), between areas (education, assistance, work) and different levels of government. The cited programs sum their efforts and divide labor so as to multiply interactions and make a difference in the lives of the poor.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Market risk exposure plays a key role for nancial institutions risk management. A possible measure for this exposure is to evaluate losses likely to incurwhen the price of the portfolio's assets declines using Value-at-Risk (VaR) estimates, one of the most prominent measure of nancial downside market risk. This paper suggests an evolving possibilistic fuzzy modeling approach for VaR estimation. The approach is based on an extension of the possibilistic fuzzy c-means clustering and functional fuzzy rule-based modeling, which employs memberships and typicalities to update clusters and creates new clusters based on a statistical control distance-based criteria. ePFM also uses an utility measure to evaluate the quality of the current cluster structure. Computational experiments consider data of the main global equity market indexes of United States, London, Germany, Spain and Brazil from January 2000 to December 2012 for VaR estimation using ePFM, traditional VaR benchmarks such as Historical Simulation, GARCH, EWMA, and Extreme Value Theory and state of the art evolving approaches. The results show that ePFM is a potential candidate for VaR modeling, with better performance than alternative approaches.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

As empresas de capital aberto, listadas em bolsa de valores, são naturalmente aquelas que vieram apresentando retornos superiores perante às demais empresas do seu setor. Assim, será que o viés de seleção desses ativos in uencia sigini cativamente no resultado do Equity Premium Puzzle, primordialmente lançado por Mehra and Prescott (1985)? É essa pergunta que este trabalho investiga e conclui que, sim, de fato pode haver uma in uência desse viés em explicar o Puzzle . Para isso, iremos gerar uma economia cujos ativos, por hipótese, sejam preci cados de acordo com o fator estocástico de desconto (SDF) baseado em consumo, ou seja, os modelos conhecidos como CCAPM (Consumption Capital Asset Pricing Model). Assim, essa economia será gerada via simulação de Monte Carlo, de forma que iremos construir um índice benchmark dessa economia, nos quais participariam apenas os ativos que foram historicamente mais rentáveis. Adota-se tal metodologia em paralelo à forma como os reais benchmarks são construidos (S&P 500, Nasdaq, Ibovespa), em que neles participam, basicamente, as empresas de capital aberta mais negociadas em Bolsa de Valores, que são, comumente, as empresas historicamente mais rentáveis da economia. Em sequência, iremos realizar a estimação via GMM (Generalized Method of Moments) de um dos parâmetros de interesse de uma economia CCAPM: o coe ciente de aversão relativa ao risco (CRRA). Finalmente, os resultados obtidos são comparados e analisados quanto ao viés de estimação.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The general objective of this thesis has been seasonal monitoring (quarterly time scale) of coastal and estuarine areas of a section of the Northern Coast of Rio Grande do Norte, Brazil, environmentally sensitive and with intense sediment erosion in the oil activities to underpin the implementation of projects for containment of erosion and mitigate the impacts of coastal dynamics. In order to achieve the general objective, the work was done systematically in three stages which consisted the specific objectives. The first stage was the implementation of geodetic reference infrastructure for carrying out the geodetic survey of the study area. This process included the implementation of RGLS (Northern Coast of the RN GPS Network), consisting of stations with geodetic coordinates and orthometric heights of precision; positioning of Benchmarks and evaluation of the gravimetric geoid available, for use in GPS altimetry of precision; and development of software for GPS altimetry of precision. The second stage was the development and improvement of methodologies for collection, processing, representation, integration and analysis of CoastLine (CL) and Digital Elevation Models (DEM) obtained by geodetic positioning techniques. As part of this stage have been made since, the choice of equipment and positioning methods to be used, depending on the required precision and structure implanted, and the definition of the LC indicator and of the geodesic references best suited, to coastal monitoring of precision. The third step was the seasonal geodesic monitoring of the study area. It was defined the execution times of the geodetic surveys by analyzing the pattern of sediment dynamics of the study area; the performing of surveys in order to calculate and locate areas and volumes of erosion and accretion (sandy and volumetric sedimentary balance) occurred on CL and on the beaches and islands surfaces throughout the year, and study of correlations between the measured variations (in area and volume) between each survey and the action of the coastal dynamic agents. The results allowed an integrated study of spatial and temporal interrelationships of the causes and consequences of intensive coastal processes operating in the area, especially to the measurement of variability of erosion, transport, balance and supply sedimentary over the annual cycle of construction and destruction of beaches. In the analysis of the results, it was possible to identify the causes and consequences of severe coastal erosion occurred on beaches exposed, to analyze the recovery of beaches and the accretion occurring in tidal inlets and estuaries. From the optics of seasonal variations in the CL, human interventions to erosion contention have been proposed with the aim of restoring the previous situation of the beaches in the process of erosion.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

There is no data about cardiac measurements em Brazilians obtained by CMR. This a muldisciplinary study with the objective of obtaining measurements of the left ventricle (LV) and right ventricle (RV) diastolic diameter (Dd), systolic diameter (Ds), diastolic volume (Dv), systolic volume (Sv), ejection fraction (EF) and myocardial mass in Brazilians. One hundred and seven (54 men and 53 women, mean age of 43.4 ± 13.1 years) asymptomatic individuals without heart disease were submitted to cardiac magnetic resonance (cMR) studies using steady state free precession technique. The means and standard deviations of the parameters of the LV and RV were respectively: LVDD = 4,8 ± 0,5 cm; LVSD = 3,0±0,6 cm; LVDV = 128,4±29,6 ml; LVSV = 45,2±16,6 ml; LVEF = 65,5±6,3%; LV mass = 95,2±30,8.1 g; RVDD = 3,9±1,3 cm; RVSD = 2,5±0,5 cm; RVDV = 126,5±30,7 ml; RVSV = 53.6±18,4 ml; RVEF = 58.3±8,0.0% and RV mass = 26,1±6,1 g. The masses and volumes were significantly higher in men, except for the LVSV. The RV EF was significantly higher in women. There was inverse correlation between RV systolic volume and with age, being more significant in men. This study describes for the first time benchmarks for cardiac measurements obtained by CMR among asymptomatic Brazilians individuals without heart disease and demonstrated differences according to sex and age

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Ferritin is a protein composed of heavy and light chains, non-covalently linked and which accommodates, in its core, thousands of atoms of iron. Furthermore, this protein represents the stock of iron in the body and it is characterized as an acute marker and predictor of diseases, such as iron deficiency anemia, hereditary hemochromatosis and others. Considering the variability of reference values and the analytical methods currently available, the aim of this work was to propose 95% confidence intervals for adults in the State of Rio Grande do Norte, Brazil, after determining the average concentration of serum ferritin for both sexes, beyond its correlation with the age. We analyzed 385 blood samples, collected by venipuncture from individuals residing in the State, after 12-14 hours of fast. The populational sample had 169 men and 216 women between 18-59 years old, which filled a questionnaire on socioeconomic, food habits and accounts about previous and current diseases. The sample collections were itinerant and the results of erythrogram, fasting glucose, alanine aminotransferase, aspartate aminotransferase, γ-glutamyl transferase, urea, creatinine, leukocyte count and platelets, beyond C-reactive protein, were issued to each participant, so that, after selection of the apparently healthy individuals, the dosage of serum ferritin was carried out. Statistical analysis was performed using the softwares SPSS 11.0 Windows version, Epi Info 3.3.2 and Graf instant pad (version 3.02), and the random population sample was single (finite population), for which the test of linear correlation and diagram of dispersion were also made. After selection of individuals and determination of serum ferritin, the most discrepant outliers were disregarded (N = 358, Men = 154/Women = 207) and the average value determined for the masculine sex individuals was 167,18 ng / dL; for the feminine sex individuals, the average value obtained was 81,55 ng / dL. Moreover, we found that 25% of men had values < 90,30 ng / dL; 50% ≤ 156,25 ng / dL and 75% ≤ 229,00 ng / dL. In the group of women, 25% had values < 38,80 ng / dL; 50% ≤ 65,00 ng / dL and 75% ≤ 119,00 ng / dL. Through the correlation coefficient (r = 0,23 with p = 0,003), it is possible to suggest the existence of positive linear correlation between age and serum ferritin for men. The correlation coefficient for women (r = 0,16 with p = 0,025) also confirms the existence of positive linear correlation between serum ferritin and age. Considering the analysis carried out and specific methods corroborating with the proposed benchmarks, we concluded that the average value found for men is higher than that found for women. Furthermore, this scenario rises with age for both sexes, and the 95% confidence intervals obtained were 74 ng/dL ≤ μ ≤ 89 ng/dL and 152ng/dL ≤ μ ≤183ng/dL for the feminine and masculine sex individuals respectively

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The purpose is to write a reflection on the audiovisual production by the visually impaired. The starting point for this research was a documentary video production workshop offered by the Instituto de Educação e Reabilitação de Cegos do Rio Grande do Norte - IERC / RN, with the participation of blind people with low vision and sighted employees of the institution. The research approach follows the precepts of complex thinking, where work is woven into the network, along with the researched. The theoretical framework is based on the theory of French sociologist Edgar Morin, and other important thinkers for this work, namely: Erving Goffman, Paulo Freire, Michel Foucault, Edward Said, Jacques Aumont, Phillpe Dubois, as well as scholars who think and theorize about his own condition and conduct discussions on the issue of blindness: Francisco Jose de Lima, Evgen Bavcar Jacques Lusseyran and Joana Belarmino. The research was formulated based on the statement in the interest of respondents to understand and produce visual images using video as a tool. In this sense, the methodology adopted approaches of action research in constructing the text and dialogue with the participation of those involved in the project. The technique of gathering the information was based on ethnographic description describing the dynamics of the workshop, the relationships between participants, relationship to the other that sees and the manner of operation of equipment. The main focus is the relationship based on dialogue of information, attitudes and ways of knowing from experience and capacity developed and obstacles for blind people to produce visual images using other benchmarks, such as touch, smell and time dimension and space, and add references that give new meaning to the guidelines based on visuality of ministering to the workshop. It is also held to discuss aspects related to the concept of image with sociological reflection about the audiovisual production made by blind people socially constructed and perpetuated by what Edgar Morin called cultural imprinting. Thus we attempted to walk the route with its obstacles and achievements in the production of new images that were seen

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The discussion we have established in this study is about how the practice is developing educational projects based on work with because the methodological choice of the teacher. The study of an ethnographic qualitative approach was conducted with a group of six teachers in two public schools in School Administrative Zone north of the city of Natal / RN. Data were constructed from semi-structured, reinforced by the direct observation of the practice of teaching of teachers. The goals outlined were to identify the reasons why the option for the project, listing the benchmarks that subsidized their construction, the observation of everyday experience in educational planning and development of projects developed. In the light of information we can notice a complex and even contradictory, which is confusing the concepts and practices are weakened. The analysis revealed inconsistencies between theory and practice in working with projects, the fruit of little theoretical development of teachers. The adoption of the project as a methodological option has meant a change in direction of the effective action didactic. Reflects on this situation, it was concluded that there is a need to broaden the understanding of the significance of this type of work, covering the different dimensions that involve the practice of research, research and training of teachers. Even preventing the intentions and initiatives of the teachers, it must be emphasized is the process of learning within an approach that focuses the process of learning in multiple dimensions, inter-relational, both the capabilities of students in the areas of knowledge.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The portfolio theory is a field of study devoted to investigate the decision-making by investors of resources. The purpose of this process is to reduce risk through diversification and thus guarantee a return. Nevertheless, the classical Mean-Variance has been criticized regarding its parameters and it is observed that the use of variance and covariance has sensitivity to the market and parameter estimation. In order to reduce the estimation errors, the Bayesian models have more flexibility in modeling, capable of insert quantitative and qualitative parameters about the behavior of the market as a way of reducing errors. Observing this, the present study aimed to formulate a new matrix model using Bayesian inference as a way to replace the covariance in the MV model, called MCB - Covariance Bayesian model. To evaluate the model, some hypotheses were analyzed using the method ex post facto and sensitivity analysis. The benchmarks used as reference were: (1) the classical Mean Variance, (2) the Bovespa index's market, and (3) in addition 94 investment funds. The returns earned during the period May 2002 to December 2009 demonstrated the superiority of MCB in relation to the classical model MV and the Bovespa Index, but taking a little more diversifiable risk that the MV. The robust analysis of the model, considering the time horizon, found returns near the Bovespa index, taking less risk than the market. Finally, in relation to the index of Mao, the model showed satisfactory, return and risk, especially in longer maturities. Some considerations were made, as well as suggestions for further work

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Reinforcement learning is a machine learning technique that, although finding a large number of applications, maybe is yet to reach its full potential. One of the inadequately tested possibilities is the use of reinforcement learning in combination with other methods for the solution of pattern classification problems. It is well documented in the literature the problems that support vector machine ensembles face in terms of generalization capacity. Algorithms such as Adaboost do not deal appropriately with the imbalances that arise in those situations. Several alternatives have been proposed, with varying degrees of success. This dissertation presents a new approach to building committees of support vector machines. The presented algorithm combines Adaboost algorithm with a layer of reinforcement learning to adjust committee parameters in order to avoid that imbalances on the committee components affect the generalization performance of the final hypothesis. Comparisons were made with ensembles using and not using the reinforcement learning layer, testing benchmark data sets widely known in area of pattern classification

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The pattern classification is one of the machine learning subareas that has the most outstanding. Among the various approaches to solve pattern classification problems, the Support Vector Machines (SVM) receive great emphasis, due to its ease of use and good generalization performance. The Least Squares formulation of SVM (LS-SVM) finds the solution by solving a set of linear equations instead of quadratic programming implemented in SVM. The LS-SVMs provide some free parameters that have to be correctly chosen to achieve satisfactory results in a given task. Despite the LS-SVMs having high performance, lots of tools have been developed to improve them, mainly the development of new classifying methods and the employment of ensembles, in other words, a combination of several classifiers. In this work, our proposal is to use an ensemble and a Genetic Algorithm (GA), search algorithm based on the evolution of species, to enhance the LSSVM classification. In the construction of this ensemble, we use a random selection of attributes of the original problem, which it splits the original problem into smaller ones where each classifier will act. So, we apply a genetic algorithm to find effective values of the LS-SVM parameters and also to find a weight vector, measuring the importance of each machine in the final classification. Finally, the final classification is obtained by a linear combination of the decision values of the LS-SVMs with the weight vector. We used several classification problems, taken as benchmarks to evaluate the performance of the algorithm and compared the results with other classifiers

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Bayesian networks are powerful tools as they represent probability distributions as graphs. They work with uncertainties of real systems. Since last decade there is a special interest in learning network structures from data. However learning the best network structure is a NP-Hard problem, so many heuristics algorithms to generate network structures from data were created. Many of these algorithms use score metrics to generate the network model. This thesis compare three of most used score metrics. The K-2 algorithm and two pattern benchmarks, ASIA and ALARM, were used to carry out the comparison. Results show that score metrics with hyperparameters that strength the tendency to select simpler network structures are better than score metrics with weaker tendency to select simpler network structures for both metrics (Heckerman-Geiger and modified MDL). Heckerman-Geiger Bayesian score metric works better than MDL with large datasets and MDL works better than Heckerman-Geiger with small datasets. The modified MDL gives similar results to Heckerman-Geiger for large datasets and close results to MDL for small datasets with stronger tendency to select simpler network structures

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We present a nestedness index that measures the nestedness pattern of bipartite networks, a problem that arises in theoretical ecology. Our measure is derived using the sum of distances of the occupied elements in the adjacency matrix of the network. This index quantifies directly the deviation of a given matrix from the nested pattern. In the most simple case the distance of the matrix element ai,j is di,j = i+j, the Manhattan distance. A generic distance is obtained as di,j = (i¬ + j¬)1/¬. The nestedness índex is defined by = 1 − where is the temperature of the matrix. We construct the temperature index using two benchmarks: the distance of the complete nested matrix that corresponds to zero temperature and the distance of the average random matrix that is defined as temperature one. We discuss an important feature of the problem: matrix occupancy. We address this question using a metric index ¬ that adjusts for matrix occupancy

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The increase of applications complexity has demanded hardware even more flexible and able to achieve higher performance. Traditional hardware solutions have not been successful in providing these applications constraints. General purpose processors have inherent flexibility, since they perform several tasks, however, they can not reach high performance when compared to application-specific devices. Moreover, since application-specific devices perform only few tasks, they achieve high performance, although they have less flexibility. Reconfigurable architectures emerged as an alternative to traditional approaches and have become an area of rising interest over the last decades. The purpose of this new paradigm is to modify the device s behavior according to the application. Thus, it is possible to balance flexibility and performance and also to attend the applications constraints. This work presents the design and implementation of a coarse grained hybrid reconfigurable architecture to stream-based applications. The architecture, named RoSA, consists of a reconfigurable logic attached to a processor. Its goal is to exploit the instruction level parallelism from intensive data-flow applications to accelerate the application s execution on the reconfigurable logic. The instruction level parallelism extraction is done at compile time, thus, this work also presents an optimization phase to the RoSA architecture to be included in the GCC compiler. To design the architecture, this work also presents a methodology based on hardware reuse of datapaths, named RoSE. RoSE aims to visualize the reconfigurable units through reusability levels, which provides area saving and datapath simplification. The architecture presented was implemented in hardware description language (VHDL). It was validated through simulations and prototyping. To characterize performance analysis some benchmarks were used and they demonstrated a speedup of 11x on the execution of some applications

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The tourism industry is gaining representation by move and stimulate the economy, especially by allowing the generation of employment and income, thus allowing growth opportunities for localities where tourism develops. Therefore, the present study entitled determinants of competitiveness of tourist destinations applied to regional routes: an evaluation of the Route of Seridó/RN, discusses the issue of competitiveness in tourism and tries to understand the scenery of this Route. The main objective of the study is to assess the conditions of competitiveness in the Route of Seridó/RN according to benchmarks and global determinants of competitiveness for tourist destinations. The study has also as specifics objectives: define dimensions of the reference model for use in evaluating the competitiveness of the Route of Seridó/RN; identify levels of governance and competitiveness in the municipalities that make up the sample set above the Route, and analyze to what extent the competitiveness of the Route correspond to the global reference of competitiveness of tourism destinations. Regarding the methodology, it is a search for an exploratory- descriptive and used a combination of quantitative and qualitative research method as expected and required in the implementation of the evaluation tool called Compet&enible Model. For data collection, it has been taken technical visits and also analysis of documents and materials. Data analysis was based on the records and documents and the use of simple descriptive statistics for the scores of the elements offered by Compet&enible Model. The results allowed us to know the real conditions of competitiveness of the Seridó/RN Route forward to the attributes of tourist destinations for global competitiveness: the dimension I, Governance, reached 17 points, classified as "in structuring" and dimension II, Competitiveness, reached 10 points, ranking "weak". These results highlight the need for greater involvement of the actors in the supply chain of tourism in Polo Seridó/RN for the actions, programs and projects are put into practice. It is expected that tourism is considered an important activity for the local and global development, serving as a reference for the future management of Seridó/RN Route, guiding new policy guidelines, planning and organization to better competitiveness