97 resultados para Erro de concordância

em Universidade Federal do Rio Grande do Norte(UFRN)


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study developed software rotines, in a system made basically from a processor board producer of signs and supervisory, wich main function was correcting the information measured by a turbine gas meter. This correction is based on the use of an intelligent algorithm formed by an artificial neural net. The rotines were implemented in the habitat of the supervisory as well as in the habitat of the DSP and have three main itens: processing, communication and supervision

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The present work had as objective to apply an experimental planning aiming at to improve the efficiency of separation of a new type of mixer-settler applied to treat waste water contaminated with oil. An unity in scale of laboratory, was installed in the Post-graduation Program of Chemical Engineering of UFRN. It was constructed in partnership with Petrobras S.A. This called device Misturador-Decantador a Inversão de Fases (MDIF) , possess features of conventional mixer-settler and spray column type. The equipment is composed of three main parts: mixing chamber; chamber of decantation and chamber of separation. The efficiency of separation is evaluated analyzing the oil concentrations in water in the feed and the output of the device. For the analysis one used the gravimetric method of oil and greases analysis (TOG). The system in study is a water of formation emulsified with oil. The used extractant is a mixture of Turpentine spirit hydro-carbons, supplied for Petrobras. It was applied, for otimization of the efficiency of separation of the equipment, an experimental planning of the composite central type, having as factorial portion fractionary factorial planning 2 5-2, with the magnifying of the type star and five replications in the central point. In this work, the following independents variables were studied: contents of oil in the feed of the device; volumetric ratio (O/A); total flowrate ; agitation in the mixing chamber and height of the organic bed. Minimum and maximum limits for the studied variables had been fixed according previous works. The analysis of variance for the equation of the empirical model, revealed statistically significant and useful results for predictions ends. The variance analysis also presented the distribution of the error as a normal distribution and was observed that as the dispersions do not depend on the levels of the factors, the independence assumption can be verified. The variation around the average is explained by 98.98%, or either, equal to the maximum value, being the smoothing of the model in relation to the experimental points of 0,98981. The results present a strong interaction between the variable oil contents in the feed and agitation in the mixing chamber, having great and positive influence in the separation efficiency. Another variable that presented a great positive influence was the height of the organic bed. The best results of separation efficiency had been obtained for high flowrates when associates the high oil concentrations and high agitation. The results of the present work had shown excellent agreement with the results carried out through previous works with the mixer-settler of phase inversion

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this work, we study the survival cure rate model proposed by Yakovlev et al. (1993), based on a competing risks structure concurring to cause the event of interest, and the approach proposed by Chen et al. (1999), where covariates are introduced to model the risk amount. We focus the measurement error covariates topics, considering the use of corrected score method in order to obtain consistent estimators. A simulation study is done to evaluate the behavior of the estimators obtained by this method for finite samples. The simulation aims to identify not only the impact on the regression coefficients of the covariates measured with error (Mizoi et al. 2007) but also on the coefficients of covariates measured without error. We also verify the adequacy of the piecewise exponential distribution to the cure rate model with measurement error. At the end, model applications involving real data are made

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The calculation of tooth mass discrepancy, essential for good planning and a proper orthodontic finishing, when performed manually, besides being laborious, requires considerable time consumption. The aim of this study was to develop and test Bolton Freeware, a software for analysis of the tooth mass discrepancy of Bolton, aiming to minimize the consumption of time in a less onerous way. The digital analysis of the software was done by means of two-dimensional scanning of plaster study models and compared to manual evaluation (gold standard), using 75 pairs of stone plaster study models divided into two groups according to the magnitude of the Curve of Spee (group I from 0 to 2 mm, group II greater than 2 to 3mm). All the models had permanent dentition and were in perfect condition. The manual evaluation was performed with a digital caliper and a calculator, and the time required to perform the analysis for both methods was recorded and compared. In addition, the software was evaluated by orthodontists regarding its use, by means of questionnaires developed specifically for this purpose. Calibration was performed prior to manual analysis, and excellent levels of inter-rater agreement were achieved, with ICC > 0.75 and r > 0.9 for total and anterior proportion. It was observed in the evaluation of error of the digital method that some teeth showed a significant systematic error, being the highest measured at 0.08 mm. The analysis of total tooth mass discrepancy performed by Bolton Freeware, for those cases in which the curve of Spee is mild and moderate, differ from manual analysis, on average, 0.09 mm and 0.07 mm respectively, for each tooth evaluated, with r> 0, 8 for total and anterior proportion. According to the specificity and sensitivity test, Bolton Freeware has an improved ability to detect true negatives, i.e. the presence of discrepancy. The Bolton analysis digitally performed was faster, with an average difference of time consumed to perform the analysis of Bolton between the two methods of approximately 6 minutes. Most experts interviewed (93%) approved the usability of the software

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Student’s mistakes as viewed in a didactic and pedagogical perspective are a phenomenon inevitably observed in any context in which formal teaching-andlearning processes are taking place. Researchers have shown that such mistakes are viewed most of the times as undesirable and often as a consequence of lack of attention or poor commitment on the part of the student and rarely considered didactically useful. The object of our reflections in this work is exactly those mistakes, which are born in the entrails of the teaching-and-learning processes. It is our understanding that a mistake constitutes a tool which mediates knowledge and may therefore become a strong ally of the instructor’s actions in her/his teaching tasks and thus should be taken into the teacher’s best consideration. Understanding a mistake as so, we postulate that the teacher must face it as a possibility to be exploited rather than as a negative occurrence. Such an attitude on the part of the teacher would undoubtedly render profitable didactic situations. To deepen the understanding of our aim, we took a case study on the perception of senior college students in the program of Mathematics at UFRN in the year 2009, 2nd term. The reason of this choice is the fact that Mathematics is the field presenting traditionally the poorest records in terms of school grades. In this work we put forth data associated to ENEM1 , to the UFRN Vestibular2 and the undergraduate courses on Mathematics. The theoretical matrixes supporting our reflections in this thesis follow the ideas proposed by Castorina (1988); Davis e Espósito (1990); Aquino (1997); Luckesi (2006); Cury (1994; 2008); Pinto (2000); Torre (2007). To carry out the study, we applied a semi-structured questionnaire containing 14 questions, out of which 10 were open questions. The questions were methodologically based on the Thematic Analysis – One of the techniques for Content Analysis schemed by Bardin (1977) – and it was also used the computer program Modalisa 6.0 (A software designed by faculties the University of Paris VIII). The results indicate that most of the teachers training instructors in their pedagogical practice view the mistakes made by their students only as a guide for grading and, in this procedure, the student is frequently labeled as guilty. Conclusive analyses, therefore, signal to the necessity of orienting the teachers training instructors in the sense of building a new theoretical contemplation of the students’ mistakes and their pedagogical potentialities and so making those professionals perceive the importance of such mistakes, since they reveal gaps in the process of learning and provide valuable avenues for the teaching procedures.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Cephalometric analysis is the mensuration of linear and angular measures through demarcation points as distances and lines on teleradiography, and is considered of fundamental importance for diagnosis and orthodontic planning. In this manner, the objective of this research was to compare cephalometric measurements obtained by dentists and radiologists from the analysis of the same radiograph, in a computerized cephalometric analysis program. All research participants marked 18 cephalometric points on a 14-inch notebook computer, as directed by the program itself (Radiocef 2®). From there, they generated 14 cephalometric parameters including skeletal, dental-skeletal, dental and soft tissue. In order to verify the intra-examiner agreement, 10 professionals from each group repeated the marking of the points with a minimum interval of eight days between the two markings. The intra-group variability was calculated based on the coefficients of variation (CV). The comparison between groups was performed using the Student t-test for normally distributed variables, and using the Mann-Whitney test for those with non-normal distribution. In the group of orthodontists, the measurements of Pog and 1-NB, SL, S-Ls Line, S-Li Line and 1.NB showed high internal variability. In the group of radiologists, the same occurred with the values of Pog and 1-NB, S-Ls Line, S-Li Line and 1.NA. In the comparison between groups, all the analyzed linear values and two angular values showed statistically significant differences between radiologists and dentists (p <0.05). According to the results, the interexaminer error in cephalometric analysis requires more attention, but does not come from a specific class of specialists, being either dentists or radiologists.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Devido à necessidade de mensurar o risco de quedas em concordância à linguagem padronizada de enfermagem, foi selecionado o resultado de enfermagem Comportamento para Prevenção de Quedas da Nursing Outcomes Classification (NOC), com objetivo de identificar evidências sobre seus elementos, mensuração, comparação com indicadores existentes e construir definições constitutivas. Foi efetuada revisão integrativa entre abril e novembro de 2009, mediante identificação da questão, estabelecimento de critérios de inclusão/exclusão, extração das informações, avaliação, interpretação e síntese. Destacaram-se pesquisas transversais e perspectivas de especialistas. Os indicadores Uso de recursos de correção da visão e Uso de sapatos amarrados e do tamanho adequado foram considerados insuficientes para avaliar fatores de risco como déficits sensoriais e roupas/calçados inadequados. Percebe-se que algumas definições precisam ser melhor desenvolvidas e que esse resultado de enfermagem merece refinamento sobretudo referente aos indicadores. Foram identificados 22 indicadores e definições foram propostas baseadas nas evidências da literatura

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study examines the factors that influence public managers in the adoption of advanced practices related to Information Security Management. This research used, as the basis of assertions, Security Standard ISO 27001:2005 and theoretical model based on TAM (Technology Acceptance Model) from Venkatesh and Davis (2000). The method adopted was field research of national scope with participation of eighty public administrators from states of Brazil, all of them managers and planners of state governments. The approach was quantitative and research methods were descriptive statistics, factor analysis and multiple linear regression for data analysis. The survey results showed correlation between the constructs of the TAM model (ease of use, perceptions of value, attitude and intention to use) and agreement with the assertions made in accordance with ISO 27001, showing that these factors influence the managers in adoption of such practices. On the other independent variables of the model (organizational profile, demographic profile and managers behavior) no significant correlation was identified with the assertions of the same standard, witch means the need for expansion researches using such constructs. It is hoped that this study may contribute positively to the progress on discussions about Information Security Management, Adoption of Safety Standards and Technology Acceptance Model

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study aimed to analyze the leadership style adopted by managers of nongovernmental organizations in the metropolitan region of Belem on the theory of Hersey and Blanchard. This theory is called situational leadership ranks E1, E2, E3, E4 and the styles of leadership and maturity in parallel classes M1, M2, M3 and M4. This study examined the relationship of leadership styles with the maturity of work, identified the relationship of leadership styles as related to psychological maturity and job maturity and psychological maturity. The main objectives were to analyze and relate leadership styles with the maturity of the leaders and understand the phenomenon of leadership from the self-perception of those who lead the organizations studied. To achieve the objectives we used a questionnaire already validated the theory of situational leadership and applied in 320 non-governmental organizations in the metropolitan region of Belem The methodology was quantitative, descriptive and exploratory. The analysis was by descriptive statistics and inferential statistics for univariate and bivariate form, applying the chi-square, the V Crammer and Spearman correlation. The data analysis shows safety, attested to the frequencies, and average margin of error and after application of the tests it was found that a relationship between the leadership style of work with the maturity and psychological maturity. The managers of nongovernmental organizations practicing various styles of leadership and focus on the quadrant of high maturity. It was diagnosed when the manager uses only one style of leadership was the predominance of E3 "share or support", which represents 24% of the sample. As uses two styles of leadership is the predominance of E3 and E2, which represents 76%. So the managers of nongovernmental organizations in the metropolitan region of Belem, practicing a style of leadership support, sharing ideas for decision making using a democratic style

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study aimed to measure the perception of maturity project management of state boards of Rio Grande do Norte by the perception of its managers. Argues that project management has been highlighted as a critical factor for the success of any organization, because the projects are directly related to the set of activities that result in organizational innovation as products, services and processes and the improvement of project management is directly aligned with the main pillars of the New Public Management. Methodologically, this is a quantitative research of a descriptive nature in which 161 forms were applied with coordinators and subcoordinators of state departments of Rio Grande do Norte, culminating in a sampling error of less than 6% to 95% confidence according to the procedures finite sampling. The process of tabulation and analysis was done using the package Statistical Package for Social Sciences - SPSS 18.0 and worked with techniques such as mean, standard deviation, frequency distributions, cluster analysis and factor analysis. The results indicate that the levels of maturity in project management in state departments of Rio Grande do Norte is below the national average and that behavioral skills are the main problem for improving management in these departments. It was possible to detect the existence of two groups of different perceptions about the management of projects, indicating, according to the managers, there are islands of excellence in project management in some sectors of the state departments. It was also observed that there are eight factors that affect maturity in project management: Planning and Control , Development of Management Skills , Project Management Environment , Acceptance of the Subject Project Management , Stimulus to Performance , Project Evaluation and Learning , Project Management Office and Visibility of Project Managers . It concludes that the project management in state departments of Rio Grande do Norte has no satisfactory levels of maturity in project management, affecting the levels of efficiency and effectiveness of the state apparatus, which shows that some of the assumptions that guide the New Public Management are not getting the levels of excellence nailed by this management model

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This research aims to investigate the Hedge Efficiency and Optimal Hedge Ratio for the future market of cattle, coffee, ethanol, corn and soybean. This paper uses the Optimal Hedge Ratio and Hedge Effectiveness through multivariate GARCH models with error correction, attempting to the possible phenomenon of Optimal Hedge Ratio differential during the crop and intercrop period. The Optimal Hedge Ratio must be bigger in the intercrop period due to the uncertainty related to a possible supply shock (LAZZARINI, 2010). Among the future contracts studied in this research, the coffee, ethanol and soybean contracts were not object of this phenomenon investigation, yet. Furthermore, the corn and ethanol contracts were not object of researches which deal with Dynamic Hedging Strategy. This paper distinguishes itself for including the GARCH model with error correction, which it was never considered when the possible Optimal Hedge Ratio differential during the crop and intercrop period were investigated. The commodities quotation were used as future price in the market future of BM&FBOVESPA and as spot market, the CEPEA index, in the period from May 2010 to June 2013 to cattle, coffee, ethanol and corn, and to August 2012 to soybean, with daily frequency. Similar results were achieved for all the commodities. There is a long term relationship among the spot market and future market, bicausality and the spot market and future market of cattle, coffee, ethanol and corn, and unicausality of the future price of soybean on spot price. The Optimal Hedge Ratio was estimated from three different strategies: linear regression by MQO, BEKK-GARCH diagonal model, and BEKK-GARCH diagonal with intercrop dummy. The MQO regression model, pointed out the Hedge inefficiency, taking into consideration that the Optimal Hedge presented was too low. The second model represents the strategy of dynamic hedge, which collected time variations in the Optimal Hedge. The last Hedge strategy did not detect Optimal Hedge Ratio differential between the crop and intercrop period, therefore, unlikely what they expected, the investor do not need increase his/her investment in the future market during the intercrop

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Reactive oxygen species (ROS) are produced by aerobic metabolism and react with biomolecules, such as lipids, proteins and DNA. In high concentration, they lead to oxidative stress. Among ROS, singlet oxygen (1O2) is one of the main ROS involved in oxidative stress and is one of the most reactive forms of molecular oxygen. The exposure of some dyes, such as methylene blue (MB) to light (MB+VL), is able to generate 1O2 and it is the principle involved in photodynamic therapy (PDT). 1O2 e other ROS have caused toxic and carcinogenic effects and have been associated with ageing, neurodegenerative diseases and cancer. Oxidative DNA damage is mainly repaired by base excision repair (BER) pathway. However, recent studies have observed the involvement of nucleotide excision repair (NER) factors in the repair of this type of injury. One of these factors is the Xeroderma Pigmentosum Complementation Group A (XPA) protein, which acts with other proteins in DNA damage recognition and in the recruitment of other repair factors. Moreover, oxidative agents such as 1O2 can induce gene expression. In this context, this study aimed at evaluating the response of XPA-deficient cells after treatment with photosensitized MB. For this purpose, we analyzed the cell viability and occurrence of oxidative DNA damage in cells lines proficient and deficient in XPA after treatment with MB+VL, and evaluated the expression of this enzyme in proficient and complemented cells. Our results indicate an increased resistance to treatment of complemented cells and a higher level of oxidative damage in the deficient cell lines. Furthermore, the treatment was able to modulate the XPA expression up to 24 hours later. These results indicate a direct evidence for the involvement of NER enzymes in the repair of oxidative damage. Besides, a better understanding of the effects of PDT on the induction of gene expression could be provided

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper we developed a prototype for dynamic and quantitative analysis of the hardness of metal surfaces by penetration tests. It consists of a micro-indenter which is driven by a gear system driven by three-rectified. The sample to be tested is placed on a table that contains a load cell that measures the deformation in the sample during the penetration of micro-indenter. With this prototype it is possible to measure the elastic deformation of the material obtained by calculating the depth of penetration in the sample from the difference of turns between the start of load application to the application of the load test and return the indenter until the complete termination of load application. To determine the hardness was used to measure the depth of plastic deformation. We used 7 types of steel trade to test the apparatus. There was a dispersion of less than 10% for five measurements made on each sample and a good agreement with the values of firmness provided by the manufacturers.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this work we developed a computer simulation program for physics porous structures based on programming language C + + using a Geforce 9600 GT with the PhysX chip, originally developed for video games. With this tool, the ability of physical interaction between simulated objects is enlarged, allowing to simulate a porous structure, for example, reservoir rocks and structures with high density. The initial procedure for developing the simulation is the construction of porous cubic structure consisting of spheres with a single size and with varying sizes. In addition, structures can also be simulated with various volume fractions. The results presented are divided into two parts: first, the ball shall be deemed as solid grains, ie the matrix phase represents the porosity, the second, the spheres are considered as pores. In this case the matrix phase represents the solid phase. The simulations in both cases are the same, but the simulated structures are intrinsically different. To validate the results presented by the program, simulations were performed by varying the amount of grain, the grain size distribution and void fraction in the structure. All results showed statistically reliable and consistent with those presented in the literature. The mean values and distributions of stereological parameters measured, such as intercept linear section of perimeter area, sectional area and mean free path are in agreement with the results obtained in the literature for the structures simulated. The results may help the understanding of real structures.