909 resultados para data accuracy


Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objectives. The objectives of this study were to assess the accuracy of working length determination using 3 electronic apex locators and direct digital radiography and to compare the results with those obtained using the visual method (control measurement). Study design. Twenty extracted human maxillary premolars were selected: 17 two-rooted and 3 single-rooted (total of 37 canals). Working length was measured using electronic apex locators Elements Diagnostic, Root ZX, and Just II. Subsequently, teeth were positioned in the alveolar bone of a dry skull and submitted to direct digital radiography. A variation of +/- 1 mm was considered as acceptable. Results were analyzed using the Wilcoxon and the chi(2) tests. Results. Results presented an accuracy of 94.6% for Elements Diagnostic, 91.9% for Root ZX, 73.0% for Just II, and 64.9% for direct digital radiography when considering the margin of +/- 1 mm in relation to the control measurement. Comparisons with the actual control measurements resulted in accuracy results of 13.51%, 13.51%, 10.10%, and 2.70%, respectively. Conclusions. Root ZX and Elements Diagnostic are more accurate in determining working length when compared with Just II and Schick direct digital radiography. (Oral Surg Oral Med Oral Pathol Oral Radiol Endod 2011;111:e44-e49)

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objectives: The diagnosis of caries lesions is still a matter of concern in dentistry. The diagnosis of dental caries by digital radiography has a number of advantages over conventional radiography; however, this method has not been explored fully in the field of paediatric dentistry. This in vitro research evaluated the accuracy of direct digital radiography compared with visual inspection and conventional radiography in the diagnosis of occlusal caries lesions in primary molars. Methods: 50 molars were selected and evaluated under standardized conditions by 2 previously calibrated examiners according to 3 diagnostic methods (visual inspection, conventional radiography and direct digital radiography). Direct digital radiographs were obtained with the Dixi3 system (Planmeca, Helsinki, Finland) and the conventional radiographs with InSight film (Kodak Eastman Co., Rochester, NY). The images were scored and a reference standard was obtained histologically. The interexaminer reliability was calculated using Cohen`s kappa test and the specificity, sensitivity and accuracy of the methods were calculated. Results: Examiner reliability was good. For lesions limited to the enamel, visual inspection showed significantly higher sensitivity and accuracy than both radiographic methods, but no significant difference was found in specificity. For teeth with dentinal caries, no significant differences were found for any parameter when comparing visual and radiographic evaluation. Conclusions: Although less accurate than the visual method for detecting caries lesions confined to the enamel, the direct digital radiographic method is as effective as conventional radiographic examination and visual inspection of primary teeth with occlusal caries when the dentine is involved. Dentomaxillofacial Radiology (2010) 39, 362-367. doi: 10.1259/dmfr/22865872

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The literature shows contradictory results regarding the role of composite shrinkage and elastic modulus as determinants of polymerization stress. The present study aimed at a better understanding of the test mechanics that could explain such divergences among studies. The hypothesis was that the effects of composite shrinkage and elastic modulus on stress depend upon the compliance of the testing system. A commonly used test apparatus was simulated by finite element analysis, with different compliance levels defined by the bonding substrate (steel, glass, composite, or acrylic). Composites with moduli between 1 and 12 GPa and shrinkage values between 0.5% and 6% were modeled. Shrinkage was simulated by thermal analogy. The hypothesis was confirmed. When shrinkage and modulus increased simultaneously, stress increased regardless of the substrate. However, if shrinkage and modulus were inversely related, their magnitudes and interaction with rod material determined the stress response.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Introduction: The aim of this study was to compare the influence of preflaring on the accuracy of 4 electronic apex locators (EALs): Root ZX, Elements Diagnostic Unit and Apex Locator, Mini Apex Locator, and Apex DSP. Methods: Forty extracted teeth were preflared by using S1 and SX ProTaper instruments. The working length was established by reducing 1 mm from the total length (TL). The ability of the EALs to detect precise (-1 mm from TL) and acceptable (-1+/-0.5 mm from TL) measurements in unflared and preflared canals was determined. Results: The precise and acceptable (P/A) readings in unflared canals for Root ZX, Elements Diagnostic Unit and Apex Locator, Mini Apex and Apex DSP were 50%/97.5%, 47.5%/95%, 50%/97.5%, and 45%/67.5%, respectively. For preflared canals, the readings were 75%/97.5%, 55%/95%, 75%/97.5%, and 60%/87.5%, respectively. For precise criteria, the preflared procedure increased the percentage of accurate electronic readings for the Root ZX and the Mini Apex Locator (P < .05). For acceptable criteria, no differences were found among Root ZX, Elements Diagnostic Unit and Apex Locator, and Mini Apex Locator (P > .05). Fisher test indicated the lower accuracy for Apex DSP (P < .05). Conclusions: The Root ZX and the Mini Apex Locator devices increased significantly the precision to determine the real working length after the preflaring procedure. All the EALs showed an acceptable determination of the working length between the ranges of+/-0.5mm except for the Apex DSP device, which had the lowest accuracy. (J Endod 2009;35:1300-1302)

Relevância:

20.00% 20.00%

Publicador:

Resumo:

There are two main types of data sources of income distributions in China: household survey data and grouped data. Household survey data are typically available for isolated years and individual provinces. In comparison, aggregate or grouped data are typically available more frequently and usually have national coverage. In principle, grouped data allow investigation of the change of inequality over longer, continuous periods of time, and the identification of patterns of inequality across broader regions. Nevertheless, a major limitation of grouped data is that only mean (average) income and income shares of quintile or decile groups of the population are reported. Directly using grouped data reported in this format is equivalent to assuming that all individuals in a quintile or decile group have the same income. This potentially distorts the estimate of inequality within each region. The aim of this paper is to apply an improved econometric method designed to use grouped data to study income inequality in China. A generalized beta distribution is employed to model income inequality in China at various levels and periods of time. The generalized beta distribution is more general and flexible than the lognormal distribution that has been used in past research, and also relaxes the assumption of a uniform distribution of income within quintile and decile groups of populations. The paper studies the nature and extent of inequality in rural and urban China over the period 1978 to 2002. Income inequality in the whole of China is then modeled using a mixture of province-specific distributions. The estimated results are used to study the trends in national inequality, and to discuss the empirical findings in the light of economic reforms, regional policies, and globalization of the Chinese economy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This article investigates the researcher's work in the coproduction (or not) of complaint sequences in research interviews. Using a conversation analytic approach, we show how the interviewer's management of complaint sequences in a research setting is consequential for subsequent talk and thus directly affects the data generated. In the examples shown here, researchers sharing cocategorial incumbency with respondents may well provide spaces for research participants to formulate complaints. This article examines sequences of talk surrounding complaints to show how researchers generate complaints (or not) and handle unsafe complaints. Researchers are able to provoke specific types of accounts from respondents, whereas their respondents may actively resist the researchers' direction. For researchers using the interview as a method of data generation, examination of complaint sequences and how these appear in interview data provides insight into how interview talk is coproduced and managed within a socially situated setting.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper examines the effects of information request ambiguity and construct incongruence on end user's ability to develop SQL queries with an interactive relational database query language. In this experiment, ambiguity in information requests adversely affected accuracy and efficiency. Incongruities among the information request, the query syntax, and the data representation adversely affected accuracy, efficiency, and confidence. The results for ambiguity suggest that organizations might elicit better query development if end users were sensitized to the nature of ambiguities that could arise in their business contexts. End users could translate natural language queries into pseudo-SQL that could be examined for precision before the queries were developed. The results for incongruence suggest that better query development might ensue if semantic distances could be reduced by giving users data representations and database views that maximize construct congruence for the kinds of queries in typical domains. (C) 2001 Elsevier Science B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

With the proliferation of relational database programs for PC's and other platforms, many business end-users are creating, maintaining, and querying their own databases. More importantly, business end-users use the output of these queries as the basis for operational, tactical, and strategic decisions. Inaccurate data reduce the expected quality of these decisions. Implementing various input validation controls, including higher levels of normalisation, can reduce the number of data anomalies entering the databases. Even in well-maintained databases, however, data anomalies will still accumulate. To improve the quality of data, databases can be queried periodically to locate and correct anomalies. This paper reports the results of two experiments that investigated the effects of different data structures on business end-users' abilities to detect data anomalies in a relational database. The results demonstrate that both unnormalised and higher levels of normalisation lower the effectiveness and efficiency of queries relative to the first normal form. First normal form databases appear to provide the most effective and efficient data structure for business end-users formulating queries to detect data anomalies.