881 resultados para Accuracy.
Resumo:
Clinical Research Data Quality Literature Review and Pooled Analysis We present a literature review and secondary analysis of data accuracy in clinical research and related secondary data uses. A total of 93 papers meeting our inclusion criteria were categorized according to the data processing methods. Quantitative data accuracy information was abstracted from the articles and pooled. Our analysis demonstrates that the accuracy associated with data processing methods varies widely, with error rates ranging from 2 errors per 10,000 files to 5019 errors per 10,000 fields. Medical record abstraction was associated with the highest error rates (70–5019 errors per 10,000 fields). Data entered and processed at healthcare facilities had comparable error rates to data processed at central data processing centers. Error rates for data processed with single entry in the presence of on-screen checks were comparable to double entered data. While data processing and cleaning methods may explain a significant amount of the variability in data accuracy, additional factors not resolvable here likely exist. Defining Data Quality for Clinical Research: A Concept Analysis Despite notable previous attempts by experts to define data quality, the concept remains ambiguous and subject to the vagaries of natural language. This current lack of clarity continues to hamper research related to data quality issues. We present a formal concept analysis of data quality, which builds on and synthesizes previously published work. We further posit that discipline-level specificity may be required to achieve the desired definitional clarity. To this end, we combine work from the clinical research domain with findings from the general data quality literature to produce a discipline-specific definition and operationalization for data quality in clinical research. While the results are helpful to clinical research, the methodology of concept analysis may be useful in other fields to clarify data quality attributes and to achieve operational definitions. Medical Record Abstractor’s Perceptions of Factors Impacting the Accuracy of Abstracted Data Medical record abstraction (MRA) is known to be a significant source of data errors in secondary data uses. Factors impacting the accuracy of abstracted data are not reported consistently in the literature. Two Delphi processes were conducted with experienced medical record abstractors to assess abstractor’s perceptions about the factors. The Delphi process identified 9 factors that were not found in the literature, and differed with the literature by 5 factors in the top 25%. The Delphi results refuted seven factors reported in the literature as impacting the quality of abstracted data. The results provide insight into and indicate content validity of a significant number of the factors reported in the literature. Further, the results indicate general consistency between the perceptions of clinical research medical record abstractors and registry and quality improvement abstractors. Distributed Cognition Artifacts on Clinical Research Data Collection Forms Medical record abstraction, a primary mode of data collection in secondary data use, is associated with high error rates. Distributed cognition in medical record abstraction has not been studied as a possible explanation for abstraction errors. We employed the theory of distributed representation and representational analysis to systematically evaluate cognitive demands in medical record abstraction and the extent of external cognitive support employed in a sample of clinical research data collection forms. We show that the cognitive load required for abstraction in 61% of the sampled data elements was high, exceedingly so in 9%. Further, the data collection forms did not support external cognition for the most complex data elements. High working memory demands are a possible explanation for the association of data errors with data elements requiring abstractor interpretation, comparison, mapping or calculation. The representational analysis used here can be used to identify data elements with high cognitive demands.
Resumo:
This paper discusses the target localization problem of wireless visual sensor networks. Specifically, each node with a low-resolution camera extracts multiple feature points to represent the target at the sensor node level. A statistical method of merging the position information of different sensor nodes to select the most correlated feature point pair at the base station is presented. This method releases the influence of the accuracy of target extraction on the accuracy of target localization in universal coordinate system. Simulations show that, compared with other relative approach, our proposed method can generate more desirable target localization's accuracy, and it has a better trade-off between camera node usage and localization accuracy.
Resumo:
The accuracy of Tomás López´s historical cartography of the Canary Islands included in the “Atlas Particular” of the Kingdoms of Spain, Portugal and Adjacent Islands” is analyzed. For this purpose, we propose a methodology based on Geographic Information Systems (GIS), a comparison of digitized historical cartography population centres with current ones. This study shows that the lineal error value is small for the smaller islands: Lanzarote, El Hierro, La Palma and La Gomera. In the large islands of Tenerife, Fuerteventura and Gran Canaria, the error is smaller in central zones but increases towards the coast. This indicates that Tomás López began his cartography starting from central island zones, accumulating errors due to lack of geodetic references as he moved toward the coast.
Resumo:
The aim of this study was to examine the effect of positioning on the correctness of decision making of top-class referees and assistant referees during international games. Match analyses were carried out during the Fe´de´ration Internationale de Football Association (FIFA) Confederations Cup 2009 and 380 foul play incidents and 165 offside situations were examined. The error percentage for the referees when indicating the incidents averaged 14%. The lowest error percentage occurred in the central area of the field, where the collaboration of the assistant referee is limited, and was achieved when indicating the incidents from a distance of 11–15 m, whereas this percentage peaked (23%) in the last 15-min match period. The error rate for the assistant referees was 13%. Distance of the assistant referee to the offside line did not have an impact on the quality of the offside decision. The risk of making incorrect decisions was reduced when the assistant referees viewed the offside situations from an angle between 46 and 608. Incorrect offside decisions occurred twice as often in the second as in the first half of the games. Perceptual-cognitive training sessions specific to the requirements of the game should be implemented in the weekly schedule of football officials to reduce the overall error rate.
Resumo:
In pressure irrigation-water distribution networks, pressure regulating devices for controlling the discharged flow rate by irrigation units are needed due to the variability of flow rate. In addition, applied water volume is used controlled operating the valve during a calculated time interval, and assuming constant flow rate. In general, a pressure regulating valve PRV is the commonly used pressure regulating device in a hydrant, which, also, executes the open and close function. A hydrant feeds several irrigation units, requiring a wide range in flow rate. In addition, some flow meters are also available, one as a component of the hydrant and the rest are placed downstream. Every land owner has one flow meter for each group of field plots downstream the hydrant. Its lecture could be used for refining the water balance but its accuracy must be taken into account. Ideal PRV performance would maintain a constant downstream pressure. However, the true performance depends on both upstream pressure and the discharged flow rate. The objective of this work is to asses the influence of the performance on the applied volume during the whole irrigation events in a year. The results of the study have been obtained introducing the flow rate into a PRV model. Variations on flow rate are simulated by taking into account the consequences of variations on climate conditions and also decisions in irrigation operation, such us duration and frequency application. The model comprises continuity, dynamic and energy equations of the components of the PRV.
Resumo:
Se establece un metodología para evaluar la cartografía de capas GIS
Resumo:
Se proponen novedosas fórmulas para evaluar la certeza de la cartografía
Resumo:
La mayoría de las aplicaciones forestales del escaneo laser aerotransportado (ALS, del inglés airborne laser scanning) requieren la integración y uso simultaneo de diversas fuentes de datos, con el propósito de conseguir diversos objetivos. Los proyectos basados en sensores remotos normalmente consisten en aumentar la escala de estudio progresivamente a lo largo de varias fases de fusión de datos: desde la información más detallada obtenida sobre un área limitada (la parcela de campo), hasta una respuesta general de la cubierta forestal detectada a distancia de forma más incierta pero cubriendo un área mucho más amplia (la extensión cubierta por el vuelo o el satélite). Todas las fuentes de datos necesitan en ultimo termino basarse en las tecnologías de sistemas de navegación global por satélite (GNSS, del inglés global navigation satellite systems), las cuales son especialmente erróneas al operar por debajo del dosel forestal. Otras etapas adicionales de procesamiento, como la ortorectificación, también pueden verse afectadas por la presencia de vegetación, deteriorando la exactitud de las coordenadas de referencia de las imágenes ópticas. Todos estos errores introducen ruido en los modelos, ya que los predictores se desplazan de la posición real donde se sitúa su variable respuesta. El grado por el que las estimaciones forestales se ven afectadas depende de la dispersión espacial de las variables involucradas, y también de la escala utilizada en cada caso. Esta tesis revisa las fuentes de error posicional que pueden afectar a los diversos datos de entrada involucrados en un proyecto de inventario forestal basado en teledetección ALS, y como las propiedades del dosel forestal en sí afecta a su magnitud, aconsejando en consecuencia métodos para su reducción. También se incluye una discusión sobre las formas más apropiadas de medir exactitud y precisión en cada caso, y como los errores de posicionamiento de hecho afectan a la calidad de las estimaciones, con vistas a una planificación eficiente de la adquisición de los datos. La optimización final en el posicionamiento GNSS y de la radiometría del sensor óptico permitió detectar la importancia de este ultimo en la predicción de la desidad relativa de un bosque monoespecífico de Pinus sylvestris L. ABSTRACT Most forestry applications of airborne laser scanning (ALS) require the integration and simultaneous use of various data sources, pursuing a variety of different objectives. Projects based on remotely-sensed data generally consist in upscaling data fusion stages: from the most detailed information obtained for a limited area (field plot) to a more uncertain forest response sensed over a larger extent (airborne and satellite swath). All data sources ultimately rely on global navigation satellite systems (GNSS), which are especially error-prone when operating under forest canopies. Other additional processing stages, such as orthorectification, may as well be affected by vegetation, hence deteriorating the accuracy of optical imagery’s reference coordinates. These errors introduce noise to the models, as predictors displace from their corresponding response. The degree to which forest estimations are affected depends on the spatial dispersion of the variables involved and the scale used. This thesis reviews the sources of positioning errors which may affect the different inputs involved in an ALS-assisted forest inventory project, and how the properties of the forest canopy itself affects their magnitude, advising on methods for diminishing them. It is also discussed how accuracy should be assessed, and how positioning errors actually affect forest estimation, toward a cost-efficient planning for data acquisition. The final optimization in positioning the GNSS and optical image allowed to detect the importance of the latter in predicting relative density in a monospecific Pinus sylvestris L. forest.
Wireless measurement system for structural health monitoring with high time synchronization accuracy
Resumo:
Structural health monitoring (SHM) systems have excellent potential to improve the regular operation and maintenance of structures. Wireless networks (WNs) have been used to avoid the high cost of traditional generic wired systems. The most important limitation of SHM wireless systems is time-synchronization accuracy, scalability, and reliability. A complete wireless system for structural identification under environmental load is designed, implemented, deployed, and tested on three different real bridges. Our contribution ranges from the hardware to the graphical front end. System goal is to avoid the main limitations of WNs for SHM particularly in regard to reliability, scalability, and synchronization. We reduce spatial jitter to 125 ns, far below the 120 μs required for high-precision acquisition systems and much better than the 10-μs current solutions, without adding complexity. The system is scalable to a large number of nodes to allow for dense sensor coverage of real-world structures, only limited by a compromise between measurement length and mandatory time to obtain the final result. The system addresses a myriad of problems encountered in a real deployment under difficult conditions, rather than a simulation or laboratory test bed.
Resumo:
Accuracy in the liquid hydrocarbons custody transfer is mandatory because it has a great economic impact. By far the most accurate meter is the positive displacement (PD) meter. Increasing such an accuracy may adversely affect the cost of the custody transfer, unless simple models are developed in order to lower the cost, which is the purpose of this work. PD meter consists of a fixed volume rotating chamber. For each turn a pulse is counted, hence, the measured volume is the number of pulses times the volume of the chamber. It does not coincide with the real volume, so corrections have to be made. All the corrections are grouped by a meter factor. Among corrections highlights the slippage flow. By solving the Navier-Stokes equations one can find an analytical expression for this flow. It is neither easy nor cheap to apply straightforward the slippage correction; therefore we have made a simple model where slippage is regarded as a single parameter with dimension of time. The model has been tested for several PD meters. In our careful experiments, the meter factor grows with temperature at a constant pace of 8?10?5?ºC?1. Be warned
Resumo:
Effects of considering the comminution rate -kc- and the correction of microbial contamination -using 15N techniques- of particles in the rumen on estimates of ruminally undegraded fractions and their intestinal digestibility were examined generating composite samples -from rumen-incubated residues- representative of the undegraded feed rumen outflow. The study used sunflower meal -SFM- and Italian ryegrass hay -RGH- and three rumen and duodenum cannulated wethers fed with a 40:60 RGH to concentrate diet -75 g DM/kgBW0.75-. Transit studies up to the duodenum with Yb-SFM and Eu-RGH marked samples showed higher kc values -/h- in SFM than in RGH -0.577 vs. 0.0892, p = 0.034-, whereas similar values occurred for the rumen passage rate -kp-. Estimates of ruminally undegraded and intestinal digestibility of all tested fractions decreased when kc was considered and also applying microbial correction. Thus, microbial uncorrected kp-based proportions of intestinal digested undegraded crude protein overestimated those corrected and kc-kp-based by 39% in SFM -0.146 vs. 0.105- and 761% in RGH -0.373 vs. 0.0433-. Results show that both kc and microbial contamination correction should be considered to obtain accurate in situ estimates in grasses, whereas in protein concentrates not considering kc is an important source of error.
Resumo:
Purpose: In this paper we study all settlements shown on the map of the Province of Madrid, sheet number 1 of AGE (Atlas Geográfico de España of Tomas Lopez 1804) and their correspondence with the current ones. This map is divided in to zones: Madrid and Almonacid de Zorita. Method: The steps followed in the methodology are as follow: 1. Geo-reference of maps with latitude and longitude framework. Move the historical longitude origin to the origin longitude of modern cartography. 2 Digitize of all population settlements or cities (97 on Madrid and 42 on Almonacid de Zorita), 3 Identify historic settlements or cities corresponding with current ones. 4. If the maps have the same orientation and scale, replace the coordinate transformation of historical settlements with a new one, by a translation in latitude and longitude equal to the calculated mean value of all ancient map points corresponding to the new. 5. Calculation of absolute accuracy of the two maps. 6 draw in the GIS, the settlements accuracy. Result: It was found that all AGE settlements have good correspondence with current, ie only 27 settlements lost in Madrid and 2 in Almonacid. The average accuracy is 2.3 and 5.7 km to Madrid and Almonacid de Zorita respectively. Discussion & Conclusion: The final accuracy map obtained shows that there is less error in the middle of the map. This study highlights the great work done by Tomas Lopez in performing this mapping without fieldwork. This demonstrates the great value that has been the work of Tomas Lopez in the history of cartography.
Resumo:
Subtraction of Ictal SPECT Co-registered to MRI (SISCOM) is an imaging technique used to localize the epileptogenic focus in patients with intractable partial epilepsy. The aim of this study was to determine the accuracy of registration algorithms involved in SISCOM analysis using FocusDET, a new user-friendly application. To this end, Monte Carlo simulation was employed to generate realistic SPECT studies. Simulated sinograms were reconstructed by using the Filtered BackProjection (FBP) algorithm and an Ordered Subsets Expectation Maximization (OSEM) reconstruction method that included compensation for all degradations. Registration errors in SPECT-SPECT and SPECT-MRI registration were evaluated by comparing the theoretical and actual transforms. Patient studies with well-localized epilepsy were also included in the registration assessment. Global registration errors including SPECT-SPECT and SPECT-MRI registration errors were less than 1.2 mm on average, exceeding the voxel size (3.32 mm) of SPECT studies in no case. Although images reconstructed using OSEM led to lower registration errors than images reconstructed with FBP, differences after using OSEM or FBP in reconstruction were less than 0.2 mm on average. This indicates that correction for degradations does not play a major role in the SISCOM process, thereby facilitating the application of the methodology in centers where OSEM is not implemented with correction of all degradations. These findings together with those obtained by clinicians from patients via MRI, interictal and ictal SPECT and video-EEG, show that FocusDET is a robust application for performing SISCOM analysis in clinical practice.
Resumo:
This paper analyses numerically the electric field distribution of a liquid contained in a Petri dish when exposed to electromagnetic waves excited in a rectangular waveguide. Solutions exhibit high-gradients due to the presence of the dielectric liquid contained in the dish. Furthermore, electromagnetic fields within the dielectric have a dramatically lower value than on the remaining part of the domain, which difficults its simulation. Additionally, various singularities of different intensity appear along the boundary of the Petri dish. To properly reproduce and numerically study those effects, we employ a highly-accurate hp-adaptive finite element method. Results of this study demonstrate that the electric field generated within the circular Petri dish is non-homogeneous, and thus, a better shape, size, or location of the dish is needed to achieve an equally distributed radiation enabling the uniform growth of cell cultives.
Resumo:
The applicability of a portable NIR spectrometer for estimating the °Brix content of grapes by non-destructive measurement has been analysed in field. The NIR spectrometer AOTF-NIR Luminar 5030, from Brimrose, was used. The spectrometer worked with a spectral range from 1100 to 2300 nm. A total of 600 samples of Cabernet Sauvignon grapes, belonging to two vintages, were measured in a non-destructive way. The specific objective of this research is to analyse the influence of the statistical treatment of the spectra information in the development of °Brix estimation models. Different data pretreatments have been tested before applying multivariate analysis techniques to generate estimation models. The calibration using PLS regression applied to spectra data pretreated with the MSC method (multiplicative scatter correction) has been the procedure with better results. Considering the models developed with data corresponding to the first campaign, errors near to 1.35 °Brix for calibration (SEC = 1.36) and, about 1.50 °Brix for validation (SECV = 1.52) were obtained. The coefficients of determination were R2 = 0.78 for the calibration, and R2 = 0.77 for the validation. In addition, the great variability in the data of the °Brix content for the tested plots was analysed. The variation of °Brix on the plots was up to 4 °Brix, for all varieties. This deviation was always superior to the calculated errors in the generated models. Therefore, the generated models can be considered to be valid for its application in field. Models were validated with data corresponding to the second campaign. In this sense, the validation results were worse than those obtained in the first campaign. It is possible to conclude in the need to realize an adjustment of the spectrometer for each season, and to develop specific predictive models for every vineyard.