909 resultados para Data accuracy


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Parkinson's disease (PD) is a degenerative illness whose cardinal symptoms include rigidity, tremor, and slowness of movement. In addition to its widely recognized effects PD can have a profound effect on speech and voice.The speech symptoms most commonly demonstrated by patients with PD are reduced vocal loudness, monopitch, disruptions of voice quality, and abnormally fast rate of speech. This cluster of speech symptoms is often termed Hypokinetic Dysarthria.The disease can be difficult to diagnose accurately, especially in its early stages, due to this reason, automatic techniques based on Artificial Intelligence should increase the diagnosing accuracy and to help the doctors make better decisions. The aim of the thesis work is to predict the PD based on the audio files collected from various patients.Audio files are preprocessed in order to attain the features.The preprocessed data contains 23 attributes and 195 instances. On an average there are six voice recordings per person, By using data compression technique such as Discrete Cosine Transform (DCT) number of instances can be minimized, after data compression, attribute selection is done using several WEKA build in methods such as ChiSquared, GainRatio, Infogain after identifying the important attributes, we evaluate attributes one by one by using stepwise regression.Based on the selected attributes we process in WEKA by using cost sensitive classifier with various algorithms like MultiPass LVQ, Logistic Model Tree(LMT), K-Star.The classified results shows on an average 80%.By using this features 95% approximate classification of PD is acheived.This shows that using the audio dataset, PD could be predicted with a higher level of accuracy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

To have good data quality with high complexity is often seen to be important. Intuition says that the higher accuracy and complexity the data have the better the analytic solutions becomes if it is possible to handle the increasing computing time. However, for most of the practical computational problems, high complexity data means that computational times become too long or that heuristics used to solve the problem have difficulties to reach good solutions. This is even further stressed when the size of the combinatorial problem increases. Consequently, we often need a simplified data to deal with complex combinatorial problems. In this study we stress the question of how the complexity and accuracy in a network affect the quality of the heuristic solutions for different sizes of the combinatorial problem. We evaluate this question by applying the commonly used p-median model, which is used to find optimal locations in a network of p supply points that serve n demand points. To evaluate this, we vary both the accuracy (the number of nodes) of the network and the size of the combinatorial problem (p). The investigation is conducted by the means of a case study in a region in Sweden with an asymmetrically distributed population (15,000 weighted demand points), Dalecarlia. To locate 5 to 50 supply points we use the national transport administrations official road network (NVDB). The road network consists of 1.5 million nodes. To find the optimal location we start with 500 candidate nodes in the network and increase the number of candidate nodes in steps up to 67,000 (which is aggregated from the 1.5 million nodes). To find the optimal solution we use a simulated annealing algorithm with adaptive tuning of the temperature. The results show that there is a limited improvement in the optimal solutions when the accuracy in the road network increase and the combinatorial problem (low p) is simple. When the combinatorial problem is complex (large p) the improvements of increasing the accuracy in the road network are much larger. The results also show that choice of the best accuracy of the network depends on the complexity of the combinatorial (varying p) problem.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Jakarta is vulnerable to flooding mainly caused by prolonged and heavy rainfall and thus a robust hydrological modeling is called for. A good quality of spatial precipitation data is therefore desired so that a good hydrological model could be achieved. Two types of rainfall sources are available: satellite and gauge station observations. At-site rainfall is considered to be a reliable and accurate source of rainfall. However, the limited number of stations makes the spatial interpolation not very much appealing. On the other hand, the gridded rainfall nowadays has high spatial resolution and improved accuracy, but still, relatively less accurate than its counterpart. To achieve a better precipitation data set, the study proposes cokriging method, a blending algorithm, to yield the blended satellite-gauge gridded rainfall at approximately 10-km resolution. The Global Satellite Mapping of Precipitation (GSMaP, 0.1⁰×0.1⁰) and daily rainfall observations from gauge stations are used. The blended product is compared with satellite data by cross-validation method. The newly-yield blended product is then utilized to re-calibrate the hydrological model. Several scenarios are simulated by the hydrological models calibrated by gauge observations alone and blended product. The performance of two calibrated hydrological models is then assessed and compared based on simulated and observed runoff.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work aims to compare the forecast efficiency of different types of methodologies applied to Brazilian Consumer inflation (IPCA). We will compare forecasting models using disaggregated and aggregated data over twelve months ahead. The disaggregated models were estimated by SARIMA and will have different levels of disaggregation. Aggregated models will be estimated by time series techniques such as SARIMA, state-space structural models and Markov-switching. The forecasting accuracy comparison will be made by the selection model procedure known as Model Confidence Set and by Diebold-Mariano procedure. We were able to find evidence of forecast accuracy gains in models using more disaggregated data

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose: The objective of this study was to evaluate and compare 3 impression techniques for osseointegrated implant transfer procedures.Materials and Methods: (1) Group Splinted with Acrylic Resin (SAR), impression with square copings splinted with prefabricated autopolymerizing acrylic resin bar; (2) Group Splinted with Light-Curing Resin (SLR), impression, with square copings splinted with prefabricated light-curing composite resin bar; (3). Group Independent Air-abraded (IAA), impression with independent square coping aluminum oxide air-abraded. Impression procedures were performed with polyether material, and the data obtained was compared with a control group. These were characterized by metal matrix (MM) measurement values of the implants inclination positions at 90 and 05 degrees in relation to the matrix surface. Readings of analogs and implant inclinations were assessed randomly through graphic computation AutoCAD software. Experimental groups angular deviation with MM were submitted to analysis of variance and means were compared through Tukey's test (P < 0.05).Results: There was no statistical significant difference between SAR and SLR experimental groups and MM for vertical and angulated implants. Group IAA presented a statistically significant difference for angulated implants.Conclusion: It was concluded within the limitations of this study, that SAR and SLR produced more accurate casts than IAA technique, which presented inferior results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

O objetivo deste trabalho e apresentar uma investigação preliminar da precisão nos resultados do sistema de localização geográfica de transmissores desenvolvido utilizando o software da rede brasileira de coleta de dados. Um conjunto de medidas de desvio Doppler de uma única passagem do satélite, considerando uma Plataforma de Coleta de Dados (PCD) e uma rede de estações de recepção terrestrês, e denominado uma rede de recepção de dados. Assim, a rede brasileira de coleta de dados com o uso de múltiplas estações de recepção permitira o incremento na quantidade de dados coletados com consequente melhora na precisão e na confiabilidade das localizações fornecidas. Consequentemente uma maior quantidade de localizações válidas e mais precisas. Os resultados e análises foram obtidos sob duas condições: na primeira foi considerada uma condição prática com dados reais e dados ideais simulados, para comparar os resultados considerando a mesma passagem do satélite, transmissor e duas estações de recepção conhecidas; na segunda foram consideradas as condições ideais simuladas a partir de medidas de um transmissor fixo, três estações de recepção e dois satélites. Os resultados utilizando a rede de recepção de dados foram bastante satisfatórios. O estudo realizado mostrou a importãncia da instalação de novas estações de recepção terrenas distribuídas no territorio nacional, para um aumento na quantidade de medidas e consequentemente uma maior quantidade de localizações válidas e mais precisas.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Geometric accuracy of a close-range photogrammetric system is assessed in this paper considering surface reconstruction with structured light as its main purpose. The system is based on an off-the-shelf digital camera and a pattern projector. The mathematical model for reconstruction is based on the parametric equation of the projected straight line combined with collinearity equations. A sequential approach for system calibration was developed and is presented. Results obtained from real data are also presented and discussed. Experiments with real data using a prototype have indicated 0.5mm of accuracy in height determination and 0.2mm in the XY plane considering an application where the object was 1630mm distant from the camera.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this study was to determine the accuracy of mechanical torque devices in delivering target torque values in dental offices in Salvador, Brazil. A team of researchers visited 16 dental offices, and the clinicians applied torque values (20 and 32 Ncm) to electronic torque controllers. Five repetitions were completed at each torque value and data were collected. When 20 Ncm of torque was used, 62.5% of measured values were accurate (within 10% of the target value). For 32 Ncm, however, only 37.5% of these values were achieved. Several of the tested mechanical torque devices were inaccurate. Int J Prosthodont 2011;24:38-39.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Starting from the well established form of the Dirac action coupled to the electromagnetic and torsion field we find that there is some additional softly broken local symmetry associated with torsion. This symmetry fixes the form of divergences of the effective action after the spinor fields are integrated out. Then the requirement of renormalizability fixes the torsion field to be equivalent to some massive pseudovector and its action is fixed with accuracy to the values of coupling constant of torsion-spinor interaction, mass of the torsion and higher derivative terms. Implementing this action into the abelian sector of the Standard Model we establish the upper bounds on the torsion mass and coupling. In our study we used results of present experimental limits on four-fermion contact interaction (LEP, HERA, SLAC, SLD, CCFR) and TEVATRON limits on the cross section of new gauge boson, which could be produced as a resonance at high energy pp̄ collisions. © 1998 Elsevier Science B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Until mid 2006, SCIAMACHY data processors for the operational retrieval of nitrogen dioxide (NO2) column data were based on the historical version 2 of the GOME Data Processor (GDP). On top of known problems inherent to GDP 2, ground-based validations of SCIAMACHY NO2 data revealed issues specific to SCIAMACHY, like a large cloud-dependent offset occurring at Northern latitudes. In 2006, the GDOAS prototype algorithm of the improved GDP version 4 was transferred to the off-line SCIAMACHY Ground Processor (SGP) version 3.0. In parallel, the calibration of SCIAMACHY radiometric data was upgraded. Before operational switch-on of SGP 3.0 and public release of upgraded SCIAMACHY NO2 data, we have investigated the accuracy of the algorithm transfer: (a) by checking the consistency of SGP 3.0 with prototype algorithms; and (b) by comparing SGP 3.0 NO2 data with ground-based observations reported by the WMO/GAW NDACC network of UV-visible DOAS/SAOZ spectrometers. This delta-validation study concludes that SGP 3.0 is a significant improvement with respect to the previous processor IPF 5.04. For three particular SCIAMACHY states, the study reveals unexplained features in the slant columns and air mass factors, although the quantitative impact on SGP 3.0 vertical columns is not significant.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The contents of some nutrients in 35 Brazilian green and roasted coffee samples were determined by flame atomic absorption spectrometry (Ca, Mg, Fe, Cu, Mn, and Zn), flame atomic emission photometry (Na and K) and Kjeldahl (N) after preparing the samples by wet digestion procedures using i) a digester heating block and ii) a conventional microwave oven system with pressure and temperature control. The accuracy of the procedures was checked using three standard reference materials (National Institute of Standards and Technology, SRM 1573a Tomato Leaves, SRM 1547 Peach Leaves, SRM 1570a Trace Elements in Spinach). Analysis of data after application of t-test showed that results obtained by microwave-assisted digestion were more accurate than those obtained by block digester at 95% confidence level. Additionally to better accuracy, other favorable characteristics found were lower analytical blanks, lower reagent consumption, and shorter digestion time. Exploratory analysis of results using Principal Component Analysis (PCA) and Hierarchical Cluster Analysis (HCA) showed that Na, K, Ca, Cu, Mg, and Fe were the principal elements to discriminate between green and roasted coffee samples. ©2007 Sociedade Brasileira de Química.