889 resultados para Method Evaluation


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Crop type classification using remote sensing data plays a vital role in planning cultivation activities and for optimal usage of the available fertile land. Thus a reliable and precise classification of agricultural crops can help improve agricultural productivity. Hence in this paper a gene expression programming based fuzzy logic approach for multiclass crop classification using Multispectral satellite image is proposed. The purpose of this work is to utilize the optimization capabilities of GEP for tuning the fuzzy membership functions. The capabilities of GEP as a classifier is also studied. The proposed method is compared to Bayesian and Maximum likelihood classifier in terms of performance evaluation. From the results we can conclude that the proposed method is effective for classification.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Single receive antenna selection (AS) is a popular method for obtaining diversity benefits without the additional costs of multiple radio receiver chains. Since only one antenna receives at any time, the transmitter sends a pilot multiple times to enable the receiver to estimate the channel gains of its N antennas to the transmitter and select an antenna. In time-varying channels, the channel estimates of different antennas are outdated to different extents. We analyze the symbol error probability (SEP) in time-varying channels of the N-pilot and (N+1)-pilot AS training schemes. In the former, the transmitter sends one pilot for each receive antenna. In the latter, the transmitter sends one additional pilot that helps sample the channel fading process of the selected antenna twice. We present several new results about the SEP, optimal energy allocation across pilots and data, and optimal selection rule in time-varying channels for the two schemes. We show that due to the unique nature of AS, the (N+1)-pilot scheme, despite its longer training duration, is much more energy-efficient than the conventional N-pilot scheme. An extension to a practical scenario where all data symbols of a packet are received by the same antenna is also investigated.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We report a blood pressure evaluation methodology by recording the radial arterial pulse waveform in real time using a fiber Bragg grating pulse device (FBGPD). Here, the pressure responses of the arterial pulse in the form of beat-to-beat pulse amplitude and arterial diametrical variations are monitored. Particularly, the unique signatures of pulse pressure variations have been recorded in the arterial pulse waveform, which indicate the systolic and diastolic blood pressure while the patient is subjected to the sphygmomanometric blood pressure examination. The proposed method of blood pressure evaluation using FBGPD has been validated with the auscultatory method of detecting the acoustic pulses (Korotkoff sounds) by an electronic stethoscope. (C) 2013 Society of Photo-Optical Instrumentation Engineers (SPIE)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A necessary step for the recognition of scanned documents is binarization, which is essentially the segmentation of the document. In order to binarize a scanned document, we can find several algorithms in the literature. What is the best binarization result for a given document image? To answer this question, a user needs to check different binarization algorithms for suitability, since different algorithms may work better for different type of documents. Manually choosing the best from a set of binarized documents is time consuming. To automate the selection of the best segmented document, either we need to use ground-truth of the document or propose an evaluation metric. If ground-truth is available, then precision and recall can be used to choose the best binarized document. What is the case, when ground-truth is not available? Can we come up with a metric which evaluates these binarized documents? Hence, we propose a metric to evaluate binarized document images using eigen value decomposition. We have evaluated this measure on DIBCO and H-DIBCO datasets. The proposed method chooses the best binarized document that is close to the ground-truth of the document.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we evaluate the performance of a burst retransmission method for an optical burst switched network with intermediate-node-initiation (INI) signaling technique. The proposed method tries to reduce the burst contention probability at the intermediate core nodes. We develop an analytical model to get the burst contention probability and burst loss probability for an optical burst switched network with intermediate-node-initiation signaling technique. The proposed method uses the optical burst retransmission method. We simulate the performance of the optical burst retransmission. Simulation results show that at low traffic loads the loss probability is low compared to the conventional burst retransmission in the OBS network. Result also show that the retransmission method for OBS network with intermediate-node-initiation signaling technique significantly reduces the burst loss probability.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Deviated nasal septum (DNS) is one of the major causes of nasal obstruction. Polyvinylidene fluoride (PVDF) nasal sensor is the new technique developed to assess the nasal obstruction caused by DNS. This study evaluates the PVDF nasal sensor measurements in comparison with PEAK nasal inspiratory flow (PNIF) measurements and visual analog scale (VAS) of nasal obstruction. Methods: Because of piezoelectric property, two PVDF nasal sensors provide output voltage signals corresponding to the right and left nostril when they are subjected to nasal airflow. The peak-to-peak amplitude of the voltage signal corresponding to nasal airflow was analyzed to assess the nasal obstruction. PVDF nasal sensor and PNIF were performed on 30 healthy subjects and 30 DNS patients. Receiver operating characteristic was used to analyze the DNS of these two methods. Results: Measurements of PVDF nasal sensor strongly correlated with findings of PNIF (r = 0.67; p < 0.01) in DNS patients. A significant difference (p < 0.001) was observed between PVDF nasal sensor measurements and PNIF measurements of the DNS and the control group. A cutoff between normal and pathological of 0.51 Vp-p for PVDF nasal sensor and 120 L/min for PNIF was calculated. No significant difference in terms of sensitivity of PVDF nasal sensor and PNIF (89.7% versus 82.6%) and specificity (80.5% versus 78.8%) was calculated. Conclusion: The result shows that PVDF measurements closely agree with PNIF findings. Developed PVDF nasal sensor is an objective method that is simple, inexpensive, fast, and portable for determining DNS in clinical practice.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The sparse estimation methods that utilize the l(p)-norm, with p being between 0 and 1, have shown better utility in providing optimal solutions to the inverse problem in diffuse optical tomography. These l(p)-norm-based regularizations make the optimization function nonconvex, and algorithms that implement l(p)-norm minimization utilize approximations to the original l(p)-norm function. In this work, three such typical methods for implementing the l(p)-norm were considered, namely, iteratively reweighted l(1)-minimization (IRL1), iteratively reweighted least squares (IRLS), and the iteratively thresholding method (ITM). These methods were deployed for performing diffuse optical tomographic image reconstruction, and a systematic comparison with the help of three numerical and gelatin phantom cases was executed. The results indicate that these three methods in the implementation of l(p)-minimization yields similar results, with IRL1 fairing marginally in cases considered here in terms of shape recovery and quantitative accuracy of the reconstructed diffuse optical tomographic images. (C) 2014 Optical Society of America

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The tonic is a fundamental concept in Indian art music. It is the base pitch, which an artist chooses in order to construct the melodies during a rg(a) rendition, and all accompanying instruments are tuned using the tonic pitch. Consequently, tonic identification is a fundamental task for most computational analyses of Indian art music, such as intonation analysis, melodic motif analysis and rg recognition. In this paper we review existing approaches for tonic identification in Indian art music and evaluate them on six diverse datasets for a thorough comparison and analysis. We study the performance of each method in different contexts such as the presence/absence of additional metadata, the quality of audio data, the duration of audio data, music tradition (Hindustani/Carnatic) and the gender of the singer (male/female). We show that the approaches that combine multi-pitch analysis with machine learning provide the best performance in most cases (90% identification accuracy on average), and are robust across the aforementioned contexts compared to the approaches based on expert knowledge. In addition, we also show that the performance of the latter can be improved when additional metadata is available to further constrain the problem. Finally, we present a detailed error analysis of each method, providing further insights into the advantages and limitations of the methods.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We report on the systematic comparative study of highly c-axis oriented and crystalline piezoelectric ZnO thin films deposited on four different flexible substrates for vibration sensing application. The flexible substrates employed for present experimental study were namely a metal alloy (Phynox), metal (aluminum), polyimide (Kapton), and polyester (Mylar). ZnO thin films were deposited by an RF reactive magnetron sputtering technique. ZnO thin films of similar thicknesses of 700 +/- 30 nm were deposited on four different flexible substrates to have proper comparative studies. The crystallinity, surface morphology, chemical composition, and roughness of ZnO thin films were evaluated by respective material characterization techniques. The transverse piezoelectric coefficient (d(31)) value for assessing the piezoelectric property of ZnO thin films on different flexible substrates was measured by a four-point bending method. ZnO thin films deposited on Phynox alloy substrate showed relatively better material characterization results and a higher piezoelectric d(31) coefficient value as compared to ZnO films on metal and polymer substrates. In order to experimentally verify the above observations, vibration sensing studies were performed. As expected, the ZnO thin film deposited on Phynox alloy substrate showed better vibration sensing performance. It has generated the highest peak to peak output voltage amplitude of 256 mV as compared to that of aluminum (224 mV), Kapton (144 mV), and Mylar (46 mV). Therefore, metal alloy flexible substrate proves to be a more suitable, advantageous, and versatile choice for integrating ZnO thin films as compared to metal and polymer flexible substrates for vibration sensing applications. The present experimental study is extremely important and helpful for the selection of a suitable flexible substrate for various applications in the field of sensor and actuator technology.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Simulated boundary potential data for Electrical Impedance Tomography (EIT) are generated by a MATLAB based EIT data generator and the resistivity reconstruction is evaluated with Electrical Impedance Tomography and Diffuse Optical Tomography Reconstruction Software (EIDORS). Circular domains containing subdomains as inhomogeneity are defined in MATLAB-based EIT data generator and the boundary data are calculated by a constant current simulation with opposite current injection (OCI) method. The resistivity images reconstructed for different boundary data sets and images are analyzed with image parameters to evaluate the reconstruction.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the domain of manual mechanical assembly, expert knowledge is an important means of supporting assembly planning that leads to fewer issues during actual assembly. Knowledge based systems can be used to provide assembly planners with expert knowledge as advice. However, acquisition of knowledge remains a difficult task to automate, while manual acquisition is tedious, time-consuming, and requires engagement of knowledge engineers with specialist knowledge to understand and translate expert knowledge. This paper describes the development, implementation and preliminary evaluation of a method that asks a series of questions to an expert, so as to automatically acquire necessary diagnostic and remedial knowledge as rules for use in a knowledge based system for advising assembly planners diagnose and resolve issues. The method, called a questioning procedure, organizes its questions around an assembly situation which it presents to the expert as the context, and adapts its questions based on the answers it receives from the expert. (C) 2014 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Several statistical downscaling models have been developed in the past couple of decades to assess the hydrologic impacts of climate change by projecting the station-scale hydrological variables from large-scale atmospheric variables simulated by general circulation models (GCMs). This paper presents and compares different statistical downscaling models that use multiple linear regression (MLR), positive coefficient regression (PCR), stepwise regression (SR), and support vector machine (SVM) techniques for estimating monthly rainfall amounts in the state of Florida. Mean sea level pressure, air temperature, geopotential height, specific humidity, U wind, and V wind are used as the explanatory variables/predictors in the downscaling models. Data for these variables are obtained from the National Centers for Environmental Prediction-National Center for Atmospheric Research (NCEP-NCAR) reanalysis dataset and the Canadian Centre for Climate Modelling and Analysis (CCCma) Coupled Global Climate Model, version 3 (CGCM3) GCM simulations. The principal component analysis (PCA) and fuzzy c-means clustering method (FCM) are used as part of downscaling model to reduce the dimensionality of the dataset and identify the clusters in the data, respectively. Evaluation of the performances of the models using different error and statistical measures indicates that the SVM-based model performed better than all the other models in reproducing most monthly rainfall statistics at 18 sites. Output from the third-generation CGCM3 GCM for the A1B scenario was used for future projections. For the projection period 2001-10, MLR was used to relate variables at the GCM and NCEP grid scales. Use of MLR in linking the predictor variables at the GCM and NCEP grid scales yielded better reproduction of monthly rainfall statistics at most of the stations (12 out of 18) compared to those by spatial interpolation technique used in earlier studies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Optical emission from emitters strongly interacting among themselves and also with other polarizable matter in close proximity has been approximated by emission from independent emitters. This is primarily due to our inability to evaluate the self-energy matrices and radiative properties of the collective eigenstates of emitters in heterogeneous ensembles. A method to evaluate self-energy matrices that is not limited by the geometry and material composition is presented to understand and exploit such collective excitations. Numerical evaluations using this method are used to highlight the significant differences between independent and the collective modes of emission in nanoscale heterostructures. A set of N Lorentz emitters and other polarizable entities is used to represent the coupled system of a generalized geometry in a volume integral approach. Closed form relations between the Green tensors of entity pairs in free space and their correspondents in a heterostructure are derived concisely. This is made possible for general geometries because the global matrices consisting of all free-space Green dyads are subject to conservation laws. The self-energy matrix can then be assembled using the evaluated Green tensors of the heterostructure, but a decomposition of its components into their radiative and nonradiative decay contributions is nontrivial. The relations to compute the observables of the eigenstates (such as quantum efficiency, power/energy of emission, radiative and nonradiative decay rates) are presented. A note on extension of this method to collective excitations, which also includes strong interactions with a surface in the near-field, is added. (C) 2014 Optical Society of America

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this study, we applied the integration methodology developed in the companion paper by Aires (2014) by using real satellite observations over the Mississippi Basin. The methodology provides basin-scale estimates of the four water budget components (precipitation P, evapotranspiration E, water storage change Delta S, and runoff R) in a two-step process: the Simple Weighting (SW) integration and a Postprocessing Filtering (PF) that imposes the water budget closure. A comparison with in situ observations of P and E demonstrated that PF improved the estimation of both components. A Closure Correction Model (CCM) has been derived from the integrated product (SW+PF) that allows to correct each observation data set independently, unlike the SW+PF method which requires simultaneous estimates of the four components. The CCM allows to standardize the various data sets for each component and highly decrease the budget residual (P - E - Delta S - R). As a direct application, the CCM was combined with the water budget equation to reconstruct missing values in any component. Results of a Monte Carlo experiment with synthetic gaps demonstrated the good performances of the method, except for the runoff data that has a variability of the same order of magnitude as the budget residual. Similarly, we proposed a reconstruction of Delta S between 1990 and 2002 where no Gravity Recovery and Climate Experiment data are available. Unlike most of the studies dealing with the water budget closure at the basin scale, only satellite observations and in situ runoff measurements are used. Consequently, the integrated data sets are model independent and can be used for model calibration or validation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The present immuno-diagnostic method using soluble antigens from whole cell lysate antigen for trypanosomosis have certain inherent problems like lack of standardized and reproducible antigens, as well as ethical issues due to in vivo production, that could be alleviated by in vitro production. In the present study we have identified heat shock protein 70 (HSP70) from T. evansi proteome. The nucleotide sequence of T. evansi HSP70 was 2116 bp, which encodes 690 amino acid residues. The phylogenetic analysis of T. evansi HSP70 showed that T. evansi occurred within Trypanosoma clade and is most closely related to T. brucei brucei and T. brucei gambiense, whereas T. congolense HSP70 laid in separate clade. The two partial HSP70 sequences (HSP-1 from N-terminal region and HSP-2 from C-terminal region) were expressed and evaluated as diagnostic antigens using experimentally infected equine serum samples. Both recombinant proteins detected antibody in immunoblot using serum samples from experimental infected donkeys with T. evansi. Recombinant HSP-2 showed comparable antibody response to Whole cell lysate (WCL) antigen in immunoblot and ELISA. The initial results indicated that HSP70 has potential to detect the T. evansi infection and needs further validation on large set of equine serum samples.