885 resultados para fuzzy based evaluation method


Relevância:

40.00% 40.00%

Publicador:

Resumo:

Hyperspectral imaging has become one of the main topics in remote sensing applications, which comprise hundreds of spectral bands at different (almost contiguous) wavelength channels over the same area generating large data volumes comprising several GBs per flight. This high spectral resolution can be used for object detection and for discriminate between different objects based on their spectral characteristics. One of the main problems involved in hyperspectral analysis is the presence of mixed pixels, which arise when the spacial resolution of the sensor is not able to separate spectrally distinct materials. Spectral unmixing is one of the most important task for hyperspectral data exploitation. However, the unmixing algorithms can be computationally very expensive, and even high power consuming, which compromises the use in applications under on-board constraints. In recent years, graphics processing units (GPUs) have evolved into highly parallel and programmable systems. Specifically, several hyperspectral imaging algorithms have shown to be able to benefit from this hardware taking advantage of the extremely high floating-point processing performance, compact size, huge memory bandwidth, and relatively low cost of these units, which make them appealing for onboard data processing. In this paper, we propose a parallel implementation of an augmented Lagragian based method for unsupervised hyperspectral linear unmixing on GPUs using CUDA. The method called simplex identification via split augmented Lagrangian (SISAL) aims to identify the endmembers of a scene, i.e., is able to unmix hyperspectral data sets in which the pure pixel assumption is violated. The efficient implementation of SISAL method presented in this work exploits the GPU architecture at low level, using shared memory and coalesced accesses to memory.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Performance evaluation increasingly assumes a more important role in any organizational environment. In the transport area, the drivers are the company’s image and for this reason it is important to develop and increase their performance and commitment to the company goals. This evaluation can be used to motivate driver to improve their performance and to discover training needs. This work aims to create a performance appraisal evaluation model of the drivers based on the multi-criteria decision aid methodology. The MMASSI (Multicriteria Methodology to Support Selection of Information Systems) methodology was adapted by using a template supporting the evaluation according to the freight transportation company in study. The evaluation process involved all drivers (collaborators being evaluated), their supervisors and the company management. The final output is a ranking of the drivers, based on their performance, for each one of the scenarios used.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Arguably, the most difficult task in text classification is to choose an appropriate set of features that allows machine learning algorithms to provide accurate classification. Most state-of-the-art techniques for this task involve careful feature engineering and a pre-processing stage, which may be too expensive in the emerging context of massive collections of electronic texts. In this paper, we propose efficient methods for text classification based on information-theoretic dissimilarity measures, which are used to define dissimilarity-based representations. These methods dispense with any feature design or engineering, by mapping texts into a feature space using universal dissimilarity measures; in this space, classical classifiers (e.g. nearest neighbor or support vector machines) can then be used. The reported experimental evaluation of the proposed methods, on sentiment polarity analysis and authorship attribution problems, reveals that it approximates, sometimes even outperforms previous state-of-the-art techniques, despite being much simpler, in the sense that they do not require any text pre-processing or feature engineering.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Load forecasting has gradually becoming a major field of research in electricity industry. Therefore, Load forecasting is extremely important for the electric sector under deregulated environment as it provides a useful support to the power system management. Accurate power load forecasting models are required to the operation and planning of a utility company, and they have received increasing attention from researches of this field study. Many mathematical methods have been developed for load forecasting. This work aims to develop and implement a load forecasting method for short-term load forecasting (STLF), based on Holt-Winters exponential smoothing and an artificial neural network (ANN). One of the main contributions of this paper is the application of Holt-Winters exponential smoothing approach to the forecasting problem and, as an evaluation of the past forecasting work, data mining techniques are also applied to short-term Load forecasting. Both ANN and Holt-Winters exponential smoothing approaches are compared and evaluated.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Power systems have been experiencing huge changes mainly due to the substantial increase of distributed generation (DG) and the operation in competitive environments. Virtual Power Players (VPP) can aggregate several players, namely a diversity of energy resources, including distributed generation (DG) based on several technologies, electric storage systems (ESS) and demand response (DR). Energy resources management gains an increasing relevance in this competitive context. This makes the DR use more interesting and flexible, giving place to a wide range of new opportunities. This paper proposes a methodology to support VPPs in the DR programs’ management, considering all the existing energy resources (generation and storage units) and the distribution network. The proposed method is based on locational marginal prices (LMP) values. The evaluation of the impact of using DR specific programs in the LMP values supports the manager decision concerning the DR use. The proposed method has been computationally implemented and its application is illustrated in this paper using a 33-bus network with intensive use of DG.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

A comparison of the Etest and the reference broth macrodilution susceptibility test for fluconazole, ketoconazole, itraconazole and amphotericin B was performed with 59 of Candida species isolated from the oral cavities of AIDS patients. The Etest method was performed according to the manufacturer's instructions, and the reference method was performed according to National Committee for Clinical Laboratory Standards document M27-A guidelines. Our data showed that there was a good correlation between the MICs obtained by the Etest and broth dilution methods. When only the MIC results at ± 2 dilutions for both methods were considered, the agreement rates were 90.4% for itraconazole, ketoconazole and amphotericin B and 84.6% for fluconazole of the C. albicans tested. In contrast, to the reference method, the Etest method classified as susceptible three fluconazole-resistant isolates and one itraconazole-resistant isolate, representing four very major errors. These results indicate that Etest could be considered useful for antifungal sensitivity evaluation of yeasts in clinical laboratories.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This study evaluated the whole blood immunochromatographic card test (ICT card test) in a survey performed in Northeastern Brazil. 625 people were examined by the thick blood film (TBF) and ICT card test. Residents of a non-endemic area were also tested by the whole blood card test and Og4C3. The sensitivity of the ICT card test was 94.7% overall, but lower in females than males, based on the reasonable assumption that TBF is 100% specific. However, since TBF and other methods have unknown sensitivity, the true specificity of the card test is unknown. Nevertheless, it is possible to estimate upper and lower limits for the specificity, and relate it to the prevalence of the disease. In the endemic area, the possible range of the specificity was from 72.4% to 100%. 29.6% of the card tests performed in the non-endemic area exhibited faint lines that were interpreted as positives. Characteristics of the method including high sensitivity, promptness and simplicity justify its use for screening of filariasis. However, detailed information about the correct interpretation in case of extremely faint lines is essential. Further studies designed to consider problems arising from imperfect standards are necessary, as is a sounder diagnostic definition for the card test.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The Internet of Things (IoT) has emerged as a paradigm over the last few years as a result of the tight integration of the computing and the physical world. The requirement of remote sensing makes low-power wireless sensor networks one of the key enabling technologies of IoT. These networks encompass several challenges, especially in communication and networking, due to their inherent constraints of low-power features, deployment in harsh and lossy environments, and limited computing and storage resources. The IPv6 Routing Protocol for Low Power and Lossy Networks (RPL) [1] was proposed by the IETF ROLL (Routing Over Low-power Lossy links) working group and is currently adopted as an IETF standard in the RFC 6550 since March 2012. Although RPL greatly satisfied the requirements of low-power and lossy sensor networks, several issues remain open for improvement and specification, in particular with respect to Quality of Service (QoS) guarantees and support for mobility. In this paper, we focus mainly on the RPL routing protocol. We propose some enhancements to the standard specification in order to provide QoS guarantees for static as well as mobile LLNs. For this purpose, we propose OF-FL (Objective Function based on Fuzzy Logic), a new objective function that overcomes the limitations of the standardized objective functions that were designed for RPL by considering important link and node metrics, namely end-to-end delay, number of hops, ETX (Expected transmission count) and LQL (Link Quality Level). In addition, we present the design of Co-RPL, an extension to RPL based on the corona mechanism that supports mobility in order to overcome the problem of slow reactivity to frequent topology changes and thus providing a better quality of service mainly in dynamic networks application. Performance evaluation results show that both OF-FL and Co-RPL allow a great improvement when compared to the standard specification, mainly in terms of packet loss ratio and average network latency. 2015 Elsevier B.V. Al

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Optimization methods have been used in many areas of knowledge, such as Engineering, Statistics, Chemistry, among others, to solve optimization problems. In many cases it is not possible to use derivative methods, due to the characteristics of the problem to be solved and/or its constraints, for example if the involved functions are non-smooth and/or their derivatives are not know. To solve this type of problems a Java based API has been implemented, which includes only derivative-free optimization methods, and that can be used to solve both constrained and unconstrained problems. For solving constrained problems, the classic Penalty and Barrier functions were included in the API. In this paper a new approach to Penalty and Barrier functions, based on Fuzzy Logic, is proposed. Two penalty functions, that impose a progressive penalization to solutions that violate the constraints, are discussed. The implemented functions impose a low penalization when the violation of the constraints is low and a heavy penalty when the violation is high. Numerical results, obtained using twenty-eight test problems, comparing the proposed Fuzzy Logic based functions to six of the classic Penalty and Barrier functions are presented. Considering the achieved results, it can be concluded that the proposed penalty functions besides being very robust also have a very good performance.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Introduction & Objectives: Several factors may influence the decision to pursue nonsurgical modalities for the treatment of non-melanoma skin cancer. Topical photodynamic therapy (PDT) is a non-invasive alternative treatment reported to have a high efficacy when using standardized protocols in Bowen’s disease (BD), superficial basal cell carcinoma (BCC) and in thin nodular BCC. However, long-term recurrence studies are lacking. The aim of this study was to evaluate the long-term efficacy of PDT with topical methylaminolevulinate (MAL) for the treatment of BD and BCC in a dermato-oncology department. Materials & Methods: All patients with the diagnosis of BD or BCC, treated with MAL-PDT from the years 2004 to 2008, were enrolled. Treatment protocol included two MAL-PDT sessions one week apart repeated at three months when incomplete response, using a red light dose of 37-40 J/cm2 and an exposure time of 8’20’’. Clinical records were retrospectively reviewed, and data regarding age, sex, tumour location, size, treatment outcomes and recurrence were registered. Descriptive analysis was performed using chi square tests, followed by survival analysis with the Kaplan-Meier and Cox regression models. Results: Sixty-eight patients (median age 71.0 years, P25;P75=30;92) with a total of 78 tumours (31 BD, 45 superficial BCC, 2 nodular BCC) and a median tumour size of 5 cm2 were treated. Overall, the median follow-up period was 43.5 months (P25;P75=0;100), and a total recurrence rate of 33.8% was observed (24.4 % for BCC vs. 45.2% for BD). Estimated recurrence rates for BCC and BD were 5.0% vs. 7.4% at 6 months, 23.4% vs. 27.9% at 12 months, and 30.0% vs. 72.4% at 60 months. Both age and diagnosis were independent prognostic factors for recurrence, with significantly higher estimated recurrence rates in patients with BD (p=0.0036) or younger than 58 years old (p=0.039). The risk of recurrence (hazard ratio) was 2.4 times higher in patients with BD compared to superficial BCC (95% CI:1.1-5.3; p=0.033), and 2.8 times higher in patients younger than 58 years old (95% CI:1.2-6.5; p=0.02). Conclusions: In the studied population, estimated recurrence rates are higher than those expected from available literature, possibly due to a longer follow-up period. To the authors’ knowledge there is only one other study with a similar follow-up period, regarding BCC solely. BD, as an in situ squamous cell carcinoma, has a higher tendency to recur than superficial BCC. Despite greater cosmesis, PDT might no be the best treatment option for young patients considering their higher risk of recurrence.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Currently there are several methods to extract bacterial DNA based on different principles. However, the amount and the quality of the DNA obtained by each one of those methods is highly variable and microorganism dependent, as illustrated by coagulase-negative staphylococci (CoNS) which have a thick cell wall that is difficult to lyse. This study was designed to compare the quality and the amount of CoNS DNA, extracted by four different techniques: two in-house protocols and two commercial kits. DNA amount and quality determination was performed through spectrophotometry. The extracted DNA was also analyzed using agarose gel electrophoresis and by PCR. 267 isolates of CoNS were used in this study. The column method and thermal lyses showed better results with regard to DNA quality (mean ratio of A260/280 = 1.95) and average concentration of DNA (), respectively. All four methods tested provided appropriate DNA for PCR amplification, but with different yields. DNA quality is important since it allows the application of a large number of molecular biology techniques, and also it's storage for a longer period of time. In this sense the extraction method based on an extraction column presented the best results for CoNS.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Enterococci are increasingly responsible for nosocomial infections worldwide. This study was undertaken to compare the identification and susceptibility profile using an automated MicrosScan system, PCR-based assay and disk diffusion assay of Enterococcus spp. We evaluated 30 clinical isolates of Enterococcus spp. Isolates were identified by MicrosScan system and PCR-based assay. The detection of antibiotic resistance genes (vancomycin, gentamicin, tetracycline and erythromycin) was also determined by PCR. Antimicrobial susceptibilities to vancomycin (30 µg), gentamicin (120 µg), tetracycline (30 µg) and erythromycin (15 µg) were tested by the automated system and disk diffusion method, and were interpreted according to the criteria recommended in CLSI guidelines. Concerning Enterococcus identification the general agreement between data obtained by the PCR method and by the automatic system was 90.0% (27/30). For all isolates of E. faecium and E. faecalis we observed 100% agreement. Resistance frequencies were higher in E. faecium than E. faecalis. The resistance rates obtained were higher for erythromycin (86.7%), vancomycin (80.0%), tetracycline (43.35) and gentamicin (33.3%). The correlation between disk diffusion and automation revealed an agreement for the majority of the antibiotics with category agreement rates of > 80%. The PCR-based assay, the van(A) gene was detected in 100% of vancomycin resistant enterococci. This assay is simple to conduct and reliable in the identification of clinically relevant enterococci. The data obtained reinforced the need for an improvement of the automated system to identify some enterococci.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Dissertation submitted in partial fulfillment of the requirements for the Degree of Master of Science in Geospatial Technologies.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Dissertation submitted in partial fulfillment of the requirements for the Degree of Master of Science in Geospatial Technologies.