986 resultados para supplier selection criterion
Resumo:
The objectives of this study were to evaluate the possibility of selecting anthracnose resistant common bean plants using detached primary leaves in partially controlled environment of a greenhouse and identify differences in the reaction of genotypes to anthracnose. The common bean cultivars Ouro Negro, OuroVermelho, ManteigãoFosco 11, Rudá, Rudá-R, VP8, BRSMG Madrepérola, Pérola, MeiaNoite and BRSMG Talismãwere characterizedfor resistance to the races 65, 81 and 453 of Colletotrichum lindemuthianum and the method of detached primary leaves was compared to the method with the traditional inoculation of plants at the phenological stage V2. The lines Rudá, Rudá-R and Pérola were inoculated with the races 65 and 453 of C. lindemuthianum, aiming to assess the rate of coincidence of anthracnose severity by both inoculation methods. In general, the two methods presented similar results for the reaction of the cultivars. The use of detached primary leaves of common bean plants in the partially controlled environment was feasible for selection of plants resistant to anthracnose and has the advantages of low-needed infrastructure and reduction of resources, space and time.
Resumo:
The silvopastoral system is a viable technological alternative to extensive cattle grazing, however, for it to be successful, forage grass genotypes adapted to reduced light need to be identified. The objective of this study was to select progenies of Panicum maximum tolerant to low light conditions for use in breeding programs and to study the genetic control and performance of some traits associated with shade tolerance. Six full-sib progenies were evaluated in full sun, 50% and 70% of light reduction in pots and subjected to cuttings. Progeny genotypic values (GV) increased with light reduction in relation to plant height (H) and specific leaf area (SLA). The traits total dry mass accumulation (DM) and leaf dry mass accumulation (LDM) had GV higher in 50% shade and intermediate in 70% shade. The GV of tiller number (TIL) and root dry mass accumulation (RDM) decreased with light reduction. The highest positive correlations were obtained for the traits H and RDM with SLA and DM; the highest negative correlations were between TIL and SLA and RDM, and H and LDM. The progenies showed higher tolerance to 50% light reduction and, among them, two stood out and will be used in breeding programs. It was also found that it is not necessary to evaluate some traits under all light conditions. All traits had high broad sense heritability and high genotypic correlation between progenies in all light intensities. There is genetic difference among the progenies regarding the response to different light intensities, which will allow selection for shade tolerance
Resumo:
Motion compensated frame interpolation (MCFI) is one of the most efficient solutions to generate side information (SI) in the context of distributed video coding. However, it creates SI with rather significant motion compensated errors for some frame regions while rather small for some other regions depending on the video content. In this paper, a low complexity Infra mode selection algorithm is proposed to select the most 'critical' blocks in the WZ frame and help the decoder with some reliable data for those blocks. For each block, the novel coding mode selection algorithm estimates the encoding rate for the Intra based and WZ coding modes and determines the best coding mode while maintaining a low encoder complexity. The proposed solution is evaluated in terms of rate-distortion performance with improvements up to 1.2 dB regarding a WZ coding mode only solution.
Resumo:
INTRODUCTION: The correct identification of the underlying cause of death and its precise assignment to a code from the International Classification of Diseases are important issues to achieve accurate and universally comparable mortality statistics These factors, among other ones, led to the development of computer software programs in order to automatically identify the underlying cause of death. OBJECTIVE: This work was conceived to compare the underlying causes of death processed respectively by the Automated Classification of Medical Entities (ACME) and the "Sistema de Seleção de Causa Básica de Morte" (SCB) programs. MATERIAL AND METHOD: The comparative evaluation of the underlying causes of death processed respectively by ACME and SCB systems was performed using the input data file for the ACME system that included deaths which occurred in the State of S. Paulo from June to December 1993, totalling 129,104 records of the corresponding death certificates. The differences between underlying causes selected by ACME and SCB systems verified in the month of June, when considered as SCB errors, were used to correct and improve SCB processing logic and its decision tables. RESULTS: The processing of the underlying causes of death by the ACME and SCB systems resulted in 3,278 differences, that were analysed and ascribed to lack of answer to dialogue boxes during processing, to deaths due to human immunodeficiency virus [HIV] disease for which there was no specific provision in any of the systems, to coding and/or keying errors and to actual problems. The detailed analysis of these latter disclosed that the majority of the underlying causes of death processed by the SCB system were correct and that different interpretations were given to the mortality coding rules by each system, that some particular problems could not be explained with the available documentation and that a smaller proportion of problems were identified as SCB errors. CONCLUSION: These results, disclosing a very low and insignificant number of actual problems, guarantees the use of the version of the SCB system for the Ninth Revision of the International Classification of Diseases and assures the continuity of the work which is being undertaken for the Tenth Revision version.
Resumo:
Reclaimed water from small wastewater treatment facilities in the rural areas of the Beira Interior region (Portugal) may constitute an alternative water source for aquifer recharge. A 21-month monitoring period in a constructed wetland treatment system has shown that 21,500 m(3) year(-1) of treated wastewater (reclaimed water) could be used for aquifer recharge. A GIS-based multi-criteria analysis was performed, combining ten thematic maps and economic, environmental and technical criteria, in order to produce a suitability map for the location of sites for reclaimed water infiltration. The areas chosen for aquifer recharge with infiltration basins are mainly composed of anthrosol with more than 1 m deep and fine sand texture, which allows an average infiltration velocity of up to 1 m d(-1). These characteristics will provide a final polishing treatment of the reclaimed water after infiltration (soil aquifer treatment (SAT)), suitable for the removal of the residual load (trace organics, nutrients, heavy metals and pathogens). The risk of groundwater contamination is low since the water table in the anthrosol areas ranges from 10 m to 50 m. Oil the other hand, these depths allow a guaranteed unsaturated area suitable for SAT. An area of 13,944 ha was selected for study, but only 1607 ha are suitable for reclaimed water infiltration. Approximately 1280 m(2) were considered enough to set up 4 infiltration basins to work in flooding and drying cycles.
Resumo:
Copyright © 2013 Springer Netherlands.
Resumo:
Frame rate upconversion (FRUC) is an important post-processing technique to enhance the visual quality of low frame rate video. A major, recent advance in this area is FRUC based on trilateral filtering which novelty mainly derives from the combination of an edge-based motion estimation block matching criterion with the trilateral filter. However, there is still room for improvement, notably towards reducing the size of the uncovered regions in the initial estimated frame, this means the estimated frame before trilateral filtering. In this context, proposed is an improved motion estimation block matching criterion where a combined luminance and edge error metric is weighted according to the motion vector components, notably to regularise the motion field. Experimental results confirm that significant improvements are achieved for the final interpolated frames, reaching PSNR gains up to 2.73 dB, on average, regarding recent alternative solutions, for video content with varied motion characteristics.
Resumo:
Dissertação de Mestrado, Sociologia, 31 de Março de 2014, Universidade dos Açores.
Resumo:
27th Annual Conference of the European Cetacean Society. Setúbal, Portugal, 8-10 April 2013.
Resumo:
27th Annual Conference of the European Cetacean Society. Setúbal, Portugal, 8-10 April 2013.
Resumo:
O problema de selecção de fornecedores/parceiros é uma parte integrante e importante nas empresas que se propõem a um desempenho competitivo e lucrativo na sua área de actividade. A escolha do melhor fornecedor/parceiro passa na maior parte da vezes por fazer uma análise cuidada dos factores que podem influenciar positiva ou negativamente essa escolha. Desde cedo este problema tem vindo a ser alvo de inúmeros estudos, estudos esses que se focam essencialmente nos critérios a considerar e nas metodologias a adoptar para optimizar a escolha dos parceiros. De entre os vários estudos efectuados, muitos são os que consideram como critérios chave o custo do produto, a qualidade, a entrega e a reputação da empresa fornecedora. Ainda assim, há muitos outros que são referidos e que na sua maioria se apresentam como subcritérios. No âmbito deste trabalho, foram identificados cinco grandes critérios, Qualidade, Sistema Financeiro, Sinergias, Custo e Sistema Produtivo. Dentro desses critérios, sentiu-se a necessidade de incluir alguns subcritérios pelo que, cada um dos critérios chave apresenta cinco subcritérios. Identificados os critérios, foi necessário perceber de que forma são aplicados e que modelos são utilizados para se poder tirar o melhor partido das informações. Sabendo que existem modelos que privilegiam a programação matemática e outros que fazem uso de ponderações lineares para se identificar o melhor fornecedor, foi realizado um inquérito e contactadas empresas por forma a perceber quais os factores que mais peso tinham nas suas decisões de escolha de parceiros. Interpretados os resultados e tratados os dados foi adoptado um modelo de ponderação linear para traduzir a importância de cada um dos factores. O modelo proposto apresenta uma estrutura hierárquica e pode ser aplicado com o método AHP de Saaty ou o método de Análise de Valor. Este modelo permite escolher a ou as alternativas que melhor se adequam aos requisitos das empresas.
Resumo:
Cluster analysis for categorical data has been an active area of research. A well-known problem in this area is the determination of the number of clusters, which is unknown and must be inferred from the data. In order to estimate the number of clusters, one often resorts to information criteria, such as BIC (Bayesian information criterion), MML (minimum message length, proposed by Wallace and Boulton, 1968), and ICL (integrated classification likelihood). In this work, we adopt the approach developed by Figueiredo and Jain (2002) for clustering continuous data. They use an MML criterion to select the number of clusters and a variant of the EM algorithm to estimate the model parameters. This EM variant seamlessly integrates model estimation and selection in a single algorithm. For clustering categorical data, we assume a finite mixture of multinomial distributions and implement a new EM algorithm, following a previous version (Silvestre et al., 2008). Results obtained with synthetic datasets are encouraging. The main advantage of the proposed approach, when compared to the above referred criteria, is the speed of execution, which is especially relevant when dealing with large data sets.
Resumo:
Resource constraints are becoming a problem as many of the wireless mobile devices have increased generality. Our work tries to address this growing demand on resources and performance, by proposing the dynamic selection of neighbor nodes for cooperative service execution. This selection is in uenced by user's quality of service requirements expressed in his request, tailoring provided service to user's speci c needs. In this paper we improve our proposal's formulation algorithm with the ability to trade o time for the quality of the solution. At any given time, a complete solution for service execution exists, and the quality of that solution is expected to improve overtime.
Resumo:
Electrocardiography (ECG) biometrics is emerging as a viable biometric trait. Recent developments at the sensor level have shown the feasibility of performing signal acquisition at the fingers and hand palms, using one-lead sensor technology and dry electrodes. These new locations lead to ECG signals with lower signal to noise ratio and more prone to noise artifacts; the heart rate variability is another of the major challenges of this biometric trait. In this paper we propose a novel approach to ECG biometrics, with the purpose of reducing the computational complexity and increasing the robustness of the recognition process enabling the fusion of information across sessions. Our approach is based on clustering, grouping individual heartbeats based on their morphology. We study several methods to perform automatic template selection and account for variations observed in a person's biometric data. This approach allows the identification of different template groupings, taking into account the heart rate variability, and the removal of outliers due to noise artifacts. Experimental evaluation on real world data demonstrates the advantages of our approach.
Resumo:
In research on Silent Speech Interfaces (SSI), different sources of information (modalities) have been combined, aiming at obtaining better performance than the individual modalities. However, when combining these modalities, the dimensionality of the feature space rapidly increases, yielding the well-known "curse of dimensionality". As a consequence, in order to extract useful information from this data, one has to resort to feature selection (FS) techniques to lower the dimensionality of the learning space. In this paper, we assess the impact of FS techniques for silent speech data, in a dataset with 4 non-invasive and promising modalities, namely: video, depth, ultrasonic Doppler sensing, and surface electromyography. We consider two supervised (mutual information and Fisher's ratio) and two unsupervised (meanmedian and arithmetic mean geometric mean) FS filters. The evaluation was made by assessing the classification accuracy (word recognition error) of three well-known classifiers (knearest neighbors, support vector machines, and dynamic time warping). The key results of this study show that both unsupervised and supervised FS techniques improve on the classification accuracy on both individual and combined modalities. For instance, on the video component, we attain relative performance gains of 36.2% in error rates. FS is also useful as pre-processing for feature fusion. Copyright © 2014 ISCA.