903 resultados para Evaluation of performance
Resumo:
This paper presents the evaluation of morpheme a sketching interface for the control of sound synthesis. We explain the task that was designed in order to assess the effectiveness of the interface, detect usability issues and gather participants’ responses regarding cognitive, experiential and expressive aspects of the interaction. The evaluation comprises a design task, where partici-pants were asked to design two soundscapes using the morpheme interface for two video footages. Responses were gathered using a series of likert type and open-ended questions. The analysis of the data gathered revealed a number of usability issues, however the performance of morpheme was satisfactory and participants recognised the creative potential of the interface and the synthesis methods for sound design applications.
Resumo:
A lógica fuzzy admite infinitos valores lógicos intermediários entre o falso e o verdadeiro. Com esse princípio, foi elaborado neste trabalho um sistema baseado em regras fuzzy, que indicam o índice de massa corporal de animais ruminantes com objetivo de obter o melhor momento para o abate. O sistema fuzzy desenvolvido teve como entradas as variáveis massa e altura, e a saída um novo índice de massa corporal, denominado Índice de Massa Corporal Fuzzy (IMC Fuzzy), que poderá servir como um sistema de detecção do momento de abate de bovinos, comparando-os entre si através das variáveis linguísticas )Muito BaixaM, ,BaixaB, ,MédiaM, ,AltaA e Muito AltaM. Para a demonstração e aplicação da utilização deste sistema fuzzy, foi feita uma análise de 147 vacas da raça Nelore, determinando os valores do IMC Fuzzy para cada animal e indicando a situação de massa corpórea de todo o rebanho. A validação realizada do sistema foi baseado em uma análise estatística, utilizando o coeficiente de correlação de Pearson 0,923, representando alta correlação positiva e indicando que o método proposto está adequado. Desta forma, o presente método possibilita a avaliação do rebanho, comparando cada animal do rebanho com seus pares do grupo, fornecendo desta forma um método quantitativo de tomada de decisão para o pecuarista. Também é possível concluir que o presente trabalho estabeleceu um método computacional baseado na lógica fuzzy capaz de imitar parte do raciocínio humano e interpretar o índice de massa corporal de qualquer tipo de espécie bovina e em qualquer região do País.
Resumo:
Aim: To evaluate the clinical performance of a composite resin (CR) and a resin-modified glassionomer cement (RMGIC) for the treatment of abfraction lesions. Methods: Thirty patients with abfraction lesions in at least two premolar teeth were selected and invited to participate in this study. All restorations were made within the same clinical time frame. One tooth was restored with CR Z100TM (3M, St. Paul, MN, USA), and the other was restored with RMGIC VitremerTM (3M). The restorations were assessed immediately and 1, 6 and 12 months after the restoration, using modified US Public Health Service (USPHS) criteria: marginal integrity, marginal discoloration, wear, retention, secondary caries and hypersensitivity. The statistical analysis was based on Friedman ANOVA test and Mann-Whitney test, considering p<0.05 for statistical significance. Results: Both materials demonstrated satisfactory clinical performance after one year. In the individual analysis of each material, there was a significant difference (p<0.05) in the criteria marginal integrity and wear, for both CR and RMGIC. RMGIC exhibited more damage one year after the restoration. Comparing both materials, it was found a significant difference only for marginal discoloration, while the RMGIC restorations showed the worst prognosis after a year of evaluation. There was no significant difference in the number of retentions, caries or hypersensitivity between CR and RMGIC. Conclusions: It was concluded that CR exhibited the best clinical performance according to the cost-effectiveness and evaluation criteria used in this study.
Resumo:
The aim of this study was to evaluate the performance of progenies from Citrullus lanatus var. lanatus (cultivated watermelons) when crossed with progenies from C. lanatus var. citroides (fodder watermelon with a historic of resistance to the nematode Meloidogyne enterolobii). The parents and their F1s were evaluated for resistance to this nematode. In the initial stages of eleven treatments, watermelon seedlings plantlets were transplanted to plastic bags of six kilograms once the first leaves developed. Ten inoculated plants with 5,200 eggs in the soil near the stem of the plant and four non-inoculated ones were used in each treatment, in a complete block design. Sixty-two days after sowing, the following characteristics were evaluated: the length of the aerial part of the plant (LAP, in m), fresh mass of the aerial part (FMAP, in g), root fresh mass (RFM, in g), egg number (EN) and reproduction factor (RF). A comparison between the averages of inoculated and non-inoculated plants was performed using Scott-Knott test at 5% and the diallelic analysis was performed using the GENES program. The morphological characteristics did not allow for the identification of the parent plants or the F1s with respect to nematode resistance, but the variables EN and RF were useful for such identification. The analyses of the general and specific combining abilities indicate highly significant effects with respect to this resistance, showing additive gene effects as well as dominance and epistatic gene effects, allowing for identification of parents and F1s that can be used in watermelon breeding programs to improve resistance to the M. enterolobii.
Resumo:
Consumers currently enjoy a surplus of goods (books, videos, music, or other items) available to purchase. While this surplus often allows a consumer to find a product tailored to their preferences or needs, the volume of items available may require considerable time or effort on the part of the user to find the most relevant item. Recommendation systems have become a common part of many online business that supply users books, videos, music, or other items to consumers. These systems attempt to provide assistance to consumers in finding the items that fit their preferences. This report presents an overview of recommendation systems. We will also briefly explore the history of recommendation systems and the large boost that was given to research in this field due to the Netflix Challenge. The classical methods for collaborative recommendation systems are reviewed and implemented, and an examination is performed contrasting the complexity and performance among the various models. Finally, current challenges and approaches are discussed.
Resumo:
Elemental analysis can become an important piece of evidence to assist the solution of a case. The work presented in this dissertation aims to evaluate the evidential value of the elemental composition of three particular matrices: ink, paper and glass. In the first part of this study, the analytical performance of LIBS and LA-ICP-MS methods was evaluated for paper, writing inks and printing inks. A total of 350 ink specimens were examined including black and blue gel inks, ballpoint inks, inkjets and toners originating from several manufacturing sources and/or batches. The paper collection set consisted of over 200 paper specimens originating from 20 different paper sources produced by 10 different plants. Micro-homogeneity studies show smaller variation of elemental compositions within a single source (i.e., sheet, pen or cartridge) than the observed variation between different sources (i.e., brands, types, batches). Significant and detectable differences in the elemental profile of the inks and paper were observed between samples originating from different sources (discrimination of 87 – 100% of samples, depending on the sample set under investigation and the method applied). These results support the use of elemental analysis, using LA-ICP-MS and LIBS, for the examination of documents and provide additional discrimination to the currently used techniques in document examination. In the second part of this study, a direct comparison between four analytical methods (µ-XRF, solution-ICP-MS, LA-ICP-MS and LIBS) was conducted for glass analyses using interlaboratory studies. The data provided by 21 participants were used to assess the performance of the analytical methods in associating glass samples from the same source and differentiating different sources, as well as the use of different match criteria (confidence interval (±6s, ±5s, ±4s, ±3s, ±2s), modified confidence interval, t-test (sequential univariate, p=0.05 and p=0.01), t-test with Bonferroni correction (for multivariate comparisons), range overlap, and Hotelling’s T2 tests. Error rates (Type 1 and Type 2) are reported for the use of each of these match criteria and depend on the heterogeneity of the glass sources, the repeatability between analytical measurements, and the number of elements that were measured. The study provided recommendations for analytical performance-based parameters for µ-XRF and LA-ICP-MS as well as the best performing match criteria for both analytical techniques, which can be applied now by forensic glass examiners.
Resumo:
The attention on green building is driven by the desire to reduce a building’s running cost over its entire life cycle. However, with the use of sustainable technologies and more environmentally friendly products in the building sector, the construction industry contributes significantly to sustainable actions of our society. Different certification systems have entered the market with the aim to measure a building’s sustainability. However, each system uses its own set of criteria for the purpose of rating. The primary goal of this study is to identify a comprehensive set of criteria for the measurement of building sustainability, and therefore to facilitate the comparison of existing rating methods. The collection and analysis of the criteria, identified through a comprehensive literature review, has led to the establishment of two additional categories besides the 3 pillars of sustainability. The comparative analyses presented in this thesis reveal strengths and weaknesses of the chosen green building certification systems - LEED, BREEAM, and DGNB.
Resumo:
The isoprene degradation mechanism included in version 3 of the Master Chemical Mechanism (MCM v3) has been evaluated and refined, using the Statewide Air Pollution Research Center (SAPRC) environmental chamber datasets on the photo-oxidation of isoprene and its degradation products, methacrolein (MACR) and methylvinyl ketone (MVK). Prior to this, the MCM v3 butane degradation chemistry was also evaluated using chamber data on the photo-oxidation of butane, and its degradation products, methylethyl ketone (MEK), acetaldehyde (CH3CHO) and formaldehyde (HCHO), in conjunction with an initial evaluation of the chamber-dependent auxiliary mechanisms for the series of relevant chambers. The MCM v3 mechanisms for both isoprene and butane generally performed well and were found to provide an acceptable reaction framework for describing the NOx-photo-oxidation experiments on the above systems, although a number of parameter modifications and refinements were identified which resulted in an improved performance. All these relate to the magnitude of sources of free radicals from organic chemical process, such as carbonyl photolysis rates and the yields of radicals from the reactions of O3 with unsaturated oxygenates, and specific recommendations are made for refinements. In addition to this, it was necessary to include a representation of the reactions of O(3P) with isoprene, MACR and MVK (which were not previously treated in MCM v3), and conclusions are drawn concerning the required extent of free radical formation from these reactions. Throughout the study, the performance of MCM v3 was also compared with that of the SAPRC-99 mechanism, which was developed and optimized in conjunction with the chamber datasets.
Resumo:
The representation of alkene degradation in version 3 of the Master Chemical Mechanism (MCM v3) has been evaluated, using environmental chamber data on the photo-oxidation of ethene, propene, 1-butene and 1-hexene in the presence of NOx, from up to five chambers at the Statewide Air Pollution Research Center (SAPRC) at the University of California. As part of this evaluation, it was necessary to include a representation of the reactions of the alkenes with O(3P), which are significant under chamber conditions but generally insignificant under atmospheric conditions. The simulations for the ethene and propene systems, in particular, were found to be sensitive to the branching ratios assigned to molecular and free radical forming pathways of the O(3P) reactions, with the extent of radical formation required for proper fitting of the model to the chamber data being substantially lower than the reported consensus. With this constraint, the MCM v3 mechanisms for ethene and propene generally performed well. The sensitivity of the simulations to the parameters applied to a series of other radical sources and sink reactions (radical formation from the alkene ozonolysis reactions and product carbonyl photolysis; radical removal from the reaction of OH with NO2 and β-hydroxynitrate formation) were also considered, and the implications of these results are discussed. Evaluation of the MCM v3 1-butene and 1-hexene degradation mechanisms, using a more limited dataset from only one chamber, was found to be inconclusive. The results of sensitivity studies demonstrate that it is impossible to reconcile the simulated and observed formation of ozone in these systems for ranges of parameter values which can currently be justified on the basis of the literature. As a result of this work, gaps and uncertainties in the kinetic, mechanistic and chamber database are identified and discussed, in relation to both tropospheric chemistry and chemistry important under chamber conditions which may compromise the evaluation procedure, and recommendations are made for future experimental studies. Throughout the study, the performance of the MCM v3 chemistry was also simultaneously compared with that of the corresponding chemistry in the SAPRC-99 mechanism, which was developed and optimized in conjunction with the chamber datasets.
Resumo:
Recent developments in interactive technologies have seen major changes in the manner in which artists, performers, and creative individuals interact with digital music technology; this is due to the increasing variety of interactive technologies that are readily available today. Digital Musical Instruments (DMIs) present musicians with performance challenges that are unique to this form of computer music. One of the most significant deviations from conventional acoustic musical instruments is the level of physical feedback conveyed by the instrument to the user. Currently, new interfaces for musical expression are not designed to be as physically communicative as acoustic instruments. Specifically, DMIs are often void of haptic feedback and therefore lack the ability to impart important performance information to the user. Moreover, there currently is no standardised way to measure the effect of this lack of physical feedback. Best practice would expect that there should be a set of methods to effectively, repeatedly, and quantifiably evaluate the functionality, usability, and user experience of DMIs. Earlier theoretical and technological applications of haptics have tried to address device performance issues associated with the lack of feedback in DMI designs and it has been argued that the level of haptic feedback presented to a user can significantly affect the user’s overall emotive feeling towards a musical device. The outcome of the investigations contained within this thesis are intended to inform new haptic interface.
Resumo:
High quality, pure DNA is required for ensuring reliable and reproducible results in molecular diagnosis applications. A number of in-house and commercial methods are available for the extraction and purification of genomic DNA from faecal material, each one offering a specific combination of performance, cost-effectiveness, and easiness of use that should be conveniently evaluated in function of the pathogen of interest. In this comparative study the marketed kits QIAamp DNA stool mini (Qiagen), SpeedTools DNA extraction (Biotools), DNAExtract-VK (Vacunek), PowerFecal DNA isolation (MoBio), and Wizard magnetic DNA purification system (Promega Corporation) were assessed for their efficacy in obtaining DNA of the most relevant enteric protozoan parasites associated to gastrointestinal disease globally. A panel of 113 stool specimens of clinically confirmed patients with cryptosporidiosis (n = 29), giardiasis (n = 47) and amoebiasis by Entamoeba histolytica (n = 3) or E. dispar (n = 10) and apparently healthy subjects (n = 24) were used for this purpose. Stool samples were aliquoted in five sub-samples and individually processed by each extraction method evaluated. Purified DNA samples were subsequently tested in PCR-based assays routinely used in our laboratory. The five compared methods yielded amplifiable amounts of DNA of the pathogens tested, although performance differences were observed among them depending on the parasite and the infection burden. Methods combining chemical, enzymatic and/or mechanical lysis procedures at temperatures of at least 56 °C were proven more efficient for the release of DNA from Cryptosporidium oocysts.
Resumo:
Background: Intensified selection of polled individuals has recently gained importance in predominantly horned dairy cattle breeds as an alternative to routine dehorning. The status quo of the current polled breeding pool of genetically-closely related artificial insemination sires with lower breeding values for performance traits raises questions regarding the effects of intensified selection based on this founder pool. Methods: We developed a stochastic simulation framework that combines the stochastic simulation software QMSim and a self-designed R program named QUALsim that acts as an external extension. Two traits were simulated in a dairy cattle population for 25 generations: one quantitative (QMSim) and one qualitative trait with Mendelian inheritance (i.e. polledness, QUALsim). The assignment scheme for qualitative trait genotypes initiated realistic initial breeding situations regarding allele frequencies, true breeding values for the quantitative trait and genetic relatedness. Intensified selection for polled cattle was achieved using an approach that weights estimated breeding values in the animal best linear unbiased prediction model for the quantitative trait depending on genotypes or phenotypes for the polled trait with a user-defined weighting factor. Results: Selection response for the polled trait was highest in the selection scheme based on genotypes. Selection based on phenotypes led to significantly lower allele frequencies for polled. The male selection path played a significantly greater role for a fast dissemination of polled alleles compared to female selection strategies. Fixation of the polled allele implies selection based on polled genotypes among males. In comparison to a base breeding scenario that does not take polledness into account, intensive selection for polled substantially reduced genetic gain for this quantitative trait after 25 generations. Reducing selection intensity for polled males while maintaining strong selection intensity among females, simultaneously decreased losses in genetic gain and achieved a final allele frequency of 0.93 for polled. Conclusions: A fast transition to a completely polled population through intensified selection for polled was in contradiction to the preservation of high genetic gain for the quantitative trait. Selection on male polled genotypes with moderate weighting, and selection on female polled phenotypes with high weighting, could be a suitable compromise regarding all important breeding aspects.
Resumo:
The purpose of this evaluation is to assess the performance of Iowa's mental health system in relation to current standards, benchmarks and best practices found in public health systems in the United States.
Resumo:
Gait analysis allows to characterize motor function, highlighting deviations from normal motor behavior related to an underlying pathology. The widespread use of wearable inertial sensors has opened the way to the evaluation of ecological gait, and a variety of methodological approaches and algorithms have been proposed for the characterization of gait from inertial measures (e.g. for temporal parameters, motor stability and variability, specific pathological alterations). However, no comparative analysis of their performance (i.e. accuracy, repeatability) was available yet, in particular, analysing how this performance is affected by extrinsic (i.e. sensor location, computational approach, analysed variable, testing environmental constraints) and intrinsic (i.e. functional alterations resulting from pathology) factors. The aim of the present project was to comparatively analyze the influence of intrinsic and extrinsic factors on the performance of the numerous algorithms proposed in the literature for the quantification of specific characteristics (i.e. timing, variability/stability) and alterations (i.e. freezing) of gait. Considering extrinsic factors, the influence of sensor location, analyzed variable, and computational approach on the performance of a selection of gait segmentation algorithms from a literature review was analysed in different environmental conditions (e.g. solid ground, sand, in water). Moreover, the influence of altered environmental conditions (i.e. in water) was analyzed as referred to the minimum number of stride necessary to obtain reliable estimates of gait variability and stability metrics, integrating what already available in the literature for over ground gait in healthy subjects. Considering intrinsic factors, the influence of specific pathological conditions (i.e. Parkinson’s Disease) was analyzed as affecting the performance of segmentation algorithms, with and without freezing. Finally, the analysis of the performance of algorithms for the detection of gait freezing showed how results depend on the domain of implementation and IMU position.
Resumo:
The topic of seismic loss assessment not only incorporates many aspects of the earthquake engineering, but also entails social factors, public policies and business interests. Because of its multidisciplinary character, this process may be complex to challenge, and sound discouraging to neophytes. In this context, there is an increasing need of deriving simplified methodologies to streamline the process and provide tools for decision-makers and practitioners. This dissertation investigates different possible applications both in the area of modelling of seismic losses, both in the analysis of observational seismic data. Regarding the first topic, the PRESSAFE-disp method is proposed for the fast evaluation of the fragility curves of precast reinforced-concrete (RC) structures. Hence, a direct application of the method to the productive area of San Felice is studied to assess the number of collapses under a specific seismic scenario. In particular, with reference to the 2012 events, two large-scale stochastic models are outlined. The outcomes of the framework are promising, in good agreement with the observed damage scenario. Furthermore, a simplified displacement-based methodology is outlined to estimate different loss performance metrics for the decision-making phase of the seismic retrofit of a single RC building. The aim is to evaluate the seismic performance of different retrofit options, for a comparative analysis of their effectiveness and the convenience. Finally, a contribution to the analysis of the observational data is presented in the last part of the dissertation. A specific database of losses of precast RC buildings damaged by the 2012 Earthquake is created. A statistical analysis is performed, allowing deriving several consequence functions. The outcomes presented may be implemented in probabilistic seismic risk assessments to forecast the losses at the large scale. Furthermore, these may be adopted to establish retrofit policies to prevent and reduce the consequences of future earthquakes in industrial areas.