880 resultados para Evaluation models
Resumo:
Thesis (Master's)--University of Washington, 2016-09
Resumo:
MELO, Dulce Maria de Araújo et al. Evaluation of the Zinox and Zeolite materials as adsorbents to remove H2S from natural gas. Colloids and Surfaces. A, Physicochemical and Engineering Aspects, Estados Unidos, v. 272, p. 32-36, 2006.
Resumo:
Current practices in agricultural management involve the application of rules and techniques to ensure high quality and environmentally friendly production. Based on their experience, agricultural technicians and farmers make critical decisions affecting crop growth while considering several interwoven agricultural, technological, environmental, legal and economic factors. In this context, decision support systems and the knowledge models that support them, enable the incorporation of valuable experience into software systems providing support to agricultural technicians to make rapid and effective decisions for efficient crop growth. Pest control is an important issue in agricultural management due to crop yield reductions caused by pests and it involves expert knowledge. This paper presents a formalisation of the pest control problem and the workflow followed by agricultural technicians and farmers in integrated pest management, the crop production strategy that combines different practices for growing healthy crops whilst minimising pesticide use. A generic decision schema for estimating infestation risk of a given pest on a given crop is defined and it acts as a metamodel for the maintenance and extension of the knowledge embedded in a pest management decision support system which is also presented. This software tool has been implemented by integrating a rule-based tool into web-based architecture. Evaluation from validity and usability perspectives concluded that both agricultural technicians and farmers considered it a useful tool in pest control, particularly for training new technicians and inexperienced farmers.
Resumo:
Statistical association between a single nucleotide polymorphism (SNP) genotype and a quantitative trait in genome-wide association studies is usually assessed using a linear regression model, or, in the case of non-normally distributed trait values, using the Kruskal-Wallis test. While linear regression models assume an additive mode of inheritance via equi-distant genotype scores, Kruskal-Wallis test merely tests global differences in trait values associated with the three genotype groups. Both approaches thus exhibit suboptimal power when the underlying inheritance mode is dominant or recessive. Furthermore, these tests do not perform well in the common situations when only a few trait values are available in a rare genotype category (disbalance), or when the values associated with the three genotype categories exhibit unequal variance (variance heterogeneity). We propose a maximum test based on Marcus-type multiple contrast test for relative effect sizes. This test allows model-specific testing of either dominant, additive or recessive mode of inheritance, and it is robust against variance heterogeneity. We show how to obtain mode-specific simultaneous confidence intervals for the relative effect sizes to aid in interpreting the biological relevance of the results. Further, we discuss the use of a related all-pairwise comparisons contrast test with range preserving confidence intervals as an alternative to Kruskal-Wallis heterogeneity test. We applied the proposed maximum test to the Bogalusa Heart Study dataset, and gained a remarkable increase in the power to detect association, particularly for rare genotypes. Our simulation study also demonstrated that the proposed non-parametric tests control family-wise error rate in the presence of non-normality and variance heterogeneity contrary to the standard parametric approaches. We provide a publicly available R library nparcomp that can be used to estimate simultaneous confidence intervals or compatible multiplicity-adjusted p-values associated with the proposed maximum test.
Resumo:
La possibilité d’estimer l’impact du changement climatique en cours sur le comportement hydrologique des hydro-systèmes est une nécessité pour anticiper les adaptations inévitables et nécessaires que doivent envisager nos sociétés. Dans ce contexte, ce projet doctoral présente une étude sur l’évaluation de la sensibilité des projections hydrologiques futures à : (i) La non-robustesse de l’identification des paramètres des modèles hydrologiques, (ii) l’utilisation de plusieurs jeux de paramètres équifinaux et (iii) l’utilisation de différentes structures de modèles hydrologiques. Pour quantifier l’impact de la première source d’incertitude sur les sorties des modèles, quatre sous-périodes climatiquement contrastées sont tout d’abord identifiées au sein des chroniques observées. Les modèles sont calés sur chacune de ces quatre périodes et les sorties engendrées sont analysées en calage et en validation en suivant les quatre configurations du Different Splitsample Tests (Klemeš, 1986;Wilby, 2005; Seiller et al. (2012);Refsgaard et al. (2014)). Afin d’étudier la seconde source d’incertitude liée à la structure du modèle, l’équifinalité des jeux de paramètres est ensuite prise en compte en considérant pour chaque type de calage les sorties associées à des jeux de paramètres équifinaux. Enfin, pour évaluer la troisième source d’incertitude, cinq modèles hydrologiques de différents niveaux de complexité sont appliqués (GR4J, MORDOR, HSAMI, SWAT et HYDROTEL) sur le bassin versant québécois de la rivière Au Saumon. Les trois sources d’incertitude sont évaluées à la fois dans conditions climatiques observées passées et dans les conditions climatiques futures. Les résultats montrent que, en tenant compte de la méthode d’évaluation suivie dans ce doctorat, l’utilisation de différents niveaux de complexité des modèles hydrologiques est la principale source de variabilité dans les projections de débits dans des conditions climatiques futures. Ceci est suivi par le manque de robustesse de l’identification des paramètres. Les projections hydrologiques générées par un ensemble de jeux de paramètres équifinaux sont proches de celles associées au jeu de paramètres optimal. Par conséquent, plus d’efforts devraient être investis dans l’amélioration de la robustesse des modèles pour les études d’impact sur le changement climatique, notamment en développant les structures des modèles plus appropriés et en proposant des procédures de calage qui augmentent leur robustesse. Ces travaux permettent d’apporter une réponse détaillée sur notre capacité à réaliser un diagnostic des impacts des changements climatiques sur les ressources hydriques du bassin Au Saumon et de proposer une démarche méthodologique originale d’analyse pouvant être directement appliquée ou adaptée à d’autres contextes hydro-climatiques.
Resumo:
Adoptive immunotherapy and oncolytic virotherapy are two promising strategies for treating primary and metastatic malignant brain tumors. We demonstrate the ability of adoptively transferred tumor-specific T cells to rapidly mediate the clearance of established brain tumors in several mouse models. Similar to the clinical situation, tumor recurrences are frequent and result from immune editing of tumors. T cells can eliminate antigen-expressing tumor cells but are not effective against antigen loss variant (ALV) cancer cells that multiply and repopulate a tumor. We show that the level of tumor antigen present affects the success of adoptive T cell therapy. When high levels of antigen are present, tumor stromal cells such as microglia and macrophages present tumor peptide on their surface. As a result, T cells directly eliminate cancer cells and cross-presenting stromal cells and indirectly eliminate ALV cells. We were able to show the first direct evidence of tumor antigen cross-presentation by CD11b+ stromal cells in the brain using soluble, high-affinity T cell receptor monomers. Strategies that target brain tumor stroma or increase antigen shedding from tumor cells leading to increased crosspresentation by stromal cells may improve the clinical success of T cell adoptive therapies. We evaluated one potential strategy to complement adoptive T cell therapy by characterizing the oncolytic effects of myxoma virus (MYXV) in a syngeneic mouse brain tumor model of metastatic melanoma. MYXV is a rabbit poxvirus with strict species tropism for European rabbits. MYXV can also infect mouse and human cancer cell lines due to signaling defects in innate antiviral mechanisms and hyperphosphorylation of Akt. MYXV kills B16.SIY melanoma cells in vitro, and intratumoral injection of virus leads to robust, selective and transient infection of the tumor. We observed that virus treatment recruits innate immune cells iii to the tumor, induces TNFα and IFNβ production in the brain, and results in limited oncolytic effects in vivo. To overcome this, we evaluated the safety and efficacy of co-administering 2C T cells, MYXV, and neutralizing antibodies against IFNβ. Mice that received the triple combination therapy survived significantly longer with no apparent side effects, but eventually relapsed. Based on these findings, methods to enhance viral replication in the tumor and limit immune clearance of the virus will be pursued. We conclude that myxoma virus should be further explored as a vector for transient delivery of therapeutic genes to a tumor to enhance T cell responses.
Resumo:
Reliability and dependability modeling can be employed during many stages of analysis of a computing system to gain insights into its critical behaviors. To provide useful results, realistic models of systems are often necessarily large and complex. Numerical analysis of these models presents a formidable challenge because the sizes of their state-space descriptions grow exponentially in proportion to the sizes of the models. On the other hand, simulation of the models requires analysis of many trajectories in order to compute statistically correct solutions. This dissertation presents a novel framework for performing both numerical analysis and simulation. The new numerical approach computes bounds on the solutions of transient measures in large continuous-time Markov chains (CTMCs). It extends existing path-based and uniformization-based methods by identifying sets of paths that are equivalent with respect to a reward measure and related to one another via a simple structural relationship. This relationship makes it possible for the approach to explore multiple paths at the same time,· thus significantly increasing the number of paths that can be explored in a given amount of time. Furthermore, the use of a structured representation for the state space and the direct computation of the desired reward measure (without ever storing the solution vector) allow it to analyze very large models using a very small amount of storage. Often, path-based techniques must compute many paths to obtain tight bounds. In addition to presenting the basic path-based approach, we also present algorithms for computing more paths and tighter bounds quickly. One resulting approach is based on the concept of path composition whereby precomputed subpaths are composed to compute the whole paths efficiently. Another approach is based on selecting important paths (among a set of many paths) for evaluation. Many path-based techniques suffer from having to evaluate many (unimportant) paths. Evaluating the important ones helps to compute tight bounds efficiently and quickly.
Resumo:
Traditional air delivery to high-bay buildings involves ceiling level supply and return ducts that create an almost-uniform temperature in the space. Problems with this system include potential recirculation of supply air and higher-than-necessary return air temperatures. A new air delivery strategy was investigated that involves changing the height of conventional supply and return ducts to have control over thermal stratification in the space. A full-scale experiment using ten vertical temperature profiles was conducted in a manufacturing facility over one year. The experimental data was utilized to validated CFD and EnergyPlus models. CFD simulation results show that supplying air directly to the occupied zone increases stratification while holding thermal comfort constant during the cooling operation. The building energy simulation identified how return air temperature offset, set point offset, and stratification influence the building’s energy consumption. A utility bill analysis for cooling shows 28.8% HVAC energy savings while the building energy simulation shows 19.3 – 37.4% HVAC energy savings.
Resumo:
This paper presents an experimental study on the evolution of carrot properties along convective drying by hot air at different temperatures (50ºC, 60ºC and 70ºC). The thermo-physical properties calculated were: specific heat, thermal conductivity, diffusivity, enthalpy, heat and mass transfer coefficients. Furthermore, the data of drying kinetics were treated and adjusted according to the three empirical models: Page, Henderson & Pabis and Logarithmic. The sorption isotherms were also determined and fitted using the GAB model. The results showed that, generally, the thermo-physical properties presented a decline during the drying process, and the decrease was faster for the temperature of 70ºC. It was possible to verify that the Page model presented the best prediction ability for the representation of kinetics of the drying process. The GAB model used to fit the sorption isotherms showed a good prediction capacity and, at a given water activity, despite some variations, the amount of water sorbed increased with the decrease of drying temperature.
Resumo:
As academic student mobility is increasing, improving the functionality of international operations is recognised as a competitive advantage at tertiary education institutions. Although many scholars have researched the experiences of exchange students, the role of student tutors and their contribution to exchange students’ experiences is still an unknown factor. This research examines international tutoring at the University of Turku, and aims to understand better the way tutoring contributes to exchange experiences and to explore the functionality of the tutor system and discover areas for improvements. To achieve these goals, the research seeks to answer the fundamental research question: What is the role of tutors in mediating exchange experiences? The theoretical framework combines literature on mediating exchange experiences, the phenomenon of studying abroad, the process of adaptation, the importance of cross-cultural communication, and the role of student tutors as mediators. Based on the literature review, a theoretical model for studying the mediation of exchange experiences is introduced. The model’s applicability and validity is examined through a case study. Three methods were used in the empirical research: surveys, participant observations, and interviews. These methods provided extensive data from three major parties of the tutor system: tutors, exchange students, and the international office. The findings of the research reveal that tutoring – instrumental leading and social and cultural mediating – generates both negative and positive experiences depending on the individuals’ expectations, motivations, relationships, and the nature of the tutoring. Although functional, there are a few weaknesses in the tutor system. Tutors tend to act as effective instrumental leaders, but often fail to create a friendship and contribute to the exchange students’ experience through social and cultural mediation, which is significantly more important in the exchange students’ overall experience in terms of building networks, adapting, gaining emotional experiences, and achieving the stage of personal development and mental change. Based on the weaknesses, three improvements are suggested: (1) increasing comprehensive sharing of information, effective communication, and collective cooperation, (2) emphasising the importance of social and cultural mediation and increasing the frequency of interaction between tutors and exchange students, and (3) improving the recruitment and training, revising the process of reporting and rewarding, and finally, enhancing services and coordination.
Resumo:
International audience
Resumo:
The production and use of synthetic nanoparticles is growing rapidly, and therefore the presence of these materials in the environment seems inevitable. Titanium dioxide (TiO2) presents various possible uses in industry, cosmetics, and even in the treatment of contaminated environments. Studies about the potential ecotoxicological risks of TiO2 nanoparticles (nano-TiO2) have been published but their results are still inconclusive. It should be noted that the properties of the diverse nano-TiO2 must be considered in order to establish experimental models to study their toxicity to environmentally relevant species. Moreover, the lack of descriptions and characterization of nanoparticles, as well as differences in the experimental conditions employed, have been a compromising factor in the comparison of results obtained in various studies. Therefore, the purpose of this paper is to make a simple review of the principal properties of TiO2, especially in nanoparticulate form, which should be considered in aquatic toxicology studies, and a compilation of the works that have been published on the subject.
Resumo:
In recent years, security of industrial control systems has been the main research focus due to the potential cyber-attacks that can impact the physical operations. As a result of these risks, there has been an urgent need to establish a stronger security protection against these threats. Conventional firewalls with stateful rules can be implemented in the critical cyberinfrastructure environment which might require constant updates. Despite the ongoing effort to maintain the rules, the protection mechanism does not restrict malicious data flows and it poses the greater risk of potential intrusion occurrence. The contributions of this thesis are motivated by the aforementioned issues which include a systematic investigation of attack-related scenarios within a substation network in a reliable sense. The proposed work is two-fold: (i) system architecture evaluation and (ii) construction of attack tree for a substation network. Cyber-system reliability remains one of the important factors in determining the system bottleneck for investment planning and maintenance. It determines the longevity of the system operational period with or without any disruption. First, a complete enumeration of existing implementation is exhaustively identified with existing communication architectures (bidirectional) and new ones with strictly unidirectional. A detailed modeling of the extended 10 system architectures has been evaluated. Next, attack tree modeling for potential substation threats is formulated. This quantifies the potential risks for possible attack scenarios within a network or from the external networks. The analytical models proposed in this thesis can serve as a fundamental development that can be further researched.
Resumo:
Consumers currently enjoy a surplus of goods (books, videos, music, or other items) available to purchase. While this surplus often allows a consumer to find a product tailored to their preferences or needs, the volume of items available may require considerable time or effort on the part of the user to find the most relevant item. Recommendation systems have become a common part of many online business that supply users books, videos, music, or other items to consumers. These systems attempt to provide assistance to consumers in finding the items that fit their preferences. This report presents an overview of recommendation systems. We will also briefly explore the history of recommendation systems and the large boost that was given to research in this field due to the Netflix Challenge. The classical methods for collaborative recommendation systems are reviewed and implemented, and an examination is performed contrasting the complexity and performance among the various models. Finally, current challenges and approaches are discussed.