5 resultados para complexity metrics

em CORA - Cork Open Research Archive - University College Cork - Ireland


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Pre-treatment HCV quasispecies complexity and diversity may predict response to interferon based anti-viral therapy. The objective of this study was to retrospectively (1) examine temporal changes in quasispecies prior to the start of therapy and (2) investigate extensively quasispecies evolution in a group of 10 chronically infected patients with genotype 3a, treated with pegylated alpha 2a-Interferon and ribavirin. The degree of sequence heterogeneity within the hypervariable region 1 was assessed by analyzing 20-30 individual clones in serial serum samples. Genetic parameters, including amino acid Shannon entropy, Hamming distance and genetic distance were calculated for each sample. Treatment outcome was divided into (1) sustained virological responders (SVR) and (2) treatment failure (TF).Our results indicate, (1) quasispecies complexity and diversity are lower in the SVR group, (2) quasispecies vary temporally and (3) genetic heterogeneity at baseline can be used to predict treatment outcome. We discuss the results from the perspective of replicative homeostasis. We discuss the results from the perspective of replicative homeostasis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Energy efficiency and user comfort have recently become priorities in the Facility Management (FM) sector. This has resulted in the use of innovative building components, such as thermal solar panels, heat pumps, etc., as they have potential to provide better performance, energy savings and increased user comfort. However, as the complexity of components increases, the requirement for maintenance management also increases. The standard routine for building maintenance is inspection which results in repairs or replacement when a fault is found. This routine leads to unnecessary inspections which have a cost with respect to downtime of a component and work hours. This research proposes an alternative routine: performing building maintenance at the point in time when the component is degrading and requires maintenance, thus reducing the frequency of unnecessary inspections. This thesis demonstrates that statistical techniques can be used as part of a maintenance management methodology to invoke maintenance before failure occurs. The proposed FM process is presented through a scenario utilising current Building Information Modelling (BIM) technology and innovative contractual and organisational models. This FM scenario supports a Degradation based Maintenance (DbM) scheduling methodology, implemented using two statistical techniques, Particle Filters (PFs) and Gaussian Processes (GPs). DbM consists of extracting and tracking a degradation metric for a component. Limits for the degradation metric are identified based on one of a number of proposed processes. These processes determine the limits based on the maturity of the historical information available. DbM is implemented for three case study components: a heat exchanger; a heat pump; and a set of bearings. The identified degradation points for each case study, from a PF, a GP and a hybrid (PF and GP combined) DbM implementation are assessed against known degradation points. The GP implementations are successful for all components. For the PF implementations, the results presented in this thesis find that the extracted metrics and limits identify degradation occurrences accurately for components which are in continuous operation. For components which have seasonal operational periods, the PF may wrongly identify degradation. The GP performs more robustly than the PF, but the PF, on average, results in fewer false positives. The hybrid implementations, which are a combination of GP and PF results, are successful for 2 of 3 case studies and are not affected by seasonal data. Overall, DbM is effectively applied for the three case study components. The accuracy of the implementations is dependant on the relationships modelled by the PF and GP, and on the type and quantity of data available. This novel maintenance process can improve equipment performance and reduce energy wastage from BSCs operation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This research has explored the relationship between system test complexity and tacit knowledge. It is proposed as part of this thesis, that the process of system testing (comprising of test planning, test development, test execution, test fault analysis, test measurement, and case management), is directly affected by both complexity associated with the system under test, and also by other sources of complexity, independent of the system under test, but related to the wider process of system testing. While a certain amount of knowledge related to the system under test is inherent, tacit in nature, and therefore difficult to make explicit, it has been found that a significant amount of knowledge relating to these other sources of complexity, can indeed be made explicit. While the importance of explicit knowledge has been reinforced by this research, there has been a lack of evidence to suggest that the availability of tacit knowledge to a test team is of any less importance to the process of system testing, when operating in a traditional software development environment. The sentiment was commonly expressed by participants, that even though a considerable amount of explicit knowledge relating to the system is freely available, that a good deal of knowledge relating to the system under test, which is demanded for effective system testing, is actually tacit in nature (approximately 60% of participants operating in a traditional development environment, and 60% of participants operating in an agile development environment, expressed similar sentiments). To cater for the availability of tacit knowledge relating to the system under test, and indeed, both explicit and tacit knowledge required by system testing in general, an appropriate knowledge management structure needs to be in place. This would appear to be required, irrespective of the employed development methodology.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The lived environment is the arena where our cognitive skills, preferences, and attitudes come together to determine our ability to interact with the world. The mechanisms through which lived environments can benefit cognitive health in older age are yet to be fully understood. The existing literature suggests that environments which are perceived as stimulating, usable and aesthetically appealing can improve or facilitate cognitive performance both in young and older age. Importantly, optimal stimulation for cognition seems to depend on experiencing sufficiently stimulating environments while not too challenging. Environmental complexity is an important contributor to determining whether an environment provides such an optimal stimulation. The present paper reviews a selection of studies which have explored complexity in relation to perceptual load, environmental preference and perceived usability to propose a framework which explores direct and indirect environmental influences on cognition, and to understand these influences in relation to aging processes. We identify ways to define complexity at different environmental scales, going from micro low-level perceptual features of scenes, to design qualities of proximal environments (e.g., streets, neighborhoods), to broad geographical areas (i.e., natural vs. urban environments). We propose that studying complexity at these different scales will provide new insight into the design of cognitive-friendly environments.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The pace at which challenges are introduced in a game has long been identified as a key determinant of both the enjoyment and difficulty experienced by game players, and their ability to learn from game play. In order to understand how to best pace challenges in games, there is great value in analysing games already demonstrated as highly engaging. Play-through videos of four puzzle games (Portal, Portal 2 Co-operative mode, Braid and Lemmings), were observed and analysed using metrics derived from a behavioural psychology understanding of how people solve problems. Findings suggest that; 1) the main skills learned in each game are introduced separately, 2) through simple puzzles that require only basic performance of that skill, 3) the player has the opportunity to practice and integrate that skill with previously learned skills, and 4) puzzles increase in complexity until the next new skill is introduced. These data provide practical guidance for designers, support contemporary thinking on the design of learning structures in games, and suggest future directions for empirical research.