35 resultados para computational thinking
em University of Queensland eSpace - Australia
Resumo:
This paper provides a computational framework, based on Defeasible Logic, to capture some aspects of institutional agency. Our background is Kanger-Lindahl-P\"orn account of organised interaction, which describes this interaction within a multi-modal logical setting. This work focuses in particular on the notions of counts-as link and on those of attempt and of personal and direct action to realise states of affairs. We show how standard Defeasible Logic can be extended to represent these concepts: the resulting system preserves some basic properties commonly attributed to them. In addition, the framework enjoys nice computational properties, as it turns out that the extension of any theory can be computed in time linear to the size of the theory itself.
Resumo:
Traditional waste stabilisation pond (WSP) models encounter problems predicting pond performance because they cannot account for the influence of pond features, such as inlet structure or pond geometry, on fluid hydrodynamics. In this study, two dimensional (2-D) computational fluid dynamics (CFD) models were compared to experimental residence time distributions (RTD) from literature. In one of the-three geometries simulated, the 2-D CFD model successfully predicted the experimental RTD. However, flow patterns in the other two geometries were not well described due to the difficulty of representing the three dimensional (3-D) experimental inlet in the 2-D CFD model, and the sensitivity of the model results to the assumptions used to characterise the inlet. Neither a velocity similarity nor geometric similarity approach to inlet representation in 2-D gave results correlating with experimental data. However. it was shown that 2-D CFD models were not affected by changes in values of model parameters which are difficult to predict, particularly the turbulent inlet conditions. This work suggests that 2-D CFD models cannot be used a priori to give an adequate description of the hydrodynamic patterns in WSP. (C) 1998 Elsevier Science Ltd. All rights reserved.
Resumo:
This paper describes U2DE, a finite-volume code that numerically solves the Euler equations. The code was used to perform multi-dimensional simulations of the gradual opening of a primary diaphragm in a shock tube. From the simulations, the speed of the developing shock wave was recorded and compared with other estimates. The ability of U2DE to compute shock speed was confirmed by comparing numerical results with the analytic solution for an ideal shock tube. For high initial pressure ratios across the diaphragm, previous experiments have shown that the measured shock speed can exceed the shock speed predicted by one-dimensional models. The shock speeds computed with the present multi-dimensional simulation were higher than those estimated by previous one-dimensional models and, thus, were closer to the experimental measurements. This indicates that multi-dimensional flow effects were partly responsible for the relatively high shock speeds measured in the experiments.
Resumo:
Computer models can be combined with laboratory experiments for the efficient determination of (i) peptides that bind MHC molecules and (ii) T-cell epitopes. For maximum benefit, the use of computer models must be treated as experiments analogous to standard laboratory procedures. This requires the definition of standards and experimental protocols for model application. We describe the requirements for validation and assessment of computer models. The utility of combining accurate predictions with a limited number of laboratory experiments is illustrated by practical examples. These include the identification of T-cell epitopes from IDDM-, melanoma- and malaria-related antigens by combining computational and conventional laboratory assays. The success rate in determining antigenic peptides, each in the context of a specific HLA molecule, ranged from 27 to 71%, while the natural prevalence of MHC-binding peptides is 0.1-5%.
Resumo:
In this and a preceding paper, we provide an introduction to the Fujitsu VPP range of vector-parallel supercomputers and to some of the computational chemistry software available for the VPP. Here, we consider the implementation and performance of seven popular chemistry application packages. The codes discussed range from classical molecular dynamics to semiempirical and ab initio quantum chemistry. All have evolved from sequential codes, and have typically been parallelised using a replicated data approach. As such they are well suited to the large-memory/fast-processor architecture of the VPP. For one code, CASTEP, a distributed-memory data-driven parallelisation scheme is presented. (C) 2000 Published by Elsevier Science B.V. All rights reserved.
Resumo:
Protein kinases exhibit various degrees of substrate specificity. The large number of different protein kinases in the eukaryotic proteomes makes it impractical to determine the specificity of each enzyme experimentally. To test if it were possible to discriminate potential substrates from non-substrates by simple computational techniques, we analysed the binding enthalpies of modelled enzyme-substrate complexes and attempted to correlate it with experimental enzyme kinetics measurements. The crystal structures of phosphorylase kinase and cAMP-dependent protein kinase were used to generate models of the enzyme with a series of known peptide substrates and non-substrates, and the approximate enthalpy of binding assessed following energy minimization. We show that the computed enthalpies do not correlate closely with kinetic measurements, but the method can distinguish good substrates from weak substrates and non-substrates. Copyright (C) 2002 John Wiley Sons, Ltd.
Resumo:
The explosive growth in biotechnology combined with major advancesin information technology has the potential to radically transformimmunology in the postgenomics era. Not only do we now have readyaccess to vast quantities of existing data, but new data with relevanceto immunology are being accumulated at an exponential rate. Resourcesfor computational immunology include biological databases and methodsfor data extraction, comparison, analysis and interpretation. Publiclyaccessible biological databases of relevance to immunologists numberin the hundreds and are growing daily. The ability to efficientlyextract and analyse information from these databases is vital forefficient immunology research. Most importantly, a new generationof computational immunology tools enables modelling of peptide transportby the transporter associated with antigen processing (TAP), modellingof antibody binding sites, identification of allergenic motifs andmodelling of T-cell receptor serial triggering.
Resumo:
Allergy is a major cause of morbidity worldwide. The number of characterized allergens and related information is increasing rapidly creating demands for advanced information storage, retrieval and analysis. Bioinformatics provides useful tools for analysing allergens and these are complementary to traditional laboratory techniques for the study of allergens. Specific applications include structural analysis of allergens, identification of B- and T-cell epitopes, assessment of allergenicity and cross-reactivity, and genome analysis. In this paper, the most important bioinformatic tools and methods with relevance to the study of allergy have been reviewed.
Resumo:
The present exploratory-descriptive cross-national study focused on the career development of 11- to 14-yr.-old children, in particular whether they can match their personal characteristics with their occupational aspirations. Further, the study explored whether their matching may be explained in terms of a fit between person and environment using Holland's theory as an example. Participants included 511 South African and 372 Australian children. Findings relate to two items of the Revised Career Awareness Survey that require children to relate personal-social knowledge to their favorite occupation. Data were analyzed in three stages using descriptive statistics, i.e., mean scores, frequencies, and percentage agreement. The study indicated that children perceived their personal characteristics to be related to their occupational aspirations. However, how this matching takes place is not adequately accounted for in terms of a career theory such as that of Holland.
Resumo:
Computational models complement laboratory experimentation for efficient identification of MHC-binding peptides and T-cell epitopes. Methods for prediction of MHC-binding peptides include binding motifs, quantitative matrices, artificial neural networks, hidden Markov models, and molecular modelling. Models derived by these methods have been successfully used for prediction of T-cell epitopes in cancer, autoimmunity, infectious disease, and allergy. For maximum benefit, the use of computer models must be treated as experiments analogous to standard laboratory procedures and performed according to strict standards. This requires careful selection of data for model building, and adequate testing and validation. A range of web-based databases and MHC-binding prediction programs are available. Although some available prediction programs for particular MHC alleles have reasonable accuracy, there is no guarantee that all models produce good quality predictions. In this article, we present and discuss a framework for modelling, testing, and applications of computational methods used in predictions of T-cell epitopes. (C) 2004 Elsevier Inc. All rights reserved.
Resumo:
Given escalating concern worldwide about the loss of biodiversity, and given biodiversity's centrality to quality of life, it is imperative that current ecological knowledge fully informs societal decision making. Over the past two decades, ecological science has undergone many significant shifts in emphasis and perspective, which have important implications for how we manage ecosystems and species. In particular, a shift has occurred from the equilibrium paradigm to one that recognizes the dynamic, non-equilibrium nature of ecosystems. Revised thinking about the spatial and temporal dynamics of ecological systems has important implications for management. Thus, it is of growing concern to ecologists and others that these recent developments have not been translated into information useful to managers and policy makers. Many conservation policies and plans are still based on equilibrium assumptions. A fundamental difficulty with integrating current ecological thinking into biodiversity policy and management planning is that field observations have yet to provide compelling evidence for many of the relationships suggested by non-equilibrium ecology. Yet despite this scientific uncertainty, management and policy decisions must still be made. This paper was motivated by the need for considered scientific debate on the significance of current ideas in theoretical ecology for biodiversity conservation. This paper aims to provide a platform for such discussion by presenting a critical synthesis of recent ecological literature that (1) identifies core issues in ecological theory, and (2) explores the implications of current ecological thinking for biodiversity conservation.