829 resultados para Two Approaches
Resumo:
This paper presents a novel two-stage information filtering model which combines the merits of term-based and pattern- based approaches to effectively filter sheer volume of information. In particular, the first filtering stage is supported by a novel rough analysis model which efficiently removes a large number of irrelevant documents, thereby addressing the overload problem. The second filtering stage is empowered by a semantically rich pattern taxonomy mining model which effectively fetches incoming documents according to the specific information needs of a user, thereby addressing the mismatch problem. The experiments have been conducted to compare the proposed two-stage filtering (T-SM) model with other possible "term-based + pattern-based" or "term-based + term-based" IF models. The results based on the RCV1 corpus show that the T-SM model significantly outperforms other types of "two-stage" IF models.
Resumo:
Stochastic models for competing clonotypes of T cells by multivariate, continuous-time, discrete state, Markov processes have been proposed in the literature by Stirk, Molina-París and van den Berg (2008). A stochastic modelling framework is important because of rare events associated with small populations of some critical cell types. Usually, computational methods for these problems employ a trajectory-based approach, based on Monte Carlo simulation. This is partly because the complementary, probability density function (PDF) approaches can be expensive but here we describe some efficient PDF approaches by directly solving the governing equations, known as the Master Equation. These computations are made very efficient through an approximation of the state space by the Finite State Projection and through the use of Krylov subspace methods when evolving the matrix exponential. These computational methods allow us to explore the evolution of the PDFs associated with these stochastic models, and bimodal distributions arise in some parameter regimes. Time-dependent propensities naturally arise in immunological processes due to, for example, age-dependent effects. Incorporating time-dependent propensities into the framework of the Master Equation significantly complicates the corresponding computational methods but here we describe an efficient approach via Magnus formulas. Although this contribution focuses on the example of competing clonotypes, the general principles are relevant to multivariate Markov processes and provide fundamental techniques for computational immunology.
Resumo:
This chapter focuses on the interactions and roles between delays and intrinsic noise effects within cellular pathways and regulatory networks. We address these aspects by focusing on genetic regulatory networks that share a common network motif, namely the negative feedback loop, leading to oscillatory gene expression and protein levels. In this context, we discuss computational simulation algorithms for addressing the interplay of delays and noise within the signaling pathways based on biological data. We address implementational issues associated with efficiency and robustness. In a molecular biology setting we present two case studies of temporal models for the Hes1 gene (Monk, 2003; Hirata et al., 2002), known to act as a molecular clock, and the Her1/Her7 regulatory system controlling the periodic somite segmentation in vertebrate embryos (Giudicelli and Lewis, 2004; Horikawa et al., 2006).
Resumo:
Over the last two decades, particularly in Australia and the UK, the doctoral landscape has changed considerably with increasingly hybridised approaches to methodologies and research strategies as well as greater choice of examinable outputs. This paper provides an overview of doctoral practices that are emerging in the context of the creative industries, with a focus on practice-led approaches within the Doctor of Philosophy and recent developments in professional doctorates, from a predominantly Australian perspective. In interrogating what constitutes ‘doctorateness’ in this context, the paper examines some of the diverse theoretical principles which foreground the practitioner/researcher, methodological approaches that incorporate tacit knowledge and reflective practice together with qualitative strategies, blended learning delivery modes, and flexible doctoral outputs; and how these are shaping this shifting environment. The paper concludes with a study of the Doctor of Creative Industries at Queensland University of Technology as one model of an interdisciplinary professional research doctorate.
Resumo:
This study seeks to analyse the adequacy of the current regulation of the payday lending industry in Australia, and consider whether there is a need for additional regulation to protect consumers of these services. The report examines the different regulatory approaches adopted in comparable OECD countries, and reviews alternative models for payday regulation, in particular, the role played by responsible lending. The study also examines the consumer protection mechanisms now in existence in Australia in the National Consumer Credit Protection Act 2009 (Cth) (NCCP) and the National Credit Code (NCC) contained in Schedule 1 of that Act and in the Australian Securities and Investments Commission Act 2001 (Cth).
Resumo:
Unstructured text data, such as emails, blogs, contracts, academic publications, organizational documents, transcribed interviews, and even tweets, are important sources of data in Information Systems research. Various forms of qualitative analysis of the content of these data exist and have revealed important insights. Yet, to date, these analyses have been hampered by limitations of human coding of large data sets, and by bias due to human interpretation. In this paper, we compare and combine two quantitative analysis techniques to demonstrate the capabilities of computational analysis for content analysis of unstructured text. Specifically, we seek to demonstrate how two quantitative analytic methods, viz., Latent Semantic Analysis and data mining, can aid researchers in revealing core content topic areas in large (or small) data sets, and in visualizing how these concepts evolve, migrate, converge or diverge over time. We exemplify the complementary application of these techniques through an examination of a 25-year sample of abstracts from selected journals in Information Systems, Management, and Accounting disciplines. Through this work, we explore the capabilities of two computational techniques, and show how these techniques can be used to gather insights from a large corpus of unstructured text.
Resumo:
Information mismatch and overload are two fundamental issues influencing the effectiveness of information filtering systems. Even though both term-based and pattern-based approaches have been proposed to address the issues, neither of these approaches alone can provide a satisfactory decision for determining the relevant information. This paper presents a novel two-stage decision model for solving the issues. The first stage is a novel rough analysis model to address the overload problem. The second stage is a pattern taxonomy mining model to address the mismatch problem. The experimental results on RCV1 and TREC filtering topics show that the proposed model significantly outperforms the state-of-the-art filtering systems.
Resumo:
Virtual environments can provide, through digital games and online social interfaces, extremely exciting forms of interactive entertainment. Because of their capability in displaying and manipulating information in natural and intuitive ways, such environments have found extensive applications in decision support, education and training in the health and science domains amongst others. Currently, the burden of validating both the interactive functionality and visual consistency of a virtual environment content is entirely carried out by developers and play-testers. While considerable research has been conducted in assisting the design of virtual world content and mechanics, to date, only limited contributions have been made regarding the automatic testing of the underpinning graphics software and hardware. The aim of this thesis is to determine whether the correctness of the images generated by a virtual environment can be quantitatively defined, and automatically measured, in order to facilitate the validation of the content. In an attempt to provide an environment-independent definition of visual consistency, a number of classification approaches were developed. First, a novel model-based object description was proposed in order to enable reasoning about the color and geometry change of virtual entities during a play-session. From such an analysis, two view-based connectionist approaches were developed to map from geometry and color spaces to a single, environment-independent, geometric transformation space; we used such a mapping to predict the correct visualization of the scene. Finally, an appearance-based aliasing detector was developed to show how incorrectness too, can be quantified for debugging purposes. Since computer games heavily rely on the use of highly complex and interactive virtual worlds, they provide an excellent test bed against which to develop, calibrate and validate our techniques. Experiments were conducted on a game engine and other virtual worlds prototypes to determine the applicability and effectiveness of our algorithms. The results show that quantifying visual correctness in virtual scenes is a feasible enterprise, and that effective automatic bug detection can be performed through the techniques we have developed. We expect these techniques to find application in large 3D games and virtual world studios that require a scalable solution to testing their virtual world software and digital content.
Resumo:
The literature supporting the notion that active, student-centered learning is superior to passive, teacher-centered instruction is encyclopedic (Bonwell & Eison, 1991; Bruning, Schraw, & Ronning, 1999; Haile, 1997a, 1997b, 1998; Johnson, Johnson, & Smith, 1999). Previous action research demonstrated that introducing a learning activity in class improved the learning outcomes of students (Mejias, 2010). People acquire knowledge and skills through practice and reflection, not by watching and listening to others telling them how to do something. In this context, this project aims to find more insights about the level of interactivity in the curriculum a class should have and its alignment with assessment so the intended learning outcomes (ILOs) are achieved. In this project, interactivity is implemented in the form of problem- based learning (PBL). I present the argument that a more continuous formative feedback when implemented with the correct amount of PBL stimulates student engagement bringing enormous benefits to student learning. Different levels of practical work (PBL) were implemented together with two different assessment approaches in two subjects. The outcomes were measured using qualitative and quantitative data to evaluate the levels of student engagement and satisfaction in the terms of ILOs.
Resumo:
Individual-based models describing the migration and proliferation of a population of cells frequently restrict the cells to a predefined lattice. An implicit assumption of this type of lattice based model is that a proliferative population will always eventually fill the lattice. Here we develop a new lattice-free individual-based model that incorporates cell-to-cell crowding effects. We also derive approximate mean-field descriptions for the lattice-free model in two special cases motivated by commonly used experimental setups. Lattice-free simulation results are compared to these mean-field descriptions and to a corresponding lattice-based model. Data from a proliferation experiment is used to estimate the parameters for the new model, including the cell proliferation rate, showing that the model fits the data well. An important aspect of the lattice-free model is that the confluent cell density is not predefined, as with lattice-based models, but an emergent model property. As a consequence of the more realistic, irregular configuration of cells in the lattice-free model, the population growth rate is much slower at high cell densities and the population cannot reach the same confluent density as an equivalent lattice-based model.
Resumo:
This study explored the relationship among student approaches to learning and teaching methods on critical thinking in two business units. Key findings included differences in critical thinking scores between student approaches to learning and some evidence of an interaction between student approaches to learning and critical thinking teach method (immersion vs. infusion). Possible explanations for the results are examined and implications for developing critical thinking skills across a degree discussed. What is apparent is that as Universities move towards program-wide level assessment of critical thinking, further work is required in terms of the design of critical thinking teaching interventions and assessment at the unit, school, and degree level. The session will discuss the challenges in developing critical thinking programs in individual units and at the Faculty level.
Resumo:
The discovery of protein variation is an important strategy in disease diagnosis within the biological sciences. The current benchmark for elucidating information from multiple biological variables is the so called “omics” disciplines of the biological sciences. Such variability is uncovered by implementation of multivariable data mining techniques which come under two primary categories, machine learning strategies and statistical based approaches. Typically proteomic studies can produce hundreds or thousands of variables, p, per observation, n, depending on the analytical platform or method employed to generate the data. Many classification methods are limited by an n≪p constraint, and as such, require pre-treatment to reduce the dimensionality prior to classification. Recently machine learning techniques have gained popularity in the field for their ability to successfully classify unknown samples. One limitation of such methods is the lack of a functional model allowing meaningful interpretation of results in terms of the features used for classification. This is a problem that might be solved using a statistical model-based approach where not only is the importance of the individual protein explicit, they are combined into a readily interpretable classification rule without relying on a black box approach. Here we incorporate statistical dimension reduction techniques Partial Least Squares (PLS) and Principal Components Analysis (PCA) followed by both statistical and machine learning classification methods, and compared them to a popular machine learning technique, Support Vector Machines (SVM). Both PLS and SVM demonstrate strong utility for proteomic classification problems.
Resumo:
The role of the judiciary in common law systems is to create law, interpret law and uphold the law. As such decisions by courts on matters related to ecologically sustainable development, natural resource use and management and climate change make an important contribution to earth jurisprudence. There are examples where judicial decisions further the goals of earth jurisprudence and examples where decisions go against the principles of earth jurisprudence. This presentation will explore judicial approaches to standing in Australia and America. The paper will explore two trends in each jurisdiction. Approaches by American courts to standing will be examined in reference to climate change and environmental justice litigation. While Australian approaches to standing will be examined in the context of public interest litigation and environmental criminal negligence cases. The presentation will draw some conclusions about the role of standing in each of these cases and implications of this for earth jurisprudence.
Resumo:
Many older people have difficulties using modern consumer products due to increased product complexity both in terms of functionality and interface design. Previous research has shown that older people have more difficulty in using complex devices intuitively when compared to the younger. Furthermore, increased life expectancy and a falling birth rate have been catalysts for changes in world demographics over the past two decades. This trend also suggests a proportional increase of older people in the work-force. This realisation has led to research on the effective use of technology by older populations in an effort to engage them more productively and to assist them in leading independent lives. Ironically, not enough attention has been paid to the development of interaction design strategies that would actually enable older users to better exploit new technologies. Previous research suggests that if products are designed to reflect people's prior knowledge, they will appear intuitive to use. Since intuitive interfaces utilise domain-specific prior knowledge of users, they require minimal learning for effective interaction. However, older people are very diverse in their capabilities and domain-specific prior knowledge. In addition, ageing also slows down the process of acquiring new knowledge. Keeping these suggestions and limitations in view, the aim of this study was set to investigate possible approaches to developing interfaces that facilitate their intuitive use by older people. In this quest to develop intuitive interfaces for older people, two experiments were conducted that systematically investigated redundancy (the use of both text and icons) in interface design, complexity of interface structure (nested versus flat), and personal user factors such as cognitive abilities, perceived self-efficacy and technology anxiety. All of these factors could interfere with intuitive use. The results from the first experiment suggest that, contrary to what was hypothesised, older people (65+ years) completed the tasks on the text only based interface design faster than on the redundant interface design. The outcome of the second experiment showed that, as expected, older people took more time on a nested interface. However, they did not make significantly more errors compared with younger age groups. Contrary to what was expected, older age groups also did better under anxious conditions. The findings of this study also suggest that older age groups are more heterogeneous in their capabilities and their intuitive use of contemporary technological devices is mediated more by domain-specific technology prior knowledge and by their cognitive abilities, than chronological age. This makes it extremely difficult to develop product interfaces that are entirely intuitive to use. However, by keeping in view the cognitive limitations of older people when interfaces are developed, and using simple text-based interfaces with flat interface structure, would help them intuitively learn and use complex technological products successfully during early encounter with a product. These findings indicate that it might be more pragmatic if interfaces are designed for intuitive learning rather than for intuitive use. Based on this research and the existing literature, a model for adaptable interface design as a strategy for developing intuitively learnable product interfaces was proposed. An adaptable interface can initially use a simple text only interface to help older users to learn and successfully use the new system. Over time, this can be progressively changed to a symbols-based nested interface for more efficient and intuitive use.
Resumo:
The robust and diversely useful isoindoline nitroxide, 5-carboxy-1,1,3,3-tetramethylisoindolin-2-yloxyl (1; CTMIO), has previously been synthesised in low-to-moderate yields from phthalic anhydride (3). Recent interest in its biological potential as a potent antioxidant and in other areas has seen an increased demand for its production. Herein, three new synthetic routes to CTMIO are presented and their efficiencies assessed. Two routes, via the nitrile 9 and the formyl compound 11, derive from 5-bromo-1,1,3,3-tetramethylisoindoline (6). The third approach starts from the readily accessible starting material, 4-methylphthalic anhydride (12), and proceeds by a methylarene oxidation with potassium permanganate. The three new approaches yield CTMIO in comparable overall yields (16–18 %); however, the synthetic efficiency is most improved when employing the nitrile intermediate 9.