82 resultados para categorization IT PFC computational neuroscience model HMAX
em Queensland University of Technology - ePrints Archive
Resumo:
This paper develops a framework for classifying term dependencies in query expansion with respect to the role terms play in structural linguistic associations. The framework is used to classify and compare the query expansion terms produced by the unigram and positional relevance models. As the unigram relevance model does not explicitly model term dependencies in its estimation process it is often thought to ignore dependencies that exist between words in natural language. The framework presented in this paper is underpinned by two types of linguistic association, namely syntagmatic and paradigmatic associations. It was found that syntagmatic associations were a more prevalent form of linguistic association used in query expansion. Paradoxically, it was the unigram model that exhibited this association more than the positional relevance model. This surprising finding has two potential implications for information retrieval models: (1) if linguistic associations underpin query expansion, then a probabilistic term dependence assumption based on position is inadequate for capturing them; (2) the unigram relevance model captures more term dependency information than its underlying theoretical model suggests, so its normative position as a baseline that ignores term dependencies should perhaps be reviewed.
Resumo:
Computational neuroscience aims to elucidate the mechanisms of neural information processing and population dynamics, through a methodology of incorporating biological data into complex mathematical models. Existing simulation environments model at a particular level of detail; none allow a multi-level approach to neural modelling. Moreover, most are not engineered to produce compute-efficient solutions, an important issue because sufficient processing power is a major impediment in the field. This project aims to apply modern software engineering techniques to create a flexible high performance neural modelling environment, which will allow rigorous exploration of model parameter effects, and modelling at multiple levels of abstraction.
Resumo:
Oscillations of neural activity may bind widespread cortical areas into a neural representation that encodes disparate aspects of an event. In order to test this theory we have turned to data collected from complex partial epilepsy (CPE) patients with chronically implanted depth electrodes. Data from regions critical to word and face information processing was analyzed using spectral coherence measurements. Similar analyses of intracranial EEG (iEEG) during seizure episodes display HippoCampal Formation (HCF)—NeoCortical (NC) spectral coherence patterns that are characteristic of specific seizure stages (Klopp et al. 1996). We are now building a computational memory model to examine whether spatio-temporal patterns of human iEEG spectral coherence emerge in a computer simulation of HCF cellular distribution, membrane physiology and synaptic connectivity. Once the model is reasonably scaled it will be used as a tool to explore neural parameters that are critical to memory formation and epileptogenesis.
Resumo:
Organizations generally are not responding effectively to rising IT security threats because people issues receive inadequate attention. The stark example of IT security is just the latest strategic IT priority demonstrating deficient IT leadership attention to the social dimension of IT. Universities in particular, with their devolved people organization, diverse adoption of IT, and split central/local federated approach to governance and leadership of IT, demand higher levels of interpersonal sophistication and strategic engagement from their IT leaders. An idealized model for IT leaders for the 21st century university is proposed to be developed as a framework for further investigation. The testing of this model in an action research study is proposed.
Resumo:
This action research examines the enhancement of visual communication within the architectural design studio through physical model making. „It is through physical model making that designers explore their conceptual ideas and develop the creation and understanding of space,‟ (Salama & Wilkinson 2007:126). This research supplements Crowther‟s findings extending the understanding of visual dialogue to include physical models. „Architecture Design 8‟ is the final core design unit at QUT in the fourth year of the Bachelor of Design Architecture. At this stage it is essential that students have the ability to communicate their ideas in a comprehensive manner, relying on a combination of skill sets including drawing, physical model making, and computer modeling. Observations within this research indicates that students did not integrate the combination of the skill sets in the design process through the first half of the semester by focusing primarily on drawing and computer modeling. The challenge was to promote deeper learning through physical model making. This research addresses one of the primary reasons for the lack of physical model making, which was the limited assessment emphasis on the physical models. The unit was modified midway through the semester to better correlate the lecture theory with studio activities by incorporating a series of model making exercises conducted during the studio time. The outcome of each exercise was assessed. Tutors were surveyed regarding the model making activities and a focus group was conducted to obtain formal feedback from students. Students and tutors recognised the added value in communicating design ideas through physical forms and model making. The studio environment was invigorated by the enhanced learning outcomes of the students who participated in the model making exercises. The conclusions of this research will guide the structure of the upcoming iteration of the fourth year design unit.
Resumo:
This article examines the current transfer pricing regime to consider whether it is a sound model to be applied to modern multinational entities. The arm's length price methodology is examined to enable a discussion of the arguments in favour of such a regime. The article then refutes these arguments concluding that, contrary to the very reason multinational entities exist, applying arm's length rules involves a legal fiction of imagining transactions between unrelated parties. Multinational entities exist to operate in a way that independent entities would not, which the arm's length rules fail to take into account. As such, there is clearly an air of artificiality in applying the arm's length standard. To demonstrate this artificiality with respect to modern multinational entities, multinational banks are used as an example. The article concluded that the separate entity paradigm adopted by the traditional transfer pricing regime is incongruous with the economic theory of modern multinational enterprises.
Resumo:
Successful firms use business model innovation to rethink the way they do business and transform industries. However, current research on business model innovation is lacking theoretical underpinnings and is in need of new insights. This objective of this paper is to advance our understanding of both the business model concept and business model innovation based on service logic as foundation for customer value and value creation. We present and discuss a rationale for business models based on ‘service logic’ with service as a value-supporting process and compared it with a business model based on ‘goods logic’ with goods as value-supporting resources. The implications for each of the business model dimensions: customer, value proposition, organizational architecture and revenue model, are described and discussed in detail.
Resumo:
Design Science Research (DSR) has emerged as an important approach in Information Systems (IS) research. However, DSR is still in its genesis and has yet to achieve consensus on even the fundamentals, such as what methodology / approach to use for DSR. While there has been much effort to establish DSR methodologies, a complete, holistic and validated approach for the conduct of DSR to guide IS researcher (especially novice researchers) is yet to be established. Alturki et al. (2011) present a DSR ‘Roadmap’, making the claim that it is a complete and comprehensive guide for conducting DSR. This paper aims to further assess this Roadmap, by positioning it against the ‘Idealized Model for Theory Development’ (IM4TD) (Fischer & Gregor 2011). The IM4TD highlights the role of discovery and justification and forms of reasoning to progress in theory development. Fischer and Gregor (2011) have applied IM4TD’s hypothetico-deductive method to analyze DSR methodologies, which is adopted in this study to deductively validate the Alturki et al. (2011) Roadmap. The results suggest that the Roadmap adheres to the IM4TD, is reasonably complete, overcomes most shortcomings identified in other DSR methodologies and also highlights valuable refinements that should be considered within the IM4TD.
Resumo:
In microscopic traffic simulators, the interaction between vehicles is considered. The dynamics of the system then becomes an emergent property of the interaction between its components. Such interactions include lane-changing, car-following behaviours and intersection management. Although, in some cases, such simulators produce realistic prediction, they do not allow for an important aspect of the dynamics, that is, the driver-vehicle interaction. This paper introduces a physically sound vehicle-driver model for realistic microscopic simulation. By building a nanoscopic traffic simulation model that uses steering angle and throttle position as parameters, the model aims to overcome unrealistic acceleration and deceleration values, as found in various microscopic simulation tools. A physics engine calculates the driving force of the vehicle, and the preliminary results presented here, show that, through a realistic driver-vehicle-environment simulator, it becomes possible to model realistic driver and vehicle behaviours in a traffic simulation.
Resumo:
From human biomonitoring data that are increasingly collected in the United States, Australia, and in other countries from large-scale field studies, we obtain snap-shots of concentration levels of various persistent organic pollutants (POPs) within a cross section of the population at different times. Not only can we observe the trends within this population with time, but we can also gain information going beyond the obvious time trends. By combining the biomonitoring data with pharmacokinetic modeling, we can re-construct the time-variant exposure to individual POPs, determine their intrinsic elimination half-lives in the human body, and predict future levels of POPs in the population. Different approaches have been employed to extract information from human biomonitoring data. Pharmacokinetic (PK) models were combined with longitudinal data1, with single2 or multiple3 average concentrations of a cross-sectional data (CSD), or finally with multiple CSD with or without empirical exposure data4. In the latter study, for the first time, the authors based their modeling outputs on two sets of CSD and empirical exposure data, which made it possible that their model outputs were further constrained due to the extensive body of empirical measurements. Here we use a PK model to analyze recent levels of PBDE concentrations measured in the Australian population. In this study, we are able to base our model results on four sets5-7 of CSD; we focus on two PBDE congeners that have been shown3,5,8-9 to differ in intake rates and half-lives with BDE-47 being associated with high intake rates and a short half-life and BDE-153 with lower intake rates and a longer half-life. By fitting the model to PBDE levels measured in different age groups in different years, we determine the level of intake of BDE-47 and BDE-153, as well as the half-lives of these two chemicals in the Australian population.
Resumo:
Articular cartilage is a complex structure with an architecture in which fluid-swollen proteoglycans constrained within a 3D network of collagen fibrils. Because of the complexity of the cartilage structure, the relationship between its mechanical behaviours at the macroscale level and its components at the micro-scale level are not completely understood. The research objective in this thesis is to create a new model of articular cartilage that can be used to simulate and obtain insight into the micro-macro-interaction and mechanisms underlying its mechanical responses during physiological function. The new model of articular cartilage has two characteristics, namely: i) not use fibre-reinforced composite material idealization ii) Provide a framework for that it does probing the micro mechanism of the fluid-solid interaction underlying the deformation of articular cartilage using simple rules of repartition instead of constitutive / physical laws and intuitive curve-fitting. Even though there are various microstructural and mechanical behaviours that can be studied, the scope of this thesis is limited to osmotic pressure formation and distribution and their influence on cartilage fluid diffusion and percolation, which in turn governs the deformation of the compression-loaded tissue. The study can be divided into two stages. In the first stage, the distributions and concentrations of proteoglycans, collagen and water were investigated using histological protocols. Based on this, the structure of cartilage was conceptualised as microscopic osmotic units that consist of these constituents that were distributed according to histological results. These units were repeated three-dimensionally to form the structural model of articular cartilage. In the second stage, cellular automata were incorporated into the resulting matrix (lattice) to simulate the osmotic pressure of the fluid and the movement of water within and out of the matrix; following the osmotic pressure gradient in accordance with the chosen rule of repartition of the pressure. The outcome of this study is the new model of articular cartilage that can be used to simulate and study the micromechanical behaviours of cartilage under different conditions of health and loading. These behaviours are illuminated at the microscale level using the socalled neighbourhood rules developed in the thesis in accordance with the typical requirements of cellular automata modelling. Using these rules and relevant Boundary Conditions to simulate pressure distribution and related fluid motion produced significant results that provided the following insight into the relationships between osmotic pressure gradient and associated fluid micromovement, and the deformation of the matrix. For example, it could be concluded that: 1. It is possible to model articular cartilage with the agent-based model of cellular automata and the Margolus neighbourhood rule. 2. The concept of 3D inter connected osmotic units is a viable structural model for the extracellular matrix of articular cartilage. 3. Different rules of osmotic pressure advection lead to different patterns of deformation in the cartilage matrix, enabling an insight into how this micromechanism influences macromechanical deformation. 4. When features such as transition coefficient were changed, permeability (representing change) is altered due to the change in concentrations of collagen, proteoglycans (i.e. degenerative conditions), the deformation process is impacted. 5. The boundary conditions also influence the relationship between osmotic pressure gradient and fluid movement at the micro-scale level. The outcomes are important to cartilage research since we can use these to study the microscale damage in the cartilage matrix. From this, we are able to monitor related diseases and their progression leading to potential insight into drug-cartilage interaction for treatment. This innovative model is an incremental progress on attempts at creating further computational modelling approaches to cartilage research and other fluid-saturated tissues and material systems.
Resumo:
Responding to the global and unprecedented challenge of capacity building for twenty-first century life, this book is a practical guide for tertiary education institutions to quickly and effectively renew the curriculum towards education for sustainable development. The book begins by exploring why curriculum change has been so slow. It then describes a model for rapid curriculum renewal, highlighting the important roles of setting timeframes, formal and informal leadership, and key components and action strategies. The second part of the book provides detailed coverage of six core elements that have been trialled and peer reviewed by institutions around the world: - raising awareness among staff and students - mapping graduate attributes - auditing the curriculum - developing niche degrees, flagship courses and fully integrated programs - engaging and catalysing community and student markets - integrating curriculum with green campus operations. With input from more than seventy academics and grounded in engineering education experiences, this book will provide academic staff with tools and insights to rapidly align program offerings with the needs of present and future generations of students.
Resumo:
Parametric roll is a critical phenomenon for ships, whose onset may cause roll oscillations up to +-40 degrees, leading to very dangerous situations and possibly capsizing. Container ships have been shown to be particularly prone to parametric roll resonance when they are sailing in moderate to heavy head seas. A Matlab/Simulink parametric roll benchmark model for a large container ship has been implemented and validated against a wide set of experimental data. The model is a part of a Matlab/Simulink Toolbox (MSS, 2007). The benchmark implements a 3rd-order nonlinear model where the dynamics of roll is strongly coupled with the heave and pitch dynamics. The implemented model has shown good accuracy in predicting the container ship motions, both in the vertical plane and in the transversal one. Parametric roll has been reproduced for all the data sets in which it happened, and the model provides realistic results which are in good agreement with the model tank experiments.
Resumo:
A hippocampal-CA3 memory model was constructed with PGENESIS, a recently developed version of GENESIS that allows for distributed processing of a neural network simulation. A number of neural models of the human memory system have identified the CA3 region of the hippocampus as storing the declarative memory trace. However, computational models designed to assess the viability of the putative mechanisms of storage and retrieval have generally been too abstract to allow comparison with empirical data. Recent experimental evidence has shown that selective knock-out of NMDA receptors in the CA1 of mice leads to reduced stability of firing specificity in place cells. Here a similar reduction of stability of input specificity is demonstrated in a biologically plausible neural network model of the CA3 region, under conditions of Hebbian synaptic plasticity versus an absence of plasticity. The CA3 region is also commonly associated with seizure activity. Further simulations of the same model tested the response to continuously repeating versus randomized nonrepeating input patterns. Each paradigm delivered input of equal intensity and duration. Non-repeating input patterns elicited a greater pyramidal cell spike count. This suggests that repetitive versus non-repeating neocortical inpus has a quantitatively different effect on the hippocampus. This may be relevant to the production of independent epileptogenic zones and the process of encoding new memories.
Resumo:
A computational framework for enhancing design in an evolutionary approach with a dynamic hierarchical structure is presented in this paper. This framework can be used as an evolutionary kernel for building computer-supported design systems. It provides computational components for generating, adapting and exploring alternative design solutions at multiple levels of abstraction with hierarchically structured design representations. In this paper, preliminary experimental results of using this framework in several design applications are presented.