521 resultados para Conceptual site models
Resumo:
Modelling how a word is activated in human memory is an important requirement for determining the probability of recall of a word in an extra-list cueing experiment. The spreading activation, spooky-action-at-a-distance and entanglement models have all been used to model the activation of a word. Recently a hypothesis was put forward that the mean activation levels of the respective models are as follows: Spreading � Entanglment � Spooking-action-at-a-distance This article investigates this hypothesis by means of a substantial empirical analysis of each model using the University of South Florida word association, rhyme and word norms.
Resumo:
This ALTC Teaching Fellowship aimed to establish Guiding Principles for Library and Information Science Education 2.0. The aim was achieved by (i) identifying the current and anticipated skills and knowledge required by successful library and information science (LIS) professionals in the age of web 2.0 (and beyond), (ii) establishing the current state of LIS education in Australia in supporting the development of librarian 2.0, and in doing so, identify models of best practice.
The fellowship has contributed to curriculum renewal in the LIS profession. It has helped to ensure that LIS education in Australia continues to meet the changing skills and knowledge requirements of the profession it supports. It has also provided a vehicle through which LIS professionals and LIS educators may find opportunities for greater collaboration and more open communication. This will help bridge the gap between LIS theory and practice and will foster more authentic engagement between LIS education and other parts of the LIS industry in the education of the next generation of professionals. Through this fellowship the LIS discipline has become a role model for other disciplines who will be facing similar issues in the coming years.
Eighty-one members of the Australian LIS profession participated in a series of focus groups exploring the current and anticipated skills and knowledge needed by the LIS professional in the web 2.0 world and beyond. Whilst each focus group tended to draw on specific themes of interest to that particular group of people, there was a great deal of common ground. Eight key themes emerged: technology, learning and education, research or evidence-based practice, communication, collaboration and team work, user focus, business savvy and personal traits.
It was acknowledged that the need for successful LIS professionals to possess transferable skills and interpersonal attributes was not new. It was noted however that the speed with which things are changing in the web 2.0 world was having a significant impact and that this faster pace is placing a new and unexpected emphasis on the transferable skills and knowledge. It was also acknowledged that all librarians need to possess these skills, knowledge and attributes and not just the one or two role models who lead the way.
The most interesting finding however was that web 2.0, library 2.0 and librarian 2.0 represented a ‘watershed’ for the LIS profession. Almost all the focus groups spoke about how they are seeing and experiencing a culture change in the profession. Librarian 2.0 requires a ‘different mindset or attitude’. The Levels of Perspective model by Daniel Kim provides one lens by which to view this finding. The focus group findings suggest that we are witnessing a re-awaking of the Australian LIS profession as it begins to move towards the higher levels of Kim’s model (ie mental models, vision).
Thirty-six LIS educators participated in telephone interviews aimed at exploring the current state of LIS education in supporting the development of librarian 2.0. Skills and knowledge of LIS professionals in a web 2.0 world that were identified and discussed by the LIS educators mirrored those highlighted in the focus group discussions with LIS professionals. Similarly it was noted that librarian 2.0 needed a focus less on skills and knowledge and more on attitude. However, whilst LIS professionals felt that there was a paradigm shift within the profession. LIS educators did not speak with one voice on this matter with quite a number of the educators suggesting that this might be ‘overstating it a bit’. This study provides evidence for “disparate viewpoints” (Hallam, 2007) between LIS educators and LIS professionals that can have a significant implications for the future of not just LIS professional education specifically but for the profession generally.
Library and information science education 2.0: guiding principles and models of best practice 1
Inviting the LIS academics to discuss how their teaching and learning activities support the development of librarian 2.0 was a core part of the interviews conducted. The strategies used and the challenges faced by LIS educators in developing their teaching and learning approaches to support the formation of librarian 2.0 are identified and discussed. A core part of the fellowship was the identification of best practice examples on how LIS educators were developing librarian 2.0. Twelve best practice examples were identified. Each educator was recorded discussing his or her approach to teaching and learning. Videos of these interviews are available via the Fellowship blog at
Resumo:
Emergency Health Services (EHS), encompassing hospital-based Emergency Departments (ED) and pre-hospital ambulance services, are a significant and high profile component of Australia’s health care system and congestion of these, evidenced by physical overcrowding and prolonged waiting times, is causing considerable community and professional concern. This concern relates not only to Australia’s capacity to manage daily health emergencies but also the ability to respond to major incidents and disasters. EHS congestion is a result of the combined effects of increased demand for emergency care, increased complexity of acute health care, and blocked access to ongoing care (e.g. inpatient beds). Despite this conceptual understanding there is a lack of robust evidence to explain the factors driving increased demand, or how demand contributes to congestion, and therefore public policy responses have relied upon limited or unsound information. The Emergency Health Services Queensland (EHSQ) research program proposes to determine the factors influencing the growing demand for emergency health care and to establish options for alternative service provision that may safely meet patient’s needs. The EHSQ study is funded by the Australian Research Council (ARC) through its Linkage Program and is supported financially by the Queensland Ambulance Service (QAS). This monograph is part of a suite of publications based on the research findings that examines the existing literature, and current operational context. Literature was sourced using standard search approaches and a range of databases as well as a selection of articles cited in the reviewed literature. Public sources including the Australian Institute of Health and Welfare (AIHW), the Council of Ambulance Authorities (CAA) Annual Reports, Australian Bureau of Statistics (ABS) and Department of Health and Ageing (DoHA) were examined for trend data across Australia.
Resumo:
Continuum, partial differential equation models are often used to describe the collective motion of cell populations, with various types of motility represented by the choice of diffusion coefficient, and cell proliferation captured by the source terms. Previously, the choice of diffusion coefficient has been largely arbitrary, with the decision to choose a particular linear or nonlinear form generally based on calibration arguments rather than making any physical connection with the underlying individual-level properties of the cell motility mechanism. In this work we provide a new link between individual-level models, which account for important cell properties such as varying cell shape and volume exclusion, and population-level partial differential equation models. We work in an exclusion process framework, considering aligned, elongated cells that may occupy more than one lattice site, in order to represent populations of agents with different sizes. Three different idealizations of the individual-level mechanism are proposed, and these are connected to three different partial differential equations, each with a different diffusion coefficient; one linear, one nonlinear and degenerate and one nonlinear and nondegenerate. We test the ability of these three models to predict the population level response of a cell spreading problem for both proliferative and nonproliferative cases. We also explore the potential of our models to predict long time travelling wave invasion rates and extend our results to two dimensional spreading and invasion. Our results show that each model can accurately predict density data for nonproliferative systems, but that only one does so for proliferative systems. Hence great care must be taken to predict density data for with varying cell shape.
Resumo:
Many modern business environments employ software to automate the delivery of workflows; whereas, workflow design and generation remains a laborious technical task for domain specialists. Several differ- ent approaches have been proposed for deriving workflow models. Some approaches rely on process data mining approaches, whereas others have proposed derivations of workflow models from operational struc- tures, domain specific knowledge or workflow model compositions from knowledge-bases. Many approaches draw on principles from automatic planning, but conceptual in context and lack mathematical justification. In this paper we present a mathematical framework for deducing tasks in workflow models from plans in mechanistic or strongly controlled work environments, with a focus around automatic plan generations. In addition, we prove an associative composition operator that permits crisp hierarchical task compositions for workflow models through a set of mathematical deduction rules. The result is a logical framework that can be used to prove tasks in workflow hierarchies from operational information about work processes and machine configurations in controlled or mechanistic work environments.
Resumo:
The question of under what conditions conceptual representation is compositional remains debatable within cognitive science. This paper proposes a well developed mathematical apparatus for a probabilistic representation of concepts, drawing upon methods developed in quantum theory to propose a formal test that can determine whether a specific conceptual combination is compositional, or not. This test examines a joint probability distribution modeling the combination, asking whether or not it is factorizable. Empirical studies indicate that some combinations should be considered non-compositionally.
Resumo:
This thesis develops a detailed conceptual design method and a system software architecture defined with a parametric and generative evolutionary design system to support an integrated interdisciplinary building design approach. The research recognises the need to shift design efforts toward the earliest phases of the design process to support crucial design decisions that have a substantial cost implication on the overall project budget. The overall motivation of the research is to improve the quality of designs produced at the author's employer, the General Directorate of Major Works (GDMW) of the Saudi Arabian Armed Forces. GDMW produces many buildings that have standard requirements, across a wide range of environmental and social circumstances. A rapid means of customising designs for local circumstances would have significant benefits. The research considers the use of evolutionary genetic algorithms in the design process and the ability to generate and assess a wider range of potential design solutions than a human could manage. This wider ranging assessment, during the early stages of the design process, means that the generated solutions will be more appropriate for the defined design problem. The research work proposes a design method and system that promotes a collaborative relationship between human creativity and the computer capability. The tectonic design approach is adopted as a process oriented design that values the process of design as much as the product. The aim is to connect the evolutionary systems to performance assessment applications, which are used as prioritised fitness functions. This will produce design solutions that respond to their environmental and function requirements. This integrated, interdisciplinary approach to design will produce solutions through a design process that considers and balances the requirements of all aspects of the design. Since this thesis covers a wide area of research material, 'methodological pluralism' approach was used, incorporating both prescriptive and descriptive research methods. Multiple models of research were combined and the overall research was undertaken following three main stages, conceptualisation, developmental and evaluation. The first two stages lay the foundations for the specification of the proposed system where key aspects of the system that have not previously been proven in the literature, were implemented to test the feasibility of the system. As a result of combining the existing knowledge in the area with the newlyverified key aspects of the proposed system, this research can form the base for a future software development project. The evaluation stage, which includes building the prototype system to test and evaluate the system performance based on the criteria defined in the earlier stage, is not within the scope this thesis. The research results in a conceptual design method and a proposed system software architecture. The proposed system is called the 'Hierarchical Evolutionary Algorithmic Design (HEAD) System'. The HEAD system has shown to be feasible through the initial illustrative paper-based simulation. The HEAD system consists of the two main components - 'Design Schema' and the 'Synthesis Algorithms'. The HEAD system reflects the major research contribution in the way it is conceptualised, while secondary contributions are achieved within the system components. The design schema provides constraints on the generation of designs, thus enabling the designer to create a wide range of potential designs that can then be analysed for desirable characteristics. The design schema supports the digital representation of the human creativity of designers into a dynamic design framework that can be encoded and then executed through the use of evolutionary genetic algorithms. The design schema incorporates 2D and 3D geometry and graph theory for space layout planning and building formation using the Lowest Common Design Denominator (LCDD) of a parameterised 2D module and a 3D structural module. This provides a bridge between the standard adjacency requirements and the evolutionary system. The use of graphs as an input to the evolutionary algorithm supports the introduction of constraints in a way that is not supported by standard evolutionary techniques. The process of design synthesis is guided as a higher level description of the building that supports geometrical constraints. The Synthesis Algorithms component analyses designs at four levels, 'Room', 'Layout', 'Building' and 'Optimisation'. At each level multiple fitness functions are embedded into the genetic algorithm to target the specific requirements of the relevant decomposed part of the design problem. Decomposing the design problem to allow for the design requirements of each level to be dealt with separately and then reassembling them in a bottom up approach reduces the generation of non-viable solutions through constraining the options available at the next higher level. The iterative approach, in exploring the range of design solutions through modification of the design schema as the understanding of the design problem improves, assists in identifying conflicts in the design requirements. Additionally, the hierarchical set-up allows the embedding of multiple fitness functions into the genetic algorithm, each relevant to a specific level. This supports an integrated multi-level, multi-disciplinary approach. The HEAD system promotes a collaborative relationship between human creativity and the computer capability. The design schema component, as the input to the procedural algorithms, enables the encoding of certain aspects of the designer's subjective creativity. By focusing on finding solutions for the relevant sub-problems at the appropriate levels of detail, the hierarchical nature of the system assist in the design decision-making process.
Resumo:
Traditional crash prediction models, such as generalized linear regression models, are incapable of taking into account the multilevel data structure, which extensively exists in crash data. Disregarding the possible within-group correlations can lead to the production of models giving unreliable and biased estimates of unknowns. This study innovatively proposes a -level hierarchy, viz. (Geographic region level – Traffic site level – Traffic crash level – Driver-vehicle unit level – Vehicle-occupant level) Time level, to establish a general form of multilevel data structure in traffic safety analysis. To properly model the potential cross-group heterogeneity due to the multilevel data structure, a framework of Bayesian hierarchical models that explicitly specify multilevel structure and correctly yield parameter estimates is introduced and recommended. The proposed method is illustrated in an individual-severity analysis of intersection crashes using the Singapore crash records. This study proved the importance of accounting for the within-group correlations and demonstrated the flexibilities and effectiveness of the Bayesian hierarchical method in modeling multilevel structure of traffic crash data.
Resumo:
A building information model (BIM) is an electronic repository of structured, three-dimensional data that captures both the physical and dynamic functional characteristics of a facility. In addition to its more traditional function as a tool to aid design and construction, a BIM can be used throughout the life cycle of a facility, functioning as a living database that places resources contained within the building in their spatial and temporal context. Through its comprehension of spatial relationships, a BIM can meaningfully represent and integrate previously isolated control and management systems and processes, and thereby provide a more intuitive interface to users. By placing processes in a spatial context, decision-making can be improved, with positive flow-on effects for security and efficiency. In this article, we systematically analyse the authorization requirements involved in the use of BIMs. We introduce the concept of using a BIM as a graphical tool to support spatial access control configuration and management (including physical access control). We also consider authorization requirements for regulating access to the structured data that exists within a BIM as well as to external systems and data repositories that can be accessed via the BIM interface. With a view to addressing these requirements we present a survey of relevant spatiotemporal access control models, focusing on features applicable to BIMs and highlighting capability gaps. Finally, we present a conceptual authorization framework that utilizes BIMs.
Resumo:
Objective: To use our Bayesian method of motor unit number estimation (MUNE) to evaluate lower motor neuron degeneration in ALS. Methods: In subjects with ALS we performed serial MUNE studies. We examined the repeatability of the test and then determined whether the loss of MUs was fitted by an exponential or Weibull distribution. Results: The decline in motor unit (MU) numbers was well-fitted by an exponential decay curve. We calculated the half life of MUs in the abductor digiti minimi (ADM), abductor pollicis brevis (APB) and/or extensor digitorum brevis (EDB) muscles. The mean half life of the MUs of ADM muscle was greater than those of the APB or EDB muscles. The half-life of MUs was less in the ADM muscle of subjects with upper limb than in those with lower limb onset. Conclusions: The rate of loss of lower motor neurons in ALS is exponential, the motor units of the APB decay more quickly than those of the ADM muscle and the rate of loss of motor units is greater at the site of onset of disease. Significance: This shows that the Bayesian MUNE method is useful in following the course and exploring the clinical features of ALS. 2012 International Federation of Clinical Neurophysiology.
Resumo:
Recently, ‘business model’ and ‘business model innovation’ have gained substantial attention in management literature and practice. However, many firms lack the capability to develop a novel business model to capture the value from new technologies. Existing literature on business model innovation highlights the central role of ‘customer value’. Further, it suggests that firms need to experiment with different business models and engage in ‘trail-and-error’ learning when participating in business model innovation. Trial-and error processes and prototyping with tangible artifacts are a fundamental characteristic of design. This conceptual paper explores the role of design-led innovation in facilitating firms to conceive and prototype novel and meaningful business models. It provides a brief review of the conceptual discussion on business model innovation and highlights the opportunities for linking it with the research stream of design-led innovation. We propose design-led business model innovation as a future research area and highlight the role of design-led prototyping and new types of artifacts and prototypes play within it. We present six propositions in order to outline future research avenues.
Resumo:
HtrA is a complex, multimeric chaperone and serine protease important for the virulence and survival of many bacteria. Chlamydia trachomatis is an obligate, intracellular bacterial pathogen that is responsible for severe disease pathology. C. trachomatis HtrA (CtHtrA) has been shown to be highly expressed in laboratory models of disease. In this study, molecular modelling of CtHtrA protein active site structure identified putative S1-S3 subsite residues I242, I265, and V266. These residues were altered by site-directed mutagenesis, and these changes were shown to considerably reduce protease activity on known substrates and resulted in a narrower and distinct range of substrates compared to wild type. Bacterial two-hybrid analysis revealed that CtHtrA is able to interact in vivo with a broad range of protein sequences with high affinity. Notably, however, the interaction was significantly altered in 35 out of 69 clones when residue V266 was mutated, indicating that this residue has an important function during substrate binding.
Resumo:
Many construction industry decision-makers believe there is a lack of off-site manufacture (OSM) adoption for non-residential construction in Australia. Identification of construction business process was considered imperative in order to assist decision-makers to increase OSM utilisation. The premise that domain knowledge can be re-used to provide an intervention point in the construction process led a team of researchers to construct simple base-line process models for the complete construction process, segmented into six phases. Sixteen domain knowledge industry experts were asked to review the construction phase base-line models to answer the question “Where in the process illustrated by this base-line model phase is an OSM task?”. Through an iterative and generative process a number of off-site manufacture intervention points were identified and integrated into the process models. The re-use of industry expert domain knowledge provided suggestions for new ways to do basic tasks thus facilitating changes to current practice. It is expected that implementation of the new processes will lead to systemic industry change and thus a growth in productivity due to increased adoption of OSM.
Resumo:
Topic modeling has been widely utilized in the fields of information retrieval, text mining, text classification etc. Most existing statistical topic modeling methods such as LDA and pLSA generate a term based representation to represent a topic by selecting single words from multinomial word distribution over this topic. There are two main shortcomings: firstly, popular or common words occur very often across different topics that bring ambiguity to understand topics; secondly, single words lack coherent semantic meaning to accurately represent topics. In order to overcome these problems, in this paper, we propose a two-stage model that combines text mining and pattern mining with statistical modeling to generate more discriminative and semantic rich topic representations. Experiments show that the optimized topic representations generated by the proposed methods outperform the typical statistical topic modeling method LDA in terms of accuracy and certainty.