73 resultados para Information literacy integration model
Resumo:
Examined the barriers faced by people with Spinal Cord Injuries (SCI) when integrating their Assistive Technology (AT) into the workplace, as well as factors that contribute to successful integration. In-depth interviews were taken with 5 men (aged 37-50 yrs) with SCI, 3 of their employers and 2 co-workers. Results indicate that in addition to the barriers previously outlined in the literature related to funding the technology, time delays, information availability, training and maintenance, other issues were highlighted. Implications for service providers are considered in relation to these barriers and the factors that prompted successful integration. The author discusses limitations of the study and makes recommendations for future research. (PsycINFO Database Record (c) 2007 APA, all rights reserved)
Resumo:
Ligaments undergo finite strain displaying hyperelastic behaviour as the initially tangled fibrils present straighten out, combined with viscoelastic behaviour (strain rate sensitivity). In the present study the anterior cruciate ligament of the human knee joint is modelled in three dimensions to gain an understanding of the stress distribution over the ligament due to motion imposed on the ends, determined from experimental studies. A three dimensional, finite strain material model of ligaments has recently been proposed by Pioletti in Ref. [2]. It is attractive as it separates out elastic stress from that due to the present strain rate and that due to the past history of deformation. However, it treats the ligament as isotropic and incompressible. While the second assumption is reasonable, the first is clearly untrue. In the present study an alternative model of the elastic behaviour due to Bonet and Burton (Ref. [4]) is generalized. Bonet and Burton consider finite strain with constant modulii for the fibres and for the matrix of a transversely isotropic composite. In the present work, the fibre modulus is first made to increase exponentially from zero with an invariant that provides a measure of the stretch in the fibre direction. At 12% strain in the fibre direction, a new reference state is then adopted, after which the material modulus is made constant, as in Bonet and Burton's model. The strain rate dependence can be added, either using Pioletti's isotropic approximation, or by making the effect depend on the strain rate in the fibre direction only. A solid model of a ligament is constructed, based on experimentally measured sections, and the deformation predicted using explicit integration in time. This approach simplifies the coding of the material model, but has a limitation due to the detrimental effect on stability of integration of the substantial damping implied by the nonlinear dependence of stress on strain rate. At present, an artificially high density is being used to provide stability, while the dynamics are being removed from the solution using artificial viscosity. The result is a quasi-static solution incorporating the effect of strain rate. Alternate approaches to material modelling and integration are discussed, that may result in a better model.
Resumo:
This paper presents a method of formally specifying, refining and verifying concurrent systems which uses the object-oriented state-based specification language Object-Z together with the process algebra CSP. Object-Z provides a convenient way of modelling complex data structures needed to define the component processes of such systems, and CSP enables the concise specification of process interactions. The basis of the integration is a semantics of Object-Z classes identical to that of CSP processes. This allows classes specified in Object-Z to he used directly within the CSP part of the specification. In addition to specification, we also discuss refinement and verification in this model. The common semantic basis enables a unified method of refinement to be used, based upon CSP refinement. To enable state-based techniques to be used fur the Object-Z components of a specification we develop state-based refinement relations which are sound and complete with respect to CSP refinement. In addition, a verification method for static and dynamic properties is presented. The method allows us to verify properties of the CSP system specification in terms of its component Object-Z classes by using the laws of the the CSP operators together with the logic for Object-Z.
Resumo:
As individuals gain expertise in a chosen field they can begin to conceptualize how what they know can be applied more broadly, to new populations and situations, or to increase desirable outcomes. Judd's book does just this. It takes our current understanding of the etiology, course, and sequelae of brain injuries, combines this with established psychotherapy and rehabilitation techniques, and expands these into a cogent model of what Judd calls “neuropsychotherapy.” Simply put, neuropsychotherapy attempts to address the cognitive, emotional and behavioral changes in brain-injured persons, changes that may go undiagnosed, misdiagnosed, or untreated.
Resumo:
At the core of the analysis task in the development process is information systems requirements modelling, Modelling of requirements has been occurring for many years and the techniques used have progressed from flowcharting through data flow diagrams and entity-relationship diagrams to object-oriented schemas today. Unfortunately, researchers have been able to give little theoretical guidance only to practitioners on which techniques to use and when. In an attempt to address this situation, Wand and Weber have developed a series of models based on the ontological theory of Mario Bunge-the Bunge-Wand-Weber (BWW) models. Two particular criticisms of the models have persisted however-the understandability of the constructs in the BWW models and the difficulty in applying the models to a modelling technique. This paper addresses these issues by presenting a meta model of the BWW constructs using a meta language that is familiar to many IS professionals, more specific than plain English text, but easier to understand than the set-theoretic language of the original BWW models. Such a meta model also facilitates the application of the BWW theory to other modelling techniques that have similar meta models defined. Moreover, this approach supports the identification of patterns of constructs that might be common across meta models for modelling techniques. Such findings are useful in extending and refining the BWW theory. (C) 2002 Elsevier Science Ltd. All rights reserved.
Resumo:
Internationalisation occurs when the firm expands its selling, production, or other business activities into international markets. Many enterprises, especially small- and medium-size firms (SMEs), are internationalising today at an unprecedented rate. Managers are strategically using information to achieve degrees of internationalisation previously considered the domain of large firms. We extend existing explanations of firm internationalisation by examining the nature and fundamental, antecedent role of internalising appropriate information and translating it into relevant knowledge. Based on case studies of internationalising firms, we advance a conceptualisation of information internalisation and knowledge creation within the firm as it achieves internationalisation readiness. In the process, we offer several propositions intended to guide future research. (C) 2002 Elsevier Science Inc. All rights reserved.
Resumo:
Within the information systems field, the task of conceptual modeling involves building a representation of selected phenomena in some domain. High-quality conceptual-modeling work is important because it facilitates early detection and correction of system development errors. It also plays an increasingly important role in activities like business process reengineering and documentation of best-practice data and process models in enterprise resource planning systems. Yet little research has been undertaken on many aspects of conceptual modeling. In this paper, we propose a framework to motivate research that addresses the following fundamental question: How can we model the world to better facilitate our developing, implementing, using, and maintaining more valuable information systems? The framework comprises four elements: conceptual-modeling grammars, conceptual-modeling methods, conceptual-modeling scripts, and conceptual-modeling contexts. We provide examples of the types of research that have already been undertaken on each element and illustrate research opportunities that exist.
Resumo:
The efficacy of psychological treatments emphasising a self-management approach to chronic pain has been demonstrated by substantial empirical research. Nevertheless, high drop-out and relapse rates and low or unsuccessful engagement in self-management pain rehabilitation programs have prompted the suggestion that people vary in their readiness to adopt a self-management approach to their pain. The Pain Stages of Change Questionnaire (PSOCQ) was developed to assess a patient's readiness to adopt a self-management approach to their chronic pain. Preliminary evidence has supported the PSOCQ's psychometric properties. The current study was designed to further examine the psychometric properties of the PSOCQ, including its reliability, factorial structure and predictive validity. A total of 107 patients with an average age of 36.2 years (SD = 10.63) attending a multi-disciplinary pain management program completed the PSOCQ, the Pain Self-Efficacy Questionnaire (PSEQ) and the West Haven-Yale Multidimensional Pain Inventory (WHYMPI) pre-admission and at discharge from the program. Initial data analysis found inadequate internal consistencies of the precontemplation and action scales of the PSOCQ and a high correlation (r = 0.66, P < 0.01) between the action and maintenance scales. Principal component analysis supported a two-factor structure: 'Contemplation' and 'Engagement'. Subsequent analyses revealed that the PSEQ was a better predictor of treatment outcome than the PSOCQ scales. Discussion centres upon the utility of the PSOCQ in a clinical pain setting in light of the above findings, and a need for further research. (C) 2002 International Association for the Study of Pain. Published by Elsevier Science B.V. All rights reserved.
Resumo:
In this paper we refer to the gene-to-phenotype modeling challenge as the GP problem. Integrating information across levels of organization within a genotype-environment system is a major challenge in computational biology. However, resolving the GP problem is a fundamental requirement if we are to understand and predict phenotypes given knowledge of the genome and model dynamic properties of biological systems. Organisms are consequences of this integration, and it is a major property of biological systems that underlies the responses we observe. We discuss the E(NK) model as a framework for investigation of the GP problem and the prediction of system properties at different levels of organization. We apply this quantitative framework to an investigation of the processes involved in genetic improvement of plants for agriculture. In our analysis, N genes determine the genetic variation for a set of traits that are responsible for plant adaptation to E environment-types within a target population of environments. The N genes can interact in epistatic NK gene-networks through the way that they influence plant growth and development processes within a dynamic crop growth model. We use a sorghum crop growth model, available within the APSIM agricultural production systems simulation model, to integrate the gene-environment interactions that occur during growth and development and to predict genotype-to-phenotype relationships for a given E(NK) model. Directional selection is then applied to the population of genotypes, based on their predicted phenotypes, to simulate the dynamic aspects of genetic improvement by a plant-breeding program. The outcomes of the simulated breeding are evaluated across cycles of selection in terms of the changes in allele frequencies for the N genes and the genotypic and phenotypic values of the populations of genotypes.
Resumo:
Concerns of reduced productivity and land degradation in the Mitchell grasslands of central western Queensland were addressed through a range monitoring program to interpret condition and trend. Botanical and eclaphic parameters were recorded along piosphere and grazing gradients, and across fenceline impact areas, to maximise changes resulting from grazing. The Degradation Gradient Method was used in conjunction with State and Transition Models to develop models of rangeland dynamics and condition. States were found to be ordered along a degradation gradient, indicator species developed according to rainfall trends and transitions determined from field data and available literature. Astrebla spp. abundance declined with declining range condition and increasing grazing pressure, while annual grasses and forbs increased in dominance under poor range condition. Soil erosion increased and litter decreased with decreasing range condition. An approach to quantitatively define states within a variable rainfall environment based upon a time-series ordination analysis is described. The derived model could provide the interpretive framework necessary to integrate on-ground monitoring, remote sensing and geographic information systems to trace states and transitions at the paddock scale. However, further work is needed to determine the full catalogue of states and transitions and to refine the model for application at the paddock scale.
Resumo:
We consider a mixture model approach to the regression analysis of competing-risks data. Attention is focused on inference concerning the effects of factors on both the probability of occurrence and the hazard rate conditional on each of the failure types. These two quantities are specified in the mixture model using the logistic model and the proportional hazards model, respectively. We propose a semi-parametric mixture method to estimate the logistic and regression coefficients jointly, whereby the component-baseline hazard functions are completely unspecified. Estimation is based on maximum likelihood on the basis of the full likelihood, implemented via an expectation-conditional maximization (ECM) algorithm. Simulation studies are performed to compare the performance of the proposed semi-parametric method with a fully parametric mixture approach. The results show that when the component-baseline hazard is monotonic increasing, the semi-parametric and fully parametric mixture approaches are comparable for mildly and moderately censored samples. When the component-baseline hazard is not monotonic increasing, the semi-parametric method consistently provides less biased estimates than a fully parametric approach and is comparable in efficiency in the estimation of the parameters for all levels of censoring. The methods are illustrated using a real data set of prostate cancer patients treated with different dosages of the drug diethylstilbestrol. Copyright (C) 2003 John Wiley Sons, Ltd.
Resumo:
This paper is concerned with methods for refinement of specifications written using a combination of Object-Z and CSP. Such a combination has proved to be a suitable vehicle for specifying complex systems which involve state and behaviour, and several proposals exist for integrating these two languages. The basis of the integration in this paper is a semantics of Object-Z classes identical to CSP processes. This allows classes specified in Object-Z to be combined using CSP operators. It has been shown that this semantic model allows state-based refinement relations to be used on the Object-Z components in an integrated Object-Z/CSP specification. However, the current refinement methodology does not allow the structure of a specification to be changed in a refinement, whereas a full methodology would, for example, allow concurrency to be introduced during the development life-cycle. In this paper, we tackle these concerns and discuss refinements of specifications written using Object-Z and CSP where we change the structure of the specification when performing the refinement. In particular, we develop a set of structural simulation rules which allow single components to be refined to more complex specifications involving CSP operators. The soundness of these rules is verified against the common semantic model and they are illustrated via a number of examples.
Resumo:
Over the past decade or so, there has been increasing demand for greater clarity about the major causes of disease and injury, how these differentially affect populations, and how they are changing. In part, this demand has been motivated by resource constraints and a realisation that better health is possible with more informed allocation of resources. At the same time, there has been a change in the way population health and its determinants are quantified, with a much closer integration of the quantitative population sciences (such as epidemiology, demography and health economics) to strengthen and broaden the evidence base for healthcare policy.