899 resultados para Information literacy integration model
Resumo:
At the core of the analysis task in the development process is information systems requirements modelling, Modelling of requirements has been occurring for many years and the techniques used have progressed from flowcharting through data flow diagrams and entity-relationship diagrams to object-oriented schemas today. Unfortunately, researchers have been able to give little theoretical guidance only to practitioners on which techniques to use and when. In an attempt to address this situation, Wand and Weber have developed a series of models based on the ontological theory of Mario Bunge-the Bunge-Wand-Weber (BWW) models. Two particular criticisms of the models have persisted however-the understandability of the constructs in the BWW models and the difficulty in applying the models to a modelling technique. This paper addresses these issues by presenting a meta model of the BWW constructs using a meta language that is familiar to many IS professionals, more specific than plain English text, but easier to understand than the set-theoretic language of the original BWW models. Such a meta model also facilitates the application of the BWW theory to other modelling techniques that have similar meta models defined. Moreover, this approach supports the identification of patterns of constructs that might be common across meta models for modelling techniques. Such findings are useful in extending and refining the BWW theory. (C) 2002 Elsevier Science Ltd. All rights reserved.
Resumo:
Internationalisation occurs when the firm expands its selling, production, or other business activities into international markets. Many enterprises, especially small- and medium-size firms (SMEs), are internationalising today at an unprecedented rate. Managers are strategically using information to achieve degrees of internationalisation previously considered the domain of large firms. We extend existing explanations of firm internationalisation by examining the nature and fundamental, antecedent role of internalising appropriate information and translating it into relevant knowledge. Based on case studies of internationalising firms, we advance a conceptualisation of information internalisation and knowledge creation within the firm as it achieves internationalisation readiness. In the process, we offer several propositions intended to guide future research. (C) 2002 Elsevier Science Inc. All rights reserved.
Resumo:
Within the information systems field, the task of conceptual modeling involves building a representation of selected phenomena in some domain. High-quality conceptual-modeling work is important because it facilitates early detection and correction of system development errors. It also plays an increasingly important role in activities like business process reengineering and documentation of best-practice data and process models in enterprise resource planning systems. Yet little research has been undertaken on many aspects of conceptual modeling. In this paper, we propose a framework to motivate research that addresses the following fundamental question: How can we model the world to better facilitate our developing, implementing, using, and maintaining more valuable information systems? The framework comprises four elements: conceptual-modeling grammars, conceptual-modeling methods, conceptual-modeling scripts, and conceptual-modeling contexts. We provide examples of the types of research that have already been undertaken on each element and illustrate research opportunities that exist.
Resumo:
The efficacy of psychological treatments emphasising a self-management approach to chronic pain has been demonstrated by substantial empirical research. Nevertheless, high drop-out and relapse rates and low or unsuccessful engagement in self-management pain rehabilitation programs have prompted the suggestion that people vary in their readiness to adopt a self-management approach to their pain. The Pain Stages of Change Questionnaire (PSOCQ) was developed to assess a patient's readiness to adopt a self-management approach to their chronic pain. Preliminary evidence has supported the PSOCQ's psychometric properties. The current study was designed to further examine the psychometric properties of the PSOCQ, including its reliability, factorial structure and predictive validity. A total of 107 patients with an average age of 36.2 years (SD = 10.63) attending a multi-disciplinary pain management program completed the PSOCQ, the Pain Self-Efficacy Questionnaire (PSEQ) and the West Haven-Yale Multidimensional Pain Inventory (WHYMPI) pre-admission and at discharge from the program. Initial data analysis found inadequate internal consistencies of the precontemplation and action scales of the PSOCQ and a high correlation (r = 0.66, P < 0.01) between the action and maintenance scales. Principal component analysis supported a two-factor structure: 'Contemplation' and 'Engagement'. Subsequent analyses revealed that the PSEQ was a better predictor of treatment outcome than the PSOCQ scales. Discussion centres upon the utility of the PSOCQ in a clinical pain setting in light of the above findings, and a need for further research. (C) 2002 International Association for the Study of Pain. Published by Elsevier Science B.V. All rights reserved.
Resumo:
In this paper we refer to the gene-to-phenotype modeling challenge as the GP problem. Integrating information across levels of organization within a genotype-environment system is a major challenge in computational biology. However, resolving the GP problem is a fundamental requirement if we are to understand and predict phenotypes given knowledge of the genome and model dynamic properties of biological systems. Organisms are consequences of this integration, and it is a major property of biological systems that underlies the responses we observe. We discuss the E(NK) model as a framework for investigation of the GP problem and the prediction of system properties at different levels of organization. We apply this quantitative framework to an investigation of the processes involved in genetic improvement of plants for agriculture. In our analysis, N genes determine the genetic variation for a set of traits that are responsible for plant adaptation to E environment-types within a target population of environments. The N genes can interact in epistatic NK gene-networks through the way that they influence plant growth and development processes within a dynamic crop growth model. We use a sorghum crop growth model, available within the APSIM agricultural production systems simulation model, to integrate the gene-environment interactions that occur during growth and development and to predict genotype-to-phenotype relationships for a given E(NK) model. Directional selection is then applied to the population of genotypes, based on their predicted phenotypes, to simulate the dynamic aspects of genetic improvement by a plant-breeding program. The outcomes of the simulated breeding are evaluated across cycles of selection in terms of the changes in allele frequencies for the N genes and the genotypic and phenotypic values of the populations of genotypes.
Resumo:
Concerns of reduced productivity and land degradation in the Mitchell grasslands of central western Queensland were addressed through a range monitoring program to interpret condition and trend. Botanical and eclaphic parameters were recorded along piosphere and grazing gradients, and across fenceline impact areas, to maximise changes resulting from grazing. The Degradation Gradient Method was used in conjunction with State and Transition Models to develop models of rangeland dynamics and condition. States were found to be ordered along a degradation gradient, indicator species developed according to rainfall trends and transitions determined from field data and available literature. Astrebla spp. abundance declined with declining range condition and increasing grazing pressure, while annual grasses and forbs increased in dominance under poor range condition. Soil erosion increased and litter decreased with decreasing range condition. An approach to quantitatively define states within a variable rainfall environment based upon a time-series ordination analysis is described. The derived model could provide the interpretive framework necessary to integrate on-ground monitoring, remote sensing and geographic information systems to trace states and transitions at the paddock scale. However, further work is needed to determine the full catalogue of states and transitions and to refine the model for application at the paddock scale.
Resumo:
We consider a mixture model approach to the regression analysis of competing-risks data. Attention is focused on inference concerning the effects of factors on both the probability of occurrence and the hazard rate conditional on each of the failure types. These two quantities are specified in the mixture model using the logistic model and the proportional hazards model, respectively. We propose a semi-parametric mixture method to estimate the logistic and regression coefficients jointly, whereby the component-baseline hazard functions are completely unspecified. Estimation is based on maximum likelihood on the basis of the full likelihood, implemented via an expectation-conditional maximization (ECM) algorithm. Simulation studies are performed to compare the performance of the proposed semi-parametric method with a fully parametric mixture approach. The results show that when the component-baseline hazard is monotonic increasing, the semi-parametric and fully parametric mixture approaches are comparable for mildly and moderately censored samples. When the component-baseline hazard is not monotonic increasing, the semi-parametric method consistently provides less biased estimates than a fully parametric approach and is comparable in efficiency in the estimation of the parameters for all levels of censoring. The methods are illustrated using a real data set of prostate cancer patients treated with different dosages of the drug diethylstilbestrol. Copyright (C) 2003 John Wiley Sons, Ltd.
Resumo:
This paper is concerned with methods for refinement of specifications written using a combination of Object-Z and CSP. Such a combination has proved to be a suitable vehicle for specifying complex systems which involve state and behaviour, and several proposals exist for integrating these two languages. The basis of the integration in this paper is a semantics of Object-Z classes identical to CSP processes. This allows classes specified in Object-Z to be combined using CSP operators. It has been shown that this semantic model allows state-based refinement relations to be used on the Object-Z components in an integrated Object-Z/CSP specification. However, the current refinement methodology does not allow the structure of a specification to be changed in a refinement, whereas a full methodology would, for example, allow concurrency to be introduced during the development life-cycle. In this paper, we tackle these concerns and discuss refinements of specifications written using Object-Z and CSP where we change the structure of the specification when performing the refinement. In particular, we develop a set of structural simulation rules which allow single components to be refined to more complex specifications involving CSP operators. The soundness of these rules is verified against the common semantic model and they are illustrated via a number of examples.
Resumo:
Over the past decade or so, there has been increasing demand for greater clarity about the major causes of disease and injury, how these differentially affect populations, and how they are changing. In part, this demand has been motivated by resource constraints and a realisation that better health is possible with more informed allocation of resources. At the same time, there has been a change in the way population health and its determinants are quantified, with a much closer integration of the quantitative population sciences (such as epidemiology, demography and health economics) to strengthen and broaden the evidence base for healthcare policy.
Resumo:
The maintenance of arterial pressure at levels adequate to perfuse the tissues is a basic requirement for the constancy of the internal environment and survival. The objective of the present review was to provide information about the basic reflex mechanisms that are responsible for the moment-to-moment regulation of the cardiovascular system. We demonstrate that this control is largely provided by the action of arterial and non-arterial reflexes that detect and correct changes in arterial pressure (baroreflex), blood volume or chemical composition (mechano- and chemosensitive cardiopulmonary reflexes), and changes in blood-gas composition (chemoreceptor reflex). The importance of the integration of these cardiovascular reflexes is well understood and it is clear that processing mainly occurs in the nucleus tractus solitarii, although the mechanism is poorly understood. There are several indications that the interactions of baroreflex, chemoreflex and Bezold-Jarisch reflex inputs, and the central nervous system control the activity of autonomic preganglionic neurons through parallel afferent and efferent pathways to achieve cardiovascular homeostasis. It is surprising that so little appears in the literature about the integration of these neural reflexes in cardiovascular function. Thus, our purpose was to review the interplay between peripheral neural reflex mechanisms of arterial blood pressure and blood volume regulation in physiological and pathophysiological states. Special emphasis is placed on the experimental model of arterial hypertension induced by N-nitro-L-arginine methyl ester (L-NAME) in which the interplay of these three reflexes is demonstrable.
Resumo:
A integração de tecnologias de informação e comunicação no contexto educacional tem sido tema de diversos congressos e simpósios ao redor do mundo e no Brasil. Neste sentido, vários estudos têm sido realizados com o objetivo de se obter metodologias que tornem efetivo o emprego das novas tecnologias no ensino. Este artigo mostra um estudo que investigou a interação entre estudantes universitários da área de ciências exatas e um ambiente de modelagem computacional qualitativo em atividades de modelagem expressiva. Os resultados obtidos mostram que os estudantes foram capazes de criar e modificar o modelo do sistema proposto a partir de suas próprias concepções.
Resumo:
What sort of component coordination strategies emerge in a software integration process? How can such strategies be discovered and further analysed? How close are they to the coordination component of the envisaged architectural model which was supposed to guide the integration process? This paper introduces a framework in which such questions can be discussed and illustrates its use by describing part of a real case-study. The approach is based on a methodology which enables semi-automatic discovery of coordination patterns from source code, combining generalized slicing techniques and graph manipulation
Resumo:
Nowadays, there exist various standards for individual management systems (MSs), at least, one for each stakeholder. New ones will be published. An integrated management system (IMS) aims to integrate some or all components of the business into one coherent and efficient MS. Maximizing integration is more and more a strategic priority in that it constitutes an opportunity to eliminate and/or reduce potential factors of destruction of value for the organizations and also to be more competitive and consequently promote its sustainable success. A preliminary investigation was conducted on a Portuguese industrial company which, over the years, has been adopting gradually, in whole or in part, individualized management system standards (MSSs). A research, through a questionnaire, was performed with the objective to develop, in a real business environment, an adequate and efficient IMS-QES (quality, environment, and safety) model and to potentiate for the future a generic IMS model to integrate other MSSs. The strategy and research methods have taken into consideration the case study. It was obtained a set of relevant conclusions resulting from the statistical analyses of the responses to the survey. Globally, the investigation results, by themselves, justified and prioritized the conception of a model of development of the IMS-QES and consequent definition and validation of a structure of an IMS-QES model, to be implemented at the small- and medium-sized enterprise (SME) where the investigation was conducted.
Resumo:
Nanotechnology is the manipulation of matter on na almost atomic scale to produce new structures, materials, and devices. As potential occupational exposure to nanomaterials (NMs) becomes more prevalente, it is importante that the principles of medical surveillance and risk management be considered for workers in the nanotechnology industry.However, much information about health risk is beyond our current knowledge. Thus, NMs presente new challenges to understanding, predicting, andmanageing potential health risks. First, we briefly describe some general features of NMs and list the most importante types of NMs. This review discusses the toxicological potential of NMs by comparing possible injury mechanism and know, or potentially adverse, health effects. We review the limited research to date for occupational exposure to these particles and how a worker might be exposed to NMs. The principles of medical surveillance are reviewed to further the discussion of occupational health surveillance are reviewed to further the discussion of occupational health surveillance for workers exposed to NMs. We outlinehow occupational health professionals could contribute to a better knowledge of health effects by the utilization of a health surveillance program and by minimizing exposure. Finally, we discuss the early steps towards regulation and the difficulties facing regulators in controlling potentially harmful exposures in the absence of suficiente scientific evidence.
Resumo:
Abstract: The growing proliferation of management systems standards (MSSs), and their individualized implementation, is a real problem faced by organizations. On the other hand, MSSs are aimed at improving efficiency and effectiveness of organizational responses in order to satisfy the requirements, needs and expectations of the stakeholders. Each organization has its own identity and this is an issue that cannot be neglected; hence, two possible approaches can be attended. First, continue with the implementation of individualized management systems (MSs); or, integrate the several MSSs versus related MSs into an integrated management system (IMS). Therefore, in this context, organizations are faced with a dilemma, as a result of the increasing proliferation and diversity of MSSs. This paper takes into account the knowledge gained through a case study conducted in the context of a Portuguese company and unveils some of the advantages and disadvantages of integration. A methodology is also proposed and presented to support organizations in developing and structuring the integration process of their individualized MSs, and consequently minimize problems that are generators of inefficiencies, value destruction and loss of competitiveness. The obtained results provide relevant information that can support Top Management decision in solving that dilemma and consequently promote a successful integration, including a better control of business risks associated to MSSs requirements and enhancing sustainable performance, considering the context in which organizations operate.