807 resultados para Information Model
Resumo:
In this paper, a simulation model of glucose-insulin metabolism for Type 1 diabetes patients is presented. The proposed system is based on the combination of Compartmental Models (CMs) and artificial Neural Networks (NNs). This model aims at the development of an accurate system, in order to assist Type 1 diabetes patients to handle their blood glucose profile and recognize dangerous metabolic states. Data from a Type 1 diabetes patient, stored in a database, have been used as input to the hybrid system. The data contain information about measured blood glucose levels, insulin intake, and description of food intake, along with the corresponding time. The data are passed to three separate CMs, which produce estimations about (i) the effect of Short Acting (SA) insulin intake on blood insulin concentration, (ii) the effect of Intermediate Acting (IA) insulin intake on blood insulin concentration, and (iii) the effect of carbohydrate intake on blood glucose absorption from the gut. The outputs of the three CMs are passed to a Recurrent NN (RNN) in order to predict subsequent blood glucose levels. The RNN is trained with the Real Time Recurrent Learning (RTRL) algorithm. The resulted blood glucose predictions are promising for the use of the proposed model for blood glucose level estimation for Type 1 diabetes patients.
Resumo:
In this paper we analyze a dynamic agency problem where contracting parties do not know the agent's future productivity at the beginning of the relationship. We consider a two-period model where both the agent and the principal observe the agent's second-period productivity at the end of the first period. This observation is assumed to be non-verifiable information. We compare long-term contracts with short-term contracts with respect to their suitability to motivate effort in both periods. On the one hand, short-term contracts allow for a better fine-tuning of second-period incentives as they can be aligned with the agent's second-period productivity. On the other hand, in short-term contracts first-period effort incentives might be distorted as contracts have to be sequentially optimal. Hence, the difference between long-term and short-term contracts is characterized by a trade-off between inducing effort in the first and in the second period. We analyze the determinants of this trade-off and demonstrate its implications for performance measurement and information system design.
Resumo:
The three-step test is central to the regulation of copyright limitations at the international level. Delineating the room for exemptions with abstract criteria, the three-step test is by far the most important and comprehensive basis for the introduction of national use privileges. It is an essential, flexible element in the international limitation infrastructure that allows national law makers to satisfy domestic social, cultural, and economic needs. Given the universal field of application that follows from the test’s open-ended wording, the provision creates much more breathing space than the more specific exceptions recognized in international copyright law. EC copyright legislation, however, fails to take advantage of the flexibility inherent in the three-step test. Instead of using the international provision as a means to open up the closed EC catalogue of permissible exceptions, offer sufficient breathing space for social, cultural, and economic needs, and enable EC copyright law to keep pace with the rapid development of the Internet, the Copyright Directive 2001/29/EC encourages the application of the three-step test to further restrict statutory exceptions that are often defined narrowly in national legislation anyway. In the current online environment, however, enhanced flexibility in the field of copyright limitations is indispensable. From a social and cultural perspective, the web 2.0 promotes and enhances freedom of expression and information with its advanced search engine services, interactive platforms, and various forms of user-generated content. From an economic perspective, it creates a parallel universe of traditional content providers relying on copyright protection, and emerging Internet industries whose further development depends on robust copyright limita- tions. In particular, the newcomers in the online market – social networking sites, video forums, and virtual worlds – promise a remarkable potential for economic growth that has already attracted the attention of the OECD. Against this background, the time is ripe to debate the introduction of an EC fair use doctrine on the basis of the three-step test. Otherwise, EC copyright law is likely to frustrate important opportunities for cultural, social, and economic development. To lay groundwork for the debate, the differences between the continental European and the Anglo-American approach to copyright limitations (section 1), and the specific merits of these two distinct approaches (section 2), will be discussed first. An analysis of current problems that have arisen under the present dysfunctional EC system (section 3) will then serve as a starting point for proposing an EC fair use doctrine based on the three-step test (section 4). Drawing conclusions, the international dimension of this fair use proposal will be considered (section 5).
Resumo:
Endocrine-disrupting compounds (EDCs) are widespread in the aquatic environment and can cause alterations in development, physiological homeostasis and health of vertebrates. Zebrafish, Danio rerio, has been suggested as a model species to identify targets as well as modes of EDC action. In fact, zebrafish has been found useful in EDC screening, in EDC effects assessment and in studying targets and mechanisms of EDC action. Since many of the environmental EDCs interfere with the sex steroid system of vertebrates, most EDC studies with zebrafish addressed disruption of sexual differentiation and reproduction. However, other targets of EDCs action must not be overlooked. For using a species as a toxicological model, a good knowledge of the biological traits of this species is a pre-requisite for the rational design of test protocols and endpoints as well as for the interpretation and extrapolation of the toxicological findings. Due to the genomic resources available for zebrafish and the long experience with zebrafish in toxicity testing, it is easily possible to establish molecular endpoints for EDC effects assessment. Additionally, the zebrafish model offers a number of technical advantages including ease and cost of maintenance, rapid development, high fecundity, optical transparency of embryos supporting phenotypic screening, existence of many mutant strains, or amenability for both forward and reverse genetics. To date, the zebrafish has been mainly used to identify molecular targets of EDC action and to determine effect thresholds, while the potential of this model species to study immediate and delayed physiological consequences of molecular interactions has been instrumentalized only partly. One factor that may limit the exploitation of this potential is the still rather fragmentary knowledge of basic biological and endocrine traits of zebrafish. Information on species-specific features in endocrine processes and biological properties, however, need to be considered in establishing EDC test protocols using zebrafish, in extrapolating findings from zebrafish to other vertebrate species, and in understanding how EDC-induced gene expression changes translate into disease.
Resumo:
With a steady increase of regulatory requirements for business processes, automation support of compliance management is a field garnering increasing attention in Information Systems research. Several approaches have been developed to support compliance checking of process models. One major challenge for such approaches is their ability to handle different modeling techniques and compliance rules in order to enable widespread adoption and application. Applying a structured literature search strategy, we reflect and discuss compliance-checking approaches in order to provide an insight into their generalizability and evaluation. The results imply that current approaches mainly focus on special modeling techniques and/or a restricted set of types of compliance rules. Most approaches abstain from real-world evaluation which raises the question of their practical applicability. Referring to the search results, we propose a roadmap for further research in model-based business process compliance checking.
Resumo:
We analyze the impact of stratospheric volcanic aerosols on the diurnal temperature range (DTR) over Europe using long-term subdaily station records. We compare the results with a 28-member ensemble of European Centre/Hamburg version 5.4 (ECHAM5.4) general circulation model simulations. Eight stratospheric volcanic eruptions during the instrumental period are investigated. Seasonal all- and clear-sky DTR anomalies are compared with contemporary (approximately 20 year) reference periods. Clear sky is used to eliminate cloud effects and better estimate the signal from the direct radiative forcing of the volcanic aerosols. We do not find a consistent effect of stratospheric aerosols on all-sky DTR. For clear skies, we find average DTR anomalies of −0.08°C (−0.13°C) in the observations (in the model), with the largest effect in the second winter after the eruption. Although the clear-sky DTR anomalies from different stations, volcanic eruptions, and seasons show heterogeneous signals in terms of order of magnitude and sign, the significantly negative DTR anomalies (e.g., after the Tambora eruption) are qualitatively consistent with other studies. Referencing with clear-sky DTR anomalies to the radiative forcing from stratospheric volcanic eruptions, we find the resulting sensitivity to be of the same order of magnitude as previously published estimates for tropospheric aerosols during the so-called “global dimming” period (i.e., 1950s to 1980s). Analyzing cloud cover changes after volcanic eruptions reveals an increase in clear-sky days in both data sets. Quantifying the impact of stratospheric volcanic eruptions on clear-sky DTR over Europe provides valuable information for the study of the radiative effect of stratospheric aerosols and for geo-engineering purposes.
Resumo:
We examine the impact of identity preferences on the interrelation between incentives and performance measurement. In our model, a manager identifies with an organization and loses utility to the extent that his actions conflict with effort-standards issued by the principal. Contrary to prior arguments in the literature, we find conditions under which a manager who identifies strongly with the organization receives stronger incentives and faces more performance evaluation reports than a manager who does not identify with the organization. Our theory predicts that managers who experience events that boost their identification with the firm can decrease their effort in short-term value creation. We also find that firms are more likely to employ less precise but more congruent performance measures, such as stock prices, when contracting with managers who identify little with the organization. In contrast, they use more precise but less congruent measures, such as accounting earnings, when contracting with managers who identify strongly with the firm.
Resumo:
Despite promising cost saving potential, many offshore software projects fail to realize the expected benefits. A frequent source of failure lies in the insufficient transfer of knowledge during the transition phase. Former literature has reported cases where some domains of knowledge were successfully transferred to vendor personnel whereas others were not. There is further evidence that the actual knowledge transfer processes often vary from case to case. This raises the question whether there is a systematic relationship between the chosen knowledge transfer process and know-ledge transfer success. This paper introduces a dynamic perspective that distinguishes different types of knowledge transfer processes explaining under which circumstances which type is deemed most appropriate to successfully transfer knowledge. Our paper draws on knowledge transfer literature, the Model of Work-Based Learning and theories from cognitive psychology to show how characteristics of know-ledge and the absorptive capacity of knowledge recipients fit particular knowledge transfer processes. The knowledge transfer processes are conceptualized as combinations of generic knowledge transfer activities. This results in six gestalts of know-ledge transfer processes, each representing a fit between the characteristics of the knowledge process and the characteristics of the knowledge to be transferred and the absorptive capacity of the knowledge recipient.
Resumo:
PURPOSE To extend the capabilities of the Cone Location and Magnitude Index algorithm to include a combination of topographic information from the anterior and posterior corneal surfaces and corneal thickness measurements to further improve our ability to correctly identify keratoconus using this new index: ConeLocationMagnitudeIndex_X. DESIGN Retrospective case-control study. METHODS Three independent data sets were analyzed: 1 development and 2 validation. The AnteriorCornealPower index was calculated to stratify the keratoconus data from mild to severe. The ConeLocationMagnitudeIndex algorithm was applied to all tomography data collected using a dual Scheimpflug-Placido-based tomographer. The ConeLocationMagnitudeIndex_X formula, resulting from analysis of the Development set, was used to determine the logistic regression model that best separates keratoconus from normal and was applied to all data sets to calculate PercentProbabilityKeratoconus_X. The sensitivity/specificity of PercentProbabilityKeratoconus_X was compared with the original PercentProbabilityKeratoconus, which only uses anterior axial data. RESULTS The AnteriorCornealPower severity distribution for the combined data sets are 136 mild, 12 moderate, and 7 severe. The logistic regression model generated for ConeLocationMagnitudeIndex_X produces complete separation for the Development set. Validation Set 1 has 1 false-negative and Validation Set 2 has 1 false-positive. The overall sensitivity/specificity results for the logistic model produced using the ConeLocationMagnitudeIndex_X algorithm are 99.4% and 99.6%, respectively. The overall sensitivity/specificity results for using the original ConeLocationMagnitudeIndex algorithm are 89.2% and 98.8%, respectively. CONCLUSIONS ConeLocationMagnitudeIndex_X provides a robust index that can detect the presence or absence of a keratoconic pattern in corneal tomography maps with improved sensitivity/specificity from the original anterior surface-only ConeLocationMagnitudeIndex algorithm.
Resumo:
Localized short-echo-time (1)H-MR spectra of human brain contain contributions of many low-molecular-weight metabolites and baseline contributions of macromolecules. Two approaches to model such spectra are compared and the data acquisition sequence, optimized for reproducibility, is presented. Modeling relies on prior knowledge constraints and linear combination of metabolite spectra. Investigated was what can be gained by basis parameterization, i.e., description of basis spectra as sums of parametric lineshapes. Effects of basis composition and addition of experimentally measured macromolecular baselines were investigated also. Both fitting methods yielded quantitatively similar values, model deviations, error estimates, and reproducibility in the evaluation of 64 spectra of human gray and white matter from 40 subjects. Major advantages of parameterized basis functions are the possibilities to evaluate fitting parameters separately, to treat subgroup spectra as independent moieties, and to incorporate deviations from straightforward metabolite models. It was found that most of the 22 basis metabolites used may provide meaningful data when comparing patient cohorts. In individual spectra, sums of closely related metabolites are often more meaningful. Inclusion of a macromolecular basis component leads to relatively small, but significantly different tissue content for most metabolites. It provides a means to quantitate baseline contributions that may contain crucial clinical information.
Resumo:
This paper develops a process model of how and why complementarity and substitution form over time between contractual and relational governance in the context of information systems outsourcing. Our analysis identifies four distinct process patterns that explain this formation as the outcome of interaction processes between key elements of both contractual and relational governance. These patterns unveil the dynamic nature of complementarity and substitution. In particular, we show that the relationship between contractual and relational governance oscillates between complementarity and substitution. Those oscillations are triggered mainly by three types of contextual events (goal fuzziness, goal conflict, and goal misalignment). Surprisingly, substitution of informal control did not occur as an immediate reaction to external events but emerged as a consequence of preceding complementarity. Thus, our study challenges the prevailing view of an either/or dichotomy of complementarity and substitution by showing that they are causally connected over time.
Resumo:
ABSTRACT: Fourier transform infrared spectroscopy (FTIRS) can provide detailed information on organic and minerogenic constituents of sediment records. Based on a large number of sediment samples of varying age (0�340 000 yrs) and from very diverse lake settings in Antarctica, Argentina, Canada, Macedonia/Albania, Siberia, and Sweden, we have developed universally applicable calibration models for the quantitative determination of biogenic silica (BSi; n = 816), total inorganic carbon (TIC; n = 879), and total organic carbon (TOC; n = 3164) using FTIRS. These models are based on the differential absorbance of infrared radiation at specific wavelengths with varying concentrations of individual parameters, due to molecular vibrations associated with each parameter. The calibration models have low prediction errors and the predicted values are highly correlated with conventionally measured values (R = 0.94�0.99). Robustness tests indicate the accuracy of the newly developed FTIRS calibration models is similar to that of conventional geochemical analyses. Consequently FTIRS offers a useful and rapid alternative to conventional analyses for the quantitative determination of BSi, TIC, and TOC. The rapidity, cost-effectiveness, and small sample size required enables FTIRS determination of geochemical properties to be undertaken at higher resolutions than would otherwise be possible with the same resource allocation, thus providing crucial sedimentological information for climatic and environmental reconstructions.