801 resultados para Ontology generation
Resumo:
We show that stochastic electrodynamics and quantum mechanics give quantitatively different predictions for the quantum nondemolition (QND) correlations in travelling wave second harmonic generation. Using phase space methods and stochastic integration, we calculate correlations in both the positive-P and truncated Wigner representations, the latter being equivalent to the semi-classical theory of stochastic electrodynamics. We show that the semiclassical results are different in the regions where the system performs best in relation to the QND criteria, and that they significantly overestimate the performance in these regions. (C) 2001 Published by Elsevier Science B.V.
Resumo:
Management are keen to maximize the life span of an information system because of the high cost, organizational disruption, and risk of failure associated with the re-development or replacement of an information system. This research investigates the effects that various factors have on an information system's life span by understanding how the factors affect an information system's stability. The research builds on a previously developed two-stage model of information system change whereby an information system is either in a stable state of evolution in which the information system's functionality is evolving, or in a state of revolution, in which the information system is being replaced because it is not providing the functionality expected by its users. A case study surveyed a number of systems within one organization. The aim was to test whether a relationship existed between the base value of the volatility index (a measure of the stability of an information system) and certain system characteristics. Data relating to some 3000 user change requests covering 40 systems over a 10-year period were obtained. The following factors were hypothesized to have significant associations with the base value of the volatility index: language level (generation of language of construction), system size, system age, and the timing of changes applied to a system. Significant associations were found in the hypothesized directions except that the timing of user changes was not associated with any change in the value of the volatility index. Copyright (C) 2002 John Wiley Sons, Ltd.
Resumo:
This paper attempts a state-of-the-art summary of research into thunderstorm wind fields from an engineering perspective. The characteristics of thunderstorms and the two extreme wind events-tornadoes and downbursts-spawn by thunderstorms are described. The significant differences from traditional boundary layer flows are highlighted. The importance of thunderstorm gusts in the worldwide database of extreme wind events is established. Physical simulations of tornadoes and downbursts are described and discussed leading to the recommendation that Wind Engineering needs to focus more resources on the fundamental issue - What is the flow structure in the strongest winds? © 2002 Published by Elsevier Science Ltd.
Resumo:
Studies on purified blood dendritic cells (DCs) are hampered by poor viability in tissue culture. We, therefore, attempted to study some of the interactions/relationships between DCs and other blood cells by culturing unseparated peripheral blood mononuclear cell (PBMC) preparations in vitro. Flow cytometric techniques were used to undertake a phenotypic and functional analysis of DCs within the cultured PBMC population. We discovered that both the CD11c(+) and CD11c(-) CD123(hi) DC subsets maintained their viability throughout the 3-day culture period, without the addition of exogenous cytokines. This viability was accompanied by progressive up-regulation of the surface costimulatory (CD40, CD80, CD86) and activation (CMRF-44, CMRF-56, CD83) molecules. The survival and apparent production of DCs in PBMC culture (without exogenous cytokines) and that of sorted DCs (with cytokines) were evaluated and compared by using TruCOUNT analysis. Absolute DC counts increased (for CD123hi and CD11c+ subsets) after overnight culture of PBMCs. Single-cell lineage depletion experiments demonstrated the rapid and spontaneous emergence of new in vitro generated DCs from CD14(+)/CD16(+) PBMC radioresistant precursors, additional to the preexisting ex vivo DC population. Unlike monocyte-derived DCs, blood DCs increased dextran uptake with culture and activation. Finally, DCs obtained after culture of PBMCs for 3 days were as effective as freshly isolated DCs in stimulating an allogeneic mixed leukocyte reaction. (C) 2002 by The American Society of Hematology.
Resumo:
Solid earth simulations have recently been developed to address issues such as natural disasters, global environmental destruction and the conservation of natural resources. The simulation of solid earth phenomena involves the analysis of complex structures including strata, faults, and heterogeneous material properties. Simulation of the generation and cycle of earthquakes is particularly important, but such simulations require the analysis of complex fault dynamics. GeoFEM is a parallel finite-element analysis system intended for solid earth field phenomena problems. This paper describes recent development in the GeoFEM project for the simulation of earthquake generation and cycles.
Resumo:
Conceptual modelling is an activity undertaken during information systems development work to build a representation of selected semantics about some real-world domain. Ontological theories have been developed to account for the structure and behavior of the real world in general. In this paper, I discuss why ontological theories can be used to inform conceptual modelling research, practice, and pedagogy. I provide examples from my research to illustrate how a particular ontological theory has enabled me to improve my understanding of certain conceptual modelling practices and grammars. I describe, also, how some colleagues and I have used this theory to generate several counter-intuitive, sometimes surprising predictions about widely advocated conceptual modelling practices - predictions that subsequently were supported in empirical research we undertook. Finally, I discuss several possibilities and pitfalls I perceived to be associated with our using ontological theories to underpin research on conceptual modelling.
Resumo:
The Kunjin replicon was used to express a polytope that consisted of seven hepatitis C virus cytotoxic T lymphocyte epitopes and one influenza cytotoxic T lymphocyte epitope for vaccination studies. The self-replicating nature of, and expression from, the ribonucleic acid was confirmed in vitro . Initial vaccinations with one dose of Kun-Poly ribonucleic acid showed that an influenza-specific cytotoxic T lymphocyte response was elicited more efficiently by intradermal inoculation compared with intramuscular delivery. Two micrograms of ribonucleic acid delivered in the ear pinnae of mice was sufficient to elicit a detectable cytotoxic T lymphocyte response 10 days post-vaccination. Further vaccination studies showed that four of the seven hepatitis C virus cytotoxic T lymphocyte epitopes were able to elicit weak cytotoxic T lymphocyte responses whereas the influenza epitope was able to elicit strong, specific cytotoxic T lymphocyte responses following three doses of Kun-Poly ribonucleic acid. These studies vindicate the use of the Kunjin replicon as a vector to deliver encoded proteins for the development of cell-mediated immune responses.
Resumo:
For dynamic simulations to be credible, verification of the computer code must be an integral part of the modelling process. This two-part paper describes a novel approach to verification through program testing and debugging. In Part 1, a methodology is presented for detecting and isolating coding errors using back-to-back testing. Residuals are generated by comparing the output of two independent implementations, in response to identical inputs. The key feature of the methodology is that a specially modified observer is created using one of the implementations, so as to impose an error-dependent structure on these residuals. Each error can be associated with a fixed and known subspace, permitting errors to be isolated to specific equations in the code. It is shown that the geometric properties extend to multiple errors in either one of the two implementations. Copyright (C) 2003 John Wiley Sons, Ltd.
Resumo:
Background: Although excessive ethanol consumption is known to lead to a variety of adverse effects in the heart, the molecular mechanisms of such effects have remained poorly defined. We hypothesized that posttranslational covalent binding of reactive molecular species to proteins occurs in the heart in response to acute ethanol exposure. Methods: The generation of protein adducts with several aldehydic species was examined by using monospecific antibodies against adducts with malondialdehyde (MDA), acetaldehyde (AA), MDA-AA hybrids, and hydroxyethyl radicals. Specimens of heart tissue were obtained from rats after intraperitoneal injections with alcohol (75 mmol/kg body weight) with or without pretreatment with cyanamide (0.05 mmol/kg body weight), an aldehyde dehydrogenase inhibitor. Results: The amounts of MDA and unreduced AA adducts were found to be significantly increased in the heart of the rats treated with ethanol, cyanamide, or both, whereas no other adducts were detected in statistically significant quantities. Immunohistochemical studies for characterization of adduct distribution revealed sarcolemmal adducts of both MDA and AA in the rats treated with ethanol and cyanamide in addition to intracellular adducts, which were also present in the group treated with ethanol alone. Conclusions: These findings support the role of enhanced lipid peroxidation and the generation of protein-aldehyde condensates in vivo as a result of excessive ethanol intake. These findings may have implications in the molecular mechanisms of cardiac dysfunction in alcoholics.
Resumo:
Arquitetura Corporativa promove o estabelecimento de uma visão holística da estrutura e forma de trabalho de uma organização. Um dos aspectos abordados em Arquitetura Corporativa está associada a "estrutura ativa" da organização, que diz respeito a “quem" realiza as atividades organizacionais. Várias abordagens têm sido propostas a fim de proporcionar um meio para a representação de Arquitetura Corporativa, entre as quais ARIS, RM-ODP, UPDM e ArchiMate. Apesar da aceitação por parte da comunidade, as abordagens existentes se concentram em propósitos diferentes, têm limitações de escopo e algumas não têm semântica de mundo real bem definida. Além das abordagens de modelagem, muitas abordagens de ontologias têm sido propostas, a fim de descrever o domínio de estrutura ativa, incluindo as ontologias de SUPER Project, TOVE, Enterprise Ontology e W3C Org Ontology. Embora especificadas para fundamentação semântica e negociação de significado, algumas das abordagens propostas têm fins específicos e cobertura limitada. Além disso, algumas das abordagens não são definidas usando linguagens formais e outras são especificadas usando linguagens sem semântica bem definida. Este trabalho apresenta uma ontologia de referência bem fundamentada para o domínio organizacional. A ontologia organizacional de referência apresentada abrange os aspectos básicos discutidos na literatura organizacional, tais como divisão do trabalho, relações sociais e classificação das unidades estruturais. Além disso, também abrange os aspectos organizacionais definidos em abordagens existentes, levando em consideração tanto abordagens de modelagem quanto abordagens ontológicas. A ontologia resultante é especificada em OntoUML e estende os conceitos sociais de UFO-C.
Resumo:
The paper proposes a methodology especially focused on the generation of strategic plans of action, emphasizing the relevance of having a structured timeframe classification for the actions. The methodology explicitly recognizes the relevance of long-term goals as strategic drivers, which must insure that the complex system is capable to effectively respond to changes in the environment. In addition, the methodology employs engineering systems techniques in order to understand the inner working of the system and to build up alternative plans of action. Due to these different aspects, the proposed approach features higher flexibility compared to traditional methods. The validity and effectiveness of the methodology has been demonstrated by analyzing an airline company composed by 5 subsystems with the aim of defining a plan of action for the next 5 years, which can either: improve efficiency, redefine mission or increase revenues.
Resumo:
One of the most efficient approaches to generate the side information (SI) in distributed video codecs is through motion compensated frame interpolation where the current frame is estimated based on past and future reference frames. However, this approach leads to significant spatial and temporal variations in the correlation noise between the source at the encoder and the SI at the decoder. In such scenario, it would be useful to design an architecture where the SI can be more robustly generated at the block level, avoiding the creation of SI frame regions with lower correlation, largely responsible for some coding efficiency losses. In this paper, a flexible framework to generate SI at the block level in two modes is presented: while the first mode corresponds to a motion compensated interpolation (MCI) technique, the second mode corresponds to a motion compensated quality enhancement (MCQE) technique where a low quality Intra block sent by the encoder is used to generate the SI by doing motion estimation with the help of the reference frames. The novel MCQE mode can be overall advantageous from the rate-distortion point of view, even if some rate has to be invested in the low quality Intra coding blocks, for blocks where the MCI produces SI with lower correlation. The overall solution is evaluated in terms of RD performance with improvements up to 2 dB, especially for high motion video sequences and long Group of Pictures (GOP) sizes.
Resumo:
A Blumlein line is a particular Pulse Forming Line, PFL, configuration that allows the generation of high-voltage sub-microsecond square pulses, with the same voltage amplitude as the dc charging voltage, into a matching load. By stacking n Blumlein lines one can multiply in theory by n the input dc voltage charging amplitude. In order to understand the operating behavior of this electromagnetic system and to further optimize its operation it is fundamental to theoretically model it, that is to calculate the voltage amplitudes at each circuit point and the time instant that happens. In order to do this, one needs to define the reflection and transmission coefficients where impedance discontinuity occurs. The experimental results of a fast solid-state switch, which discharges a three stage Blumlein stack, will be compared with theoretical ones.