985 resultados para Model transformations
Resumo:
In this study we are proposing a Bayesian model selection methodology, where the best model from the list of candidate structural explanatory models is selected. The model structure is based on the Zellner's (1971)explanatory model with autoregressive errors. For the selection technique we are using a parsimonious model, where the model variables are transformed using Box and Cox (1964) class of transformations.
Resumo:
We consider the problem of assessing the number of clusters in a limited number of tissue samples containing gene expressions for possibly several thousands of genes. It is proposed to use a normal mixture model-based approach to the clustering of the tissue samples. One advantage of this approach is that the question on the number of clusters in the data can be formulated in terms of a test on the smallest number of components in the mixture model compatible with the data. This test can be carried out on the basis of the likelihood ratio test statistic, using resampling to assess its null distribution. The effectiveness of this approach is demonstrated on simulated data and on some microarray datasets, as considered previously in the bioinformatics literature. (C) 2004 Elsevier Inc. All rights reserved.
Resumo:
The edge-to-edge matching model, which was originally developed for predicting crystallographic features in diffusional phase transformations in solids, has been used to understand the formation of in-plane textures in TiSi2 (C49) thin films on Si single crystal (001)si surface. The model predicts all the four previously reported orientation relationships between C49 and Si substrate based on the actual atom matching across the interface and the basic crystallographic data only. The model has strong potential to be used to develop new thin film materials. (c) 2006 Acta Materialia Inc. Published by Elsevier Ltd. All rights reserved.
Resumo:
Current models of word production assume that words are stored as linear sequences of phonemes which are structured into syllables only at the moment of production. This is because syllable structure is always recoverable from the sequence of phonemes. In contrast, we present theoretical and empirical evidence that syllable structure is lexically represented. Storing syllable structure would have the advantage of making representations more stable and resistant to damage. On the other hand, re-syllabifications affect only a minimal part of phonological representations and occur only in some languages and depending on speech register. Evidence for these claims comes from analyses of aphasic errors which not only respect phonotactic constraints, but also avoid transformations which move the syllabic structure of the word further away from the original structure, even when equating for segmental complexity. This is true across tasks, types of errors, and, crucially, types of patients. The same syllabic effects are shown by apraxic patients and by phonological patients who have more central difficulties in retrieving phonological representations. If syllable structure was only computed after phoneme retrieval, it would have no way to influence the errors of phonological patients. Our results have implications for psycholinguistic and computational models of language as well as for clinical and educational practices.
Resumo:
The article describes the structure of an ontology model for Optimization of a sequential program. The components of an intellectual modeling system for program optimization are described. The functions of the intellectual modeling system are defined.
Resumo:
The paper presents a short review of some systems for program transformations performed on the basis of the internal intermediate representations of these programs. Many systems try to support several languages of representation of the source texts of programs and solve the task of their translation into the internal representation. This task is still a challenge as it is effort-consuming. To reduce the effort, different systems of translator construction, ready compilers with ready grammars of outside designers are used. Though this approach saves the effort, it has its drawbacks and constraints. The paper presents the general idea of using the mapping approach to solve the task within the framework of program transformations and overcome the disadvantages of the existing systems. The paper demonstrates a fragment of the ontology model of high-level languages mappings onto the single representation and gives the example of how the description of (a fragment) a particular mapping is represented in accordance with the ontology model.
Resumo:
Supply chains comprise of complex processes spanning across multiple trading partners. The various operations involved generate large number of events that need to be integrated in order to enable internal and external traceability. Further, provenance of artifacts and agents involved in the supply chain operations is now a key traceability requirement. In this paper we propose a Semantic web/Linked data powered framework for the event based representation and analysis of supply chain activities governed by the EPCIS specification. We specifically show how a new EPCIS event type called "Transformation Event" can be semantically annotated using EEM - The EPCIS Event Model to generate linked data, that can be exploited for internal event based traceability in supply chains involving transformation of products. For integrating provenance with traceability, we propose a mapping from EEM to PROV-O. We exemplify our approach on an abstraction of the production processes that are part of the wine supply chain.
Resumo:
The objective of this study was to develop a model to predict transport and fate of gasoline components of environmental concern in the Miami River by mathematically simulating the movement of dissolved benzene, toluene, xylene (BTX), and methyl-tertiary-butyl ether (MTBE) occurring from minor gasoline spills in the inter-tidal zone of the river. Computer codes were based on mathematical algorithms that acknowledge the role of advective and dispersive physical phenomena along the river and prevailing phase transformations of BTX and MTBE. Phase transformations included volatilization and settling. ^ The model used a finite-difference scheme of steady-state conditions, with a set of numerical equations that was solved by two numerical methods: Gauss-Seidel and Jacobi iterations. A numerical validation process was conducted by comparing the results from both methods with analytical and numerical reference solutions. Since similar trends were achieved after the numerical validation process, it was concluded that the computer codes algorithmically were correct. The Gauss-Seidel iteration yielded at a faster convergence rate than the Jacobi iteration. Hence, the mathematical code was selected to further develop the computer program and software. The model was then analyzed for its sensitivity. It was found that the model was very sensitive to wind speed but not to sediment settling velocity. ^ A computer software was developed with the model code embedded. The software was provided with two major user-friendly visualized forms, one to interface with the database files and the other to execute and present the graphical and tabulated results. For all predicted concentrations of BTX and MTBE, the maximum concentrations were over an order of magnitude lower than current drinking water standards. It should be pointed out, however, that smaller concentrations than the latter reported standards and values, although not harmful to humans, may be very harmful to organisms of the trophic levels of the Miami River ecosystem and associated waters. This computer model can be used for the rapid assessment and management of the effects of minor gasoline spills on inter-tidal riverine water quality. ^
Resumo:
Dissolved organic matter (DOM) is one of the largest carbon reservoirs on this planet and is present in aquatic environments as a highly complex mixture of organic compounds. The Florida coastal Everglades (FCE) is one of the largest wetlands in the world. DOM in this system is an important biogeochemical component as most of the nitrogen (N) and phosphorous (P) are in organic forms. Achieving a better understanding of DOM dynamics in large coastal wetlands is critical, and a particularly important issue in the context of Everglades restoration. In this work, the environmental dynamics of surface water DOM on spatial and temporal scales was investigated. In addition, photo- and bio-reactivity of this DOM was determined, surface-to-groundwater exchange of DOM was investigated, and the size distribution of freshwater DOM in Everglades was assessed. The data show that DOM dynamics in this ecosystem are controlled by both hydrological and ecological drivers and are clearly different on spatial scales and variable seasonally. The DOM reactivity data, modeled with a multi-pool first order degradation kinetics model, found that fluorescent DOM in FCE is generally photo-reactive and bio-refractory. Yet the sequential degradation proved a “priming effect” of sunlight on the bacterial uptake and reworking of this subtropical wetland DOM. Interestingly, specific PARAFAC components were found to have different photo- and bio-degradation rates, suggesting a highly heterogeneous nature of fluorophores associated with the DOM. Surface-to-groundwater exchange of DOM was observed in different regions of the system, and compositional differences were associated with source and photo-reactivity. Lastly, the high degree of heterogeneity of DOM associated fluorophores suggested based on the degradation studies was confirmed through the EEM-PARAFAC analysis of DOM along a molecular size continuum, suggesting that the fluorescence characteristics of DOM are highly controlled by different size fractions and as such can exhibit significant differences in reactivity.
Resumo:
Moving through a stable, three-dimensional world is a hallmark of our motor and perceptual experience. This stability is constantly being challenged by movements of the eyes and head, inducing retinal blur and retino-spatial misalignments for which the brain must compensate. To do so, the brain must account for eye and head kinematics to transform two-dimensional retinal input into the reference frame necessary for movement or perception. The four studies in this thesis used both computational and psychophysical approaches to investigate several aspects of this reference frame transformation. In the first study, we examined the neural mechanism underlying the visuomotor transformation for smooth pursuit using a feedforward neural network model. After training, the model performed the general, three-dimensional transformation using gain modulation. This gave mechanistic significance to gain modulation observed in cortical pursuit areas while also providing several testable hypotheses for future electrophysiological work. In the second study, we asked how anticipatory pursuit, which is driven by memorized signals, accounts for eye and head geometry using a novel head-roll updating paradigm. We showed that the velocity memory driving anticipatory smooth pursuit relies on retinal signals, but is updated for the current head orientation. In the third study, we asked how forcing retinal motion to undergo a reference frame transformation influences perceptual decision making. We found that simply rolling one's head impairs perceptual decision making in a way captured by stochastic reference frame transformations. In the final study, we asked how torsional shifts of the retinal projection occurring with almost every eye movement influence orientation perception across saccades. We found a pre-saccadic, predictive remapping consistent with maintaining a purely retinal (but spatially inaccurate) orientation perception throughout the movement. Together these studies suggest that, despite their spatial inaccuracy, retinal signals play a surprisingly large role in our seamless visual experience. This work therefore represents a significant advance in our understanding of how the brain performs one of its most fundamental functions.
Resumo:
We study work extraction from the Dicke model achieved using simple unitary cyclic transformations keeping into account both a non optimal unitary protocol, and the energetic cost of creating the initial state. By analyzing the role of entanglement, we find that highly entangled states can be inefficient for energy storage when considering the energetic cost of creating the state. Such surprising result holds notwithstanding the fact that the criticality of the model at hand can sensibly improve the extraction of work. While showing the advantages of using a many-body system for work extraction, our results demonstrate that entanglement is not necessarily advantageous for energy storage purposes, when non optimal processes are considered. Our work shows the importance of better understanding the complex interconnections between non-equilibrium thermodynamics of quantum systems and correlations among their subparts.
Resumo:
The forensic toxicologist faces challenges in the detection of drugs and poisons in biological samples due to transformations which occur both during life and after death. For example, changes can result from drug metabolism during life or from the use of formalin solution for post mortem embalming purposes. The former requires the identification of drug metabolites and the latter the identification of chemical reaction products in order to know which substances had been administered. The work described in this thesis was aimed at providing ways of tackling these challenges and was divided into two parts. Part 1 investigated the use of in vitro drug metabolism by human liver microsomes (HLM) to obtain information on drug metabolites and Part 2 investigated the chemical reactions of drugs and a carbamate pesticide with formalin solution and formalin-blood. The initial aim of part I was to develop an in vitro metabolism method using HLM, based on a literature review of previous studies of this type. MDMA was chosen as a model compound to develop the HLM method because its metabolism was known and standards of its metabolites were commercially available. In addition, a sensitive and selective method was developed for the identification and quantitation of hydrophilic phase I drug metabolites using LC/MS/MS with a conventional reverse-phase (C18) column. In order to obtain suitable retention factors for polar drug metabolites on this column, acetyl derivatives were evaluated for converting the metabolites to more lipophilic compounds and an optimal separation system was developed. Acetate derivatives were found to be stable in the HPLC mobile phase and to provide good chromatographic separation of the target analytes. In vitro metabolism of MDMA and, subsequently, of other drugs involved incubation of 4 µg drug substance in pH 7.4 buffer with an NADPH generating system (NGS) at 37oC for 90 min with addition of more NGS after 30 min. The reaction was stopped at 90 min by the addition of acetonitrile before extraction of the metabolites. Acetate derivatives of MDMA metabolites were identified by LC/MS/MS using multiple reaction monitoring (MRM). Three phase I metabolites (both major and minor metabolites) of MDMA were detected in HLM samples. 3,4-dihydroxy-methamphetamine and 4-hydroxy-3-methoxymethamphetamine were found to be major metabolites of MDMA whereas 3,4-methylenedioxyamphetamine was found to be a minor metabolite. Subsequently, ten MDMA positive urines were analysed to compare the metabolite patterns with those produced by HLM. An LC/MS method for MDMA and its metabolites in urine samples was developed and validated. The method demonstrated good linearity, accuracy and precision and insignificant matrix effects, with limits of quantitation of 0.025 µg/ml. Moreover, derivatives of MDMA and its metabolites were quantified in all 10 positive human urine samples. The urine metabolite pattern was found to be similar to that from HLM. The second aim of Part 1 was to use the HLM system to study the metabolism of some new psychoactive substances, whose misuse worldwide has necessitated the development of analytical methods for these drugs in biological specimens. Methylone and butylone were selected as representative cathinones and para-methoxyamphetamine (PMA) was chosen as a representative ring-substituted amphetamine, because of the involvement of these drugs in recent drug-related deaths, because of a relative lack of information on their metabolism, and because reference standards of their metabolites were not commercially available. An LC/MS/MS method for the analysis of methylone, butylone, PMA and their metabolites was developed. Three phase I metabolites of methylone and butylone were detected in HLM samples. Ketone reduction to β-OH metabolites and demethylenation to dihydroxy-metabolites were found to be major phase I metabolic pathways of butylone and methylone whereas N-demethylation to nor-methylone and nor-butylone were found to be minor pathways. Also, demethylation to para-hydroxyamphetamine was found to be a major phase I metabolic pathway of PMA whereas β-hydroxylation to β-OH-PMA was found to be a minor pathway. Formaldehyde is used for embalming, to reduce decomposition and preserve cadavers, especially in tropical countries such as Thailand. Drugs present in the body can be exposed to formaldehyde resulting in decreasing concentrations of the original compounds and production of new substances. The aim of part II of the study was to evaluate the in vitro reactions of formaldehyde with selected drug groups including amphetamines (amphetamine, methamphetamine and MDMA), benzodiazepines (alprazolam and diazepam), opiates (morphine, hydromorphone, codeine and hydrocodone) and with a carbamate insecticide (carbosulfan). The study would identify degradation products to serve as markers for the parent compounds when these were no longer detectable. Drugs standards were spiked in 10% formalin solution and 10% formalin blood. Water and whole blood without formalin were used for controls. Samples were analysed by LC/MS/MS at different times from the start, over periods of up to 30 days. Amphetamine, methamphetamine and MDMA were found to rapidly convert to methamphetamine, DMA and MDDMA respectively, in both formalin solution and formalin blood, confirming the Eschweiler-Clarke reaction between amine-containing compounds and formaldehyde. Alprazolam was found to be unstable whereas diazepam was found to be stable in both formalin solution and water. Both were found to hydrolyse in formalin solution and to give open-ring alprazolam and open-ring diazepam. Other alprazolam conversion products attached to paraformaldehyde were detected in both formalin solution and formalin blood. Morphine and codeine were found to be more stable than hydromorphone and hydrocodone in formalin solution. Conversion products of hydromorphone and hydrocodone attached to paraformaldehyde were tentatively identified in formalin solution. Moreover, hydrocodone and hydromorphone rapidly decreased within 24 h in formalin blood and could not be detected after 7 days. Carbosulfan was found to be unstable in formalin solution and was rapidly hydrolysed within 24 h, whereas in water it was stable up to 48 h. Carbofuran was the major degradation product, plus smaller amounts of other products, 3-ketocarbofuran and 3-hydrocarbofuran. By contrast, carbosulfan slowly hydrolysed in formalin-blood and was still detected after 15 days. It was concluded that HLM provide a useful tool for human drug metabolism studies when ethical considerations preclude their controlled administration to humans. The use of chemical derivatisation for hydrophilic compounds such as polar drug metabolites for analysis by LC/MS/MS with a conventional C18 column is effective and inexpensive, and suitable for routine use in the identification and quantitation of drugs and their metabolites. The detection of parent drugs and their metabolites or conversion and decomposition products is potentially very useful for the interpretation of cases in forensic toxicology, especially when the original compounds cannot be observed.
Resumo:
The majority of the organizations store their historical business information in data warehouses which are queried to make strategic decisions by using online analytical processing (OLAP) tools. This information has to be correctly assured against unauthorized accesses, but nevertheless there are a great amount of legacy OLAP applications that have been developed without considering security aspects or these have been incorporated once the system was implemented. This work defines a reverse engineering process that allows us to obtain the conceptual model corresponding to a legacy OLAP application, and also analyses and represents the security aspects that could have established. This process has been aligned with a model-driven architecture for developing secure OLAP applications by defining the transformations needed to automatically apply it. Once the conceptual model has been extracted, it can be easily modified and improved with security, and automatically transformed to generate the new implementation.