982 resultados para Semi-implicit methods
Resumo:
This paper surveys some of the fundamental problems in natural language (NL) understanding (syntax, semantics, pragmatics, and discourse) and the current approaches to solving them. Some recent developments in NL processing include increased emphasis on corpus-based rather than example- or intuition-based work, attempts to measure the coverage and effectiveness of NL systems, dealing with discourse and dialogue phenomena, and attempts to use both analytic and stochastic knowledge. Critical areas for the future include grammars that are appropriate to processing large amounts of real language; automatic (or at least semi-automatic) methods for deriving models of syntax, semantics, and pragmatics; self-adapting systems; and integration with speech processing. Of particular importance are techniques that can be tuned to such requirements as full versus partial understanding and spoken language versus text. Portability (the ease with which one can configure an NL system for a particular application) is one of the largest barriers to application of this technology.
Resumo:
Recently the Balanced method was introduced as a class of quasi-implicit methods for solving stiff stochastic differential equations. We examine asymptotic and mean-square stability for several implementations of the Balanced method and give a generalized result for the mean-square stability region of any Balanced method. We also investigate the optimal implementation of the Balanced method with respect to strong convergence.
Resumo:
A stochastic model for solute transport in aquifers is studied based on the concepts of stochastic velocity and stochastic diffusivity. By applying finite difference techniques to the spatial variables of the stochastic governing equation, a system of stiff stochastic ordinary differential equations is obtained. Both the semi-implicit Euler method and the balanced implicit method are used for solving this stochastic system. Based on the Karhunen-Loeve expansion, stochastic processes in time and space are calculated by means of a spatial correlation matrix. Four types of spatial correlation matrices are presented based on the hydraulic properties of physical parameters. Simulations with two types of correlation matrices are presented.
Resumo:
Aluminium (Al) is known to be neurotoxic and has been associated with the aetiology of Alzheimer's Disease. To date, only desferrioxamine (DFO), a trihydroxamic acid siderophore has been used in the clinical environment for the removal of Al from the body. However, this drug is expensive, orally inactive and is associated with many side effects. These studies employed a theoretical approach, with the use of quantum mechanics (QM) via semi-empirical molecular orbital (MO) calculations, and a practical approach using U87-MG glioblastoma cells as a model for evaluating the influence of potential chelators on the passage of aluminium into cells. Preliminary studies involving the Cambridge Structural Database (CSD) identified that Al prefers binding to bidentate ligands in a 3:1 manner, whereby oxygen was the exclusive donating atom. Statistically significant differences in M-O bond lengths when compared to other trivalent metal ions such as Fe3+ were established and used as an acceptance criterion for subsequent MO calculations. Of the semi-empirical methods parameterised for Al, the PM3 Hamiltonian was found to give the most reliable final optimised geometries of simple 3:1 Al complexes. Consequently the PM3 Hamiltonian was used for evaluating the Hf of 3:1 complexes with more complicated ligands. No correlation exists between published stability constants and individual parameters calculated via PM3 optimisations, although investigation of the dicarboxylates reveals a correlation of 0.961 showing promise for affinity prediction of closely related ligands. A simple and inexpensive morin spectrofluorescence assay has been developed and optimised producing results comparable to atomic absorption spectroscopy methods for the quantitative analysis of Al. This assay was used in subsequent in vitro models, initially on E. coli, which indicated that Al inhibits the antimicrobial action of ciprofloxacin, a potent quinolone antibiotic. Ensuing studies using the second model, U87-MG cells, investigated the influence of chelators on the transmembrane transport of Al, identifying 1,2-diethylhydroxypyridin-4-one as a ligand showing greatest potential for chelating Al in the clinical situation. In conclusion, these studies have explored semi-empirical MO Hamiltonians and an in-vitro U87-MG cell line, both as possible methods for predicting effective chelators of Al.
Resumo:
This research explores how news media reports construct representations of a business crisis through language. In an innovative approach to dealing with the vast pool of potentially relevant texts, media texts concerning the BP Deepwater Horizon oil spill are gathered from three different time points: immediately after the explosion in 2010, one year later in 2011 and again in 2012. The three sets of 'BP texts' are investigated using discourse analysis and semi-quantitative methods within a semiotic framework that gives an account of language at the semiotic levels of sign, code, mythical meaning and ideology. The research finds in the texts three discourses of representation concerning the crisis that show a movement from the ostensibly representational to the symbolic and conventional: a discourse of 'objective factuality', a discourse of 'positioning' and a discourse of 'redeployment'. This progression can be shown to have useful parallels with Peirce's sign classes of Icon, Index and Symbol, with their implied movement from a clear motivation by the Object (in this case the disaster events), to an arbitrary, socially-agreed connection. However, the naturalisation of signs, whereby ideologies are encoded in ways of speaking and writing that present them as 'taken for granted' is at its most complete when it is least discernible. The findings suggest that media coverage is likely to move on from symbolic representation to a new kind of iconicity, through a fourth discourse of 'naturalisation'. Here the representation turns back towards ostensible factuality or iconicity, to become the 'naturalised icon'. This work adds to the study of media representation a heuristic for understanding how the meaning-making of a news story progresses. It offers a detailed account of what the stages of this progression 'look like' linguistically, and suggests scope for future research into both language characteristics of phases and different news-reported phenomena.
Resumo:
Purpose. The goal of this study is to improve the favorable molecular interactions between starch and PPC by addition of grafting monomers MA and ROM as compatibilizers, which would advance the mechanical properties of starch/PPC composites. ^ Methodology. DFT and semi-empirical methods based calculations were performed on three systems: (a) starch/PPC, (b) starch/PPC-MA, and (c) starch-ROM/PPC. Theoretical computations involved the determination of optimal geometries, binding-energies and vibrational frequencies of the blended polymers. ^ Findings. Calculations performed on five starch/PPC composites revealed hydrogen bond formation as the driving force behind stable composite formation, also confirmed by the negative relative energies of the composites indicating the existence of binding forces between the constituent co-polymers. The interaction between starch and PPC is also confirmed by the computed decrease in stretching CO and OH group frequencies participating in hydrogen bond formation, which agree qualitatively with the experimental values. ^ A three-step mechanism of grafting MA on PPC was proposed to improve the compatibility of PPC with starch. Nine types of 'blends' produced by covalent bond formation between starch and MA-grafted PPC were found to be energetically stable, with blends involving MA grafted at the 'B' and 'C' positions of PPC indicating a binding-energy increase of 6.8 and 6.2 kcal/mol, respectively, as compared to the non-grafted starch/PPC composites. A similar increase in binding-energies was also observed for three types of 'composites' formed by hydrogen bond formation between starch and MA-grafted PPC. ^ Next, grafting of ROM on starch and subsequent blend formation with PPC was studied. All four types of blends formed by the reaction of ROM-grafted starch with PPC were found to be more energetically stable as compared to the starch/PPC composite and starch/PPC-MA composites and blends. A blend of PPC and ROM grafted at the ' a&d12; ' position on amylose exhibited a maximal increase of 17.1 kcal/mol as compared with the starch/PPC-MA blend. ^ Conclusions. ROM was found to be a more effective compatibilizer in improving the favorable interactions between starch and PPC as compared to MA. The ' a&d12; ' position was found to be the most favorable attachment point of ROM to amylose for stable blend formation with PPC.^
Resumo:
Este estudo incide sobre as características que a presença do ião flúor em moléculas concede. Mais concretamente em fluoroquinolonas, antibióticos que cada vez são mais utilizados. Fez-se uma analise de vários parâmetros para obtermos informação sobre a interação fármaco-receptor nas fluoroquinolonas. Sendo para isso utilizadas técnicas de caracterização química computacional para conseguirmos caracterizar eletronicamente e estruturalmente (3D) as fluoroquinolonas em complemento aos métodos semi-empíricos utilizados inicialmente. Como é sabido, a especificidade e a afinidade para o sitio alvo, é essencial para eficácia de um fármaco. As fluoroquinolonas sofreram um grande desenvolvimento desde a primeira quinolona sintetizada em 1958, sendo que desde ai foram sintetizadas inúmeros derivados da mesma. Este facto deve-se a serem facilmente manipuladas, derivando fármacos altamente potentes, espectro alargado, factores farmacocinéticos optimizados e efeitos adversos reduzidos. A grande alteração farmacológica para o aumento do interesse neste grupo, foi a substituição em C6 de um átomo de flúor em vez de um de hidrogénio. Para obtermos as informações sobre a influência do ião flúor sobre as propriedades estruturais e electrónicas das fluoroquinolonas, foi feita uma comparação entre a fluoroquinolona com flúor em C6 e com hidrogénio em C6. As quatro fluoroquinolonas presentes neste estudo foram: ciprofloxacina, moxiflocacina, sparfloxacina e pefloxacina. As informações foram obtidas por programas informáticos de mecânica quântica e molecular. Concluiu-se que a presença de substituinte flúor não modificava de forma significativa a geometria das moléculas mas sim a distribuição da carga no carbono vicinal e nos átomos em posição alfa, beta e gama relativamente a este. Esta modificação da distribuição electrónica pode condicionar a ligação do fármaco ao receptor, modificando a sua actividade farmacológica.
Resumo:
Rigid adherence to pre-specified thresholds and static graphical representations can lead to incorrect decisions on merging of clusters. As an alternative to existing automated or semi-automated methods, we developed a visual analytics approach for performing hierarchical clustering analysis of short time-series gene expression data. Dynamic sliders control parameters such as the similarity threshold at which clusters are merged and the level of relative intra-cluster distinctiveness, which can be used to identify "weak-edges" within clusters. An expert user can drill down to further explore the dendrogram and detect nested clusters and outliers. This is done by using the sliders and by pointing and clicking on the representation to cut the branches of the tree in multiple-heights. A prototype of this tool has been developed in collaboration with a small group of biologists for analysing their own datasets. Initial feedback on the tool has been positive.
Resumo:
Choosing a single similarity threshold for cutting dendrograms is not sufficient for performing hierarchical clustering analysis of heterogeneous data sets. In addition, alternative automated or semi-automated methods that cut dendrograms in multiple levels make assumptions about the data in hand. In an attempt to help the user to find patterns in the data and resolve ambiguities in cluster assignments, we developed MLCut: a tool that provides visual support for exploring dendrograms of heterogeneous data sets in different levels of detail. The interactive exploration of the dendrogram is coordinated with a representation of the original data, shown as parallel coordinates. The tool supports three analysis steps. Firstly, a single-height similarity threshold can be applied using a dynamic slider to identify the main clusters. Secondly, a distinctiveness threshold can be applied using a second dynamic slider to identify “weak-edges” that indicate heterogeneity within clusters. Thirdly, the user can drill-down to further explore the dendrogram structure - always in relation to the original data - and cut the branches of the tree at multiple levels. Interactive drill-down is supported using mouse events such as hovering, pointing and clicking on elements of the dendrogram. Two prototypes of this tool have been developed in collaboration with a group of biologists for analysing their own data sets. We found that enabling the users to cut the tree at multiple levels, while viewing the effect in the original data, is a promising method for clustering which could lead to scientific discoveries.
Resumo:
In this talk, we propose an all regime Lagrange-Projection like numerical scheme for the gas dynamics equations. By all regime, we mean that the numerical scheme is able to compute accurate approximate solutions with an under-resolved discretization with respect to the Mach number M, i.e. such that the ratio between the Mach number M and the mesh size or the time step is small with respect to 1. The key idea is to decouple acoustic and transport phenomenon and then alter the numerical flux in the acoustic approximation to obtain a uniform truncation error in term of M. This modified scheme is conservative and endowed with good stability properties with respect to the positivity of the density and the internal energy. A discrete entropy inequality under a condition on the modification is obtained thanks to a reinterpretation of the modified scheme in the Harten Lax and van Leer formalism. A natural extension to multi-dimensional problems discretized over unstructured mesh is proposed. Then a simple and efficient semi implicit scheme is also proposed. The resulting scheme is stable under a CFL condition driven by the (slow) material waves and not by the (fast) acoustic waves and so verifies the all regime property. Numerical evidences are proposed and show the ability of the scheme to deal with tests where the flow regime may vary from low to high Mach values.
Resumo:
Time perception is studied with subjective or semi-objective psychophysical methods. With subjective methods, observers provide quantitative estimates of duration and data depict the psychophysical function relating subjective duration to objective duration. With semi-objective methods, observers provide categorical or comparative judgments of duration and data depict the psychometric function relating the probability of a certain judgment to objective duration. Both approaches are used to study whether subjective and objective time run at the same pace or whether time flies or slows down under certain conditions. We analyze theoretical aspects affecting the interpretation of data gathered with the most widely used semi-objective methods, including single-presentation and paired-comparison methods. For this purpose, a formal model of psychophysical performance is used in which subjective duration is represented via a psychophysical function and the scalar property. This provides the timing component of the model, which is invariant across methods. A decisional component that varies across methods reflects how observers use subjective durations to make judgments and give the responses requested under each method. Application of the model shows that psychometric functions in single-presentation methods are uninterpretable because the various influences on observed performance are inextricably confounded in the data. In contrast, data gathered with paired-comparison methods permit separating out those influences. Prevalent approaches to fitting psychometric functions to data are also discussed and shown to be inconsistent with widely accepted principles of time perception, implicitly assuming instead that subjective time equals objective time and that observed differences across conditions do not reflect differences in perceived duration but criterion shifts. These analyses prompt evidence-based recommendations for best methodological practice in studies on time perception.
Resumo:
With the increasing importance given to building rehabilitation comes the need to create simple, fast and non-destructive testing methods (NDT) to identify problems and for anomaly diagnosis. Ceramic tiles are one of the most typical kinds of exterior wall cladding in several countries; the earliest known examples are Egyptian dating from 4000 BC. This type of building facade coating, though being quite often used in due to its aesthetic and architectural characteristics, is one of the most complex that can be applied given the several parts from which it is composed; hence, it is also one of the most difficult to correctly diagnose with expeditious methods. The detachment of ceramic wall tiles is probably the most common and difficult to identify anomaly associated with this kind of cladding and it is also definitely the one that can compromise security the most. Thus, it is necessary to study a process of inspection more efficient and economic than the currently used which often consist in semi-destructive methods (the most common is the pull off test), that can only be used in a small part of the building at a time, allowing some assumptions of what can the rest of the cladding be like. Infrared thermography (IRT) is a NDT with a wide variety of applications in building inspection that is becoming commonly used to identify anomalies related with thermal variations in the inspected surfaces. Few authors have studied the application of IRT in anomalies associated with ceramic claddings claiming that the presence of air or water beneath the superficial layer will influence the heat transfer in a way that can be detected in both a qualitative and a quantitative way by the thermal camera, providing information about the state of the wall in a much broad area per trial than other methods commonly used nowadays. This article intends to present a review of the state of art of this NDT and its potentiality in becoming a more efficient way to diagnose anomalies in ceramic wall claddings.
Resumo:
La fraction d’éjection du ventricule gauche est un excellent marqueur de la fonction cardiaque. Plusieurs techniques invasives ou non sont utilisées pour son calcul : l’angiographie, l’échocardiographie, la résonnance magnétique nucléaire cardiaque, le scanner cardiaque, la ventriculographie radioisotopique et l’étude de perfusion myocardique en médecine nucléaire. Plus de 40 ans de publications scientifiques encensent la ventriculographie radioisotopique pour sa rapidité d’exécution, sa disponibilité, son faible coût et sa reproductibilité intra-observateur et inter-observateur. La fraction d’éjection du ventricule gauche a été calculée chez 47 patients à deux reprises, par deux technologues, sur deux acquisitions distinctes selon trois méthodes : manuelle, automatique et semi-automatique. Les méthodes automatique et semi-automatique montrent dans l’ensemble une meilleure reproductibilité, une plus petite erreur standard de mesure et une plus petite différence minimale détectable. La méthode manuelle quant à elle fournit un résultat systématiquement et significativement inférieur aux deux autres méthodes. C’est la seule technique qui a montré une différence significative lors de l’analyse intra-observateur. Son erreur standard de mesure est de 40 à 50 % plus importante qu’avec les autres techniques, tout comme l’est sa différence minimale détectable. Bien que les trois méthodes soient d’excellentes techniques reproductibles pour l’évaluation de la fraction d’éjection du ventricule gauche, les estimations de la fiabilité des méthodes automatique et semi-automatique sont supérieures à celles de la méthode manuelle.
Resumo:
La fraction d’éjection du ventricule gauche est un excellent marqueur de la fonction cardiaque. Plusieurs techniques invasives ou non sont utilisées pour son calcul : l’angiographie, l’échocardiographie, la résonnance magnétique nucléaire cardiaque, le scanner cardiaque, la ventriculographie radioisotopique et l’étude de perfusion myocardique en médecine nucléaire. Plus de 40 ans de publications scientifiques encensent la ventriculographie radioisotopique pour sa rapidité d’exécution, sa disponibilité, son faible coût et sa reproductibilité intra-observateur et inter-observateur. La fraction d’éjection du ventricule gauche a été calculée chez 47 patients à deux reprises, par deux technologues, sur deux acquisitions distinctes selon trois méthodes : manuelle, automatique et semi-automatique. Les méthodes automatique et semi-automatique montrent dans l’ensemble une meilleure reproductibilité, une plus petite erreur standard de mesure et une plus petite différence minimale détectable. La méthode manuelle quant à elle fournit un résultat systématiquement et significativement inférieur aux deux autres méthodes. C’est la seule technique qui a montré une différence significative lors de l’analyse intra-observateur. Son erreur standard de mesure est de 40 à 50 % plus importante qu’avec les autres techniques, tout comme l’est sa différence minimale détectable. Bien que les trois méthodes soient d’excellentes techniques reproductibles pour l’évaluation de la fraction d’éjection du ventricule gauche, les estimations de la fiabilité des méthodes automatique et semi-automatique sont supérieures à celles de la méthode manuelle.
Resumo:
Nitrobenzoxadiazole (NBD)-labeled lipids are popular fluorescent membrane probes. However, the understanding of important aspects of the photophysics of NBD remains incomplete, including the observed shift in the emission spectrum of NBD-lipids to longer wavelengths following excitation at the red edge of the absorption spectrum (red-edge excitation shift or REES). REES of NBD-lipids in membrane environments has been previously interpreted as reflecting restricted mobility of solvent surrounding the fluorophore. However, this requires a large change in the dipole moment (Dm) of NBD upon excitation. Previous calculations of the value of Dm of NBD in the literature have been carried out using outdated semi-empirical methods, leading to conflicting values. Using up-to-date density functional theory methods, we recalculated the value of Dm and verified that it is rather small (B2 D). Fluorescence measurements confirmed that the value of REES is B16 nm for 1,2-dioleoyl-sn-glycero-3- phospho-L-serine-N-(NBD) (NBD-PS) in dioleoylphosphatidylcholine vesicles. However, the observed shift is independent of both the temperature and the presence of cholesterol and is therefore insensitive to the mobility and hydration of the membrane. Moreover, red-edge excitation leads to an increased contribution of the decay component with a shorter lifetime, whereas time-resolved emission spectra of NBD-PS displayed an atypical blue shift following excitation. This excludes restrictions to solvent relaxation as the cause of the measured REES and TRES of NBD, pointing instead to the heterogeneous transverse location of probes as the origin of these effects. The latter hypothesis was confirmed by molecular dynamics simulations, from which the calculated heterogeneity of the hydration and location of NBD correlated with the measured fluorescence lifetimes/REES. Globally, our combination of theoretical and experiment-based techniques has led to a considerably improved understanding of the photophysics of NBD and a reinterpretation of its REES in particular.