939 resultados para mathematical equation correction approach


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Les gènes, qui servent à encoder les fonctions biologiques des êtres vivants, forment l'unité moléculaire de base de l'hérédité. Afin d'expliquer la diversité des espèces que l'on peut observer aujourd'hui, il est essentiel de comprendre comment les gènes évoluent. Pour ce faire, on doit recréer le passé en inférant leur phylogénie, c'est-à-dire un arbre de gènes qui représente les liens de parenté des régions codantes des vivants. Les méthodes classiques d'inférence phylogénétique ont été élaborées principalement pour construire des arbres d'espèces et ne se basent que sur les séquences d'ADN. Les gènes sont toutefois riches en information, et on commence à peine à voir apparaître des méthodes de reconstruction qui utilisent leurs propriétés spécifiques. Notamment, l'histoire d'une famille de gènes en terme de duplications et de pertes, obtenue par la réconciliation d'un arbre de gènes avec un arbre d'espèces, peut nous permettre de détecter des faiblesses au sein d'un arbre et de l'améliorer. Dans cette thèse, la réconciliation est appliquée à la construction et la correction d'arbres de gènes sous trois angles différents: 1) Nous abordons la problématique de résoudre un arbre de gènes non-binaire. En particulier, nous présentons un algorithme en temps linéaire qui résout une polytomie en se basant sur la réconciliation. 2) Nous proposons une nouvelle approche de correction d'arbres de gènes par les relations d'orthologie et paralogie. Des algorithmes en temps polynomial sont présentés pour les problèmes suivants: corriger un arbre de gènes afin qu'il contienne un ensemble d'orthologues donné, et valider un ensemble de relations partielles d'orthologie et paralogie. 3) Nous montrons comment la réconciliation peut servir à "combiner'' plusieurs arbres de gènes. Plus précisément, nous étudions le problème de choisir un superarbre de gènes selon son coût de réconciliation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Phase change problems arise in many practical applications such as air-conditioning and refrigeration, thermal energy storage systems and thermal management of electronic devices. The physical phenomenon in such applications are complex and are often difficult to be studied in detail with the help of only experimental techniques. The efforts to improve computational techniques for analyzing two-phase flow problems with phase change are therefore gaining momentum. The development of numerical methods for multiphase flow has been motivated generally by the need to account more accurately for (a) large topological changes such as phase breakup and merging, (b) sharp representation of the interface and its discontinuous properties and (c) accurate and mass conserving motion of the interface. In addition to these considerations, numerical simulation of multiphase flow with phase change introduces additional challenges related to discontinuities in the velocity and the temperature fields. Moreover, the velocity field is no longer divergence free. For phase change problems, the focus of developmental efforts has thus been on numerically attaining a proper conservation of energy across the interface in addition to the accurate treatment of fluxes of mass and momentum conservation as well as the associated interface advection. Among the initial efforts related to the simulation of bubble growth in film boiling applications the work in \cite{Welch1995} was based on the interface tracking method using a moving unstructured mesh. That study considered moderate interfacial deformations. A similar problem was subsequently studied using moving, boundary fitted grids \cite{Son1997}, again for regimes of relatively small topological changes. A hybrid interface tracking method with a moving interface grid overlapping a static Eulerian grid was developed \cite{Juric1998} for the computation of a range of phase change problems including, three-dimensional film boiling \cite{esmaeeli2004computations}, multimode two-dimensional pool boiling \cite{Esmaeeli2004} and film boiling on horizontal cylinders \cite{Esmaeeli2004a}. The handling of interface merging and pinch off however remains a challenge with methods that explicitly track the interface. As large topological changes are crucial for phase change problems, attention has turned in recent years to front capturing methods utilizing implicit interfaces that are more effective in treating complex interface deformations. The VOF (Volume of Fluid) method was adopted in \cite{Welch2000} to simulate the one-dimensional Stefan problem and the two-dimensional film boiling problem. The approach employed a specific model for mass transfer across the interface involving a mass source term within cells containing the interface. This VOF based approach was further coupled with the level set method in \cite{Son1998}, employing a smeared-out Heaviside function to avoid the numerical instability related to the source term. The coupled level set, volume of fluid method and the diffused interface approach was used for film boiling with water and R134a at the near critical pressure condition \cite{Tomar2005}. The effect of superheat and saturation pressure on the frequency of bubble formation were analyzed with this approach. The work in \cite{Gibou2007} used the ghost fluid and the level set methods for phase change simulations. A similar approach was adopted in \cite{Son2008} to study various boiling problems including three-dimensional film boiling on a horizontal cylinder, nucleate boiling in microcavity \cite{lee2010numerical} and flow boiling in a finned microchannel \cite{lee2012direct}. The work in \cite{tanguy2007level} also used the ghost fluid method and proposed an improved algorithm based on enforcing continuity and divergence-free condition for the extended velocity field. The work in \cite{sato2013sharp} employed a multiphase model based on volume fraction with interface sharpening scheme and derived a phase change model based on local interface area and mass flux. Among the front capturing methods, sharp interface methods have been found to be particularly effective both for implementing sharp jumps and for resolving the interfacial velocity field. However, sharp velocity jumps render the solution susceptible to erroneous oscillations in pressure and also lead to spurious interface velocities. To implement phase change, the work in \cite{Hardt2008} employed point mass source terms derived from a physical basis for the evaporating mass flux. To avoid numerical instability, the authors smeared the mass source by solving a pseudo time-step diffusion equation. This measure however led to mass conservation issues due to non-symmetric integration over the distributed mass source region. The problem of spurious pressure oscillations related to point mass sources was also investigated by \cite{Schlottke2008}. Although their method is based on the VOF, the large pressure peaks associated with sharp mass source was observed to be similar to that for the interface tracking method. Such spurious fluctuation in pressure are essentially undesirable because the effect is globally transmitted in incompressible flow. Hence, the pressure field formation due to phase change need to be implemented with greater accuracy than is reported in current literature. The accuracy of interface advection in the presence of interfacial mass flux (mass flux conservation) has been discussed in \cite{tanguy2007level,tanguy2014benchmarks}. The authors found that the method of extending one phase velocity to entire domain suggested by Nguyen et al. in \cite{nguyen2001boundary} suffers from a lack of mass flux conservation when the density difference is high. To improve the solution, the authors impose a divergence-free condition for the extended velocity field by solving a constant coefficient Poisson equation. The approach has shown good results with enclosed bubble or droplet but is not general for more complex flow and requires additional solution of the linear system of equations. In current thesis, an improved approach that addresses both the numerical oscillation of pressure and the spurious interface velocity field is presented by featuring (i) continuous velocity and density fields within a thin interfacial region and (ii) temporal velocity correction steps to avoid unphysical pressure source term. Also I propose a general (iii) mass flux projection correction for improved mass flux conservation. The pressure and the temperature gradient jump condition are treated sharply. A series of one-dimensional and two-dimensional problems are solved to verify the performance of the new algorithm. Two-dimensional and cylindrical film boiling problems are also demonstrated and show good qualitative agreement with the experimental observations and heat transfer correlations. Finally, a study on Taylor bubble flow with heat transfer and phase change in a small vertical tube in axisymmetric coordinates is carried out using the new multiphase, phase change method.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Les gènes, qui servent à encoder les fonctions biologiques des êtres vivants, forment l'unité moléculaire de base de l'hérédité. Afin d'expliquer la diversité des espèces que l'on peut observer aujourd'hui, il est essentiel de comprendre comment les gènes évoluent. Pour ce faire, on doit recréer le passé en inférant leur phylogénie, c'est-à-dire un arbre de gènes qui représente les liens de parenté des régions codantes des vivants. Les méthodes classiques d'inférence phylogénétique ont été élaborées principalement pour construire des arbres d'espèces et ne se basent que sur les séquences d'ADN. Les gènes sont toutefois riches en information, et on commence à peine à voir apparaître des méthodes de reconstruction qui utilisent leurs propriétés spécifiques. Notamment, l'histoire d'une famille de gènes en terme de duplications et de pertes, obtenue par la réconciliation d'un arbre de gènes avec un arbre d'espèces, peut nous permettre de détecter des faiblesses au sein d'un arbre et de l'améliorer. Dans cette thèse, la réconciliation est appliquée à la construction et la correction d'arbres de gènes sous trois angles différents: 1) Nous abordons la problématique de résoudre un arbre de gènes non-binaire. En particulier, nous présentons un algorithme en temps linéaire qui résout une polytomie en se basant sur la réconciliation. 2) Nous proposons une nouvelle approche de correction d'arbres de gènes par les relations d'orthologie et paralogie. Des algorithmes en temps polynomial sont présentés pour les problèmes suivants: corriger un arbre de gènes afin qu'il contienne un ensemble d'orthologues donné, et valider un ensemble de relations partielles d'orthologie et paralogie. 3) Nous montrons comment la réconciliation peut servir à "combiner'' plusieurs arbres de gènes. Plus précisément, nous étudions le problème de choisir un superarbre de gènes selon son coût de réconciliation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Structured abstract Purpose: To deepen, in grocery retail context, the roles of consumer perceived value and consumer satisfaction, as antecedents’ dimensions of customer loyalty intentions. Design/Methodology/approach: Also employing a short version (12-items) of the original 19-item PERVAL scale of Sweeney & Soutar (2001), a structural equation modeling approach was applied to investigate statistical properties of the indirect influence on loyalty of a reflective second order customer perceived value model. The performance of three alternative estimation methods was compared through bootstrapping techniques. Findings: Results provided i) support for the use of the short form of the PERVAL scale in measuring consumer perceived value; ii) the influence of the four highly correlated independent latent predictors on satisfaction was well summarized by a higher-order reflective specification of consumer perceived value; iii) emotional and functional dimensions were determinants for the relationship with the retailer; iv) parameter’s bias with the three methods of estimation was only significant for bootstrap small sample sizes. Research limitations:/implications: Future research is needed to explore the use of the short form of the PERVAL scale in more homogeneous groups of consumers. Originality/value: Firstly, to indirectly explain customer loyalty mediated by customer satisfaction it was adopted a recent short form of PERVAL scale and a second order reflective conceptualization of value. Secondly, three alternative estimation methods were used and compared through bootstrapping and simulation procedures.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Liver transplantation increased 1.84-fold from 1988 to 2004. However, the number of patients on the waiting list for a liver increased 2.71-fold, from 553 to 1500. We used a mathematical equation to analyze the potential effect of using ABO-compatible living-donor liver transplantation (LDLT) on both our liver transplantation program and the waiting list. We calculated the prevalence distribution of blood groups (O, A, B, and AB) in the population and the probability of having a compatible parent or sibling for LDLT. The incidence of ABO compatibility in the overall population was as follows: A, 0.31; B, 0.133; O, 0.512; and AB, 0.04. The ABO compatibility for parent donors was blood group A, 0.174; B, 0.06; O, 0.152; and AB, 0.03; and for sibling donors was A, 0.121; B, 0.05; O, 0.354; and AB, 0.03. Use of LDLT can reduce the pressure on our liver transplantation waiting list by decreasing its size by at least 16.5% at 20 years after its introduction. Such a program could save an estimated 3600 lives over the same period.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objective. The purpose of this study was to construct nomograms of placental volumes according to gestational age and estimated fetal weight. Methods. From March to November 2007, placental volumes were prospectively measured by ultrasonography in 295 normal pregnancies from 12 to 40 weeks` gestation and correlated with gestational age and estimated fetal weight. Inclusion criteria were healthy women, singleton pregnancies with normal fetal morphologic characteristics on ultrasonography, and confirmed gestational age by first-trimester ultrasonography. Results. The mean placental volume ranged from 83 cm(3) at 12 weeks to 427.7 cm(3) at 40 weeks. Linear regression yielded the following formula for the expected placental volumes (ePV) according to gestational age (GA): ePV` (cm(3)) = -64.68 + 12.31 x GA (r = 0.572; P < .001). Placental volumes also varied according to estimated fetal weight (EFW), and the following mathematical equation was also obtained by linear regression: ePV = 94.19 + 0.09 x EFW (r = 0.505; P < 0.001). Conclusions. Nomograms of placental volumes according to gestational age and estimated fetal weight were constructed, generating reference values.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Red cell number and size increase during puberty, particularly in males. The aim of the present study was to determine whether expression of genes affecting red cell indices varied with age and sex. Haemoglobin, red cell count, and mean cellular volume were measured longitudinally on 578 pairs of twins at twelve, fourteen and sixteen years of age. Data were analysed using a structural equation modeling approach, in which a variety of univariate and longitudinal simplex models were fitted to the data. Significant heritability was demonstrated for all variables across all ages. The genes involved did not differ between the sexes, although there was evidence for sex limitation in the case of haemoglobin at age twelve. Longitudinal analyses indicated that new genes affecting red cell indices were expressed at different stages of puberty. Some of these genes affected the different red cell indices pleiotropically, while others had effects specific to one variable only.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study aimed to analyze the economic viability of the third milking in production systems using mechanical milking in a closed circuit, aiming to provide technicians and farmers with information to assist them in decision-making. Specifically, it intended: (a) to estimate the cost of one milking; (b) to estimate the cost of the third milking; (c) to develop a mathematical equation to estimate the minimum amount of milk produced with two milkings, from which it would be economically feasible to do the third milking. Data were collected from three dairy farms, from November 2010 to March 2011, keeping a twice-a-day milking frequency, with three data collections in each farm, totalizing nine collections. Considering the average data, it would be feasible to do the third milking if the average milk yield per day of lactating cows in a twice-a-day milking frequency was greater than or equal to 24.43 kg of milk.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Among the largest resources for biological sequence data is the large amount of expressed sequence tags (ESTs) available in public and proprietary databases. ESTs provide information on transcripts but for technical reasons they often contain sequencing errors. Therefore, when analyzing EST sequences computationally, such errors must be taken into account. Earlier attempts to model error prone coding regions have shown good performance in detecting and predicting these while correcting sequencing errors using codon usage frequencies. In the research presented here, we improve the detection of translation start and stop sites by integrating a more complex mRNA model with codon usage bias based error correction into one hidden Markov model (HMM), thus generalizing this error correction approach to more complex HMMs. We show that our method maintains the performance in detecting coding sequences.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A prospective study was undertaken to determine prognostic markers for patients with obstructive jaundice. Along with routine liver function tests, antipyrine clearance was determined in 20 patients. Four patients died after basal investigations. Five patients underwent definitive surgery. The remaining 11 patients were subjected to percutaneous transhepatic biliary decompression. Four patients died during the drainage period, while surgery was carried out for seven patients within 1-3 weeks of drainage. Of 20 patients, only six patients survived. Basal liver function tests were comparable in survivors and nonsurvivors. Discriminant analysis of the basal data revealed that plasma bilirubin, proteins and antipyrine half-life taken together had a strong association with mortality. A mathematical equation was derived using these variables and a score was computed for each patient. It was observed that a score value greater than or equal to 0.84 indicated survival. Omission of antipyrine half-life from the data, however, resulted in prediction of false security in 55% of patients. This study highlights the importance of addition of antipyrine elimination test to the routine liver function tests for precise identification of high risk patients.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

PURPOSE: To improve the traditional Nyquist ghost correction approach in echo planar imaging (EPI) at high fields, via schemes based on the reversal of the EPI readout gradient polarity for every other volume throughout a functional magnetic resonance imaging (fMRI) acquisition train. MATERIALS AND METHODS: An EPI sequence in which the readout gradient was inverted every other volume was implemented on two ultrahigh-field systems. Phantom images and fMRI data were acquired to evaluate ghost intensities and the presence of false-positive blood oxygenation level-dependent (BOLD) signal with and without ghost correction. Three different algorithms for ghost correction of alternating readout EPI were compared. RESULTS: Irrespective of the chosen processing approach, ghosting was significantly reduced (up to 70% lower intensity) in both rat brain images acquired on a 9.4T animal scanner and human brain images acquired at 7T, resulting in a reduction of sources of false-positive activation in fMRI data. CONCLUSION: It is concluded that at high B(0) fields, substantial gains in Nyquist ghost correction of echo planar time series are possible by alternating the readout gradient every other volume.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The concept of conditional stability constant is extended to the competitive binding of small molecules to heterogeneous surfaces or macromolecules via the introduction of the conditional affinity spectrum (CAS). The CAS describes the distribution of effective binding energies experienced by one complexing agent at a fixed concentration of the rest. We show that, when the multicomponent system can be described in terms of an underlying affinity spectrum [integral equation (IE) approach], the system can always be characterized by means of a CAS. The thermodynamic properties of the CAS and its dependence on the concentration of the rest of components are discussed. In the context of metal/proton competition, analytical expressions for the mean (conditional average affinity) and the variance (conditional heterogeneity) of the CAS as functions of pH are reported and their physical interpretation discussed. Furthermore, we show that the dependence of the CAS variance on pH allows for the analytical determination of the correlation coefficient between the binding energies of the metal and the proton. Nonideal competitive adsorption isotherm and Frumkin isotherms are used to illustrate the results of this work. Finally, the possibility of using CAS when the IE approach does not apply (for instance, when multidentate binding is present) is explored. © 2006 American Institute of Physics.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objective To evaluate the BI-RADS as a predictive factor of suspicion for malignancy in breast lesions by correlating radiological with histological results and calculating the positive predictive value for categories 3, 4 and 5 in a breast cancer reference center in the city of São Paulo. Materials and Methods Retrospective, analytical and cross-sectional study including 725 patients with mammographic and/or sonographic findings classified as BI-RADS categories 3, 4 and 5 who were referred to the authors' institution to undergo percutaneous biopsy. The tests results were reviewed and the positive predictive value was calculated by means of a specific mathematical equation. Results Positive predictive values found for categories 3, 4 and 5 were respectively the following: 0.74%, 33.08% and 92.95%, for cases submitted to ultrasound-guided biopsy, and 0.00%, 14.90% and 100% for cases submitted to stereotactic biopsy. Conclusion The present study demonstrated high suspicion for malignancy in lesions classified as category 5 and low risk for category 3. As regards category 4, the need for systematic biopsies was observed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper we set out a confirmatory factor analysis model relating the values adolescents and their parents aspire to for the child’s future. We approach a problem when collecting parents’ answers and analysing paired data from parents and their child: the fact that in some families only one parent answers, while in others both meet to answer together. In order to account for differences between one-parent and two-parent responses we follow a multiple group structural equation modelling approach. Some significant differences emerged between the two and one answering parent groups. We observed only weak relationships between parents’ and children’s values

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Identification of low-dimensional structures and main sources of variation from multivariate data are fundamental tasks in data analysis. Many methods aimed at these tasks involve solution of an optimization problem. Thus, the objective of this thesis is to develop computationally efficient and theoretically justified methods for solving such problems. Most of the thesis is based on a statistical model, where ridges of the density estimated from the data are considered as relevant features. Finding ridges, that are generalized maxima, necessitates development of advanced optimization methods. An efficient and convergent trust region Newton method for projecting a point onto a ridge of the underlying density is developed for this purpose. The method is utilized in a differential equation-based approach for tracing ridges and computing projection coordinates along them. The density estimation is done nonparametrically by using Gaussian kernels. This allows application of ridge-based methods with only mild assumptions on the underlying structure of the data. The statistical model and the ridge finding methods are adapted to two different applications. The first one is extraction of curvilinear structures from noisy data mixed with background clutter. The second one is a novel nonlinear generalization of principal component analysis (PCA) and its extension to time series data. The methods have a wide range of potential applications, where most of the earlier approaches are inadequate. Examples include identification of faults from seismic data and identification of filaments from cosmological data. Applicability of the nonlinear PCA to climate analysis and reconstruction of periodic patterns from noisy time series data are also demonstrated. Other contributions of the thesis include development of an efficient semidefinite optimization method for embedding graphs into the Euclidean space. The method produces structure-preserving embeddings that maximize interpoint distances. It is primarily developed for dimensionality reduction, but has also potential applications in graph theory and various areas of physics, chemistry and engineering. Asymptotic behaviour of ridges and maxima of Gaussian kernel densities is also investigated when the kernel bandwidth approaches infinity. The results are applied to the nonlinear PCA and to finding significant maxima of such densities, which is a typical problem in visual object tracking.