884 resultados para Problem analysis
Resumo:
Introduction. Tricuspid regurgitation (TR) is the most commonly valvular dysfunction found after heart transplantation (HTx). It may be related to endomyocardial biopsy (EMB) performed for allograft rejection surveillance. Objective. This investigation evaluated the presence of tricuspid valve tissue fragments obtained during routine EMB performed after HTx and its possible effect on short-term and long-term hemodynamic status. Method. This single-center review included prospectively collected and retrospectively analyzed data. From 1985 to 2010, 417 patients underwent 3550 EMB after HTx. All myocardial specimens were reviewed to identify the presence of tricuspid valve tissue by 2 observers initially and in doubtful cases by a third observer. The echocardiographic and hemodynamic parameters were only considered for valvular functional damage analysis in cases of tricuspid tissue inadvertently removed during EMB. Results. The 417 HTx patients to 3550 EMB, including 17,550 myocardial specimens. Tricuspid valve tissue was observed in 12 (2.9%) patients corresponding to 0.07% of the removed fragments. The echocardiographic and hemodynamic parameters of these patients before versus after the biopsy showed increased TR in 2 cases (2/12; 16.7%) quantified as moderate without progression in the long term. Only the right atrial pressure showed a significant increase (P = .0420) after tricuspid injury; however, the worsening of the functional class was not significant enough in any of the subjects. Thus, surgical intervention was not required. Conclusions. Histological evidence of chordal tissue in EMB specimens is a real-world problem of relatively low frequency. Traumatic tricuspid valve injury due to EMB rarely leads to severe valvular regurgitation; only a minority of patients develop significant clinical symptoms. Hemodynamic and echocardiographic alterations are also less often observed in most patients.
Resumo:
A deep theoretical analysis of the graph cut image segmentation framework presented in this paper simultaneously translates into important contributions in several directions. The most important practical contribution of this work is a full theoretical description, and implementation, of a novel powerful segmentation algorithm, GC(max). The output of GC(max) coincides with a version of a segmentation algorithm known as Iterative Relative Fuzzy Connectedness, IRFC. However, GC(max) is considerably faster than the classic IRFC algorithm, which we prove theoretically and show experimentally. Specifically, we prove that, in the worst case scenario, the GC(max) algorithm runs in linear time with respect to the variable M=|C|+|Z|, where |C| is the image scene size and |Z| is the size of the allowable range, Z, of the associated weight/affinity function. For most implementations, Z is identical to the set of allowable image intensity values, and its size can be treated as small with respect to |C|, meaning that O(M)=O(|C|). In such a situation, GC(max) runs in linear time with respect to the image size |C|. We show that the output of GC(max) constitutes a solution of a graph cut energy minimization problem, in which the energy is defined as the a"" (a) norm ayenF (P) ayen(a) of the map F (P) that associates, with every element e from the boundary of an object P, its weight w(e). This formulation brings IRFC algorithms to the realm of the graph cut energy minimizers, with energy functions ayenF (P) ayen (q) for qa[1,a]. Of these, the best known minimization problem is for the energy ayenF (P) ayen(1), which is solved by the classic min-cut/max-flow algorithm, referred to often as the Graph Cut algorithm. We notice that a minimization problem for ayenF (P) ayen (q) , qa[1,a), is identical to that for ayenF (P) ayen(1), when the original weight function w is replaced by w (q) . Thus, any algorithm GC(sum) solving the ayenF (P) ayen(1) minimization problem, solves also one for ayenF (P) ayen (q) with qa[1,a), so just two algorithms, GC(sum) and GC(max), are enough to solve all ayenF (P) ayen (q) -minimization problems. We also show that, for any fixed weight assignment, the solutions of the ayenF (P) ayen (q) -minimization problems converge to a solution of the ayenF (P) ayen(a)-minimization problem (ayenF (P) ayen(a)=lim (q -> a)ayenF (P) ayen (q) is not enough to deduce that). An experimental comparison of the performance of GC(max) and GC(sum) algorithms is included. This concentrates on comparing the actual (as opposed to provable worst scenario) algorithms' running time, as well as the influence of the choice of the seeds on the output.
Resumo:
The concept of Education for Sustainable Development, ESD, has been introduced in a period where chemistry education is undergoing a major change, both in emphasis and methods of teaching. Studying an everyday problem, with an important socio-economic impact in the laboratory is a part of this approach. Presently, the students in many countries go to school in vehicles that run, at least partially, on biofuels; it is high time to let them test these fuels. The use of renewable fuels is not new: since 1931 the gasoline sold in Brazil contains 20 to 25 vol-% of bioethanol; this composition is being continually monitored. With ESD in mind, we have employed a constructivist approach in an undergraduate course, where UV-vis spectroscopy has been employed for the determination of the composition of two fuel blends, namely, bioethanol/water, and bioethanol/gasoline. The activities started by giving a three-part quiz. The first and second ones introduced the students to historical and practical aspects of the theme (biofuels). In the third part, we asked them to develop a UV-vis experiment for the determination of the composition of fuel blends. They have tested two approaches: (i) use of a solvatochromic dye, followed by determination of fuel composition from plots of the empirical fuel polarity versus its composition; (ii) use of an ethanol-soluble dye, followed by determination of the blend composition from a Beer's law plot; the former proved to be much more convenient. Their evaluation of the experiment was highly positive, because of the relevance of the problem; the (constructivist) approach employed, and the bright colors that the solvatochromic dye acquire in these fuel blends. Thus ESD can be fruitfully employed in order to motivate the students; make the laboratory "fun", and teach them theory (solvation). The experiments reported here can also be given to undergraduate students whose major is not chemistry (engineering, pharmacy, biology, etc.). They are low-cost and safe to be introduced at high-school level.
Resumo:
This paper compares the effectiveness of the Tsallis entropy over the classic Boltzmann-Gibbs-Shannon entropy for general pattern recognition, and proposes a multi-q approach to improve pattern analysis using entropy. A series of experiments were carried out for the problem of classifying image patterns. Given a dataset of 40 pattern classes, the goal of our image case study is to assess how well the different entropies can be used to determine the class of a newly given image sample. Our experiments show that the Tsallis entropy using the proposed multi-q approach has great advantages over the Boltzmann-Gibbs-Shannon entropy for pattern classification, boosting image recognition rates by a factor of 3. We discuss the reasons behind this success, shedding light on the usefulness of the Tsallis entropy and the multi-q approach. (C) 2012 Elsevier B.V. All rights reserved.
Resumo:
In this paper we analyze the problem of light-matter interaction when absorptive resonances are imbedded in the material dispersion. We apply an improved approach to aluminum (Al) in the optical frequency range to investigate the impact of these resonances on the operating characteristics of Al-based nanoscale devices. Quantities such as group velocity, stored energy density, and energy velocity, normally obtained using a single resonance model [Wave Propagation and Group Velocity (Academic Press, 1960), Nat. Mater. 11, 208 (2012)], are now accurately calculated regardless of the medium adopted. We adapt the Loudon approach [Nat. Mater. 11, 208 (2012)] to media with several optical resonances and present the details of the extended model. We also show pertinent results for Al-based metal-dielectric-metal (MDM) waveguides, around spectral resonances. The model delineated here can be applied readily to any metal accurately characterized by Drude-Lorentz spectral resonance features. (C) 2012 Optical Society of America
Resumo:
Statement of problem. The retention of an Aramany Class IV removable partial dental prosthesis can be compromised by a lack of support. The biomechanics of this obturator prosthesis result in an unusual stress distribution on the residual maxillary bone. Purpose. This study evaluated the biomechanics of an Aramany Class IV obturator prosthesis with finite element analysis and a digital 3-dimensional (3-D) model developed from a computed tomography scan; bone stress was evaluated according to the load placed on the prosthesis. Material and methods. A 3-D model of an Aramany Class IV maxillary resection and prosthesis was constructed. This model was used to develop a finite element mesh. A 120 N load was applied to the occlusal and incisal platforms corresponding to the prosthetic teeth. Qualitative analysis was based on the scale of maximum principal stress; values obtained through quantitative analysis were expressed in MPa. Results. Under posterior load, tensile and compressive stresses were observed; the tensile stress was greater than the compressive stress, regardless of the bone region, and the greatest compressive stress was observed on the anterior palate near the midline. Under an anterior load, tensile stress was observed in all of the evaluated bone regions; the tensile stress was greater than the compressive stress, regardless of the bone region. Conclusions. The Aramany Class IV obturator prosthesis tended to rotate toward the surgical resection when subjected to posterior or anterior loads. The amount of tensile and compressive stress caused by the Aramany Class IV obturator prosthesis did not exceed the physiological limits of the maxillary bone tissue. (J Prosthet Dent 2012;107:336-342)
Resumo:
Solution of structural reliability problems by the First Order method require optimization algorithms to find the smallest distance between a limit state function and the origin of standard Gaussian space. The Hassofer-Lind-Rackwitz-Fiessler (HLRF) algorithm, developed specifically for this purpose, has been shown to be efficient but not robust, as it fails to converge for a significant number of problems. On the other hand, recent developments in general (augmented Lagrangian) optimization techniques have not been tested in aplication to structural reliability problems. In the present article, three new optimization algorithms for structural reliability analysis are presented. One algorithm is based on the HLRF, but uses a new differentiable merit function with Wolfe conditions to select step length in linear search. It is shown in the article that, under certain assumptions, the proposed algorithm generates a sequence that converges to the local minimizer of the problem. Two new augmented Lagrangian methods are also presented, which use quadratic penalties to solve nonlinear problems with equality constraints. Performance and robustness of the new algorithms is compared to the classic augmented Lagrangian method, to HLRF and to the improved HLRF (iHLRF) algorithms, in the solution of 25 benchmark problems from the literature. The new proposed HLRF algorithm is shown to be more robust than HLRF or iHLRF, and as efficient as the iHLRF algorithm. The two augmented Lagrangian methods proposed herein are shown to be more robust and more efficient than the classical augmented Lagrangian method.
Resumo:
Objective: To assess the risk factors for delayed diagnosis of uterine cervical lesions. Materials and Methods: This is a case-control study that recruited 178 women at 2 Brazilian hospitals. The cases (n = 74) were composed of women with a late diagnosis of a lesion in the uterine cervix (invasive carcinoma in any stage). The controls (n = 104) were composed of women with cervical lesions diagnosed early on (low-or high-grade intraepithelial lesions). The analysis was performed by means of logistic regression model using a hierarchical model. The socioeconomic and demographic variables were included at level I (distal). Level II (intermediate) included the personal and family antecedents and knowledge about the Papanicolaou test and human papillomavirus. Level III (proximal) encompassed the variables relating to individuals' care for their own health, gynecologic symptoms, and variables relating to access to the health care system. Results: The risk factors for late diagnosis of uterine cervical lesions were age older than 40 years (odds ratio [OR] = 10.4; 95% confidence interval [CI], 2.3-48.4), not knowing the difference between the Papanicolaou test and gynecological pelvic examinations (OR, = 2.5; 95% CI, 1.3-4.9), not thinking that the Papanicolaou test was important (odds ratio [OR], 4.2; 95% CI, 1.3-13.4), and abnormal vaginal bleeding (OR, 15.0; 95% CI, 6.5-35.0). Previous treatment for sexually transmissible disease was a protective factor (OR, 0.3; 95% CI, 0.1-0.8) for delayed diagnosis. Conclusions: Deficiencies in cervical cancer prevention programs in developing countries are not simply a matter of better provision and coverage of Papanicolaou tests. The misconception about the Papanicolaou test is a serious educational problem, as demonstrated by the present study.
Resumo:
Background: Hepatitis B virus (HBV) infection is one of the most prevalent viral infections in humans and represents a serious public health problem. In Colombia, our group reported recently the presence of subgenotypes F3, A2 and genotype G in Bogota. The aim of this study was to characterize the HBV genotypes circulating in Quibdo, the largest Afro-descendant community in Colombia. Sixty HBsAg-positive samples were studied. A fragment of 1306 bp (S/POL) was amplified by nested PCR. Positive samples to S/POL fragment were submitted to PCR amplification of the HBV complete genome. Findings: The distribution of HBV genotypes was: A1 (52.17%), E (39.13%), D3 (4.3%) and F3/A1 (4.3%). An HBV recombinant strain subgenotype F3/A1 was found for the first time. Conclusions: This study is the first analysis of complete HBV genome sequences from Afro-Colombian population. It was found an important presence of HBV/A1 and HBV/E genotypes. A new recombinant strain of HBV genotype F3/A1 was reported in this population. This fact may be correlated with the introduction of these genotypes in the times of slavery.
Resumo:
In this work the differentiability of the principal eigenvalue lambda = lambda(1)(Gamma) to the localized Steklov problem -Delta u + qu = 0 in Omega, partial derivative u/partial derivative nu = lambda chi(Gamma)(x)u on partial derivative Omega, where Gamma subset of partial derivative Omega is a smooth subdomain of partial derivative Omega and chi(Gamma) is its characteristic function relative to partial derivative Omega, is shown. As a key point, the flux subdomain Gamma is regarded here as the variable with respect to which such differentiation is performed. An explicit formula for the derivative of lambda(1) (Gamma) with respect to Gamma is obtained. The lack of regularity up to the boundary of the first derivative of the principal eigenfunctions is a further intrinsic feature of the problem. Therefore, the whole analysis must be done in the weak sense of H(1)(Omega). The study is of interest in mathematical models in morphogenesis. (C) 2011 Elsevier Inc. All rights reserved.
Resumo:
Abstract Background The search for enriched (aka over-represented or enhanced) ontology terms in a list of genes obtained from microarray experiments is becoming a standard procedure for a system-level analysis. This procedure tries to summarize the information focussing on classification designs such as Gene Ontology, KEGG pathways, and so on, instead of focussing on individual genes. Although it is well known in statistics that association and significance are distinct concepts, only the former approach has been used to deal with the ontology term enrichment problem. Results BayGO implements a Bayesian approach to search for enriched terms from microarray data. The R source-code is freely available at http://blasto.iq.usp.br/~tkoide/BayGO in three versions: Linux, which can be easily incorporated into pre-existent pipelines; Windows, to be controlled interactively; and as a web-tool. The software was validated using a bacterial heat shock response dataset, since this stress triggers known system-level responses. Conclusion The Bayesian model accounts for the fact that, eventually, not all the genes from a given category are observable in microarray data due to low intensity signal, quality filters, genes that were not spotted and so on. Moreover, BayGO allows one to measure the statistical association between generic ontology terms and differential expression, instead of working only with the common significance analysis.
Resumo:
Abstract Background A popular model for gene regulatory networks is the Boolean network model. In this paper, we propose an algorithm to perform an analysis of gene regulatory interactions using the Boolean network model and time-series data. Actually, the Boolean network is restricted in the sense that only a subset of all possible Boolean functions are considered. We explore some mathematical properties of the restricted Boolean networks in order to avoid the full search approach. The problem is modeled as a Constraint Satisfaction Problem (CSP) and CSP techniques are used to solve it. Results We applied the proposed algorithm in two data sets. First, we used an artificial dataset obtained from a model for the budding yeast cell cycle. The second data set is derived from experiments performed using HeLa cells. The results show that some interactions can be fully or, at least, partially determined under the Boolean model considered. Conclusions The algorithm proposed can be used as a first step for detection of gene/protein interactions. It is able to infer gene relationships from time-series data of gene expression, and this inference process can be aided by a priori knowledge available.
Resumo:
This paper proposes two new approaches for the sensitivity analysis of multiobjective design optimization problems whose performance functions are highly susceptible to small variations in the design variables and/or design environment parameters. In both methods, the less sensitive design alternatives are preferred over others during the multiobjective optimization process. While taking the first approach, the designer chooses the design variable and/or parameter that causes uncertainties. The designer then associates a robustness index with each design alternative and adds each index as an objective function in the optimization problem. For the second approach, the designer must know, a priori, the interval of variation in the design variables or in the design environment parameters, because the designer will be accepting the interval of variation in the objective functions. The second method does not require any law of probability distribution of uncontrollable variations. Finally, the authors give two illustrative examples to highlight the contributions of the paper.
Resumo:
The thesis consists of three independent parts. Part I: Polynomial amoebas We study the amoeba of a polynomial, as de ned by Gelfand, Kapranov and Zelevinsky. A central role in the treatment is played by a certain convex function which is linear in each complement component of the amoeba, which we call the Ronkin function. This function is used in two di erent ways. First, we use it to construct a polyhedral complex, which we call a spine, approximating the amoeba. Second, the Monge-Ampere measure of the Ronkin function has interesting properties which we explore. This measure can be used to derive an upper bound on the area of an amoeba in two dimensions. We also obtain results on the number of complement components of an amoeba, and consider possible extensions of the theory to varieties of codimension higher than 1. Part II: Differential equations in the complex plane We consider polynomials in one complex variable arising as eigenfunctions of certain differential operators, and obtain results on the distribution of their zeros. We show that in the limit when the degree of the polynomial approaches innity, its zeros are distributed according to a certain probability measure. This measure has its support on the union of nitely many curve segments, and can be characterized by a simple condition on its Cauchy transform. Part III: Radon transforms and tomography This part is concerned with different weighted Radon transforms in two dimensions, in particular the problem of inverting such transforms. We obtain stability results of this inverse problem for rather general classes of weights, including weights of attenuation type with data acquisition limited to a 180 degrees range of angles. We also derive an inversion formula for the exponential Radon transform, with the same restriction on the angle.
Resumo:
Programa de doctorado: Sistemas Inteligentes y Aplicaciones Numéricas en Ingeniería Instituto Universitario (SIANI)