937 resultados para Spectral method with domain decomposition
Resumo:
3rd Portuguese Meeting on Medicinal Chemistry and 1st Portuguese-Spanish-Brazilian Meeting on Medicinal Chemistry, Aveiro, 28-30 Novembro 2012.
Resumo:
On the basis of its electrochemical behaviour a new flow-injection analysis (FIA) method with amperometric detection has been developed for quantification of the herbicide bentazone (BTZ) in estuarine waters. Standard solutions and samples (200 µL) were injected into a water carrier stream and both pH and ionic strength were automatically adjusted inside the manifold. Optimization of critical FIA conditions indicated that the best analytical results were obtained at an oxidation potential of 1.10 V, pH 4.5, and an overall flow-rate of 2.4 mL min–1. Analysis of real samples was performed by means of calibration curves over the concentration range 2.5x10–6 to 5.0x10–5 mol L–1, and results were compared with those obtained by use of an independent method (HPLC). The accuracy of the amperometric determinations was ascertained; errors relative to the comparison method were below 4% and sampling rates were approximately 100 samples h–1. The repeatability of the proposed method was calculated by assessing the relative standard deviation (%) of ten consecutive determinations of one sample; the value obtained was 2.1%.
Resumo:
Trabalho Final de Mestrado elaborado no Laboratório Nacional de Engenharia Civil (LNEC) para a obtenção do grau de Mestre em Engenharia Civil pelo Instituto Superior de Engenharia de Lisboa no âmbito do protocolo de Cooperação entre o ISEL e o LNEC
Resumo:
Constraints nonlinear optimization problems can be solved using penalty or barrier functions. This strategy, based on solving the problems without constraints obtained from the original problem, have shown to be e ective, particularly when used with direct search methods. An alternative to solve the previous problems is the lters method. The lters method introduced by Fletcher and Ley er in 2002, , has been widely used to solve problems of the type mentioned above. These methods use a strategy di erent from the barrier or penalty functions. The previous functions de ne a new one that combine the objective function and the constraints, while the lters method treat optimization problems as a bi-objective problems that minimize the objective function and a function that aggregates the constraints. Motivated by the work of Audet and Dennis in 2004, using lters method with derivative-free algorithms, the authors developed works where other direct search meth- ods were used, combining their potential with the lters method. More recently. In a new variant of these methods was presented, where it some alternative aggregation restrictions for the construction of lters were proposed. This paper presents a variant of the lters method, more robust than the previous ones, that has been implemented with a safeguard procedure where values of the function and constraints are interlinked and not treated completely independently.
Resumo:
Constrained nonlinear optimization problems are usually solved using penalty or barrier methods combined with unconstrained optimization methods. Another alternative used to solve constrained nonlinear optimization problems is the lters method. Filters method, introduced by Fletcher and Ley er in 2002, have been widely used in several areas of constrained nonlinear optimization. These methods treat optimization problem as bi-objective attempts to minimize the objective function and a continuous function that aggregates the constraint violation functions. Audet and Dennis have presented the rst lters method for derivative-free nonlinear programming, based on pattern search methods. Motivated by this work we have de- veloped a new direct search method, based on simplex methods, for general constrained optimization, that combines the features of the simplex method and lters method. This work presents a new variant of these methods which combines the lters method with other direct search methods and are proposed some alternatives to aggregate the constraint violation functions.
Resumo:
Hyperspectral remote sensing exploits the electromagnetic scattering patterns of the different materials at specific wavelengths [2, 3]. Hyperspectral sensors have been developed to sample the scattered portion of the electromagnetic spectrum extending from the visible region through the near-infrared and mid-infrared, in hundreds of narrow contiguous bands [4, 5]. The number and variety of potential civilian and military applications of hyperspectral remote sensing is enormous [6, 7]. Very often, the resolution cell corresponding to a single pixel in an image contains several substances (endmembers) [4]. In this situation, the scattered energy is a mixing of the endmember spectra. A challenging task underlying many hyperspectral imagery applications is then decomposing a mixed pixel into a collection of reflectance spectra, called endmember signatures, and the corresponding abundance fractions [8–10]. Depending on the mixing scales at each pixel, the observed mixture is either linear or nonlinear [11, 12]. Linear mixing model holds approximately when the mixing scale is macroscopic [13] and there is negligible interaction among distinct endmembers [3, 14]. If, however, the mixing scale is microscopic (or intimate mixtures) [15, 16] and the incident solar radiation is scattered by the scene through multiple bounces involving several endmembers [17], the linear model is no longer accurate. Linear spectral unmixing has been intensively researched in the last years [9, 10, 12, 18–21]. It considers that a mixed pixel is a linear combination of endmember signatures weighted by the correspondent abundance fractions. Under this model, and assuming that the number of substances and their reflectance spectra are known, hyperspectral unmixing is a linear problem for which many solutions have been proposed (e.g., maximum likelihood estimation [8], spectral signature matching [22], spectral angle mapper [23], subspace projection methods [24,25], and constrained least squares [26]). In most cases, the number of substances and their reflectances are not known and, then, hyperspectral unmixing falls into the class of blind source separation problems [27]. Independent component analysis (ICA) has recently been proposed as a tool to blindly unmix hyperspectral data [28–31]. ICA is based on the assumption of mutually independent sources (abundance fractions), which is not the case of hyperspectral data, since the sum of abundance fractions is constant, implying statistical dependence among them. This dependence compromises ICA applicability to hyperspectral images as shown in Refs. [21, 32]. In fact, ICA finds the endmember signatures by multiplying the spectral vectors with an unmixing matrix, which minimizes the mutual information among sources. If sources are independent, ICA provides the correct unmixing, since the minimum of the mutual information is obtained only when sources are independent. This is no longer true for dependent abundance fractions. Nevertheless, some endmembers may be approximately unmixed. These aspects are addressed in Ref. [33]. Under the linear mixing model, the observations from a scene are in a simplex whose vertices correspond to the endmembers. Several approaches [34–36] have exploited this geometric feature of hyperspectral mixtures [35]. Minimum volume transform (MVT) algorithm [36] determines the simplex of minimum volume containing the data. The method presented in Ref. [37] is also of MVT type but, by introducing the notion of bundles, it takes into account the endmember variability usually present in hyperspectral mixtures. The MVT type approaches are complex from the computational point of view. Usually, these algorithms find in the first place the convex hull defined by the observed data and then fit a minimum volume simplex to it. For example, the gift wrapping algorithm [38] computes the convex hull of n data points in a d-dimensional space with a computational complexity of O(nbd=2cþ1), where bxc is the highest integer lower or equal than x and n is the number of samples. The complexity of the method presented in Ref. [37] is even higher, since the temperature of the simulated annealing algorithm used shall follow a log( ) law [39] to assure convergence (in probability) to the desired solution. Aiming at a lower computational complexity, some algorithms such as the pixel purity index (PPI) [35] and the N-FINDR [40] still find the minimum volume simplex containing the data cloud, but they assume the presence of at least one pure pixel of each endmember in the data. This is a strong requisite that may not hold in some data sets. In any case, these algorithms find the set of most pure pixels in the data. PPI algorithm uses the minimum noise fraction (MNF) [41] as a preprocessing step to reduce dimensionality and to improve the signal-to-noise ratio (SNR). The algorithm then projects every spectral vector onto skewers (large number of random vectors) [35, 42,43]. The points corresponding to extremes, for each skewer direction, are stored. A cumulative account records the number of times each pixel (i.e., a given spectral vector) is found to be an extreme. The pixels with the highest scores are the purest ones. N-FINDR algorithm [40] is based on the fact that in p spectral dimensions, the p-volume defined by a simplex formed by the purest pixels is larger than any other volume defined by any other combination of pixels. This algorithm finds the set of pixels defining the largest volume by inflating a simplex inside the data. ORA SIS [44, 45] is a hyperspectral framework developed by the U.S. Naval Research Laboratory consisting of several algorithms organized in six modules: exemplar selector, adaptative learner, demixer, knowledge base or spectral library, and spatial postrocessor. The first step consists in flat-fielding the spectra. Next, the exemplar selection module is used to select spectral vectors that best represent the smaller convex cone containing the data. The other pixels are rejected when the spectral angle distance (SAD) is less than a given thresh old. The procedure finds the basis for a subspace of a lower dimension using a modified Gram–Schmidt orthogonalizati on. The selected vectors are then projected onto this subspace and a simplex is found by an MV T pro cess. ORA SIS is oriented to real-time target detection from uncrewed air vehicles using hyperspectral data [46]. In this chapter we develop a new algorithm to unmix linear mixtures of endmember spectra. First, the algorithm determines the number of endmembers and the signal subspace using a newly developed concept [47, 48]. Second, the algorithm extracts the most pure pixels present in the data. Unlike other methods, this algorithm is completely automatic and unsupervised. To estimate the number of endmembers and the signal subspace in hyperspectral linear mixtures, the proposed scheme begins by estimating sign al and noise correlation matrices. The latter is based on multiple regression theory. The signal subspace is then identified by selectin g the set of signal eigenvalue s that best represents the data, in the least-square sense [48,49 ], we note, however, that VCA works with projected and with unprojected data. The extraction of the end members exploits two facts: (1) the endmembers are the vertices of a simplex and (2) the affine transformation of a simplex is also a simplex. As PPI and N-FIND R algorithms, VCA also assumes the presence of pure pixels in the data. The algorithm iteratively projects data on to a direction orthogonal to the subspace spanned by the endmembers already determined. The new end member signature corresponds to the extreme of the projection. The algorithm iterates until all end members are exhausted. VCA performs much better than PPI and better than or comparable to N-FI NDR; yet it has a computational complexity between on e and two orders of magnitude lower than N-FINDR. The chapter is structure d as follows. Section 19.2 describes the fundamentals of the proposed method. Section 19.3 and Section 19.4 evaluate the proposed algorithm using simulated and real data, respectively. Section 19.5 presents some concluding remarks.
Resumo:
The purpose of this work is to present an algorithm to solve nonlinear constrained optimization problems, using the filter method with the inexact restoration (IR) approach. In the IR approach two independent phases are performed in each iteration—the feasibility and the optimality phases. The first one directs the iterative process into the feasible region, i.e. finds one point with less constraints violation. The optimality phase starts from this point and its goal is to optimize the objective function into the satisfied constraints space. To evaluate the solution approximations in each iteration a scheme based on the filter method is used in both phases of the algorithm. This method replaces the merit functions that are based on penalty schemes, avoiding the related difficulties such as the penalty parameter estimation and the non-differentiability of some of them. The filter method is implemented in the context of the line search globalization technique. A set of more than two hundred AMPL test problems is solved. The algorithm developed is compared with LOQO and NPSOL software packages.
Resumo:
Dissertação para obtenção do Grau de Mestre em Lógica Computacional
Resumo:
INTRODUCTION: The goal was to develop an in-house serological method with high specificity and sensitivity for diagnosis and monitoring of Chagas disease morbidity. METHODS: With this purpose, the reactivities of anti-T. cruzi IgG and subclasses were tested in successive serum dilutions of patients from Berilo municipality, Jequitinhonha Valley, Minas Gerais, Brazil. The performance of the in-house ELISA was also evaluated in samples from other relevant infectious diseases, including HIV, hepatitis C (HCV), syphilis (SYP), visceral leishmaniasis (VL), and American tegumentary leishmaniasis (ATL), and noninfected controls (NI). Further analysis was performed to evaluate the applicability of this in-house methodology for monitoring Chagas disease morbidity into three groups of patients: indeterminate (IND), cardiac (CARD), and digestive/mixed (DIG/Mix), based on their clinical status. RESULTS: The analysis of total IgG reactivity at serum dilution 1:40 was an excellent approach to Chagas disease diagnosis (100% sensitivity and specificity). The analysis of IgG subclasses showed cross-reactivity, mainly with NI, VL, and ATL, at all selected serum dilutions. Based on the data analysis, the IND group displayed higher IgG3 levels and the DIG/Mix group presented higher levels of total IgG as compared with the IND and CARD groups. CONCLUSIONS: These findings demonstrated that methodology presents promising applicability in the analysis of anti-T. cruzi IgG reactivity for the differential diagnosis and evaluation of Chagas disease morbidity.
Resumo:
The present paper focuses on a damage identification method based on the use of the second order spectral properties of the nodal response processes. The explicit dependence on the frequency content of the outputs power spectral densities makes them suitable for damage detection and localization. The well-known case study of the Z24 Bridge in Switzerland is chosen to apply and further investigate this technique with the aim of validating its reliability. Numerical simulations of the dynamic response of the structure subjected to different types of excitation are carried out to assess the variability of the spectrum-driven method with respect to both type and position of the excitation sources. The simulated data obtained from random vibrations, impulse, ramp and shaking forces, allowed to build the power spectrum matrix from which the main eigenparameters of reference and damage scenarios are extracted. Afterwards, complex eigenvectors and real eigenvalues are properly weighed and combined and a damage index based on the difference between spectral modes is computed to pinpoint the damage. Finally, a group of vibration-based damage identification methods are selected from the literature to compare the results obtained and to evaluate the performance of the spectral index.
Resumo:
Since 1984, DNA tests based on the highly repeated subtelomeric sequences of Plasmodium falciparum (rep 20) have been frequently used in malaria diagnosis. Rep 20 is very specific for this parasite, and is made of 21 bp units, organized in repeated blocks with direct and inverted orientation. Based in this particular organization, we selected a unique consensus oligonucleotide (pf-21) to drive a PCR reaction coupled to hybridization to non-radioactive labeled probes. The pf-21 unique oligo PCR (pf-21-I) assay produced DNA amplification fingerprints when was applied on purified P. falciparum DNA samples (Brazil and Colombia), as well as in patient's blood samples from a large area of Venezuela. The performance of the Pf-21-I assay was compared against Giemsa stained thick blood smears from samples collected at a malaria endemic area of the Bolívar State, Venezuela, at the field station of Malariología in Tumeremo. Coupled to non-radioactive hybridization the pf-21-I performed better than the traditional microscopic method with a r=1.7:1. In the case of mixed infections the r value of P. falciparum detection increased to 2.5:1. The increased diagnostic sensitivity of the test produced with this homologous oligonucleotide could provide an alternative to the epidemiological diagnosis of P. falciparum being currently used in Venezuela endemic areas, where low parasitemia levels and asymptomatic malaria are frequent. In addition, the DNA fingerprint could be tested in molecular population studies
Resumo:
Indirect drug susceptibility tests of Mycobacterium tuberculosis was done to investigate the accuracy and feasibility of a broth microdilution method (BMM) for determining minimal inhibitory concentrations of conventional drugs against M. tuberculosis. Test drugs included isoniazid (H), rifampicin (R), ethambutol (E), streptomycin (S) and pyrazinamide (Z). Fifty isolates of M. tuberculosis from patients who had never received drug therapy, and H37Rv strain for control, were evaluated in the system. When comparing this method with the gold standard proportional method in Lowenstein-Jensen medium, sensitivity of 100% for all drugs and specifities of 91, 100, 96, 98 and 85% were observed respectively for H, R, E, S and Z. The BMM was read faster (14-20 days) than the proportional method (20-28 days). The microdilution method evaluated allows the testing of multiple drugs in multiple concentrations. It is easy to perform and does not require special equipment or expensive supplies. In contrast to radiometric method it does not use radioactive material.
Resumo:
Jaboticatubas is a municipality in the metropolitan region of Belo Horizonte which has been a target of a wide media release as "the capital of schistosomiasis" since the 1960's. In order to give support to a work based on an integrated control, we sought to identify the disease determinants at the site. A transversal study was carried out aimed at identifying prevalence rates of the disease and factors associated with the infection in the district of São José de Almeida, and two close localities, Cipó Velho and São José da Serra, all of them located in the municipality of Jaboticatubas. A parasitological survey was performed, applying the Kato-Katz method with two slides per sample in 1186 schoolchildren which represents 77% of all registered pupils in four public schools in 2001. Among these schoolchildren a number of 101 (8.6%) prooved positive for Schistosoma mansoni eggs in their stool samples. A total of 64 families, whose schoolchildren had shown to be positive for schistosomiasis, also undertook examinations. As negative control, a random sample was collected from the 206 families, whose children had proven negative for schistosomiasis. The prevalence among 270 families (1304 people) was 12%. To assess those who continued to have contact with possibly contaminated water, 1061 (81.4%) people of the 270 families were interviewed. A multivariate analysis identified the following factors associated with the infection: time of residence in the area (short period), garbage disposal (use of deserted areas), gender (male), age (from 10 to 29 years), and water contact (daily and weekly). Further analysis of these factors revealed a close correlation between water contact and the disease, with a positive significant frequency concerning almost all those items. Depending on gender and age significant variations of water contact patterns associated with leisure and professional activities were found. A malacological survey on water collections in the area identified snails of the species Biomphalaria straminea and B. glabrata. The latter showed 17 (0.6%) specimens positive for S. mansoni. Qualitative studies have complemented such evidences, which allowed us to design a reference picture and specific indicators of the disease for the local population. Those data provided the essential information to continue the development of an already ongoing educative process, as well as projects on environmental improvements.
Resumo:
This study aimed to standardise an in-house real-time polymerase chain reaction (rtPCR) to allow quantification of hepatitis B virus (HBV) DNA in serum or plasma samples, and to compare this method with two commercial assays, the Cobas Amplicor HBV monitor and the Cobas AmpliPrep/Cobas TaqMan HBV test. Samples from 397 patients from the state of São Paulo were analysed by all three methods. Fifty-two samples were from patients who were human immunodeficiency virus and hepatitis C virus positive, but HBV negative. Genotypes were characterised, and the viral load was measure in each sample. The in-house rtPCR showed an excellent success rate compared with commercial tests; inter-assay and intra-assay coefficients correlated with commercial tests (r = 0.96 and r = 0.913, p < 0.001) and the in-house test showed no genotype-dependent differences in detection and quantification rates. The in-house assay tested in this study could be used for screening and quantifying HBV DNA in order to monitor patients during therapy.
Resumo:
We propose a restoration algorithm for band limited images that considers irregular(perturbed) sampling, denoising, and deconvolution. We explore the application of a family ofregularizers that allow to control the spectral behavior of the solution combined with the irregular toregular sampling algorithms proposed by H.G. Feichtinger, K. Gr¨ochenig, M. Rauth and T. Strohmer.Moreover, the constraints given by the image acquisition model are incorporated as a set of localconstraints. And the analysis of such constraints leads to an early stopping rule meant to improvethe speed of the algorithm. Finally we present experiments focused on the restoration of satellite images, where the micro-vibrations are responsible of the type of distortions we are considering here. We will compare results of the proposed method with previous methods and show an extension tozoom.