930 resultados para Method of linear transformations


Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this thesis we use statistical physics techniques to study the typical performance of four families of error-correcting codes based on very sparse linear transformations: Sourlas codes, Gallager codes, MacKay-Neal codes and Kanter-Saad codes. We map the decoding problem onto an Ising spin system with many-spins interactions. We then employ the replica method to calculate averages over the quenched disorder represented by the code constructions, the arbitrary messages and the random noise vectors. We find, as the noise level increases, a phase transition between successful decoding and failure phases. This phase transition coincides with upper bounds derived in the information theory literature in most of the cases. We connect the practical decoding algorithm known as probability propagation with the task of finding local minima of the related Bethe free-energy. We show that the practical decoding thresholds correspond to noise levels where suboptimal minima of the free-energy emerge. Simulations of practical decoding scenarios using probability propagation agree with theoretical predictions of the replica symmetric theory. The typical performance predicted by the thermodynamic phase transitions is shown to be attainable in computation times that grow exponentially with the system size. We use the insights obtained to design a method to calculate the performance and optimise parameters of the high performance codes proposed by Kanter and Saad.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Numerical techniques have been finding increasing use in all aspects of fracture mechanics, and often provide the only means for analyzing fracture problems. The work presented here, is concerned with the application of the finite element method to cracked structures. The present work was directed towards the establishment of a comprehensive two-dimensional finite element, linear elastic, fracture analysis package. Significant progress has been made to this end, and features which can now be studied include multi-crack tip mixed-mode problems, involving partial crack closure. The crack tip core element was refined and special local crack tip elements were employed to reduce the element density in the neighbourhood of the core region. The work builds upon experience gained by previous research workers and, as part of the general development, the program was modified to incorporate the eight-node isoparametric quadrilateral element. Also. a more flexible solving routine was developed, and provided a very compact method of solving large sets of simultaneous equations, stored in a segmented form. To complement the finite element analysis programs, an automatic mesh generation program has been developed, which enables complex problems. involving fine element detail, to be investigated with a minimum of input data. The scheme has proven to be versati Ie and reasonably easy to implement. Numerous examples are given to demonstrate the accuracy and flexibility of the finite element technique.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We investigate two numerical procedures for the Cauchy problem in linear elasticity, involving the relaxation of either the given boundary displacements (Dirichlet data) or the prescribed boundary tractions (Neumann data) on the over-specified boundary, in the alternating iterative algorithm of Kozlov et al. (1991). The two mixed direct (well-posed) problems associated with each iteration are solved using the method of fundamental solutions (MFS), in conjunction with the Tikhonov regularization method, while the optimal value of the regularization parameter is chosen via the generalized cross-validation (GCV) criterion. An efficient regularizing stopping criterion which ceases the iterative procedure at the point where the accumulation of noise becomes dominant and the errors in predicting the exact solutions increase, is also presented. The MFS-based iterative algorithms with relaxation are tested for Cauchy problems for isotropic linear elastic materials in various geometries to confirm the numerical convergence, stability, accuracy and computational efficiency of the proposed method.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objective: Development and validation of a selective and sensitive LCMS method for the determination of methotrexate polyglutamates in dried blood spots (DBS). Methods: DBS samples [spiked or patient samples] were prepared by applying blood to Guthrie cards which was then dried at room temperature. The method utilised 6-mm disks punched from the DBS samples (equivalent to approximately 12 μl of whole blood). The simple treatment procedure was based on protein precipitation using perchloric acid followed by solid phase extraction using MAX cartridges. The extracted sample was chromatographed using a reversed phase system involving an Atlantis T3-C18 column (3 μm, 2.1x150 mm) preceded by Atlantis guard column of matching chemistry. Analytes were subjected to LCMS analysis using positive electrospray ionization. Key Results: The method was linear over the range 5-400 nmol/L. The limits of detection and quantification were 1.6 and 5 nmol/L for individual polyglutamates and 1.5 and 4.5 nmol/L for total polyglutamates, respectively. The method has been applied successfully to the determination of DBS finger-prick samples from 47 paediatric patients and results confirmed with concentrations measured in matched RBC samples using conventional HPLC-UV technique. Conclusions and Clinical Relevance: The methodology has a potential for application in a range of clinical studies (e.g. pharmacokinetic evaluations or medication adherence assessment) since it is minimally invasive and easy to perform, potentially allowing parents to take blood samples at home. The feasibility of using DBS sampling can be of major value for future clinical trials or clinical care in paediatric rheumatology. © 2014 Hawwa et al.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

An HPLC method has been developed and validated for the rapid determination of mercaptopurine and four of its metabolites; thioguanine, thiouric acid, thioxanthine and methylmercaptopurine in plasma and red blood cells. The method involves a simple treatment procedure based on deproteinisation by perchloric acid followed by acid hydrolysis and heating for 45min at 100 degrees C. The developed method was linear over the concentration range studied with a correlation coefficient >0.994 for all compounds in both plasma and erythrocytes. The lower limits of quantification were 13, 14, 3, 2, 95pmol/8 x 10(8) RBCs and 2, 5, 2, 3, 20ng/ml plasma for thioguanine, thiouric acid, mercaptopurine, thioxanthine and methylmercaptopurine, respectively. The method described is selective and sensitive enough to analyse the different metabolites in a single run under isocratic conditions. Furthermore, it has been shown to be applicable for monitoring these metabolites in paediatric patients due to the low volume requirement (200microl of plasma or erythrocytes) and has been successfully applied for investigating population pharmacokinetics, pharmacogenetics and non-adherence to therapy in these patients.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

An increasing number of publications on the dried blood spot (DBS) sampling approach for the quantification of drugs and metabolites have been spurred on by the inherent advantages of this sampling technique. In the present research, a selective and sensitive high-performance liquid chromatography method for the concurrent determination of multiple antiepileptic drugs (AEDs) [levetiracetam (LVT), lamotrigine (LTG), phenobarbital (PHB)], carbamazepine (CBZ) and its active metabolite carbamazepine-10,11 epoxide (CBZE)] in a single DBS has been developed and validated. Whole blood was spotted onto Guthrie cards and dried. Using a standard punch (6 mm diameter), a circular disc was punched from the card and extracted with methanol: acetonitrile (3:1, v/v) containing hexobarbital (Internal Standard) and sonicated prior to evaporation. The extract was then dissolved in water and vortex mixed before undergoing solid phase extraction using HLB cartridges. Chromatographic separation of the AEDs was achieved using Waters XBridge™ C18 column with a gradient system. The developed method was linear over the concentration ranges studied with r ≥ 0.995 for all compounds. The lower limits of quantification (LLOQs) were 2, 1, 2, 0.5 and 1 μg/mL for LVT, LTG, PHB, CBZE and CBZ, respectively. Accuracy (%RE) and precision (%CV) values for within and between day were <20% at the LLOQs and <15% at all other concentrations tested. This method was successfully applied to the analysis of the AEDs in DBS samples taken from children with epilepsy for the assessment of their adherence to prescribed treatments.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Abstract: Highway bridges have great values in a country because in case of any natural disaster they may serve as lines to save people’s lives. Being vulnerable under significant seismic loads, different methods can be considered to design resistant highway bridges and rehabilitate the existing ones. In this study, base isolation has been considered as one efficient method in this regards which in some cases reduces significantly the seismic load effects on the structure. By reducing the ductility demand on the structure without a notable increase of strength, the structure is designed to remain elastic under seismic loads. The problem associated with the isolated bridges, especially with elastomeric bearings, can be their excessive displacements under service and seismic loads. This can defy the purpose of using elastomeric bearings for small to medium span typical bridges where expansion joints and clearances may result in significant increase of initial and maintenance cost. Thus, supplementing the structure with dampers with some stiffness can serve as a solution which in turn, however, may increase the structure base shear. The main objective of this thesis is to provide a simplified method for the evaluation of optimal parameters for dampers in isolated bridges. Firstly, performing a parametric study, some directions are given for the use of simple isolation devices such as elastomeric bearings to rehabilitate existing bridges with high importance. Parameters like geometry of the bridge, code provisions and the type of soil on which the structure is constructed have been introduced to a typical two span bridge. It is concluded that the stiffness of the substructure, soil type and special provisions in the code can determine the employment of base isolation for retrofitting of bridges. Secondly, based on the elastic response coefficient of isolated bridges, a simplified design method of dampers for seismically isolated regular highway bridges has been presented in this study. By setting objectives for reduction of displacement and base shear variation, the required stiffness and damping of a hysteretic damper can be determined. By modelling a typical two span bridge, numerical analyses have followed to verify the effectiveness of the method. The method has been used to identify equivalent linear parameters and subsequently, nonlinear parameters of hysteretic damper for various designated scenarios of displacement and base shear requirements. Comparison of the results of the nonlinear numerical model without damper and with damper has shown that the method is sufficiently accurate. Finally, an innovative and simple hysteretic steel damper was designed. Five specimens were fabricated from two steel grades and were tested accompanying a real scale elastomeric isolator in the structural laboratory of the Université de Sherbrooke. The test procedure was to characterize the specimens by cyclic displacement controlled tests and subsequently to test them by real-time dynamic substructuring (RTDS) method. The test results were then used to establish a numerical model of the system which went through nonlinear time history analyses under several earthquakes. The outcome of the experimental and numerical showed an acceptable conformity with the simplified method.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Linear cascade testing serves a fundamental role in the research, development, and design of turbomachines as it is a simple yet very effective way to compute the performance of a generic blade geometry. These kinds of experiments are usually carried out in specialized wind tunnel facilities. This thesis deals with the numerical characterization and subsequent partial redesign of the S-1/C Continuous High Speed Wind Tunnel of the Von Karman Institute for Fluid Dynamics. The current facility is powered by a 13-stage axial compressor that is not powerful enough to balance the energy loss experienced when testing low turning airfoils. In order to address this issue a performance assessment of the wind tunnel was performed under several flow regimes via numerical simulations. After that, a redesign proposal aimed at reducing the pressure loss was investigated. This consists of a linear cascade of turning blades to be placed downstream of the test section and designed specifically for the type of linear cascade being tested. An automatic design procedure was created taking as input parameters those measured at the outlet of the cascade. The parametrization method employed Bézier curves to produce an airfoil geometry that could be imported into a CAD software so that a cascade could be designed. The proposal was simulated via CFD analysis and proved to be effective in reducing pressure losses up to 41%. The same tool developed in this thesis could be adopted to design similar apparatuses and could also be optimized and specialized for the design of turbomachines components.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Modern High-Performance Computing HPC systems are gradually increasing in size and complexity due to the correspondent demand of larger simulations requiring more complicated tasks and higher accuracy. However, as side effects of the Dennard’s scaling approaching its ultimate power limit, the efficiency of software plays also an important role in increasing the overall performance of a computation. Tools to measure application performance in these increasingly complex environments provide insights into the intricate ways in which software and hardware interact. The monitoring of the power consumption in order to save energy is possible through processors interfaces like Intel Running Average Power Limit RAPL. Given the low level of these interfaces, they are often paired with an application-level tool like Performance Application Programming Interface PAPI. Since several problems in many heterogeneous fields can be represented as a complex linear system, an optimized and scalable linear system solver algorithm can decrease significantly the time spent to compute its resolution. One of the most widely used algorithms deployed for the resolution of large simulation is the Gaussian Elimination, which has its most popular implementation for HPC systems in the Scalable Linear Algebra PACKage ScaLAPACK library. However, another relevant algorithm, which is increasing in popularity in the academic field, is the Inhibition Method. This thesis compares the energy consumption of the Inhibition Method and Gaussian Elimination from ScaLAPACK to profile their execution during the resolution of linear systems above the HPC architecture offered by CINECA. Moreover, it also collates the energy and power values for different ranks, nodes, and sockets configurations. The monitoring tools employed to track the energy consumption of these algorithms are PAPI and RAPL, that will be integrated with the parallel execution of the algorithms managed with the Message Passing Interface MPI.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Privacy issues and data scarcity in PET field call for efficient methods to expand datasets via synthetic generation of new data that cannot be traced back to real patients and that are also realistic. In this thesis, machine learning techniques were applied to 1001 amyloid-beta PET images, which had undergone a diagnosis of Alzheimer’s disease: the evaluations were 540 positive, 457 negative and 4 unknown. Isomap algorithm was used as a manifold learning method to reduce the dimensions of the PET dataset; a numerical scale-free interpolation method was applied to invert the dimensionality reduction map. The interpolant was tested on the PET images via LOOCV, where the removed images were compared with the reconstructed ones with the mean SSIM index (MSSIM = 0.76 ± 0.06). The effectiveness of this measure is questioned, since it indicated slightly higher performance for a method of comparison using PCA (MSSIM = 0.79 ± 0.06), which gave clearly poor quality reconstructed images with respect to those recovered by the numerical inverse mapping. Ten synthetic PET images were generated and, after having been mixed with ten originals, were sent to a team of clinicians for the visual assessment of their realism; no significant agreements were found either between clinicians and the true image labels or among the clinicians, meaning that original and synthetic images were indistinguishable. The future perspective of this thesis points to the improvement of the amyloid-beta PET research field by increasing available data, overcoming the constraints of data acquisition and privacy issues. Potential improvements can be achieved via refinements of the manifold learning and the inverse mapping stages during the PET image analysis, by exploring different combinations in the choice of algorithm parameters and by applying other non-linear dimensionality reduction algorithms. A final prospect of this work is the search for new methods to assess image reconstruction quality.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A fully automated methodology was developed for the determination of the thyroid hormones levothyroxine (T4) and liothyronine (T3). The proposed method exploits the formation of highly coloured charge-transfer (CT) complexes between these compounds, acting as electron donors, and pi-acceptors such as chloranilic acid (CIA) and 2,3-dichloro-5,6-dicyano-p-benzoquinone (DDQ). For automation of the analytical procedure a simple, fast and versatile single interface flow system (SIFA)was implemented guaranteeing a simplified performance optimisation, low maintenance and a cost-effective operation. Moreover, the single reaction interface assured a convenient and straightforward approach for implementing job`s method of continuous variations used to establish the stoichiometry of the formed CT complexes. Linear calibration plots for levothyroxine and liothyronine concentrations ranging from 5.0 x 10(-5) to 2.5 x 10(-4) mol L(-1) and 1.0 x 10(-5) to 1.0 x 10(-4) mol L(-1), respectively, were obtained, with good precision (R.S.D. <4.6% and <3.9%) and with a determination frequency of 26 h(-1) for both drugs. The results obtained for pharmaceutical formulations were statistically comparable to the declared hormone amount with relative deviations lower than 2.1%. The accuracy was confirmed by carrying out recovery studies, which furnished recovery values ranging from 96.3% to 103.7% for levothyroxine and 100.1% for liothyronine. (C) 2009 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, the method of Galerkin and the Askey-Wiener scheme are used to obtain approximate solutions to the stochastic displacement response of Kirchhoff plates with uncertain parameters. Theoretical and numerical results are presented. The Lax-Milgram lemma is used to express the conditions for existence and uniqueness of the solution. Uncertainties in plate and foundation stiffness are modeled by respecting these conditions, hence using Legendre polynomials indexed in uniform random variables. The space of approximate solutions is built using results of density between the space of continuous functions and Sobolev spaces. Approximate Galerkin solutions are compared with results of Monte Carlo simulation, in terms of first and second order moments and in terms of histograms of the displacement response. Numerical results for two example problems show very fast convergence to the exact solution, at excellent accuracies. The Askey-Wiener Galerkin scheme developed herein is able to reproduce the histogram of the displacement response. The scheme is shown to be a theoretically sound and efficient method for the solution of stochastic problems in engineering. (C) 2009 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Urban solid residues are constituted of food remaining, grass leaves, fruit peelings, paper, cardboard, rubber, plastic, etc. The organic fraction formed represents about 50% during the decomposition yields biogas and leachate, which are sources of pollution. Residue samples were collected from the landfill in different and cells from several ages and the corresponding leachate, both after treatments, were submitted to thermal analysis. Kinetic parameters were determined using Flynn-Wall-Ozawa method. The linear relation between the two kinetic parameters (ln A and E) was verified for organic residue urban`s samples, but not for leachate`s sample. The occurred difference can be attributed to the constituents present in leachate.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The purpose of this work was to assess the degradation of linear alkylbenzene sulfonate (LAS) in a horizontal-flow anaerobic immobilized biomass (HAIB) reactor. The reactor was filled with polyurethane foam where the sludge from a sanitary sewage treatment was immobilized. The hydraulic detention time (HDT) used in the experiments was of 12 h. The reactor was fed with synthetic substrate (410 mg l(-1) of meat extract, 115 mg l(-1) of starch, 80 mg l(-1) of saccharose, 320 mg l(-1) of sodium bicarbonate and 5 ml l(-1)of salt solution) in the following stages of operation: SI-synthetic substrate, SII-synthetic substrate with 7 mg l(-1) of LAS, SIII-synthetic substrate with 14 mg l(-1) of LAS and SIV-synthetic substrate containing yeast extract (substituting meat extract) and 14 mg l(-1) of LAS, without starch. At the end of the experiment (313 days) a degradation of similar to 35% of LAS was achieved. The higher the concentration of LAS, the greater the amount of foam for its adsorption. This is necessary because the isotherm of LAS adsorption in the foam is linear for the studied concentrations (2 to 50 mg l(-1)). Microscopic analyses of the biofilm revealed diverse microbial morphologies, while Denaturing Gradient Gel Eletrophoresis (DGGE) profiling showed variations in the population of total bacteria and sulphate-reducing bacteria (SRB). The 16S rRNA gene sequencing and phylogenetic analyses revealed that the members of the order Clostridiales were the major components of the bacterial community in the last reactor operation step.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study aimed to determine the efficiency of an anaerobic stirred sequencing-batch reactor containing granular biomass for the degradation of linear alkylbenzene sulfonate (LAS), a surfactant present in household detergent. The bioreactor was monitored for LAS concentrations in the influent, effluent and sludge, pH, chemical oxygen demand, bicarbonate alkalinity, total solids, and volatile solids. The degradation of LAS was found to be higher in the absence of co-substrates (53%) than in their presence (24-37%). Using the polymerase chain reaction and denaturing gradient gel electrophoresis (PCR/DGGE), we identified populations of microorganisms from the Bacteria and Archaea domains. Among the bacteria, we identified uncultivated populations of Arcanobacterium spp. (94%) and Opitutus spp. (96%). Among the Archaea, we identified Methanospirillum spp. (90%), Methanosaeta spp. (98%), and Methanobacterium spp. (96%). The presence of methanogenic microorganisms shows that LAS did not inhibit anaerobic digestion. Sampling at the last stage of reactor operation recovered 61 clones belonging to the domain bacteria. These represented a variety of phyla: 34% shared significant homology with Bacteroidetes, 18% with Proteobacteria, 11% with Verrucomicrobia, 8% with Fibrobacteres, 2% with Acidobacteria, 3% with Chlorobi and Firmicutes, and 1% with Acidobacteres and Chloroflexi. A small fraction of the clones (13%) were not related to any phylum. Published by Elsevier Ltd.