923 resultados para Search Engine Optimization Methods


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective: To compare measurements of the upper arm cross-sectional areas (total arm area,arm muscle area, and arm fat area of healthy neonates) as calculated using anthropometry with the values obtained by ultrasonography. Materials and methods: This study was performed on 60 consecutively born healthy neonates: gestational age (mean6SD) 39.661.2 weeks, birth weight 3287.16307.7 g, 27 males (45%) and 33 females (55%). Mid-arm circumference and tricipital skinfold thickness measurements were taken on the left upper mid-arm according to the conventional anthropometric method to calculate total arm area, arm muscle area and arm fat area. The ultrasound evaluation was performed at the same arm location using a Toshiba sonolayer SSA-250AÒ, which allows the calculation of the total arm area, arm muscle area and arm fat area by the number of pixels enclosed in the plotted areas. Statistical analysis: whenever appropriate, parametric and non-parametric tests were used in order to compare measurements of paired samples and of groups of samples. Results: No significant differences between males and females were found in any evaluated measurements, estimated either by anthropometry or by ultrasound. Also the median of total arm area did not differ significantly with either method (P50.337). Although there is evidence of concordance of the total arm area measurements (r50.68, 95% CI: 0.55–0.77) the two methods of measurement differed for arm muscle area and arm fat area. The estimated median of measurements by ultrasound for arm muscle area were significantly lower than those estimated by the anthropometric method, which differed by as much as 111% (P,0.001). The estimated median ultrasound measurement of the arm fat was higher than the anthropometric arm fat area by as much as 31% (P,0.001). Conclusion: Compared with ultrasound measurements using skinfold measurements and mid-arm circumference without further correction may lead to overestimation of the cross-sectional area of muscle and underestimation of the cross-sectional fat area. The correlation between the two methods could be interpreted as an indication for further search of correction factors in the equations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This dissertation is presented to obtain a Master degree in Structural and Functional Biochemistry

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dissertação para obtenção do Grau de Mestre em Logica Computicional

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Breast cancer is the most common cancer among women, being a major public health problem. Worldwide, X-ray mammography is the current gold-standard for medical imaging of breast cancer. However, it has associated some well-known limitations. The false-negative rates, up to 66% in symptomatic women, and the false-positive rates, up to 60%, are a continued source of concern and debate. These drawbacks prompt the development of other imaging techniques for breast cancer detection, in which Digital Breast Tomosynthesis (DBT) is included. DBT is a 3D radiographic technique that reduces the obscuring effect of tissue overlap and appears to address both issues of false-negative and false-positive rates. The 3D images in DBT are only achieved through image reconstruction methods. These methods play an important role in a clinical setting since there is a need to implement a reconstruction process that is both accurate and fast. This dissertation deals with the optimization of iterative algorithms, with parallel computing through an implementation on Graphics Processing Units (GPUs) to make the 3D reconstruction faster using Compute Unified Device Architecture (CUDA). Iterative algorithms have shown to produce the highest quality DBT images, but since they are computationally intensive, their clinical use is currently rejected. These algorithms have the potential to reduce patient dose in DBT scans. A method of integrating CUDA in Interactive Data Language (IDL) is proposed in order to accelerate the DBT image reconstructions. This method has never been attempted before for DBT. In this work the system matrix calculation, the most computationally expensive part of iterative algorithms, is accelerated. A speedup of 1.6 is achieved proving the fact that GPUs can accelerate the IDL implementation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

INTRODUCTION: This study was developed to evaluate the situation of leprosy in the general population of the municipality of Buriticupu, State of Maranhão, Brazil. METHODS: We used the method of active search to identify new cases from 2008 to 2010. Bacilloscopy of intradermal scrapings was performed in all patients with skin lesions compatible with leprosy, and histopathological examination in those who had doubts on the definition of the clinical form. RESULTS: The study included 19,104 individuals, with 42 patients diagnosed with leprosy after clinical examination, representing a detection rate of 219.84 per 100,000 inhabitants. The predominant clinical presentation was tuberculoid with 24 (57.1%) cases, followed by borderline with 11, indeterminate with four, and lepromatous with three cases. The study also allowed the identification of 81 patients with a history of leprosy and other skin diseases, such as pityriasis versicolor, dermatophytosis, scabies, vitiligo, and skin carcinoma. The binomial test showed that the proportion of cases in the headquarters was significantly higher than that in the villages (p = 0.04), and the generalized exact test showed that there was no association between age and clinical form (p = 0.438) and between age and gender (p = 0.083). CONCLUSIONS: The elevated detection rate defines the city as hyperendemic for leprosy; the active search for cases, as well as the organization of health services, is an important method for disease control.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The goal of this thesis is the investigation and optimization of the synthesis of potential fragrances. This work is projected as collaboration between the University of Applied Sciences in Merseburg and the company Miltitz Aromatics GmbH in Bitterfeld‐Wolfen (Germany). Flavoured compounds can be synthesized in different ways and by various methods. In this work, methods like the phase transfer catalysis and the Cope‐rearrangement were investigated and applied, for getting a high yield and quantity of the desired substances and without any by‐products or side reactions. This involved the study of syntheses with different process parameters such as temperature, solvent, pressure and reaction time. The main focus was on Cope‐rearrangement, which is a common method in the synthesis of new potential fragrance compounds. The substances synthesized in this work have a hepta‐1,5‐diene‐structure and that is why they can easily undergo this [3,3]‐sigma tropic rearrangement. The lead compound of all research was 2,5‐dimethyl‐2‐vinyl‐4‐hexenenitrile (Neronil). Neronil is synthesized by an alkylation of 2‐methyl‐3‐butenenitrile with prenylchloride under basic conditions in a phase‐transfer system. In this work the yield of isolated Neronil is improved from about 35% to 46% by according to the execution conditions of the reaction. Additionally the amount of side product was decreased. This synthesized hexenenitrile involved not only the aforementioned 1,5‐diene‐structure, but also a cyano group, that makes this structure a suitable base for the synthesis of new potential fragrance compounds. It was observed that Neronil can be transferred into 2,5‐dimethyl‐2‐vinyl‐4‐hexenoic acid by a hydrolysis under basic conditions. After five hours the acid can be obtained with a yield of 96%. The following esterification is realized with isobutanol to produce 2,5‐dimethyl‐2‐vinyl‐4‐hexenoic acid isobutyl ester with quantitative conversion. It was observed that the Neronil and the corresponding ester can be converted into the corresponding Cope‐product, with a conversion of 30 % and 80%. Implementing the Cope‐rearrangement, the acid was heated and an unexpected decarboxylated product is formed. To achieve the best verification of reaction development and structure, scrupulous analyses were done using GC‐MS, 1H‐NMR and 13C‐ NMR.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Phosphorus (P) is becoming a scarce element due to the decreasing availability of primary sources. Therefore, recover P from secondary sources, e.g. waste streams, have become extremely important. Sewage sludge ash (SSA) is a reliable secondary source of P. The use of SSAs as a direct fertilizer has very restricted legislation due to the presence of inorganic contaminants. Furthermore, the P present in SSAs is not in a plant-available form. The electrodialytic (ED) process is one of the methods under development to recover P and simultaneously remove heavy metals. The present work aimed to optimize the P recovery through a 2 compartment electrodialytic cell. The research was divided in three independent phases. In the first phase, ED experiments were carried out for two SSAs from different seasons, varying the duration of the ED process (2, 4, 6 and 9 days). During the ED treatment the SSA was suspended in distilled water in the anolyte, which was separated from the catholyte by a cation exchange membrane. From both ashes 90% of P was successfully extracted after 6 days of treatment. Regarding the heavy metals removal, one of the SSAs had a better removal than the other. Therefore, it was possible to conclude that SSAs from different seasons can be submitted to ED process under the same parameters. In the second phase, the two SSAs were exposed to humidity and air prior to ED, in order to carbonate them. Although this procedure was not successful, ED experiments were carried out varying the duration of the treatment (2 and 6 days) and the period of air exposure that SSAs were submitted to (7, 14 and 30 days). After 6 days of treatment and 30 days of air exposure, 90% of phosphorus was successfully extracted from both ashes. No differences were identified between carbonated and non-carbonated SSAs. Thus, SSAs that were exposed to the air and humidity, e.g. SSAs stored for 30 days in an open deposit, can be treated under the same parameters as the SSAs directly collected from the incineration process. In the third phase, ED experiments were carried out during 6 days varying the stirring time (0, 1, 2 and 4 h/day) in order to investigate if energy can be saved on the stirring process. After 6 days of treatment and 4 h/day stirring, 80% and 90% of P was successfully extracted from SSA-A and SSA-B, respectively. This value is very similar to the one obtained for 6 days of treatment stirring 24 h/day.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Polysaccharides are gaining increasing attention as potential environmental friendly and sustainable building blocks in many fields of the (bio)chemical industry. The microbial production of polysaccharides is envisioned as a promising path, since higher biomass growth rates are possible and therefore higher productivities may be achieved compared to vegetable or animal polysaccharides sources. This Ph.D. thesis focuses on the modeling and optimization of a particular microbial polysaccharide, namely the production of extracellular polysaccharides (EPS) by the bacterial strain Enterobacter A47. Enterobacter A47 was found to be a metabolically versatile organism in terms of its adaptability to complex media, notably capable of achieving high growth rates in media containing glycerol byproduct from the biodiesel industry. However, the industrial implementation of this production process is still hampered due to a largely unoptimized process. Kinetic rates from the bioreactor operation are heavily dependent on operational parameters such as temperature, pH, stirring and aeration rate. The increase of culture broth viscosity is a common feature of this culture and has a major impact on the overall performance. This fact complicates the mathematical modeling of the process, limiting the possibility to understand, control and optimize productivity. In order to tackle this difficulty, data-driven mathematical methodologies such as Artificial Neural Networks can be employed to incorporate additional process data to complement the known mathematical description of the fermentation kinetics. In this Ph.D. thesis, we have adopted such an hybrid modeling framework that enabled the incorporation of temperature, pH and viscosity effects on the fermentation kinetics in order to improve the dynamical modeling and optimization of the process. A model-based optimization method was implemented that enabled to design bioreactor optimal control strategies in the sense of EPS productivity maximization. It is also critical to understand EPS synthesis at the level of the bacterial metabolism, since the production of EPS is a tightly regulated process. Methods of pathway analysis provide a means to unravel the fundamental pathways and their controls in bioprocesses. In the present Ph.D. thesis, a novel methodology called Principal Elementary Mode Analysis (PEMA) was developed and implemented that enabled to identify which cellular fluxes are activated under different conditions of temperature and pH. It is shown that differences in these two parameters affect the chemical composition of EPS, hence they are critical for the regulation of the product synthesis. In future studies, the knowledge provided by PEMA could foster the development of metabolically meaningful control strategies that target the EPS sugar content and oder product quality parameters.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Earthworks involve the levelling or shaping of a target area through the moving or processing of the ground surface. Most construction projects require earthworks, which are heavily dependent on mechanical equipment (e.g., excavators, trucks and compactors). Often, earthworks are the most costly and time-consuming component of infrastructure constructions (e.g., road, railway and airports) and current pressure for higher productivity and safety highlights the need to optimize earthworks, which is a nontrivial task. Most previous attempts at tackling this problem focus on single-objective optimization of partial processes or aspects of earthworks, overlooking the advantages of a multi-objective and global optimization. This work describes a novel optimization system based on an evolutionary multi-objective approach, capable of globally optimizing several objectives simultaneously and dynamically. The proposed system views an earthwork construction as a production line, where the goal is to optimize resources under two crucial criteria (costs and duration) and focus the evolutionary search (non-dominated sorting genetic algorithm-II) on compaction allocation, using linear programming to distribute the remaining equipment (e.g., excavators). Several experiments were held using real-world data from a Portuguese construction site, showing that the proposed system is quite competitive when compared with current manual earthwork equipment allocation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

PhD Thesis in Bioengineering

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The artificial fish swarm algorithm has recently been emerged in continuous global optimization. It uses points of a population in space to identify the position of fish in the school. Many real-world optimization problems are described by 0-1 multidimensional knapsack problems that are NP-hard. In the last decades several exact as well as heuristic methods have been proposed for solving these problems. In this paper, a new simpli ed binary version of the artificial fish swarm algorithm is presented, where a point/ fish is represented by a binary string of 0/1 bits. Trial points are created by using crossover and mutation in the different fi sh behavior that are randomly selected by using two user de ned probability values. In order to make the points feasible the presented algorithm uses a random heuristic drop item procedure followed by an add item procedure aiming to increase the profit throughout the adding of more items in the knapsack. A cyclic reinitialization of 50% of the population, and a simple local search that allows the progress of a small percentage of points towards optimality and after that refines the best point in the population greatly improve the quality of the solutions. The presented method is tested on a set of benchmark instances and a comparison with other methods available in literature is shown. The comparison shows that the proposed method can be an alternative method for solving these problems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Electromagnetism-like (EM) algorithm is a population- based stochastic global optimization algorithm that uses an attraction- repulsion mechanism to move sample points towards the optimal. In this paper, an implementation of the EM algorithm in the Matlab en- vironment as a useful function for practitioners and for those who want to experiment a new global optimization solver is proposed. A set of benchmark problems are solved in order to evaluate the performance of the implemented method when compared with other stochastic methods available in the Matlab environment. The results con rm that our imple- mentation is a competitive alternative both in term of numerical results and performance. Finally, a case study based on a parameter estimation problem of a biology system shows that the EM implementation could be applied with promising results in the control optimization area.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dissertação de mestrado integrado em Engenharia Mecânica

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE: To report the hemodynamic and functional responses obtained with clinical optimization guided by hemodynamic parameters in patients with severe and refractory heart failure. METHODS: Invasive hemodynamic monitoring using right heart catheterization aimed to reach low filling pressures and peripheral resistance. Frequent adjustments of intravenous diuretics and vasodilators were performed according to the hemodynamic measurements. RESULTS: We assessed 19 patients (age = 48±12 years and ejection fraction = 21±5%) with severe heart failure. The intravenous use of diuretics and vasodilators reduced by 12 mm Hg (relative reduction of 43%) pulmonary artery occlusion pressure (P<0.001), with a concomitant increment of 6 mL per beat in stroke volume (relative increment of 24%, P<0.001). We observed significant associations between pulmonary artery occlusion pressure and mean pulmonary artery pressure (r=0.76; P<0.001) and central venous pressure (r=0.63; P<0.001). After clinical optimization, improvement in functional class occurred (P< 0.001), with a tendency towards improvement in ejection fraction and no impairment to renal function. CONCLUSION: Optimization guided by hemodynamic parameters in patients with refractory heart failure provides a significant improvement in the hemodynamic profile with concomitant improvement in functional class. This study emphasizes that adjustments in blood volume result in imme-diate benefits for patients with severe heart failure.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis describes a search for very high energy (VHE) gamma-ray emission from the starburst galaxy IC 342. The analysis was based on data from the 2003 — 2004 observing season recorded using the Whipple 10-metre imaging atmospheric Cherenkov telescope located on Mount Hopkins in southern Arizona. IC 342 may be classed as a non-blazar type galaxy and to date only a few such galaxies (M 87, Cen A, M 82 and NGC 253) have been detected as VHE gamma-ray sources. Analysis of approximately 24 hours of good quality IC 342 data, consisting entirely of ON/OFF observations, was carried out using a number of methods (standard Supercuts, optimised Supercuts, scaled optimised Supercuts and the multivariate kernel analysis technique). No evidence for TeV gamma-ray emission from IC 342 was found. The significance was 0.6 a with a nominal rate of 0.04 ± 0.06 gamma rays per minute. The flux upper limit above 600 GeV (at 99.9 % confidence) was determined to be 5.5 x 10-8 m-2 s-1, corresponding to 8 % of the Crab Nebula flux in the same energy range.