894 resultados para Conventional Methods


Relevância:

60.00% 60.00%

Publicador:

Resumo:

In this paper we propose and analyze a hybrid $hp$ boundary element method for the solution of problems of high frequency acoustic scattering by sound-soft convex polygons, in which the approximation space is enriched with oscillatory basis functions which efficiently capture the high frequency asymptotics of the solution. We demonstrate, both theoretically and via numerical examples, exponential convergence with respect to the order of the polynomials, moreover providing rigorous error estimates for our approximations to the solution and to the far field pattern, in which the dependence on the frequency of all constants is explicit. Importantly, these estimates prove that, to achieve any desired accuracy in the computation of these quantities, it is sufficient to increase the number of degrees of freedom in proportion to the logarithm of the frequency as the frequency increases, in contrast to the at least linear growth required by conventional methods.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In this paper we propose and analyse a hybrid numerical-asymptotic boundary element method for the solution of problems of high frequency acoustic scattering by a class of sound-soft nonconvex polygons. The approximation space is enriched with carefully chosen oscillatory basis functions; these are selected via a study of the high frequency asymptotic behaviour of the solution. We demonstrate via a rigorous error analysis, supported by numerical examples, that to achieve any desired accuracy it is sufficient for the number of degrees of freedom to grow only in proportion to the logarithm of the frequency as the frequency increases, in contrast to the at least linear growth required by conventional methods. This appears to be the first such numerical analysis result for any problem of scattering by a nonconvex obstacle. Our analysis is based on new frequency-explicit bounds on the normal derivative of the solution on the boundary and on its analytic continuation into the complex plane.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

There is an increasing demand in higher education institutions for training in complex environmental problems. Such training requires a careful mix of conventional methods and innovative solutions, a task not always easy to accomplish. In this paper we review literature on this theme, highlight relevant advances in the pedagogical literature, and report on some examples resulting from our recent efforts to teach complex environmental issues. The examples range from full credit courses in sustainable development and research methods to project-based and in-class activity units. A consensus from the literature is that lectures are not sufficient to fully engage students in these issues. A conclusion from the review of examples is that problem-based and project-based, e.g., through case studies, experiential learning opportunities, or real-world applications, learning offers much promise. This could greatly be facilitated by online hubs through which teachers, students, and other members of the practitioner and academic community share experiences in teaching and research, the way that we have done here.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

An ability to quantify the reliability of probabilistic flood inundation predictions is a requirement not only for guiding model development but also for their successful application. Probabilistic flood inundation predictions are usually produced by choosing a method of weighting the model parameter space, but previous study suggests that this choice leads to clear differences in inundation probabilities. This study aims to address the evaluation of the reliability of these probabilistic predictions. However, a lack of an adequate number of observations of flood inundation for a catchment limits the application of conventional methods of evaluating predictive reliability. Consequently, attempts have been made to assess the reliability of probabilistic predictions using multiple observations from a single flood event. Here, a LISFLOOD-FP hydraulic model of an extreme (>1 in 1000 years) flood event in Cockermouth, UK, is constructed and calibrated using multiple performance measures from both peak flood wrack mark data and aerial photography captured post-peak. These measures are used in weighting the parameter space to produce multiple probabilistic predictions for the event. Two methods of assessing the reliability of these probabilistic predictions using limited observations are utilized; an existing method assessing the binary pattern of flooding, and a method developed in this paper to assess predictions of water surface elevation. This study finds that the water surface elevation method has both a better diagnostic and discriminatory ability, but this result is likely to be sensitive to the unknown uncertainties in the upstream boundary condition

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This study investigated the presence of potentially human pathogenic strains of Vibrio spp., Aeromonas spp., Escherichia coli, Salmonella spp. and Staphylococcus aureus in fish commercialized in street markets of Sao Paulo city, Brazil. Twenty fish of different species were analyzed for foodborne pathogens using conventional methods. High levels of fecal contamination were detected in 25% of samples. S. aureus was isolated from 10% of samples. All were negative for Salmonella. Vibrio species, including Vibrio cholerae non-O1/non-O139, were observed in 85% of samples although Vibrio parahaemolyticus was not found in this study. Aeromonas spp., including A. hydrophila, was isolated from 50% of fish samples. The occurrence of these pathogens suggests that the fish commercialized in Sao Paulo may represent a health risk to the consumers.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Direct analysis, with minimal sample pretreatment, of antidepressant drugs, fluoxetine, imipramine, desipramine, amitriptyline, and nortriptyline in biofluids was developed with a total run time of 8 min. The setup consists of two HPLC pumps, injection valve, capillary RAM-ADS-C18 pre-column and a capillary analytical C 18 column connected by means of a six-port valve in backflush mode. Detection was performed with ESI-MS/MS and only 1 mu m of sample was injected. Validation was adequately carried out using FLU-d(5) as internal standard. Calibration curves were constructed under a linear range of 1-250 ng mL(-1) in plasma, being the limit of quantification (LOQ), determined as 1 ng mL(-1), for all the analytes. With the described approach it was possible to reach a quantified mass sensitivity of 0.3 pg for each analyte (equivalent to 1.1-1.3 fmol), translating to a lower sample consumption (in the order of 103 less sample than using conventional methods). (C) 2008 Elsevier B.V. All rights reserved.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A quantificação da precipitação é dificultada pela extrema aleatoriedade do fenômeno na natureza. Os métodos convencionais para mensuração da precipitação atuam no sentido de espacializar a precipitação mensurada pontualmente em postos pluviométricos para toda a área de interesse e, desta forma, uma rede com elevado número de postos bem distribuídos em toda a área de interesse é necessária para um resultado satisfatório. No entanto, é notória a escassez de postos pluviométricos e a má distribuição espacial dos poucos existentes, não somente no Brasil, mas em vastas áreas do globo. Neste contexto, as estimativas da precipitação com técnicas de sensoriamento remoto e geoprocessamento pretendem potencializar a utilização dos postos pluviométricos existentes através de uma espacialização baseada em critérios físicos. Além disto, o sensoriamento remoto é a ferramenta mais capaz para gerar estimativas de precipitação nos oceanos e nas vastas áreas continentais desprovidas de qualquer tipo de informação pluviométrica. Neste trabalho investigou-se o emprego de técnicas de sensoriamento remoto e geoprocessamento para estimativas de precipitação no sul do Brasil. Três algoritmos computadorizados foram testados, sendo utilizadas as imagens dos canais 1, 3 e 4 (visível, vapor d’água e infravermelho) do satélite GOES 8 (Geostacionary Operational Environmental Satellite – 8) fornecidas pelo Centro de Previsão de Tempo e Estudos Climáticos do Instituto Nacional de Pesquisas Espaciais. A área de estudo compreendeu todo o estado do Rio Grande do Sul, onde se utilizaram os dados pluviométricos diários derivados de 142 postos no ano de 1998. Os algoritmos citados buscam identificar as nuvens precipitáveis para construir modelos estatísticos que correlacionem as precipitações diária e decendial observadas em solo com determinadas características físicas das nuvens acumuladas durante o mesmo período de tempo e na mesma posição geográfica de cada pluviômetro considerado. Os critérios de decisão que norteiam os algoritmos foram baseados na temperatura do topo das nuvens (através do infravermelho termal), reflectância no canal visível, características de vizinhança e no plano de temperatura x gradiente de temperatura Os resultados obtidos pelos modelos estatísticos são expressos na forma de mapas de precipitação por intervalo de tempo que podem ser comparados com mapas de precipitação obtidas por meios convencionais.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This work presents a proposal for a voltage and frequency control system for a wind power induction generator. It has been developed na experimental structure composes basically by a three phase induction machine, a three phase capacitor and a reactive static Power compensator controlled by histeresys. lt has been developed control algorithms using conventional methods (Pl control) and linguistic methods (using concepts of logic and fuzzy control), to compare their performances in the variable speed generator system. The control loop was projected using the ADJDA PCL 818 model board into a Pentium 200 MHz compu ter. The induction generator mathematical model was studied throught Park transformation. It has been realized simulations in the Pspice@ software, to verify the system characteristics in transient and steady-state situations. The real time control program was developed in C language, possibilish verify the algorithm performance in the 2,2kW didatic experimental system

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Numerosas pesquisas têm estudado os métodos não-destrutivos de avaliação de materiais e sua aplicação àqueles de matrizes complexas, como é o caso da madeira. Um dos primeiros métodos não-destrutivos investigados para aplicação nesses casos foi o da vibração transversal. Apesar de sua concepção simples, e a despeito dos grandes avanços obtidos nessa área com outros métodos, como, por exemplo, o ultra-som, o método de vibração transversal para a determinação do módulo de elasticidade da madeira revela-se como de grande potencial de aplicação, sobretudo pela precisão do modelo matemático a ele associado e pela possibilidade de sua aplicação a peças de dimensões estruturais (in-grade testing). Neste trabalho, apresenta-se o uso desse método na determinação do módulo de elasticidade de três espécies de eucalipto. Foram ensaiados não-destrutivamente e por ensaios mecânicos convencionais de flexão corpos-de-prova de 2 cm x 2 cm x 46 cm de E. grandis, E. saligna e E. citriodora. Os ensaios não-destrutivos foram conduzidos com uso do sistema BING - Beam Identification by Non-destructive Grading, que permite a análise das vibrações do material nos domínios do tempo e da freqüência. Os resultados obtidos revelaram boa correlação entre os dois tipos de ensaios empregados, justificando o início dos ensaios com peças de dimensões estruturais, para a viabilização da técnica nas práticas de classificação estrutural.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Nanocellulose is the crystalline domains obtained from renewable cellulosic sources, used to increase mechanical properties and biodegrability in polymer composites. This work has been to study how high pressure defibrillation and chemical purification affect the PALF fibre morphology from micro to nanoscale. Microscopy techniques and X-ray diffraction were used to study the structure and properties of the prepared nanofibers and composites. Microscopy studies showed that the used individualization processes lead to a unique morphology of interconnected web-like structure of PALF fibers. The produced nanofibers were bundles of cellulose fibers of widths ranging between 5 and 15 nm and estimated lengths of several micrometers. Percentage yield and aspect ratio of the nanofiber obtained by this technique is found to be very high in comparison with other conventional methods. The nanocomposites were prepared by means of compression moulding, by stacking the nanocellulose fibre mats between polyurethane films. The results showed that the nanofibrils reinforced the polyurethane efficiently. The addition of 5 wt% of cellulose nanofibrils to PU increased the strength nearly 300% and the stiffness by 2600%. The developed composites were utilized to fabricate various versatile medical implants. (C) 2011 Elsevier Ltd. All rights reserved.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Conventional methods to solve the problem of blind source separation nonlinear, in general, using series of restrictions to obtain the solution, often leading to an imperfect separation of the original sources and high computational cost. In this paper, we propose an alternative measure of independence based on information theory and uses the tools of artificial intelligence to solve problems of blind source separation linear and nonlinear later. In the linear model applies genetic algorithms and Rényi of negentropy as a measure of independence to find a separation matrix from linear mixtures of signals using linear form of waves, audio and images. A comparison with two types of algorithms for Independent Component Analysis widespread in the literature. Subsequently, we use the same measure of independence, as the cost function in the genetic algorithm to recover source signals were mixed by nonlinear functions from an artificial neural network of radial base type. Genetic algorithms are powerful tools for global search, and therefore well suited for use in problems of blind source separation. Tests and analysis are through computer simulations

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Petroleum is a complex combination of various classes of hydrocarbons, with paraffinic, naphtenic and aromatic compounds being those more commonly found in its composition. The recent changes in the world scenario, the large reserves of heavy oils and also the lack of new discoveries of large petroleum fields are indications that, in the near future, the oil recovery by conventional methods will be limited. In order to increase the efficiency of the extraction process, enhanced recovery methods are cited in applications where conventional techniques have proven to be little effective. The injection of surfactant solutions as an enhanced recovery method is advantageous in that surfactants are able to reduce the interfacial tensions between water and oil, thus augmenting the displacement efficiency and, as a consequence, increasing the recovery factor. This work aims to investigate the effects of some parameters that influence the surfactant behavior in solution, namely the type of surfactant, the critical micelle concentration (CMC) and the surface and interface tensions between fluids. Seawater solutions containing the surfactants PAN, PHN and PJN have been prepared for presenting lower interfacial tensions with petroleum and higher stability under increasing temperature and salinity. They were examined in an experimental apparatus designed to assess the recovery factor. Botucatu (Brazil) sandstone plug samples were submitted to assay steps comprising saturation with seawater and petroleum, conventional recovery with seawater and enhanced recovery with surfactant solutions. The plugs had porosity between 29.6 and 32.0%, with average effective permeability to water of 83 mD. The PJN surfactant, at a concentration 1000% above CMC in water, had a higher recovery factor, causing the original oil in place to be recovered by an extra 20.97%, after conventional recovery with seawater

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Anhydrous ethanol is used in chemical, pharmaceutical and fuel industries. However, current processes for obtaining it involve high cost, high energy demand and use of toxic and pollutant solvents. This problem occurs due to the formation of an azeotropic mixture of ethanol + water, which does not allow the complete separation by conventional methods such as simple distillation. As an alternative to currently used processes, this study proposes the use of ionic liquids as solvents in extractive distillation. These are organic salts which are liquids at low temperatures (under 373,15 K). They exhibit characteristics such as low volatility (almost zero/ low vapor ), thermal stability and low corrosiveness, which make them interesting for applications such as catalysts and as entrainers. In this work, experimental data for the vapor pressure of pure ethanol and water in the pressure range of 20 to 101 kPa were obtained as well as for vapor-liquid equilibrium (VLE) of the system ethanol + water at atmospheric pressure; and equilibrium data of ethanol + water + 2-HDEAA (2- hydroxydiethanolamine acetate) at strategic points in the diagram. The device used for these experiments was the Fischer ebulliometer, together with density measurements to determine phase compositions. The experimental data were consistent with literature data and presented thermodynamic consistency, thus the methodology was properly validated. The results were favorable, with the increase of ethanol concentration in the vapor phase, but the increase was not shown to be pronounced. The predictive model COSMO-SAC (COnductor-like Screening MOdels Segment Activity Coefficient) proposed by Lin & Sandler (2002) was studied for calculations to predict vapor-liquid equilibrium of systems ethanol + water + ionic liquids at atmospheric pressure. This is an alternative for predicting phase equilibrium, especially for substances of recent interest, such as ionic liquids. This is so because no experimental data nor any parameters of functional groups (as in the UNIFAC method) are needed

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This work presents a spray-dryer designed to oxalate-niobate precursors and suitable for the production of Niobium Carbide. The dryer was intended to produce powders of controlled particle size. First, the precursor is dissolved in water to produce a solution of known concentration and then it is atomized on the spray-dryer to produce the powder. This equipment consists of a 304 stainless steel chamber, 0.48 m x 1.9 m (diameter x length), with a conical shape at the lower portion, which is assembled on a vertical platform. The chamber is heated by three 4 kW electrical resistances. In this process, drying air is heated as it flows inside a serpentine surrounding the chamber, in contrary to more traditional processes in which the hot drying air is used to heat the component. The air enters the chamber at the same temperature of the chamber, thus avoiding adherence of particles on the internal surface. The low speed flow is concurrent, directed from the top to the bottom portion of the chamber. Powders are deposited on a 0.4 m diameter tray, which separates the cylindrical portion from the conical portion of the chamber. The humid air is discharged though a plug placed underneath the collecting tray. A factorial experimental planning was prepared to study the influence of five parameters (concentration, input flow, operation temperature, drying air flow and spray air flow) on the characteristics of the powders produced. Particle size distribution and shape were measured by laser granulometry and scanning electronic microscopy. Then, the powders are submitted to reaction in a CH4 / H2 atmosphere to compare the characteristics of spray-dried powders with powders synthetizided by conventional methods

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In this work, cellulose nanofibers were extracted from banana fibers via a steam explosion technique. The chemical composition, morphology and thermal properties of the nanofibers were characterized to investigate their suitability for use in bio-based composite material applications. Chemical characterization of the banana fibers confirmed that the cellulose content was increased from 64% to 95% due to the application of alkali and acid treatments. Assessment of fiber chemical composition before and after chemical treatment showed evidence for the removal of non-cellulosic constituents such as hemicelluloses and lignin that occurred during steam explosion, bleaching and acid treatments. Surface morphological studies using SEM and AFM revealed that there was a reduction in fiber diameter during steam explosion followed by acid treatments. Percentage yield and aspect ratio of the nanofiber obtained by this technique is found to be very high in comparison with other conventional methods. TGA and DSC results showed that the developed nanofibers exhibit enhanced thermal properties over the untreated fibers. (C) 2010 Elsevier Ltd. All rights reserved.