937 resultados para methodologies
Resumo:
The Earth s climate is a highly dynamic and complex system in which atmospheric aerosols have been increasingly recognized to play a key role. Aerosol particles affect the climate through a multitude of processes, directly by absorbing and reflecting radiation and indirectly by changing the properties of clouds. Because of the complexity, quantification of the effects of aerosols continues to be a highly uncertain science. Better understanding of the effects of aerosols requires more information on aerosol chemistry. Before the determination of aerosol chemical composition by the various available analytical techniques, aerosol particles must be reliably sampled and prepared. Indeed, sampling is one of the most challenging steps in aerosol studies, since all available sampling techniques harbor drawbacks. In this study, novel methodologies were developed for sampling and determination of the chemical composition of atmospheric aerosols. In the particle-into-liquid sampler (PILS), aerosol particles grow in saturated water vapor with further impaction and dissolution in liquid water. Once in water, the aerosol sample can then be transported and analyzed by various off-line or on-line techniques. In this study, PILS was modified and the sampling procedure was optimized to obtain less altered aerosol samples with good time resolution. A combination of denuders with different coatings was tested to adsorb gas phase compounds before PILS. Mixtures of water with alcohols were introduced to increase the solubility of aerosols. Minimum sampling time required was determined by collecting samples off-line every hour and proceeding with liquid-liquid extraction (LLE) and analysis by gas chromatography-mass spectrometry (GC-MS). The laboriousness of LLE followed by GC-MS analysis next prompted an evaluation of solid-phase extraction (SPE) for the extraction of aldehydes and acids in aerosol samples. These two compound groups are thought to be key for aerosol growth. Octadecylsilica, hydrophilic-lipophilic balance (HLB), and mixed phase anion exchange (MAX) were tested as extraction materials. MAX proved to be efficient for acids, but no tested material offered sufficient adsorption for aldehydes. Thus, PILS samples were extracted only with MAX to guarantee good results for organic acids determined by liquid chromatography-mass spectrometry (HPLC-MS). On-line coupling of SPE with HPLC-MS is relatively easy, and here on-line coupling of PILS with HPLC-MS through the SPE trap produced some interesting data on relevant acids in atmospheric aerosol samples. A completely different approach to aerosol sampling, namely, differential mobility analyzer (DMA)-assisted filter sampling, was employed in this study to provide information about the size dependent chemical composition of aerosols and understanding of the processes driving aerosol growth from nano-size clusters to climatically relevant particles (>40 nm). The DMA was set to sample particles with diameters of 50, 40, and 30 nm and aerosols were collected on teflon or quartz fiber filters. To clarify the gas-phase contribution, zero gas-phase samples were collected by switching off the DMA every other 15 minutes. Gas-phase compounds were adsorbed equally well on both types of filter, and were found to contribute significantly to the total compound mass. Gas-phase adsorption is especially significant during the collection of nanometer-size aerosols and needs always to be taken into account. Other aims of this study were to determine the oxidation products of β-caryophyllene (the major sesquiterpene in boreal forest) in aerosol particles. Since reference compounds are needed for verification of the accuracy of analytical measurements, three oxidation products of β-caryophyllene were synthesized: β-caryophyllene aldehyde, β-nocaryophyllene aldehyde, and β-caryophyllinic acid. All three were identified for the first time in ambient aerosol samples, at relatively high concentrations, and their contribution to the aerosol mass (and probably growth) was concluded to be significant. Methodological and instrumental developments presented in this work enable fuller understanding of the processes behind biogenic aerosol formation and provide new tools for more precise determination of biosphere-atmosphere interactions.
Resumo:
Methodologies are presented for minimization of risk in a river water quality management problem. A risk minimization model is developed to minimize the risk of low water quality along a river in the face of conflict among various stake holders. The model consists of three parts: a water quality simulation model, a risk evaluation model with uncertainty analysis and an optimization model. Sensitivity analysis, First Order Reliability Analysis (FORA) and Monte-Carlo simulations are performed to evaluate the fuzzy risk of low water quality. Fuzzy multiobjective programming is used to formulate the multiobjective model. Probabilistic Global Search Laussane (PGSL), a global search algorithm developed recently, is used for solving the resulting non-linear optimization problem. The algorithm is based on the assumption that better sets of points are more likely to be found in the neighborhood of good sets of points, therefore intensifying the search in the regions that contain good solutions. Another model is developed for risk minimization, which deals with only the moments of the generated probability density functions of the water quality indicators. Suitable skewness values of water quality indicators, which lead to low fuzzy risk are identified. Results of the models are compared with the results of a deterministic fuzzy waste load allocation model (FWLAM), when methodologies are applied to the case study of Tunga-Bhadra river system in southern India, with a steady state BOD-DO model. The fractional removal levels resulting from the risk minimization model are slightly higher, but result in a significant reduction in risk of low water quality. (c) 2005 Elsevier Ltd. All rights reserved.
Resumo:
This study discusses the scope of historical earthquake analysis in low-seismicity regions. Examples of non-damaging earthquake reports are given from the Eastern Baltic (Fennoscandian) Shield in north-eastern Europe from the 16th to the 19th centuries. The information available for past earthquakes in the region is typically sparse and cannot be increased through a careful search of the archives. This study applies recommended rigorous methodologies of historical seismology developed using ample data to the sparse reports from the Eastern Baltic Shield. Attention is paid to the context of reporting, the identity and role of the authors, the circumstances of the reporting, and the opportunity to verify the available information by collating the sources. We evaluate the reliability of oral earthquake recollections and develop criteria for cases when a historical earthquake is attested to by a single source. We propose parametric earthquake scenarios as a way to deal with sparse macroseismic reports and as an improvement to existing databases.
Resumo:
An attempt is made in this paper to arrive at a methodology for generating building technologies appropriate to rural housing. An evaluation of traditional modern' technologies currently in use reveals the need for alternatives. The lacunae in the presently available technologies also lead to a definition of rural housing needs. It is emphasised that contending technologies must establish a 'goodness of fit' between the house form and the pattern of needs. A systems viewpoint which looks at the dynamic process of building construction and the static structure of the building is then suggested as a means to match the technologies to the needs. The process viewpoint emphasises the role of building materials production and transportation in achieving desired building performances. A couple of examples of technological alternatives like the compacted soil block and the polythene-stabilised soil roof covering are then discussed. The static structural system viewpoint is then studied to arrive at methodologies of cost reduction. An illustrative analysis is carried out using the dynamic programming technique, to arrive at combinations of alternatives for the building components which lead to cost reduction. Some of the technological options are then evaluated against the need patterns. Finally, a guideline for developments in building technology is suggested
Resumo:
Human sport doping control analysis is a complex and challenging task for anti-doping laboratories. The List of Prohibited Substances and Methods, updated annually by World Anti-Doping Agency (WADA), consists of hundreds of chemically and pharmacologically different low and high molecular weight compounds. This poses a considerable challenge for laboratories to analyze for them all in a limited amount of time from a limited sample aliquot. The continuous expansion of the Prohibited List obliges laboratories to keep their analytical methods updated and to research new available methodologies. In this thesis, an accurate mass-based analysis employing liquid chromatography - time-of-flight mass spectrometry (LC-TOFMS) was developed and validated to improve the power of doping control analysis. New analytical methods were developed utilizing the high mass accuracy and high information content obtained by TOFMS to generate comprehensive and generic screening procedures. The suitability of LC-TOFMS for comprehensive screening was demonstrated for the first time in the field with mass accuracies better than 1 mDa. Further attention was given to generic sample preparation, an essential part of screening analysis, to rationalize the whole work flow and minimize the need for several separate sample preparation methods. Utilizing both positive and negative ionization allowed the detection of almost 200 prohibited substances. Automatic data processing produced a Microsoft Excel based report highlighting the entries fulfilling the criteria of the reverse data base search (retention time (RT), mass accuracy, isotope match). The quantitative performance of LC-TOFMS was demonstrated with morphine, codeine and their intact glucuronide conjugates. After a straightforward sample preparation the compounds were analyzed directly without the need for hydrolysis, solvent transfer, evaporation or reconstitution. The hydrophilic interaction technique (HILIC) provided good chromatographic separation, which was critical for the morphine glucuronide isomers. A wide linear range (50-5000 ng/ml) with good precision (RSD<10%) and accuracy (±10%) was obtained, showing comparable or better performance to other methods used. In-source collision-induced dissociation (ISCID) allowed confirmation analysis with three diagnostic ions with a median mass accuracy of 1.08 mDa and repeatable ion ratios fulfilling WADA s identification criteria. The suitability of LC-TOFMS for screening of high molecular weight doping agents was demonstrated with plasma volume expanders (PVE), namely dextran and hydroxyethylstarch (HES). Specificity of the assay was improved, since interfering matrix compounds were removed by size exclusion chromatography (SEC). ISCID produced three characteristic ions with an excellent mean mass accuracy of 0.82 mDa at physiological concentration levels. In summary, by combining TOFMS with a proper sample preparation and chromatographic separation, the technique can be utilized extensively in doping control laboratories for comprehensive screening of chemically different low and high molecular weight compounds, for quantification of threshold substances and even for confirmation. LC-TOFMS rationalized the work flow in doping control laboratories by simplifying the screening scheme, expediting reporting and minimizing the analysis costs. Therefore LC-TOFMS can be exploited widely in doping control, and the need for several separate analysis techniques is reduced.
Resumo:
The simple two dimensional C-13-satellite J/D-resolved experiments have been proposed for the visualization of enantiomers, extraction of homo- and hetero-nuclear residual dipolar couplings and also H-1 chemical shift differences between the enantiomers in the anisotropic medium. The significant advantages of the techniques are in the determination of scalar couplings of bigger organic molecules. The scalar couplings specific to a second abundant spin such as F-19 can be selectively extracted from the severely overlapped spectrum. The methodologies are demonstrated on a chiral molecule aligned in the chiral liquid crystal medium and two different organic molecules in the isotropic solutions. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
The subject and methodology of biblical scholarship has expanded immense-ly during the last few decades. The traditional text-, literary-, source- and form-critical approaches, labeled historical-critical scholarship , have faced the challenge of social sciences. Various new literary, synchronic readings, sometimes characterized with the vague term postmodernism, have in turn challenged historicalcritical, and social-scientific approaches. Widened limits and diverging methodologies have caused a sense of crisis in biblical criticism. This metatheoretical thesis attempts to bridge the gap between philosophical discussion about the basis of biblical criticism and practical academic biblical scholarship. The study attempts to trace those epistemological changes that have produced the wealth of methods and results within biblical criticism. The account of the cult reform of King Josiah of Judah as reported in 2 Kings 22:1 23:30 serves as the case study because of its importance for critical study of the Hebrew Bible. Various scholarly approaches embracing 2 Kings 22:1 23:30 are experimentally arranged around four methodological positions: text, author, reader, and context. The heuristic model is a tentative application of Oliver Jahraus s model of four paradigms in literary theory. The study argues for six theses: 1) Our knowledge of the world is con-structed, fallible and theory-laden. 2) Methodological plurality is the neces-sary result of changes in epistemology and culture in general. 3) Oliver Jahraus s four methodological positions in regard to literature are also an applicable model within biblical criticism to comprehend the methodological plurality embracing the study of the Hebrew Bible. 4) Underlying the methodological discourse embracing biblical criticism is the epistemological ten-sion between the natural sciences and the humanities. 5) Biblical scholars should reconsider and analyze in detail concepts such as author and editor to overcome the dichotomy between the Göttingen and Cross schools. 6) To say something about the historicity of 2 Kings 22:1 23:30 one must bring together disparate elements from various disciplines and, finally, admit that though it may be possible to draw some permanent results, our conclusions often remain provisional.
Resumo:
This study is a pragmatic description of the evolution of the genre of English witchcraft pamphlets from the mid-sixteenth century to the end of the seventeenth century. Witchcraft pamphlets were produced for a new kind of readership semi-literate, uneducated masses and the central hypothesis of this study is that publishing for the masses entailed rethinking the ways of writing and printing texts. Analysis of the use of typographical variation and illustrations indicates how printers and publishers catered to the tastes and expectations of this new audience. Analysis of the language of witchcraft pamphlets shows how pamphlet writers took into account the new readership by transforming formal written source materials trial proceedings into more immediate ways of writing. The material for this study comes from the Corpus of Early Modern English Witchcraft Pamphlets, which has been compiled by the author. The multidisciplinary analysis incorporates both visual and linguistic aspects of the texts, with methodologies and theoretical insights adopted eclectically from historical pragmatics, genre studies, book history, corpus linguistics, systemic functional linguistics and cognitive psychology. The findings are anchored in the socio-historical context of early modern publishing, reading, literacy and witchcraft beliefs. The study shows not only how consideration of a new audience by both authors and printers influenced the development of a genre, but also the value of combining visual and linguistic features in pragmatic analyses of texts.
Resumo:
The machine replication of human reading has been the subject of intensive research for more than three decades. A large number of research papers and reports have already been published on this topic. Many commercial establishments have manufactured recognizers of varying capabilities. Handheld, desk-top, medium-size and large systems costing as high as half a million dollars are available, and are in use for various applications. However, the ultimate goal of developing a reading machine having the same reading capabilities of humans still remains unachieved. So, there still is a great gap between human reading and machine reading capabilities, and a great amount of further effort is required to narrow-down this gap, if not bridge it. This review is organized into six major sections covering a general overview (an introduction), applications of character recognition techniques, methodologies in character recognition, research work in character recognition, some practical OCRs and the conclusions.
Resumo:
This work is a survey of the average cost control problem for discrete-time Markov processes. The authors have attempted to put together a comprehensive account of the considerable research on this problem over the past three decades. The exposition ranges from finite to Borel state and action spaces and includes a variety of methodologies to find and characterize optimal policies. The authors have included a brief historical perspective of the research efforts in this area and have compiled a substantial yet not exhaustive bibliography. The authors have also identified several important questions that are still open to investigation.
Resumo:
A simple method using a combination of conformal mapping and vortex panel method to simulate potential flow in cascades is presented. The cascade is first transformed to a single body using a conformal mapping, and the potential flow over this body is solved using a simple higher order vortex panel method. The advantage of this method over existing methodologies is that it enables the use of higher order panel methods, as are used to solve flow past an isolated airfoil, to solve the cascade problem without the need for any numerical integrations or iterations. The fluid loading on the blades, such as the normal force and pitching moment, may be easily calculated from the resultant velocity field. The coefficient of pressure on cascade blades calculated with this methodology shows good agreement with previous numerical and experimental results.
Resumo:
This paper studies the problem of constructing robust classifiers when the training is plagued with uncertainty. The problem is posed as a Chance-Constrained Program (CCP) which ensures that the uncertain data points are classified correctly with high probability. Unfortunately such a CCP turns out to be intractable. The key novelty is in employing Bernstein bounding schemes to relax the CCP as a convex second order cone program whose solution is guaranteed to satisfy the probabilistic constraint. Prior to this work, only the Chebyshev based relaxations were exploited in learning algorithms. Bernstein bounds employ richer partial information and hence can be far less conservative than Chebyshev bounds. Due to this efficient modeling of uncertainty, the resulting classifiers achieve higher classification margins and hence better generalization. Methodologies for classifying uncertain test data points and error measures for evaluating classifiers robust to uncertain data are discussed. Experimental results on synthetic and real-world datasets show that the proposed classifiers are better equipped to handle data uncertainty and outperform state-of-the-art in many cases.
Resumo:
This paper presents methodologies for fracture analysis of concrete structural components with and without considering tension softening effect. Stress intensity factor (SIF) is computed by using analytical approach and finite element analysis. In the analytical approach, SW accounting for tension softening effect has been obtained as the difference of SIP obtained using linear elastic fracture mechanics (LEFM) principles and SIP due to closing pressure. Superposition principle has been used by accounting for non-linearity in incremental form. SW due to crack closing force applied on the effective crack face inside the process zone has been computed using Green's function approach. In finite element analysis, the domain integral method has been used for computation of SIR The domain integral method is used to calculate the strain energy release rate and SIF when a crack grows. Numerical studies have been conducted on notched 3-point bending concrete specimen with and without considering the cohesive stresses. It is observed from the studies that SW obtained from the finite element analysis with and without considering the cohesive stresses is in good agreement with the corresponding analytical value. The effect of cohesive stress on SW decreases with increase of crack length. Further, studies have been conducted on geometrically similar structures and observed that (i) the effect of cohesive stress on SW is significant with increase of load for a particular crack length and (iii) SW values decreases with increase of tensile strength for a particular crack length and load.
Resumo:
Enantioselective synthesis of both the enantiomeric forms of the hydrindane derivatives mentioned in the title, potential chiral precursors in terpenoid synthesis, starling from R-carvone employing two different cyclopentannulation methodologies is described.