952 resultados para Methods: analytical


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Application of "advanced analysis" methods suitable for non-linear analysis and design of steel frame structures permits direct and accurate determination of ultimate system strengths, without resort to simplified elastic methods of analysis and semi-empirical specification equations. However, the application of advanced analysis methods has previously been restricted to steel frames comprising only compact sections that are not influenced by the effects of local buckling. A research project has been conducted with the aim of developing concentrated plasticity methods suitable for practical advanced analysis of steel frame structures comprising non-compact sections. This paper contains a comprehensive set of analytical benchmark solutions for steel frames comprising non-compact sections, which can be used to verify the accuracy of simplified concentrated plasticity methods of advanced analysis. The analytical benchmark solutions were obtained using a distributed plasticity shell finite element model that explicitly accounts for the effects of gradual cross-sectional yielding, longitudinal spread of plasticity, initial geometric imperfections, residual stresses, and local buckling. A brief description and verification of the shell finite element model is provided in this paper.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study resulted in the development of a decision making tool for engineering consultancies looking to diversify into new markets. It reviewed existing decision tools used by contractor's entering new markets to develop a bespoke tool for engineering consultants to establish more rigor around the decision making process rather than rely purely on the intuition of company executives. The tool can be used for developing medium and long term company strategies or as a quick and efficient way to assess the viability of new market opportunities when they arise. A combination of Delphi and Analytical Hierarchy Process was selected as the basis of the decision theory.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Barmah Forest virus (BFV) disease is an emerging mosquito-borne disease in Australia. We aimed to outline some recent methods in using GIS for the analysis of BFV disease in Queensland, Australia. A large database of geocoded BFV cases has been established in conjunction with population data. The database has been used in recently published studies conducted by the authors to determine spatio-temporal BFV disease hotspots and spatial patterns using spatial autocorrelation and semi-variogram analysis in conjunction with the development of interpolated BFV disease standardised incidence maps. This paper briefly outlines spatial analysis methodologies using GIS tools used in those studies. This paper summarises methods and results from previous studies by the authors, and presents a GIS methodology to be used in future spatial analytical studies in attempt to enhance the understanding of BFV disease in Queensland. The methodology developed is useful in improving the analysis of BFV disease data and will enhance the understanding of the BFV disease distribution in Queensland, Australia.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Twitter and other social networking sites play an ever more present role in the spread of current events. The dynamics of information dissemination through digital network structures are still relatively unexplored, however. At what time an issue is taken up by whom? Who forwards a message when to whom else? What role do individual communication participants, existing digital communities or the technical foundations of each network platform play in the spread of news? In this chapter we discuss, using the example of a video on a current sociopolitical issue in Australia that was shared on Twitter, a number of new methods for the dynamic visualisation and analysis of communication processes. Our method combines temporal and spatial analytical approaches and provides new insights into the spread of news in digital networks. [Social media dienen immer häufger als Disseminationsmechanismen für Medieninhalte. Auf Twitter ermöglicht besonders die Retweet-Funktion den schnellen und weitläufgen Transfer von Nachrichten. In diesem Beitrag etablieren neue methodische Ansätze zur Erfassung, Visualisierung und Analyse von Retweet-Ketten. Insbesondere heben wir hervor, wie bestehende Netzwerkanalysemethoden ergänzt werden können, um den Ablauf der Weiterleitung sowohl temporal als auch spatial zu erfassen. Unsere Fallstudie demonstriert die verbreitung des videoclips einer am 9. Oktober 2012 spontan gehaltenen Wutrede der australischen Premierministerin Julia Gillard, in der sie Oppositionsführer Tony Abbott als Frauenhasser brandmarkte. Durch die Erfassung von Hintergrunddaten zu den jeweiligen NutzerInnen, die sich an der Weiterleitung des Videoclips beteiligten, erstellen wir ein detailliertes Bild des Disseminationsablaufs im vorliegenden Fall. So lassen sich die wichtigsten AkteurInnen und der Ablauf der Weiterleitung darstellen und analysieren. Daraus entstehen Einblicke in die allgemeinen verbreitungsmuster von Nachrichten auf Twitter].

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Greenhouse gas (GHG) emissions are simultaneously exhausting the world's supply of fossil fuels and threatening the global climate. In many developing countries, significant improvement in living standards in recent years due to the accelerating development of their economies has resulted in a disproportionate increase in household energy consumption. Therefore, a major reduction in household carbon emissions (HCEs) is essential if global carbon reduction targets are to be met. To do this, major Organisation for Economic Co-operation and Development (OECD) states have already implemented policies to alleviate the negative environmental effects of household behaviors and less carbon-intensive technologies are also proposed to promote energy efficiency and reduce carbon emissions. However, before any further remedial actions can be contemplated, though, it is important to fully understand the actual causes of such large HCEs and help researchers both gain deep insights into the development of the research domain and identify valuable research topics for future study. This paper reviews existing literature focusing on the domain of HCEs. This critical review provides a systematic understanding of current work in the field, describing the factors influencing HCEs under the themes of household income, household size, age, education level, location, gender and rebound effects. The main quantification methodologies of input–output models, life cycle assessment and emission coefficient methods are also presented, and the proposed measures to mitigate HCEs at the policy, technology and consumer levels. Finally, the limitations of work done to date and further research directions are identified for the benefit of future studies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The rectangular dielectric waveguide is the most commonly used structure in integrated optics, especially in semi-conductor diode lasers. Demands for new applications such as high-speed data backplanes in integrated electronics, waveguide filters, optical multiplexers and optical switches are driving technology toward better materials and processing techniques for planar waveguide structures. The infinite slab and circular waveguides that we know are not practical for use on a substrate because the slab waveguide has no lateral confinement and the circular fiber is not compatible with the planar processing technology being used to make planar structures. The rectangular waveguide is the natural structure. In this review, we have discussed several analytical methods for analyzing the mode structure of rectangular structures, beginning with a wave analysis based on the pioneering work of Marcatili. We study three basic techniques with examples to compare their performance levels. These are the analytical approach developed by Marcatili, the perturbation techniques, which improve on the analytical solutions and the effective index method with examples.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The feasibility of different modern analytical techniques for the mass spectrometric detection of anabolic androgenic steroids (AAS) in human urine was examined in order to enhance the prevalent analytics and to find reasonable strategies for effective sports drug testing. A comparative study of the sensitivity and specificity between gas chromatography (GC) combined with low (LRMS) and high resolution mass spectrometry (HRMS) in screening of AAS was carried out with four metabolites of methandienone. Measurements were done in selected ion monitoring mode with HRMS using a mass resolution of 5000. With HRMS the detection limits were considerably lower than with LRMS, enabling detection of steroids at low 0.2-0.5 ng/ml levels. However, also with HRMS, the biological background hampered the detection of some steroids. The applicability of liquid-phase microextraction (LPME) was studied with metabolites of fluoxymesterone, 4-chlorodehydromethyltestosterone, stanozolol and danazol. Factors affecting the extraction process were studied and a novel LPME method with in-fiber silylation was developed and validated for GC/MS analysis of the danazol metabolite. The method allowed precise, selective and sensitive analysis of the metabolite and enabled simultaneous filtration, extraction, enrichment and derivatization of the analyte from urine without any other steps in sample preparation. Liquid chromatographic/tandem mass spectrometric (LC/MS/MS) methods utilizing electrospray ionization (ESI), atmospheric pressure chemical ionization (APCI) and atmospheric pressure photoionization (APPI) were developed and applied for detection of oxandrolone and metabolites of stanozolol and 4-chlorodehydromethyltestosterone in urine. All methods exhibited high sensitivity and specificity. ESI showed, however, the best applicability, and a LC/ESI-MS/MS method for routine screening of nine 17-alkyl-substituted AAS was thus developed enabling fast and precise measurement of all analytes with detection limits below 2 ng/ml. The potential of chemometrics to resolve complex GC/MS data was demonstrated with samples prepared for AAS screening. Acquired full scan spectral data (m/z 40-700) were processed by the OSCAR algorithm (Optimization by Stepwise Constraints of Alternating Regression). The deconvolution process was able to dig out from a GC/MS run more than the double number of components as compared with the number of visible chromatographic peaks. Severely overlapping components, as well as components hidden in the chromatographic background could be isolated successfully. All studied techniques proved to be useful analytical tools to improve detection of AAS in urine. Superiority of different procedures is, however, compound-dependent and different techniques complement each other.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this dissertation, I present an overall methodological framework for studying linguistic alternations, focusing specifically on lexical variation in denoting a single meaning, that is, synonymy. As the practical example, I employ the synonymous set of the four most common Finnish verbs denoting THINK, namely ajatella, miettiä, pohtia and harkita ‘think, reflect, ponder, consider’. As a continuation to previous work, I describe in considerable detail the extension of statistical methods from dichotomous linguistic settings (e.g., Gries 2003; Bresnan et al. 2007) to polytomous ones, that is, concerning more than two possible alternative outcomes. The applied statistical methods are arranged into a succession of stages with increasing complexity, proceeding from univariate via bivariate to multivariate techniques in the end. As the central multivariate method, I argue for the use of polytomous logistic regression and demonstrate its practical implementation to the studied phenomenon, thus extending the work by Bresnan et al. (2007), who applied simple (binary) logistic regression to a dichotomous structural alternation in English. The results of the various statistical analyses confirm that a wide range of contextual features across different categories are indeed associated with the use and selection of the selected think lexemes; however, a substantial part of these features are not exemplified in current Finnish lexicographical descriptions. The multivariate analysis results indicate that the semantic classifications of syntactic argument types are on the average the most distinctive feature category, followed by overall semantic characterizations of the verb chains, and then syntactic argument types alone, with morphological features pertaining to the verb chain and extra-linguistic features relegated to the last position. In terms of overall performance of the multivariate analysis and modeling, the prediction accuracy seems to reach a ceiling at a Recall rate of roughly two-thirds of the sentences in the research corpus. The analysis of these results suggests a limit to what can be explained and determined within the immediate sentential context and applying the conventional descriptive and analytical apparatus based on currently available linguistic theories and models. The results also support Bresnan’s (2007) and others’ (e.g., Bod et al. 2003) probabilistic view of the relationship between linguistic usage and the underlying linguistic system, in which only a minority of linguistic choices are categorical, given the known context – represented as a feature cluster – that can be analytically grasped and identified. Instead, most contexts exhibit degrees of variation as to their outcomes, resulting in proportionate choices over longer stretches of usage in texts or speech.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Much progress in nanoscience and nanotechnology has been made in the past few years thanks to the increased availability of sophisticated physical methods to characterize nanomaterials. These techniques include electron microscopy and scanning probe microscopies, in addition to standard techniques such as X-ray and neutron diffraction, X-ray scattering, and various spectroscopies. Characterization of nanomaterials includes the determination not only of size and shape, but also of the atomic and electronic structures and other important properties. In this article we describe some of the important methods employed for characterization of nanostructures, describing a few case studies for illustrative purposes. These case studies include characterizations of Au, ReO3, and GaN nanocrystals; ZnO, Ni, and Co nanowires; inorganic and carbon nanotubes; and two-dimensional graphene.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Identification of the optimum generation schedule by various methods of coordinating incremental generation costs and incremental transmission losses has been described previously in the literature. This paper presents an analytical approach which reduces the time-consuming iterative procedure into a mere positive-root determination of a third-order polynomial in λ. This approach includes the effect of transmission losses and is suitable for systems with any number of plants. The validity and effectiveness of this method are demonstrated by analysing a sample system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The quantification and characterisation of soil phosphorus (P) is of agricultural and environmental importance and different extraction methods are widely used to asses the bioavailability of P and to characterize soil P reserves. However, the large variety of extractants, pre-treatments and sample preparation procedures complicate the comparison of published results. In order to improve our understanding of the behaviour and cycling of P in soil, it is crucial to know the scientific relevance of the methods used for various purposes. The knowledge of the factors affecting the analytical outcome is a prerequisite for justified interpretation of the results. The aim of this thesis was to study the effects of sample preparation procedures on soil P and to determine the dependence of the recovered P pool on the chemical nature of extractants. Sampling is a critical step in soil testing and sampling strategy is dependent on the land-use history and the purpose of sampling. This study revealed that pre-treatments changed soil properties and air-drying was found to affect soil P, particularly extractable organic P, by disrupting organic matter. This was evidenced by an increase in the water-extractable small-sized (<0.2 µm) P that, at least partly, took place at the expense of the large-sized (>0.2 µm) P. However, freezing induced only insignificant changes and thus, freezing can be taken to be a suitable method for storing soils from the boreal zone that naturally undergo periodic freezing. The results demonstrated that chemical nature of the extractant affects its sensitivity to detect changes in soil P solubility. Buffered extractants obscured the alterations in P solubility induced by pH changes; however, water extraction, though sensitive to physicochemical changes, can be used to reveal short term changes in soil P solubility. As for the organic P, the analysis was found to be sensitive to the sample preparation procedures: filtering may leave a large proportion of extractable organic P undetected, whereas the outcome of centrifugation was found to be affected by the ionic strength of the extractant. Widely used sequential fractionation procedures proved to be able to detect land-use -derived differences in the distribution of P among fractions of different solubilities. However, interpretation of the results from extraction experiments requires better understanding of the biogeochemical function of the recovered P fraction in the P cycle in differently managed soils under dissimilar climatic conditions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Past studies that have compared LBB stable discontinuous- and continuous-pressure finite element formulations on a variety of problems have concluded that both methods yield Solutions of comparable accuracy, and that the choice of interpolation is dictated by which of the two is more efficient. In this work, we show that using discontinuous-pressure interpolations can yield inaccurate solutions at large times on a class of transient problems, while the continuous-pressure formulation yields solutions that are in good agreement with the analytical Solution.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The analysis of lipid compositions from biological samples has become increasingly important. Lipids have a role in cardiovascular disease, metabolic syndrome and diabetes. They also participate in cellular processes such as signalling, inflammatory response, aging and apoptosis. Also, the mechanisms of regulation of cell membrane lipid compositions are poorly understood, partially because a lack of good analytical methods. Mass spectrometry has opened up new possibilities for lipid analysis due to its high resolving power, sensitivity and the possibility to do structural identification by fragment analysis. The introduction of Electrospray ionization (ESI) and the advances in instrumentation revolutionized the analysis of lipid compositions. ESI is a soft ionization method, i.e. it avoids unwanted fragmentation the lipids. Mass spectrometric analysis of lipid compositions is complicated by incomplete separation of the signals, the differences in the instrument response of different lipids and the large amount of data generated by the measurements. These factors necessitate the use of computer software for the analysis of the data. The topic of the thesis is the development of methods for mass spectrometric analysis of lipids. The work includes both computational and experimental aspects of lipid analysis. The first article explores the practical aspects of quantitative mass spectrometric analysis of complex lipid samples and describes how the properties of phospholipids and their concentration affect the response of the mass spectrometer. The second article describes a new algorithm for computing the theoretical mass spectrometric peak distribution, given the elemental isotope composition and the molecular formula of a compound. The third article introduces programs aimed specifically for the analysis of complex lipid samples and discusses different computational methods for separating the overlapping mass spectrometric peaks of closely related lipids. The fourth article applies the methods developed by simultaneously measuring the progress curve of enzymatic hydrolysis for a large number of phospholipids, which are used to determine the substrate specificity of various A-type phospholipases. The data provides evidence that the substrate efflux from bilayer is the key determining factor for the rate of hydrolysis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In order to evaluate the influence of ambient aerosol particles on cloud formation, climate and human health, detailed information about the concentration and composition of ambient aerosol particles is needed. The dura-tion of aerosol formation, growth and removal processes in the atmosphere range from minutes to hours, which highlights the need for high-time-resolution data in order to understand the underlying processes. This thesis focuses on characterization of ambient levels, size distributions and sources of water-soluble organic carbon (WSOC) in ambient aerosols. The results show that in the location of this study typically 50-60 % of organic carbon in fine particles is water-soluble. The amount of WSOC was observed to increase as aerosols age, likely due to further oxidation of organic compounds. In the boreal region the main sources of WSOC were biomass burning during the winter and secondary aerosol formation during the summer. WSOC was mainly attributed to a fine particle mode between 0.1 - 1 μm, although different size distributions were measured for different sources. The WSOC concentrations and size distributions had a clear seasonal variation. Another main focus of this thesis was to test and further develop the high-time-resolution methods for chemical characterization of ambient aerosol particles. The concentrations of the main chemical components (ions, OC, EC) of ambient aerosol particles were measured online during a year-long intensive measurement campaign conducted on the SMEAR III station in Southern Finland. The results were compared to the results of traditional filter collections in order to study sampling artifacts and limitations related to each method. To achieve better a time resolution for the WSOC and ion measurements, a particle-into-liquid sampler (PILS) was coupled with a total organic carbon analyzer (TOC) and two ion chromatographs (IC). The PILS-TOC-IC provided important data about diurnal variations and short-time plumes, which cannot be resolved from the filter samples. In summary, the measurements made for this thesis provide new information on the concentrations, size distribu-tions and sources of WSOC in ambient aerosol particles in the boreal region. The analytical and collection me-thods needed for the online characterization of aerosol chemical composition were further developed in order to provide more reliable high-time-resolution measurements.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We discuss three methods to correct spherical aberration for a point to point imaging system. First, results obtained using Fermat's principle and the ray tracing method are described briefly. Next, we obtain solutions using Lie algebraic techniques. Even though one cannot always obtain analytical results using this method, it is often more powerful than the first method. The result obtained with this approach is compared and found to agree with the exact result of the first method.