835 resultados para sampling methodology
Resumo:
L’aprenentatge de la llengua anglesa com a llengua estrangera és una oportunitat que cada vegada més escoles de Catalunya presenten a l’etapa d’educació Infantil. La metodologia utilitzada per introduir aquesta llengua és variada en cada cas. Aquesta recerca es centra en l’estudi de l’ús de tècniques dramàtiques com a metodologia per ensenyar anglès a infantil. A partir d’un qüestionari contestat per 129 professors d’anglès de Catalunya s’ha analitzat la percepció que tenen sobre les tècniques dramàtiques i la seva utilització. Els resultats mostren una manca de coneixement general sobre la metodologia.
Resumo:
Given their central role in mercury (Hg) excretion and suitability as reservoirs, bird feathers are useful Hg biomonitors. Nevertheless, the interpretation of Hg concentrations is still questioned as a result of a poor knowledge of feather physiology and mechanisms affecting Hg deposition. Given the constraints of feather availability to ecotoxicological studies, we tested the effect of intra-individual differences in Hg concentrations according to feather type (body vs. flight feathers), position in the wing and size (mass and length) in order to understand how these factors could affect Hg estimates. We measured Hg concentration of 154 feathers from 28 un-moulted barn owls (Tyto alba), collected dead on roadsides. Median Hg concentration was 0.45 (0.076-4.5) mg kg(-1) in body feathers, 0.44 (0.040-4.9) mg kg(-1) in primary and 0.60 (0.042-4.7) mg kg(-1) in secondary feathers, and we found a poor effect of feather type on intra-individual Hg levels. We also found a negative effect of wing feather mass on Hg concentration but not of feather length and of its position in the wing. We hypothesize that differences in feather growth rate may be the main driver of between-feather differences in Hg concentrations, which can have implications in the interpretation of Hg concentrations in feathers. Finally, we recommend that, whenever possible, several feathers from the same individual should be analysed. The five innermost primaries have lowest mean deviations to both between-feather and intra-individual mean Hg concentration and thus should be selected under restrictive sampling scenarios.
Resumo:
This study aimed at comparing the efficiency of various sampling materials for the collection and subsequent analysis of organic gunshot residues (OGSR). To the best of our knowledge, it is the first time that sampling devices were investigated in detail for further quantitation of OGSR by LC-MS. Seven sampling materials, namely two "swab"-type and five "stub"-type collection materials, were tested. The investigation started with the development of a simple and robust LC-MS method able to separate and quantify molecules typically found in gunpowders, such as diphenylamine or ethylcentralite. The evaluation of sampling materials was then systematically carried out by first analysing blank extracts of the materials to check for potential interferences and determining matrix effects. Based on these results, the best four materials, namely cotton buds, polyester swabs, a tape from 3M and PTFE were compared in terms of collection efficiency during shooting experiments using a set of 9 mm Luger ammunition. It was found that the tape was capable of recovering the highest amounts of OGSR. As tape-lifting is the technique currently used in routine for inorganic GSR, OGSR analysis might be implemented without modifying IGSR sampling and analysis procedure.
Resumo:
A method to generate carbonylic compounds from alkynes under mild and neutral conditions, with excellent functional group compatibility and high yields, is described. Hydration takes place under catalytic conditions by using from 0.1 to 0.2 equivalents of the easily available and inexpensive mercury(II) p-toluensulfonamidate in a hydroalcoholic solution. After use the catalyst is iner tized and/or recycled ...
Resumo:
Lorsque de l'essence est employée pour allumer et/ou propager un incendie, l'inférence de la source de l'essence peut permettre d'établir un lien entre le sinistre et une source potentielle. Cette inférence de la source constitue une alternative intéressante pour fournir des éléments de preuve dans ce type d'événements où les preuves matérielles laissées par l'auteur sont rares. Le but principal de cette recherche était le développement d'une méthode d'analyse de spécimens d'essence par GC-IRMS, méthode pas routinière et peu étudiée en science forensique, puis l'évaluation de son potentiel à inférer la source de traces d'essence en comparaison aux performances de la GC-MS. Un appareillage permettant d'analyser simultanément les échantillons par MS et par IRMS a été utilisé dans cette recherche. Une méthode d'analyse a été développée, optimisée et validée pour cet appareillage. Par la suite, des prélèvements d'essence provenant d'un échantillonnage conséquent et représentatif du marché de la région lausannoise ont été analysés. Finalement, les données obtenues ont été traitées et interprétées à l'aide de méthodes chimiométriques. Les analyses effectuées ont permis de montrer que la méthodologie mise en place, aussi bien pour la composante MS que pour l'IRMS, permet de différencier des échantillons d'essence non altérée provenant de différentes stations-service. Il a également pu être démontré qu'à chaque nouveau remplissage des cuves d'une station-service, la composition de l'essence distribuée par cette station est quasi unique. La GC-MS permet une meilleure différenciation d'échantillons prélevés dans différentes stations, alors que la GC-IRMS est plus performante lorsqu'il s'agit de comparer des échantillons collectés après chacun des remplissages d'une cuve. Ainsi, ces résultats indiquent que les deux composantes de la méthode peuvent être complémentaires pour l'analyse d'échantillons d'essence non altérée. Les résultats obtenus ont également permis de montrer que l'évaporation des échantillons d'essence ne compromet pas la possibilité de grouper des échantillons de même source par GC-MS. Il est toutefois nécessaire d'effectuer une sélection des variables afin d'éliminer celles qui sont influencées par le phénomène d'évaporation. Par contre, les analyses effectuées ont montré que l'évaporation des échantillons d'essence a une forte influence sur la composition isotopique des échantillons. Cette influence est telle qu'il n'est pas possible, même en effectuant une sélection des variables, de grouper correctement des échantillons évaporés par GC-IRMS. Par conséquent, seule la composante MS de la méthodologie mise en place permet d'inférer la source d'échantillons d'essence évaporée. _________________________________________________________________________________________________ When gasoline is used to start and / or propagate an arson, source inference of gasoline can allow to establish a link between the fire and a potential source. This source inference is an interesting alternative to provide evidence in this type of events where physical evidence left by the author are rare. The main purpose of this research was to develop a GC-IRMS method for the analysis of gasoline samples, a non-routine method and little investigated in forensic science, and to evaluate its potential to infer the source of gasoline traces compared to the GC-MS performances. An instrument allowing to analyze simultaneously samples by MS and IRMS was used in this research. An analytical method was developed, optimized and validated for this instrument. Thereafter, gasoline samples from a large sampling and representative of the Lausanne area market were analyzed. Finally, the obtained data were processed and interpreted using chemometric methods. The analyses have shown that the methodology, both for MS and for IRMS, allow to differentiate unweathered gasoline samples from different service stations. It has also been demonstrated that each new filling of the tanks of a station generates an almost unique composition of gasoline. GC-MS achieves a better differentiation of samples coming from different stations, while GC-IRMS is more efficient to distinguish samples collected after each filling of a tank. Thus, these results indicate that the two components of the method can be complementary to the analysis of unweathered gasoline samples. The results have also shown that the evaporation of gasoline samples does not compromise the possibility to group samples coming from the same source by GC-MS. It is however necessary to make a selection of variables in order to eliminate those which are influenced by the evaporation. On the other hand, the carried out analyses have shown that the evaporation of gasoline samples has such a strong influence on the isotopic composition of the samples that it is not possible, even by performing a selection of variables, to properly group evaporated samples by GC-IRMS. Therefore, only the MS allows to infer the source of evaporated gasoline samples.
Resumo:
A BASIC computer program (REMOVAL) was developed to compute in a VAXNMS environment all the calculations of the removal method for population size estimation (catch-effort method for closed populations with constant sampling effort). The program follows the maximum likelihood methodology,checks the failure conditions, applies the appropriate formula, and displays the estimates of population size and catchability, with their standard deviations and coefficients of variation, and two goodness-of-fit statistics with their significance levels. Data of removal experiments for the cyprinodontid fish Aphanius iberus in the Alt Emporda wetlands are used to exemplify the use of the program
Resumo:
We study the relationship between stable sampling sequences for bandlimited functions in $L^p(\R^n)$ and the Fourier multipliers in $L^p$. In the case that the sequence is a lattice and the spectrum is a fundamental domain for the lattice the connection is complete. In the case of irregular sequences there is still a partial relationship.
Resumo:
Real-time predictions are an indispensable requirement for traffic management in order to be able to evaluate the effects of different available strategies or policies. The combination of predicting the state of the network and the evaluation of different traffic management strategies in the short term future allows system managers to anticipate the effects of traffic control strategies ahead of time in order to mitigate the effect of congestion. This paper presents the current framework of decision support systems for traffic management based on short and medium-term predictions and includes some reflections on their likely evolution, based on current scientific research and the evolution of the availability of new types of data and their associated methodologies.
Resumo:
Little is known about the amount of water and ash in brazilian foodstuffs and plants. The relationships between fresh, dry and ash weight were determined in 40 different biological samples. It could be an important tool when one studies biological material containing low concentration of the chemical elements. This study address to determine these relationships and to provide the amount of biological material that one needs to collect. It aims to supply information that could be used to improve the detection limit, precision and accuracy of the analytical methodology utilized.
Resumo:
The general methodology of classical trajectories as applied to elementary chemical reactions of the A+BC type is presented. The goal is to elucidate students about the main theoretical features and potentialities in applying this versatile method to calculate the dynamical properties of reactive systems. Only the methodology for two-dimensional (2D) case is described, from which the general theory for 3D follows straightforwardly. The adopted point of view is, as much as possible, that of allowing a direct translation of the concepts into a working program. An application to the reaction O(¹D)+H2->O+OH with relevance in atmospheric chemistry is also presented. The FORTRAN codes used are available through the web page www.qqesc.qui.uc.pt.
Resumo:
The preparation of 2', 3'-di-O-hexanoyluridine (2) by a Candida antarctica B lipase-catalysed alcoholysis of 2', 3', 5'-tri-O-hexanoyluridine (1) was optimised using an experimental design. At 25 ºC better experimental conditions allowed an increase in the yield of 2 from 80% to 96%. In addition to the yield improvement, the volume reaction could be diminished in a factor of 5 and the reaction time significantly shortened.
Resumo:
Case-crossover is one of the most used designs for analyzing the health-related effects of air pollution. Nevertheless, no one has reviewed its application and methodology in this context. Objective: We conducted a systematic review of case-crossover (CCO) designs used to study the relationship between air pollution and morbidity and mortality, from the standpoint of methodology and application.Data sources and extraction: A search was made of the MEDLINE and EMBASE databases.Reports were classified as methodologic or applied. From the latter, the following information was extracted: author, study location, year, type of population (general or patients), dependent variable(s), independent variable(s), type of CCO design, and whether effect modification was analyzed for variables at the individual level. Data synthesis: The review covered 105 reports that fulfilled the inclusion criteria. Of these, 24 addressed methodological aspects, and the remainder involved the design’s application. In the methodological reports, the designs that yielded the best results in simulation were symmetric bidirectional CCO and time-stratified CCO. Furthermore, we observed an increase across time in the use of certain CCO designs, mainly symmetric bidirectional and time-stratified CCO. The dependent variables most frequently analyzed were those relating to hospital morbidity; the pollutants most often studied were those linked to particulate matter. Among the CCO-application reports, 13.6% studied effect modification for variables at the individual level.Conclusions: The use of CCO designs has undergone considerable growth; the most widely used designs were those that yielded better results in simulation studies: symmetric bidirectional and time-stratified CCO. However, the advantages of CCO as a method of analysis of variables at the individual level are put to little use
Resumo:
This thesis concentrates on developing a practical local approach methodology based on micro mechanical models for the analysis of ductile fracture of welded joints. Two major problems involved in the local approach, namely the dilational constitutive relation reflecting the softening behaviour of material, and the failure criterion associated with the constitutive equation, have been studied in detail. Firstly, considerable efforts were made on the numerical integration and computer implementation for the non trivial dilational Gurson Tvergaard model. Considering the weaknesses of the widely used Euler forward integration algorithms, a family of generalized mid point algorithms is proposed for the Gurson Tvergaard model. Correspondingly, based on the decomposition of stresses into hydrostatic and deviatoric parts, an explicit seven parameter expression for the consistent tangent moduli of the algorithms is presented. This explicit formula avoids any matrix inversion during numerical iteration and thus greatly facilitates the computer implementation of the algorithms and increase the efficiency of the code. The accuracy of the proposed algorithms and other conventional algorithms has been assessed in a systematic manner in order to highlight the best algorithm for this study. The accurate and efficient performance of present finite element implementation of the proposed algorithms has been demonstrated by various numerical examples. It has been found that the true mid point algorithm (a = 0.5) is the most accurate one when the deviatoric strain increment is radial to the yield surface and it is very important to use the consistent tangent moduli in the Newton iteration procedure. Secondly, an assessment of the consistency of current local failure criteria for ductile fracture, the critical void growth criterion, the constant critical void volume fraction criterion and Thomason's plastic limit load failure criterion, has been made. Significant differences in the predictions of ductility by the three criteria were found. By assuming the void grows spherically and using the void volume fraction from the Gurson Tvergaard model to calculate the current void matrix geometry, Thomason's failure criterion has been modified and a new failure criterion for the Gurson Tvergaard model is presented. Comparison with Koplik and Needleman's finite element results shows that the new failure criterion is fairly accurate indeed. A novel feature of the new failure criterion is that a mechanism for void coalescence is incorporated into the constitutive model. Hence the material failure is a natural result of the development of macroscopic plastic flow and the microscopic internal necking mechanism. By the new failure criterion, the critical void volume fraction is not a material constant and the initial void volume fraction and/or void nucleation parameters essentially control the material failure. This feature is very desirable and makes the numerical calibration of void nucleation parameters(s) possible and physically sound. Thirdly, a local approach methodology based on the above two major contributions has been built up in ABAQUS via the user material subroutine UMAT and applied to welded T joints. By using the void nucleation parameters calibrated from simple smooth and notched specimens, it was found that the fracture behaviour of the welded T joints can be well predicted using present methodology. This application has shown how the damage parameters of both base material and heat affected zone (HAZ) material can be obtained in a step by step manner and how useful and capable the local approach methodology is in the analysis of fracture behaviour and crack development as well as structural integrity assessment of practical problems where non homogeneous materials are involved. Finally, a procedure for the possible engineering application of the present methodology is suggested and discussed.
Resumo:
In this paper, we present view-dependent information theory quality measures for pixel sampling and scene discretization in flatland. The measures are based on a definition for the mutual information of a line, and have a purely geometrical basis. Several algorithms exploiting them are presented and compare well with an existing one based on depth differences