970 resultados para SAMPLING METHODS
Resumo:
Les produits cosmétiques sont des substances utilisées pour entretenir ou modifier l'aspect des parties superficielles du corps humain (telles que la peau, les ongles ou les cheveux). Dans de nombreux pays d’Afrique et d’Asie et dans certaines communautés africaines immigrantes, plusieurs femmes et parfois des hommes utilisent des produits contenant des agents actifs tels que le mercure, l’hydroquinone et le propionate de clobétasol pour éclaircir leur peau. Ces principaux agents sont toxiques et leur présence dans les cosmétiques est règlementée, voire interdite, dans plusieurs pays. Dans notre étude, nous avons déterminé les concentrations de ces ingrédients dans plusieurs produits utilisés en Afrique de l’Ouest et au Canada. Nous avons également exploré l’effet de ces produits sur le microbiome cutané. Nos résultats révèlent que 68 à 84% des crèmes et 7.5 à 65% des savons dépassent les normes lorsqu’on considère l’interdiction de mercure, d’hydroquinone et de propionate de clobétasol et les concentrations déclarées sur les étiquettes ne sont pas souvent fiables. Selon la diversité de Shannon, il semble y avoir plus d’équitabilité, et donc moins de dominance dans le groupe des femmes utilisant les crèmes éclaircissantes que dans le groupe des femmes qui ne les utilisent pas. Par ailleurs, nous n’avons pas trouvé de différences significatives au niveau du microbiome cutané du groupe avec crèmes et sans crèmes au niveau du phylum et du genre. Cependant, d’autres méthodes plus approfondies avec plus d’échantillonnage pourraient révéler à des échelles plus fines (espèces, souches, etc.) l’effet de ces produits sur le microbiome cutané.
Resumo:
The project goal was to determine plant operations and maintenance worker’s level of exposure to mercury during routine and non-routine (i.e. turnarounds and inspections) maintenance events in eight gas processing plants. The project team prepared sampling and analysis plans designed to each plant’s process design and scheduled maintenance events. Occupational exposure sampling and monitoring efforts were focused on the measurement of mercury vapor concentration in worker breathing zone air during specific maintenance events including: pipe scrapping, process filter replacement, and process vessel inspection. Similar exposure groups were identified and worker breathing zone and ambient air samples were collected and analyzed for total mercury. Occupational exposure measurement techniques included portable field monitoring instruments, standard passive and active monitoring methods and an emerging passive absorption technology. Process sampling campaigns were focused on inlet gas streams, mercury removal unit outlets, treated gas, acid gas and sales gas. The results were used to identify process areas with increased potential for mercury exposure during maintenance events. Sampling methods used for the determination of total mercury in gas phase streams were based on the USEPA Methods 30B and EPA 1631 and EPA 1669. The results of four six-week long sampling campaigns have been evaluated and some conclusions and recommendations have been made. The author’s role in this project included the direction of all field phases of the project and the development and implementation of the sampling strategy. Additionally, the author participated in the development and implementation of the Quality Assurance Project Plan, Data Quality Objectives, and Similar Exposure Groups identification. All field generated data was reviewed by the author along with laboratory reports in order to generate conclusions and recommendations.
Resumo:
Credible spatial information characterizing the structure and site quality of forests is critical to sustainable forest management and planning, especially given the increasing demands and threats to forest products and services. Forest managers and planners are required to evaluate forest conditions over a broad range of scales, contingent on operational or reporting requirements. Traditionally, forest inventory estimates are generated via a design-based approach that involves generalizing sample plot measurements to characterize an unknown population across a larger area of interest. However, field plot measurements are costly and as a consequence spatial coverage is limited. Remote sensing technologies have shown remarkable success in augmenting limited sample plot data to generate stand- and landscape-level spatial predictions of forest inventory attributes. Further enhancement of forest inventory approaches that couple field measurements with cutting edge remotely sensed and geospatial datasets are essential to sustainable forest management. We evaluated a novel Random Forest based k Nearest Neighbors (RF-kNN) imputation approach to couple remote sensing and geospatial data with field inventory collected by different sampling methods to generate forest inventory information across large spatial extents. The forest inventory data collected by the FIA program of US Forest Service was integrated with optical remote sensing and other geospatial datasets to produce biomass distribution maps for a part of the Lake States and species-specific site index maps for the entire Lake State. Targeting small-area application of the state-of-art remote sensing, LiDAR (light detection and ranging) data was integrated with the field data collected by an inexpensive method, called variable plot sampling, in the Ford Forest of Michigan Tech to derive standing volume map in a cost-effective way. The outputs of the RF-kNN imputation were compared with independent validation datasets and extant map products based on different sampling and modeling strategies. The RF-kNN modeling approach was found to be very effective, especially for large-area estimation, and produced results statistically equivalent to the field observations or the estimates derived from secondary data sources. The models are useful to resource managers for operational and strategic purposes.
Resumo:
Les produits cosmétiques sont des substances utilisées pour entretenir ou modifier l'aspect des parties superficielles du corps humain (telles que la peau, les ongles ou les cheveux). Dans de nombreux pays d’Afrique et d’Asie et dans certaines communautés africaines immigrantes, plusieurs femmes et parfois des hommes utilisent des produits contenant des agents actifs tels que le mercure, l’hydroquinone et le propionate de clobétasol pour éclaircir leur peau. Ces principaux agents sont toxiques et leur présence dans les cosmétiques est règlementée, voire interdite, dans plusieurs pays. Dans notre étude, nous avons déterminé les concentrations de ces ingrédients dans plusieurs produits utilisés en Afrique de l’Ouest et au Canada. Nous avons également exploré l’effet de ces produits sur le microbiome cutané. Nos résultats révèlent que 68 à 84% des crèmes et 7.5 à 65% des savons dépassent les normes lorsqu’on considère l’interdiction de mercure, d’hydroquinone et de propionate de clobétasol et les concentrations déclarées sur les étiquettes ne sont pas souvent fiables. Selon la diversité de Shannon, il semble y avoir plus d’équitabilité, et donc moins de dominance dans le groupe des femmes utilisant les crèmes éclaircissantes que dans le groupe des femmes qui ne les utilisent pas. Par ailleurs, nous n’avons pas trouvé de différences significatives au niveau du microbiome cutané du groupe avec crèmes et sans crèmes au niveau du phylum et du genre. Cependant, d’autres méthodes plus approfondies avec plus d’échantillonnage pourraient révéler à des échelles plus fines (espèces, souches, etc.) l’effet de ces produits sur le microbiome cutané.
Resumo:
A necessidade de conhecer uma população impulsiona um processo de recolha e análise de informação. Usualmente é muito difícil ou impossível estudar a totalidade da população, daí a importância do estudo com recurso a amostras. Conceber um estudo por amostragem é um processo complexo, desde antes da recolha dos dados até a fase de análise dos mesmos. Na maior parte dos estudos utilizam-se combinações de vários métodos probabilísticos de amostragem para seleção de uma amostra, que se pretende representativa da população, denominado delineamento de amostragem complexo. O conhecimento dos erros de amostragem é necessário à correta interpretação dos resultados de inquéritos e à avaliação dos seus planos de amostragem. Em amostras complexas, têm sido usadas aproximações ajustadas à natureza complexa do plano da amostra para a estimação da variância, sendo as mais utilizadas: o método de linearização Taylor e as técnicas de reamostragem e replicação. O principal objetivo deste trabalho é avaliar o desempenho dos estimadores usuais da variância em amostras complexas. Inspirado num conjunto de dados reais foram geradas três populações com características distintas, das quais foram sorteadas amostras com diferentes delineamentos de amostragem, na expectativa de obter alguma indicação sobre em que situações se deve optar por cada um dos estimadores da variância. Com base nos resultados obtidos, podemos concluir que o desempenho dos estimadores da variância da média amostral de Taylor, Jacknife e Bootstrap varia com o tipo de delineamento e população. De um modo geral, o estimador de Bootstrap é o menos preciso e em delineamentos estratificados os estimadores de Taylor e Jackknife fornecem os mesmos resultados; Evaluation of variance estimation methods in complex samples ABSTRACT: The need to know a population drives a process of collecting and analyzing information. Usually is to hard or even impossible to study the whole population, hence the importance of sampling. Framing a study by sampling is a complex process, from before the data collection until the data analysis. Many studies have used combinations of various probabilistic sampling methods for selecting a representative sample of the population, calling it complex sampling design. Knowledge of sampling errors is essential for correct interpretation of the survey results and evaluation of the sampling plans. In complex samples to estimate the variance has been approaches adjusted to the complex nature of the sample plane. The most common are: the linearization method of Taylor and techniques of resampling and replication. The main objective of this study is to evaluate the performance of usual estimators of the variance in complex samples. Inspired on real data we will generate three populations with distinct characteristics. From this populations will be drawn samples using different sampling designs. In the end we intend to get some lights about in which situations we should opt for each one of the variance estimators. Our results show that the performance of the variance estimators of sample mean Taylor, Jacknife and Bootstrap varies with the design and population. In general, the Bootstrap estimator is less precise and in stratified design Taylor and Jackknife estimators provide the same results.
Resumo:
The porpoise of this study was to implement research methodologies and assess the effectiveness and impact of management tools to promote best practices for the long term conservation of the endangered African wild dog (Lycaon pictus). Different methods were included in the project framework to investigate and expand the applicability of these methodologies to free-ranging African wild dogs in the southern African region: ethology, behavioural endocrinology and ecology field methodologies were tested and implemented. Additionally, research was performed to test the effectiveness and implication of a contraceptive implant (Suprenolin) as a management tool for the species of a subpopulation hosted in fenced areas. Attention was especially given to social structure and survival of treated packs. This research provides useful tools and advances the applicability of these methods for field studies, standardizing and improving research instruments in the field of conservation biology and behavioural endocrinology. Results reported here provide effective methodologies to expand the applicability of non-invasive endocrine assessment to previously prohibited fields, and validation of sampling methods for faecal hormone analysis. The final aim was to fill a knowledge gap on behaviours of the species and provide a common ground for future researchers to apply non-invasive methods to this species research and to test the effectiveness of the contraception on a managed metapopulation.
Resumo:
The cation chloride cotransporters (CCCs) represent a vital family of ion transporters, with several members implicated in significant neurological disorders. Specifically, conditions such as cerebrospinal fluid accumulation, epilepsy, Down’s syndrome, Asperger’s syndrome, and certain cancers have been attributed to various CCCs. This thesis delves into these pharmacological targets using advanced computational methodologies. I primarily employed GPU-accelerated all-atom molecular dynamics simulations, deep learning-based collective variables, enhanced sampling methods, and custom Python scripts for comprehensive simulation analyses. Our research predominantly centered on KCC1 and NKCC1 transporters. For KCC1, I examined its equilibrium dynamics in the presence/absence of an inhibitor and assessed the functional implications of different ion loading states. In contrast, our work on NKCC1 revealed its unique alternating access mechanism, termed the rocking-bundle mechanism. I identified a previously unobserved occluded state and demonstrated the transporter's potential for water permeability under specific conditions. Furthermore, I confirmed the actual water flow through its permeable states. In essence, this thesis leverages cutting-edge computational techniques to deepen our understanding of the CCCs, a family of ion transporters with profound clinical significance.
Resumo:
Quantitative Susceptibility Mapping (QSM) is an advanced magnetic resonance technique that can quantify in vivo biomarkers of pathology, such as alteration in iron and myelin concentration. It allows for the comparison of magnetic susceptibility properties within and between different subject groups. In this thesis, QSM acquisition and processing pipeline are discussed, together with clinical and methodological applications of QSM to neurodegeneration. In designing the studies, significant emphasis was placed on results reproducibility and interpretability. The first project focuses on the investigation of cortical regions in amyotrophic lateral sclerosis. By examining various histogram susceptibility properties, a pattern of increased iron content was revealed in patients with amyotrophic lateral sclerosis compared to controls and other neurodegenerative disorders. Moreover, there was a correlation between susceptibility and upper motor neuron impairment, particularly in patients experiencing rapid disease progression. Similarly, in the second application, QSM was used to examine cortical and sub-cortical areas in individuals with myotonic dystrophy type 1. The thalamus and brainstem were identified as structures of interest, with relevant correlations with clinical and laboratory data such as neurological evaluation and sleep records. In the third project, a robust pipeline for assessing radiomic susceptibility-based features reliability was implemented within a cohort of patients with multiple sclerosis and healthy controls. Lastly, a deep learning super-resolution model was applied to QSM images of healthy controls. The employed model demonstrated excellent generalization abilities and outperformed traditional up-sampling methods, without requiring a customized re-training. Across the three disorders investigated, it was evident that QSM is capable of distinguishing between patient groups and healthy controls while establishing correlations between imaging measurements and clinical data. These studies lay the foundation for future research, with the ultimate goal of achieving earlier and less invasive diagnoses of neurodegenerative disorders within the context of personalized medicine.
Resumo:
Despite the success of the ΛCDM model in describing the Universe, a possible tension between early- and late-Universe cosmological measurements is calling for new independent cosmological probes. Amongst the most promising ones, gravitational waves (GWs) can provide a self-calibrated measurement of the luminosity distance. However, to obtain cosmological constraints, additional information is needed to break the degeneracy between parameters in the gravitational waveform. In this thesis, we exploit the latest LIGO-Virgo-KAGRA Gravitational Wave Transient Catalog (GWTC-3) of GW sources to constrain the background cosmological parameters together with the astrophysical properties of Binary Black Holes (BBHs), using information from their mass distribution. We expand the public code MGCosmoPop, previously used for the application of this technique, by implementing a state-of-the-art model for the mass distribution, needed to account for the presence of non-trivial features, i.e. a truncated power law with two additional Gaussian peaks, referred to as Multipeak. We then analyse GWTC-3 comparing this model with simpler and more commonly adopted ones, both in the case of fixed and varying cosmology, and assess their goodness-of-fit with different model selection criteria, and their constraining power on the cosmological and population parameters. We also start to explore different sampling methods, namely Markov Chain Monte Carlo and Nested Sampling, comparing their performances and evaluating the advantages of both. We find concurring evidence that the Multipeak model is favoured by the data, in line with previous results, and show that this conclusion is robust to the variation of the cosmological parameters. We find a constraint on the Hubble constant of H0 = 61.10+38.65−22.43 km/s/Mpc (68% C.L.), which shows the potential of this method in providing independent constraints on cosmological parameters. The results obtained in this work have been included in [1].
Resumo:
The role of land cover change as a significant component of global change has become increasingly recognized in recent decades. Large databases measuring land cover change, and the data which can potentially be used to explain the observed changes, are also becoming more commonly available. When developing statistical models to investigate observed changes, it is important to be aware that the chosen sampling strategy and modelling techniques can influence results. We present a comparison of three sampling strategies and two forms of grouped logistic regression models (multinomial and ordinal) in the investigation of patterns of successional change after agricultural land abandonment in Switzerland. Results indicated that both ordinal and nominal transitional change occurs in the landscape and that the use of different sampling regimes and modelling techniques as investigative tools yield different results. Synthesis and applications. Our multimodel inference identified successfully a set of consistently selected indicators of land cover change, which can be used to predict further change, including annual average temperature, the number of already overgrown neighbouring areas of land and distance to historically destructive avalanche sites. This allows for more reliable decision making and planning with respect to landscape management. Although both model approaches gave similar results, ordinal regression yielded more parsimonious models that identified the important predictors of land cover change more efficiently. Thus, this approach is favourable where land cover change pattern can be interpreted as an ordinal process. Otherwise, multinomial logistic regression is a viable alternative.
Resumo:
When rare is just a matter of sampling: Unexpected dominance of clubtail dragonflies (Odonata, Gomphidae) through different collecting methods at Parque Nacional da Serra do Cipó, Minas Gerais State, Brazil. Capture of dragonfly adults during two short expeditions to Parque Nacional da Serra do Cipó, Minas Gerais State, using three distinct collecting methodsaerial nets, Malaise and light sheet trapsis reported. The results are outstanding due the high number of species of Gomphidae (7 out of 26 Odonata species), including a new species of Cyanogomphus Selys, 1873, obtained by two non-traditional collecting methods. Because active collecting with aerial nets is the standard approach for dragonfly inventories, we discuss some aspects of the use of traps, comparing our results with those in the literature, suggesting they should be used as complementary methods in faunistic studies. Furthermore, Zonophora campanulata annulata Belle, 1983 is recorded for the first time from Minas Gerais State and taxonomic notes about Phyllogomphoides regularis (Selys, 1873) and Progomphus complicatus Selys, 1854 are also given.
Resumo:
The aim of this study was to compare two methods of tear sampling for protein quantification. Tear samples were collected from 29 healthy dogs (58 eyes) using Schirmer tear test (STT) strip and microcapillary tubes. The samples were frozen at -80ºC and analyzed by the Bradford method. Results were analyzed by Student's t test. The average protein concentration and standard deviation from tears collected with microcapillary tube were 4.45mg/mL ±0.35 and 4,52mg/mL ±0.29 for right and left eyes respectively. The average protein concentration and standard deviation from tears collected with Schirmer Tear Test (STT) strip were and 54.5mg/mL ±0.63 and 54.15mg/mL ±0.65 to right and left eyes respectively. Statistically significant differences (p<0.001) were found between the methods. In the conditions in which this study was conducted, the average protein concentration obtained with the Bradford test from tear samples obtained by Schirmer Tear Test (STT) strip showed values higher than those obtained with microcapillary tube. It is important that concentration of tear protein pattern values should be analyzed according the method used to collect tear samples.
Resumo:
The goal of the review is to provide a state-of-the-art survey on sampling and probe methods for the solution of inverse problems. Further, a configuration approach to some of the problems will be presented. We study the concepts and analytical results for several recent sampling and probe methods. We will give an introduction to the basic idea behind each method using a simple model problem and then provide some general formulation in terms of particular configurations to study the range of the arguments which are used to set up the method. This provides a novel way to present the algorithms and the analytic arguments for their investigation in a variety of different settings. In detail we investigate the probe method (Ikehata), linear sampling method (Colton-Kirsch) and the factorization method (Kirsch), singular sources Method (Potthast), no response test (Luke-Potthast), range test (Kusiak, Potthast and Sylvester) and the enclosure method (Ikehata) for the solution of inverse acoustic and electromagnetic scattering problems. The main ideas, approaches and convergence results of the methods are presented. For each method, we provide a historical survey about applications to different situations.
Resumo:
The high computational cost of calculating the radiative heating rates in numerical weather prediction (NWP) and climate models requires that calculations are made infrequently, leading to poor sampling of the fast-changing cloud field and a poor representation of the feedback that would occur. This paper presents two related schemes for improving the temporal sampling of the cloud field. Firstly, the ‘split time-stepping’ scheme takes advantage of the independent nature of the monochromatic calculations of the ‘correlated-k’ method to split the calculation into gaseous absorption terms that are highly dependent on changes in cloud (the optically thin terms) and those that are not (optically thick). The small number of optically thin terms can then be calculated more often to capture changes in the grey absorption and scattering associated with cloud droplets and ice crystals. Secondly, the ‘incremental time-stepping’ scheme uses a simple radiative transfer calculation using only one or two monochromatic calculations representing the optically thin part of the atmospheric spectrum. These are found to be sufficient to represent the heating rate increments caused by changes in the cloud field, which can then be added to the last full calculation of the radiation code. We test these schemes in an operational forecast model configuration and find a significant improvement is achieved, for a small computational cost, over the current scheme employed at the Met Office. The ‘incremental time-stepping’ scheme is recommended for operational use, along with a new scheme to correct the surface fluxes for the change in solar zenith angle between radiation calculations.