944 resultados para Quantitative methods


Relevância:

60.00% 60.00%

Publicador:

Resumo:

Across the nation, librarians work with caregivers and children to encourage engagement in their early literacy programs. However, these early literacy programs that libraries provide have been left mostly undocumented by research, especially through quantitative methods. Valuable Initiatives in Early Learning that Work Successfully (VIEWS2) was designed to test new ways to measure the effectiveness of these early literacy programs for young children (birth to kindergarten), leveraging a mixed methods, quasi-experimental design. Using two innovative tools, researchers collected data at 120 public library storytimes in the first year of research, observing approximately 1,440 children ranging from birth to 60 months of age. Analysis of year-one data showed a correlation between the early literacy content of the storytime program and children’s outcomes in terms of early literacy behaviors. These findings demonstrate that young children who attend public library storytimes are responding to the early literacy content in the storytime programs.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The purpose of this case study is to report on the use of learning journals as a strategy to encourage critical reflection in the field of graphic design. Very little empirical research has been published regarding the use of critical reflection in learning journals in this field. Furthermore, nothing has been documented at the college level. To that end, the goal of this research endeavor was to investigate whether second-year students in the NewMedia and Publication Design Program at a small Anglophone CEGEP in Québec, enrolled in a Page Layout and Design course, learn more deeply by reflecting in action during design projects or reflecting on action after completing design projects. Secondarily, indications of a possible change in self-efficacy were examined. Two hypotheses were posited: 1) reflection-on-action journaling will promote a deeper approach to learning than reflection-in-action journaling, and 2) the level of self-efficacy in graphic design improves as students are encouraged to think reflectively. Using both qualitative and quantitative methods, a mixed methods approach was used to collect and analyze the data. Content analysis of journal entries and interview responses was the primary method used to address the first hypothesis. Students were required to journal twice for each of three projects, once during the project and again one week after the project had been submitted. In addition, data regarding the students' perception of journaling was obtained through administering a survey and conducting interviews. For the second hypothesis, quantitative methods were used through the use of two surveys, one administered early in the Fall 2011 semester and the second administered early in the Winter 2012 semester. Supplementary data regarding self-efficacy was obtained in the form of content analysis of journal entries and interviews. Coded journal entries firmly supported the hypothesis that reflection-on-action journaling promotes deep learning. Using a taxonomy developed by Kember et al. (1999) wherein "critical reflection" is considered the highest level of reflection, it was found that only 5% of the coded responses in the reflection-in-action journals were deemed of the highest level, whereas 39% were considered critical reflection in the reflection-on-action journals. The findings from the interviews suggest that students had some initial concerns about the value of journaling, but these concerns were later dismissed as students learned that journaling was a valuable tool that helped them reflect and learn. All participants indicated that journaling changed their learning processes as they thought much more about what they were doing while they were doing it. They were taking the learning they had acquired and thinking about how they would apply it to new projects; this is critical reflection. The survey findings did not support the conclusive results of the comparison of journal instruments, where an increase of 35% in critical reflection was noted in the reflection-on-action journals. In Chapter 5, reasons for this incongruence are explored. Furthermore, based on the journals, surveys, and interviews, there is not enough evidence at this time to support the hypothesis that self-efficacy improves when students are encouraged to think reflectively. It could be hypothesized, however, that one's self-efficacy does not change in such a short period of time. In conclusion, the findings established in this case study make a practical contribution to the literature concerning the promotion of deep learning in the field of graphic design, as this researcher's hypothesis was supported that reflection-on-action journaling promoted deeper learning than reflection-in-action journaling. When examining the increases in critical reflection from reflection-in-action to the reflection-on-action journals, it was found that all students but one showed an increase in critical reflection in reflection-on-action journals. It is therefore recommended that production-oriented program instructors consider integrating reflection-on-action journaling into their courses where projects are given.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This study presents a review of new instruments for the impact assessment of libraries and a case study of the evaluation impact of the Library of the Faculty of Science, University of Porto (FCUP), from the students’ point of view. We con ducted a mixed methods research, i.e., which includes both qualitative data, to describe characteristics, in particular human actions, and quantitative data, represented by numbers that indicate exact amounts which can be statistically manipulated. Applying International Standard ISO16439:2014 (E) - Information and documentation - Methods and procedures for assessing the impact of libraries, we collected, 20 opinion texts from students of different nationalities, published in «Notícias da Biblioteca», from January 2013 to December 2014 and have conducted seven interviews.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

ResumenPrimera de una serie de entrevistas a personas que han hecho contribuciones relevantes a la historia costarricense.AbstractFirst a series of interviews to researchers who have made outstanding contributions to the history of Costa Rica. Héctor Pérez speaks on the use of quantitative methods in history, with special reference to Costa Rica.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Pesticides applications have been described by many researches as a very inefficient process. In some cases, there are reports that only 0.02% of the applied products are used for the effective control of the problem. The main factor that influences pesticides applications is the droplet size formed on spraying nozzles. Many parameters affects the dynamic of the droplets, like wind, temperature, relative humidity, and others. Small droplets are biologically more active, but they are affected by evaporation and drift. On the other hand, the great droplets do not promote a good distribution of the product on the target. In this sense, associated with the risk of non target areas contamination and with the high costs involved in applications, the knowledge of the droplet size is of fundamental importance in the application technology. When sophisticated technology for droplets analysis is unavailable, is common the use of artificial targets like water-sensitive paper to sample droplets. On field sampling, water-sensitive papers are placed on the trials where product will be applied. When droplets impinging on it, the yellow surface of this paper will be stained dark blue, making easy their recognition. Collected droplets on this papers have different kinds of sizes. In this sense, the determination of the droplet size distribution gives a mass distribution of the material and so, the efficience of the application of the product. The stains produced by droplets shows a spread factor proportional to their respectives initial sizes. One of methodologies to analyse the droplets is a counting and measure of the droplets made in microscope. The Porton N-G12 graticule, that shows equaly spaces class intervals on geometric progression of square 2, are coulpled to the lens of the microscope. The droplet size parameters frequently used are the Volumetric Median Diameter (VMD) and the Numeric Median Diameter. On VMD value, a representative droplets sample is divided in two equal parts of volume, in such away one part contains droplets of sizes smaller than VMD and the other part contains droplets of sizes greater that VMD. The same process is done to obtaining the NMD, which divide the sample in two equal parts in relation to the droplets size. The ratio between VMD and NMD allows the droplets uniformity evaluation. After that, the graphics of accumulated probability of the volume and size droplets are plotted on log scale paper (accumulated probability versus median diameter of each size class). The graphics provides the NMD on the x-axes point corresponding to the value of 50% founded on the y-axes. All this process is very slow and subjected to operator error. So, in order to decrease the difficulty envolved with droplets measuring it was developed a numeric model, implemented on easy and accessfull computational language, which allows approximate VMD and NMD values, with good precision. The inputs to this model are the frequences of the droplets sizes colected on the water-sensitive paper, observed on the Porton N-G12 graticule fitted on microscope. With these data, the accumulated distribution of the droplet medium volumes and sizes are evaluated. The graphics obtained by plotting this distributions allow to obtain the VMD and NMD using linear interpolation, seen that on the middle of the distributions the shape of the curves are linear. These values are essential to evaluate the uniformity of droplets and to estimate the volume deposited on the observed paper by the density (droplets/cm2). This methodology to estimate the droplets volume was developed by 11.0.94.224 Project of the CNPMA/EMBRAPA. Observed data of herbicides aerial spraying samples, realized by Project on Pelotas/RS county, were used to compare values obtained manual graphic method and with those obtained by model has shown, with great precision, the values of VMD and NMD on each sampled collector, allowing to estimate a quantities of deposited product and, by consequence, the quantities losses by drifty. The graphics of variability of VMD and NMD showed that the quantity of droplets that reachs the collectors had a short dispersion, while the deposited volume shows a great interval of variation, probably because the strong action of air turbulence on the droplets distribution, enfasizing the necessity of a deeper study to verify this influences on drift.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This PhD thesis explores the ecological responses of bird species to glacial-interglacial transitions during the late Quaternary in the Western Palearctic, using multiple approaches and at different scales, enhancing the importance of the bird fossil record and quantitative methods to elucidate biotic trends in relation to long-term climate changes. The taxonomic and taphonomic analyses of the avian fossil assemblages from four Italian Middle and Upper Pleistocene sedimentary successions (Grotta del Cavallo, Grotta di Fumane, Grotta di Castelcivita, and Grotta di Uluzzo C) allowed us to reconstruct local-scale patterns in birds’ response to climate changes. These bird assemblages are characterized by the presence of temperate species and by the occasional presence of cold-dwelling species during glacials, related to range shifts. These local patterns are supported by those identified at the continental scale. In this respect, I mapped the present-day and LGM climatic envelopes of species with different climatic requirements. The results show a substantial stability in the range of temperate species and pronounced changes in the range of cold-dwelling species, supported by their fossil records. Therefore, the responses to climate oscillations are highly related to the thermal niches of investigated species. I also clarified the dynamics of the presence of boreal and arctic bird species in Mediterranean Europe, due to southern range shifts, during the glacial phases. After a reassessment of the reliability of the existing fossil evidence, I show that this phenomenon is not as common as previously thought, with important implications for the paleoclimatic and paleoenvironmental significance of the targeted species. I have also been able to explore the potential of multivariate and rarefaction methods in the analyses of avian fossils from Grotta del Cavallo. These approaches helped to delineate the main drivers of taphonomic damages and the dynamics of species diversity in relation to climate-driven paleoenvironmental changes.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Fin dalla sua attuazione nel 2012, l'Iniziativa dei cittadini europei (ICE) ha catturato l'attenzione di accademici e politici per il suo apparente potenziale come strumento di democrazia partecipativa in grado di promuovere il coinvolgimento diretto dei cittadini nel processo decisionale dell'UE. Tuttavia, dopo il suo lancio, questo strumento sembra aver deluso le speranze e le aspettative riposte in esso e, invece di fungere da ponte tra i cittadini e le istituzioni dell'UE, sembra essere diventato una chiara prova della leadership burocratica dell'UE a Bruxelles. Con la riforma della sua legislazione di attuazione, le istituzioni europee hanno voluto dare all'ICE un'altra possibilità di raggiungere il suo pieno potenziale di democratizzazione. A tre anni dall'entrata in vigore del nuovo Regolamento 2019/788 e a più di dieci anni dal suo inserimento nell'ordinamento giuridico europeo, riteniamo che sia il momento giusto per valutare il reale impatto dell'ICE nella promozione della democrazia partecipativa nell'UE. Per raggiungere questo obiettivo, la presente tesi di dottorato intraprende un'analisi completa di questa struttura di opportunità per la partecipazione dei cittadini, esplorando le sue origini, il suo quadro normativo, la sua applicazione pratica e le sue implicazioni per la democrazia europea attraverso un approccio interdisciplinare che combina l'uso di metodi sia qualitativi che quantitativi. Questa ricerca mira a fornire una comprensione più profonda e critica dell'ICE e del suo ruolo nella costruzione di un'Europa più partecipativa e più vicina ai cittadini.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

High-performance liquid-chromatographic (HPLC) methods were validated for determination of pravastatin sodium (PS), fluvastatin sodium (FVS), atorvastatin calcium (ATC), and rosuvastatin calcium (RC) in pharmaceuticals. Two stability-indicating HPLC methods were developed with a small change (10%) in the composition of the organic modifier in the mobile phase. The HPLC method for each statin was validated using isocratic elution. An RP-18 column was used with mobile phases consisting of methanol-water (60:40, v/v, for PS and RC and 70:30, v/v, for FVS and ATC). The pH of each mobile phase was adjusted to 3.0 with orthophosphoric acid, and the flow rate was 1.0mL/min. Calibration plots showed correlation coefficients (r)0.999, which were calculated by the least square method. The detection limit (DL) and quantitation limit (QL) were 1.22 and 3.08 mu g/mL for PS, 2.02 and 6.12 mu g/mL for FVS, 0.44 and 1.34 mu g/mL for ATC, and 1.55 and 4.70 mu g/mL for RC. Intraday and interday relative standard deviations (RSDs) were 2.0%. The methods were applied successfully for quantitative determination of statins in pharmaceuticals.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The aim of this study is to perform a thorough comparison of quantitative susceptibility mapping (QSM) techniques and their dependence on the assumptions made. The compared methodologies were: two iterative single orientation methodologies minimizing the l2, l1TV norm of the prior knowledge of the edges of the object, one over-determined multiple orientation method (COSMOS) and anewly proposed modulated closed-form solution (MCF). The performance of these methods was compared using a numerical phantom and in-vivo high resolution (0.65mm isotropic) brain data acquired at 7T using a new coil combination method. For all QSM methods, the relevant regularization and prior-knowledge parameters were systematically changed in order to evaluate the optimal reconstruction in the presence and absence of a ground truth. Additionally, the QSM contrast was compared to conventional gradient recalled echo (GRE) magnitude and R2* maps obtained from the same dataset. The QSM reconstruction results of the single orientation methods show comparable performance. The MCF method has the highest correlation (corrMCF=0.95, r(2)MCF =0.97) with the state of the art method (COSMOS) with additional advantage of extreme fast computation time. The l-curve method gave the visually most satisfactory balance between reduction of streaking artifacts and over-regularization with the latter being overemphasized when the using the COSMOS susceptibility maps as ground-truth. R2* and susceptibility maps, when calculated from the same datasets, although based on distinct features of the data, have a comparable ability to distinguish deep gray matter structures.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Validation is arguably the bottleneck in the diffusion magnetic resonance imaging (MRI) community. This paper evaluates and compares 20 algorithms for recovering the local intra-voxel fiber structure from diffusion MRI data and is based on the results of the "HARDI reconstruction challenge" organized in the context of the "ISBI 2012" conference. Evaluated methods encompass a mixture of classical techniques well known in the literature such as diffusion tensor, Q-Ball and diffusion spectrum imaging, algorithms inspired by the recent theory of compressed sensing and also brand new approaches proposed for the first time at this contest. To quantitatively compare the methods under controlled conditions, two datasets with known ground-truth were synthetically generated and two main criteria were used to evaluate the quality of the reconstructions in every voxel: correct assessment of the number of fiber populations and angular accuracy in their orientation. This comparative study investigates the behavior of every algorithm with varying experimental conditions and highlights strengths and weaknesses of each approach. This information can be useful not only for enhancing current algorithms and develop the next generation of reconstruction methods, but also to assist physicians in the choice of the most adequate technique for their studies.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Interpretability and power of genome-wide association studies can be increased by imputing unobserved genotypes, using a reference panel of individuals genotyped at higher marker density. For many markers, genotypes cannot be imputed with complete certainty, and the uncertainty needs to be taken into account when testing for association with a given phenotype. In this paper, we compare currently available methods for testing association between uncertain genotypes and quantitative traits. We show that some previously described methods offer poor control of the false-positive rate (FPR), and that satisfactory performance of these methods is obtained only by using ad hoc filtering rules or by using a harsh transformation of the trait under study. We propose new methods that are based on exact maximum likelihood estimation and use a mixture model to accommodate nonnormal trait distributions when necessary. The new methods adequately control the FPR and also have equal or better power compared to all previously described methods. We provide a fast software implementation of all the methods studied here; our new method requires computation time of less than one computer-day for a typical genome-wide scan, with 2.5 M single nucleotide polymorphisms and 5000 individuals.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Five selective serotonin reuptake inhibitors (SSRIs) have been introduced recently: citalopram, fluoxetine, fluvoxamine, paroxetine and sertraline. Although no therapeutic window has been defined for SSRIs, in contrast to tricyclic antidepressants, analytical methods for therapeutic drug monitoring of SSRIs are useful in several instances. SSRIs differ widely in their chemical structure and in their metabolism. The fact that some of them have N-demethylated metabolites, which are also SSRIs, requires that methods be available which allow therapeutic drug monitoring of the parent compounds and of these active metabolites. most procedures are based on prepurification of the SSRIs by liquid-liquid extraction before they are submitted to separation by chromatographic procedures (high-performance liquid chromatography, gas chromatography, thin layer chromatography) and detection by various detectors (UV, fluorescence, electrochemical detector, nitrogen-phosphorus detector, mass spectrometry). This literature review shows that most methods allow quantitative determination of SSRIs in plasma, in the lower ng/ml range, and that they are, therefore, suitable for therapeutic drug monitoring purposes of this category of drugs.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Viruses are among the most important pathogens present in water contaminated with feces or urine and represent a serious risk to human health. Four procedures for concentrating viruses from sewage have been compared in this work, three of which were developed in the present study. Viruses were quantified using PCR techniques. According to statistical analysis and the sensitivity to detect human adenoviruses (HAdV), JC polyomaviruses (JCPyV) and noroviruses genogroup II (NoV GGII): (i) a new procedure (elution and skimmed-milk flocculation procedure (ESMP)) based on the elution of the viruses with glycine-alkaline buffer followed by organic flocculation with skimmed-milk was found to be the most efficient method when compared to (ii) ultrafiltration and glycine-alkaline elution, (iii) a lyophilization-based method and (iv) ultracentrifugation and glycine-alkaline elution. Through the analysis of replicate sewage samples, ESMP showed reproducible results with a coefficient of variation (CV) of 16% for HAdV, 12% for JCPyV and 17% for NoV GGII. Using spiked samples, the viral recoveries were estimated at 30-95% for HAdV, 55-90% for JCPyV and 45-50% for NoV GGII. ESMP was validated in a field study using twelve 24-h composite sewage samples collected in an urban sewage treatment plant in the North of Spain that reported 100% positive samples with mean values of HAdV, JCPyV and NoV GGII similar to those observed in other studies. Although all of the methods compared in this work yield consistently high values of virus detection and recovery in urban sewage, some require expensive laboratory equipment. ESMP is an effective low-cost procedure which allows a large number of samples to be processed simultaneously and is easily standardizable for its performance in a routine laboratory working in water monitoring. Moreover, in the present study, a CV was applied and proposed as a parameter to evaluate and compare the methods for detecting viruses in sewage samples.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Viruses are among the most important pathogens present in water contaminated with feces or urine and represent a serious risk to human health. Four procedures for concentrating viruses from sewage have been compared in this work, three of which were developed in the present study. Viruses were quantified using PCR techniques. According to statistical analysis and the sensitivity to detect human adenoviruses (HAdV), JC polyomaviruses (JCPyV) and noroviruses genogroup II (NoV GGII): (i) a new procedure (elution and skimmed-milk flocculation procedure (ESMP)) based on the elution of the viruses with glycine-alkaline buffer followed by organic flocculation with skimmed-milk was found to be the most efficient method when compared to (ii) ultrafiltration and glycine-alkaline elution, (iii) a lyophilization-based method and (iv) ultracentrifugation and glycine-alkaline elution. Through the analysis of replicate sewage samples, ESMP showed reproducible results with a coefficient of variation (CV) of 16% for HAdV, 12% for JCPyV and 17% for NoV GGII. Using spiked samples, the viral recoveries were estimated at 30-95% for HAdV, 55-90% for JCPyV and 45-50% for NoV GGII. ESMP was validated in a field study using twelve 24-h composite sewage samples collected in an urban sewage treatment plant in the North of Spain that reported 100% positive samples with mean values of HAdV, JCPyV and NoV GGII similar to those observed in other studies. Although all of the methods compared in this work yield consistently high values of virus detection and recovery in urban sewage, some require expensive laboratory equipment. ESMP is an effective low-cost procedure which allows a large number of samples to be processed simultaneously and is easily standardizable for its performance in a routine laboratory working in water monitoring. Moreover, in the present study, a CV was applied and proposed as a parameter to evaluate and compare the methods for detecting viruses in sewage samples.