944 resultados para Quantitative Methods
Resumo:
Abstract: Quantitative Methods (QM) is a compulsory course in the Social Science program in CEGEP. Many QM instructors assign a number of homework exercises to give students the opportunity to practice the statistical methods, which enhances their learning. However, traditional written exercises have two significant disadvantages. The first is that the feedback process is often very slow. The second disadvantage is that written exercises can generate a large amount of correcting for the instructor. WeBWorK is an open-source system that allows instructors to write exercises which students answer online. Although originally designed to write exercises for math and science students, WeBWorK programming allows for the creation of a variety of questions which can be used in the Quantitative Methods course. Because many statistical exercises generate objective and quantitative answers, the system is able to instantly assess students’ responses and tell them whether they are right or wrong. This immediate feedback has been shown to be theoretically conducive to positive learning outcomes. In addition, the system can be set up to allow students to re-try the problem if they got it wrong. This has benefits both in terms of student motivation and reinforcing learning. Through the use of a quasi-experiment, this research project measured and analysed the effects of using WeBWorK exercises in the Quantitative Methods course at Vanier College. Three specific research questions were addressed. First, we looked at whether students who did the WeBWorK exercises got better grades than students who did written exercises. Second, we looked at whether students who completed more of the WeBWorK exercises got better grades than students who completed fewer of the WeBWorK exercises. Finally, we used a self-report survey to find out what students’ perceptions and opinions were of the WeBWorK and the written exercises. For the first research question, a crossover design was used in order to compare whether the group that did WeBWorK problems during one unit would score significantly higher on that unit test than the other group that did the written problems. We found no significant difference in grades between students who did the WeBWorK exercises and students who did the written exercises. The second research question looked at whether students who completed more of the WeBWorK exercises would get significantly higher grades than students who completed fewer of the WeBWorK exercises. The straight-line relationship between number of WeBWorK exercises completed and grades was positive in both groups. However, the correlation coefficients for these two variables showed no real pattern. Our third research question was investigated by using a survey to elicit students’ perceptions and opinions regarding the WeBWorK and written exercises. Students reported no difference in the amount of effort put into completing each type of exercise. Students were also asked to rate each type of exercise along six dimensions and a composite score was calculated. Overall, students gave a significantly higher score to the written exercises, and reported that they found the written exercises were better for understanding the basic statistical concepts and for learning the basic statistical methods. However, when presented with the choice of having only written or only WeBWorK exercises, slightly more students preferred or strongly preferred having only WeBWorK exercises. The results of this research suggest that the advantages of using WeBWorK to teach Quantitative Methods are variable. The WeBWorK system offers immediate feedback, which often seems to motivate students to try again if they do not have the correct answer. However, this does not necessarily translate into better performance on the written tests and on the final exam. What has been learned is that the WeBWorK system can be used by interested instructors to enhance student learning in the Quantitative Methods course. Further research may examine more specifically how this system can be used more effectively.
Resumo:
Consider the problem of testing k hypotheses simultaneously. In this paper,we discuss finite and large sample theory of stepdown methods that providecontrol of the familywise error rate (FWE). In order to improve upon theBonferroni method or Holm's (1979) stepdown method, Westfall and Young(1993) make eective use of resampling to construct stepdown methods thatimplicitly estimate the dependence structure of the test statistics. However,their methods depend on an assumption called subset pivotality. The goalof this paper is to construct general stepdown methods that do not requiresuch an assumption. In order to accomplish this, we take a close look atwhat makes stepdown procedures work, and a key component is a monotonicityrequirement of critical values. By imposing such monotonicity on estimatedcritical values (which is not an assumption on the model but an assumptionon the method), it is demonstrated that the problem of constructing a validmultiple test procedure which controls the FWE can be reduced to the problemof contructing a single test which controls the usual probability of a Type 1error. This reduction allows us to draw upon an enormous resamplingliterature as a general means of test contruction.
Resumo:
Many multivariate methods that are apparently distinct can be linked by introducing oneor more parameters in their definition. Methods that can be linked in this way arecorrespondence analysis, unweighted or weighted logratio analysis (the latter alsoknown as "spectral mapping"), nonsymmetric correspondence analysis, principalcomponent analysis (with and without logarithmic transformation of the data) andmultidimensional scaling. In this presentation I will show how several of thesemethods, which are frequently used in compositional data analysis, may be linkedthrough parametrizations such as power transformations, linear transformations andconvex linear combinations. Since the methods of interest here all lead to visual mapsof data, a "movie" can be made where where the linking parameter is allowed to vary insmall steps: the results are recalculated "frame by frame" and one can see the smoothchange from one method to another. Several of these "movies" will be shown, giving adeeper insight into the similarities and differences between these methods.
Resumo:
We develop a general error analysis framework for the Monte Carlo simulationof densities for functionals in Wiener space. We also study variancereduction methods with the help of Malliavin derivatives. For this, wegive some general heuristic principles which are applied to diffusionprocesses. A comparison with kernel density estimates is made.
Resumo:
Quantitative approaches in ceramology are gaining ground in excavation reports, archaeological publications and thematic studies. Hence, a wide variety of methods are being used depending on the researchers' theoretical premise, the type of material which is examined, the context of discovery and the questions that are addressed. The round table that took place in Athens on November 2008 was intended to offer the participants the opportunity to present a selection of case studies on the basis of which methodological approaches were discussed. The aim was to define a set of guidelines for quantification that would prove to be of use to all researchers. Contents: 1) Introduction (Samuel Verdan); 2) Isthmia and beyond. How can quantification help the analysis of EIA sanctuary deposits? (Catherine Morgan); 3) Approaching aspects of cult practice and ethnicity in Early Iron Age Ephesos using quantitative analysis of a Protogeometric deposit from the Artemision (Michael Kerschner); 4) Development of a ceramic cultic assemblage: Analyzing pottery from Late Helladic IIIC through Late Geometric Kalapodi (Ivonne Kaiser, Laura-Concetta Rizzotto, Sara Strack); 5) 'Erfahrungsbericht' of application of different quantitative methods at Kalapodi (Sara Strack); 6) The Early Iron Age sanctuary at Olympia: counting sherds from the Pelopion excavations (1987-1996) (Birgitta Eder); 7) L'aire du pilier des Rhodiens à Delphes: Essai de quantification du mobilier (Jean-Marc Luce); 8) A new approach in ceramic statistical analyses: Pit 13 on Xeropolis at Lefkandi (David A. Mitchell, Irene S. Lemos); 9) Households and workshops at Early Iron Age Oropos: A quantitative approach of the fine, wheel-made pottery (Vicky Vlachou); 10) Counting sherds at Sindos: Pottery consumption and construction of identities in the Iron Age (Stefanos Gimatzidis); 11) Analyse quantitative du mobilier céramique des fouilles de Xombourgo à Ténos et le cas des supports de caisson (Jean-Sébastien Gros); 12) Defining a typology of pottery from Gortyn: The material from a pottery workshop pit, (Emanuela Santaniello); 13) Quantification of ceramics from Early Iron Age tombs (Antonis Kotsonas); 14) Quantitative analysis of the pottery from the Early Iron Age necropolis of Tsikalario on Naxos (Xenia Charalambidou); 15) Finding the Early Iron Age in field survey: Two case studies from Boeotia and Magnesia (Vladimir Stissi); 16) Pottery quantification: Some guidelines (Samuel Verdan)
Resumo:
Quantitative approaches in ceramology are gaining ground in excavation reports, archaeological publications and thematic studies. Hence, a wide variety of methods are being used depending on the researchers' theoretical premise, the type of material which is examined, the context of discovery and the questions that are addressed. The round table that took place in Athens on November 2008 was intended to offer the participants the opportunity to present a selection of case studies on the basis of which methodological approaches were discussed. The aim was to define a set of guidelines for quantification that would prove to be of use to all researchers. Contents: 1) Introduction (Samuel Verdan); 2) Isthmia and beyond. How can quantification help the analysis of EIA sanctuary deposits? (Catherine Morgan); 3) Approaching aspects of cult practice and ethnicity in Early Iron Age Ephesos using quantitative analysis of a Protogeometric deposit from the Artemision (Michael Kerschner); 4) Development of a ceramic cultic assemblage: Analyzing pottery from Late Helladic IIIC through Late Geometric Kalapodi (Ivonne Kaiser, Laura-Concetta Rizzotto, Sara Strack); 5) 'Erfahrungsbericht' of application of different quantitative methods at Kalapodi (Sara Strack); 6) The Early Iron Age sanctuary at Olympia: counting sherds from the Pelopion excavations (1987-1996) (Birgitta Eder); 7) L'aire du pilier des Rhodiens à Delphes: Essai de quantification du mobilier (Jean-Marc Luce); 8) A new approach in ceramic statistical analyses: Pit 13 on Xeropolis at Lefkandi (David A. Mitchell, Irene S. Lemos); 9) Households and workshops at Early Iron Age Oropos: A quantitative approach of the fine, wheel-made pottery (Vicky Vlachou); 10) Counting sherds at Sindos: Pottery consumption and construction of identities in the Iron Age (Stefanos Gimatzidis); 11) Analyse quantitative du mobilier céramique des fouilles de Xombourgo à Ténos et le cas des supports de caisson (Jean-Sébastien Gros); 12) Defining a typology of pottery from Gortyn: The material from a pottery workshop pit, (Emanuela Santaniello); 13) Quantification of ceramics from Early Iron Age tombs (Antonis Kotsonas); 14) Quantitative analysis of the pottery from the Early Iron Age necropolis of Tsikalario on Naxos (Xenia Charalambidou); 15) Finding the Early Iron Age in field survey: Two case studies from Boeotia and Magnesia (Vladimir Stissi); 16) Pottery quantification: Some guidelines (Samuel Verdan).
Resumo:
Les catastrophes sont souvent perçues comme des événements rapides et aléatoires. Si les déclencheurs peuvent être soudains, les catastrophes, elles, sont le résultat d'une accumulation des conséquences d'actions et de décisions inappropriées ainsi que du changement global. Pour modifier cette perception du risque, des outils de sensibilisation sont nécessaires. Des méthodes quantitatives ont été développées et ont permis d'identifier la distribution et les facteurs sous- jacents du risque.¦Le risque de catastrophes résulte de l'intersection entre aléas, exposition et vulnérabilité. La fréquence et l'intensité des aléas peuvent être influencées par le changement climatique ou le déclin des écosystèmes, la croissance démographique augmente l'exposition, alors que l'évolution du niveau de développement affecte la vulnérabilité. Chacune de ses composantes pouvant changer, le risque est dynamique et doit être réévalué périodiquement par les gouvernements, les assurances ou les agences de développement. Au niveau global, ces analyses sont souvent effectuées à l'aide de base de données sur les pertes enregistrées. Nos résultats montrent que celles-ci sont susceptibles d'être biaisées notamment par l'amélioration de l'accès à l'information. Elles ne sont pas exhaustives et ne donnent pas d'information sur l'exposition, l'intensité ou la vulnérabilité. Une nouvelle approche, indépendante des pertes reportées, est donc nécessaire.¦Les recherches présentées ici ont été mandatées par les Nations Unies et par des agences oeuvrant dans le développement et l'environnement (PNUD, l'UNISDR, la GTZ, le PNUE ou l'UICN). Ces organismes avaient besoin d'une évaluation quantitative sur les facteurs sous-jacents du risque, afin de sensibiliser les décideurs et pour la priorisation des projets de réduction des risques de désastres.¦La méthode est basée sur les systèmes d'information géographique, la télédétection, les bases de données et l'analyse statistique. Une importante quantité de données (1,7 Tb) et plusieurs milliers d'heures de calculs ont été nécessaires. Un modèle de risque global a été élaboré pour révéler la distribution des aléas, de l'exposition et des risques, ainsi que pour l'identification des facteurs de risque sous- jacent de plusieurs aléas (inondations, cyclones tropicaux, séismes et glissements de terrain). Deux indexes de risque multiples ont été générés pour comparer les pays. Les résultats incluent une évaluation du rôle de l'intensité de l'aléa, de l'exposition, de la pauvreté, de la gouvernance dans la configuration et les tendances du risque. Il apparaît que les facteurs de vulnérabilité changent en fonction du type d'aléa, et contrairement à l'exposition, leur poids décroît quand l'intensité augmente.¦Au niveau local, la méthode a été testée pour mettre en évidence l'influence du changement climatique et du déclin des écosystèmes sur l'aléa. Dans le nord du Pakistan, la déforestation induit une augmentation de la susceptibilité des glissements de terrain. Les recherches menées au Pérou (à base d'imagerie satellitaire et de collecte de données au sol) révèlent un retrait glaciaire rapide et donnent une évaluation du volume de glace restante ainsi que des scénarios sur l'évolution possible.¦Ces résultats ont été présentés à des publics différents, notamment en face de 160 gouvernements. Les résultats et les données générées sont accessibles en ligne (http://preview.grid.unep.ch). La méthode est flexible et facilement transposable à des échelles et problématiques différentes, offrant de bonnes perspectives pour l'adaptation à d'autres domaines de recherche.¦La caractérisation du risque au niveau global et l'identification du rôle des écosystèmes dans le risque de catastrophe est en plein développement. Ces recherches ont révélés de nombreux défis, certains ont été résolus, d'autres sont restés des limitations. Cependant, il apparaît clairement que le niveau de développement configure line grande partie des risques de catastrophes. La dynamique du risque est gouvernée principalement par le changement global.¦Disasters are often perceived as fast and random events. If the triggers may be sudden, disasters are the result of an accumulation of actions, consequences from inappropriate decisions and from global change. To modify this perception of risk, advocacy tools are needed. Quantitative methods have been developed to identify the distribution and the underlying factors of risk.¦Disaster risk is resulting from the intersection of hazards, exposure and vulnerability. The frequency and intensity of hazards can be influenced by climate change or by the decline of ecosystems. Population growth increases the exposure, while changes in the level of development affect the vulnerability. Given that each of its components may change, the risk is dynamic and should be reviewed periodically by governments, insurance companies or development agencies. At the global level, these analyses are often performed using databases on reported losses. Our results show that these are likely to be biased in particular by improvements in access to information. International losses databases are not exhaustive and do not give information on exposure, the intensity or vulnerability. A new approach, independent of reported losses, is necessary.¦The researches presented here have been mandated by the United Nations and agencies working in the development and the environment (UNDP, UNISDR, GTZ, UNEP and IUCN). These organizations needed a quantitative assessment of the underlying factors of risk, to raise awareness amongst policymakers and to prioritize disaster risk reduction projects.¦The method is based on geographic information systems, remote sensing, databases and statistical analysis. It required a large amount of data (1.7 Tb of data on both the physical environment and socio-economic parameters) and several thousand hours of processing were necessary. A comprehensive risk model was developed to reveal the distribution of hazards, exposure and risk, and to identify underlying risk factors. These were performed for several hazards (e.g. floods, tropical cyclones, earthquakes and landslides). Two different multiple risk indexes were generated to compare countries. The results include an evaluation of the role of the intensity of the hazard, exposure, poverty, governance in the pattern and trends of risk. It appears that the vulnerability factors change depending on the type of hazard, and contrary to the exposure, their weight decreases as the intensity increases.¦Locally, the method was tested to highlight the influence of climate change and the ecosystems decline on the hazard. In northern Pakistan, deforestation exacerbates the susceptibility of landslides. Researches in Peru (based on satellite imagery and ground data collection) revealed a rapid glacier retreat and give an assessment of the remaining ice volume as well as scenarios of possible evolution.¦These results were presented to different audiences, including in front of 160 governments. The results and data generated are made available online through an open source SDI (http://preview.grid.unep.ch). The method is flexible and easily transferable to different scales and issues, with good prospects for adaptation to other research areas. The risk characterization at a global level and identifying the role of ecosystems in disaster risk is booming. These researches have revealed many challenges, some were resolved, while others remained limitations. However, it is clear that the level of development, and more over, unsustainable development, configures a large part of disaster risk and that the dynamics of risk is primarily governed by global change.
Resumo:
There is currently little empirical knowledge regarding the construction of a musician’s identity and social class. With a theoretical framework based on Bourdieu’s (1984) distinction theory, Bronfenbrenner’s (1979) theory of ecological systems, and the identity theories of Erikson (1950; 1968) and Marcia (1966), a survey called the Musician’s Social Background and Identity Questionnaire (MSBIQ) is developed to test three research hypotheses related to the construction of a musician’s identity, social class and ecological systems of development. The MSBIQ is administered to the music students at Sibelius Academy of the University of Arts Helsinki and Helsinki Metropolia University of Applied Sciences, representing the ’highbrow’ and the ’middlebrow’ samples in the field of music education in Finland. Acquired responses (N = 253) are analyzed and compared with quantitative methods including Pearson’s chi-square test, factor analysis and an adjusted analysis of variance (ANOVA). The study revealed that (1) the music students at Sibelius Academy and Metropolia construct their subjective musician’s identity differently, but (2) social class does not affect this identity construction process significantly. In turn, (3) the ecological systems of development, especially the individual’s residential location, do significantly affect the construction of a musician’s identity, as well as the age at which one starts to play one’s first musical instrument. Furthermore, a novel finding related to the structure of a musician’s identity was the tripartite model of musical identity consisting of the three dimensions of a musician’s identity: (I) ’the subjective dimension of a musician’s identity’, (II) ’the occupational dimension of a musician’s identity’ and, (III) ’the conservative-liberal dimension of a musician’s identity’. According to this finding, a musician’s identity is not a uniform, coherent entity, but a structure consisting of different elements continuously working in parallel within different dimensions. The results and limitations related to the study are discussed, as well as the objectives related to future studies using the MSBIQ to research the identity construction and social backgrounds of a musician or other performing artists.
Resumo:
We propose finite sample tests and confidence sets for models with unobserved and generated regressors as well as various models estimated by instrumental variables methods. The validity of the procedures is unaffected by the presence of identification problems or \"weak instruments\", so no detection of such problems is required. We study two distinct approaches for various models considered by Pagan (1984). The first one is an instrument substitution method which generalizes an approach proposed by Anderson and Rubin (1949) and Fuller (1987) for different (although related) problems, while the second one is based on splitting the sample. The instrument substitution method uses the instruments directly, instead of generated regressors, in order to test hypotheses about the \"structural parameters\" of interest and build confidence sets. The second approach relies on \"generated regressors\", which allows a gain in degrees of freedom, and a sample split technique. For inference about general possibly nonlinear transformations of model parameters, projection techniques are proposed. A distributional theory is obtained under the assumptions of Gaussian errors and strictly exogenous regressors. We show that the various tests and confidence sets proposed are (locally) \"asymptotically valid\" under much weaker assumptions. The properties of the tests proposed are examined in simulation experiments. In general, they outperform the usual asymptotic inference methods in terms of both reliability and power. Finally, the techniques suggested are applied to a model of Tobin’s q and to a model of academic performance.
Resumo:
In the context of multivariate regression (MLR) and seemingly unrelated regressions (SURE) models, it is well known that commonly employed asymptotic test criteria are seriously biased towards overrejection. in this paper, we propose finite-and large-sample likelihood-based test procedures for possibly non-linear hypotheses on the coefficients of MLR and SURE systems.
Resumo:
This paper employs the one-sector Real Business Cycle model as a testing ground for four different procedures to estimate Dynamic Stochastic General Equilibrium (DSGE) models. The procedures are: 1 ) Maximum Likelihood, with and without measurement errors and incorporating Bayesian priors, 2) Generalized Method of Moments, 3) Simulated Method of Moments, and 4) Indirect Inference. Monte Carlo analysis indicates that all procedures deliver reasonably good estimates under the null hypothesis. However, there are substantial differences in statistical and computational efficiency in the small samples currently available to estimate DSGE models. GMM and SMM appear to be more robust to misspecification than the alternative procedures. The implications of the stochastic singularity of DSGE models for each estimation method are fully discussed.
Resumo:
A group of agents participate in a cooperative enterprise producing a single good. Each participant contributes a particular type of input; output is nondecreasing in these contributions. How should it be shared? We analyze the implications of the axiom of Group Monotonicity: if a group of agents simultaneously decrease their input contributions, not all of them should receive a higher share of output. We show that in combination with other more familiar axioms, this condition pins down a very small class of methods, which we dub nearly serial.
Resumo:
The Kineticist's Workbench is a program that simulates chemical reaction mechanisms by predicting, generating, and interpreting numerical data. Prior to simulation, it analyzes a given mechanism to predict that mechanism's behavior; it then simulates the mechanism numerically; and afterward, it interprets and summarizes the data it has generated. In performing these tasks, the Workbench uses a variety of techniques: graph- theoretic algorithms (for analyzing mechanisms), traditional numerical simulation methods, and algorithms that examine simulation results and reinterpret them in qualitative terms. The Workbench thus serves as a prototype for a new class of scientific computational tools---tools that provide symbiotic collaborations between qualitative and quantitative methods.
Resumo:
Quantitation is an inherent requirement in comparative proteomics and there is no exception to this for plant proteomics. Quantitative proteomics has high demands on the experimental workflow, requiring a thorough design and often a complex multi-step structure. It has to include sufficient numbers of biological and technical replicates and methods that are able to facilitate a quantitative signal read-out. Quantitative plant proteomics in particular poses many additional challenges but because of the nature of plants it also offers some potential advantages. In general, analysis of plants has been less prominent in proteomics. Low protein concentration, difficulties in protein extraction, genome multiploidy, high Rubisco abundance in green tissue, and an absence of well-annotated and completed genome sequences are some of the main challenges in plant proteomics. However, the latter is now changing with several genomes emerging for model plants and crops such as potato, tomato, soybean, rice, maize and barley. This review discusses the current status in quantitative plant proteomics (MS-based and non-MS-based) and its challenges and potentials. Both relative and absolute quantitation methods in plant proteomics from DIGE to MS-based analysis after isotope labeling and label-free quantitation are described and illustrated by published studies. In particular, we describe plant-specific quantitative methods such as metabolic labeling methods that can take full advantage of plant metabolism and culture practices, and discuss other potential advantages and challenges that may arise from the unique properties of plants.
Resumo:
Replacement, expansion and upgrading of assets in the electricity network represents financial investment for the distribution utilities. Network Investment Deferral (NID) is a well discussed benefit of wider adoption of Distributed Generation (DG). There have been many attempts to quantify and evaluate the financial benefit for the distribution utilities. While the carbon benefits of NID are commonly mentioned, there is little attempt to quantify these impacts. This paper explores the quantitative methods previously used to evaluate financial benefits in order to discuss the carbon impacts. These carbon impacts are important for companies owning DG equipment for internal reporting and emissions reductions ambitions. Currently, a GB wide approach is taken as a means for discussing more regional and local methods to be used in future work. By investigating these principles, the paper offers a novel approach to quantifying carbon emissions from various DG technologies.