948 resultados para [JEL:C70] Mathematical and Quantitative Methods - Game Theory and Bargaining Theory - General
Resumo:
This study evaluated the early development and pilot-testing of Project IMPACT, a case management intervention for victims of stalking. The Design and Development framework (Rothman & Thomas, 1994) was used as a guide for program development and evaluation. Nine research questions examined the processes and outcomes associated with program implementation. ^ The sample included all 36 clients who participated in Project IMPACT between February of 2000 and June of 2001, as well as the victim advocates who provided them with services. Quantitative and qualitative data were drawn from client case files, participant observation field notes and interview transcriptions. Quantitative data were entered into three databases where: (1) clients were the units of analysis (n = 36), (2) services were the units of analysis (n = 1146), and (3) goals were the units of analysis (n = 149). These data were analyzed using descriptive statistics, Pearson's Chi-square, Spearman's Rho, Phi, Cramer's V, Wilcoxon's Matched Pairs Signed-Ranked Test and McNemar's Test Statistic. Qualitative data were reduced via open, axial and selective coding methods. Grounded theory and case study frameworks were utilized to analyze these data. ^ Results showed that most clients noted an improved sense of well-being and safety, although residual symptoms of trauma remained for numerous individuals. Stalkers appeared to respond to criminal and civil justice-based interventions by reducing violent and threatening behaviors; however, covert behaviors continued. The study produced findings that provided preliminary support for the use of several intervention components including support services, psycho-education, safety planning, and boundary spanning. The psycho-education and safety planning in particular seemed to help clients cognitively reframe their perceptions of the stalking experience and gain a sense of increased safety and well-being. A 65% level of satisfactory goal achievement was observed overall, although goals involving justice-based organizations were associated with lower achievement. High service usage was related to low-income clients and those lacking in social support. Numerous inconsistencies in program implementation were found to be associated with the skills and experiences of victim advocates. Thus, recommendations were made to further refine, develop and evaluate the intervention. ^
Resumo:
The purpose of the study was to compare the English III success of students whose home language is Haitian Creole (SWHLIHC) with that of the more visible African American high school students in the Miami Dade County Public Schools System, in an effort to offer insight that might assist educators in facilitating the educational success of SWHLIHC in American Literature class.^ The study was guided by two important theories on how students interact with and learn from literature. They are Reader Response Theory which advocates giving students the opportunity to become involved in the literature experience (Rosenblatt, 1995), and Critical Literacy, a theory developed by Paolo Freire and Henry Giroux, which espouses a critical approach to analysis of society that enables people to analyze social problems through lenses that would reveal social inequities and assist in transforming society into a more equitable entity.^ Data for the study: 10th grade reading FCAT scores, English III/American Literature grades, and Promotion to English IV records for the school year 2010-2011 were retrieved from the records division of the Miami Dade County Public Schools System. The study used a quantitative methods approach, the central feature of which was an ex post facto design with hypotheses (Newman, Newman, Brown, & McNeely, 2006). The ex post facto design with hypotheses was chosen because the researcher postulated hypotheses about the relationships that might exist between the performances of SWHLIHC and those of African American students on the three above mentioned variables. This type of design supported the researcher's purpose of comparing these performances.^ One way analysis of variance (ANOVA), two way ANOVAs, and chi square tests were used to examine the two groups' performances on the 10th grade reading FCAT, their English III grades, and their promotion to English IV. ^ The study findings show that there was a significant difference in the performance of SWHLIHC and African American high school students on all three independent variables. SWHLIHC performed significantly higher on English III success and promotion to English IV. African American high school students performed significantly higher on the reading FCAT.^
Resumo:
The present study – employing psychometric meta-analysis of 92 independent studies with sample sizes ranging from 26 to 322 leaders – examined the relationship between EI and leadership effectiveness. Overall, the results supported a linkage between leader EI and effectiveness that was moderate in nature (ρ = .25). In addition, the positive manifold of the effect sizes presented in this study, ranging from .10 to .44, indicate that emotional intelligence has meaningful relations with myriad leadership outcomes including effectiveness, transformational leadership, LMX, follower job satisfaction, and others. Furthermore, this paper examined potential process mechanisms that may account for the EI-leadership effectiveness relationship and showed that both transformational leadership and LMX partially mediate this relationship. However, while the predictive validities of EI were moderate in nature, path analysis and hierarchical regression suggests that EI contributes less than or equal to 1% of explained variance in leadership effectiveness once personality and intelligence are accounted for.
Resumo:
Allocating resources optimally is a nontrivial task, especially when multiple
self-interested agents with conflicting goals are involved. This dissertation
uses techniques from game theory to study two classes of such problems:
allocating resources to catch agents that attempt to evade them, and allocating
payments to agents in a team in order to stabilize it. Besides discussing what
allocations are optimal from various game-theoretic perspectives, we also study
how to efficiently compute them, and if no such algorithms are found, what
computational hardness results can be proved.
The first class of problems is inspired by real-world applications such as the
TOEFL iBT test, course final exams, driver's license tests, and airport security
patrols. We call them test games and security games. This dissertation first
studies test games separately, and then proposes a framework of Catcher-Evader
games (CE games) that generalizes both test games and security games. We show
that the optimal test strategy can be efficiently computed for scored test
games, but it is hard to compute for many binary test games. Optimal Stackelberg
strategies are hard to compute for CE games, but we give an empirically
efficient algorithm for computing their Nash equilibria. We also prove that the
Nash equilibria of a CE game are interchangeable.
The second class of problems involves how to split a reward that is collectively
obtained by a team. For example, how should a startup distribute its shares, and
what salary should an enterprise pay to its employees. Several stability-based
solution concepts in cooperative game theory, such as the core, the least core,
and the nucleolus, are well suited to this purpose when the goal is to avoid
coalitions of agents breaking off. We show that some of these solution concepts
can be justified as the most stable payments under noise. Moreover, by adjusting
the noise models (to be arguably more realistic), we obtain new solution
concepts including the partial nucleolus, the multiplicative least core, and the
multiplicative nucleolus. We then study the computational complexity of those
solution concepts under the constraint of superadditivity. Our result is based
on what we call Small-Issues-Large-Team games and it applies to popular
representation schemes such as MC-nets.
Resumo:
A correct understanding about how computers run code is mandatory in order to effectively learn to program. Lectures have historically been used in programming courses to teach how computers execute code, and students are assessed through traditional evaluation methods, such as exams. Constructivism learning theory objects to students passiveness during lessons, and traditional quantitative methods for evaluating a complex cognitive process such as understanding. Constructivism proposes complimentary techniques, such as conceptual contraposition and colloquies. We enriched lectures of a Programming II (CS2) course combining conceptual contraposition with program memory tracing, then we evaluated students understanding of programming concepts through colloquies. Results revealed that these techniques applied to the lecture are insufficient to help students develop satisfactory mental models of the C++ notional machine, and colloquies behaved as the most comprehensive traditional evaluations conducted in the course.
Resumo:
This work is aimed at understanding and unifying information on epidemiological modelling methods and how those methods relate to public policy addressing human health, specifically in the context of infectious disease prevention, pandemic planning, and health behaviour change. This thesis employs multiple qualitative and quantitative methods, and presents as a manuscript of several individual, data-driven projects that are combined in a narrative arc. The first chapter introduces the scope and complexity of this interdisciplinary undertaking, describing several topical intersections of importance. The second chapter begins the presentation of original data, and describes in detail two exercises in computational epidemiological modelling pertinent to pandemic influenza planning and policy, and progresses in the next chapter to present additional original data on how the confidence of the public in modelling methodology may have an effect on their planned health behaviour change as recommended in public health policy. The thesis narrative continues in the final data-driven chapter to describe how health policymakers use modelling methods and scientific evidence to inform and construct health policies for the prevention of infectious diseases, and concludes with a narrative chapter that evaluates the breadth of this data and recommends strategies for the optimal use of modelling methodologies when informing public health policy in applied public health scenarios.
Resumo:
In this dissertation I draw a connection between quantum adiabatic optimization, spectral graph theory, heat-diffusion, and sub-stochastic processes through the operators that govern these processes and their associated spectra. In particular, we study Hamiltonians which have recently become known as ``stoquastic'' or, equivalently, the generators of sub-stochastic processes. The operators corresponding to these Hamiltonians are of interest in all of the settings mentioned above. I predominantly explore the connection between the spectral gap of an operator, or the difference between the two lowest energies of that operator, and certain equilibrium behavior. In the context of adiabatic optimization, this corresponds to the likelihood of solving the optimization problem of interest. I will provide an instance of an optimization problem that is easy to solve classically, but leaves open the possibility to being difficult adiabatically. Aside from this concrete example, the work in this dissertation is predominantly mathematical and we focus on bounding the spectral gap. Our primary tool for doing this is spectral graph theory, which provides the most natural approach to this task by simply considering Dirichlet eigenvalues of subgraphs of host graphs. I will derive tight bounds for the gap of one-dimensional, hypercube, and general convex subgraphs. The techniques used will also adapt methods recently used by Andrews and Clutterbuck to prove the long-standing ``Fundamental Gap Conjecture''.
Resumo:
This research concerns the conceptual and empirical relationship between environmental justice and social-ecological resilience as it relates to climate change vulnerability and adaptation. Two primary questions guided this work. First, what is the level of resilience and adaptive capacity for social-ecological systems that are characterized by environmental injustice in the face of climate change? And second, what is the role of an environmental justice approach in developing adaptation policies that will promote social-ecological resilience? These questions were investigated in three African American communities that are particularly vulnerable to flooding from sea-level rise on the Eastern Shore of the Chesapeake Bay. Using qualitative and quantitative methods, I found that in all three communities, religious faith and the church, rootedness in the landscape, and race relations were highly salient to community experience. The degree to which these common aspects of the communities have imparted adaptive capacity has changed over time. Importantly, a given social-ecological factor does not have the same effect on vulnerability in all communities; however, in all communities political isolation decreases adaptive capacity and increases vulnerability. This political isolation is at least partly due to procedural injustice, which occurs for a number of interrelated reasons. This research further revealed that while all stakeholders (policymakers, environmentalists, and African American community members) generally agree that justice needs to be increased on the Eastern Shore, stakeholder groups disagree about what a justice approach to adaptation would look like. When brought together at a workshop, however, these stakeholders were able to identify numerous challenges and opportunities for increasing justice. Resilience was assessed by the presence of four resilience factors: living with uncertainty, nurturing diversity, combining different types of knowledge, and creating opportunities for self-organization. Overall, these communities seem to have low resilience; however, there is potential for resilience to increase. Finally, I argue that the use of resilience theory for environmental justice communities is limited by the great breadth and depth of knowledge required to evaluate the state of the social-ecological system, the complexities of simultaneously promoting resilience at both the regional and local scale, and the lack of attention to issues of justice.
Resumo:
The purpose of the study was to compare the English III success of students whose home language is Haitian Creole (SWHLIHC) with that of the more visible African American high school students in the Miami Dade County Public Schools System, in an effort to offer insight that might assist educators in facilitating the educational success of SWHLIHC in American Literature class. The study was guided by two important theories on how students interact with and learn from literature. They are Reader Response Theory which advocates giving students the opportunity to become involved in the literature experience (Rosenblatt, 1995), and Critical Literacy, a theory developed by Paolo Freire and Henry Giroux, which espouses a critical approach to analysis of society that enables people to analyze social problems through lenses that would reveal social inequities and assist in transforming society into a more equitable entity. Data for the study: 10th grade reading FCAT scores, English III/American Literature grades, and Promotion to English IV records for the school year 2010-2011 were retrieved from the records division of the Miami Dade County Public Schools System. The study used a quantitative methods approach, the central feature of which was an ex post facto design with hypotheses (Newman, Newman, Brown, & McNeely, 2006). The ex post facto design with hypotheses was chosen because the researcher postulated hypotheses about the relationships that might exist between the performances of SWHLIHC and those of African American students on the three above mentioned variables. This type of design supported the researcher’s purpose of comparing these performances. One way analysis of variance (ANOVA), two way ANOVAs, and chi square tests were used to examine the two groups’ performances on the 10th grade reading FCAT, their English III grades, and their promotion to English IV. The study findings show that there was a significant difference in the performance of SWHLIHC and African American high school students on all three independent variables. SWHLIHC performed significantly higher on English III success and promotion to English IV. African American high school students performed significantly higher on the reading FCAT.
Resumo:
In questo elaborato ci siamo occupati della legge di Zipf sia da un punto di vista applicativo che teorico. Tale legge empirica afferma che il rango in frequenza (RF) delle parole di un testo seguono una legge a potenza con esponente -1. Per quanto riguarda l'approccio teorico abbiamo trattato due classi di modelli in grado di ricreare leggi a potenza nella loro distribuzione di probabilità. In particolare, abbiamo considerato delle generalizzazioni delle urne di Polya e i processi SSR (Sample Space Reducing). Di questi ultimi abbiamo dato una formalizzazione in termini di catene di Markov. Infine abbiamo proposto un modello di dinamica delle popolazioni capace di unificare e riprodurre i risultati dei tre SSR presenti in letteratura. Successivamente siamo passati all'analisi quantitativa dell'andamento del RF sulle parole di un corpus di testi. Infatti in questo caso si osserva che la RF non segue una pura legge a potenza ma ha un duplice andamento che può essere rappresentato da una legge a potenza che cambia esponente. Abbiamo cercato di capire se fosse possibile legare l'analisi dell'andamento del RF con le proprietà topologiche di un grafo. In particolare, a partire da un corpus di testi abbiamo costruito una rete di adiacenza dove ogni parola era collegata tramite un link alla parola successiva. Svolgendo un'analisi topologica della struttura del grafo abbiamo trovato alcuni risultati che sembrano confermare l'ipotesi che la sua struttura sia legata al cambiamento di pendenza della RF. Questo risultato può portare ad alcuni sviluppi nell'ambito dello studio del linguaggio e della mente umana. Inoltre, siccome la struttura del grafo presenterebbe alcune componenti che raggruppano parole in base al loro significato, un approfondimento di questo studio potrebbe condurre ad alcuni sviluppi nell'ambito della comprensione automatica del testo (text mining).
Resumo:
Noise is constant presence in measurements. Its origin is related to the microscopic properties of matter. Since the seminal work of Brown in 1828, the study of stochastic processes has gained an increasing interest with the development of new mathematical and analytical tools. In the last decades, the central role that noise plays in chemical and physiological processes has become recognized. The dual role of noise as nuisance/resource pushes towards the development of new decomposition techniques that divide a signal into its deterministic and stochastic components. In this thesis I show how methods based on Singular Spectrum Analysis have the right properties to fulfil the previously mentioned requirement. During my work I applied SSA to different signals of interest in chemistry: I developed a novel iterative procedure for the denoising of powder X-ray diffractograms; I “denoised” bi-dimensional images from experiments of electrochemiluminescence imaging of micro-beads obtaining new insight on ECL mechanism. I also used Principal Component Analysis to investigate the relationship between brain electrophysiological signals and voice emission.
Resumo:
The challenges of the current global food systems are often framed around feeding the world's growing population while meeting sustainable development for future generations. Globalization has brought to a fragmentation of food spaces, leading to a flexible and mutable supply chain. This poses a major challenge to food and nutrition security, affecting also rural-urban dynamics in territories. Furthermore, the recent crises have highlighted the vulnerability to shocks and disruptions of the food systems and the eco-system due to the intensive management of natural, human and economic capital. Hence, a sustainable and resilient transition of the food systems is required through a multi-faceted approach that tackles the causes of unsustainability and promotes sustainable practices at all levels of the food system. In this respect, a territorial approach becomes a relevant entry point of analysis for the food system’s multifunctionality and can support the evaluation of sustainability by quantifying impacts associated with quantitative methods and understanding the territorial responsibility of different actors with qualitative ones. Against this background the present research aims to i) investigate the environmental, costing and social indicators suitable for a scoring system able to measure the integrated sustainability performance of food initiatives within the City/Region territorial context; ii) develop a territorial assessment framework to measure sustainability impacts of agricultural systems; and iii) define an integrated methodology to match production and consumption at a territorial level to foster a long-term vision of short food supply chains. From a methodological perspective, the research proposes a mixed quantitative and qualitative research method. The outcomes provide an in-depth view into the environmental and socio-economic impacts of food systems at the territorial level, investigating possible indicators, frameworks, and business strategies to foster their future sustainable development.
Resumo:
Universidade Estadual de Campinas. Faculdade de Educação Física
Resumo:
Cellulose acetates with different degrees of substitution (DS, from 0.6 to 1.9) were prepared from previously mercerized linter cellulose, in a homogeneous medium, using N,N-dimethylacetamide/lithium chloride as a solvent system. The influence of different degrees of substitution on the properties of cellulose acetates was investigated using thermogravimetric analyses (TGA). Quantitative methods were applied to the thermogravimetric curves in order to determine the apparent activation energy (Ea) related to the thermal decomposition of untreated and mercerized celluloses and cellulose acetates. Ea values were calculated using Broido's method and considering dynamic conditions. Ea values of 158 and 187 kJ mol-1 were obtained for untreated and mercerized cellulose, respectively. A previous study showed that C6OH is the most reactive site for acetylation, probably due to the steric hindrance of C2 and C3. The C6OH takes part in the first step of cellulose decomposition, leading to the formation of levoglucosan and, when it is changed to C6OCOCH3, the results indicate that the mechanism of thermal decomposition changes to one with a lower Ea. A linear correlation between Ea and the DS of the acetates prepared in the present work was identified.
Resumo:
Purpose: To evaluate patellar kinematics of volunteers Without knee pain at rest and during isometric contraction in open- and closed-kinetic-chain exercises. Methods: Twenty individuals took part in this study. All were submitted to magnetic resonance imaging (MRI) during rest and voluntary isometric contraction (VIC) in the open anti closed kinetic chain at 15 degrees, 30 degrees, and 45 degrees of knee flexion. Through MRI and using medical e-film software, the following measurements were evaluated: sulcus angle, patellar-tilt angle, and bisect offset. The mixed-effects linear model was used for comparison between knee positions, between rest and isometric contractions, and between (he exercises. Results: Data analysis revealed that the sulcus angle decreased as knee flexion increased and revealed increases with isometric contractions in both the open and closed kinetic chain for all knee-flexion angles. The patellar-tilt angle decreased with isometric contractions in both the open and closed kinetic chain for every knee position. However, in the closed kinetic chain, patellar tilt increased significantly with the knee flexed at 15 degrees. The bisect offset increased with the knee flexed at 15 degrees during isometric contractions and decreased as knee flexion increased during both exercises. Conclusion: VIC in the last degrees of knee extension may compromise patellar dynamics. On the other hand, it is possible to favor patellar stability by performing muscle contractions with the knee flexed at 30 degrees and 45 degrees in either the open or closed kinetic chain.