22 resultados para practical epistemology analysis
em Université de Lausanne, Switzerland
Resumo:
This guide introduces Data Envelopment Analysis (DEA), a performance measurement technique, in such a way as to be appropriate to decision makers with little or no background in economics and operational research. The use of mathematics is kept to a minimum. This guide therefore adopts a strong practical approach in order to allow decision makers to conduct their own efficiency analysis and to easily interpret results. DEA helps decision makers for the following reasons: - By calculating an efficiency score, it indicates if a firm is efficient or has capacity for improvement. - By setting target values for input and output, it calculates how much input must be decreased or output increased in order to become efficient. - By identifying the nature of returns to scale, it indicates if a firm has to decrease or increase its scale (or size) in order to minimize the average cost. - By identifying a set of benchmarks, it specifies which other firms' processes need to be analysed in order to improve its own practices.
Resumo:
Game theory describes and analyzes strategic interaction. It is usually distinguished between static games, which are strategic situations in which the players choose only once as well as simultaneously, and dynamic games, which are strategic situations involving sequential choices. In addition, dynamic games can be further classified according to perfect and imperfect information. Indeed, a dynamic game is said to exhibit perfect information, whenever at any point of the game every player has full informational access to all choices that have been conducted so far. However, in the case of imperfect information some players are not fully informed about some choices. Game-theoretic analysis proceeds in two steps. Firstly, games are modelled by so-called form structures which extract and formalize the significant parts of the underlying strategic interaction. The basic and most commonly used models of games are the normal form, which rather sparsely describes a game merely in terms of the players' strategy sets and utilities, and the extensive form, which models a game in a more detailed way as a tree. In fact, it is standard to formalize static games with the normal form and dynamic games with the extensive form. Secondly, solution concepts are developed to solve models of games in the sense of identifying the choices that should be taken by rational players. Indeed, the ultimate objective of the classical approach to game theory, which is of normative character, is the development of a solution concept that is capable of identifying a unique choice for every player in an arbitrary game. However, given the large variety of games, it is not at all certain whether it is possible to device a solution concept with such universal capability. Alternatively, interactive epistemology provides an epistemic approach to game theory of descriptive character. This rather recent discipline analyzes the relation between knowledge, belief and choice of game-playing agents in an epistemic framework. The description of the players' choices in a given game relative to various epistemic assumptions constitutes the fundamental problem addressed by an epistemic approach to game theory. In a general sense, the objective of interactive epistemology consists in characterizing existing game-theoretic solution concepts in terms of epistemic assumptions as well as in proposing novel solution concepts by studying the game-theoretic implications of refined or new epistemic hypotheses. Intuitively, an epistemic model of a game can be interpreted as representing the reasoning of the players. Indeed, before making a decision in a game, the players reason about the game and their respective opponents, given their knowledge and beliefs. Precisely these epistemic mental states on which players base their decisions are explicitly expressible in an epistemic framework. In this PhD thesis, we consider an epistemic approach to game theory from a foundational point of view. In Chapter 1, basic game-theoretic notions as well as Aumann's epistemic framework for games are expounded and illustrated. Also, Aumann's sufficient conditions for backward induction are presented and his conceptual views discussed. In Chapter 2, Aumann's interactive epistemology is conceptually analyzed. In Chapter 3, which is based on joint work with Conrad Heilmann, a three-stage account for dynamic games is introduced and a type-based epistemic model is extended with a notion of agent connectedness. Then, sufficient conditions for backward induction are derived. In Chapter 4, which is based on joint work with Jérémie Cabessa, a topological approach to interactive epistemology is initiated. In particular, the epistemic-topological operator limit knowledge is defined and some implications for games considered. In Chapter 5, which is based on joint work with Jérémie Cabessa and Andrés Perea, Aumann's impossibility theorem on agreeing to disagree is revisited and weakened in the sense that possible contexts are provided in which agents can indeed agree to disagree.
Resumo:
Sampling issues represent a topic of ongoing interest to the forensic science community essentially because of their crucial role in laboratory planning and working protocols. For this purpose, forensic literature described thorough (Bayesian) probabilistic sampling approaches. These are now widely implemented in practice. They allow, for instance, to obtain probability statements that parameters of interest (e.g., the proportion of a seizure of items that present particular features, such as an illegal substance) satisfy particular criteria (e.g., a threshold or an otherwise limiting value). Currently, there are many approaches that allow one to derive probability statements relating to a population proportion, but questions on how a forensic decision maker - typically a client of a forensic examination or a scientist acting on behalf of a client - ought actually to decide about a proportion or a sample size, remained largely unexplored to date. The research presented here intends to address methodology from decision theory that may help to cope usefully with the wide range of sampling issues typically encountered in forensic science applications. The procedures explored in this paper enable scientists to address a variety of concepts such as the (net) value of sample information, the (expected) value of sample information or the (expected) decision loss. All of these aspects directly relate to questions that are regularly encountered in casework. Besides probability theory and Bayesian inference, the proposed approach requires some additional elements from decision theory that may increase the efforts needed for practical implementation. In view of this challenge, the present paper will emphasise the merits of graphical modelling concepts, such as decision trees and Bayesian decision networks. These can support forensic scientists in applying the methodology in practice. How this may be achieved is illustrated with several examples. The graphical devices invoked here also serve the purpose of supporting the discussion of the similarities, differences and complementary aspects of existing Bayesian probabilistic sampling criteria and the decision-theoretic approach proposed throughout this paper.
Resumo:
Cough is a very frequent symptom in children. Different reviews have tried to delineate the best approach to pediatric cough.1 Clinical evaluation remains the most important diagnostic initial step. Although the relations between cough and asthma are not straightforward,2 wheeze should be considered as a physical sign of increased resistance to air flow. Lung function testing is the gold standard for analyzing pulmonary resistance to air flow but has a limited practical value in young children. The clinical evaluation of the presence or absence of wheeze thus remains a primary clinical step in coughing children. Young children do not necessarily breathe deeply in and out when asked to. For years, the author has used a so-called "squeeze and wheeze" maneuver (SWM, see Methods section for definition) to elicit chest signs in young children. The basic idea is to increase expiratory flows in children who do not cooperate adequately during their lung sounds analysis. This study was realized to communicate the author's experience of a yet unreported physical sign and to study its prevalence in young children cared for in a general pediatrics practice.
Resumo:
The right to be treated humanely when detained is universally recognized. Deficiencies in detention conditions and violence, however, subvert this right. When this occurs, proper medico-legal investigations are critical irrespective of the nature of death. Unfortunately, the very context of custody raises serious concerns over the effectiveness and fairness of medico-legal examinations. The aim of this manuscript is to identify and discuss the practical and ethical difficulties encountered in the medico-legal investigation following deaths in custody. Data for this manuscript come from a larger project on Death in Custody that examined the causes of deaths in custody and the conditions under which these deaths should be investigated and prevented. A total of 33 stakeholders from forensic medicine, law, prison administration or national human rights administration were interviewed. Data obtained were analyzed qualitatively. Forensic experts are an essential part of the criminal justice process as they offer evidence for subsequent indictment and eventual punishment of perpetrators. Their independence when investigating a death in custody was deemed critical and lack thereof, problematic. When experts were not independent, concerns arose in relation to conflicts of interest, biased perspectives, and low-quality forensic reports. The solutions to ensure independent forensic investigations of deaths in custody must be structural and simple: setting binding standards of practice rather than detailed procedures and relying on preexisting national practices as opposed to encouraging new practices that are unattainable for countries with limited resources.
Resumo:
Short description of the proposed presentation * lees than 100 words This paper describes the interdisciplinary work done in Uspantán, Guatemala, a city vulnerable to natural hazards. We investigated local responses to landslides that happened in 2007 and 2010 and had a strong impact on the local community. We show a complete example of a systemic approach that incorporates physical, social and environmental aspects in order to understand risks. The objective of this work is to present the combination of social and geological data (mapping), and describe the methodology used for identification and assessment of risk. The article discusses both the limitations and methodological challenges encountered when conducting interdisciplinary research. Describe why it is important to present this topic at the Global Platform in less than 50 words This work shows the benefits of addressing risk in an interdisciplinary perspective, in particular how integrating social sciences can help identify new phenomena and natural hazards and assess risk. It gives a practical example of how one can integrate data from different fields. What is innovative about this presentation? * The use of mapping to combine qualitative and quantitative data. By coupling approaches, we could associate a hazard map with qualitative data gathered by interviews with the population. This map is an important document for the authorities. Indeed, it allows them to be aware of the most dangerous zones, the affected families and the places where it is most urgent to intervene.
Resumo:
Purpose - Work values are an important characteristic to understand gender differences in career intentions, but how gender affects the relationship between values and career intentions is not well established. The purpose of this paper is to investigate whether gender moderates the effects of work values on level and change of entrepreneurial intentions (EI). Design/methodology/approach - In total, 218 German university students were sampled regarding work values and with EI assessed three times over the course of 12 months. Data were analysed with latent growth modelling. Findings - Self-enhancement and openness to change values predicted higher levels and conservation values lower levels of EI. Gender moderated the effects of enhancement and conservation values on change in EI. Research limitations/implications - The authors relied on self-reported measures and the sample was restricted to university students. Future research needs to verify to what extent these results generalize to other samples and different career fields, such as science or nursing. Practical implications - The results imply that men and women are interested in an entrepreneurial career based on the same work values but that values have different effects for men and women regarding individual changes in EI. The results suggest that the prototypical work values of a career domain seem important regarding increasing the career intent for the gender that is underrepresented in that domain. Originality/value - The results enhance understanding of how gender affects the relation of work values and a specific career intention, such as entrepreneurship.
Resumo:
In Part I of this review, we have covered basic concepts regarding cardiorespiratory interactions. Here, we put this theoretical framework to practical use. We describe mechanisms underlying Kussmaul's sign and pulsus paradoxus. We review the literature on the use of respiratory variations of blood pressure to evaluate volume status. We show the possibilities of attaining the latter aim by investigating with ultrasonography how the geometry of great veins fluctuates with respiration. We provide a Guytonian analysis of the effects of PEEP on cardiac output. We terminate with some remarks on the potential of positive pressure breathing to induce acute cor pulmonale, and on the cardiovascular mechanisms that at times may underly the failure to wean a patient from the ventilator.
Resumo:
OBJECTIVE: To validate a revision of the Mini Nutritional Assessment short-form (MNA(R)-SF) against the full MNA, a standard tool for nutritional evaluation. METHODS: A literature search identified studies that used the MNA for nutritional screening in geriatric patients. The contacted authors submitted original datasets that were merged into a single database. Various combinations of the questions on the current MNA-SF were tested using this database through combination analysis and ROC based derivation of classification thresholds. RESULTS: Twenty-seven datasets (n=6257 participants) were initially processed from which twelve were used in the current analysis on a sample of 2032 study participants (mean age 82.3y) with complete information on all MNA items. The original MNA-SF was a combination of six questions from the full MNA. A revised MNA-SF included calf circumference (CC) substituted for BMI performed equally well. A revised three-category scoring classification for this revised MNA-SF, using BMI and/or CC, had good sensitivity compared to the full MNA. CONCLUSION: The newly revised MNA-SF is a valid nutritional screening tool applicable to geriatric health care professionals with the option of using CC when BMI cannot be calculated. This revised MNA-SF increases the applicability of this rapid screening tool in clinical practice through the inclusion of a "malnourished" category.
Resumo:
Practical guidelines for monitoring and measuring compounds such as jasmonates, ketols, ketodi(tri)enes and hydroxy-fatty acids as well as detecting the presence of novel oxylipins are presented. Additionally, a protocol for the penetrant analysis of non-enzymatic lipid oxidation is described. Each of the methods, which employ gas chromatography/mass spectrometry, can be applied without specialist knowledge or recourse to the latest analytical instrumentation. Additional information on oxylipin quantification and novel protocols for preparing oxygen isotope-labelled internal standards are provided. Four developing areas of research are identified: (i) profiling of the unbound cellular pools of oxylipins; (ii) profiling of esterified oxylipins and/or monitoring of their release from parent lipids; (iii) monitoring of non-enzymatic lipid oxidation; (iv) analysis of unstable and reactive oxylipins. The methods and protocols presented herein are designed to give technical insights into the first three areas and to provide a platform from which to enter the fourth area.
Resumo:
Purpose: Given the preponderance of education reform since the No Child Left Behind Act (U.S. Department of Education, 2001), reform efforts have shaped the nature of the work and culture in schools. The emphasis on standardized testing to determine schools' status and student performance, among other factors, has generated stress, particularly for teachers. Therefore, district and school administrators are encouraged to consider the contextual factors that contribute to teacher stress to address them and to retain high-performing teachers. Research Methods/Approach: Participants were recruited from two types of schools in order to test hypotheses related to directional responding as a function of working in a more challenging (high-priority) or less challenging (non-high-priority) school environment. We employed content analysis to analyze 64 suburban elementary school teachers' free-responses to a prompt regarding their stress as teachers. We cross-analyzed our findings through external auditing to bolster trustworthiness in the data and in the procedure. Findings: Teachers reported personal and contextual stressors. Herein, we reported concrete examples of the five categories of contextual stressors teachers identified: political and educational structures, instructional factors, student factors, parent and family factors, and school climate. We found directional qualities and overlapping relationships in the data, partially confirming our hypotheses. Implications for Research and Practice: We offer specific recommendations for practical ways in which school administrators might systemically address teacher stress based on the five categories of stressors reported by participants. We also suggest means of conducting action research to measure the effects of implemented suggestions.
Resumo:
This article analyses and discusses issues that pertain to the choice of relevant databases for assigning values to the components of evaluative likelihood ratio procedures at source level. Although several formal likelihood ratio developments currently exist, both case practitioners and recipients of expert information (such as judiciary) may be reluctant to consider them as a framework for evaluating scientific evidence in context. The recent ruling R v T and ensuing discussions in many forums provide illustrative examples for this. In particular, it is often felt that likelihood ratio-based reasoning amounts to an application that requires extensive quantitative information along with means for dealing with technicalities related to the algebraic formulation of these approaches. With regard to this objection, this article proposes two distinct discussions. In a first part, it is argued that, from a methodological point of view, there are additional levels of qualitative evaluation that are worth considering prior to focusing on particular numerical probability assignments. Analyses will be proposed that intend to show that, under certain assumptions, relative numerical values, as opposed to absolute values, may be sufficient to characterize a likelihood ratio for practical and pragmatic purposes. The feasibility of such qualitative considerations points out that the availability of hard numerical data is not a necessary requirement for implementing a likelihood ratio approach in practice. It is further argued that, even if numerical evaluations can be made, qualitative considerations may be valuable because they can further the understanding of the logical underpinnings of an assessment. In a second part, the article will draw a parallel to R v T by concentrating on a practical footwear mark case received at the authors' institute. This case will serve the purpose of exemplifying the possible usage of data from various sources in casework and help to discuss the difficulty associated with reconciling the depth of theoretical likelihood ratio developments and limitations in the degree to which these developments can actually be applied in practice.
Resumo:
Whether for investigative or intelligence aims, crime analysts often face up the necessity to analyse the spatiotemporal distribution of crimes or traces left by suspects. This article presents a visualisation methodology supporting recurrent practical analytical tasks such as the detection of crime series or the analysis of traces left by digital devices like mobile phone or GPS devices. The proposed approach has led to the development of a dedicated tool that has proven its effectiveness in real inquiries and intelligence practices. It supports a more fluent visual analysis of the collected data and may provide critical clues to support police operations as exemplified by the presented case studies.
Resumo:
This article extends existing discussion in literature on probabilistic inference and decision making with respect to continuous hypotheses that are prevalent in forensic toxicology. As a main aim, this research investigates the properties of a widely followed approach for quantifying the level of toxic substances in blood samples, and to compare this procedure with a Bayesian probabilistic approach. As an example, attention is confined to the presence of toxic substances, such as THC, in blood from car drivers. In this context, the interpretation of results from laboratory analyses needs to take into account legal requirements for establishing the 'presence' of target substances in blood. In a first part, the performance of the proposed Bayesian model for the estimation of an unknown parameter (here, the amount of a toxic substance) is illustrated and compared with the currently used method. The model is then used in a second part to approach-in a rational way-the decision component of the problem, that is judicial questions of the kind 'Is the quantity of THC measured in the blood over the legal threshold of 1.5 μg/l?'. This is pointed out through a practical example.
Resumo:
Objectives To consider the various specific substances-taking activities in sport an examination of three psychological models of doping behaviour utilised by researchers is presented in order to evaluate their real and potential impact, and to improve the relevance and efficiency of anti-doping campaigns. Design Adopting the notion of a "research program" (Lakatos, 1978) from the philosophy of science, a range of studies into the psychology of doping behaviour are classified and critically analysed. Method Theoretical and practical parameters of three research programs are critically evaluated (i) cognitive; (ii) drive; and (iii) situated-dynamic. Results The analysis reveals the diversity of theoretical commitments of the research programs and their practical consequences. The «cognitive program» assumes that athletes are accountable for their acts that reflect the endeavour to attain sporting and non-sporting goals. Attitudes, knowledge and rational decisions are understood to be the basis of doping behaviour. The «drive program» characterises the variety of traces and consequences on psychological and somatic states coming from athlete's experience with sport. Doping behaviour here is conceived of as a solution to reduce unconscious psychological and somatic distress. The «situated-dynamic program» considers a broader context of athletes' doping activity and its evolution during a sport career. Doping is considered as emergent and self-organized behaviour, grounded on temporally critical couplings between athletes' actions and situations and the specific dynamics of their development during the sporting life course. Conclusions These hypothetical, theoretical and methodological considerations offer a more nuanced understanding of doping behaviours, making an effective contribution to anti-doping education and research by enabling researchers and policy personnel to become more critically reflective about their explicit and implicit assumptions regarding models of explanations for doping behaviour.