878 resultados para implementation and complexity theory
Resumo:
Abstract The neo-liberal capitalist ideology has come under heavy fire with anecdotal evidence indicating a link between these same values and unethical behavior. Academic institutions reflect social values and act as socializing agents for the young. Can this explain the high and increasing rates of cheating that currently prevail in education? Our first chapter examines the question of whether self-enhancement values of power and açhievement, the individual level equivalent of neo-liberal capitalist values, predict positive attitudes towards cheating. Furthermore, we explore the mediating role of motivational factors. Results of four studies reveal that self-enhancement value endorsement predicts the adoption of performance-approach goals, a relationship mediated by introjected regulation, namely desire for social approval and that self-enhancement value endorsement also predicts the condoning of cheating, a relationship mediated by performance-approach goal adoption. However, self-transcendence values prescribed by a normatively salient source have the potential to reduce the link between self-enhancement value endorsément and attitudes towards cheating. Normative assessment constitutes a key tool used by academic institutions to socialize young people to accept the competitive, meritocratic nature of a sociéty driven by a neo-liberal capitalist ideology. As such, the manifest function of grades is to motivate students to work hard and to buy into the competing ethos. Does normative assessment fulfill these functions? Our second chapter explores the reward-intrinsic motivation question in the context of grading, arguably a high-stakes reward. In two experiments, the relative capacity of graded high performance as compared to the task autonomy experienced in an ungraded task to predict post-task intrinsic motivation is assessed. Results show that whilst the graded task performance predicts post-task appreciation, it fails to predict ongoing motivation. However, perceived autonomy experienced in non-graded condition, predicts both post-task appreciation and ongoing motivation. Our third chapter asks whether normative assessment inspires the spirit of competition in students. Results of three experimental studies reveal that expectation of a grade for a task, compared to no grade, induces greater adoption of performance-avoidance, but not performance-approach, goals. Experiment 3 provides an explanatory mechanism for this, showing that reduced autonomous motivation experienced in previous graded tasks mediates the relationship between grading and adoption of performance avoidance goals in a subsequent task. The above results, when combined, provide evidence as to the deleterious effects of self enhancement values and the associated practice of normative assessment in school on student motivation, goals and ethics. We conclude by using value and motivation theory to explore solutions to this problem.
Resumo:
The innovative subject “Introduction to ICT” combines a general Introductory Course to the University with elements around the Information and Communication Technologies sector (including the ICT engineer competence profile, market aspects, etc.)This new course has been developed and implemented in three degree programmes offered by the Polytechnic School at Universitat Pompeu Fabra, Barcelona. The course team consists of thirteen teachers,including business professionals, librarians, computer technicians, institutional representatives as well as an educationalist responsible for advising on methodology and study techniques. The subject was designed for a high number of students (260). At the end of the course, we collected quantitative and qualitativeinformation about the students’ satisfaction. The findings show the positive vision that they had about the topics worked during the subject. This paper describes the course, its implementation and evaluation and, of course, the details of the findings that we collected about students' satisfaction.
Resumo:
Le travail d'un(e) expert(e) en science forensique exige que ce dernier (cette dernière) prenne une série de décisions. Ces décisions sont difficiles parce qu'elles doivent être prises dans l'inévitable présence d'incertitude, dans le contexte unique des circonstances qui entourent la décision, et, parfois, parce qu'elles sont complexes suite à de nombreuse variables aléatoires et dépendantes les unes des autres. Etant donné que ces décisions peuvent aboutir à des conséquences sérieuses dans l'administration de la justice, la prise de décisions en science forensique devrait être soutenue par un cadre robuste qui fait des inférences en présence d'incertitudes et des décisions sur la base de ces inférences. L'objectif de cette thèse est de répondre à ce besoin en présentant un cadre théorique pour faire des choix rationnels dans des problèmes de décisions rencontrés par les experts dans un laboratoire de science forensique. L'inférence et la théorie de la décision bayésienne satisfont les conditions nécessaires pour un tel cadre théorique. Pour atteindre son objectif, cette thèse consiste de trois propositions, recommandant l'utilisation (1) de la théorie de la décision, (2) des réseaux bayésiens, et (3) des réseaux bayésiens de décision pour gérer des problèmes d'inférence et de décision forensiques. Les résultats présentent un cadre uniforme et cohérent pour faire des inférences et des décisions en science forensique qui utilise les concepts théoriques ci-dessus. Ils décrivent comment organiser chaque type de problème en le décomposant dans ses différents éléments, et comment trouver le meilleur plan d'action en faisant la distinction entre des problèmes de décision en une étape et des problèmes de décision en deux étapes et en y appliquant le principe de la maximisation de l'utilité espérée. Pour illustrer l'application de ce cadre à des problèmes rencontrés par les experts dans un laboratoire de science forensique, des études de cas théoriques appliquent la théorie de la décision, les réseaux bayésiens et les réseaux bayésiens de décision à une sélection de différents types de problèmes d'inférence et de décision impliquant différentes catégories de traces. Deux études du problème des deux traces illustrent comment la construction de réseaux bayésiens permet de gérer des problèmes d'inférence complexes, et ainsi surmonter l'obstacle de la complexité qui peut être présent dans des problèmes de décision. Trois études-une sur ce qu'il faut conclure d'une recherche dans une banque de données qui fournit exactement une correspondance, une sur quel génotype il faut rechercher dans une banque de données sur la base des observations faites sur des résultats de profilage d'ADN, et une sur s'il faut soumettre une trace digitale à un processus qui compare la trace avec des empreintes de sources potentielles-expliquent l'application de la théorie de la décision et des réseaux bayésiens de décision à chacune de ces décisions. Les résultats des études des cas théoriques soutiennent les trois propositions avancées dans cette thèse. Ainsi, cette thèse présente un cadre uniforme pour organiser et trouver le plan d'action le plus rationnel dans des problèmes de décisions rencontrés par les experts dans un laboratoire de science forensique. Le cadre proposé est un outil interactif et exploratoire qui permet de mieux comprendre un problème de décision afin que cette compréhension puisse aboutir à des choix qui sont mieux informés. - Forensic science casework involves making a sériés of choices. The difficulty in making these choices lies in the inévitable presence of uncertainty, the unique context of circumstances surrounding each décision and, in some cases, the complexity due to numerous, interrelated random variables. Given that these décisions can lead to serious conséquences in the admin-istration of justice, forensic décision making should be supported by a robust framework that makes inferences under uncertainty and décisions based on these inferences. The objective of this thesis is to respond to this need by presenting a framework for making rational choices in décision problems encountered by scientists in forensic science laboratories. Bayesian inference and décision theory meets the requirements for such a framework. To attain its objective, this thesis consists of three propositions, advocating the use of (1) décision theory, (2) Bayesian networks, and (3) influence diagrams for handling forensic inference and décision problems. The results present a uniform and coherent framework for making inferences and décisions in forensic science using the above theoretical concepts. They describe how to organize each type of problem by breaking it down into its différent elements, and how to find the most rational course of action by distinguishing between one-stage and two-stage décision problems and applying the principle of expected utility maximization. To illustrate the framework's application to the problems encountered by scientists in forensic science laboratories, theoretical case studies apply décision theory, Bayesian net-works and influence diagrams to a selection of différent types of inference and décision problems dealing with différent catégories of trace evidence. Two studies of the two-trace problem illustrate how the construction of Bayesian networks can handle complex inference problems, and thus overcome the hurdle of complexity that can be present in décision prob-lems. Three studies-one on what to conclude when a database search provides exactly one hit, one on what genotype to search for in a database based on the observations made on DNA typing results, and one on whether to submit a fingermark to the process of comparing it with prints of its potential sources-explain the application of décision theory and influ¬ence diagrams to each of these décisions. The results of the theoretical case studies support the thesis's three propositions. Hence, this thesis présents a uniform framework for organizing and finding the most rational course of action in décision problems encountered by scientists in forensic science laboratories. The proposed framework is an interactive and exploratory tool for better understanding a décision problem so that this understanding may lead to better informed choices.
Resumo:
OBJECTIVE: This study was undertaken to determine the delay of extubation attributable to ventilator-associated pneumonia (VAP) in comparison to other complications and complexity of surgery after repair of congenital heart lesions in neonates and children. METHODS: Cohort study in a pediatric intensive care unit of a tertiary referral center. All patients who had cardiac operations during a 22-month period and who survived surgery were eligible (n = 272, median age 1.3 years). Primary outcome was time to successful extubation. Primary variable of interest was VAP Surgical procedures were classified according to complexity. Cox proportional hazards models were calculated to adjust for confounding. Potential confounders comprised other known risk factors for delayed extubation. RESULTS: Median time to extubation was 3 days. VAP occurred in 26 patients (9.6%). The rate of VAP was not associated with complexity of surgery (P = 0.22), or cardiopulmonary bypass (P = 0.23). The adjusted analysis revealed as further factors associated with delayed extubation: other respiratory complications (n = 28, chylothorax, airway stenosis, diaphragm paresis), prolonged inotropic support (n = 48, 17.6%), and the need for secondary surgery (n = 51, 18.8%; e.g., re-operation, secondary closure of thorax). Older age promoted early extubation. The median delay of extubation attributable to VAP was 3.7 days (hazards ratio HR = 0.29, 95% CI 0.18-0.49), exceeding the effect size of secondary surgery (HR = 0.48) and other respiratory complications (HR = 0.50). CONCLUSION: VAP accounts for a major delay of extubation in pediatric cardiac surgery.
Resumo:
It has long been standard in agency theory to search for incentive-compatible mechanisms on the assumption that people care only about their own material wealth. However, this assumption is clearly refuted by numerous experiments, and we feel that it may be useful to consider nonpecuniary utility in mechanism design and contract theory. Accordingly, we devise an experiment to explore optimal contracts in an adverse-selection context. A principal proposes one of three contract menus, each of which offers a choice of two incentive-compatible contracts, to two agents whose types are unknown to the principal. The agents know the set of possible menus, and choose to either accept one of the two contracts offered in the proposed menu or to reject the menu altogether; a rejection by either agent leads to lower (and equal) reservation payoffs for all parties. While all three possible menus favor the principal, they do so to varying degrees. We observe numerous rejections of the more lopsided menus, and approach an equilibrium where one of the more equitable contract menus (which one depends on the reservation payoffs) is proposed and agents accept a contract, selecting actions according to their types. Behavior is largely consistent with all recent models of social preferences, strongly suggesting there is value in considering nonpecuniary utility in agency theory.
Resumo:
One of the disadvantages of old age is that there is more past than future: this,however, may be turned into an advantage if the wealth of experience and, hopefully,wisdom gained in the past can be reflected upon and throw some light on possiblefuture trends. To an extent, then, this talk is necessarily personal, certainly nostalgic,but also self critical and inquisitive about our understanding of the discipline ofstatistics. A number of almost philosophical themes will run through the talk: searchfor appropriate modelling in relation to the real problem envisaged, emphasis onsensible balances between simplicity and complexity, the relative roles of theory andpractice, the nature of communication of inferential ideas to the statistical layman, theinter-related roles of teaching, consultation and research. A list of keywords might be:identification of sample space and its mathematical structure, choices betweentransform and stay, the role of parametric modelling, the role of a sample spacemetric, the underused hypothesis lattice, the nature of compositional change,particularly in relation to the modelling of processes. While the main theme will berelevance to compositional data analysis we shall point to substantial implications forgeneral multivariate analysis arising from experience of the development ofcompositional data analysis…
Resumo:
Ectoparasites are common in most bird species, but experimental evidence of their effects on life-history traits is scarce. We investigated experimentally the effects of the hematophagous hen flea (Ceratophyllus gallinae) on timing of reproduction, nest-site choice, nest desertion, clutch size, and hatching success in the great tit (Parus major). When great tits were offered a choice on their territory between an infested and a parasite-free nest-box, they chose the one without parasites. When there was no choice, the great tits in a territory containing an infested nest-box delayed laying the clutch by 11 days as compared with the birds that were offered a parasite-free nesting opportunity. The finding that there was no difference in phenotypic traits related to dominance between the birds nesting in infested boxes and birds nesting in parasite-free boxes suggests that the delay is not imposed by social dominance. Nest desertion between laying and shortly after hatching was significandy higher in infested nests. There was no difference between infested and parasite-free nests in clutch size, but hatching success and hence brood size at hatching were significantly smaller in infested nests. Nest-box studies of great tits have been seminal in the development of evolutionary, ecological, and behavioral theory, but recently a polemic has arisen in the literature about the validity of the conclusions drawn from nest-box studies where the naturally occurring, detrimental ectoparasites are eliminated by the routine removal of old nests between breeding seasons. Our study suggests that this criticism is valid and that the evaluation of the effects of ectoparasites may improve our understanding of behavioral traits, life-history traits, or population dynamics
Resumo:
It has long been standard in agency theory to search for incentive-compatible mechanisms on the assumption that people care only about their own material wealth. However, this assumption is clearly refuted by numerous experiments, and we feel that it may be useful to consider nonpecuniary utility in mechanism design and contract theory. Accordingly, we devise an experiment to explore optimal contracts in an adverse-selection context. A principal proposes one of three contract menus, each of which offers a choice of two incentive-compatible contracts, to two agents whose types are unknown to the principal. The agents know the set of possible menus, and choose to either accept one of the two contracts offered in the proposed menu or to reject the menu altogether; a rejection by either agent leads to lower (and equal) reservation payoffs for all parties. While all three possible menus favor the principal, they do so to varying degrees. We observe numerous rejections of the more lopsided menus, and approach an equilibrium where one of the more equitable contract menus (which one depends on the reservation payoffs) is proposed and agents accept a contract, selecting actions according to their types. Behavior is largely consistent with all recent models of social preferences, strongly suggesting there is value in considering nonpecuniary utility in agency theory.
Resumo:
BACKGROUND/PURPOSE: A new coordinated interdisciplinary unit was created in the acute section of the department of clinical neurosciences, the Acute NeuroRehabilitation (NRA) unit. The objective was to evaluate the impact of the unit and its neurosensory programme on the management of tracheostomy patients in terms of reduction in the average time taken for weaning, weaning success rate and therapeutic efficiency. METHODS: This 49-month retrospective study compares 2 groups of tracheostomy patients before (n = 34) and after (n = 46) NRA intervention. The outcome measures evaluate the benefits of the NRA unit intervention (time to decannulation, weaning and complication rates) and the benefits of the coordination (time to registration in a rehabilitation centre and rate of non-compliance with standards of care). RESULTS: Weaning failure rate was reduced from 27.3% to 9.1%, no complications or recannulations were observed in the post-intervention group after weaning and time to decannulation following admission to our unit decreased from 19.13 to 12.75 days. The rate of non-compliance with patient standards of care was significantly reduced from 45% to 30% (Mann-Whitney p = 0.003). DISCUSSION/CONCLUSIONS: This interdisciplinary weaning programme helped to reduce weaning time and weaning failure, without increased complications, in the sample studied. Coordination improved the efficiency of the interdisciplinary team in the multiplicity and complexity of the different treatments.
Resumo:
This guide was created to aid communities in the process of smart planning and is organized around the 10 Smart Planning Principles signed into Iowa law in 2010. A general description of the concept, strategies for encouraging use, policy tools for implementation, and a current Iowa example are presented for each Principle. In addition, a brief list of resources is provided to help local governments, community organizations and citizen planners find information and ideas on community involvement and incorporation of smart planning concepts in every day decisions.
Resumo:
The emergence of powerful new technologies, the existence of large quantities of data, and increasing demands for the extraction of added value from these technologies and data have created a number of significant challenges for those charged with both corporate and information technology management. The possibilities are great, the expectations high, and the risks significant. Organisations seeking to employ cloud technologies and exploit the value of the data to which they have access, be this in the form of "Big Data" available from different external sources or data held within the organisation, in structured or unstructured formats, need to understand the risks involved in such activities. Data owners have responsibilities towards the subjects of the data and must also, frequently, demonstrate that they are in compliance with current standards, laws and regulations. This thesis sets out to explore the nature of the technologies that organisations might utilise, identify the most pertinent constraints and risks, and propose a framework for the management of data from discovery to external hosting that will allow the most significant risks to be managed through the definition, implementation, and performance of appropriate internal control activities.
Resumo:
We consider mean first-passage times (MFPTs) for systems driven by non-Markov gamma and McFadden dichotomous noises. A simplified derivation is given of the underlying integral equations and the theory for ordinary renewal processes is extended to modified and equilibrium renewal processes. The exact results are compared with the MFPT for Markov dichotomous noise and with the results of Monte Carlo simulations.
Resumo:
The renin-angiotensin aldosterone system (RAAS) is central to the pathogenesis of cardiovascular disease. RAAS inhibition can reduce blood pressure, prevent target organ damage in hypertension and diabetes, and improve outcomes in patients with heart failure and/or myocardial infarction. This review presents the history of RAAS inhibition including a summary of key heart failure, myocardial infarction, hypertension and atrial fibrillation trials. Recent developments in RAAS inhibition are discussed including implementation and optimization of current drug therapies. Finally, ongoing clinical trials, opportunities for future trials and issues related to the barriers and approvability of novel RAAS inhibitors are highlighted.
Resumo:
Tax reform proposals in the spirit of the "flat tax" model typically aim to reduce three parameters: the average tax burden, the progressivity of the tax schedule, and the complexity of the tax code. We explore the implications of changes in these three parameters for entrepreneurial activity, measured by counts of firm births. The Swiss fiscal system offers sufficient intra-national variation in tax codes to allow us to estimate such effects with considerable precision. We find that high average taxes and complicated tax codes depress firm birth rates, while tax progressivity per se promotes firm births. The latter result supports the existence of an insurance effect from progressive corporate income taxes for risk averse entrepreneurs. However, implied elasticities with respect to the level and complexity of corporate taxes are an order of magnitude larger than elasticities with respect to the progressivity of tax schedules.
Resumo:
A range of models describing metapopulations is surveyed and their implications for conservation biology are described. An overview of the use of both population genetic elements and demographic theory in metapopulation models is given. It would appear that most of the current models suffer from either the use of over-simplified demography or the avoidance of selectively important genetic factors. The scale for which predictions are made by the various models is often obscure. A conceptual framework for describing metapopulations by utilising the concept of fitness of local populations is provided and some examples are given. The expectation that any general theory, such as that of metapopulations, can make useful predictions for particular problems of conservation is examined and compared with the prevailing 'state of the art' recommendations.