906 resultados para Reading and Interpretation of Statistical Graphs


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Secure communications in distributed Wireless Sensor Networks (WSN) operating under adversarial conditions necessitate efficient key management schemes. In the absence of a priori knowledge of post-deployment network configuration and due to limited resources at sensor nodes, key management schemes cannot be based on post-deployment computations. Instead, a list of keys, called a key-chain, is distributed to each sensor node before the deployment. For secure communication, either two nodes should have a key in common in their key-chains, or they should establish a key through a secure-path on which every link is secured with a key. We first provide a comparative survey of well known key management solutions for WSN. Probabilistic, deterministic and hybrid key management solutions are presented, and they are compared based on their security properties and re-source usage. We provide a taxonomy of solutions, and identify trade-offs in them to conclude that there is no one size-fits-all solution. Second, we design and analyze deterministic and hybrid techniques to distribute pair-wise keys to sensor nodes before the deployment. We present novel deterministic and hybrid approaches based on combinatorial design theory and graph theory for deciding how many and which keys to assign to each key-chain before the sensor network deployment. Performance and security of the proposed schemes are studied both analytically and computationally. Third, we address the key establishment problem in WSN which requires key agreement algorithms without authentication are executed over a secure-path. The length of the secure-path impacts the power consumption and the initialization delay for a WSN before it becomes operational. We formulate the key establishment problem as a constrained bi-objective optimization problem, break it into two sub-problems, and show that they are both NP-Hard and MAX-SNP-Hard. Having established inapproximability results, we focus on addressing the authentication problem that prevents key agreement algorithms to be used directly over a wireless link. We present a fully distributed algorithm where each pair of nodes can establish a key with authentication by using their neighbors as the witnesses.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Management of groundwater systems requires realistic conceptual hydrogeological models as a framework for numerical simulation modelling, but also for system understanding and communicating this to stakeholders and the broader community. To help overcome these challenges we developed GVS (Groundwater Visualisation System), a stand-alone desktop software package that uses interactive 3D visualisation and animation techniques. The goal was a user-friendly groundwater management tool that could support a range of existing real-world and pre-processed data, both surface and subsurface, including geology and various types of temporal hydrological information. GVS allows these data to be integrated into a single conceptual hydrogeological model. In addition, 3D geological models produced externally using other software packages, can readily be imported into GVS models, as can outputs of simulations (e.g. piezometric surfaces) produced by software such as MODFLOW or FEFLOW. Boreholes can be integrated, showing any down-hole data and properties, including screen information, intersected geology, water level data and water chemistry. Animation is used to display spatial and temporal changes, with time-series data such as rainfall, standing water levels and electrical conductivity, displaying dynamic processes. Time and space variations can be presented using a range of contouring and colour mapping techniques, in addition to interactive plots of time-series parameters. Other types of data, for example, demographics and cultural information, can also be readily incorporated. The GVS software can execute on a standard Windows or Linux-based PC with a minimum of 2 GB RAM, and the model output is easy and inexpensive to distribute, by download or via USB/DVD/CD. Example models are described here for three groundwater systems in Queensland, northeastern Australia: two unconfined alluvial groundwater systems with intensive irrigation, the Lockyer Valley and the upper Condamine Valley, and the Surat Basin, a large sedimentary basin of confined artesian aquifers. This latter example required more detail in the hydrostratigraphy, correlation of formations with drillholes and visualisation of simulation piezometric surfaces. Both alluvial system GVS models were developed during drought conditions to support government strategies to implement groundwater management. The Surat Basin model was industry sponsored research, for coal seam gas groundwater management and community information and consultation. The “virtual” groundwater systems in these 3D GVS models can be interactively interrogated by standard functions, plus production of 2D cross-sections, data selection from the 3D scene, rear end database and plot displays. A unique feature is that GVS allows investigation of time-series data across different display modes, both 2D and 3D. GVS has been used successfully as a tool to enhance community/stakeholder understanding and knowledge of groundwater systems and is of value for training and educational purposes. Projects completed confirm that GVS provides a powerful support to management and decision making, and as a tool for interpretation of groundwater system hydrological processes. A highly effective visualisation output is the production of short videos (e.g. 2–5 min) based on sequences of camera ‘fly-throughs’ and screen images. Further work involves developing support for multi-screen displays and touch-screen technologies, distributed rendering, gestural interaction systems. To highlight the visualisation and animation capability of the GVS software, links to related multimedia hosted online sites are included in the references.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We applied a texture-based flow visualisation technique to a numerical hydrodynamic model of the Pumicestone Passage in southeast Queensland, Australia. The quality of the visualisations using our flow visualisation tool, are compared with animations generated using more traditional drogue release plot and velocity contour and vector techniques. The texture-based method is found to be far more effective in visualising advective flow within the model domain. In some instances, it also makes it easier for the researcher to identify specific hydrodynamic features within the complex flow regimes of this shallow tidal barrier estuary as compared with the direct and geometric based methods.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Analysis of fossils from cave deposits at Mount Etna (eastern-central Queensland) has established that a species-rich rainforest palaeoenvironment existed in that area during the middle Pleistocene. This unexpected finding has implications for several fields (e.g., biogeography/phylogeography of rainforest-adapted taxa, and the impact of climate change on rainforest communities), but it was unknown whether the Mount Etna sites represented a small refugial patch of rainforest or was more widespread. In this study numerous bone deposits in caves in north-east Queensland are analysed to reconstruct the environmental history of the area during the late Quaternary. Study sites are in the Chillagoe/Mitchell Palmer and Broken River/Christmas Creek areas. The cave fossil records in these study areas are compared with dated (middle Pleistocene-Holocene) cave sites in the Mount Etna area. Substantial taxonomic work on the Mount Etna faunas (particularly dasyurid marsupials and murine rodents) is also presented as a prerequisite for meaningful comparison with the study sites further north. Middle Pleistocene sites at Mount Etna contain species indicative of a rainforest palaeoenvironment. Small mammal assemblages in the Mount Etna rainforest sites (>500-280 ka) are unexpectedly diverse and composed almost entirely of new species. Included in the rainforest assemblages are lineages with no extant representatives in rainforest (e.g., Leggadina), one genus previously known only from New Guinea (Abeomelomys), and forms that appear to bridge gaps between related but morphologically-divergent extant taxa ('B-rat' and 'Pseudomys C'). Curiously, some taxa (e.g., Melomys spp.) are notable for their absence from the Mount Etna rainforest sites. After 280 ka the rainforest faunas are replaced by species adapted to open, dry habitats. At that time the extinct ‘rainforest’ dasyurids and rodents are replaced by species that are either extant or recently extant. By the late Pleistocene all ‘rainforest’ and several ‘dry’ taxa are locally or completely extinct, and the small mammal fauna resembles that found in the area today. The faunal/environmental changes recorded in the Mount Etna sites were interpreted by previous workers as the result of shifts in climate during the Pleistocene. Many samples from caves in the Chillagoe/Mitchell-Palmer and Broken River/Christmas Creek areas are held in the Queensland Museum’s collection. These, supplemented with additional samples collected in the field as well as samples supplied by other workers, were systematically and palaeoecologically analysed for the first time. Palaeoecological interpretation of the faunal assemblages in the sites suggests that they encompass a similar array of palaeoenvironments as the Mount Etna sites. ‘Rainforest’ sites at the Broken River are here interpreted as being of similar age to those at Mount Etna, suggesting the possibility of extensive rainforest coverage in eastern tropical Queensland during part of the Pleistocene. Likewise, faunas suggesting open, dry palaeoenvironments are found at Chillagoe, the Broken River and Mount Etna, and may be of similar age. The 'dry' faunal assemblage at Mount Etna (Elephant hole Cave) dates to 205-170 ka. Dating of one of the Chillagoe sites (QML1067) produced a maximum age for the deposit of approximately 200 ka, and the site is interpreted as being close to that age, supporting the interpretation of roughly contemporaneous deposition at Mount Etna and Chillagoe. Finally, study sites interpreted as being of late Pleistocene-Holocene age show faunal similarities to sites of that age near Mount Etna. This study has several important implications for the biogeography and phylogeography of murine rodents, and represents a major advance in the study of the Australian murine fossil record. Likewise the survey of the northern study areas is the first systematic analysis of multiple sites in those areas, and is thus a major contribution to knowledge of tropical Australian faunas during the Quaternary. This analysis suggests that climatic changes during the Pleistocene affected a large area of eastern tropical Queensland in similar ways. Further fieldwork and dating is required to properly analyse the geographical extent and timing of faunal change in eastern tropical Queensland.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

There are no population studies of prevalence or incidence of child maltreatment in Australia. Child protection data gives some understanding but is restricted by system capacity and definitional issues across jurisdictions. Child protection data currently suggests that numbers of reports are increasing yearly, and the child protection system then becomes focussed on investigating all reports and diluting available resources for those children who are most in need of intervention. A public health response across multiple agencies enables responses to child safety across the entire population. All families are targeted at the primary level; examples include ensuring all parents know the dangers of shaking a baby or teaching children to say no if a situation makes them uncomfortable. The secondary level of prevention targets families with a number of risk factors, for example subsidised child care so children aren't left unsupervised after school when both parents have to be at work or home visiting for drug-addicted parents to ensure children are cared for. The tertiary response then becomes the responsibility of the child protection system and is reserved for those children where abuse and neglect are identified. This model requires that child safety is seen in a broader context than just the child protection system, and increasingly health professionals are being identified as an important component in the public health framework. If all injury is viewed as preventable and considered along a continuum of 'accidental' through to 'inflicted', it becomes possible to conceptualise child maltreatment in an injury context. Parental intent may not be to cause harm to the child, but by lack of insight or concern about risk, the potential for injury is high. The mechanisms for unintentional and intentional injury overlap and some suggest that by segregating child abuse (with the possible exception of sexual abuse) from unintentional injury, child abuse is excluded from the broader injury prevention initiative that is gaining momentum in the community. This research uses a public health perspective, specifically that of injury prevention, to consider the problem of child abuse. This study employed a mixed method design that incorporates secondary data analysis, data linkage and structured interviews of different professional groups. Datasets from the Queensland Injury Surveillance Unit (QISU) and The Department of Child Safety (DCS) were evaluated. Coded injury data was grouped according to intent of injury according to those with a code that indicated the ED presentation was due to child abuse, a code indicating that the injury was possibly due to abuse or, in the third group, the intent code indicated that the injury was unintentional and not due to abuse. Primary data collection from ED records was undertaken and information recoded to assess reliability and completeness. Emergency department data (QISU) was linked to Department of Child Safety Data to examine concordance and data quality. Factors influencing the collection and collation of these data were identified through structured interview methodology and analysed using qualitative methods. Secondary analysis of QISU data indicated that codes lacking specific information on the injury event were more likely to also have an intent code indicating abuse than those records where there was specific information on the injury event. Codes for abuse appeared in only 1.2% of the 84,765 records analysed. Unintentional injury was the most commonly coded intent (95.3%). In the group with a definite abuse code assigned at triage, 83% linked to a record with DCS and cases where documentation indicated police involvement were significantly more likely to be associated with a DCS record than those without such documentation. In those coded with an unintentional injury code, 22% linked to a DCS record with cases assigned an urgent triage category more likely to link than those with a triage category for resuscitation and children who presented to regional or remote hospitals more likely to link to a DCS record than those presenting to urban hospitals. Twenty-nine per cent of cases with a code indicating possible abuse linked to a DCS record. In documentation that indicated police involvement in the case, a code for unspecified activity when compared to cases with a code indicating involvement in a sporting activity and children less than 12 months of age compared to those in the 13-17 year old age group were all variables significantly associated with linkage to a DCS record. Only 13% of records contained documentation indicating that child abuse and neglect were considered in the diagnosis of the injury despite almost half of the sample having a code of abuse or possible abuse. Doctors and nurses were confident in their knowledge of the process of reporting child maltreatment but less confident about identifying child abuse and neglect and what should be reported. Many were concerned about implications of reporting, for the child and family and for themselves. A number were concerned about the implications of not reporting, mostly for the wellbeing of the child and a few in terms of their legal obligations as mandatory reporters. The outcomes of this research will help improve the knowledge of barriers to effective surveillance of child abuse in emergency departments. This will, in turn, ensure better identification and reporting practises; more reliable official statistical collections and the potential of flagging high-risk cases to ensure adequate departmental responses have been initiated.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Deterministic computer simulation of physical experiments is now a common technique in science and engineering. Often, physical experiments are too time consuming, expensive or impossible to conduct. Complex computer models or codes, rather than physical experiments lead to the study of computer experiments, which are used to investigate many scientific phenomena. A computer experiment consists of a number of runs of the computer code with different input choices. The Design and Analysis of Computer Experiments is a rapidly growing technique in statistical experimental design. This paper aims to discuss some practical issues when designing a computer simulation and/or experiments for manufacturing systems. A case study approach is reviewed and presented.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Computer Experiments, consisting of a number of runs of a computer model with different inputs, are now common-place in scientific research. Using a simple fire model for illustration some guidelines are given for the size of a computer experiment. A graph is provided relating the error of prediction to the sample size which should be of use when designing computer experiments. Methods for augmenting computer experiments with extra runs are also described and illustrated. The simplest method involves adding one point at a time choosing that point with the maximum prediction variance. Another method that appears to work well is to choose points from a candidate set with maximum determinant of the variance covariance matrix of predictions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Deterministic computer simulations of physical experiments are now common techniques in science and engineering. Often, physical experiments are too time consuming, expensive or impossible to conduct. Complex computer models or codes, rather than physical experiments lead to the study of computer experiments, which are used to investigate many scientific phenomena of this nature. A computer experiment consists of a number of runs of the computer code with different input choices. The Design and Analysis of Computer Experiments is a rapidly growing technique in statistical experimental design. This thesis investigates some practical issues in the design and analysis of computer experiments and attempts to answer some of the questions faced by experimenters using computer experiments. In particular, the question of the number of computer experiments and how they should be augmented is studied and attention is given to when the response is a function over time.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Recent advances in the area of ‘Transformational Government’ position the citizen at the centre of focus. This paradigm shift from a department-centric to a citizen-centric focus requires governments to re-think their approach to service delivery, thereby decreasing costs and increasing citizen satisfaction. The introduction of franchises as a virtual business layer between the departments and their citizens is intended to provide a solution. Franchises are structured to address the needs of citizens independent of internal departmental structures. For delivering services online, governments pursue the development of a One-Stop Portal, which structures information and services through those franchises. Thus, each franchise can be mapped to a specific service bundle, which groups together services that are deemed to be of relevance to a specific citizen need. This study focuses on the development and evaluation of these service bundles. In particular, two research questions guide the line of investigation of this study: Research Question 1): What methods can be used by governments to identify service bundles as part of governmental One-Stop Portals? Research Question 2): How can the quality of service bundles in governmental One-Stop Portals be evaluated? The first research question asks about the identification of suitable service bundle identification methods. A literature review was conducted, to, initially, conceptualise the service bundling task, in general. As a consequence, a 4-layer model of service bundling and a morphological box were created, detailing characteristics that are of relevance when identifying service bundles. Furthermore, a literature review of Decision-Support Systems was conducted to identify approaches of relevance in different bundling scenarios. These initial findings were complemented by targeted studies of multiple leading governments in the e-government domain, as well as with a local expert in the field. Here, the aim was to identify the current status of online service delivery and service bundling in practice. These findings led to the conceptualising of two service bundle identification methods, applicable in the context of Queensland Government: On the one hand, a provider-driven approach, based on service description languages, attributes, and relationships between services was conceptualised. As well, a citizen-driven approach, based on analysing the outcomes from content identification and grouping workshops with citizens, was also conceptualised. Both methods were then applied and evaluated in practice. The conceptualisation of the provider-driven method for service bundling required the initial specification of relevant attributes that could be used to identify similarities between services called relationships; these relationships then formed the basis for the identification of service bundles. This study conceptualised and defined seven relationships, namely ‘Co-location’, ‘Resource’, ‘Co-occurrence’, ‘Event’, ‘Consumer’, ‘Provider’, and ‘Type’. The relationships, and the bundling method itself, were applied and refined as part of six Action Research cycles in collaboration with the Queensland Government. The findings show that attributes and relationships can be used effectively as a means for bundle identification, if distinct decision rules are in place to prescribe how services are to be identified. For the conceptualisation of the citizen-driven method, insights from the case studies led to the decision to involve citizens, through card sorting activities. Based on an initial list of services, relevant for a certain franchise, participating citizens grouped services according to their liking. The card sorting activity, as well as the required analysis and aggregation of the individual card sorting results, was analysed in depth as part of this study. A framework was developed that can be used as a decision-support tool to assist with the decision of what card sorting analysis method should be utilised in a given scenario. The characteristic features associated with card sorting in a government context led to the decision to utilise statistical analysis approaches, such as cluster analysis and factor analysis, to aggregate card sorting results. The second research question asks how the quality of service bundles can be assessed. An extensive literature review was conducted focussing on bundle, portal, and e-service quality. It was found that different studies use different constructs, terminology, and units of analysis, which makes comparing these models a difficult task. As a direct result, a framework was conceptualised, that can be used to position past and future studies in this research domain. Complementing the literature review, interviews conducted as part of the case studies with leaders in e-government, indicated that, typically, satisfaction is evaluated for the overall portal once the portal is online, but quality tests are not conducted during the development phase. Consequently, a research model which appropriately defines perceived service bundle quality would need to be developed from scratch. Based on existing theory, such as Theory of Reasoned Action, Expectation Confirmation Theory, and Theory of Affordances, perceived service bundle quality was defined as an inferential belief. Perceived service bundle quality was positioned within the nomological net of services. Based on the literature analysis on quality, and on the subsequent work of a focus group, the hypothesised antecedents (descriptive beliefs) of the construct and the associated question items were defined and the research model conceptualised. The model was then tested, refined, and finally validated during six Action Research cycles. Results show no significant difference in higher quality or higher satisfaction among users for either the provider-driven method or for the citizen-driven method. The decision on which method to choose, it was found, should be based on contextual factors, such as objectives, resources, and the need for visibility. The constructs of the bundle quality model were examined. While the quality of bundles identified through the citizen-centric approach could be explained through the constructs ‘Navigation’, ‘Ease of Understanding’, and ‘Organisation’, bundles identified through the provider-driven approach could be explained solely through the constructs ‘Navigation’ and ‘Ease of Understanding’. An active labelling style for bundles, as part of the provider-driven Information Architecture, had a larger impact on ‘Quality’ than the topical labelling style used in the citizen-centric Information Architecture. However, ‘Organisation’, reflecting the internal, logical structure of the Information Architecture, was a significant factor impacting on ‘Quality’ only in the citizen-driven Information Architecture. Hence, it was concluded that active labelling can compensate for a lack of logical structure. Further studies are needed to further test this conjecture. Such studies may involve building alternative models and conducting additional empirical research (e.g. use of an active labelling style for the citizen-driven Information Architecture). This thesis contributes to the body of knowledge in several ways. Firstly, it presents an empirically validated model of the factors explaining and predicting a citizen’s perception of service bundle quality. Secondly, it provides two alternative methods that can be used by governments to identify service bundles in structuring the content of a One-Stop Portal. Thirdly, this thesis provides a detailed narrative to suggest how the recent paradigm shift in the public domain, towards a citizen-centric focus, can be pursued by governments; the research methodology followed by this study can serve as an exemplar for governments seeking to achieve a citizen-centric approach to service delivery.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The latest paradigm shift in government, termed Transformational Government, puts the citizen in the centre of attention. Including citizens in the design of online one-stop portals can help governmental organisations to become more customer focussed. This study describes the initial efforts of an Australian state government to develop an information architecture to structure the content of their future one-stop portal. Hereby, card sorting exercises have been conducted and analysed, utilising contemporary approaches found in academic and non-scientific literature. This paper describes the findings of the card sorting exercises in this particular case and discusses the suitability of the applied approaches in general. These are distinguished into non-statistical, statistical, and hybrid approaches. Thus, on the one hand, this paper contributes to academia by describing the application of different card sorting approaches and discussing their strengths and weaknesses. On the other hand, this paper contributes to practice by explaining the approach that has been taken by the authors’ research partner in order to develop a customer-focussed governmental one-stop portal. Thus, they provide decision support for practitioners with regard to different analysis methods that can be used to complement recent approaches in Transformational Government.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Football, or soccer as it is more commonly referred to in Australia and the US, is arguably the world’s most popular sport. It generates a proportionate volume of related writing. Within this landscape, works of novel-length fiction are seemingly rare. This paper establishes and maps a substantial body of football fiction works, explores elements and qualities exhibited individually and collectively. In bringing together current, limited surveys of the field, it presents the first rigorous definition of football fiction and captures the first historiography of the corpus. Drawing on distant reading methods developed in conjunction with closer textual analyses, the historiography and subsequent taxonomy represent the first articulation of relationships across the body of work, identify growth areas and establish a number of movements and trends. In advancing the understanding of football fiction as a collective body, the paper lays foundations for further research and consideration of the works in generic terms.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The emic perspective of criminal desistance (ex-offenders’ personal explanations of how they gave up crime) is largely ignored by criminology. This thesis attempts to address this absence of the storyteller’s perspective by inviting desisters to participate in the exploration and interpretation of their individual desistance journeys. Significant attention is drawn to the importance of philosophical self-enquiry to personal change. This detailed journey through the desistance stories of five ex-offenders has produced an emphatic re-statement of the need for the non-judgemental listener as the beginning point of cathartic healing in damaged lives.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The representation of business process models has been a continuing research topic for many years now. However, many process model representations have not developed beyond minimally interactive 2D icon-based representations of directed graphs and networks, with little or no annotation for information overlays. In addition, very few of these representations have undergone a thorough analysis or design process with reference to psychological theories on data and process visualization. This dearth of visualization research, we believe, has led to problems with BPM uptake in some organizations, as the representations can be difficult for stakeholders to understand, and thus remains an open research question for the BPM community. In addition, business analysts and process modeling experts themselves need visual representations that are able to assist with key BPM life cycle tasks in the process of generating optimal solutions. With the rise of desktop computers and commodity mobile devices capable of supporting rich interactive 3D environments, we believe that much of the research performed in computer human interaction, virtual reality, games and interactive entertainment have much potential in areas of BPM; to engage, provide insight, and to promote collaboration amongst analysts and stakeholders alike. We believe this is a timely topic, with research emerging in a number of places around the globe, relevant to this workshop. This is the second TAProViz workshop being run at BPM. The intention this year is to consolidate on the results of last year's successful workshop by further developing this important topic, identifying the key research topics of interest to the BPM visualization community.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Keeping exotic plant pests out of our country relies on good border control or quarantine. However with increasing globalization and mobilization some things slip through. Then the back up systems become important. This can include an expensive form of surveillance that purposively targets particular pests. A much wider net is provided by general surveillance, which is assimilated into everyday activities, like farmers checking the health of their crops. In fact farmers and even home gardeners have provided a front line warning system for some pests (eg European wasp) that could otherwise have wreaked havoc. Mathematics is used to model how surveillance works in various situations. Within this virtual world we can play with various surveillance and management strategies to "see" how they would work, or how to make them work better. One of our greatest challenges is estimating some of the input parameters : because the pest hasn't been here before, it's hard to predict how well it might behave: establishing, spreading, and what types of symptoms it might express. So we rely on experts to help us with this. This talk will look at the mathematical, psychological and logical challenges of helping experts to quantify what they think. We show how the subjective Bayesian approach is useful for capturing expert uncertainty, ultimately providing a more complete picture of what they think... And what they don't!

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Both the United States and Canada have federal legislation that attempts to address employment inequities across specific target groups. The US has a long tradition of affirmative action, dating back to President Kennedy’s 1961 Executive Order; Canada enacted its Employment Equity Act in 1986. Employment Equity/Affirmative Action policy has attracted significant controversy, with high profile court cases and the repeal of state/provincial legislation. Coate and Loury (1993) examine the theoretical impact of introducing affirmative action. Unfortunately the theoretical impact of affirmative action is ambiguous. The current paper employs a laboratory experiment to shed empirical light on this theoretical ambiguity.