977 resultados para Oxford
Resumo:
Earlier research found evidence for electro-cortical race bias towards black target faces in white American participants irrespective of the task relevance of race. The present study investigated whether an implicit race bias generalizes across cultural contexts and racial in- and out-groups. An Australian sample of 56 Chinese and Caucasian males and females completed four oddball tasks that required sex judgements for pictures of male and female Chinese and Caucasian posers. The nature of the background (across task) and of the deviant stimuli (within task) was fully counterbalanced. Event-related potentials (ERPs) to deviant stimuli recorded from three midline sites were quantified in terms of mean amplitude for four components: N1, P2, N2 and a late positive complex (LPC; 350–700 ms). Deviants that differed from the backgrounds in sex or race elicited enhanced LPC activity. These differences were not modulated by participant race or sex. The current results replicate earlier reports of effects of poser race relative to background race on the LPC component of the ERP waveform. In addition, they indicate that an implicit race bias occurs regardless of participant's or poser's race and is not confined to a particular cultural context.
Resumo:
The case proposes an ethical dilemma that a Public Service Director faces that could affect his career, the career of his boss, and the career of the governor of a state. There is a strong need for ethical leaders in this changing global organization world where the headlines are filled with stories of private sector and public sector leaders who have made serious ethical and moral compromises. It is easy to follow ethical leaders who you can count on to do what is right and difficult to follow those who will do what is expedient or personally beneficial. However, ethical leadership is not always black and white as this case will portray. Difficult decisions must be made where it may not always be clear what to do. The names in the case have been changed although the situation is a real one.
Resumo:
The focus of this case study concerns Peter Davies, one of three Assistant Principals in a large Australian secondary school, who faces an ethical dilemma regarding a student discipline issue. It is an important case because it underscores the point that ethical decision-making for leaders is fraught with complexity and whatever decision is made, there will be implications for all parties concerned.
Resumo:
Significant numbers of children are severely abused and neglected by parents and caregivers. Infants and very young children are the most vulnerable and are unable to seek help. To identify these situations and enable child protection and the provision of appropriate assistance, many jurisdictions have enacted ‘mandatory reporting laws’ requiring designated professionals such as doctors, nurses, police and teachers to report suspected cases of severe child abuse and neglect. Other jurisdictions have not adopted this legislative approach, at least partly motivated by a concern that the laws produce dramatic increases in unwarranted reports, which, it is argued, lead to investigations which infringe on people’s privacy, cause trauma to innocent parents and families, and divert scarce government resources from deserving cases. The primary purpose of this paper is to explore the extent to which opposition to mandatory reporting laws is valid based on the claim that the laws produce ‘overreporting’. The first part of this paper revisits the original mandatory reporting laws, discusses their development into various current forms, explains their relationship with policy and common law reporting obligations, and situates them in the context of their place in modern child protection systems. This part of the paper shows that in general, contemporary reporting laws have expanded far beyond their original conceptualisation, but that there is also now a deeper understanding of the nature, incidence, timing and effects of different types of severe maltreatment, an awareness that the real incidence of maltreatment is far higher than that officially recorded, and that there is strong evidence showing the majority of identified cases of severe maltreatment are the result of reports by mandated reporters. The second part of this paper discusses the apparent effect of mandatory reporting laws on ‘overreporting’ by referring to Australian government data about reporting patterns and outcomes, with a particular focus on New South Wales. It will be seen that raw descriptive data about report numbers and outcomes appear to show that reporting laws produce both desirable consequences (identification of severe cases) and problematic consequences (increased numbers of unsubstantiated reports). Yet, to explore the extent to which the data supports the overreporting claim, and because numbers of unsubstantiated reports alone cannot demonstrate overreporting, this part of the paper asks further questions of the data. Who makes reports, about which maltreatment types, and what are the outcomes of those reports? What is the nature of these reports; for example, to what extent are multiple numbers of reports made about the same child? What meaning can be attached to an ‘unsubstantiated’ report, and can such reports be used to show flaws in reporting effectiveness and problems in reporting laws? It will be suggested that available evidence from Australia is not sufficiently detailed or strong to demonstrate the overreporting claim. However, it is also apparent that, whether adopting an approach based on public health and or other principles, much better evidence about reporting needs to be collected and analyzed. As well, more nuanced research needs to be conducted to identify what can reasonably be said to constitute ‘overreports’, and efforts must be made to minimize unsatisfactory reporting practice, informed by the relevant jurisdiction’s context and aims. It is also concluded that, depending on the jurisdiction, the available data may provide useful indicators of positive, negative and unanticipated effects of specific components of the laws, and of the strengths, weaknesses and needs of the child protection system.
Resumo:
This work offers a critical introduction to sociology for New Zealand students. Written in an accessible narrative style, it seeks to challenge and debunk students' assumptions about key elements of their social worlds, encouraging them to develop a "critical imagination" as a tool to identify broader social themes in personal issues.
Resumo:
On 20 September 2001, the former US President, George W. Bush, declared what is now widely, and arguably infamously, known as a ‘war on terror’. In response to the fatal 9/11 attacks in New York and Washington, DC, President Bush identified the US military response as having far-reaching and long-lasting consequences. It was, he argued, ‘our war on terror’ that began ‘with al Qaeda, but … it will not end until every terrorist group of global reach has been found, stopped and defeated’ (CNN 2001). This was to be a war that would, in the words of former British Prime Minister, Tony Blair, seek to eliminate a threat that was ‘aimed at the whole democratic world’ (Blair 2001). Blair claimed that this threat is of such magnitude that unprecedented measures would need to be taken to uphold freedom and security. Blair would later admit that it was a war that ‘divided the country’ and was based on evidence ‘about Saddam having actual biological and chemical weapons, as opposed to the capability to develop them, has turned out to be wrong’ (Blair 2004). The failures of intelligence ushered in new political rhetoric in the form of ‘trust me’ because ‘instinct is no science’ (Blair 2004). The war on terror has been one of the most significant international events in the past three decades, alongside the collapse of the former Soviet Union, the end of apartheid in South Africa, the unification of Europe and the marketization of the People's Republic of China. Yet, unlike the other events, it will not be remembered for advancing democracy or sovereignty, but for the conviction politics of particular politicians who chose to dispense with international law and custom in pursuit of personal instincts that proved fatal. Since the invasions of Afghanistan in October 2001 and …
Resumo:
Genetically modified or engineered foods are produced from rapidly expanding technologies that have sparked international debates and concerns about health and safety. These concerns focus on the potential dangers to human health, the risks of genetic pollution, and the demise of alternative farming techniques as well as biopiracy and economic exploitation by large private corporations. This article discusses the findings of the world's first Royal Commission on Genetic Modification conducted in New Zealand and reveals that there are potential social, ecological and economic risks created by genetically modified foods that require closer criminological scrutiny. As contemporary criminological discourses continue to push new boundaries in areas of crimes of the economy, environmental pollution, risk management, governance and globalization, the potential concerns posed by genetically modified foods creates fertile ground for criminological scholarship and activism.
Resumo:
Consider the concept combination ‘pet human’. In word association experiments, human subjects produce the associate ‘slave’ in relation to this combination. The striking aspect of this associate is that it is not produced as an associate of ‘pet’, or ‘human’ in isolation. In other words, the associate ‘slave’ seems to be emergent. Such emergent associations sometimes have a creative character and cognitive science is largely silent about how we produce them. Departing from a dimensional model of human conceptual space, this article will explore concept combinations, and will argue that emergent associations are a result of abductive reasoning within conceptual space, that is, below the symbolic level of cognition. A tensor-based approach is used to model concept combinations allowing such combinations to be formalized as interacting quantum systems. Free association norm data is used to motivate the underlying basis of the conceptual space. It is shown by analogy how some concept combinations may behave like quantum-entangled (non-separable) particles. Two methods of analysis were presented for empirically validating the presence of non-separable concept combinations in human cognition. One method is based on quantum theory and another based on comparing a joint (true theoretic) probability distribution with another distribution based on a separability assumption using a chi-square goodness-of-fit test. Although these methods were inconclusive in relation to an empirical study of bi-ambiguous concept combinations, avenues for further refinement of these methods are identified.
Resumo:
Chapter aims By the end of your study of this chapter, you should be able to: - See public relations as a link between organisations and their environments - Use systems theory to guide your understanding and practical application of public relations - Understand the make up of a public relations management team within an organisation - Identify and understand how a range of internal forces including culture, and power affect the practice of public relations - Identify and understand how a range of external forces including conflict, activism and corporate social responsibility affect the practice of public relations.
Resumo:
The role of networks and their contribution to sustaining and developing creative industries is well documented (Wittel 2001, Kong 2005, Pratt, 2007). This article argues that although networks operate across geographical boundaries, particularly through the use of communication technologies, the majority of studies have focussed on the ways in which networks operate among creative industry workers located in a) specific inner-urban metropolitan regions or b) specific industries. Such studies are informed by the geographical mindset of creative city proponents such as Florida (2002) and Landry (2000) in which inner-urban precincts are seen as the prime location or ‘hub’ for creative industries activity, business development and opportunity. But what of those creative industries situated beyond the inner city? Evidence in Australia suggests that there is increasing creative industries activity beyond the inner city, in outer-suburban and ex-urban areas (Gibson and Brennan-Horley 2006). This article identifies features of networks operating two outer-suburban locations.
Resumo:
Determining the temporal scale of biological evolution has traditionally been the preserve of paleontology, with the timing of species originations and major diversifications all being read from the fossil record. However, the ages of the earliest (correctly identified) records will underestimate actual origins due to the incomplete nature of the fossil record and the necessity for lineages to have evolved sufficiently divergent morphologies in order to be distinguished. The possibility of inferring divergence times more accurately has been promoted by the idea that the accumulation of genetic change between modern lineages can be used as a molecular clock (Zuckerkandl and Pauling, 1965). In practice, though, molecular dates have often been so old as to be incongruent even with liberal readings of the fossil record. Prominent examples include inferred diversifications of metazoan phyla hundreds of millions of years before their Cambrian fossil record appearances (e.g., Nei et al., 2001) and a basal split between modern birds (Neoaves) that is almost double the age of their earliest recognizable fossils (e.g., Cooper and Penny, 1997).
Resumo:
In recent years, a number of phylogenetic methods have been developed for estimating molecular rates and divergence dates under models that relax the molecular clock constraint by allowing rate change throughout the tree. These methods are being used with increasing frequency, but there have been few studies into their accuracy. We tested the accuracy of several relaxed-clock methods (penalized likelihood and Bayesian inference using various models of rate change) using nucleotide sequences simulated on a nine-taxon tree. When the sequences evolved with a constant rate, the methods were able to infer rates accurately, but estimates were more precise when a molecular clock was assumed. When the sequences evolved under a model of autocorrelated rate change, rates were accurately estimated using penalized likelihood and by Bayesian inference using lognormal and exponential models of rate change, while other models did not perform as well. When the sequences evolved under a model of uncorrelated rate change, only Bayesian inference using an exponential rate model performed well. Collectively, the results provide a strong recommendation for using the exponential model of rate change if a conservative approach to divergence time estimation is required. A case study is presented in which we use a simulation-based approach to examine the hypothesis of elevated rates in the Cambrian period, and it is found that these high rate estimates might be an artifact of the rate estimation method. If this bias is present, then the ages of metazoan divergences would be systematically underestimated. The results of this study have implications for studies of molecular rates and divergence dates.
Time dependency of molecular rate estimates and systematic overestimation of recent divergence times
Resumo:
Studies of molecular evolutionary rates have yielded a wide range of rate estimates for various genes and taxa. Recent studies based on population-level and pedigree data have produced remarkably high estimates of mutation rate, which strongly contrast with substitution rates inferred in phylogenetic (species-level) studies. Using Bayesian analysis with a relaxed-clock model, we estimated rates for three groups of mitochondrial data: avian protein-coding genes, primate protein-coding genes, and primate d-loop sequences. In all three cases, we found a measurable transition between the high, short-term (<1–2 Myr) mutation rate and the low, long-term substitution rate. The relationship between the age of the calibration and the rate of change can be described by a vertically translated exponential decay curve, which may be used for correcting molecular date estimates. The phylogenetic substitution rates in mitochondria are approximately 0.5% per million years for avian protein-coding sequences and 1.5% per million years for primate protein-coding and d-loop sequences. Further analyses showed that purifying selection offers the most convincing explanation for the observed relationship between the estimated rate and the depth of the calibration. We rule out the possibility that it is a spurious result arising from sequence errors, and find it unlikely that the apparent decline in rates over time is caused by mutational saturation. Using a rate curve estimated from the d-loop data, several dates for last common ancestors were calculated: modern humans and Neandertals (354 ka; 222–705 ka), Neandertals (108 ka; 70–156 ka), and modern humans (76 ka; 47–110 ka). If the rate curve for a particular taxonomic group can be accurately estimated, it can be a useful tool for correcting divergence date estimates by taking the rate decay into account. Our results show that it is invalid to extrapolate molecular rates of change across different evolutionary timescales, which has important consequences for studies of populations, domestication, conservation genetics, and human evolution.