188 resultados para transcendental arguments


Relevância:

10.00% 10.00%

Publicador:

Resumo:

This research investigates home literacy education practices of Taiwanese families in Australia. As Taiwanese immigrants represent the largest ¡°Chinese Australian¡± subgroup to have settled in the state of Queensland, teachers in this state often face the challenges of cultural differences between Australian schools and Taiwanese homes. Extensive work by previous researchers suggests that understanding the cultural and linguistic differences that influence how an immigrant child views and interacts with his/her environment is a possible way to minimise the challenges. Cultural practices start from infancy and at home. Therefore, this study is focused on young children who are around the age of four to five. It is a study that examines the form of literacy education that is enacted and valued by Taiwanese parents in Australia. Specifically, this study analyses ¡°what literacy knowledge and skill is taught at home?¡±, ¡°how is it taught?¡± and ¡°why is it taught?¡± The study is framed in Pierre Bourdieu.s theory of social practice that defines literacy from a sociological perspective. The aim is to understand the practices through which literacy is taught in the Taiwanese homes. Practices of literacy education are culturally embedded. Accordingly, the study shows the culturally specialised ways of learning and knowing that are enacted in the study homes. The study entailed four case studies that draw on: observations and recording of the interactions between the study parent and child in their literacy events; interviews and dialogues with the parents involved; and a collection of photographs of the children.s linguistic resources and artefacts. The methodological arguments and design addressed the complexity of home literacy education where Taiwanese parents raise children in their own cultural ways while adapting to a new country in an immigrant context. In other words, the methodology not only involves cultural practices, but also involves change and continuity in home literacy practices. Bernstein.s theory of pedagogic discourse was used to undertake a detailed analysis of parents. selection and organisation of content for home literacy education, and the evaluative criteria they established for the selected literacy knowledge and skill. This analysis showed how parents selected and controlled the interactions in their child.s literacy learning. Bernstein.s theory of pedagogic discourse was used also to analyse change and continuity in home literacy practice, specifically, the concepts of ¡°classification¡± and ¡°framing¡±. The design of this study aimed to gain an understanding of parents. literacy teaching in an immigrant context. The study found that parents tended to value and enact traditional practices, yet most of the parents were also searching for innovative ideas for their adult-structured learning. Home literacy education of Taiwanese families in this study was found to be complex, multi-faceted and influenced in an ongoing way by external factors. Implications for educators and recommendations for future study are provided. The findings of this study offer early childhood teachers in Australia understandings that will help them build knowledge about home literacy education of Taiwanese Australian families.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this review piece, we survey the literature on the cost of equity capital implications of corporate disclosure and conservative accounting policy choice decisions with the principle objective of providing insights into the design and methodological issues, which underlie the empirical investigations. We begin with a review of the analytical studies most typically cited in the empirical research as providing a theoretical foundation. We then turn to consider literature that offers insights into the selection of proxies for each of our points of interest, cost of equity capital, disclosure quality and accounting conservatism. As a final step, we review selected empirical studies to illustrate the relevant evidence found within the literature. Based on our review, we interpret the literature as providing the researcher with only limited direct guidance on the appropriate choice of measure for each of the constructs of interest. Further, we view the literature as raising questions about both the interpretation of empirical findings in the face of measurement concerns and the suitability of certain theoretical arguments to the research setting. Overall, perhaps the message which is most clear is that one of the most controversial and fundamental issues underlying the literature is the issue of the diversifiability or nondiversifiability of information effects.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Many of the classification algorithms developed in the machine learning literature, including the support vector machine and boosting, can be viewed as minimum contrast methods that minimize a convex surrogate of the 0–1 loss function. The convexity makes these algorithms computationally efficient. The use of a surrogate, however, has statistical consequences that must be balanced against the computational virtues of convexity. To study these issues, we provide a general quantitative relationship between the risk as assessed using the 0–1 loss and the risk as assessed using any nonnegative surrogate loss function. We show that this relationship gives nontrivial upper bounds on excess risk under the weakest possible condition on the loss function—that it satisfies a pointwise form of Fisher consistency for classification. The relationship is based on a simple variational transformation of the loss function that is easy to compute in many applications. We also present a refined version of this result in the case of low noise, and show that in this case, strictly convex loss functions lead to faster rates of convergence of the risk than would be implied by standard uniform convergence arguments. Finally, we present applications of our results to the estimation of convergence rates in function classes that are scaled convex hulls of a finite-dimensional base class, with a variety of commonly used loss functions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We analyze the puzzling behavior of the volatility of individual stock returns over the past few decades. The literature has provided many different explanations to the trend in volatility and this paper tests the viability of the different explanations. Virtually all current theoretical arguments that are provided for the trend in the average level of volatility over time lend themselves to explanations about the difference in volatility levels between firms in the cross-section. We therefore focus separately on the cross-sectional and time-series explanatory power of the different proxies. We fail to find a proxy that is able to explain both dimensions well. In particular, we find that Cao et al. [Cao, C., Simin, T.T., Zhao, J., 2008. Can growth options explain the trend in idiosyncratic risk? Review of Financial Studies 21, 2599–2633] market-to-book ratio tracks average volatility levels well, but has no cross-sectional explanatory power. On the other hand, the low-price proxy suggested by Brandt et al. [Brandt, M.W., Brav, A., Graham, J.R., Kumar, A., 2010. The idiosyncratic volatility puzzle: time trend or speculative episodes. Review of Financial Studies 23, 863–899] has much cross-sectional explanatory power, but has virtually no time-series explanatory power. We also find that the different proxies do not explain the trend in volatility in the period prior to 1995 (R-squared of virtually zero), but explain rather well the trend in volatility at the turn of the Millennium (1995–2005).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Evidence suggests that both start-up and young firms (henceforth: new firms) – despite typically being resource-constrained – are sometimes able to innovate (Katila & Shane 2005). Such firms are seldom able to invest in expensive innovation processes, which suggests that they may rely on other pathways to innovation. In this paper, we test arguments that “bricolage,” defined as making do by applying combinations of the resources at hand to new problems and opportunities, provides a pathway to innovation for new firms. Our results suggest that variations in bricolage behaviors can provide an explanation of innovation under resource constraints by new firms.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Current climate mitigation policies have not fully resolved contentious issues regarding the inclusion of carbon sequestration through changes in forestry and agricultural management practices. Terrestrial carbon sinks could be a low-cost mitigation option that fosters conservation and development, yet issues related to accurately documenting the amount of carbon sequestered undermine confidence that emission offsets through sequestration are equivalent to emission reductions. From an atmospheric perspective, net of CO2 removals through sequestration are equivalent to emission reductions over a given period of time. But carbon will not remain sequestered in biomass or soils indefinitely and investments in sequestration could stifle investments in reducing emissions from other sources. Many international climate agreements cap emissions from some countries or sectors but enable participation of uncapped countries or sectors for forestry and agricultural sequestration. This structure can prompt emission increases in parts of the uncapped entities that weaken the value of emission reductions earned through sequestration. This has been a minor issue under the Clean Development Mechanism of the Kyoto Protocol. Reduced emissions through deforestation and degradation is susceptible to the same problems. The purpose of this article is to review the science, politics, and policy that form the basis of arguments for and against the inclusion forestry and agricultural sequestration as a component of current and future international climate mitigation policies.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The global release of 250,000 US Embassy diplomatic cables to selected media sites worldwide through the WikiLeaks website, was arguably the major global media event of 2010. As well as the implications of the content of the cables for international politics and diplomacy, the actions of WikiLeaks and its controversial editor-in-chief, the Australian Julian Assange, bring together a range of arguments about how the media, news and journalism are being transformed in the 21st century. This paper will focus on the reactions of Australian online news media sites to the release of the diplomatic cables by WikiLeaks, including both the online sites of established news outlets such as The Australian, Sydney Morning Herald and The Age, the ABC’s The Drum site, and online-only sites such as Crikey, New Matilda and On Line Opinion. The study focuses on opinion and commentary rather than straight news reportage, and analysis is framed around three issues: WikiLeaks and international diplomacy; implications of WikiLeaks for journalism; and WikiLeaks and democracy, including debates about the organisation and the ethics of its own practice. It also whether a “WikiLeaks Effect” has wider implications for how journalism is conducted in the future, particularly the method of ‘redaction’ of large amounts of computational data.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

How do humans respond to their social context? This question is becoming increasingly urgent in a society where democracy requires that the citizens of a country help to decide upon its policy directions, and yet those citizens frequently have very little knowledge of the complex issues that these policies seek to address. Frequently, we find that humans make their decisions more with reference to their social setting, than to the arguments of scientists, academics, and policy makers. It is broadly anticipated that the agent based modelling (ABM) of human behaviour will make it possible to treat such social effects, but we take the position here that a more sophisticated treatment of context will be required in many such models. While notions such as historical context (where the past history of an agent might affect its later actions) and situational context (where the agent will choose a different action in a different situation) abound in ABM scenarios, we will discuss a case of a potentially changing context, where social effects can have a strong influence upon the perceptions of a group of subjects. In particular, we shall discuss a recently reported case where a biased worm in an election debate led to significant distortions in the reports given by participants as to who won the debate (Davis et al 2011). Thus, participants in a different social context drew different conclusions about the perceived winner of the same debate, with associated significant differences among the two groups as to who they would vote for in the coming election. We extend this example to the problem of modelling the likely electoral responses of agents in the context of the climate change debate, and discuss the notion of interference between related questions that might be asked of an agent in a social simulation that was intended to simulate their likely responses. A modelling technology which could account for such strong social contextual effects would benefit regulatory bodies which need to navigate between multiple interests and concerns, and we shall present one viable avenue for constructing such a technology. A geometric approach will be presented, where the internal state of an agent is represented in a vector space, and their social context is naturally modelled as a set of basis states that are chosen with reference to the problem space.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper posits that the 'student as customer' model has a negative impact upon the academic leadership which in turn is responsible for the erosion of objectivity in the assessment process in the higher education sector. The paper draws on the existing literature to explore the relationship between the student as customer model, academic leadership, and student assessment. The existing research emanating from the literature provides the basis from which the short comings of the student as customer model are exposed. From a practical perspective the arguments made in this paper provide the groundwork for possible future research into the adverse affects of the student as customer model on academic leadership and job satisfaction in the academic work force. The concern for quality may benefit from empirical investigation of the relationship between the student as customer model and quality learning and assessment outcomes in the higher education sector. The paper raises awareness of the faults with the present reliance on the student as customer model and the negative impact on both students and academic staff. The issues explored have the potential to influence the future directions of the higher education sector with regard to the social implications of their quest for quality educational outcomes. The paper addresses a gap in the literature in regard to use of the student as customer model and the subsequent adverse affect on academic leadership and assessment in higher education.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper I examine the recent arguments by Charles Foster, Jonathan Herring, Karen Melham and Tony Hope against the utility of the doctrine of double effect. One basis on which they reject the utility of the doctrine is their claim that it is notoriously difficult to apply what they identify as its 'core' component, namely, the distinction between intention and foresight. It is this contention that is the primarily focus of my article. I argue against this claim that the intention/foresight distinction remains a fundamental part of the law in those jurisdictions where intention remains an element of the offence of murder and that, accordingly, it is essential ro resolve the putative difficulties of applying the intention/foresight distinction so as to ensure the integrity of the law of murder. I argue that the main reasons advanced for the claim that the intention/foresight distinction is difficult to apply are ultimately unsustainable, and that the distinction is not as difficult to apply as the authors suggest.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The history of public discourse (and in many cases, academic publishing) on pornography is, notoriously, largely polemical and polarised. There is perhaps no other media form that has been so relentlessly the centre of what boils down to little more than arguments “for” or “against”; most famously, on the basis of the oppression, dominance or liberation of sexual subjectivities. These polarised debates leave much conceptual space for researchers to explore: discussions of pornography often lack specificity (when speaking of porn, what exactly do we mean? Which genre? Which markets?); assumptions (eg. about exactly how the sexualised “white male body” functions culturally, or what the “uses” of porn actually might be) can be buried; and empirical opportunities (how porn as media industry connects to innovation and the rest of the mediasphere) are missed. In this issue, we have tried to create and populate such a space, not only for the rethinking of some of our core assumptions about pornography, but also for the treatment of pornography as a bona fide, even while contested and problematic, segment of the media and cultural industries, linked economically and symbolically to other media forms.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Velocity jump processes are discrete random walk models that have many applications including the study of biological and ecological collective motion. In particular, velocity jump models are often used to represent a type of persistent motion, known as a “run and tumble”, which is exhibited by some isolated bacteria cells. All previous velocity jump processes are non-interacting, which means that crowding effects and agent-to-agent interactions are neglected. By neglecting these agent-to-agent interactions, traditional velocity jump models are only applicable to very dilute systems. Our work is motivated by the fact that many applications in cell biology, such as wound healing, cancer invasion and development, often involve tissues that are densely packed with cells where cell-to-cell contact and crowding effects can be important. To describe these kinds of high cell density problems using a velocity jump process we introduce three different classes of crowding interactions into a one-dimensional model. Simulation data and averaging arguments lead to a suite of continuum descriptions of the interacting velocity jump processes. We show that the resulting systems of hyperbolic partial differential equations predict the mean behavior of the stochastic simulations very well.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Continuum, partial differential equation models are often used to describe the collective motion of cell populations, with various types of motility represented by the choice of diffusion coefficient, and cell proliferation captured by the source terms. Previously, the choice of diffusion coefficient has been largely arbitrary, with the decision to choose a particular linear or nonlinear form generally based on calibration arguments rather than making any physical connection with the underlying individual-level properties of the cell motility mechanism. In this work we provide a new link between individual-level models, which account for important cell properties such as varying cell shape and volume exclusion, and population-level partial differential equation models. We work in an exclusion process framework, considering aligned, elongated cells that may occupy more than one lattice site, in order to represent populations of agents with different sizes. Three different idealizations of the individual-level mechanism are proposed, and these are connected to three different partial differential equations, each with a different diffusion coefficient; one linear, one nonlinear and degenerate and one nonlinear and nondegenerate. We test the ability of these three models to predict the population level response of a cell spreading problem for both proliferative and nonproliferative cases. We also explore the potential of our models to predict long time travelling wave invasion rates and extend our results to two dimensional spreading and invasion. Our results show that each model can accurately predict density data for nonproliferative systems, but that only one does so for proliferative systems. Hence great care must be taken to predict density data for with varying cell shape.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Identity is unique, multiple and dynamic. This paper explores common attributes of organisational identities, and examines the role of performance management systems (PMSs) on revealing identity attributes. One of the influential PMSs, the balanced scorecard, is used to illustrate the arguments. A case study of a public-sector organisation suggests that PMSs now place a value on the intangible aspects of organisational life as well as the financial, periodically revealing distinctiveness, relativity, visibility, fluidity and manageability of public-sector identities that sustain their viability. This paper contributes to a multi-disciplinary approach and its practical application, demonstrating an alternative pathway to identity-making using PMSs.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Female genital mutilation (FGM) is a cultural practice common in many Islamic societies. It involves the deliberate, non-therapeutic physical modification of young girls’ genitalia. FGM can take several forms, ranging from less damaging incisions to actual removal of genitalia and narrowing or even closing of the vagina. While often thought to be required by religion, FGM both predates and has no basis in the Koran. Rather, it is a cultural tradition, motivated by a patriarchal social desire to control female bodies to ensure virginity at marriage (preserving family honour), and to prevent infidelity by limiting sexual desire. In the USA and Australia in 2010, peak medical bodies considered endorsing the medical administration of a ‘lesser’ form of FGM. The basis for this was pragmatic: it would be preferable to satisfy patients’ desire for FGM in medically-controlled conditions, rather than have these patients seek it, possibly in more severe forms, under less safe conditions. While arguments favouring medically-administered FGM were soon overcome, the prospect of endorsing FGM illuminated the issue in these two Western countries and beyond. This paper will review the nature of FGM, its physical and psychological health consequences, and Australian laws prohibiting FGM. Then, it will scan recent developments in Africa, where FGM has been made illegal by a growing number of nations and by the Protocol to the African Charter on Human and Peoples’ Rights 2003 (the Maputo Protocol), but is still proving difficult to eradicate. Finally, based on arguments derived from theories of rights, health evidence, and the historical and religious contexts, this paper will ask whether an absolute human right against FGM can be developed.