845 resultados para Political theory
Resumo:
This chapter recognizes that research is a cultural invention and explains why. It discusses what equity, research and research design mean, and suggests that the concept of equity is enriched considerably when ideas from Indigenous, critical and politically committed research traditions are involved in research design. When research design and the processes of research are guided by principles of equity, several issues warrant investigation. These include power relations, deficit models of research, homogeneity and reflexivity. Research design that is informed by principles of equity is explicit in its political purpose of seeking socially just outcomes for the short and long term.
Resumo:
The need for the development of effective business curricula that meets the needs of the marketplace has created an increase in the adoption of core competencies lists identifying appropriate graduate skills. Many organisations and tertiary institutions have individual graduate capabilities lists including skills deemed essential for success. Skills recognised as ‘critical thinking’ are popular inclusions on core competencies and graduate capability lists. While there is literature outlining ‘critical thinking’ frameworks, methods of teaching it and calls for its integration into business curricula, few studies actually identify quantifiable improvements achieved in this area. This project sought to address the development of ‘critical thinking’ skills in a management degree program by embedding a process for critical thinking within a theory unit undertaken by students early in the program. Focus groups and a student survey were used to identify issues of both content and implementation and to develop a student perspective on their needs in thinking critically. A process utilising a framework of critical thinking was integrated through a workbook of weekly case studies for group analysis, discussions and experiential exercises. The experience included formative and summative assessment. Initial results indicate a greater valuation by students of their experience in the organisation theory unit; better marks for mid semester essay assignments and higher evaluations on the university administered survey of students’ satisfaction.
Resumo:
There has been considerable research conducted over the last 20 years focused on predicting motor vehicle crashes on transportation facilities. The range of statistical models commonly applied includes binomial, Poisson, Poisson-gamma (or negative binomial), zero-inflated Poisson and negative binomial models (ZIP and ZINB), and multinomial probability models. Given the range of possible modeling approaches and the host of assumptions with each modeling approach, making an intelligent choice for modeling motor vehicle crash data is difficult. There is little discussion in the literature comparing different statistical modeling approaches, identifying which statistical models are most appropriate for modeling crash data, and providing a strong justification from basic crash principles. In the recent literature, it has been suggested that the motor vehicle crash process can successfully be modeled by assuming a dual-state data-generating process, which implies that entities (e.g., intersections, road segments, pedestrian crossings, etc.) exist in one of two states—perfectly safe and unsafe. As a result, the ZIP and ZINB are two models that have been applied to account for the preponderance of “excess” zeros frequently observed in crash count data. The objective of this study is to provide defensible guidance on how to appropriate model crash data. We first examine the motor vehicle crash process using theoretical principles and a basic understanding of the crash process. It is shown that the fundamental crash process follows a Bernoulli trial with unequal probability of independent events, also known as Poisson trials. We examine the evolution of statistical models as they apply to the motor vehicle crash process, and indicate how well they statistically approximate the crash process. We also present the theory behind dual-state process count models, and note why they have become popular for modeling crash data. A simulation experiment is then conducted to demonstrate how crash data give rise to “excess” zeros frequently observed in crash data. It is shown that the Poisson and other mixed probabilistic structures are approximations assumed for modeling the motor vehicle crash process. Furthermore, it is demonstrated that under certain (fairly common) circumstances excess zeros are observed—and that these circumstances arise from low exposure and/or inappropriate selection of time/space scales and not an underlying dual state process. In conclusion, carefully selecting the time/space scales for analysis, including an improved set of explanatory variables and/or unobserved heterogeneity effects in count regression models, or applying small-area statistical methods (observations with low exposure) represent the most defensible modeling approaches for datasets with a preponderance of zeros
Resumo:
Statisticians along with other scientists have made significant computational advances that enable the estimation of formerly complex statistical models. The Bayesian inference framework combined with Markov chain Monte Carlo estimation methods such as the Gibbs sampler enable the estimation of discrete choice models such as the multinomial logit (MNL) model. MNL models are frequently applied in transportation research to model choice outcomes such as mode, destination, or route choices or to model categorical outcomes such as crash outcomes. Recent developments allow for the modification of the potentially limiting assumptions of MNL such as the independence from irrelevant alternatives (IIA) property. However, relatively little transportation-related research has focused on Bayesian MNL models, the tractability of which is of great value to researchers and practitioners alike. This paper addresses MNL model specification issues in the Bayesian framework, such as the value of including prior information on parameters, allowing for nonlinear covariate effects, and extensions to random parameter models, so changing the usual limiting IIA assumption. This paper also provides an example that demonstrates, using route-choice data, the considerable potential of the Bayesian MNL approach with many transportation applications. This paper then concludes with a discussion of the pros and cons of this Bayesian approach and identifies when its application is worthwhile
Resumo:
A short discussion concerning the theory of endemic governance problems.
Resumo:
This paper proposes a simple variation of the Allingham and Sandmo (1972) construct and integrates it to a dynamic general equilibrium framework with heterogeneous agents. We study an overlapping generations framework i n which agents must initially decide whether to evade taxes or not. In the event they decide to evade, they then have to decide the extent of income or wealth they wish to under-report. We find that in comparison with the basic approach, the ‘evade or not’ choice drastically reduced the extent of evasion in the economy. This outcome is the result of an anomaly intrinsic to the basic Allingham and Sandmo version of the model, which makes the evade-or-not extension a more suitable approach to modelling the issue. We also find that the basic model, and the model with and ‘evade-or-not’ choice have strikingly different political economy implications, , which suggest fruitful avenues of empirical research.
Resumo:
In Australian universities, journalism educators usually come to the academy from the journalism profession and consequently place a high priority on leading students to develop a career-focussed skill set. The changing nature of the technological, political and economic environments and the professional destinations of journalism graduates place demands on journalism curricula and educators alike. The profession is diverse, such that the better description is of many ‘journalisms’ rather than one ‘journalism’ with consequential pressures being placed on curricula to extend beyond the traditional skill set, where practical ‘writing’ and ‘editing’ skills dominate, to the incorporation of critical theory and the social construction of knowledge. A parallel set of challenges faces academic staff operating in a higher education environment where change is the only constant and research takes precedent over curriculum development. In this paper, three educators at separate universities report on their attempts to implement curriculum change to imbue graduates with better skills and attributes such as enhanced team work, problem solving and critical thinking, to operate in the divergent environment of 21st century journalism. The paper uses narrative case study to illustrate the different approaches. Data collected from formal university student evaluations inform the narratives along with rich but less formal qualitative data including anecdotal student comments and student reflective assessment presentations. Comparison of the three approaches illustrates the dilemmas academic staff face when teaching in disciplines that are impacted by rapid changes in technology requiring new pedagogical approaches. Recommendations for future directions are considered against the background or learning purpose.
Resumo:
We present a novel modified theory based upon Rayleigh scattering of ultrasound from composite nanoparticles with a liquid core and solid shell. We derive closed form solutions to the scattering cross-section and have applied this model to an ultrasound contrast agent consisting of a liquid-filled core (perfluorooctyl bromide, PFOB) encapsulated by a polymer shell (poly-caprolactone, PCL). Sensitivity analysis was performed to predict the dependence of the scattering cross-section upon material and dimensional parameters. A rapid increase in the scattering cross-section was achieved by increasing the compressibility of the core, validating the incorporation of high compressibility PFOB; the compressibility of the shell had little impact on the overall scattering cross-section although a more compressible shell is desirable. Changes in the density of the shell and the core result in predicted local minima in the scattering cross-section, approximately corresponding to the PFOB-PCL contrast agent considered; hence, incorporation of a lower shell density could potentially significantly improve the scattering cross-section. A 50% reduction in shell thickness relative to external radius increased the predicted scattering cross-section by 50%. Although it has often been considered that the shell has a negative effect on the echogeneity due to its low compressibility, we have shown that it can potentially play an important role in the echogeneity of the contrast agent. The challenge for the future is to identify suitable shell and core materials that meet the predicted characteristics in order to achieve optimal echogenity.
Resumo:
In 2008 the Australian government decided to remove white blood cells from all blood products. This policy of universal leucodepletion was a change to the existing policy of supplying leucodepleted products to high risk patients only. The decision was made without strong information about the cost-effectiveness of universal leucodepletion. The aims for this policy analysis are to generate cost-effectiveness data about universal leucodepletion, and to add to our understanding of the role of evidence and the political reality of healthcare decision-making in Australia. The cost-effectiveness analysis revealed universal leucodepletion costs $398,943 to save one year of life. This exceeds the normal maximum threshold for Australia. We discuss this result within the context of how policy decisions are made about blood, and how it relates to the theory and process of policy making. We conclude that the absence of a strong voice for cost-effectiveness was an important omission in this decision.
Resumo:
Neo-liberalism has become one of the boom concepts of our time. From its original reference point as a descriptor of the economics of the “Chicago School” such as Milton Friedman, or authors such as Friedrich von Hayek, neo-liberalism has become an all-purpose descriptor and explanatory device for phenomena as diverse as Bollywood weddings, standardized testing in schools, violence in Australian cinema, and the digitization of content in public libraries. Moreover, it has become an entirely pejorative term: no-one refers to their own views as “neo-liberal”, but it rather refers to the erroneous views held by others, whether they acknowledge this or not. Neo-liberalism as it has come to be used, then, bears many of the hallmarks of a dominant ideology theory in the classical Marxist sense, even if it is often not explored in these terms. This presentation will take the opportunity provided by the English language publication of Michel Foucault’s 1978-79 lectures, under the title of The Birth of Biopolitics, to consider how he used the term neo-liberalism, and how this equates with its current uses in critical social and cultural theory. It will be argued that Foucault did not understand neo-liberalism as a dominant ideology in these lectures, but rather as marking a point of inflection in the historical evolution of liberal political philosophies of government. It will also be argued that his interpretation of neo-liberalism was more nuanced and more comparative than the more recent uses of Foucault in the literature on neo-liberalism. It will also look at how Foucault develops comparative historical models of liberal capitalism in The Birth of Biopolitics, arguing that this dimension of his work has been lost in more recent interpretations, which tend to retro-fit Foucault to contemporary critiques of either U.S. neo-conservatism or the “Third Way” of Tony Blair’s New Labour in the UK.
Resumo:
Globally, teaching has become more complex and more challenging over recent years, with new and increased demands being placed on teachers by students, their families, governments and wider society. Teachers work with more diverse communities in times characterised by volatility, uncertainty and moral ambiguity. Societal, political, economic and cultural shifts have transformed the contexts in which teachers work and have redefined the ways in which teachers interact with students. This qualitative study uses phenomenographic methods to explore the nature of pedagogic teacherstudent interactions. The data analysis reveals five qualitatively different ways in which teachers experience pedagogic engagements with students. The resultant categories of description ranged from information providing, with teachers viewed as transmitters of a body of knowledge through to mentoring in which teachers were perceived as significant others in the lives of students with their influence extending beyond the walls of the classroom and beyond the years of schooling. The paper concludes by arguing that if teachers are to prepare students for the challenges and opportunities in changing times, teacher education programs need to consider ways to facilitate the development of mentoring capacities in new teachers.
Resumo:
Plenary Session: "New Voices in Children's Literature"
Resumo:
This study reports on the impact of a "drink driving education program" taught to grade ten high school students. The program which involves twelve lessons uses strategies based on the Ajzen and Madden theory of planned behavior. Students were trained to use alternatives to drink driving and passenger behaviors. One thousand seven hundred and seventy-four students who had been taught the program in randomly assigned control and intervention schools were followed up three years later. There had been a major reduction in drink driving behaviors in both intervention and control students. In addition to this cohort change there was a trend toward reduced drink driving in the intervention group and a significant reduction in passenger behavior in this group. Readiness to use alternatives suggested that the major impact of the program was on students who were experimenting with the behavior at the time the program was taught. The program seems to have optimized concurrent social attitude and behavior change.
Resumo:
A consistent finding in the literature is that males report greater usage of drugs and subsequently greater amounts of drug driving. Research also suggests that vicarious influences may be more pertinent to males than to females. Utilising Stafford and Warr’s (1993) reconceptualization of deterrence theory, this study sought to determine if the relative deterrent impact of zero-tolerance drug driving laws is disparate between genders. A sample of motorists’ (N = 899) completed a self-report questionnaire assessing participants frequency of drug driving and personal and vicarious experiences with punishment and punishment avoidance. Results show that males were significantly more likely to report future intentions of drug driving. Additionally, vicarious experiences of punishment avoidance was a more influential predictor of future drug driving instances for males with personal experiences of punishment avoidance a more influential predictor for females. These findings can inform gender sensitive media campaigns and interventions for convicted drug drivers.
Resumo:
The changes of economic status in Malaysia have lead to many psychosocial problems especially among the young people. Counselling and psychotherapy have been seen as one of the solutions that are practiced in Western Culture. Most counselling theorists believe that their theory is universal however there is limited research to prove it. This paper will describe an ongoing study conducted in Malaysia about the applicability of one Western counselling Theory, Bowen’s family theory the Differentiation of self levels in the family allow a person to both leave the family’s boundaries in search of uniqueness and continually return to the family in order to further establish a sense of belonging. In addition Bowen believed that this comprised of four measures: Differentiation of Self (DSI), Family Inventory of Live Event (ILE), Depression Anxiety and Stress Scale (DASS) and Connor-Davidson Resilience Scale (CD-RISC). Preliminary findings are discussed and the implication in enhancing the quality of teaching family counselling in universities explored.