951 resultados para divided societies


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Freeways are divided roadways designed to facilitate the uninterrupted movement of motor vehicles. However, many freeways now experience demand flows in excess of capacity, leading to recurrent congestion. The Highway Capacity Manual (TRB, 1994) uses empirical macroscopic relationships between speed, flow and density to quantify freeway operations and performance. Capacity may be predicted as the maximum uncongested flow achievable. Although they are effective tools for design and analysis, macroscopic models lack an understanding of the nature of processes taking place in the system. Szwed and Smith (1972, 1974) and Makigami and Matsuo (1990) have shown that microscopic modelling is also applicable to freeway operations. Such models facilitate an understanding of the processes whilst providing for the assessment of performance, through measures of capacity and delay. However, these models are limited to only a few circumstances. The aim of this study was to produce more comprehensive and practical microscopic models. These models were required to accurately portray the mechanisms of freeway operations at the specific locations under consideration. The models needed to be able to be calibrated using data acquired at these locations. The output of the models needed to be able to be validated with data acquired at these sites. Therefore, the outputs should be truly descriptive of the performance of the facility. A theoretical basis needed to underlie the form of these models, rather than empiricism, which is the case for the macroscopic models currently used. And the models needed to be adaptable to variable operating conditions, so that they may be applied, where possible, to other similar systems and facilities. It was not possible to produce a stand-alone model which is applicable to all facilities and locations, in this single study, however the scene has been set for the application of the models to a much broader range of operating conditions. Opportunities for further development of the models were identified, and procedures provided for the calibration and validation of the models to a wide range of conditions. The models developed, do however, have limitations in their applicability. Only uncongested operations were studied and represented. Driver behaviour in Brisbane was applied to the models. Different mechanisms are likely in other locations due to variability in road rules and driving cultures. Not all manoeuvres evident were modelled. Some unusual manoeuvres were considered unwarranted to model. However the models developed contain the principal processes of freeway operations, merging and lane changing. Gap acceptance theory was applied to these critical operations to assess freeway performance. Gap acceptance theory was found to be applicable to merging, however the major stream, the kerb lane traffic, exercises only a limited priority over the minor stream, the on-ramp traffic. Theory was established to account for this activity. Kerb lane drivers were also found to change to the median lane where possible, to assist coincident mergers. The net limited priority model accounts for this by predicting a reduced major stream flow rate, which excludes lane changers. Cowan's M3 model as calibrated for both streams. On-ramp and total upstream flow are required as input. Relationships between proportion of headways greater than 1 s and flow differed for on-ramps where traffic leaves signalised intersections and unsignalised intersections. Constant departure onramp metering was also modelled. Minimum follow-on times of 1 to 1.2 s were calibrated. Critical gaps were shown to lie between the minimum follow-on time, and the sum of the minimum follow-on time and the 1 s minimum headway. Limited priority capacity and other boundary relationships were established by Troutbeck (1995). The minimum average minor stream delay and corresponding proportion of drivers delayed were quantified theoretically in this study. A simulation model was constructed to predict intermediate minor and major stream delays across all minor and major stream flows. Pseudo-empirical relationships were established to predict average delays. Major stream average delays are limited to 0.5 s, insignificant compared with minor stream delay, which reach infinity at capacity. Minor stream delays were shown to be less when unsignalised intersections are located upstream of on-ramps than signalised intersections, and less still when ramp metering is installed. Smaller delays correspond to improved merge area performance. A more tangible performance measure, the distribution of distances required to merge, was established by including design speeds. This distribution can be measured to validate the model. Merging probabilities can be predicted for given taper lengths, a most useful performance measure. This model was also shown to be applicable to lane changing. Tolerable limits to merging probabilities require calibration. From these, practical capacities can be estimated. Further calibration is required of traffic inputs, critical gap and minimum follow-on time, for both merging and lane changing. A general relationship to predict proportion of drivers delayed requires development. These models can then be used to complement existing macroscopic models to assess performance, and provide further insight into the nature of operations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Stigmergy is a biological term used when discussing insect or swarm behaviour, and describes a model supporting environmental communication separately from artefacts or agents. This phenomenon is demonstrated in the behavior of ants and their food gathering process when following pheromone trails, or similarly termites and their termite mound building process. What is interesting with this mechanism is that highly organized societies are achieved with a lack of any apparent management structure. Stigmergic behavior is implicit in the Web where the volume of users provides a self-organizing and self-contextualization of content in sites which facilitate collaboration. However, the majority of content is generated by a minority of the Web participants. A significant contribution from this research would be to create a model of Web stigmergy, identifying virtual pheromones and their importance in the collaborative process. This paper explores how exploiting stigmergy has the potential of providing a valuable mechanism for identifying and analyzing online user behavior recording actionable knowledge otherwise lost in the existing web interaction dynamics. Ultimately this might assist our building better collaborative Web sites.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper presents a comprehensive review of scientific and grey literature on gross pollutant traps (GPTs). GPTs are designed with internal screens to capture gross pollutants—organic matter and anthropogenic litter. Their application involves professional societies, research organisations, local city councils, government agencies and the stormwater industry—often in partnership. In view of this, the 113 references include unpublished manuscripts from these bodies along with scientific peer-reviewed conference papers and journal articles. The literature reviewed was organised into a matrix of six main devices and nine research areas (testing methodologies) which include: design appraisal study, field monitoring/testing, experimental flow fields, gross pollutant capture/retention characteristics, residence time calculations, hydraulic head loss, screen blockages, flow visualisations and computational fluid dynamics (CFD). When the fifty-four item matrix was analysed, twenty-eight research gaps were found in the tabulated literature. It was also found that the number of research gaps increased if only the scientific literature was considered. It is hoped, that in addition to informing the research community at QUT, this literature review will also be of use to other researchers in this field.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Mathematics education literature has called for an abandonment of ontological and epistemological ideologies that have often divided theory-based practice. Instead, a consilience of theories has been sought which would leverage the strengths of each learning theory and so positively impact upon contemporary educational practice. This research activity is based upon Popper’s notion of three knowledge worlds which differentiates the knowledge shared in a community from the personal knowledge of the individual, and Bereiter’s characterisation of understanding as the individual’s relationship to tool-like knowledge. Using these notions, a re-conceptualisation of knowledge and understanding and a subsequent re-consideration of learning theories are proposed as a way to address the challenge set by literature. Referred to as the alternative theoretical framework, the proposed theory accounts for the scaffolded transformation of each individual’s unique understanding, whilst acknowledging the existence of a body of domain knowledge shared amongst participants in a scientific community of practice. The alternative theoretical framework is embodied within an operational model that is accompanied by a visual nomenclature with which to describe consensually developed shared knowledge and personal understanding. This research activity has sought to iteratively evaluate this proposed theory through the practical application of the operational model and visual nomenclature to the domain of early-number counting, addition and subtraction. This domain of mathematical knowledge has been comprehensively analysed and described. Through this process, the viability of the proposed theory as a tool with which to discuss and thus improve the knowledge and understanding with the domain of mathematics has been validated. Putting of the proposed theory into practice has lead to the theory’s refinement and the subsequent achievement of a solid theoretical base for the future development of educational tools to support teaching and learning practice, including computer-mediated learning environments. Such future activity, using the proposed theory, will advance contemporary mathematics educational practice by bringing together the strengths of cognitivist, constructivist and post-constructivist learning theories.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The aim of this thesis has been to map the ethical journey of experienced nurses now practising in rural and remote hospitals in central and south-west Queensland and in domiciliary services in Brisbane. One group of the experienced nurses in the study were Directors of Nursing in rural and remote hospitals. These nurses were “hands on”, “multi-skilled “ nurses who also had the task of managing the hospital. Also there were two Directors of Nursing from domiciliary services in Brisbane. A grounded theory method was used. The nurses were interviewed and the data retrieved from the interviews was coded, categorised and from these categories a conceptual framework was generated. The literature which dealt with the subject of ethical decision making and nurses also became part of the data. The study revealed that all these nurses experienced moral distress as they made ethical decisions. The decision making categories revealed in the data were: the area of financial management; issues as end of life approaches; allowing to die with dignity; emergency decisions; experience of unexpected death; the dilemma of providing care in very difficult circumstances. These categories were divided into two chapters: the category related to administrative and financial constraints and categories dealing with ethical issues in clinical settings. A further chapter discussed the overarching category of coping with moral distress. These experienced nurses suffered moral distress as they made ethical decisions, confirming many instances of moral distress in ethical decision making documented in the literature to date. Significantly, the nurses in their interviews never mentioned the ethical principles used in bioethics as an influence in their decision making. Only one referred to lectures on ethics as being an influence in her thinking. As they described their ethical problems and how they worked through them, they drew on their own previous experience rather than any knowledge of ethics gained from nursing education. They were concerned for their patients, they spoke from a caring responsibility towards their patients, but they were also concerned for justice for their patients. This study demonstrates that these nurses operated from the ethic of care, tempered with the ethic of responsibility as well as a concern for justice for their patients. Reflection on professional experience, rather than formal ethics education and training, was the primary influence on their ethical decision making.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Over the last few decades, there has been a marked increase in media and debate surrounding a specific group of offences in modern Democratic nations which bear the brunt of the label ‘crimes against morality’. Included within this group are offences related to prostitution and pornography, homosexuality and incest and child sexual abuse. This book examines the nexus between sex, crime and morality from a theoretical perspective. This is the first academic text to offer an examination and analysis of the philosophical underpinnings of sex-related crimes and social attitudes towards them and the historical, anthropological and moral reasons for differentiating these crimes in contemporary western culture. The book is divided into three sections corresponding to three theoretical frameworks: Part 1 examines the moral temporality of sex and taboo as a foundation for legislation governing sex crimes Part 2 focuses on the geography of sex and deviance, specifically notions of public morality and the public private divide Part 3 examines the moral economy of sex and harm, including the social construction of harm. Sex, Crime and Morality will be key reading for students of criminology, criminal justice, gender studies and ethics, and will also be of interest to justice professionals.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A series of kaolinite-potassium acetate intercalation composite was prepared. The thermal behavior and decomposition of these composites were investigated by simultaneous differential scanning calorimetry-thermogravimetric analysis (DSC-TGA), X-ray diffraction (XRD) and Fourier-transformation infrared (FT-IR). The XRD pattern at room temperature indicated that intercalation of potassium acetate into kaolinite causes an increase of the basal spacing from 0.718 to 1.428nm. The peak intensity of the expanded phase of the composite decreased with heating above 300°C, and the basal spacing reduced to 1.19nm at 350°C and 0.718nm at 400°C. These were supported by DSC-TGA and FT-IR measurements, where the endothermic reactions are observed between 300 and 600°C. These reactions can be divided into two stages: 1) Removal of the intercalated molecules between 300-400°C. 2) Dehydroxylation of kaolinite between 400-600°C. Significant changes were observed in the infrared bands assigned to outer surface hydroxyl, inner surface hydroxyl, inner hydroxyl and hydrogen bands.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Women and Representation in Local Government opens up an opportunity to critique and move beyond suppositions and labels in relation to women in local government. Presenting a wealth of new empirical material, this book brings together international experts to examine and compare the presence of women at this level and features case studies on the US, UK, France, Germany, Spain, Finland, Uganda, China, Australia and New Zealand. Divided into four main sections, each explores a key theme related to the subject of women and representation in local government and engages with contemporary gender theory and the broader literature on women and politics. The contributors explore local government as a gendered environment; critiquing strategies to address the limited number of elected female members in local government and examine the impact of significant recent changes on local government through a gender lens. Addressing key questions of how gender equality can be achieved in this sector, it will be of strong interest to students and academics working in the fields of gender studies, local government and international politics.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

It is a common acceptance that contemporary schoolchildren live in a world that is intensely visual and commercially motivated, where what is imagined and what is experienced intermingle. Because of this, contemporary education should encourage a child to make reference to, and connection with their ‘out-of-school’ life. The core critical underpinnings of curriculum based arts appreciation and theory hinge on educators and students taking a historical look at the ways artists have engaged with, and made comment upon, their contemporary societies. My article uses this premise to argue for the need to persist with pushing for critique of/through the visual, that it be delivered as an active process via the arts classroom rather than as visual literacy, here regarded as a more passive process for interpreting and understanding visual material. The article asserts that visual arts lessons are best placed to provide fully students with such critique because they help students to develop a ’critical eye’, an interpretive lens often used by artists to view, analyse and independently navigate and respond to contemporary society.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We both love and hate our journalists. They are perceived as sexy and glamorous on the one hand, despicable and sleazy on the other. Opinion polls regularly indicate that we experience a kind of cultural schizophrenia in our relationship to journalists and the news media: sometimes they are viewed as heroes, at other times villains. From Watergate to the fabrication scandals of the 2000s, journalists have risen and fallen in public esteem. In this book, leading journalism studies scholar Brian McNair explores how journalists have been represented through the prism of one of our key cultural forms, cinema. Drawing on the history of cinema since the 1930s, and with a focus on the period 1997-2008, McNair explores how journalists have been portrayed in film, and what these images tell us about the role of the journalist in liberal democratic societies. Separate chapters are devoted to the subject of female journalists in film, foreign correspondents, investigative reporters and other categories of news maker who have featured regularly in cinema. The book also discusses the representation of public relations professionals in film. Illustrated throughout and written in an accessible and lively style suitable for academic and lay readers alike, Journalists in Film will be essential reading for students and teachers of journalism, and for all those concerned about the role of the journalist in contemporary society, not least journalists themselves. An appendix contains mini-essays on every film about journalism released in the cinema between 1997 and 2008.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Circuit breaker restrikes are unwanted occurrence, which can ultimately lead to breaker. Before 2008, there was little evidence in the literature of monitoring techniques based on restrike measurement and interpretation produced during switching of capacitor banks and shunt reactor banks. In 2008 a non-intrusive radiometric restrike measurement method, as well a restrike hardware detection algorithm was developed. The limitations of the radiometric measurement method are a band limited frequency response as well as limitations in amplitude determination. Current detection methods and algorithms required the use of wide bandwidth current transformers and voltage dividers. A novel non-intrusive restrike diagnostic algorithm using ATP (Alternative Transient Program) and wavelet transforms is proposed. Wavelet transforms are the most common use in signal processing, which is divided into two tests, i.e. restrike detection and energy level based on deteriorated waveforms in different types of restrike. A ‘db5’ wavelet was selected in the tests as it gave a 97% correct diagnostic rate evaluated using a database of diagnostic signatures. This was also tested using restrike waveforms simulated under different network parameters which gave a 92% correct diagnostic responses. The diagnostic technique and methodology developed in this research can be applied to any power monitoring system with slight modification for restrike detection.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This special issue presents an excellent opportunity to study applied epistemology in public policy. This is an important task because the arena of public policy is the social domain in which macro conditions for ‘knowledge work’ and ‘knowledge industries’ are defined and created. We argue that knowledge-related public policy has become overly concerned with creating the politico-economic parameters for the commodification of knowledge. Our policy scope is broader than that of Fuller (1988), who emphasizes the need for a social epistemology of science policy. We extend our focus to a range of policy documents that include communications, science, education and innovation policy (collectively called knowledge-related public policy in acknowledgement of the fact that there is no defined policy silo called ‘knowledge policy’), all of which are central to policy concerned with the ‘knowledge economy’ (Rooney and Mandeville, 1998). However, what we will show here is that, as Fuller (1995) argues, ‘knowledge societies’ are not industrial societies permeated by knowledge, but that knowledge societies are permeated by industrial values. Our analysis is informed by an autopoietic perspective. Methodologically, we approach it from a sociolinguistic position that acknowledges the centrality of language to human societies (Graham, 2000). Here, what we call ‘knowledge’ is posited as a social and cognitive relationship between persons operating on and within multiple social and non-social (or, crudely, ‘physical’) environments. Moreover, knowing, we argue, is a sociolinguistically constituted process. Further, we emphasize that the evaluative dimension of language is most salient for analysing contemporary policy discourses about the commercialization of epistemology (Graham, in press). Finally, we provide a discourse analysis of a sample of exemplary texts drawn from a 1.3 million-word corpus of knowledge-related public policy documents that we compiled from local, state, national and supranational legislatures throughout the industrialized world. Our analysis exemplifies a propensity in policy for resorting to technocratic, instrumentalist and anti-intellectual views of knowledge in policy. We argue that what underpins these patterns is a commodity-based conceptualization of knowledge, which is underpinned by an axiology of narrowly economic imperatives at odds with the very nature of knowledge. The commodity view of knowledge, therefore, is flawed in its ignorance of the social systemic properties of knowing’.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Dark Ages are generally held to be a time of technological and intellectual stagnation in western development. But that is not necessarily the case. Indeed, from a certain perspective, nothing could be further from the truth. In this paper we draw historical comparisons, focusing especially on the thirteenth and fourteenth centuries, between the technological and intellectual ruptures in Europe during the Dark Ages, and those of our current period. Our analysis is framed in part by Harold Innis’s2 notion of "knowledge monopolies". We give an overview of how these were affected by new media, new power struggles, and new intellectual debates that emerged in thirteenth and fourteenth century Europe. The historical salience of our focus may seem elusive. Our world has changed so much, and history seems to be an increasingly far-from-favoured method for understanding our own period and its future potentials. Yet our seemingly distant historical focus provides some surprising insights into the social dynamics that are at work today: the fracturing of established knowledge and power bases; the democratisation of certain "sacred" forms of communication and knowledge, and, conversely, the "sacrosanct" appropriation of certain vernacular forms; challenges and innovations in social and scientific method and thought; the emergence of social world-shattering media practices; struggles over control of vast networks of media and knowledge monopolies; and the enclosure of public discursive and social spaces for singular, manipulative purposes. The period between the eleventh and fourteenth centuries in Europe prefigured what we now call the Enlightenment, perhaps moreso than any other period before or after; it shaped what the Enlightenment was to become. We claim no knowledge of the future here. But in the "post-everything" society, where history is as much up for sale as it is for argument, we argue that our historical perspective provides a useful analogy for grasping the wider trends in the political economy of media, and for recognising clear and actual threats to the future of the public sphere in supposedly democratic societies.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Dark Ages are generally held to be a time of technological and intellectual stagnation in western development. But that is not necessarily the case. Indeed, from a certain perspective, nothing could be further from the truth. In this paper we draw historical comparisons, focusing especially on the thirteenth and fourteenth centuries, between the technological and intellectual ruptures in Europe during the Dark Ages, and those of our current period. Our analysis is framed in part by Harold Innis’s2 notion of "knowledge monopolies". We give an overview of how these were affected by new media, new power struggles, and new intellectual debates that emerged in thirteenth and fourteenth century Europe. The historical salience of our focus may seem elusive. Our world has changed so much, and history seems to be an increasingly far-from-favoured method for understanding our own period and its future potentials. Yet our seemingly distant historical focus provides some surprising insights into the social dynamics that are at work today: the fracturing of established knowledge and power bases; the democratisation of certain "sacred" forms of communication and knowledge, and, conversely, the "sacrosanct" appropriation of certain vernacular forms; challenges and innovations in social and scientific method and thought; the emergence of social world-shattering media practices; struggles over control of vast networks of media and knowledge monopolies; and the enclosure of public discursive and social spaces for singular, manipulative purposes. The period between the eleventh and fourteenth centuries in Europe prefigured what we now call the Enlightenment, perhaps moreso than any other period before or after; it shaped what the Enlightenment was to become. We claim no knowledge of the future here. But in the "post-everything" society, where history is as much up for sale as it is for argument, we argue that our historical perspective provides a useful analogy for grasping the wider trends in the political economy of media, and for recognising clear and actual threats to the future of the public sphere in supposedly democratic societies.