991 resultados para Historical Methodology


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Relationships between LGBT people and police have been turbulent for some time now, and have been variously characterized as supportive (McGhee, 2004) and antagonistic (Radford, Betts, & Ostermeyer, 2006). These relationships were, and continue to be, influenced by a range of political, legal, cultural, and social factors. This chapter will examine historical and social science accounts of LGBT-police histories to chart the historical peaks and troughs in these relationships. The discussion demonstrates how, in Western contexts, we oscillate between historical moments of police criminalizing homosexual perversity and contemporary landscapes of partnership between police and LGBT people. However, the chapter challenges the notion that it is possible to trace this as a lineal progression from a painful past to a more productive present. Rather, it focuses on specific moments, marked by pain or pleasure or both, and how these moments emerge and re-emerge in ways that shaped LGBT-police landscapes in potted, uneven ways. The chapter concludes noting how, although certain ideas and police practices may shift towards more progressive notions of partnership policing, we cannot just take away the history that emerged out of mistrust and pain.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Exposure control or case-control methodologies are common techniques for estimating crash risks, however they require either observational data on control cases or exogenous exposure data, such as vehicle-kilometres travelled. This study proposes an alternative methodology for estimating crash risk of road user groups, whilst controlling for exposure under a variety of roadway, traffic and environmental factors by using readily available police-reported crash data. In particular, the proposed method employs a combination of a log-linear model and quasi-induced exposure technique to identify significant interactions among a range of roadway, environmental and traffic conditions to estimate associated crash risks. The proposed methodology is illustrated using a set of police-reported crash data from January 2004 to June 2009 on roadways in Queensland, Australia. Exposure-controlled crash risks of motorcyclists—involved in multi-vehicle crashes at intersections—were estimated under various combinations of variables like posted speed limit, intersection control type, intersection configuration, and lighting condition. Results show that the crash risk of motorcycles at three-legged intersections is high if the posted speed limits along the approaches are greater than 60 km/h. The crash risk at three-legged intersections is also high when they are unsignalized. Dark lighting conditions appear to increase the crash risk of motorcycles at signalized intersections, but the problem of night time conspicuity of motorcyclists at intersections is lessened on approaches with lower speed limits. This study demonstrates that this combined methodology is a promising tool for gaining new insights into the crash risks of road user groups, and is transferrable to other road users.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Transit passenger market segmentation enables transit operators to target different classes of transit users to provide customized information and services. The Smart Card (SC) data, from Automated Fare Collection system, facilitates the understanding of multiday travel regularity of transit passengers, and can be used to segment them into identifiable classes of similar behaviors and needs. However, the use of SC data for market segmentation has attracted very limited attention in the literature. This paper proposes a novel methodology for mining spatial and temporal travel regularity from each individual passenger’s historical SC transactions and segments them into four segments of transit users. After reconstructing the travel itineraries from historical SC transactions, the paper adopts the Density-Based Spatial Clustering of Application with Noise (DBSCAN) algorithm to mine travel regularity of each SC user. The travel regularity is then used to segment SC users by an a priori market segmentation approach. The methodology proposed in this paper assists transit operators to understand their passengers and provide them oriented information and services.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The methoxyamine group represents an ideal protecting group for the nitroxide moiety. It can be easily and selectively introduced in high yield (typically >90%) to a range of functionalised nitroxides using FeSO4.7H2O and H2O2 in DMSO. Its removal is readily achieved under mild conditions in high yield (70-90%) using mCPBA in a Cope-type elimination process.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The story of prickly pear in Australia is usually told as a tale of triumphant scientific intervention into an environmental disaster. Instead, this unarticle considers it as a transnational network in order to better understand the myriad of elements that made this event so important. Through this methodology emerges the complex nature of prickly pear land that included people, places, ideas, rhetoric and objects that traveled from all over the world into settler Australia.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A multi-resource multi-stage scheduling methodology is developed to solve short-term open-pit mine production scheduling problems as a generic multi-resource multi-stage scheduling problem. It is modelled using essential characteristics of short-term mining production operations such as drilling, sampling, blasting and excavating under the capacity constraints of mining equipment at each processing stage. Based on an extended disjunctive graph model, a shifting-bottleneck-procedure algorithm is enhanced and applied to obtain feasible short-term open-pit mine production schedules and near-optimal solutions. The proposed methodology and its solution quality are verified and validated using a real mining case study.

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This project involved writing Turrwan (great man), a novel set in Queensland in the nineteenth century, and an investigation into the way historical novels portray the past. Turrwan tells the story of Tom Petrie, who was six when he arrived with his family at the notorious Moreton Bay Penal Colony in 1837. The thesis examines historical fiction as a genre with particular focus on notions of historical authenticity. It analyses the complexities involved in a non-Indigenous person writing about the Australian Aboriginal people, and reflects on the process of researching, planning and writing a historical novel.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The terrorist attacks in the United States on September 11, 2001 appeared to be a harbinger of increased terrorism and violence in the 21st century, bringing terrorism and political violence to the forefront of public discussion. Questions about these events abound, and “Estimating the Historical and Future Probabilities of Large Scale Terrorist Event” [Clauset and Woodard (2013)] asks specifically, “how rare are large scale terrorist events?” and, in general, encourages discussion on the role of quantitative methods in terrorism research and policy and decision-making. Answering the primary question raises two challenges. The first is identify- ing terrorist events. The second is finding a simple yet robust model for rare events that has good explanatory and predictive capabilities. The challenges of identifying terrorist events is acknowledged and addressed by reviewing and using data from two well-known and reputable sources: the Memorial Institute for the Prevention of Terrorism-RAND database (MIPT-RAND) [Memorial Institute for the Prevention of Terrorism] and the Global Terror- ism Database (GTD) [National Consortium for the Study of Terrorism and Responses to Terrorism (START) (2012), LaFree and Dugan (2007)]. Clauset and Woodard (2013) provide a detailed discussion of the limitations of the data and the models used, in the context of the larger issues surrounding terrorism and policy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The analysis of content and meta–data has long been the subject of most Twitter studies, however such research only tells part of the story of the development of Twitter as a platform. In this work, we introduce a methodology to determine the growth patterns of individual users of the platform, a technique we refer to as follower accession, and through a number of case studies consider the factors which lead to follower growth, and the identification of non–authentic followers. Finally, we consider what such an approach tells us about the history of the platform itself, and the way in which changes to the new user signup process have impacted upon users.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background Timely diagnosis and reporting of patient symptoms in hospital emergency departments (ED) is a critical component of health services delivery. However, due to dispersed information resources and a vast amount of manual processing of unstructured information, accurate point-of-care diagnosis is often difficult. Aims The aim of this research is to report initial experimental evaluation of a clinician-informed automated method for the issue of initial misdiagnoses associated with delayed receipt of unstructured radiology reports. Method A method was developed that resembles clinical reasoning for identifying limb abnormalities. The method consists of a gazetteer of keywords related to radiological findings; the method classifies an X-ray report as abnormal if it contains evidence contained in the gazetteer. A set of 99 narrative reports of radiological findings was sourced from a tertiary hospital. Reports were manually assessed by two clinicians and discrepancies were validated by a third expert ED clinician; the final manual classification generated by the expert ED clinician was used as ground truth to empirically evaluate the approach. Results The automated method that attempts to individuate limb abnormalities by searching for keywords expressed by clinicians achieved an F-measure of 0.80 and an accuracy of 0.80. Conclusion While the automated clinician-driven method achieved promising performances, a number of avenues for improvement were identified using advanced natural language processing (NLP) and machine learning techniques.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of this on-going research is to interrogate the era of colonialism in Australia (1896-1966) and the denial of paid employment of Aboriginal women. The 1897 Aborigines Protection and the Restriction of the Sale of Opium Act witnessed thousands of Aboriginal people placed on Government run reserves and missions. This resulted in all aspects of their lives being controlled through state mechanisms. Under various Acts of Parliament, Aboriginal women were sent to privately owned properties to be utilised as ‘domestic servants’ through a system of forced indentured labour, which continued until the 1970’s. This paper discusses the hidden histories of these women through the use of primary sources documents including records from the Australian Department of Native Affairs and Department of Home and Health. This social history research reveals that the practice of removing Aboriginal women from their families at the age of 12 or 13 and to white families was more common practice than not. These women were often: not paid, worked up to 15 hour days, not allowed leave and subjected to many forms of abuse. Wages that were meant to be paid were re-directed to other others, including the Government. Whilst the retrieval of these ‘stolen wages’ is now an on-going issue resulting in the Queensland Government in 2002 offering AUS $2,000 to $4,000 in compensation for a lifetime of work, Aboriginal women were also asked to waive their legal right to further compensation. There are few documented histories of these Aboriginal women as told through the archives. This hidden Aboriginal Australian women’s history needs to be revealed to better understand the experiences and depth of misappropriation of Aboriginal women as domestic workers. In doing so, it also reveals a more accurate reflection of women’s work in Australia.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

As the first academically rigorous interrogation of the generation of performance within the global frame of the motion capture volume, this research presents a historical contextualisation and develops and tests a set of first principles through an original series of theoretically informed, practical exercises to guide those working in the emergent space of performance capture. It contributes a new understanding of the framing of performance in The Omniscient Frame, and initiates and positions performance capture as a new and distinct interdisciplinary discourse in the fields of theatre, animation, performance studies and film.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Most of existing motorway traffic safety studies using disaggregate traffic flow data aim at developing models for identifying real-time traffic risks by comparing pre-crash and non-crash conditions. One of serious shortcomings in those studies is that non-crash conditions are arbitrarily selected and hence, not representative, i.e. selected non-crash data might not be the right data comparable with pre-crash data; the non-crash/pre-crash ratio is arbitrarily decided and neglects the abundance of non-crash over pre-crash conditions; etc. Here, we present a methodology for developing a real-time MotorwaY Traffic Risk Identification Model (MyTRIM) using individual vehicle data, meteorological data, and crash data. Non-crash data are clustered into groups called traffic regimes. Thereafter, pre-crash data are classified into regimes to match with relevant non-crash data. Among totally eight traffic regimes obtained, four highly risky regimes were identified; three regime-based Risk Identification Models (RIM) with sufficient pre-crash data were developed. MyTRIM memorizes the latest risk evolution identified by RIM to predict near future risks. Traffic practitioners can decide MyTRIM’s memory size based on the trade-off between detection and false alarm rates. Decreasing the memory size from 5 to 1 precipitates the increase of detection rate from 65.0% to 100.0% and of false alarm rate from 0.21% to 3.68%. Moreover, critical factors in differentiating pre-crash and non-crash conditions are recognized and usable for developing preventive measures. MyTRIM can be used by practitioners in real-time as an independent tool to make online decision or integrated with existing traffic management systems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background Heatwaves could cause the population excess death numbers to be ranged from tens to thousands within a couple of weeks in a local area. An excess mortality due to a special event (e.g., a heatwave or an epidemic outbreak) is estimated by subtracting the mortality figure under ‘normal’ conditions from the historical daily mortality records. The calculation of the excess mortality is a scientific challenge because of the stochastic temporal pattern of the daily mortality data which is characterised by (a) the long-term changing mean levels (i.e., non-stationarity); (b) the non-linear temperature-mortality association. The Hilbert-Huang Transform (HHT) algorithm is a novel method originally developed for analysing the non-linear and non-stationary time series data in the field of signal processing, however, it has not been applied in public health research. This paper aimed to demonstrate the applicability and strength of the HHT algorithm in analysing health data. Methods Special R functions were developed to implement the HHT algorithm to decompose the daily mortality time series into trend and non-trend components in terms of the underlying physical mechanism. The excess mortality is calculated directly from the resulting non-trend component series. Results The Brisbane (Queensland, Australia) and the Chicago (United States) daily mortality time series data were utilized for calculating the excess mortality associated with heatwaves. The HHT algorithm estimated 62 excess deaths related to the February 2004 Brisbane heatwave. To calculate the excess mortality associated with the July 1995 Chicago heatwave, the HHT algorithm needed to handle the mode mixing issue. The HHT algorithm estimated 510 excess deaths for the 1995 Chicago heatwave event. To exemplify potential applications, the HHT decomposition results were used as the input data for a subsequent regression analysis, using the Brisbane data, to investigate the association between excess mortality and different risk factors. Conclusions The HHT algorithm is a novel and powerful analytical tool in time series data analysis. It has a real potential to have a wide range of applications in public health research because of its ability to decompose a nonlinear and non-stationary time series into trend and non-trend components consistently and efficiently.