494 resultados para Quit Attempt Methods
ADI-Euler and extrapolation methods for the two-dimensional fractional advection-dispersion equation
Resumo:
Numerous expert elicitation methods have been suggested for generalised linear models (GLMs). This paper compares three relatively new approaches to eliciting expert knowledge in a form suitable for Bayesian logistic regression. These methods were trialled on two experts in order to model the habitat suitability of the threatened Australian brush-tailed rock-wallaby (Petrogale penicillata). The first elicitation approach is a geographically assisted indirect predictive method with a geographic information system (GIS) interface. The second approach is a predictive indirect method which uses an interactive graphical tool. The third method uses a questionnaire to elicit expert knowledge directly about the impact of a habitat variable on the response. Two variables (slope and aspect) are used to examine prior and posterior distributions of the three methods. The results indicate that there are some similarities and dissimilarities between the expert informed priors of the two experts formulated from the different approaches. The choice of elicitation method depends on the statistical knowledge of the expert, their mapping skills, time constraints, accessibility to experts and funding available. This trial reveals that expert knowledge can be important when modelling rare event data, such as threatened species, because experts can provide additional information that may not be represented in the dataset. However care must be taken with the way in which this information is elicited and formulated.
Resumo:
A new steady state method for determination of the electron diffusion length in dye-sensitized solar cells (DSCs) is described and illustrated with data obtained using cells containing three different types of electrolyte. The method is based on using near-IR absorbance methods to establish pairs of illumination intensity for which the total number of trapped electrons is the same at open circuit (where all electrons are lost by interfacial electron transfer) as at short circuit (where the majority of electrons are collected at the contact). Electron diffusion length values obtained by this method are compared with values derived by intensity modulated methods and by impedance measurements under illumination. The results indicate that the values of electron diffusion length derived from the steady state measurements are consistently lower than the values obtained by the non steady-state methods. For all three electrolytes used in the study, the electron diffusion length was sufficiently high to guarantee electron collection efficiencies greater than 90%. Measurement of the trap distributions by near-IR absorption confirmed earlier observations of much higher electron trap densities for electrolytes containing Li+ ions. It is suggested that the electron trap distributions may not be intrinsic properties of the TiO2 nanoparticles, but may be associated with electron-ion interactions.
Resumo:
Background: Work-related injuries in Australia are estimated to cost around $57.5 billion annually, however there are currently insufficient surveillance data available to support an evidence-based public health response. Emergency departments (ED) in Australia are a potential source of information on work-related injuries though most ED’s do not have an ‘Activity Code’ to identify work-related cases with information about the presenting problem recorded in a short free text field. This study compared methods for interrogating text fields for identifying work-related injuries presenting at emergency departments to inform approaches to surveillance of work-related injury.---------- Methods: Three approaches were used to interrogate an injury description text field to classify cases as work-related: keyword search, index search, and content analytic text mining. Sensitivity and specificity were examined by comparing cases flagged by each approach to cases coded with an Activity code during triage. Methods to improve the sensitivity and/or specificity of each approach were explored by adjusting the classification techniques within each broad approach.---------- Results: The basic keyword search detected 58% of cases (Specificity 0.99), an index search detected 62% of cases (Specificity 0.87), and the content analytic text mining (using adjusted probabilities) approach detected 77% of cases (Specificity 0.95).---------- Conclusions The findings of this study provide strong support for continued development of text searching methods to obtain information from routine emergency department data, to improve the capacity for comprehensive injury surveillance.
Resumo:
We study the suggestion that Markov switching (MS) models should be used to determine cyclical turning points. A Kalman filter approximation is used to derive the dating rules implicit in such models. We compare these with dating rules in an algorithm that provides a good approximation to the chronology determined by the NBER. We find that there is very little that is attractive in the MS approach when compared with this algorithm. The most important difference relates to robustness. The MS approach depends on the validity of that statistical model. Our approach is valid in a wider range of circumstances.
Resumo:
Road curves are an important feature of road infrastructure and many serious crashes occur on road curves. In Queensland, the number of fatalities is twice as many on curves as that on straight roads. Therefore, there is a need to reduce drivers’ exposure to crash risk on road curves. Road crashes in Australia and in the Organisation for Economic Co-operation and Development(OECD) have plateaued in the last five years (2004 to 2008) and the road safety community is desperately seeking innovative interventions to reduce the number of crashes. However, designing an innovative and effective intervention may prove to be difficult as it relies on providing theoretical foundation, coherence, understanding, and structure to both the design and validation of the efficiency of the new intervention. Researchers from multiple disciplines have developed various models to determine the contributing factors for crashes on road curves with a view towards reducing the crash rate. However, most of the existing methods are based on statistical analysis of contributing factors described in government crash reports. In order to further explore the contributing factors related to crashes on road curves, this thesis designs a novel method to analyse and validate these contributing factors. The use of crash claim reports from an insurance company is proposed for analysis using data mining techniques. To the best of our knowledge, this is the first attempt to use data mining techniques to analyse crashes on road curves. Text mining technique is employed as the reports consist of thousands of textual descriptions and hence, text mining is able to identify the contributing factors. Besides identifying the contributing factors, limited studies to date have investigated the relationships between these factors, especially for crashes on road curves. Thus, this study proposed the use of the rough set analysis technique to determine these relationships. The results from this analysis are used to assess the effect of these contributing factors on crash severity. The findings obtained through the use of data mining techniques presented in this thesis, have been found to be consistent with existing identified contributing factors. Furthermore, this thesis has identified new contributing factors towards crashes and the relationships between them. A significant pattern related with crash severity is the time of the day where severe road crashes occur more frequently in the evening or night time. Tree collision is another common pattern where crashes that occur in the morning and involves hitting a tree are likely to have a higher crash severity. Another factor that influences crash severity is the age of the driver. Most age groups face a high crash severity except for drivers between 60 and 100 years old, who have the lowest crash severity. The significant relationship identified between contributing factors consists of the time of the crash, the manufactured year of the vehicle, the age of the driver and hitting a tree. Having identified new contributing factors and relationships, a validation process is carried out using a traffic simulator in order to determine their accuracy. The validation process indicates that the results are accurate. This demonstrates that data mining techniques are a powerful tool in road safety research, and can be usefully applied within the Intelligent Transport System (ITS) domain. The research presented in this thesis provides an insight into the complexity of crashes on road curves. The findings of this research have important implications for both practitioners and academics. For road safety practitioners, the results from this research illustrate practical benefits for the design of interventions for road curves that will potentially help in decreasing related injuries and fatalities. For academics, this research opens up a new research methodology to assess crash severity, related to road crashes on curves.
Resumo:
Aims: To describe a local data linkage project to match hospital data with the Australian Institute of Health and Welfare (AIHW) National Death Index (NDI) to assess longterm outcomes of intensive care unit patients. Methods: Data were obtained from hospital intensive care and cardiac surgery databases on all patients aged 18 years and over admitted to either of two intensive care units at a tertiary-referral hospital between 1 January 1994 and 31 December 2005. Date of death was obtained from the AIHW NDI by probabilistic software matching, in addition to manual checking through hospital databases and other sources. Survival was calculated from time of ICU admission, with a censoring date of 14 February 2007. Data for patients with multiple hospital admissions requiring intensive care were analysed only from the first admission. Summary and descriptive statistics were used for preliminary data analysis. Kaplan-Meier survival analysis was used to analyse factors determining long-term survival. Results: During the study period, 21 415 unique patients had 22 552 hospital admissions that included an ICU admission; 19 058 surgical procedures were performed with a total of 20 092 ICU admissions. There were 4936 deaths. Median follow-up was 6.2 years, totalling 134 203 patient years. The casemix was predominantly cardiac surgery (80%), followed by cardiac medical (6%), and other medical (4%). The unadjusted survival at 1, 5 and 10 years was 97%, 84% and 70%, respectively. The 1-year survival ranged from 97% for cardiac surgery to 36% for cardiac arrest. An APACHE II score was available for 16 877 patients. In those discharged alive from hospital, the 1, 5 and 10-year survival varied with discharge location. Conclusions: ICU-based linkage projects are feasible to determine long-term outcomes of ICU patients
Resumo:
The call to innovate is ubiquitous across the Australian educational policy context. The claims of innovative practices and environments that occur frequently in university mission statements, strategic plans and marketing literature suggest that this exhortation to innovate appears to have been taken up enthusiastically by the university sector. Throughout the history of universities, a range of reported deficiencies of higher education have worked to produce a notion of crisis. At present, it would seem that innovation is positioned as the solution to the notion of crisis. This thesis is an inquiry into how the insistence on innovation works to both enable and constrain teaching and learning practices in Australian universities. Alongside the interplay between innovation and crisis is the link between resistance and innovation, a link which remains largely unproblematized in the scholarly literature. This thesis works to locate and unsettle understandings of a relationship between innovation and Australian higher education. The aim of this inquiry is to generate new understandings of what counts as innovation within this context and how innovation is enacted. The thesis draws on a number of postmodernist theorists, whose works have informed firstly the research method, and then the analysis and findings. Firstly, there is an assumption that power is capillary and works through discourse to enact power relations which shape certain truths (Foucault, 1990). Secondly, this research scrutinised language practices which frame the capacity for individuals to act, alongside the language practices which encourage an individual to adopt certain attitudes and actions as one’s own (Foucault, 1988). Thirdly, innovation talk is read in this thesis as an example of needs talk, that is, as a medium through which what is considered domestic, political or economic is made and contested (Fraser, 1989). Fourthly, relationships between and within discourses were identified and analysed beyond cause and effect descriptions, and more productively considered to be in a constant state of becoming (Deleuze, 1987). Finally, the use of ironic research methods assisted in producing alternate configurations of innovation talk which are useful and new (Rorty, 1989). The theoretical assumptions which underpin this thesis inform a document analysis methodology, used to examine how certain texts work to shape the ways in which innovation is constructed. The data consisted of three Federal higher education funding policies selected on the rationale that these documents, as opposed to state or locally based policy and legislation, represent the only shared policy context for all Australian universities. The analysis first provided a modernist reading of the three documents, and this was followed by postmodernist readings of these same policy documents. The modernist reading worked to locate and describe the current truths about innovation. The historical context in which the policy was produced as well as the textual features of the document itself were important to this reading. In the first modernist reading, the binaries involved in producing proper and improper notions of innovation were described and analysed. In the process of the modernist analysis and the subsequent location of binary organisation, a number of conceptual collisions were identified, and these sites of struggle were revisited, through the application of a postmodernist reading. By applying the theories of Rorty (1989) and Fraser (1989) it became possible to not treat these sites as contradictory and requiring resolution, but rather as spaces in which binary tensions are necessary and productive. This postmodernist reading constructed new spaces for refusing and resisting dominant discourses of innovation which value only certain kinds of teaching and learning practices. By exploring a number of ironic language practices found within the policies, this thesis proposes an alternative way of thinking about what counts as innovation and how it happens. The new readings of innovation made possible through the work of this thesis were in response to a suite of enduring, inter-related questions – what counts as innovation?, who or what supports innovation?, how does innovation occur?, and who are the innovators?. The truths presented in response to these questions were treated as the language practices which constitute a dominant discourse of innovation talk. The collisions that occur within these truths were the contested sites which were of most interest for the analysis. The thesis concludes by presenting a theoretical blueprint which works to shift the boundaries of what counts as innovation and how it happens in a manner which is productive, inclusive and powerful. This blueprint forms the foundation upon which a number of recommendations are made for both my own professional practice and broader contexts. In keeping with the conceptual tone of this study, these recommendations are a suite of new questions which focus attention on the boundaries of innovation talk as an attempt to re-configure what is valued about teaching and learning at university.
Resumo:
This paper discusses the choice to use two less conventional or “interesting” research methods, Q Methodology and Experience Sampling Method, rather than “status quo” research methods so common in the marketing discipline. It is argued that such methods have value for marketing academics because they widen the potential for discovery. The paper outlines these two research methods, providing examples of how they have been used in an experiential consumption perspective. Additionally the paper identifies some of the challenges to be faced when trying to publish research that use such less conventional methods, as well as offering suggestions to address them.
Resumo:
PURPOSE: We report our telephone-based system for selecting community control series appropriate for a complete Australia-wide series of Ewing's sarcoma cases. METHODS: We used electronic directory random sampling to select age-matched controls. The sampling has all listed telephone numbers on an up-dated CD-Rom. RESULTS: 95% of 2245 telephone numbers selected were successfully contacted. The mean number of attempts needed was 1.94, 58% answering at the first attempt. On average, we needed 4.5 contacts per control selected. Calls were more likely to be successful (reach a respondent) when made in the evening (except Saturdays). The overall response rate among contacted telephone numbers was 92.8%. Participation rates among female and male respondents were practically the same. The exclusion of unlisted numbers (13.5% of connected households) and unconnected households (3.7%) led to potential selection bias. However, restricting the case series to listed cases only, plus having external information on the direction of potential bias allow meaningful interpretation of our data. CONCLUSION: Sampling from an electronic directory is convenient, economical and simple, and gives a very good yield of eligible subjects compared to other methods.
Resumo:
The aim of this chapter is to provide you with a basic understanding of epidemiology, and to introduce you to some of the epidemiological concepts and methods used by researchers and practitioners working in public health. It is hoped that you will recognise how the principles and practice of epidemiology help to provide information and insights that can be used to achieve better health outcomes for all. Epidemiology is fundamental to preventive medicine and public health policy. Rather than examine health and illness on an individual level, as clinicians do, epidemiologists focus on communities and population health issues. The word epidemiology is derived from the Greek epi (on, upon), demos (the people) and logos (the study of). Epidemiology, then, is the study of that which falls upon the people. Its aims are to describe health-related states or events, and through systematic examination of the available information, attempt to determine their causes. The ultimate goal is to contribute to prevention of disease and disability and to delay mortality. The primary question of epidemiology is: why do certain diseases affect particular population groups? Drawing upon statistics, the social and behavioural sciences, the biological sciences and medicine, epidemiologists collect and interpret information to assist in the prevention of new cases of disease, eradicate existing disease and prolong the lives of people who have disease.