226 resultados para analytic semigroups
Resumo:
In this paper, we consider the numerical solution of a fractional partial differential equation with Riesz space fractional derivatives (FPDE-RSFD) on a finite domain. Two types of FPDE-RSFD are considered: the Riesz fractional diffusion equation (RFDE) and the Riesz fractional advection–dispersion equation (RFADE). The RFDE is obtained from the standard diffusion equation by replacing the second-order space derivative with the Riesz fractional derivative of order αset membership, variant(1,2]. The RFADE is obtained from the standard advection–dispersion equation by replacing the first-order and second-order space derivatives with the Riesz fractional derivatives of order βset membership, variant(0,1) and of order αset membership, variant(1,2], respectively. Firstly, analytic solutions of both the RFDE and RFADE are derived. Secondly, three numerical methods are provided to deal with the Riesz space fractional derivatives, namely, the L1/L2-approximation method, the standard/shifted Grünwald method, and the matrix transform method (MTM). Thirdly, the RFDE and RFADE are transformed into a system of ordinary differential equations, which is then solved by the method of lines. Finally, numerical results are given, which demonstrate the effectiveness and convergence of the three numerical methods.
Resumo:
We propose that a general analytic framework for cultural science can be constructed as a generalization of the generic micro meso macro framework proposed by Dopfer and Potts (2008). This paper outlines this argument along with some implications for the creative industries research agenda.
Resumo:
This paper discusses a method, Generation in Context, for interrogating theories of music analysis and music perception. Given an analytic theory, the method consists of creating a generative process that implements the theory in reverse. Instead of using the theory to create analyses from scores, the theory is used to generate scores from analyses. Subjective evaluation of the quality of the musical output provides a mechanism for testing the theory in a contextually robust fashion. The method is exploratory, meaning that in addition to testing extant theories it provides a general mechanism for generating new theoretical insights. We outline our initial explorations in the use of generative processes for music research, and we discuss how generative processes provide evidence as to the veracity of theories about how music is experienced, with insights into how these theories may be improved and, concurrently, provide new techniques for music creation. We conclude that Generation in Context will help reveal new perspectives on our understanding of music.
Resumo:
Organizations invest heavily in Customer Relationship Management (CRM) and Supply Chain Management (SCM) systems, and their related infrastructure, presumably expecting positive benefits to the organization. Assessing the benefits of such applications is an important aspect of managing such systems. Considering the salient differences between CRM and SCM applications with other intra-organizational applications, existing Information Systems benefits measurement models and frameworks are ill-suited to gauge benefits of inter-organizational systems. This paper reports the preliminary findings of a measurement model developed to assess benefits of CRM and SCM applications. The preliminary model, which reflects the characteristics of the Analytic Theory, is derived using a review of 55 academic studies and 44 papers from the practice. Six hundred and six identified benefits were then synthesized in to 74 non-overlapping benefits, arranged under six dimensions.
Resumo:
This thesis argues that the end of Soviet Marxism and a bipolar global political imaginary at the dissolution of the short Twentieth Century poses an obstacle for anti-systemic political action. Such a blockage of alternate political imaginaries can be discerned by reading the work of Francis Fukuyama and "Endism" as performative invocations of the closure of political alternatives, and thus as an ideological proclamation which enables and constrains forms of social action. It is contended that the search through dialectical thought for a competing universal to posit against "liberal democracy" is a fruitless one, because it reinscribes the terms of teleological theories of history which work to effect closure. Rather, constructing a phenomenological analytic of the political conjuncture, the thesis suggests that the figure of messianism without a Messiah is central to a deconstructive reframing of the possibilities of political action - a reframing attentive to the rhetorical tone of texts. The project of recovering the political is viewed through a phenomenological lens. An agonistic political distinction must be made so as to memorialise the remainders and ghosts of progress, and thus to gesture towards an indeconstructible justice which would serve as a horizon for the articulation of an empty universal. This project is furthered by a return to a certain phenomenology inspired by Cornelius Castoriadis, Claude Lefort, Maurice Merleau-Ponty and Ernesto Laclau. The thesis provides a reading of Jacques Derrida and Walter Benjamin as thinkers of a minor universalism, a non-prescriptive utopia, and places their work in the context of new understandings of religion and the political as quasi-transcendentals which can be utilised to think through the aporias of political time in order to grasp shards of meaning. Derrida and Chantal Mouffe's deconstructive critique and supplement to Carl Schmitt's concept of the political is read as suggestive of a reframing of political thought which would leave the political question open and thus enable the articulation of social imaginary significations able to inscribe meaning in the field of political action. Thus, the thesis gestures towards a form of thought which enables rather than constrains action under the sign of justice.
Resumo:
Leucodepletion, the removal of leucocytes from blood products improves the safety of blood transfusion by reducing adverse events associated with the incidental non-therapeutic transfusion of leucocytes. Leucodepletion has been shown to have clinical benefit for immuno-suppressed patients who require transfusion. The selective leucodepletion of blood products by bed side filtration for these patients has been widely practiced. This study investigated the economic consequences in Queensland of moving from a policy of selective leucodepletion to one of universal leucodepletion, that is providing all transfused patients with blood products leucodepleted during the manufacturing process. Using an analytic decision model a cost-effectiveness analysis was conducted. An ICER of $16.3M per life year gained was derived. Sensitivity analysis found this result to be robust to uncertainty in the parameters used in the model. This result argues against moving to a policy of universal leucodepletion. However during the course of the study the policy decision for universal leucodepletion was made and implemented in Queensland in October 2008. This study has concluded that cost-effectiveness is not an influential factor in policy decisions regarding quality and safety initiatives in the Australian blood sector.
Resumo:
The MDG deadline is fast approaching and the climate within the United Nations remains positive but skeptical. A common feeling is that a great deal of work and headway has been made, but the MDG goals will not be achieved in full by 2015. The largest problem facing the success of the MDGs is, and unless mitigated may remain, mismanaged governance. This argument is confirmed by a strong line of publications stemming from the United Nations and targeting methods (depending on a region or country context) such as improving governance via combating corruption, instituting accountability, peace and stability, as well as transparency. Furthermore, a logical assessment of the framework which MDGs operate in (i.e. international pressure and local civil socio-economic and/or political initiatives pushing governments to progress with MDGs) identifies the State's governing apparatus as the key to the success of MDGs. It is argued that a new analytic framework and grounded theory of democracy (the Element of Democracy) is needed in order to improve governance and enhance democracy. By looking beyond the confines of the MDGs and focusing on properly rectifying poor governance, the progress of MDGs can be accelerated as societies and their governments will be - at minimum - held more accountable to the success of programs in their respective countries. The paper demonstrates the logic of this argument - especially highlighting a new way of viewing democracy - and certain early practices which can accelerate MDGs in the short to medium term.
Resumo:
Background: Reducing rates of healthcare acquired infection has been identified by the Australian Commission on Safety and Quality in Health Care as a national priority. One of the goals is the prevention of central venous catheter-related bloodstream infection (CR-BSI). At least 3,500 cases of CR-BSI occur annually in Australian hospitals, resulting in unnecessary deaths and costs to the healthcare system between $25.7 and $95.3 million. Two approaches to preventing these infections have been proposed: use of antimicrobial catheters (A-CVCs); or a catheter care and management ‘bundle’. Given finite healthcare budgets, decisions about the optimal infection control policy require consideration of the effectiveness and value for money of each approach. Objectives: The aim of this research is to use a rational economic framework to inform efficient infection control policy relating to the prevention of CR-BSI in the intensive care unit. It addresses three questions relating to decision-making in this area: 1. Is additional investment in activities aimed at preventing CR-BSI an efficient use of healthcare resources? 2. What is the optimal infection control strategy from amongst the two major approaches that have been proposed to prevent CR-BSI? 3. What uncertainty is there in this decision and can a research agenda to improve decision-making in this area be identified? Methods: A decision analytic model-based economic evaluation was undertaken to identify an efficient approach to preventing CR-BSI in Queensland Health intensive care units. A Markov model was developed in conjunction with a panel of clinical experts which described the epidemiology and prognosis of CR-BSI. The model was parameterised using data systematically identified from the published literature and extracted from routine databases. The quality of data used in the model and its validity to clinical experts and sensitivity to modelling assumptions was assessed. Two separate economic evaluations were conducted. The first evaluation compared all commercially available A-CVCs alongside uncoated catheters to identify which was cost-effective for routine use. The uncertainty in this decision was estimated along with the value of collecting further information to inform the decision. The second evaluation compared the use of A-CVCs to a catheter care bundle. We were unable to estimate the cost of the bundle because it is unclear what the full resource requirements are for its implementation, and what the value of these would be in an Australian context. As such we undertook a threshold analysis to identify the cost and effectiveness thresholds at which a hypothetical bundle would dominate the use of A-CVCs under various clinical scenarios. Results: In the first evaluation of A-CVCs, the findings from the baseline analysis, in which uncertainty is not considered, show that the use of any of the four A-CVCs will result in health gains accompanied by cost-savings. The MR catheters dominate the baseline analysis generating 1.64 QALYs and cost-savings of $130,289 per 1.000 catheters. With uncertainty, and based on current information, the MR catheters remain the optimal decision and return the highest average net monetary benefits ($948 per catheter) relative to all other catheter types. This conclusion was robust to all scenarios tested, however, the probability of error in this conclusion is high, 62% in the baseline scenario. Using a value of $40,000 per QALY, the expected value of perfect information associated with this decision is $7.3 million. An analysis of the expected value of perfect information for individual parameters suggests that it may be worthwhile for future research to focus on providing better estimates of the mortality attributable to CR-BSI and the effectiveness of both SPC and CH/SSD (int/ext) catheters. In the second evaluation of the catheter care bundle relative to A-CVCs, the results which do not consider uncertainty indicate that a bundle must achieve a relative risk of CR-BSI of at least 0.45 to be cost-effective relative to MR catheters. If the bundle can reduce rates of infection from 2.5% to effectively zero, it is cost-effective relative to MR catheters if national implementation costs are less than $2.6 million ($56,610 per ICU). If the bundle can achieve a relative risk of 0.34 (comparable to that reported in the literature) it is cost-effective, relative to MR catheters, if costs over an 18 month period are below $613,795 nationally ($13,343 per ICU). Once uncertainty in the decision is considered, the cost threshold for the bundle increases to $2.2 million. Therefore, if each of the 46 Level III ICUs could implement an 18 month catheter care bundle for less than $47,826 each, this approach would be cost effective relative to A-CVCs. However, the uncertainty is substantial and the probability of error in concluding that the bundle is the cost-effective approach at a cost of $2.2 million is 89%. Conclusions: This work highlights that infection control to prevent CR-BSI is an efficient use of healthcare resources in the Australian context. If there is no further investment in infection control, an opportunity cost is incurred, which is the potential for a more efficient healthcare system. Minocycline/rifampicin catheters are the optimal choice of antimicrobial catheter for routine use in Australian Level III ICUs, however, if a catheter care bundle implemented in Australia was as effective as those used in the large studies in the United States it would be preferred over the catheters if it was able to be implemented for less than $47,826 per Level III ICU. Uncertainty is very high in this decision and arises from multiple sources. There are likely greater costs to this uncertainty for A-CVCs, which may carry hidden costs, than there are for a catheter care bundle, which is more likely to provide indirect benefits to clinical practice and patient safety. Research into the mortality attributable to CR-BSI, the effectiveness of SPC and CH/SSD (int/ext) catheters and the cost and effectiveness of a catheter care bundle in Australia should be prioritised to reduce uncertainty in this decision. This thesis provides the economic evidence to inform one area of infection control, but there are many other infection control decisions for which information about the cost-effectiveness of competing interventions does not exist. This work highlights some of the challenges and benefits to generating and using economic evidence for infection control decision-making and provides support for commissioning more research into the cost-effectiveness of infection control.
Resumo:
Background: Work-related injuries in Australia are estimated to cost around $57.5 billion annually, however there are currently insufficient surveillance data available to support an evidence-based public health response. Emergency departments (ED) in Australia are a potential source of information on work-related injuries though most ED’s do not have an ‘Activity Code’ to identify work-related cases with information about the presenting problem recorded in a short free text field. This study compared methods for interrogating text fields for identifying work-related injuries presenting at emergency departments to inform approaches to surveillance of work-related injury.---------- Methods: Three approaches were used to interrogate an injury description text field to classify cases as work-related: keyword search, index search, and content analytic text mining. Sensitivity and specificity were examined by comparing cases flagged by each approach to cases coded with an Activity code during triage. Methods to improve the sensitivity and/or specificity of each approach were explored by adjusting the classification techniques within each broad approach.---------- Results: The basic keyword search detected 58% of cases (Specificity 0.99), an index search detected 62% of cases (Specificity 0.87), and the content analytic text mining (using adjusted probabilities) approach detected 77% of cases (Specificity 0.95).---------- Conclusions The findings of this study provide strong support for continued development of text searching methods to obtain information from routine emergency department data, to improve the capacity for comprehensive injury surveillance.
Resumo:
This article uses critical discourse analysis to analyse material shifts in the political economy of communications. It examines texts of major corporations to describe four key changes in political economy: (1) the separation of ownership from control; (2) the separation of business from industry; (3) the separation of accountability from responsibility; and (4) the subjugation of ‘going concerns’ by overriding concerns. The authors argue that this amounts to a political economic shift from traditional concepts of ‘capitalism’ to a new ‘corporatism’ in which the relationships between public and private, state and individual interests have become redefined and obscured through new discourse strategies. They conclude that the present financial and regulatory ‘crisis’ cannot be adequately resolved without a new analytic framework for examining the relationships between corporation, discourse and political economy.
Resumo:
In this thesis an investigation into theoretical models for formation and interaction of nanoparticles is presented. The work presented includes a literature review of current models followed by a series of five chapters of original research. This thesis has been submitted in partial fulfilment of the requirements for the degree of doctor of philosophy by publication and therefore each of the five chapters consist of a peer-reviewed journal article. The thesis is then concluded with a discussion of what has been achieved during the PhD candidature, the potential applications for this research and ways in which the research could be extended in the future. In this thesis we explore stochastic models pertaining to the interaction and evolution mechanisms of nanoparticles. In particular, we explore in depth the stochastic evaporation of molecules due to thermal activation and its ultimate effect on nanoparticles sizes and concentrations. Secondly, we analyse the thermal vibrations of nanoparticles suspended in a fluid and subject to standing oscillating drag forces (as would occur in a standing sound wave) and finally on lattice surfaces in the presence of high heat gradients. We have described in this thesis a number of new models for the description of multicompartment networks joined by a multiple, stochastically evaporating, links. The primary motivation for this work is in the description of thermal fragmentation in which multiple molecules holding parts of a carbonaceous nanoparticle may evaporate. Ultimately, these models predict the rate at which the network or aggregate fragments into smaller networks/aggregates and with what aggregate size distribution. The models are highly analytic and describe the fragmentation of a link holding multiple bonds using Markov processes that best describe different physical situations and these processes have been analysed using a number of mathematical methods. The fragmentation of the network/aggregate is then predicted using combinatorial arguments. Whilst there is some scepticism in the scientific community pertaining to the proposed mechanism of thermal fragmentation,we have presented compelling evidence in this thesis supporting the currently proposed mechanism and shown that our models can accurately match experimental results. This was achieved using a realistic simulation of the fragmentation of the fractal carbonaceous aggregate structure using our models. Furthermore, in this thesis a method of manipulation using acoustic standing waves is investigated. In our investigation we analysed the effect of frequency and particle size on the ability for the particle to be manipulated by means of a standing acoustic wave. In our results, we report the existence of a critical frequency for a particular particle size. This frequency is inversely proportional to the Stokes time of the particle in the fluid. We also find that for large frequencies the subtle Brownian motion of even larger particles plays a significant role in the efficacy of the manipulation. This is due to the decreasing size of the boundary layer between acoustic nodes. Our model utilises a multiple time scale approach to calculating the long term effects of the standing acoustic field on the particles that are interacting with the sound. These effects are then combined with the effects of Brownian motion in order to obtain a complete mathematical description of the particle dynamics in such acoustic fields. Finally, in this thesis, we develop a numerical routine for the description of "thermal tweezers". Currently, the technique of thermal tweezers is predominantly theoretical however there has been a handful of successful experiments which demonstrate the effect it practise. Thermal tweezers is the name given to the way in which particles can be easily manipulated on a lattice surface by careful selection of a heat distribution over the surface. Typically, the theoretical simulations of the effect can be rather time consuming with supercomputer facilities processing data over days or even weeks. Our alternative numerical method for the simulation of particle distributions pertaining to the thermal tweezers effect use the Fokker-Planck equation to derive a quick numerical method for the calculation of the effective diffusion constant as a result of the lattice and the temperature. We then use this diffusion constant and solve the diffusion equation numerically using the finite volume method. This saves the algorithm from calculating many individual particle trajectories since it is describes the flow of the probability distribution of particles in a continuous manner. The alternative method that is outlined in this thesis can produce a larger quantity of accurate results on a household PC in a matter of hours which is much better than was previously achieveable.
Resumo:
Understanding the complexities that are involved in the genetics of multifactorial diseases is still a monumental task. In addition to environmental factors that can influence the risk of disease, there is also a number of other complicating factors. Genetic variants associated with age of disease onset may be different from those variants associated with overall risk of disease, and variants may be located in positions that are not consistent with the traditional protein coding genetic paradigm. Latent Variable Models are well suited for the analysis of genetic data. A latent variable is one that we do not directly observe, but which is believed to exist or is included for computational or analytic convenience in a model. This thesis presents a mixture of methodological developments utilising latent variables, and results from case studies in genetic epidemiology and comparative genomics. Epidemiological studies have identified a number of environmental risk factors for appendicitis, but the disease aetiology of this oft thought useless vestige remains largely a mystery. The effects of smoking on other gastrointestinal disorders are well documented, and in light of this, the thesis investigates the association between smoking and appendicitis through the use of latent variables. By utilising data from a large Australian twin study questionnaire as both cohort and case-control, evidence is found for the association between tobacco smoking and appendicitis. Twin and family studies have also found evidence for the role of heredity in the risk of appendicitis. Results from previous studies are extended here to estimate the heritability of age-at-onset and account for the eect of smoking. This thesis presents a novel approach for performing a genome-wide variance components linkage analysis on transformed residuals from a Cox regression. This method finds evidence for a dierent subset of genes responsible for variation in age at onset than those associated with overall risk of appendicitis. Motivated by increasing evidence of functional activity in regions of the genome once thought of as evolutionary graveyards, this thesis develops a generalisation to the Bayesian multiple changepoint model on aligned DNA sequences for more than two species. This sensitive technique is applied to evaluating the distributions of evolutionary rates, with the finding that they are much more complex than previously apparent. We show strong evidence for at least 9 well-resolved evolutionary rate classes in an alignment of four Drosophila species and at least 7 classes in an alignment of four mammals, including human. A pattern of enrichment and depletion of genic regions in the profiled segments suggests they are functionally significant, and most likely consist of various functional classes. Furthermore, a method of incorporating alignment characteristics representative of function such as GC content and type of mutation into the segmentation model is developed within this thesis. Evidence of fine-structured segmental variation is presented.
Resumo:
This special issue of Innovation : Management, Policy & Practice (also released as a book: ISBN 978-1-921348-31-0) will explore some empirical and analytic connections between creative industries and innovation policy. Seven papers are presented. The first four are empirical, providing analysis of large and/or detailed data sets on creative industries businesses and occupations to discern their contribution to innovation. The next three papers focus on comparative and historical policy analysis, connecting creative industries policy (broadly considered, including media, arts and cultural policy) and innovation policy. To introduce this special issue I want to review the arguments connecting the statistical, conceptual and policy neologism of ‘creative industries’ to: (1) the elements of a national innovation system; and (2) to innovation policy. In approaching this connection, two overarching issues arise.
Resumo:
This book explores the interrelation of literacy and religion as practiced by Western Christians in, first, historical contexts and, second, in one contemporary church setting. Using both a case study and a Foucauldian theoretical framework, the book provides a sustained analysis of the reciprocal discursive construction of literacy, religiosity and identity in one Seventh-day Adventist Church community of Northern Australia. Critical linguistic and discourse analytic theory is used to disclose processes of theological (church), familial (home) and educational (school) normalisation of community members into regulated ways of hearing and speaking, reading and writing, being and believing. Detailed analyses of spoken and written texts taken from institutional and local community settings show how textual religion is an exemplary technology of the self, a politics constituted by canonical texts, interpretive norms, textual practices, ritualised events and sociopolitical protocols that, ultimately, are turned in upon the self. The purpose of these analyses is to show how, across denominational difference in belief (tradition) and practice, particular versions of self and society are constructed through economies of truth from text, enabling and constraining what can and cannot be spoken and enacted by believers.
Resumo:
Objective: To demonstrate properties of the International Classification of the External Cause of Injury (ICECI) as a tool for use in injury prevention research. Methods: The Childhood Injury Prevention Study (CHIPS) is a prospective longitudinal follow up study of a cohort of 871 children 5–12 years of age, with a nested case crossover component. The ICECI is the latest tool in the International Classification of Diseases (ICD) family and has been designed to improve the precision of coding injury events. The details of all injury events recorded in the study, as well as all measured injury related exposures, were coded using the ICECI. This paper reports a substudy on the utility and practicability of using the ICECI in the CHIPS to record exposures. Interrater reliability was quantified for a sample of injured participants using the Kappa statistic to measure concordance between codes independently coded by two research staff. Results: There were 767 diaries collected at baseline and event details from 563 injuries and exposure details from injury crossover periods. There were no event, location, or activity details which could not be coded using the ICECI. Kappa statistics for concordance between raters within each of the dimensions ranged from 0.31 to 0.93 for the injury events and 0.94 and 0.97 for activity and location in the control periods. Discussion: This study represents the first detailed account of the properties of the ICECI revealed by its use in a primary analytic epidemiological study of injury prevention. The results of this study provide considerable support for the ICECI and its further use.