32 resultados para leap
em Queensland University of Technology - ePrints Archive
Resumo:
This paper demonstrates how Indigenous Studies is controlled in some Australian universities in ways that continue the marginalisation, denigration and exploitation of Indigenous peoples. Moreover, it shows how the engagement of white notions of “inclusion” can result in the maintenance of racism, systemic marginalisation, white race privilege and radicalised subjectivity. A case study will be utilised which draws from the experience of two Indigenous scholars who were invited to be part of a panel to review one Australian university’s plan and courses in Indigenous studies. The case study offers the opportunity to destabilise the relationships between oppression and privilege and the epistemology that maintains them. The paper argues for the need to examine exactly what is being offered when universities provide opportunities for “inclusion”.
Resumo:
The delay stochastic simulation algorithm (DSSA) by Barrio et al. [Plos Comput. Biol.2, 117–E (2006)] was developed to simulate delayed processes in cell biology in the presence of intrinsic noise, that is, when there are small-to-moderate numbers of certain key molecules present in a chemical reaction system. These delayed processes can faithfully represent complex interactions and mechanisms that imply a number of spatiotemporal processes often not explicitly modeled such as transcription and translation, basic in the modeling of cell signaling pathways. However, for systems with widely varying reaction rate constants or large numbers of molecules, the simulation time steps of both the stochastic simulation algorithm (SSA) and the DSSA can become very small causing considerable computational overheads. In order to overcome the limit of small step sizes, various τ-leap strategies have been suggested for improving computational performance of the SSA. In this paper, we present a binomial τ- DSSA method that extends the τ-leap idea to the delay setting and avoids drawing insufficient numbers of reactions, a common shortcoming of existing binomial τ-leap methods that becomes evident when dealing with complex chemical interactions. The resulting inaccuracies are most evident in the delayed case, even when considering reaction products as potential reactants within the same time step in which they are produced. Moreover, we extend the framework to account for multicellular systems with different degrees of intercellular communication. We apply these ideas to two important genetic regulatory models, namely, the hes1 gene, implicated as a molecular clock, and a Her1/Her 7 model for coupled oscillating cells.
Resumo:
The stochastic simulation algorithm was introduced by Gillespie and in a different form by Kurtz. There have been many attempts at accelerating the algorithm without deviating from the behavior of the simulated system. The crux of the explicit τ-leaping procedure is the use of Poisson random variables to approximate the number of occurrences of each type of reaction event during a carefully selected time period, τ. This method is acceptable providing the leap condition, that no propensity function changes “significantly” during any time-step, is met. Using this method there is a possibility that species numbers can, artificially, become negative. Several recent papers have demonstrated methods that avoid this situation. One such method classifies, as critical, those reactions in danger of sending species populations negative. At most, one of these critical reactions is allowed to occur in the next time-step. We argue that the criticality of a reactant species and its dependent reaction channels should be related to the probability of the species number becoming negative. This way only reactions that, if fired, produce a high probability of driving a reactant population negative are labeled critical. The number of firings of more reaction channels can be approximated using Poisson random variables thus speeding up the simulation while maintaining the accuracy. In implementing this revised method of criticality selection we make use of the probability distribution from which the random variable describing the change in species number is drawn. We give several numerical examples to demonstrate the effectiveness of our new method.
Resumo:
Biologists are increasingly conscious of the critical role that noise plays in cellular functions such as genetic regulation, often in connection with fluctuations in small numbers of key regulatory molecules. This has inspired the development of models that capture this fundamentally discrete and stochastic nature of cellular biology - most notably the Gillespie stochastic simulation algorithm (SSA). The SSA simulates a temporally homogeneous, discrete-state, continuous-time Markov process, and of course the corresponding probabilities and numbers of each molecular species must all remain positive. While accurately serving this purpose, the SSA can be computationally inefficient due to very small time stepping so faster approximations such as the Poisson and Binomial τ-leap methods have been suggested. This work places these leap methods in the context of numerical methods for the solution of stochastic differential equations (SDEs) driven by Poisson noise. This allows analogues of Euler-Maruyuma, Milstein and even higher order methods to be developed through the Itô-Taylor expansions as well as similar derivative-free Runge-Kutta approaches. Numerical results demonstrate that these novel methods compare favourably with existing techniques for simulating biochemical reactions by more accurately capturing crucial properties such as the mean and variance than existing methods.
Resumo:
This dissertation examines the compliance and performance of a large sample of faith based (religious) ethical funds - the Shari'ah-compliant equity funds (SEFs), which may be viewed as a form of ethical investing. SEFs screen their investment for compliance with Islamic law, where riba (conventional interest expense), maysir (gambling), gharar (excessive uncertainty), and non-halal (non-ethical) products are prohibited. Using a set of stringent Shari'ah screens similar to those of MSCI Islamic, we first examine the extent to which SEFs comply with the Shari'ah law. Results show that only about 27% of the equities held by SEFs are Shari'ah-compliant. While most of the fund holdings pass the business screens, only about 42% pass the total debt to total assets ratio screen. This finding suggests that, in order to overcome a significant reduction in the investment opportunity, Shari'ah principles are compromised, with SEFs adopting lax screening rules so as to achieve a financial performance. While younger funds and funds that charge higher fees and are domiciled in more Muslim countries are more Shari'ah-compliant, we find little evidence of a positive relationship between fund disclosure of the Shari'ah compliance framework and Shari'ah-compliance. Clearly, Shari'ah compliance remains a major challenge for fund managers and SEF investors should be aware of Shari'ah-compliance risk since the fund managers do not always fulfill their fiduciary obligation, as promised in their prospectus. Employing a matched firm approach for a survivorship free sample of 387 SEFs, we then examine an issue that has been heavily debated in the literature: Does ethical screening reduce investment performance? Results show that it does but only by an average of 0.04% per month if benchmarked against matched conventional funds - this is a relatively small price to pay for religious faith. Cross-sectional regressions show an inverse relationship between Shari'ah compliance and fund performance: every one percentage increase in total compliance decreases fund performance by 0.01% per month. However, compliance fails to explain differences in the performance between SEFs and matched funds. Although SEFs do not generally perform better during crisis periods, further analysis shows evidence of better performance relative to conventional funds only during the recent Global Financial Crisis; the latter is consistent with popular media claims.
Resumo:
Monitoring foodservice satisfaction is a risk management strategy for malnutrition in the acute care sector, as low satisfaction may be associated with poor intake. This study aimed to investigate the relationship between age and foodservice satisfaction in the private acute care setting. Patient satisfaction was assessed using a validated tool, the Acute Care Hospital Foodservice Patient Satisfaction Questionnaire for data collected 2008–2010 (n = 779) at a private hospital, Brisbane. Age was grouped into three categories; <50 years, 51–70 years and >70 years. Fisher’s exact test assessed independence of categorical responses and age group; ANOVA or Kruskal–Wallis test was used for continuous variables. Dichotomised responses were analysed using logistic regression and odds ratios (95% confidence interval, p < 0.05). Overall foodservice satisfaction (5 point scale) was high (≥4 out of 5) and was independent of age group (p = 0.377). There was an increasing trend with age in mean satisfaction scores for individual dimensions of foodservice; food quality (p < 0.001), meal service quality (p < 0.001), staff service issues (p < 0.001) and physical environment (p < 0.001). A preference for being able to choose different sized meals (59.8% > 70 years vs 40.6% ≤50 years; p < 0.001) and response to ‘the foods are just the right temperature’ (55.3% >70 years vs 35.9% ≤50 years; p < 0.001) was dependent on age. For the food quality dimension, based on dichotomised responses (satisfied or not), the odds of satisfaction was higher for >70 years (OR = 5.0, 95% CI: 1.8–13.8; <50 years referent). These results suggest that dimensions of foodservice satisfaction are associated with age and can assist foodservices to meet varying generational expectations of clients.
Resumo:
How does globalization influence transitions toward more sustainable socio-technical regimes in the developing world? This paper argues that transformations of regimes, the networks and institutions governing technological and environmental practices in an industry, can be positively influenced by globalization but it depends on how global forces interact with local socio-political landscapes-the political-economic institutions, values, and regulations broadly guiding an economy and its relationship to the environment. We evaluate these relationships through a comparison of two kinds of socio-political landscapes-the neo-liberal export-led development model commonly found in the developing world and the uniquely Asian capitalist developmental state. We first show how the neo-liberal model overemphasizes the power of market forces to facilitate upgrading and more sustainable industrialization. We then argue that capitalist developmental states in East and Southeast Asia have been better able to harness global economic forces for technological and sustainability transitions through an openness to trade and investment and effective public-private institutions able to link cleaner technologies and environmental standards to production activities in firms. We buttress this argument with firm-level evidence showing the evolution of socio-technical regimes in two industries-cement and electronics. The case studies demonstrate how interactions with OECD firms can contribute to environmental technique effects provided the socio-political landscape is amenable to changes in an industry's regime. Ultimately, we find the process of transition to be complex and contingent; a hard slog not a leap frog toward a potentially more sustainable future. We close by considering the limitations on the capitalist developmental state model and with comments about what else needs to be learned about globalization's role in sustainability transitions.
Resumo:
Strategic searching for invasive pests presents a formidable challenge for conservation managers. Limited funding can necessitate choosing between surveying many sites cursorily, or focussing intensively on fewer sites. While existing knowledge may help to target more likely sites, e.g. with species distribution models (maps), this knowledge is not flawless and improving it also requires management investment. 2.In a rare example of trading-off action against knowledge gain, we combine search coverage and accuracy, and its future improvement, within a single optimisation framework. More specifically we examine under which circumstances managers should adopt one of two search-and-control strategies (cursory or focussed), and when they should divert funding to improving knowledge, making better predictive maps that benefit future searches. 3.We use a family of Receiver Operating Characteristic curves to reflect the quality of maps that direct search efforts. We demonstrate our framework by linking these to a logistic model of invasive spread such as that for the red imported fire ant Solenopsis invicta in south-east Queensland, Australia. 4.Cursory widespread searching is only optimal if the pest is already widespread or knowledge is poor, otherwise focussed searching exploiting the map is preferable. For longer management timeframes, eradication is more likely if funds are initially devoted to improving knowledge, even if this results in a short-term explosion of the pest population. 5.Synthesis and applications. By combining trade-offs between knowledge acquisition and utilization, managers can better focus - and justify - their spending to achieve optimal results in invasive control efforts. This framework can improve the efficiency of any ecological management that relies on predicting occurrence. © 2010 The Authors. Journal of Applied Ecology © 2010 British Ecological Society.
Resumo:
Design as seen from the designer's perspective is a series of amazing imaginative jumps or creative leaps. But design as seen by the design historian is a smooth progression or evolution of ideas that they seem self-evident and inevitable after the event. But the next step is anything but obvious for the artist/creator/inventor/designer stuck at that point just before the creative leap. They know where they have come from and have a general sense of where they are going, but often do not have a precise target or goal. This is why it is misleading to talk of design as a problem-solving activity - it is better defined as a problem-finding activity. This has been very frustrating for those trying to assist the design process with computer-based, problem-solving techniques. By the time the problem has been defined, it has been solved. Indeed the solution is often the very definition of the problem. Design must be creative-or it is mere imitation. But since this crucial creative leap seem inevitable after the event, the question must arise, can we find some way of searching the space ahead? Of course there are serious problems of knowing what we are looking for and the vastness of the search space. It may be better to discard altogether the term "searching" in the context of the design process: Conceptual analogies such as search, search spaces and fitness landscapes aim to elucidate the design process. However, the vastness of the multidimensional spaces involved make these analogies misguided and they thereby actually result in further confounding the issue. The term search becomes a misnomer since it has connotations that imply that it is possible to find what you are looking for. In such vast spaces the term search must be discarded. Thus, any attempt at searching for the highest peak in the fitness landscape as an optimal solution is also meaningless. Futhermore, even the very existence of a fitness landscape is fallacious. Although alternatives in the same region of the vast space can be compared to one another, distant alternatives will stem from radically different roots and will therefore not be comparable in any straightforward manner (Janssen 2000). Nevertheless we still have this tantalizing possibility that if a creative idea seems inevitable after the event, then somehow might the process be rserved? This may be as improbable as attempting to reverse time. A more helpful analogy is from nature, where it is generally assumed that the process of evolution is not long-term goal directed or teleological. Dennett points out a common minsunderstanding of Darwinism: the idea that evolution by natural selection is a procedure for producing human beings. Evolution can have produced humankind by an algorithmic process, without its being true that evolution is an algorithm for producing us. If we were to wind the tape of life back and run this algorithm again, the likelihood of "us" being created again is infinitesimally small (Gould 1989; Dennett 1995). But nevertheless Mother Nature has proved a remarkably successful, resourceful, and imaginative inventor generating a constant flow of incredible new design ideas to fire our imagination. Hence the current interest in the potential of the evolutionary paradigm in design. These evolutionary methods are frequently based on techniques such as the application of evolutionary algorithms that are usually thought of as search algorithms. It is necessary to abandon such connections with searching and see the evolutionary algorithm as a direct analogy with the evolutionary processes of nature. The process of natural selection can generate a wealth of alternative experiements, and the better ones survive. There is no one solution, there is no optimal solution, but there is continuous experiment. Nature is profligate with her prototyping and ruthless in her elimination of less successful experiments. Most importantly, nature has all the time in the world. As designers we cannot afford prototyping and ruthless experiment, nor can we operate on the time scale of the natural design process. Instead we can use the computer to compress space and time and to perform virtual prototyping and evaluation before committing ourselves to actual prototypes. This is the hypothesis underlying the evolutionary paradigm in design (1992, 1995).
Resumo:
Qualitative research methods require transparency to ensure the ‘trustworthiness’ of the data analysis. The intricate processes of organizing, coding and analyzing the data are often rendered invisible in the presentation of the research findings, which requires a ‘leap of faith’ for the reader. Computer assisted data analysis software can be used to make the research process more transparent, without sacrificing rich, interpretive analysis by the researcher. This article describes in detail how one software package was used in a poststructural study to link and code multiple forms of data to four research questions for fine-grained analysis. This description will be useful for researchers seeking to use qualitative data analysis software as an analytic tool.