992 resultados para parameter uncertainty
Resumo:
Children’s literature has conventionally and historically been concerned with identity and the often tortuous journey to becoming a subject who is generally older and wiser, a journey typically characterised by mishap, adventure, and detours. Narrative closure in children’s and young adult novels and films typically provides a point of self-realisation or self-actualisation, whereby the struggles of finding one’s “true” identity have been overcome. In this familiar coming-of-age narrative, there is often an underlying premise of an essential self that will emerge or be uncovered. This kind of narrative resolution provides readers with a reassurance that things will work for the best in the end, which is an enduring feature of children’s literature, and part of liberal-humanism’s project of harmonious individuality. However, uncertainty is a constant that has always characterised the ways lives are lived, regardless of best-laid plans. Children’s literature provides a field of narrative knowledge whereby readers gain impressions of childhood and adolescence, or more specifically, knowledge of ways of being at a time in life, which is marked by uncertainty. Despite the prevalence of children’s texts which continue to offer normative ways of being, in particular, normative forms of gender behaviour, there are texts which resist the pull for characters to be “like everyone else” by exploring alternative subjectivities. Fiction, however, cannot be regarded as a source of evidence about the material realities of life, as its strength lies in its affective and imaginative dimensions, which nevertheless can offer readers moments of reflection, recognition, or, in some cases, reality lessons. As a form of cultural production, contemporary children’s literature is highly responsive to social change and political debates, and is crucially implicated in shaping the values, attitudes and behaviours of children and young people.
Resumo:
Increasingly societies and their governments are facing important social issues that have science and technology as key features. A number of these socio-scientific issues have two features that distinguish them from the restricted contexts in which school science has traditionally been presented. Some of their science is uncertain and scientific knowledge is not the only knowledge involved. As a result, the concepts of uncertainty, risk and complexity become essential aspects of the science underlying these issues. In this chapter we discuss the nature and role of these concepts in the public understanding of science and consider their links with school science. We argue that these same concepts and their role in contemporary scientific knowledge need to be addressed in school science curricula. The new features for content, pedagogy and assessment of this urgent challenge for science educators are outlined. These will be essential if the goal of science education for citizenship is to be achieved with our students, who will increasingly be required to make personal and collective decisions on issues involving science and technology.
Resumo:
The estimation of phylogenetic divergence times from sequence data is an important component of many molecular evolutionary studies. There is now a general appreciation that the procedure of divergence dating is considerably more complex than that initially described in the 1960s by Zuckerkandl and Pauling (1962, 1965). In particular, there has been much critical attention toward the assumption of a global molecular clock, resulting in the development of increasingly sophisticated techniques for inferring divergence times from sequence data. In response to the documentation of widespread departures from clocklike behavior, a variety of local- and relaxed-clock methods have been proposed and implemented. Local-clock methods permit different molecular clocks in different parts of the phylogenetic tree, thereby retaining the advantages of the classical molecular clock while casting off the restrictive assumption of a single, global rate of substitution (Rambaut and Bromham 1998; Yoder and Yang 2000).
Resumo:
We develop a stochastic endogenous growth model to explain the diversity in growth and inequality patterns and the non-convergence of incomes in transitional economies where an underdeveloped financial sector imposes an implicit, fixed cost on the diversification of idiosyncratic risk. In the model endogenous growth occurs through physical and human capital deepening, with the latter being the more dominant element. We interpret the fixed cost as a ‘learning by doing’ cost for entrepreneurs who undertake risk in the absence of well developed financial markets and institutions that help diversify such risk. As such, this cost may be interpreted as the implicit returns foregone due to the lack of diversification opportunities that would otherwise have been available, had such institutions been present. The analytical and numerical results of the model suggest three growth outcomes depending on the productivity differences between the projects and the fixed cost associated with the more productive project. We label these outcomes as poverty trap, dual economy and balanced growth. Further analysis of these three outcomes highlights the existence of a diversity within diversity. Specifically, within the ‘poverty trap’ and ‘dual economy’ scenarios growth and inequality patterns differ, depending on the initial conditions. This additional diversity allows the model to capture a richer range of outcomes that are consistent with the empirical experience of several transitional economies.
Resumo:
This study compared the performance of a local and three robust optimality criteria in terms of the standard error for a one-parameter and a two-parameter nonlinear model with uncertainty in the parameter values. The designs were also compared in conditions where there was misspecification in the prior parameter distribution. The impact of different correlation between parameters on the optimal design was examined in the two-parameter model. The designs and standard errors were solved analytically whenever possible and numerically otherwise.
Resumo:
Advances in algorithms for approximate sampling from a multivariable target function have led to solutions to challenging statistical inference problems that would otherwise not be considered by the applied scientist. Such sampling algorithms are particularly relevant to Bayesian statistics, since the target function is the posterior distribution of the unobservables given the observables. In this thesis we develop, adapt and apply Bayesian algorithms, whilst addressing substantive applied problems in biology and medicine as well as other applications. For an increasing number of high-impact research problems, the primary models of interest are often sufficiently complex that the likelihood function is computationally intractable. Rather than discard these models in favour of inferior alternatives, a class of Bayesian "likelihoodfree" techniques (often termed approximate Bayesian computation (ABC)) has emerged in the last few years, which avoids direct likelihood computation through repeated sampling of data from the model and comparing observed and simulated summary statistics. In Part I of this thesis we utilise sequential Monte Carlo (SMC) methodology to develop new algorithms for ABC that are more efficient in terms of the number of model simulations required and are almost black-box since very little algorithmic tuning is required. In addition, we address the issue of deriving appropriate summary statistics to use within ABC via a goodness-of-fit statistic and indirect inference. Another important problem in statistics is the design of experiments. That is, how one should select the values of the controllable variables in order to achieve some design goal. The presences of parameter and/or model uncertainty are computational obstacles when designing experiments but can lead to inefficient designs if not accounted for correctly. The Bayesian framework accommodates such uncertainties in a coherent way. If the amount of uncertainty is substantial, it can be of interest to perform adaptive designs in order to accrue information to make better decisions about future design points. This is of particular interest if the data can be collected sequentially. In a sense, the current posterior distribution becomes the new prior distribution for the next design decision. Part II of this thesis creates new algorithms for Bayesian sequential design to accommodate parameter and model uncertainty using SMC. The algorithms are substantially faster than previous approaches allowing the simulation properties of various design utilities to be investigated in a more timely manner. Furthermore the approach offers convenient estimation of Bayesian utilities and other quantities that are particularly relevant in the presence of model uncertainty. Finally, Part III of this thesis tackles a substantive medical problem. A neurological disorder known as motor neuron disease (MND) progressively causes motor neurons to no longer have the ability to innervate the muscle fibres, causing the muscles to eventually waste away. When this occurs the motor unit effectively ‘dies’. There is no cure for MND, and fatality often results from a lack of muscle strength to breathe. The prognosis for many forms of MND (particularly amyotrophic lateral sclerosis (ALS)) is particularly poor, with patients usually only surviving a small number of years after the initial onset of disease. Measuring the progress of diseases of the motor units, such as ALS, is a challenge for clinical neurologists. Motor unit number estimation (MUNE) is an attempt to directly assess underlying motor unit loss rather than indirect techniques such as muscle strength assessment, which generally is unable to detect progressions due to the body’s natural attempts at compensation. Part III of this thesis builds upon a previous Bayesian technique, which develops a sophisticated statistical model that takes into account physiological information about motor unit activation and various sources of uncertainties. More specifically, we develop a more reliable MUNE method by applying marginalisation over latent variables in order to improve the performance of a previously developed reversible jump Markov chain Monte Carlo sampler. We make other subtle changes to the model and algorithm to improve the robustness of the approach.
Resumo:
Reliable pollutant build-up prediction plays a critical role in the accuracy of urban stormwater quality modelling outcomes. However, water quality data collection is resource demanding compared to streamflow data monitoring, where a greater quantity of data is generally available. Consequently, available water quality data sets span only relatively short time scales unlike water quantity data. Therefore, the ability to take due consideration of the variability associated with pollutant processes and natural phenomena is constrained. This in turn gives rise to uncertainty in the modelling outcomes as research has shown that pollutant loadings on catchment surfaces and rainfall within an area can vary considerably over space and time scales. Therefore, the assessment of model uncertainty is an essential element of informed decision making in urban stormwater management. This paper presents the application of a range of regression approaches such as ordinary least squares regression, weighted least squares Regression and Bayesian Weighted Least Squares Regression for the estimation of uncertainty associated with pollutant build-up prediction using limited data sets. The study outcomes confirmed that the use of ordinary least squares regression with fixed model inputs and limited observational data may not provide realistic estimates. The stochastic nature of the dependent and independent variables need to be taken into consideration in pollutant build-up prediction. It was found that the use of the Bayesian approach along with the Monte Carlo simulation technique provides a powerful tool, which attempts to make the best use of the available knowledge in the prediction and thereby presents a practical solution to counteract the limitations which are otherwise imposed on water quality modelling.
Resumo:
This volume puts together the works of a group of distinguished scholars and active researchers in the field of media and communication studies to reflect upon the past, present, and future of new media research. The chapters examine the implications of new media technologies on everyday life, existing social institutions, and the society at large at various levels of analysis. Macro-level analyses of changing techno-social formation – such as discussions of the rise of surveillance society and the "fifth estate" – are combined with studies on concrete and specific new media phenomena, such as the rise of Pro-Am collaboration and "fan labor" online. In the process, prominent concepts in the field of new media studies, such as social capital, displacement, and convergence, are critically examined, while new theoretical perspectives are proposed and explicated. Reflecting the inter-disciplinary nature of the field of new media studies and communication research in general, the chapters interrogate into the problematic through a range of theoretical and methodological approaches. The book should offer students and researchers who are interested in the social impact of new media both critical reviews of the existing literature and inspirations for developing new research questions.
Resumo:
The ultimate goal of an access control system is to allocate each user the precise level of access they need to complete their job - no more and no less. This proves to be challenging in an organisational setting. On one hand employees need enough access to the organisation’s resources in order to perform their jobs and on the other hand more access will bring about an increasing risk of misuse - either intentionally, where an employee uses the access for personal benefit, or unintentionally, through carelessness or being socially engineered to give access to an adversary. This thesis investigates issues of existing approaches to access control in allocating optimal level of access to users and proposes solutions in the form of new access control models. These issues are most evident when uncertainty surrounding users’ access needs, incentive to misuse and accountability are considered, hence the title of the thesis. We first analyse access control in environments where the administrator is unable to identify the users who may need access to resources. To resolve this uncertainty an administrative model with delegation support is proposed. Further, a detailed technical enforcement mechanism is introduced to ensure delegated resources cannot be misused. Then we explicitly consider that users are self-interested and capable of misusing resources if they choose to. We propose a novel game theoretic access control model to reason about and influence the factors that may affect users’ incentive to misuse. Next we study access control in environments where neither users’ access needs can be predicted nor they can be held accountable for misuse. It is shown that by allocating budget to users, a virtual currency through which they can pay for the resources they deem necessary, the need for a precise pre-allocation of permissions can be relaxed. The budget also imposes an upper-bound on users’ ability to misuse. A generalised budget allocation function is proposed and it is shown that given the context information the optimal level of budget for users can always be numerically determined. Finally, Role Based Access Control (RBAC) model is analysed under the explicit assumption of administrators’ uncertainty about self-interested users’ access needs and their incentives to misuse. A novel Budget-oriented Role Based Access Control (B-RBAC) model is proposed. The new model introduces the notion of users’ behaviour into RBAC and provides means to influence users’ incentives. It is shown how RBAC policy can be used to individualise the cost of access to resources and also to determine users’ budget. The implementation overheads of B-RBAC is examined and several low-cost sub-models are proposed.
Resumo:
This paper draws upon the current situation within Japanese Higher Education. In particular the paper focuses on educational reforms and how they relate to the notions of Yutori Kyoiku which constituted a major attempt by Japanese education to develop individual student capacity. A clear subtext of the recent neo-liberal reform agenda is a desire to incorporated free-market ideals into the Japanese educational system. This paper raises several important problems connected to the reforms such as the decrease in classroom hours, changes to the contents of textbooks and a growing discrepancy in academic skills between students in different localities. These education reforms have impacted on notions of Yutori Kyoiku through the continuation of nationally standardized testing and changes directed at controlling the practices of classroom teachers. While acknowledging the current Japanese cabinet’s (DP) education policy has been inherited from an earlier LDP government, the paper points to similarities between the current reforms and the iconic Meiji era reforms of the late 1800s.
Resumo:
New venture growth is a central topic in entrepreneurship research. Although sales growth is emerging as the most commonly used measure of growth for emerging ventures, employment growth has also been used frequently. However, empirical research demonstrates that there are only very low to moderately sized correlations between the two (Delmar et aL, 2003; Weinzimmer, et al., 1998). In addition) sales growth and employment growth respond differently to a wide variety of criteria (Baum et al., 2001; Delmar et al., 2003). In this study we use transaction cost economics (Williamson, 1996) as a theoretical base to examine transaction cost influences on the addition of new employees as emerging ventures experience sales growth. \\le theorize that transaction cost economics variables will moderate the relationship between sales growth and employment growth. W'e develop and test hypotheses related to asset specificity, behavioral uncertainty, and the influence of resource munificence on the strength of the sales growth/ employment growth relationship. Asset specificity is theorized to be a positive moderator of the relationship between sales growth and employment growth. When the behavioral uncertainty associated with adding new employees is greater than that of outsourcing or subcontracting, it is hypothesized to be a negative moderator of the sales growth/employment growth relationship. We also hypothesize that resource scarcity will strengthen those relationships.
Resumo:
Morris' (1986) analysis of the factors affecting project success and failure is considered in relation to the psychology of judgement under uncertainty. A model is proposed whereby project managers may identify the specific circumstances in which human decision-making is prone to systematic error, and hence may apply a number of de-biasing techniques.
Resumo:
Preventative health has become central to contemporary health care, identifying youth physical activity as a key factor in determining health and functioning. Schools offer a unique research setting due to distinctive methodological circumstances. However, school-based researchers face several obstacles in their endeavour to complete successful research investigations; often confronted with complex research designs and methodological procedures that are not easily amenable to school contexts. The purpose of this paper is to provide a practical guide for teachers (both teacher educators and teaching practitioners) seeking to conduct physical activity-based research in Australian school settings, as well as discuss research practices. The research enabling process has been divided into six phases: preparation; design; outcome measures; procedures; participants; and feedback. Careful planning and consideration must be undertaken prior to the commencement of, and during the research process, due to the complex nature of school settings and research processes that exist in the Australian context.