567 resultados para 230201 Probability Theory
Resumo:
Theories of search and search behavior can be used to glean insights and generate hypotheses about how people interact with retrieval systems. This paper examines three such theories, the long standing Information Foraging Theory, along with the more recently proposed Search Economic Theory and the Interactive Probability Ranking Principle. Our goal is to develop a model for ad-hoc topic retrieval using each approach, all within a common framework, in order to (1) determine what predictions each approach makes about search behavior, and (2) show the relationships, equivalences and differences between the approaches. While each approach takes a different perspective on modeling searcher interactions, we show that under certain assumptions, they lead to similar hypotheses regarding search behavior. Moreover, we show that the models are complementary to each other, but operate at different levels (i.e., sessions, patches and situations). We further show how the differences between the approaches lead to new insights into the theories and new models. This contribution will not only lead to further theoretical developments, but also enables practitioners to employ one of the three equivalent models depending on the data available.
Resumo:
We consider the motion of a diffusive population on a growing domain, 0 < x < L(t ), which is motivated by various applications in developmental biology. Individuals in the diffusing population, which could represent molecules or cells in a developmental scenario, undergo two different kinds of motion: (i) undirected movement, characterized by a diffusion coefficient, D, and (ii) directed movement, associated with the underlying domain growth. For a general class of problems with a reflecting boundary at x = 0, and an absorbing boundary at x = L(t ), we provide an exact solution to the partial differential equation describing the evolution of the population density function, C(x,t ). Using this solution, we derive an exact expression for the survival probability, S(t ), and an accurate approximation for the long-time limit, S = limt→∞ S(t ). Unlike traditional analyses on a nongrowing domain, where S ≡ 0, we show that domain growth leads to a very different situation where S can be positive. The theoretical tools developed and validated in this study allow us to distinguish between situations where the diffusive population reaches the moving boundary at x = L(t ) from other situations where the diffusive population never reaches the moving boundary at x = L(t ). Making this distinction is relevant to certain applications in developmental biology, such as the development of the enteric nervous system (ENS). All theoretical predictions are verified by implementing a discrete stochastic model.
Resumo:
This paper critiques a traditional approach to music theory pedagogy. It argues that music theory courses should draw on pedagogies that reflect the diversity and pluralism inherent in 21st century music making. It presents the findings of an action research project investigating the experiences of undergraduate students undertaking an innovative contemporary art music theory course. It describes the students’ struggle in coming to terms with a course that integrated composing, performing, listening and analysing coupled with what for many was their first exposure to the diversity of contemporary art music. The paper concludes with suggesting that the approach could be adopted more widely throughout music programs.
Resumo:
We report on ongoing research to develop a design theory for classes of information systems that allow for work practices that exhibit a minimal harmful impact on the natural environment. We call such information systems Green IS. In this paper we describe the building blocks of our Green IS design theory, which develops prescriptions for information systems that allow for: (1) belief formation, action formation and outcome measurement relating to (2) environmentally sustainable work practices and environmentally sustainable decisions on (3) a macro or micro level. For each element, we specify structural features, symbolic expressions, user abilities and goals required for the affordances to emerge. We also provide a set of testable propositions derived from our design theory and declare two principles of implementation.
Resumo:
This research examined the implementation of clinical information system technology in a large Saudi Arabian health care organisation. The research was underpinned by symbolic interactionism and grounded theory methods informed data collection and analysis. Observations, a review of policy documents and 38 interviews with registered nurses produced in-depth data. Analysis generated three abstracted concepts that explained how imported technology increased practice and health care complexity rather than enhance quality patient care. The core category, Disseminating Change, also depicted a hierarchical and patriarchal culture that shaped the implementation process at the levels of government, organisation and the individual.
Resumo:
Anticipating the number and identity of bidders has significant influence in many theoretical results of the auction itself and bidders’ bidding behaviour. This is because when a bidder knows in advance which specific bidders are likely competitors, this knowledge gives a company a head start when setting the bid price. However, despite these competitive implications, most previous studies have focused almost entirely on forecasting the number of bidders and only a few authors have dealt with the identity dimension qualitatively. Using a case study with immediate real-life applications, this paper develops a method for estimating every potential bidder’s probability of participating in a future auction as a function of the tender economic size removing the bias caused by the contract size opportunities distribution. This way, a bidder or auctioner will be able to estimate the likelihood of a specific group of key, previously identified bidders in a future tender.
Resumo:
Innovation enables organisations to endure by responding to emergence and to improve efficiency. Innovation in a complex organisation can be difficult due to complexities contributing to slow decision-making. Complex projects fail due to an inability to respond to emergence which consumes finances and impacts on resources and organisational success. Therefore, for complex organisations to improve on performance and resilience, it would be advantageous to understand how to improve the management of innovation and thus, the ability to respond to emergence. The benefits to managers are an increase in the number of successful projects and improved productivity. This study will explore innovation management in a complex project based organisation. The contribution to the academic literature will be an in-depth, qualitative exploration of innovation in a complex project based organisation using a comparative case study approach.
Resumo:
Accurate determination of same-sex twin zygosity is important for medical, scientific and personal reasons. Determination may be based upon questionnaire data, blood group, enzyme isoforms and fetal membrane examination, but assignment of zygosity must ultimately be confirmed by genotypic data. Here methods are reviewed for calculating average probabilities of correctly concluding a twin pair is monozygotic, given they share the same genotypes across all loci for commonly utilized multiplex short tandem repeat (STR) kits.
Resumo:
I agree with Costanza and Finkelstein (2015) that it is futile to further invest in the study of generational differences in the work context due to a lack of appropriate theory and methods. The key problem with the generations concept is that splitting continuous variables such as age or time into a few discrete units involves arbitrary cutoffs and atheoretical groupings of individuals (e.g., stating that all people born between the early 1960s and early 1980s belong to Generation X). As noted by methodologists, this procedure leads to a loss of information about individuals and reduced statistical power (MacCallum, Zhang, Preacher, & Rucker, 2002). Due to these conceptual and methodological limitations, I regard it as very difficult if not impossible to develop a “comprehensive theory of generations” (Costanza & Finkelstein, p. 20) and to rigorously examine generational differences at work in empirical studies.
Resumo:
Background Excessive speed is a primary contributing factor to young novice road trauma, including intentional and unintentional speeds above posted limits or too fast for conditions. The objective of this research was to conduct a systematic review of recent investigations into novice drivers’ speed selection, with particular attention to applications and limitations of theory and methodology. Method Systematic searches of peer-reviewed and grey literature were conducted during September 2014. Abstract reviews identified 71 references potentially meeting selection criteria of investigations since the year 2000 into factors that influence (directly or indirectly) actual speed (i.e., behaviour or performance) of young (age <25 years) and/or novice (recently-licensed) drivers. Results Full paper reviews resulted in 30 final references: 15 focused on intentional speeding and 15 on broader speed selection investigations. Both sets identified a range of individual (e.g., beliefs, personality) and social (e.g., peer, adult) influences, were predominantly theory-driven and applied cross-sectional designs. Intentional speed investigations largely utilised self-reports while other investigations more often included actual driving (simulated or ‘real world’). The latter also identified cognitive workload and external environment influences, as well as targeted interventions. Discussion and implications Applications of theory have shifted the novice speed-related literature beyond a simplistic focus on intentional speeding as human error. The potential to develop a ‘grand theory’ of intentional speeding emerged and to fill gaps to understand broader speed selection influences. This includes need for future investigations of vehicle-related and physical environment-related influences and methodologies that move beyond cross-sectional designs and rely less on self-reports.
Resumo:
The adequacy and efficiency of existing legal and regulatory frameworks dealing with corporate phoenix activity have been repeatedly called into question over the past two decades through various reviews, inquiries, targeted regulatory operations and the implementation of piecemeal legislative reform. Despite these efforts, phoenix activity does not appear to have abated. While there is no law in Australia that declares ‘phoenix activity’ to be illegal, the behaviour that tends to manifest in phoenix activity can be capable of transgressing a vast array of law, including for example, corporate law, tax law, and employment law. This paper explores the notion that the persistence of phoenix activity despite the sheer extent of this law suggests that the law is not acting as powerfully as it might as a deterrent. Economic theories of entrepreneurship and innovation can to some extent explain why this is the case and also offer a sound basis for the evaluation and reconsideration of the existing law. The challenges facing key regulators are significant. Phoenix activity is not limited to particular corporate demographic: it occurs in SMEs, large companies and in corporate groups. The range of behaviour that can amount to phoenix activity is so broad, that not all phoenix activity is illegal. This paper will consider regulatory approaches to these challenges via analysis of approaches to detection and enforcement of the underlying law capturing illegal phoenix activity. Remedying the mischief of phoenix activity is of practical importance. The benefits include continued confidence in our economy, law that inspires best practice among directors, and law that is articulated in a manner such that penalties act as a sufficient deterrent and the regulatory system is able to detect offenders and bring them to account. Any further reforms must accommodate and tolerate legal phoenix activity, at least to some extent. Even then, phoenix activity pushes tolerance of repeated entrepreneurial failure to its absolute limit. The more limited liability is misused and abused, the stronger the argument to place some restrictions on access to limited liability. This paper proposes that such an approach is a legitimate next step for a robust and mature capitalist economy.
Resumo:
Sampling design is critical to the quality of quantitative research, yet it does not always receive appropriate attention in nursing research. The current article details how balancing probability techniques with practical considerations produced a representative sample of Australian nursing homes (NHs). Budgetary, logistical, and statistical constraints were managed by excluding some NHs (e.g., those too difficult to access) from the sampling frame; a stratified, random sampling methodology yielded a final sample of 53 NHs from a population of 2,774. In testing the adequacy of representation of the study population, chi-square tests for goodness of fit generated nonsignificant results for distribution by distance from major city and type of organization. A significant result for state/territory was expected and was easily corrected for by the application of weights. The current article provides recommendations for conducting high-quality, probability-based samples and stresses the importance of testing the representativeness of achieved samples.
Resumo:
We have shown that novel synthesis methods combined with careful evaluation of DFT phonon calculations provides new insight into boron compounds including a capacity to predict Tc for AlB2-type superconductors.
Resumo:
This chapter challenges current approaches to defining the context and process of entrepreneurship education. In modeling our classrooms as a microcosm of the world our current and future students will enter, this chapter brings to life (and celebrates) the everpresent diversity found within. The chapter attempts to make an important (and unique) contribution to the field of enterprise education by illustrating how we can determine the success of (1) our efforts as educators, (2) our students, and (3) our various teaching methods. The chapter is based on two specific premises, the most fundamental being the assertion that the performance of student, educator and institution can only be accounted for by accepting the nature of the dialogic relationship between the student and educator and between the educator and institution. A second premise is that at any moment in time, the educator can be assessed as being either efficient or inefficient, due to the presence of observable heterogeneity in the learning environment that produces differential learning outcomes. This chapter claims that understanding and appreciating the nature of heterogeneity in our classrooms provides an avenue for improvement in all facets of learning and teaching. To explain this claim, Haskell’s (1949) theory of coaction is resurrected to provide a lens through which all manner of interaction occurring within all forms of educational contexts can be explained. Haskell (1949) asserted that coaction theory had three salient features.
Resumo:
A central tenet in the theory of reliability modelling is the quantification of the probability of asset failure. In general, reliability depends on asset age and the maintenance policy applied. Usually, failure and maintenance times are the primary inputs to reliability models. However, for many organisations, different aspects of these data are often recorded in different databases (e.g. work order notifications, event logs, condition monitoring data, and process control data). These recorded data cannot be interpreted individually, since they typically do not have all the information necessary to ascertain failure and preventive maintenance times. This paper presents a methodology for the extraction of failure and preventive maintenance times using commonly-available, real-world data sources. A text-mining approach is employed to extract keywords indicative of the source of the maintenance event. Using these keywords, a Naïve Bayes classifier is then applied to attribute each machine stoppage to one of two classes: failure or preventive. The accuracy of the algorithm is assessed and the classified failure time data are then presented. The applicability of the methodology is demonstrated on a maintenance data set from an Australian electricity company.