824 resultados para Probabilistic decision process model
Resumo:
Business Process Management (BPM) (Dumas et al. 2013) investigates how organizations function and can be improved on the basis of their business processes. The starting point for BPM is that organizational performance is a function of process performance. Thus, BPM proposes a set of methods, techniques and tools to discover, analyze, implement, monitor and control business processes, with the ultimate goal of improving these processes. Most importantly, BPM is not just an organizational management discipline. BPM also studies how technology, and particularly information technology, can effectively support the process improvement effort. In the past two decades the field of BPM has been the focus of extensive research, which spans an increasingly growing scope and advances technology in various directions. The main international forum for state-of-the-art research in this field is the International Conference on Business Process Management, or “BPM” for short—an annual meeting of the aca ...
Resumo:
Despite extensive literature on female mate choice, empirical evidence on women’s mating preferences in the search for a sperm donor is scarce, even though this search, by isolating a male’s genetic impact on offspring from other factors like paternal investment, offers a naturally ”controlled” research setting. In this paper, we work to fill this void by examining the rapidly growing online sperm donor market, which is raising new challenges by offering women novel ways to seek out donor sperm. We not only identify individual factors that influence women’s mating preferences but find strong support for the proposition that behavioural traits (inner values) are more important in these choices than physical appearance (exterior values). We also report evidence that physical factors matter more than resources or other external cues of material success, perhaps because the relevance of good character in donor selection is part of a female psychological adaptation throughout evolutionary history. The lack of evidence on a preference for material resources, on the other hand, may indicate the ability of peer socialization and better access to resources to rapidly shape the female decision process. Overall, the paper makes useful contributions to both the literature on human behaviour and that on decision-making in extreme and highly important situations.
Resumo:
Semantic priming occurs when a subject is faster in recognising a target word when it is preceded by a related word compared to an unrelated word. The effect is attributed to automatic or controlled processing mechanisms elicited by short or long interstimulus intervals (ISIs) between primes and targets. We employed event-related functional magnetic resonance imaging (fMRI) to investigate blood oxygen level dependent (BOLD) responses associated with automatic semantic priming using an experimental design identical to that used in standard behavioural priming tasks. Prime-target semantic strength was manipulated by using lexical ambiguity primes (e.g., bank) and target words related to dominant or subordinate meaning of the ambiguity. Subjects made speeded lexical decisions (word/nonword) on dominant related, subordinate related, and unrelated word pairs presented randomly with a short ISI. The major finding was a pattern of reduced activity in middle temporal and inferior prefrontal regions for dominant versus unrelated and subordinate versus unrelated comparisons, respectively. These findings are consistent with both a dual process model of semantic priming and recent repetition priming data that suggest that reductions in BOLD responses represent neural priming associated with automatic semantic activation and implicate the left middle temporal cortex and inferior prefrontal cortex in more automatic aspects of semantic processing.
Resumo:
Fusing data from multiple sensing modalities, e.g. laser and radar, is a promising approach to achieve resilient perception in challenging environmental conditions. However, this may lead to \emph{catastrophic fusion} in the presence of inconsistent data, i.e. when the sensors do not detect the same target due to distinct attenuation properties. It is often difficult to discriminate consistent from inconsistent data across sensing modalities using local spatial information alone. In this paper we present a novel consistency test based on the log marginal likelihood of a Gaussian process model that evaluates data from range sensors in a relative manner. A new data point is deemed to be consistent if the model statistically improves as a result of its fusion. This approach avoids the need for absolute spatial distance threshold parameters as required by previous work. We report results from object reconstruction with both synthetic and experimental data that demonstrate an improvement in reconstruction quality, particularly in cases where data points are inconsistent yet spatially proximal.
Resumo:
Unlike US and Continental European jurisdictions, Australian monetary policy announcements are not followed promptly by projections materials or comprehensive summaries that explain the decision process. This information is disclosed 2 weeks later when the explanatory minutes of the Reserve Bank board meeting are released. This paper is the first study to exploit the features of the Australian monetary policy environment in order to examine the differential impact of monetary policy announcements and explanatory statements on the Australian interest rate futures market. We find that both monetary policy announcements and explanatory minutes releases have a significant impact on the implied yield and volatility of Australian interest rate futures contracts. When the differential impact of these announcements is examined using the full sample, no statistically significant difference is found. However, when the sample is partitioned based on stable periods and the Global Financial Crisis, a differential impact is evident. Further, contrary to the findings of Kim and Nguyen (2008), Lu et al. (2009), and Smales (2012a), the response along the yield curve, is found to be indifferent between the short and medium terms.
Resumo:
Modelling fluvial processes is an effective way to reproduce basin evolution and to recreate riverbed morphology. However, due to the complexity of alluvial environments, deterministic modelling of fluvial processes is often impossible. To address the related uncertainties, we derive a stochastic fluvial process model on the basis of the convective Exner equation that uses the statistics (mean and variance) of river velocity as input parameters. These statistics allow for quantifying the uncertainty in riverbed topography, river discharge and position of the river channel. In order to couple the velocity statistics and the fluvial process model, the perturbation method is employed with a non-stationary spectral approach to develop the Exner equation as two separate equations: the first one is the mean equation, which yields the mean sediment thickness, and the second one is the perturbation equation, which yields the variance of sediment thickness. The resulting solutions offer an effective tool to characterize alluvial aquifers resulting from fluvial processes, which allows incorporating the stochasticity of the paleoflow velocity.
Resumo:
The aim of this project was to evaluate the cost-effectiveness of hand hygiene interventions in resource-limited hospital settings. Using data from north-east Thailand, the research found that such interventions are likely to be very cost-effective in intensive care unit settings as a result of reduced incidence of methicillin-resistant Staphylococcus aureus bloodstream infection alone. This study also found evidence showing that the World Health Organization's (WHO) multimodal intervention is effective and when adding either goal-setting, reward incentives, or accountability strategies to the WHO intervention, compliance could be further improved.
Resumo:
Abnormally high price spikes in spot electricity markets represent a significant risk to market participants. As such, a literature has developed that focuses on forecasting the probability of such spike events, moving beyond simply forecasting the level of price. Many univariate time series models have been proposed to dealwith spikes within an individual market region. This paper is the first to develop a multivariate self-exciting point process model for dealing with price spikes across connected regions in the Australian National Electricity Market. The importance of the physical infrastructure connecting the regions on the transmission of spikes is examined. It is found that spikes are transmitted between the regions, and the size of spikes is influenced by the available transmission capacity. It is also found that improved risk estimates are obtained when inter-regional linkages are taken into account.
Resumo:
The aim of this report was to present findings of an economic evaluation of the UP Pilot. A decision analytic model was used to examine the monetary cost of offering each of the four interventions in the UP Pilot against success measures. The evaluation also included subgroup analysis by demographic groups to offer insights into groups that are more resistant to undertaking preventive actions, with the possibility of further research to better understand client motivation for undertaking preventive behaviour. Based on the evaluation, this report makes recommendations for further investment and implementation of the UP Pilot.
Resumo:
Background Optimal infant nutrition comprises exclusive breastfeeding, with complementary foods introduced from six months of age. How parents make decisions regarding this is poorly studied. This study begins to address the dearth of research into the decision-making processes used by first-time mothers relating to the introduction of complementary foods. Methods This qualitative explorative study was conducted using interviews (13) and focus groups (3). A semi-structured interview guide based on the Theory of Planned Behaviour (TPB). The TPB, a well-validated decision-making model, identifies the key determinants of a behaviour through behavioural beliefs, subjective norms, and perceived behavioural control over the behaviour. It is purported that these beliefs predict behavioural intention to perform the behaviour, and performing the behaviour. A purposive, convenience, sample of 21 metropolitan parents recruited through advertising at local playgroups and childcare centres, and electronically through the University community email list self-selected to participate. Data were analysed thematically within the theoretical constructs: behavioural beliefs, subjective norms and perceived behavioural control. Data relating to sources of information about the introduction of complementary foods were also collected. Results Overall, first-time mothers found that waiting until six months was challenging despite knowledge of the WHO recommendations and an initial desire to comply with this guideline. Beliefs that complementary foods would assist the infants' weight gain, sleeping patterns and enjoyment at meal times were identified. Barriers preventing parents complying with the recommendations included subjective and group norms, peer influences, infant cues indicating early readiness and food labelling inconsistencies. The most valued information source was from peers who had recently introduced complementary foods. Conclusions First-time mothers in this study did not demonstrate a good understanding of the rationale behind the WHO recommendations, nor did they understand fully the signs of readiness of infants to commence solid foods. Factors that assisted waiting until six months were a trusting relationship with a health professional whose practice and advice was consistent with the recommendations and/or when their infant was developmentally ready for complementary foods at six months and accepted them with ease and enthusiasm. Barriers preventing parents complying with the recommendations included subjective and group norms, peer influences, infant cues indicating early readiness and food labelling inconsistencies.
Resumo:
Analyzing and redesigning business processes is a complex task, which requires the collaboration of multiple actors. Current approaches focus on workshops where process stakeholders together with modeling experts create a graphical visualization of a process in a model. Within these workshops, stakeholders are mostly limited to verbal contributions, which are integrated into a process model by a modeling expert using traditional input devices. This limitation negatively affects the collaboration outcome and also the perception of the collaboration itself. In order to overcome this problem we created CubeBPM – a system that allows groups of actors to interact with process models through a touch based interface on a large interactive touch display wall. Using this system for collaborative modeling, we expect to provide a more effective collaboration environment thus improving modeling performance and collaboration.
Resumo:
Background: A genetic network can be represented as a directed graph in which a node corresponds to a gene and a directed edge specifies the direction of influence of one gene on another. The reconstruction of such networks from transcript profiling data remains an important yet challenging endeavor. A transcript profile specifies the abundances of many genes in a biological sample of interest. Prevailing strategies for learning the structure of a genetic network from high-dimensional transcript profiling data assume sparsity and linearity. Many methods consider relatively small directed graphs, inferring graphs with up to a few hundred nodes. This work examines large undirected graphs representations of genetic networks, graphs with many thousands of nodes where an undirected edge between two nodes does not indicate the direction of influence, and the problem of estimating the structure of such a sparse linear genetic network (SLGN) from transcript profiling data. Results: The structure learning task is cast as a sparse linear regression problem which is then posed as a LASSO (l1-constrained fitting) problem and solved finally by formulating a Linear Program (LP). A bound on the Generalization Error of this approach is given in terms of the Leave-One-Out Error. The accuracy and utility of LP-SLGNs is assessed quantitatively and qualitatively using simulated and real data. The Dialogue for Reverse Engineering Assessments and Methods (DREAM) initiative provides gold standard data sets and evaluation metrics that enable and facilitate the comparison of algorithms for deducing the structure of networks. The structures of LP-SLGNs estimated from the INSILICO1, INSILICO2 and INSILICO3 simulated DREAM2 data sets are comparable to those proposed by the first and/or second ranked teams in the DREAM2 competition. The structures of LP-SLGNs estimated from two published Saccharomyces cerevisae cell cycle transcript profiling data sets capture known regulatory associations. In each S. cerevisiae LP-SLGN, the number of nodes with a particular degree follows an approximate power law suggesting that its degree distributions is similar to that observed in real-world networks. Inspection of these LP-SLGNs suggests biological hypotheses amenable to experimental verification. Conclusion: A statistically robust and computationally efficient LP-based method for estimating the topology of a large sparse undirected graph from high-dimensional data yields representations of genetic networks that are biologically plausible and useful abstractions of the structures of real genetic networks. Analysis of the statistical and topological properties of learned LP-SLGNs may have practical value; for example, genes with high random walk betweenness, a measure of the centrality of a node in a graph, are good candidates for intervention studies and hence integrated computational – experimental investigations designed to infer more realistic and sophisticated probabilistic directed graphical model representations of genetic networks. The LP-based solutions of the sparse linear regression problem described here may provide a method for learning the structure of transcription factor networks from transcript profiling and transcription factor binding motif data.
Resumo:
Yao, Begg, and Livingston (1996, Biometrics 52, 992-1001) considered the optimal group size for testing a series of potentially therapeutic agents to identify a promising one as soon as possible for given error rates. The number of patients to be tested with each agent was fixed as the group size. We consider a sequential design that allows early acceptance and rejection, and we provide an optimal strategy to minimize the sample sizes (patients) required using Markov decision processes. The minimization is under the constraints of the two types (false positive and false negative) of error probabilities, with the Lagrangian multipliers corresponding to the cost parameters for the two types of errors. Numerical studies indicate that there can be a substantial reduction in the number of patients required.
Resumo:
Hand hygiene is the primary measure in hospitals to reduce the spread of infections, with nurses experiencing the greatest frequency of patient contact. The ‘5 critical moments’ of hand hygiene initiative has been implemented in hospitals across Australia, accompanied by awareness-raising, staff training and auditing. The aim of this study was to understand the determinants of nurses’ hand hygiene decisions, using an extension of a common health decision-making model, the theory of planned behaviour (TPB), to inform future health education strategies to increase compliance. Nurses from 50 Australian hospitals (n = 2378) completed standard TPB measures (attitude, subjective norm, perceived behavioural control [PBC], intention) and the extended variables of group norm, risk perceptions (susceptibility, severity) and knowledge (subjective, objective) at Time 1, while a sub-sample (n = 797) reported their hand hygiene behaviour 2 weeks later. Regression analyses identified subjective norm, PBC, group norm, subjective knowledge and risk susceptibility as the significant predictors of nurses’ hand hygiene intentions, with intention and PBC predicting their compliance behaviour. Rather than targeting attitudes which are already very favourable among nurses, health education strategies should focus on normative influences and perceptions of control and risk in efforts to encourage hand hygiene adherence.
Resumo:
Replacement of deteriorated water pipes is a capital-intensive activity for utility companies. Replacement planning aims to minimize total costs while maintaining a satisfactory level of service and is usually conducted for individual pipes. Scheduling replacement in groups is seen to be a better method and has the potential to provide benefits such as the reduction of maintenance costs and service interruptions. However, developing group replacement schedules is a complex task and often beyond the ability of a human expert, especially when multiple or conflicting objectives need to be catered for, such as minimization of total costs and service interruptions. This paper describes the development of a novel replacement decision optimization model for group scheduling (RDOM-GS), which enables multiple group-scheduling criteria by integrating new cost functions, a service interruption model, and optimization algorithms into a unified procedure. An industry case study demonstrates that RDOM-GS can improve replacement planning significantly and reduce costs and service interruptions.