824 resultados para Probabilistic decision process model


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper uses a practice perspective to study coordinating as dynamic activities that are continuously created and modified in order to enact organizational relationships and activities. It is based on the case of Servico, an organization undergoing a major restructuring of its value chain in response to a change in government regulation. In our case, the actors iterate between the abstract concept of a coordinating mechanism referred to as end-to-end management and its performance in practice. They do this via five performative–ostensive cycles: (1) enacting disruption, (2) orienting to absence, (3) creating elements, (4) forming new patterns, and (5) stabilizing new patterns. These cycles and the relationships between them constitute a process model of coordinating. This model highlights the importance of absence in the coordinating process and demonstrates how experiencing absence shapes subsequent coordinating activity.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis addresses the question of how business schoolsestablished as public privatepartnerships (PPPs) within a regional university in the English-speaking Caribbean survived for over twenty-one years and achieved legitimacy in their environment. The aim of the study was to examine how public and private sector actors contributed to the evolution of the PPPs. A social network perspective provided a broad relational focus from which to explore the phenomenon and engage disciplinary and middle-rangetheories to develop explanations. Legitimacy theory provided an appropriate performance dimension from which to assess PPP success. An embedded multiple-case research design, with three case sites analysed at three levels including the country and university environment, the PPP as a firm and the subgroup level constituted the methodological framing of the research process. The analysis techniques included four methods but relied primarily on discourse and social network analysis of interview data from 40 respondents across the three sites. A staged analysis of the evolution of the firm provided the ‘time and effects’ antecedents which formed the basis for sense-making to arrive at explanations of the public-private relationship-influenced change. A conceptual model guided the study and explanations from the cross-case analysis were used to refine the process model and develop a dynamic framework and set of theoretical propositions that would underpin explanations of PPP success and legitimacy in matched contexts through analytical generalisation. The study found that PPP success was based on different models of collaboration and partner resource contribution that arose from a confluence of variables including the development of shared purpose, private voluntary control in corporate governance mechanisms and boundary spanning leadership. The study contributes a contextual theory that explains how PPPs work and a research agenda of ‘corporate governance as inspiration’ from a sociological perspective of ‘liquid modernity’. Recommendations for policy and management practice were developed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVE: To determine the accuracy, acceptability and cost-effectiveness of polymerase chain reaction (PCR) and optical immunoassay (OIA) rapid tests for maternal group B streptococcal (GBS) colonisation at labour. DESIGN: A test accuracy study was used to determine the accuracy of rapid tests for GBS colonisation of women in labour. Acceptability of testing to participants was evaluated through a questionnaire administered after delivery, and acceptability to staff through focus groups. A decision-analytic model was constructed to assess the cost-effectiveness of various screening strategies. SETTING: Two large obstetric units in the UK. PARTICIPANTS: Women booked for delivery at the participating units other than those electing for a Caesarean delivery. INTERVENTIONS: Vaginal and rectal swabs were obtained at the onset of labour and the results of vaginal and rectal PCR and OIA (index) tests were compared with the reference standard of enriched culture of combined vaginal and rectal swabs. MAIN OUTCOME MEASURES: The accuracy of the index tests, the relative accuracies of tests on vaginal and rectal swabs and whether test accuracy varied according to the presence or absence of maternal risk factors. RESULTS: PCR was significantly more accurate than OIA for the detection of maternal GBS colonisation. Combined vaginal or rectal swab index tests were more sensitive than either test considered individually [combined swab sensitivity for PCR 84% (95% CI 79-88%); vaginal swab 58% (52-64%); rectal swab 71% (66-76%)]. The highest sensitivity for PCR came at the cost of lower specificity [combined specificity 87% (95% CI 85-89%); vaginal swab 92% (90-94%); rectal swab 92% (90-93%)]. The sensitivity and specificity of rapid tests varied according to the presence or absence of maternal risk factors, but not consistently. PCR results were determinants of neonatal GBS colonisation, but maternal risk factors were not. Overall levels of acceptability for rapid testing amongst participants were high. Vaginal swabs were more acceptable than rectal swabs. South Asian women were least likely to have participated in the study and were less happy with the sampling procedure and with the prospect of rapid testing as part of routine care. Midwives were generally positive towards rapid testing but had concerns that it might lead to overtreatment and unnecessary interference in births. Modelling analysis revealed that the most cost-effective strategy was to provide routine intravenous antibiotic prophylaxis (IAP) to all women without screening. Removing this strategy, which is unlikely to be acceptable to most women and midwives, resulted in screening, based on a culture test at 35-37 weeks' gestation, with the provision of antibiotics to all women who screened positive being most cost-effective, assuming that all women in premature labour would receive IAP. The results were sensitive to very small increases in costs and changes in other assumptions. Screening using a rapid test was not cost-effective based on its current sensitivity, specificity and cost. CONCLUSIONS: Neither rapid test was sufficiently accurate to recommend it for routine use in clinical practice. IAP directed by screening with enriched culture at 35-37 weeks' gestation is likely to be the most acceptable cost-effective strategy, although it is premature to suggest the implementation of this strategy at present.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Screening for congenital heart defects (CHDs) relies on antenatal ultrasound and postnatal clinical examination; however, life-threatening defects often go undetected. Objective: To determine the accuracy, acceptability and cost-effectiveness of pulse oximetry as a screening test for CHDs in newborn infants. Design: A test accuracy study determined the accuracy of pulse oximetry. Acceptability of testing to parents was evaluated through a questionnaire, and to staff through focus groups. A decision-analytic model was constructed to assess cost-effectiveness. Setting: Six UK maternity units. Participants: These were 20,055 asymptomatic newborns at = 35 weeks’ gestation, their mothers and health-care staff. Interventions: Pulse oximetry was performed prior to discharge from hospital and the results of this index test were compared with a composite reference standard (echocardiography, clinical follow-up and follow-up through interrogation of clinical databases). Main outcome measures: Detection of major CHDs – defined as causing death or requiring invasive intervention up to 12 months of age (subdivided into critical CHDs causing death or intervention before 28 days, and serious CHDs causing death or intervention between 1 and 12 months of age); acceptability of testing to parents and staff; and the cost-effectiveness in terms of cost per timely diagnosis. Results: Fifty-three of the 20,055 babies screened had a major CHD (24 critical and 29 serious), a prevalence of 2.6 per 1000 live births. Pulse oximetry had a sensitivity of 75.0% [95% confidence interval (CI) 53.3% to 90.2%] for critical cases and 49.1% (95% CI 35.1% to 63.2%) for all major CHDs. When 23 cases were excluded, in which a CHD was already suspected following antenatal ultrasound, pulse oximetry had a sensitivity of 58.3% (95% CI 27.7% to 84.8%) for critical cases (12 babies) and 28.6% (95% CI 14.6% to 46.3%) for all major CHDs (35 babies). False-positive (FP) results occurred in 1 in 119 babies (0.84%) without major CHDs (specificity 99.2%, 95% CI 99.0% to 99.3%). However, of the 169 FPs, there were six cases of significant but not major CHDs and 40 cases of respiratory or infective illness requiring medical intervention. The prevalence of major CHDs in babies with normal pulse oximetry was 1.4 (95% CI 0.9 to 2.0) per 1000 live births, as 27 babies with major CHDs (6 critical and 21 serious) were missed. Parent and staff participants were predominantly satisfied with screening, perceiving it as an important test to detect ill babies. There was no evidence that mothers given FP results were more anxious after participating than those given true-negative results, although they were less satisfied with the test. White British/Irish mothers were more likely to participate in the study, and were less anxious and more satisfied than those of other ethnicities. The incremental cost-effectiveness ratio of pulse oximetry plus clinical examination compared with examination alone is approximately £24,900 per timely diagnosis in a population in which antenatal screening for CHDs already exists. Conclusions: Pulse oximetry is a simple, safe, feasible test that is acceptable to parents and staff and adds value to existing screening. It is likely to identify cases of critical CHDs that would otherwise go undetected. It is also likely to be cost-effective given current acceptable thresholds. The detection of other pathologies, such as significant CHDs and respiratory and infective illnesses, is an additional advantage. Other pulse oximetry techniques, such as perfusion index, may enhance detection of aortic obstructive lesions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper explains how dynamic client portfolios can be a source of ambidexterity (i.e., exploration and exploitation) for knowledge-intensive firms (KIFs). Drawing from a unique qualitative dataset of firms in the global reinsurance market, we show how different types of client relationships underpin a dynamic client portfolio and become a source of ambidexterity for a KIF. We develop a process model to show how KIFs attain knowledge by segmenting their client portfolios, use that knowledge to explore and exploit within and across their client relationships, and dynamically adjust their client portfolios over time. Our study contributes to the literature on external sources of ambidexterity and dynamic management of client knowledge within KIFs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

It has been reported that high-speed communication network traffic exhibits both long-range dependence (LRD) and burstiness, which posed new challenges in network engineering. While many models have been studied in capturing the traffic LRD, they are not capable of capturing efficiently the traffic impulsiveness. It is desirable to develop a model that can capture both LRD and burstiness. In this letter, we propose a truncated a-stable LRD process model for this purpose, which can characterize both LRD and burstiness accurately. A procedure is developed further to estimate the model parameters from real traffic. Simulations demonstrate that our proposed model has a higher accuracy compared to existing models and is flexible in capturing the characteristics of high-speed network traffic. © 2012 Springer-Verlag GmbH.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper describes work conducted as a joint collaboration between the Virtual Design Team (VDT) research group at Stanford University (USA) , the Systems Engineering Group (SEG) at De Montfort University (UK) and Elipsis Ltd . We describe a new docking methodology in which we combine the use of two radically different types of organizational simulation tool. The VDT simulation tool operates on a standalone computer, and employs computational agents during simulated execution of a pre-defined process model (Kunz, 1998). The other software tool, DREAMS , operates over a standard TCP/IP network, and employs human agents (real people) during a simulated execution of a pre-defined process model (Clegg, 2000).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Guided by theory in both the trust and leadership domains, the overarching aim of this thesis was to answer a fundamental question. Namely, how and when does trust-building between leaders and followers enhance leader-member exchange (LMX) development and organisational trust? Although trust is considered to be at the crux of the leader-follower relationship, surprisingly little theoretical or empirical attention has been devoted to understanding the precise nature of this relationship. By integrating both a typology of trustworthy behaviour and a process model of trust development with LMX theory, study one developed and tested a new model of LMX development with leader-follower trust-building as the primary mechanism. In a three wave cross-lagged design, 294 student dyads in a business simulation completed measures of trust perceptions and LMX across the first 6 months of the LMX relationship. Trust-building was found to account for unexplained variance in the LMX construct over time, while controlling for initial relationship quality, thus confirming the critical role of the trust-building process in LMX development. The strongest evidence was found for the role of integrity-based trust-building behaviour, albeit only when such behaviour was not attributed to insincere motives. The results for ability and benevolence-based trustworthy behaviour revealed valued insights into the developmental nature of trustworthiness perceptions within LMX relationships. Thus, the pattern of results in study one provided a more comprehensive and nuanced understanding of the dynamic interplay between trust and LMX. In study two, leader trust-building was investigated cross-sectionally within an organisational sample of 201 employees. The central aim of this study was to investigate whether leader trust-building within leader-follower relationships could be leveraged for organisational trust. As expected, the trust-building process instigated by members in study one was replicated for leaders in study two. In addition, the results were most consistent for benevolence-based trust building, whereas both integrity- and ability-based trust-building were moderated by the position of the leader within the organisation’s hierarchy. Overall, the findings of this thesis shed considerable light on the richness of trusting perceptions in organisations, and the critical role of trust-building in LMX development and organisational trust.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Optimal design for parameter estimation in Gaussian process regression models with input-dependent noise is examined. The motivation stems from the area of computer experiments, where computationally demanding simulators are approximated using Gaussian process emulators to act as statistical surrogates. In the case of stochastic simulators, which produce a random output for a given set of model inputs, repeated evaluations are useful, supporting the use of replicate observations in the experimental design. The findings are also applicable to the wider context of experimental design for Gaussian process regression and kriging. Designs are proposed with the aim of minimising the variance of the Gaussian process parameter estimates. A heteroscedastic Gaussian process model is presented which allows for an experimental design technique based on an extension of Fisher information to heteroscedastic models. It is empirically shown that the error of the approximation of the parameter variance by the inverse of the Fisher information is reduced as the number of replicated points is increased. Through a series of simulation experiments on both synthetic data and a systems biology stochastic simulator, optimal designs with replicate observations are shown to outperform space-filling designs both with and without replicate observations. Guidance is provided on best practice for optimal experimental design for stochastic response models. © 2013 Elsevier Inc. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Existing approaches to quality estimation of e-learning systems are analyzed. The “layered” approach for quality estimation of e-learning systems enhanced with learning process modeling and simulation is presented. The method of quality estimation using learning process modeling and quality criteria are suggested. The learning process model based on extended colored stochastic Petri net is described. The method has been implemented in the automated system of quality estimation of e-learning systems named “QuAdS”. Results of approbation of the developed method and quality criteria are shown. We argue that using learning process modeling for quality estimation simplifies identifying lacks of an e-learning system for an expert.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In an attempt to answer the need of wider accessibility and popularization of the treasury of Bulgarian folklore, a team from the Institute of Mathematics and Informatics at the Bulgarian Academy of Sciences has planned to develop the Bulgarian folklore artery within the national project ―Knowledge Technologies for Creation of Digital Presentation and Significant Repositories of Folklore Heritage‖. This paper presents the process of business modeling of the application architecture of the Bulgarian folklore artery, which aids requirements analysis, application design and its software implementation. The folklore domain process model is made in the context of the target social applications—e-learning, virtual expositions of folklore artifacts, research, news, cultural/ethno-tourism, etc. The basic processes are analyzed and modeled and some inferences are made for the use cases and requirements specification of the Bulgarian folklore artery application. As a conclusion the application architecture of the Bulgarian folklore artery is presented.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Recognising the importance of alliance decision making in a virtual enterprise (VE), this paper proposes an analysis template to facilitate this process. The existing transaction-cost and resource-based theories in the literature are first reviewed, showing some deficiencies in both type of theories, and the potential of the resource based explanations. The paper then goes on to propose a resource-based analysis template, integrating both the motives of using certain business forms and the factors why different forms help achieve different objectives, Resource-combination effectiveness, management complexity and flexibility are identified as the three factors providing fundamental explanations of an organization's alliance making decision process. The template provides a comprehensive and generic approach for analysing alliance decisions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The financial community is well aware that continued underfunding of state and local government pension plans poses many public policy and fiduciary management concerns. However, a well-defined theoretical rationale has not been developed to explain why and how public sector pension plans underfund. This study uses three methods: a survey of national pension experts, an incomplete covariance panel method, and field interviews.^ A survey of national public sector pension experts was conducted to provide a conceptual framework by which underfunding could be evaluated. Experts suggest that plan design, fiscal stress, and political culture factors impact underfunding. However, experts do not agree with previous research findings that unions actively pursue underfunding to secure current wage increases.^ Within the conceptual framework and determinants identified by experts, several empirical regularities are documented for the first time. Analysis of 173 local government pension plans, observed from 1987 to 1992, was conducted. Findings indicate that underfunding occurs in plans that have lower retirement ages, increased costs due to benefit enhancements, when the sponsor faces current year operating deficits, or when a local government relies heavily on inelastic revenue sources. Results also suggest that elected officials artificially inflate interest rate assumptions to reduce current pension costs, consequently shifting these costs to future generations. In concurrence with some experts there is no data to support the assumption that highly unionized employees secure more funding than less unionized employees.^ Empirical results provide satisfactory but not overwhelming statistical power, and only minor predictive capacity. To further explore why underfunding occurs, field interviews were carried out with 62 local government officials. Practitioners indicated that perceived fiscal stress, the willingness of policymakers to advance funding, bargaining strategies used by union officials, apathy by employees and retirees, pension board composition, and the level of influence by internal pension experts has an impact on funding outcomes.^ A pension funding process model was posited by triangulating the expert survey, empirical findings, and field survey results. The funding process model should help shape and refine our theoretical knowledge of state and local government pension underfunding in the future. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The theoretical analysis and research of cultural activities have been limited, for the most part, to the study of the role the public sector plays in the funding and support of nonprofit Arts organizations. The tools used to evaluate this intervention follow a macroeconomic perspective and fail to account for microeconomic principles and assumptions that affect the behavior of these organizations. This dissertation describes through conceptual models the behavior of the agents involved in the artistic process and the economic sectors affected by it. The first paper deals with issues related to economic impact studies and formulates a set of guidelines that should be followed when conducting this type of study. One of the ways to assess more accurately the impact culture has in a community is by assuming that artists can re-create the public space of a blight community and get it ready for a regeneration process. The second paper of this dissertation assumes just that and explains in detail all the cultural, political, economic and sociological interactions that are taking place in the Arts-led regeneration process in Miami Beach, Florida. The paper models the behavior of these agents by indicating what their goals and decision process mechanisms are. The results give support to the claim that the public space artists create in a city actually stimulate development. The third paper discusses the estimation of a demand function for artistic activities, specifically the New World Symphony (NWS) located in Miami Beach, Florida. The behavior of the consumers and producers of NWS' concerts is modeled. The results support the notion that consumers make their decisions based, among other things, on the perceived value these concerts have. Economists engage in the analysis of the effects of cultural activities in a community since many cities rely on them for their development. The history of many communities is not told by their assembly lines and machinery anymore but by their centers of entertainment, hotels and restaurants. Many cities in Europe and North America that have seen the manufacturing sector migrate to the South are trying to face the demands of the new economy by using the Arts as catalysts for development. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This dissertation establishes a novel system for human face learning and recognition based on incremental multilinear Principal Component Analysis (PCA). Most of the existing face recognition systems need training data during the learning process. The system as proposed in this dissertation utilizes an unsupervised or weakly supervised learning approach, in which the learning phase requires a minimal amount of training data. It also overcomes the inability of traditional systems to adapt to the testing phase as the decision process for the newly acquired images continues to rely on that same old training data set. Consequently when a new training set is to be used, the traditional approach will require that the entire eigensystem will have to be generated again. However, as a means to speed up this computational process, the proposed method uses the eigensystem generated from the old training set together with the new images to generate more effectively the new eigensystem in a so-called incremental learning process. In the empirical evaluation phase, there are two key factors that are essential in evaluating the performance of the proposed method: (1) recognition accuracy and (2) computational complexity. In order to establish the most suitable algorithm for this research, a comparative analysis of the best performing methods has been carried out first. The results of the comparative analysis advocated for the initial utilization of the multilinear PCA in our research. As for the consideration of the issue of computational complexity for the subspace update procedure, a novel incremental algorithm, which combines the traditional sequential Karhunen-Loeve (SKL) algorithm with the newly developed incremental modified fast PCA algorithm, was established. In order to utilize the multilinear PCA in the incremental process, a new unfolding method was developed to affix the newly added data at the end of the previous data. The results of the incremental process based on these two methods were obtained to bear out these new theoretical improvements. Some object tracking results using video images are also provided as another challenging task to prove the soundness of this incremental multilinear learning method.