940 resultados para NONLINEAR SIGMA-MODELS


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The concept of Six Sigma was initiated in the 1980s by Motorola. Since then it has been implemented in several manufacturing and service organizations. Till now Six Sigma implementation is mostly limited to healthcare and financial services in private sector. Its implementation is now gradually picking up in services such as call center, education, construction and related engineering etc. in private as well as public sector. Through a literature review, a questionnaire survey, and multiple case study approach the paper develops a conceptual framework to facilitate widening the scope of Six Sigma implementation in service organizations. Using grounded theory methodology, this study develops theory for Six Sigma implementation in service organizations. The study involves a questionnaire survey and case studies to understand and build a conceptual framework. The survey was conducted in service organizations in Singapore and exploratory in nature. The case studies involved three service organizations which implemented Six Sigma. The objective is to explore and understand the issues highlighted by the survey and the literature. The findings confirm the inclusion of critical success factors, critical-to-quality characteristics, and set of tools and techniques as observed from the literature. In case of key performance indicator, there are different interpretations about it in literature and also by industry practitioners. Some literature explain key performance indicator as performance metrics whereas some feel it as key process input or output variables, which is similar to interpretations by practitioners of Six Sigma. The response of not relevant and unknown to us as reasons for not implementing Six Sigma shows the need for understanding specific requirements of service organizations. Though much theoretical description is available about Six Sigma, but there has been limited rigorous academic research on it. This gap is far more pronounced about Six Sigma implementation in service organizations, where the theory is not mature enough. Identifying this need, the study contributes by going through theory building exercise and developing a conceptual framework to understand the issues involving its implementation in service organizations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, the goal of identifying disease subgroups based on differences in observed symptom profile is considered. Commonly referred to as phenotype identification, solutions to this task often involve the application of unsupervised clustering techniques. In this paper, we investigate the application of a Dirichlet Process mixture (DPM) model for this task. This model is defined by the placement of the Dirichlet Process (DP) on the unknown components of a mixture model, allowing for the expression of uncertainty about the partitioning of observed data into homogeneous subgroups. To exemplify this approach, an application to phenotype identification in Parkinson’s disease (PD) is considered, with symptom profiles collected using the Unified Parkinson’s Disease Rating Scale (UPDRS). Clustering, Dirichlet Process mixture, Parkinson’s disease, UPDRS.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Quality oriented management systems and methods have become the dominant business and governance paradigm. From this perspective, satisfying customers’ expectations by supplying reliable, good quality products and services is the key factor for an organization and even government. During recent decades, Statistical Quality Control (SQC) methods have been developed as the technical core of quality management and continuous improvement philosophy and now are being applied widely to improve the quality of products and services in industrial and business sectors. Recently SQC tools, in particular quality control charts, have been used in healthcare surveillance. In some cases, these tools have been modified and developed to better suit the health sector characteristics and needs. It seems that some of the work in the healthcare area has evolved independently of the development of industrial statistical process control methods. Therefore analysing and comparing paradigms and the characteristics of quality control charts and techniques across the different sectors presents some opportunities for transferring knowledge and future development in each sectors. Meanwhile considering capabilities of Bayesian approach particularly Bayesian hierarchical models and computational techniques in which all uncertainty are expressed as a structure of probability, facilitates decision making and cost-effectiveness analyses. Therefore, this research investigates the use of quality improvement cycle in a health vii setting using clinical data from a hospital. The need of clinical data for monitoring purposes is investigated in two aspects. A framework and appropriate tools from the industrial context are proposed and applied to evaluate and improve data quality in available datasets and data flow; then a data capturing algorithm using Bayesian decision making methods is developed to determine economical sample size for statistical analyses within the quality improvement cycle. Following ensuring clinical data quality, some characteristics of control charts in the health context including the necessity of monitoring attribute data and correlated quality characteristics are considered. To this end, multivariate control charts from an industrial context are adapted to monitor radiation delivered to patients undergoing diagnostic coronary angiogram and various risk-adjusted control charts are constructed and investigated in monitoring binary outcomes of clinical interventions as well as postintervention survival time. Meanwhile, adoption of a Bayesian approach is proposed as a new framework in estimation of change point following control chart’s signal. This estimate aims to facilitate root causes efforts in quality improvement cycle since it cuts the search for the potential causes of detected changes to a tighter time-frame prior to the signal. This approach enables us to obtain highly informative estimates for change point parameters since probability distribution based results are obtained. Using Bayesian hierarchical models and Markov chain Monte Carlo computational methods, Bayesian estimators of the time and the magnitude of various change scenarios including step change, linear trend and multiple change in a Poisson process are developed and investigated. The benefits of change point investigation is revisited and promoted in monitoring hospital outcomes where the developed Bayesian estimator reports the true time of the shifts, compared to priori known causes, detected by control charts in monitoring rate of excess usage of blood products and major adverse events during and after cardiac surgery in a local hospital. The development of the Bayesian change point estimators are then followed in a healthcare surveillances for processes in which pre-intervention characteristics of patients are viii affecting the outcomes. In this setting, at first, the Bayesian estimator is extended to capture the patient mix, covariates, through risk models underlying risk-adjusted control charts. Variations of the estimator are developed to estimate the true time of step changes and linear trends in odds ratio of intensive care unit outcomes in a local hospital. Secondly, the Bayesian estimator is extended to identify the time of a shift in mean survival time after a clinical intervention which is being monitored by riskadjusted survival time control charts. In this context, the survival time after a clinical intervention is also affected by patient mix and the survival function is constructed using survival prediction model. The simulation study undertaken in each research component and obtained results highly recommend the developed Bayesian estimators as a strong alternative in change point estimation within quality improvement cycle in healthcare surveillances as well as industrial and business contexts. The superiority of the proposed Bayesian framework and estimators are enhanced when probability quantification, flexibility and generalizability of the developed model are also considered. The empirical results and simulations indicate that the Bayesian estimators are a strong alternative in change point estimation within quality improvement cycle in healthcare surveillances. The superiority of the proposed Bayesian framework and estimators are enhanced when probability quantification, flexibility and generalizability of the developed model are also considered. The advantages of the Bayesian approach seen in general context of quality control may also be extended in the industrial and business domains where quality monitoring was initially developed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper proposes a nonlinear H_infinity controller for stabilization of velocities, attitudes and angular rates of a fixed-wing unmanned aerial vehicle (UAV) in a windy environment. The suggested controller aims to achieve a steady-state flight condition in the presence of wind gusts such that the host UAV can be maneuvered to avoid collision with other UAVs during cruise flight with safety guarantees. This paper begins with building a proper model capturing flight aerodynamics of UAVs. Then a nonlinear controller is developed with gust attenuation and rapid response properties. Simulations are conducted for the Shadow UAV to verify performance of the proposed con- troller. Comparative studies with the proportional-integral-derivative (PID) controllers demonstrate that the proposed controller exhibits great performance improvement in a gusty environment, making it suitable for integration into the design of flight control systems for cruise flight of UAVs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Much of our understanding of human thinking is based on probabilistic models. This innovative book by Jerome R. Busemeyer and Peter D. Bruza argues that, actually, the underlying mathematical structures from quantum theory provide a much better account of human thinking than traditional models. They introduce the foundations for modelling probabilistic-dynamic systems using two aspects of quantum theory. The first, "contextuality", is a way to understand interference effects found with inferences and decisions under conditions of uncertainty. The second, "entanglement", allows cognitive phenomena to be modelled in non-reductionist ways. Employing these principles drawn from quantum theory allows us to view human cognition and decision in a totally new light...

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Crop simulation models have the potential to assess the risk associated with the selection of a specific N fertilizer rate, by integrating the effects of soil-crop interactions on crop growth under different pedo-climatic and management conditions. The objective of this study was to simulate the environmental and economic impact (nitrate leaching and N2O emissions) of a spatially variable N fertilizer application in an irrigated maize field in Italy. The validated SALUS model was run with 5 nitrogen rates scenarios, 50, 100, 150, 200, and 250 kg N ha−1, with the latter being the N fertilization adopted by the farmer. The long-term (25 years) simulations were performed on two previously identified spatially and temporally stable zones, a high yielding and low yielding zone. The simulation results showed that N fertilizer rate can be reduced without affecting yield and net return. The marginal net return was on average higher for the high yield zone, with values ranging from 1550 to 2650 € ha−1 for the 200 N and 1485 to 2875 € ha−1 for the 250 N. N leaching varied between 16.4 and 19.3 kg N ha−1 for the 200 N and the 250 N in the high yield zone. In the low yield zone, the 250 N had a significantly higher N leaching. N2O emissions varied between 0.28 kg N2O ha−1 for the 50 kg N ha−1 rate to a maximum of 1.41 kg N2O ha−1 for the 250 kg N ha−1 rate.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Models of cell invasion incorporating directed cell movement up a gradient of an external substance and carrying capacity-limited proliferation give rise to travelling wave solutions. Travelling wave profiles with various shapes, including smooth monotonically decreasing, shock-fronted monotonically decreasing and shock-fronted nonmonotone shapes, have been reported previously in the literature. The existence of tacticallydriven shock-fronted nonmonotone travelling wave solutions is analysed for the first time. We develop a necessary condition for nonmonotone shock-fronted solutions. This condition shows that some of the previously reported shock-fronted nonmonotone solutions are genuine while others are a consequence of numerical error. Our results demonstrate that, for certain conditions, travelling wave solutions can be either smooth and monotone, smooth and nonmonotone or discontinuous and nonmonotone. These different shapes correspond to different invasion speeds. A necessary and sufficient condition for the travelling wave with minimum wave speed to be nonmonotone is presented. Several common forms of the tactic sensitivity function have the potential to satisfy the newly developed condition for nonmonotone shock-fronted solutions developed in this work.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Physical access control systems play a central role in the protection of critical infrastructures, where both the provision of timely access and preserving the security of sensitive areas are paramount. In this paper we discuss the shortcomings of existing approaches to the administration of physical access control in complex environments. At the heart of the problem is the current dependency on human administrators to reason about the implications of the provision or the revocation of staff access to an area within these facilities. We demonstrate how utilising Building Information Models (BIMs) and the capabilities they provide, including 3D representation of a facility and path-finding can reduce possible intentional or accidental errors made by security administrators.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Recent efforts in mission planning for underwater vehicles have utilised predictive models to aid in navigation, optimal path planning and drive opportunistic sampling. Although these models provide information at a unprecedented resolutions and have proven to increase accuracy and effectiveness in multiple campaigns, most are deterministic in nature. Thus, predictions cannot be incorporated into probabilistic planning frameworks, nor do they provide any metric on the variance or confidence of the output variables. In this paper, we provide an initial investigation into determining the confidence of ocean model predictions based on the results of multiple field deployments of two autonomous underwater vehicles. For multiple missions conducted over a two-month period in 2011, we compare actual vehicle executions to simulations of the same missions through the Regional Ocean Modeling System in an ocean region off the coast of southern California. This comparison provides a qualitative analysis of the current velocity predictions for areas within the selected deployment region. Ultimately, we present a spatial heat-map of the correlation between the ocean model predictions and the actual mission executions. Knowing where the model provides unreliable predictions can be incorporated into planners to increase the utility and application of the deterministic estimations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Australian higher education institutions (HEIs) have entered a new phase of regulation and accreditation which includes performance-based funding relating to the participation and retention of students from social and cultural groups previously underrepresented in higher education. However, in addressing these priorities, it is critical that HEIs do not further disadvantage students from certain groups by identifying them for attention because of their social or cultural backgrounds, circumstances which are largely beyond the control of students. In response, many HEIs are focusing effort on university-wide approaches to enhancing the student experience because such approaches will enhance the engagement, success and retention of all students, and in doing so, particularly benefit those students who come from underrepresented groups. Measuring and benchmarking student experiences and engagement that arise from these efforts is well supported by extensive collections of student experience survey data. However no comparable instrument exists that measures the capability of institutions to influence and/or enhance student experiences where capability is an indication of how well an organisational process does what it is designed to do (Rosemann & de Bruin, 2005). This paper proposes that the concept of a maturity model (Marshall, 2010; Paulk, 1999) may be useful as a way of assessing the capability of HEIs to provide and implement student engagement, success and retention activities. We will describe the Student Engagement, Success and Retention Maturity Model (SESR-MM), (Clarke, Nelson & Stoodley, 2012; Nelson, Clarke & Stoodley, 2012) we are currently investigating. We will discuss if our research may address the current gap by facilitating the development of an SESR-MM instrument that aims (i) to enable institutions to assess the capability of their current student engagement and retention programs and strategies to influence and respond to student experiences within the institution; and (ii) to provide institutions with the opportunity to understand various practices across the sector with a view to further improving programs and practices relevant to their context. The first aim of our research is to extend the generational approach which has been useful in considering the evolutionary nature of the first year experience (FYE) (Wilson, 2009). Three generations have been identified and explored: First generation approaches that focus on co-curricular strategies (e.g. orientation and peer programs); Second generation approaches that focus on curriculum (e.g. pedagogy, curriculum design, and learning and teaching practice); and third generation approaches—also referred to as transition pedagogy—that focus on the production of an institution-wide integrated holistic intentional blend of curricular and co-curricular activities (Kift, Nelson & Clarke, 2010). The second aim of this research is to move beyond assessments of students’ experiences to focus on assessing institutional processes and their capability to influence student engagement. In essence, we propose to develop and use the maturity model concept to produce an instrument that will indicate the capability of HEIs to manage and improve student engagement, success and retention programs and strategies. References Australian Council for Educational Research. (n.d.). Australasian Survey of Student Engagement. Retrieved from http://www.acer.edu.au/research/ausse/background Clarke, J., Nelson, K., & Stoodley, I. (2012, July). The Maturity Model concept as framework for assessing the capability of higher education institutions to address student engagement, success and retention: New horizon or false dawn? A Nuts & Bolts presentation at the 15th International Conference on the First Year in Higher Education, “New Horizons,” Brisbane, Australia. Kift, S., Nelson, K., & Clarke, J. (2010) Transition pedagogy - a third generation approach to FYE: A case study of policy and practice for the higher education sector. The International Journal of the First Year in Higher Education, 1(1), pp. 1-20. Department of Education, Employment and Workplace Relations. (n.d.). The University Experience Survey. Advancing quality in higher education information sheet. Retrieved from http://www.deewr.gov.au/HigherEducation/Policy/Documents/University_Experience_Survey.pdf Marshall, S. (2010). A quality framework for continuous improvement of e-Learning: The e-Learning Maturity Model. Journal of Distance Education, 24(1), 143-166. Nelson, K., Clarke, J., & Stoodley, I. (2012). An exploration of the Maturity Model concept as a vehicle for higher education institutions to assess their capability to address student engagement. A work in progress. Submitted for publication. Paulk, M. (1999). Using the Software CMM with good judgment, ASQ Software Quality Professional, 1(3), 19-29. Wilson, K. (2009, June–July). The impact of institutional, programmatic and personal interventions on an effective and sustainable first-year student experience. Keynote address presented at the 12th Pacific Rim First Year in Higher Education Conference, “Preparing for Tomorrow Today: The First Year as Foundation,” Townsville, Australia. Retrieved from http://www.fyhe.com.au/past_papers/papers09/ppts/Keithia_Wilson_paper.pdf

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we present a methodology for designing experiments for efficiently estimating the parameters of models with computationally intractable likelihoods. The approach combines a commonly used methodology for robust experimental design, based on Markov chain Monte Carlo sampling, with approximate Bayesian computation (ABC) to ensure that no likelihood evaluations are required. The utility function considered for precise parameter estimation is based upon the precision of the ABC posterior distribution, which we form efficiently via the ABC rejection algorithm based on pre-computed model simulations. Our focus is on stochastic models and, in particular, we investigate the methodology for Markov process models of epidemics and macroparasite population evolution. The macroparasite example involves a multivariate process and we assess the loss of information from not observing all variables.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Recently, ‘business model’ and ‘business model innovation’ have gained substantial attention in management literature and practice. However, many firms lack the capability to develop a novel business model to capture the value from new technologies. Existing literature on business model innovation highlights the central role of ‘customer value’. Further, it suggests that firms need to experiment with different business models and engage in ‘trail-and-error’ learning when participating in business model innovation. Trial-and error processes and prototyping with tangible artifacts are a fundamental characteristic of design. This conceptual paper explores the role of design-led innovation in facilitating firms to conceive and prototype novel and meaningful business models. It provides a brief review of the conceptual discussion on business model innovation and highlights the opportunities for linking it with the research stream of design-led innovation. We propose design-led business model innovation as a future research area and highlight the role of design-led prototyping and new types of artifacts and prototypes play within it. We present six propositions in order to outline future research avenues.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The identification of the primary drivers of stock returns has been of great interest to both financial practitioners and academics alike for many decades. Influenced by classical financial theories such as the CAPM (Sharp, 1964; Lintner, 1965) and APT (Ross, 1976), a linear relationship is conventionally assumed between company characteristics as derived from their financial accounts and forward returns. Whilst this assumption may be a fair approximation to the underlying structural relationship, it is often adopted for the purpose of convenience. It is actually quite rare that the assumptions of distributional normality and a linear relationship are explicitly assessed in advance even though this information would help to inform the appropriate choice of modelling technique. Non-linear models have nevertheless been applied successfully to the task of stock selection in the past (Sorensen et al, 2000). However, their take-up by the investment community has been limited despite the fact that researchers in other fields have found them to be a useful way to express knowledge and aid decision-making...

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Animal models typically require a known genetic pedigree to estimate quantitative genetic parameters. Here we test whether animal models can alternatively be based on estimates of relatedness derived entirely from molecular marker data. Our case study is the morphology of a wild bird population, for which we report estimates of the genetic variance-covariance matrices (G) of six morphological traits using three methods: the traditional animal model; a molecular marker-based approach to estimate heritability based on Ritland's pairwise regression method; and a new approach using a molecular genealogy arranged in a relatedness matrix (R) to replace the pedigree in an animal model. Using the traditional animal model, we found significant genetic variance for all six traits and positive genetic covariance among traits. The pairwise regression method did not return reliable estimates of quantitative genetic parameters in this population, with estimates of genetic variance and covariance typically being very small or negative. In contrast, we found mixed evidence for the use of the pedigree-free animal model. Similar to the pairwise regression method, the pedigree-free approach performed poorly when the full-rank R matrix based on the molecular genealogy was employed. However, performance improved substantially when we reduced the dimensionality of the R matrix in order to maximize the signal to noise ratio. Using reduced-rank R matrices generated estimates of genetic variance that were much closer to those from the traditional model. Nevertheless, this method was less reliable at estimating covariances, which were often estimated to be negative. Taken together, these results suggest that pedigree-free animal models can recover quantitative genetic information, although the signal remains relatively weak. It remains to be determined whether this problem can be overcome by the use of a more powerful battery of molecular markers and improved methods for reconstructing genealogies.