540 resultados para predictive models


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Over the last few decades, construction project performance has been evaluated due to the increase of delays, cost overruns and quality failures. Growing numbers of disputes, inharmonious working environments, conflict, blame cultures, and mismatches of objectives among project teams have been found to be contributory factors to poor project performance. Performance measurement (PM) approaches have been developed to overcome these issues, however, the comprehensiveness of PM as an overall approach is still criticised in terms of the iron triangle; namely time, cost, and quality. PM has primarily focused on objective measures, however, continuous improvement requires the inclusion of subjective measures, particularly contractor satisfaction (Co-S). It is challenging to deal with the two different groups of large and small-medium contractor satisfaction as to date, Co-S has not been extensively defined, primarily in developing countries such as Malaysia. Therefore, a Co-S model is developed in this research which aims to fulfil the current needs in the construction industry by integrating performance measures to address large and small-medium contractor perceptions. The positivist paradigm used in the research was adhered to by reviewing relevant literature and evaluating expert discussions on the research topic. It yielded a basis for the contractor satisfaction model (CoSMo) development which consists of three elements: contractor satisfaction (Co-S) dimensions; contributory factors and characteristics (project and participant). Using valid questionnaire results from 136 contractors in Malaysia lead to the prediction of several key factors of contractor satisfaction and to an examination of the relationships between elements. The relationships were examined through a series of sequential statistical analyses, namely correlation, one-way analysis of variance (ANOVA), t-tests and multiple regression analysis (MRA). Forward and backward MRAs were used to develop Co-S mathematical models. Sixteen Co-S models were developed for both large and small-medium contractors. These determined that the large contractor Malaysian Co-S was most affected by the conciseness of project scope and quality of the project brief. Contrastingly, Co-S for small-medium contractors was strongly affected by the efficiency of risk control in a project. The results of the research provide empirical evidence in support of the notion that appropriate communication systems in projects negatively contributes to large Co-S with respect to cost and profitability. The uniqueness of several Co-S predictors was also identified through a series of analyses on small-medium contractors. These contractors appear to be less satisfied than large contractors when participants lack effectiveness in timely authoritative decision-making and communication between project team members. Interestingly, the empirical results show that effective project health and safety measures are influencing factors in satisfying both large and small-medium contractors. The perspectives of large and small-medium contractors in respect to the performance of the entire project development were derived from the Co-S models. These were statistically validated and refined before a new Co-S model was developed. Developing such a unique model has the potential to increase project value and benefit all project participants. It is important to improve participant collaboration as it leads to better project performance. This study may encourage key project participants; such as client, consultant, subcontractor and supplier; to increase their attention to contractor needs in the development of a project. Recommendations for future research include investigating other participants‟ perspectives on CoSMo and the impact of the implementation of CoSMo in a project, since this study is focused purely on the contractor perspective.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A synthesis is presented of the predictive capability of a family of near-wall wall-normal free Reynolds stress models (which are completely independent of wall topology, i.e., of the distance fromthe wall and the normal-to-thewall orientation) for oblique-shock-wave/turbulent-boundary-layer interactions. For the purpose of comparison, results are also presented using a standard low turbulence Reynolds number k–ε closure and a Reynolds stress model that uses geometric wall normals and wall distances. Studied shock-wave Mach numbers are in the range MSW = 2.85–2.9 and incoming boundary-layer-thickness Reynolds numbers are in the range Reδ0 = 1–2×106. Computations were carefully checked for grid convergence. Comparison with measurements shows satisfactory agreement, improving on results obtained using a k–ε model, and highlights the relative importance of redistribution and diffusion closures, indicating directions for future modeling work.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: There is currently no early predictive marker of survival for patients receiving chemotherapy for malignant pleural mesothelioma (MPM). Tumour response may be predictive for overall survival (OS), though this has not been explored. We have thus undertaken a combined-analysis of OS, from a 42 day landmark, of 526 patients receiving systemic therapy for MPM. We also validate published progression-free survival rates (PFSRs) and a progression-free survival (PFS) prognostic-index model. Methods: Analyses included nine MPM clinical trials incorporating six European Organisation for Research and Treatment of Cancer (EORTC) studies. Analysis of OS from landmark (from day 42 post-treatment) was considered regarding tumour response. PFSR analysis data included six non-EORTC MPM clinical trials. Prognostic index validation was performed on one non-EORTC data-set, with available survival data. Results: Median OS, from landmark, of patients with partial response (PR) was 12·8 months, stable disease (SD), 9·4 months and progressive disease (PD), 3·4 months. Both PR and SD were associated with longer OS from landmark compared with disease progression (both p < 0·0001). PFSRs for platinum-based combination therapies were consistent with published significant clinical activity ranges. Effective separation between PFS and OS curves provided a validation of the EORTC prognostic model, based on histology, stage and performance status. Conclusion: Response to chemotherapy is associated with significantly longer OS from landmark in patients with MPM. © 2012 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An important aspect of robotic path planning for is ensuring that the vehicle is in the best location to collect the data necessary for the problem at hand. Given that features of interest are dynamic and move with oceanic currents, vehicle speed is an important factor in any planning exercises to ensure vehicles are at the right place at the right time. Here, we examine different Gaussian process models to find a suitable predictive kinematic model that enable the speed of an underactuated, autonomous surface vehicle to be accurately predicted given a set of input environmental parameters.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A predictive model of terrorist activity is developed by examining the daily number of terrorist attacks in Indonesia from 1994 through 2007. The dynamic model employs a shot noise process to explain the self-exciting nature of the terrorist activities. This estimates the probability of future attacks as a function of the times since the past attacks. In addition, the excess of nonattack days coupled with the presence of multiple coordinated attacks on the same day compelled the use of hurdle models to jointly model the probability of an attack day and corresponding number of attacks. A power law distribution with a shot noise driven parameter best modeled the number of attacks on an attack day. Interpretation of the model parameters is discussed and predictive performance of the models is evaluated.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An important aspect of decision support systems involves applying sophisticated and flexible statistical models to real datasets and communicating these results to decision makers in interpretable ways. An important class of problem is the modelling of incidence such as fire, disease etc. Models of incidence known as point processes or Cox processes are particularly challenging as they are ‘doubly stochastic’ i.e. obtaining the probability mass function of incidents requires two integrals to be evaluated. Existing approaches to the problem either use simple models that obtain predictions using plug-in point estimates and do not distinguish between Cox processes and density estimation but do use sophisticated 3D visualization for interpretation. Alternatively other work employs sophisticated non-parametric Bayesian Cox process models, but do not use visualization to render interpretable complex spatial temporal forecasts. The contribution here is to fill this gap by inferring predictive distributions of Gaussian-log Cox processes and rendering them using state of the art 3D visualization techniques. This requires performing inference on an approximation of the model on a discretized grid of large scale and adapting an existing spatial-diurnal kernel to the log Gaussian Cox process context.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Electrification of vehicular systems has gained increased momentum in recent years with particular attention to constant power loads (CPLs). Since a CPL potentially threatens system stability, stability analysis of hybrid electric vehicle with CPLs becomes necessary. A new power buffer configuration with battery is introduced to mitigate the effect of instability caused by CPLs. Model predictive control (MPC) is applied to regulate the power buffer to decouple source and load dynamics. Moreover, MPC provides an optimal tradeoff between modification of load impedance, variation of dc-link voltage and battery current ripples. This is particularly important during transients or starting of system faults, since battery response is not very fast. Optimal tradeoff becomes even more significant when considering low-cost power buffer without battery. This paper analyzes system models for both voltage swell and voltage dip faults. Furthermore, a dual mode MPC algorithm is implemented in real time offering improved stability. A comprehensive set of experimental results is included to verify the efficacy of the proposed power buffer.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The functions of the volunteer functions inventory were combined with the constructs of the theory of planned behaviour (i.e., attitudes, subjective norms, and perceived behavioural control) to establish whether a stronger, single explanatory model prevailed. Undertaken in the context of episodic, skilled volunteering by individuals who were retired or approaching retirement (N = 186), the research advances on prior studies which either examined the predictive capacity of each model independently or compared their explanatory value. Using hierarchical regression analysis, the functions of the volunteer functions inventory (when controlling for demographic variables) explained an additional 7.0% of variability in individuals’ willingness to volunteer over and above that accounted for by the theory of planned behaviour. Significant predictors in the final model included attitudes, subjective norms and perceived behavioural control from the theory of planned behaviour and the understanding function from the volunteer functions inventory. It is proposed that the items comprising the understanding function may represent a deeper psychological construct (e.g., self-actualisation) not accounted for by the theory of planned behaviour. The findings highlight the potential benefit of combining these two prominent models in terms of improving understanding of volunteerism and providing a single parsimonious model for raising rates of this important behaviour.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Introduction Risk factor analyses for nosocomial infections (NIs) are complex. First, due to competing events for NI, the association between risk factors of NI as measured using hazard rates may not coincide with the association using cumulative probability (risk). Second, patients from the same intensive care unit (ICU) who share the same environmental exposure are likely to be more similar with regard to risk factors predisposing to a NI than patients from different ICUs. We aimed to develop an analytical approach to account for both features and to use it to evaluate associations between patient- and ICU-level characteristics with both rates of NI and competing risks and with the cumulative probability of infection. Methods We considered a multicenter database of 159 intensive care units containing 109,216 admissions (813,739 admission-days) from the Spanish HELICS-ENVIN ICU network. We analyzed the data using two models: an etiologic model (rate based) and a predictive model (risk based). In both models, random effects (shared frailties) were introduced to assess heterogeneity. Death and discharge without NI are treated as competing events for NI. Results There was a large heterogeneity across ICUs in NI hazard rates, which remained after accounting for multilevel risk factors, meaning that there are remaining unobserved ICU-specific factors that influence NI occurrence. Heterogeneity across ICUs in terms of cumulative probability of NI was even more pronounced. Several risk factors had markedly different associations in the rate-based and risk-based models. For some, the associations differed in magnitude. For example, high Acute Physiology and Chronic Health Evaluation II (APACHE II) scores were associated with modest increases in the rate of nosocomial bacteremia, but large increases in the risk. Others differed in sign, for example respiratory vs cardiovascular diagnostic categories were associated with a reduced rate of nosocomial bacteremia, but an increased risk. Conclusions A combination of competing risks and multilevel models is required to understand direct and indirect risk factors for NI and distinguish patient-level from ICU-level factors.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Due to its ability to represent intricate systems with material nonlinearities as well as irregular loading, boundary, geometrical and material domains, the finite element (FE) method has been recognized as an important computational tool in spinal biomechanics. Current FE models generally account for a single distinct spinal geometry with one set of material properties despite inherently large inter-subject variability. The uncertainty and high variability in tissue material properties, geometry, loading and boundary conditions has cast doubt on the reliability of their predictions and comparability with reported in vitro and in vivo values. A multicenter study was undertaken to compare the results of eight well-established models of the lumbar spine that have been developed, validated and applied for many years. Models were subjected to pure and combined loading modes and their predictions were compared to in vitro and in vivo measurements for intervertebral rotations, disc pressures and facet joint forces. Under pure moment loading, the predicted L1-5 rotations of almost all models fell within the reported in vitro ranges; their median values differed on average by only 2° for flexion-extension, 1° for lateral bending and 5° for axial rotation. Predicted median facet joint forces and disc pressures were also in good agreement with previously published median in vitro values. However, the ranges of predictions were larger and exceeded the in vitro ranges, especially for facet joint forces. For all combined loading modes, except for flexion, predicted median segmental intervertebral rotations and disc pressures were in good agreement with in vivo values. The simulations yielded median facet joint forces of 0 N in flexion, 38 N in extension, 14 N in lateral bending and 60 N in axial rotation that could not be validated due to the paucity of in vivo facet joint forces. In light of high inter-subject variability, one must be cautious when generalizing predictions obtained from one deterministic model. This study demonstrates however that the predictive power increases when FE models are combined together. The median of individual numerical results can hence be used as an improved tool in order to estimate the response of the lumbar spine.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Overprocessing waste occurs in a business process when effort is spent in a way that does not add value to the customer nor to the business. Previous studies have identied a recurrent overprocessing pattern in business processes with so-called "knockout checks", meaning activities that classify a case into "accepted" or "rejected", such that if the case is accepted it proceeds forward, while if rejected, it is cancelled and all work performed in the case is considered unnecessary. Thus, when a knockout check rejects a case, the effort spent in other (previous) checks becomes overprocessing waste. Traditional process redesign methods propose to order knockout checks according to their mean effort and rejection rate. This paper presents a more fine-grained approach where knockout checks are ordered at runtime based on predictive machine learning models. Experiments on two real-life processes show that this predictive approach outperforms traditional methods while incurring minimal runtime overhead.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The quality of species distribution models (SDMs) relies to a large degree on the quality of the input data, from bioclimatic indices to environmental and habitat descriptors (Austin, 2002). Recent reviews of SDM techniques, have sought to optimize predictive performance e.g. Elith et al., 2006. In general SDMs employ one of three approaches to variable selection. The simplest approach relies on the expert to select the variables, as in environmental niche models Nix, 1986 or a generalized linear model without variable selection (Miller and Franklin, 2002). A second approach explicitly incorporates variable selection into model fitting, which allows examination of particular combinations of variables. Examples include generalized linear or additive models with variable selection (Hastie et al. 2002); or classification trees with complexity or model based pruning (Breiman et al., 1984, Zeileis, 2008). A third approach uses model averaging, to summarize the overall contribution of a variable, without considering particular combinations. Examples include neural networks, boosted or bagged regression trees and Maximum Entropy as compared in Elith et al. 2006. Typically, users of SDMs will either consider a small number of variable sets, via the first approach, or else supply all of the candidate variables (often numbering more than a hundred) to the second or third approaches. Bayesian SDMs exist, with several methods for eliciting and encoding priors on model parameters (see review in Low Choy et al. 2010). However few methods have been published for informative variable selection; one example is Bayesian trees (O’Leary 2008). Here we report an elicitation protocol that helps makes explicit a priori expert judgements on the quality of candidate variables. This protocol can be flexibly applied to any of the three approaches to variable selection, described above, Bayesian or otherwise. We demonstrate how this information can be obtained then used to guide variable selection in classical or machine learning SDMs, or to define priors within Bayesian SDMs.