913 resultados para Model predictive control


Relevância:

40.00% 40.00%

Publicador:

Resumo:

In construction projects, the aim of project control is to ensure projects finish on time, within budget, and achieve other project objectives. During the last few decades, numerous project control methods have been developed and adopted by project managers in practice. However, many existing methods focus on describing what the processes and tasks of project control are; not on how these tasks should be conducted. There is also a potential gap between principles that underly these methods and project control practice. As a result, time and cost overruns are still common in construction projects, partly attributable to deficiencies of existing project control methods and difficulties in implementing them. This paper describes a new project cost and time control model, the project control and inhibiting factors management (PCIM) model, developed through a study involving extensive interaction with construction practitioners in the UK, which better reflects the real needs of project managers. A set of good practice checklist is also developed to facilitate implementation of the model. © 2013 American Society of Civil Engineers.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The aim of this research was to improve the quantitative support to project planning and control principally through the use of more accurate forecasting for which new techniques were developed. This study arose from the observation that in most cases construction project forecasts were based on a methodology (c.1980) which relied on the DHSS cumulative cubic cost model and network based risk analysis (PERT). The former of these, in particular, imposes severe limitations which this study overcomes. Three areas of study were identified, namely growth curve forecasting, risk analysis and the interface of these quantitative techniques with project management. These fields have been used as a basis for the research programme. In order to give a sound basis for the research, industrial support was sought. This resulted in both the acquisition of cost profiles for a large number of projects and the opportunity to validate practical implementation. The outcome of this research project was deemed successful both in theory and practice. The new forecasting theory was shown to give major reductions in projection errors. The integration of the new predictive and risk analysis technologies with management principles, allowed the development of a viable software management aid which fills an acknowledged gap in current technology.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This thesis is concerned with the inventory control of items that can be considered independent of one another. The decisions when to order and in what quantity, are the controllable or independent variables in cost expressions which are minimised. The four systems considered are referred to as (Q, R), (nQ,R,T), (M,T) and (M,R,T). Wiith ((Q,R) a fixed quantity Q is ordered each time the order cover (i.e. stock in hand plus on order ) equals or falls below R, the re-order level. With the other three systems reviews are made only at intervals of T. With (nQ,R,T) an order for nQ is placed if on review the inventory cover is less than or equal to R, where n, which is an integer, is chosen at the time so that the new order cover just exceeds R. In (M, T) each order increases the order cover to M. Fnally in (M, R, T) when on review, order cover does not exceed R, enough is ordered to increase it to M. The (Q, R) system is examined at several levels of complexity, so that the theoretical savings in inventory costs obtained with more exact models could be compared with the increases in computational costs. Since the exact model was preferable for the (Q,R) system only exact models were derived for theoretical systems for the other three. Several methods of optimization were tried, but most were found inappropriate for the exact models because of non-convergence. However one method did work for each of the exact models. Demand is considered continuous, and with one exception, the distribution assumed is the normal distribution truncated so that demand is never less than zero. Shortages are assumed to result in backorders, not lost sales. However, the shortage cost is a function of three items, one of which, the backorder cost, may be either a linear, quadratic or an exponential function of the length of time of a backorder, with or without period of grace. Lead times are assumed constant or gamma distributed. Lastly, the actual supply quantity is allowed to be distributed. All the sets of equations were programmed for a KDF 9 computer and the computed performances of the four inventory control procedures are compared under each assurnption.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

DUE TO COPYRIGHT RESTRICTIONS ONLY AVAILABLE FOR CONSULTATION AT ASTON UNIVERSITY LIBRARY AND INFORMATION SERVICES WITH PRIOR ARRANGEMENT

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The author analyzes some peculiarities of information perception and the problems of tests evaluation. A fuzzy model of tests evaluation as means of increasing the effectiveness of knowledge control is suggested.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The link between off-target anticholinergic effects of medications and acute cognitive impairment in older adults requires urgent investigation. We aimed to determine whether a relevant in vitro model may aid the identification of anticholinergic responses to drugs and the prediction of anticholinergic risk during polypharmacy. In this preliminary study we employed a co-culture of human-derived neurons and astrocytes (NT2.N/A) derived from the NT2 cell line. NT2.N/A cells possess much of the functionality of mature neurons and astrocytes, key cholinergic phenotypic markers and muscarinic acetylcholine receptors (mAChRs). The cholinergic response of NT2 astrocytes to the mAChR agonist oxotremorine was examined using the fluorescent dye fluo-4 to quantitate increases in intracellular calcium [Ca2+]i. Inhibition of this response by drugs classified as severe (dicycloverine, amitriptyline), moderate (cyclobenzaprine) and possible (cimetidine) on the Anticholinergic Cognitive Burden (ACB) scale, was examined after exposure to individual and pairs of compounds. Individually, dicycloverine had the most significant effect regarding inhibition of the astrocytic cholinergic response to oxotremorine, followed by amitriptyline then cyclobenzaprine and cimetidine, in agreement with the ACB scale. In combination, dicycloverine with cyclobenzaprine had the most significant effect, followed by dicycloverine with amitriptyline. The order of potency of the drugs in combination frequently disagreed with predicted ACB scores derived from summation of the individual drug scores, suggesting current scales may underestimate the effect of polypharmacy. Overall, this NT2.N/A model may be appropriate for further investigation of adverse anticholinergic effects of multiple medications, in order to inform clinical choices of suitable drug use in the elderly.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

How are the image statistics of global image contrast computed? We answered this by using a contrast-matching task for checkerboard configurations of ‘battenberg’ micro-patterns where the contrasts and spatial spreads of interdigitated pairs of micro-patterns were adjusted independently. Test stimuli were 20 × 20 arrays with various sized cluster widths, matched to standard patterns of uniform contrast. When one of the test patterns contained a pattern with much higher contrast than the other, that determined global pattern contrast, as in a max() operation. Crucially, however, the full matching functions had a curious intermediate region where low contrast additions for one pattern to intermediate contrasts of the other caused a paradoxical reduction in perceived global contrast. None of the following models predicted this: RMS, energy, linear sum, max, Legge and Foley. However, a gain control model incorporating wide-field integration and suppression of nonlinear contrast responses predicted the results with no free parameters. This model was derived from experiments on summation of contrast at threshold, and masking and summation effects in dipper functions. Those experiments were also inconsistent with the failed models above. Thus, we conclude that our contrast gain control model (Meese & Summers, 2007) describes a fundamental operation in human contrast vision.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

A novel simulation model for pyrolysis processes oflignocellulosicbiomassin AspenPlus (R) was presented at the BC&E 2013. Based on kinetic reaction mechanisms, the simulation calculates product compositions and yields depending on reactor conditions (temperature, residence time, flue gas flow rate) and feedstock composition (biochemical composition, atomic composition, ash and alkali metal content). The simulation model was found to show good correlation with existing publications. In order to further verify the model, own pyrolysis experiments in a 1 kg/h continuously fed fluidized bed fast pyrolysis reactor are performed. Two types of biomass with different characteristics are processed in order to evaluate the influence of the feedstock composition on the yields of the pyrolysis products and their composition. One wood and one straw-like feedstock are used due to their different characteristics. Furthermore, the temperature response of yields and product compositions is evaluated by varying the reactor temperature between 450 and 550 degrees C for one of the feedstocks. The yields of the pyrolysis products (gas, oil, char) are determined and their detailed composition is analysed. The experimental runs are reproduced with the corresponding reactor conditions in the AspenPlus model and the results compared with the experimental findings.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The development of a new set of frost property measurement techniques to be used in the control of frost growth and defrosting processes in refrigeration systems was investigated. Holographic interferometry and infrared thermometry were used to measure the temperature of the frost-air interface, while a beam element load sensor was used to obtain the weight of a deposited frost layer. The proposed measurement techniques were tested for the cases of natural and forced convection, and the characteristic charts were obtained for a set of operational conditions. ^ An improvement of existing frost growth mathematical models was also investigated. The early stage of frost nucleation was commonly not considered in these models and instead an initial value of layer thickness and porosity was regularly assumed. A nucleation model to obtain the droplet diameter and surface porosity at the end of the early frosting period was developed. The drop-wise early condensation in a cold flat plate under natural convection to a hot (room temperature) and humid air was modeled. A nucleation rate was found, and the relation of heat to mass transfer (Lewis number) was obtained. It was found that the Lewis number was much smaller than unity, which is the standard value usually assumed for most frosting numerical models. The nucleation model was validated against available experimental data for the early nucleation and full growth stages of the frosting process. ^ The combination of frost top temperature and weight variation signals can now be used to control the defrosting timing and the developed early nucleation model can now be used to simulate the entire process of frost growth in any surface material. ^

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The theoretical construct of control has been defined as necessary (Etzioni, 1965), ubiquitous (Vickers, 1967), and on-going (E. Langer, 1983). Empirical measures, however, have not adequately given meaning to this potent construct, especially within complex organizations such as schools. Four stages of theory-development and empirical testing of school building managerial control using principals and teachers working within the nation's fourth largest district are presented in this dissertation as follows: (1) a review and synthesis of social science theories of control across the literatures of organizational theory, political science, sociology, psychology, and philosophy; (2) a systematic analysis of school managerial activities performed at the building level within the context of curricular and instructional tasks; (3) the development of a survey questionnaire to measure school building managerial control; and (4) initial tests of construct validity including inter-item reliability statistics, principal components analyses, and multivariate tests of significance. The social science synthesis provided support of four managerial control processes: standards, information, assessment, and incentives. The systematic analysis of school managerial activities led to further categorization between structural frequency of behaviors and discretionary qualities of behaviors across each of the control processes and the curricular and instructional tasks. Teacher survey responses (N=486) reported a significant difference between these two dimensions of control, structural frequency and discretionary qualities, for standards, information, and assessments, but not for incentives. The descriptive model of school managerial control suggests that (1) teachers perceive structural and discretionary managerial behaviors under information and incentives more clearly than activities representing standards or assessments, (2) standards are primarily structural while assessments are primarily qualitative, (3) teacher satisfaction is most closely related to the equitable distribution of incentives, (4) each of the structural managerial behaviors has a qualitative effect on teachers, and that (5) certain qualities of managerial behaviors are perceived by teachers as distinctly discretionary, apart from school structure. The variables of teacher tenure and school effectiveness reported significant effects on school managerial control processes, while instructional levels (elementary, junior, and senior) and individual school differences were not found to be significant for the construct of school managerial control.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

We analyzed projections of current and future ambient temperatures along the eastern United States in relationship to the thermal tolerance of harbor seals in air. Using the earth systems model (HadGEM2-ES) and representative concentration pathways (RCPs) 4.5 and 8.5, which are indicative of two different atmospheric CO2 concentrations, we were able to examine possible shifts in distribution based on three metrics: current preferences, the thermal limit of juveniles, and the thermal limits of adults. Our analysis focused on average ambient temperatures because harbor seals are least effective at regulating their body temperature in air, making them most susceptible to rising air temperatures in the coming years. Our study focused on the months of May, June, and August from 2041-2060 (2050) and 2061-2080 (2070) as these are the historic months in which harbor seals are known to annually come ashore to pup, breed, and molt. May, June, and August are also some of the warmest months of the year. We found that breeding colonies along the eastern United States will be limited by the thermal tolerance of juvenile harbor seals in air, while their foraging range will extend as far south as the thermal tolerance of adult harbor seals in air. Our analysis revealed that in 2070, harbor seal pups should be absent from the United States coastline nearing the end of the summer due to exceptionally high air temperatures.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Previously developed models for predicting absolute risk of invasive epithelial ovarian cancer have included a limited number of risk factors and have had low discriminatory power (area under the receiver operating characteristic curve (AUC) < 0.60). Because of this, we developed and internally validated a relative risk prediction model that incorporates 17 established epidemiologic risk factors and 17 genome-wide significant single nucleotide polymorphisms (SNPs) using data from 11 case-control studies in the United States (5,793 cases; 9,512 controls) from the Ovarian Cancer Association Consortium (data accrued from 1992 to 2010). We developed a hierarchical logistic regression model for predicting case-control status that included imputation of missing data. We randomly divided the data into an 80% training sample and used the remaining 20% for model evaluation. The AUC for the full model was 0.664. A reduced model without SNPs performed similarly (AUC = 0.649). Both models performed better than a baseline model that included age and study site only (AUC = 0.563). The best predictive power was obtained in the full model among women younger than 50 years of age (AUC = 0.714); however, the addition of SNPs increased the AUC the most for women older than 50 years of age (AUC = 0.638 vs. 0.616). Adapting this improved model to estimate absolute risk and evaluating it in prospective data sets is warranted.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This paper presents MOTION, a modular on-line model for urban traffic signal control. It consists of a network and a local level and builds on enhanced traffic state estimation. Special consideration is given to the prioritization of public transit. MOTION provides possibilities for the interaction with integrated urban management systems.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The role of computer modeling has grown recently to integrate itself as an inseparable tool to experimental studies for the optimization of automotive engines and the development of future fuels. Traditionally, computer models rely on simplified global reaction steps to simulate the combustion and pollutant formation inside the internal combustion engine. With the current interest in advanced combustion modes and injection strategies, this approach depends on arbitrary adjustment of model parameters that could reduce credibility of the predictions. The purpose of this study is to enhance the combustion model of KIVA, a computational fluid dynamics code, by coupling its fluid mechanics solution with detailed kinetic reactions solved by the chemistry solver, CHEMKIN. As a result, an engine-friendly reaction mechanism for n-heptane was selected to simulate diesel oxidation. Each cell in the computational domain is considered as a perfectly-stirred reactor which undergoes adiabatic constant- volume combustion. The model was applied to an ideally-prepared homogeneous- charge compression-ignition combustion (HCCI) and direct injection (DI) diesel combustion. Ignition and combustion results show that the code successfully simulates the premixed HCCI scenario when compared to traditional combustion models. Direct injection cases, on the other hand, do not offer a reliable prediction mainly due to the lack of turbulent-mixing model, inherent in the perfectly-stirred reactor formulation. In addition, the model is sensitive to intake conditions and experimental uncertainties which require implementation of enhanced predictive tools. It is recommended that future improvements consider turbulent-mixing effects as well as optimization techniques to accurately simulate actual in-cylinder process with reduced computational cost. Furthermore, the model requires the extension of existing fuel oxidation mechanisms to include pollutant formation kinetics for emission control studies.