955 resultados para Internal model control
Resumo:
How do signals from the 2 eyes combine and interact? Our recent work has challenged earlier schemes in which monocular contrast signals are subject to square-law transduction followed by summation across eyes and binocular gain control. Much more successful was a new 'two-stage' model in which the initial transducer was almost linear and contrast gain control occurred both pre- and post-binocular summation. Here we extend that work by: (i) exploring the two-dimensional stimulus space (defined by left- and right-eye contrasts) more thoroughly, and (ii) performing contrast discrimination and contrast matching tasks for the same stimuli. Twenty-five base-stimuli made from 1 c/deg patches of horizontal grating, were defined by the factorial combination of 5 contrasts for the left eye (0.3-32%) with five contrasts for the right eye (0.3-32%). Other than in contrast, the gratings in the two eyes were identical. In a 2IFC discrimination task, the base-stimuli were masks (pedestals), where the contrast increment was presented to one eye only. In a matching task, the base-stimuli were standards to which observers matched the contrast of either a monocular or binocular test grating. In the model, discrimination depends on the local gradient of the observer's internal contrast-response function, while matching equates the magnitude (rather than gradient) of response to the test and standard. With all model parameters fixed by previous work, the two-stage model successfully predicted both the discrimination and the matching data and was much more successful than linear or quadratic binocular summation models. These results show that performance measures and perception (contrast discrimination and contrast matching) can be understood in the same theoretical framework for binocular contrast vision. © 2007 VSP.
Resumo:
Blurred edges appear sharper in motion than when they are stationary. We have previously shown how such distortions in perceived edge blur may be explained by a model which assumes that luminance contrast is encoded by a local contrast transducer whose response becomes progressively more compressive as speed increases. To test this model further, we measured the sharpening of drifting, periodic patterns over a large range of contrasts, blur widths, and speeds Human Vision. The results indicate that, while sharpening increased with speed, it was practically invariant with contrast. This contrast invariance cannot be explained by a fixed compressive nonlinearity since that predicts almost no sharpening at low contrasts.We show by computational modelling of spatiotemporal responses that, if a dynamic contrast gain control precedes the static nonlinear transducer, then motion sharpening, its speed dependence, and its invariance with contrast can be predicted with reasonable accuracy.
Resumo:
In construction projects, the aim of project control is to ensure projects finish on time, within budget, and achieve other project objectives. During the last few decades, numerous project control methods have been developed and adopted by project managers in practice. However, many existing methods focus on describing what the processes and tasks of project control are; not on how these tasks should be conducted. There is also a potential gap between principles that underly these methods and project control practice. As a result, time and cost overruns are still common in construction projects, partly attributable to deficiencies of existing project control methods and difficulties in implementing them. This paper describes a new project cost and time control model, the project control and inhibiting factors management (PCIM) model, developed through a study involving extensive interaction with construction practitioners in the UK, which better reflects the real needs of project managers. A set of good practice checklist is also developed to facilitate implementation of the model. © 2013 American Society of Civil Engineers.
Resumo:
This thesis is concerned with the inventory control of items that can be considered independent of one another. The decisions when to order and in what quantity, are the controllable or independent variables in cost expressions which are minimised. The four systems considered are referred to as (Q, R), (nQ,R,T), (M,T) and (M,R,T). Wiith ((Q,R) a fixed quantity Q is ordered each time the order cover (i.e. stock in hand plus on order ) equals or falls below R, the re-order level. With the other three systems reviews are made only at intervals of T. With (nQ,R,T) an order for nQ is placed if on review the inventory cover is less than or equal to R, where n, which is an integer, is chosen at the time so that the new order cover just exceeds R. In (M, T) each order increases the order cover to M. Fnally in (M, R, T) when on review, order cover does not exceed R, enough is ordered to increase it to M. The (Q, R) system is examined at several levels of complexity, so that the theoretical savings in inventory costs obtained with more exact models could be compared with the increases in computational costs. Since the exact model was preferable for the (Q,R) system only exact models were derived for theoretical systems for the other three. Several methods of optimization were tried, but most were found inappropriate for the exact models because of non-convergence. However one method did work for each of the exact models. Demand is considered continuous, and with one exception, the distribution assumed is the normal distribution truncated so that demand is never less than zero. Shortages are assumed to result in backorders, not lost sales. However, the shortage cost is a function of three items, one of which, the backorder cost, may be either a linear, quadratic or an exponential function of the length of time of a backorder, with or without period of grace. Lead times are assumed constant or gamma distributed. Lastly, the actual supply quantity is allowed to be distributed. All the sets of equations were programmed for a KDF 9 computer and the computed performances of the four inventory control procedures are compared under each assurnption.
Resumo:
Recent research has highlighted several job characteristics salient to employee well-being and behavior for which there are no adequate generally applicable measures. These include timing and method control, monitoring and problem-solving demand, and production responsibility. In this article, an attempt to develop measures of these constructs provided encouraging results. Confirmatory factor analyses applied to data from 2 samples of shop-floor employees showed a consistent fit to a common 5-factor measurement model. Scales corresponding to each of the dimensions showed satisfactory internal and test–retest reliabilities. As expected, the scales also discriminated between employees in different jobs and employees working with contrasting technologies.
Resumo:
In recent years the topic of risk management has moved up the agenda of both government and industry, and private sector initiatives to improve risk and internal control systems have been mirrored by similar promptings for change in the public sector. Both regulators and practitioners now view risk management as an integral part of the process of corporate governance, and an aid to the achievement of strategic objectives. The paper uses case study material on the risk management control system at Birmingham City Council to extend existing theory by developing a contingency theory for the public sector. The case demonstrates that whilst the structure of the control system fits a generic model, the operational details indicate that controls are contingent upon three core variables—central government policies, information and communication technology and organisational size. All three contingent variables are suitable for testing the theory across the broader public sector arena.