853 resultados para performance measures
Resumo:
Classical metapopulation theory assumes a static landscape. However, empirical evidence indicates many metapopulations are driven by habitat succession and disturbance. We develop a stochastic metapopulation model, incorporating habitat disturbance and recovery, coupled with patch colonization and extinction, to investigate the effect of habitat dynamics on persistence. We discover that habitat dynamics play a fundamental role in metapopulation dynamics. The mean number of suitable habitat patches is not adequate for characterizing the dynamics of the metapopulation. For a fixed mean number of suitable patches, we discover that the details of how disturbance affects patches and how patches recover influences metapopulation dynamics in a fundamental way. Moreover, metapopulation persistence is dependent not only oil the average lifetime of a patch, but also on the variance in patch lifetime and the synchrony in patch dynamics that results from disturbance. Finally, there is an interaction between the habitat and metapopulation dynamics, for instance declining metapopulations react differently to habitat dynamics than expanding metapopulations. We close, emphasizing the importance of using performance measures appropriate to stochastic systems when evaluating their behavior, such as the probability distribution of the state of the. metapopulation, conditional on it being extant (i.e., the quasistationary distribution).
Resumo:
Business Process Management (BPM) is widely seen as the top priority in organizations wanting to survive competitive markets. However, the current academic research agenda does not seem to map with industry demands. In this paper, we address the need to identify the actual issues that organizations face in their efforts to manage business processes. To that end, we report a number of critical issues identified by industry in what we consider to be the first steps towards an industry-driven research agenda for the BPM area. The reported issues are derived from a series of focus groups conducted with Australian organizations. The findings point to, among others, a need for more consolidated efforts in the areas of business process governance, systematic change management, developing BPM methodologies, and introducing appropriate performance measures.
Resumo:
Traditionally, machine learning algorithms have been evaluated in applications where assumptions can be reliably made about class priors and/or misclassification costs. In this paper, we consider the case of imprecise environments, where little may be known about these factors and they may well vary significantly when the system is applied. Specifically, the use of precision-recall analysis is investigated and compared to the more well known performance measures such as error-rate and the receiver operating characteristic (ROC). We argue that while ROC analysis is invariant to variations in class priors, this invariance in fact hides an important factor of the evaluation in imprecise environments. Therefore, we develop a generalised precision-recall analysis methodology in which variation due to prior class probabilities is incorporated into a multi-way analysis of variance (ANOVA). The increased sensitivity and reliability of this approach is demonstrated in a remote sensing application.
Resumo:
Many maintenance managers find it difficult to justify investments in maintenance improvement initiatives. In part, this is due to a tendency by mine managers to regard maintenance purely as a cost centre, and not as a process able to influence productive capacity and profit. It is also hindered by a lack of alignment between commonly used maintenance performance measures and key business drivers, and the lack of formal business training amongst maintenance professionals. With this in mind, a model to assist maintenance managers in evaluating the benefits of maintenance improvement projects was recently formulated. The model considers four cost saving dimensions. These are: 1. reduction in the cost of unplanned repairs and maintenance, 2. increased or accelerated production and/or sales, 3. spares inventory reduction, and 4. reduction in over-investment in physical assets and operating costs. This paper discusses the application of this model and a number of numerical examples are given to justify investments in maintenance improvement projects having varying objectives.
Resumo:
Essa pesquisa investiga empiricamente o desempenho das empresas do Grande ABC, região industrializada e cada vez mais representativa economicamente para o país. As sete cidades que representam a região, Santo André, São Bernardo do Campo, São Caetano do Sul, Diadema, Mauá, Ribeirão Pires e Rio Grande da Serra, tiverem nos últimos anos um crescimento econômico consideravelmente acima do crescimento do país e seu desenvolvimento tem impulsionado o crescimento do país. A análise empírica utiliza dados em painel e investiga o desempenho das firmas das sete cidades que compõe o Grande ABC durante os anos de 2001 a 2008 utilizando a metodologia multinível e três medidas de desempenho: ROA, OROA e ROE. A metodologia multinível possibilitou a identificação dos principais efeitos que estão associados ou não ao desempenho das empresas, entre esses efeitos estão o ano, a própria empresa, o subsetor, o setor e a cidade que a empresa se localiza. Entre as três medidas de desempenho utilizadas houve significativa convergência e, além disso, o estudo identificou que há um significativo efeito no desempenho das empresas associado ao ano e à própria empresa, além de mostrar que os setores, os subsetores e a cidade que a empresa se localiza não apresentam um efeito significativo associado ao desempenho dessas firmas.
Resumo:
We explore the dependence of performance measures, such as the generalization error and generalization consistency, on the structure and the parameterization of the prior on `rules', instanced here by the noisy linear perceptron. Using a statistical mechanics framework, we show how one may assign values to the parameters of a model for a `rule' on the basis of data instancing the rule. Information about the data, such as input distribution, noise distribution and other `rule' characteristics may be embedded in the form of general gaussian priors for improving net performance. We examine explicitly two types of general gaussian priors which are useful in some simple cases. We calculate the optimal values for the parameters of these priors and show their effect in modifying the most probable, MAP, values for the rules.
Resumo:
The dynamics of supervised learning in layered neural networks were studied in the regime where the size of the training set is proportional to the number of inputs. The evolution of macroscopic observables, including the two relevant performance measures can be predicted by using the dynamical replica theory. Three approximation schemes aimed at eliminating the need to solve a functional saddle-point equation at each time step have been derived.
Resumo:
How do signals from the 2 eyes combine and interact? Our recent work has challenged earlier schemes in which monocular contrast signals are subject to square-law transduction followed by summation across eyes and binocular gain control. Much more successful was a new 'two-stage' model in which the initial transducer was almost linear and contrast gain control occurred both pre- and post-binocular summation. Here we extend that work by: (i) exploring the two-dimensional stimulus space (defined by left- and right-eye contrasts) more thoroughly, and (ii) performing contrast discrimination and contrast matching tasks for the same stimuli. Twenty-five base-stimuli made from 1 c/deg patches of horizontal grating, were defined by the factorial combination of 5 contrasts for the left eye (0.3-32%) with five contrasts for the right eye (0.3-32%). Other than in contrast, the gratings in the two eyes were identical. In a 2IFC discrimination task, the base-stimuli were masks (pedestals), where the contrast increment was presented to one eye only. In a matching task, the base-stimuli were standards to which observers matched the contrast of either a monocular or binocular test grating. In the model, discrimination depends on the local gradient of the observer's internal contrast-response function, while matching equates the magnitude (rather than gradient) of response to the test and standard. With all model parameters fixed by previous work, the two-stage model successfully predicted both the discrimination and the matching data and was much more successful than linear or quadratic binocular summation models. These results show that performance measures and perception (contrast discrimination and contrast matching) can be understood in the same theoretical framework for binocular contrast vision. © 2007 VSP.
Resumo:
This article reports the results of a web-based survey of real estate portfolio managers in the pension fund industry. The study focused on ascertaining the real estate research interests of the respondents as well as whether or not research funding should be allocated to various research topics. Performance measures of real estate assets and portfolios, microeconomic factors affecting real estate and the role of real estate in a mixed-asset portfolio were the top three real estate research interests. There was some variation by the type and size of fund providing evidence that segmentation is important within the money management industry. Respondents were also queried on more focused research subtopics and additional questions in the survey focused on satisfaction with existing real estate benchmarks, and perceptions of the usefulness of published research. Findings should be used to guide research practitioners and academics as to the most important research interests of plan sponsor real estate investment managers.
Resumo:
In recent discussions over the contribution of marketing to the strategy dialogue, market orientation has been singled out as being of particular importance in relation to the understanding of competitive advantage (Day et al 1992, Hunt and Lamb 2000). Research in the past has focused primarily on firms operating in domestic markets. As such, despite the recent progress, it is unclear of relevancy of market orientation as a construct in the context of multinational corporations (MNC) and their foreign subsidiaries. In this study, we set out to explore the role of market orientation in the subsidiary business performance. An investigation of a sample of 252 foreign subsidiaries in the UK revealed that except for “receptive? subsidiaries (Taggart 1998), market orientation has significant positive relationships with a number of business performance measures in all three other types of subsidiaries, suggesting that market orientation is a key driver for business performance at foreign subsidiaries.
Resumo:
A local area network that can support both voice and data packets offers economic advantages due to the use of only a single network for both types of traffic, greater flexibility to changing user demands, and it also enables efficient use to be made of the transmission capacity. The latter aspect is very important in local broadcast networks where the capacity is a scarce resource, for example mobile radio. This research has examined two types of local broadcast network, these being the Ethernet-type bus local area network and a mobile radio network with a central base station. With such contention networks, medium access control (MAC) protocols are required to gain access to the channel. MAC protocols must provide efficient scheduling on the channel between the distributed population of stations who want to transmit. No access scheme can exceed the performance of a single server queue, due to the spatial distribution of the stations. Stations cannot in general form a queue without using part of the channel capacity to exchange protocol information. In this research, several medium access protocols have been examined and developed in order to increase the channel throughput compared to existing protocols. However, the established performance measures of average packet time delay and throughput cannot adequately characterise protocol performance for packet voice. Rather, the percentage of bits delivered within a given time bound becomes the relevant performance measure. Performance evaluation of the protocols has been examined using discrete event simulation and in some cases also by mathematical modelling. All the protocols use either implicit or explicit reservation schemes, with their efficiency dependent on the fact that many voice packets are generated periodically within a talkspurt. Two of the protocols are based on the existing 'Reservation Virtual Time CSMA/CD' protocol, which forms a distributed queue through implicit reservations. This protocol has been improved firstly by utilising two channels, a packet transmission channel and a packet contention channel. Packet contention is then performed in parallel with a packet transmission to increase throughput. The second protocol uses variable length packets to reduce the contention time between transmissions on a single channel. A third protocol developed, is based on contention for explicit reservations. Once a station has achieved a reservation, it maintains this effective queue position for the remainder of the talkspurt and transmits after it has sensed the transmission from the preceeding station within the queue. In the mobile radio environment, adaptions to the protocols were necessary in order that their operation was robust to signal fading. This was achieved through centralised control at a base station, unlike the local area network versions where the control was distributed at the stations. The results show an improvement in throughput compared to some previous protocols. Further work includes subjective testing to validate the protocols' effectiveness.
Resumo:
The ERS-1 Satellite was launched in July 1991 by the European Space Agency into a polar orbit at about 800 km, carrying a C-band scatterometer. A scatterometer measures the amount of backscatter microwave radiation reflected by small ripples on the ocean surface induced by sea-surface winds, and so provides instantaneous snap-shots of wind flow over large areas of the ocean surface, known as wind fields. Inherent in the physics of the observation process is an ambiguity in wind direction; the scatterometer cannot distinguish if the wind is blowing toward or away from the sensor device. This ambiguity implies that there is a one-to-many mapping between scatterometer data and wind direction. Current operational methods for wind field retrieval are based on the retrieval of wind vectors from satellite scatterometer data, followed by a disambiguation and filtering process that is reliant on numerical weather prediction models. The wind vectors are retrieved by the local inversion of a forward model, mapping scatterometer observations to wind vectors, and minimising a cost function in scatterometer measurement space. This thesis applies a pragmatic Bayesian solution to the problem. The likelihood is a combination of conditional probability distributions for the local wind vectors given the scatterometer data. The prior distribution is a vector Gaussian process that provides the geophysical consistency for the wind field. The wind vectors are retrieved directly from the scatterometer data by using mixture density networks, a principled method to model multi-modal conditional probability density functions. The complexity of the mapping and the structure of the conditional probability density function are investigated. A hybrid mixture density network, that incorporates the knowledge that the conditional probability distribution of the observation process is predominantly bi-modal, is developed. The optimal model, which generalises across a swathe of scatterometer readings, is better on key performance measures than the current operational model. Wind field retrieval is approached from three perspectives. The first is a non-autonomous method that confirms the validity of the model by retrieving the correct wind field 99% of the time from a test set of 575 wind fields. The second technique takes the maximum a posteriori probability wind field retrieved from the posterior distribution as the prediction. For the third technique, Markov Chain Monte Carlo (MCMC) techniques were employed to estimate the mass associated with significant modes of the posterior distribution, and make predictions based on the mode with the greatest mass associated with it. General methods for sampling from multi-modal distributions were benchmarked against a specific MCMC transition kernel designed for this problem. It was shown that the general methods were unsuitable for this application due to computational expense. On a test set of 100 wind fields the MAP estimate correctly retrieved 72 wind fields, whilst the sampling method correctly retrieved 73 wind fields.
Resumo:
This thesis examines individual differences in work behaviour of rubber tappers. The study examined sex, age, experience and race differences and their interactions with terrain on job performance, absenteeism, and job satisfaction of 1053 rubber tappers. Rubber tappers are unskilled blue-collar workers who essentially do the same type of work and are paid the same rates of pay. There are very few studies that have compared male and female blue-collar workers doing similar jobs in organisational settings. This study is one of the few investigations that examine sex differences in job performance of blue-collar workers doing same job using production data. Studies on age differences in work behaviour encounter numerous methodological difficulties such as high turnover, internal transfers and problems associated with age differences in educational levels. The participation of rubber tappers in this study is envisaged to overcome these difficulties because attrition rates of rubber tappers are low, and internal transfers are non existent. Further, the educational levels of rubber tappers are relatively similar across different age cohorts, as most rubber tappers have little or no education. Two measures of both job performance and absenteeism were derived from payroll records. The two job performance measures were total crop production and attendance. The two absenteeism measures were avoidable and unavoidable absence rates. Overall job satisfaction was determined using a 4-item scale. Significant sex, age, experience and race differences were obtained for job performance, absenteeism and job satisfaction. Significant interactive effects were also obtained for sex, age , experience, race and terrain for job performance and absenteeism. The results are discussed in relation to the abilities and motivation of rubber tappers. The implication of these findings for employee selection and human resource management in rubber estates is discussed.
Resumo:
This thesis investigates how people select items from a computer display using the mouse input device. The term computer mouse refers to a class of input devices which share certain features, but these may have different characteristics which influence the ways in which people use the device. Although task completion time is one of the most commonly used performance measures for input device evaluation, there is no consensus as to its definition. Furthermore most mouse studies fail to provide adequate assurances regarding its correct measurement.Therefore precise and accurate timing software were developed which permitted the recording of movement data which by means of automated analysis yielded the device movements made. Input system gain, an important task parameter, has been poorly defined and misconceptualized in most previous studies. The issue of gain has been clarified and investigated within this thesis. Movement characteristics varied between users and within users, even for the same task conditions. The variables of target size, movement amplitude, and experience exerted significant effects on performance. Subjects consistently undershot the target area. This may be a consequence of the particular task demands. Although task completion times indicated that mouse performance had stabilized after 132 trials the movement traces, even of very experienced users, indicated that there was still considerable room for improvement in performance, as indicated by the proportion of poorly made movements. The mouse input device was suitable for older novice device users, but they took longer to complete the experimental trials. Given the diversity and inconsistency of device movements, even for the same task conditions, caution is urged when interpreting averaged grouped data. Performance was found to be sensitive to; task conditions, device implementations, and experience in ways which are problematic for the theoretical descriptions of device movement, and limit the generalizability of such findings within this thesis.
Resumo:
Corporate restructuring is perceived as a challenge to research. Prior studies do not provide conclusive evidence regarding the effects of restructuring. Since there are discernible findings, this research attempts to examine the effects of restructuring events amongst the UK listed firms. The sample firms are listed in the LSE and London AIM stock exchange. Only completed restructuring transactions are included in the study. The time horizon extends from year 1999 to 2003. A three-year floating window is assigned to examine the sample firms. The key enquiry is to scrutinise the ex post effects of restructuring on performance and value measures of firms with contrast to a matched criteria non-restructured sample. A cross sectional study employing logit estimate is undertaken to examine firm characteristics of restructuring samples. Further, additional parameters, i.e. Conditional Volatility and Asymmetry are generated under the GJR-GARCH estimate and reiterated in logit models to capture time-varying heteroscedasticity of the samples. This research incorporates most forms of restructurings, while prior studies have examined certain forms of restructuring. Particularly, these studies have made limited attempts to examine different restructuring events simultaneously. In addition to logit analysis, an event study is adopted to evaluate the announcement effect of restructuring under both the OLS and GJR-GARCH estimate supplementing our prior results. By engaging a composite empirical framework, our estimation method validates a full appreciation of restructuring effect. The study provides evidence that restructurings indicate non-trivial significant positive effect. There are some evidences that the response differs because of the types of restructuring, particularly while event study is applied. The results establish that performance measures, i.e. Operating Profit Margin, Return on Equity, Return on Assets, Growth, Size, Profit Margin and Shareholders' Ownership indicate consistent and significant increase. However, Leverage and Asset Turn Over suggest reasonable influence on restructuring across the sample period. Similarly, value measures, i.e. Abnormal Returns, Return on Equity and Cash Flow Margin suggest sizeable improvement. A notable characteristic seen coherently throughout the analysis is the decreasing proportion of Systematic Risk. Consistent with these findings, Conditional Volatility and Asymmetry exhibit similar trend. The event study analysis suggests that on an average market perceives restructuring favourably and shareholders experience significant and systematic positive gain.