51 resultados para Masculinity in performance


Relevância:

90.00% 90.00%

Publicador:

Resumo:

As a source or sink of reactive power, compensators can be made from a voltage sourced inverter circuit with the a.c. terminals of the inverter connected to the system through an inductive link and with a capacitor connected across the d.c. terminals. Theoretical calculations on linearised models of the compensators have shown that the parameters characterising the performance are the reduced firing angle and the resonance ratio. The resonance ratio is the ratio of the natural frequency of oscillation of the energy storage components in the circuit to the system frequency. The reduced firing angle of the inverter divided by the damping coefficient, β, where β is half the R to X ratio of the link between the inverter and the system. The theoretical results have been verified by computer simulation and experiment. There is a narrow range of values for the resonance ratio below which there is no appreciable improvement in performance, despite an increase in the cost of the energy storage components, and above which the performance of the equipment is poor with the current being dominated by harmonics. The harmonic performance of the equipment is improved by using multiple inverters and phase shifting transformers to increase the pulse number. The optimum value of the resonance ratio increases pulse number, indicating a reduction in the energy storage components needed at high pulse numbers. The reactive power output from the compensator varies linearly with the reduced firing angle while the losses vary as the square of it.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This thesis includes analysis of disordered spin ensembles corresponding to Exact Cover, a multi-access channel problem, and composite models combining sparse and dense interactions. The satisfiability problem in Exact Cover is addressed using a statistical analysis of a simple branch and bound algorithm. The algorithm can be formulated in the large system limit as a branching process, for which critical properties can be analysed. Far from the critical point a set of differential equations may be used to model the process, and these are solved by numerical integration and exact bounding methods. The multi-access channel problem is formulated as an equilibrium statistical physics problem for the case of bit transmission on a channel with power control and synchronisation. A sparse code division multiple access method is considered and the optimal detection properties are examined in typical case by use of the replica method, and compared to detection performance achieved by interactive decoding methods. These codes are found to have phenomena closely resembling the well-understood dense codes. The composite model is introduced as an abstraction of canonical sparse and dense disordered spin models. The model includes couplings due to both dense and sparse topologies simultaneously. The new type of codes are shown to outperform sparse and dense codes in some regimes both in optimal performance, and in performance achieved by iterative detection methods in finite systems.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Combining the results of classifiers has shown much promise in machine learning generally. However, published work on combining text categorizers suggests that, for this particular application, improvements in performance are hard to attain. Explorative research using a simple voting system is presented and discussed in the light of a probabilistic model that was originally developed for safety critical software. It was found that typical categorization approaches produce predictions which are too similar for combining them to be effective since they tend to fail on the same records. Further experiments using two less orthodox categorizers are also presented which suggest that combining text categorizers can be successful, provided the essential element of ‘difference’ is considered.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The objective of Total Productive Maintenance (TPM) is to maximise plant and equipment effectiveness, to create a sense of ownership for operators, and promote continuous improvement through small group activities involving production, engineering and maintenance personnel. This paper describes and analyses a case study of TPM implementation at a newspaper printing house in Singapore. However, rather than adopting more conventional implementation methods such as employing consultants or through a project using external training, a unique approach was adopted based on Action Research using a spiral of cycles of planning, acting observing and reflecting. An Action Research team of company personnel was specially formed to undertake the necessary fieldwork. The team subsequently assisted with administering the resulting action plan. The main sources of maintenance and operational data were from interviews with shop floor workers, participative observation and reviews conducted with members of the team. Content analysis using appropriate statistical techniques was used to test the significance of changes in performance between the start and completion of the TPM programme. The paper identifies the characteristics associated with the Action Research method when used to implement TPM and discusses the applicability of the approach in related industries and processes.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

EEG Hyperscanning is a method for studying two or more individuals simultaneously with the objective of elucidating how co-variations in their neural activity (i.e., hyperconnectivity) are influenced by their behavioral and social interactions. The aim of this study was to compare the performance of different hyper-connectivity measures using (i) simulated data, where the degree of coupling could be systematically manipulated, and (ii) individually recorded human EEG combined into pseudo-pairs of participants where no hyper-connections could exist. With simulated data we found that each of the most widely used measures of hyperconnectivity were biased and detected hyper-connections where none existed. With pseudo-pairs of human data we found spurious hyper-connections that arose because there were genuine similarities between the EEG recorded from different people independently but under the same experimental conditions. Specifically, there were systematic differences between experimental conditions in terms of the rhythmicity of the EEG that were common across participants. As any imbalance between experimental conditions in terms of stimulus presentation or movement may affect the rhythmicity of the EEG, this problem could apply in many hyperscanning contexts. Furthermore, as these spurious hyper-connections reflected real similarities between the EEGs, they were not Type-1 errors that could be overcome by some appropriate statistical control. However, some measures that have not previously been used in hyperconnectivity studies, notably the circular correlation co-efficient (CCorr), were less susceptible to detecting spurious hyper-connections of this type. The reason for this advantage in performance is discussed and the use of the CCorr as an alternative measure of hyperconnectivity is advocated. © 2013 Burgess.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The intensity of global competition and ever-increasing economic uncertainties has led organizations to search for more efficient and effective ways to manage their business operations. Data envelopment analysis (DEA) has been widely used as a conceptually simple yet powerful tool for evaluating organizational productivity and performance. Fuzzy DEA (FDEA) is a promising extension of the conventional DEA proposed for dealing with imprecise and ambiguous data in performance measurement problems. This book is the first volume in the literature to present the state-of-the-art developments and applications of FDEA. It is designed for students, educators, researchers, consultants and practicing managers in business, industry, and government with a basic understanding of the DEA and fuzzy logic concepts.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Data envelopment analysis (DEA) is a methodology for measuring the relative efficiencies of a set of decision making units (DMUs) that use multiple inputs to produce multiple outputs. Crisp input and output data are fundamentally indispensable in conventional DEA. However, the observed values of the input and output data in real-world problems are sometimes imprecise or vague. Many researchers have proposed various fuzzy methods for dealing with the imprecise and ambiguous data in DEA. This chapter provides a taxonomy and review of the fuzzy DEA (FDEA) methods. We present a classification scheme with six categories, namely, the tolerance approach, the α-level based approach, the fuzzy ranking approach, the possibility approach, the fuzzy arithmetic, and the fuzzy random/type-2 fuzzy set. We discuss each classification scheme and group the FDEA papers published in the literature over the past 30 years. © 2014 Springer-Verlag Berlin Heidelberg.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper explores experimentally the impairments in performance that are generated when multiple single-sideband (SSB) subcarrier multiplexing (SCM) signals are closely allocated in frequency to establish a spectrally efficient wavelength division multiplexing (WDM) link. The performance of cost-effective SSB WDM/ SCM implementations, without optical filters in the transmitter, presents a strong dependency on the imperfect sideband suppression ratio that can be directly achieved with the electro-optical modulator. A direct detected broadband multichannel SCM link composed of a state-of-the-art optical IQ modulator and five quadrature phase-shift keyed (QPSK) subcarriers per optical channel is presented, showing that a suppression ratio of 20 dB obtained directly with the modulator produced a penalty of 2 dB in overall performance, due to interference between adjacent optical channels.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Over the course of the last twenty years there has been a growing academic interest in performance management, particularly in respect of the evolution of new techniques and their resulting impact. One important theoretical development has been the emergence of multidimensional performance measurement models that are potentially applicable within the public sector. Empirically, academic researchers are increasingly supporting the use of such models as a way of improving public sector management and the effectiveness of service provision (Mayston, 1985; Pollitt, 1986; Bates and Brignall, 1993; and Massey, 1999). This paper seeks to add to the literature by using both theoretical and empirical evidence to argue that CPA, the external inspection tool used by the Audit Commission to evaluate local authority performance management, is a version of the Balanced Scorecard which, when adapted for internal use, may have beneficial effects. After demonstrating the parallels between the CPA framework and Kaplan and Norton's public sector Balanced Scorecard (BSC), we use a case study of the BSC based performance management system in Hertfordshire County Council to demonstrate the empirical linkages between a local scorecard and CPA. We conclude that CPA is based upon the BSC and has the potential to serve as a springboard for the evolution of local authority performance management systems.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This thesis investigates the visual deficits associated with developmental dyslexia, particularly that of visual attention. Visual attention has previously been investigated in a wide array of behavioural and psychophysical (amongst others) studies but not many have produced consistent findings. Attention processes are believed to play an integral part in depicting the overall "extent" of reading deficits in dyslexia, so it was of paramount importance to aim at such attention mechanisms in this research. The experiments in this thesis focused on signal enhancement and noise (distractor) exclusion. Given the flexibility of the visual search paradigms employed in this research, factors such as visual crowding and attention distribution was also investigated. The experiments systematically manipulated noise (by increasing distractor count, i.e. set-size), crowding (varying the spacing between distractors), attention allocation (use of peripheral cues to direct attention), and attention distribution (influence of one visual field over the other), all of which were tied to a critical factor, the "location/spatial/decisional uncertainty". Adults with dyslexia were: (i) able to modulate attention appropriately using peripheral pre-cues, (ii) severely affected by crowding, and (iii) unable to counteract increased set-sizes when post or un-cued, the latter signifying poor distractor (noise) suppression. By controlling for location uncertainty, the findings confirmed that adults with dyslexia were yet again affected by crowding and set-size, in addition to an asymmetric attention distribution. Confounding effects of ADHD symptoms did not explain a significant independent variance in performance, suggesting that the difficulty shown by adult dyslexics were not accounted for by co-morbid ADHD. Furthermore, the effects of crowding, set-size and asymmetric attention correlated significantly with literacy, but not ADHD measures. It is believed that a more diffuse and an asymmetric attention system (in dyslexia) to be the limiting factor concerning noise exclusion and attention distribution. The findings from this thesis add to the current understanding of the potential role of deficits in visual attention in dyslexia and in the literacy difficulties experienced by this population.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This thesis is a study of performance management of Complex Event Processing (CEP) systems. Since CEP systems have distinct characteristics from other well-studied computer systems such as batch and online transaction processing systems and database-centric applications, these characteristics introduce new challenges and opportunities to the performance management for CEP systems. Methodologies used in benchmarking CEP systems in many performance studies focus on scaling the load injection, but not considering the impact of the functional capabilities of CEP systems. This thesis proposes the approach of evaluating the performance of CEP engines’ functional behaviours on events and develops a benchmark platform for CEP systems: CEPBen. The CEPBen benchmark platform is developed to explore the fundamental functional performance of event processing systems: filtering, transformation and event pattern detection. It is also designed to provide a flexible environment for exploring new metrics and influential factors for CEP systems and evaluating the performance of CEP systems. Studies on factors and new metrics are carried out using the CEPBen benchmark platform on Esper. Different measurement points of response time in performance management of CEP systems are discussed and response time of targeted event is proposed to be used as a metric for quality of service evaluation combining with the traditional response time in CEP systems. Maximum query load as a capacity indicator regarding to the complexity of queries and number of live objects in memory as a performance indicator regarding to the memory management are proposed in performance management of CEP systems. Query depth is studied as a performance factor that influences CEP system performance.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper develops an index for comparing the productivity of groups of operating units in cost terms when input prices are available. In that sense it represents an extension of a similar index available in the literature for comparing groups of units in terms of technical productivity in the absence of input prices. The index is decomposed to reveal the origins of differences in performance of the groups of units both in terms of technical and cost productivity. The index and its decomposition are of value in contexts where the need arises to compare units which perform the same function but they can be grouped by virtue of the fact that they operate in different contexts as might for example arise in comparisons of water or gas transmission companies operating in different countries.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Previous work has shown that human vision performs spatial integration of luminance contrast energy, where signals are squared and summed (with internal noise) over area at detection threshold. We tested that model here in an experiment using arrays of micro-pattern textures that varied in overall stimulus area and sparseness of their target elements, where the contrast of each element was normalised for sensitivity across the visual field. We found a power-law improvement in performance with stimulus area, and a decrease in sensitivity with sparseness. While the contrast integrator model performed well when target elements constituted 50–100% of the target area (replicating previous results), observers outperformed the model when texture elements were sparser than this. This result required the inclusion of further templates in our model, selective for grids of various regular texture densities. By assuming a MAX operation across these noisy mechanisms the model also accounted for the increase in the slope of the psychometric function that occurred as texture density decreased. Thus, for the first time, mechanisms that are selective for texture density have been revealed at contrast detection threshold. We suggest that these mechanisms have a role to play in the perception of visual textures.