977 resultados para feature representation
Resumo:
Drawing on attitude theory, this study investigates the drivers of employees' expression of favorable opinions about their workplace. Despite its theoretical and managerial importance, the marketing literature largely ignores the topic. This study advances prior research by developing, and empirically testing, a conceptual framework of the relationship between workgroup support and favorable external representation of the workplace, mediated by emotional responses to this support. The present research investigates four new relationships: between workgroup support and emotional exhaustion, workgroup support and organizational commitment, workgroup support and job satisfaction, and emotional exhaustion and external representation of the workplace. Based on a sample of over 700 frontline service employees, this study finds that workgroup support affects favorable external representation of the workplace through various emotional responses (i.e., emotional exhaustion, organizational commitment and job satisfaction). In addition, the results identify employees' organizational commitment as the most important determinant of favorable external representation of the workplace, followed by job satisfaction and reduced emotional exhaustion. These results suggest that companies should develop practices that encourage workgroup support and organizational commitment to achieve favorable external representation of the workplace.
Resumo:
This article in the peer-reviewed Oxford Bibliographies series, gives an introduction to the literatures on the varieties, origins, and effects of proportional electoral systems.
Resumo:
In this paper ensembles of forecasts (of up to six hours) are studied from a convection-permitting model with a representation of model error due to unresolved processes. The ensemble prediction system (EPS) used is an experimental convection-permitting version of the UK Met Office’s 24- member Global and Regional Ensemble Prediction System (MOGREPS). The method of representing model error variability, which perturbs parameters within the model’s parameterisation schemes, has been modified and we investigate the impact of applying this scheme in different ways. These are: a control ensemble where all ensemble members have the same parameter values; an ensemble where the parameters are different between members, but fixed in time; and ensembles where the parameters are updated randomly every 30 or 60 min. The choice of parameters and their ranges of variability have been determined from expert opinion and parameter sensitivity tests. A case of frontal rain over the southern UK has been chosen, which has a multi-banded rainfall structure. The consequences of including model error variability in the case studied are mixed and are summarised as follows. The multiple banding, evident in the radar, is not captured for any single member. However, the single band is positioned in some members where a secondary band is present in the radar. This is found for all ensembles studied. Adding model error variability with fixed parameters in time does increase the ensemble spread for near-surface variables like wind and temperature, but can actually decrease the spread of the rainfall. Perturbing the parameters periodically throughout the forecast does not further increase the spread and exhibits “jumpiness” in the spread at times when the parameters are perturbed. Adding model error variability gives an improvement in forecast skill after the first 2–3 h of the forecast for near-surface temperature and relative humidity. For precipitation skill scores, adding model error variability has the effect of improving the skill in the first 1–2 h of the forecast, but then of reducing the skill after that. Complementary experiments were performed where the only difference between members was the set of parameter values (i.e. no initial condition variability). The resulting spread was found to be significantly less than the spread from initial condition variability alone.
Resumo:
This article critically explores the nature and purpose of relationships and inter-dependencies between stakeholders in the context of a parastatal chromite mining company in the Betsiboka Region of Northern Madagascar. An examination of the institutional arrangements at the interface between the mining company and local communities identified power hierarchies and dependencies in the context of a dominant paternalistic environment. The interactions, inter alia, limited social cohesion and intensified the fragility and weakness of community representation, which was further influenced by ethnic hierarchies between the varied community groups; namely, indigenous communities and migrants to the area from different ethnic groups. Moreover, dependencies and nepotism, which may exist at all institutional levels, can create civil society stakeholder representatives who are unrepresentative of the society they are intended to represent. Similarly, a lack of horizontal and vertical trust and reciprocity inherent in Malagasy society engenders a culture of low expectations regarding transparency and accountability, which further catalyses a cycle of nepotism and elite rent-seeking behaviour. On the other hand, leaders retain power with minimal vertical delegation or decentralisation of authority among levels of government and limit opportunities to benefit the elite, perpetuating rent-seeking behaviour within the privileged minority. Within the union movement, pluralism and the associated politicisation of individual unions restricts solidarity, which impacts on the movement’s capacity to act as a cohesive body of opinion and opposition. Nevertheless, the unions’ drive to improve their social capital has increased expectations of transparency and accountability, resulting in demands for greater engagement in decision-making processes.
Resumo:
Considerable effort is presently being devoted to producing high-resolution sea surface temperature (SST) analyses with a goal of spatial grid resolutions as low as 1 km. Because grid resolution is not the same as feature resolution, a method is needed to objectively determine the resolution capability and accuracy of SST analysis products. Ocean model SST fields are used in this study as simulated “true” SST data and subsampled based on actual infrared and microwave satellite data coverage. The subsampled data are used to simulate sampling errors due to missing data. Two different SST analyses are considered and run using both the full and the subsampled model SST fields, with and without additional noise. The results are compared as a function of spatial scales of variability using wavenumber auto- and cross-spectral analysis. The spectral variance at high wavenumbers (smallest wavelengths) is shown to be attenuated relative to the true SST because of smoothing that is inherent to both analysis procedures. Comparisons of the two analyses (both having grid sizes of roughly ) show important differences. One analysis tends to reproduce small-scale features more accurately when the high-resolution data coverage is good but produces more spurious small-scale noise when the high-resolution data coverage is poor. Analysis procedures can thus generate small-scale features with and without data, but the small-scale features in an SST analysis may be just noise when high-resolution data are sparse. Users must therefore be skeptical of high-resolution SST products, especially in regions where high-resolution (~5 km) infrared satellite data are limited because of cloud cover.
Resumo:
Taking a perspective from a whole building lifecycle, occupier's actions could account for about 50% of energy. However occupants' activities influence building energy performance is still a blind area. Building energy performance is thought to be the result of a combination of building fabrics, building services and occupants' activities, along with their interactions. In this sense, energy consumption in built environment is regarded as a socio-technical system. In order to understand how such a system works, a range of physical, technical and social information is involved that needs to be integrated and aligned. This paper has proposed a semiotic framework to add value for Building Information Modelling, incorporating energy-related occupancy factors in a context of office buildings. Further, building information has been addressed semantically to describe a building space from the facility management perspective. Finally, the framework guides to set up building information representation system, which can help facility managers to manage buildings efficiently by improving their understanding on how office buildings are operated and used.
Resumo:
Svalgaard (2014) has recently pointed out that the calibration of the Helsinki magnetic observatory’s H component variometer was probably in error in published data for the years 1866–1874.5 and that this makes the interdiurnal variation index based on daily means, IDV(1d), (Lockwood et al., 2013a), and the interplanetary magnetic field strength derived from it (Lockwood et al., 2013b), too low around the peak of solar cycle 11. We use data from the modern Nurmijarvi station, relatively close to the site of the original Helsinki Observatory, to confirm a 30% underestimation in this interval and hence our results are fully consistent with the correction derived by Svalgaard. We show that the best method for recalibration uses the Helsinki Ak(H) and aa indices and is accurate to ±10 %. This makes it preferable to recalibration using either the sunspot number or the diurnal range of geomagnetic activity which we find to be accurate to ±20 %. In the case of Helsinki data during cycle 11, the two recalibration methods produce very similar corrections which are here confirmed using newly digitised data from the nearby St Petersburg observatory and also using declination data from Helsinki. However, we show that the IDV index is, compared to later years, too similar to sunspot number before 1872, revealing independence of the two data series has been lost; either because the geomagnetic data used to compile IDV has been corrected using sunspot numbers, or vice versa, or both. We present corrected data sequences for both the IDV(1d) index and the reconstructed IMF (interplanetary magnetic field).We also analyse the relationship between the derived near-Earth IMF and the sunspot number and point out the relevance of the prior history of solar activity, in addition to the contemporaneous value, to estimating any “floor” value of the near-Earth interplanetary field.
Resumo:
Most prominent models of bilingual representation assume a degree of interconnection or shared representation at the conceptual level. However, in the context of linguistic and cultural specificity of human concepts, and given recent findings that reveal a considerable amount of bidirectional conceptual transfer and conceptual change in bilinguals, a particular challenge that bilingual models face is to account for non-equivalence or partial equivalence of L1 and L2 specific concepts in bilingual conceptual store. The aim of the current paper is to provide a state-of-the-art review of the available empirical evidence from the fields of psycholinguistics, cognitive, experimental, and cross-cultural psychology, and discuss how these may inform and develop further traditional and more recent accounts of bilingual conceptual representation. Based on a synthesis of the available evidence against theoretical postulates of existing models, I argue that the most coherent account of bilingual conceptual representation combines three fundamental assumptions. The first one is the distributed, multi-modal nature of representation. The second one concerns cross-linguistic and cross-cultural variation of concepts. The third one makes assumptions about the development of concepts, and the emergent links between those concepts and their linguistic instantiations.
Resumo:
These findings support the view that abnormal neural responses to reward may be an endophenotype for depression and a potential target for intervention and prevention strategies.
Resumo:
We show that the affective experience of touch and the sight of touch can be modulated by cognition, and investigate in an fMRI study where top-down cognitive modulations of bottom-up somatosensory and visual processing of touch and its affective value occur in the human brain. The cognitive modulation was produced by word labels, 'Rich moisturizing cream' or 'Basic cream', while cream was being applied to the forearm, or was seen being applied to a forearm. The subjective pleasantness and richness were modulated by the word labels, as were the fMRI activations to touch in parietal cortex area 7, the insula and ventral striatum. The cognitive labels influenced the activations to the sight of touch and also the correlations with pleasantness in the pregenual cingulate/orbitofrontal cortex and ventral striatum. Further evidence of how the orbitofrontal cortex is involved in affective aspects of touch was that touch to the forearm [which has C fiber Touch (CT) afferents sensitive to light touch] compared with touch to the glabrous skin of the hand (which does not) revealed activation in the mid-orbitofrontal cortex. This is of interest as previous studies have suggested that the CT system is important in affiliative caress-like touch between individuals.
Resumo:
Recent studies showed that features extracted from brain MRIs can well discriminate Alzheimer’s disease from Mild Cognitive Impairment. This study provides an algorithm that sequentially applies advanced feature selection methods for findings the best subset of features in terms of binary classification accuracy. The classifiers that provided the highest accuracies, have been then used for solving a multi-class problem by the one-versus-one strategy. Although several approaches based on Regions of Interest (ROIs) extraction exist, the prediction power of features has not yet investigated by comparing filter and wrapper techniques. The findings of this work suggest that (i) the IntraCranial Volume (ICV) normalization can lead to overfitting and worst the accuracy prediction of test set and (ii) the combined use of a Random Forest-based filter with a Support Vector Machines-based wrapper, improves accuracy of binary classification.
Resumo:
This paper presents a novel approach to the automatic classification of very large data sets composed of terahertz pulse transient signals, highlighting their potential use in biochemical, biomedical, pharmaceutical and security applications. Two different types of THz spectra are considered in the classification process. Firstly a binary classification study of poly-A and poly-C ribonucleic acid samples is performed. This is then contrasted with a difficult multi-class classification problem of spectra from six different powder samples that although have fairly indistinguishable features in the optical spectrum, they also possess a few discernable spectral features in the terahertz part of the spectrum. Classification is performed using a complex-valued extreme learning machine algorithm that takes into account features in both the amplitude as well as the phase of the recorded spectra. Classification speed and accuracy are contrasted with that achieved using a support vector machine classifier. The study systematically compares the classifier performance achieved after adopting different Gaussian kernels when separating amplitude and phase signatures. The two signatures are presented as feature vectors for both training and testing purposes. The study confirms the utility of complex-valued extreme learning machine algorithms for classification of the very large data sets generated with current terahertz imaging spectrometers. The classifier can take into consideration heterogeneous layers within an object as would be required within a tomographic setting and is sufficiently robust to detect patterns hidden inside noisy terahertz data sets. The proposed study opens up the opportunity for the establishment of complex-valued extreme learning machine algorithms as new chemometric tools that will assist the wider proliferation of terahertz sensing technology for chemical sensing, quality control, security screening and clinic diagnosis. Furthermore, the proposed algorithm should also be very useful in other applications requiring the classification of very large datasets.
Resumo:
Fractal with microscopic anisotropy shows a unique type of macroscopic isotropy restoration phenomenon that is absent in Euclidean space [M. T. Barlow et al., Phys. Rev. Lett. 75, 3042]. In this paper the isotropy restoration feature is considered for a family of two-dimensional Sierpinski gasket type fractal resistor networks. A parameter xi is introduced to describe this phenomenon. Our numerical results show that xi satisfies the scaling law xi similar to l(-alpha), where l is the system size and alpha is an exponent independent of the degree of microscopic anisotropy, characterizing the isotropy restoration feature of the fractal systems. By changing the underlying fractal structure towards the Euclidean triangular lattice through increasing the side length b of the gasket generators, the fractal-to-Euclidean crossover behavior of the isotropy restoration feature is discussed.