18 resultados para Complex dimensions of fractal strings
em Aston University Research Archive
Resumo:
Despite abundant literature on human behaviour in the face of danger, much remains to be discovered. Some descriptive models of behaviour in the face of danger are reviewed in order to identify areas where documentation is lacking. It is argued that little is known about recognition and assessment of danger and yet, these are important aspects of cognitive processes. Speculative arguments about hazard assessment are reviewed and tested against the results of previous studies. Once hypotheses are formulated, the reason for retaining the reportory grid as the main research instrument are outlined, and the choice of data analysis techniques is described. Whilst all samples used repertory grids, the rating scales were different between samples; therefore, an analysis is performed of the way in which rating scales were used in the various samples and of some reasons why the scales were used differently. Then, individual grids are looked into and compared between respondents within each sample; consensus grids are also discussed. the major results from all samples are then contrasted and compared. It was hypothesized that hazard assessment would encompass three main dimensions, i.e. 'controllability', 'severity of consequences' and 'likelihood of occurrence', which would emerge in that order. the results suggest that these dimensions are but facets of two broader dimensions labelled 'scope of human intervention' and 'dangerousness'. It seems that these two dimensions encompass a number of more specific dimensions some of which can be further fragmented. Thus, hazard assessment appears to be a more complex process about which much remains to be discovered. Some of the ways in which further discovery might proceed are discussed.
Resumo:
Market orientation (MO) and marketing performance measurement (MPM) are two of the most widespread strategic marketing concepts among practitioners. However, some have questioned the benefits of extensive investments in MO and MPM. More importantly, little is known about which combinations of MO and MPM are optimal in ensuring high business performance. To address this research gap, the authors analyze a unique data set of 628 firms with a novel method of configurational analysis: fuzzy-set qualitative comparative analysis. In line with prior research, the authors find that MO is an important determinant of business performance. However, to reap its benefits, managers need to complement it with appropriate MPM, the level and focus of which vary across firms. For example, whereas large firms and market leaders generally benefit from comprehensive MPM, small firms may benefit from measuring marketing performance only selectively or by focusing on particular dimensions of marketing performance. The study also finds that many of the highest-performing firms do not follow any of the particular best practices identified.
Resumo:
Experiments combining different groups or factors and which use ANOVA are a powerful method of investigation in applied microbiology. ANOVA enables not only the effect of individual factors to be estimated but also their interactions; information which cannot be obtained readily when factors are investigated separately. In addition, combining different treatments or factors in a single experiment is more efficient and often reduces the sample size required to estimate treatment effects adequately. Because of the treatment combinations used in a factorial experiment, the degrees of freedom (DF) of the error term in the ANOVA is a more important indicator of the ‘power’ of the experiment than the number of replicates. A good method is to ensure, where possible, that sufficient replication is present to achieve 15 DF for the error term of the ANOVA testing effects of particular interest. Finally, it is important to always consider the design of the experiment because this determines the appropriate ANOVA to use. Hence, it is necessary to be able to identify the different forms of ANOVA appropriate to different experimental designs and to recognise when a design is a split-plot or incorporates a repeated measure. If there is any doubt about which ANOVA to use in a specific circumstance, the researcher should seek advice from a statistician with experience of research in applied microbiology.
Resumo:
Taking issue with the prevalent practice of measuring customer satisfaction with a single global measurement item, this article stresses the importance of measuring customer satisfaction through its underlying dimensions, especially in retail settings. Empirical results of a survey of 351 consumers demonstrate that (a) consumer satisfaction with retail stores has 6 key dimensions, (b) the suggested dimensions of retail satisfaction predict overall satisfaction, and (c) the dimensions of retail satisfaction have a greater effect on overall satisfaction than SERVQUAL dimensions. However, the predictive power of the dimensions of retail satisfaction is still fairly low. Implications for retail management as well as academic research are outlined.
Resumo:
With an increased emphasis on outsourcing and shortening business cycles, contracts between firms have become more important. Carefully written contracts contribute to the efficiency and longevity of inter-firm relationships as they may constrain opportunism and are often a less costly governance mechanism than maintaining complex social relationships (Larson 1992). This exploratory examination adds to our understanding of how incomplete contracts affect interorganizational exchange. First, we consider the multiple dimensions of contract constraints (safeguards). We also investigate the extent that constraints affect decisions to enforce the relationship by delaying payments, and whether the decision is efficient. Finally, we examine the extent the constraints are effective (and ineffective) at reducing transaction problems associated with enforcement. Based on 971 observations of transactions using explicit, written terms and other secondary data in the context of IT transaction in The Netherlands we test our research propositions.
Resumo:
We measure complex amplitude of scattered wave in the far field, and justify theoretically and numerically solution of the inverse scattering problem. This allows single-shot reconstructing of dielectric function distribution during direct femtosecond laser micro-fabrication.
Resumo:
In this paper we present a novel method for emulating a stochastic, or random output, computer model and show its application to a complex rabies model. The method is evaluated both in terms of accuracy and computational efficiency on synthetic data and the rabies model. We address the issue of experimental design and provide empirical evidence on the effectiveness of utilizing replicate model evaluations compared to a space-filling design. We employ the Mahalanobis error measure to validate the heteroscedastic Gaussian process based emulator predictions for both the mean and (co)variance. The emulator allows efficient screening to identify important model inputs and better understanding of the complex behaviour of the rabies model.
Resumo:
This research explores the role of internal customers in the delivery of external service quality. It will consider any potentially different internal customer types that may exist within the organisation. Additionally, it will explore any potential differences in the dimensions that are used to measure service quality internally and externally. If there are different internal customer types then there may be different dimensions which are used to measure service quality between these types and this will be considered also. The approach adopted given the depth and breadth of understanding required, was an action research case based approach. The research objectives were:(i) To determine the dimensions of internal service quality between internal customer supplier cells. (ii) To determine what variation, if any, there is in the dimension sets between internal customer supplier cells. (iii) To determine any ranking in the dimensions that could exist by internal customer supplier cell type. (iv) To investigate the impact of internal service quality on external service quality over time. The research findings were: (i) The majority of the dimensions used in measuring external service quality were also used internally. There were additions of new dimensions however and some dimensions which were used externally, for internal use, had to be redefined. (ii) Variation in dimension sets were revealed during the research. Four different dimension sets were identified and these were matched with four different types of internal service interaction. (iii) Differences in the ranking of dimensions within each dimension set for each internal customer supplier cell type were confirmed. (iv) Internal service quality was seen to influence external service quality but at a cellular level rather than company level. At the company level, the average internal service quality at the start and finish of the research showed no improvement but external service quality had improved. Further investigation at the cellular level showed that improvements in internal service quality had occurred. Those improvements were found to be with the cells that were closest to the customer.The research implications were found to be: (i) some cells may not be necessary in the delivery of external service quality. (ii) The immediacy of the cell to the external customer and number of interactions into and out of that cell has the greatest effect on external customer satisfaction. (iii) Internal service quality may be driven by the customer affecting those cells at the front end of the business first. This then cascades back to those cells which are less immediate until ultimately the whole organisation shows improvements in internal service quality.
Resumo:
A preliminary study by Freeman et al (1996b) has suggested that when complex patterns of motion elicit impressions of 2-dimensionality, odd-item-out detection improves given targets can be differentiated on the basis of surface properties. Their results can be accounted for, it if is supposed that observers are permitted efficient access to 3-D surface descriptions but access to 2-D motion descriptions is restricted. To test the hypothesis, a standard search technique was employed, in which targets could be discussed on the basis of slant sign. In one experiment, slant impressions were induced through the summing of deformation and translation components. In a second theory were induced through the summing of shear and translation components. Neither showed any evidence of efficient access. A third experiment explored the possibility that access to these representations may have been hindered by a lack of grouping between the stimuli. Attempts to improve grouping failed to produce convincing evidence in support of life. An alternative explanation is that complex patterns of motion are simply not processed simultaneously. Psychophysical and physiological studies have, however, suggested that multiple mechanisms selective for complex motion do exist. Using a subthreshold summation technique I found evidence supporting the notion that complex motions are processed in parallel. Furthermore, in a spatial summation experiment, coherence thresholds were measured for displays containing different numbers of complex motion patches. Consistent with the idea that complex motion processing proceeds in parallel, increases in the number of motion patches were seen to decrease thresholds, both for expansion and rotation. Moreover, the rates of decrease were higher than those typically expected from probability summation, thus implying mechanisms are available, which can pool signals from spatially distinct complex motion flows.
Resumo:
This thesis consisted of two major parts, one determining the masking characteristics of pixel noise and the other investigating the properties of the detection filter employed by the visual system. The theoretical cut-off frequency of white pixel noise can be defined from the size of the noise pixel. The empirical cut-off frequency, i.e. the largest size of noise pixels that mimics the effect of white noise in detection, was determined by measuring contrast energy thresholds for grating stimuli in the presence of spatial noise consisting of noise pixels of various sizes and shapes. The critical i.e. minimum number of noise pixels per grating cycle needed to mimic the effect of white noise in detection was found to decrease with the bandwidth of the stimulus. The shape of the noise pixels did not have any effect on the whiteness of pixel noise as long as there was at least the minimum number of noise pixels in all spatial dimensions. Furthermore, the masking power of white pixel noise is best described when the spectral density is calculated by taking into account all the dimensions of noise pixels, i.e. width, height, and duration, even when there is random luminance only in one of these dimensions. The properties of the detection mechanism employed by the visual system were studied by measuring contrast energy thresholds for complex spatial patterns as a function of area in the presence of white pixel noise. Human detection efficiency was obtained by comparing human performance with an ideal detector. The stimuli consisted of band-pass filtered symbols, uniform and patched gratings, and point stimuli with randomised phase spectra. In agreement with the existing literature, the detection performance was found to decline with the increasing amount of detail and contour in the stimulus. A measure of image complexity was developed and successfully applied to the data. The accuracy of the detection mechanism seems to depend on the spatial structure of the stimulus and the spatial spread of contrast energy.
Resumo:
We measure complex amplitude of scattered wave in the far field, and justify theoretically and numerically solution of the inverse scattering problem. This allows single-shot reconstructing of dielectric function distribution during direct femtosecond laser micro-fabrication.
Resumo:
We measure complex amplitude of scattered wave in the far field, and justify theoretically and numerically solution of the inverse scattering problem. This allows single-shot reconstructing of dielectric function distribution during direct femtosecond laser micro-fabrication.
Resumo:
With its implications for vaccine discovery, the accurate prediction of T cell epitopes is one of the key aspirations of computational vaccinology. We have developed a robust multivariate statistical method, based on partial least squares, for the quantitative prediction of peptide binding to major histocompatibility complexes (MHC), the principal checkpoint on the antigen presentation pathway. As a service to the immunobiology community, we have made a Perl implementation of the method available via a World Wide Web server. We call this server MHCPred. Access to the server is freely available from the URL: http://www.jenner.ac.uk/MHCPred. We have exemplified our method with a model for peptides binding to the common human MHC molecule HLA-B*3501.