70 resultados para due credibility


Relevância:

20.00% 20.00%

Publicador:

Resumo:

PURPOSE: To explore the effects of glaucoma and aging on low-spatial-frequency contrast sensitivity by using tests designed to assess performance of either the magnocellular (M) or parvocellular (P) visual pathways. METHODS: Contrast sensitivity was measured for spatial frequencies of 0.25 to 2 cyc/deg by using a published steady- and pulsed-pedestal approach. Sixteen patients with glaucoma and 16 approximately age-matched control subjects participated. Patients with glaucoma were tested foveally and at two midperipheral locations: (1) an area of early visual field loss, and (2) an area of normal visual field. Control subjects were assessed in matched locations. An additional group of 12 younger control subjects (aged 20-35 years) were also tested. RESULTS: Older control subjects demonstrated reduced sensitivity relative to the younger group for the steady (presumed M)- and pulsed (presumed P)-pedestal conditions. Sensitivity was reduced foveally and in the midperiphery across the spatial frequency range. In the area of early visual field loss, the glaucoma group demonstrated further sensitivity reduction relative to older control subjects across the spatial frequency range for both the steady- and pulsed-pedestal tasks. Sensitivity was also reduced in the midperipheral location of "normal" visual field for the pulsed condition. CONCLUSIONS: Normal aging results in a reduction of contrast sensitivity for the low-spatial-frequency-sensitive components of both the M and P pathways. Glaucoma results in a further reduction of sensitivity that is not selective for M or P function. The low-spatial-frequency-sensitive channels of both pathways, which are presumably mediated by cells with larger receptive fields, are approximately equivalently impaired in early glaucoma.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Channel measurements and simulations have been carried out to observe the effects of pedestrian movement on multiple-input multiple-output orthogonal frequency division multiplexing (MIMO-OFDM) channel capacity. An in-house built MIMO-OFDM packet transmission demonstrator equipped with four transmitters and four receivers has been utilized to perform channel measurements at 5.2 GHz. Variations in the channel capacity dynamic range have been analysed for 1 to 10 pedestrians and different antenna arrays (2 × 2, 3 × 3 and 4 × 4). Results show a predicted 5.5 bits/s/Hz and a measured 1.5 bits/s/Hz increment in the capacity dynamic range with the number of pedestrian and the number of antennas in the transmitter and receiver array.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We assess the increase in particle number emissions from motor vehicles driving at steady speed when forced to stop and accelerate from rest. Considering the example of a signalized pedestrian crossing on a two-way single-lane urban road, we use a complex line source method to calculate the total emissions produced by a specific number and mix of light petrol cars and diesel passenger buses and show that the total emissions during a red light is significantly higher than during the time when the light remains green. Replacing two cars with one bus increased the emissions by over an order of magnitude. Considering these large differences, we conclude that the importance attached to particle number emissions in traffic management policies be reassessed in the future.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Shrinkage cracking is commonly observed in concrete flat structures such as highway pavements, slabs, and bridge decks. Crack spacing due to shrinkage has received considerable attention for many years [1-3]. However, some aspects concerning the mechanism of crack spacing still remain un-clear. Though it is well known that the interval of the cracks generally falls with a range, no satisfactory explanation has been put forward as to why the minimum spacing exists.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Since the 1980s, industries and researchers have sought to better understand the quality of services due to the rise in their importance (Brogowicz, Delene and Lyth 1990). More recent developments with online services, coupled with growing recognition of service quality (SQ) as a key contributor to national economies and as an increasingly important competitive differentiator, amplify the need to revisit our understanding of SQ and its measurement. Although ‘SQ’ can be broadly defined as “a global overarching judgment or attitude relating to the overall excellence or superiority of a service” (Parasuraman, Berry and Zeithaml 1988), the term has many interpretations. There has been considerable progress on how to measure SQ perceptions, but little consensus has been achieved on what should be measured. There is agreement that SQ is multi-dimensional, but little agreement as to the nature or content of these dimensions (Brady and Cronin 2001). For example, within the banking sector, there exist multiple SQ models, each consisting of varying dimensions. The existence of multiple conceptions and the lack of a unifying theory bring the credibility of existing conceptions into question, and beg the question of whether it is possible at some higher level to define SQ broadly such that it spans all service types and industries. This research aims to explore the viability of a universal conception of SQ, primarily through a careful re-visitation of the services and SQ literature. The study analyses the strengths and weaknesses of the highly regarded and widely used global SQ model (SERVQUAL) which reflects a single-level approach to SQ measurement. The SERVQUAL model states that customers evaluate SQ (of each service encounter) based on five dimensions namely reliability, assurance, tangibles, empathy and responsibility. SERVQUAL, however, failed to address what needs to be reliable, assured, tangible, empathetic and responsible. This research also addresses a more recent global SQ model from Brady and Cronin (2001); the B&C (2001) model, that has potential to be the successor of SERVQUAL in that it encompasses other global SQ models and addresses the ‘what’ questions that SERVQUAL didn’t. The B&C (2001) model conceives SQ as being multidimensional and multi-level; this hierarchical approach to SQ measurement better reflecting human perceptions. In-line with the initial intention of SERVQUAL, which was developed to be generalizable across industries and service types, this research aims to develop a conceptual understanding of SQ, via literature and reflection, that encompasses the content/nature of factors related to SQ; and addresses the benefits and weaknesses of various SQ measurement approaches (i.e. disconfirmation versus perceptions-only). Such understanding of SQ seeks to transcend industries and service types with the intention of extending our knowledge of SQ and assisting practitioners in understanding and evaluating SQ. The candidate’s research has been conducted within, and seeks to contribute to, the ‘IS-Impact’ research track of the IT Professional Services (ITPS) Research Program at QUT. The vision of the track is “to develop the most widely employed model for benchmarking Information Systems in organizations for the joint benefit of research and practice.” The ‘IS-Impact’ research track has developed an Information Systems (IS) success measurement model, the IS-Impact Model (Gable, Sedera and Chan 2008), which seeks to fulfill the track’s vision. Results of this study will help future researchers in the ‘IS-Impact’ research track address questions such as: • Is SQ an antecedent or consequence of the IS-Impact model or both? • Has SQ already been addressed by existing measures of the IS-Impact model? • Is SQ a separate, new dimension of the IS-Impact model? • Is SQ an alternative conception of the IS? Results from the candidate’s research suggest that SQ dimensions can be classified at a higher level which is encompassed by the B&C (2001) model’s 3 primary dimensions (interaction, physical environment and outcome). The candidate also notes that it might be viable to re-word the ‘physical environment quality’ primary dimension to ‘environment quality’ so as to better encompass both physical and virtual scenarios (E.g: web sites). The candidate does not rule out the global feasibility of the B&C (2001) model’s nine sub-dimensions, however, acknowledges that more work has to be done to better define the sub-dimensions. The candidate observes that the ‘expertise’, ‘design’ and ‘valence’ sub-dimensions are supportive representations of the ‘interaction’, physical environment’ and ‘outcome’ primary dimensions respectively. The latter statement suggests that customers evaluate each primary dimension (or each higher level of SQ classification) namely ‘interaction’, physical environment’ and ‘outcome’ based on the ‘expertise’, ‘design’ and ‘valence’ sub-dimensions respectively. The ability to classify SQ dimensions at a higher level coupled with support for the measures that make up this higher level, leads the candidate to propose the B&C (2001) model as a unifying theory that acts as a starting point to measuring SQ and the SQ of IS. The candidate also notes, in parallel with the continuing validation and generalization of the IS-Impact model, that there is value in alternatively conceptualizing the IS as a ‘service’ and ultimately triangulating measures of IS SQ with the IS-Impact model. These further efforts are beyond the scope of the candidate’s study. Results from the candidate’s research also suggest that both the disconfirmation and perceptions-only approaches have their merits and the choice of approach would depend on the objective(s) of the study. Should the objective(s) be an overall evaluation of SQ, the perceptions-only approached is more appropriate as this approach is more straightforward and reduces administrative overheads in the process. However, should the objective(s) be to identify SQ gaps (shortfalls), the (measured) disconfirmation approach is more appropriate as this approach has the ability to identify areas that need improvement.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The range of political information sources available to modern Australians is greater and more varied today than at any point in the nation’s history, incorporating print, broadcast, Internet, mainstream and non-mainstream media. In such a competitive media environment, the factors which influence the selection of some information sources above others are of interest to political agents, media institutions and communications researchers alike. A key factor in information source selection is credibility. At the same time that the range of political information sources is increasing rapidly, due to the development of new information and communication technologies, audience research suggests that trust in mainstream media organisations in many countries is declining. So if people distrust the mainstream media, but have a vast array of alternative political information sources available to them, what do their personal media consumption patterns look like? How can we analyse such media consumption patterns in a meaningful way? In this paper I will briefly map the development of media credibility research in the US and Australia, leading to a discussion of one of the most recent media credibility constructs to be shown to influence political information consumption, media scepticism. Looking at the consequences of media scepticism, I will then consider the associated media consumption construct, media diet, and evaluate its usefulness in an Australian, as opposed to US, context. Finally, I will suggest alternative conceptualisations of media diets which may be more suited to Australian political communications research.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

It is now well known that pesticide spraying by farmers has an adverse impact on their health. This is especially so in developing countries where pesticide spraying is undertaken manually. The estimated health costs are large. Studies to date have examined farmers’ exposure to pesticides, the costs of ill-health and their determinants based on information provided by farmers. Hence, some doubt has been cast on the reliability of such studies. In this study, we rectify this situation by conducting surveys among two groups of farmers. Farmers who perceive that their ill-health is due to exposure to pesticides and obtained treatment and farmers whose ill-health have been diagnosed by doctors and who have been treated in hospital for exposure to pesticides. In the paper, cost comparisons between the two groups of farmers are made. Furthermore, regression analysis of the determinants of health costs show that the quantity of pesticides used per acre per month, frequency of pesticide use and number of pesticides used per hour per day are the most important determinants of medical costs for both samples. The results have important policy implications.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

There are a number of gel dosimeter calibration methods in contemporary usage. The present study is a detailed Monte Carlo investigation into the accuracy of several calibration techniques. Results show that for most arrangements the dose to gel accurately reflects the dose to water, with the most accurate method involving the use of a large diameter flask of gel into which multiple small fields of varying dose are directed. The least accurate method was found to be that of a long test tube in a water phantom, coaxial with the beam. The large flask method is also the most straightforward and least likely to introduce errors during setup, though, to its detriment, the volume of gel required is much more than other methods.