886 resultados para Critical factors of success


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Contrast susceptibility is defined as the difference in visual acuity recorded for high and low contrast optotypes. Other researchers refer to this parameter as "normalised low contrast acuity". Pilot surveys have revealed that contrast susceptibility deficits are more strongly related to driving accident involvement than are deficits in high contrast visual acuity. It has been hypothesised that driving situation avoidance is purely based upon high contrast visual acuity. Hence, the relationship between high contrast visual acuity and accidents is masked by situation avoidance whilst drivers with contrast susceptibility deficits remain prone to accidents in poor visibility conditions. A national survey carried out to test this hypothesis provided no support for either the link between contrast susceptibility deficits and accidents involvement or the proposed hypothesis. Further, systematically worse contrast susceptibility scores emerged from vision screeners compared to wall mounted test charts. This discrepancy was not due to variations in test luminance or instrument myopia. Instead, optical imperfections inherent in vision screeners were considered to be responsible. Although contrast susceptibility is unlikely to provide a useful means of screening drivers' vision, previous research does provide support for its ability to detect visual deficits that may influence everyday tasks. In this respect, individual contrast susceptibility variations were found to reflect variations in the contrast sensitivity function - a parameter that provides a global estimate of human contrast sensitivity.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Increasingly users are seen as the weak link in the chain, when it comes to the security of corporate information. Should the users of computer systems act in any inappropriate or insecure manner, then they may put their employers in danger of financial losses, information degradation or litigation, and themselves in danger of dismissal or prosecution. This is a particularly important concern for knowledge-intensive organisations, such as universities, as the effective conduct of their core teaching and research activities is becoming ever more reliant on the availability, integrity and accuracy of computer-based information resources. One increasingly important mechanism for reducing the occurrence of inappropriate behaviours, and in so doing, protecting corporate information, is through the formulation and application of a formal ‘acceptable use policy (AUP). Whilst the AUP has attracted some academic interest, it has tended to be prescriptive and overly focussed on the role of the Internet, and there is relatively little empirical material that explicitly addresses the purpose, positioning or content of real acceptable use policies. The broad aim of the study, reported in this paper, is to fill this gap in the literature by critically examining the structure and composition of a sample of authentic policies – taken from the higher education sector – rather than simply making general prescriptions about what they ought to contain. There are two important conclusions to be drawn from this study: (1) the primary role of the AUP appears to be as a mechanism for dealing with unacceptable behaviour, rather than proactively promoting desirable and effective security behaviours, and (2) the wide variation found in the coverage and positioning of the reviewed policies is unlikely to be fostering a coherent approach to security management, across the higher education sector.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Ensuring the security of corporate information, that is increasingly stored, processed and disseminated using information and communications technologies [ICTs], has become an extremely complex and challenging activity. This is a particularly important concern for knowledge-intensive organisations, such as universities, as the effective conduct of their core teaching and research activities is becoming ever more reliant on the availability, integrity and accuracy of computer-based information resources. One increasingly important mechanism for reducing the occurrence of security breaches, and in so doing, protecting corporate information, is through the formulation and application of a formal information security policy (InSPy). Whilst a great deal has now been written about the importance and role of the information security policy, and approaches to its formulation and dissemination, there is relatively little empirical material that explicitly addresses the structure or content of security policies. The broad aim of the study, reported in this paper, is to fill this gap in the literature by critically examining the structure and content of authentic information security policies, rather than simply making general prescriptions about what they ought to contain. Having established the structure and key features of the reviewed policies, the paper critically explores the underlying conceptualisation of information security embedded in the policies. There are two important conclusions to be drawn from this study: (1) the wide diversity of disparate policies and standards in use is unlikely to foster a coherent approach to security management; and (2) the range of specific issues explicitly covered in university policies is surprisingly low, and reflects a highly techno-centric view of information security management.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Some critical aspects of a new kind of on-line measurement technique for micro and nanoscale surface measurements are described. This attempts to use spatial light-wave scanning to replace mechanical stylus scanning, and an optical fibre interferometer to replace optically bulky interferometers for measuring the surfaces. The basic principle is based on measuring the phase shift of a reflected optical signal. Wavelength-division-multiplexing and fibre Bragg grating techniques are used to carry out wavelength-to-field transformation and phase-to-depth detection, allowing a large dynamic measurement ratio (range/resolution) and high signal-to-noise ratio with remote access. In effect the paper consists of two parts: multiplexed fibre interferometry and remote on-machine surface detection sensor (an optical dispersive probe). This paper aims to investigate the metrology properties of a multiplexed fibre interferometer and to verify its feasibility by both theoretical and experimental studies. Two types of optical probes, using a dispersive prism and a blazed grating, respectively, are introduced to realize wavelength-to-spatial scanning.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

DUE TO COPYRIGHT RESTRICTIONS ONLY AVAILABLE FOR CONSULTATION AT ASTON UNIVERSITY LIBRARY AND INFORMATION SERVICES WITH PRIOR ARRANGEMENT

Relevância:

100.00% 100.00%

Publicador:

Resumo:

DUE TO COPYRIGHT RESTRICTIONS ONLY AVAILABLE FOR CONSULTATION AT ASTON UNIVERSITY LIBRARY AND INFORMATION SERVICES WITH PRIOR ARRANGEMENT

Relevância:

100.00% 100.00%

Publicador:

Resumo:

With the European Commission making global leadership claims in the field of audit regulation, the content of its 2010 Green Paper on ‘Audit Policy: Lessons from the Crisis’ warrants careful scrutiny. Important issues raised in the Green Paper include regulatory oversight, competition in the audit market, the dangers of having very few firms with the capacity to audit global transnational corporations, professional judgement, innovative audit practices and, last but not least, social responsibility. This article analyses the principal perspectives and assumptions underpinning the construction of the Green Paper. The aims are threefold: to enhance understanding of the contemporary regulatory mindset of the European Commission, contribute to policy debate and inspire future research.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper critically reviews the evolution of financial reporting in the banking sector with specific reference to the reporting of market risk and the growing use of the measure known as Value at Risk (VaR). The paper investigates the process by which VaR became 'institutionalised'. The analysis highlights a number of inherent limitations of VaR as a risk measure and questions the usefulness of published VaR disclosures, concluding that risk 'disclosure' might be more apparent than real. It also looks at some of the implications for risk reporting practice and the accounting profession more generally.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The 'amyloid cascade hypothesis' (ACH) is the most influential model of the pathogenesis of Alzheimer's disease (AD). The hypothesis proposes that the deposition of β-amyloid (Aβ) is the initial pathological event in AD, leading to the formation of extracellular senile plaques (SP), tau-immunoreactive neurofibrillary tangles (NFT), neuronal loss, and ultimately, clinical dementia. Ever since the formulation of the ACH, however, there have been questions regarding whether it completely describes AD pathogenesis. This review critically examines various aspects of the ACH including its origin and development, the role of amyloid precursor protein (APP), whether SP and NFT are related to the development of clinical dementia, whether Aβ and tau are 'reactive' proteins, and whether there is a pathogenic relationship between SP and NFT. The results of transgenic experiments and treatments for AD designed on the basis of the ACH are also reviewed. It was concluded: (1) Aβ and tau could be the products rather than the cause of neuro-degeneration in AD, (2) it is doubtful whether there is a direct causal link between Aβ and tau, and (3) SP and NFT may not be directly related to the development of dementia, (4) transgenic models involving APP alone do not completely replicate AD pathology, and (5) treatments based on the ACH have been unsuccessful. Hence, a modification of the ACH is proposed which may provide a more complete explanation of the pathogenesis of AD.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The glucagon-like peptide-1 receptor (GLP-1R) is a class B G protein-coupled receptor that has a critical role in the regulation of glucose homeostasis, principally through the regulation of insulin secretion. The receptor systemis highly complex, able to be activated by both endogenous [GLP-1(1-36)NH2, GLP-1(1-37), GLP-1(7-36)NH2, GLP-1(7-37), oxyntomodulin], and exogenous (exendin-4) peptides in addition to small-molecule allosteric agonists (compound 2 [6,7-dichloro-2-methylsulfonyl-3-tertbutylaminoquinoxaline], BETP [4-(3-benzyloxy)phenyl)-2-ethylsulfinyl-6-(trifluoromethyl)pyrimidine]). Furthermore, the GLP-1R is subject to single-nucleotide polymorphic variance, resulting in amino acid changes in the receptor protein. In this study, we investigated two polymorphic variants previously reported to impact peptidemediated receptor activity (M149) and small-molecule allostery (C333). These residues were mutated to a series of alternate amino acids, and their functionality was monitored across physiologically significant signaling pathways, including cAMP, extracellular signal-regulated kinase 1 and 2 phosphorylation, and intracellular Ca2+ mobilization, in addition to peptide binding and cell-surface expression. We observed that residue 149 is highly sensitive to mutation, with almost all peptide responses significantly attenuated at mutated receptors. However, most reductions in activity were able to be restored by the small-molecule allosteric agonist compound 2. Conversely, mutation of residue 333 had little impact on peptide-mediated receptor activation, but this activity could not be modulated by compound 2 to the same extent as that observed at the wild-type receptor. These results provide insight into the importance of residues 149 and 333 in peptide function and highlight the complexities of allosteric modulation within this receptor system.