20 resultados para Metaphysics of reasons

em Aston University Research Archive


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The objective of the study was to define common reasons for non-adherence (NA) to highly active antiretroviral therapy (HAART) and the number of reasons reported by non-adherent individuals. A confidential questionnaire was administered to HIV-seropositive patients taking proteinase inhibitor based HAART. Median self-reported adherence was 95% (n = 178, range = 60-100%). The most frequent reasons for at least 'sometimes' missing a dose were eating a meal at the wrong time (38.2%), oversleeping (36.3%), forgetting (35.0%) and being in a social situation (30.5%). The mean number of reasons occurring at least 'sometimes' was 3.2; 20% of patients gave six or more reasons; those reporting the lowest adherence reported a significantly greater numbers of reasons (ρ = - 0.59; p < 0.001). Three factors were derived from the data by principal component analysis reflecting 'negative experiences of HAART', 'having a low priority for taking medication' and 'unintentionally missing doses', accounting for 53.8% of the variance. On multivariate analysis only the latter two factors were significantly related to NA (odds ratios 0.845 and 0.849, respectively). There was a wide spectrum of reasons for NA in our population. The number of reasons in an individual increased as adherence became less. A variety of modalities individualized for each patient are required to support patients with the lowest adherence.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Uncertified absence from work has traditionally been difficult to link to personality. The present paper argues that personality is best conceptualized as influencing an individual’s intention to be absent from work because of reasons that are within their control. This was investigated in an employed community sample of 128 individuals. These individuals used a self-report measure to express their future intentions to be absent from work as a result of several reasons. These reasons for absence were categorized as “being absent because of external pressure or commitment” (ABCo) and “being absence by choice” (ABCh). The Big Five personality factors were found to be unrelated to objective uncertified absence records and unrelated to ABCo. Three of the Big Five were related to ABCh. Agreeableness was negatively related to ABCh whereas Extraversion and Openness demonstrated a positive correlation. It was concluded that the results should be viewed tentatively, but that this study may provide a useful framework for conceptualizing the association of personality with uncertified absence.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Computer-Based Learning systems of one sort or another have been in existence for almost 20 years, but they have yet to achieve real credibility within Commerce, Industry or Education. A variety of reasons could be postulated for this, typically: - cost - complexity - inefficiency - inflexibility - tedium Obviously different systems deserve different levels and types of criticism, but it still remains true that Computer-Based Learning (CBL) is falling significantly short of its potential. Experience of a small, but highly successful CBL system within a large, geographically distributed industry (the National Coal Board) prompted an investigation into currently available packages, the original intention being to purchase the most suitable software and run it on existing computer hardware, alongside existing software systems. It became apparent that none of the available CBL packages were suitable, and a decision was taken to develop an in-house Computer-Assisted Instruction system according to the following criteria: - cheap to run; - easy to author course material; - easy to use; - requires no computing knowledge to use (as either an author or student) ; - efficient in the use of computer resources; - has a comprehensive range of facilities at all levels. This thesis describes the initial investigation, resultant observations and the design, development and implementation of the SCHOOL system. One of the principal characteristics c£ SCHOOL is that it uses a hierarchical database structure for the storage of course material - thereby providing inherently a great deal of the power, flexibility and efficiency originally required. Trials using the SCHOOL system on IBM 303X series equipment are also detailed, along with proposed and current development work on what is essentially an operational CBL system within a large-scale Industrial environment.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This thesis presents the results from an investigation into the merits of analysing Magnetoencephalographic (MEG) data in the context of dynamical systems theory. MEG is the study of both the methods for the measurement of minute magnetic flux variations at the scalp, resulting from neuro-electric activity in the neocortex, as well as the techniques required to process and extract useful information from these measurements. As a result of its unique mode of action - by directly measuring neuronal activity via the resulting magnetic field fluctuations - MEG possesses a number of useful qualities which could potentially make it a powerful addition to any brain researcher's arsenal. Unfortunately, MEG research has so far failed to fulfil its early promise, being hindered in its progress by a variety of factors. Conventionally, the analysis of MEG has been dominated by the search for activity in certain spectral bands - the so-called alpha, delta, beta, etc that are commonly referred to in both academic and lay publications. Other efforts have centred upon generating optimal fits of "equivalent current dipoles" that best explain the observed field distribution. Many of these approaches carry the implicit assumption that the dynamics which result in the observed time series are linear. This is despite a variety of reasons which suggest that nonlinearity might be present in MEG recordings. By using methods that allow for nonlinear dynamics, the research described in this thesis avoids these restrictive linearity assumptions. A crucial concept underpinning this project is the belief that MEG recordings are mere observations of the evolution of the true underlying state, which is unobservable and is assumed to reflect some abstract brain cognitive state. Further, we maintain that it is unreasonable to expect these processes to be adequately described in the traditional way: as a linear sum of a large number of frequency generators. One of the main objectives of this thesis will be to prove that much more effective and powerful analysis of MEG can be achieved if one were to assume the presence of both linear and nonlinear characteristics from the outset. Our position is that the combined action of a relatively small number of these generators, coupled with external and dynamic noise sources, is more than sufficient to account for the complexity observed in the MEG recordings. Another problem that has plagued MEG researchers is the extremely low signal to noise ratios that are obtained. As the magnetic flux variations resulting from actual cortical processes can be extremely minute, the measuring devices used in MEG are, necessarily, extremely sensitive. The unfortunate side-effect of this is that even commonplace phenomena such as the earth's geomagnetic field can easily swamp signals of interest. This problem is commonly addressed by averaging over a large number of recordings. However, this has a number of notable drawbacks. In particular, it is difficult to synchronise high frequency activity which might be of interest, and often these signals will be cancelled out by the averaging process. Other problems that have been encountered are high costs and low portability of state-of-the- art multichannel machines. The result of this is that the use of MEG has, hitherto, been restricted to large institutions which are able to afford the high costs associated with the procurement and maintenance of these machines. In this project, we seek to address these issues by working almost exclusively with single channel, unaveraged MEG data. We demonstrate the applicability of a variety of methods originating from the fields of signal processing, dynamical systems, information theory and neural networks, to the analysis of MEG data. It is noteworthy that while modern signal processing tools such as independent component analysis, topographic maps and latent variable modelling have enjoyed extensive success in a variety of research areas from financial time series modelling to the analysis of sun spot activity, their use in MEG analysis has thus far been extremely limited. It is hoped that this work will help to remedy this oversight.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This research examines and explains the links between safety culture and communication. Safety culture is a concept that in recent years has gained prominence but there has been little applied research conducted to investigate the meaning of the concept in 'real life' settings. This research focused on a Train Operating Company undergoing change in a move towards privatisation. These changes were evident in the management of safety, the organisation of the industry and internally in their management. The Train Operating Company's management took steps to improve their safety culture and communications through the development of a cascade communication structure. The research framework employed a qualitative methodology in order to investigate the effect of the new system on safety culture. Findings of the research were that communications in the organisation failed to be effective for a number of reasons, including both cultural and logistical problems. The cultural problems related to a lack of trust in the organisation by the management and the workforce, the perception of communications as management propaganda, and asyntonic communications between those involved, whilst logistical problems related to the inherent difficulties of communicating over a geographically distributed network. An organisational learning framework was used to explain the results. It is postulated that one of the principal reasons why change, either to the safety culture or to communications, did not occur was because of the organisation's inability to learn. The research has also shown the crucial importance of trust between the members of the organisation, as this was one of the fundamental reasons why the safety culture did not change, and why safety management systems were not fully implemented. This is consistent with the notion of mutual trust in the HSC (1993) definition of safety culture. This research has highlighted its relevance to safety culture and its importance for organisational change.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

An international round robin study of the stability of fast pyrolysis bio-oil was undertaken. Fifteen laboratories in five different countries contributed. Two bio-oil samples were distributed to the laboratories for stability testing and further analysis. The stability test was defined in a method provided with the bio-oil samples. Viscosity measurement was a key input. The change in viscosity of a sealed sample of bio-oil held for 24 h at 80 °C was the defining element of stability. Subsequent analyses included ultimate analysis, density, moisture, ash, filterable solids, and TAN/pH determination, and gel permeation chromatography. The results showed that kinematic viscosity measurement was more generally conducted and more reproducibly performed versus dynamic viscosity measurement. The variation in the results of the stability test was great and a number of reasons for the variation were identified. The subsequent analyses proved to be at the level of reproducibility, as found in earlier round robins on bio-oil analysis. Clearly, the analyses were more straightforward and reproducible with a bio-oil sample low in filterable solids (0.2%), compared to one with a higher (2%) solids loading. These results can be helpful in setting standards for use of bio-oil, which is just coming into the marketplace. © 2012 American Chemical Society.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The article looks first into the nature of the relations between Germany and the CEE countries a decade since the accession of the CEE countries to the EU. The relations are characterized as normalised and intensive with diverse levels of closeness and co-operation reflecting of the conceptual and ideological compatibility/differences. Next, the article focuses on the German attitude to the euro zone crisis. Germany has become a hegemon in the rescue effort aimed at stabilisation and economic invigoration of the euro zone. However, German hegemony has developed by default, not by design: her leading position is linked with considerable political and financial costs. Germany moved central stage and took the position of a reluctant hegemon. However, German role is contested internationally (it has not the support of the French government in key areas) as well as internally (particularly by the Federal Constitutional Court and the Bundesbank).The article argues that the new situation makes the German-CEE relations increasingly relevant for both sides. The German leadership of the EU increasing split along the north-south divide requires backing by the Northern group countries to which the CEE in general belongs. Given a number of reasons the CEE countries implement three distinctive strategies of co-operation with Germany in European politics. Also military co-operation, which remained rather limited so far, may receive new impulses, given the financial austerity. © 2013 The Regents of the University of California.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Timing jitter is a major factor limiting the performance of any high-speed, long-haul data transmission system. It arises from a number of reasons, such as interaction with accumulated spontaneous emission, inter-symbol interference (ISI), electrostriction etc. Some effects causing timing jitter can be reduced by means of non-linear filtering, using, for example, a nonlinear optical loop mirror (NOLM) [1]. The NOLM has been shown to reduce the timing jitter by suppressing the ASE and by stabilising the pulse duration [2, 3]. In this paper, we investigate the dynamics of timing jitter in a 2R regenerated system, nonlinearly guided by NOLMs at bit rates of 10, 20, 40, and 80- Gbit/s. Transmission performance of an equivalent non-regenerated (generic) system is taken as a reference.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Despite Government investment in flood defence schemes, many properties remain at high risk of flooding. A substantial portion of these properties are business establishments. Flooding can create serious consequences for businesses, including damage to property and stocks, being out of business for a considerable period and ultimately business failure. Recent flood events such as those in 2007 and 2009 that affected many parts of the UK have helped to establish the true costs of flooding to businesses. This greater understanding of the risks to businesses has heightened the need for business owners to adapt their businesses to the threat of future flooding. Government policy has now shifted away from investment in engineered flood defences, towards encouraging the uptake of property level flood resistance and resilience measures by businesses. However, implementing such adaptation strategies remains a challenge due a range of reasons. A review of the current state of property level flood risk adaptation of UK businesses is presented, drawing from extant literature. Barriers that may hinder the uptake of property level adaptation by businesses are revealed and drivers that may enhance uptake and effectively overcome these barriers are also discussed. It is concluded that the professions from the construction sector have the potential to contribute towards the adaptation of business properties and thereby the flood resilience of businesses at risk of flooding.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Purpose – Increasing turnover of frontline staff in call centres is detrimental to the delivery of quality service to customers. This paper aims to present the context for the rapid growth of the business process outsourcing (BPO) sector in India, and to address a critical issue faced by call centre organisations in this sector – the high employee turnover. Design/methodology/approach – Following a triangulation approach, two separate empirical investigations are conducted to examine various aspects of high labour turnover rates in the call centre sector in India. Study one examines the research issue via 51 in-depth interviews in as many units. Study two reports results from a questionnaire survey with 204 frontline agents across 11 call centres regarding employee turnover. Findings – This research reveals a range of reasons – from monotonous work, stressful work environment, adverse working conditions, lack of career development opportunities; to better job opportunities elsewhere, which emerge as the key causes of increasing attrition rates in the Indian call centre industry. Research limitations/implications – The research suggests that there are several issues that need to be handled carefully by management of call centres in India to overcome the problem of increasing employee turnover, and that this also demands support from the Indian government. Originality/value – The contributions of this study untangle the issues underlying a key problem in the call centre industry, i.e. employee turnover in the Indian call centre industry context. Adopting an internal marketing approach, it provides useful information for both academics and practitioners and suggests internal marketing interventions, and avenues for future research to combat the problem of employee turnover.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Over the last 30 years, the field of problem structuring methods (PSMs) has been pioneered by a handful of 'gurus'—the most visible of whom have contributed other viewpoints to this special issue. As this generation slowly retires, it is opportune to survey the field and their legacy. We focus on the progress the community has made up to 2000, as work that started afterwards is ongoing and its impact on the field will probably only become apparent in 5–10 years time. We believe that up to 2000, research into PSMs was stagnating. We believe that this was partly due to a lack of new researchers penetrating what we call the 'grass-roots community'—the community which takes an active role in developing the theory and application of problem structuring. Evidence for this stagnation (or lack of development) is that, in 2000, many PSMs still relied heavily on the same basic methods proposed by the originators nearly 30 years earlier—perhaps only supporting those methods with computer software as a sign of development. Furthermore, no new methods had been integrated into the literature which suggests that revolutionary development, at least by academics, has stalled. We are pleased to suggest that from papers in this double special issue on PSMs this trend seems over because new authors report new PSMs and extend existing PSMs in new directions. Despite these recent developments of the methods, it is important to examine why this apparent stagnation took place. In the following sections, we identify and elaborate a number of reasons for it. We also consider the trends, challenges and opportunities that the PSM community will continue to face. Our aim is to evaluate the pre-2000 PSM community to encourage its revolutionary development post-2006 and offer directions for the long term sustainability of the field.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This thesis examines the reasons for Cadburys' move from a city centre site to a greenfield site in Bournville in 1879 and the subsequent development of the factory and the Bournville community. The founding of the Bournville Village Trust by George Cadbury is discussed in relation to the Garden City movement. The welfare and personnel management policies which Cadburys adopted in the 1900s are considered in relation to welfarism in general, especially in the United States. The extent to which the idea of a `Quaker employer' can explain Cadburys policies is questioned both methodologically and empirically. The early use of scientific management at Bournville is described and related to Edward Cadbury's writings on the subject. Finally, the institution of a Works Council Scheme in 1918 is described and its uses are discussed. It is concluded that Cadburys instituted a new factory system in this period which consisted of a synthesis of ideas borrowed from elsewhere and that for a variety of reasons Cadburys was an appropriate site for their implementation.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper investigates the decisions that companies make when choosing the source of their manufacturing technology. It builds on previous research that has identified differences between the practices of US and Japanese manufacturing companies. A structured study of manufacturing technology sourcing practices at 14 US-based manufacturing companies is described. This research has confirmed that there is a trend in the companies studied to acquire manufacturing technology from sources external to their organisations. However, no formal processes are used to form these policies. When a rationale for this behaviour was sought, companies gave a series of reasons concerned with business focus, efficiency of technology acquisition, and the extent, defence and support of manufacturing capabilities.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Today, very few would doubt that there are plenty of reasons to liken Weber’s and Foucault’s theories of power. Nevertheless, their respective works have divergent ethical and ontological preoccupations which should be reconsidered. This paper explores Foucault’s account of a historical episode in Discipline and Punish and Weber’s theory of life spheres, uncovering evidence that there is a need to reassess the conceptual bridges which have been built so far. The exploration reveals a radical difference between a monological theory of power (Foucault) and a multidimensional approach to power (Weber). Yet by unbridging the two thinkers and focusing on other aspects of their theories along with their ideas about power, we also find that alternative links between the two frameworks may offer a more promising critical theory.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The management and sharing of complex data, information and knowledge is a fundamental and growing concern in the Water and other Industries for a variety of reasons. For example, risks and uncertainties associated with climate, and other changes require knowledge to prepare for a range of future scenarios and potential extreme events. Formal ways in which knowledge can be established and managed can help deliver efficiencies on acquisition, structuring and filtering to provide only the essential aspects of the knowledge really needed. Ontologies are a key technology for this knowledge management. The construction of ontologies is a considerable overhead on any knowledge management programme. Hence current computer science research is investigating generating ontologies automatically from documents using text mining and natural language techniques. As an example of this, results from application of the Text2Onto tool to stakeholder documents for a project on sustainable water cycle management in new developments are presented. It is concluded that by adopting ontological representations sooner, rather than later in an analytical process, decision makers will be able to make better use of highly knowledgeable systems containing automated services to ensure that sustainability considerations are included.