934 resultados para Metaphysics of reasons
Resumo:
Firms began outsourcing information system functions soon after the inception of electronic computing. Extant research has concentrated on large organizations and large-valued outsourcing contracts from a variety of different industries. Smaller-sized firms are inherently different from their large counterparts. These differences between small and large firms could lead to different information technology/information system (IT/IS) items being outsourced and different outsourcing agreements governing these arrangements. This research explores and examines the outsourcing practices of very small through to medium-sized manufacturing organizations. The in-depth case studies not only explored the extent to which different firms engaged in outsourcing but also the nuances of their outsourcing arrangements. The results reveal that all six firms tended to outsource the same sorts of functions. Some definite differences existed, however, in the strategies adopted in relation to the functions they outsourced. These differences arose for a variety of reasons, including size, locality, and holding company influences. The very small and small manufacturing firms tended to make outsourcing purchases on an ad hoc basis with little reliance on legal advice. In contrast, the medium-sized firms often used a more planned initiative and sought legal advice more often. Interestingly, not one of the six firms outsourced any of their transaction processing. These findings now give very small, small-, and medium-sized manufacturing firms the opportunity to compare their practices against other firms of similar size.
Resumo:
Uncertified absence from work has traditionally been difficult to link to personality. The present paper argues that personality is best conceptualized as influencing an individual’s intention to be absent from work because of reasons that are within their control. This was investigated in an employed community sample of 128 individuals. These individuals used a self-report measure to express their future intentions to be absent from work as a result of several reasons. These reasons for absence were categorized as “being absent because of external pressure or commitment” (ABCo) and “being absence by choice” (ABCh). The Big Five personality factors were found to be unrelated to objective uncertified absence records and unrelated to ABCo. Three of the Big Five were related to ABCh. Agreeableness was negatively related to ABCh whereas Extraversion and Openness demonstrated a positive correlation. It was concluded that the results should be viewed tentatively, but that this study may provide a useful framework for conceptualizing the association of personality with uncertified absence.
Resumo:
Computer-Based Learning systems of one sort or another have been in existence for almost 20 years, but they have yet to achieve real credibility within Commerce, Industry or Education. A variety of reasons could be postulated for this, typically: - cost - complexity - inefficiency - inflexibility - tedium Obviously different systems deserve different levels and types of criticism, but it still remains true that Computer-Based Learning (CBL) is falling significantly short of its potential. Experience of a small, but highly successful CBL system within a large, geographically distributed industry (the National Coal Board) prompted an investigation into currently available packages, the original intention being to purchase the most suitable software and run it on existing computer hardware, alongside existing software systems. It became apparent that none of the available CBL packages were suitable, and a decision was taken to develop an in-house Computer-Assisted Instruction system according to the following criteria: - cheap to run; - easy to author course material; - easy to use; - requires no computing knowledge to use (as either an author or student) ; - efficient in the use of computer resources; - has a comprehensive range of facilities at all levels. This thesis describes the initial investigation, resultant observations and the design, development and implementation of the SCHOOL system. One of the principal characteristics c£ SCHOOL is that it uses a hierarchical database structure for the storage of course material - thereby providing inherently a great deal of the power, flexibility and efficiency originally required. Trials using the SCHOOL system on IBM 303X series equipment are also detailed, along with proposed and current development work on what is essentially an operational CBL system within a large-scale Industrial environment.
Resumo:
This thesis presents the results from an investigation into the merits of analysing Magnetoencephalographic (MEG) data in the context of dynamical systems theory. MEG is the study of both the methods for the measurement of minute magnetic flux variations at the scalp, resulting from neuro-electric activity in the neocortex, as well as the techniques required to process and extract useful information from these measurements. As a result of its unique mode of action - by directly measuring neuronal activity via the resulting magnetic field fluctuations - MEG possesses a number of useful qualities which could potentially make it a powerful addition to any brain researcher's arsenal. Unfortunately, MEG research has so far failed to fulfil its early promise, being hindered in its progress by a variety of factors. Conventionally, the analysis of MEG has been dominated by the search for activity in certain spectral bands - the so-called alpha, delta, beta, etc that are commonly referred to in both academic and lay publications. Other efforts have centred upon generating optimal fits of "equivalent current dipoles" that best explain the observed field distribution. Many of these approaches carry the implicit assumption that the dynamics which result in the observed time series are linear. This is despite a variety of reasons which suggest that nonlinearity might be present in MEG recordings. By using methods that allow for nonlinear dynamics, the research described in this thesis avoids these restrictive linearity assumptions. A crucial concept underpinning this project is the belief that MEG recordings are mere observations of the evolution of the true underlying state, which is unobservable and is assumed to reflect some abstract brain cognitive state. Further, we maintain that it is unreasonable to expect these processes to be adequately described in the traditional way: as a linear sum of a large number of frequency generators. One of the main objectives of this thesis will be to prove that much more effective and powerful analysis of MEG can be achieved if one were to assume the presence of both linear and nonlinear characteristics from the outset. Our position is that the combined action of a relatively small number of these generators, coupled with external and dynamic noise sources, is more than sufficient to account for the complexity observed in the MEG recordings. Another problem that has plagued MEG researchers is the extremely low signal to noise ratios that are obtained. As the magnetic flux variations resulting from actual cortical processes can be extremely minute, the measuring devices used in MEG are, necessarily, extremely sensitive. The unfortunate side-effect of this is that even commonplace phenomena such as the earth's geomagnetic field can easily swamp signals of interest. This problem is commonly addressed by averaging over a large number of recordings. However, this has a number of notable drawbacks. In particular, it is difficult to synchronise high frequency activity which might be of interest, and often these signals will be cancelled out by the averaging process. Other problems that have been encountered are high costs and low portability of state-of-the- art multichannel machines. The result of this is that the use of MEG has, hitherto, been restricted to large institutions which are able to afford the high costs associated with the procurement and maintenance of these machines. In this project, we seek to address these issues by working almost exclusively with single channel, unaveraged MEG data. We demonstrate the applicability of a variety of methods originating from the fields of signal processing, dynamical systems, information theory and neural networks, to the analysis of MEG data. It is noteworthy that while modern signal processing tools such as independent component analysis, topographic maps and latent variable modelling have enjoyed extensive success in a variety of research areas from financial time series modelling to the analysis of sun spot activity, their use in MEG analysis has thus far been extremely limited. It is hoped that this work will help to remedy this oversight.
Resumo:
This research examines and explains the links between safety culture and communication. Safety culture is a concept that in recent years has gained prominence but there has been little applied research conducted to investigate the meaning of the concept in 'real life' settings. This research focused on a Train Operating Company undergoing change in a move towards privatisation. These changes were evident in the management of safety, the organisation of the industry and internally in their management. The Train Operating Company's management took steps to improve their safety culture and communications through the development of a cascade communication structure. The research framework employed a qualitative methodology in order to investigate the effect of the new system on safety culture. Findings of the research were that communications in the organisation failed to be effective for a number of reasons, including both cultural and logistical problems. The cultural problems related to a lack of trust in the organisation by the management and the workforce, the perception of communications as management propaganda, and asyntonic communications between those involved, whilst logistical problems related to the inherent difficulties of communicating over a geographically distributed network. An organisational learning framework was used to explain the results. It is postulated that one of the principal reasons why change, either to the safety culture or to communications, did not occur was because of the organisation's inability to learn. The research has also shown the crucial importance of trust between the members of the organisation, as this was one of the fundamental reasons why the safety culture did not change, and why safety management systems were not fully implemented. This is consistent with the notion of mutual trust in the HSC (1993) definition of safety culture. This research has highlighted its relevance to safety culture and its importance for organisational change.
Resumo:
An international round robin study of the stability of fast pyrolysis bio-oil was undertaken. Fifteen laboratories in five different countries contributed. Two bio-oil samples were distributed to the laboratories for stability testing and further analysis. The stability test was defined in a method provided with the bio-oil samples. Viscosity measurement was a key input. The change in viscosity of a sealed sample of bio-oil held for 24 h at 80 °C was the defining element of stability. Subsequent analyses included ultimate analysis, density, moisture, ash, filterable solids, and TAN/pH determination, and gel permeation chromatography. The results showed that kinematic viscosity measurement was more generally conducted and more reproducibly performed versus dynamic viscosity measurement. The variation in the results of the stability test was great and a number of reasons for the variation were identified. The subsequent analyses proved to be at the level of reproducibility, as found in earlier round robins on bio-oil analysis. Clearly, the analyses were more straightforward and reproducible with a bio-oil sample low in filterable solids (0.2%), compared to one with a higher (2%) solids loading. These results can be helpful in setting standards for use of bio-oil, which is just coming into the marketplace. © 2012 American Chemical Society.
Resumo:
The article looks first into the nature of the relations between Germany and the CEE countries a decade since the accession of the CEE countries to the EU. The relations are characterized as normalised and intensive with diverse levels of closeness and co-operation reflecting of the conceptual and ideological compatibility/differences. Next, the article focuses on the German attitude to the euro zone crisis. Germany has become a hegemon in the rescue effort aimed at stabilisation and economic invigoration of the euro zone. However, German hegemony has developed by default, not by design: her leading position is linked with considerable political and financial costs. Germany moved central stage and took the position of a reluctant hegemon. However, German role is contested internationally (it has not the support of the French government in key areas) as well as internally (particularly by the Federal Constitutional Court and the Bundesbank).The article argues that the new situation makes the German-CEE relations increasingly relevant for both sides. The German leadership of the EU increasing split along the north-south divide requires backing by the Northern group countries to which the CEE in general belongs. Given a number of reasons the CEE countries implement three distinctive strategies of co-operation with Germany in European politics. Also military co-operation, which remained rather limited so far, may receive new impulses, given the financial austerity. © 2013 The Regents of the University of California.
Resumo:
Timing jitter is a major factor limiting the performance of any high-speed, long-haul data transmission system. It arises from a number of reasons, such as interaction with accumulated spontaneous emission, inter-symbol interference (ISI), electrostriction etc. Some effects causing timing jitter can be reduced by means of non-linear filtering, using, for example, a nonlinear optical loop mirror (NOLM) [1]. The NOLM has been shown to reduce the timing jitter by suppressing the ASE and by stabilising the pulse duration [2, 3]. In this paper, we investigate the dynamics of timing jitter in a 2R regenerated system, nonlinearly guided by NOLMs at bit rates of 10, 20, 40, and 80- Gbit/s. Transmission performance of an equivalent non-regenerated (generic) system is taken as a reference.
Resumo:
Despite Government investment in flood defence schemes, many properties remain at high risk of flooding. A substantial portion of these properties are business establishments. Flooding can create serious consequences for businesses, including damage to property and stocks, being out of business for a considerable period and ultimately business failure. Recent flood events such as those in 2007 and 2009 that affected many parts of the UK have helped to establish the true costs of flooding to businesses. This greater understanding of the risks to businesses has heightened the need for business owners to adapt their businesses to the threat of future flooding. Government policy has now shifted away from investment in engineered flood defences, towards encouraging the uptake of property level flood resistance and resilience measures by businesses. However, implementing such adaptation strategies remains a challenge due a range of reasons. A review of the current state of property level flood risk adaptation of UK businesses is presented, drawing from extant literature. Barriers that may hinder the uptake of property level adaptation by businesses are revealed and drivers that may enhance uptake and effectively overcome these barriers are also discussed. It is concluded that the professions from the construction sector have the potential to contribute towards the adaptation of business properties and thereby the flood resilience of businesses at risk of flooding.
Resumo:
As traffic congestion exuberates and new roadway construction is severely constrained because of limited availability of land, high cost of land acquisition, and communities' opposition to the building of major roads, new solutions have to be sought to either make roadway use more efficient or reduce travel demand. There is a general agreement that travel demand is affected by land use patterns. However, traditional aggregate four-step models, which are the prevailing modeling approach presently, assume that traffic condition will not affect people's decision on whether to make a trip or not when trip generation is estimated. Existing survey data indicate, however, that differences exist in trip rates for different geographic areas. The reasons for such differences have not been carefully studied, and the success of quantifying the influence of land use on travel demand beyond employment, households, and their characteristics has been limited to be useful to the traditional four-step models. There may be a number of reasons, such as that the representation of influence of land use on travel demand is aggregated and is not explicit and that land use variables such as density and mix and accessibility as measured by travel time and congestion have not been adequately considered. This research employs the artificial neural network technique to investigate the potential effects of land use and accessibility on trip productions. Sixty two variables that may potentially influence trip production are studied. These variables include demographic, socioeconomic, land use and accessibility variables. Different architectures of ANN models are tested. Sensitivity analysis of the models shows that land use does have an effect on trip production, so does traffic condition. The ANN models are compared with linear regression models and cross-classification models using the same data. The results show that ANN models are better than the linear regression models and cross-classification models in terms of RMSE. Future work may focus on finding a representation of traffic condition with existing network data and population data which might be available when the variables are needed to in prediction.
Resumo:
The following paper examines Walter Benjamin’s reflection on the category of “redemption”, mainly developed in the theses On the concept of History. To this end, we will try firstly to reconstruct Benjamin’s critique of “fate”, as it unfolds in the twenties on the field of right, economy and, especially, history. The critique of the expiatory logic of “fate” – developed in essays such as Fate and Character, Critique of violence or Capitalism as religion – will then allow us to disclose the “dialectical” structure of redemption, whereby Benjamin mobilizes his previous theory of knowledge against the doctrine of progress.
Resumo:
Jean-Luc Marion’s phenomenology of giveness constitutes one of the most outstanding attempts to set up a universal theory of the phenomenologically given as a whole within the framework of contemporary philosophical thought. The aim of the present study is to apply the main categories of this phenomenological theory concerning gift to the singular type of phenomenon represented by the pure indeterminate and anonymous being to which Emmanuel Levinas refers by the name of il y a (“there is”) in his early writings (and also subsequently). Therefore, this concerns examining the multiple specific modes of giveness proper to the impersonal “there is” and also its paradoxical relationship both with the donor and with the receiver of such gift in order to show the possibility of a “third way” of phenomenological investigation. This is a way equally distant from the western traditional concept of Being as “stable presence” and from Levinas’ proposal geared to substitute ontology for ethics as “first philosophy”.
Resumo:
Principal attrition is a national problem particularly in large urban school districts. Research confirms that schools that serve high proportions of children living in poverty have the most difficulty attracting and retaining competent school leaders. Principals who are at the helm of high poverty schools have a higher turnover rate than the national average of three to four years and higher rates of teacher attrition. This leadership turnover has a fiscal impact on districts and negatively affects student achievement. Research identifies a myriad of reasons why administrators leave the role of principal: some leave the position for retirement; some exit based on difficulty of the role and lack of support; and some simply leave for other opportunities within and outside of the profession altogether. As expectations for both teacher and learner performance drive the national education agenda, understanding how to keep effective principals in their jobs is critical. This study examined the factors that principals in a large urban district identified as potentially affecting their decisions to stay in the position. The study utilized a multi-dimensional, web-based questionnaire to examine principals’ perceptions regarding contributing factors that impact tenure. Results indicated that: • having a quality teaching staff and establishing a positive work-life balance were important stay factors for principals; • having an effective supervisor and collegial support from other principals, were helpful supports; and • having adequate resources, time for long-term planning, and teacher support and resources were critical working conditions. Taken together, these indicators were the most frequently cited factors that would keep principals in their positions. The results were used to create a framework that may serve as a potential guide for addressing principal retention.
Resumo:
Background: Community participation has become an integral part of many areas of public policy over the last two decades. For a variety of reasons, ranging from concerns about social cohesion and unrest to perceived failings in public services, governments in the UK and elsewhere have turned to communities as both a site of intervention and a potential solution. In contemporary policy, the shift to community is exemplified by the UK Government’s Big Society/Localism agenda and the Scottish Government’s emphasis on Community Empowerment. Through such policies, communities have been increasingly encouraged to help themselves in various ways, to work with public agencies in reshaping services, and to become more engaged in the democratic process. These developments have led some theorists to argue that responsibilities are being shifted from the state onto communities, representing a new form of 'government through community' (Rose, 1996; Imrie and Raco, 2003). Despite this policy development, there is surprisingly little evidence which demonstrates the outcomes of the different forms of community participation. This study attempts to address this gap in two ways. Firstly, it explores the ways in which community participation policy in Scotland and England are playing out in practice. And secondly, it assesses the outcomes of different forms of community participation taking place within these broad policy contexts. Methodology: The study employs an innovative combination of the two main theory-based evaluation methodologies, Theories of Change (ToC) and Realist Evaluation (RE), building on ideas generated by earlier applications of each approach (Blamey and Mackenzie, 2007). ToC methodology is used to analyse the national policy frameworks and the general approach of community organisations in six case studies, three in Scotland and three in England. The local evidence from the community organisations’ theories of change is then used to analyse and critique the assumptions which underlie the Localism and Community Empowerment policies. Alongside this, across the six case studies, a RE approach is utilised to examine the specific mechanisms which operate to deliver outcomes from community participation processes, and to explore the contextual factors which influence their operation. Given the innovative methodological approach, the study also engages in some focused reflection on the practicality and usefulness of combining ToC and RE approaches. Findings: The case studies provide significant evidence of the outcomes that community organisations can deliver through directly providing services or facilities, and through influencing public services. Important contextual factors in both countries include particular strengths within communities and positive relationships with at least part of the local state, although this often exists in parallel with elements of conflict. Notably this evidence suggests that the idea of responsibilisation needs to be examined in a more nuanced fashion, incorporating issues of risk and power, as well the active agency of communities and the local state. Thus communities may sometimes willingly take on responsibility in return for power, although this may also engender significant risk, with the balance between these three elements being significantly mediated by local government. The evidence also highlights the impacts of austerity on community participation, with cuts to local government budgets in particular increasing the degree of risk and responsibility for communities and reducing opportunities for power. Furthermore, the case studies demonstrate the importance of inequalities within and between communities, operating through a socio-economic gradient in community capacity. This has the potential to make community participation policy regressive as more affluent communities are more able to take advantage of additional powers and local authorities have less resource to support the capacity of more disadvantaged communities. For Localism in particular, the findings suggest that some of the ‘new community rights’ may provide opportunities for communities to gain power and generate positive social outcomes. However, the English case studies also highlight the substantial risks involved and the extent to which such opportunities are being undermined by austerity. The case studies suggest that cuts to local government budgets have the potential to undermine some aspects of Localism almost entirely, and that the very limited interest in inequalities means that Localism may be both ‘empowering the powerful’ (Hastings and Matthews, 2014) and further disempowering the powerless. For Community Empowerment, the study demonstrates the ways in which community organisations can gain power and deliver positive social outcomes within the broad policy framework. However, whilst Community Empowerment is ostensibly less regressive, there are still significant challenges to be addressed. In particular, the case studies highlight significant constraints on the notion that communities can ‘choose their own level of empowerment’, and the assumption of partnership working between communities and the local state needs to take into account the evidence of very mixed relationships in practice. Most importantly, whilst austerity has had more limited impacts on local government in Scotland so far, the projected cuts in this area may leave Community Empowerment vulnerable to the dangers of regressive impact highlighted for Localism. Methodologically, the study shows that ToC and RE can be practically applied together and that there may be significant benefits of the combination. ToC offers a productive framework for policy analysis and combining this with data derived from local ToCs provides a powerful lens through which to examine and critique the aims and assumptions of national policy. ToC models also provide a useful framework within which to identify specific causal mechanisms, using RE methodology and, again, the data from local ToC work can enable significant learning about ‘what works for whom in what circumstances’ (Pawson and Tilley, 1997).
Resumo:
Objective: The purpose of this study was to develop and test psychometric properties of a Mealtime Interaction Clinical Observation Tool (MICOT) that could be used to facilitate assessment and behavioural intervention in childhood feeding difficulties. Methods: Thematic analysis of four focus groups with feeding and behaviour experts identified the content and structure of the MICOT. Following refinement, inter-rater reliability was tested between three healthcare professionals. Results: Six themes were identified for the MICOT, which utilises a traffic-light system to identify areas of strength and areas for intervention. Despite poor inter-rater reliability, for which a number of reasons are postulated, some correlation between psychologists’ ratings was evident. Healthcare professionals liked the tool and reported that it could have good clinical utility. Conclusion: The study provides a promising first version of a clinical observation tool that facilitates assessment and behavioural intervention in childhood feeding difficulties.