9 resultados para National Science Foundation (U.S.). Research Applied to National Needs Program.
em Aston University Research Archive
Resumo:
Risk and knowledge are two concepts and components of business management which have so far been studied almost independently. This is especially true where risk management (RM) is conceived mainly in financial terms, as for example, in the financial institutions sector. Financial institutions are affected by internal and external changes with the consequent accommodation to new business models, new regulations and new global competition that includes new big players. These changes induce financial institutions to develop different methodologies for managing risk, such as the enterprise risk management (ERM) approach, in order to adopt a holistic view of risk management and, consequently, to deal with different types of risk, levels of risk appetite, and policies in risk management. However, the methodologies for analysing risk do not explicitly include knowledge management (KM). This research examines the potential relationships between KM and two RM concepts: perceived quality of risk control and perceived value of ERM. To fulfill the objective of identifying how KM concepts can have a positive influence on some RM concepts, a literature review of KM and its processes and RM and its processes was performed. From this literature review eight hypotheses were analysed using a classification into people, process and technology variables. The data for this research was gathered from a survey applied to risk management employees in financial institutions and 121 answers were analysed. The analysis of the data was based on multivariate techniques, more specifically stepwise regression analysis. The results showed that the perceived quality of risk control is significantly associated with the variables: perceived quality of risk knowledge sharing, perceived quality of communication among people, web channel functionality, and risk management information system functionality. However, the relationships of the KM variables to the perceived value of ERM are not identified because of the low performance of the models describing these relationships. The analysis reveals important insights into the potential KM support to RM such as: the better adoption of KM people and technology actions, the better the perceived quality of risk control. Equally, the results suggest that the quality of risk control and the benefits of ERM follow different patterns given that there is no correlation between both concepts and the distinct influence of the KM variables in each concept. The ERM scenario is different from that of risk control because ERM, as an answer to RM failures and adaptation to new regulation in financial institutions, has led organizations to adopt new processes, technologies, and governance models. Thus, the search for factors influencing the perceived value of ERM implementation needs additional analysis because what is improved in RM processes individually is not having the same effect on the perceived value of ERM. Based on these model results and the literature review the basis of the ERKMAS (Enterprise Risk Knowledge Management System) is presented.
Resumo:
This paper is drawn from the use of data envelopment analysis (DEA) in helping a Portuguese bank to manage the performance of its branches. The bank wanted to set targets for the branches on such variables as growth in number of clients, growth in funds deposited and so on. Such variables can take positive and negative values but apart from some exceptions, traditional DEA models have hitherto been restricted to non-negative data. We report on the development of a model to handle unrestricted data in a DEA framework and illustrate the use of this model on data from the bank concerned.
Resumo:
People and their performance are key to an organization's effectiveness. This review describes an evidence-based framework of the links between some key organizational influences and staff performance, health and well-being. This preliminary framework integrates management and psychological approaches, with the aim of assisting future explanation, prediction and organizational change. Health care is taken as the focus of this review, as there are concerns internationally about health care effectiveness. The framework considers empirical evidence for links between the following organizational levels: 1. Context (organizational culture and inter-group relations; resources, including staffing; physical environment) 2. People management (HRM practices and strategies; job design, workload and teamwork; employee involvement and control over work; leadership and support) 3. Psychological consequences for employees (health and stress; satisfaction and commitment; knowledge, skills and motivation) 4. Employee behaviour (absenteeism and turnover; task and contextual performance; errors and near misses) 5. Organizational performance; patient care. This review contributes to an evidence base for policies and practices of people management and performance management. Its usefulness will depend on future empirical research, using appropriate research designs, sufficient study power and measures that are reliable and valid.
Resumo:
The aim of this research was to improve the quantitative support to project planning and control principally through the use of more accurate forecasting for which new techniques were developed. This study arose from the observation that in most cases construction project forecasts were based on a methodology (c.1980) which relied on the DHSS cumulative cubic cost model and network based risk analysis (PERT). The former of these, in particular, imposes severe limitations which this study overcomes. Three areas of study were identified, namely growth curve forecasting, risk analysis and the interface of these quantitative techniques with project management. These fields have been used as a basis for the research programme. In order to give a sound basis for the research, industrial support was sought. This resulted in both the acquisition of cost profiles for a large number of projects and the opportunity to validate practical implementation. The outcome of this research project was deemed successful both in theory and practice. The new forecasting theory was shown to give major reductions in projection errors. The integration of the new predictive and risk analysis technologies with management principles, allowed the development of a viable software management aid which fills an acknowledged gap in current technology.
Resumo:
New Approach’ Directives now govern the health and safety of most products whether destined for workplace or domestic use. These Directives have been enacted into UK law by various specific legislation principally relating to work equipment, machinery and consumer products. This research investigates whether the risk assessment approach used to ensure the safety of machinery may be applied to consumer products. Crucially, consumer products are subject to the Consumer Protection Act (CPA) 1987, where there is no direct reference to “assessing risk”. This contrasts with the law governing the safety of products used in the workplace, where risk assessment underpins the approach. New Approach Directives are supported by European harmonised standards, and in the case of machinery, further supported by the risk assessment standard, EN 1050. The system regulating consumer product safety is discussed, its key elements identified and a graphical model produced. This model incorporates such matters as conformity assessment, the system of regulation, near miss and accident reporting. A key finding of the research is that New Approach Directives have a common feature of specifying essential performance requirements that provide a hazard prompt-list that can form the basis for a risk assessment (the hazard identification stage). Drawing upon 272 prosecution cases, and with thirty examples examined in detail, this research provides evidence that despite the high degree of regulation, unsafe consumer products still find their way onto the market. The research presents a number of risk assessment tools to help Trading Standards Officers (TSOs) prioritise their work at the initial inspection stage when dealing with subsequent enforcement action.
Resumo:
The Alborz Mountain range separates the northern part of Iran from the southern part. It also isolates a narrow coastal strip to the south of the Caspian Sea from the Central Iran plateau. Communication between the south and north until the 1950's was via two roads and one rail link. In 1963 work was completed on a major access road via the Haraz Valley (the most physically hostile area in the region). From the beginning the road was plagued by accidents resulting from unstable slopes on either side of the valley. Heavy casualties persuaded the government to undertake major engineering works to eliminate ''black spots" and make the road safe. However, despite substantial and prolonged expenditure the problems were not solved and casualties increased steadily due to the increase in traffic using the road. Another road was built to bypass the Haraz road and opened to traffic in 1983. But closure of the Haraz road was still impossible because of the growth of settlements along the route and the need for access to other installations such as the Lar Dam. The aim of this research was to explore the possibility of applying Landsat MSS imagery to locating black spots along the road and the instability problems. Landsat data had not previously been applied to highway engineering problems in the study area. Aerial photographs are better in general than satellite images for detailed mapping, but Landsat images are superior for reconnaissance and adequate for mapping at the 1 :250,000 scale. The broad overview and lack of distortion in the Landsat imagery make the images ideal for structural interpretation. The results of Landsat digital image analysis showed that certain rock types and structural features can be delineated and mapped. The most unstable areas comprising steep slopes, free of vegetation cover can be identified using image processing techniques. Structural lineaments revealed from the image analysis led to improved results (delineation of unstable features). Damavand Quaternary volcanics were found to be the dominant rock type along a 40 km stretch of the road. These rock types are inherently unstable and partly responsible for the difficulties along the road. For more detailed geological and morphological interpretation a sample of small subscenes was selected and analysed. A special developed image analysis package was designed at Aston for use on a non specialized computing system. Using this package a new and unique method for image classification was developed, allowing accurate delineation of the critical features of the study area.
Resumo:
In this paper, we discuss some practical implications for implementing adaptable network algorithms applied to non-stationary time series problems. Two real world data sets, containing electricity load demands and foreign exchange market prices, are used to test several different methods, ranging from linear models with fixed parameters, to non-linear models which adapt both parameters and model order on-line. Training with the extended Kalman filter, we demonstrate that the dynamic model-order increment procedure of the resource allocating RBF network (RAN) is highly sensitive to the parameters of the novelty criterion. We investigate the use of system noise for increasing the plasticity of the Kalman filter training algorithm, and discuss the consequences for on-line model order selection. The results of our experiments show that there are advantages to be gained in tracking real world non-stationary data through the use of more complex adaptive models.
Resumo:
Potentiostatically induced current transients obtained on a range of reinforced concrete specimens were analysed to give estimates of the polarisation resistance and interfacial capacitance. The polarisation resistance was compared with the values obtained using more conventional DC methods of analysis and, while it was consistently lower, it was within the error normally attributed to the polarisation resistance method of corrosion rate determination. The interfacial capacitance values determined increased from 0.44 F m -2 for passive steel (polarisation resistance of 132 Ω m 2) to 26.5 F m -2 for active steel (polarisation resistance of 0.34 Ω m 2). This has a dominant effect on the time required for potentiostatically induced current transients to reach a steady state with a longer time being required by actively corroding steel. By contrast the potential decay time constants describing galvanostatically or coulostatically induced potential transients decrease with an increase in corrosion rate and values less than 25 s for active specimens and greater than 40 s for passive specimens were determined in this work. © 1997 Elsevier Science Ltd.
Resumo:
This article uses a research project into the online conversations of sex offenders and the children they abuse to further the arguments for the acceptability of experimental work as a research tool for linguists. The research reported here contributes to the growing body of work within linguistics that has found experimental methods to be useful in answering questions about representation and constraints on linguistic expression (Hemforth 2013). The wider project examines online identity assumption in online paedophile activity and the policing of such activity, and involves dealing with the linguistic analysis of highly sensitive sexual grooming transcripts. Within the linguistics portion of the project, we examine theories of idiolect and identity through analysis of the ‘talk’ of perpetrators of online sexual abuse, and of the undercover officers that must assume alternative identities in order to investigate such crimes. The essential linguistic question in this article is methodological and concerns the applicability of experimental work to exploration of online identity and identity disguise. Although we touch on empirical questions, such as the sufficiency of linguistic description that will enable convincing identity disguise, we do not explore the experimental results in detail. In spite of the preference within a range of discourse analytical paradigms for ‘naturally occurring’ data, we argue that not only does the term prove conceptually problematic, but in certain contexts, and particularly in the applied forensic context described, a rejection of experimentally elicited data would limit the possible types and extent of analyses. Thus, it would restrict the contribution that academic linguistics can make in addressing a serious social problem.