38 resultados para many-objective problems
Resumo:
Purpose - To consider the role of technology in knowledge management in organizations, both actual and desired. Design/methodology/approach - Facilitated, computer-supported group workshops were conducted with 78 people from ten different organizations. The objective of each workshop was to review the current state of knowledge management in that organization and develop an action plan for the future. Findings - Only three organizations had adopted a strongly technology-based "solution" to knowledge management problems, and these followed three substantially different routes. There was a clear emphasis on the use of general information technology tools to support knowledge management activities, rather than the use of tools specific to knowledge management. Research limitations/implications - Further research is needed to help organizations make best use of generally available software such as intranets and e-mail for knowledge management. Many issues, especially human, relate to the implementation of any technology. Participation was restricted to organizations that wished to produce an action plan for knowledge management. The findings may therefore represent only "average" organizations, not the very best practice. Practical implications - Each organization must resolve four tensions: Between the quantity and quality of information/knowledge, between centralized and decentralized organization, between head office and organizational knowledge, and between "push" and "pull" processes. Originality/value - Although it is the group rather than an individual that determines what counts as knowledge, hardly any previous studies of knowledge management have collected data in a group context.
Resumo:
Purpose – The purpose of this paper is to challenge the assumption that process losses of individuals working in teams are unavoidable. The paper aims to challenge this assumption on the basis of social identity theory and recent research. Design/methodology/approach – The approach adopted in this paper is to review the mainstream literature providing strong evidence for motivation problems of individuals working in groups. Based on more recent literature, innovative ways to overcome these problems are discussed. Findings – A social identity-based analysis and recent findings summarized in this paper show that social loafing can be overcome and that even motivation gains in group work can be expected when groups are important for the individual group members' self-concepts. Practical implications – The paper provides human resource professionals and front-line managers with suggestions as to how individual motivation and performance might be increased when working in teams. Originality/value – The paper contributes to the literature by challenging the existing approach to reducing social loafing, i.e. individualizing workers as much as possible, and proposes a team-based approach instead to overcome motivation problems.
Resumo:
This article provide an in-depth examination into how the existence of "informal" work practices, in a particular location, namely, the higher education system in Ukraine impacts the everyday lives of a specific population group, namely, students. The article provides a comprehensive overview of corruption in post-Soviet education systems, suggesting some historical and contemporary reasons for its current scale. In particular, the article focuses on students' experiences of corruption, first, exploring the difficulties that many individuals face when trying to gain access to the higher education system, and second, outlining students' experiences while progressing through the course. The article's concluding section examines the wider outcome of these processes on Ukrainian society as a whole as well as the impacts on individual students.
Resumo:
Pyrolysis is one of several thermochemical technologies that convert solid biomass into more useful and valuable bio-fuels. Pyrolysis is thermal degradation in the complete or partial absence of oxygen. Under carefully controlled conditions, solid biomass can be converted to a liquid known as bie-oil in 75% yield on dry feed. Bio-oil can be used as a fuel but has the drawback of having a high level of oxygen due to the presence of a complex mixture of molecular fragments of cellulose, hemicellulose and lignin polymers. Also, bio-oil has a number of problems in use including high initial viscosity, instability resulting in increased viscosity or phase separation and high solids content. Much effort has been spent on upgrading bio-oil into a more usable liquid fuel, either by modifying the liquid or by major chemical and catalytic conversion to hydrocarbons. The overall primary objective was to improve oil stability by exploring different ways. The first was to detennine the effect of feed moisture content on bio-oil stability. The second method was to try to improve bio-oil stability by partially oxygenated pyrolysis. The third one was to improve stability by co-pyrolysis with methanol. The project was carried out on an existing laboratory pyrolysis reactor system, which works well with this project without redesign or modification too much. During the finishing stages of this project, it was found that the temperature of the condenser in the product collection system had a marked impact on pyrolysis liquid stability. This was discussed in this work and further recommendation given. The quantity of water coming from the feedstock and the pyrolysis reaction is important to liquid stability. In the present work the feedstock moisture content was varied and pyrolysis experiments were carried out over a range of temperatures. The quality of the bio-oil produced was measured as water content, initial viscosity and stability. The result showed that moderate (7.3-12.8 % moisture) feedstock moisture led to more stable bio-oil. One of drawbacks of bio-oil was its instability due to containing unstable oxygenated chemicals. Catalytic hydrotreatment of the oil and zeolite cracking of pyrolysis vapour were discllssed by many researchers, the processes were intended to eliminate oxygen in the bio-oil. In this work an alternative way oxygenated pyrolysis was introduced in order to reduce oil instability, which was intended to oxidise unstable oxygenated chemicals in the bio-oil. The results showed that liquid stability was improved by oxygen addition during the pyrolysis of beech wood at an optimum air factor of about 0.09-0.15. Methanol as a postproduction additive to bio-oil has been studied by many researchers and the most effective result came from adding methanol to oil just after production. Co-pyrolysis of spruce wood with methanol was undertaken in the present work and it was found that methanol improved liquid stability as a co-pyrolysis solvent but was no more effective than when used as a postproduction additive.
Resumo:
SINNMR (Sonically Induced Narrowing of the Nuclear Magnetic Resonance spectra of solids), is a novel technique that is being developed to enable the routine study of solids by nuclear magnetic resonance spectroscopy. SINNMR aims to narrow the broad resonances that are characteristic of solid state NMR by inducing rapid incoherent motion of solid particles suspended in a support medium, using high frequency ultrasound in the range 2-10 MHz. The width of the normal broad resonances from solids are due to incomplete averaging of several components of the total spin Hamiltonian caused by restrictions placed on molecular motion within a solid. At present Magic Angle Spinning (MAS) NMR is the classical solid state technique used to reduce line broadening, but: this has associated problems, not least of which is the appearance of many spinning side bands which confuse the spectra. It is hoped that SlNNMR will offer a simple alternative, particularly as it does not reveal spinning sidebands The fundamental question concerning whether the use of ultrasound within a cryo-magnet will cause quenching has been investigated with success, as even under the most extreme conditions of power, frequency and irradiator time, the magnet does not quench. The objective of this work is to design and construct a SINNMR probe for use in a super conducting cryo-magnet NMR spectrometer. A cell for such a probe has been constructed and incorporated into an adapted high resolution broadband probe. It has been proved that the cell is capable of causing cavitation, up to 10 MHz, by running a series of ultrasonic reactions within it and observing the reaction products. It was found that the ultrasound was causing the sample to be heated to unacceptable temperatures and this necessitated the incorporation of temperature stabilisation devices. Work has been performed on the investigation of the narrowing of the solid state 23Na spectrum of tri-sodium phosphate using high frequency ultrasound. Work has also been completed on the signal enhancement and T1 reduction of a liquid mixture and a pure compound using ultrasound. Some preliminary "bench" experiments have been completed on a novel ultrasonic device designed to help minimise sample heating. The concept involves passing the ultrasound through a temperature stabilised, liquid filled funnel that has a drum skin on the end that will enable the passage of ultrasound into the sample. Bench experiments have proved that acoustic attenuation is low and that cavitation in the liquid beyond the device is still possible.
Resumo:
A method has been constructed for the solution of a wide range of chemical plant simulation models including differential equations and optimization. Double orthogonal collocation on finite elements is applied to convert the model into an NLP problem that is solved either by the VF 13AD package based on successive quadratic programming, or by the GRG2 package, based on the generalized reduced gradient method. This approach is termed simultaneous optimization and solution strategy. The objective functional can contain integral terms. The state and control variables can have time delays. Equalities and inequalities containing state and control variables can be included into the model as well as algebraic equations and inequalities. The maximum number of independent variables is 2. Problems containing 3 independent variables can be transformed into problems having 2 independent variables using finite differencing. The maximum number of NLP variables and constraints is 1500. The method is also suitable for solving ordinary and partial differential equations. The state functions are approximated by a linear combination of Lagrange interpolation polynomials. The control function can either be approximated by a linear combination of Lagrange interpolation polynomials or by a piecewise constant function over finite elements. The number of internal collocation points can vary by finite elements. The residual error is evaluated at arbitrarily chosen equidistant grid-points, thus enabling the user to check the accuracy of the solution between collocation points, where the solution is exact. The solution functions can be tabulated. There is an option to use control vector parameterization to solve optimization problems containing initial value ordinary differential equations. When there are many differential equations or the upper integration limit should be selected optimally then this approach should be used. The portability of the package has been addressed converting the package from V AX FORTRAN 77 into IBM PC FORTRAN 77 and into SUN SPARC 2000 FORTRAN 77. Computer runs have shown that the method can reproduce optimization problems published in the literature. The GRG2 and the VF I 3AD packages, integrated into the optimization package, proved to be robust and reliable. The package contains an executive module, a module performing control vector parameterization and 2 nonlinear problem solver modules, GRG2 and VF I 3AD. There is a stand-alone module that converts the differential-algebraic optimization problem into a nonlinear programming problem.
Resumo:
This study aims to investigate to what extent the views of the managers of the enterprises to be privatized are a barrier to smooth implementation of privatization as opposed to other problems. Accordingly, the research tackles two main issues: Identification and analysis of the major problems encountered in the implementation of the Egyptian privatization programme and at which level these problems exist while proposing different approaches to tackle them; and views of public sector top and middle-level managers regarding the main issues of privatization. The study relies upon a literature survey, interviews with stakeholders, a survey of managers' attitudes and several illustrative case studies. A model of "good practice" for the smooth and effective implementation of privatization has been designed. Practice in Egypt has then been studied and compared with the "good practice" model. Lack of strictness and firmness in implementing the announced privatization programme has been found to be a characteristic of Egyptian practice. This is partly attributable to the inadequacy of the programme and partly to the different obstacles to implementation. The main obstacles are doubtful desirability of privatization on the part of the managers at different implementation levels, resistance of stakeholders, in adequately of the legal framework governing privatization, redundant labour, lack of an efficient monitoring system allowing for accountability, inefficient marketing of privatization, ineffective communication, insufficient information at different levels and problems related to valuation and selling procedures. A large part of the thesis is concerned with SOE (State Owned Enterprise) managers' attitudes on and understanding of the privatization (appraised through surveys). Although most managers have stated their acceptance of privatization, many of their responses show that they do not accept selling SOEs. They understand privatization to include enterprise reform and restructuring, changing procedures and giving more authority to company executives, but not necessarily as selling SOEs. The majority of managers still see many issues that have to be addressed for smooth implementation of privatization e.g. insufficiency of information, incompleteness of legal framework, restructuring and labour problems. The main contribution to knowledge of this thesis is the study of problems of implementing privatization in developing countries especially managers' resistance to privatization as a major change, partly because of the threat it poses and partly because of the lack of understanding of privatization and implications of operating private businesses. A programme of persuading managers and offsetting the unfavourable effects is recommended as an outcome of the study. Five different phrases and words for the national Index to theses are: Egypt, privatization, implementation of privatization, problems of implementing privatization and managers' attitudes towards privatization.
Resumo:
Many manufacturing companies have long endured the problems associated with the presence of `islands of automation'. Due to rapid computerisation, `islands' such as Computer-Aided Design (CAD), Computer-Aided Manufacturing (CAM), Flexible Manufacturing Systems (FMS) and Material Requirement Planning (MRP), have emerged, and with a lack of co-ordination, often lead to inefficient performance of the overall system. The main objective of Computer-Integrated Manufacturing (CIM) technology is to form a cohesive network between these islands. Unfortunately, a commonly used approach - the centralised system approach, has imposed major technical constraints and design complication on development strategies. As a consequence, small companies have experienced difficulties in participating in CIM technology. The research described in this thesis has aimed to examine alternative approaches to CIM system design. Through research and experimentation, the cellular system approach, which has existed in the form of manufacturing layouts, has been found to simplify the complexity of an integrated manufacturing system, leading to better control and far higher system flexibility. Based on the cellular principle, some central management functions have also been distributed to smaller cells within the system. This concept is known, specifically, as distributed planning and control. Through the development of an embryo cellular CIM system, the influence of both the cellular principle and the distribution methodology have been evaluated. Based on the evidence obtained, it has been concluded that distributed planning and control methodology can greatly enhance cellular features within an integrated system. Both the cellular system approach and the distributed control concept will therefore make significant contributions to the design of future CIM systems, particularly systems designed with respect to small company requirements.
Resumo:
Particle impacts are of fundamental importance in many areas and there has been a renewed interest in research on particle impact problems. A comprehensive investigation of the particle impact problems, using finite element (FE) methods, is presented in this thesis. The capability of FE procedures for modelling particle impacts is demonstrated by excellent agreements between FE analysis results and previous theoretical, experimental and numerical results. For normal impacts of elastic particles, it is found that the energy loss due to stress wave propagation is negligible if it can reflect more than three times during the impact, for which Hertz theory provides a good prediction of impact behaviour provided that the contact deformation is sufficiently small. For normal impact of plastic particles, the energy loss due to stress wave propagation is also generally negligible so that the energy loss is mainly due to plastic deformation. Finite-deformation plastic impact is addressed in this thesis so that plastic impacts can be categorised into elastic-plastic impact and finite-deformation plastic impact. Criteria for the onset of finite-deformation plastic impacts are proposed in terms of impact velocity and material properties. It is found that the coefficient of restitution depends mainly upon the ratio of impact velocity to yield Vni/Vy0 for elastic-plastic impacts, but it is proportional to [(Vni/Vy0)*(Y/E*)]-1/2, where Y /E* is the representative yield strain for finite-deformation plastic impacts. A theoretical model for elastic-plastic impacts is also developed and compares favourably with FEA and previous experimental results. The effect of work hardening is also investigated.
Resumo:
This study is concerned with quality and productivity aspects of traditional house building. The research focuses on these issues by concentrating on the services and finishing stages of the building process. These are work stages which have not been fully investigated in previous productivity related studies. The primary objective of the research is to promote an integrated design and construction led approach to traditional house building based on an original concept of 'development cycles'. This process involves the following: site monitoring; the analysis of work operations; implementing design and construction changes founded on unique information collected during site monitoring; and subsequent re-monitoring to measure and assess Ihe effect of change. A volume house building firm has been involved in this applied research and has allowed access to its sites for production monitoring purposes. The firm also assisted in design detailing for a small group of 'experimental' production houses where various design and construction changes were implemented. Results from the collaborative research have shown certain quality and productivity improvements to be possible using this approach, albeit on a limited scale at this early experimental stage. The improvements have been possible because an improved activity sampling technique, developed for, and employed by the study, has been able to describe why many quality and productivity related problems occur during site building work. Experience derived from the research has shown the following attributes to be important: positive attitudes towards innovation; effective communication; careful planning and organisation; and good coordination and control at site level. These are all essential aspects of quality led management and determine to a large extent the overall success of this approach. Future work recommendations must include a more widespread use of innovative practices so that further design and construction modifications can be made. By doing this, productivity can be improved, cost savings made and better quality afforded.
Resumo:
This study concerns the application of a model of effective interpersonal relationships to problems arising from staff assessment at I.C.I. Ltd. Corporate Laboratory between 1972 and 1974. In collaboration with academic and industrial supervision, the study commenced with a survey of management and supervisor opinions about the effectiveness of current staff (work) relationships, with particular reference to the problem of recognising and developing creative potential. This survey emphasised a need to improve the relationships between staff in the staff assessment context. A survey of research into creativity emphasised the importance of the interpersonal environment for obtaining creative behaviour in an organisation context. A further survey of theories of how interpersonal behaviour related to personal creativity (therapeutic psychology) provided a model of effective interpersonal behaviour (Carkhuff, 1969) that could be applied to the organisation context of staff assessment. The objective of the project was redefined as a need to improve the conditions of interpersonal behaviour in relation to certain (career development) problems arising from staff assessment practices. In order to demonstrate the application of the model of effective interpersonal behaviour, the research student recorded interviews between himself and members of staff designed to develop and operate the dimensions of the model. Different samples of staff were used to develop the 'facilitative' and the 'action oriented' dimensions of bahaviour, and then for the operation of a helping programme (based on vocational guidance tests). These interactions have been analysed, according to the scales of measurement in the model ana the results are presented in case study form in this thesis. At each stage of the project, results and conclusions were presented to the sponsoring organisation (e.g. industrial supervisor) in order to assess their (subjective) opinion of relevance to the organ isation. Finally, recommendations on further actions towards general improvement of the work relationships in the laboratory were presented in a brief report to the sponsor.
Resumo:
Objective - To review and summarise published data on medication errors in older people with mental health problems. Methods - A systematic review was conducted to identify studies that investigated medication errors in older people with mental health problems. MEDLINE, EMBASE, PHARMLINE, COCHRANE COLLABORATION and PsycINFO were searched electronically. Any studies identified were scrutinized for further references. The title, abstract or full text was systematically reviewed for relevance. Results - Data were extracted from eight studies. In total, information about 728 errors (459 administration, 248 prescribing, 7 dispensing, 12 transcribing, 2 unclassified) was available. The dataset related almost exclusively to inpatients, frequently involved non-psychotropics, and the majority of the errors were not serious. Conclusions - Due to methodology issues it was impossible to calculate overall error rates. Future research should concentrate on serious errors within community settings, and clarify potential risk factors.
Resumo:
The traditional method of classifying neurodegenerative diseases is based on the original clinico-pathological concept supported by 'consensus' criteria and data from molecular pathological studies. This review discusses first, current problems in classification resulting from the coexistence of different classificatory schemes, the presence of disease heterogeneity and multiple pathologies, the use of 'signature' brain lesions in diagnosis, and the existence of pathological processes common to different diseases. Second, three models of neurodegenerative disease are proposed: (1) that distinct diseases exist ('discrete' model), (2) that relatively distinct diseases exist but exhibit overlapping features ('overlap' model), and (3) that distinct diseases do not exist and neurodegenerative disease is a 'continuum' in which there is continuous variation in clinical/pathological features from one case to another ('continuum' model). Third, to distinguish between models, the distribution of the most important molecular 'signature' lesions across the different diseases is reviewed. Such lesions often have poor 'fidelity', i.e., they are not unique to individual disorders but are distributed across many diseases consistent with the overlap or continuum models. Fourth, the question of whether the current classificatory system should be rejected is considered and three alternatives are proposed, viz., objective classification, classification for convenience (a 'dissection'), or analysis as a continuum.
Resumo:
Differential clinical diagnosis of the parkinsonian syndromes, viz., Parkinson’s disease (PD), progressive supranuclear palsy (PSP), dementia with Lewy bodies (DLB), and multiple system atrophy (MSA) can be difficult. Eye movement problems, however, are a chronic complication of many of these disorders and may be a useful aid to diagnosis. Hence, the presence in PSP of vertical supranuclear gaze palsy, fixation instability, lid retraction, blepharospasm, and apraxia of eyelid opening and closing is useful in separating PD from PSP. Moreover, atypical features of PSP include slowing of upward saccades, moderate slowing of downward saccades, the presence of a full range of voluntary vertical eye movements, a curved trajectory of oblique saccades, and absence of square-wave jerks. Downgaze palsy is probably the most useful diagnostic clinical symptom of PSP. By contrast, DLB patients are specifically impaired in both reflexive and saccadic execution and in the performance of more complex saccadic eye movement tasks. Problems in convergence in DLB are also followed by akinesia and rigidity. Abnormal ocular fixation may occur in a significant proportion of MSA patients along with excessive square-wave jerks, a mild supranuclear gaze palsy, a gaze-evoked nystagmus, a positioning down-beat nystagmus, mild-moderate saccadic hypometria, impaired smooth pursuit movements, and reduced vestibulo-ocular reflex (VOR) suppression. There may be considerable overlap between the eye movement problems characteristic of the various parkinsonian disorders, but taken together with other signs and symptoms, can be a useful aid in differential diagnosis, especially in the separation of PD and PSP.
Resumo:
Visual field assessment is a core component of glaucoma diagnosis and monitoring, and the Standard Automated Perimetry (SAP) test is considered up until this moment, the gold standard of visual field assessment. Although SAP is a subjective assessment and has many pitfalls, it is being constantly used in the diagnosis of visual field loss in glaucoma. Multifocal visual evoked potential (mfVEP) is a newly introduced method used for visual field assessment objectively. Several analysis protocols have been tested to identify early visual field losses in glaucoma patients using the mfVEP technique, some were successful in detection of field defects, which were comparable to the standard SAP visual field assessment, and others were not very informative and needed more adjustment and research work. In this study, we implemented a novel analysis approach and evaluated its validity and whether it could be used effectively for early detection of visual field defects in glaucoma. OBJECTIVES: The purpose of this study is to examine the effectiveness of a new analysis method in the Multi-Focal Visual Evoked Potential (mfVEP) when it is used for the objective assessment of the visual field in glaucoma patients, compared to the gold standard technique. METHODS: 3 groups were tested in this study; normal controls (38 eyes), glaucoma patients (36 eyes) and glaucoma suspect patients (38 eyes). All subjects had a two standard Humphrey visual field HFA test 24-2 and a single mfVEP test undertaken in one session. Analysis of the mfVEP results was done using the new analysis protocol; the Hemifield Sector Analysis HSA protocol. Analysis of the HFA was done using the standard grading system. RESULTS: Analysis of mfVEP results showed that there was a statistically significant difference between the 3 groups in the mean signal to noise ratio SNR (ANOVA p<0.001 with a 95% CI). The difference between superior and inferior hemispheres in all subjects were all statistically significant in the glaucoma patient group 11/11 sectors (t-test p<0.001), partially significant 5/11 (t-test p<0.01) and no statistical difference between most sectors in normal group (only 1/11 was significant) (t-test p<0.9). sensitivity and specificity of the HAS protocol in detecting glaucoma was 97% and 86% respectively, while for glaucoma suspect were 89% and 79%. DISCUSSION: The results showed that the new analysis protocol was able to confirm already existing field defects detected by standard HFA, was able to differentiate between the 3 study groups with a clear distinction between normal and patients with suspected glaucoma; however the distinction between normal and glaucoma patients was especially clear and significant. CONCLUSION: The new HSA protocol used in the mfVEP testing can be used to detect glaucomatous visual field defects in both glaucoma and glaucoma suspect patient. Using this protocol can provide information about focal visual field differences across the horizontal midline, which can be utilized to differentiate between glaucoma and normal subjects. Sensitivity and specificity of the mfVEP test showed very promising results and correlated with other anatomical changes in glaucoma field loss.