833 resultados para Work processes
Resumo:
The aim of this work was to develop a generic methodology for evaluating and selecting, at the conceptual design phase of a project, the best process technology for Natural Gas conditioning. A generic approach would be simple and require less time and would give a better understanding of why one process is to be preferred over another. This will lead to a better understanding of the problem. Such a methodology would be useful in evaluating existing, novel and hybrid technologies. However, to date no information is available in the published literature on such a generic approach to gas processing. It is believed that the generic methodology presented here is the first available for choosing the best or cheapest method of separation for natural gas dew-point control. Process cost data are derived from evaluations carried out by the vendors. These evaluations are then modelled using a steady-state simulation package. From the results of the modelling the cost data received are correlated and defined with respect to the design or sizing parameters. This allows comparisons between different process systems to be made in terms of the overall process. The generic methodology is based on the concept of a Comparative Separation Cost. This takes into account the efficiency of each process, the value of its products, and the associated costs. To illustrate the general applicability of the methodology, three different cases suggested by BP Exploration are evaluated. This work has shown that it is possible to identify the most competitive process operations at the conceptual design phase and illustrate why one process has an advantage over another. Furthermore, the same methodology has been used to identify and evaluate hybrid processes. It has been determined here that in some cases they offer substantial advantages over the separate process techniques.
Resumo:
The aim of this work was to synthesise a series of hydrophilic derivatives of cis-1,2-dihydroxy-3,5-cyclohexadiene (cis-DHCD) and copolymerise them with 2-hydroxyethyl methacrylate (HEMA), to produce a completely new range of hydrogel materials. It is theorised that hydrogels incorporating such derivatives of cis-DHCD will exhibit good strength and elasticity in addition to good water binding ability. The synthesis of derivatives was attempted by both enzymatic and chemical methods. Enzyme synthesis involved the transesterification of cis-DHCD with a number of trichloro and trifluoroethyl esters using the enzyme lipase porcine pancreas to catalyse the reaction in organic solvent. Cyclohexanol was used in initial studies to assess the viability of enzyme catalysed reactions. Chemical synthesis involved the epoxidation of a number of unsaturated carboxylic acids and the subsequent reaction of these epoxy acids with cis-DHCD in DCC/DMAP catalysed esterifications. The silylation of cis-DHCD using TBDCS and BSA was also studied. The rate of aromatisation of cis-DHCD at room temperature was studied in order to assess its stability and 1H NMR studies were also undertaken to determine the conformations adopted by derivatives of cis-DHCD. The copolymerisation of diepoxybutanoate, diepoxyundecanoate, dibutenoate and silyl protected derivatives of cis-DHCD with HEMA, to produce a new group of hydrogels was investigated. The EWC and mechanical properties of these hydrogels were measured and DSC was used to determine the amount of freezing and non-freezing water in the membranes. The effect on EWC of opening the epoxide rings of the comonomers was also investigated
Resumo:
Clinical dextran is used as a blood volume expander. The British Pharmacopeia (BP) specification for this product requires the amount of dextran below 12,000 MW and above 98,000 MW to be strictly controlled. Dextran is presently fractionated industrially using ethanol precipitation. The aim of this work was to develop an ultrafiltration system which could replace the present industrial process. Initially these molecular weight (MW) bands were removed using batch ultrafiltration. A large number of membranes were tested. The correct BP specification could be achieved using these membranes but there was a significant loss of saleable material. To overcome this problem a four stage ultrafiltration cascade (UFC) was used. This work is the first known example of a UFC being used to remove both the high and low MW dextran. To remove the high MW material it was necessary to remove 90% of the MW distribution and retain the remaining 10%. The UFC significantly reduced the amount of dialysate required. To achieve the correct specification below 12,000 MW, the UFC required only 2.5 - 3.0 diavolumes while the batch system required 6 - 7. The UFC also improved the efficiency of the fractionation process. The UFC could retain up to 96% of the high MW material while the batch system could only retain 82.5% using the same number of diavolumes. On average the UFC efficiency was approximately 10% better than the equivalent batch system. The UFC was found to be more predictable than the industrial process and the specification of the final product was easier to control. The UFC can be used to improve the fractionation of any polymer and also has several other potential uses including enzyme purification. A dextransucrase bioreactor was also developed. This preliminary investigation highlighted the problems involved with the development of a successful bioreactor for this enzyme system.
Resumo:
Tear component deposition onto contact lenses is termed `spoilation' and occurs due to the interaction of synthetic polymers with their biological fluid environment. Spoilation phenomena alter the physico-chemical properties of hydrophilic contact lenses, diminishing the optical properties of the lens; causing discomfort and complications for the wearer. Eventually these alterations render the lens unwearable. The primary aim of this interdisciplinary study was to develop analytical techniques capable of analysing the minute quantities of biological deposition involved, in particular the lipid fraction. Prior to this work such techniques were unavailable for single contact lenses. It is envisaged that these investigations will further the understanding of this biological interfacial conversion. Two main analytical techniques were developed: a high performance liquid chromatography (HPLC) technique and fluorescence spectrofluorimetry. The HPLC method allows analysis of a single contact lens and provided previously unavailable valuable information about variations in the lipid profiles of deposited contact lenses and patient tear films. Fluorescence spectrophotofluorimetry is a sensitive non-destructive technique for observing changes in the fluorescence intensity of biological components on contact lenses. The progression and deposition of tear materials can be monitored and assessed for both in vivo and in vitro spoiled lenses using this technique. An improved in vitro model which is comparable to tears and chemically mimics ocular spoilation was also developed. This model allows the controlled study of extrinsic factors and hydrogel compositions. These studies show that unsaturated tear lipids, probably unsaturated fatty acids, are involved in the interfacial conversion of hydrogel lenses, rendering them incompatible with the ocular microenvironment. Lipid interaction with the lens surface then facilitates secondary deposition of other tear components. Interaction, exchange and immobilisation (by polymerisation) of the lipid layer appears to occur before the final and rapid growth of more complex, insoluble discrete deposits, sometimes called `white spots'.
Resumo:
Prior research suggests management can employ cognitively demanding job attributes to promote employee creativity. However, it is not clear what specific type of cognitive demand is particularly important for creativity, what processes underpin the relationship between demanding job conditions and creativity and what factors lead to employee perceptions of demanding job attributes. This research sets out to address the aforementioned issues by examining: (i) problem-solving demand (PDS), a specific type of cognitive demand, and the processes that link PSD to creativity, and (ii) antecedents to PSD. Based on social cognitive theory, PSD was hypothesized to be positively related to creativity through the motivational mechanism of creative self-efficacy. However, the relationship between PSD and creative self-efficacy was hypothesized to be contingent on levels of intrinsic motivation. Social information processing perspective and the job crafting model were used to identify antecedents of PSD. Consequently, two social-contextual factors (supervisor developmental feedback and job autonomy) and one individual factor (proactive personality) were hypothesized to be precursors to PSD perceptions. The theorized model was tested with data obtained from a sample of 270 employees and their supervisors from 3 organisations in the People’s Republic of China. Regression results revealed that PSD was positively related to creativity but this relationship was partially mediated by creative self-efficacy. Additionally, intrinsic motivation moderated the relationship between PSD and creative self-efficacy such that the relationship was stronger for individuals high rather than low in intrinsic motivation. The findings represent a productive first step in identifying a specific cognitive demand that is conducive to employee creativity. In addition, the findings contribute to the literature by identifying a psychological mechanism that may link cognitively demanding job attributes and creativity.
Resumo:
The aim of this research was to investigate the integration of computer-aided drafting and finite-element analysis in a linked computer-aided design procedure and to develop the necessary software. The Be'zier surface patch for surface representation was used to bridge the gap between the rather separate fields of drafting and finite-element analysis because the surfaces are defined by analytical functions which allow systematic and controlled variation of the shape and provide continuous derivatives up to any required degree. The objectives of this research were achieved by establishing : (i) A package which interpretes the engineering drawings of plate and shell structures and prepares the Be'zier net necessary for surface representation. (ii) A general purpose stand-alone meshed-surface modelling package for surface representation of plates and shells using the Be'zier surface patch technique. (iii) A translator which adapts the geometric description of plate and shell structures as given by the meshed-surface modeller to the form needed by the finite-element analysis package. The translator was extended to suit fan impellers by taking advantage of their sectorial symmetry. The linking processes were carried out for simple test structures, simplified and actual fan impellers to verify the flexibility and usefulness of the linking technique adopted. Finite-element results for thin plate and shell structures showed excellent agreement with those obtained by other investigators while results for the simplified and actual fan impellers also showed good agreement with those obtained in an earlier investigation where finite-element analysis input data were manually prepared. Some extensions of this work have also been discussed.
Resumo:
This thesis was focused on theoretical models of synchronization to cortical dynamics as measured by magnetoencephalography (MEG). Dynamical systems theory was used in both identifying relevant variables for brain coordination and also in devising methods for their quantification. We presented a method for studying interactions of linear and chaotic neuronal sources using MEG beamforming techniques. We showed that such sources can be accurately reconstructed in terms of their location, temporal dynamics and possible interactions. Synchronization in low-dimensional nonlinear systems was studied to explore specific correlates of functional integration and segregation. In the case of interacting dissimilar systems, relevant coordination phenomena involved generalized and phase synchronization, which were often intermittent. Spatially-extended systems were then studied. For locally-coupled dissimilar systems, as in the case of cortical columns, clustering behaviour occurred. Synchronized clusters emerged at different frequencies and their boundaries were marked through oscillation death. The macroscopic mean field revealed sharp spectral peaks at the frequencies of the clusters and broader spectral drops at their boundaries. These results question existing models of Event Related Synchronization and Desynchronization. We re-examined the concept of the steady-state evoked response following an AM stimulus. We showed that very little variability in the AM following response could be accounted by system noise. We presented a methodology for detecting local and global nonlinear interactions from MEG data in order to account for residual variability. We found crosshemispheric nonlinear interactions of ongoing cortical rhythms concurrent with the stimulus and interactions of these rhythms with the following AM responses. Finally, we hypothesized that holistic spatial stimuli would be accompanied by the emergence of clusters in primary visual cortex resulting in frequency-specific MEG oscillations. Indeed, we found different frequency distributions in induced gamma oscillations for different spatial stimuli, which was suggestive of temporal coding of these spatial stimuli. Further, we addressed the bursting character of these oscillations, which was suggestive of intermittent nonlinear dynamics. However, we did not observe the characteristic-3/2 power-law scaling in the distribution of interburst intervals. Further, this distribution was only seldom significantly different to the one obtained in surrogate data, where nonlinear structure was destroyed. In conclusion, the work presented in this thesis suggests that advances in dynamical systems theory in conjunction with developments in magnetoencephalography may facilitate a mapping between levels of description int he brain. this may potentially represent a major advancement in neuroscience.
Resumo:
Task classification is introduced as a method for the evaluation of monitoring behaviour in different task situations. On the basis of an analysis of different monitoring tasks, a task classification system comprising four task 'dimensions' is proposed. The perceptual speed and flexibility of closure categories, which are identified with signal discrimination type, comprise the principal dimension in this taxonomy, the others being sense modality, the time course of events, and source complexity. It is also proposed that decision theory provides the most complete method for the analysis of performance in monitoring tasks. Several different aspects of decision theory in relation to monitoring behaviour are described. A method is also outlined whereby both accuracy and latency measures of performance may be analysed within the same decision theory framework. Eight experiments and an organizational study are reported. The results show that a distinction can be made between the perceptual efficiency (sensitivity) of a monitor and his criterial level of response, and that in most monitoring situations, there is no decrement in efficiency over the work period, but an increase in the strictness of the response criterion. The range of tasks exhibiting either or both of these performance trends can be specified within the task classification system. In particular, it is shown that a sensitivity decrement is only obtained for 'speed' tasks with a high stimulation rate. A distinctive feature of 'speed' tasks is that target detection requires the discrimination of a change in a stimulus relative to preceding stimuli, whereas in 'closure' tasks, the information required for the discrimination of targets is presented at the same point In time. In the final study, the specification of tasks yielding sensitivity decrements is shown to be consistent with a task classification analysis of the monitoring literature. It is also demonstrated that the signal type dimension has a major influence on the consistency of individual differences in performance in different tasks. The results provide an empirical validation for the 'speed' and 'closure' categories, and suggest that individual differences are not completely task specific but are dependent on the demands common to different tasks. Task classification is therefore shovn to enable improved generalizations to be made of the factors affecting 1) performance trends over time, and 2) the consistencv of performance in different tasks. A decision theory analysis of response latencies is shown to support the view that criterion shifts are obtained in some tasks, while sensitivity shifts are obtained in others. The results of a psychophysiological study also suggest that evoked potential latency measures may provide temporal correlates of criterion shifts in monitoring tasks. Among other results, the finding that the latencies of negative responses do not increase over time is taken to invalidate arousal-based theories of performance trends over a work period. An interpretation in terms of expectancy, however, provides a more reliable explanation of criterion shifts. Although the mechanisms underlying the sensitivity decrement are not completely clear, the results rule out 'unitary' theories such as observing response and coupling theory. It is suggested that an interpretation in terms of the memory data limitations on information processing provides the most parsimonious explanation of all the results in the literature relating to sensitivity decrement. Task classification therefore enables the refinement and selection of theories of monitoring behaviour in terms of their reliability in generalizing predictions to a wide range of tasks. It is thus concluded that task classification and decision theory provide a reliable basis for the assessment and analysis of monitoring behaviour in different task situations.
Resumo:
In 1974 Dr D M Bramwell published his research work at the University of Aston a part of which was the establishment of an elemental work study data base covering drainage construction. The Transport and Road Research Laboratory decided to, extend that work as part of their continuing research programme into the design and construction of buried pipelines by placing a research contract with Bryant Construction. This research may be considered under two broad categories. In the first, site studies were undertaken to validate and extend the data base. The studies showed good agreement with the existing data with the exception of the excavation trench shoring and pipelaying data which was amended to incorporate new construction plant and methods. An inter-active on-line computer system for drainage estimating was developed. This system stores the elemental data, synthesizes the standard time of each drainage operation and is used to determine the required resources and construction method of the total drainage activity. The remainder of the research was into the general topic of construction efficiency. An on-line command driven computer system was produced. This system uses a stochastic simulation technique, based on distributions of site efficiency measurements to evaluate the effects of varying performance levels. The analysis of this performance data quantities the variability inherent in construction and demonstrates how some of this variability can be reconciled by considering the characteristics of a contract. A long term trend of decreasing efficiency with contract duration was also identified. The results obtained from the simulation suite were compared to site records collected from current contracts. This showed that this approach will give comparable answers, but these are greatly affected by the site performance parameters.
Resumo:
This thesis is an exploration of the social and political processes involved in the introduction of new technology to the shopfloor. Through a series of case studies of applications of microelectronics to batch manufacture, it attempts to uncover the ways in which the values and interests of managers, engineers, workers and others profoundly influence the choice and use of technology, and thus the work organisation which emerges. Previous analyses have tended to treat new technology as if it had "impacts" on work organisation - especially skills - which are inevitable in particular technical and economic circumstances. It is in opposition to this view that technical change is here treated as a matter for social choice and political negotiation, the various interested parties to the change being shown to attempt to incorporate their own interests into the technical and social organisation of work. Section one provides the relevant background to the case studies by summarising and criticising previous theoretical and empirical work in the area. The inadequacies of this work for our concerns are drawn out, and the need for detailed studies of the political aspects of technical change is justified. The case studies are presented in section two as a set of "episodes" of innovation, and section three analyses the empirical findings. The innovations are compared and contrasted in order to illustrate the social and political dynamics involved in the various stages of the innovation process. Finally some comments are made on policy issues for which the research has important implications.
Resumo:
This paper presents the first part of a study of the combustion processes in an industrial radiant tube burner (RTB). The RTB is used typically in heat-treating furnaces. The work was initiated because of the need for improvements in burner lifetime and performance. The present paper is concerned with the flow of combustion air; a future paper will address the combusting flow. A detailed three-dimensional computational fluid dynamics model of the burner was developed, validated with experimental air flow velocity measurements using a split-film probe. Satisfactory agreement was achieved using the k-e turbulence model. Various features along the air inlet passage were subsequently analysed. The effectiveness of the air recuperator swirler was found to be significantly compromised by the need for a generous assembly tolerance. Also, a substantial circumferential flow maldistribution introduced by the swirler is effectively removed by the positioning of a constriction in the downstream passage.
Resumo:
The focus of this paper is on the doctoral research training experienced by one of the authors and the ways in which the diverse linguistic and disciplinary perspectives of her two supervisors (co-authors of this paper) mediated the completion of her study. The doctoral candidate is a professional translator/interpreter and translation teacher. The paper describes why and how she identified her research area and then focused on the major research questions in collaboration with her two supervisors, who brought their differing perspectives from the field of linguistics to this translation research, even though they are not translators by profession or disciplinary background and do not speak Korean. In addition, the discussion considers the focus, purpose and theoretical orientation of the research itself (which addressed questions of readability in translated English-Korean texts through detailed analysis of a corpus and implications for professional translator training) as well as the supervisory and conceptual processes and practices involved. The authors contend that doctoral research of this kind can be seen as a mutual learning process and that inter-disciplinary research can make a contribution not only to the development of rigorous research in the field of translation studies but also to the other disciplinary fields involved.
Resumo:
This article categorises manufacturing strategy design processes and presents the characteristics of resulting strategies. This work will therefore assist practitioners to appreciate the implications of planning activities. The article presents a framework for classifying manufacturing strategy processes and the resulting strategies. Each process and respective strategy is then considered in detail. In this consideration the preferred approach is presented for formulating a world class manufacturing strategy. Finally, conclusions and recommendations for further work are given.
Resumo:
The leadership categorisation theory suggests that followers rely on a hierarchical cognitive structure in perceiving leaders and the leadership process, which consists of three levels; superordinate, basic and subordinate. The predominant view is that followers rely on Implicit Leadership Theories (ILTs) at the basic level in making judgments about managers. The thesis examines whether this presumption is true by proposing and testing two competing conceptualisations; namely the congruence between the basic level ILTs (general leader) and actual manager perceptions, and subordinate level ILTs (job-specific leader) and actual manager. The conceptualisation at the job-specific level builds on context-related assertions of the ILT explanatory models: leadership categorisation, information processing and connectionist network theories. Further, the thesis addresses the effects of ILT congruence at the group level. The hypothesised model suggests that Leader-Member Exchange (LMX) will act as a mediator between ILT congruence and outcomes. Three studies examined the proposed model. The first was cross-sectional with 175 students reporting on work experience during a 1-year industrial placement. The second was longitudinal and had a sample of 343 students engaging in a business simulation in groups with formal leadership. The final study was a cross-sectional survey in several organisations with a sample of 178. A novel approach was taken to congruence analysis; the hypothesised models were tested using Latent Congruence Modelling (LCM), which accounts for measurement error and overcomes the majority of limitations of traditional approaches. The first two studies confirm the traditional theorised view that employees rely on basic-level ILTs in making judgments about their managers with important implications, and show that LMX mediates the relationship between ILT congruence and work-related outcomes (performance, job satisfaction, well-being, task satisfaction, intragroup conflict, group satisfaction, team realness, team-member exchange, group performance). The third study confirms this with conflict, well-being, self-rated performance and commitment as outcomes.
Resumo:
Biomass-To-Liquid (BTL) is one of the most promising low carbon processes available to support the expanding transportation sector. This multi-step process produces hydrocarbon fuels from biomass, the so-called “second generation biofuels” that, unlike first generation biofuels, have the ability to make use of a wider range of biomass feedstock than just plant oils and sugar/starch components. A BTL process based on gasification has yet to be commercialized. This work focuses on the techno-economic feasibility of nine BTL plants. The scope was limited to hydrocarbon products as these can be readily incorporated and integrated into conventional markets and supply chains. The evaluated BTL systems were based on pressurised oxygen gasification of wood biomass or bio-oil and they were characterised by different fuel synthesis processes including: Fischer-Tropsch synthesis, the Methanol to Gasoline (MTG) process and the Topsoe Integrated Gasoline (TIGAS) synthesis. This was the first time that these three fuel synthesis technologies were compared in a single, consistent evaluation. The selected process concepts were modelled using the process simulation software IPSEpro to determine mass balances, energy balances and product distributions. For each BTL concept, a cost model was developed in MS Excel to estimate capital, operating and production costs. An uncertainty analysis based on the Monte Carlo statistical method, was also carried out to examine how the uncertainty in the input parameters of the cost model could affect the output (i.e. production cost) of the model. This was the first time that an uncertainty analysis was included in a published techno-economic assessment study of BTL systems. It was found that bio-oil gasification cannot currently compete with solid biomass gasification due to the lower efficiencies and higher costs associated with the additional thermal conversion step of fast pyrolysis. Fischer-Tropsch synthesis was the most promising fuel synthesis technology for commercial production of liquid hydrocarbon fuels since it achieved higher efficiencies and lower costs than TIGAS and MTG. None of the BTL systems were competitive with conventional fossil fuel plants. However, if government tax take was reduced by approximately 33% or a subsidy of £55/t dry biomass was available, transport biofuels could be competitive with conventional fuels. Large scale biofuel production may be possible in the long term through subsidies, fuels price rises and legislation.