21 resultados para Traditional irrigation system
em Aston University Research Archive
Resumo:
This study has been conceived with the primary objective of identifying and evaluating the financial aspects of the transformation in country/company relations of the international oil industry from the traditional concessionary system to the system of governmental participation in the ownership and operation of oil concessions. The emphasis of the inquiry was placed on assembling a case study of the oil exploitation arrangements of Libya. Through a comprehensive review of the literature, the sociopolitical factors surrounding the international oil business were identified and examined in an attempt to see their influence on contractual arrangements and particularly to gauge the impact of any induced contractual changes on the revenue benefit accruing to the host country from its oil operations. Some comparative analyses were made in the study to examine the viability of the Libyan participation deals both as an investment proposal and as a system of conducting oil activities in the country. The analysis was carried out in the light of specific hypotheses to assess the relative impact of the participation scheme in comparison with the alternative concessionary model on the net revenue resulting to the government from oil operations and the relative effect on the level of research and development within the industry. A discounted cash flow analysis was conducted to measure inputs and outputs of the comparative models and judge their revenue benefits. Then an empirical analysis was carried out to detect any significant behavioural changes in the exploration and development effort associated with the different oil exploitation systems. Results of the investigation of revenues support the argument that the mere introduction of the participation system has not resulted in a significant revenue benefit to the host government. Though there has been a significant increase in government revenue, associated with the period following the emergence of the participation agreements, this increase was mainly due to socio-economic factors other than the participation scheme. At the same time the empirical results have shown an association of the participation scheme with a decline of the oil industry's research and development efforts.
Resumo:
DUE TO COPYRIGHT RESTRICTIONS ONLY AVAILABLE FOR CONSULTATION AT ASTON UNIVERSITY LIBRARY AND INFORMATION SERVICES WITH PRIOR ARRANGEMENT One of the current research trends in Enterprise Resource Planning (ERP) involves examining the critical factors for its successful implementation. However, such research is limited to system implementation, not focusing on the flexibility of ERP to respond to changes in business. Therefore, this study explores a combination system, made up of an ERP and informality, intended to provide organisations with efficient and flexible performance simultaneously. In addition, this research analyses the benefits and challenges of using the system. The research was based on socio-technical system (STS) theory which contains two dimensions: 1) a technical dimension which evaluates the performance of the system; and 2) a social dimension which examines the impact of the system on an organisation. A mixed method approach has been followed in this research. The qualitative part aims to understand the constraints of using a single ERP system, and to define a new system corresponding to these problems. To achieve this goal, four Chinese companies operating in different industries were studied, all of which faced challenges in using an ERP system due to complexity and uncertainty in their business environments. The quantitative part contains a discrete-event simulation study that is intended to examine the impact of operational performance when a company implements the hybrid system in a real-life situation. Moreover, this research conducts a further qualitative case study, the better to understand the influence of the system in an organisation. The empirical aspect of the study reveals that an ERP with pre-determined business activities cannot react promptly to unanticipated changes in a business. Incorporating informality into an ERP can react to different situations by using different procedures that are based on the practical knowledge of frontline employees. Furthermore, the simulation study shows that the combination system can achieve a balance between efficiency and flexibility. Unlike existing research, which emphasises a continuous improvement in the IT functions of an enterprise system, this research contributes to providing a definition of a new system in theory, which has mixed performance and contains both the formal practices embedded in an ERP and informal activities based on human knowledge. It supports both cost-efficiency in executing business transactions and flexibility in coping with business uncertainty.This research also indicates risks of using the system, such as using an ERP with limited functions; a high cost for performing informally; and a low system acceptance, owing to a shift in organisational culture. With respect to practical contribution, this research suggests that companies can choose the most suitable enterprise system approach in accordance with their operational strategies. The combination system can be implemented in a company that needs to operate a medium amount of volume and variety. By contrast, the traditional ERP system is better suited in a company that operates a high-level volume market, while an informal system is more suitable for a firm with a requirement for a high level of variety.
Resumo:
Purpose - To provide an example of the use of system dynamics within the context of a discrete-event simulation study. Design/methodology/approach - A discrete-event simulation study of a production-planning facility in a gas cylinder-manufacturing plant is presented. The case study evidence incorporates questionnaire responses from sales managers involved in the order-scheduling process. Findings - As the project progressed it became clear that, although the discrete-event simulation would meet the objectives of the study in a technical sense, the organizational problem of "delivery performance" would not be solved by the discrete-event simulation study alone. The case shows how the qualitative outcomes of the discrete-event simulation study led to an analysis using the system dynamics technique. The system dynamics technique was able to model the decision-makers in the sales and production process and provide a deeper understanding of the performance of the system. Research limitations/implications - The case study describes a traditional discrete-event simulation study which incorporated an unplanned investigation using system dynamics. Further, case studies using a planned approach to showing consideration of organizational issues in discrete-event simulation studies are required. Then the role of both qualitative data in a discrete-event simulation study and the use of supplementary tools which incorporate organizational aspects may help generate a methodology for discrete-event simulation that incorporates human aspects and so improve its relevance for decision making. Practical implications - It is argued that system dynamics can provide a useful addition to the toolkit of the discrete-event simulation practitioner in helping them incorporate a human aspect in their analysis. Originality/value - Helps decision makers gain a broader perspective on the tools available to them by showing the use of system dynamics to supplement the use of discrete-event simulation. © Emerald Group Publishing Limited.
Resumo:
A novel quasidistributed in-fiber Bragg grating (FBG) temperature sensor system has been developed for temperature proving in vivo in the human body for medical applications, e.g., hyperthermia treatment. This paper provides the operating principle of FBG temperature sensors and then the design of the sensor head. High-resolution detection of the wavelength-shifts induced by temperature changes are achieved using drift-compensated interferometric detection while the return signals from the FBG sensor array are demultiplexed with a simple monochromator which offers crosstalk-free wavelength-division-multiplexing (WDM). A “strain-free” probe is designed by enclosing the FBG sensor array in a protection sleeve. A four FBG sensor system is demonstrated and the experimental results are in good agreement with those obtained by traditional electrical thermocouple sensors. A resolution of 0.1°C and an accuracy of ±0.2°C over a temperature range of 30-60°C have been achieved, which meet established medical requirements.
Resumo:
The IRDS standard is an international standard produced by the International Organisation for Standardisation (ISO). In this work the process for producing standards in formal standards organisations, for example the ISO, and in more informal bodies, for example the Object Management Group (OMG), is examined. This thesis examines previous models and classifications of standards. The previous models and classifications are then combined to produce a new classification. The IRDS standard is then placed in a class in the new model as a reference anticipatory standard. Anticipatory standards are standards which are developed ahead of the technology in order to attempt to guide the market. The diffusion of the IRDS is traced over a period of eleven years. The economic conditions which affect the diffusion of standards are examined, particularly the economic conditions which prevail in compatibility markets such as the IT and ICT markets. Additionally the consequences of the introduction of gateway or converter devices into a market where a standard has not yet been established is examined. The IRDS standard did not have an installed base and this hindered its diffusion. The thesis concludes that the IRDS standard was overtaken by new developments such as object oriented technologies and middleware. This was partly because of the slow development process of developing standards in traditional organisations which operate on a consensus basis and partly because the IRDS standard did not have an installed base. Also the rise and proliferation of middleware products resulted in exchange mechanisms becoming dominant rather than repository solutions. The research method used in this work is a longitudinal study of the development and diffusion of the ISO/EEC IRDS standard. The research is regarded as a single case study and follows the interpretative epistemological point of view.
Resumo:
The sectoral and occupational structure of Britain and West Germany has increasingly changed over the last fifty years from a manual manufacturing based to a non-manual service sector based one. There has been a trend towards more managerial and less menial type occupations. Britain employs a higher proportion of its population in the service sector than in manufacturing compared to West Germany, except in retailing, where West Germany employs twice as many people as Britain. This is a stable sector of the economy in terms of employment, but the requirements of the workforce have changed in line with changes in the industry in both countries. School leavers in the two countries, faced with the same options (FE, training schemes or employment) have opted for the various options in different proportions: young Germans are staying longer in education before embarking on training and young Britons are now less likely to go straight into employment than ten years ago. Training is becoming more accepted as the normal route into employment with government policy leading the way, but public opinion still slow to respond. This study investigates how vocational training has adapted to the changing requirements of industry, often determined by technological advancements. In some areas e.g. manufacturing industry the changes have been radical, in others such as retailing they have not, but skill requirements, not necessarily influenced by technology have changed. Social-communicative skills, frequently not even considered skills and therefore not included in training are coming to the forefront. Vocational training has adapted differently in the two countries: in West Germany on the basis of an established over-defined system and in Britain on the basis of an out-dated ill-defined and almost non-existent system. In retailing German school leavers opt for two or three year apprenticeships whereas British school leavers are offered employment with or without formalised training. The publicly held view of the occupation of sales assistant is one of low-level skill, low intellectual demands and a job anyone can do. The traditional skills - product knowledge, selling and social-communicative skills have steadily been eroded. In the last five years retailers have recognised that a return to customer service, utilising the traditional skills was going to be needed of their staff to remain competitive. This requires training. The German retail training system responded by adapting its training regulations in a long consultative process, whereas the British experimented with YTS, a formalised training scheme nationwide being a new departure. The thesis evaluates the changes in these regulations. The case studies in four retail outlets demonstrate that it is indeed product knowledge and selling and social-communicative skills which are fundamental to being a successful and content sales assistant in either country. When the skills are recognised and taught well and systematically the foundations for career development in retailing are laid in a labour market which is continually looking for better qualified workers. Training, when planned and conducted professionally is appreciated by staff and customers and of benefit to the company. In retailing not enough systematic training, to recognisable standards is carried out in Britain, whereas in West Germany the training system is nevertheless better prepared to show innovative potential as a structure and is in place on which to build. In Britain the reputation of the individual company has a greater role to play, not ensuring a national provision of good training in retailing.
Resumo:
The absence of a definitive approach to the design of manufacturing systems signifies the importance of a control mechanism to ensure the timely application of relevant design techniques. To provide effective control, design development needs to be continually assessed in relation to the required system performance, which can only be achieved analytically through computer simulation. The technique providing the only method of accurately replicating the highly complex and dynamic interrelationships inherent within manufacturing facilities and realistically predicting system behaviour. Owing to the unique capabilities of computer simulation, its application should support and encourage a thorough investigation of all alternative designs. Allowing attention to focus specifically on critical design areas and enabling continuous assessment of system evolution. To achieve this system analysis needs to efficient, in terms of data requirements and both speed and accuracy of evaluation. To provide an effective control mechanism a hierarchical or multi-level modelling procedure has therefore been developed, specifying the appropriate degree of evaluation support necessary at each phase of design. An underlying assumption of the proposal being that evaluation is quick, easy and allows models to expand in line with design developments. However, current approaches to computer simulation are totally inappropriate to support the hierarchical evaluation. Implementation of computer simulation through traditional approaches is typically characterized by a requirement for very specialist expertise, a lengthy model development phase, and a correspondingly high expenditure. Resulting in very little and rather inappropriate use of the technique. Simulation, when used, is generally only applied to check or verify a final design proposal. Rarely is the full potential of computer simulation utilized to aid, support or complement the manufacturing system design procedure. To implement the proposed modelling procedure therefore the concept of a generic simulator was adopted, as such systems require no specialist expertise, instead facilitating quick and easy model creation, execution and modification, through simple data inputs. Previously generic simulators have tended to be too restricted, lacking the necessary flexibility to be generally applicable to manufacturing systems. Development of the ATOMS manufacturing simulator, however, has proven that such systems can be relevant to a wide range of applications, besides verifying the benefits of multi-level modelling.
Resumo:
This thesis starts with a literature review, outlining the major issues identified in the literature concerning virtual manufacturing enterprise (VME) transformation. Then it details the research methodology used – a systematic approach for empirical research. next, based on the conceptual framework proposed, this thesis builds three modules to form a reference model, with the purpose of clarifying the important issues relevant to transforming a traditional manufacturing company into a VME. The first module proposes a mechanism of VME transformation – operating along the VME metabolism. The second module builds a management function within a VME to ensure a proper operation of the mechanism. This function helps identify six areas as closely related to VME transformation: lean manufacturing; competency protection; internal operation performance measurement; alliance performance measurement; knowledge management; alliance decision making. The third module continues and proposes an alliance performance measurement system which includes 14 categories of performance indicators. An analysis template for alliance decision making is also proposed and integrated into the first module. To validate these three modules, 7 manufacturing organisations (5 in China and 2 in the UK) were investigated, and these field case studies are analysed in this thesis. The evidence found in these organisations, together with the evidence collected from the literature, including both researcher views and literature case studies, provide support for triangulation evidence. In addition, this thesis identifies the strength and weakness patterns of the manufacturing companies within the theoretical niche of this research, and clarifies the relationships among some major research areas from the perspective of virtual manufacturing. Finally, the research findings are summarised, as well as their theoretical and practical implications. Research limitations and recommendations for future work conclude this thesis.
Resumo:
This thesis presents the results from an investigation into the merits of analysing Magnetoencephalographic (MEG) data in the context of dynamical systems theory. MEG is the study of both the methods for the measurement of minute magnetic flux variations at the scalp, resulting from neuro-electric activity in the neocortex, as well as the techniques required to process and extract useful information from these measurements. As a result of its unique mode of action - by directly measuring neuronal activity via the resulting magnetic field fluctuations - MEG possesses a number of useful qualities which could potentially make it a powerful addition to any brain researcher's arsenal. Unfortunately, MEG research has so far failed to fulfil its early promise, being hindered in its progress by a variety of factors. Conventionally, the analysis of MEG has been dominated by the search for activity in certain spectral bands - the so-called alpha, delta, beta, etc that are commonly referred to in both academic and lay publications. Other efforts have centred upon generating optimal fits of "equivalent current dipoles" that best explain the observed field distribution. Many of these approaches carry the implicit assumption that the dynamics which result in the observed time series are linear. This is despite a variety of reasons which suggest that nonlinearity might be present in MEG recordings. By using methods that allow for nonlinear dynamics, the research described in this thesis avoids these restrictive linearity assumptions. A crucial concept underpinning this project is the belief that MEG recordings are mere observations of the evolution of the true underlying state, which is unobservable and is assumed to reflect some abstract brain cognitive state. Further, we maintain that it is unreasonable to expect these processes to be adequately described in the traditional way: as a linear sum of a large number of frequency generators. One of the main objectives of this thesis will be to prove that much more effective and powerful analysis of MEG can be achieved if one were to assume the presence of both linear and nonlinear characteristics from the outset. Our position is that the combined action of a relatively small number of these generators, coupled with external and dynamic noise sources, is more than sufficient to account for the complexity observed in the MEG recordings. Another problem that has plagued MEG researchers is the extremely low signal to noise ratios that are obtained. As the magnetic flux variations resulting from actual cortical processes can be extremely minute, the measuring devices used in MEG are, necessarily, extremely sensitive. The unfortunate side-effect of this is that even commonplace phenomena such as the earth's geomagnetic field can easily swamp signals of interest. This problem is commonly addressed by averaging over a large number of recordings. However, this has a number of notable drawbacks. In particular, it is difficult to synchronise high frequency activity which might be of interest, and often these signals will be cancelled out by the averaging process. Other problems that have been encountered are high costs and low portability of state-of-the- art multichannel machines. The result of this is that the use of MEG has, hitherto, been restricted to large institutions which are able to afford the high costs associated with the procurement and maintenance of these machines. In this project, we seek to address these issues by working almost exclusively with single channel, unaveraged MEG data. We demonstrate the applicability of a variety of methods originating from the fields of signal processing, dynamical systems, information theory and neural networks, to the analysis of MEG data. It is noteworthy that while modern signal processing tools such as independent component analysis, topographic maps and latent variable modelling have enjoyed extensive success in a variety of research areas from financial time series modelling to the analysis of sun spot activity, their use in MEG analysis has thus far been extremely limited. It is hoped that this work will help to remedy this oversight.
Resumo:
The development of a system that integrates reverse osmosis (RO) with a horticultural greenhouse has been advanced through laboratory experiments. In this concept, intended for the inland desalination of brackish groundwater in dry areas, the RO concentrate will be reduced in volume by passing it through the evaporative cooling pads of the greenhouse. The system will be powered by solar photovoltaics (PV). Using a solar array simulator, we have verified that the RO can operate with varying power input and recovery rates to meet the water demands for irrigation and cooling of a greenhouse in north-west India. Cooling requires ventilation by a fan which has also been built, tested and optimised with a PV module outdoors. Results from the experiments with these two subsystems (RO and fan) are compared to theoretical predictions to reach conclusions about energy usage, sizing and cost. For example, the optimal sizing for the RO system is 0.12–1.3 m2 of PV module per m2 of membrane, depending on feed salinity. For the fan, the PV module area equals that of the fan aperture. The fan consumes <30 J of electrical energy per m3 of air moved which is 3 times less than that of standard fans. The specific energy consumption of the RO, at 1–2.3 kWh ?m-3, is comparable to that reported by others. Now that the subsystems have been verifi ed, the next step will be to integrate and test the whole system in the field.
Resumo:
In many areas of northern India, salinity renders groundwater unsuitable for drinking and even for irrigation. Though membrane treatment can be used to remove the salt, there are some drawbacks to this approach e.g. (1) depletion of the groundwater due to over-abstraction, (2) saline contamination of surface water and soil caused by concentrate disposal and (3) high electricity usage. To address these issues, a system is proposed in which a photovoltaic-powered reverse osmosis (RO) system is used to irrigate a greenhouse (GH) in a stand-alone arrangement. The concentrate from the RO is supplied to an evaporative cooling system, thus reducing the volume of the concentrate so that finally it can be evaporated in a pond to solid for safe disposal. Based on typical meteorological data for Delhi, calculations based on mass and energy balance are presented to assess the sizing and cost of the system. It is shown that solar radiation, freshwater output and evapotranspiration demand are readily matched due to the approximately linear relation among these variables. The demand for concentrate varies independently, however, thus favouring the use of a variable recovery arrangement. Though enough water may be harvested from the GH roof to provide year-round irrigation, this would require considerable storage. Some practical options for storage tanks are discussed. An alternative use of rainwater is in misting to reduce peak temperatures in the summer. An example optimised design provides internal temperatures below 30EC (monthly average daily maxima) for 8 months of the year and costs about €36,000 for the whole system with GH floor area of 1000 m2 . Further work is needed to assess technical risks relating to scale-deposition in the membrane and evaporative pads, and to develop a business model that will allow such a project to succeed in the Indian rural context.
Resumo:
A novel quasidistributed in-flber Bragg grating (FBG) temperature sensor system has been developed for temperature profiling in vivo in the human body for medical applications, e.g., hyperthermia treatment. This paper provides the operating principle of FBG temperature sensors and then the design of the sensor head. High-resolution detection of the wavelength-shifts induced by temperature changes are achieved using drift-compensated interferometric detection while the return signals from the FBG sensor array are demultiplexed with a simple monochromator which offers crosstalk-free wavelength-division-multiplexing (WDM). A "strain-free" probe is designed by enclosing the FBG sensor array in a protection sleeve. A four FBG sensor system is demonstrated and the experimental results are in good agreement with those obtained by traditional electrical thermocouple sensors. A resolution of 0.1°C and an accuracy of ±0.2°C over a temperature range of 30-60°C have been achieved, which meet established medical requirements.
Resumo:
We propose a scheme for multilevel (nine or more) amplitude regeneration based on a nonlinear optical loop mirror (NOLM) and demonstrate through numerical modeling its efficiency and cascadability on circular 16-, 64-, and 256- symbol constellations. We show that the amplitude noise is efficiently suppressed. The design is flexible and enables variation of the number of levels and their positioning. The scheme is compatible with phase regenerators. Also, compared to the traditional single-NOLM configuration scheme, new features, such as reduced and sign-varied power-dependent phase shift, are available. The model is simple to implement, as it requires only two couplers in addition to the traditional NOLM, and offers a vast range of optimization parameters. © 2014 Optical Society of America.
Resumo:
Synchronous reluctance motors (SynRMs) are gaining in popularity in industrial drives due to their permanent magnet-free, competitive performance, and robust features. This paper studies the power losses in a 90-kW converter-fed SynRM drive by a calorimetric method in comparison of the traditional input-output method. After the converter and the motor were measured simultaneously in separate chambers, the converter was installed inside the large-size chamber next to the motor and the total drive system losses were obtained using one chamber. The uncertainty of both measurement methods is analyzed and discussed.
Resumo:
The computational mechanics approach has been applied to the orientational behavior of water molecules in a molecular dynamics simulated water–Na + system. The distinctively different statistical complexity of water molecules in the bulk and in the first solvation shell of the ion is demonstrated. It is shown that the molecules undergo more complex orientational motion when surrounded by other water molecules compared to those constrained by the electric field of the ion. However the spatial coordinates of the oxygen atom shows the opposite complexity behavior in that complexity is higher for the solvation shell molecules. New information about the dynamics of water molecules in the solvation shell is provided that is additional to that given by traditional methods of analysis.