37 resultados para reference price system
em Aston University Research Archive
Resumo:
The Digital Observatory for Protected Areas (DOPA) has been developed to support the European Union’s efforts in strengthening our capacity to mobilize and use biodiversity data, information and forecasts so that they are readily accessible to policymakers, managers, experts and other users. Conceived as a set of web based services, DOPA provides a broad set of free and open source tools to assess, monitor and even forecast the state of and pressure on protected areas at local, regional and global scale. DOPA Explorer 1.0 is a web based interface available in four languages (EN, FR, ES, PT) providing simple means to explore the nearly 16,000 protected areas that are at least as large as 100 km2. Distinguishing between terrestrial, marine and mixed protected areas, DOPA Explorer 1.0 can help end users to identify those with most unique ecosystems and species, and assess the pressures they are exposed to because of human development. Recognized by the UN Convention on Biological Diversity (CBD) as a reference information system, DOPA Explorer is based on the best global data sets available and provides means to rank protected areas at the country and ecoregion levels. Inversely, DOPA Explorer indirectly highlights the protected areas for which information is incomplete. We finally invite the end-users of DOPA to engage with us through the proposed communication platforms to help improve our work to support the safeguarding of biodiversity.
Resumo:
The Joint Research Centre (JRC) of the European Commission has developed, in consultation with many partners, the DOPA as a global reference information system to support decision making on protected areas (PAs) and biodiversity conservation. The DOPA brings together the World Database on Protected Areas with other reference datasets on species, habitats, ecoregions, threats and pressures, to deliver critical indicators at country level and PA level that can inform gap analyses, PA planning and reporting. These indicators are especially relevant to Aichi Targets 11 and 12, and have recently contributed to CBD country dossiers and capacity building on these targets. DOPA also includes eConservation, a new module that provides a means to share and search information on conservation projects, and thus allows users to see “who is doing what where”. So far over 5000 projects from the World Bank, GEF, CEPF, EU LIFE Programme, CBD LifeWeb Initiative and others have been included, and these projects can be searched in an interactive mapping interface based on criteria such as location, objectives, timeframe, budget, the organizations involved, target species etc. This seminar will provide an introduction to DOPA and eConservation, highlight how these services are used by the CBD and others, and include ample time for discussion.
Resumo:
This report presents the results of testing of the Metris iGPS system performed by the National Physical Laboratory (NPL) and the University of Bath (UoB), with the assistance of Metris, and Airbus at Airbus, Broughton in March 2008. The aim of the test was to determine the performance capability of the iGPS coordinate metrology system by comparison with a reference measurement system based on multilateration implemented using laser trackers. A network of reference points was created using SMR nests fixed to the ground and above ground level on various stands. The reference points were spread out within the measurement volume of approximately 10 m ´ 10 m ´ 2 m. The coordinates of each reference point were determined by the laser tracker survey using multilateration. The expanded uncertainty (k=2) in the relative position of these reference coordinates was estimated to be of the order of 10 µm in x, y and z. A comparison between the iGPS system and the reference system showed that for the test setup, the iGPS system was able to determine lengths up to 12 m with an uncertainty of 170 µm (k=2) and coordinates with an uncertainty of 120 µm in x and y and 190 µm in z (k=2).
Resumo:
This study has been conceived with the primary objective of identifying and evaluating the financial aspects of the transformation in country/company relations of the international oil industry from the traditional concessionary system to the system of governmental participation in the ownership and operation of oil concessions. The emphasis of the inquiry was placed on assembling a case study of the oil exploitation arrangements of Libya. Through a comprehensive review of the literature, the sociopolitical factors surrounding the international oil business were identified and examined in an attempt to see their influence on contractual arrangements and particularly to gauge the impact of any induced contractual changes on the revenue benefit accruing to the host country from its oil operations. Some comparative analyses were made in the study to examine the viability of the Libyan participation deals both as an investment proposal and as a system of conducting oil activities in the country. The analysis was carried out in the light of specific hypotheses to assess the relative impact of the participation scheme in comparison with the alternative concessionary model on the net revenue resulting to the government from oil operations and the relative effect on the level of research and development within the industry. A discounted cash flow analysis was conducted to measure inputs and outputs of the comparative models and judge their revenue benefits. Then an empirical analysis was carried out to detect any significant behavioural changes in the exploration and development effort associated with the different oil exploitation systems. Results of the investigation of revenues support the argument that the mere introduction of the participation system has not resulted in a significant revenue benefit to the host government. Though there has been a significant increase in government revenue, associated with the period following the emergence of the participation agreements, this increase was mainly due to socio-economic factors other than the participation scheme. At the same time the empirical results have shown an association of the participation scheme with a decline of the oil industry's research and development efforts.
Resumo:
This paper details work carried out to verify the dimensional measurement performance of the Indoor GPS (iGPS) system; a network of Rotary-Laser Automatic Theodolites (R-LATs). Initially tests were carried out to determine the angular uncertainties on an individual R-LAT transmitter-receiver pair. A method is presented of determining the uncertainty of dimensional measurement for a three dimensional coordinate measurement machine. An experimental procedure was developed to compare three dimensional coordinate measurements with calibrated reference points. The reference standard used to calibrate these reference points was a fringe counting interferometer with the multilateration technique employed to establish three dimensional coordinates. This is an extension of the established technique of comparing measured lengths with calibrated lengths. The method was found to be practical and able to establish that the expanded uncertainty of the basic iGPS system was approximately 1 mm at a 95% confidence level. Further tests carried out on a highly optimized version of the iGPS system have shown that the coordinate uncertainty can be reduced to 0.25 mm at a 95% confidence level.
Resumo:
A sieve plate distillation column has been constructed and interfaced to a minicomputer with the necessary instrumentation for dynamic, estimation and control studies with special bearing on low-cost and noise-free instrumentation. A dynamic simulation of the column with a binary liquid system has been compiled using deterministic models that include fluid dynamics via Brambilla's equation for tray liquid holdup calculations. The simulation predictions have been tested experimentally under steady-state and transient conditions. The simulator's predictions of the tray temperatures have shown reasonably close agreement with the measured values under steady-state conditions and in the face of a step change in the feed rate. A method of extending linear filtering theory to highly nonlinear systems with very nonlinear measurement functional relationships has been proposed and tested by simulation on binary distillation. The simulation results have proved that the proposed methodology can overcome the typical instability problems associated with the Kalman filters. Three extended Kalman filters have been formulated and tested by simulation. The filters have been used to refine a much simplified model sequentially and to estimate parameters such as the unmeasured feed composition using information from the column simulation. It is first assumed that corrupted tray composition measurements are made available to the filter and then corrupted tray temperature measurements are accessed instead. The simulation results have demonstrated the powerful capability of the Kalman filters to overcome the typical hardware problems associated with the operation of on-line analyzers in relation to distillation dynamics and control by, in effect, replacirig them. A method of implementing estimator-aided feedforward (EAFF) control schemes has been proposed and tested by simulation on binary distillation. The results have shown that the EAFF scheme provides much better control and energy conservation than the conventional feedback temperature control in the face of a sustained step change in the feed rate or multiple changes in the feed rate, composition and temperature. Further extensions of this work are recommended as regards simulation, estimation and EAFF control.
Resumo:
Background: The importance of appropriate normalization controls in quantitative real-time polymerase chain reaction (qPCR) experiments has become more apparent as the number of biological studies using this methodology has increased. In developing a system to study gene expression from transiently transfected plasmids, it became clear that normalization using chromosomally encoded genes is not ideal, at it does not take into account the transfection efficiency and the significantly lower expression levels of the plasmids. We have developed and validated a normalization method for qPCR using a co-transfected plasmid.Results: The best chromosomal gene for normalization in the presence of the transcriptional activators used in this study, cadmium, dexamethasone, forskolin and phorbol-12-myristate 13-acetate was first identified. qPCR data was analyzed using geNorm, Normfinder and BestKeeper. Each software application was found to rank the normalization controls differently with no clear correlation. Including a co-transfected plasmid encoding the Renilla luciferase gene (Rluc) in this analysis showed that its calculated stability was not as good as the optimised chromosomal genes, most likely as a result of the lower expression levels and transfection variability. Finally, we validated these analyses by testing two chromosomal genes (B2M and ActB) and a co-transfected gene (Rluc) under biological conditions. When analyzing co-transfected plasmids, Rluc normalization gave the smallest errors compared to the chromosomal reference genes.Conclusions: Our data demonstrates that transfected Rluc is the most appropriate normalization reference gene for transient transfection qPCR analysis; it significantly reduces the standard deviation within biological experiments as it takes into account the transfection efficiencies and has easily controllable expression levels. This improves reproducibility, data validity and most importantly, enables accurate interpretation of qPCR data. © 2010 Jiwaji et al; licensee BioMed Central Ltd.
Resumo:
This paper will seek to explicate the changes in the New Zealand health sector informed by the concepts of problematization, inscription and the construction of networks (Callon, 1986; Latour, 1987, 1993). This will involve applying a framework of interpretation based on the concepts of Latour's sociology of translation. Material on problematization and inscription will be incorporated into the paper in order to provide an explanatory frame of reference which will enable us to make sense of the processes of change in the New Zealand health sector. The sociology of translation will be used to explain the processes which underlie the changes and will be used to capture effects, such as changes in policy and structure, producing new networks within which 'allies' could be enrolled in support of the health reforms.
Resumo:
After its privatization in 1989, the water and sewerage industry of England and Wales faced a new regulatory régime and implemented a substantial capital investment program aimed at improving water and environmental standards. A new RPI + K regulatory pricing system was designed to compensate the industry for its increased capital costs, encourage increased efficiency, and maintain fair prices for customers. This paper evaluates how successful privatization and the resulting system of economic regulation has been. Estimates of productivity growth, derived with quality adjusted output indices, suggest that despite reductions in labor usage, total factor productivity growth has not improved since privatization. Moreover, total price performance indices reveal that increases in output prices have outstripped increases in input costs, a trend which is largely responsible for the increase in economic profits that has occurred since privatization. * We would like to thank Emmanuel Thanassoulis, Joshy Easaw, Jim Love, John Sawkins, and an anonymous referee for helpful comments on earlier drafts of this paper. The usual disclaimer applies.
Resumo:
The research carried out in this thesis was mainly concerned with the effects of large induction motors and their transient performance in power systems. Computer packages using the three phase co-ordinate frame of reference were developed to simulate the induction motor transient performance. A technique using matrix algebra was developed to allow extension of the three phase co-ordinate method to analyse asymmetrical and symmetrical faults on both sides of the three phase delta-star transformer which is usually required when connecting large induction motors to the supply system. System simulation, applying these two techniques, was used to study the transient stability of a power system. The response of a typical system, loaded with a group of large induction motors, two three-phase delta-star transformers, a synchronous generator and an infinite system was analysed. The computer software developed to study this system has the advantage that different types of fault at different locations can be studied by simple changes in input data. The research also involved investigating the possibility of using different integrating routines such as Runge-Kutta-Gill, RungeKutta-Fehlberg and the Predictor-Corrector methods. The investigation enables the reduction of computation time, which is necessary when solving the induction motor equations expressed in terms of the three phase variables. The outcome of this investigation was utilised in analysing an introductory model (containing only minimal control action) of an isolated system having a significant induction motor load compared to the size of the generator energising the system.
Resumo:
Groupe Spécial Mobile (GSM) has been developed as the pan-European second generation of digital mobile systems. GSM operates in the 900 MHz frequency band and employs digital technology instead of the analogue technology of its predecessors. Digital technology enables the GSM system to operate in much smaller zones in comparison with the analogue systems. The GSM system will offer greater roaming facilities to its subscribers, extended throughout the countries that have installed the system. The GSM system could be seen as a further enhancement to European integration. GSM has adopted a contention-based protocol for multipoint-to-point transmission. In particular, the slotted-ALOHA medium access protocol is used to coordinate the transmission of the channel request messages between the scattered mobile stations. Collision still happens when more than one mobile station having the same random reference number attempts to transmit on the same time-slot. In this research, a modified version of this protocol has been developed in order to reduce the number of collisions and hence increase the random access channel throughput compared to the existing protocol. The performance evaluation of the protocol has been carried out using simulation methods. Due to the growing demand for mobile radio telephony as well as for data services, optimal usage of the scarce availability radio spectrum is becoming increasingly important. In this research, a protocol has been developed whereby the number of transmitted information packets over the GSM system is increased without any additional increase of the allocated radio spectrum. Simulation results are presented to show the improvements achieved by the proposed protocol. Cellular mobile radio networks commonly respond to an increase in the service demand by using smaller coverage areas. As a result, the volume of the signalling exchanges increases. In this research, a proposal for interconnecting the various entitles of the mobile radio network over the future broadband networks based on the IEEE 802.6 Metropolitan Area Network (MAN) is outlined. Simulation results are presented to show the benefits achieved by interconnecting these entities over the broadband Networks.
Resumo:
The present work describes the development of a proton induced X-ray emission (PIXE) analysis system, especially designed and builtfor routine quantitative multi-elemental analysis of a large number of samples. The historical and general developments of the analytical technique and the physical processes involved are discussed. The philosophy, design, constructional details and evaluation of a versatile vacuum chamber, an automatic multi-sample changer, an on-demand beam pulsing system and ion beam current monitoring facility are described.The system calibration using thin standard foils of Si, P, S,Cl, K, Ca, Ti, V, Fe, Cu, Ga, Ge, Rb, Y and Mo was undertaken at proton beam energies of 1 to 3 MeV in steps of 0.5 MeV energy and compared with theoretical calculations. An independent calibration check using bovine liver Standard Reference Material was performed. The minimum detectable limits have been experimentally determined at detector positions of 90° and 135° with respect to the incident beam for the above range of proton energies as a function of atomic number Z. The system has detection limits of typically 10- 7 to 10- 9 g for elements 14
Resumo:
Coleridge, looking back at the end of the ‘long eighteenth century’, remarked that the whole of natural philosophy had been ‘electrified’ by advances in the understanding of electrical phenomena. In this paper I trace the way in which these advances affected contemporary ‘neurophysiology.’ At the beginning of the long eighteenth century, neurophysiology (in spite of Swammerdam’s and Glisson’s demonstrations to the contrary) was still understood largely in terms of hollow nerves and animal spirits. At the end of that period the researches of microscopists and electricians had convinced most medical men that the old understanding had to be replaced. Walsh, Patterson, John Hunter and others had described the electric organs of electric fish. Gray and Nollet had demonstrated that electricity was not merely static, but flowed. Franklin had alerted the world to atmospheric electricity. Galvani’s frog experiments were widely known. Volta had invented his ‘pile.’ But did ‘animal electricity’ exist and was it identical to the electricity physicists studied in the inanimate world? Was the brain a gland, as Malpighi’s researches seemed to confirm., and did it secrete electricity into the nervous system? The Monros (primus and secundus), William Cullen, Luigi Galvani, Alessandro Volta, Erasmus Darwin, Luigi Rolando and François Baillarger all had their own ideas. This paper reviews these ‘long-eighteenth century’ controversies with special reference to the Edinburgh medical school and the interaction between neurophysiology and physics.
Resumo:
From an examination of the literature relating to the catalytic steam reforming of hydrocarbons, it is concluded that the kinetics of high pressure reforming, particularly steam-methane reforming, has received relatively little attention. Therefore because of the increasing availability of natural gas in the U.K., this system was considered worthy of investigation. An examination of the thermodynamics relating to the equilibria of steam-hydrocarbon reforming is described. The reactions most likely to have influence over the process are established and from these a computer program was written to calculate equilibrium compositions. A means of presenting such data in a graphica1 form for ranges of the operating variables is given, and also an operating chart which may be used to quickly check feed ratios employed on a working naphtha reforming plant is presented. For the experimental kinetic study of the steam-methane system, cylindrical pellets of ICI 46-1 nickel catalyst were used in the form of a rod catalyst. The reactor was of the integral type and a description is given with the operating procedures and analytical method used. The experimental work was divided into two parts, qualitative and quantitative. In the qualitative study the various reaction steps are examined in order to establish which one is rate controlling. It is concluded that the effects of film diffusion resistance within the conditions employed are negligible. In the quantitative study it was found that at 250 psig and 6500C the steam-methane reaction is much slower than the CO shift reaction and is rate controlling. Two rate mechanisms and accompanying kinetic rate equations are derived, both of which represent 'chemical' steps in the reaction and are considered of equal merit. However the possibility of a dual control involving 'chemical' and pore diffusion resistances is also expressed.
Resumo:
The IRDS standard is an international standard produced by the International Organisation for Standardisation (ISO). In this work the process for producing standards in formal standards organisations, for example the ISO, and in more informal bodies, for example the Object Management Group (OMG), is examined. This thesis examines previous models and classifications of standards. The previous models and classifications are then combined to produce a new classification. The IRDS standard is then placed in a class in the new model as a reference anticipatory standard. Anticipatory standards are standards which are developed ahead of the technology in order to attempt to guide the market. The diffusion of the IRDS is traced over a period of eleven years. The economic conditions which affect the diffusion of standards are examined, particularly the economic conditions which prevail in compatibility markets such as the IT and ICT markets. Additionally the consequences of the introduction of gateway or converter devices into a market where a standard has not yet been established is examined. The IRDS standard did not have an installed base and this hindered its diffusion. The thesis concludes that the IRDS standard was overtaken by new developments such as object oriented technologies and middleware. This was partly because of the slow development process of developing standards in traditional organisations which operate on a consensus basis and partly because the IRDS standard did not have an installed base. Also the rise and proliferation of middleware products resulted in exchange mechanisms becoming dominant rather than repository solutions. The research method used in this work is a longitudinal study of the development and diffusion of the ISO/EEC IRDS standard. The research is regarded as a single case study and follows the interpretative epistemological point of view.