942 resultados para Logging
Resumo:
The objective of this study was to design, construct, commission and operate a laboratory scale gasifier system that could be used to investigate the parameters that influence the gasification process. The gasifier is of the open-core variety and is fabricated from 7.5 cm bore quartz glass tubing. Gas cleaning is by a centrifugal contacting scrubber, with the product gas being flared. The system employs an on-line dedicated gas analysis system, monitoring the levels of H2, CO, CO2 and CH4 in the product gas. The gas composition data, as well as the gas flowrate, temperatures throughout the system and pressure data is recorded using a BBC microcomputer based data-logging system. Ten runs have been performed using the system of which six were predominantly commissioning runs. The main emphasis in the commissioning runs was placed on the gas clean-up, the product gas cleaning and the reactor bed temperature measurement. The reaction was observed to occur in a narrow band, of about 3 to 5 particle diameters thick. Initially the fuel was pyrolysed, with the volatiles produced being combusted and providing the energy to drive the process, and then the char product was gasified by reaction with the pyrolysis gases. Normally, the gasifier is operated with reaction zone supported on a bed of char, although it has been operated for short periods without a char bed. At steady state the depth of char remains constant, but by adjusting the air inlet rate it has been shown that the depth of char can be increased or decreased. It has been shown that increasing the depth of the char bed effects some improvement in the product gas quality.
Resumo:
Computer integrated monitoring is a very large area in engineering where on-line, real time data acquisition with the aid of sensors is the solution to many problems in the manufacturing industry as opposed to the old data logging method by graphics analysis. The raw data which is collected this way however is useless in the absence of a proper computerized management system. The transfer of data between the management and the shop floor processes has been impossible in the past unless all the computers in the system were totally compatible with each other. This limits the efficiency of the systems because they get governed by the limitations of the computers. General Motors of U.S.A. have recently started research on a new standard called the Manufacturing Automation Protocol (MAP) which is expected to allow data transfer between different types of computers. This is still in early development stages and also is currently very expensive. This research programme shows how such a shop floor data acquisition system and a complete management system on entirely different computers can be integrated together to form a single system by achieving data transfer communications using a cheaper but a superior alternative to MAP. Standard communication character sets and hardware such as ASCII and UARTs have been used in this method but the technique is so powerful that totally incompatible computers are shown to run different programs (in different languages) simultaneously and yet receive data from each other and process in their own CPUs with no human intervention.
Resumo:
A mathematical model has been developed for predicting the spectral distribution of solar radiation incident on a horizontal surface. The solar spectrum in the wavelength range 0.29 to 4.0 micrometers has been divided in 144 intervals. Two variables in the model are the atmospheric water vapour content and atmospheric turbidity. After allowing for absorption and scattering in the atmosphere, the spectral intensity of direct and diffuse components of radiation are computed. When the predicted radiation levels are compared with the measured values for the total radiation and the values with glass filters RG715, RG630 and OG530, a close agreement (± 5%) has been achieved under clear sky conditions. A solar radiation measuring facility, close to the centre of Birmingham, has been set up utilising a microcomputer based data logging system. A suite of computer programs in the BASIC programming language has been developed and extensively tested for solar radiation data, logging, analysis and plotting. Two commonly used instruments, the Eppley PSP pyranometer and the Kipp and Zonen CM5 pyranometer, have been compared under different experimental conditions. Three models for computing the inclined plane irradiation, using total and diffuse radiation on a horizontal surface, have been tested for Birmingham. The anisotropic-alI-sky model, proposed by Klucher, provides a good agreement between the measured and the predicted radiation levels. Measurements of solar spectral distribution, using glass filters, are also reported for a number of inclines facing South.
Resumo:
The aim of this Interdisciplinary Higher Degrees project was the development of a high-speed method of photometrically testing vehicle headlamps, based on the use of image processing techniques, for Lucas Electrical Limited. Photometric testing involves measuring the illuminance produced by a lamp at certain points in its beam distribution. Headlamp performance is best represented by an iso-lux diagram, showing illuminance contours, produced from a two-dimensional array of data. Conventionally, the tens of thousands of measurements required are made using a single stationary photodetector and a two-dimensional mechanical scanning system which enables a lamp's horizontal and vertical orientation relative to the photodetector to be changed. Even using motorised scanning and computerised data-logging, the data acquisition time for a typical iso-lux test is about twenty minutes. A detailed study was made of the concept of using a video camera and a digital image processing system to scan and measure a lamp's beam without the need for the time-consuming mechanical movement. Although the concept was shown to be theoretically feasible, and a prototype system designed, it could not be implemented because of the technical limitations of commercially-available equipment. An alternative high-speed approach was developed, however, and a second prototype syqtem designed. The proposed arrangement again uses an image processing system, but in conjunction with a one-dimensional array of photodetectors and a one-dimensional mechanical scanning system in place of a video camera. This system can be implemented using commercially-available equipment and, although not entirely eliminating the need for mechanical movement, greatly reduces the amount required, resulting in a predicted data acquisiton time of about twenty seconds for a typical iso-lux test. As a consequence of the work undertaken, the company initiated an 80,000 programme to implement the system proposed by the author.
Resumo:
A combination of experimental methods was applied at a clogged, horizontal subsurface flow (HSSF) municipal wastewater tertiary treatment wetland (TW) in the UK, to quantify the extent of surface and subsurface clogging which had resulted in undesirable surface flow. The three dimensional hydraulic conductivity profile was determined, using a purpose made device which recreates the constant head permeameter test in-situ. The hydrodynamic pathways were investigated by performing dye tracing tests with Rhodamine WT and a novel multi-channel, data-logging, flow through Fluorimeter which allows synchronous measurements to be taken from a matrix of sampling points. Hydraulic conductivity varied in all planes, with the lowest measurement of 0.1 md1 corresponding to the surface layer at the inlet, and the maximum measurement of 1550 md1 located at a 0.4m depth at the outlet. According to dye tracing results, the region where the overland flow ceased received five times the average flow, which then vertically short-circuited below the rhizosphere. The tracer break-through curve obtained from the outlet showed that this preferential flow-path accounted for approximately 80% of the flow overall and arrived 8 h before a distinctly separate secondary flow-path. The overall volumetric efficiencyof the clogged system was 71% and the hydrology was simulated using a dual-path, dead-zone storage model. It is concluded that uneven inlet distribution, continuous surface loading and high rhizosphere resistance is responsible for the clog formation observed in this system. The average inlet hydraulic conductivity was 2 md1, suggesting that current European design guidelines, which predict that the system will reach an equilibrium hydraulic conductivity of 86 md1, do not adequately describe the hydrology of mature systems.
Resumo:
Most empirical work in economic growth assumes either a Cobb–Douglas production function expressed in logs or a log-approximated constant elasticity of substitution specification. Estimates from each are likely biased due to logging the model and the latter can also suffer from approximation bias. We illustrate this with a successful replication of Masanjala and Papagerogiou (The Solow model with CES technology: nonlinearities and parameter heterogeneity, Journal of Applied Econometrics 2004; 19: 171–201) and then estimate both models in levels to avoid these biases. Our estimation in levels gives results in line with conventional wisdom.
Resumo:
Evaluations of semantic search systems are generally small scale and ad hoc due to the lack of appropriate resources such as test collections, agreed performance criteria and independent judgements of performance. By analysing our work in building and evaluating semantic tools over the last five years, we conclude that the growth of the semantic web led to an improvement in the available resources and the consequent robustness of performance assessments. We propose two directions for continuing evaluation work: the development of extensible evaluation benchmarks and the use of logging parameters for evaluating individual components of search systems.
Resumo:
Researching simulation/implementation of membranes systems is very recent. Present literature gathers new publications frequently about software/hardware, data structures and algorithms for implementing P system evolution. In this context, this work presents a framework which goal is to make tasks of researchers of this field easier. Hence, it establishes the set of cooperating classes that form a reusable and flexible design for the customizable evaluation with new data structures and algorithms. Moreover, it includes customizable services for correcting, monitoring and logging the evolution and edition, recovering, automatic generating, persistence and visualizing P systems.
Resumo:
This paper presents an effective decision making system for leak detection based on multiple generalized linear models and clustering techniques. The training data for the proposed decision system is obtained by setting up an experimental pipeline fully operational distribution system. The system is also equipped with data logging for three variables; namely, inlet pressure, outlet pressure, and outlet flow. The experimental setup is designed such that multi-operational conditions of the distribution system, including multi pressure and multi flow can be obtained. We then statistically tested and showed that pressure and flow variables can be used as signature of leak under the designed multi-operational conditions. It is then shown that the detection of leakages based on the training and testing of the proposed multi model decision system with pre data clustering, under multi operational conditions produces better recognition rates in comparison to the training based on the single model approach. This decision system is then equipped with the estimation of confidence limits and a method is proposed for using these confidence limits for obtaining more robust leakage recognition results.
Resumo:
This paper describes the use of Bluetooth and Java-Based technologies in developing a multi-player mobile game in ubiquitous computing, which strongly depends on automatic contextual reconfiguration and context-triggered actions. Our investigation focuses on an extended form of ubiquitous computing which game software developers utilize to develop games for players. We have developed an experimental ubiquitous computing application that provides context-aware services to game server and game players in a mobile distributed computing system. Obviously, contextual services provide useful information in a context-aware system. However, designing a context-aware game is still a daunting task and much theoretical and practical research remains to be done to reach the ubiquitous computing era. In this paper, we present the overall architecture and discuss, in detail, the implementation steps taken to create a Bluetooth and Java based context-aware game. We develop a multi-player game server and prepare the client and server codes in ubiquitous computing, providing adaptive routines to handle connection information requests, logging and context formatting and delivery for automatic contextual reconfiguration and context-triggered actions. © 2010 Binary Information Press.
Resumo:
This article provides a comprehensive review of the literature on the theoretical aspects of sustainable consumption. The conditions for consumers’ social responsibility and the formation of environmentally conscious behavior patterns will also be discussed, along with possible methods for motivating behavioral changes. The authors have completed a primary research study with the purpose of surveying environmentally conscious consumption patterns in Hungary. They also examined how the provision of appropriate information and the raising of awareness might encourage sustainable consumption. According to their findings, the respondents’ knowledge on environmentally conscious behavior was rather limited, and reinforcement was needed in identifying appropriate activity alternatives. This paper provides a summary of the qualitative research phase which employed in-depth interviews, logging and focus groups. The consecutive application of these methods enabled the authors to keep track of the process and the consequences of raising awareness.
Resumo:
Rainforests are situated at low latitude where forests enjoy steady and strong radiation. Biodiversity in rainforests has been very high, for historical and climatic reasons. The number of species is very high and tends to increase with precipitation and decrease with seasonality. Disturbance, soil fertility and forest stature also influence the species richness and high turnover of species contribute to diversity. Field observation and studies revealed that large scale deforestation could alter the regional and global climate significantly. Deforestation alters the surface albedo which leads to climate change. Regional land use contributes to climate change through surface-energy budget, as well as the carbon cycle. Forest fragmentation, logging, overhunting, fire and the expanding agriculture threaten the biodiversity. Rainforest covered area has significantly shrunk in the last decades. It is hard to protect the forests because of the growing demand for agricultural area and forest-derived products. Most measures proved ineffective to slow down the destruction. Hence, more forest will be lost in the future. Conservationists should take into consideration the secondary forests because biodiversity can be high enough and it is worth protecting them.
Resumo:
The Internet has become an integral part of our nation’s critical socio-economic infrastructure. With its heightened use and growing complexity however, organizations are at greater risk of cyber crimes. To aid in the investigation of crimes committed on or via the Internet, a network forensics analysis tool pulls together needed digital evidence. It provides a platform for performing deep network analysis by capturing, recording and analyzing network events to find out the source of a security attack or other information security incidents. Existing network forensics work has been mostly focused on the Internet and fixed networks. But the exponential growth and use of wireless technologies, coupled with their unprecedented characteristics, necessitates the development of new network forensic analysis tools. This dissertation fostered the emergence of a new research field in cellular and ad-hoc network forensics. It was one of the first works to identify this problem and offer fundamental techniques and tools that laid the groundwork for future research. In particular, it introduced novel methods to record network incidents and report logged incidents. For recording incidents, location is considered essential to documenting network incidents. However, in network topology spaces, location cannot be measured due to absence of a ‘distance metric’. Therefore, a novel solution was proposed to label locations of nodes within network topology spaces, and then to authenticate the identity of nodes in ad hoc environments. For reporting logged incidents, a novel technique based on Distributed Hash Tables (DHT) was adopted. Although the direct use of DHTs for reporting logged incidents would result in an uncontrollably recursive traffic, a new mechanism was introduced that overcome this recursive process. These logging and reporting techniques aided forensics over cellular and ad-hoc networks, which in turn increased their ability to track and trace attacks to their source. These techniques were a starting point for further research and development that would result in equipping future ad hoc networks with forensic components to complement existing security mechanisms.
Resumo:
Significant improvements have been made in estimating gross primary production (GPP), ecosystem respiration (R), and net ecosystem production (NEP) from diel, “free-water” changes in dissolved oxygen (DO). Here we evaluate some of the assumptions and uncertainties that are still embedded in the technique and provide guidelines on how to estimate reliable metabolic rates from high-frequency sonde data. True whole-system estimates are often not obtained because measurements reflect an unknown zone of influence which varies over space and time. A minimum logging frequency of 30 min was sufficient to capture metabolism at the daily time scale. Higher sampling frequencies capture additional pattern in the DO data, primarily related to physical mixing. Causes behind the often large daily variability are discussed and evaluated for an oligotrophic and a eutrophic lake. Despite a 3-fold higher day-to-day variability in absolute GPP rates in the eutrophic lake, both lakes required at least 3 sonde days per week for GPP estimates to be within 20% of the weekly average. A sensitivity analysis evaluated uncertainties associated with DO measurements, piston velocity (k), and the assumption that daytime R equals nighttime R. In low productivity lakes, uncertainty in DO measurements and piston velocity strongly impacts R but has no effect on GPP or NEP. Lack of accounting for higher R during the day underestimates R and GPP but has no effect on NEP. We finally provide suggestions for future research to improve the technique.
Resumo:
Non-Destructive Testing (NDT) of deep foundations has become an integral part of the industry's standard manufacturing processes. It is not unusual for the evaluation of the integrity of the concrete to include the measurement of ultrasonic wave speeds. Numerous methods have been proposed that use the propagation speed of ultrasonic waves to check the integrity of concrete for drilled shaft foundations. All such methods evaluate the integrity of the concrete inside the cage and between the access tubes. The integrity of the concrete outside the cage remains to be considered to determine the location of the border between the concrete and the soil in order to obtain the diameter of the drilled shaft. It is also economic to devise a methodology to obtain the diameter of the drilled shaft using the Cross-Hole Sonic Logging system (CSL). Performing such a methodology using the CSL and following the CSL tests is performed and used to check the integrity of the inside concrete, thus allowing the determination of the drilled shaft diameter without having to set up another NDT device.^ This proposed new method is based on the installation of galvanized tubes outside the shaft across from each inside tube, and performing the CSL test between the inside and outside tubes. From the performed experimental work a model is developed to evaluate the relationship between the thickness of concrete and the ultrasonic wave properties using signal processing. The experimental results show that there is a direct correlation between concrete thicknesses outside the cage and maximum amplitude of the received signal obtained from frequency domain data. This study demonstrates how this new method to measuring the diameter of drilled shafts during construction using a NDT method overcomes the limitations of currently-used methods. ^ In the other part of study, a new method is proposed to visualize and quantify the extent and location of the defects. It is based on a color change in the frequency amplitude of the signal recorded by the receiver probe in the location of defects and it is called Frequency Tomography Analysis (FTA). Time-domain data is transferred to frequency-domain data of the signals propagated between tubes using Fast Fourier Transform (FFT). Then, distribution of the FTA will be evaluated. This method is employed after CSL has determined the high probability of an anomaly in a given area and is applied to improve location accuracy and to further characterize the feature. The technique has a very good resolution and clarifies the exact depth location of any void or defect through the length of the drilled shaft for the voids inside the cage. ^ The last part of study also evaluates the effect of voids inside and outside the reinforcement cage and corrosion in the longitudinal bars on the strength and axial load capacity of drilled shafts. The objective is to quantify the extent of loss in axial strength and stiffness of drilled shafts due to presence of different types of symmetric voids and corrosion throughout their lengths.^