536 resultados para Industrial automation techniques
Understanding the mechanisms of graft union formation in solanaceae plants using in vitro techniques
Resumo:
Bat researchers currently use a variety of techniques that transform echolocation calls into audible frequencies and allow the spectral content of a signal to be viewed and analyzed. All techniques have limitations and an understanding of how each works and the effect on the signal being analyzed are vital for correct interpretation. The 3 most commonly used techniques for transforming frequencies of a call are heterodyne, frequency division, and time expansion. Three techniques for viewing spectral content of a signal are zero-crossing, Fourier analysis, and instantaneous frequency analysis. It is important for bat researchers to be familiar with the advantages and disadvantages of each technique.
Resumo:
Background: Conventional biodiesel production relies on trans-esterification of lipids extracted from vegetable crops. However, the use of valuable vegetable food stocks as raw material for biodiesel production makes it an unfeasibly expensive process. Used cooking oil is a finite resource and requires extra downstream processing, which affects the amount of biodiesel that can be produced and the economics of the process. Lipids extracted from microalgae are considered an alternative raw material for biodiesel production. This is primarily due to the fast growth rate of these species in a simple aquaculture environment. However, the dilute nature of microalgae culture puts a huge economic burden on the dewatering process especially on an industrial scale. This current study explores the performance and economic viability of chemical flocculation and tangential flow filtration (TFF) for the dewatering of Tetraselmis suecicamicroalgae culture. Results: Results show that TFF concentrates the microalgae feedstock up to 148 times by consuming 2.06 kWh m-3 of energy while flocculation consumes 14.81 kWhm-3 to concentrate the microalgae up to 357 times. Economic evaluation demonstrates that even though TFF has higher initial capital investment than polymer flocculation, the payback period for TFF at the upper extreme ofmicroalgae revenue is ∼1.5 years while that of flocculation is ∼3 years. Conclusion: These results illustrate that improved dewatering levels can be achieved more economically by employing TFF. The performances of these two techniques are also compared with other dewatering techniques.
Resumo:
Current developments in gene medicine and vaccination studies are utilizing plasmid DNA (pDNA) as the vector. For this reason, there has been an increasing trend towards larger and larger doses of pDNA utilized in human trials: from 100-1000 μg in 2002 to 500-5000 μg in 2005. The increasing demand of pDNA has created the need to revolutionalize current production levels under optimum economy. In this work, different standard media (LB, TB and SOC) for culturing recombinant Escherichia coli DH5α harbouring pUC19 were compared to a medium optimised for pDNA production. Lab scale fermentations using the standard media showed that the highest pDNA volumetric and specific yields were for TB (11.4 μg/ml and 6.3 μg/mg dry cell mass respectively) and the lowest was for LB (2.8 μg/ml and 3.3 μg/mg dry cell mass respectively). A fourth medium, PDMR, designed by modifying a stoichiometrically-formulated medium with an optimised carbon source concentration and carbon to nitrogen ratio displayed pDNA volumetric and specific yields of 23.8 μg/ml and 11.2 μg/mg dry cell mass respectively. However, it is the economic advantages of the optimised medium that makes it so attractive. Keeping all variables constant except medium and using LB as a base scenario (100 medium cost [MC] units/mg pDNA), the optimised PDMR medium yielded pDNA at a cost of only 27 MC units/mg pDNA. These results show that greater amounts of pDNA can be obtained more economically with minimal extra effort simply by using a medium optimised for pDNA production.
Resumo:
Magnetic resonance is a well-established tool for structural characterisation of porous media. Features of pore-space morphology can be inferred from NMR diffusion-diffraction plots or the time-dependence of the apparent diffusion coefficient. Diffusion NMR signal attenuation can be computed from the restricted diffusion propagator, which describes the distribution of diffusing particles for a given starting position and diffusion time. We present two techniques for efficient evaluation of restricted diffusion propagators for use in NMR porous-media characterisation. The first is the Lattice Path Count (LPC). Its physical essence is that the restricted diffusion propagator connecting points A and B in time t is proportional to the number of distinct length-t paths from A to B. By using a discrete lattice, the number of such paths can be counted exactly. The second technique is the Markov transition matrix (MTM). The matrix represents the probabilities of jumps between every pair of lattice nodes within a single timestep. The propagator for an arbitrary diffusion time can be calculated as the appropriate matrix power. For periodic geometries, the transition matrix needs to be defined only for a single unit cell. This makes MTM ideally suited for periodic systems. Both LPC and MTM are closely related to existing computational techniques: LPC, to combinatorial techniques; and MTM, to the Fokker-Planck master equation. The relationship between LPC, MTM and other computational techniques is briefly discussed in the paper. Both LPC and MTM perform favourably compared to Monte Carlo sampling, yielding highly accurate and almost noiseless restricted diffusion propagators. Initial tests indicate that their computational performance is comparable to that of finite element methods. Both LPC and MTM can be applied to complicated pore-space geometries with no analytic solution. We discuss the new methods in the context of diffusion propagator calculation in porous materials and model biological tissues.
Resumo:
Industrial control systems (ICS) have been moving from dedicated communications to switched and routed corporate networks, making it probable that these devices are being exposed to the Internet. Many ICS have been designed with poor or little security features, making them vulnerable to potential attack. Recently, several tools have been developed that can scan the internet, including ZMap, Masscan and Shodan. However, little in-depth analysis has been done to compare these Internet-wide scanning techniques, and few Internet-wide scans have been conducted targeting ICS and protocols. In this paper we present a Taxonomy of Internet-wide scanning with a comparison of three popular network scanning tools, and a framework for conducting Internet-wide scans.
Resumo:
Long-term measurements of particle number size distribution (PNSD) produce a very large number of observations and their analysis requires an efficient approach in order to produce results in the least possible time and with maximum accuracy. Clustering techniques are a family of sophisticated methods which have been recently employed to analyse PNSD data, however, very little information is available comparing the performance of different clustering techniques on PNSD data. This study aims to apply several clustering techniques (i.e. K-means, PAM, CLARA and SOM) to PNSD data, in order to identify and apply the optimum technique to PNSD data measured at 25 sites across Brisbane, Australia. A new method, based on the Generalised Additive Model (GAM) with a basis of penalised B-splines, was proposed to parameterise the PNSD data and the temporal weight of each cluster was also estimated using the GAM. In addition, each cluster was associated with its possible source based on the results of this parameterisation, together with the characteristics of each cluster. The performances of four clustering techniques were compared using the Dunn index and Silhouette width validation values and the K-means technique was found to have the highest performance, with five clusters being the optimum. Therefore, five clusters were found within the data using the K-means technique. The diurnal occurrence of each cluster was used together with other air quality parameters, temporal trends and the physical properties of each cluster, in order to attribute each cluster to its source and origin. The five clusters were attributed to three major sources and origins, including regional background particles, photochemically induced nucleated particles and vehicle generated particles. Overall, clustering was found to be an effective technique for attributing each particle size spectra to its source and the GAM was suitable to parameterise the PNSD data. These two techniques can help researchers immensely in analysing PNSD data for characterisation and source apportionment purposes.
Resumo:
The construction industry is a crucial component of the Hong Kong economy, and the safety and efficiency of workers are two of its main concerns. The current approach to training workers relies primarily on instilling practice and experience in conventional teacher-apprentice settings on and off site. Both have their limitations however, on-site training is very inefficient and interferes with progress on site, while off-site training provides little opportunity to develop the practical skills and awareness needed through hands-on experience. A more effective way is to train workers in safety awareness and efficient working by current novel information technologies. This paper describes a new and innovative prototype system – the Proactive Construction Management System (PCMS) – to train precast installation workers to be highly productive while being fully aware of the hazards involved. PCMS uses Chirp-Spread-Spectrum-based (CSS) real-time location technology and Unity3D-based data visualisation technology to track construction resources (people, equipment, materials, etc.) and provide real-time feedback and post-event visualisation analysis in a training environment. A trial of a precast facade installation on a real site demonstrates the benefits gained by PCMS in comparison with equivalent training using conventional methods. It is concluded that, although the study is based on specific industrial conditions found in Hong Kong construction projects, PCMS may well attract wider interest and use in future.
Resumo:
Hot metal carriers (HMCs) are large forklift-type vehicles used to move molten metal in aluminum smelters. This paper reports on field experiments that demonstrate that HMCs can operate autonomously and in particular can use vision as a primary sensor to locate the load of aluminum. We present our complete system but focus on the vision system elements and also detail experiments demonstrating reliable operation of the materials handling task. Two key experiments are described, lasting 2 and 5 h, in which the HMC traveled 15 km in total and handled the load 80 times.