955 resultados para High frequency


Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper presents the ultrasonic velocity measurement method which investigates the possible effects of high voltage high frequency pulsed power on cortical bone material elasticity. Before applying a pulsed power signal on a live bone, it is essential to determine the safe parameters of pulsed power applied on bone non-destructively. Therefore, the possible changes in cortical bone material elasticity due to a specified pulsed power excitation have been investigated. A controllable positive buck-boost converter with adjustable output voltage and frequency has been used to generate high voltage pulses (500V magnitude at 10 KHz frequency). To determine bone elasticity, an ultrasonic velocity measurement has been conducted on two groups of control (unexposed to pulse power but in the same environmental condition) and cortical bone samples exposed to pulsed power. Young’s modulus of cortical bone samples have been determined and compared before and after applying the pulsed power signal. After applying the high voltage pulses, no significant variation in elastic property of cortical bone specimens was found compared to the control. The result shows that pulsed power with nominated parameters can be applied on cortical bone tissue without any considerable negative effect on elasticity of bone material.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In natural waterways and estuaries, the understanding of turbulent mixing is critical to the knowledge of sediment transport, stormwater runoff during flood events, and release of nutrient-rich wastewater into ecosystems. In the present study, some field measurements were conducted in a small subtropical estuary with micro-tidal range and semi-diurnal tides during king tide conditions: i. e., the tidal range was the largest for both 2009 and 2010. The turbulent velocity measurements were performed continuously at high-frequency (50Hz) for 60 h. Two acoustic Doppler velocimeters (ADVs) were sampled simultaneously in the middle estuarine zone, and a third ADV was deployed in the upper estuary for 12 h only. The results provided an unique characterisation of the turbulence in both middle and upper estuarine zones under the king tide conditions. The present observations showed some marked differences between king tide and neap tide conditions. During the king tide conditions, the tidal forcing was the dominant water exchange and circulation mechanism in the estuary. In contrast, the long-term oscillations linked with internal and external resonance played a major role in the turbulent mixing during neap tides. The data set showed further that the upper estuarine zone was drastically less affected by the spring tide range: the flow motion remained slow, but the turbulent velocity data were affected by the propagation of a transient front during the very early flood tide motion at the sampling site. © 2012 Springer Science+Business Media B.V.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Forecasts of volatility and correlation are important inputs into many practical financial problems. Broadly speaking, there are two ways of generating forecasts of these variables. Firstly, time-series models apply a statistical weighting scheme to historical measurements of the variable of interest. The alternative methodology extracts forecasts from the market traded value of option contracts. An efficient options market should be able to produce superior forecasts as it utilises a larger information set of not only historical information but also the market equilibrium expectation of options market participants. While much research has been conducted into the relative merits of these approaches, this thesis extends the literature along several lines through three empirical studies. Firstly, it is demonstrated that there exist statistically significant benefits to taking the volatility risk premium into account for the implied volatility for the purposes of univariate volatility forecasting. Secondly, high-frequency option implied measures are shown to lead to superior forecasts of the intraday stochastic component of intraday volatility and that these then lead on to superior forecasts of intraday total volatility. Finally, the use of realised and option implied measures of equicorrelation are shown to dominate measures based on daily returns.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A breaker restrike is an abnormal arcing phenomenon, leading to a possible breaker failure. Eventually, this failure leads to interruption of the transmission and distribution of the electricity supply system until the breaker is replaced. Before 2008, there was little evidence in the literature of monitoring techniques based on restrike measurement and interpretation produced during switching of capacitor banks and shunt reactor banks in power systems. In 2008 a non-intrusive radiometric restrike measurement method and a restrike hardware detection algorithm were developed by M.S. Ramli and B. Kasztenny. However, the limitations of the radiometric measurement method are a band limited frequency response as well as limitations in amplitude determination. Current restrike detection methods and algorithms require the use of wide bandwidth current transformers and high voltage dividers. A restrike switch model using Alternative Transient Program (ATP) and Wavelet Transforms which support diagnostics are proposed. Restrike phenomena become a new diagnostic process using measurements, ATP and Wavelet Transforms for online interrupter monitoring. This research project investigates the restrike switch model Parameter „A. dielectric voltage gradient related to a normal and slowed case of the contact opening velocity and the escalation voltages, which can be used as a diagnostic tool for a vacuum circuit-breaker (CB) at service voltages between 11 kV and 63 kV. During current interruption of an inductive load at current quenching or chopping, a transient voltage is developed across the contact gap. The dielectric strength of the gap should rise to a point to withstand this transient voltage. If it does not, the gap will flash over, resulting in a restrike. A straight line is fitted through the voltage points at flashover of the contact gap. This is the point at which the gap voltage has reached a value that exceeds the dielectric strength of the gap. This research shows that a change in opening contact velocity of the vacuum CB produces a corresponding change in the slope of the gap escalation voltage envelope. To investigate the diagnostic process, an ATP restrike switch model was modified with contact opening velocity computation for restrike waveform signature analyses along with experimental investigations. This also enhanced a mathematical CB model with the empirical dielectric model for SF6 (sulphur hexa-fluoride) CBs at service voltages above 63 kV and a generalised dielectric curve model for 12 kV CBs. A CB restrike can be predicted if there is a similar type of restrike waveform signatures for measured and simulated waveforms. The restrike switch model applications are used for: computer simulations as virtual experiments, including predicting breaker restrikes; estimating the interrupter remaining life of SF6 puffer CBs; checking system stresses; assessing point-on-wave (POW) operations; and for a restrike detection algorithm development using Wavelet Transforms. A simulated high frequency nozzle current magnitude was applied to an Equation (derived from the literature) which can calculate the life extension of the interrupter of a SF6 high voltage CB. The restrike waveform signatures for a medium and high voltage CB identify its possible failure mechanism such as delayed opening, degraded dielectric strength and improper contact travel. The simulated and measured restrike waveform signatures are analysed using Matlab software for automatic detection. Experimental investigation of a 12 kV vacuum CB diagnostic was carried out for the parameter determination and a passive antenna calibration was also successfully developed with applications for field implementation. The degradation features were also evaluated with a predictive interpretation technique from the experiments, and the subsequent simulation indicates that the drop in voltage related to the slow opening velocity mechanism measurement to give a degree of contact degradation. A predictive interpretation technique is a computer modeling for assessing switching device performance, which allows one to vary a single parameter at a time; this is often difficult to do experimentally because of the variable contact opening velocity. The significance of this thesis outcome is that it is a non-intrusive method developed using measurements, ATP and Wavelet Transforms to predict and interpret a breaker restrike risk. The measurements on high voltage circuit-breakers can identify degradation that can interrupt the distribution and transmission of an electricity supply system. It is hoped that the techniques for the monitoring of restrike phenomena developed by this research will form part of a diagnostic process that will be valuable for detecting breaker stresses relating to the interrupter lifetime. Suggestions for future research, including a field implementation proposal to validate the restrike switch model for ATP system studies and the hot dielectric strength curve model for SF6 CBs, are given in Appendix A.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Structural health monitoring (SHM) refers to the procedure used to assess the condition of structures so that their performance can be monitored and any damage can be detected early. Early detection of damage and appropriate retrofitting will aid in preventing failure of the structure and save money spent on maintenance or replacement and ensure the structure operates safely and efficiently during its whole intended life. Though visual inspection and other techniques such as vibration based ones are available for SHM of structures such as bridges, the use of acoustic emission (AE) technique is an attractive option and is increasing in use. AE waves are high frequency stress waves generated by rapid release of energy from localised sources within a material, such as crack initiation and growth. AE technique involves recording these waves by means of sensors attached on the surface and then analysing the signals to extract information about the nature of the source. High sensitivity to crack growth, ability to locate source, passive nature (no need to supply energy from outside, but energy from damage source itself is utilised) and possibility to perform real time monitoring (detecting crack as it occurs or grows) are some of the attractive features of AE technique. In spite of these advantages, challenges still exist in using AE technique for monitoring applications, especially in the area of analysis of recorded AE data, as large volumes of data are usually generated during monitoring. The need for effective data analysis can be linked with three main aims of monitoring: (a) accurately locating the source of damage; (b) identifying and discriminating signals from different sources of acoustic emission and (c) quantifying the level of damage of AE source for severity assessment. In AE technique, the location of the emission source is usually calculated using the times of arrival and velocities of the AE signals recorded by a number of sensors. But complications arise as AE waves can travel in a structure in a number of different modes that have different velocities and frequencies. Hence, to accurately locate a source it is necessary to identify the modes recorded by the sensors. This study has proposed and tested the use of time-frequency analysis tools such as short time Fourier transform to identify the modes and the use of the velocities of these modes to achieve very accurate results. Further, this study has explored the possibility of reducing the number of sensors needed for data capture by using the velocities of modes captured by a single sensor for source localization. A major problem in practical use of AE technique is the presence of sources of AE other than crack related, such as rubbing and impacts between different components of a structure. These spurious AE signals often mask the signals from the crack activity; hence discrimination of signals to identify the sources is very important. This work developed a model that uses different signal processing tools such as cross-correlation, magnitude squared coherence and energy distribution in different frequency bands as well as modal analysis (comparing amplitudes of identified modes) for accurately differentiating signals from different simulated AE sources. Quantification tools to assess the severity of the damage sources are highly desirable in practical applications. Though different damage quantification methods have been proposed in AE technique, not all have achieved universal approval or have been approved as suitable for all situations. The b-value analysis, which involves the study of distribution of amplitudes of AE signals, and its modified form (known as improved b-value analysis), was investigated for suitability for damage quantification purposes in ductile materials such as steel. This was found to give encouraging results for analysis of data from laboratory, thereby extending the possibility of its use for real life structures. By addressing these primary issues, it is believed that this thesis has helped improve the effectiveness of AE technique for structural health monitoring of civil infrastructures such as bridges.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The study presented in this paper reviewed 9,358 accidents which occurred in the U.S. construction industry between 2002 and 2011, in order to understand the relationships between the risk factors and injury severity (e.g. fatalities, hospitalized injuries, or non-hospitalized injuries) and to develop a strategic prevention plan to reduce the likelihood of fatalities where an accident is unavoidable. The study specifically aims to: (1) verify the relationships among risk factors, accident types, and injury severity, (2) determine significant risk factors associated with each accident type that are highly correlated to injury severity, and (3) analyze the impact of the identified key factors on accident and fatality occurrence. The analysis results explained that safety managers’ roles are critical to reducing human-related risks—particularly misjudgement of hazardous situations—through safety training and education, appropriate use of safety devices and proper safety inspection. However, for environment-related factors, the dominant risk factors were different depending on the different accident types. The outcomes of this study will assist safety managers to understand the nature of construction accidents and plan for strategic risk mitigation by prioritizing high frequency risk factors to effectively control accident occurrence and manage the likelihood of fatal injuries on construction sites.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A simple and effective down-sample algorithm, Peak-Hold-Down-Sample (PHDS) algorithm is developed in this paper to enable a rapid and efficient data transfer in remote condition monitoring applications. The algorithm is particularly useful for high frequency Condition Monitoring (CM) techniques, and for low speed machine applications since the combination of the high sampling frequency and low rotating speed will generally lead to large unwieldy data size. The effectiveness of the algorithm was evaluated and tested on four sets of data in the study. One set of the data was extracted from the condition monitoring signal of a practical industry application. Another set of data was acquired from a low speed machine test rig in the laboratory. The other two sets of data were computer simulated bearing defect signals having either a single or multiple bearing defects. The results disclose that the PHDS algorithm can substantially reduce the size of data while preserving the critical bearing defect information for all the data sets used in this work even when a large down-sample ratio was used (i.e., 500 times down-sampled). In contrast, the down-sample process using existing normal down-sample technique in signal processing eliminates the useful and critical information such as bearing defect frequencies in a signal when the same down-sample ratio was employed. Noise and artificial frequency components were also induced by the normal down-sample technique, thus limits its usefulness for machine condition monitoring applications.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We used in vivo (biological), in silico (computational structure prediction), and in vitro (model sequence folding) analyses of single-stranded DNA sequences to show that nucleic acid folding conservation is the selective principle behind a high-frequency single-nucleotide reversion observed in a three-nucleotide mutated motif of the Maize streak virus replication associated protein (Rep) gene. In silico and in vitro studies showed that the three-nucleotide mutation adversely affected Rep nucleic acid folding, and that the single-nucleotide reversion [C(601)A] restored wild-type-like folding. In vivo support came from infecting maize with mutant viruses: those with Rep genes containing nucleotide changes predicted to restore a wild-type-like fold [A(601)/G(601)] preferentially accumulated over those predicted to fold differently [C(601)/T(601)], which frequently reverted to A(601) and displaced the original population. We propose that the selection of native nucleic acid folding is an epigenetic effect, which might have broad implications in the evolution of plants and their viruses.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

One aim of experimental economics is to try to better understand human economic decision making. Early research of the ultimatum bargaining game (Gueth et al., 1982) revealed that other motives than pure monetary reward play a role. Neuroeconomic research has introduced the recording of physiological observations as signals of emotional responses. In this study, we apply heart rate variability (HRV) measuring technology to explore the behaviour and physiological reactions of proposers and responders in the ultimatum bargaining game. Since this technology is small and non-intrusive, we are able to run the experiment in a standard experimental economic setup. We show that low o�ers by a proposer cause signs of mental stress in both the proposer and the responder, as both exhibit high ratios of low to high frequency activity in the HRV spectrum.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Wheel–rail interaction is one of the most important research topics in railway engineering. It involves track impact response, track vibration and track safety. Track structure failures caused by wheel–rail impact forces can lead to significant economic loss for track owners through damage to rails and to the sleepers beneath. Wheel–rail impact forces occur because of imperfections in the wheels or rails such as wheel flats, irregular wheel profiles, rail corrugations and differences in the heights of rails connected at a welded joint. A wheel flat can cause a large dynamic impact force as well as a forced vibration with a high frequency, which can cause damage to the track structure. In the present work, a three-dimensional (3-D) finite element (FE) model for the impact analysis induced by the wheel flat is developed by use of the finite element analysis (FEA) software package ANSYS and validated by another validated simulation. The effect of wheel flats on impact forces is thoroughly investigated. It is found that the presence of a wheel flat will significantly increase the dynamic impact force on both rail and sleeper. The impact force will monotonically increase with the size of wheel flats. The relationships between the impact force and the wheel flat size are explored from this finite element analysis and they are important for track engineers to improve their understanding of the design and maintenance of the track system.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper analyses effects of winding structure on capacitive coupling reduction appearing in the planar magnetic elements at high frequencies. Capacitive coupling appears between the conductive layers of the planar transformers resulting in high current spikes and consequently high power dissipation. With finite element analysis, the equivalent capacitive coupling of magnetic elements is calculated for different structures of planar windings. Finally, a new winding structure with minimum capacitive coupling is introduced for the planar magnetic elements, which is verified by simulation and experiments.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In the last decade, smartphones have gained widespread usage. Since the advent of online application stores, hundreds of thousands of applications have become instantly available to millions of smart-phone users. Within the Android ecosystem, application security is governed by digital signatures and a list of coarse-grained permissions. However, this mechanism is not fine-grained enough to provide the user with a sufficient means of control of the applications' activities. Abuse of highly sensible private information such as phone numbers without users' notice is the result. We show that there is a high frequency of privacy leaks even among widely popular applications. Together with the fact that the majority of the users are not proficient in computer security, this presents a challenge to the engineers developing security solutions for the platform. Our contribution is twofold: first, we propose a service which is able to assess Android Market applications via static analysis and provide detailed, but readable reports to the user. Second, we describe a means to mitigate security and privacy threats by automated reverse-engineering and refactoring binary application packages according to the users' security preferences.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This study explored the dynamic performance of an innovative Hybrid Composite Floor Plate System (HCFPS), composed of Polyurethane (PU) core, outer layers of Glass–fibre Reinforced Cement (GRC) and steel laminates at tensile regions, using experimental testing and Finite Element (FE) modelling. Experimental testing included heel impact and walking tests for 3200 mm span HCFPS panels. FE models of the HCFPS were developed using the FE program ABAQUS and validated with experimental results. HCFPS is a light-weight high frequency floor system with excellent damping ratio of 5% (bare floor) due to the central PU core. Parametric studies were conducted using the validated FE models to investigate the dynamic response of the HCFPS and to identify characteristics that influence acceleration response under human induced vibration in service. This vibration performance was compared with recommended acceptable perceptibility limits. The findings of this study show that HCFPS can be used in residential and office buildings as a light-weight floor system, which does not exceed the perceptible thresholds due to human induced vibrations.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Abstract: Texture enhancement is an important component of image processing, with extensive application in science and engineering. The quality of medical images, quantified using the texture of the images, plays a significant role in the routine diagnosis performed by medical practitioners. Previously, image texture enhancement was performed using classical integral order differential mask operators. Recently, first order fractional differential operators were implemented to enhance images. Experiments conclude that the use of the fractional differential not only maintains the low frequency contour features in the smooth areas of the image, but also nonlinearly enhances edges and textures corresponding to high-frequency image components. However, whilst these methods perform well in particular cases, they are not routinely useful across all applications. To this end, we applied the second order Riesz fractional differential operator to improve upon existing approaches of texture enhancement. Compared with the classical integral order differential mask operators and other fractional differential operators, our new algorithms provide higher signal to noise values, which leads to superior image quality.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Background: Measurement accuracy is critical for biomechanical gait assessment. Very few studies have determined the accuracy of common clinical rearfoot variables between cameras with different collection frequencies. Research question: What is the measurement error for common rearfoot gait parameters when using a standard 30Hz digital camera compared to 100Hz camera? Type of study: Descriptive. Methods: 100 footfalls were recorded from 10 subjects ( 10 footfalls per subject) running on a treadmill at 2.68m/s. A high-speed digital timer, accurate within 1ms served as an external reference. Markers were placed along the vertical axis of the heel counter and the long axis of the shank. 2D coordinates for the four markers were determined from heel strike to heel lift. Variables of interest included time of heel strike (THS), time of heel lift (THL), time to maximum eversion (TMax), and maximum rearfoot eversion angle (EvMax). Results: THS difference was 29.77ms (+/- 8.77), THL difference was 35.64ms (+/- 6.85), and TMax difference was 16.50ms (+/- 2.54). These temporal values represent a difference equal to 11.9%, 14.3%, and 6.6% of the stance phase of running gait, respectively. EvMax difference was 1.02 degrees (+/- 0.46). Conclusions: A 30Hz camera is accurate, compared to a high-frequency camera, in determining TMax and EvMax during a clinical gait analysis. However, relatively large differences, in excess of 12% of the stance phase of gait, for THS and THL variables were measured.