141 resultados para High Frequency Structure Simulator (HFSS)
Resumo:
A breaker restrike is an abnormal arcing phenomenon, leading to a possible breaker failure. Eventually, this failure leads to interruption of the transmission and distribution of the electricity supply system until the breaker is replaced. Before 2008, there was little evidence in the literature of monitoring techniques based on restrike measurement and interpretation produced during switching of capacitor banks and shunt reactor banks in power systems. In 2008 a non-intrusive radiometric restrike measurement method and a restrike hardware detection algorithm were developed by M.S. Ramli and B. Kasztenny. However, the limitations of the radiometric measurement method are a band limited frequency response as well as limitations in amplitude determination. Current restrike detection methods and algorithms require the use of wide bandwidth current transformers and high voltage dividers. A restrike switch model using Alternative Transient Program (ATP) and Wavelet Transforms which support diagnostics are proposed. Restrike phenomena become a new diagnostic process using measurements, ATP and Wavelet Transforms for online interrupter monitoring. This research project investigates the restrike switch model Parameter „A. dielectric voltage gradient related to a normal and slowed case of the contact opening velocity and the escalation voltages, which can be used as a diagnostic tool for a vacuum circuit-breaker (CB) at service voltages between 11 kV and 63 kV. During current interruption of an inductive load at current quenching or chopping, a transient voltage is developed across the contact gap. The dielectric strength of the gap should rise to a point to withstand this transient voltage. If it does not, the gap will flash over, resulting in a restrike. A straight line is fitted through the voltage points at flashover of the contact gap. This is the point at which the gap voltage has reached a value that exceeds the dielectric strength of the gap. This research shows that a change in opening contact velocity of the vacuum CB produces a corresponding change in the slope of the gap escalation voltage envelope. To investigate the diagnostic process, an ATP restrike switch model was modified with contact opening velocity computation for restrike waveform signature analyses along with experimental investigations. This also enhanced a mathematical CB model with the empirical dielectric model for SF6 (sulphur hexa-fluoride) CBs at service voltages above 63 kV and a generalised dielectric curve model for 12 kV CBs. A CB restrike can be predicted if there is a similar type of restrike waveform signatures for measured and simulated waveforms. The restrike switch model applications are used for: computer simulations as virtual experiments, including predicting breaker restrikes; estimating the interrupter remaining life of SF6 puffer CBs; checking system stresses; assessing point-on-wave (POW) operations; and for a restrike detection algorithm development using Wavelet Transforms. A simulated high frequency nozzle current magnitude was applied to an Equation (derived from the literature) which can calculate the life extension of the interrupter of a SF6 high voltage CB. The restrike waveform signatures for a medium and high voltage CB identify its possible failure mechanism such as delayed opening, degraded dielectric strength and improper contact travel. The simulated and measured restrike waveform signatures are analysed using Matlab software for automatic detection. Experimental investigation of a 12 kV vacuum CB diagnostic was carried out for the parameter determination and a passive antenna calibration was also successfully developed with applications for field implementation. The degradation features were also evaluated with a predictive interpretation technique from the experiments, and the subsequent simulation indicates that the drop in voltage related to the slow opening velocity mechanism measurement to give a degree of contact degradation. A predictive interpretation technique is a computer modeling for assessing switching device performance, which allows one to vary a single parameter at a time; this is often difficult to do experimentally because of the variable contact opening velocity. The significance of this thesis outcome is that it is a non-intrusive method developed using measurements, ATP and Wavelet Transforms to predict and interpret a breaker restrike risk. The measurements on high voltage circuit-breakers can identify degradation that can interrupt the distribution and transmission of an electricity supply system. It is hoped that the techniques for the monitoring of restrike phenomena developed by this research will form part of a diagnostic process that will be valuable for detecting breaker stresses relating to the interrupter lifetime. Suggestions for future research, including a field implementation proposal to validate the restrike switch model for ATP system studies and the hot dielectric strength curve model for SF6 CBs, are given in Appendix A.
Resumo:
The study presented in this paper reviewed 9,358 accidents which occurred in the U.S. construction industry between 2002 and 2011, in order to understand the relationships between the risk factors and injury severity (e.g. fatalities, hospitalized injuries, or non-hospitalized injuries) and to develop a strategic prevention plan to reduce the likelihood of fatalities where an accident is unavoidable. The study specifically aims to: (1) verify the relationships among risk factors, accident types, and injury severity, (2) determine significant risk factors associated with each accident type that are highly correlated to injury severity, and (3) analyze the impact of the identified key factors on accident and fatality occurrence. The analysis results explained that safety managers’ roles are critical to reducing human-related risks—particularly misjudgement of hazardous situations—through safety training and education, appropriate use of safety devices and proper safety inspection. However, for environment-related factors, the dominant risk factors were different depending on the different accident types. The outcomes of this study will assist safety managers to understand the nature of construction accidents and plan for strategic risk mitigation by prioritizing high frequency risk factors to effectively control accident occurrence and manage the likelihood of fatal injuries on construction sites.
Resumo:
A simple and effective down-sample algorithm, Peak-Hold-Down-Sample (PHDS) algorithm is developed in this paper to enable a rapid and efficient data transfer in remote condition monitoring applications. The algorithm is particularly useful for high frequency Condition Monitoring (CM) techniques, and for low speed machine applications since the combination of the high sampling frequency and low rotating speed will generally lead to large unwieldy data size. The effectiveness of the algorithm was evaluated and tested on four sets of data in the study. One set of the data was extracted from the condition monitoring signal of a practical industry application. Another set of data was acquired from a low speed machine test rig in the laboratory. The other two sets of data were computer simulated bearing defect signals having either a single or multiple bearing defects. The results disclose that the PHDS algorithm can substantially reduce the size of data while preserving the critical bearing defect information for all the data sets used in this work even when a large down-sample ratio was used (i.e., 500 times down-sampled). In contrast, the down-sample process using existing normal down-sample technique in signal processing eliminates the useful and critical information such as bearing defect frequencies in a signal when the same down-sample ratio was employed. Noise and artificial frequency components were also induced by the normal down-sample technique, thus limits its usefulness for machine condition monitoring applications.
Resumo:
One of the next great challenges of cell biology is the determination of the enormous number of protein structures encoded in genomes. In recent years, advances in electron cryo-microscopy and high-resolution single particle analysis have developed to the point where they now provide a methodology for high resolution structure determination. Using this approach, images of randomly oriented single particles are aligned computationally to reconstruct 3-D structures of proteins and even whole viruses. One of the limiting factors in obtaining high-resolution reconstructions is obtaining a large enough representative dataset ($>100,000$ particles). Traditionally particles have been manually picked which is an extremely labour intensive process. The problem is made especially difficult by the low signal-to-noise ratio of the images. This paper describes the development of automatic particle picking software, which has been tested with both negatively stained and cryo-electron micrographs. This algorithm has been shown to be capable of selecting most of the particles, with few false positives. Further work will involve extending the software to detect differently shaped and oriented particles.
Resumo:
One aim of experimental economics is to try to better understand human economic decision making. Early research of the ultimatum bargaining game (Gueth et al., 1982) revealed that other motives than pure monetary reward play a role. Neuroeconomic research has introduced the recording of physiological observations as signals of emotional responses. In this study, we apply heart rate variability (HRV) measuring technology to explore the behaviour and physiological reactions of proposers and responders in the ultimatum bargaining game. Since this technology is small and non-intrusive, we are able to run the experiment in a standard experimental economic setup. We show that low o�ers by a proposer cause signs of mental stress in both the proposer and the responder, as both exhibit high ratios of low to high frequency activity in the HRV spectrum.
Resumo:
In the last decade, smartphones have gained widespread usage. Since the advent of online application stores, hundreds of thousands of applications have become instantly available to millions of smart-phone users. Within the Android ecosystem, application security is governed by digital signatures and a list of coarse-grained permissions. However, this mechanism is not fine-grained enough to provide the user with a sufficient means of control of the applications' activities. Abuse of highly sensible private information such as phone numbers without users' notice is the result. We show that there is a high frequency of privacy leaks even among widely popular applications. Together with the fact that the majority of the users are not proficient in computer security, this presents a challenge to the engineers developing security solutions for the platform. Our contribution is twofold: first, we propose a service which is able to assess Android Market applications via static analysis and provide detailed, but readable reports to the user. Second, we describe a means to mitigate security and privacy threats by automated reverse-engineering and refactoring binary application packages according to the users' security preferences.
Resumo:
This study explored the dynamic performance of an innovative Hybrid Composite Floor Plate System (HCFPS), composed of Polyurethane (PU) core, outer layers of Glass–fibre Reinforced Cement (GRC) and steel laminates at tensile regions, using experimental testing and Finite Element (FE) modelling. Experimental testing included heel impact and walking tests for 3200 mm span HCFPS panels. FE models of the HCFPS were developed using the FE program ABAQUS and validated with experimental results. HCFPS is a light-weight high frequency floor system with excellent damping ratio of 5% (bare floor) due to the central PU core. Parametric studies were conducted using the validated FE models to investigate the dynamic response of the HCFPS and to identify characteristics that influence acceleration response under human induced vibration in service. This vibration performance was compared with recommended acceptable perceptibility limits. The findings of this study show that HCFPS can be used in residential and office buildings as a light-weight floor system, which does not exceed the perceptible thresholds due to human induced vibrations.
Resumo:
Abstract: Texture enhancement is an important component of image processing, with extensive application in science and engineering. The quality of medical images, quantified using the texture of the images, plays a significant role in the routine diagnosis performed by medical practitioners. Previously, image texture enhancement was performed using classical integral order differential mask operators. Recently, first order fractional differential operators were implemented to enhance images. Experiments conclude that the use of the fractional differential not only maintains the low frequency contour features in the smooth areas of the image, but also nonlinearly enhances edges and textures corresponding to high-frequency image components. However, whilst these methods perform well in particular cases, they are not routinely useful across all applications. To this end, we applied the second order Riesz fractional differential operator to improve upon existing approaches of texture enhancement. Compared with the classical integral order differential mask operators and other fractional differential operators, our new algorithms provide higher signal to noise values, which leads to superior image quality.
Resumo:
Background: Measurement accuracy is critical for biomechanical gait assessment. Very few studies have determined the accuracy of common clinical rearfoot variables between cameras with different collection frequencies. Research question: What is the measurement error for common rearfoot gait parameters when using a standard 30Hz digital camera compared to 100Hz camera? Type of study: Descriptive. Methods: 100 footfalls were recorded from 10 subjects ( 10 footfalls per subject) running on a treadmill at 2.68m/s. A high-speed digital timer, accurate within 1ms served as an external reference. Markers were placed along the vertical axis of the heel counter and the long axis of the shank. 2D coordinates for the four markers were determined from heel strike to heel lift. Variables of interest included time of heel strike (THS), time of heel lift (THL), time to maximum eversion (TMax), and maximum rearfoot eversion angle (EvMax). Results: THS difference was 29.77ms (+/- 8.77), THL difference was 35.64ms (+/- 6.85), and TMax difference was 16.50ms (+/- 2.54). These temporal values represent a difference equal to 11.9%, 14.3%, and 6.6% of the stance phase of running gait, respectively. EvMax difference was 1.02 degrees (+/- 0.46). Conclusions: A 30Hz camera is accurate, compared to a high-frequency camera, in determining TMax and EvMax during a clinical gait analysis. However, relatively large differences, in excess of 12% of the stance phase of gait, for THS and THL variables were measured.
Resumo:
Purpose: UC is a disease of the entire urothelium, characterized by multiplicity and multifocality. The clonal relationship among multiple UCs has implications regarding adjuvant chemotherapy. It has been investigated in studies of chromosomal alteration and single gene mutation. However, these genetic changes can occur in unrelated tumors under similar carcinogenic selection pressures. Tumors with high MSI have numerous DNA mutations, of which many provide no selection benefit. While these tumors represent an ideal model for studying UC clonality, their low frequency has prevented their previous investigation. Materials and Methods: We investigated 32 upper and lower urinary tract UCs with high MSI and 4 nonUC primary cancers in 9 patients. We used the high frequency and specificity of individual DNA mutations in these tumors (MSI at 17 loci) and the early timing of epigenetic events (methylation of 7 gene promoters) to investigate tumor clonality. Results: Molecular alterations varied among tumors from different primary organs but they appeared related in the UCs of all 9 patients. While 7 patients had a high degree of concordance among UCs, in 2 the UCs shared only a few similar alterations. Genetic and epigenetic abnormalities were frequently found in normal urothelial samples. Conclusions: Multiple UCs in each patient appeared to arise from a single clone. The molecular order of tumor development varied from the timing of clinical presentation and suggested that residual malignant cells persist in the urinary tract despite apparent curative surgery. These cells lead to subsequent tumor relapse and new methods are required to detect and eradicate them.
Resumo:
Intra-host sequence data from RNA viruses have revealed the ubiquity of defective viruses in natural viral populations, sometimes at surprisingly high frequency. Although defective viruses have long been known to laboratory virologists, their relevance in clinical and epidemiological settings has not been established. The discovery of long-term transmission of a defective lineage of dengue virus type 1 (DENV-1) in Myanmar, first seen in 2001, raised important questions about the emergence of transmissible defective viruses and their role in viral epidemiology. By combining phylogenetic analyses and dynamical modelling, we investigate how evolutionary and ecological processes at the intra-host and inter-host scales shaped the emergence and spread of the defective DENV-1 lineage. We show that this lineage of defective viruses emerged between June 1998 and February 2001, and that the defective virus was transmitted primarily through co-transmission with the functional virus to uninfected individuals. We provide evidence that, surprisingly, this co-transmission route has a higher transmission potential than transmission of functional dengue viruses alone. Consequently, we predict that the defective lineage should increase overall incidence of dengue infection, which could account for the historically high dengue incidence reported in Myanmar in 2001-2002. Our results show the unappreciated potential for defective viruses to impact the epidemiology of human pathogens, possibly by modifying the virulence-transmissibility trade-off, or to emerge as circulating infections in their own right. They also demonstrate that interactions between viral variants, such as complementation, can open new pathways to viral emergence.
Resumo:
The price formation of financial assets is a complex process. It extends beyond the standard economic paradigm of supply and demand to the understanding of the dynamic behavior of price variability, the price impact of information, and the implications of trading behavior of market participants on prices. In this thesis, I study aggregate market and individual assets volatility, liquidity dimensions, and causes of mispricing for US equities over a recent sample period. How volatility forecasts are modeled, what determines intradaily jumps and causes changes in intradaily volatility and what drives the premium of traded equity indexes? Are they induced, for example, by the information content of lagged volatility and return parameters or by macroeconomic news, changes in liquidity and volatility? Besides satisfying our intellectual curiosity, answers to these questions are of direct importance to investors developing trading strategies, policy makers evaluating macroeconomic policies and to arbitrageurs exploiting mispricing in exchange-traded funds. Results show that the leverage effect and lagged absolute returns improve forecasts of continuous components of daily realized volatility as well as jumps. Implied volatility does not subsume the information content of lagged returns in forecasting realized volatility and its components. The reported results are linked to the heterogeneous market hypothesis and demonstrate the validity of extending the hypothesis to returns. Depth shocks, signed order flow, the number of trades, and resiliency are the most important determinants of intradaily volatility. In contrast, spread shock and resiliency are predictive of signed intradaily jumps. There are fewer macroeconomic news announcement surprises that cause extreme price movements or jumps than those that elevate intradaily volatility. Finally, the premium of exchange-traded funds is significantly associated with momentum in net asset value and a number of liquidity parameters including the spread, traded volume, and illiquidity. The mispricing of industry exchange traded funds suggest that limits to arbitrage are driven by potential illiquidity.
Resumo:
This article covers lymphoproliferative disorders in patients with primary or acquired immunodeficiencies. Primary immunodeficiences include Ataxia Telangiectasia and X-linked disorders such as Wiskott-Aldrich syndrome. Acquired immunodeficiencies predominantly occur in the setting of infection with the Human Immunodeficiency Virus or arise following immunosuppressive therapy administered after organ transplantation. The rising incidence of HIV throughout the world and the dramatic increase in transplant surgery since the 1990's suggest that these lymphomas will remain an important health problem. Evidence for lymphoma developing as a result of treatment with methotrexate or Tumour Necrosis Factor Antagonists for autoimmune entities will also be reviewed. The lymphoproliferations that occur with immunodeficiency are extremely heterogenous. In part this reflects the diversity of the causal immune defect. The most striking clinical characteristic is the high frequency of extranodal disease. Frequently, these lymphomas are driven by viruses such as Epstein-Barr virus (EBV), although the lack of EBV in a proportion indicates that alternate pathways must also be involved in the pathogenesis. Lastly, discussion will centre on mechanisms utilized by lymphomas in the immunodeficient as these may have applications to lymphomas in the "immunocompetent", by serving as a paradigm for the altered immunoregulatory environment present in many lymphoma sub-types.
Resumo:
Nitrous oxide emissions from soil are known to be spatially and temporally volatile. Reliable estimation of emissions over a given time and space depends on measuring with sufficient intensity but deciding on the number of measuring stations and the frequency of observation can be vexing. The question of low frequency manual observations providing comparable results to high frequency automated sampling also arises. Data collected from a replicated field experiment was intensively studied with the intention to give some statistically robust guidance on these issues. The experiment had nitrous oxide soil to air flux monitored within 10 m by 2.5 m plots by automated closed chambers under a 3 h average sampling interval and by manual static chambers under a three day average sampling interval over sixty days. Observed trends in flux over time by the static chambers were mostly within the auto chamber bounds of experimental error. Cumulated nitrous oxide emissions as measured by each system were also within error bounds. Under the temporal response pattern in this experiment, no significant loss of information was observed after culling the data to simulate results under various low frequency scenarios. Within the confines of this experiment observations from the manual chambers were not spatially correlated above distances of 1 m. Statistical power was therefore found to improve due to increased replicates per treatment or chambers per replicate. Careful after action review of experimental data can deliver savings for future work.
Resumo:
An ironless motor for use as direct wheel drive is presented. The motor is intended for use in a lightweight (600kg), low drag, series hybrid commuter vehicle under development at The University of Queensland. The vehicle will utilise these ironless motors in each of its rear wheels, with each motor producing a peak torque output of 500Nm and a maximum rotational speed of 1500rpm. The axial flux motor consists of twin Ironless litz wire stators with a central magnetic ring and simplified Halbach magnet arrays on either side. A small amount of iron is used to support the outer Halbach arrays and to improve the peak magnetic flux density. Ducted air cooling is used to remove heat from the motor and will allow for a continuous torque rating of 250Nm. Ironless machines have previously been shown to be effective in high speed, high frequency applications (+1000Hz). They are generally regarded as non-optimal for low speed applications as iron cores allow for better magnet utilisation and do not significantly increase the weight of a machine. However, ironless machines can also be seen to be effective in applications where the average torque requirement is much lower than the peak torque requirement such as in some vehicle drive applications. The low spinning losses in ironless machines are shown to result in very high energy throughput efficiency in a wide range of vehicle driving cycles.