939 resultados para FREQUENCY APPROACH


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Task-based approach implicates identifying all the tasks developed in each workplace aiming to refine the exposure characterization. The starting point of this approach is the recognition that only through a more detailed and comprehensive understanding of tasks is possible to understand, in more detail, the exposure scenario. In addition allows also the most suitable risk management measures identification. This approach can be also used when there is a need of identifying the workplace surfaces for sampling chemicals that have the dermal exposure route as the most important. In this case is possible to identify, through detail observation of tasks performance, the surfaces that involves higher contact (frequency) by the workers and can be contaminated. Identify the surfaces to sample when performing occupational exposure assessment to antineoplasic agents. Surfaces selection done based on the task-based approach.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Terahertz (THz) technology has been generating a lot of interest because of the potential applications for systems working in this frequency range. However, to fully achieve this potential, effective and efficient ways of generating controlled signals in the terahertz range are required. Devices that exhibit negative differential resistance (NDR) in a region of their current-voltage (I-V ) characteristics have been used in circuits for the generation of radio frequency signals. Of all of these NDR devices, resonant tunneling diode (RTD) oscillators, with their ability to oscillate in the THz range are considered as one of the most promising solid-state sources for terahertz signal generation at room temperature. There are however limitations and challenges with these devices, from inherent low output power usually in the range of micro-watts (uW) for RTD oscillators when milli-watts (mW) are desired. At device level, parasitic oscillations caused by the biasing line inductance when the device is biased in the NDR region prevent accurate device characterisation, which in turn prevents device modelling for computer simulations. This thesis describes work on I-V characterisation of tunnel diode (TD) and RTD (fabricated by Dr. Jue Wang) devices, and the radio frequency (RF) characterisation and small signal modelling of RTDs. The thesis also describes the design and measurement of hybrid TD oscillators for higher output power and the design and measurement of a planar Yagi antenna (fabricated by Khalid Alharbi) for THz applications. To enable oscillation free current-voltage characterisation of tunnel diodes, a commonly employed method is the use of a suitable resistor connected across the device to make the total differential resistance in the NDR region positive. However, this approach is not without problems as the value of the resistor has to satisfy certain conditions or else bias oscillations would still be present in the NDR region of the measured I-V characteristics. This method is difficult to use for RTDs which are fabricated on wafer due to the discrepancies in designed and actual resistance values of fabricated resistors using thin film technology. In this work, using pulsed DC rather than static DC measurements during device characterisation were shown to give accurate characteristics in the NDR region without the need for a stabilisation resistor. This approach allows for direct oscillation free characterisation for devices. Experimental results show that the I-V characterisation of tunnel diodes and RTD devices free of bias oscillations in the NDR region can be made. In this work, a new power-combining topology to address the limitations of low output power of TD and RTD oscillators is presented. The design employs the use of two oscillators biased separately, but with the combined output power from both collected at a single load. Compared to previous approaches, this method keeps the frequency of oscillation of the combined oscillators the same as for one of the oscillators. Experimental results with a hybrid circuit using two tunnel diode oscillators compared with a single oscillator design with similar values shows that the coupled oscillators produce double the output RF power of the single oscillator. This topology can be scaled for higher (up to terahertz) frequencies in the future by using RTD oscillators. Finally, a broadband Yagi antenna suitable for wireless communication at terahertz frequencies is presented in this thesis. The return loss of the antenna showed that the bandwidth is larger than the measured range (140-220 GHz). A new method was used to characterise the radiation pattern of the antenna in the E-plane. This was carried out on-wafer and the measured radiation pattern showed good agreement with the simulated pattern. In summary, this work makes important contributions to the accurate characterisation and modelling of TDs and RTDs, circuit-based techniques for power combining of high frequency TD or RTD oscillators, and to antennas suitable for on chip integration with high frequency oscillators.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

International audience

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Statistical approaches to study extreme events require, by definition, long time series of data. In many scientific disciplines, these series are often subject to variations at different temporal scales that affect the frequency and intensity of their extremes. Therefore, the assumption of stationarity is violated and alternative methods to conventional stationary extreme value analysis (EVA) must be adopted. Using the example of environmental variables subject to climate change, in this study we introduce the transformed-stationary (TS) methodology for non-stationary EVA. This approach consists of (i) transforming a non-stationary time series into a stationary one, to which the stationary EVA theory can be applied, and (ii) reverse transforming the result into a non-stationary extreme value distribution. As a transformation, we propose and discuss a simple time-varying normalization of the signal and show that it enables a comprehensive formulation of non-stationary generalized extreme value (GEV) and generalized Pareto distribution (GPD) models with a constant shape parameter. A validation of the methodology is carried out on time series of significant wave height, residual water level, and river discharge, which show varying degrees of long-term and seasonal variability. The results from the proposed approach are comparable with the results from (a) a stationary EVA on quasi-stationary slices of non-stationary series and (b) the established method for non-stationary EVA. However, the proposed technique comes with advantages in both cases. For example, in contrast to (a), the proposed technique uses the whole time horizon of the series for the estimation of the extremes, allowing for a more accurate estimation of large return levels. Furthermore, with respect to (b), it decouples the detection of non-stationary patterns from the fitting of the extreme value distribution. As a result, the steps of the analysis are simplified and intermediate diagnostics are possible. In particular, the transformation can be carried out by means of simple statistical techniques such as low-pass filters based on the running mean and the standard deviation, and the fitting procedure is a stationary one with a few degrees of freedom and is easy to implement and control. An open-source MAT-LAB toolbox has been developed to cover this methodology, which is available at https://github.com/menta78/tsEva/(Mentaschi et al., 2016).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cardiovascular diseases (CVDs) have reached an epidemic proportion in the US and worldwide with serious consequences in terms of human suffering and economic impact. More than one third of American adults are suffering from CVDs. The total direct and indirect costs of CVDs are more than $500 billion per year. Therefore, there is an urgent need to develop noninvasive diagnostics methods, to design minimally invasive assist devices, and to develop economical and easy-to-use monitoring systems for cardiovascular diseases. In order to achieve these goals, it is necessary to gain a better understanding of the subsystems that constitute the cardiovascular system. The aorta is one of these subsystems whose role in cardiovascular functioning has been underestimated. Traditionally, the aorta and its branches have been viewed as resistive conduits connected to an active pump (left ventricle of the heart). However, this perception fails to explain many observed physiological results. My goal in this thesis is to demonstrate the subtle but important role of the aorta as a system, with focus on the wave dynamics in the aorta.

The operation of a healthy heart is based on an optimized balance between its pumping characteristics and the hemodynamics of the aorta and vascular branches. The delicate balance between the aorta and heart can be impaired due to aging, smoking, or disease. The heart generates pulsatile flow that produces pressure and flow waves as it enters into the compliant aorta. These aortic waves propagate and reflect from reflection sites (bifurcations and tapering). They can act constructively and assist the blood circulation. However, they may act destructively, promoting diseases or initiating sudden cardiac death. These waves also carry information about the diseases of the heart, vascular disease, and coupling of heart and aorta. In order to elucidate the role of the aorta as a dynamic system, the interplay between the dominant wave dynamic parameters is investigated in this study. These parameters are heart rate, aortic compliance (wave speed), and locations of reflection sites. Both computational and experimental approaches have been used in this research. In some cases, the results are further explained using theoretical models.

The main findings of this study are as follows: (i) developing a physiologically realistic outflow boundary condition for blood flow modeling in a compliant vasculature; (ii) demonstrating that pulse pressure as a single index cannot predict the true level of pulsatile workload on the left ventricle; (iii) proving that there is an optimum heart rate in which the pulsatile workload of the heart is minimized and that the optimum heart rate shifts to a higher value as aortic rigidity increases; (iv) introducing a simple bio-inspired device for correction and optimization of aortic wave reflection that reduces the workload on the heart; (v) deriving a non-dimensional number that can predict the optimum wave dynamic state in a mammalian cardiovascular system; (vi) demonstrating that waves can create a pumping effect in the aorta; (vii) introducing a system parameter and a new medical index, Intrinsic Frequency, that can be used for noninvasive diagnosis of heart and vascular diseases; and (viii) proposing a new medical hypothesis for sudden cardiac death in young athletes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Doutoramento em Economia

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recent advances in mobile phone cameras have poised them to take over compact hand-held cameras as the consumer’s preferred camera option. Along with advances in the number of pixels, motion blur removal, face-tracking, and noise reduction algorithms have significant roles in the internal processing of the devices. An undesired effect of severe noise reduction is the loss of texture (i.e. low-contrast fine details) of the original scene. Current established methods for resolution measurement fail to accurately portray the texture loss incurred in a camera system. The development of an accurate objective method to identify the texture preservation or texture reproduction capability of a camera device is important in this regard. The ‘Dead Leaves’ target has been used extensively as a method to measure the modulation transfer function (MTF) of cameras that employ highly non-linear noise-reduction methods. This stochastic model consists of a series of overlapping circles with radii r distributed as r−3, and having uniformly distributed gray level, which gives an accurate model of occlusion in a natural setting and hence mimics a natural scene. This target can be used to model the texture transfer through a camera system when a natural scene is captured. In the first part of our study we identify various factors that affect the MTF measured using the ‘Dead Leaves’ chart. These include variations in illumination, distance, exposure time and ISO sensitivity among others. We discuss the main differences of this method with the existing resolution measurement techniques and identify the advantages. In the second part of this study, we propose an improvement to the current texture MTF measurement algorithm. High frequency residual noise in the processed image contains the same frequency content as fine texture detail, and is sometimes reported as such, thereby leading to inaccurate results. A wavelet thresholding based denoising technique is utilized for modeling the noise present in the final captured image. This updated noise model is then used for calculating an accurate texture MTF. We present comparative results for both algorithms under various image capture conditions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study computed trends in extreme precipitation events of Florida for 1950-2010. Hourly aggregated rainfall data from 24 stations of the National Climatic Data Centre were analyzed to derive time-series of extreme rainfalls for 12 durations, ranging from 1 hour to 7 day. Non-parametric Mann-Kendall test and Theil-Sen Approach were applied to detect the significance of trends in annual maximum rainfalls, number of above threshold events and average magnitude of above threshold events for four common analysis periods. Trend Free Pre-Whitening (TFPW) approach was applied to remove the serial correlations and bootstrap resampling approach was used to detect the field significance of trends. The results for annual maximum rainfall revealed dominant increasing trends at the statistical significance level of 0.10, especially for hourly events in longer period and daily events in recent period. The number of above threshold events exhibited strong decreasing trends for hourly durations in all time periods.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Frequency, time and places of charging and discharging have critical impact on the Quality of Experience (QoE) of using Electric Vehicles (EVs). EV charging and discharging scheduling schemes should consider both the QoE of using EV and the load capacity of the power grid. In this paper, we design a traveling plan-aware scheduling scheme for EV charging in driving pattern and a cooperative EV charging and discharging scheme in parking pattern to improve the QoE of using EV and enhance the reliability of the power grid. For traveling planaware scheduling, the assignment of EVs to Charging Stations (CSs) is modeled as a many-to-one matching game and the Stable Matching Algorithm (SMA) is proposed. For cooperative EV charging and discharging in parking pattern, the electricity exchange between charging EVs and discharging EVs in the same parking lot is formulated as a many-to-many matching model with ties, and we develop the Pareto Optimal Matching Algorithm (POMA). Simulation results indicates that the SMA can significantly improve the average system utility for EV charging in driving pattern, and the POMA can increase the amount of electricity offloaded from the grid which is helpful to enhance the reliability of the power grid.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The current approach to data analysis for the Laser Interferometry Space Antenna (LISA) depends on the time delay interferometry observables (TDI) which have to be generated before any weak signal detection can be performed. These are linear combinations of the raw data with appropriate time shifts that lead to the cancellation of the laser frequency noises. This is possible because of the multiple occurrences of the same noises in the different raw data. Originally, these observables were manually generated starting with LISA as a simple stationary array and then adjusted to incorporate the antenna's motions. However, none of the observables survived the flexing of the arms in that they did not lead to cancellation with the same structure. The principal component approach is another way of handling these noises that was presented by Romano and Woan which simplified the data analysis by removing the need to create them before the analysis. This method also depends on the multiple occurrences of the same noises but, instead of using them for cancellation, it takes advantage of the correlations that they produce between the different readings. These correlations can be expressed in a noise (data) covariance matrix which occurs in the Bayesian likelihood function when the noises are assumed be Gaussian. Romano and Woan showed that performing an eigendecomposition of this matrix produced two distinct sets of eigenvalues that can be distinguished by the absence of laser frequency noise from one set. The transformation of the raw data using the corresponding eigenvectors also produced data that was free from the laser frequency noises. This result led to the idea that the principal components may actually be time delay interferometry observables since they produced the same outcome, that is, data that are free from laser frequency noise. The aims here were (i) to investigate the connection between the principal components and these observables, (ii) to prove that the data analysis using them is equivalent to that using the traditional observables and (ii) to determine how this method adapts to real LISA especially the flexing of the antenna. For testing the connection between the principal components and the TDI observables a 10x 10 covariance matrix containing integer values was used in order to obtain an algebraic solution for the eigendecomposition. The matrix was generated using fixed unequal arm lengths and stationary noises with equal variances for each noise type. Results confirm that all four Sagnac observables can be generated from the eigenvectors of the principal components. The observables obtained from this method however, are tied to the length of the data and are not general expressions like the traditional observables, for example, the Sagnac observables for two different time stamps were generated from different sets of eigenvectors. It was also possible to generate the frequency domain optimal AET observables from the principal components obtained from the power spectral density matrix. These results indicate that this method is another way of producing the observables therefore analysis using principal components should give the same results as that using the traditional observables. This was proven by fact that the same relative likelihoods (within 0.3%) were obtained from the Bayesian estimates of the signal amplitude of a simple sinusoidal gravitational wave using the principal components and the optimal AET observables. This method fails if the eigenvalues that are free from laser frequency noises are not generated. These are obtained from the covariance matrix and the properties of LISA that are required for its computation are the phase-locking, arm lengths and noise variances. Preliminary results of the effects of these properties on the principal components indicate that only the absence of phase-locking prevented their production. The flexing of the antenna results in time varying arm lengths which will appear in the covariance matrix and, from our toy model investigations, this did not prevent the occurrence of the principal components. The difficulty with flexing, and also non-stationary noises, is that the Toeplitz structure of the matrix will be destroyed which will affect any computation methods that take advantage of this structure. In terms of separating the two sets of data for the analysis, this was not necessary because the laser frequency noises are very large compared to the photodetector noises which resulted in a significant reduction in the data containing them after the matrix inversion. In the frequency domain the power spectral density matrices were block diagonals which simplified the computation of the eigenvalues by allowing them to be done separately for each block. The results in general showed a lack of principal components in the absence of phase-locking except for the zero bin. The major difference with the power spectral density matrix is that the time varying arm lengths and non-stationarity do not show up because of the summation in the Fourier transform.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Endemic zoonotic diseases remain a serious but poorly recognised problem in affected communities in developing countries. Despite the overall burden of zoonoses on human and animal health, information about their impacts in endemic settings is lacking and most of these diseases are continuously being neglected. The non-specific clinical presentation of these diseases has been identified as a major challenge in their identification (even with good laboratory diagnosis), and control. The signs and symptoms in animals and humans respectively, are easily confused with other non-zoonotic diseases, leading to widespread misdiagnosis in areas where diagnostic capacity is limited. The communities that are mostly affected by these diseases live in close proximity with their animals which they depend on for livelihood, which further complicates the understanding of the epidemiology of zoonoses. This thesis reviewed the pattern of reporting of zoonotic pathogens that cause febrile illness in malaria endemic countries, and evaluates the recognition of animal associations among other risk factors in the transmission and management of zoonoses. The findings of the review chapter were further investigated through a laboratory study of risk factors for bovine leptospirosis, and exposure patterns of livestock coxiellosis in the subsequent chapters. A review was undertaken on 840 articles that were part of a bigger review of zoonotic pathogens that cause human fever. The review process involves three main steps: filtering and reference classification, identification of abstracts that describe risk factors, and data extraction and summary analysis of data. Abstracts of the 840 references were transferred into a Microsoft excel spread sheet, where several subsets of abstracts were generated using excel filters and text searches to classify the content of each abstract. Data was then extracted and summarised to describe geographical patterns of the pathogens reported, and determine the frequency animal related risk factors were considered among studies that investigated risk factors for zoonotic pathogen transmission. Subsequently, a seroprevalence study of bovine leptospirosis in northern Tanzania was undertaken in the second chapter of this thesis. The study involved screening of serum samples, which were obtained from an abattoir survey and cross-sectional study (Bacterial Zoonoses Project), for antibodies against Leptospira serovar Hardjo. The data were analysed using generalised linear mixed models (GLMMs), to identify risk factors for cattle infection. The final chapter was the analysis of Q fever data, which were also obtained from the Bacterial Zoonoses Project, to determine exposure patterns across livestock species using generalized linear mixed models (GLMMs). Leptospira spp. (10.8%, 90/840) and Rickettsia spp. (10.7%, 86/840) were identified as the most frequently reported zoonotic pathogens that cause febrile illness, while Rabies virus (0.4%, 3/840) and Francisella spp. (0.1%, 1/840) were least reported, across malaria endemic countries. The majority of the pathogens were reported in Asia, and the frequency of reporting seems to be higher in areas where outbreaks are mostly reported. It was also observed that animal related risk factors are not often considered among other risk factors for zoonotic pathogens that cause human fever in malaria endemic countries. The seroprevalence study indicated that Leptospira serovar Hardjo is widespread in cattle population in northern Tanzania, and animal husbandry systems and age are the two most important risk factors that influence seroprevalence. Cattle in the pastoral systems and adult cattle were significantly more likely to be seropositive compared to non-pastoral and young animals respectively, while there was no significant effect of cattle breed or sex. Exposure patterns of Coxiella burnetii appear different for each livestock species. While most risk factors were identified for goats (such as animal husbandry systems, age and sex) and sheep (animal husbandry systems and sex), there were none for cattle. In addition, there was no evidence of a significant influence of mixed livestock-keeping on animal coxiellosis. Zoonotic agents that cause human fever are common in developing countries. The role of animals in the transmission of zoonotic pathogens that cause febrile illness is not fully recognised and appreciated. Since Leptospira spp. and C. burnetii are among the most frequently reported pathogens that cause human fever across malaria endemic countries, and are also prevalent in livestock population, control and preventive measures that recognise animals as source of infection would be very important especially in livestock-keeping communities where people live in close proximity with their animals.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An integrative multidisciplinary approach was used to delimit boundaries among cryptic species within the Anastrepha fraterculus complex in Brazil. Sexual compatibility, courtship and sexual acoustic behaviour, female morphometric variability, variation for the mitochondrial gene COI, and the presence of Wolbachia were compared among A. fraterculus populations from the Southern (Vacaria, Pelotas, Bento Gonçalves, S~ao Joaquim) and Southeastern (Piracicaba) regions of Brazil. Our results suggest full mating compatibility among A. fraterculus populations from the Southern region and partial pre-zygotic reproductive isolation of these populations when compared with the population from the Southeastern region. A. fraterculus populations from both regions differed in the frequency of courtship displays and aspects of the calling phase and mounting acoustic signal. Morphometric analysis showed differences between Southern region and Southeastern region samples. All populations analyzed were infected with Wolbachia. The trees generated from the COI sequencing data are broadly congruent with the behavioural and morphometric data with the exception of one Southern population. The likely mechanisms by which A. fraterculus populations might have diverged are discussed in detail based on behavioural, morphometric, molecular genetics, and biogeographical studies