187 resultados para Realized volatility
Resumo:
This thesis deals with the problem of the instantaneous frequency (IF) estimation of sinusoidal signals. This topic plays significant role in signal processing and communications. Depending on the type of the signal, two major approaches are considered. For IF estimation of single-tone or digitally-modulated sinusoidal signals (like frequency shift keying signals) the approach of digital phase-locked loops (DPLLs) is considered, and this is Part-I of this thesis. For FM signals the approach of time-frequency analysis is considered, and this is Part-II of the thesis. In part-I we have utilized sinusoidal DPLLs with non-uniform sampling scheme as this type is widely used in communication systems. The digital tanlock loop (DTL) has introduced significant advantages over other existing DPLLs. In the last 10 years many efforts have been made to improve DTL performance. However, this loop and all of its modifications utilizes Hilbert transformer (HT) to produce a signal-independent 90-degree phase-shifted version of the input signal. Hilbert transformer can be realized approximately using a finite impulse response (FIR) digital filter. This realization introduces further complexity in the loop in addition to approximations and frequency limitations on the input signal. We have tried to avoid practical difficulties associated with the conventional tanlock scheme while keeping its advantages. A time-delay is utilized in the tanlock scheme of DTL to produce a signal-dependent phase shift. This gave rise to the time-delay digital tanlock loop (TDTL). Fixed point theorems are used to analyze the behavior of the new loop. As such TDTL combines the two major approaches in DPLLs: the non-linear approach of sinusoidal DPLL based on fixed point analysis, and the linear tanlock approach based on the arctan phase detection. TDTL preserves the main advantages of the DTL despite its reduced structure. An application of TDTL in FSK demodulation is also considered. This idea of replacing HT by a time-delay may be of interest in other signal processing systems. Hence we have analyzed and compared the behaviors of the HT and the time-delay in the presence of additive Gaussian noise. Based on the above analysis, the behavior of the first and second-order TDTLs has been analyzed in additive Gaussian noise. Since DPLLs need time for locking, they are normally not efficient in tracking the continuously changing frequencies of non-stationary signals, i.e. signals with time-varying spectra. Nonstationary signals are of importance in synthetic and real life applications. An example is the frequency-modulated (FM) signals widely used in communication systems. Part-II of this thesis is dedicated for the IF estimation of non-stationary signals. For such signals the classical spectral techniques break down, due to the time-varying nature of their spectra, and more advanced techniques should be utilized. For the purpose of instantaneous frequency estimation of non-stationary signals there are two major approaches: parametric and non-parametric. We chose the non-parametric approach which is based on time-frequency analysis. This approach is computationally less expensive and more effective in dealing with multicomponent signals, which are the main aim of this part of the thesis. A time-frequency distribution (TFD) of a signal is a two-dimensional transformation of the signal to the time-frequency domain. Multicomponent signals can be identified by multiple energy peaks in the time-frequency domain. Many real life and synthetic signals are of multicomponent nature and there is little in the literature concerning IF estimation of such signals. This is why we have concentrated on multicomponent signals in Part-H. An adaptive algorithm for IF estimation using the quadratic time-frequency distributions has been analyzed. A class of time-frequency distributions that are more suitable for this purpose has been proposed. The kernels of this class are time-only or one-dimensional, rather than the time-lag (two-dimensional) kernels. Hence this class has been named as the T -class. If the parameters of these TFDs are properly chosen, they are more efficient than the existing fixed-kernel TFDs in terms of resolution (energy concentration around the IF) and artifacts reduction. The T-distributions has been used in the IF adaptive algorithm and proved to be efficient in tracking rapidly changing frequencies. They also enables direct amplitude estimation for the components of a multicomponent
Resumo:
Long-term loss of soil C stocks under conventional tillage and accrual of soil C following adoption of no-tillage have been well documented. No-tillage use is spreading, but it is common to occasionally till within a no-till regime or to regularly alternate between till and no-till practices within a rotation of different crops. Short-term studies indicate that substantial amounts of C can be lost from the soil immediately following a tillage event, but there are few field studies that have investigated the impact of infrequent tillage on soil C stocks. How much of the C sequestered under no-tillage is likely to be lost if the soil is tilled? What are the longer-term impacts of continued infrequent no-tillage? If producers are to be compensated for sequestering C in soil following adoption of conservation tillage practices, the impacts of infrequent tillage need to be quantified. A few studies have examined the short-term impacts of tillage on soil C and several have investigated the impacts of adoption of continuous no-tillage. We present: (1) results from a modeling study carried out to address these questions more broadly than the published literature allows, (2) a review of the literature examining the short-term impacts of tillage on soil C, (3) a review of published studies on the physical impacts of tillage and (4) a synthesis of these components to assess how infrequent tillage impacts soil C stocks and how changes in tillage frequency could impact soil C stocks and C sequestration. Results indicate that soil C declines significantly following even one tillage event (1-11 % of soil C lost). Longer-term losses increase as frequency of tillage increases. Model analyses indicate that cultivating and ripping are less disruptive than moldboard plowing, and soil C for those treatments average just 6% less than continuous NT compared to 27% less for CT. Most (80%) of the soil C gains of NT can be realized with NT coupled with biannual cultivating or ripping. (C) 2007 Elsevier B.V. All rights reserved.
Resumo:
In the Superannuation/Pension industry ordinary investors entrust their retirement savings to the trustees of the superannuation plan. Investors rely on the trustees to ensure ethical business and risk management practices are implemented to protect their retirement savings. Governance practices ensure the monitoring of ethical risk management (Drennan, 2004). The Australian superannuation industry presents a unique scenario. Legislation requires employers to contribute a minimum of 9% of the employees wage to retirement savings. However, there are no legislated governance standards, although there are standards of recommended governance practices. In this paper, we examine the level of voluntary adoption of governance practices by the trustees of Australian public sector and industry superannuation funds. We also assess whether superannuation governance practices are associated with performance and volatility/riskiness of returns. Survey results show that the majority of superannuation plans adopt recommended governance practices supporting the concept of ethical management of the member’s retirement savings. The examination of governance principles that impact returns and risk show that board size and regular review of conflicts are positively associated with return. Superannuation plans with higher volatility in returns meet more frequently.
Resumo:
The holistic conception of the troika, as described in the first chapter, centres on the relationship between the implicit and explicit teaching of values the nurturing of the specific dimensions of quality teaching and the opportunity to ‘walk the talk’ of the values education program through aspects such as practical citizenship (Lovat, Toomey, Clement, Crotty & Nielsen, 2009). It is proposed in this chapter that the conception can be realized through the embedding of Philosophy in the Classroom within pre-service teaching programs. The troika, a Russian sleigh with three horses, only function well when there is complete synergy and balance between all Classroom is a scaffold for ensuring that all three elements of the troika, namely, quality teaching, values education and service learning in the form of education for citizenship, exist within the classroom to achieve an optimal learning, growth and wellbeing for all students. For this to be more widely accomplished Philosophy in the Classroom and discusses how it constitutes a successful synergy and balance of the troika for effective teaching. It then proposes how it might be embedded into pre-service teacher education.
Resumo:
Many developing countries are afflicted by persistent inequality in the distribution of income. While a growing body of literature emphasizes differential fertility as a channel through which income inequality persists, this paper investigates differential child mortality – differences in the incidence of child mortality across socioeconomic groups – as a critical link in this regard. Using evidence from cross-country data to evaluate this linkage, we find that differential child mortality serves as a stronger channel than differential fertility in the transmission of income inequality over time. We use random effects and generalized estimating equations techniques to account for temporal correlation within countries. The results are robust to the use of an alternate definition of fertility that reflects parental preference for children instead of realized fertility.
Resumo:
In this paper, we present the design and construction of a prototype target tracking system. The experimental set up consists of three main modules for moving the object, detecting the motion of the object and its tracking. The mechanism for moving the object includes an object and two stepper motors and their driving and control circuitry. The detection of the object’s motion is realized by photo switch array. The tracking mechanism consists of a laser beam and two DC servomotors and their associated circuitry. The control algorithm is a standard fuzzy logic controller. The system is designed to operate in two modes in such a way that the role of target and tracker can be interchanged. Experimental results indicate that the fuzzy controller is capable of controlling the system in both modes.
Resumo:
The influence of biogenic particle formation on climate is a well recognised phenomenon. To understand the mechanisms underlying the biogenic particle formation, determining the chemical composition of the new particles and therefore the species that drive the particle production is of utmost importance. Due to the very small amount of mass involved, indirect approaches are frequently used to infer the composition. We present here the results of such an indirect approach by simultaneously measuring volatile and hygroscopic properties of newly formed particles in a forest environment. It is shown that the particles are composed of both sulphates and organics, with the amount of sulphate component strongly depending on the available gas-phase sulphuric acid, and the organic components having the same volatility and hygroscopicity as photooxidation products of a monoterpene such as α-pinene. Our findings agree with a two-step process through nucleation and cluster formation followed by simultaneous growth by condensation of sulphates and organics that take the particles to climatically relevant sizes.
Resumo:
Recent studies have detected a dominant accumulation mode (~100 nm) in the Sea Spray Aerosol (SSA) number distribution. There is evidence to suggest that particles in this mode are composed primarily of organics. To investigate this hypothesis we conducted experiments on NaCl, artificial SSA and natural SSA particles with a Volatility-Hygroscopicity-Tandem-Differential-Mobility-Analyser (VH-TDMA). NaCl particles were atomiser generated and a bubble generator was constructed to produce artificial and natural SSA particles. Natural seawater samples for use in the bubble generator were collected from biologically active, terrestrially-affected coastal water in Moreton Bay, Australia. Differences in the VH-TDMA-measured volatility curves of artificial and natural SSA particles were used to investigate and quantify the organic fraction of natural SSA particles. Hygroscopic Growth Factor (HGF) data, also obtained by the VH-TDMA, were used to confirm the conclusions drawn from the volatility data. Both datasets indicated that the organic fraction of our natural SSA particles evaporated in the VH-TDMA over the temperature range 170–200°C. The organic volume fraction for 71–77 nm natural SSA particles was 8±6%. Organic volume fraction did not vary significantly with varying water residence time (40 secs to 24 hrs) in the bubble generator or SSA particle diameter in the range 38–173 nm. At room temperature we measured shape- and Kelvin-corrected HGF at 90% RH of 2.46±0.02 for NaCl, 2.35±0.02 for artifical SSA and 2.26±0.02 for natural SSA particles. Overall, these results suggest that the natural accumulation mode SSA particles produced in these experiments contained only a minor organic fraction, which had little effect on hygroscopic growth. Our measurement of 8±6% is an order of magnitude below two previous measurements of the organic fraction in SSA particles of comparable sizes. We stress that our results were obtained using coastal seawater and they can’t necessarily be applied on a regional or global ocean scale. Nevertheless, considering the order of magnitude discrepancy between this and previous studies, further research with independent measurement techniques and a variety of different seawaters is required to better quantify how much organic material is present in accumulation mode SSA.
Resumo:
A 4 week intensive measurement campaign was conducted in March–April 2007 at Agnes Water, a remote coastal site on the east coast of Australia. A Volatility-Hygroscopicity-Tandem Differential Mobility Analyser (VH-TDMA) was used to investigate changes in the hygroscopic properties of ambient particles as volatile components were progressively evaporated. Nine out of 18 VH-TDMA volatility scans detected internally mixed multi-component particles in the nucleation and Aitken modes in clean marine air. Evaporation of a volatile, organic-like component in the VH-TDMA caused significant increases in particle hygroscopicity. In 3 scans the increase in hygroscopicity was so large it was explained by an increase in the absolute volume of water uptake by the particle residuals, and not merely an increase in their relative hygroscopicity. This indicates the presence of organic components that were suppressing the hygroscopic growth of mixed particles on the timescale of humidification in the VH-TDMA (6.5 secs). This observation was supported by ZSR calculations for one scan, which showed that the measured growth factors of mixed particles were up to 18% below those predicted assuming independent water uptake of the individual particle components. The observed suppression of water uptake could be due to a reduced rate of hygroscopic growth caused by the presence of organic films or organic-inorganic interactions in solution droplets that had a negative effect on hygroscopicity.
Resumo:
Short-term traffic flow data is characterized by rapid and dramatic fluctuations. It reflects the nature of the frequent congestion in the lane, which shows a strong nonlinear feature. Traffic state estimation based on the data gained by electronic sensors is critical for much intelligent traffic management and the traffic control. In this paper, a solution to freeway traffic estimation in Beijing is proposed using a particle filter, based on macroscopic traffic flow model, which estimates both traffic density and speed.Particle filter is a nonlinear prediction method, which has obvious advantages for traffic flows prediction. However, with the increase of sampling period, the volatility of the traffic state curve will be much dramatic. Therefore, the prediction accuracy will be affected and difficulty of forecasting is raised. In this paper, particle filter model is applied to estimate the short-term traffic flow. Numerical study is conducted based on the Beijing freeway data with the sampling period of 2 min. The relatively high accuracy of the results indicates the superiority of the proposed model.
Resumo:
Globally, teaching has become more complex and more challenging over recent years, with new and increased demands being placed on teachers by students, their families, governments and wider society. Teachers work with more diverse communities in times characterised by volatility, uncertainty and moral ambiguity. Societal, political, economic and cultural shifts have transformed the contexts in which teachers work and have redefined the ways in which teachers interact with students. This qualitative study uses phenomenographic methods to explore the nature of pedagogic teacherstudent interactions. The data analysis reveals five qualitatively different ways in which teachers experience pedagogic engagements with students. The resultant categories of description ranged from information providing, with teachers viewed as transmitters of a body of knowledge through to mentoring in which teachers were perceived as significant others in the lives of students with their influence extending beyond the walls of the classroom and beyond the years of schooling. The paper concludes by arguing that if teachers are to prepare students for the challenges and opportunities in changing times, teacher education programs need to consider ways to facilitate the development of mentoring capacities in new teachers.
Resumo:
Network has emerged from a contempory worldwide phenomenon, culturally manifested as a consequence of globalization and the knowledge economy. It is in this context that the internet revolution has prompted a radical re-ordering of social and institutional relations and the associated structures, processes and places which support them. Within the duality of virtual space and the augmentation of traditional notions of physical place, the organizational structures pose new challenges for the design professions. Technological developments increasingly permit communication anytime and anywhere, and provide the opportunity for both synchronous and asynchronous collaboration. The resultant ecology formed through the network enterprise has resulted in an often convolted and complex world wherein designers are forced to consider the relevance and meaning of this new context. The role of technology and that of space are thus interwined in the relation between the network and the individual workplace. This paper explores a way to inform the interior desgn process for contemporary workplace environments. It reports on both theoretical and practical outcomes through an Australia-wide case study of three collaborating, yet independent business entities. It further suggests the link between workplace design and successful business innovation being realized between partnering organizations in Great Britain. Evidence presented indicates that, for architects and interior designers, the scope of the problem has widened, the depth of knowledge required to provide solutions has increased, and the rules of engagement are required to change. The ontological and epistemological positions adopted in the study enabled the spatial dimensions to be examined from both within and beyond the confines of a traditional design only viewpoint. Importantly it highlights the significance of a trans-disiplinary collaboration in dealing with the multiple layers and complexity of the contemporary social and business world, from both a research and practice perspective.
Resumo:
A Wireless Sensor Network (WSN) is a set of sensors that are integrated with a physical environment. These sensors are small in size, and capable of sensing physical phenomena and processing them. They communicate in a multihop manner, due to a short radio range, to form an Ad Hoc network capable of reporting network activities to a data collection sink. Recent advances in WSNs have led to several new promising applications, including habitat monitoring, military target tracking, natural disaster relief, and health monitoring. The current version of sensor node, such as MICA2, uses a 16 bit, 8 MHz Texas Instruments MSP430 micro-controller with only 10 KB RAM, 128 KB program space, 512 KB external ash memory to store measurement data, and is powered by two AA batteries. Due to these unique specifications and a lack of tamper-resistant hardware, devising security protocols for WSNs is complex. Previous studies show that data transmission consumes much more energy than computation. Data aggregation can greatly help to reduce this consumption by eliminating redundant data. However, aggregators are under the threat of various types of attacks. Among them, node compromise is usually considered as one of the most challenging for the security of WSNs. In a node compromise attack, an adversary physically tampers with a node in order to extract the cryptographic secrets. This attack can be very harmful depending on the security architecture of the network. For example, when an aggregator node is compromised, it is easy for the adversary to change the aggregation result and inject false data into the WSN. The contributions of this thesis to the area of secure data aggregation are manifold. We firstly define the security for data aggregation in WSNs. In contrast with existing secure data aggregation definitions, the proposed definition covers the unique characteristics that WSNs have. Secondly, we analyze the relationship between security services and adversarial models considered in existing secure data aggregation in order to provide a general framework of required security services. Thirdly, we analyze existing cryptographic-based and reputationbased secure data aggregation schemes. This analysis covers security services provided by these schemes and their robustness against attacks. Fourthly, we propose a robust reputationbased secure data aggregation scheme for WSNs. This scheme minimizes the use of heavy cryptographic mechanisms. The security advantages provided by this scheme are realized by integrating aggregation functionalities with: (i) a reputation system, (ii) an estimation theory, and (iii) a change detection mechanism. We have shown that this addition helps defend against most of the security attacks discussed in this thesis, including the On-Off attack. Finally, we propose a secure key management scheme in order to distribute essential pairwise and group keys among the sensor nodes. The design idea of the proposed scheme is the combination between Lamport's reverse hash chain as well as the usual hash chain to provide both past and future key secrecy. The proposal avoids the delivery of the whole value of a new group key for group key update; instead only the half of the value is transmitted from the network manager to the sensor nodes. This way, the compromise of a pairwise key alone does not lead to the compromise of the group key. The new pairwise key in our scheme is determined by Diffie-Hellman based key agreement.
Resumo:
This paper proposes a novel approach for identifying risks in executable business processes and detecting them at run time. The approach considers risks in all phases of the business process management lifecycle, and is realized via a distributed, sensor-based architecture. At design-time, sensors are defined to specify risk conditions which when fulfilled, are a likely indicator of faults to occur. Both historical and current execution data can be used to compose such conditions. At run-time, each sensor independently notifies a sensor manager when a risk is detected. In turn, the sensor manager interacts with the monitoring component of a process automation suite to prompt the results to the user who may take remedial actions. The proposed architecture has been implemented in the YAWL system and its performance has been evaluated in practice.