409 resultados para Dunkl Transform


Relevância:

10.00% 10.00%

Publicador:

Resumo:

A "self-exciting" market is one in which the probability of observing a crash increases in response to the occurrence of a crash. It essentially describes cases where the initial crash serves to weaken the system to some extent, making subsequent crashes more likely. This thesis investigates if equity markets possess this property. A self-exciting extension of the well-known jump-based Bates (1996) model is used as the workhorse model for this thesis, and a particle-filtering algorithm is used to facilitate estimation by means of maximum likelihood. The estimation method is developed so that option prices are easily included in the dataset, leading to higher quality estimates. Equilibrium arguments are used to price the risks associated with the time-varying crash probability, and in turn to motivate a risk-neutral system for use in option pricing. The option pricing function for the model is obtained via the application of widely-used Fourier techniques. An application to S&P500 index returns and a panel of S&P500 index option prices reveals evidence of self excitation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The most integrated approach toward understanding the multiple molecular events and mechanisms by which cancer may develop is the application of gene expression profiling using microarray technologies. As molecular alterations in breast cancer are complex and involve cross-talk between multiple cellular signalling pathways, microarray technology provides a means of capturing and comparing the expression patterns of the entire genome across multiple samples in a high throughput manner. Since the development of microarray technologies, together with the advances in RNA extraction methodologies, gene expression studies have revolutionised the means by which genes suitable as targets for drug development and individualised cancer treatment can be identified. As of the mid-1990s, expression microarrays have been extensively applied to the study of cancer and no cancer type has seen as much genomic attention as breast cancer. The most abundant area of breast cancer genomics has been the clarification and interpretation of gene expression patterns that unite both biological and clinical aspects of tumours. It is hoped that one day molecular profiling will transform diagnosis and therapeutic selection in human breast cancer toward more individualised regimes. Here, we review a number of prominent microarray profiling studies focussed on human breast cancer and examine their strengths, their limitations, clinical implications including prognostic relevance and gene signature significance along with potential improvements for the next generation of microarray studies.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The mean action time is the mean of a probability density function that can be interpreted as a critical time, which is a finite estimate of the time taken for the transient solution of a reaction-diffusion equation to effectively reach steady state. For high-variance distributions, the mean action time under-approximates the critical time since it neglects to account for the spread about the mean. We can improve our estimate of the critical time by calculating the higher moments of the probability density function, called the moments of action, which provide additional information regarding the spread about the mean. Existing methods for calculating the nth moment of action require the solution of n nonhomogeneous boundary value problems which can be difficult and tedious to solve exactly. Here we present a simplified approach using Laplace transforms which allows us to calculate the nth moment of action without solving this family of boundary value problems and also without solving for the transient solution of the underlying reaction-diffusion problem. We demonstrate the generality of our method by calculating exact expressions for the moments of action for three problems from the biophysics literature. While the first problem we consider can be solved using existing methods, the second problem, which is readily solved using our approach, is intractable using previous techniques. The third problem illustrates how the Laplace transform approach can be used to study coupled linear reaction-diffusion equations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We demonstrate a rapid synthesis of gold nanoparticles using hydroquinone as a reducing agent under acidic conditions without the need for precursor seed particles. The nanoparticle formation process is facilitated by the addition of NaOH to a solution containing HAuCl4 and hydroquinone to locally change the pH; this enhances the reducing capability of hydroquinone to form gold nucleation centres, after which further growth of gold can take place through an autocatalytic mechanism. The stability of the nanoparticles is highly dependent on the initial solution pH, and both the concentration of added NaOH and hydroquinone present in solution. The gold nanoparticles were characterized by UV–visible spectroscopy, transmission electron microscopy, Fourier transform infrared spectroscopy, atomic force microscopy, dynamic light scattering, and zeta potential measurements. It was found that under optimal conditions that stable aqueous suspensions of 20 nm diameter nanoparticles can be achieved where benzoquinone, the oxidized product of hydroquinone, acts as a capping agent preventing nanoparticles aggregation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The spontaneous reaction between microrods of an organic semiconductor molecule, copper 7,7,8,8-tetracyanoquinodimethane (CuTCNQ) with [AuBr4]− ions in an aqueous environment is reported. The reaction is found to be redox in nature which proceeds via a complex galvanic replacement mechanism, wherein the surface of the CuTCNQ microrods is replaced with metallic gold nanoparticles. Unlike previous reactions reported in acetonitrile, the galvanic replacement reaction in aqueous solution proceeds via an entirely different reaction mechanism, wherein a cyclical reaction mechanism involving continuous regeneration of CuTCNQ consumed during the galvanic replacement reaction occurs in parallel with the galvanic replacement reaction. This results in the driving force of the galvanic replacement reaction in aqueous medium being largely dependent on the availability of [AuBr4]− ions during the reaction. Therefore, this study highlights the importance of the choice of an appropriate solvent during galvanic replacement reactions, which can significantly impact upon the reaction mechanism. The reaction progress with respect to different gold salt concentration was monitored using Fourier transform infrared (FT-IR), Raman, and X-ray photoelectron spectroscopy (XPS), as well as XRD and EDX analysis, and SEM imaging. The CuTCNQ/Au nanocomposites were also investigated for their potential photocatalytic properties, wherein the destruction of the organic dye, Congo red, in a simulated solar light environment was found to be largely dependent on the degree of gold nanoparticle surface coverage. The approach reported here opens up new possibilities of decorating metal–organic charge transfer complexes with a host of metals, leading to potentially novel applications in catalysis and sensing.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis analyses the performance bounds of amplify-and-forward relay channels which are becoming increasingly popular in wireless communication applications. The statistics of cascaded Nakagami-m fading model which is a major obstacle in evaluating the outage of wireless networks is analysed using Mellin transform. Furthermore, the upper and the lower bounds for the ergodic capacity of the slotted amplify-and-forward relay channel, for finite and infinite number of relays are derived using random matrix theory. The results obtained will enable wireless network designers to optimize the network resources, benefiting the consumers.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The notion of plaintext awareness ( PA ) has many applications in public key cryptography: it offers unique, stand-alone security guarantees for public key encryption schemes, has been used as a sufficient condition for proving indistinguishability against adaptive chosen-ciphertext attacks ( IND-CCA ), and can be used to construct privacy-preserving protocols such as deniable authentication. Unlike many other security notions, plaintext awareness is very fragile when it comes to differences between the random oracle and standard models; for example, many implications involving PA in the random oracle model are not valid in the standard model and vice versa. Similarly, strategies for proving PA of schemes in one model cannot be adapted to the other model. Existing research addresses PA in detail only in the public key setting. This paper gives the first formal exploration of plaintext awareness in the identity-based setting and, as initial work, proceeds in the random oracle model. The focus is laid mainly on identity-based key encapsulation mechanisms (IB-KEMs), for which the paper presents the first definitions of plaintext awareness, highlights the role of PA in proof strategies of IND-CCA security, and explores relationships between PA and other security properties. On the practical side, our work offers the first, highly efficient, general approach for building IB-KEMs that are simultaneously plaintext-aware and IND-CCA -secure. Our construction is inspired by the Fujisaki-Okamoto (FO) transform, but demands weaker and more natural properties of its building blocks. This result comes from a new look at the notion of γ -uniformity that was inherent in the original FO transform. We show that for IB-KEMs (and PK-KEMs), this assumption can be replaced with a weaker computational notion, which is in fact implied by one-wayness. Finally, we give the first concrete IB-KEM scheme that is PA and IND-CCA -secure by applying our construction to a popular IB-KEM and optimizing it for better performance.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Current housing design and construction practices do not meet the needs of many people with disability and older people, and limits their inclusion and participation in community and family life. In spite of a decade of advocacy for regulation of access within residential environments, the Australian government has opted for a voluntary approach where the housing industry takes responsibility. Housing industry leaders have indicated that they are willing to transform their established practice, if it makes good business to do so, and if there is a demand from home buyers. To date, there has been minimal demand. In 2010, housing industry and community leaders formalised this commitment in an agreement, called Livable Housing Design, to transform housing design and construction practices, with a target of all new housing providing minimal access by 2020. This paper reports on a study which examined the assumption behind Livable Housing Design agreement; that is, individuals in the housing industry will respond voluntarily and take responsibility for the provision of inclusive housing. From interviews with developers, designers and builders in Brisbane, Queensland, the study found a complex picture of competing demands and responsibilities. Instead of changing their design and construction practices voluntarily to meet the future needs of users over the life of housing, they are more likely to focus on their immediate contractual obligations and to maintain the status quo. Contrary to the view of the government and industry leaders, participants identified that an external regulatory framework would be required if Livable Housing Design’s 2020 goal was to be met.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Efficient transformation of barley cv. Schooner was achieved using Agrobacterium delivery, hygromycin or bialaphos selection and embryogenic callus. Using this system, transgenic plants were generated that contained either the green fluorescent protein gene, or transgenes derived from barley yellow dwarf (BYDV) and cereal yellow dwarf (CYDV) viruses. Many of these plants contained 1-3 transgene copies that were inherited in a simple Mendelian manner. Some plants containing BYDV and/or CYDV derived transgenes showed reduced virus symptoms and rates of viral replication when challenged with the appropriate virus. The ability to transform Schooner is a significant advance for the Australian barley industry, as this elite malting variety is, and has for the last 15 years been, the most widely grown barley variety in eastern Australia.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Australian Bone Marrow Donor Registry (ABMDR) is a publicly funded company that is part of an international network that facilitates unrelated bone marrow transplantation. This role means that the ABMDR has access to a large biospecimen repository, therefore making it a highly valuable research resource. Recognising the potential value of these biospecimens for research purposes, the ABMDR is in the process of determining whether, and how, to share its biospecimens with other biobanks. While this would undoubtedly be of value to the scientific community, and ultimately to the wider community, it would also inevitably transform the role of an institution whose primary role is therapeutic, and would compromise the degree of control that a custodian has over donated material. This article describe the challenges confronting the ABMDR, and organisations like it, in balancing their duties to donors, patients, researchers and the general public. These problems have led inevitably to the use of "property" rights language in the discussion of these issues but notions of gift, ownership, trusteeship and transfer might also be considered.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper explores the potential for online video as a mechanism to transform the ways students learn, as measured by research, user experience and usage following surveys and trials of patron-driven acquisition collaboratively undertaken by Queensland University of Technology, La Trobe University and Kanopy.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The diagnostics of mechanical components operating in transient conditions is still an open issue, in both research and industrial field. Indeed, the signal processing techniques developed to analyse stationary data are not applicable or are affected by a loss of effectiveness when applied to signal acquired in transient conditions. In this paper, a suitable and original signal processing tool (named EEMED), which can be used for mechanical component diagnostics in whatever operating condition and noise level, is developed exploiting some data-adaptive techniques such as Empirical Mode Decomposition (EMD), Minimum Entropy Deconvolution (MED) and the analytical approach of the Hilbert transform. The proposed tool is able to supply diagnostic information on the basis of experimental vibrations measured in transient conditions. The tool has been originally developed in order to detect localized faults on bearings installed in high speed train traction equipments and it is more effective to detect a fault in non-stationary conditions than signal processing tools based on spectral kurtosis or envelope analysis, which represent until now the landmark for bearings diagnostics.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Diagnostics is based on the characterization of mechanical system condition and allows early detection of a possible fault. Signal processing is an approach widely used in diagnostics, since it allows directly characterizing the state of the system. Several types of advanced signal processing techniques have been proposed in the last decades and added to more conventional ones. Seldom, these techniques are able to consider non-stationary operations. Diagnostics of roller bearings is not an exception of this framework. In this paper, a new vibration signal processing tool, able to perform roller bearing diagnostics in whatever working condition and noise level, is developed on the basis of two data-adaptive techniques as Empirical Mode Decomposition (EMD), Minimum Entropy Deconvolution (MED), coupled by means of the mathematics related to the Hilbert transform. The effectiveness of the new signal processing tool is proven by means of experimental data measured in a test-rig that employs high power industrial size components.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The signal processing techniques developed for the diagnostics of mechanical components operating in stationary conditions are often not applicable or are affected by a loss of effectiveness when applied to signals measured in transient conditions. In this chapter, an original signal processing tool is developed exploiting some data-adaptive techniques such as Empirical Mode Decomposition, Minimum Entropy Deconvolution and the analytical approach of the Hilbert transform. The tool has been developed to detect localized faults on bearings of traction systems of high speed trains and it is more effective to detect a fault in non-stationary conditions than signal processing tools based on envelope analysis or spectral kurtosis, which represent until now the landmark for bearings diagnostics.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Basing signature schemes on strong lattice problems has been a long standing open issue. Today, two families of lattice-based signature schemes are known: the ones based on the hash-and-sign construction of Gentry et al.; and Lyubashevsky’s schemes, which are based on the Fiat-Shamir framework. In this paper we show for the first time how to adapt the schemes of Lyubashevsky to the ring signature setting. In particular we transform the scheme of ASIACRYPT 2009 into a ring signature scheme that provides strong properties of security under the random oracle model. Anonymity is ensured in the sense that signatures of different users are within negligible statistical distance even under full key exposure. In fact, the scheme satisfies a notion which is stronger than the classical full key exposure setting as even if the keypair of the signing user is adversarially chosen, the statistical distance between signatures of different users remains negligible. Considering unforgeability, the best lattice-based ring signature schemes provide either unforgeability against arbitrary chosen subring attacks or insider corruption in log-sized rings. In this paper we present two variants of our scheme. In the basic one, unforgeability is ensured in those two settings. Increasing signature and key sizes by a factor k (typically 80 − 100), we provide a variant in which unforgeability is ensured against insider corruption attacks for arbitrary rings. The technique used is pretty general and can be adapted to other existing schemes.