895 resultados para Cochlear filter bank


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Bank of England notes of £20 denomination have been studied using infrared spectroscopy in order to generate a method to identify forged notes. An aim of this work was to develop a non-destructive method so that a small, compact Fourier transform infrared spectrometer (FT-IR) instrument could be used by bank workers, police departments or others such as shop assistants to identify forged notes in a non-lab setting. The ease of use of the instrument is the key to this method, as well as the relatively low cost. The presence of a peak at 1400 cm−1 arising from νasym () from the blank paper section of a forged note proved to be a successful indicator of the note’s illegality for the notes that we studied. Moreover, differences between the spectra of forged and genuine £20 notes were observed in the ν(OH) (ca. 3500 cm−1), ν(CH) (ca. 2900 cm−1) and ν(CO) (ca. 1750 cm−1) regions of the IR spectrum recorded for the polymer film covering the holographic strip. In cases where these simple tests fail, we have shown how an infrared microscope can be used to further differentiate genuine and forged banknotes by producing infrared maps of selected areas of the note contrasting inks with background paper.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Particle filters are fully non-linear data assimilation techniques that aim to represent the probability distribution of the model state given the observations (the posterior) by a number of particles. In high-dimensional geophysical applications the number of particles required by the sequential importance resampling (SIR) particle filter in order to capture the high probability region of the posterior, is too large to make them usable. However particle filters can be formulated using proposal densities, which gives greater freedom in how particles are sampled and allows for a much smaller number of particles. Here a particle filter is presented which uses the proposal density to ensure that all particles end up in the high probability region of the posterior probability density function. This gives rise to the possibility of non-linear data assimilation in large dimensional systems. The particle filter formulation is compared to the optimal proposal density particle filter and the implicit particle filter, both of which also utilise a proposal density. We show that when observations are available every time step, both schemes will be degenerate when the number of independent observations is large, unlike the new scheme. The sensitivity of the new scheme to its parameter values is explored theoretically and demonstrated using the Lorenz (1963) model.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The time discretization in weather and climate models introduces truncation errors that limit the accuracy of the simulations. Recent work has yielded a method for reducing the amplitude errors in leapfrog integrations from first-order to fifth-order. This improvement is achieved by replacing the Robert--Asselin filter with the RAW filter and using a linear combination of the unfiltered and filtered states to compute the tendency term. The purpose of the present paper is to apply the composite-tendency RAW-filtered leapfrog scheme to semi-implicit integrations. A theoretical analysis shows that the stability and accuracy are unaffected by the introduction of the implicitly treated mode. The scheme is tested in semi-implicit numerical integrations in both a simple nonlinear stiff system and a medium-complexity atmospheric general circulation model, and yields substantial improvements in both cases. We conclude that the composite-tendency RAW-filtered leapfrog scheme is suitable for use in semi-implicit integrations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Currently, infrared filters for astronomical telescopes and satellite radiometers are based on multilayer thin film stacks of alternating high and low refractive index materials. However, the choice of suitable layer materials is limited and this places limitations on the filter performance that can be achieved. The ability to design materials with arbitrary refractive index allows for filter performance to be greatly increased but also increases the complexity of design. Here a differential algorithm was used as a method for optimised design of filters with arbitrary refractive indices, and then materials are designed to these specifications as mono-materials with sub wavelength structures using Bruggeman’s effective material approximation (EMA).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Understanding the performance of banks is of the utmost importance due to the impact the sector may have on economic growth and financial stability. Residential mortgage loans constitute a large proportion of the portfolio of many banks and are one of the key assets in the determination of their performance. Using a dynamic panel model, we analyse the impact of residential mortgage loans on bank profitability and risk, based on a sample of 555 banks in the European Union (EU-15), over the period from 1995 to 2008. We find that an increase in residential mortgage loans seems to improve bank’s performance in terms of both profitability and credit risk in good market, pre-financial crisis, conditions. These findings may aid in explaining why banks rush to lend to property during booms because of the positive effect it has on performance. The results also show that credit risk and profitability are lower during the upturn in the residential property cycle.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

For certain observing types, such as those that are remotely sensed, the observation errors are correlated and these correlations are state- and time-dependent. In this work, we develop a method for diagnosing and incorporating spatially correlated and time-dependent observation error in an ensemble data assimilation system. The method combines an ensemble transform Kalman filter with a method that uses statistical averages of background and analysis innovations to provide an estimate of the observation error covariance matrix. To evaluate the performance of the method, we perform identical twin experiments using the Lorenz ’96 and Kuramoto-Sivashinsky models. Using our approach, a good approximation to the true observation error covariance can be recovered in cases where the initial estimate of the error covariance is incorrect. Spatial observation error covariances where the length scale of the true covariance changes slowly in time can also be captured. We find that using the estimated correlated observation error in the assimilation improves the analysis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper discusses ECG signal classification after parametrizing the ECG waveforms in the wavelet domain. Signal decomposition using perfect reconstruction quadrature mirror filter banks can provide a very parsimonious representation of ECG signals. In the current work, the filter parameters are adjusted by a numerical optimization algorithm in order to minimize a cost function associated to the filter cut-off sharpness. The goal consists of achieving a better compromise between frequency selectivity and time resolution at each decomposition level than standard orthogonal filter banks such as those of the Daubechies and Coiflet families. Our aim is to optimally decompose the signals in the wavelet domain so that they can be subsequently used as inputs for training to a neural network classifier.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

While the private sector has long been in the vanguard of shaping and managing urban environs, under the New Labour government business actors were also heralded as key agents in the delivery of sustainable places. Policy interventions, such as Business Improvement Districts (BIDs), saw business-led local partnerships positioned as key drivers in the production of economically, socially and environmentally sustainable urban communities. This research considers how one business-led body, South Bank Employer’s Group (SBEG), has inserted itself into, and influenced, local (re)development trajectories. Interview, observational and archival data are used to explore how, in a neighbourhood noted for its turbulent and conflictual development past, SBEG has led on a series of regeneration programmes that it asserts will create a “better South Bank for all”. A belief in consensual solutions underscored New Labour’s urban agenda and cast regeneration as a politically neutral process in which different stakeholders can reach mutually beneficial solutions (Southern, 2001). For authors such as Mouffe (2005), the search for consensus represents a move towards a ‘post-political’ approach to governing in which the (necessarily) antagonistic nature of the political is denied. The research utilises writings on the ‘post-political’ condition to frame an empirical exploration of regeneration at the neighbourhood level. It shows how SBEG has brokered a consensual vision of regeneration with the aim of overriding past disagreements about local development. While this may be seen as an attempt to enact what Honig (1993: 3) calls the ‘erasure of resistance from political orderings’ by assuming control of regeneration agendas (see also Baeten, 2009), the research shows that ‘resistances’ to SBEG’s activities continue to be expressed in a series of ways. These resistances suggest that, while increasingly ‘post-political’ in character, local place shaping continues to evidence what Massey (2005: 10) calls the ‘space of loose ends and missing links’ from which political activity can, at least potentially, emerge.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Timediscretization in weatherandclimate modelsintroduces truncation errors that limit the accuracy of the simulations. Recent work has yielded a method for reducing the amplitude errors in leap-frog integrations from first-order to fifth-order.This improvement is achieved by replacing the Robert–Asselin filter with the Robert–Asselin–Williams (RAW) filter and using a linear combination of unfiltered and filtered states to compute the tendency term. The purpose of the present article is to apply the composite-tendency RAW-filtered leapfrog scheme to semi-implicit integrations. A theoretical analysis shows that the stability and accuracy are unaffected by the introduction of the implicitly treated mode. The scheme is tested in semi-implicit numerical integrations in both a simple nonlinear stiff system and a medium-complexity atmospheric general circulation model and yields substantial improvements in both cases. We conclude that the composite-tendency RAW-filtered leap-frog scheme is suitable for use in semi-implicit integrations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In general, particle filters need large numbers of model runs in order to avoid filter degeneracy in high-dimensional systems. The recently proposed, fully nonlinear equivalent-weights particle filter overcomes this requirement by replacing the standard model transition density with two different proposal transition densities. The first proposal density is used to relax all particles towards the high-probability regions of state space as defined by the observations. The crucial second proposal density is then used to ensure that the majority of particles have equivalent weights at observation time. Here, the performance of the scheme in a high, 65 500 dimensional, simplified ocean model is explored. The success of the equivalent-weights particle filter in matching the true model state is shown using the mean of just 32 particles in twin experiments. It is of particular significance that this remains true even as the number and spatial variability of the observations are changed. The results from rank histograms are less easy to interpret and can be influenced considerably by the parameter values used. This article also explores the sensitivity of the performance of the scheme to the chosen parameter values and the effect of using different model error parameters in the truth compared with the ensemble model runs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The disadvantage of the majority of data assimilation schemes is the assumption that the conditional probability density function of the state of the system given the observations [posterior probability density function (PDF)] is distributed either locally or globally as a Gaussian. The advantage, however, is that through various different mechanisms they ensure initial conditions that are predominantly in linear balance and therefore spurious gravity wave generation is suppressed. The equivalent-weights particle filter is a data assimilation scheme that allows for a representation of a potentially multimodal posterior PDF. It does this via proposal densities that lead to extra terms being added to the model equations and means the advantage of the traditional data assimilation schemes, in generating predominantly balanced initial conditions, is no longer guaranteed. This paper looks in detail at the impact the equivalent-weights particle filter has on dynamical balance and gravity wave generation in a primitive equation model. The primary conclusions are that (i) provided the model error covariance matrix imposes geostrophic balance, then each additional term required by the equivalent-weights particle filter is also geostrophically balanced; (ii) the relaxation term required to ensure the particles are in the locality of the observations has little effect on gravity waves and actually induces a reduction in gravity wave energy if sufficiently large; and (iii) the equivalent-weights term, which leads to the particles having equivalent significance in the posterior PDF, produces a change in gravity wave energy comparable to the stochastic model error. Thus, the scheme does not produce significant spurious gravity wave energy and so has potential for application in real high-dimensional geophysical applications.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper investigates the behavior of residential property and examines the linkages between house price dynamics and bank herding behavior. The analysis presents evidence that irrational behaviour may have played a significant role in several countries, including; United Kingdom, Spain, Denmark, Sweden and Ireland. In addition, we also provide evidence indicative of herding behaviour in the European residential mortgage loan market. Granger Causality tests indicate that non-fundamentally justified prices dynamics contributed to herding by lenders and that this behaviour was a response by the banks as a group to common information on residential property assets. In contrast, in Germany, Portugal and Austria, residential property prices were largely explained by fundamentals. Furthermore, these countries show no evidence of either irrational price bubbles or herd behaviour in the mortgage market. Granger Causality tests indicate that both variables are independent.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper investigates whether bank integration measured by cross-border bank flows can capture the co-movements across housing markets in developed countries by using a spatial dynamic panel model. The transmission can occur through a global banking channel in which global banks intermediate wholesale funding to local banks. Changes in financial conditions are passed across borders through the banks’ balance-sheet exposure to credit, currency, maturity, and funding risks resulting in house price spillovers. While controlling for country-level and global factors, we find significant co-movement across housing markets of countries with proportionally high bank integration. Bank integration can better capture house price co-movements than other measures of economic integration. Once we account for bank exposure, other spatial linkages traditionally used to account for return co-movements across region – such as trade, foreign direct investment, portfolio investment, geographic proximity, etc. – become insignificant. Moreover, we find that the co-movement across housing markets decreases for countries with less developed mortgage markets characterized by fixed mortgage rate contracts, low limits of loan-to-value ratios and no mortgage equity withdrawal.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper draws on a study of the politics of development planning in London’s South Bank to examine wider trends in the governance of contemporary cities. It assesses the impacts and outcomes of so-called new localist reforms and argues that we are witnessing two principal trends. First, governance processes are increasingly dominated by anti-democratic development machines, characterized by new assemblages of public- and private-sector experts. These machines reflect and reproduce a type of development politics in which there is a greater emphasis on a pragmatic realism and a politics of delivery. Second, the presence of these machines is having a significant impact on the politics of planning. Democratic engagement is not seen as the basis for new forms of localism and community control. Instead, it is presented as a potentially disruptive force that needs to be managed by a new breed of skilled private-sector consultant. The paper examines these wider shifts in urban politics before focusing on the connections between emerging development machines and local residential and business communities. It ends by highlighting some of the wider implications of change for democratic modes of engagement and nodes of resistance in urban politics.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper investigates the use of a particle filter for data assimilation with a full scale coupled ocean–atmosphere general circulation model. Synthetic twin experiments are performed to assess the performance of the equivalent weights filter in such a high-dimensional system. Artificial 2-dimensional sea surface temperature fields are used as observational data every day. Results are presented for different values of the free parameters in the method. Measures of the performance of the filter are root mean square errors, trajectories of individual variables in the model and rank histograms. Filter degeneracy is not observed and the performance of the filter is shown to depend on the ability to keep maximum spread in the ensemble.