14 resultados para Optimal frame-level timing estimator

em Aston University Research Archive


Relevância:

100.00% 100.00%

Publicador:

Resumo:

We investigate the feasibility of simultaneous suppressing of the amplification noise and nonlinearity, representing the most fundamental limiting factors in modern optical communication. To accomplish this task we developed a general design optimisation technique, based on concepts of noise and nonlinearity management. We demonstrate the immense efficiency of the novel approach by applying it to a design optimisation of transmission lines with periodic dispersion compensation using Raman and hybrid Raman-EDFA amplification. Moreover, we showed, using nonlinearity management considerations, that the optimal performance in high bit-rate dispersion managed fibre systems with hybrid amplification is achieved for a certain amplifier spacing – which is different from commonly known optimal noise performance corresponding to fully distributed amplification. Required for an accurate estimation of the bit error rate, the complete knowledge of signal statistics is crucial for modern transmission links with strong inherent nonlinearity. Therefore, we implemented the advanced multicanonical Monte Carlo (MMC) method, acknowledged for its efficiency in estimating distribution tails. We have accurately computed acknowledged for its efficiency in estimating distribution tails. We have accurately computed marginal probability density functions for soliton parameters, by numerical modelling of Fokker-Plank equation applying the MMC simulation technique. Moreover, applying a powerful MMC method we have studied the BER penalty caused by deviations from the optimal decision level in systems employing in-line 2R optical regeneration. We have demonstrated that in such systems the analytical linear approximation that makes a better fit in the central part of the regenerator nonlinear transfer function produces more accurate approximation of the BER and BER penalty. We present a statistical analysis of RZ-DPSK optical signal at direct detection receiver with Mach-Zehnder interferometer demodulation

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present a framework for calculating globally optimal parameters, within a given time frame, for on-line learning in multilayer neural networks. We demonstrate the capability of this method by computing optimal learning rates in typical learning scenarios. A similar treatment allows one to determine the relevance of related training algorithms based on modifications to the basic gradient descent rule as well as to compare different training methods.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A method for calculating the globally optimal learning rate in on-line gradient-descent training of multilayer neural networks is presented. The method is based on a variational approach which maximizes the decrease in generalization error over a given time frame. We demonstrate the method by computing optimal learning rates in typical learning scenarios. The method can also be employed when different learning rates are allowed for different parameter vectors as well as to determine the relevance of related training algorithms based on modifications to the basic gradient descent rule.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we consider the optimisation of Shannon mutual information (MI) in the context of two model neural systems The first is a stochastic pooling network (population) of McCulloch-Pitts (MP) type neurons (logical threshold units) subject to stochastic forcing; the second is (in a rate coding paradigm) a population of neurons that each displays Poisson statistics (the so called 'Poisson neuron'). The mutual information is optimised as a function of a parameter that characterises the 'noise level'-in the MP array this parameter is the standard deviation of the noise, in the population of Poisson neurons it is the window length used to determine the spike count. In both systems we find that the emergent neural architecture and; hence, code that maximises the MI is strongly influenced by the noise level. Low noise levels leads to a heterogeneous distribution of neural parameters (diversity), whereas, medium to high noise levels result in the clustering of neural parameters into distinct groups that can be interpreted as subpopulations In both cases the number of subpopulations increases with a decrease in noise level. Our results suggest that subpopulations are a generic feature of an information optimal neural population.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis examines the dynamics of firm-level financing and investment decisions for six Southeast Asian countries. The study provides empirical evidence on the impacts of changes in the firm-level financing decisions during the period of financial liberalization by considering the debt and equity financing decisions of a set of non-financial firms. The empirical results show that firms in Indonesia, Pakistan, and South Korea have relatively faster speed of adjustment than other Southeast Asian countries to attain optimal debt and equity ratios in response to banking sector and stock market liberalization. In addition, contrary to widely held belief that firms adjust their financial ratios to industry levels, the results indicate that industry factors do not significantly impact on the speed of capital structure adjustments. This study also shows that non-linear estimation methods are more appropriate than linear estimation methods for capturing changes in capital structure. The empirical results also show that international stock market integration of these countries has significantly reduced the equity risk premium as well as the firm-level cost of equity capital. Thus stock market liberalization is associated with a decrease in the cost of equity capital of the firms. Developments in the securities markets infrastructure have also reduced the cost of equity capital. However, with increased integration there is the possibility of capital outflows from the emerging markets, which might reverse the pattern of decrease in cost of capital in these markets.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

linearity management is explored as a complete tool to obtain maximum transmission reach in a WDM fiber transmission system, making it possible to optimize multiple system parameters, including optimal dispersion pre-compensation, with fast simulations based on the continuous-wave approximation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the wake of the global financial crisis, several macroeconomic contributions have highlighted the risks of excessive credit expansion. In particular, too much finance can have a negative impact on growth. We examine the microeconomic foundations of this argument, positing a non-monotonic relationship between leverage and firm-level productivity growth in the spirit of the trade-off theory of capital structure. A threshold regression model estimated on a sample of Central and Eastern European countries confirms that TFP growth increases with leverage until the latter reaches a critical threshold beyond which leverage lowers TFP growth. This estimate can provide guidance to firms and policy makers on identifying "excessive" leverage. We find similar non-monotonic relationships between leverage and proxies for firm value. Our results are a first step in bridging the gap between the literature on optimal capital structure and the wider macro literature on the finance-growth nexus. © 2012 Elsevier Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Despite recent research on time (e.g. Hedaa & Törnroos, 2001), consideration of the time dimension in data collection, analysis and interpretation in research in supply networks is, to date, still limited. Drawing on a body of literature from organization studies, and empirical findings from a six-year action research programme and a related study of network learning, we reflect on time, timing and timeliness in interorganizational networks. The empirical setting is supply networks in the English health sector wherein we identify and elaborate various issues of time, within the case and in terms of research process. Our analysis is wide-ranging and multi-level, from the global (e.g. identifying the notion of life cycles) to the particular (e.g. different cycle times in supply, such as daily for deliveries and yearly for contracts). We discuss the ‘speeding up’ of inter-organizational ‘e’ time and tensions with other time demands. In closing the paper, we relate our conclusions to the future conduct of the research programme and supply research more generally, and to the practice of managing supply (in) networks.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The ability to hear a target signal over background noise is an important aspect of efficient hearing in everyday situations. This mechanism depends on binaural hearing whenever there are differences in the inter-aural timing of inputs from the noise and the signal. Impairments in binaural hearing may underlie some auditory processing disorders, for example temporal-lobe epilepsies. The binaural masking level difference (BMLD) measures the advantage in detecting a tone whose inter-aural phase differs from that of the masking noise. BMLD’s are typically estimated psychophysically, but this is challenging in children or those with cognitive impairments. The aim of this doctorate is to design a passive measure of BMLD using magnetoencephalography (MEG) and test this in adults, children and patients with different types of epilepsy. The stimulus consists of Gaussian background noise with 500-Hz tones presented binaurally either in-phase or 180° out-of-phase between the ears. Source modelling provides the N1m amplitude for the in-phase and out-of-phase tones, representing the extent of signal perception over background noise. The passive BMLD stimulus is successfully used as a measure of binaural hearing capabilities in participants who would otherwise be unable to undertake a psychophysical task.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The re-entrant flow shop scheduling problem (RFSP) is regarded as a NP-hard problem and attracted the attention of both researchers and industry. Current approach attempts to minimize the makespan of RFSP without considering the interdependency between the resource constraints and the re-entrant probability. This paper proposed Multi-level genetic algorithm (GA) by including the co-related re-entrant possibility and production mode in multi-level chromosome encoding. Repair operator is incorporated in the Multi-level genetic algorithm so as to revise the infeasible solution by resolving the resource conflict. With the objective of minimizing the makespan, Multi-level genetic algorithm (GA) is proposed and ANOVA is used to fine tune the parameter setting of GA. The experiment shows that the proposed approach is more effective to find the near-optimal schedule than the simulated annealing algorithm for both small-size problem and large-size problem. © 2013 Published by Elsevier Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Nonlinearity management is explored as a complete tool to obtain maximum transmission reach in a WDM fiber transmission system, making it possible to optimize multiple system parameters, including optimal dispersion pre-compensation, with fast simulations based on the continuous-wave approximation. © 2006 Optical Society of America.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The original contribution of this work is threefold. Firstly, this thesis develops a critical perspective on current evaluation practice of business support, with focus on the timing of evaluation. The general time frame applied for business support policy evaluation is limited to one to two, seldom three years post intervention. This is despite calls for long-term impact studies by various authors, concerned about time lags before effects are fully realised. This desire for long-term evaluation opposes the requirements by policy-makers and funders, seeking quick results. Also, current ‘best practice’ frameworks do not refer to timing or its implications, and data availability affects the ability to undertake long-term evaluation. Secondly, this thesis provides methodological value for follow-up and similar studies by using data linking of scheme-beneficiary data with official performance datasets. Thus data availability problems are avoided through the use of secondary data. Thirdly, this thesis builds the evidence, through the application of a longitudinal impact study of small business support in England, covering seven years of post intervention data. This illustrates the variability of results for different evaluation periods, and the value in using multiple years of data for a robust understanding of support impact. For survival, impact of assistance is found to be immediate, but limited. Concerning growth, significant impact centres on a two to three year period post intervention for the linear selection and quantile regression models – positive for employment and turnover, negative for productivity. Attribution of impact may present a problem for subsequent periods. The results clearly support the argument for the use of longitudinal data and analysis, and a greater appreciation by evaluators of the factor time. This analysis recommends a time frame of four to five years post intervention for soft business support evaluation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The original contribution of this work is threefold. Firstly, this thesis develops a critical perspective on current evaluation practice of business support, with focus on the timing of evaluation. The general time frame applied for business support policy evaluation is limited to one to two, seldom three years post intervention. This is despite calls for long-term impact studies by various authors, concerned about time lags before effects are fully realised. This desire for long-term evaluation opposes the requirements by policy-makers and funders, seeking quick results. Also, current ‘best practice’ frameworks do not refer to timing or its implications, and data availability affects the ability to undertake long-term evaluation. Secondly, this thesis provides methodological value for follow-up and similar studies by using data linking of scheme-beneficiary data with official performance datasets. Thus data availability problems are avoided through the use of secondary data. Thirdly, this thesis builds the evidence, through the application of a longitudinal impact study of small business support in England, covering seven years of post intervention data. This illustrates the variability of results for different evaluation periods, and the value in using multiple years of data for a robust understanding of support impact. For survival, impact of assistance is found to be immediate, but limited. Concerning growth, significant impact centres on a two to three year period post intervention for the linear selection and quantile regression models – positive for employment and turnover, negative for productivity. Attribution of impact may present a problem for subsequent periods. The results clearly support the argument for the use of longitudinal data and analysis, and a greater appreciation by evaluators of the factor time. This analysis recommends a time frame of four to five years post intervention for soft business support evaluation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Supply chain operations directly affect service levels. Decision on amendment of facilities is generally decided based on overall cost, leaving out the efficiency of each unit. Decomposing the supply chain superstructure, efficiency analysis of the facilities (warehouses or distribution centers) that serve customers can be easily implemented. With the proposed algorithm, the selection of a facility is based on service level maximization and not just cost minimization as this analysis filters all the feasible solutions utilizing Data Envelopment Analysis (DEA) technique. Through multiple iterations, solutions are filtered via DEA and only the efficient ones are selected leading to cost minimization. In this work, the problem of optimal supply chain networks design is addressed based on a DEA based algorithm. A Branch and Efficiency (B&E) algorithm is deployed for the solution of this problem. Based on this DEA approach, each solution (potentially installed warehouse, plant etc) is treated as a Decision Making Unit, thus is characterized by inputs and outputs. The algorithm through additional constraints named “efficiency cuts”, selects only efficient solutions providing better objective function values. The applicability of the proposed algorithm is demonstrated through illustrative examples.