974 resultados para Bit Error Rate


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper studies the rate of convergence of an appropriatediscretization scheme of the solution of the Mc Kean-Vlasovequation introduced by Bossy and Talay. More specifically,we consider approximations of the distribution and of thedensity of the solution of the stochastic differentialequation associated to the Mc Kean - Vlasov equation. Thescheme adopted here is a mixed one: Euler/weakly interactingparticle system. If $n$ is the number of weakly interactingparticles and $h$ is the uniform step in the timediscretization, we prove that the rate of convergence of thedistribution functions of the approximating sequence in the $L^1(\Omega\times \Bbb R)$ norm and in the sup norm is of theorder of $\frac 1{\sqrt n} + h $, while for the densities is ofthe order $ h +\frac 1 {\sqrt {nh}}$. This result is obtainedby carefully employing techniques of Malliavin Calculus.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This article studies the effects of interest rate restrictions on loan allocation. The British governmenttightened the usury laws in 1714, reducing the maximum permissible interest rate from 6% to5%. A sample of individual loan transactions reveals that average loan size and minimum loan sizeincreased strongly, while access to credit worsened for those with little social capital. Collateralisedcredits, which had accounted for a declining share of total lending, returned to their former role ofprominence. Our results suggest that the usury laws distorted credit markets significantly; we findno evidence that they offered a form of Pareto-improving social insurance.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This work is part of a project studying the performance of model basedestimators in a small area context. We have chosen a simple statisticalapplication in which we estimate the growth rate of accupation for severalregions of Spain. We compare three estimators: the direct one based onstraightforward results from the survey (which is unbiassed), and a thirdone which is based in a statistical model and that minimizes the mean squareerror.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Given $n$ independent replicates of a jointly distributed pair $(X,Y)\in {\cal R}^d \times {\cal R}$, we wish to select from a fixed sequence of model classes ${\cal F}_1, {\cal F}_2, \ldots$ a deterministic prediction rule $f: {\cal R}^d \to {\cal R}$ whose risk is small. We investigate the possibility of empirically assessingthe {\em complexity} of each model class, that is, the actual difficulty of the estimation problem within each class. The estimated complexities are in turn used to define an adaptive model selection procedure, which is based on complexity penalized empirical risk.The available data are divided into two parts. The first is used to form an empirical cover of each model class, and the second is used to select a candidate rule from each cover based on empirical risk. The covering radii are determined empirically to optimize a tight upper bound on the estimation error. An estimate is chosen from the list of candidates in order to minimize the sum of class complexity and empirical risk. A distinguishing feature of the approach is that the complexity of each model class is assessed empirically, based on the size of its empirical cover.Finite sample performance bounds are established for the estimates, and these bounds are applied to several non-parametric estimation problems. The estimates are shown to achieve a favorable tradeoff between approximation and estimation error, and to perform as well as if the distribution-dependent complexities of the model classes were known beforehand. In addition, it is shown that the estimate can be consistent,and even possess near optimal rates of convergence, when each model class has an infinite VC or pseudo dimension.For regression estimation with squared loss we modify our estimate to achieve a faster rate of convergence.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The resting metabolic rate (RMR) and body composition of 130 obese and nonobese prepubertal children, aged 6 to 10 years, were assessed by indirect calorimetry and skin-fold thickness, respectively. The mean (+/- SD) RMR was 4619 +/- 449 kJ.day-1 (164 +/- 31 kJ.kg body weight-1 x day-1) in the 62 boys and 4449 +/- 520 kJ.day-1 (147 +/- 32 kJ.kg body weight-1 x day-1) in the 68 girls. Fat-free mass was the best single predictor of RMR (R2 = 0.64; p < 0.001). Step-down multiple regression analysis, with independent variables such as age, gender, weight, and height, allowed several RMR predictive equations to be developed. An equation for boys is as follows: RMR (kJ.day-1) = 1287 + 28.6 x Weight(kg) + 23.6 x Height(cm) - 69.1 x Age(yr) (R2 = 0.58; p < 0.001). An equation for girls is as follows: RMR (kJ.day-1 = 1552 + 35.8 x Weight (kg) + 15.6 x Height (cm) - 36.3 x Age (yr) (R2 = 0.69; p < 0.001). Comparison between the measured RMR and that predicted by currently used formulas showed that most of these equations tended to overestimate the RMR of both genders, especially in overweight children.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The efficacy of treatments for osteoporosis does not become evident when evaluated by fracture incidence (FI). Vertebral FI decreased in all controlled studies on calcitonin, but not significantly. Small sample sizes and short periods of treatment may have masked a possible therapeutic benefit, but longer, controlled studies with sodium fluoride or etidronate in larger groups of patients also failed to show a decrease in FI. The present analysis of nine published, therapeutic studies which indicate the FI per year and the initial prevalence of vertebral fractures, examines the question of whether the initial prevalence of fractures has an effect on the subsequent incidence of new fractures and whether the therapeutic effects have to be evaluated as a function of the initial prevalence of fractures. Bearing in mind the differences in roentgenological evaluation and in the size and quality of the various studies, the analysis revealed (1) that in the control groups there was a higher FI in patients with more than three vertebral fractures at baseline (estimated odds ratio (OR) = 49, p = 0.011); (2) that a similar trend, although not statistically significant, was observed in treated patients; (3) that the groups of control patients treated for more than 1 year showed in general an increase in FI beyond the first year and that the reverse was true in treated patients. In conclusion, failure to allow for the initial prevalence of vertebral fractures at the individual level in therapeutic trials of calcitonin to treat osteoporosis and prevent new fractures might have contributed to the absence of a demonstrable benefit of the treatment in those studies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Constant interest rate (CIR) projections are often criticized on the grounds that they are inconsistent with the existence of a unique equilibrium in a variety of forward-looking models. This note shows howto construct CIR projections that are not subject to that criticism, using a standard New Keynesian model as a reference framework.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We lay out a small open economy version of the Calvo sticky price model, and show how the equilibrium dynamics can be reduced to simple representation in domestic inflation and the output gap. We use the resulting framework to analyze the macroeconomic implications of three alternative rule-based policy regimes for the small open economy: domestic inflation and CPI-based Taylor rules, and an exchange rate peg. We show that a key difference amongthese regimes lies in the relative amount of exchange rate volatility that they entail. We also discuss a special case for which domestic inflation targeting constitutes the optimal policy, and where a simple second order approximation to the utility of the representative consumer can be derived and used to evaluate the welfare losses associated with the suboptimal rules.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Dipteran larvae were collected from rabbit (Oryctolagus cunniculus L.) carcasses during the four seasons in 2005 in the southernmost state of Rio Grande do Sul, Brazil. The larvae were fed ground beef at ambient temperatures following collection from carcasses. The development of each species under these conditions was estimated. The most abundant species in the carcasses were Lucilia eximia (Wiedemann) and Chrysomya albiceps (Wiedemann) (Calliphoridae), and they were found in all seasons. The data were fitted to a linear model that describes the relationship between temperature and linear developmental rating. These two species are primary forensic indicators in southern Brazil. Other species such as Hemilucilia semidiaphana (Rondani) (Calliphoridae), Synthesiomyia nudiseta (Wulp), Muscina stabulans (Fallen) (Muscidae), and Fannia pusio (Wiedemann) (Fanniidae) were forensically less important because they only occurred in high frequency in certain seasons and during the first days of carcass decomposition.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper proposes a dynamic framework to study the timing of balance of paymentscrises. The model incorporates two main ingredients: (i) investors have private information; (ii)investors interact in a dynamic setting, weighing the high returns on domestic assets against the incentives to pull out before the devaluation. The model shows that the presence of disaggregated information delays the onset of BOP crises, giving rise to discrete devaluations. It also shows that high interest rates can be eective in delaying and possibly avoiding the abandonment of the peg. The optimal policy is to raise interest rates sharply as fundamentals become very weak. However, this policy is time inconsistent, suggesting a role for commitment devices such as currency boards or IMF pressure.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Specialized glucosensing neurons are present in the hypothalamus, some of which neighbor the median eminence, where the blood-brain barrier has been reported leaky. A leaky blood-brain barrier implies high tissue glucose levels and obviates a role for endothelial glucose transporters in the control of hypothalamic glucose concentration, important in understanding the mechanisms of glucose sensing We therefore addressed the question of blood-brain barrier integrity at the hypothalamus for glucose transport by examining the brain tissue-to-plasma glucose ratio in the hypothalamus relative to other brain regions. We also examined glycogenolysis in hypothalamus because its occurrence is unlikely in the potential absence of a hypothalamus-blood interface. Across all regions the concentration of glucose was comparable at a given plasma glucose concentration and was a near linear function of plasma glucose. At steady-state, hypothalamic glucose concentration was similar to the extracellular hypothalamic glucose concentration reported by others. Hypothalamic glycogen fell at a rate of approximately 1.5 micromol/g/h and remained present in substantial amounts. We conclude for the hypothalamus, a putative primary site of brain glucose sensing that: the rate-limiting step for glucose transport into brain cells is at the blood-hypothalamus interface, and that glycogenolysis is consistent with a substantial blood -to- intracellular glucose concentration gradient.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Analiza el estado de la fisiología del fitoplancton de las aguas costeras cercanas a Perú

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper offers empirical evidence that a country's choice of exchange rate regime can have a signifficant impact on its medium-term rate of productivity growth. Moreover, the impact depends critically on the country's level of financial development, its degree of market regulation, and its distance from the global technology frontier. We illustrate how each of these channels may operate in a simple stylized growth model in which real exchange rate uncertainty exacerbates the negative investment e¤ects of domestic credit market constraints. The empirical analysis is based on an 83 country data set spanning the years 1960-2000. Our approach delivers results that are in striking contrast to the vast existing empirical exchange rate literature, which largely finds the effects of exchange rate volatility on real activity to be relatively small and insignificant.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We study model selection strategies based on penalized empirical loss minimization. We point out a tight relationship between error estimation and data-based complexity penalization: any good error estimate may be converted into a data-based penalty function and the performance of the estimate is governed by the quality of the error estimate. We consider several penalty functions, involving error estimates on independent test data, empirical {\sc vc} dimension, empirical {\sc vc} entropy, andmargin-based quantities. We also consider the maximal difference between the error on the first half of the training data and the second half, and the expected maximal discrepancy, a closely related capacity estimate that can be calculated by Monte Carlo integration. Maximal discrepancy penalty functions are appealing for pattern classification problems, since their computation is equivalent to empirical risk minimization over the training data with some labels flipped.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Summary points: - The bias introduced by random measurement error will be different depending on whether the error is in an exposure variable (risk factor) or outcome variable (disease) - Random measurement error in an exposure variable will bias the estimates of regression slope coefficients towards the null - Random measurement error in an outcome variable will instead increase the standard error of the estimates and widen the corresponding confidence intervals, making results less likely to be statistically significant - Increasing sample size will help minimise the impact of measurement error in an outcome variable but will only make estimates more precisely wrong when the error is in an exposure variable