47 resultados para growing parameters
em Consorci de Serveis Universitaris de Catalunya (CSUC), Spain
Resumo:
This work is focused on the study of the fine speckle contrast present in planar view observations of matched and mismatched InGaAs layers grown by molecular beam epitaxy on InP substrates. Our results provide experimental evidence of the evolution of this fine structure with the mismatch, layer thickness, and growth temperature. The correlation of the influence of all these parameters on the appearance of the contrast modulation points to the development of the fine structure during the growth. Moreover, as growth proceeds, this structure shows a dynamic behavior which depends on the intrinsic layer substrate stress.
Resumo:
The spread of viruses in growing plaques predicted by classical models is greater than that measured experimentally. There is a widespread belief that this discrepancy is due to biological factors. Here we show that the observed speeds can be satisfactorily predicted by a purely physical model that takes into account the delay time due to virus reproduction inside infected cells. No free or adjustable parameters are used
Resumo:
We study collusive behaviour in experimental duopolies that compete in prices under dynamic demand conditions. In one treatment the demand grows at a constant rate. In the other treatment the demand declines at another constant rate. The rates are chosen so that the evolution of the demand in one case is just the reverse in time than the one for the other case. We use a box-design demand function so that there are no issues of finding and co-ordinating on the collusive price. Contrary to game-theoretic reasoning, our results show that collusion is significantly larger when the demand shrinks than when it grows. We conjecture that the prospect of rapidly declining profit opportunities exerts a disciplining effect on firms that facilitates collusion and discourages deviation.
Resumo:
This paper analyses the theoretical relevance of the dynamical aspects of growth on the discussion about the observed positive correlation between per capita real income and real exchange rates. With this purpose, we develop a simple exogenous growth model where the internal, external and intertemporal equilibrium conditions of a typical macroeconomic model are imposed; this last one through the inclusion of a balanced growth path for the foreign assets accumulation. The main result under this consideration is that the relationship defended by the Balassa-Samuelson hypothesis is no more so straightforward. In our particular approach, the mentioned bilateral relationship depends on a parameter measuring thriftiness in the economy. Therefore, the probability of ending up with a positive relationship between growth and real exchange rates -as the classical economic theory predicts- will be higher when the economy is able to maintain a minimum saving ratio. Moreover, given that our model considers a simple Keynesian consumption function, some explosive paths can also be possible.
Resumo:
Low concentrations of elements in geochemical analyses have the peculiarity of beingcompositional data and, for a given level of significance, are likely to be beyond thecapabilities of laboratories to distinguish between minute concentrations and completeabsence, thus preventing laboratories from reporting extremely low concentrations of theanalyte. Instead, what is reported is the detection limit, which is the minimumconcentration that conclusively differentiates between presence and absence of theelement. A spatially distributed exhaustive sample is employed in this study to generateunbiased sub-samples, which are further censored to observe the effect that differentdetection limits and sample sizes have on the inference of population distributionsstarting from geochemical analyses having specimens below detection limit (nondetects).The isometric logratio transformation is used to convert the compositional data in thesimplex to samples in real space, thus allowing the practitioner to properly borrow fromthe large source of statistical techniques valid only in real space. The bootstrap method isused to numerically investigate the reliability of inferring several distributionalparameters employing different forms of imputation for the censored data. The casestudy illustrates that, in general, best results are obtained when imputations are madeusing the distribution best fitting the readings above detection limit and exposes theproblems of other more widely used practices. When the sample is spatially correlated, itis necessary to combine the bootstrap with stochastic simulation
Resumo:
Les melanines són un grup heterogeni de polímers producte de reaccions enzimàtiques en els teixits vegetals que contenen compostos fenòlics o polifenòlics. Estudis recents han descobert algunes propietats benèfiques de les melanines sobre la salut, tals com antioxidants, antiinflamatòries, immunològiques i propietats anti-tumorals. Així, no només la seva eliminació ha de ser examinada, sinó que també podria considerar-se la seva addició a aliments funcionals de nova creació. D’aquesta manera, es requereix conèixer el mecanisme cinètic de la lanogènesi abans de la seva possible utilització industrial. S’ha desenvolupat un model cinètic per explicar la formació de melanina a partir de L-tirosina utilitzant polifenol oxidasa d’Agaricus bisporus i monitoritzant l'absorbància de la solució. Aquesta expressió permet descriure la formació de melanina en funció del temps de reacció i obtenir alguns paràmetres importants que defineixen el producte, com el coeficient d'extinció. L’absorbància comença a créixer després d'un període de latència en què es produeixen productes intermedis incolors. El coeficient d'extinció dels productes resultants no és un valor constant, perquè depèn de les condicions de cada experiment. La tirosinasa tingué un menor efecte catalitzador sobre la L-tirosina (primera reacció que catalitza), que sobre L-DOPA (segona reacció).
Resumo:
The emergence of uncorrelated growing networks is proved when nodes are removed either uniformly or under the preferential survival rule recently observed in the World Wide Web evolution. To this aim, the rate equation for the joint probability of degrees is derived, and stationary symmetrical solutions are obtained, by passing to the continuum limit. When a uniformly random removal of extant nodes and linear preferential attachment of new nodes are at work, we prove that the only stationary solution corresponds to uncorrelated networks for any removal rate r ∈ (0,1). In the more general case of preferential survival of nodes, uncorrelated solutions are also obtained. These results generalize the uncorrelatedness displayed by the (undirected) Barab´asi-Albert network model to models with uniformly random and selective (against low degrees) removal of nodes
Resumo:
We present a continuum formalism for modeling growing random networks under addition and deletion of nodes based on a differential mass balance equation. As examples of its applicability, we obtain new results on the degree distribution for growing networks with a uniform attachment and deletion of nodes, and complete some recent results on growing networks with preferential attachment and uniform removal
Resumo:
The view of a 1870-1913 expanding European economy providing increasing welfare to everybody has been challenged by many, then and now. We focus on the amazing growth that was experienced, its diffusion and its sources, in the context of the permanent competition among European nation states. During 1870-193 the globalized European economy reached a silver age . GDP growth was quite rapid (2.15% per annum) and diffused all over Europe. Even discounting the high rates of population growth (1.06%), per capita growth was left at a respectable 1.08%. Income per capita was rising in every country, and the rates of improvement were quite similar. This was a major achievement after two generations of highly localized growth, both geographically and socially. Growth was based on the increased use of labour and capital, but a good part of growth (73 per cent for the weighted average of the best documented European countries) came out of total factor productivity efficiency gains resulting from not well specified ultimate sources of growth. This proportion suggests that the European economy was growing at full capacity at its production frontier. It would have been very difficult to improve its performance. Within Europe, convergence was limited, and it only was in motion after 1900. What happened was more the end of the era of big divergence rather than an era of convergence.
Resumo:
In many areas of economics there is a growing interest in how expertise andpreferences drive individual and group decision making under uncertainty. Increasingly, we wish to estimate such models to quantify which of these drive decisionmaking. In this paper we propose a new channel through which we can empirically identify expertise and preference parameters by using variation in decisionsover heterogeneous priors. Relative to existing estimation approaches, our \Prior-Based Identification" extends the possible environments which can be estimated,and also substantially improves the accuracy and precision of estimates in thoseenvironments which can be estimated using existing methods.
Resumo:
For the standard kernel density estimate, it is known that one can tune the bandwidth such that the expected L1 error is within a constant factor of the optimal L1 error (obtained when one is allowed to choose the bandwidth with knowledge of the density). In this paper, we pose the same problem for variable bandwidth kernel estimates where the bandwidths are allowed to depend upon the location. We show in particular that for positive kernels on the real line, for any data-based bandwidth, there exists a densityfor which the ratio of expected L1 error over optimal L1 error tends to infinity. Thus, the problem of tuning the variable bandwidth in an optimal manner is ``too hard''. Moreover, from the class of counterexamples exhibited in the paper, it appears thatplacing conditions on the densities (monotonicity, convexity, smoothness) does not help.
Resumo:
Most methods for small-area estimation are based on composite estimators derived from design- or model-based methods. A composite estimator is a linear combination of a direct and an indirect estimator with weights that usually depend on unknown parameters which need to be estimated. Although model-based small-area estimators are usually based on random-effects models, the assumption of fixed effects is at face value more appropriate.Model-based estimators are justified by the assumption of random (interchangeable) area effects; in practice, however, areas are not interchangeable. In the present paper we empirically assess the quality of several small-area estimators in the setting in which the area effects are treated as fixed. We consider two settings: one that draws samples from a theoretical population, and another that draws samples from an empirical population of a labor force register maintained by the National Institute of Social Security (NISS) of Catalonia. We distinguish two types of composite estimators: a) those that use weights that involve area specific estimates of bias and variance; and, b) those that use weights that involve a common variance and a common squared bias estimate for all the areas. We assess their precision and discuss alternatives to optimizing composite estimation in applications.
Resumo:
Many dynamic revenue management models divide the sale period into a finite number of periods T and assume, invoking a fine-enough grid of time, that each period sees at most one booking request. These Poisson-type assumptions restrict the variability of the demand in the model, but researchers and practitioners were willing to overlook this for the benefit of tractability of the models. In this paper, we criticize this model from another angle. Estimating the discrete finite-period model poses problems of indeterminacy and non-robustness: Arbitrarily fixing T leads to arbitrary control values and on the other hand estimating T from data adds an additional layer of indeterminacy. To counter this, we first propose an alternate finite-population model that avoids this problem of fixing T and allows a wider range of demand distributions, while retaining the useful marginal-value properties of the finite-period model. The finite-population model still requires jointly estimating market size and the parameters of the customer purchase model without observing no-purchases. Estimation of market-size when no-purchases are unobservable has rarely been attempted in the marketing or revenue management literature. Indeed, we point out that it is akin to the classical statistical problem of estimating the parameters of a binomial distribution with unknown population size and success probability, and hence likely to be challenging. However, when the purchase probabilities are given by a functional form such as a multinomial-logit model, we propose an estimation heuristic that exploits the specification of the functional form, the variety of the offer sets in a typical RM setting, and qualitative knowledge of arrival rates. Finally we perform simulations to show that the estimator is very promising in obtaining unbiased estimates of population size and the model parameters.