69 resultados para Technological parameters
Resumo:
Aquest projecte té la intenció d'identificar i analitzar els efectes de la introducció d'Internet a les escoles catalanes (educació primària i secundària). L'objectiu és posar de manifest la manera com s'utilitza la xarxa en aquest àmbit i en quina mesura contribueix a l'aparició, en els centres educatius, d'una nova cultura adaptada a les necessitats de la societat xarxa. Amb aquest propòsit, aquest projecte desplega les seves línies d'anàlisi per a fer atenció al procés d'incorporació d'Internet, principalment, en tres direccions: la pràctica pedagògica, les formes d'organització i gestió dels centres educatius i la seva vinculació amb la comunitat i el territori. Aquesta investigació ha estat desenvolupada pel grup de recerca ENS (Education and Network Society). Amb una perspectiva comparativa, el treball d'aquest grup vol contribuir, sobre la base de dades empíriques, a interpretar la transformació de l'àmbit educatiu no universitari en els paràmetres que estableix, avui dia, la nostra societat.
Resumo:
Aquest projecte té la intenció d'identificar i analitzar els efectes de la introducció d'Internet a les escoles catalanes (educació primària i secundària). L'objectiu és posar de manifest la manera com s'utilitza la xarxa en aquest àmbit i en quina mesura contribueix a l'aparició, en els centres educatius, d'una nova cultura adaptada a les necessitats de la societat xarxa. Amb aquest propòsit, aquest projecte desplega les seves línies d'anàlisi per a fer atenció al procés d'incorporació d'Internet, principalment, en tres direccions: la pràctica pedagògica, les formes d'organització i gestió dels centres educatius i la seva vinculació amb la comunitat i el territori. Aquesta investigació ha estat desenvolupada pel grup de recerca ENS (Education and Network Society). Amb una perspectiva comparativa, el treball d'aquest grup vol contribuir, sobre la base de dades empíriques, a interpretar la transformació de l'àmbit educatiu no universitari en els paràmetres que estableix, avui dia, la nostra societat.
Resumo:
Aquest projecte té la intenció d'identificar i analitzar els efectes de la introducció d'Internet a les escoles catalanes (educació primària i secundària). L'objectiu és posar de manifest la manera com s'utilitza la xarxa en aquest àmbit i en quina mesura contribueix a l'aparició, en els centres educatius, d'una nova cultura adaptada a les necessitats de la societat xarxa. Amb aquest propòsit, aquest projecte desplega les seves línies d'anàlisi per a fer atenció al procés d'incorporació d'Internet, principalment, en tres direccions: la pràctica pedagògica, les formes d'organització i gestió dels centres educatius i la seva vinculació amb la comunitat i el territori. Aquesta investigació ha estat desenvolupada pel grup de recerca ENS (Education and Network Society). Amb una perspectiva comparativa, el treball d'aquest grup vol contribuir, sobre la base de dades empíriques, a interpretar la transformació de l'àmbit educatiu no universitari en els paràmetres que estableix, avui dia, la nostra societat.
Resumo:
Low concentrations of elements in geochemical analyses have the peculiarity of beingcompositional data and, for a given level of significance, are likely to be beyond thecapabilities of laboratories to distinguish between minute concentrations and completeabsence, thus preventing laboratories from reporting extremely low concentrations of theanalyte. Instead, what is reported is the detection limit, which is the minimumconcentration that conclusively differentiates between presence and absence of theelement. A spatially distributed exhaustive sample is employed in this study to generateunbiased sub-samples, which are further censored to observe the effect that differentdetection limits and sample sizes have on the inference of population distributionsstarting from geochemical analyses having specimens below detection limit (nondetects).The isometric logratio transformation is used to convert the compositional data in thesimplex to samples in real space, thus allowing the practitioner to properly borrow fromthe large source of statistical techniques valid only in real space. The bootstrap method isused to numerically investigate the reliability of inferring several distributionalparameters employing different forms of imputation for the censored data. The casestudy illustrates that, in general, best results are obtained when imputations are madeusing the distribution best fitting the readings above detection limit and exposes theproblems of other more widely used practices. When the sample is spatially correlated, itis necessary to combine the bootstrap with stochastic simulation
Resumo:
This paper analyses the effects that technological changes in agriculture would have on environmental, social and economic indicators. Specifically, our study is focused on two alternative technological improvements: the modernization of water transportation systems versus the increase in the total factor productivity of agriculture. Using a computable general equilibrium model for the Catalan economy, our results suggest that a water policy that leads to greater economic efficiency is not necessarily optimal if we consider social or environmental criteria. Moreover, improving environmental sustainability depends less on the type of technological change than on the institutional framework in which technological change occurs. Keywords: agricultural technological changes, computable general equilibrium model, economic impact, water policy
Resumo:
The information and communication technologies (ICT) sectors are in a process of technological convergence. Determinant factors in this process are the liberalisation of the telecommunications markets and technological change. Many firms are engaged in a process of mergers and alliances to position themselves in this new framework. Technological and demand uncertainties are very important. Our objective in this paper is to study the economic determinants of the strategies of the firms. With this aim, we review some key technological and demand aspects. We shed some light on the strategic motivations of the firms by establishing a parallel with the evolution of the retailing sector
Resumo:
Human arteries affected by atherosclerosis are characterized by altered wall viscoelastic properties. The possibility of noninvasively assessing arterial viscoelasticity in vivo would significantly contribute to the early diagnosis and prevention of this disease. This paper presents a noniterative technique to estimate the viscoelastic parameters of a vascular wall Zener model. The approach requires the simultaneous measurement of flow variations and wall displacements, which can be provided by suitable ultrasound Doppler instruments. Viscoelastic parameters are estimated by fitting the theoretical constitutive equations to the experimental measurements using an ARMA parameter approach. The accuracy and sensitivity of the proposed method are tested using reference data generated by numerical simulations of arterial pulsation in which the physiological conditions and the viscoelastic parameters of the model can be suitably varied. The estimated values quantitatively agree with the reference values, showing that the only parameter affected by changing the physiological conditions is viscosity, whose relative error was about 27% even when a poor signal-to-noise ratio is simulated. Finally, the feasibility of the method is illustrated through three measurements made at different flow regimes on a cylindrical vessel phantom, yielding a parameter mean estimation error of 25%.
Resumo:
In this work we propose a new automatic methodology for computing accurate digital elevation models (DEMs) in urban environments from low baseline stereo pairs that shall be available in the future from a new kind of earth observation satellite. This setting makes both views of the scene similarly, thus avoiding occlusions and illumination changes, which are the main disadvantages of the commonly accepted large-baseline configuration. There still remain two crucial technological challenges: (i) precisely estimating DEMs with strong discontinuities and (ii) providing a statistically proven result, automatically. The first one is solved here by a piecewise affine representation that is well adapted to man-made landscapes, whereas the application of computational Gestalt theory introduces reliability and automation. In fact this theory allows us to reduce the number of parameters to be adjusted, and tocontrol the number of false detections. This leads to the selection of a suitable segmentation into affine regions (whenever possible) by a novel and completely automatic perceptual grouping method. It also allows us to discriminate e.g. vegetation-dominated regions, where such an affine model does not apply anda more classical correlation technique should be preferred. In addition we propose here an extension of the classical ”quantized” Gestalt theory to continuous measurements, thus combining its reliability with the precision of variational robust estimation and fine interpolation methods that are necessary in the low baseline case. Such an extension is very general and will be useful for many other applications as well.
Resumo:
Over the past two decades, technological progress has been biased towards making skilled labor more productive. The evidence for this finding is based on the persistent parallel increase in the skill premium and the supply of skilled workers. What are the implications of skill-biased technological change for the business cycle? To answer this question, we use the CPS outgoing rotation groups to construct quarterly series for the price and quantity of skill. The unconditional correlation of the skill premium with the cycle is zero. However, using a structural VAR with long run restrictions, we find that technology shocks substantially increase the premium. Investment-specific technology shocks are not skill-biased and our findings suggest that capital and skill are (mildly) substitutable in aggregate production.
Resumo:
For the standard kernel density estimate, it is known that one can tune the bandwidth such that the expected L1 error is within a constant factor of the optimal L1 error (obtained when one is allowed to choose the bandwidth with knowledge of the density). In this paper, we pose the same problem for variable bandwidth kernel estimates where the bandwidths are allowed to depend upon the location. We show in particular that for positive kernels on the real line, for any data-based bandwidth, there exists a densityfor which the ratio of expected L1 error over optimal L1 error tends to infinity. Thus, the problem of tuning the variable bandwidth in an optimal manner is ``too hard''. Moreover, from the class of counterexamples exhibited in the paper, it appears thatplacing conditions on the densities (monotonicity, convexity, smoothness) does not help.
Resumo:
Most methods for small-area estimation are based on composite estimators derived from design- or model-based methods. A composite estimator is a linear combination of a direct and an indirect estimator with weights that usually depend on unknown parameters which need to be estimated. Although model-based small-area estimators are usually based on random-effects models, the assumption of fixed effects is at face value more appropriate.Model-based estimators are justified by the assumption of random (interchangeable) area effects; in practice, however, areas are not interchangeable. In the present paper we empirically assess the quality of several small-area estimators in the setting in which the area effects are treated as fixed. We consider two settings: one that draws samples from a theoretical population, and another that draws samples from an empirical population of a labor force register maintained by the National Institute of Social Security (NISS) of Catalonia. We distinguish two types of composite estimators: a) those that use weights that involve area specific estimates of bias and variance; and, b) those that use weights that involve a common variance and a common squared bias estimate for all the areas. We assess their precision and discuss alternatives to optimizing composite estimation in applications.
Resumo:
Over the past two decades, technological progress in the United States hasbeen biased towards skilled labor. What does this imply for business cycles?We construct a quarterly skill premium from the CPS and use it to identifyskill-biased technology shocks in a VAR with long-run restrictions. Hours fallin response to skill-biased technology shocks, indicating that at least part of thetechnology-induced fall in total hours is due to a compositional shift in labordemand. Skill-biased technology shocks have no effect on the relative price ofinvestment, suggesting that capital and skill are not complementary in aggregateproduction.
Resumo:
We use a simulation model to study how the diversification of electricity generation portfoliosinfluences wholesale prices. We find that technological diversification generally leads to lower market prices but that the relationship is mediated by the supply to demand ratio. In each demand case there is a threshold where pivotal dynamics change. Pivotal dynamics pre- and post-threshold are the cause of non-linearities in the influence of diversification on market prices. The findings are robust to our choice of behavioural parameters and match close-form solutions where those are available.