60 resultados para SPATIALLY-RESOLVED CATHODOLUMINESCENCE
Resumo:
Empirical studies on the determinants of industrial location typically use variables measured at the available administrative level (municipalities, counties, etc.). However, this amounts to assuming that the effects these determinants may have on the location process do not extent beyond the geographical limits of the selected site. We address the validity of this assumption by comparing results from standard count data models with those obtained by calculating the geographical scope of the spatially varying explanatory variables using a wide range of distances and alternative spatial autocorrelation measures. Our results reject the usual practice of using administrative records as covariates without making some kind of spatial correction. Keywords: industrial location, count data models, spatial statistics JEL classification: C25, C52, R11, R30
Resumo:
Weak solutions of the spatially inhomogeneous (diffusive) Aizenmann-Bak model of coagulation-breakup within a bounded domain with homogeneous Neumann boundary conditions are shown to converge, in the fast reaction limit, towards local equilibria determined by their mass. Moreover, this mass is the solution of a nonlinear diffusion equation whose nonlinearity depends on the (size-dependent) diffusion coefficient. Initial data are assumed to have integrable zero order moment and square integrable first order moment in size, and finite entropy. In contrast to our previous result [CDF2], we are able to show the convergence without assuming uniform bounds from above and below on the number density of clusters.
Resumo:
We re-examine the literature on mobile termination in the presence of network externalities. Externalities arise when firms discriminate between on- and off-net calls or when subscription demand is elastic. This literature predicts that profit decreases and consumer surplus increases in termination charge in a neighborhood of termination cost. This creates a puzzle since in reality we see regulators worldwide pushing termination rates down while being opposed by network operators. We show that this puzzle is resolved when consumers' expectations are assumed passive but required to be fulfilled in equilibrium (as defined by Katz and Shapiro, AER 1985), instead of being rationally responsive to non-equilibrium prices, as assumed until now.
Resumo:
En la actualidad, la computación de altas prestaciones está siendo utilizada en multitud de campos científicos donde los distintos problemas estudiados se resuelven mediante aplicaciones paralelas/distribuidas. Estas aplicaciones requieren gran capacidad de cómputo, bien sea por la complejidad de los problemas o por la necesidad de solventar situaciones en tiempo real. Por lo tanto se debe aprovechar los recursos y altas capacidades computacionales de los sistemas paralelos en los que se ejecutan estas aplicaciones con el fin de obtener un buen rendimiento. Sin embargo, lograr este rendimiento en una aplicación ejecutándose en un sistema es una dura tarea que requiere un alto grado de experiencia, especialmente cuando se trata de aplicaciones que presentan un comportamiento dinámico o cuando se usan sistemas heterogéneos. En estos casos actualmente se plantea realizar una mejora de rendimiento automática y dinámica de las aplicaciones como mejor enfoque para el análisis del rendimiento. El presente trabajo de investigación se sitúa dentro de este ámbito de estudio y su objetivo principal es sintonizar dinámicamente mediante MATE (Monitoring, Analysis and Tuning Environment) una aplicación MPI empleada en computación de altas prestaciones que siga un paradigma Master/Worker. Las técnicas de sintonización integradas en MATE han sido desarrolladas a partir del estudio de un modelo de rendimiento que refleja los cuellos de botella propios de aplicaciones situadas bajo un paradigma Master/Worker: balanceo de carga y número de workers. La ejecución de la aplicación elegida bajo el control dinámico de MATE y de la estrategia de sintonización implementada ha permitido observar la adaptación del comportamiento de dicha aplicación a las condiciones actuales del sistema donde se ejecuta, obteniendo así una mejora de su rendimiento.
Resumo:
Concerns on the clustering of retail industries and professional services in main streets had traditionally been the public interest rationale for supporting distance regulations. Although many geographic restrictions have been suppressed, deregulation has hinged mostly upon the theory results on the natural tendency of outlets to differentiate spatially. Empirical evidence has so far offered mixed results. Using the case of deregulation of pharmacy establishment in a region of Spain, we empirically show how pharmacy locations scatter, and that there is not rationale for distance regulation apart from the underlying private interest of very few incumbents.
Gaussian estimates for the density of the non-linear stochastic heat equation in any space dimension
Resumo:
In this paper, we establish lower and upper Gaussian bounds for the probability density of the mild solution to the stochastic heat equation with multiplicative noise and in any space dimension. The driving perturbation is a Gaussian noise which is white in time with some spatially homogeneous covariance. These estimates are obtained using tools of the Malliavin calculus. The most challenging part is the lower bound, which is obtained by adapting a general method developed by Kohatsu-Higa to the underlying spatially homogeneous Gaussian setting. Both lower and upper estimates have the same form: a Gaussian density with a variance which is equal to that of the mild solution of the corresponding linear equation with additive noise.
Resumo:
This article analyses how agglomeration economies shaped the location decisions of new manufacturing start-ups in Catalan municipalities in 2001-2005. We estimate whether the locations of new firms are spatially autocorrelated and whether this phenomenon is industry-specific. Our aim is to estimate the geographical scope of agglomeration economies on firm entries. The data set comes from a compulsory register of manufacturing establishments (REIC: Catalan Manufacturing Establishments Register). JEL classification: R1, R3 Keywords: firm location; spatial autocorrelation
Resumo:
This paper tries to resolve some of the main shortcomings in the empirical literature of location decisions for new plants, i.e. spatial effects and overdispersion. Spatial effects are omnipresent, being a source of overdispersion in the data as well as a factor shaping the functional relationship between the variables that explain a firm’s location decisions. Using Count Data models, empirical researchers have dealt with overdispersion and excess zeros by developments of the Poisson regression model. This study aims to take this a step further, by adopting Bayesian methods and models in order to tackle the excess of zeros, spatial and non-spatial overdispersion and spatial dependence simultaneously. Data for Catalonia is used and location determinants are analysed to that end. The results show that spatial effects are determinant. Additionally, overdispersion is descomposed into an unstructured iid effect and a spatially structured effect. Keywords: Bayesian Analysis, Spatial Models, Firm Location. JEL Classification: C11, C21, R30.
Resumo:
El canvi climàtic del segle XXI és una realitat, hi ha moltes evidències científiques que indiquen que l’escalfament del sistema climàtic és inequívoc. Malgrat això, també hi ha moltes incerteses respecte els impactes que pot comportar aquest canvi climàtic global. L’objectiu d’aquest projecte és estudiar la possible evolució futura de tres variables climàtiques, que són el rang de la temperatura diürna a prop de la superfície (DTR), la temperatura mitjana a prop de la superfície (MT) i la precipitació mensual (PL_mes) i valorar l’exposició que poden experimentar diferents cobertes del sòl i diferents regions biogeogràfiques del continent europeu davant d’aquests possibles patrons de canvi. Per això s’han utilitzat Models Climàtics Globals que fan projeccions de variables climàtiques que permeten preveure el possible clima futur. Mitjançant l’aplicatiu informàtic Tetyn s’han extret els paràmetres climàtics dels conjunts de dades del Tyndall Centre for Climate Change Research, del futur (TYN SC) i del passat (CRU TS). Les variables obtingudes s’han processat amb eines de sistemes d’informació geogràfica (SIG) per obtenir els patrons de canvi de les variables a cada coberta del sòl. Els resultats obtinguts mostren que hi ha una gran variabilitat, que augmenta amb el temps, entre els diferents models climàtics i escenaris considerats, que posa de manifest la incertesa associada a la modelització climàtica, a la generació d’escenaris d’emissions i a la naturalesa dinàmica i no determinista del sistema climàtic. Però en general, mostren que les glaceres seran una de les cobertes més exposades al canvi climàtic, i la mediterrània, una de les regions més vulnerables
Resumo:
Report for the scientific sojourn at the UC Berkeley, USA, from march until july 2008. This document starts by surveying the literature on economic federalism and relating it to network industries. The insights and some new developments (which focus on the role of interjurisdictional externalities, multiple objectives and investment incentives) are used to analyze regulatory arrangements in telecommunications and energy in the EU and the US. In the long history of vertically integrated monopolies in telecommunications and energy, there was a historical trend to move regulation up in the vertical structure of government, at least form the local level to the state or nation-state level. This move alleviated the pressure on regulators to renege on the commitment not to expropriate sunk investments, although it did not eliminate the practice of taxation by regulation that was the result of multiple interest group action. Although central or federal policy making is more focused and especialized and makes it difficult for more interest groups to organize, it is not clear that under all conditions central powers will not be associated with underinvestment. When technology makes the introduction of competition in some segments possible, the possibilities for organizing the institutional architechture of regulation expand. The central level may focus on structural regulation and the location of behavioral regulation of the remaining monopolists may be resolved in a cooperative way or concentrated at the level where the relevant spillovers are internalized.
Resumo:
This article provides a theoretical and empirical analysis of a firm's optimal R&D strategy choice. In this paper a firm's R&D strategy is assumed to be endogenous and allowed to depend on both internal firms. characteristics and external factors. Firms choose between two strategies, either they engage in R&D or abstain from own R&D and imitate the outcomes of innovators. In the theoretical model this yields three types of equilibria in which either all firms innovate, some firms innovate and others imitate, or no firm innovates. Firms'equilibrium strategies crucially depend on external factors. We find that the efficiency of intellectual property rights protection positively affects firms'incentives to engage in R&D, while competitive pressure has a negative effect. In addition, smaller firms are found to be more likely to become imitators when the product is homogeneous and the level of spillovers is high. These results are supported by empirical evidence for German .rms from manufacturing and services sectors. Regarding social welfare our results indicate that strengthening intellectual property protection can have an ambiguous effect. In markets characterized by a high rate of innovation a reduction of intellectual property rights protection can discourage innovative performance substantially. However, a reduction of patent protection can also increase social welfare because it may induce imitation. This indicates that policy issues such as the optimal length and breadth of patent protection cannot be resolved without taking into account specific market and firm characteristics. Journal of Economic Literature Classification Numbers: C35, D43, L13, L22, O31. Keywords: Innovation; imitation; spillovers; product differentiation; market competition; intellectual property rights protection.
Resumo:
En la logística de Vall Companys Grup, un problema fonamental a resoldre és el de la situació geogràfica de tots els elements que participen en la logística, com són fàbriques, granges, camions i magatzems; i estudiar les seves interrelacions per aconseguir un aprofitament òptim dels recursos disponibles. Per resoldre aquest problema, l'eina adequada seria un Sistema d'Informació Geogràfica (SIG).
Resumo:
Maquieavelli és un joc molt antic i popular de la companyia Avallon Hill, que encara no s'ha fet per jugar per ordinador. És idoni com a joc multijugador per Internet, ja que és un joc per torns on els jugadors escriuen les ordres i es resolen totes alhora. Per implementar aquesta idea, la plataforma més adient pel joc és Browser i servidor. Això ens permetrà jugar des de qualsevol lloc (tan sols necessitem una connexió a Internet i un navegador).
Resumo:
Emergent molecular measurement methods, such as DNA microarray, qRTPCR, andmany others, offer tremendous promise for the personalized treatment of cancer. Thesetechnologies measure the amount of specific proteins, RNA, DNA or other moleculartargets from tumor specimens with the goal of “fingerprinting” individual cancers. Tumorspecimens are heterogeneous; an individual specimen typically contains unknownamounts of multiple tissues types. Thus, the measured molecular concentrations resultfrom an unknown mixture of tissue types, and must be normalized to account for thecomposition of the mixture.For example, a breast tumor biopsy may contain normal, dysplastic and cancerousepithelial cells, as well as stromal components (fatty and connective tissue) and bloodand lymphatic vessels. Our diagnostic interest focuses solely on the dysplastic andcancerous epithelial cells. The remaining tissue components serve to “contaminate”the signal of interest. The proportion of each of the tissue components changes asa function of patient characteristics (e.g., age), and varies spatially across the tumorregion. Because each of the tissue components produces a different molecular signature,and the amount of each tissue type is specimen dependent, we must estimate the tissuecomposition of the specimen, and adjust the molecular signal for this composition.Using the idea of a chemical mass balance, we consider the total measured concentrationsto be a weighted sum of the individual tissue signatures, where weightsare determined by the relative amounts of the different tissue types. We develop acompositional source apportionment model to estimate the relative amounts of tissuecomponents in a tumor specimen. We then use these estimates to infer the tissuespecificconcentrations of key molecular targets for sub-typing individual tumors. Weanticipate these specific measurements will greatly improve our ability to discriminatebetween different classes of tumors, and allow more precise matching of each patient tothe appropriate treatment
Resumo:
Low concentrations of elements in geochemical analyses have the peculiarity of beingcompositional data and, for a given level of significance, are likely to be beyond thecapabilities of laboratories to distinguish between minute concentrations and completeabsence, thus preventing laboratories from reporting extremely low concentrations of theanalyte. Instead, what is reported is the detection limit, which is the minimumconcentration that conclusively differentiates between presence and absence of theelement. A spatially distributed exhaustive sample is employed in this study to generateunbiased sub-samples, which are further censored to observe the effect that differentdetection limits and sample sizes have on the inference of population distributionsstarting from geochemical analyses having specimens below detection limit (nondetects).The isometric logratio transformation is used to convert the compositional data in thesimplex to samples in real space, thus allowing the practitioner to properly borrow fromthe large source of statistical techniques valid only in real space. The bootstrap method isused to numerically investigate the reliability of inferring several distributionalparameters employing different forms of imputation for the censored data. The casestudy illustrates that, in general, best results are obtained when imputations are madeusing the distribution best fitting the readings above detection limit and exposes theproblems of other more widely used practices. When the sample is spatially correlated, itis necessary to combine the bootstrap with stochastic simulation