16 resultados para Complex polymerization method
em Consorci de Serveis Universitaris de Catalunya (CSUC), Spain
Resumo:
The studies of Giacomo Becattini concerning the notion of the "Marshallian industrial district" have led a revolution in the field of economic development around the world. The paper offers an interpretation of the methodology adopted by Becattini. The roots are clearly Marshallian. Becattini proposes a return to the economy as a complex social science that operates in historical time. We adopt a Schumpeterian approach to the method in economic analysis in order to highlight the similarities between the Marshall and Becattini's approach. Finally the paper uses the distinction between logical time, real time and historical time which enable us to study the "localized" economic process in a Becattinian way.
Resumo:
Realistic rendering animation is known to be an expensive processing task when physically-based global illumination methods are used in order to improve illumination details. This paper presents an acceleration technique to compute animations in radiosity environments. The technique is based on an interpolated approach that exploits temporal coherence in radiosity. A fast global Monte Carlo pre-processing step is introduced to the whole computation of the animated sequence to select important frames. These are fully computed and used as a base for the interpolation of all the sequence. The approach is completely view-independent. Once the illumination is computed, it can be visualized by any animated camera. Results present significant high speed-ups showing that the technique could be an interesting alternative to deterministic methods for computing non-interactive radiosity animations for moderately complex scenarios
Resumo:
We present a method to compute, quickly and efficiently, the mutual information achieved by an IID (independent identically distributed) complex Gaussian signal on a block Rayleigh-faded channel without side information at the receiver. The method accommodates both scalar and MIMO (multiple-input multiple-output) settings. Operationally, this mutual information represents the highest spectral efficiency that can be attained using Gaussiancodebooks. Examples are provided that illustrate the loss in spectral efficiency caused by fast fading and how that loss is amplified when multiple transmit antennas are used. These examples are further enriched by comparisons with the channel capacity under perfect channel-state information at the receiver, and with the spectral efficiency attained by pilot-based transmission.
Resumo:
We present a method to compute, quickly and efficiently, the mutual information achieved by an IID (independent identically distributed) complex Gaussian signal on a block Rayleigh-faded channel without side information at the receiver. The method accommodates both scalar and MIMO (multiple-input multiple-output) settings. Operationally, this mutual information represents the highest spectral efficiency that can be attained using Gaussiancodebooks. Examples are provided that illustrate the loss in spectral efficiency caused by fast fading and how that loss is amplified when multiple transmit antennas are used. These examples are further enriched by comparisons with the channel capacity under perfect channel-state information at the receiver, and with the spectral efficiency attained by pilot-based transmission.
Resumo:
In this paper a method for extracting semantic informationfrom online music discussion forums is proposed. The semantic relations are inferred from the co-occurrence of musical concepts in forum posts, using network analysis. The method starts by defining a dictionary of common music terms in an art music tradition. Then, it creates a complex network representation of the online forum by matchingsuch dictionary against the forum posts. Once the complex network is built we can study different network measures, including node relevance, node co-occurrence andterm relations via semantically connecting words. Moreover, we can detect communities of concepts inside the forum posts. The rationale is that some music terms are more related to each other than to other terms. All in all, this methodology allows us to obtain meaningful and relevantinformation from forum discussions.
Resumo:
Most studies analysing the infrastructure impact on regional growth show a positive relationship between both variables. However, the public capital elasticity estimated in a Cobb-Douglas function, which is the most common specification in these works, is sometimes too big to be credible, so that the results have been partially desestimated. In the present paper, we give some new advances on the real link between public capital and productivity for the Spanish regions in the period 1964-1991. Firstly, we find out that the association for both variables is smaller when controlling for regional effects, being industry the sector which reaps the most benefits from an increase in the infrastructural dotation. Secondly, concerning to the rigidity of the Cobb-Douglas function, it is surpassed by using the variable expansion method. The expanded functional form reveals both the absence of a direct effect of infrastructure and the fact that the link between infrastructure and growth depends on the level of the existing stock (threshold level) and the way infrastructure is articulated in its location relative to other factors. Finally, we analyse the importance of the spatial dimension in infrastructure impact, due to spillover effects. In this sense, the paper provides evidence of the existence of spatial autocorrelation processes that may invalidate previous results.
Resumo:
The study of the thermal behavior of complex packages as multichip modules (MCM¿s) is usually carried out by measuring the so-called thermal impedance response, that is: the transient temperature after a power step. From the analysis of this signal, the thermal frequency response can be estimated, and consequently, compact thermal models may be extracted. We present a method to obtain an estimate of the time constant distribution underlying the observed transient. The method is based on an iterative deconvolution that produces an approximation to the time constant spectrum while preserving a convenient convolution form. This method is applied to the obtained thermal response of a microstructure as analyzed by finite element method as well as to the measured thermal response of a transistor array integrated circuit (IC) in a SMD package.
Resumo:
Most studies analysing the infrastructure impact on regional growth show a positive relationship between both variables. However, the public capital elasticity estimated in a Cobb-Douglas function, which is the most common specification in these works, is sometimes too big to be credible, so that the results have been partially desestimated. In the present paper, we give some new advances on the real link between public capital and productivity for the Spanish regions in the period 1964-1991. Firstly, we find out that the association for both variables is smaller when controlling for regional effects, being industry the sector which reaps the most benefits from an increase in the infrastructural dotation. Secondly, concerning to the rigidity of the Cobb-Douglas function, it is surpassed by using the variable expansion method. The expanded functional form reveals both the absence of a direct effect of infrastructure and the fact that the link between infrastructure and growth depends on the level of the existing stock (threshold level) and the way infrastructure is articulated in its location relative to other factors. Finally, we analyse the importance of the spatial dimension in infrastructure impact, due to spillover effects. In this sense, the paper provides evidence of the existence of spatial autocorrelation processes that may invalidate previous results.
Resumo:
We discuss reality conditions and the relation between spacetime diffeomorphisms and gauge transformations in Ashtekars complex formulation of general relativity. We produce a general theoretical framework for the stabilization algorithm for the reality conditions, which is different from Diracs method of stabilization of constraints. We solve the problem of the projectability of the diffeomorphism transformations from configuration-velocity space to phase space, linking them to the reality conditions. We construct the complete set of canonical generators of the gauge group in the phase space which includes all the gauge variables. This result proves that the canonical formalism has all the gauge structure of the Lagrangian theory, including the time diffeomorphisms.
Resumo:
We propose a method to display full complex Fresnel holograms by adding the information displayed on two analogue ferroelectric liquid crystal spatial light modulators. One of them works in real-only configuration and the other in imaginary-only mode. The Fresnel holograms are computed by backpropagating an object at a selected distance with the Fresnel transform. Then, displaying the real and imaginary parts on each panel, the object is reconstructed at that distance from the modulators by simple propagation of light. We present simulation results taking into account the specifications of the modulators as well as optical results. We have also studied the quality of reconstructions using only real, imaginary, amplitude or phase information. Although the real and imaginary reconstructions look acceptable for certain distances, full complex reconstruction is always better and is required when arbitrary distances are used.
Resumo:
Background Plant hormones play a pivotal role in several physiological processes during a plant's life cycle, from germination to senescence, and the determination of endogenous concentrations of hormones is essential to elucidate the role of a particular hormone in any physiological process. Availability of a sensitive and rapid method to quantify multiple classes of hormones simultaneously will greatly facilitate the investigation of signaling networks in controlling specific developmental pathways and physiological responses. Due to the presence of hormones at very low concentrations in plant tissues (10-9 M to 10-6 M) and their different chemistries, the development of a high-throughput and comprehensive method for the determination of hormones is challenging. Results The present work reports a rapid, specific and sensitive method using ultrahigh-performance liquid chromatography coupled to electrospray ionization tandem spectrometry (UPLC/ESI-MS/MS) to analyze quantitatively the major hormones found in plant tissues within six minutes, including auxins, cytokinins, gibberellins, abscisic acid, 1-amino-cyclopropane-1-carboxyic acid (the ethylene precursor), jasmonic acid and salicylic acid. Sample preparation, extraction procedures and UPLC-MS/MS conditions were optimized for the determination of all plant hormones and are summarized in a schematic extraction diagram for the analysis of small amounts of plant material without time-consuming additional steps such as purification, sample drying or re-suspension. Conclusions This new method is applicable to the analysis of dynamic changes in endogenous concentrations of hormones to study plant developmental processes or plant responses to biotic and abiotic stresses in complex tissues. An example is shown in which a hormone profiling is obtained from leaves of plants exposed to salt stress in the aromatic plant, Rosmarinus officinalis.
Resumo:
A new aggregation method for decision making is presented by using induced aggregation operators and the index of maximum and minimum level. Its main advantage is that it can assess complex reordering processes in the aggregation that represent complex attitudinal characters of the decision maker such as psychological or personal factors. A wide range of properties and particular cases of this new approach are studied. A further generalization by using hybrid averages and immediate weights is also presented. The key issue in this approach against the previous model is that we can use the weighted average and the ordered weighted average in the same formulation. Thus, we are able to consider the subjective attitude and the degree of optimism of the decision maker in the decision process. The paper ends with an application in a decision making problem based on the use of the assignment theory.
Resumo:
Increasing anthropogenic pressures urge enhanced knowledge and understanding of the current state of marine biodiversity. This baseline information is pivotal to explore present trends, detect future modifications and propose adequate management actions for marine ecosystems. Coralligenous outcrops are a highly diverse and structurally complex deep-water habitat faced with major threats in the Mediterranean Sea. Despite its ecological, aesthetic and economic value, coralligenous biodiversity patterns are still poorly understood. There is currently no single sampling method that has been demonstrated to be sufficiently representative to ensure adequate community assessment and monitoring in this habitat. Therefore, we propose a rapid non-destructive protocol for biodiversity assessment and monitoring of coralligenous outcrops providing good estimates of its structure and species composition, based on photographic sampling and the determination of presence/absence of macrobenthic species. We used an extensive photographic survey, covering several spatial scales (100s of m to 100s of km) within the NW Mediterranean and including 2 different coralligenous assemblages: Paramuricea clavata (PCA) and Corallium rubrum assemblage (CRA). This approach allowed us to determine the minimal sampling area for each assemblage (5000 cm² for PCA and 2500 cm²for CRA). In addition, we conclude that 3 replicates provide an optimal sampling effort in order to maximize the species number and to assess the main biodiversity patterns of studied assemblages in variability studies requiring replicates. We contend that the proposed sampling approach provides a valuable tool for management and conservation planning, monitoring and research programs focused on coralligenous outcrops, potentially also applicable in other benthic ecosystems
Resumo:
We analyze the process of informational exchange through complex networks by measuring network efficiencies. Aiming to study nonclustered systems, we propose a modification of this measure on the local level. We apply this method to an extension of the class of small worlds that includes declustered networks and show that they are locally quite efficient, although their clustering coefficient is practically zero. Unweighted systems with small-world and scale-free topologies are shown to be both globally and locally efficient. Our method is also applied to characterize weighted networks. In particular we examine the properties of underground transportation systems of Madrid and Barcelona and reinterpret the results obtained for the Boston subway network.
Resumo:
The most suitable method for estimation of size diversity is investigated. Size diversity is computed on the basis of the Shannon diversity expression adapted for continuous variables, such as size. It takes the form of an integral involving the probability density function (pdf) of the size of the individuals. Different approaches for the estimation of pdf are compared: parametric methods, assuming that data come from a determinate family of pdfs, and nonparametric methods, where pdf is estimated using some kind of local evaluation. Exponential, generalized Pareto, normal, and log-normal distributions have been used to generate simulated samples using estimated parameters from real samples. Nonparametric methods include discrete computation of data histograms based on size intervals and continuous kernel estimation of pdf. Kernel approach gives accurate estimation of size diversity, whilst parametric methods are only useful when the reference distribution have similar shape to the real one. Special attention is given for data standardization. The division of data by the sample geometric mean is proposedas the most suitable standardization method, which shows additional advantages: the same size diversity value is obtained when using original size or log-transformed data, and size measurements with different dimensionality (longitudes, areas, volumes or biomasses) may be immediately compared with the simple addition of ln k where kis the dimensionality (1, 2, or 3, respectively). Thus, the kernel estimation, after data standardization by division of sample geometric mean, arises as the most reliable and generalizable method of size diversity evaluation