838 resultados para B2B integration
Resumo:
Essay elaborated by Shaelyne Johnson, undergraduate student of Global Studies at the University of California-Santa Barbara, during her internship at CEO-UAB for the academic course 2008/2009. She compares the organisational structure, goals and objectives of the institutions in the Olympic Movement and the European Integration, in order to find connections between both movements which were caused by globalization. The paper begins with an introduction of the changing world nowadays, followed by an overview on the structural similarities in the historical unfolding between these two parallel movements and, before concluding, new means for international relations are considered. This document is available in English through the digital library at the CEO-UAB Portal of Olympic Studies and the digital repository RECERCAT.
Resumo:
We study how unionisation affects competitive selection between heterogeneous firms when wage negotiations can occur at the firm or at the profit-centre level. With productivity specific wages, an increase in union power has: (i) a selection-softening; (ii) a counter-competitive; (iii) a wage-inequality; and (iv) a variety effect. In a two-country asymmetric setting, stronger unions soften competition for domestic firms and toughen it for exporters. With profit-centre bargaining, we show how trade liberalisation can affect wage inequality among identical workers both across firms (via its effects on competitive selection) and within firms (via wage discrimination across destination markets).
Resumo:
We re-examine the dynamics of returns and dividend growth within the present-value framework of stock prices. We find that the finite sample order of integration of returns is approximately equal to the order of integration of the first-differenced price-dividend ratio. As such, the traditional return forecasting regressions based on the price-dividend ratio are invalid. Moreover, the nonstationary long memory behaviour of the price-dividend ratio induces antipersistence in returns. This suggests that expected returns should be modelled as an AFIRMA process and we show this improves the forecast ability of the present-value model in-sample and out-of-sample.
Resumo:
The remarkable increase in trade flows and in migratory flows of highly educated people are two important features of globalization of the last decades. This paper extends a two-country model of inter- and intraindustry trade to a rich environment featuring technological differences, skill differences and the possibility of international labor mobility. The model is used to explain the patterns of trade and migration as countries remove barriers to trade and to labor mobility. We parameterize the model to match the features of the Western and Eastern European members of the EU and analyze first the effects of the trade liberalization which occured between 1989 and 2004, and then the gains and losses from migration which are expected to occur if legal barriers to labor mobility are substantially reduced. The lower barriers to migration would result in significant migration of skilled workers from Eastern European countries. Interestingly, this would not only benefit the migrants and most Western European workers but, via trade, it would also benefit the workers remaining in Eastern Europe. Key Words: Skilled Migration, Gains from Variety, Real Wages, Eastern-Western Europe. JEL Codes: F12, F22, J61.
Resumo:
Multisensory interactions are a fundamental feature of brain organization. Principles governing multisensory processing have been established by varying stimulus location, timing and efficacy independently. Determining whether and how such principles operate when stimuli vary dynamically in their perceived distance (as when looming/receding) provides an assay for synergy among the above principles and also means for linking multisensory interactions between rudimentary stimuli with higher-order signals used for communication and motor planning. Human participants indicated movement of looming or receding versus static stimuli that were visual, auditory, or multisensory combinations while 160-channel EEG was recorded. Multivariate EEG analyses and distributed source estimations were performed. Nonlinear interactions between looming signals were observed at early poststimulus latencies (∼75 ms) in analyses of voltage waveforms, global field power, and source estimations. These looming-specific interactions positively correlated with reaction time facilitation, providing direct links between neural and performance metrics of multisensory integration. Statistical analyses of source estimations identified looming-specific interactions within the right claustrum/insula extending inferiorly into the amygdala and also within the bilateral cuneus extending into the inferior and lateral occipital cortices. Multisensory effects common to all conditions, regardless of perceived distance and congruity, followed (∼115 ms) and manifested as faster transition between temporally stable brain networks (vs summed responses to unisensory conditions). We demonstrate the early-latency, synergistic interplay between existing principles of multisensory interactions. Such findings change the manner in which to model multisensory interactions at neural and behavioral/perceptual levels. We also provide neurophysiologic backing for the notion that looming signals receive preferential treatment during perception.
Resumo:
Although combination chemotherapy has been shown to be more effective than single agents in advanced esophagogastric cancer, the better response rates have not fulfilled their promise as overall survival times from best combination still range between 8 to 11 months. So far, the development of targeted therapies stays somewhat behind their integration into treatment concepts compared to other gastrointestinal diseases. Thus, the review summarizes the recent advances in the development of targeted therapies in advanced esophagogastric cancer. The majority of agents tested were angiogenesis inhibitors or agents targeting the epidermal growth factor receptors EGFR1 and HER2. For trastuzumab and bevacizumab, phase III trial results have been presented recently. While addition of trastuzumab to cisplatin/5-fluoropyrimidine-based chemotherapy results in a clinically relevant and statistically significant survival benefit in HER 2+ patients, the benefit of the addition of bevacizumab to chemotherapy was not significant. Thus, all patients with metastatic disease should be tested for HER-2 status in the tumor. Trastuzumab in combination with cisplatin/5-fluoropyrimidine-based chemotherapy is the new standard of care for patients with HER2-positive advanced gastric cancer.
Resumo:
The objective of this work is to present a multitechnique approach to define the geometry, the kinematics, and the failure mechanism of a retrogressive large landslide (upper part of the La Valette landslide, South French Alps) by the combination of airborne and terrestrial laser scanning data and ground-based seismic tomography data. The advantage of combining different methods is to constrain the geometrical and failure mechanism models by integrating different sources of information. Because of an important point density at the ground surface (4. 1 points m?2), a small laser footprint (0.09 m) and an accurate three-dimensional positioning (0.07 m), airborne laser scanning data are adapted as a source of information to analyze morphological structures at the surface. Seismic tomography surveys (P-wave and S-wave velocities) may highlight the presence of low-seismic-velocity zones that characterize the presence of dense fracture networks at the subsurface. The surface displacements measured from the terrestrial laser scanning data over a period of 2 years (May 2008?May 2010) allow one to quantify the landslide activity at the direct vicinity of the identified discontinuities. An important subsidence of the crown area with an average subsidence rate of 3.07 m?year?1 is determined. The displacement directions indicate that the retrogression is controlled structurally by the preexisting discontinuities. A conceptual structural model is proposed to explain the failure mechanism and the retrogressive evolution of the main scarp. Uphill, the crown area is affected by planar sliding included in a deeper wedge failure system constrained by two preexisting fractures. Downhill, the landslide body acts as a buttress for the upper part. Consequently, the progression of the landslide body downhill allows the development of dip-slope failures, and coherent blocks start sliding along planar discontinuities. The volume of the failed mass in the crown area is estimated at 500,000 m3 with the sloping local base level method.
Resumo:
"Vegeu el resum a l'inici del document del fitxer adjunt."
Resumo:
Integration of kDNA sequences within the genome of the host cell shown by PCR amplification with primers to the conserved Trypanosoma cruzi kDNA minicircle sequence was confirmed by Southern hybridization with specific probes. The cells containing the integrated kDNA sequences were then perpetuated as transfected macrophage subclonal lines. The kDNA transfected macrophages expressed membrane antigens that were recognized by antibodies in a panel of sera from ten patients with chronic Chagas disease. These antigens barely expressed in the membrane of uninfected, control macrophage clonal lines were recognized neither by factors in the control, non-chagasic subjects nor in the chagasic sera. This finding suggests the presence of an autoimmune antibody in the chagasic sera that recognizes auto-antigens in the membrane of T. cruzi kDNA transfected macrophage subclonal lines.
Resumo:
Des progrès significatifs ont été réalisés dans le domaine de l'intégration quantitative des données géophysique et hydrologique l'échelle locale. Cependant, l'extension à de plus grandes échelles des approches correspondantes constitue encore un défi majeur. Il est néanmoins extrêmement important de relever ce défi pour développer des modèles fiables de flux des eaux souterraines et de transport de contaminant. Pour résoudre ce problème, j'ai développé une technique d'intégration des données hydrogéophysiques basée sur une procédure bayésienne de simulation séquentielle en deux étapes. Cette procédure vise des problèmes à plus grande échelle. L'objectif est de simuler la distribution d'un paramètre hydraulique cible à partir, d'une part, de mesures d'un paramètre géophysique pertinent qui couvrent l'espace de manière exhaustive, mais avec une faible résolution (spatiale) et, d'autre part, de mesures locales de très haute résolution des mêmes paramètres géophysique et hydraulique. Pour cela, mon algorithme lie dans un premier temps les données géophysiques de faible et de haute résolution à travers une procédure de réduction déchelle. Les données géophysiques régionales réduites sont ensuite reliées au champ du paramètre hydraulique à haute résolution. J'illustre d'abord l'application de cette nouvelle approche dintégration des données à une base de données synthétiques réaliste. Celle-ci est constituée de mesures de conductivité hydraulique et électrique de haute résolution réalisées dans les mêmes forages ainsi que destimations des conductivités électriques obtenues à partir de mesures de tomographic de résistivité électrique (ERT) sur l'ensemble de l'espace. Ces dernières mesures ont une faible résolution spatiale. La viabilité globale de cette méthode est testée en effectuant les simulations de flux et de transport au travers du modèle original du champ de conductivité hydraulique ainsi que du modèle simulé. Les simulations sont alors comparées. Les résultats obtenus indiquent que la procédure dintégration des données proposée permet d'obtenir des estimations de la conductivité en adéquation avec la structure à grande échelle ainsi que des predictions fiables des caractéristiques de transports sur des distances de moyenne à grande échelle. Les résultats correspondant au scénario de terrain indiquent que l'approche d'intégration des données nouvellement mise au point est capable d'appréhender correctement les hétérogénéitées à petite échelle aussi bien que les tendances à gande échelle du champ hydraulique prévalent. Les résultats montrent également une flexibilté remarquable et une robustesse de cette nouvelle approche dintégration des données. De ce fait, elle est susceptible d'être appliquée à un large éventail de données géophysiques et hydrologiques, à toutes les gammes déchelles. Dans la deuxième partie de ma thèse, j'évalue en détail la viabilité du réechantillonnage geostatique séquentiel comme mécanisme de proposition pour les méthodes Markov Chain Monte Carlo (MCMC) appliquées à des probmes inverses géophysiques et hydrologiques de grande dimension . L'objectif est de permettre une quantification plus précise et plus réaliste des incertitudes associées aux modèles obtenus. En considérant une série dexemples de tomographic radar puits à puits, j'étudie deux classes de stratégies de rééchantillonnage spatial en considérant leur habilité à générer efficacement et précisément des réalisations de la distribution postérieure bayésienne. Les résultats obtenus montrent que, malgré sa popularité, le réechantillonnage séquentiel est plutôt inefficace à générer des échantillons postérieurs indépendants pour des études de cas synthétiques réalistes, notamment pour le cas assez communs et importants où il existe de fortes corrélations spatiales entre le modèle et les paramètres. Pour résoudre ce problème, j'ai développé un nouvelle approche de perturbation basée sur une déformation progressive. Cette approche est flexible en ce qui concerne le nombre de paramètres du modèle et lintensité de la perturbation. Par rapport au rééchantillonage séquentiel, cette nouvelle approche s'avère être très efficace pour diminuer le nombre requis d'itérations pour générer des échantillons indépendants à partir de la distribution postérieure bayésienne. - Significant progress has been made with regard to the quantitative integration of geophysical and hydrological data at the local scale. However, extending corresponding approaches beyond the local scale still represents a major challenge, yet is critically important for the development of reliable groundwater flow and contaminant transport models. To address this issue, I have developed a hydrogeophysical data integration technique based on a two-step Bayesian sequential simulation procedure that is specifically targeted towards larger-scale problems. The objective is to simulate the distribution of a target hydraulic parameter based on spatially exhaustive, but poorly resolved, measurements of a pertinent geophysical parameter and locally highly resolved, but spatially sparse, measurements of the considered geophysical and hydraulic parameters. To this end, my algorithm links the low- and high-resolution geophysical data via a downscaling procedure before relating the downscaled regional-scale geophysical data to the high-resolution hydraulic parameter field. I first illustrate the application of this novel data integration approach to a realistic synthetic database consisting of collocated high-resolution borehole measurements of the hydraulic and electrical conductivities and spatially exhaustive, low-resolution electrical conductivity estimates obtained from electrical resistivity tomography (ERT). The overall viability of this method is tested and verified by performing and comparing flow and transport simulations through the original and simulated hydraulic conductivity fields. The corresponding results indicate that the proposed data integration procedure does indeed allow for obtaining faithful estimates of the larger-scale hydraulic conductivity structure and reliable predictions of the transport characteristics over medium- to regional-scale distances. The approach is then applied to a corresponding field scenario consisting of collocated high- resolution measurements of the electrical conductivity, as measured using a cone penetrometer testing (CPT) system, and the hydraulic conductivity, as estimated from electromagnetic flowmeter and slug test measurements, in combination with spatially exhaustive low-resolution electrical conductivity estimates obtained from surface-based electrical resistivity tomography (ERT). The corresponding results indicate that the newly developed data integration approach is indeed capable of adequately capturing both the small-scale heterogeneity as well as the larger-scale trend of the prevailing hydraulic conductivity field. The results also indicate that this novel data integration approach is remarkably flexible and robust and hence can be expected to be applicable to a wide range of geophysical and hydrological data at all scale ranges. In the second part of my thesis, I evaluate in detail the viability of sequential geostatistical resampling as a proposal mechanism for Markov Chain Monte Carlo (MCMC) methods applied to high-dimensional geophysical and hydrological inverse problems in order to allow for a more accurate and realistic quantification of the uncertainty associated with the thus inferred models. Focusing on a series of pertinent crosshole georadar tomographic examples, I investigated two classes of geostatistical resampling strategies with regard to their ability to efficiently and accurately generate independent realizations from the Bayesian posterior distribution. The corresponding results indicate that, despite its popularity, sequential resampling is rather inefficient at drawing independent posterior samples for realistic synthetic case studies, notably for the practically common and important scenario of pronounced spatial correlation between model parameters. To address this issue, I have developed a new gradual-deformation-based perturbation approach, which is flexible with regard to the number of model parameters as well as the perturbation strength. Compared to sequential resampling, this newly proposed approach was proven to be highly effective in decreasing the number of iterations required for drawing independent samples from the Bayesian posterior distribution.