995 resultados para Global Fortune 500
Resumo:
International competitiveness ultimately depends upon the linkages between a firm’s unique, idiosyncratic capabilities (firm-specific advantages, FSAs) and its home country assets (country-specific advantages, CSAs). In this paper, we present a modified FSA/CSA matrix building upon the FSA/CSA matrix (Rugman 1981). We relate this to the diamond framework for national competitiveness (Porter 1990), and the double diamond model (Rugman and D’Cruz 1993). We provide empirical evidence to demonstrate the merits and usefulness of the modified FSA/CSA matrix using the Fortune Global 500 firms. We examine the FSAs based on the geographic scope of sales and CSAs that can lead to national, home region, and global competitiveness. Our empirical analysis suggests that the world’s largest 500 firms have increased their firm-level international competitiveness. However, much of this is still being achieved within their home region. In other words, international competitiveness is a regional not a global phenomenon. Our findings have significant implications for research and practice. Future research in international marketing should take into account the multi-faceted nature of FSAs and CSAs across different levels. For MNE managers, our study provides useful insights for strategic marketing planning and implementation.
Resumo:
Objective: The purpose of this study was to compare 2 different interventions, global postural reeducation (GPR) and static stretching exercises (SS), in the treatment of women with temporomandibular disorders (TMDs). Methods: A total of 28 subjects with TMDs were randomized into 2 treatment groups: GPR, where therapy involved muscle global chain stretching, or SS, with conventional static stretching; but only 24 completed the study. Eight treatment sessions lasting 40 minutes each (weekly) were performed. Assessments were conducted at baseline, immediately after treatment end, and 2 months later. Measurements included pain intensity at the temporomandibular joint, headache, cervicalgia, teeth clenching, ear symptoms, restricted sleep, and difficulties for mastication, using a visual analogue scale. In addition, electromyographic activity and pain thresholds were measured at the masseter, anterior temporalis, stemocleidomastoid, and upper trapezius muscles. Two-way analysis of variance with Tukey post hoc test was used for between-group comparisons. Significance level was .05. Results: Comparing the pain assessments using the visual analogue scale, no significant differences were seen with the exception of severity of headaches at treatment end (GPR, 3.92 +/- 2.98 cm; SS, 1.64 +/- 1.66 cm; P < .024). In addition, no significant differences were seen for pain thresholds and for electromyographic activity (P > .05). Conclusions: For the subjects in this study, both GPR and SS were similarly effective for the treatment of TMDs with muscular component. They equally reduced pain intensity, increased pain thresholds, and decreased electromyographic activity. (J Manipulative Physiol Ther 2010;33:500-507)
Resumo:
The main purpose of this study is to analyse the changes caused by the global financial crisis on the influence of board characteristics on corporate results, in terms of corporate performance, corporate risk-taking, and earnings management. Sample comprises S&P 500 listed firms during 2002-2008. This study reveals that the environmental conditions call for different behaviour from directors to fulfil their responsibilities and suggests changes in normative and voluntary guidelines for improving good practices in the boardroom.
Resumo:
Os objectivos principais deste estudo são a caracterização de uma das linhas de extrusão existentes na Cabelte, nomeadamente a linha de extrusão de referência EP5, composta por duas extrusoras. Pretende-se fazer a determinação de indicadores energéticos e de processo e a optimização do consumo energético, no que diz respeito à energia consumida e às perdas térmicas relativas a esta linha. Para fazer a monitorização da linha de extrusão EP5 foi colocado no quadro geral dessa linha um equipamento central de medida de forma a ser possível a sua monitorização. No entanto, para a extrusora auxiliar as medições foram efectuadas com uma pinça amperimétrica e um fasímetro. Foram também efectuados ensaios onde foi avaliada a quantidade de material transformada, para isso foi utilizado um equipamento de pesagem, doseador gravimétrico aplicado nas extrusoras. As medições de temperatura para os cálculos das perdas térmicas da extrusora principal e para a caracterização dos materiais plásticos, foram efectuadas utilizando um termómetro digital. Foram efectuados ensaios de débito às extrusoras auxiliar e principal e foi estudada a variação do factor de potência em função da rotação do fuso. Na perspectiva do utilizador final a optimização para a utilização racional de energia está na redução de encargos da factura de energia eléctrica. Essa factura não depende só da quantidade mas também do modo temporal como se utiliza essa energia, principalmente a energia eléctrica, bastante dependente do período em que é consumida. Uma metodologia diferente no planeamento da produção, contemplando o fabrico dos cabos com maior custo específico nas horas de menor custo energético, implicaria uma redução dos custos específicos de 18,7% para o horário de verão e de 20,4% para o horário de inverno. Os materiais de revestimento utilizados (PE e PVC), influenciam directamente os custos energéticos, uma vez que o polietileno (PE) apresenta sempre valores de entalpia superiores (0,317 kWh/kg e 0,281 kWh/kg)) e necessita de temperaturas de trabalho mais elevadas do que o policloreto de vinilo (PVC) (0,141 kWh/kg e 0,124 kWh/kg). O consumo específico tendencialmente diminui à medida que aumenta a rotação do fuso, até se atingir o valor de rotação óptimo, a partir do qual esta tendência se inverte. O cosφ para as duas extrusoras em estudo, aumenta sempre com o aumento de rotação do fuso. Este estudo permitiu avaliar as condições óptimas no processo de revestimento dos cabos, de forma a minimizarmos os consumos energéticos. A redução de toda a espécie de desperdícios (sobre consumos, desperdício em purgas) é uma prioridade de gestão que alia também a eficácia à eficiência, e constitui uma ferramenta fundamental para assegurar o futuro da empresa. O valor médio lido para o factor de potência (0,38) da linha EP5, valor extremamente baixo e que vem associado à energia reactiva, além do factor económico que lhe está inerente, condiciona futuras ampliações. A forma de se corrigir o factor de potência é instalando uma bateria de condensadores de 500 kVAr. Considerando o novo sistema tarifário aplicado à energia reactiva, vamos ter um ganho de 36167,4 Euro/ano e o período de retorno de investimento é de 0,37 ano (4,5 meses). Esta medida implica também uma redução anual na quantidade de CO2 emitida de 6,5%. A quantificação das perdas térmicas é importante, pois só desta forma se podem definir modos de actuação de forma a aumentar a eficiência energética. Se não existir conhecimento profundo dos processos e metodologias correctas, não podem existir soluções eficientes, logo é importante medir antes de avançar com qualquer medida de gestão.
Resumo:
Embora as ciências da comunicação em Portugal somem já uma história com quase 35 anos, que começa muito incipientemente em 1979 com a introdução do ensino da comunicação nas universidades portuguesas, são ainda muito exigentes os desafios que enfrentamos. E nunca como hoje corremos o risco do retrocesso. Nas últimas duas décadas, criámos centros de investigação e lançámos publicações especializadas. Doutoraram-se mais de duas centenas de professores e investigadores, estando integrados atualmente em programas doutorais muito perto de 500 estudantes. Fizemos um longo caminho que nos permitiu sonhar com uma grande comunidade científica. Será irrepetível nos próximos anos o ritmo de crescimento que temos vivido. São conhecidos de todos os fatores que nos fragilizam. O desinvestimento político e financeiro nas ciências sociais e humanas e o desinteresse aparente pela produção científica em língua portuguesa, a par com a crise económica que inibe o sonho de prosseguir os estudos, são hoje razões de preocupação e uma ameaça aos padrões de qualidade, de originalidade e de extensão finalmente alcançados. Internacionalizar é hoje a palavra de ordem. Assim como criar emprego e proporcionar impacto socioeconómico na região e no país. Ainda que eventualmente compreensíveis, estes requisitos para validar a investigação que fazemos ignoram os tempos próprios do conhecimento nesta área, onde só a médio prazo se podem notar os efeitos do nosso trabalho. São também por isso equivocados os critérios de qualidade que nos empurram para métricas de produção que não se compaginam com a natureza das ciências da comunicação. O tema do VIII Congresso da Sopcom não é alheio a estes desafios. “Comunicação global, cultura e tecnologia” é o mote para refletirmos, em tempos de crise, sobre esta cultura da internacionalização e sobre as tecnologias disponíveis para a promoção desse ideal de comunicação global. Eis, pois, as três armas com que nos fazemos a campo em mais um contributo para a consolidação das ciências da comunicação em Portugal. Contamos para isso com mais de 200 comunicações, distribuídas por dezenas de sessões paralelas, e com oito intervenções em quatro sessões plenárias. Juntam-se a este programa científico cinco convidados de Espanha, França e Brasil - Margarita Ledo, Miquel de Moragas, Alain Kiyindou, Antônio Hohlfeldt e Muniz Sodré. Esperamos finalmente acolher pela primeira vez muitos dos novos investigadores que se têm associado à Sopcom. Nos trabalhos científicos como nas pausas para café e no jantar oficial, é o espírito de um grupo jovem e audaz que se espera cultivar. Que este congresso seja para todos momento de encontro, de partilha e também de esperança na solidariedade junto dos pares, que é o mesmo que dizer na associação de interesses, de iniciativas e de realizações. Em comunicação, essa é a alma da ciência que fazemos. in apresentação, Moisés de Lemos Martins
Resumo:
The aim of this thesis is to research mean return spillovers as well as volatility spillovers from the S&P 500 stock index in the USA to selected stock markets in the emerging economies in Eastern Europe between 2002 and 2014. The sample period has been divided into smaller subsamples, which enables taking different market conditions as well as the unification of the World’s capital markets during the financial crisis into account. Bivariate VAR(1) models are used to analyze the mean return spillovers while the volatility linkages are analyzed through the use of bivariate BEKK-GARCH(1,1) models. The results show both constant volatility pooling within the S&P 500 as well as some statistically significant spillovers of both return and volatility from the S&P 500 to the Eastern European emerging stock markets. Moreover, some of the results indicate that the volatility spillovers have increased as time has passed, indicating unification of global stock markets.
Resumo:
The 21st century has brought new challenges for forest management at a time when globalization in world trade is increasing and global climate change is becoming increasingly apparent. In addition to various goods and services like food, feed, timber or biofuels being provided to humans, forest ecosystems are a large store of terrestrial carbon and account for a major part of the carbon exchange between the atmosphere and the land surface. Depending on the stage of the ecosystems and/or management regimes, forests can be either sinks, or sources of carbon. At the global scale, rapid economic development and a growing world population have raised much concern over the use of natural resources, especially forest resources. The challenging question is how can the global demands for forest commodities be satisfied in an increasingly globalised economy, and where could they potentially be produced? For this purpose, wood demand estimates need to be integrated in a framework, which is able to adequately handle the competition for land between major land-use options such as residential land or agricultural land. This thesis is organised in accordance with the requirements to integrate the simulation of forest changes based on wood extraction in an existing framework for global land-use modelling called LandSHIFT. Accordingly, the following neuralgic points for research have been identified: (1) a review of existing global-scale economic forest sector models (2) simulation of global wood production under selected scenarios (3) simulation of global vegetation carbon yields and (4) the implementation of a land-use allocation procedure to simulate the impact of wood extraction on forest land-cover. Modelling the spatial dynamics of forests on the global scale requires two important inputs: (1) simulated long-term wood demand data to determine future roundwood harvests in each country and (2) the changes in the spatial distribution of woody biomass stocks to determine how much of the resource is available to satisfy the simulated wood demands. First, three global timber market models are reviewed and compared in order to select a suitable economic model to generate wood demand scenario data for the forest sector in LandSHIFT. The comparison indicates that the ‘Global Forest Products Model’ (GFPM) is most suitable for obtaining projections on future roundwood harvests for further study with the LandSHIFT forest sector. Accordingly, the GFPM is adapted and applied to simulate wood demands for the global forestry sector conditional on selected scenarios from the Millennium Ecosystem Assessment and the Global Environmental Outlook until 2050. Secondly, the Lund-Potsdam-Jena (LPJ) dynamic global vegetation model is utilized to simulate the change in potential vegetation carbon stocks for the forested locations in LandSHIFT. The LPJ data is used in collaboration with spatially explicit forest inventory data on aboveground biomass to allocate the demands for raw forest products and identify locations of deforestation. Using the previous results as an input, a methodology to simulate the spatial dynamics of forests based on wood extraction is developed within the LandSHIFT framework. The land-use allocation procedure specified in the module translates the country level demands for forest products into woody biomass requirements for forest areas, and allocates these on a five arc minute grid. In a first version, the model assumes only actual conditions through the entire study period and does not explicitly address forest age structure. Although the module is in a very preliminary stage of development, it already captures the effects of important drivers of land-use change like cropland and urban expansion. As a first plausibility test, the module performance is tested under three forest management scenarios. The module succeeds in responding to changing inputs in an expected and consistent manner. The entire methodology is applied in an exemplary scenario analysis for India. A couple of future research priorities need to be addressed, particularly the incorporation of plantation establishments; issue of age structure dynamics; as well as the implementation of a new technology change factor in the GFPM which can allow the specification of substituting raw wood products (especially fuelwood) by other non-wood products.
Resumo:
Introducción: la hibridación genómica comparativa en una técnica que permite la exploración de las anormalidades cromosómicas. Su utilidad en la aproximación de los pacientes con retraso global del desarrollo o fenotipo dismórfico, sin embargo, no ha sido explorada mediante una revisión sistemática de la literatura. Metodología: realizó una revisión sistemática de la literatura. Se incluyeron estudios controlados, cuasi-experimentales, de cohortes, de casos y controles, transversales y descriptivos publicados en idiomas inglés y español entre los años 2000 y 2013. Se realizó un análisis de la evidencia con un enfoque cualitativo y cuantitativo. Se realizó un análisis del riesgo de sesgo de los estudios incluidos. Resultados: se incluyeron 4 estudios que cumplieron con los criterios de inclusión. La prevalencia de alteraciones cromosómicas en los niños con retraso global del desarrollo fue de entre el 6 y 13%. El uso de la técnica permitió identificar alteraciones que no fueron detectadas mediante el cariotipo. Conclusiones: la hibridación genómica comparativa es una técnica útil en la aproximación diagnóstica de los niños con retraso global del desarrollo y del fenotipo dismórfico y permite una mayor detección de alteraciones comparada con el cariotipo.
Resumo:
En el presente trabajo se describen los aportes más destacados de algunos de los gurúes representativos del ámbito administrativo Oriental y Occidental. En Occidental se cuenta con el legado de Henry Ford, Philip Kotler, Frederick Winslow Taylor, Henry Fayol, Michael Porter, Peter Drucker y Steve Jobs. En Oriente, los gurúes son Akio Morita, Edwards Deming, Kaoru Ishikawa, Familia Toyoda, Masaaki Imai y Taiichi Ohno. A partir de ello, se hacen comparaciones entre las tendencias de administración de cada cultura y entre los gurúes. Seguido, se comentan aspectos importantes de Mc Donald’s y Samsung, en cuanto a sus modelos de gestión y su adaptación en un mundo globalizado.
Resumo:
El presente Estudio de Caso tiene como objetivo analizar en qué medida las dinámicas comerciales de la Diplomacia Petrolera China han convertido a Ecuador en un socio estratégico para la RPCh. El petróleo como fuente de energía es primordial para llevar a cabo los procesos de industrialización y mantener el crecimiento económico del león Asiático. Por eso su búsqueda se ha convertido en un tema principal dentro de la agenda de política exterior. Ecuador, el tercer país de Suramérica con más reservas de petróleo, después de Venezuela y Brasil, se ha convertido en zona de influencia de la RPCh y a través de las empresas petroleras estatales se han firmado contratos por la venta de petróleo. A pesar de que las relaciones bilaterales son asimétricas, se buscar establecer si Ecuador es un socio estratégico en la región.
Resumo:
El liderazgo ha sido definido de diferentes maneras por cientos de autores debido al contexto en el que estudian este concepto. Ninguna de estas definiciones es errónea pero algunas han tomado mayor importancia debido a los diferentes factores que enfrenta la sociedad. Desde hace unos años los países se han abierto a diferentes mercados lo cual les ha permitido eliminar las barreras políticas, económicas y culturales existentes. Esto ha llevado a que los líderes deban evaluar la nueva forma de dirigir y direccionar las organizaciones. Este es tan solo uno de los ejemplos que han llevado a modificar el concepto de liderazgo, añadiendo los nuevos retos a los que se ven enfrentados los líderes. En este trabajo de grado se estudia el que se considera uno de los mayores retos de los siglos XX y XXI: la globalización. Este fenómeno ha acercado al mundo a través del intercambio de información, de bienes, de servicios, de conocimientos y sobre todo de cultura. Esto se ha logrado a través de nuevas tecnologías, nuevos servicios de comunicación y transporte, de la ciencia y los avances de la industria. El nuevo líder debe romper la barrera nacional y abrirse a mercados extranjeros, para esto debe contar con ciertas características que le permitirán entender los diferentes mercados y a las personas que se encuentran en este. En este trabajo se identifican las que se consideran las principales características de un líder global; estas son el resultado de la investigación de diferentes autores y estudios.
Resumo:
The impact of selected observing systems on forecast skill is explored using the European Centre for Medium-Range Weather Forecasts (ECMWF) 40-yr reanalysis (ERA-40) system. Analyses have been produced for a surface-based observing system typical of the period prior to 1945/1950, a terrestrial-based observing system typical of the period 1950-1979 and a satellite-based observing system consisting of surface pressure and satellite observations. Global prediction experiments have been undertaken using these analyses as initial states, and which are available every 6 h, for the boreal winters of 1990/1991 and 2000/2001 and the summer of 2000, using a more recent version of the ECMWF model. The results show that for 500-hPa geopotential height, as a representative field, the terrestrial system in the Northern Hemisphere extratropics is only slightly inferior to the control system, which makes use of all observations for the analysis, and is also more accurate than the satellite system. There are indications that the skill of the terrestrial system worsens slightly and the satellite system improves somewhat between 1990/1991 and 2000/2001. The forecast skill in the Southern Hemisphere is dominated by the satellite information and this dominance is larger in the latter period. The overall skill is only slightly worse than that of the Northern Hemisphere. In the tropics (20 degrees S-20 degrees N), using the wind at 850 and 250 hPa as representative fields, the information content in the terrestrial and satellite systems is almost equal and complementary. The surface-based system has very limited skill restricted to the lower troposphere of the Northern Hemisphere. Predictability calculations show a potential for a further increase in predictive skill of 1-2 d in the extratropics of both hemispheres, but a potential for a major improvement of many days in the tropics. As well as the Eulerian perspective of predictability, the storm tracks have been calculated from all experiments and validated for the extratropics to provide a Lagrangian perspective.
Resumo:
Shelf and coastal seas are regions of exceptionally high biological productivity, high rates of biogeochemical cycling and immense socio-economic importance. They are, however, poorly represented by the present generation of Earth system models, both in terms of resolution and process representation. Hence, these models cannot be used to elucidate the role of the coastal ocean in global biogeochemical cycles and the effects global change (both direct anthropogenic and climatic) are having on them. Here, we present a system for simulating all the coastal regions around the world (the Global Coastal Ocean Modelling System) in a systematic and practical fashion. It is based on automatically generating multiple nested model domains, using the Proudman Oceanographic Laboratory Coastal Ocean Modelling System coupled to the European Regional Seas Ecosystem Model. Preliminary results from the system are presented. These demonstrate the viability of the concept, and we discuss the prospects for using the system to explore key areas of global change in shelf seas, such as their role in the carbon cycle and climate change effects on fisheries.
Resumo:
Under global warming, the predicted intensification of the global freshwater cycle will modify the net freshwater flux at the ocean surface. Since the freshwater flux maintains ocean salinity structures, changes to the density-driven ocean circulation are likely. A modified ocean circulation could further alter the climate, potentially allowing rapid changes, as seen in the past. The relevant feedback mechanisms and timescales are poorly understood in detail, however, especially at low latitudes where the effects of salinity are relatively subtle. In an attempt to resolve some of these outstanding issues, we present an investigation of the climate response of the low-latitude Pacific region to changes in freshwater forcing. Initiated from the present-day thermohaline structure, a control run of a coupled ocean-atmosphere general circulation model is compared with a perturbation run in which the net freshwater flux is prescribed to be zero over the ocean. Such an extreme experiment helps to elucidate the general adjustment mechanisms and their timescales. The atmospheric greenhouse gas concentrations are held constant, and we restrict our attention to the adjustment of the upper 1,000 m of the Pacific Ocean between 40°N and 40°S, over 100 years. In the perturbation run, changes to the surface buoyancy, near-surface vertical mixing and mixed-layer depth are established within 1 year. Subsequently, relative to the control run, the surface of the low-latitude Pacific Ocean in the perturbation run warms by an average of 0.6°C, and the interior cools by up to 1.1°C, after a few decades. This vertical re-arrangement of the ocean heat content is shown to be achieved by a gradual shutdown of the heat flux due to isopycnal (i.e. along surfaces of constant density) mixing, the vertical component of which is downwards at low latitudes. This heat transfer depends crucially upon the existence of density-compensating temperature and salinity gradients on isopycnal surfaces. The timescale of the thermal changes in the perturbation run is therefore set by the timescale for the decay of isopycnal salinity gradients in response to the eliminated freshwater forcing, which we demonstrate to be around 10-20 years. Such isopycnal heat flux changes may play a role in the response of the low-latitude climate to a future accelerated freshwater cycle. Specifically, the mechanism appears to represent a weak negative sea surface temperature feedback, which we speculate might partially shield from view the anthropogenically-forced global warming signal at low latitudes. Furthermore, since the surface freshwater flux is shown to play a role in determining the ocean's thermal structure, it follows that evaporation and/or precipitation biases in general circulation models are likely to cause sea surface temperature biases.
Resumo:
Simulations of the last 500 yr carried out using the Third Hadley Centre Coupled Ocean-Atmosphere GCM (HadCM3) with anthropogenic and natural (solar and volcanic) forcings have been analyzed. Global-mean surface temperature change during the twentieth century is well reproduced. Simulated contributions to global-mean sea level rise during recent decades due to thermal expansion (the largest term) and to mass loss from glaciers and ice caps agree within uncertainties with observational estimates of these terms, but their sum falls short of the observed rate of sea level rise. This discrepancy has been discussed by previous authors; a completely satisfactory explanation of twentieth-century sea level rise is lacking. The model suggests that the apparent onset of sea level rise and glacier retreat during the first part of the nineteenth century was due to natural forcing. The rate of sea level rise was larger during the twentieth century than during the previous centuries because of anthropogenic forcing, but decreasing natural forcing during the second half of the twentieth century tended to offset the anthropogenic acceleration in the rate. Volcanic eruptions cause rapid falls in sea level, followed by recovery over several decades. The model shows substantially less decadal variability in sea level and its thermal expansion component than twentieth-century observations indicate, either because it does not generate sufficient ocean internal variability, or because the observational analyses overestimate the variability.