73 resultados para non-global solution
Resumo:
Global NDVI data are routinely derived from the AVHRR, SPOT-VGT, and MODIS/Terra earth observation records for a range of applications from terrestrial vegetation monitoring to climate change modeling. This has led to a substantial interest in the harmonization of multisensor records. Most evaluations of the internal consistency and continuity of global multisensor NDVI products have focused on time-series harmonization in the spectral domain, often neglecting the spatial domain. We fill this void by applying variogram modeling (a) to evaluate the differences in spatial variability between 8-km AVHRR, 1-km SPOT-VGT, and 1-km, 500-m, and 250-m MODIS NDVI products over eight EOS (Earth Observing System) validation sites, and (b) to characterize the decay of spatial variability as a function of pixel size (i.e. data regularization) for spatially aggregated Landsat ETM+ NDVI products and a real multisensor dataset. First, we demonstrate that the conjunctive analysis of two variogram properties – the sill and the mean length scale metric – provides a robust assessment of the differences in spatial variability between multiscale NDVI products that are due to spatial (nominal pixel size, point spread function, and view angle) and non-spatial (sensor calibration, cloud clearing, atmospheric corrections, and length of multi-day compositing period) factors. Next, we show that as the nominal pixel size increases, the decay of spatial information content follows a logarithmic relationship with stronger fit value for the spatially aggregated NDVI products (R2 = 0.9321) than for the native-resolution AVHRR, SPOT-VGT, and MODIS NDVI products (R2 = 0.5064). This relationship serves as a reference for evaluation of the differences in spatial variability and length scales in multiscale datasets at native or aggregated spatial resolutions. The outcomes of this study suggest that multisensor NDVI records cannot be integrated into a long-term data record without proper consideration of all factors affecting their spatial consistency. Hence, we propose an approach for selecting the spatial resolution, at which differences in spatial variability between NDVI products from multiple sensors are minimized. This approach provides practical guidance for the harmonization of long-term multisensor datasets.
Resumo:
Pollination is an essential process in the sexual reproduction of seed plants and a key ecosystem service to human welfare. Animal pollinators decline as a consequence of five major global change pressures: climate change, landscape alteration, agricultural intensification, non-native species, and spread of pathogens. These pressures, which differ in their biotic or abiotic nature and their spatiotemporal scales, can interact in nonadditive ways (synergistically or antagonistically), but are rarely considered together in studies of pollinator and/or pollination decline. Management actions aimed at buffering the impacts of a particular pressure could thereby prove ineffective if another pressure is present. Here, we focus on empirical evidence of the combined effects of global change pressures on pollination, highlighting gaps in current knowledge and future research needs.
Resumo:
We establish Maximum Principles which apply to vectorial approximate minimizers of the general integral functional of Calculus of Variations. Our main result is a version of the Convex Hull Property. The primary advance compared to results already existing in the literature is that we have dropped the quasiconvexity assumption of the integrand in the gradient term. The lack of weak Lower semicontinuity is compensated by introducing a nonlinear convergence technique, based on the approximation of the projection onto a convex set by reflections and on the invariance of the integrand in the gradient term under the Orthogonal Group. Maximum Principles are implied for the relaxed solution in the case of non-existence of minimizers and for minimizing solutions of the Euler–Lagrange system of PDE.
Resumo:
The role of atmospheric general circulation model (AGCM) horizontal resolution in representing the global energy budget and hydrological cycle is assessed, with the aim of improving the understanding of model uncertainties in simulating the hydrological cycle. We use two AGCMs from the UK Met Office Hadley Centre: HadGEM1-A at resolutions ranging from 270 to 60 km, and HadGEM3-A ranging from 135 to 25 km. The models exhibit a stable hydrological cycle, although too intense compared to reanalyses and observations. This over-intensity is explained by excess surface shortwave radiation, a common error in general circulation models (GCMs). This result is insensitive to resolution. However, as resolution is increased, precipitation decreases over the ocean and increases over the land. This is associated with an increase in atmospheric moisture transport from ocean to land, which changes the partitioning of moisture fluxes that contribute to precipitation over land from less local to more non-local moisture sources. The results start to converge at 60-km resolution, which underlines the excessive reliance of the mean hydrological cycle on physical parametrization (local unresolved processes) versus model dynamics (large-scale resolved processes) in coarser HadGEM1 and HadGEM3 GCMs. This finding may be valid for other GCMs, showing the necessity to analyze other chains of GCMs that may become available in the future with such a range of horizontal resolutions. Our finding supports the hypothesis that heterogeneity in model parametrization is one of the underlying causes of model disagreement in the Coupled Model Intercomparison Project (CMIP) exercises.
Resumo:
Global wetlands are believed to be climate sensitive, and are the largest natural emitters of methane (CH4). Increased wetland CH4 emissions could act as a positive feedback to future warming. The Wetland and Wetland CH4 Inter-comparison of Models Project (WETCHIMP) investigated our present ability to simulate large-scale wetland characteristics and corresponding CH4 emissions. To ensure inter-comparability, we used a common experimental protocol driving all models with the same climate and carbon dioxide (CO2) forcing datasets. The WETCHIMP experiments were conducted for model equilibrium states as well as transient simulations covering the last century. Sensitivity experiments investigated model response to changes in selected forcing inputs (precipitation, temperature, and atmospheric CO2 concentration). Ten models participated, covering the spectrum from simple to relatively complex, including models tailored either for regional or global simulations. The models also varied in methods to calculate wetland size and location, with some models simulating wetland area prognostically, while other models relied on remotely sensed inundation datasets, or an approach intermediate between the two. Four major conclusions emerged from the project. First, the suite of models demonstrate extensive disagreement in their simulations of wetland areal extent and CH4 emissions, in both space and time. Simple metrics of wetland area, such as the latitudinal gradient, show large variability, principally between models that use inundation dataset information and those that independently determine wetland area. Agreement between the models improves for zonally summed CH4 emissions, but large variation between the models remains. For annual global CH4 emissions, the models vary by ±40% of the all-model mean (190 Tg CH4 yr−1). Second, all models show a strong positive response to increased atmospheric CO2 concentrations (857 ppm) in both CH4 emissions and wetland area. In response to increasing global temperatures (+3.4 °C globally spatially uniform), on average, the models decreased wetland area and CH4 fluxes, primarily in the tropics, but the magnitude and sign of the response varied greatly. Models were least sensitive to increased global precipitation (+3.9 % globally spatially uniform) with a consistent small positive response in CH4 fluxes and wetland area. Results from the 20th century transient simulation show that interactions between climate forcings could have strong non-linear effects. Third, we presently do not have sufficient wetland methane observation datasets adequate to evaluate model fluxes at a spatial scale comparable to model grid cells (commonly 0.5°). This limitation severely restricts our ability to model global wetland CH4 emissions with confidence. Our simulated wetland extents are also difficult to evaluate due to extensive disagreements between wetland mapping and remotely sensed inundation datasets. Fourth, the large range in predicted CH4 emission rates leads to the conclusion that there is both substantial parameter and structural uncertainty in large-scale CH4 emission models, even after uncertainties in wetland areas are accounted for.
Resumo:
Purpose – This paper extends the increasing debates about the role of international experience through mechanisms other than standard expatriation packages, in particular through the use of short-term assignments. It explores the different forms of short-term assignments (project work, commuter assignments, virtual international working and development assignments) and the different sets of positive and negative implications these can have for the company and the individuals concerned. The integration-differentiation debate is reflected here as elsewhere in IHRM, with the company moving towards greater centralization and control of its use of these assignments. Design/methodology/approach – Since the research is exploratory, we adopted a qualitative approach to get a more in-depth understanding on the realities the corporations and the assignees are facing. The study was implemented through a single case study setting in which the data were collected by interviewing (n=20) line managers, human resource management (HRM) staff and assignees themselves. In addition corporate documentation and other materials were reviewed. Findings – The present case study provides evidence about the characteristics of short-term assignments as well as the on the management of such assignments. The paper identifies various benefits and challenges involved in the use of short-term assignments both from the perspectives of the company and assignees. Furthermore, the findings support the view that a recent increase in the popularity of short-term assignments has not been matched by the development of HRM policies for such assignments. Research limitations/implications – As a single case study, limitations in the generalizability of the findings should be kept in mind. More large-scale research evidence is needed around different forms of international assignments beyond standard expatriation in order to fully capture the realities faced by international HRM specialists Practical implications – The paper identifies many challenges but also benefits of using short-term assignments. The paper reports in-depth findings on HR development needs that organizations face when expanding the use of such assignments. Social implications – The paper identifies many challenges but also benefits of using short-term assignments. The paper reports in-depth findings on HR development needs that organizations face when expanding the use of such assignments. Originality/value – Empirical research on short-term assignments is still very limited. In that way the paper provides much needed in-depth evidence on why such assignments are used, what challenges are involved in the use of such assignments and what kinds of HR-development needs are involved.
Resumo:
We explore the large spatial variation in the relationship between population density and burned area, using continental-scale Geographically Weighted Regression (GWR) based on 13 years of satellite-derived burned area maps from the global fire emissions database (GFED) and the human population density from the gridded population of the world (GPW 2005). Significant relationships are observed over 51.5% of the global land area, and the area affected varies from continent to continent: population density has a significant impact on fire over most of Asia and Africa but is important in explaining fire over < 22% of Europe and Australia. Increasing population density is associated with both increased and decreased in fire. The nature of the relationship depends on land-use: increasing population density is associated with increased burned are in rangelands but with decreased burned area in croplands. Overall, the relationship between population density and burned area is non-monotonic: burned area initially increases with population density and then decreases when population density exceeds a threshold. These thresholds vary regionally. Our study contributes to improved understanding of how human activities relate to burned area, and should contribute to a better estimate of atmospheric emissions from biomass burning.
Resumo:
The problem of heat conduction in one-dimensional piecewise homogeneous composite materials is examined by providing an explicit solution of the one-dimensional heat equation in each domain. The location of the interfaces is known, but neither temperature nor heat flux are prescribed there. Instead, the physical assumptions of their continuity at the interfaces are the only conditions imposed. The problem of two semi-infinite domains and that of two finite-sized domains are examined in detail. We indicate also how to extend the solution method to the setting of one finite-sized domain surrounded on both sides by semi-infinite domains, and on that of three finite-sized domains.
Resumo:
Although there is a strong policy interest in the impacts of climate change corresponding to different degrees of climate change, there is so far little consistent empirical evidence of the relationship between climate forcing and impact. This is because the vast majority of impact assessments use emissions-based scenarios with associated socio-economic assumptions, and it is not feasible to infer impacts at other temperature changes by interpolation. This paper presents an assessment of the global-scale impacts of climate change in 2050 corresponding to defined increases in global mean temperature, using spatially-explicit impacts models representing impacts in the water resources, river flooding, coastal, agriculture, ecosystem and built environment sectors. Pattern-scaling is used to construct climate scenarios associated with specific changes in global mean surface temperature, and a relationship between temperature and sea level used to construct sea level rise scenarios. Climate scenarios are constructed from 21 climate models to give an indication of the uncertainty between forcing and response. The analysis shows that there is considerable uncertainty in the impacts associated with a given increase in global mean temperature, due largely to uncertainty in the projected regional change in precipitation. This has important policy implications. There is evidence for some sectors of a non-linear relationship between global mean temperature change and impact, due to the changing relative importance of temperature and precipitation change. In the socio-economic sectors considered here, the relationships are reasonably consistent between socio-economic scenarios if impacts are expressed in proportional terms, but there can be large differences in absolute terms. There are a number of caveats with the approach, including the use of pattern-scaling to construct scenarios, the use of one impacts model per sector, and the sensitivity of the shape of the relationships between forcing and response to the definition of the impact indicator.
Resumo:
Global communication requirements and load imbalance of some parallel data mining algorithms are the major obstacles to exploit the computational power of large-scale systems. This work investigates how non-uniform data distributions can be exploited to remove the global communication requirement and to reduce the communication cost in iterative parallel data mining algorithms. In particular, the analysis focuses on one of the most influential and popular data mining methods, the k-means algorithm for cluster analysis. The straightforward parallel formulation of the k-means algorithm requires a global reduction operation at each iteration step, which hinders its scalability. This work studies a different parallel formulation of the algorithm where the requirement of global communication can be relaxed while still providing the exact solution of the centralised k-means algorithm. The proposed approach exploits a non-uniform data distribution which can be either found in real world distributed applications or can be induced by means of multi-dimensional binary search trees. The approach can also be extended to accommodate an approximation error which allows a further reduction of the communication costs.
Resumo:
This paper provides a high-level overview of E-UTRAN interworking and interoperability with existing Third Generation Partnership Project (3GPP) and non-3GPP wireless networks. E-UTRAN access networks (LTE and LTE-A) are currently the latest technologies for 3GPP evolution specified in Release 8, 9 and beyond. These technologies promise higher throughputs and lower latency while also reducing the cost of delivering the services to fit with subscriber demands. 3GPP offers a direct transition path from the current 3GPP UTRAN/GERAN networks to LTE including seamless handover. E-UTRAN and other wireless networks interworking is an option that allows operators to maximize the life of their existing network components before a complete transition to truly 4G networks. Network convergence, backward compatibility and interpretability are regarded as the next major challenge in the evolution and the integration of mobile wireless communications. In this paper, interworking and interoperability between the E-UTRAN Evolved Packet Core (EPC) architecture and 3GPP, 3GPP2 and IEEE based networks are clearly explained. How the EPC is designed to deliver multimedia and facilitate interworking is also explained. Moreover, the seamless handover needed to perform this interworking efficiently is described briefly. This study showed that interoperability and interworking between existing networks and E-UTRAN are highly recommended as an interim solution before the transition to full 4G. Furthermore, wireless operators have to consider a clear interoperability and interworking plan for their existing networks before making a decision to migrate completely to LTE. Interworking provides not only communication between different wireless networks; in many scenarios it contributes to add technical enhancements to one or both environments.
South Korean MNEs' international HRM approach: hybridization of global standards and local practices
Resumo:
This paper analyses the international Human Resource Management (HRM) approaches of Korean Multinational Enterprises (MNEs). Through a study of nine major Korean MNEs’ approaches to subsidiary-HRM, it is argued that the firms pursue hybridization through a blending of localization and global standardization across detailed elements in five broad HRM practice areas. Local discretion is allowed if not counter to global HRM system requirements and “global best practices” used as the template for global standardization of selected HRM elements. This strategic orientation appears to be part of a deliberate response to the “liabilities of origin” born by firms from non-dominant economies.
Resumo:
Purpose – The purpose of this paper is to highlight the serious limitations of neo-liberal capitalism and urge for a shift to socialized capital before further economic deterioration leads to a succession of global conflicts. Design/methodology/approach – This conceptual paper adopts a macro perspective in presenting argument on how global, financial markets integration and capital flow liberalization have led to inadequate market and corporate governance measures. The argument is couched in a selected literature and is preceded by a proposed solution – the requirement for socialized capital. An analysis of the nature of socialized capital is outlined and the questions that require attention identified if a paradigm shift from neo-liberal capitalism is to take place. Findings – The need to urgently shift to a new philosophy of capitalism is overwhelming. Emphasized is that capital needs to adopt a socialised identity and is supported by investment horizons of 30 years or more. It is argued that non-market (e.g. state, NGOs, civil society) intervention is critical in setting appropriate frameworks within which socialized capital can operate. Research limitations/implications – This is a theoretical paper, in which questions are raised which require transparent, public debate. Originality/value – The paper presents the case for a fundamental reconsideration of present day markets, the role of capital and the influence of elites in determining the public good.
Resumo:
Much has been written about Wall Street and the global financial crisis (GFC). From a fraudulent derivatives market to a contestable culture of banking bonuses, culpability has been examined within the frames of American praxis, namely that of American exceptionalism. This study begins with an exploratory analysis of non-US voices concerning the nature of the causes of the GFC. The analysis provides glimpses of the globalized extent of assumptions shared, but not debated within the globalization convergence of financial markets as the neo-liberal project. Practical and paradigmatic tensions are revealed in the capture of a London-based set of views articulated by senior financial executives of financial service organizations, the outcomes of which are not overly optimistic for any significant change in praxis within the immediate future.
Resumo:
With an aging global population, the number of people living with a chronic illness is expected to increase significantly by 2050. If left unmanaged, chronic care leads to serious health complications, resulting in poor patient quality of life and a costly time bomb for care providers. If effectively managed, patients with chronic care tend to live a richer and more healthy life, resulting in a less costly total care solution. This chapter considers literature from the areas of technology acceptance and care self-management, which aims to alleviate symptoms and/or reason for non-acceptance of care, and thus minimise the risk of long-term complications, which in turn reduces the chance of spiralling health expenditure. By bringing together these areas, the chapter highlights areas where self-management is failing so that changes can be made in care in advance of health deterioration.