807 resultados para decentralised data fusion framework
Resumo:
Insect pollination benefits over three quarters of the world's major crops. There is growing concern that observed declines in pollinators may impact on production and revenues from animal pollinated crops. Knowing the distribution of pollinators is therefore crucial for estimating their availability to pollinate crops; however, in general, we have an incomplete knowledge of where these pollinators occur. We propose a method to predict geographical patterns of pollination service to crops, novel in two elements: the use of pollinator records rather than expert knowledge to predict pollinator occurrence, and the inclusion of the managed pollinator supply. We integrated a maximum entropy species distribution model (SDM) with an existing pollination service model (PSM) to derive the availability of pollinators for crop pollination. We used nation-wide records of wild and managed pollinators (honey bees) as well as agricultural data from Great Britain. We first calibrated the SDM on a representative sample of bee and hoverfly crop pollinator species, evaluating the effects of different settings on model performance and on its capacity to identify the most important predictors. The importance of the different predictors was better resolved by SDM derived from simpler functions, with consistent results for bees and hoverflies. We then used the species distributions from the calibrated model to predict pollination service of wild and managed pollinators, using field beans as a test case. The PSM allowed us to spatially characterize the contribution of wild and managed pollinators and also identify areas potentially vulnerable to low pollination service provision, which can help direct local scale interventions. This approach can be extended to investigate geographical mismatches between crop pollination demand and the availability of pollinators, resulting from environmental change or policy scenarios.
Resumo:
A framework for understanding the complexity of cancer development was established by Hanahan and Weinberg in their definition of the hallmarks of cancer. In this review, we consider the evidence that parabens can enable development in human breast epithelial cells of 4/6 of the basic hallmarks, 1/2 of the emerging hallmarks and 1/2 of the enabling characteristics. Hallmark 1: parabens have been measured as present in 99% of human breast tissue samples, possess oestrogenic activity and can stimulate sustained proliferation of human breast cancer cells at concentrations measurable in the breast. Hallmark 2: parabens can inhibit the suppression of breast cancer cell growth by hydroxytamoxifen, and through binding to the oestrogen-related receptor gamma (ERR) may prevent its deactivation by growth inhibitors. Hallmark 3: in the 10nM to 1M range, parabens give a dose-dependent evasion of apoptosis in high-risk donor breast epithelial cells. Hallmark 4: long-term exposure (>20weeks) to parabens leads to increased migratory and invasive activity in human breast cancer cells, properties which are linked to the metastatic process. Emerging hallmark: methylparaben has been shown in human breast epithelial cells to increase mTOR, a key regulator of energy metabolism. Enabling characteristic: parabens can cause DNA damage at high concentrations in the short term but more work is needed to investigate long-term low-doses of mixtures. The ability of parabens to enable multiple cancer hallmarks in human breast epithelial cells provides grounds for regulatory review of the implications of the presence of parabens in human breast tissue.
Resumo:
This article shows how one can formulate the representation problem starting from Bayes’ theorem. The purpose of this article is to raise awareness of the formal solutions,so that approximations can be placed in a proper context. The representation errors appear in the likelihood, and the different possibilities for the representation of reality in model and observations are discussed, including nonlinear representation probability density functions. Specifically, the assumptions needed in the usual procedure to add a representation error covariance to the error covariance of the observations are discussed,and it is shown that, when several sub-grid observations are present, their mean still has a representation error ; socalled ‘superobbing’ does not resolve the issue. Connection is made to the off-line or on-line retrieval problem, providing a new simple proof of the equivalence of assimilating linear retrievals and original observations. Furthermore, it is shown how nonlinear retrievals can be assimilated without loss of information. Finally we discuss how errors in the observation operator model can be treated consistently in the Bayesian framework, connecting to previous work in this area.
Resumo:
In the resource-based view, organisations are represented by the sum of their physical, human and organisational assets, resources and capabilities. Operational capabilities maintain the status quo and allow an organisation to execute their existing business. Dynamic capabilities, otherwise, allow an organisation to change this status quo including a change of the operational ones. Competitive advantage, in this context, is an effect of continuously developing and reconfiguring these firm-specific assets through dynamic capabilities. Deciding where and how to source the core operational capabilities is a key success factor. Furthermore, developing its dynamic capabilities allows an organisation to effectively manage change its operational capabilities. Many organisations are asserted to have a high dependency on - as well as a high benefit from - the use of information technology (IT), making it a crucial and overarching resource. Furthermore, the IT function is assigned the role as a change enabler and so IT sourcing affects the capability of managing business change. IT sourcing means that organisations need to decide how to source their IT capabilities. Outsourcing of parts of the IT function will also outsource some of the IT capabilities and therefore some of the business capabilities. As a result, IT sourcing has an impact on the organisation's capabilities and consequently on the business success. And finally, a turbulent and fast moving business environment challenges organisations to effectively and efficiently managing business change. Our research builds on the existing theory of dynamic and operational capabilities by considering the interdependencies between the dynamic capabilities of business change and IT sourcing. Further it examines the decision-making oversight of these areas as implemented through IT governance. We introduce a new conceptual framework derived from the existing theory and extended through an illustrative case study conducted in a German bank. Under a philosophical paradigm of constructivism, we collected data from eight semi-structured interviews and used additional sources of evidence in form of annual accounts, strategy papers and IT benchmark reports. We applied an Interpretative Phenomenological Analysis (IPA), which emerged the superordinate themes for our tentative capabilities framework. An understanding of these interdependencies enables scholars and professionals to improve business success through effectively managing business change and evaluating the impact of IT sourcing decisions on the organisation's operational and dynamic capabilities.
Resumo:
The Environmental Data Abstraction Library provides a modular data management library for bringing new and diverse datatypes together for visualisation within numerous software packages, including the ncWMS viewing service, which already has very wide international uptake. The structure of EDAL is presented along with examples of its use to compare satellite, model and in situ data types within the same visualisation framework. We emphasize the value of this capability for cross calibration of datasets and evaluation of model products against observations, including preparation for data assimilation.
Resumo:
Land cover maps at different resolutions and mapping extents contribute to modeling and support decision making processes. Because land cover affects and is affected by climate change, it is listed among the 13 terrestrial essential climate variables. This paper describes the generation of a land cover map for Latin America and the Caribbean (LAC) for the year 2008. It was developed in the framework of the project Latin American Network for Monitoring and Studying of Natural Resources (SERENA), which has been developed within the GOFC-GOLD Latin American network of remote sensing and forest fires (RedLaTIF). The SERENA land cover map for LAC integrates: 1) the local expertise of SERENA network members to generate the training and validation data, 2) a methodology for land cover mapping based on decision trees using MODIS time series, and 3) class membership estimates to account for pixel heterogeneity issues. The discrete SERENA land cover product, derived from class memberships, yields an overall accuracy of 84% and includes an additional layer representing the estimated per-pixel confidence. The study demonstrates in detail the use of class memberships to better estimate the area of scarce classes with a scattered spatial distribution. The land cover map is already available as a printed wall map and will be released in digital format in the near future. The SERENA land cover map was produced with a legend and classification strategy similar to that used by the North American Land Change Monitoring System (NALCMS) to generate a land cover map of the North American continent, that will allow to combine both maps to generate consistent data across America facilitating continental monitoring and modeling
Resumo:
We make use of the Skyrme effective nuclear interaction within the time-dependent Hartree-Fock framework to assess the effect of inclusion of the tensor terms of the Skyrme interaction on the fusion window of the 16O–16O reaction. We find that the lower fusion threshold, around the barrier, is quite insensitive to these details of the force, but the higher threshold, above which the nuclei pass through each other, changes by several MeV between different tensor parametrisations. The results suggest that eventually fusion properties may become part of the evaluation or fitting process for effective nuclear interactions.
Resumo:
This paper describes the hydrochemistry of a lowland, urbanised river-system, The Cut in England, using in situ sub-daily sampling. The Cut receives effluent discharges from four major sewage treatment works serving around 190,000 people. These discharges consist largely of treated water, originally abstracted from the River Thames and returned via the water supply network, substantially increasing the natural flow. The hourly water quality data were supplemented by weekly manual sampling with laboratory analysis to check the hourly data and measure further determinands. Mean phosphorus and nitrate concentrations were very high, breaching standards set by EU legislation. Though 56% of the catchment area is agricultural, the hydrochemical dynamics were significantly impacted by effluent discharges which accounted for approximately 50% of the annual P catchment input loads and, on average, 59% of river flow at the monitoring point. Diurnal dissolved oxygen data demonstrated high in-stream productivity. From a comparison of high frequency and conventional monitoring data, it is inferred that much of the primary production was dominated by benthic algae, largely diatoms. Despite the high productivity and nutrient concentrations, the river water did not become anoxic and major phytoplankton blooms were not observed. The strong diurnal and annual variation observed showed that assessments of water quality made under the Water Framework Directive (WFD) are sensitive to the time and season of sampling. It is recommended that specific sampling time windows be specified for each determinand, and that WFD targets should be applied in combination to help identify periods of greatest ecological risk. This article is protected by copyright. All rights reserved.
Resumo:
We systematically compare the performance of ETKF-4DVAR, 4DVAR-BEN and 4DENVAR with respect to two traditional methods (4DVAR and ETKF) and an ensemble transform Kalman smoother (ETKS) on the Lorenz 1963 model. We specifically investigated this performance with increasing nonlinearity and using a quasi-static variational assimilation algorithm as a comparison. Using the analysis root mean square error (RMSE) as a metric, these methods have been compared considering (1) assimilation window length and observation interval size and (2) ensemble size to investigate the influence of hybrid background error covariance matrices and nonlinearity on the performance of the methods. For short assimilation windows with close to linear dynamics, it has been shown that all hybrid methods show an improvement in RMSE compared to the traditional methods. For long assimilation window lengths in which nonlinear dynamics are substantial, the variational framework can have diffculties fnding the global minimum of the cost function, so we explore a quasi-static variational assimilation (QSVA) framework. Of the hybrid methods, it is seen that under certain parameters, hybrid methods which do not use a climatological background error covariance do not need QSVA to perform accurately. Generally, results show that the ETKS and hybrid methods that do not use a climatological background error covariance matrix with QSVA outperform all other methods due to the full flow dependency of the background error covariance matrix which also allows for the most nonlinearity.
Resumo:
Operational forecasting centres are currently developing data assimilation systems for coupled atmosphere-ocean models. Strongly coupled assimilation, in which a single assimilation system is applied to a coupled model, presents significant technical and scientific challenges. Hence weakly coupled assimilation systems are being developed as a first step, in which the coupled model is used to compare the current state estimate with observations, but corrections to the atmosphere and ocean initial conditions are then calculated independently. In this paper we provide a comprehensive description of the different coupled assimilation methodologies in the context of four dimensional variational assimilation (4D-Var) and use an idealised framework to assess the expected benefits of moving towards coupled data assimilation. We implement an incremental 4D-Var system within an idealised single column atmosphere-ocean model. The system has the capability to run both strongly and weakly coupled assimilations as well as uncoupled atmosphere or ocean only assimilations, thus allowing a systematic comparison of the different strategies for treating the coupled data assimilation problem. We present results from a series of identical twin experiments devised to investigate the behaviour and sensitivities of the different approaches. Overall, our study demonstrates the potential benefits that may be expected from coupled data assimilation. When compared to uncoupled initialisation, coupled assimilation is able to produce more balanced initial analysis fields, thus reducing initialisation shock and its impact on the subsequent forecast. Single observation experiments demonstrate how coupled assimilation systems are able to pass information between the atmosphere and ocean and therefore use near-surface data to greater effect. We show that much of this benefit may also be gained from a weakly coupled assimilation system, but that this can be sensitive to the parameters used in the assimilation.
Resumo:
The evaluation of forecast performance plays a central role both in the interpretation and use of forecast systems and in their development. Different evaluation measures (scores) are available, often quantifying different characteristics of forecast performance. The properties of several proper scores for probabilistic forecast evaluation are contrasted and then used to interpret decadal probability hindcasts of global mean temperature. The Continuous Ranked Probability Score (CRPS), Proper Linear (PL) score, and IJ Good’s logarithmic score (also referred to as Ignorance) are compared; although information from all three may be useful, the logarithmic score has an immediate interpretation and is not insensitive to forecast busts. Neither CRPS nor PL is local; this is shown to produce counter intuitive evaluations by CRPS. Benchmark forecasts from empirical models like Dynamic Climatology place the scores in context. Comparing scores for forecast systems based on physical models (in this case HadCM3, from the CMIP5 decadal archive) against such benchmarks is more informative than internal comparison systems based on similar physical simulation models with each other. It is shown that a forecast system based on HadCM3 out performs Dynamic Climatology in decadal global mean temperature hindcasts; Dynamic Climatology previously outperformed a forecast system based upon HadGEM2 and reasons for these results are suggested. Forecasts of aggregate data (5-year means of global mean temperature) are, of course, narrower than forecasts of annual averages due to the suppression of variance; while the average “distance” between the forecasts and a target may be expected to decrease, little if any discernible improvement in probabilistic skill is achieved.
Resumo:
A method is proposed for merging different nadir-sounding climate data records using measurements from high-resolution limb sounders to provide a transfer function between the different nadir measurements. The two nadir-sounding records need not be overlapping so long as the limb-sounding record bridges between them. The method is applied to global-mean stratospheric temperatures from the NOAA Climate Data Records based on the Stratospheric Sounding Unit (SSU) and the Advanced Microwave Sounding Unit-A (AMSU), extending the SSU record forward in time to yield a continuous data set from 1979 to present, and providing a simple framework for extending the SSU record into the future using AMSU. SSU and AMSU are bridged using temperature measurements from the Michelson Interferometer for Passive Atmospheric Sounding (MIPAS), which is of high enough vertical resolution to accurately represent the weighting functions of both SSU and AMSU. For this application, a purely statistical approach is not viable since the different nadir channels are not sufficiently linearly independent, statistically speaking. The near-global-mean linear temperature trends for extended SSU for 1980–2012 are −0.63 ± 0.13, −0.71 ± 0.15 and −0.80 ± 0.17 K decade−1 (95 % confidence) for channels 1, 2 and 3, respectively. The extended SSU temperature changes are in good agreement with those from the Microwave Limb Sounder (MLS) on the Aura satellite, with both exhibiting a cooling trend of ~ 0.6 ± 0.3 K decade−1 in the upper stratosphere from 2004 to 2012. The extended SSU record is found to be in agreement with high-top coupled atmosphere–ocean models over the 1980–2012 period, including the continued cooling over the first decade of the 21st century.
Resumo:
We propose a geoadditive negative binomial model (Geo-NB-GAM) for regional count data that allows us to address simultaneously some important methodological issues, such as spatial clustering, nonlinearities, and overdispersion. This model is applied to the study of location determinants of inward greenfield investments that occurred during 2003–2007 in 249 European regions. After presenting the data set and showing the presence of overdispersion and spatial clustering, we review the theoretical framework that motivates the choice of the location determinants included in the empirical model, and we highlight some reasons why the relationship between some of the covariates and the dependent variable might be nonlinear. The subsequent section first describes the solutions proposed by previous literature to tackle spatial clustering, nonlinearities, and overdispersion, and then presents the Geo-NB-GAM. The empirical analysis shows the good performance of Geo-NB-GAM. Notably, the inclusion of a geoadditive component (a smooth spatial trend surface) permits us to control for spatial unobserved heterogeneity that induces spatial clustering. Allowing for nonlinearities reveals, in keeping with theoretical predictions, that the positive effect of agglomeration economies fades as the density of economic activities reaches some threshold value. However, no matter how dense the economic activity becomes, our results suggest that congestion costs never overcome positive agglomeration externalities.
Resumo:
This paper investigates the potential of fusion at normalisation/segmentation level prior to feature extraction. While there are several biometric fusion methods at data/feature level, score level and rank/decision level combining raw biometric signals, scores, or ranks/decisions, this type of fusion is still in its infancy. However, the increasing demand to allow for more relaxed and less invasive recording conditions, especially for on-the-move iris recognition, suggests to further investigate fusion at this very low level. This paper focuses on the approach of multi-segmentation fusion for iris biometric systems investigating the benefit of combining the segmentation result of multiple normalisation algorithms, using four methods from two different public iris toolkits (USIT, OSIRIS) on the public CASIA and IITD iris datasets. Evaluations based on recognition accuracy and ground truth segmentation data indicate high sensitivity with regards to the type of errors made by segmentation algorithms.
Resumo:
Purpose We study particular structural and organisational factors affecting the formality of human resource management (HRM) practices in small and medium-sized enterprises (SMEs) in South-Eastern European (SEE) post-communist countries, in particular Serbia, Romania, Bulgaria and the Former Yugoslav Republic of Macedonia (FYROM) in order to understand the antecedents of formalization in such settings. Design/methodology/approach Adopting a quantitative approach, this study analyses data gathered through a survey of 168 managers of SMEs from throughout the region. Findings The results show that HRM in SMEs in the SEE region can be understood through a three-fold framework which includes: degree of internationalisation of SMEs, sector of SMEs and organisational size of SMEs. These three factors positively affect the level of HRM formalisation in SEE SMEs. These findings are further attributed to the particular political and economic context of the post-communist SEE region. Research limitations/implications Although specific criteria were set for SME selection, we do not suggest that the study reflects a representative picture of the SEE region because we used a purposive sampling methodology. Practical implications This article provides useful insights into the factors which influence HRM in SMEs in a particular context. The findings can help business owners and managers understand how HRM can be applied in smaller organisations, particularly in post-communist SEE business contexts. Originality/value HRM in SMEs in this region has hardly been studied at all despite their importance. Therefore, this exploratory research seeks to expand knowledge relating to the application of HRM in SMEs in SEE countries which have their business environments dominated by different dynamics in comparison to western European ones.