962 resultados para data availability


Relevância:

60.00% 60.00%

Publicador:

Resumo:

In crop insurance, the accuracy with which the insurer quantifies the actual risk is highly dependent on the availability on actual yield data. Crop models might be valuable tools to generate data on expected yields for risk assessment when no historical records are available. However, selecting a crop model for a specific objective, location and implementation scale is a difficult task. A look inside the different crop and soil modules to understand how outputs are obtained might facilitate model choice. The objectives of this paper were (i) to assess the usefulness of crop models to be used within a crop insurance analysis and design and (ii) to select the most suitable crop model for drought risk assessment in semi-arid regions in Spain. For that purpose first, a pre-selection of crop models simulating wheat yield under rainfed growing conditions at the field scale was made, and second, four selected models (Aquacrop, CERES- Wheat, CropSyst and WOFOST) were compared in terms of modelling approaches, process descriptions and model outputs. Outputs of the four models for the simulation of winter wheat growth are comparable when water is not limiting, but differences are larger when simulating yields under rainfed conditions. These differences in rainfed yields are mainly related to the dissimilar simulated soil water availability and the assumed linkages with dry matter formation. We concluded that for the simulation of winter wheat growth at field scale in such semi-arid conditions, CERES-Wheat and CropSyst are preferred. WOFOST is a satisfactory compromise between data availability and complexity when detail data on soil is limited. Aquacrop integrates physiological processes in some representative parameters, thus diminishing the number of input parameters, what is seen as an advantage when observed data is scarce. However, the high sensitivity of this model to low water availability limits its use in the region considered. Contrary to the use of ensembles of crop models, we endorse that efforts be concentrated on selecting or rebuilding a model that includes approaches that better describe the agronomic conditions of the regions in which they will be applied. The use of such complex methodologies as crop models is associated with numerous sources of uncertainty, although these models are the best tools available to get insight in these complex agronomic systems.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Funding: Funded by the Scottish Government’s Rural and Environment Science and Analytical Services Division (RESAS, Theme 7: Diet and Health). The funder had no role in study design, data collection and analysis, decision to publish, or preparation of this manuscript. Data Availability: All relevant data are owned by the Aberdeen Maternity and Neonatal Databank. Interested parties may request access to the data by following the instructions at http://www.abdn.ac.uk/iahs/research/obsgynae/amnd/access.php.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We thank Karim Gharbi and Urmi Trivedi for their assistance with RNA sequencing, carried out in the GenePool genomics facility (University of Edinburgh). We also thank Susan Fairley and Eduardo De Paiva Alves (Centre for Genome Enabled Biology and Medicine, University of Aberdeen) for help with the initial bioinformatics analysis. We thank Aaron Mitchell for kindly providing the ALS3 mutant, Julian Naglik for the gift of TR146 cells, and Jon Richardson for technical assistance. We thank the Genomics and Bioinformatics core of the Faculty of Health Sciences for Next Generation Sequencing and Bioinformatics support, the Information and Communication Technology Office at the University of Macau for providing access to a High Performance Computer and Jacky Chan and William Pang for their expert support on the High Performance Computer. Finally, we thank Amanda Veri for generating CaLC2928. M.D.L. is supported by a Sir Henry Wellcome Postdoctoral Fellowship (Wellcome Trust 096072), R.A.F. by a Wellcome Trust-Massachusetts Institute of Technology (MIT) Postdoctoral Fellowship, L.E.C. by a Canada Research Chair in Microbial Genomics and Infectious Disease and by Canadian Institutes of Health Research Grants MOP-119520 and MOP-86452, A.J. P.B. was supported by the UK Biotechnology and Biological Sciences Research Council (BB/F00513X/1) and by the European Research Council (ERC-2009-AdG-249793-STRIFE), KHW is supported by the Science and Technology Development Fund of Macau S.A.R (FDCT) (085/2014/A2) and the Research and Development Administrative Office of the University of Macau (SRG2014-00003-FHS) and R.T.W. by the Burroughs Wellcome fund and NIH R15AO094406. Data availability RNA-sequencing data sets are available at ArrayExpress (www.ebi.ac.uk) under accession code E-MTAB-4075. ChIP-seq data sets are available at the NCBI SRA database (http://www.ncbi.nlm.nih.gov) under accession code SRP071687. The authors declare that all other data supporting the findings of this study are available within the article and its supplementary information files, or from the corresponding author upon request.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

At the height of the financial crisis, the Western welfare state prevented a repeat of the Great Depression. But there were also suggestions that social policy had contributed to the crisis, particularly by promoting households’ access to credit in pursuit of welfare goals. Others claim that it was the withdrawal of state welfare that led to the disaster. Against this background that motivated our interest, we propose a systematic way of assessing the relationship between financial market and public welfare provisions. We use structural vector auto-regression to establish the causal link and its direction. Two hypotheses about this relationship can be inferred from the literature. First, the notion that welfare states ‘decommodify’ livelihoods or that there is an equity-efficiency tradeoff would suggest that welfare states substitute to varying degrees for financial market offers of insurance and savings. By contrast, welfare states may support private interests selectively and/or help markets for households to function better; thus the nexus would be one of complementarity. Our empirical strategy is to spell out the causal mechanisms that can account for a substitutive or complementary relationship and then to see whether advanced econometric techniques find evidence for the existence of either of these mechanisms in six OECD countries. We find complementarity between public welfare (spending and tax subsidies) and life insurance markets for four out of our six countries, notably even for the United States. Substitution between welfare and finance is the more plausible interpretation for France and the Netherlands, which is surprising. Data availability constrains us from testing the implications for the welfare state contribution to the crisis directly but our findings suggest that the welfare state cannot generally be blamed for the financial crisis.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Addressing high and volatile natural resource prices, uncertain supply prospects, reindustrialization attempts and environmental damages related to resource use, resource efficiency has evolved into a highly debated proposal among academia, policy makers, firms and international financial institutions (IFIs). In 2011, the European Union (EU) declared resource efficiency as one of its seven flagship initiatives in its Europe 2020 strategy. This paper contributes to the discussions by assessing its key initiative, the Roadmap to a Resource Efficient Europe (EC 2011 571), following two streams of evaluation. In a first step, resource efficiency is linked to two theoretical frameworks regarding sustainability, (i) the sustainability triangle (consisting of economic, social and ecological dimensions) and (ii) balanced sustainability (combining weak and strong sustainability). Subsequently, both sustainability frameworks are used to assess to which degree the Roadmap follows the concept of sustainability. It can be concluded that it partially respects the sustainability triangle as well as balanced sustainability, primarily lacking a social dimension. In a second step, following Steger and Bleischwitz (2009), the impact of resource efficiency on competitiveness as advocated in the Roadmap is empirically evaluated. Using an Arellano–Bond dynamic panel data model reveals no robust impact of resource efficiency on competiveness in the EU between 2004 and 2009 – a puzzling result. Further empirical research and enhanced data availability are needed to better understand the impacts of resource efficiency on competitiveness on the macroeconomic, microeconomic and industry level. In that regard, strengthening the methodologies of resource indicators seem essential. Last but certainly not least, political will is required to achieve the transition of the EU-economy into a resource efficient future.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Addressing high and volatile natural resource prices, uncertain supply prospects, reindustrialization attempts and environmental damages related to resource use, resource efficiency has evolved into a highly debated proposal among academia, policy makers, firms and international financial institutions (IFIs). In 2011, the European Union (EU) declared resource efficiency as one of its seven flagship initiatives in its Europe 2020 strategy. This paper contributes to the discussions by assessing its key initiative, the Roadmap to a Resource Efficient Europe (EC 2011 571), following two streams of evaluation. In a first step, resource efficiency is linked to two theoretical frameworks regarding sustainability, (i) the sustainability triangle (consisting of economic, social and ecological dimensions) and (ii) balanced sustainability (combining weak and strong sustainability). Subsequently, both sustainability frameworks are used to assess to which degree the Roadmap follows the concept of sustainability. It can be concluded that it partially respects the sustainability triangle as well as balanced sustainability, primarily lacking a social dimension. In a second step, following Steger and Bleischwitz (2009), the impact of resource efficiency on competitiveness as advocated in the Roadmap is empirically evaluated. Using an Arellano–Bond dynamic panel data model reveals no robust impact of resource efficiency on competiveness in the EU between 2004 and 2009 – a puzzling result. Further empirical research and enhanced data availability are needed to better understand the impacts of resource efficiency on competitiveness on the macroeconomic, microeconomic and industry level. In that regard, strengthening the methodologies of resource indicators seem essential. Last but certainly not least, political will is required to achieve the transition of the EU-economy into a resource efficient future.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The Balkan Vegetation Database (BVD; GIVD ID: EU-00-019; http://www.givd.info/ID/EU-00- 019) is a regional database that consists of phytosociological relevés from different vegetation types from six countries on the Balkan Peninsula (Albania, Bosnia and Herzegovina, Bulgaria, Kosovo, Montenegro and Serbia). Currently, it contains 9,580 relevés, and most of them (78%) are geo-referenced. The database includes digitized relevés from the literature (79%) and unpublished data (21%). Herein we present descriptive statistics about attributive relevé information. We developed rules that regulate governance of the database, data provision, types of data availability regimes, data requests and terms of use, authorships and relationships with other databases. The database offers an extensive overview about studies on the local, regional and SE European levels including information about flora, vegetation and habitats.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Identifying cloud interference in satellite-derived data is a critical step toward developing useful remotely sensed products. Most MODIS land products use a combination of the MODIS (MOD35) cloud mask and the 'internal' cloud mask of the surface reflectance product (MOD09) to mask clouds, but there has been little discussion of how these masks differ globally. We calculated global mean cloud frequency for both products, for 2009, and found that inflated proportions of observations were flagged as cloudy in the Collection 5 MOD35 product. These erroneously categorized areas were spatially and environmentally non-random and usually occurred over high-albedo land-cover types (such as grassland and savanna) in several regions around the world. Additionally, we found that spatial variability in the processing path applied in the Collection 5 MOD35 algorithm affects the likelihood of a cloudy observation by up to 20% in some areas. These factors result in abrupt transitions in recorded cloud frequency across landcover and processing-path boundaries impeding their use for fine-scale spatially contiguous modeling applications. We show that together, these artifacts have resulted in significantly decreased and spatially biased data availability for Collection 5 MOD35-derived composite MODIS land products such as land surface temperature (MOD11) and net primary productivity (MOD17). Finally, we compare our results to mean cloud frequency in the new Collection 6 MOD35 product, and find that landcover artifacts have been reduced but not eliminated. Collection 6 thus increases data availability for some regions and land cover types in MOD35-derived products but practitioners need to consider how the remaining artifacts might affect their analysis.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Hydrographers have traditionally referred to the nearshore area as the "white ribbon" area due to the challenges associated with the collection of elevation data in this highly dynamic transitional zone between terrestrial and marine environments. Accordingly, available information in this zone is typically characterised by a range of datasets from disparate sources. In this paper we propose a framework to 'fill' the white ribbon area of a coral reef system by integrating multiple elevation and bathymetric datasets acquired by a suite of remote-sensing technologies into a seamless digital elevation model (DEM). A range of datasets are integrated, including field-collected GPS elevation points, terrestrial and bathymetric LiDAR, single and multibeam bathymetry, nautical chart depths and empirically derived bathymetry estimations from optical remote sensing imagery. The proposed framework ranks data reliability internally, thereby avoiding the requirements to quantify absolute error and results in a high resolution, seamless product. Nested within this approach is an effective spatially explicit technique for improving the accuracy of bathymetry estimates derived empirically from optical satellite imagery through modelling the spatial structure of residuals. The approach was applied to data collected on and around Lizard Island in northern Australia. Collectively, the framework holds promise for filling the white ribbon zone in coastal areas characterised by similar data availability scenarios. The seamless DEM is referenced to the horizontal coordinate system MGA Zone 55 - GDA 1994, mean sea level (MSL) vertical datum and has a spatial resolution of 20 m.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Reliable, comparable information about the main causes of disease and injury in populations, and how these are changing, is a critical input for debates about priorities in the health sector. Traditional sources of information about the descriptive epidemiology of diseases, injuries and risk factors are generally incomplete, fragmented and of uncertain reliability and comparability. Lack of a standardized measurement framework to permit comparisons across diseases and injuries, as well as risk factors, and failure to systematically evaluate data quality have impeded comparative analyses of the true public health importance of various conditions and risk factors. As a consequence the impact of major conditions and hazards on population health has been poorly appreciated, often leading to a lack of public health investment. Global disease and risk factor quantification improved dramatically in the early 1990s with the completion of the first Global Burden of Disease Study. For the first time, the comparative importance of over 100 diseases and injuries, and ten major risk factors, for global and regional health status could be assessed using a common metric (Disability-Adjusted Life Years) which simultaneously accounted for both premature mortality and the prevalence, duration and severity of the non-fatal consequences of disease and injury. As a consequence, mental health conditions and injuries, for which non-fatal outcomes are of particular significance, were identified as being among the leading causes of disease/injury burden worldwide, with clear implications for policy, particularly prevention. A major achievement of the Study was the complete global descriptive epidemiology, including incidence, prevalence and mortality, by age, sex and Region, of over 100 diseases and injuries. National applications, further methodological research and an increase in data availability have led to improved national, regional and global estimates for 2000, but substantial uncertainty around the disease burden caused by major conditions, including, HIV, remains. The rapid implementation of cost-effective data collection systems in developing countries is a key priority if global public policy to promote health is to be more effectively informed.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This thesis presents the formal definition of a novel Mobile Cloud Computing (MCC) extension of the Networked Autonomic Machine (NAM) framework, a general-purpose conceptual tool which describes large-scale distributed autonomic systems. The introduction of autonomic policies in the MCC paradigm has proved to be an effective technique to increase the robustness and flexibility of MCC systems. In particular, autonomic policies based on continuous resource and connectivity monitoring help automate context-aware decisions for computation offloading. We have also provided NAM with a formalization in terms of a transformational operational semantics in order to fill the gap between its existing Java implementation NAM4J and its conceptual definition. Moreover, we have extended NAM4J by adding several components with the purpose of managing large scale autonomic distributed environments. In particular, the middleware allows for the implementation of peer-to-peer (P2P) networks of NAM nodes. Moreover, NAM mobility actions have been implemented to enable the migration of code, execution state and data. Within NAM4J, we have designed and developed a component, denoted as context bus, which is particularly useful in collaborative applications in that, if replicated on each peer, it instantiates a virtual shared channel allowing nodes to notify and get notified about context events. Regarding the autonomic policies management, we have provided NAM4J with a rule engine, whose purpose is to allow a system to autonomously determine when offloading is convenient. We have also provided NAM4J with trust and reputation management mechanisms to make the middleware suitable for applications in which such aspects are of great interest. To this purpose, we have designed and implemented a distributed framework, denoted as DARTSense, where no central server is required, as reputation values are stored and updated by participants in a subjective fashion. We have also investigated the literature regarding MCC systems. The analysis pointed out that all MCC models focus on mobile devices, and consider the Cloud as a system with unlimited resources. To contribute in filling this gap, we defined a modeling and simulation framework for the design and analysis of MCC systems, encompassing both their sides. We have also implemented a modular and reusable simulator of the model. We have applied the NAM principles to two different application scenarios. First, we have defined a hybrid P2P/cloud approach where components and protocols are autonomically configured according to specific target goals, such as cost-effectiveness, reliability and availability. Merging P2P and cloud paradigms brings together the advantages of both: high availability, provided by the Cloud presence, and low cost, by exploiting inexpensive peers resources. As an example, we have shown how the proposed approach can be used to design NAM-based collaborative storage systems based on an autonomic policy to decide how to distribute data chunks among peers and Cloud, according to cost minimization and data availability goals. As a second application, we have defined an autonomic architecture for decentralized urban participatory sensing (UPS) which bridges sensor networks and mobile systems to improve effectiveness and efficiency. The developed application allows users to retrieve and publish different types of sensed information by using the features provided by NAM4J's context bus. Trust and reputation is managed through the application of DARTSense mechanisms. Also, the application includes an autonomic policy that detects areas characterized by few contributors, and tries to recruit new providers by migrating code necessary to sensing, through NAM mobility actions.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper reports on an assessment of an ongoing 6-Sigma program conducted within a UK based (US owned) automotive company. It gives an overview of the management of the 6-sigma programme and the 23 in-house methodology used. The analysis given in the paper pays particular focus to the financial impacts that individual projects have had. Three projects are chosen from the hundreds that have been completed and are discussed in detail, including which specific techniques have been used and how financially successful the projects were. Commentary is also given on the effectiveness of the overall program along with a critique of how the implementation of 6-Sigma could be more effectively managed in the future. This discussion particularly focuses upon issues such as: project selection and scoping, financial evaluation and data availability, organisational awareness, commitment and involvement, middle management support, functional variation, and maintaining momentum during the rollout of a lengthy program.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The value of knowing about data availability and system accessibility is analyzed through theoretical models of Information Economics. When a user places an inquiry for information, it is important for the user to learn whether the system is not accessible or the data is not available, rather than not have any response. In reality, various outcomes can be provided by the system: nothing will be displayed to the user (e.g., a traffic light that does not operate, a browser that keeps browsing, a telephone that does not answer); a random noise will be displayed (e.g., a traffic light that displays random signals, a browser that provides disorderly results, an automatic voice message that does not clarify the situation); a special signal indicating that the system is not operating (e.g., a blinking amber indicating that the traffic light is down, a browser responding that the site is unavailable, a voice message regretting to tell that the service is not available). This article develops a model to assess the value of the information for the user in such situations by employing the information structure model prevailing in Information Economics. Examples related to data accessibility in centralized and in distributed systems are provided for illustration.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Over the last two decades social vulnerability has emerged as a major area of study, with increasing attention to the study of vulnerable populations. Generally, the elderly are among the most vulnerable members of any society, and widespread population aging has led to greater focus on elderly vulnerability. However, the absence of a valid and practical measure constrains the ability of policy-makers to address this issue in a comprehensive way. This study developed a composite indicator, The Elderly Social Vulnerability Index (ESVI), and used it to undertake a comparative analysis of the availability of support for elderly Jamaicans based on their access to human, material and social resources. The results of the ESVI indicated that while the elderly are more vulnerable overall, certain segments of the population appear to be at greater risk. Females had consistently lower scores than males, and the oldest-old had the highest scores of all groups of older persons. Vulnerability scores also varied according to place of residence, with more rural parishes having higher scores than their urban counterparts. These findings support the political economy framework which locates disadvantage in old age within political and ideological structures. The findings also point to the pervasiveness and persistence of gender inequality as argued by feminist theories of aging. Based on the results of the study it is clear that there is a need for policies that target specific population segments, in addition to universal policies that could make the experience of old age less challenging for the majority of older persons. Overall, the ESVI has displayed usefulness as a tool for theoretical analysis and demonstrated its potential as a policy instrument to assist decision-makers in determining where to target their efforts as they seek to address the issue of social vulnerability in old age. Data for this study came from the 2001 population and housing census of Jamaica, with multiple imputation for missing data. The index was derived from the linear aggregation of three equally weighted domains, comprised of eleven unweighted indicators which were normalized using z-scores. Indicators were selected based on theoretical relevance and data availability.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Subtitle D of the Resource Conservation and Recovery Act (RCRA) requires a post closure period of 30 years for non hazardous wastes in landfills. Post closure care (PCC) activities under Subtitle D include leachate collection and treatment, groundwater monitoring, inspection and maintenance of the final cover, and monitoring to ensure that landfill gas does not migrate off site or into on site buildings. The decision to reduce PCC duration requires exploration of a performance based methodology to Florida landfills. PCC should be based on whether the landfill is a threat to human health or the environment. Historically no risk based procedure has been available to establish an early end to PCC. Landfill stability depends on a number of factors that include variables that relate to operations both before and after the closure of a landfill cell. Therefore, PCC decisions should be based on location specific factors, operational factors, design factors, post closure performance, end use, and risk analysis. The question of appropriate PCC period for Florida’s landfills requires in depth case studies focusing on the analysis of the performance data from closed landfills in Florida. Based on data availability, Davie Landfill was identified as case study site for a case by case analysis of landfill stability. The performance based PCC decision system developed by Geosyntec Consultants was used for the assessment of site conditions to project PCC needs. The available data for leachate and gas quantity and quality, ground water quality, and cap conditions were evaluated. The quality and quantity data for leachate and gas were analyzed to project the levels of pollutants in leachate and groundwater in reference to maximum contaminant level (MCL). In addition, the projected amount of gas quantity was estimated. A set of contaminants (including metals and organics) were identified as contaminants detected in groundwater for health risk assessment. These contaminants were selected based on their detection frequency and levels in leachate and ground water; and their historical and projected trends. During the evaluations a range of discrepancies and problems that related to the collection and documentation were encountered and possible solutions made. Based on the results of PCC performance integrated with risk assessment, projection of future PCC monitoring needs and sustainable waste management options were identified. According to these results, landfill gas monitoring can be terminated, leachate and groundwater monitoring for parameters above MCL and surveying of the cap integrity should be continued. The parameters which cause longer monitoring periods can be eliminated for the future sustainable landfills. As a conclusion, 30 year PCC period can be reduced for some of the landfill components based on their potential impacts to human health and environment (HH&E).