960 resultados para Data Availability


Relevância:

60.00% 60.00%

Publicador:

Resumo:

Hydrographers have traditionally referred to the nearshore area as the "white ribbon" area due to the challenges associated with the collection of elevation data in this highly dynamic transitional zone between terrestrial and marine environments. Accordingly, available information in this zone is typically characterised by a range of datasets from disparate sources. In this paper we propose a framework to 'fill' the white ribbon area of a coral reef system by integrating multiple elevation and bathymetric datasets acquired by a suite of remote-sensing technologies into a seamless digital elevation model (DEM). A range of datasets are integrated, including field-collected GPS elevation points, terrestrial and bathymetric LiDAR, single and multibeam bathymetry, nautical chart depths and empirically derived bathymetry estimations from optical remote sensing imagery. The proposed framework ranks data reliability internally, thereby avoiding the requirements to quantify absolute error and results in a high resolution, seamless product. Nested within this approach is an effective spatially explicit technique for improving the accuracy of bathymetry estimates derived empirically from optical satellite imagery through modelling the spatial structure of residuals. The approach was applied to data collected on and around Lizard Island in northern Australia. Collectively, the framework holds promise for filling the white ribbon zone in coastal areas characterised by similar data availability scenarios. The seamless DEM is referenced to the horizontal coordinate system MGA Zone 55 - GDA 1994, mean sea level (MSL) vertical datum and has a spatial resolution of 20 m.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Reliable, comparable information about the main causes of disease and injury in populations, and how these are changing, is a critical input for debates about priorities in the health sector. Traditional sources of information about the descriptive epidemiology of diseases, injuries and risk factors are generally incomplete, fragmented and of uncertain reliability and comparability. Lack of a standardized measurement framework to permit comparisons across diseases and injuries, as well as risk factors, and failure to systematically evaluate data quality have impeded comparative analyses of the true public health importance of various conditions and risk factors. As a consequence the impact of major conditions and hazards on population health has been poorly appreciated, often leading to a lack of public health investment. Global disease and risk factor quantification improved dramatically in the early 1990s with the completion of the first Global Burden of Disease Study. For the first time, the comparative importance of over 100 diseases and injuries, and ten major risk factors, for global and regional health status could be assessed using a common metric (Disability-Adjusted Life Years) which simultaneously accounted for both premature mortality and the prevalence, duration and severity of the non-fatal consequences of disease and injury. As a consequence, mental health conditions and injuries, for which non-fatal outcomes are of particular significance, were identified as being among the leading causes of disease/injury burden worldwide, with clear implications for policy, particularly prevention. A major achievement of the Study was the complete global descriptive epidemiology, including incidence, prevalence and mortality, by age, sex and Region, of over 100 diseases and injuries. National applications, further methodological research and an increase in data availability have led to improved national, regional and global estimates for 2000, but substantial uncertainty around the disease burden caused by major conditions, including, HIV, remains. The rapid implementation of cost-effective data collection systems in developing countries is a key priority if global public policy to promote health is to be more effectively informed.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This thesis presents the formal definition of a novel Mobile Cloud Computing (MCC) extension of the Networked Autonomic Machine (NAM) framework, a general-purpose conceptual tool which describes large-scale distributed autonomic systems. The introduction of autonomic policies in the MCC paradigm has proved to be an effective technique to increase the robustness and flexibility of MCC systems. In particular, autonomic policies based on continuous resource and connectivity monitoring help automate context-aware decisions for computation offloading. We have also provided NAM with a formalization in terms of a transformational operational semantics in order to fill the gap between its existing Java implementation NAM4J and its conceptual definition. Moreover, we have extended NAM4J by adding several components with the purpose of managing large scale autonomic distributed environments. In particular, the middleware allows for the implementation of peer-to-peer (P2P) networks of NAM nodes. Moreover, NAM mobility actions have been implemented to enable the migration of code, execution state and data. Within NAM4J, we have designed and developed a component, denoted as context bus, which is particularly useful in collaborative applications in that, if replicated on each peer, it instantiates a virtual shared channel allowing nodes to notify and get notified about context events. Regarding the autonomic policies management, we have provided NAM4J with a rule engine, whose purpose is to allow a system to autonomously determine when offloading is convenient. We have also provided NAM4J with trust and reputation management mechanisms to make the middleware suitable for applications in which such aspects are of great interest. To this purpose, we have designed and implemented a distributed framework, denoted as DARTSense, where no central server is required, as reputation values are stored and updated by participants in a subjective fashion. We have also investigated the literature regarding MCC systems. The analysis pointed out that all MCC models focus on mobile devices, and consider the Cloud as a system with unlimited resources. To contribute in filling this gap, we defined a modeling and simulation framework for the design and analysis of MCC systems, encompassing both their sides. We have also implemented a modular and reusable simulator of the model. We have applied the NAM principles to two different application scenarios. First, we have defined a hybrid P2P/cloud approach where components and protocols are autonomically configured according to specific target goals, such as cost-effectiveness, reliability and availability. Merging P2P and cloud paradigms brings together the advantages of both: high availability, provided by the Cloud presence, and low cost, by exploiting inexpensive peers resources. As an example, we have shown how the proposed approach can be used to design NAM-based collaborative storage systems based on an autonomic policy to decide how to distribute data chunks among peers and Cloud, according to cost minimization and data availability goals. As a second application, we have defined an autonomic architecture for decentralized urban participatory sensing (UPS) which bridges sensor networks and mobile systems to improve effectiveness and efficiency. The developed application allows users to retrieve and publish different types of sensed information by using the features provided by NAM4J's context bus. Trust and reputation is managed through the application of DARTSense mechanisms. Also, the application includes an autonomic policy that detects areas characterized by few contributors, and tries to recruit new providers by migrating code necessary to sensing, through NAM mobility actions.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper reports on an assessment of an ongoing 6-Sigma program conducted within a UK based (US owned) automotive company. It gives an overview of the management of the 6-sigma programme and the 23 in-house methodology used. The analysis given in the paper pays particular focus to the financial impacts that individual projects have had. Three projects are chosen from the hundreds that have been completed and are discussed in detail, including which specific techniques have been used and how financially successful the projects were. Commentary is also given on the effectiveness of the overall program along with a critique of how the implementation of 6-Sigma could be more effectively managed in the future. This discussion particularly focuses upon issues such as: project selection and scoping, financial evaluation and data availability, organisational awareness, commitment and involvement, middle management support, functional variation, and maintaining momentum during the rollout of a lengthy program.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The value of knowing about data availability and system accessibility is analyzed through theoretical models of Information Economics. When a user places an inquiry for information, it is important for the user to learn whether the system is not accessible or the data is not available, rather than not have any response. In reality, various outcomes can be provided by the system: nothing will be displayed to the user (e.g., a traffic light that does not operate, a browser that keeps browsing, a telephone that does not answer); a random noise will be displayed (e.g., a traffic light that displays random signals, a browser that provides disorderly results, an automatic voice message that does not clarify the situation); a special signal indicating that the system is not operating (e.g., a blinking amber indicating that the traffic light is down, a browser responding that the site is unavailable, a voice message regretting to tell that the service is not available). This article develops a model to assess the value of the information for the user in such situations by employing the information structure model prevailing in Information Economics. Examples related to data accessibility in centralized and in distributed systems are provided for illustration.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Over the last two decades social vulnerability has emerged as a major area of study, with increasing attention to the study of vulnerable populations. Generally, the elderly are among the most vulnerable members of any society, and widespread population aging has led to greater focus on elderly vulnerability. However, the absence of a valid and practical measure constrains the ability of policy-makers to address this issue in a comprehensive way. This study developed a composite indicator, The Elderly Social Vulnerability Index (ESVI), and used it to undertake a comparative analysis of the availability of support for elderly Jamaicans based on their access to human, material and social resources. The results of the ESVI indicated that while the elderly are more vulnerable overall, certain segments of the population appear to be at greater risk. Females had consistently lower scores than males, and the oldest-old had the highest scores of all groups of older persons. Vulnerability scores also varied according to place of residence, with more rural parishes having higher scores than their urban counterparts. These findings support the political economy framework which locates disadvantage in old age within political and ideological structures. The findings also point to the pervasiveness and persistence of gender inequality as argued by feminist theories of aging. Based on the results of the study it is clear that there is a need for policies that target specific population segments, in addition to universal policies that could make the experience of old age less challenging for the majority of older persons. Overall, the ESVI has displayed usefulness as a tool for theoretical analysis and demonstrated its potential as a policy instrument to assist decision-makers in determining where to target their efforts as they seek to address the issue of social vulnerability in old age. Data for this study came from the 2001 population and housing census of Jamaica, with multiple imputation for missing data. The index was derived from the linear aggregation of three equally weighted domains, comprised of eleven unweighted indicators which were normalized using z-scores. Indicators were selected based on theoretical relevance and data availability.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Subtitle D of the Resource Conservation and Recovery Act (RCRA) requires a post closure period of 30 years for non hazardous wastes in landfills. Post closure care (PCC) activities under Subtitle D include leachate collection and treatment, groundwater monitoring, inspection and maintenance of the final cover, and monitoring to ensure that landfill gas does not migrate off site or into on site buildings. The decision to reduce PCC duration requires exploration of a performance based methodology to Florida landfills. PCC should be based on whether the landfill is a threat to human health or the environment. Historically no risk based procedure has been available to establish an early end to PCC. Landfill stability depends on a number of factors that include variables that relate to operations both before and after the closure of a landfill cell. Therefore, PCC decisions should be based on location specific factors, operational factors, design factors, post closure performance, end use, and risk analysis. The question of appropriate PCC period for Florida’s landfills requires in depth case studies focusing on the analysis of the performance data from closed landfills in Florida. Based on data availability, Davie Landfill was identified as case study site for a case by case analysis of landfill stability. The performance based PCC decision system developed by Geosyntec Consultants was used for the assessment of site conditions to project PCC needs. The available data for leachate and gas quantity and quality, ground water quality, and cap conditions were evaluated. The quality and quantity data for leachate and gas were analyzed to project the levels of pollutants in leachate and groundwater in reference to maximum contaminant level (MCL). In addition, the projected amount of gas quantity was estimated. A set of contaminants (including metals and organics) were identified as contaminants detected in groundwater for health risk assessment. These contaminants were selected based on their detection frequency and levels in leachate and ground water; and their historical and projected trends. During the evaluations a range of discrepancies and problems that related to the collection and documentation were encountered and possible solutions made. Based on the results of PCC performance integrated with risk assessment, projection of future PCC monitoring needs and sustainable waste management options were identified. According to these results, landfill gas monitoring can be terminated, leachate and groundwater monitoring for parameters above MCL and surveying of the cap integrity should be continued. The parameters which cause longer monitoring periods can be eliminated for the future sustainable landfills. As a conclusion, 30 year PCC period can be reduced for some of the landfill components based on their potential impacts to human health and environment (HH&E).

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This research study was designed to examine the relationship between globalization as measured by the KOF index, its related forces (economic, political, cultural and technological) and the public provision of higher education. This study is important since globalization is increasingly being associated with changes in critical aspects of higher education. The public provision of education was measured by government expenditure and educational outcomes; that is participation, gender equity and attainment. The study utilized a non-experimental quantitative research design. Data collected from secondary sources for 139 selected countries was analyzed. The countries were geographically distributed and included both developed and developing countries. The choice of countries for inclusion in the study was based on data availability. The data, which was sourced from international organizations such as the United Nations and the World Bank, were examined for different time periods using five year averages. The period covered was 1970 to 2009.^ The relationship between globalization and the higher education variables was examined using cross sectional regression analysis while controlling for economic, political and demographic factors. The major findings of the study are as follows. For the two spending models, only one revealed a significant relationship between globalization and education with the R 2 s ranging from .222 to .448 over the period. This relationship was however negative indicating that as globalization increased, spending on higher education declined. However, for the education outcomes models, this relationship was not significant. For the sub-indices of globalization, only the political dimension showed significance as shown in the spending model. Political globalization was significant for six periods with R2 s ranging from .31 to .52.^ The study concluded that the results are mixed for both the spending and outcome models. It also found no robust effects of globalization on government education provision. This finding is not surprising given the existing literature which sees mixed results on the social impact of globalization.^

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Subtitle D of the Resource Conservation and Recovery Act (RCRA) requires a post closure period of 30 years for non hazardous wastes in landfills. Post closure care (PCC) activities under Subtitle D include leachate collection and treatment, groundwater monitoring, inspection and maintenance of the final cover, and monitoring to ensure that landfill gas does not migrate off site or into on site buildings. The decision to reduce PCC duration requires exploration of a performance based methodology to Florida landfills. PCC should be based on whether the landfill is a threat to human health or the environment. Historically no risk based procedure has been available to establish an early end to PCC. Landfill stability depends on a number of factors that include variables that relate to operations both before and after the closure of a landfill cell. Therefore, PCC decisions should be based on location specific factors, operational factors, design factors, post closure performance, end use, and risk analysis. The question of appropriate PCC period for Florida’s landfills requires in depth case studies focusing on the analysis of the performance data from closed landfills in Florida. Based on data availability, Davie Landfill was identified as case study site for a case by case analysis of landfill stability. The performance based PCC decision system developed by Geosyntec Consultants was used for the assessment of site conditions to project PCC needs. The available data for leachate and gas quantity and quality, ground water quality, and cap conditions were evaluated. The quality and quantity data for leachate and gas were analyzed to project the levels of pollutants in leachate and groundwater in reference to maximum contaminant level (MCL). In addition, the projected amount of gas quantity was estimated. A set of contaminants (including metals and organics) were identified as contaminants detected in groundwater for health risk assessment. These contaminants were selected based on their detection frequency and levels in leachate and ground water; and their historical and projected trends. During the evaluations a range of discrepancies and problems that related to the collection and documentation were encountered and possible solutions made. Based on the results of PCC performance integrated with risk assessment, projection of future PCC monitoring needs and sustainable waste management options were identified. According to these results, landfill gas monitoring can be terminated, leachate and groundwater monitoring for parameters above MCL and surveying of the cap integrity should be continued. The parameters which cause longer monitoring periods can be eliminated for the future sustainable landfills. As a conclusion, 30 year PCC period can be reduced for some of the landfill components based on their potential impacts to human health and environment (HH&E).

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Unequaled improvements in processor and I/O speeds make many applications such as databases and operating systems to be increasingly I/O bound. Many schemes such as disk caching and disk mirroring have been proposed to address the problem. In this thesis we focus only on disk mirroring. In disk mirroring, a logical disk image is maintained on two physical disks allowing a single disk failure to be transparent to application programs. Although disk mirroring improves data availability and reliability, it has two major drawbacks. First, writes are expensive because both disks must be updated. Second, load balancing during failure mode operation is poor because all requests are serviced by the surviving disk. Distorted mirrors was proposed to address the write problem and interleaved declustering to address the load balancing problem. In this thesis we perform a comparative study of these two schemes under various operating modes. In addition we also study traditional mirroring to provide a common basis for comparison.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The Surface Ocean CO2 Atlas (SOCAT) is a synthesis of quality-controlled fCO2 (fugacity of carbon dioxide) values for the global surface oceans and coastal seas with regular updates. Version 3 of SOCAT has 14.5 million fCO2 values from 3646 data sets covering the years 1957 to 2014. This latest version has an additional 4.4 million fCO2 values relative to version 2 and extends the record from 2011 to 2014. Version 3 also significantly increases the data availability for 2005 to 2013. SOCAT has an average of approximately 1.2 million surface water fCO2 values per year for the years 2006 to 2012. Quality and documentation of the data has improved. A new feature is the data set quality control (QC) flag of E for data from alternative sensors and platforms. The accuracy of surface water fCO2 has been defined for all data set QC flags. Automated range checking has been carried out for all data sets during their upload into SOCAT. The upgrade of the interactive Data Set Viewer allows better interrogation of the SOCAT data collection and rapid creation of high-quality figures for scientific presentations. Automated data upload has been launched for version 4 and will enable more frequent SOCAT releases in the future. High-profile scientific applications of SOCAT include quantification of the ocean sink for atmospheric carbon dioxide and its long-term variation, detection of ocean acidification, as well as evaluation of coupled-climate and ocean-only biogeochemical models. Users of SOCAT data products are urged to acknowledge the contribution of data providers, as stated in the SOCAT Fair Data Use Statement. This living data publication documents changes in the methods and data sets used in this new version of the SOCAT data collection compared with previous publications of this data collection (Pfeil et al., 2013; Sabine et al., 2013; Bakker et al., 2014).

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Prior finance literature lacks a comprehensive analysis of microstructure characteristics of U.S. futures markets due to the lack of data availability. Utilizing a unique data set for five different futures contract this dissertation fills this gap in the finance literature. In three essays price discovery, resiliency and the components of bid-ask spreads in electronic futures markets are examined. In order to provide comprehensive and robust analysis, both moderately volatile pre-crisis and volatile crisis periods are included in the analysis. The first essay entitled “Price Discovery and Liquidity Characteristics for U.S. Electronic Futures and ETF Markets” explores the price discovery process in U.S. futures and ETF markets. Hasbrouck’s information share method is applied to futures and ETF instruments. The information share results show that futures markets dominate the price discovery process. The results on the factors that affect the price discovery process show that when volatility increases, the price leadership of futures markets declines. Furthermore, when the relative size of bid-ask spread in one market increases, its information share decreases. The second essay, entitled “The Resiliency of Large Trades for U.S. Electronic Futures Markets,“ examines the effects of large trades in futures markets. How quickly prices and liquidity recovers after large trades is an important characteristic of financial markets. The price effects of large trades are greater during the crisis period compared to the pre-crisis period. Furthermore, relative to the pre-crisis period, during the crisis period it takes more trades until liquidity returns to the pre-block trade levels. The third essay, entitled “Components of Quoted Bid-Ask Spreads in U.S. Electronic Futures Markets,” investigates the bid-ask spread components in futures market. The components of bid-ask spreads is one of the most important subjects of microstructure studies. Utilizing Huang and Stoll’s (1997) method the third essay of this dissertation provides the first analysis of the components of quoted bid-ask spreads in U.S. electronic futures markets. The results show that order processing cost is the largest component of bid-ask spreads, followed by inventory holding costs. During the crisis period market makers increase bid-ask spreads due to increasing inventory holding and adverse selection risks.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Over the last two decades social vulnerability has emerged as a major area of study, with increasing attention to the study of vulnerable populations. Generally, the elderly are among the most vulnerable members of any society, and widespread population aging has led to greater focus on elderly vulnerability. However, the absence of a valid and practical measure constrains the ability of policy-makers to address this issue in a comprehensive way. This study developed a composite indicator, The Elderly Social Vulnerability Index (ESVI), and used it to undertake a comparative analysis of the availability of support for elderly Jamaicans based on their access to human, material and social resources. The results of the ESVI indicated that while the elderly are more vulnerable overall, certain segments of the population appear to be at greater risk. Females had consistently lower scores than males, and the oldest-old had the highest scores of all groups of older persons. Vulnerability scores also varied according to place of residence, with more rural parishes having higher scores than their urban counterparts. These findings support the political economy framework which locates disadvantage in old age within political and ideological structures. The findings also point to the pervasiveness and persistence of gender inequality as argued by feminist theories of aging. Based on the results of the study it is clear that there is a need for policies that target specific population segments, in addition to universal policies that could make the experience of old age less challenging for the majority of older persons. Overall, the ESVI has displayed usefulness as a tool for theoretical analysis and demonstrated its potential as a policy instrument to assist decision-makers in determining where to target their efforts as they seek to address the issue of social vulnerability in old age. Data for this study came from the 2001 population and housing census of Jamaica, with multiple imputation for missing data. The index was derived from the linear aggregation of three equally weighted domains, comprised of eleven unweighted indicators which were normalized using z-scores. Indicators were selected based on theoretical relevance and data availability.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Understanding complex social-ecological systems, and anticipating how they may respond to rapid change, requires an approach that incorporates environmental, social, economic, and policy factors, usually in a context of fragmented data availability. We employed fuzzy cognitive mapping (FCM) to integrate these factors in the assessment of future wildfire risk in the Chiquitania region, Bolivia. In this region, dealing with wildfires is becoming increasingly challenging because of reinforcing feedbacks between multiple drivers. We conducted semistructured interviews and constructed different FCMs in focus groups to understand the regional dynamics of wildfire from diverse perspectives. We used FCM modelling to evaluate possible adaptation scenarios in the context of future drier climatic conditions. Scenarios also considered possible failure to respond in time to the emergent risk. This approach proved of great potential to support decision making for risk management. It helped identify key forcing variables and generate insights into potential risks and trade-offs of different strategies. The “Hands-off” scenario resulted in amplified impacts driven by intensifying trends, affecting particularly the agricultural production under drought conditions. The “Fire management” scenario, which adopted a bottom-up approach to improve controlled burning, showed less trade-offs between wildfire risk reduction and production compared with the “Fire suppression” scenario. Findings highlighted the importance of considering strategies that involve all actors who use fire, and the need to nest these strategies for a more systemic approach to manage wildfire risk. The FCM model could be used as a decision-support tool and serve as a “boundary object” to facilitate collaboration and integration of different perceptions of fire in the region. This approach also has the potential to inform decisions in other dynamic frontier landscapes around the world that are facing increased risk of large wildfires.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

It is nowadays recognized that the risk of human co-exposure to multiple mycotoxins is real. In the last years, a number of studies have approached the issue of co-exposure and the best way to develop a more precise and realistic assessment. Likewise, the growing concern about the combined effects of mycotoxins and their potential impact on human health has been reflected by the increasing number of toxicological studies on the combined toxicity of these compounds. Nevertheless, risk assessment of these toxins, still follows the conventional paradigm of single exposure and single effects, incorporating only the possibility of additivity but not taking into account the complex dynamics associated to interactions between different mycotoxins or between mycotoxins and other food contaminants. Considering that risk assessment is intimately related to the establishment of regulatory guidelines, once the risk assessment is completed, an effort to reduce or manage the risk should be followed to protect public health. Risk assessment of combined human exposure to multiple mycotoxins thus poses several challenges to scientists, risk assessors and risk managers and opens new avenues for research. This presentation aims to give an overview of the different challenges posed by the likelihood of human co-exposure to mycotoxins and the possibility of interactive effects occurring after absorption, towards knowledge generation to support a more accurate human risk assessment and risk management. For this purpose, a physiologically-based framework that includes knowledge on the bioaccessibility, toxicokinetics and toxicodynamics of multiple toxins is proposed. Regarding exposure assessment, the need of harmonized food consumption data, availability of multianalyte methods for mycotoxin quantification, management of left-censored data and use of probabilistic models will be highlight, in order to develop a more precise and realistic exposure assessment. On the other hand, the application of predictive mathematical models to estimate mycotoxins’ combined effects from in vitro toxicity studies will be also discussed. Results from a recent Portuguese project aimed at exploring the toxic effects of mixtures of mycotoxins in infant foods and their potential health impact will be presented as a case study, illustrating the different aspects of risk assessment highlighted in this presentation. Further studies on hazard and exposure assessment of multiple mycotoxins, using harmonized approaches and methodologies, will be crucial towards an improvement in data quality and contributing to holistic risk assessment and risk management strategies for multiple mycotoxins in foodstuffs.