962 resultados para data availability


Relevância:

60.00% 60.00%

Publicador:

Resumo:

This research study was designed to examine the relationship between globalization as measured by the KOF index, its related forces (economic, political, cultural and technological) and the public provision of higher education. This study is important since globalization is increasingly being associated with changes in critical aspects of higher education. The public provision of education was measured by government expenditure and educational outcomes; that is participation, gender equity and attainment. The study utilized a non-experimental quantitative research design. Data collected from secondary sources for 139 selected countries was analyzed. The countries were geographically distributed and included both developed and developing countries. The choice of countries for inclusion in the study was based on data availability. The data, which was sourced from international organizations such as the United Nations and the World Bank, were examined for different time periods using five year averages. The period covered was 1970 to 2009.^ The relationship between globalization and the higher education variables was examined using cross sectional regression analysis while controlling for economic, political and demographic factors. The major findings of the study are as follows. For the two spending models, only one revealed a significant relationship between globalization and education with the R 2 s ranging from .222 to .448 over the period. This relationship was however negative indicating that as globalization increased, spending on higher education declined. However, for the education outcomes models, this relationship was not significant. For the sub-indices of globalization, only the political dimension showed significance as shown in the spending model. Political globalization was significant for six periods with R2 s ranging from .31 to .52.^ The study concluded that the results are mixed for both the spending and outcome models. It also found no robust effects of globalization on government education provision. This finding is not surprising given the existing literature which sees mixed results on the social impact of globalization.^

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Subtitle D of the Resource Conservation and Recovery Act (RCRA) requires a post closure period of 30 years for non hazardous wastes in landfills. Post closure care (PCC) activities under Subtitle D include leachate collection and treatment, groundwater monitoring, inspection and maintenance of the final cover, and monitoring to ensure that landfill gas does not migrate off site or into on site buildings. The decision to reduce PCC duration requires exploration of a performance based methodology to Florida landfills. PCC should be based on whether the landfill is a threat to human health or the environment. Historically no risk based procedure has been available to establish an early end to PCC. Landfill stability depends on a number of factors that include variables that relate to operations both before and after the closure of a landfill cell. Therefore, PCC decisions should be based on location specific factors, operational factors, design factors, post closure performance, end use, and risk analysis. The question of appropriate PCC period for Florida’s landfills requires in depth case studies focusing on the analysis of the performance data from closed landfills in Florida. Based on data availability, Davie Landfill was identified as case study site for a case by case analysis of landfill stability. The performance based PCC decision system developed by Geosyntec Consultants was used for the assessment of site conditions to project PCC needs. The available data for leachate and gas quantity and quality, ground water quality, and cap conditions were evaluated. The quality and quantity data for leachate and gas were analyzed to project the levels of pollutants in leachate and groundwater in reference to maximum contaminant level (MCL). In addition, the projected amount of gas quantity was estimated. A set of contaminants (including metals and organics) were identified as contaminants detected in groundwater for health risk assessment. These contaminants were selected based on their detection frequency and levels in leachate and ground water; and their historical and projected trends. During the evaluations a range of discrepancies and problems that related to the collection and documentation were encountered and possible solutions made. Based on the results of PCC performance integrated with risk assessment, projection of future PCC monitoring needs and sustainable waste management options were identified. According to these results, landfill gas monitoring can be terminated, leachate and groundwater monitoring for parameters above MCL and surveying of the cap integrity should be continued. The parameters which cause longer monitoring periods can be eliminated for the future sustainable landfills. As a conclusion, 30 year PCC period can be reduced for some of the landfill components based on their potential impacts to human health and environment (HH&E).

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Unequaled improvements in processor and I/O speeds make many applications such as databases and operating systems to be increasingly I/O bound. Many schemes such as disk caching and disk mirroring have been proposed to address the problem. In this thesis we focus only on disk mirroring. In disk mirroring, a logical disk image is maintained on two physical disks allowing a single disk failure to be transparent to application programs. Although disk mirroring improves data availability and reliability, it has two major drawbacks. First, writes are expensive because both disks must be updated. Second, load balancing during failure mode operation is poor because all requests are serviced by the surviving disk. Distorted mirrors was proposed to address the write problem and interleaved declustering to address the load balancing problem. In this thesis we perform a comparative study of these two schemes under various operating modes. In addition we also study traditional mirroring to provide a common basis for comparison.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The Surface Ocean CO2 Atlas (SOCAT) is a synthesis of quality-controlled fCO2 (fugacity of carbon dioxide) values for the global surface oceans and coastal seas with regular updates. Version 3 of SOCAT has 14.5 million fCO2 values from 3646 data sets covering the years 1957 to 2014. This latest version has an additional 4.4 million fCO2 values relative to version 2 and extends the record from 2011 to 2014. Version 3 also significantly increases the data availability for 2005 to 2013. SOCAT has an average of approximately 1.2 million surface water fCO2 values per year for the years 2006 to 2012. Quality and documentation of the data has improved. A new feature is the data set quality control (QC) flag of E for data from alternative sensors and platforms. The accuracy of surface water fCO2 has been defined for all data set QC flags. Automated range checking has been carried out for all data sets during their upload into SOCAT. The upgrade of the interactive Data Set Viewer allows better interrogation of the SOCAT data collection and rapid creation of high-quality figures for scientific presentations. Automated data upload has been launched for version 4 and will enable more frequent SOCAT releases in the future. High-profile scientific applications of SOCAT include quantification of the ocean sink for atmospheric carbon dioxide and its long-term variation, detection of ocean acidification, as well as evaluation of coupled-climate and ocean-only biogeochemical models. Users of SOCAT data products are urged to acknowledge the contribution of data providers, as stated in the SOCAT Fair Data Use Statement. This living data publication documents changes in the methods and data sets used in this new version of the SOCAT data collection compared with previous publications of this data collection (Pfeil et al., 2013; Sabine et al., 2013; Bakker et al., 2014).

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Prior finance literature lacks a comprehensive analysis of microstructure characteristics of U.S. futures markets due to the lack of data availability. Utilizing a unique data set for five different futures contract this dissertation fills this gap in the finance literature. In three essays price discovery, resiliency and the components of bid-ask spreads in electronic futures markets are examined. In order to provide comprehensive and robust analysis, both moderately volatile pre-crisis and volatile crisis periods are included in the analysis. The first essay entitled “Price Discovery and Liquidity Characteristics for U.S. Electronic Futures and ETF Markets” explores the price discovery process in U.S. futures and ETF markets. Hasbrouck’s information share method is applied to futures and ETF instruments. The information share results show that futures markets dominate the price discovery process. The results on the factors that affect the price discovery process show that when volatility increases, the price leadership of futures markets declines. Furthermore, when the relative size of bid-ask spread in one market increases, its information share decreases. The second essay, entitled “The Resiliency of Large Trades for U.S. Electronic Futures Markets,“ examines the effects of large trades in futures markets. How quickly prices and liquidity recovers after large trades is an important characteristic of financial markets. The price effects of large trades are greater during the crisis period compared to the pre-crisis period. Furthermore, relative to the pre-crisis period, during the crisis period it takes more trades until liquidity returns to the pre-block trade levels. The third essay, entitled “Components of Quoted Bid-Ask Spreads in U.S. Electronic Futures Markets,” investigates the bid-ask spread components in futures market. The components of bid-ask spreads is one of the most important subjects of microstructure studies. Utilizing Huang and Stoll’s (1997) method the third essay of this dissertation provides the first analysis of the components of quoted bid-ask spreads in U.S. electronic futures markets. The results show that order processing cost is the largest component of bid-ask spreads, followed by inventory holding costs. During the crisis period market makers increase bid-ask spreads due to increasing inventory holding and adverse selection risks.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Over the last two decades social vulnerability has emerged as a major area of study, with increasing attention to the study of vulnerable populations. Generally, the elderly are among the most vulnerable members of any society, and widespread population aging has led to greater focus on elderly vulnerability. However, the absence of a valid and practical measure constrains the ability of policy-makers to address this issue in a comprehensive way. This study developed a composite indicator, The Elderly Social Vulnerability Index (ESVI), and used it to undertake a comparative analysis of the availability of support for elderly Jamaicans based on their access to human, material and social resources. The results of the ESVI indicated that while the elderly are more vulnerable overall, certain segments of the population appear to be at greater risk. Females had consistently lower scores than males, and the oldest-old had the highest scores of all groups of older persons. Vulnerability scores also varied according to place of residence, with more rural parishes having higher scores than their urban counterparts. These findings support the political economy framework which locates disadvantage in old age within political and ideological structures. The findings also point to the pervasiveness and persistence of gender inequality as argued by feminist theories of aging. Based on the results of the study it is clear that there is a need for policies that target specific population segments, in addition to universal policies that could make the experience of old age less challenging for the majority of older persons. Overall, the ESVI has displayed usefulness as a tool for theoretical analysis and demonstrated its potential as a policy instrument to assist decision-makers in determining where to target their efforts as they seek to address the issue of social vulnerability in old age. Data for this study came from the 2001 population and housing census of Jamaica, with multiple imputation for missing data. The index was derived from the linear aggregation of three equally weighted domains, comprised of eleven unweighted indicators which were normalized using z-scores. Indicators were selected based on theoretical relevance and data availability.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Understanding complex social-ecological systems, and anticipating how they may respond to rapid change, requires an approach that incorporates environmental, social, economic, and policy factors, usually in a context of fragmented data availability. We employed fuzzy cognitive mapping (FCM) to integrate these factors in the assessment of future wildfire risk in the Chiquitania region, Bolivia. In this region, dealing with wildfires is becoming increasingly challenging because of reinforcing feedbacks between multiple drivers. We conducted semistructured interviews and constructed different FCMs in focus groups to understand the regional dynamics of wildfire from diverse perspectives. We used FCM modelling to evaluate possible adaptation scenarios in the context of future drier climatic conditions. Scenarios also considered possible failure to respond in time to the emergent risk. This approach proved of great potential to support decision making for risk management. It helped identify key forcing variables and generate insights into potential risks and trade-offs of different strategies. The “Hands-off” scenario resulted in amplified impacts driven by intensifying trends, affecting particularly the agricultural production under drought conditions. The “Fire management” scenario, which adopted a bottom-up approach to improve controlled burning, showed less trade-offs between wildfire risk reduction and production compared with the “Fire suppression” scenario. Findings highlighted the importance of considering strategies that involve all actors who use fire, and the need to nest these strategies for a more systemic approach to manage wildfire risk. The FCM model could be used as a decision-support tool and serve as a “boundary object” to facilitate collaboration and integration of different perceptions of fire in the region. This approach also has the potential to inform decisions in other dynamic frontier landscapes around the world that are facing increased risk of large wildfires.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

It is nowadays recognized that the risk of human co-exposure to multiple mycotoxins is real. In the last years, a number of studies have approached the issue of co-exposure and the best way to develop a more precise and realistic assessment. Likewise, the growing concern about the combined effects of mycotoxins and their potential impact on human health has been reflected by the increasing number of toxicological studies on the combined toxicity of these compounds. Nevertheless, risk assessment of these toxins, still follows the conventional paradigm of single exposure and single effects, incorporating only the possibility of additivity but not taking into account the complex dynamics associated to interactions between different mycotoxins or between mycotoxins and other food contaminants. Considering that risk assessment is intimately related to the establishment of regulatory guidelines, once the risk assessment is completed, an effort to reduce or manage the risk should be followed to protect public health. Risk assessment of combined human exposure to multiple mycotoxins thus poses several challenges to scientists, risk assessors and risk managers and opens new avenues for research. This presentation aims to give an overview of the different challenges posed by the likelihood of human co-exposure to mycotoxins and the possibility of interactive effects occurring after absorption, towards knowledge generation to support a more accurate human risk assessment and risk management. For this purpose, a physiologically-based framework that includes knowledge on the bioaccessibility, toxicokinetics and toxicodynamics of multiple toxins is proposed. Regarding exposure assessment, the need of harmonized food consumption data, availability of multianalyte methods for mycotoxin quantification, management of left-censored data and use of probabilistic models will be highlight, in order to develop a more precise and realistic exposure assessment. On the other hand, the application of predictive mathematical models to estimate mycotoxins’ combined effects from in vitro toxicity studies will be also discussed. Results from a recent Portuguese project aimed at exploring the toxic effects of mixtures of mycotoxins in infant foods and their potential health impact will be presented as a case study, illustrating the different aspects of risk assessment highlighted in this presentation. Further studies on hazard and exposure assessment of multiple mycotoxins, using harmonized approaches and methodologies, will be crucial towards an improvement in data quality and contributing to holistic risk assessment and risk management strategies for multiple mycotoxins in foodstuffs.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The 2010 biodiversity target agreed by signatories to the Convention on Biological Diversity directed the attention of conservation professionals toward the development of indicators with which to measure changes in biological diversity at the global scale. We considered why global biodiversity indicators are needed, what characteristics successful global indicators have, and how existing indicators perform. Because monitoring could absorb a large proportion of funds available for conservation, we believe indicators should be linked explicitly to monitoring objectives and decisions about which monitoring schemes deserve funding should be informed by predictions of the value of such schemes to decision making. We suggest that raising awareness among the public and policy makers, auditing management actions, and informing policy choices are the most important global monitoring objectives. Using four well-developed indicators of biological diversity (extent of forests, coverage of protected areas, Living Planet Index, Red List Index) as examples, we analyzed the characteristics needed for indicators to meet these objectives. We recommend that conservation professionals improve on existing indicators by eliminating spatial biases in data availability, fill gaps in information about ecosystems other than forests, and improve understanding of the way indicators respond to policy changes. Monitoring is not an end in itself, and we believe it is vital that the ultimate objectives of global monitoring of biological diversity inform development of new indicators.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This research study was designed to examine the relationship between globalization as measured by the KOF index, its related forces (economic, political, cultural and technological) and the public provision of higher education. This study is important since globalization is increasingly being associated with changes in critical aspects of higher education. The public provision of education was measured by government expenditure and educational outcomes; that is participation, gender equity and attainment. The study utilized a non-experimental quantitative research design. Data collected from secondary sources for 139 selected countries was analyzed. The countries were geographically distributed and included both developed and developing countries. The choice of countries for inclusion in the study was based on data availability. The data, which was sourced from international organizations such as the United Nations and the World Bank, were examined for different time periods using five year averages. The period covered was 1970 to 2009. The relationship between globalization and the higher education variables was examined using cross sectional regression analysis while controlling for economic, political and demographic factors. The major findings of the study are as follows. For the two spending models, only one revealed a significant relationship between globalization and education with the R2 s ranging from .222 to .448 over the period. This relationship was however negative indicating that as globalization increased, spending on higher education declined. However, for the education outcomes models, this relationship was not significant. For the sub-indices of globalization, only the political dimension showed significance as shown in the spending model. Political globalization was significant for six periods with R2 s ranging from .31 to .52. The study concluded that the results are mixed for both the spending and outcome models. It also found no robust effects of globalization on government education provision. This finding is not surprising given the existing literature which sees mixed results on the social impact of globalization.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

A feeding strategy model is proposed using stomach content and resource availability data as a modification to Costello (1990) and Amundsen et al. (1996). Incorporation of feeding electivity index (E) instead of the prey-specific abundance signifies the importance of resource availability in prey selection as well as the predator's ability to specialize, generalize or avoid particular prey items at the individual and population level.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

A modelling scheme is described which uses satellite retrieved sea-surface temperature and chlorophyll-a to derive monthly zooplankton biomass estimates in the eastern North Atlantic; this forms part of a bio-physical model of inter-annual variations in the growth and survival of larvae and post-larvae of mackerel (Scomber scombrus). The temperature and chlorophyll data are incorporated first to model copepod (Calanus) egg production rates. Egg production is then converted to available food using distribution data from the Continuous Plankton Recorder (CPR) Survey, observed population biomass per unit daily egg production and the proportion of the larval mackerel diet comprising Calanus. Results are validated in comparison with field observations of zooplankton biomass. The principal benefit of the modelling scheme is the ability to use the combination of broad scale coverage and fine scale temporal and spatial variability of satellite data as driving forces in the model; weaknesses are the simplicity of the egg production model and the broad-scale generalizations assumed in the raising factors to convert egg production to biomass.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Failures are normal rather than exceptional in the cloud computing environments. To improve system avai-lability, replicating the popular data to multiple suitable locations is an advisable choice, as users can access the data from a nearby site. This is, however, not the case for replicas which must have a fixed number of copies on several locations. How to decide a reasonable number and right locations for replicas has become a challenge in the cloud computing. In this paper, a dynamic data replication strategy is put forward with a brief survey of replication strategy suitable for distributed computing environments. It includes: 1) analyzing and modeling the relationship between system availability and the number of replicas; 2) evaluating and identifying the popular data and triggering a replication operation when the popularity data passes a dynamic threshold; 3) calculating a suitable number of copies to meet a reasonable system byte effective rate requirement and placing replicas among data nodes in a balanced way; 4) designing the dynamic data replication algorithm in a cloud. Experimental results demonstrate the efficiency and effectiveness of the improved system brought by the proposed strategy in a cloud.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Includes bibliography