962 resultados para data availability


Relevância:

70.00% 70.00%

Publicador:

Resumo:

The original contribution of this work is threefold. Firstly, this thesis develops a critical perspective on current evaluation practice of business support, with focus on the timing of evaluation. The general time frame applied for business support policy evaluation is limited to one to two, seldom three years post intervention. This is despite calls for long-term impact studies by various authors, concerned about time lags before effects are fully realised. This desire for long-term evaluation opposes the requirements by policy-makers and funders, seeking quick results. Also, current ‘best practice’ frameworks do not refer to timing or its implications, and data availability affects the ability to undertake long-term evaluation. Secondly, this thesis provides methodological value for follow-up and similar studies by using data linking of scheme-beneficiary data with official performance datasets. Thus data availability problems are avoided through the use of secondary data. Thirdly, this thesis builds the evidence, through the application of a longitudinal impact study of small business support in England, covering seven years of post intervention data. This illustrates the variability of results for different evaluation periods, and the value in using multiple years of data for a robust understanding of support impact. For survival, impact of assistance is found to be immediate, but limited. Concerning growth, significant impact centres on a two to three year period post intervention for the linear selection and quantile regression models – positive for employment and turnover, negative for productivity. Attribution of impact may present a problem for subsequent periods. The results clearly support the argument for the use of longitudinal data and analysis, and a greater appreciation by evaluators of the factor time. This analysis recommends a time frame of four to five years post intervention for soft business support evaluation.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

The original contribution of this work is threefold. Firstly, this thesis develops a critical perspective on current evaluation practice of business support, with focus on the timing of evaluation. The general time frame applied for business support policy evaluation is limited to one to two, seldom three years post intervention. This is despite calls for long-term impact studies by various authors, concerned about time lags before effects are fully realised. This desire for long-term evaluation opposes the requirements by policy-makers and funders, seeking quick results. Also, current ‘best practice’ frameworks do not refer to timing or its implications, and data availability affects the ability to undertake long-term evaluation. Secondly, this thesis provides methodological value for follow-up and similar studies by using data linking of scheme-beneficiary data with official performance datasets. Thus data availability problems are avoided through the use of secondary data. Thirdly, this thesis builds the evidence, through the application of a longitudinal impact study of small business support in England, covering seven years of post intervention data. This illustrates the variability of results for different evaluation periods, and the value in using multiple years of data for a robust understanding of support impact. For survival, impact of assistance is found to be immediate, but limited. Concerning growth, significant impact centres on a two to three year period post intervention for the linear selection and quantile regression models – positive for employment and turnover, negative for productivity. Attribution of impact may present a problem for subsequent periods. The results clearly support the argument for the use of longitudinal data and analysis, and a greater appreciation by evaluators of the factor time. This analysis recommends a time frame of four to five years post intervention for soft business support evaluation.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

This research presents several components encompassing the scope of the objective of Data Partitioning and Replication Management in Distributed GIS Database. Modern Geographic Information Systems (GIS) databases are often large and complicated. Therefore data partitioning and replication management problems need to be addresses in development of an efficient and scalable solution. ^ Part of the research is to study the patterns of geographical raster data processing and to propose the algorithms to improve availability of such data. These algorithms and approaches are targeting granularity of geographic data objects as well as data partitioning in geographic databases to achieve high data availability and Quality of Service(QoS) considering distributed data delivery and processing. To achieve this goal a dynamic, real-time approach for mosaicking digital images of different temporal and spatial characteristics into tiles is proposed. This dynamic approach reuses digital images upon demand and generates mosaicked tiles only for the required region according to user's requirements such as resolution, temporal range, and target bands to reduce redundancy in storage and to utilize available computing and storage resources more efficiently. ^ Another part of the research pursued methods for efficient acquiring of GIS data from external heterogeneous databases and Web services as well as end-user GIS data delivery enhancements, automation and 3D virtual reality presentation. ^ There are vast numbers of computing, network, and storage resources idling or not fully utilized available on the Internet. Proposed "Crawling Distributed Operating System "(CDOS) approach employs such resources and creates benefits for the hosts that lend their CPU, network, and storage resources to be used in GIS database context. ^ The results of this dissertation demonstrate effective ways to develop a highly scalable GIS database. The approach developed in this dissertation has resulted in creation of TerraFly GIS database that is used by US government, researchers, and general public to facilitate Web access to remotely-sensed imagery and GIS vector information. ^

Relevância:

70.00% 70.00%

Publicador:

Resumo:

In today’s big data world, data is being produced in massive volumes, at great velocity and from a variety of different sources such as mobile devices, sensors, a plethora of small devices hooked to the internet (Internet of Things), social networks, communication networks and many others. Interactive querying and large-scale analytics are being increasingly used to derive value out of this big data. A large portion of this data is being stored and processed in the Cloud due the several advantages provided by the Cloud such as scalability, elasticity, availability, low cost of ownership and the overall economies of scale. There is thus, a growing need for large-scale cloud-based data management systems that can support real-time ingest, storage and processing of large volumes of heterogeneous data. However, in the pay-as-you-go Cloud environment, the cost of analytics can grow linearly with the time and resources required. Reducing the cost of data analytics in the Cloud thus remains a primary challenge. In my dissertation research, I have focused on building efficient and cost-effective cloud-based data management systems for different application domains that are predominant in cloud computing environments. In the first part of my dissertation, I address the problem of reducing the cost of transactional workloads on relational databases to support database-as-a-service in the Cloud. The primary challenges in supporting such workloads include choosing how to partition the data across a large number of machines, minimizing the number of distributed transactions, providing high data availability, and tolerating failures gracefully. I have designed, built and evaluated SWORD, an end-to-end scalable online transaction processing system, that utilizes workload-aware data placement and replication to minimize the number of distributed transactions that incorporates a suite of novel techniques to significantly reduce the overheads incurred both during the initial placement of data, and during query execution at runtime. In the second part of my dissertation, I focus on sampling-based progressive analytics as a means to reduce the cost of data analytics in the relational domain. Sampling has been traditionally used by data scientists to get progressive answers to complex analytical tasks over large volumes of data. Typically, this involves manually extracting samples of increasing data size (progressive samples) for exploratory querying. This provides the data scientists with user control, repeatable semantics, and result provenance. However, such solutions result in tedious workflows that preclude the reuse of work across samples. On the other hand, existing approximate query processing systems report early results, but do not offer the above benefits for complex ad-hoc queries. I propose a new progressive data-parallel computation framework, NOW!, that provides support for progressive analytics over big data. In particular, NOW! enables progressive relational (SQL) query support in the Cloud using unique progress semantics that allow efficient and deterministic query processing over samples providing meaningful early results and provenance to data scientists. NOW! enables the provision of early results using significantly fewer resources thereby enabling a substantial reduction in the cost incurred during such analytics. Finally, I propose NSCALE, a system for efficient and cost-effective complex analytics on large-scale graph-structured data in the Cloud. The system is based on the key observation that a wide range of complex analysis tasks over graph data require processing and reasoning about a large number of multi-hop neighborhoods or subgraphs in the graph; examples include ego network analysis, motif counting in biological networks, finding social circles in social networks, personalized recommendations, link prediction, etc. These tasks are not well served by existing vertex-centric graph processing frameworks whose computation and execution models limit the user program to directly access the state of a single vertex, resulting in high execution overheads. Further, the lack of support for extracting the relevant portions of the graph that are of interest to an analysis task and loading it onto distributed memory leads to poor scalability. NSCALE allows users to write programs at the level of neighborhoods or subgraphs rather than at the level of vertices, and to declaratively specify the subgraphs of interest. It enables the efficient distributed execution of these neighborhood-centric complex analysis tasks over largescale graphs, while minimizing resource consumption and communication cost, thereby substantially reducing the overall cost of graph data analytics in the Cloud. The results of our extensive experimental evaluation of these prototypes with several real-world data sets and applications validate the effectiveness of our techniques which provide orders-of-magnitude reductions in the overheads of distributed data querying and analysis in the Cloud.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Due to the sensitive nature of patient data, the secondary use of electronic health records (EHR) is restricted in scientific research and product development. Such restrictions pursue to preserve the privacy of respective patients by limiting the availability and variety of sensitive patient data. Current limitations do not correspond with the actual needs requested by the potential secondary users. In this thesis, the secondary use of Finnish and Swedish EHR data is explored for the purpose of enhancing the availability of such data for clinical research and product development. Involved EHR-related procedures and technologies are analysed to identify the issues limiting the secondary use of patient data. Successful secondary use of patient data increases the data value. To explore the identified circumstances, a case study of potential secondary users and use intentions regarding EHR data was carried out in Finland and Sweden. The data collection for the conducted case study was performed using semi-structured interviews. In total, 14 Finnish and Swedish experts representing scientific research, health management, and business were interviewed. The motivation for the corresponding interviews was to evaluate the protection of EHR data used for secondary purposes. The efficiency of implemented procedures and technologies was analysed in terms of data availability and privacy preserving. The results of the conducted case study show that the factors affecting EHR availability are divided to three categories: management of patient data, preservation of patients' privacy, and potential secondary users. Identified issues regarding data management included laborious and inconsistent data request procedures and the role and effect of external service providers. Based on the study findings, two secondary use approaches enabling the secondary use of EHR data are identified: data alteration and protected processing environment. Data alteration increases the availability of relevant EHR data, further decreasing the value of such data. Protected processing approach restricts the amount of potential users and use intentions while providing more valuable data content.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The aim of the study is to identify the opportunities and challenges a local government public asset manager is most likely to deal with when adopting the appropriate Public Asset Management Framework especially in developing countries. In order to achieve its aim, this study employs a Case Study in Indonesia for collecting all data i.e. interviews, document analysis and observations at South Sulawesi Province, Indonesia. The study concludes that there are significant opportunities and challenges that local governments in developing countries, especially Indonesia, might be required to manage if apply public asset management framework appropriately. The opportunities are more effective and efficient local government, accountable and auditable local government organization, increase local government portfolio, reflect up to date information for decision makers in local government, and improve the quality of public services. On the other hand, there are also challenges. Those challenges are local governments has no clear legal and institutional framework to support the asset management application, non-profit principle of public assets, cross jurisdictions and applications in public asset management, the complexity of public organization objectives, and data availability required for managing public property. The study only covers the condition of developing countries where Indonesia as an example, which could not represent exactly the whole local governments’ condition in the world. Further study to develop an asset management system applicable for all local governments in developing countries is urgently needed. Findings from this study will provide useful input for the policy maker, scholars and asset management practitioners to develop an asset management framework for more efficient and effective local governments.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper sets out to examine from published literature and crash data analyses whether alcohol in bicycle crashes is an issue about which we should be concerned. It discusses factors that have the potential to increase the number of bicycle crashes in which alcohol is involved (such growth in the size and diversity of the cyclist population, and balance and coordination demands) and factors which may reduce the importance of alcohol in bicycle crashes (such as time of data factors and child riders). It also examines data availability issues that contribute to difficulties in determining the true magnitude of the issue. Methods: This paper reviews previous research and reports analyses of data from Queensland, Australia, that examine the role of alcohol in Police-reported road crashes. In Queensland it is an offence to ride a bicycle or drive a motor vehicle with a BAC exceeding 0.05% (or lower for novice and professional drivers). Results: In the five years 2003-2007, alcohol was reported as involved in 165 bicycle crashes (4%). The bicycle rider was coded as “under the influence” or “over the prescribed BAC limit” in 15 were single unit crashes (12%). In multi-vehicle bicycle crashes, alcohol involvement was reported for 16 cyclists (0.4%) and 110 operators of other vehicles (3%). Additional analyses including characteristics of the cyclist crashes involving alcohol and the importance of missing data will be discussed in the paper. Conclusion: The increase in participation in cycling and the vulnerability of cyclists to injuries support the need to examine the role of alcohol in bicycle crashes. Current data suggest that alcohol on the part of the vehicle driver is a larger concern than alcohol on the part of the cyclist, but improvements in data collection are needed before more precise conclusions can be drawn.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Background Diabetic foot complications are recognised as the most common reason for diabetic related hospitalisation and lower extremity amputations. Multi-faceted strategies to reduce diabetic foot hospitalisation and amputation rates have been successful. However, most diabetic foot ulcers are managed in ambulatory settings where data availability is poor and studies limited. The project aimed to develop and evaluate strategies to improve the management of diabetic foot complications in three diverse ambulatory settings and measure the subsequent impact on ospitalisation and amputation. Methods Multifaceted strategies were implemented in 2008, including: multi-disciplinary teams, clinical pathways and training, clinical indicators, telehealth support and surveys. A retrospective audit of consecutive patient records from July 2006 – June 2007 determined baseline clinical indicators (n = 101). A clinical pathway teleform was implemented as a clinical record and clinical indicator analyser in all sites in 2008 (n = 327) and followed up in 2009 (n = 406). Results Prior to the intervention, clinical pathways were not used and multi-disciplinary teams were limited. There was an absolute improvement in treating according to risk of 15% in 2009 and surveillance of the high risk population of 34% and 19% in 2008 and 2009 respectively (p < 0.001). Improvements of 13 – 66% (p < 0.001) were recorded in 2008 for individual clinical activities to a performance > 92% in perfusion, ulcer depth, infection assessment and management, offloading and education. Hospitalisation impacts recorded reductions of up to 64% in amputation rates / 100,000 population (p < 0.001) and 24% average length of stay (p < 0.001) Conclusion These findings support the use of multi-faceted strategies in diverse ambulatory services to standardise practice, improve diabetic foot complications management and positively impact on hospitalisation outcomes. As of October 2010, these strategies had been rolled out to over 25 ambulatory sites, representing 66% of Queensland Health districts, managing 1,820 patients and 13,380 occasions of service, including 543 healed ulcer patients. It is expected that this number will rise dramatically as an incentive payment for the use of the teleform is expanded.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

During the last several decades, the quality of natural resources and their services have been exposed to significant degradation from increased urban populations combined with the sprawl of settlements, development of transportation networks and industrial activities (Dorsey, 2003; Pauleit et al., 2005). As a result of this environmental degradation, a sustainable framework for urban development is required to provide the resilience of natural resources and ecosystems. Sustainable urban development refers to the management of cities with adequate infrastructure to support the needs of its population for the present and future generations as well as maintain the sustainability of its ecosystems (UNEP/IETC, 2002; Yigitcanlar, 2010). One of the important strategic approaches for planning sustainable cities is „ecological planning‟. Ecological planning is a multi-dimensional concept that aims to preserve biodiversity richness and ecosystem productivity through the sustainable management of natural resources (Barnes et al., 2005). As stated by Baldwin (1985, p.4), ecological planning is the initiation and operation of activities to direct and control the acquisition, transformation, disruption and disposal of resources in a manner capable of sustaining human activities with a minimum disruption of ecosystem processes. Therefore, ecological planning is a powerful method for creating sustainable urban ecosystems. In order to explore the city as an ecosystem and investigate the interaction between the urban ecosystem and human activities, a holistic urban ecosystem sustainability assessment approach is required. Urban ecosystem sustainability assessment serves as a tool that helps policy and decision-makers in improving their actions towards sustainable urban development. There are several methods used in urban ecosystem sustainability assessment among which sustainability indicators and composite indices are the most commonly used tools for assessing the progress towards sustainable land use and urban management. Currently, a variety of composite indices are available to measure the sustainability at the local, national and international levels. However, the main conclusion drawn from the literature review is that they are too broad to be applied to assess local and micro level sustainability and no benchmark value for most of the indicators exists due to limited data availability and non-comparable data across countries. Mayer (2008, p. 280) advocates that by stating "as different as the indices may seem, many of them incorporate the same underlying data because of the small number of available sustainability datasets". Mori and Christodoulou (2011) also argue that this relative evaluation and comparison brings along biased assessments, as data only exists for some entities, which also means excluding many nations from evaluation and comparison. Thus, there is a need for developing an accurate and comprehensive micro-level urban ecosystem sustainability assessment method. In order to develop such a model, it is practical to adopt an approach that uses a method to utilise indicators for collecting data, designate certain threshold values or ranges, perform a comparative sustainability assessment via indices at the micro-level, and aggregate these assessment findings to the local level. Hereby, through this approach and model, it is possible to produce sufficient and reliable data to enable comparison at the local level, and provide useful results to inform the local planning, conservation and development decision-making process to secure sustainable ecosystems and urban futures. To advance research in this area, this study investigated the environmental impacts of an existing urban context by using a composite index with an aim to identify the interaction between urban ecosystems and human activities in the context of environmental sustainability. In this respect, this study developed a new comprehensive urban ecosystem sustainability assessment tool entitled the „Micro-level Urban-ecosystem Sustainability IndeX‟ (MUSIX). The MUSIX model is an indicator-based indexing model that investigates the factors affecting urban sustainability in a local context. The model outputs provide local and micro-level sustainability reporting guidance to help policy-making concerning environmental issues. A multi-method research approach, which is based on both quantitative analysis and qualitative analysis, was employed in the construction of the MUSIX model. First, a qualitative research was conducted through an interpretive and critical literature review in developing a theoretical framework and indicator selection. Afterwards, a quantitative research was conducted through statistical and spatial analyses in data collection, processing and model application. The MUSIX model was tested in four pilot study sites selected from the Gold Coast City, Queensland, Australia. The model results detected the sustainability performance of current urban settings referring to six main issues of urban development: (1) hydrology, (2) ecology, (3) pollution, (4) location, (5) design, and; (6) efficiency. For each category, a set of core indicators was assigned which are intended to: (1) benchmark the current situation, strengths and weaknesses, (2) evaluate the efficiency of implemented plans, and; (3) measure the progress towards sustainable development. While the indicator set of the model provided specific information about the environmental impacts in the area at the parcel scale, the composite index score provided general information about the sustainability of the area at the neighbourhood scale. Finally, in light of the model findings, integrated ecological planning strategies were developed to guide the preparation and assessment of development and local area plans in conjunction with the Gold Coast Planning Scheme, which establishes regulatory provisions to achieve ecological sustainability through the formulation of place codes, development codes, constraint codes and other assessment criteria that provide guidance for best practice development solutions. These relevant strategies can be summarised as follows: • Establishing hydrological conservation through sustainable stormwater management in order to preserve the Earth’s water cycle and aquatic ecosystems; • Providing ecological conservation through sustainable ecosystem management in order to protect biological diversity and maintain the integrity of natural ecosystems; • Improving environmental quality through developing pollution prevention regulations and policies in order to promote high quality water resources, clean air and enhanced ecosystem health; • Creating sustainable mobility and accessibility through designing better local services and walkable neighbourhoods in order to promote safe environments and healthy communities; • Sustainable design of urban environment through climate responsive design in order to increase the efficient use of solar energy to provide thermal comfort, and; • Use of renewable resources through creating efficient communities in order to provide long-term management of natural resources for the sustainability of future generations.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Floods are among the most devastating events that affect primarily tropical, archipelagic countries such as the Philippines. With the current predictions of climate change set to include rising sea levels, intensification of typhoon strength and a general increase in the mean annual precipitation throughout the Philippines, it has become paramount to prepare for the future so that the increased risk of floods on the country does not translate into more economic and human loss. Field work and data gathering was done within the framework of an internship at the former German Technical Cooperation (GTZ) in cooperation with the Local Government Unit of Ormoc City, Leyte, The Philippines, in order to develop a dynamic computer based flood model for the basin of the Pagsangaan River. To this end, different geo-spatial analysis tools such as PCRaster and ArcGIS, hydrological analysis packages and basic engineering techniques were assessed and implemented. The aim was to develop a dynamic flood model and use the development process to determine the required data, availability and impact on the results as case study for flood early warning systems in the Philippines. The hope is that such projects can help to reduce flood risk by including the results of worst case scenario analyses and current climate change predictions into city planning for municipal development, monitoring strategies and early warning systems. The project was developed using a 1D-2D coupled model in SOBEK (Deltares Hydrological modelling software package) and was also used as a case study to analyze and understand the influence of different factors such as land use, schematization, time step size and tidal variation on the flood characteristics. Several sources of relevant satellite data were compared, such as Digital Elevation Models (DEMs) from ASTER and SRTM data, as well as satellite rainfall data from the GIOVANNI server (NASA) and field gauge data. Different methods were used in the attempt to partially calibrate and validate the model to finally simulate and study two Climate Change scenarios based on scenario A1B predictions. It was observed that large areas currently considered not prone to floods will become low flood risk (0.1-1 m water depth). Furthermore, larger sections of the floodplains upstream of the Lilo- an’s Bridge will become moderate flood risk areas (1 - 2 m water depth). The flood hazard maps created for the development of the present project will be presented to the LGU and the model will be used to create a larger set of possible flood prone areas related to rainfall intensity by GTZ’s Local Disaster Risk Management Department and to study possible improvements to the current early warning system and monitoring of the basin section belonging to Ormoc City; recommendations about further enhancement of the geo-hydro-meteorological data to improve the model’s accuracy mainly on areas of interest will also be presented at the LGU.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Due to knowledge gaps in relation to urban stormwater quality processes, an in-depth understanding of model uncertainty can enhance decision making. Uncertainty in stormwater quality models can originate from a range of sources such as the complexity of urban rainfall-runoff-stormwater pollutant processes and the paucity of observed data. Unfortunately, studies relating to epistemic uncertainty, which arises from the simplification of reality are limited and often deemed mostly unquantifiable. This paper presents a statistical modelling framework for ascertaining epistemic uncertainty associated with pollutant wash-off under a regression modelling paradigm using Ordinary Least Squares Regression (OLSR) and Weighted Least Squares Regression (WLSR) methods with a Bayesian/Gibbs sampling statistical approach. The study results confirmed that WLSR assuming probability distributed data provides more realistic uncertainty estimates of the observed and predicted wash-off values compared to OLSR modelling. It was also noted that the Bayesian/Gibbs sampling approach is superior compared to the most commonly adopted classical statistical and deterministic approaches commonly used in water quality modelling. The study outcomes confirmed that the predication error associated with wash-off replication is relatively higher due to limited data availability. The uncertainty analysis also highlighted the variability of the wash-off modelling coefficient k as a function of complex physical processes, which is primarily influenced by surface characteristics and rainfall intensity.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The Wet Tropics region has a unique water asset and is also considered a priority region for the improvement of water quality entering the Great Barrier Reef due to a combination of high rainfall, intensive agricultural use, urban areas and the proximity of valuable reef assets to the coast. Agricultural activities are one of many identified threats to water quality and water flows in the Wet Tropics in terms of sediment and pollutant-related water quality decline. Information describing the current state of agricultural management practices across the region is patchy at best. Based on the best available information on agricultural management practices in the Wet Tropics in 2008, it is clear that opportunities exist to improve nutrient, sediment and pesticide management practice to reduce the impact on the water asset and the Great Barrier Reef. Based on current understandings of practices and the relationship between practices and reef water quality, the greatest opportunities for improved water quality are as follows: · nutrients – correct rate and the placement of fertilisers; · pesticides – improve weed control planning, herbicide rates and calibration practice; and · soil and sediment – implement new farming system practices. The 2008-09 Reef Rescue program sought to accelerate the rate of adoption of improved management practices and through Terrain invested $6.8M in the 2008-09 year for: · landholder water quality improvement incentive payments; · cross regional catchment repair of wetlands and riparian lands in areas of high sediment or nutrient loss; and · partnerships in the region to lever resources and support for on-ground practice change. The program delivered $3,021,999 in onground incentives to landholders in the Wet Tropics to improve farm practices from D or C level to B or A level. The landholder Water Quality Incentives Grants program received 300 individual applications for funding and funded 143 individual landholders to implement practice change across 36,098 ha of farm land. It is estimated that the Reef Rescue program facilitated practice change across 21% of the cane industry, and 20% of the banana industry. The program levered an additional $2,441,166 in landholder cash contributions and a further $907,653 in non-cash in-kind contributions bringing the total project value of the landholder grants program in the Wet Tropics to $6,370,819. Most funded projects targeted multiple water quality objectives with a focus on nutrient and sediment reduction. Of the 143 projects funded, 115 projects addressed nutrient management either as the primary focus or in combination with strategies that targeted other water quality objectives. Overall, 82 projects addressed two or more water quality targets. Forty-five percent of incentive funds were allocated to new farming system practices (direct drill legumes, zonal tillage equipment, permanent beds, min till planting equipment, GPS units, laser levelling), followed by 24% allocated to subsurface fertiliser applicators (subsurface application of fertiliser using a stool splitter or beside the stool, at the correct Six Easy Steps rate). As a result, Terrain estimates that the incentive grants achieved considerable reductions in nitrogen, phosphorus, sediment and pesticide loads. The program supported nutrient management training of 167 growers managing farms covering over 20% of the area harvested in 2008, and 18 industry advisors and resellers. This resulted in 115 growers (155 farms) developing nutrient management plans. The program also supported Integrated Weed Management training of 80 growers managing farms covering 8% of the area harvested in 2008, and 6 industry advisors and resellers. This report, which draws on the best available Reef Rescue Management Monitoring, Evaluation, Reporting, and Improvement (MERI) information to evaluate program performance and impact on water quality outcomes, is the first in a series of annual reports that will assess and evaluate the impact of the Reef Rescue program on agricultural practices and water quality outcomes. The assessment is predominantly focused on the cane industry because of data availability. In the next stage, efforts will expand to: · improve practice data for the banana and grazing industry; · gain a better understanding of the water quality trends and the factors influencing them in the Wet Tropics; in particular work will focus on linking the results of the Paddock to Reef monitoring program and practice change data to assess program impact; · enhance estimations of the impact of practice change on pollutant loads from agricultural land use; · gain a better understanding of the extent of ancillary practice (change not directly funded) resulting from Reef Rescue training/ education/communication programs; and · provide a better understanding of the economic cost of practice change across the Wet Tropics region. From an ecological perspective, water quality trends and the factors that may be contributing to change, require further investigation. There is a critical need to work towards an enhanced understanding of the link between catchment land management practice change and reef water quality, so that reduced nutrient, sediment, and pesticide discharge to the Great Barrier Reef can be quantified. This will also assist with future prioritisation of grants money to agricultural industries, catchments and sub catchments. From a social perspective, the program has delivered significant water quality benefits from landholder education and training. It is believed that these activities are giving landholders the information and tools to implement further lasting change in their production systems and in doing so, creating a change in attitude that is supportive and inclusive of Natural Resource Management (NRM). The program in the Wet Tropics has also considerably strengthened institutional partnerships for NRM, particularly between NRM and industry and extension organisations. As a result of the Reef Rescue program, all institutions are actively working together to collectively improve water quality. The Reef Rescue program is improving water quality entering the Great Barrier Reef Lagoon by catalysing substantial activity in the Wet Tropics region to improve land management practices and reduce the water quality impact of agricultural landscapes. The solid institutional partnerships between the regional body, industry, catchment and government organisations have been fundamental to the successful delivery of the landholder grant and catchment rehabilitation programs. Landholders have generally had a positive perception and reaction to the program, its intent, and the practical, focused nature of grant-based support. Demand in the program was extremely high in 2008-09 and is expected to increase in 2009-2010.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Determination of sequence similarity is a central issue in computational biology, a problem addressed primarily through BLAST, an alignment based heuristic which has underpinned much of the analysis and annotation of the genomic era. Despite their success, alignment-based approaches scale poorly with increasing data set size, and are not robust under structural sequence rearrangements. Successive waves of innovation in sequencing technologies – so-called Next Generation Sequencing (NGS) approaches – have led to an explosion in data availability, challenging existing methods and motivating novel approaches to sequence representation and similarity scoring, including adaptation of existing methods from other domains such as information retrieval. In this work, we investigate locality-sensitive hashing of sequences through binary document signatures, applying the method to a bacterial protein classification task. Here, the goal is to predict the gene family to which a given query protein belongs. Experiments carried out on a pair of small but biologically realistic datasets (the full protein repertoires of families of Chlamydia and Staphylococcus aureus genomes respectively) show that a measure of similarity obtained by locality sensitive hashing gives highly accurate results while offering a number of avenues which will lead to substantial performance improvements over BLAST..

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The 2010 biodiversity target agreed by signatories to the Convention on Biological Diversity directed the attention of conservation professionals toward the development of indicators with which to measure changes in biological diversity at the global scale. We considered why global biodiversity indicators are needed, what characteristics successful global indicators have, and how existing indicators perform. Because monitoring could absorb a large proportion of funds available for conservation, we believe indicators should be linked explicitly to monitoring objectives and decisions about which monitoring schemes deserve funding should be informed by predictions of the value of such schemes to decision making. We suggest that raising awareness among the public and policy makers, auditing management actions, and informing policy choices are the most important global monitoring objectives. Using four well-developed indicators of biological diversity (extent of forests, coverage of protected areas, Living Planet Index, Red List Index) as examples, we analyzed the characteristics needed for indicators to meet these objectives. We recommend that conservation professionals improve on existing indicators by eliminating spatial biases in data availability, fill gaps in information about ecosystems other than forests, and improve understanding of the way indicators respond to policy changes. Monitoring is not an end in itself, and we believe it is vital that the ultimate objectives of global monitoring of biological diversity inform development of new indicators. ©2010 Society for Conservation Biology.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The ability to estimate the expected Remaining Useful Life (RUL) is critical to reduce maintenance costs, operational downtime and safety hazards. In most industries, reliability analysis is based on the Reliability Centred Maintenance (RCM) and lifetime distribution models. In these models, the lifetime of an asset is estimated using failure time data; however, statistically sufficient failure time data are often difficult to attain in practice due to the fixed time-based replacement and the small population of identical assets. When condition indicator data are available in addition to failure time data, one of the alternate approaches to the traditional reliability models is the Condition-Based Maintenance (CBM). The covariate-based hazard modelling is one of CBM approaches. There are a number of covariate-based hazard models; however, little study has been conducted to evaluate the performance of these models in asset life prediction using various condition indicators and data availability. This paper reviews two covariate-based hazard models, Proportional Hazard Model (PHM) and Proportional Covariate Model (PCM). To assess these models’ performance, the expected RUL is compared to the actual RUL. Outcomes demonstrate that both models achieve convincingly good results in RUL prediction; however, PCM has smaller absolute prediction error. In addition, PHM shows over-smoothing tendency compared to PCM in sudden changes of condition data. Moreover, the case studies show PCM is not being biased in the case of small sample size.