957 resultados para biological changes
Resumo:
Long-term systematic population monitoring data sets are rare but are essential in identifying changes in species abundance. In contrast, community groups and natural history organizations have collected many species lists. These represent a large, untapped source of information on changes in abundance but are generally considered of little value. The major problem with using species lists to detect population changes is that the amount of effort used to obtain the list is often uncontrolled and usually unknown. It has been suggested that using the number of species on the list, the "list length," can be a measure of effort. This paper significantly extends the utility of Franklin's approach using Bayesian logistic regression. We demonstrate the value of List Length Analysis to model changes in species prevalence (i.e., the proportion of lists on which the species occurs) using bird lists collected by a local bird club over 40 years around Brisbane, southeast Queensland, Australia. We estimate the magnitude and certainty of change for 269 bird species and calculate the probabilities that there have been declines and increases of given magnitudes. List Length Analysis confirmed suspected species declines and increases. This method is an important complement to systematically designed intensive monitoring schemes and provides a means of utilizing data that may otherwise be deemed useless. The results of List Length Analysis can be used for targeting species of conservation concern for listing purposes or for more intensive monitoring. While Bayesian methods are not essential for List Length Analysis, they can offer more flexibility in interrogating the data and are able to provide a range of parameters that are easy to interpret and can facilitate conservation listing and prioritization. © 2010 by the Ecological Society of America.
Resumo:
Almost 10 years ago, Pullin and Knight (2001) called for an “effectiveness revolution in conservation” to be enabled by the systematic evaluation of evidence for conservation decision making. Drawing from the model used in clinicalmedicine, they outlined the concept of “evidencebased conservation” in which existing information, or evidence, from relevant and rigorous research is compiled and analyzed in a systematic manner to inform conservation actions (Cochrane 1972). The promise of evidencebased conservation has generated significant interest; 25 systematic reviews have been completed since 2004 and dozens are underway (Collaboration for Environmental Evidence 2010). However we argue that an “effectiveness revolution” (Pullin & Knight 2001) in conservation will not be possible unless mechanisms are devised for incorporating the growing evidence base into decision frameworks. For conservation professionals to accomplish the missions of their organizations they must demonstrate that their actions actually achieve objectives (Pullin & Knight 2009). Systematic evaluation provides a framework for objectively evaluating the effectiveness of actions. To leverage the benefit of these evaluations, we need resource-allocation systems that are responsive to their outcomes. The allocation of conservation resources is often the product of institutional priorities or reliance on intuition (Sutherland et al. 2004; Pullin & Knight 2005; Cook et al. 2010). We highlight the NICE technologyappraisal process because it provides an example of formal integration of systematic-evidence evaluation with provision of guidance for action. The transparent process, which clearly delineates costs and benefits of each alternative action, could also provide the public with new insight into the environmental effects of different decisions. This insight could stimulate a wider discussion about investment in conservation by demonstrating how changes in funding might affect the probability of achieving conservation objectives. ©2010 Society for Conservation Biology
Resumo:
The 2010 biodiversity target agreed by signatories to the Convention on Biological Diversity directed the attention of conservation professionals toward the development of indicators with which to measure changes in biological diversity at the global scale. We considered why global biodiversity indicators are needed, what characteristics successful global indicators have, and how existing indicators perform. Because monitoring could absorb a large proportion of funds available for conservation, we believe indicators should be linked explicitly to monitoring objectives and decisions about which monitoring schemes deserve funding should be informed by predictions of the value of such schemes to decision making. We suggest that raising awareness among the public and policy makers, auditing management actions, and informing policy choices are the most important global monitoring objectives. Using four well-developed indicators of biological diversity (extent of forests, coverage of protected areas, Living Planet Index, Red List Index) as examples, we analyzed the characteristics needed for indicators to meet these objectives. We recommend that conservation professionals improve on existing indicators by eliminating spatial biases in data availability, fill gaps in information about ecosystems other than forests, and improve understanding of the way indicators respond to policy changes. Monitoring is not an end in itself, and we believe it is vital that the ultimate objectives of global monitoring of biological diversity inform development of new indicators. ©2010 Society for Conservation Biology.
Resumo:
In ecosystems driven by water availability, plant community dynamics depend on complex interactions between vegetation, hydrology, and human water resources use. Along ephemeral rivers—where water availability is erratic—vegetation and people are particularly vulnerable to changes in each other's water use. Sensible management requires that water supply be maintained for people, while preserving ecosystem health. Meeting such requirements is challenging because of the unpredictable water availability. We applied information gap decision theory to an ecohydrological system model of the Kuiseb River environment in Namibia. Our aim was to identify the robustness of ecosystem and water management strategies to uncertainties in future flood regimes along ephemeral rivers. We evaluated the trade-offs between alternative performance criteria and their robustness to uncertainty to account for both (i) human demands for water supply and (ii) reducing the risk of species extinction caused by water mining. Increasing uncertainty of flood regime parameters reduced the performance under both objectives. Remarkably, the ecological objective (species coexistence) was more sensitive to uncertainty than the water supply objective. However, within each objective, the relative performance of different management strategies was insensitive to uncertainty. The ‘best’ management strategy was one that is tuned to the competitive species interactions in the Kuiseb environment. It regulates the biomass of the strongest competitor and, thus, at the same time decreases transpiration, thereby increasing groundwater storage and reducing pressure on less dominant species. This robust mutually acceptable strategy enables species persistence without markedly reducing the water supply for humans. This study emphasises the utility of ecohydrological models for resource management of water-controlled ecosystems. Although trade-offs were identified between alternative performance criteria and their robustness to uncertain future flood regimes, management strategies were identified that help to secure an ecologically sustainable water supply.
Resumo:
Knowledge of the pollutant build-up process is a key requirement for developing stormwater pollution mitigation strategies. In this context, process variability is a concept which needs to be understood in-depth. Analysis of particulate build-up on three road surfaces in an urban catchment confirmed that particles <150µm and >150µm have characteristically different build-up patterns, and these patterns are consistent over different field conditions. Three theoretical build-up patterns were developed based on the size-fractionated particulate build-up patterns, and these patterns explain the variability in particle behavior and the variation in particle-bound pollutant load and composition over the antecedent dry period. Behavioral variability of particles <150µm was found to exert the most significant influence on the build-up process variability. As characterization of process variability is particularly important in stormwater quality modeling, it is recommended that the influence of behavioral variability of particles <150µm on pollutant build-up should be specifically addressed. This would eliminate model deficiencies in the replication of the build-up process and facilitate the accounting of the inherent process uncertainty, and thereby enhance the water quality predictions.
Resumo:
Wound healing and tumour growth involve collective cell spreading, which is driven by individual motility and proliferation events within a population of cells. Mathematical models are often used to interpret experimental data and to estimate the parameters so that predictions can be made. Existing methods for parameter estimation typically assume that these parameters are constants and often ignore any uncertainty in the estimated values. We use approximate Bayesian computation (ABC) to estimate the cell diffusivity, D, and the cell proliferation rate, λ, from a discrete model of collective cell spreading, and we quantify the uncertainty associated with these estimates using Bayesian inference. We use a detailed experimental data set describing the collective cell spreading of 3T3 fibroblast cells. The ABC analysis is conducted for different combinations of initial cell densities and experimental times in two separate scenarios: (i) where collective cell spreading is driven by cell motility alone, and (ii) where collective cell spreading is driven by combined cell motility and cell proliferation. We find that D can be estimated precisely, with a small coefficient of variation (CV) of 2–6%. Our results indicate that D appears to depend on the experimental time, which is a feature that has been previously overlooked. Assuming that the values of D are the same in both experimental scenarios, we use the information about D from the first experimental scenario to obtain reasonably precise estimates of λ, with a CV between 4 and 12%. Our estimates of D and λ are consistent with previously reported values; however, our method is based on a straightforward measurement of the position of the leading edge whereas previous approaches have involved expensive cell counting techniques. Additional insights gained using a fully Bayesian approach justify the computational cost, especially since it allows us to accommodate information from different experiments in a principled way.
Resumo:
In this paper we present an update on our novel visualization technologies based on cellular immune interaction from both large-scale spatial and temporal perspectives. We do so with a primary motive: to present a visually and behaviourally realistic environment to the community of experimental biologists and physicians such that their knowledge and expertise may be more readily integrated into the model creation and calibration process. Visualization aids understanding as we rely on visual perception to make crucial decisions. For example, with our initial model, we can visualize the dynamics of an idealized lymphatic compartment, with antigen presenting cells (APC) and cytotoxic T lymphocyte (CTL) cells. The visualization technology presented here offers the researcher the ability to start, pause, zoom-in, zoom-out and navigate in 3-dimensions through an idealised lymphatic compartment.
Resumo:
Over the last few years, investigations of human epigenetic profiles have identified key elements of change to be Histone Modifications, stable and heritable DNA methylation and Chromatin remodeling. These factors determine gene expression levels and characterise conditions leading to disease. In order to extract information embedded in long DNA sequences, data mining and pattern recognition tools are widely used, but efforts have been limited to date with respect to analyzing epigenetic changes, and their role as catalysts in disease onset. Useful insight, however, can be gained by investigation of associated dinucleotide distributions. The focus of this paper is to explore specific dinucleotides frequencies across defined regions within the human genome, and to identify new patterns between epigenetic mechanisms and DNA content. Signal processing methods, including Fourier and Wavelet Transformations, are employed and principal results are reported.
Resumo:
Aims: We assessed the diagnostic performance of z-scores to define a significant delta cardiac troponin (cTn) in a cohort of patients with well-defined clinical outcomes. Methods: We calculated z-scores, which are dependent on the analytical precision and biological variation, to report changes in cTn. We compared the diagnostic performances of a relative delta (%Δ), actual delta (Δ), and z-scores in 762 emergency department patients with symptoms of suspected acute coronary syndrome. cTn was measured with sensitive cTnI (Beckman Coulter), highly sensitive cTnI (Abbott), and highly sensitive cTnT (Roche) assays. Results: Receiver operating characteristic analysis showed no statistically significant differences in the areas under the curve (AUC) of z-scores and Δ with both superior compared to %Δ for all three assays (p<0.001). The AUCs of z-scores measured with the Abbott hs-cTnI (0.955) and Roche hs-cTnT (0.922) assays were comparable to Beckman Coulter cTnI (0.933) (p=0.272 and 0.640, respectively). The individualized Δ cut-off values that were required to emulate a z-score of 1.96 were: Beckman Coulter cTnI 30 ng/l, Abbott hs-cTnI 20 ng/l, and Roche hs-cTnT 7 ng/l. Conclusions: z-scores allow the use of a single cut-off value at all cTn levels, for both cTnI and cTnT and for sensitive and highly sensitive assays, with comparable diagnostic performances. This strategy of reporting significant changes as z-scores may obviate the need for the empirical development of assay-specific cut-off rules to define significant troponin changes.
Resumo:
This thesis examines the short-term changes occurring in a number of the eye's structures during reading tasks, and explores how these changes differ between normal eyes, and those with short-sightedness (myopia). This research revealed changes in the shape and thickness of a number of the eye's structures during near work, and aspects of these changes showed differences associated with myopia. These findings have potentially important implications for our understanding of the role of near work in the development and progression of myopia.
Resumo:
Background: It is important for nutrition intervention in malnourished patients to be guided by accurate evaluation and detection of small changes in the patient’s nutrition status over time. However, the current Subjective Global Assessment (SGA) is not able to detect changes in a short period of time. The aim of the study was to determine whether 7-point SGA is more time sensitive to nutrition changes than the conventional SGA. Methods: In this prospective study, 67 adult inpatients assessed as malnourished using both the 7-point SGA and conventional SGA were recruited. Each patient received nutrition intervention and was followed up post-discharge. Patients were reassessed using both tools at 1, 3 and 5 months from baseline assessment. Results: It took significantly shorter time to see a one-point change using 7-point SGA compared to conventional SGA (median: 1 month vs. 3 months, p = 0.002). The likelihood of at least a one-point change is 6.74 times greater in 7-point SGA compared to conventional SGA after controlling for age, gender and medical specialties (odds ratio = 6.74, 95% CI 2.88-15.80, p<0.001). Fifty-six percent of patients who had no change in SGA score had changes detected using 7-point SGA. The level of agreement was 100% (k = 1, p < 0.001) between 7-point SGA and 3-point SGA and 83% (k=0.726, p<0.001) between two blinded assessors for 7-point SGA. Conclusion: The 7-point SGA is more time sensitive in its response to nutrition changes than conventional SGA. It can be used to guide nutrition intervention for patients.
Resumo:
Underground transport tunnels are vulnerable to blast events. This paper develops and applies a fully coupled technique involving the Smooth Particle Hydrodynamics and Finite Element techniques to investigate the blast response of segmented bored tunnels. Findings indicate that several bolts failed in the longitudinal direction due to redistribution of blast loading to adjacent tunnel rings. The tunnel segments respond as arch mechanisms in the transverse direction and suffered damage mainly due to high bending stresses. The novel information from the present study will enable safer designs of buried tunnels and provide a benchmark reference for future developments in this area.