987 resultados para environment measurements
Resumo:
Only a few studies, and mostly in temperate climates in Europe, have examined the breeding and diet of long-eared owls (Asia otus) compared to studies of cavity-breeding owls, possibly because of the difficulties in reaching the nests of the former. Here we studied a population of long-eared owls, monitoring the diet of breeding owls and that of owls at a communal roost, every two to three months during 2006 -2009, in a semi-arid region in Israel. It was found that the studied owls produced more young than in most countries in Europe. Diet was not associated with breeding parameters of the owls, whereas laying date was negatively correlated with both clutch size and number of nestlings. We found that more social voles (Microtus socialis) and fewer birds and house mice (Mus musculus) made up the diet at nests than that of adults at the roosts. The diet and breeding of long-eared owls in Israel differ from that in Europe, with birds and mice comprising an important part of the diet, in addition to voles.
Resumo:
Canadian healthcare is changing. Over the course of the past decade, the Health Care in Canada Survey (HCIC) has annually measured the reactions of the public and professional stakeholders to many of these change forces. In HCIC 2008, for the first time, the public's perception of their health status and all stakeholders' views of the burden and effective management of chronic diseases were sought. Overall, Canadians perceive themselves as healthy, with 84% of adults reporting good-to-excellent health. However, good health decreased with age as the occurrence of chronic illness rose, from 12% in the age group 18-24 to 65% for the population =65 years. More than 70% of all stakeholders were strongly or somewhat supportive of the implementation of coordinated care, or disease management programs, to improve the care of patients with chronic illnesses. Concordant support was also expressed for key disease management components, including coordinated interventions to improve home, community and self-care; increased wellness promotion; and increased use of clinical measurements and feedback to all stakeholders. However, there were also important areas of non-concordance. For example, the public and doctors consistently expressed less support than other stakeholders for the value of team care, including the use of non-physician professionals to provide patient care; increased patient involvement in decision-making; and the use of electronic health records to facilitate communication. The actual participation in disease management programs averaged 34% for professionals and 25% for the public. We conclude that chronic diseases are common, age-related and burdensome in Canada. Disease management or coordinated intervention often delivered by teams is also relatively common, despite its less-than-universal acceptance by all stakeholders. Further insights are needed, particularly into the variable perceptions of the value and efficacy of team-delivered healthcare and its important components.
Resumo:
BACKGROUND: Hyperoxaluria is a major risk factor for kidney stone formation. Although urinary oxalate measurement is part of all basic stone risk assessment, there is no standardized method for this measurement. METHODS: Urine samples from 24-h urine collection covering a broad range of oxalate concentrations were aliquoted and sent, in duplicates, to six blinded international laboratories for oxalate, sodium and creatinine measurement. In a second set of experiments, ten pairs of native urine and urine spiked with 10 mg/L of oxalate were sent for oxalate measurement. Three laboratories used a commercially available oxalate oxidase kit, two laboratories used a high-performance liquid chromatography (HPLC)-based method and one laboratory used both methods. RESULTS: Intra-laboratory reliability for oxalate measurement expressed as intraclass correlation coefficient (ICC) varied between 0.808 [95% confidence interval (CI): 0.427-0.948] and 0.998 (95% CI: 0.994-1.000), with lower values for HPLC-based methods. Acidification of urine samples prior to analysis led to significantly higher oxalate concentrations. ICC for inter-laboratory reliability varied between 0.745 (95% CI: 0.468-0.890) and 0.986 (95% CI: 0.967-0.995). Recovery of the 10 mg/L oxalate-spiked samples varied between 8.7 ± 2.3 and 10.7 ± 0.5 mg/L. Overall, HPLC-based methods showed more variability compared to the oxalate oxidase kit-based methods. CONCLUSIONS: Significant variability was noted in the quantification of urinary oxalate concentration by different laboratories, which may partially explain the differences of hyperoxaluria prevalence reported in the literature. Our data stress the need for a standardization of the method of oxalate measurement.
Resumo:
Glioma cell lines are an important tool for research in basic and translational neuro-oncology. Documentation of their genetic identity has become a requirement for scientific journals and grant applications to exclude cross-contamination and misidentification that lead to misinterpretation of results. Here, we report the standard 16 marker short tandem repeat (STR) DNA fingerprints for a panel of 39 widely used glioma cell lines as reference. Comparison of the fingerprints among themselves and with the large DSMZ database comprising 9 marker STRs for 2278 cell lines uncovered 3 misidentified cell lines and confirmed previously known cross-contaminations. Furthermore, 2 glioma cell lines exhibited identity scores of 0.8, which is proposed as the cutoff for detecting cross-contamination. Additional characteristics, comprising lack of a B-raf mutation in one line and a similarity score of 1 with the original tumor tissue in the other, excluded a cross-contamination. Subsequent simulation procedures suggested that, when using DNA fingerprints comprising only 9 STR markers, the commonly used similarity score of 0.8 is not sufficiently stringent to unambiguously differentiate the origin. DNA fingerprints are confounded by frequent genetic alterations in cancer cell lines, particularly loss of heterozygosity, that reduce the informativeness of STR markers and, thereby, the overall power for distinction. The similarity score depends on the number of markers measured; thus, more markers or additional cell line characteristics, such as information on specific mutations, may be necessary to clarify the origin.
Resumo:
Redshifts for 100 galaxies in 10 clusters of galaxies are presented based on data obtained between March 1984 and March 1985 from Calar Alto, La Palma, and ESO, and on data from Mauna Kea. Data for individual galaxies are given, and the accuracy of the velocities of the four instruments is discussed. Comparison with published data shows the present velocities to be shifted by + 4.0 km/s on average, with a standard deviation in the difference of 89.7 km/s, consistent with the rms of redshift measurements which range from 50-100 km/s.
Resumo:
Relationships between porosity and hydraulic conductivity tend to be strongly scale- and site-dependent and are thus very difficult to establish. As a result, hydraulic conductivity distributions inferred from geophysically derived porosity models must be calibrated using some measurement of aquifer response. This type of calibration is potentially very valuable as it may allow for transport predictions within the considered hydrological unit at locations where only geophysical measurements are available, thus reducing the number of well tests required and thereby the costs of management and remediation. Here, we explore this concept through a series of numerical experiments. Considering the case of porosity characterization in saturated heterogeneous aquifers using crosshole ground-penetrating radar and borehole porosity log data, we use tracer test measurements to calibrate a relationship between porosity and hydraulic conductivity that allows the best prediction of the observed hydrological behavior. To examine the validity and effectiveness of the obtained relationship, we examine its performance at alternate locations not used in the calibration procedure. Our results indicate that this methodology allows us to obtain remarkably reliable hydrological predictions throughout the considered hydrological unit based on the geophysical data only. This was also found to be the case when significant uncertainty was considered in the underlying relationship between porosity and hydraulic conductivity.
Resumo:
Weather radar observations are currently the most reliable method for remote sensing of precipitation. However, a number of factors affect the quality of radar observations and may limit seriously automated quantitative applications of radar precipitation estimates such as those required in Numerical Weather Prediction (NWP) data assimilation or in hydrological models. In this paper, a technique to correct two different problems typically present in radar data is presented and evaluated. The aspects dealt with are non-precipitating echoes - caused either by permanent ground clutter or by anomalous propagation of the radar beam (anaprop echoes) - and also topographical beam blockage. The correction technique is based in the computation of realistic beam propagation trajectories based upon recent radiosonde observations instead of assuming standard radio propagation conditions. The correction consists of three different steps: 1) calculation of a Dynamic Elevation Map which provides the minimum clutter-free antenna elevation for each pixel within the radar coverage; 2) correction for residual anaprop, checking the vertical reflectivity gradients within the radar volume; and 3) topographical beam blockage estimation and correction using a geometric optics approach. The technique is evaluated with four case studies in the region of the Po Valley (N Italy) using a C-band Doppler radar and a network of raingauges providing hourly precipitation measurements. The case studies cover different seasons, different radio propagation conditions and also stratiform and convective precipitation type events. After applying the proposed correction, a comparison of the radar precipitation estimates with raingauges indicates a general reduction in both the root mean squared error and the fractional error variance indicating the efficiency and robustness of the procedure. Moreover, the technique presented is not computationally expensive so it seems well suited to be implemented in an operational environment.
Resumo:
This manual captures the experience of practitioners in the Iowa Department of Transportation’s (Iowa DOT’s) Office of Location and Environment (OLE). It also documents the need for coordinated project development efforts during the highway project planning, or location study phase and engineering design. The location study phase establishes: * The definition of, and need for, the highway improvement project * The range of alternatives and many key attributes of the project’s design * The recommended alternative, its impacts, and the agreed-to conditions for project approval The location study process involves developing engineering alternatives, collecting engineering and environmental data, and completing design refinements to accomplish functional designs. The items above also embody the basic content required for projects compliant with the National Environmental Policy Act (NEPA) of 19691, which directs federal agencies to use a systematic, interdisciplinary approach during the planning process whenever proposed actions (or “projects”) have the potential for environmental impacts. In doing so, NEPA requires coordination with stakeholders, review, comment, and public disclosure. Are location studies and environmental studies more about the process or the documents? If properly conducted, they concern both—unbiased and reasonable processes with quality and timely documents. In essence, every project is a story that needs to be told. Engineering and environmental regulations and guidance, as documented in this manual, will help project staff and managers become better storytellers.
Resumo:
The activity of radiopharmaceuticals in nuclear medicine is measured before patient injection with radionuclide calibrators. In Switzerland, the general requirements for quality controls are defined in a federal ordinance and a directive of the Federal Office of Metrology (METAS) which require each instrument to be verified. A set of three gamma sources (Co-57, Cs-137 and Co-60) is used to verify the response of radionuclide calibrators in the gamma energy range of their use. A beta source, a mixture of (90)Sr and (90)Y in secular equilibrium, is used as well. Manufacturers are responsible for the calibration factors. The main goal of the study was to monitor the validity of the calibration factors by using two sources: a (90)Sr/(90)Y source and a (18)F source. The three types of commercial radionuclide calibrators tested do not have a calibration factor for the mixture but only for (90)Y. Activity measurements of a (90)Sr/(90)Y source with the (90)Y calibration factor are performed in order to correct for the extra-contribution of (90)Sr. The value of the correction factor was found to be 1.113 whereas Monte Carlo simulations of the radionuclide calibrators estimate the correction factor to be 1.117. Measurements with (18)F sources in a specific geometry are also performed. Since this radionuclide is widely used in Swiss hospitals equipped with PET and PET-CT, the metrology of the (18)F is very important. The (18)F response normalized to the (137)Cs response shows that the difference with a reference value does not exceed 3% for the three types of radionuclide calibrators.
Resumo:
The Rebuild Iowa Agriculture and Environment Task Force respectfully submits its report to the Rebuild Iowa Advisory Commission (RIAC) for consideration of the impacts of the tornadoes, storms, high winds, and flooding affecting Iowa’s agriculture sector and environment. The Task Force was required to address very complex and multi-faceted issues. Understanding that there were a broad range of immediate concerns, as well as critical issues that need to be addressed in the future, the Task Force structured its work in two sessions. To better address the issues and priorities of the Task Force, this report categorizes the issues as agriculture, conservation, environment, and livestock.
Resumo:
Despite numerous discussions, workshops, reviews and reports about responsible development of nanotechnology, information describing health and environmental risk of engineered nanoparticles or nanomaterials is severely lacking and thus insufficient for completing rigorous risk assessment on their use. However, since preliminary scientific evaluations indicate that there are reasonable suspicions that activities involving nanomaterials might have damaging effects on human health; the precautionary principle must be applied. Public and private institutions as well as industries have the duty to adopt preventive and protective measures proportionate to the risk intensity and the desired level of protection. In this work, we present a practical, 'user-friendly' procedure for a university-wide safety and health management of nanomaterials, developed as a multi-stakeholder effort (government, accident insurance, researchers and experts for occupational safety and health). The process starts using a schematic decision tree that allows classifying the nano laboratory into three hazard classes similar to a control banding approach (from Nano 3 - highest hazard to Nano1 - lowest hazard). Classifying laboratories into risk classes would require considering actual or potential exposure to the nanomaterial as well as statistical data on health effects of exposure. Due to the fact that these data (as well as exposure limits for each individual material) are not available, risk classes could not be determined. For each hazard level we then provide a list of required risk mitigation measures (technical, organizational and personal). The target 'users' of this safety and health methodology are researchers and safety officers. They can rapidly access the precautionary hazard class of their activities and the corresponding adequate safety and health measures. We succeed in convincing scientist dealing with nano-activities that adequate safety measures and management are promoting innovation and discoveries by ensuring them a safe environment even in the case of very novel products. The proposed measures are not considered as constraints but as a support to their research. This methodology is being implemented at the Ecole Polytechnique de Lausanne in over 100 research labs dealing with nanomaterials. It is our opinion that it would be useful to other research and academia institutions as well. [Authors]
Resumo:
Intensification of agricultural production without a sound management and regulations can lead to severe environmental problems, as in Western Santa Catarina State, Brazil, where intensive swine production has caused large accumulations of manure and consequently water pollution. Natural resource scientists are asked by decision-makers for advice on management and regulatory decisions. Distributed environmental models are useful tools, since they can be used to explore consequences of various management practices. However, in many areas of the world, quantitative data for model calibration and validation are lacking. The data-intensive distributed environmental model AgNPS was applied in a data-poor environment, the upper catchment (2,520 ha) of the Ariranhazinho River, near the city of Seara, in Santa Catarina State. Steps included data preparation, cell size selection, sensitivity analysis, model calibration and application to different management scenarios. The model was calibrated based on a best guess for model parameters and on a pragmatic sensitivity analysis. The parameters were adjusted to match model outputs (runoff volume, peak runoff rate and sediment concentration) closely with the sparse observed data. A modelling grid cell resolution of 150 m adduced appropriate and computer-fit results. The rainfall runoff response of the AgNPS model was calibrated using three separate rainfall ranges (< 25, 25-60, > 60 mm). Predicted sediment concentrations were consistently six to ten times higher than observed, probably due to sediment trapping along vegetated channel banks. Predicted N and P concentrations in stream water ranged from just below to well above regulatory norms. Expert knowledge of the area, in addition to experience reported in the literature, was able to compensate in part for limited calibration data. Several scenarios (actual, recommended and excessive manure applications, and point source pollution from swine operations) could be compared by the model, using a relative ranking rather than quantitative predictions.