986 resultados para Estimating Site Occupancy


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Zero energy buildings (ZEB) and zero energy homes (ZEH) are a current hot topic globally for policy makers (what are the benefits and costs), designers (how do we design them), the construction industry (can we build them), marketing (will consumers buy them) and researchers (do they work and what are the implications). This paper presents initial findings from actual measured data from a 9 star (as built), off-ground detached family home constructed in south-east Queensland in 2008. The integrated systems approach to the design of the house is analysed in each of its three main goals: maximising the thermal performance of the building envelope, minimising energy demand whilst maintaining energy service levels, and implementing a multi-pronged low carbon approach to energy supply. The performance outcomes of each of these stages are evaluated against definitions of Net Zero Carbon / Net Zero Emissions (Site and Source) and Net Zero Energy (onsite generation v primary energy imports). The paper will conclude with a summary of the multiple benefits of combining very high efficiency building envelopes with diverse energy management strategies: a robustness, resilience, affordability and autonomy not generally seen in housing.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The health impacts of exposure to ambient temperature have been drawing increasing attention from the environmental health research community, government, society, industries, and the public. Case-crossover and time series models are most commonly used to examine the effects of ambient temperature on mortality. However, some key methodological issues remain to be addressed. For example, few studies have used spatiotemporal models to assess the effects of spatial temperatures on mortality. Few studies have used a case-crossover design to examine the delayed (distributed lag) and non-linear relationship between temperature and mortality. Also, little evidence is available on the effects of temperature changes on mortality, and on differences in heat-related mortality over time. This thesis aimed to address the following research questions: 1. How to combine case-crossover design and distributed lag non-linear models? 2. Is there any significant difference in effect estimates between time series and spatiotemporal models? 3. How to assess the effects of temperature changes between neighbouring days on mortality? 4. Is there any change in temperature effects on mortality over time? To combine the case-crossover design and distributed lag non-linear model, datasets including deaths, and weather conditions (minimum temperature, mean temperature, maximum temperature, and relative humidity), and air pollution were acquired from Tianjin China, for the years 2005 to 2007. I demonstrated how to combine the case-crossover design with a distributed lag non-linear model. This allows the case-crossover design to estimate the non-linear and delayed effects of temperature whilst controlling for seasonality. There was consistent U-shaped relationship between temperature and mortality. Cold effects were delayed by 3 days, and persisted for 10 days. Hot effects were acute and lasted for three days, and were followed by mortality displacement for non-accidental, cardiopulmonary, and cardiovascular deaths. Mean temperature was a better predictor of mortality (based on model fit) than maximum or minimum temperature. It is still unclear whether spatiotemporal models using spatial temperature exposure produce better estimates of mortality risk compared with time series models that use a single site’s temperature or averaged temperature from a network of sites. Daily mortality data were obtained from 163 locations across Brisbane city, Australia from 2000 to 2004. Ordinary kriging was used to interpolate spatial temperatures across the city based on 19 monitoring sites. A spatiotemporal model was used to examine the impact of spatial temperature on mortality. A time series model was used to assess the effects of single site’s temperature, and averaged temperature from 3 monitoring sites on mortality. Squared Pearson scaled residuals were used to check the model fit. The results of this study show that even though spatiotemporal models gave a better model fit than time series models, spatiotemporal and time series models gave similar effect estimates. Time series analyses using temperature recorded from a single monitoring site or average temperature of multiple sites were equally good at estimating the association between temperature and mortality as compared with a spatiotemporal model. A time series Poisson regression model was used to estimate the association between temperature change and mortality in summer in Brisbane, Australia during 1996–2004 and Los Angeles, United States during 1987–2000. Temperature change was calculated by the current day's mean temperature minus the previous day's mean. In Brisbane, a drop of more than 3 �C in temperature between days was associated with relative risks (RRs) of 1.16 (95% confidence interval (CI): 1.02, 1.31) for non-external mortality (NEM), 1.19 (95% CI: 1.00, 1.41) for NEM in females, and 1.44 (95% CI: 1.10, 1.89) for NEM aged 65.74 years. An increase of more than 3 �C was associated with RRs of 1.35 (95% CI: 1.03, 1.77) for cardiovascular mortality and 1.67 (95% CI: 1.15, 2.43) for people aged < 65 years. In Los Angeles, only a drop of more than 3 �C was significantly associated with RRs of 1.13 (95% CI: 1.05, 1.22) for total NEM, 1.25 (95% CI: 1.13, 1.39) for cardiovascular mortality, and 1.25 (95% CI: 1.14, 1.39) for people aged . 75 years. In both cities, there were joint effects of temperature change and mean temperature on NEM. A change in temperature of more than 3 �C, whether positive or negative, has an adverse impact on mortality even after controlling for mean temperature. I examined the variation in the effects of high temperatures on elderly mortality (age . 75 years) by year, city and region for 83 large US cities between 1987 and 2000. High temperature days were defined as two or more consecutive days with temperatures above the 90th percentile for each city during each warm season (May 1 to September 30). The mortality risk for high temperatures was decomposed into: a "main effect" due to high temperatures using a distributed lag non-linear function, and an "added effect" due to consecutive high temperature days. I pooled yearly effects across regions and overall effects at both regional and national levels. The effects of high temperature (both main and added effects) on elderly mortality varied greatly by year, city and region. The years with higher heat-related mortality were often followed by those with relatively lower mortality. Understanding this variability in the effects of high temperatures is important for the development of heat-warning systems. In conclusion, this thesis makes contribution in several aspects. Case-crossover design was combined with distribute lag non-linear model to assess the effects of temperature on mortality in Tianjin. This makes the case-crossover design flexibly estimate the non-linear and delayed effects of temperature. Both extreme cold and high temperatures increased the risk of mortality in Tianjin. Time series model using single site’s temperature or averaged temperature from some sites can be used to examine the effects of temperature on mortality. Temperature change (no matter significant temperature drop or great temperature increase) increases the risk of mortality. The high temperature effect on mortality is highly variable from year to year.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Meta-analyses estimate a statistical effect size for a test or an analysis by combining results from multiple studies without necessarily having access to each individual study's raw data. Multi-site meta-analysis is crucial for imaging genetics, as single sites rarely have a sample size large enough to pick up effects of single genetic variants associated with brain measures. However, if raw data can be shared, combining data in a "mega-analysis" is thought to improve power and precision in estimating global effects. As part of an ENIGMA-DTI investigation, we use fractional anisotropy (FA) maps from 5 studies (total N=2, 203 subjects, aged 9-85) to estimate heritability. We combine the studies through meta-and mega-analyses as well as a mixture of the two - combining some cohorts with mega-analysis and meta-analyzing the results with those of the remaining sites. A combination of mega-and meta-approaches may boost power compared to meta-analysis alone.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

To quantify the impact that planting indigenous trees and shrubs in mixed communities (environmental plantings) have on net sequestration of carbon and other environmental or commercial benefits, precise and non-biased estimates of biomass are required. Because these plantings consist of several species, estimation of their biomass through allometric relationships is a challenging task. We explored methods to accurately estimate biomass through harvesting 3139 trees and shrubs from 22 plantings, and collating similar datasets from earlier studies, in non-arid (>300mm rainfallyear-1) regions of southern and eastern Australia. Site-and-species specific allometric equations were developed, as were three types of generalised, multi-site, allometric equations based on categories of species and growth-habits: (i) species-specific, (ii) genus and growth-habit, and (iii) universal growth-habit irrespective of genus. Biomass was measured at plot level at eight contrasting sites to test the accuracy of prediction of tonnes dry matter of above-ground biomass per hectare using different classes of allometric equations. A finer-scale analysis tested performance of these at an individual-tree level across a wider range of sites. Although the percentage error in prediction could be high at a given site (up to 45%), it was relatively low (<11%) when generalised allometry-predictions of biomass was used to make regional- or estate-level estimates across a range of sites. Precision, and thus accuracy, increased slightly with the level of specificity of allometry. Inclusion of site-specific factors in generic equations increased efficiency of prediction of above-ground biomass by as much as 8%. Site-and-species-specific equations are the most accurate for site-based predictions. Generic allometric equations developed here, particularly the generic species-specific equations, can be confidently applied to provide regional- or estate-level estimates of above-ground biomass and carbon. © 2013 Elsevier B.V.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This research is designed to develop a new technique for site characterization in a three-dimensional domain. Site characterization is a fundamental task in geotechnical engineering practice, as well as a very challenging process, with the ultimate goal of estimating soil properties based on limited tests at any half-space subsurface point in a site.In this research, the sandy site at the Texas A&M University's National Geotechnical Experimentation Site is selected as an example to develop the new technique for site characterization, which is based on Artificial Neural Networks (ANN) technology. In this study, a sequential approach is used to demonstrate the applicability of ANN to site characterization. To verify its robustness, the proposed new technique is compared with other commonly used approaches for site characterization. In addition, an artificial site is created, wherein soil property values at any half-space point are assumed, and thus the predicted values can compare directly with their corresponding actual values, as a means of validation. Since the three-dimensional model has the capability of estimating the soil property at any location in a site, it could have many potential applications, especially in such case, wherein the soil properties within a zone are of interest rather than at a single point. Examples of soil properties of zonal interest include soil type classification and liquefaction potential evaluation. In this regard, the present study also addresses this type of applications based on a site located in Taiwan, which experienced liquefaction during the 1999 Chi-Chi, Taiwan, Earthquake.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Seismic site classifications are used to represent site effects for estimating hazard parameters (response spectral ordinates) at the soil surface. Seismic site classifications have generally been carried out using average shear wave velocity and/or standard penetration test n-values of top 30-m soil layers, according to the recommendations of the National Earthquake Hazards Reduction Program (NEHRP) or the International Building Code (IBC). The site classification system in the NEHRP and the IBC is based on the studies carried out in the United States where soil layers extend up to several hundred meters before reaching any distinct soil-bedrock interface and may not be directly applicable to other regions, especially in regions having shallow geological deposits. This paper investigates the influence of rock depth on site classes based on the recommendations of the NEHRP and the IBC. For this study, soil sites having a wide range of average shear wave velocities (or standard penetration test n-values) have been collected from different parts of Australia, China, and India. Shear wave velocities of rock layers underneath soil layers have also been collected at depths from a few meters to 180 m. It is shown that a site classification system based on the top 30-m soil layers often represents stiffer site classes for soil sites having shallow rock depths (rock depths less than 25 m from the soil surface). A new site classification system based on average soil thickness up to engineering bedrock has been proposed herein, which is considered more representative for soil sites in shallow bedrock regions. It has been observed that response spectral ordinates, amplification factors, and site periods estimated using one-dimensional shear wave analysis considering the depth of engineering bedrock are different from those obtained considering top 30-m soil layers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Nearshore 0-group western Baltic cod are frequently caught as bycatch in the commercial pound net fishery. Pound net fishermen from the Danish Isle of Funen and Lolland and the German Isle of Fehmarn have recorded their catches of small cod between September and December 2008. Abundance patterns were analysed, particularly concerning the influence of abiotic factors (hydrography, meteorology) and the differences between sampling sites. Catch per unit effort (CPUE) differed by site and location, whereas CPUE were highest at Lolland. Correlation between catch and wind/currents were generally weak. However, wind directions and current speeds seem to affect the catch rates. Finally an algorithm was developed to calculate a recruitment index for western Baltic cod recruitment success based on previous analyses.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Estimating the fundamental matrix (F), to determine the epipolar geometry between a pair of images or video frames, is a basic step for a wide variety of vision-based functions used in construction operations, such as camera-pair calibration, automatic progress monitoring, and 3D reconstruction. Currently, robust methods (e.g., SIFT + normalized eight-point algorithm + RANSAC) are widely used in the construction community for this purpose. Although they can provide acceptable accuracy, the significant amount of required computational time impedes their adoption in real-time applications, especially video data analysis with many frames per second. Aiming to overcome this limitation, this paper presents and evaluates the accuracy of a solution to find F by combining the use of two speedy and consistent methods: SURF for the selection of a robust set of point correspondences and the normalized eight-point algorithm. This solution is tested extensively on construction site image pairs including changes in viewpoint, scale, illumination, rotation, and moving objects. The results demonstrate that this method can be used for real-time applications (5 image pairs per second with the resolution of 640 × 480) involving scenes of the built environment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Co-occurrence of HIV and substance abuse is associated with poor outcomes for HIV-related health and substance use. Integration of substance use and medical care holds promise for HIV patients, yet few integrated treatment models have been reported. Most of the reported models lack data on treatment outcomes in diverse settings. This study examined the substance use outcomes of an integrated treatment model for patients with both HIV and substance use at three different clinics. Sites differed by type and degree of integration, with one integrated academic medical center, one co-located academic medical center, and one co-located community health center. Participants (n=286) received integrated substance use and HIV treatment for 12 months and were interviewed at 6-month intervals. We used linear generalized estimating equation regression analysis to examine changes in Addiction Severity Index (ASI) alcohol and drug severity scores. To test whether our treatment was differentially effective across sites, we compared a full model including site by time point interaction terms to a reduced model including only site fixed effects. Alcohol severity scores decreased significantly at 6 and 12 months. Drug severity scores decreased significantly at 12 months. Once baseline severity variation was incorporated into the model, there was no evidence of variation in alcohol or drug score changes by site. Substance use outcomes did not differ by age, gender, income, or race. This integrated treatment model offers an option for treating diverse patients with HIV and substance use in a variety of clinic settings. Studies with control groups are needed to confirm these findings.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The computer molecular docking of piperonyl acid piperidide (BDP) and some its analogs already known as ampakins was conducted for estimating their possible binding with AMPA-receptor glutamate domains in cyclothiazide binding area and for further design of new structures maximally complimentary to the receptor. On the base of the conducted docking it can be suggested that the binding site of BDP (amides of benzodioxane-6-carboxylic and piperonyl acids) analogs is located in AMPA-receptor cyclothiazide binding pocket. It is shown that formation of protein-ligand complexes of AMPA-receptor with benzodioxane-6-carboxylic and piperonyl acid derivatives, similarly to cyclothiazide, proceeds with interaction with Ser497, Leu751, which significance is confirmed by site-specific mutagenesis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A benefit function transfer obtains estimates of willingness-to-pay (WTP) for the evaluation of a given policy at a site by combining existing information from different study sites. This has the advantage that more efficient estimates are obtained, but it relies on the assumption that the heterogeneity between sites is appropriately captured in the benefit transfer model. A more expensive alternative to estimate WTP is to analyze only data from the policy site in question while ignoring information from other sites. We make use of the fact that these two choices can be viewed as a model selection problem and extend the set of models to allow for the hypothesis that the benefit function is only applicable to a subset of sites. We show how Bayesian model averaging (BMA) techniques can be used to optimally combine information from all models. The Bayesian algorithm searches for the set of sites that can form the basis for estimating a benefit function and reveals whether such information can be transferred to new sites for which only a small data set is available. We illustrate the method with a sample of 42 forests from U.K. and Ireland. We find that BMA benefit function transfer produces reliable estimates and can increase about 8 times the information content of a small sample when the forest is 'poolable'. © 2008 Elsevier Inc. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A series of shell middens and miscellaneous habitation sites, located in a dune
system in west County Galway, have been exposed and are slowly disappearing
through wind, wave and surface erosion. In 1992 a project was initiated to
record, sample and date some of these sites and the radiocarbon results
demonstrated that activity in the area spanned the Early Bronze Age through to
the Iron Age and into the early and post medieval periods. This preliminary
fieldwork was succeeded by the excavation of three of the better-preserved sites; a Bronze Age midden in 1994 and two early medieval sites (the subject of this paper), in 1997. The medieval sites dated to the late-seventh to ninth century adand were represented by a sub-circular stone hut with a hearth and the charred remains of a more ephemeral wooden tent-like structure. The discovery of a bronze penannular brooch of ninth/tenth century date at the latter site wouldsuggest that the settlements are not the remains of transient, impoverishedpeoples of the lower classes of society, eking out a living along the coast. The calcareous sands ensured good preservation of organic remains*fish and mammal bones, charred cereal grains, seeds and seaweed, and marine molluscs. Analyses of these indicated exploitation of marine resources but, otherwise, were comparable with the diet and economy represented by assemblages from established contemporary site types of the period. Unlike raths, cranno´gs and monastic settlements, however, the volume of material represented at the Galway sites was slight, despite the excellent preservation conditions. A range of seasonal indicators also suggested temporary habitation: probable latespring/summer occupation of the stone hut site and autumnal occupancy of the second, less substantial site. It is suggested that the machair plain, beside which the dunes are located, was most probably the attraction for settlers to the area and was exploited as an alternative pasture for the seasonal grazing of livestock.
*

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Accurate estimation of the soil water balance (SWB) is important for a number of applications (e.g. environmental, meteorological, agronomical and hydrological). The objective of this study was to develop and test techniques for the estimation of soil water fluxes and SWB components (particularly infiltration, evaporation and drainage below the root zone) from soil water records. The work presented here is based on profile soil moisture data measured using dielectric methods, at 30-min resolution, at an experimental site with different vegetation covers (barley, sunflower and bare soil). Estimates of infiltration were derived by assuming that observed gains in the soil profile water content during rainfall were due to infiltration. Inaccuracies related to diurnal fluctuations present in the dielectric-based soil water records are resolved by filtering the data with adequate threshold values. Inconsistencies caused by the redistribution of water after rain events were corrected by allowing for a redistribution period before computing water gains. Estimates of evaporation and drainage were derived from water losses above and below the deepest zero flux plane (ZFP), respectively. The evaporation estimates for the sunflower field were compared to evaporation data obtained with an eddy covariance (EC) system located elsewhere in the field. The EC estimate of total evaporation for the growing season was about 25% larger than that derived from the soil water records. This was consistent with differences in crop growth (based on direct measurements of biomass, and field mapping of vegetation using laser altimetry) between the EC footprint and the area of the field used for soil moisture monitoring. Copyright (c) 2007 John Wiley & Sons, Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Critical loads are the basis for policies controlling emissions of acidic substances in Europe and elsewhere. They are assessed by several elaborate and ingenious models, each of which requires many parameters, and have to be applied on a spatially-distributed basis. Often the values of the input parameters are poorly known, calling into question the validity of the calculated critical loads. This paper attempts to quantify the uncertainty in the critical loads due to this "parameter uncertainty", using examples from the UK. Models used for calculating critical loads for deposition of acidity and nitrogen in forest and heathland ecosystems were tested at four contrasting sites. Uncertainty was assessed by Monte Carlo methods. Each input parameter or variable was assigned a value, range and distribution in an objective a fashion as possible. Each model was run 5000 times at each site using parameters sampled from these input distributions. Output distributions of various critical load parameters were calculated. The results were surprising. Confidence limits of the calculated critical loads were typically considerably narrower than those of most of the input parameters. This may be due to a "compensation of errors" mechanism. The range of possible critical load values at a given site is however rather wide, and the tails of the distributions are typically long. The deposition reductions required for a high level of confidence that the critical load is not exceeded are thus likely to be large. The implication for pollutant regulation is that requiring a high probability of non-exceedance is likely to carry high costs. The relative contribution of the input variables to critical load uncertainty varied from site to site: any input variable could be important, and thus it was not possible to identify variables as likely targets for research into narrowing uncertainties. Sites where a number of good measurements of input parameters were available had lower uncertainties, so use of in situ measurement could be a valuable way of reducing critical load uncertainty at particularly valuable or disputed sites. From a restricted number of samples, uncertainties in heathland critical loads appear comparable to those of coniferous forest, and nutrient nitrogen critical loads to those of acidity. It was important to include correlations between input variables in the Monte Carlo analysis, but choice of statistical distribution type was of lesser importance. Overall, the analysis provided objective support for the continued use of critical loads in policy development. (c) 2007 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The ground surface net solar radiation is the energy that drives physical and chemical processes at the ground surface. In this paper, multi-spectral data from the Landsat-5 TM, topographic data from a gridded digital elevation model, field measurements, and the atmosphere model LOWTRAN 7 are used to estimate surface net solar radiation over the FIFE site. Firstly an improved method is presented and used for calculating total surface incoming radiation. Then, surface albedo is integrated from surface reflectance factors derived from remotely sensed data from Landsat-5 TM. Finally, surface net solar radiation is calculated by subtracting surface upwelling radiation from the total surface incoming radiation.