986 resultados para Estimating Site Occupancy
Resumo:
The indoor air quality (IAQ) in buildings is currently assessed by measurement of pollutants during building operation for comparison with air quality standards. Current practice at the design stage tries to minimise potential indoor air quality impacts of new building materials and contents by selecting low-emission materials. However low-emission materials are not always available, and even when used the aggregated pollutant concentrations from such materials are generally overlooked. This paper presents an innovative tool for estimating indoor air pollutant concentrations at the design stage, based on emissions over time from large area building materials, furniture and office equipment. The estimator considers volatile organic compounds, formaldehyde and airborne particles from indoor materials and office equipment and the contribution of outdoor urban air pollutants affected by urban location and ventilation system filtration. The estimated pollutants are for a single, fully mixed and ventilated zone in an office building with acceptable levels derived from Australian and international health-based standards. The model acquires its dimensional data for the indoor spaces from a 3D CAD model via IFC files and the emission data from a building products/contents emissions database. This paper describes the underlying approach to estimating indoor air quality and discusses the benefits of such an approach for designers and the occupants of buildings.
Resumo:
Off-site manufacture (OSM) offers numerous benefits to all parties in the construction process. The uptake of OSM in Australia has, however, been limited. This limited uptake corresponds to similar trends in the UK and US, although the level of OSM there appears to be increasing. This project undertook three workshops — one each in Victoria, Queensland and Western Australia — and 18 interviews with key stakeholders to assist in identifying the general benefits and barriers to OSM uptake in the Australian construction industry. Seven case studies were also undertaken, involving construction projects that used OSM, ranging from civil projects through to residential. Each of these case studies has been analysed to identify what worked and what didn’t, and suggest the lessons to be learned from each project.
Resumo:
Much has been written on Off-site Manufacture (OSM) in construction, particularly regarding the perceived benefits and barriers to implementation. However, very little understanding of the state of OSM in the Australian construction industry exists. A ‘scoping study' has recently been undertaken to determine the ‘state-of-the-art’ of OSM in Australia. This involved several industry workshops, interviews and case studies across four major states of Australia. The study surveyed a range of suppliers across the construction supply-chain, incorporating the civil, commercial and housing segments of the market. This revealed that skills shortages and lack of adequate OSM knowledge are generally the greatest issues facing OSM in Australia. The drivers and constraints that emerged from the research were, in large measure, consistent with those found in the US and UK, although some Australian anomalies are evident, such as the geographical disparity of markets. A comparative analysis with similar studies in the UK and US is reported, illustrating both the drivers and constraints confronting the industry in Australia. OSM uptake into the future is however dependent on many factors, not least of which is a better understanding of the construction process and its associated costs.
Resumo:
In 2004, with the increasing overloading restriction requirements of society in Anhui, a provincial comprehensive overloading transportation survey has been developed to take evaluations on overloading actuality and enforcement efficiency with the support of the World Bank. A total of six site surveys were conducted at Hefei, Fuyang, Luan, Wuhu, Huainan and Huangshan Areas with four main contents respectively: traffic volume, axle load, freight information and registration information. Via statistical analysis on the survey data, conclusions were gained that: vehicle overloading are very universal and serious problems at arterial highways in Anhui now. The traffic loads have far exceeded the designed endure capacity of highways and have caused prevalent premature pavement damage, especially for rigid pavement. The overloading trucks are unimpeded engaged in highway freight transportation actually due to the disordered overloading enforcement strategies and the deficient inspecting technologies.
Resumo:
The Internet theoretically enables marketers to personalize a Website to an individual consumer. This article examines optimal Website design from the perspective of personality trait theory and resource-matching theory. The influence of two traits relevant to Internet Web-site processing—sensation seeking and need for cognition— were studied in the context of resource matching and different levels of Web-site complexity. Data were collected at two points of time: personality-trait data and a laboratory experiment using constructed Web sites. Results reveal that (a) subjects prefer Web sites of a medium level of complexity, rather than high or low complexity; (b)high sensation seekers prefer complex visual designs, and low sensation seekers simple visual designs, both in Web sites of medium complexity; and (c) high need-for-cognition subjects evaluated Web sites with high verbal and low visual complexity more favourably.
Resumo:
Total cross sections for neutron scattering from nuclei, with energies ranging from 10 to 600 MeV and from many nuclei spanning the mass range 6Li to 238U, have been analyzed using a simple, three-parameter, functional form. The calculated cross sections are compared with results obtained by using microscopic (g-folding) optical potentials as well as with experimental data. The functional form reproduces those total cross sections very well. When allowance is made for Ramsauer-like effects in the scattering, the parameters of the functional form required vary smoothly with energy and target mass. They too can be represented by functions of energy and mass.
Resumo:
The interaction of quercetin, which is a bioflavonoid, with bovine serum albumin (BSA) was investigated under pseudo-physiological conditions by the application of UV–vis spectrometry, spectrofluorimetry and cyclic voltammetry (CV). These studies indicated a cooperative interaction between the quercetin–BSA complex and warfarin, which produced a ternary complex, quercetin–BSA–warfarin. It was found that both quercetin and warfarin were located in site I. However, the spectra of these three components overlapped and the chemometrics method – multivariate curve resolution-alternating least squares (MCR-ALS) was applied to resolve the spectra. The resolved spectra of quercetin–BSA and warfarin agreed well with their measured spectra, and importantly, the spectrum of the quercetin–BSA–warfarin complex was extracted. These results allowed the rationalization of the behaviour of the overlapping spectra. At lower concentrations ([warfarin] < 1 × 10−5 mol L−1), most of the site marker reacted with the quercetin–BSA, but free warfarin was present at higher concentrations. Interestingly, the ratio between quercetin–BSA and warfarin was found to be 1:2, suggesting a quercetin–BSA–(warfarin)2 complex, and the estimated equilibrium constant was 1.4 × 1011 M−2. The results suggest that at low concentrations, warfarin binds at the high-affinity sites (HAS), while low-affinity binding sites (LAS) are occupied at higher concentrations.
Resumo:
In today's fiercely competitive products market, product warranty has started playing an important role. The warranty period offered by the manufacturer/dealer has been progressively increasing since the beginning of the 20th Century. Currently, a large number of products are being sold with long-term warranty policies in the form of extended warranty, warranty for used products, service contracts and lifetime warranty policies. Lifetime warranties are relatively a new concept. The modelling of failures during the warranty period and the costs for such policies are complex since the lifespan in these policies are not defined well and it is often difficult to tell about life measures for the longer period of coverage due to usage pattern/maintenance activities undertaken and uncertainties of costs over the period. This paper focuses on defining lifetime, developing lifetime warranty policies and models for predicting failures and estimating costs for lifetime warranty policies.
Resumo:
The international focus on embracing daylighting for energy efficient lighting purposes and the corporate sector’s indulgence in the perception of workplace and work practice “transparency” has spurned an increase in highly glazed commercial buildings. This in turn has renewed issues of visual comfort and daylight-derived glare for occupants. In order to ascertain evidence, or predict risk, of these events; appraisals of these complex visual environments require detailed information on the luminances present in an occupant’s field of view. Conventional luminance meters are an expensive and time consuming method of achieving these results. To create a luminance map of an occupant’s visual field using such a meter requires too many individual measurements to be a practical measurement technique. The application of digital cameras as luminance measurement devices has solved this problem. With high dynamic range imaging, a single digital image can be created to provide luminances on a pixel-by-pixel level within the broad field of view afforded by a fish-eye lens: virtually replicating an occupant’s visual field and providing rapid yet detailed luminance information for the entire scene. With proper calibration, relatively inexpensive digital cameras can be successfully applied to the task of luminance measurements, placing them in the realm of tools that any lighting professional should own. This paper discusses how a digital camera can become a luminance measurement device and then presents an analysis of results obtained from post occupancy measurements from building assessments conducted by the Mobile Architecture Built Environment Laboratory (MABEL) project. This discussion leads to the important realisation that the placement of such tools in the hands of lighting professionals internationally will provide new opportunities for the lighting community in terms of research on critical issues in lighting such as daylight glare and visual quality and comfort.
Resumo:
Engineering assets are often complex systems. In a complex system, components often have failure interactions which lead to interactive failures. A system with interactive failures may lead to an increased failure probability. Hence, one may have to take the interactive failures into account when designing and maintaining complex engineering systems. To address this issue, Sun et al have developed an analytical model for the interactive failures. In this model, the degree of interaction between two components is represented by interactive coefficients. To use this model for failure analysis, the related interactive coefficients must be estimated. However, methods for estimating the interactive coefficients have not been reported. To fill this gap, this paper presents five methods to estimate the interactive coefficients including probabilistic method; failure data based analysis method; laboratory experimental method; failure interaction mechanism based method; and expert estimation method. Examples are given to demonstrate the applications of the proposed methods. Comparisons among these methods are also presented.
Resumo:
Adiabatic compression testing of components in gaseous oxygen is a test method that is utilized worldwide and is commonly required to qualify a component for ignition tolerance under its intended service. This testing is required by many industry standards organizations and government agencies; however, a thorough evaluation of the test parameters and test system influences on the thermal energy produced during the test has not yet been performed. This paper presents a background for adiabatic compression testing and discusses an approach to estimating potential differences in the thermal profiles produced by different test laboratories. A “Thermal Profile Test Fixture” (TPTF) is described that is capable of measuring and characterizing the thermal energy for a typical pressure shock by any test system. The test systems at Wendell Hull & Associates, Inc. (WHA) in the USA and at the BAM Federal Institute for Materials Research and Testing in Germany are compared in this manner and some of the data obtained is presented. The paper also introduces a new way of comparing the test method to idealized processes to perform system-by-system comparisons. Thus, the paper introduces an “Idealized Severity Index” (ISI) of the thermal energy to characterize a rapid pressure surge. From the TPTF data a “Test Severity Index” (TSI) can also be calculated so that the thermal energies developed by different test systems can be compared to each other and to the ISI for the equivalent isentropic process. Finally, a “Service Severity Index” (SSI) is introduced to characterizing the thermal energy of actual service conditions. This paper is the second in a series of publications planned on the subject of adiabatic compression testing.
Resumo:
This study investigates the short-run dynamics and long-run equilibrium relationship between residential electricity demand and factors influencing demand - per capita income, price of electricity, price of kerosene oil and price of liquefied petroleum gas - using annual data for Sri Lanka for the period, 1960-2007. The study uses unit root, cointegration and error-correction models. The long-run demand elasticities of income, own price and price of kerosene oil (substitute) were estimated to be 0.78, - 0.62, and 0.14 respectively. The short-run elasticities for the same variables were estimated to be 032, - 0.16 and 0.10 respectively. Liquefied petroleum (LP) gas is a substitute for electricity only in the short-run with an elasticity 0.09. The main findings of the paper support the following (1) increasing the price of electricity is not the most effective tool to reduce electricity consumption (2) existing subsidies on electricity consumption can be removed without reducing government revenue (3) the long-run income elasticity of demand shows that any future increase in household incomes is likely to significantly increase the demand for electricity and(4) any power generation plans which consider only current per capita consumption and population growth should be revised taking into account the potential future income increases in order to avoid power shortages ill the country.