446 resultados para Open-air museums
Resumo:
This chapter analyses the copyright law framework needed to ensure open access to outputs of the Australian academic and research sector such as journal articles and theses. It overviews the new knowledge landscape, the principles of copyright law, the concept of open access to knowledge, the recently developed open content models of copyright licensing and the challenges faced in providing greater access to knowledge and research outputs.
Resumo:
Porosity is one of the key parameters of the macroscopic structure of porous media, generally defined as the ratio of the free spaces occupied (by the volume of air) within the material to the total volume of the material. Porosity is determined by measuring skeletal volume and the envelope volume. Solid displacement method is one of the inexpensive and easy methods to determine the envelope volume of a sample with an irregular shape. In this method, generally glass beads are used as a solid due to their uniform size, compactness and fluidity properties. The smaller size of the glass beads means that they enter into the open pores which have a larger diameter than the glass beads. Although extensive research has been carried out on porosity determination using displacement method, no study exists which adequately reports micro-level observation of the sample during measurement. This study set out with the aim of assessing the accuracy of solid displacement method of bulk density measurement of dried foods by micro-level observation. Solid displacement method of porosity determination was conducted using a cylindrical vial (cylindrical plastic container) and 57 µm glass beads in order to measure the bulk density of apple slices at different moisture contents. A scanning electron microscope (SEM), a profilometer and ImageJ software were used to investigate the penetration of glass beads into the surface pores during the determination of the porosity of dried food. A helium pycnometer was used to measure the particle density of the sample. Results show that a significant number of pores were large enough to allow the glass beads to enter into the pores, thereby causing some erroneous results. It was also found that coating the dried sample with appropriate coating material prior to measurement can resolve this problem.
Resumo:
Australian climate is highly suitable for using outdoor air for free building cooling. In order to evaluate the suitability of hybrid cooler for specific applications, a pre-design climate assessment tool is developed and presented in this paper. In addition to the consideration of the local climate, comfort zone proposed by ASHRAE handbook and specific design of building and operation of hybrid cooler, possible influence from environmental factors (e.g. air humidity and air velocity), as well as personal factors (e.g. activity level and clothing insulation) on occupant’s thermal comfort are also considered in this tool. It is demonstrated that with the input of climatic data for a particular location and the associated design data for a specific application, the developed climate assessment tool is able to not only sort outdoor air conditions into the different process regions but also project them onto the psychrometric chart. It can also be used to estimate the hours for an individual operational mode under various climate conditions and summarize them in a table “Results”.
Resumo:
Internal heat sources may not only consume energy directly through their operation (e.g. lighting), but also contribute to building cooling or heating loads, which indirectly change building cooling and heating energy. Through the use of building simulation technique, this paper investigates the influence of building internal load densities on the energy and thermal performance of air conditioned office buildings in Australia. Case studies for air conditioned office buildings in major Australian capital cities are presented. It is found that with a decrease of internal load density in lighting and/or plug load, both the building cooling load and total energy use can be significantly reduced. Their effect on overheating hour reduction would be dependent on the local climate. In particular, it is found that if the building total internal load density is reduced from the base case of “medium” to “extra–low, the building total energy use under the future 2070 high scenario can be reduced by up to 89 to 120 kWh/m² per annum and the overheating problem could be completely avoided. It is suggested that the reduction in building internal load densities could be adopted as one of adaptation strategies for buildings in face of the future global warming.
Resumo:
Experimental work could be conducted in either laboratory or at field site. Generally, the laboratory experiments are carried out in an artificial setting and with a highly controlled environment. By contrast, the field experiments often take place in a natural setting, subject to the influences of many uncontrolled factors. Therefore, it is necessary to carefully assess the possible limitations and appropriateness of an experiment before embarking on it. In this paper, a case study of field monitoring of the energy performance of air conditioners is presented. Significant challenges facing the experimental work are described. Lessons learnt from this case study are also discussed. In particular, it was found that on-going analysis of the monitoring data and the correction of abnormal issues are two of the keys for a successful field test program. It was also shown that the installation of monitoring systems could have a significant impact on the accuracy of the data being collected. Before monitoring system was set up to collect monitoring data, it is recommended that an initial analysis of sample monitored data should be conducted to make sure that the monitoring data can achieve the expected precision. In the case where inevitable inherent errors were induced from the installation of field monitoring systems, appropriate remediation may need to be developed and implemented for the improved accuracy of the estimation of results. On-going analysis of monitoring data and correction of any abnormal issues would be the key to a successful field test program.
Resumo:
Ever growing populations in cities are associated with a major increase in road vehicles and air pollution. The overall high levels of urban air pollution have been shown to be of a significant risk to city dwellers. However, the impacts of very high but temporally and spatially restricted pollution, and thus exposure, are still poorly understood. Conventional approaches to air quality monitoring are based on networks of static and sparse measurement stations. However, these are prohibitively expensive to capture tempo-spatial heterogeneity and identify pollution hotspots, which is required for the development of robust real-time strategies for exposure control. Current progress in developing low-cost micro-scale sensing technology is radically changing the conventional approach to allow real-time information in a capillary form. But the question remains whether there is value in the less accurate data they generate. This article illustrates the drivers behind current rises in the use of low-cost sensors for air pollution management in cities, whilst addressing the major challenges for their effective implementation.
Resumo:
A number of online algorithms have been developed that have small additional loss (regret) compared to the best “shifting expert”. In this model, there is a set of experts and the comparator is the best partition of the trial sequence into a small number of segments, where the expert of smallest loss is chosen in each segment. The regret is typically defined for worst-case data / loss sequences. There has been a recent surge of interest in online algorithms that combine good worst-case guarantees with much improved performance on easy data. A practically relevant class of easy data is the case when the loss of each expert is iid and the best and second best experts have a gap between their mean loss. In the full information setting, the FlipFlop algorithm by De Rooij et al. (2014) combines the best of the iid optimal Follow-The-Leader (FL) and the worst-case-safe Hedge algorithms, whereas in the bandit information case SAO by Bubeck and Slivkins (2012) competes with the iid optimal UCB and the worst-case-safe EXP3. We ask the same question for the shifting expert problem. First, we ask what are the simple and efficient algorithms for the shifting experts problem when the loss sequence in each segment is iid with respect to a fixed but unknown distribution. Second, we ask how to efficiently unite the performance of such algorithms on easy data with worst-case robustness. A particular intriguing open problem is the case when the comparator shifts within a small subset of experts from a large set under the assumption that the losses in each segment are iid.
Resumo:
Background An increase in bicycle commuting participation may improve public health and traffic congestion in cities. Information on air pollution exposure (such as perception, symptoms and risk management) contributes to the responsible promotion of bicycle commuting participation. Methods To determine perceptions, symptoms and willingness for specific exposure risk management strategies of exposure to air pollution, a questionnaire-based cross-sectional investigation was conducted with adult bicycle commuters (n = 153; age = 41 ± 11 yr; 28% female). Results Frequency of acute respiratory signs and symptoms are positively-associated with in- and post-commute compared to pre-commute time periods (p < 0.05); greater positive-association is with respiratory disorder compared to healthy, and female compared to male, participants. The perception (although not signs or symptoms) of in-commute exposure to air pollution is positive-associated with the estimated level of in-commute proximity to motorised traffic. The majority of participants indicated a willingness (which varied with health status and gender) to adopt risk management strategies (with certain practical features) if shown to be appropriate and effective. Conclusions While acute signs and symptoms of air pollution exposure are indicated with bicycle commuting, and more so in susceptible individuals, there is willingness to manage exposure risk by adopting effective strategies with desirable features.
Resumo:
Ultrafine particles are particles that are less than 0.1 micrometres (µm) in diameter. Due to their very small size they can penetrate deep into the lungs, and potentially cause more damage than larger particles. The Ultrafine Particles from Traffic Emissions and Children’s Health (UPTECH) study is the first Australian epidemiological study to assess the health effects of ultrafine particles on children’s health in general and peripheral airways in particular. The study is being conducted in Brisbane, Australia. Continuous indoor and outdoor air pollution monitoring was conducted within each of the twenty five participating school campuses to measure particulate matter, including in the ultrafine size range, and gases. Respiratory health effects were evaluated by conducting the following tests on participating children at each school: spirometry, forced oscillation technique (FOT) and multiple breath nitrogen washout test (MBNW) (to assess airway function), fraction of exhaled nitric oxide (FeNO, to assess airway inflammation), blood cotinine levels (to assess exposure to second-hand tobacco smoke), and serum C-reactive protein (CRP) levels (to measure systemic inflammation). A pilot study was conducted prior to commencing the main study to assess the feasibility and reliably of measurement of some of the clinical tests that have been proposed for the main study. Air pollutant exposure measurements were not included in the pilot study.
Resumo:
The drying of grapes is a more complex process compared to the dehydration of other agricultural materials due to the necessity of a pretreatment operation prior to drying. Grape drying to produce raisins is a very slow process, due to the peculiar structure of grape peel, that is covered by a waxy layer.Its removal has benn so far carried out by using several chemical pre-treatments. However, they cause heterogeneity in the waxes removal and create microscopic cracks. In this paper an abrasive pretreatment for enhancing the drying rate and preserving the grape samples is proposed. Two cultivars of grape were investigated: Regina white grape and Red Globe red grape. The drying kinetics of untreated and treated samples were studied using a convective oven at 50 C. Fruit quality parameters such as sugar and organic acid contents, shrinkage, texture, peel damage (i.e. by SEM analysis) and rehydration capacity were studied to evaluate the effectiveness of abrasive pretreatment on raisins. Abrasive pretreatment contributed to reduce drying time and rehydration time. The treated and untreated dried grapes were significantly different (p<0.05) in sugar and in tartaric acid content. On the contrary, no significant differences (p<0.05) in malic and citric acids in texture peoperties between untreated and treated samples were observed.
Resumo:
This paper addresses the development of trust in the use of Open Data through incorporation of appropriate authentication and integrity parameters for use by end user Open Data application developers in an architecture for trustworthy Open Data Services. The advantages of this architecture scheme is that it is far more scalable, not another certificate-based hierarchy that has problems with certificate revocation management. With the use of a Public File, if the key is compromised: it is a simple matter of the single responsible entity replacing the key pair with a new one and re-performing the data file signing process. Under this proposed architecture, the the Open Data environment does not interfere with the internal security schemes that might be employed by the entity. However, this architecture incorporates, when needed, parameters from the entity, e.g. person who authorized publishing as Open Data, at the time that datasets are created/added.
Resumo:
Background Climate change may affect mortality associated with air pollutants, especially for fine particulate matter (PM2.5) and ozone (O3). Projection studies of such kind involve complicated modelling approaches with uncertainties. Objectives We conducted a systematic review of researches and methods for projecting future PM2.5-/O3-related mortality to identify the uncertainties and optimal approaches for handling uncertainty. Methods A literature search was conducted in October 2013, using the electronic databases: PubMed, Scopus, ScienceDirect, ProQuest, and Web of Science. The search was limited to peer-reviewed journal articles published in English from January 1980 to September 2013. Discussion Fifteen studies fulfilled the inclusion criteria. Most studies reported that an increase of climate change-induced PM2.5 and O3 may result in an increase in mortality. However, little research has been conducted in developing countries with high emissions and dense populations. Additionally, health effects induced by PM2.5 may dominate compared to those caused by O3, but projection studies of PM2.5-related mortality are fewer than those of O3-related mortality. There is a considerable variation in approaches of scenario-based projection researches, which makes it difficult to compare results. Multiple scenarios, models and downscaling methods have been used to reduce uncertainties. However, few studies have discussed what the main source of uncertainties is and which uncertainty could be most effectively reduced. Conclusions Projecting air pollution-related mortality requires a systematic consideration of assumptions and uncertainties, which will significantly aid policymakers in efforts to manage potential impacts of PM2.5 and O3 on mortality in the context of climate change.