66 resultados para Radioactive wastes.
Resumo:
Domestic food wastage is a growing problem for the environment and food security. Some causes of domestic food wastes are attributed to a consumer’s behaviours during food purchasing, storage and consumption, such as: excessive food purchases and stockpiling in storage. Recent efforts in human-computer interaction research have examined ways of influencing consumer behaviour. The outcomes have led to a number of interventions that assist users with performing everyday tasks. The Internet Fridge is an example of such an intervention. However, new pioneering technologies frequently confront barriers that restrict their future impact in the market place, which has prompted investigations into the effectiveness of behaviour changing interventions used to encourage more sustainable practices. In this paper, we investigate and compare the effectiveness of two interventions that encourage behaviour change: FridgeCam and the Colour Code Project. We use FridgeCam to examine how improving a consumer’s food supply knowledge can reduce food stockpiling. We use the Colour Code Project to examine how improving consumer awareness of food location can encourage consumption of forgotten foods. We explore opportunities to integrate these interventions into commercially available technologies, such as the Internet Fridge, to: (i) increase the technology’s benefit and value to users, and (ii) promote reduced domestic food wastage. We conclude that interventions improving consumer food supply and location knowledge can promote behaviours that reduce domestic food waste over a longer term. The implications of this research present new opportunities for existing and future technologies to play a key role in reducing domestic food waste.
Resumo:
A novel differential pulse voltammetry (DPV) method was developed for the simultaneous analysis of herbicides in water. A mixture of four herbicides, atrazine, simazine, propazine and terbuthylazine was analyzed simultaneously and the complex, overlapping DPV voltammograms were resolved by several chemometrics methods such as partial least squares (PLS), principal component regression (PCR) and principal component–artificial networks (PC–ANN). The complex profiles of the voltammograms collected from a synthetic set of samples were best resolved with the use of the PC–ANN method, and the best predictions of the concentrations of the analytes were obtained with the PC-ANN model (%RPET = 6.1 and average %Recovery = 99.0). The new method was also used for analysis of real samples, and the obtained results were compared well with those from the GC-MS technique. Such conclusions suggest that the novel method is a viable alternative to the other commonly used methods such as GC, HPLC and GC-MS.
Resumo:
During their entire lives, people are exposed to the pollutants present in indoor air. Recently, Electronic Nicotine Delivery Systems, mainly known as electronic cigarettes, have been widely commercialized: they deliver particles into the lungs of the users but a “second-hand smoke” has yet to be associated to this indoor source. On the other hand, the naturally-occurring radioactive gas, i.e. radon, represents a significant risk for lung cancer, and the cumulative action of these two agents could be worse than the agents separately would. In order to deepen the interaction between radon progeny and second-hand aerosol from different types of cigarettes, a designed experimental study was carried out by generating aerosol from e-cigarette vaping as well as from second-hand traditional smoke inside a walk-in radon chamber at the National Institute of Ionizing Radiation Metrology (INMRI) of Italy. In this chamber, the radon present in air comes naturally from the floor and ambient conditions are controlled. To characterize the sidestream smoke emitted by cigarettes, condensation particle counters and scanning mobility particle sizer were used. Radon concentration in the air was measured through an Alphaguard ionization chamber, whereas the measurement of radon decay product in the air was performed with the Tracelab BWLM Plus-2S Radon daughter Monitor. It was found an increase of the Potential Alpha-Energy Concentration (PAEC) due to the radon decay products attached to aerosol for higher particle number concentrations. This varied from 7.47 ± 0.34 MeV L−1 to 12.6 ± 0.26 MeV L−1 (69%) for the e-cigarette. In the case of traditional cigarette and at the same radon concentration, the increase was from 14.1 ± 0.43 MeV L−1 to 18.6 ± 0.19 MeV L−1 (31%). The equilibrium factor increases, varying from 23.4% ± 1.11% to 29.5% ± 0.26% and from 30.9% ± 1.0% to 38.1 ± 0.88 for the e-cigarette and traditional cigarette, respectively. These growths still continue for long time after the combustion, by increasing the exposure risk.
Resumo:
This paper presents an approach, based on Lean production philosophy, for rationalising the processes involved in the production of specification documents for construction projects. Current construction literature erroneously depicts the process for the creation of construction specifications as a linear one. This traditional understanding of the specification process often culminates in process-wastes. On the contrary, the evidence suggests that though generalised, the activities involved in producing specification documents are nonlinear. Drawing on the outcome of participant observation, this paper presents an optimised approach for representing construction specifications. Consequently, the actors typically involved in producing specification documents are identified, the processes suitable for automation are highlighted and the central role of tacit knowledge is integrated into a conceptual template of construction specifications. By applying the transformation, flow, value (TFV) theory of Lean production the paper argues that value creation can be realised by eliminating the wastes associated with the traditional preparation of specification documents with a view to integrating specifications in digital models such as Building Information Models (BIM). Therefore, the paper presents an approach for rationalising the TFV theory as a method for optimising current approaches for generating construction specifications based on a revised specification writing model.
Resumo:
Species distribution modelling (SDM) typically analyses species’ presence together with some form of absence information. Ideally absences comprise observations or are inferred from comprehensive sampling. When such information is not available, then pseudo-absences are often generated from the background locations within the study region of interest containing the presences, or else absence is implied through the comparison of presences to the whole study region, e.g. as is the case in Maximum Entropy (MaxEnt) or Poisson point process modelling. However, the choice of which absence information to include can be both challenging and highly influential on SDM predictions (e.g. Oksanen and Minchin, 2002). In practice, the use of pseudo- or implied absences often leads to an imbalance where absences far outnumber presences. This leaves analysis highly susceptible to ‘naughty-noughts’: absences that occur beyond the envelope of the species, which can exert strong influence on the model and its predictions (Austin and Meyers, 1996). Also known as ‘excess zeros’, naughty noughts can be estimated via an overall proportion in simple hurdle or mixture models (Martin et al., 2005). However, absences, especially those that occur beyond the species envelope, can often be more diverse than presences. Here we consider an extension to excess zero models. The two-staged approach first exploits the compartmentalisation provided by classification trees (CTs) (as in O’Leary, 2008) to identify multiple sources of naughty noughts and simultaneously delineate several species envelopes. Then SDMs can be fit separately within each envelope, and for this stage, we examine both CTs (as in Falk et al., 2014) and the popular MaxEnt (Elith et al., 2006). We introduce a wider range of model performance measures to improve treatment of naughty noughts in SDM. We retain an overall measure of model performance, the area under the curve (AUC) of the Receiver-Operating Curve (ROC), but focus on its constituent measures of false negative rate (FNR) and false positive rate (FPR), and how these relate to the threshold in the predicted probability of presence that delimits predicted presence from absence. We also propose error rates more relevant to users of predictions: false omission rate (FOR), the chance that a predicted absence corresponds to (and hence wastes) an observed presence, and the false discovery rate (FDR), reflecting those predicted (or potential) presences that correspond to absence. A high FDR may be desirable since it could help target future search efforts, whereas zero or low FOR is desirable since it indicates none of the (often valuable) presences have been ignored in the SDM. For illustration, we chose Bradypus variegatus, a species that has previously been published as an exemplar species for MaxEnt, proposed by Phillips et al. (2006). We used CTs to increasingly refine the species envelope, starting with the whole study region (E0), eliminating more and more potential naughty noughts (E1–E3). When combined with an SDM fit within the species envelope, the best CT SDM had similar AUC and FPR to the best MaxEnt SDM, but otherwise performed better. The FNR and FOR were greatly reduced, suggesting that CTs handle absences better. Interestingly, MaxEnt predictions showed low discriminatory performance, with the most common predicted probability of presence being in the same range (0.00-0.20) for both true absences and presences. In summary, this example shows that SDMs can be improved by introducing an initial hurdle to identify naughty noughts and partition the envelope before applying SDMs. This improvement was barely detectable via AUC and FPR yet visible in FOR, FNR, and the comparison of predicted probability of presence distribution for pres/absence.
Resumo:
Biorefineries, co-producing fuels, green chemicals and bio-products, offer great potential for enhancing agricultural value, and developing new industries in the bioeconomy. Biomass biorefineries aim to convert agricultural crops and wastes through biochemical and enzymatic processes to low cost fermentable sugars and other products which are platforms for value-adding. Through subsequent fermentation or chemical synthesis, the bio-based platforms can be converted to fuels including ethanol and butanol, oils, organic acids such as lactic and levulinic acid and polymer precursors. Other biorefinery products can include food and animal feeds, plastics, fibre products and resins. In 2014, QUT commissioned a study from Deloitte Access Economics and Correlli Consulting to assess the potential future economic value of tropical biorefineries to Queensland. This paper will report on the outcomes of this study and address the opportunities available for tropical biorefineries to contribute to the future profitability and sustainability of tropical agricultural industries in Queensland and more broadly across northern Australia.