989 resultados para ink reduction software
Resumo:
This paper evaluates and compares the system performance of a solar desiccant-evaporative cooling (SDEC) system with a referenced conventional variable air volume (VAV) system for a typical office building in all 8 Australian capital cities. A simulation model of the building is developed using the whole building simulation software EnergyPlus. The performance indicators for the comparison are system coefficient of performance (COP), annual primary energy consumption, annual energy savings, and annual CO2 emissions reduction. The simulation results show that Darwin has the most apparent advantages for SDEC system applications with an annual energy savings of 557 GJ and CO2 emission reduction of 121 tonnes. The maximum system COP is 7. For other climate zones such as Canberra, Hobart and Melbourne, the SDEC system is not as energy efficient as the conventional VAV system.
Resumo:
Bien Hoa Airbase was one of the bulk storage and supply facilities for defoliants during the Vietnam War. Environmental and biological samples taken around the airbase have elevated levels of dioxin. In 2007, a pre-intervention knowledge, attitude and practice (KAP) survey of local residents living in Trung Dung and Tan Phong wards was undertaken regarding appropriate strategies to reduce dioxin exposure. A risk reduction programme was implemented in 2008 and post-intervention KAP surveys were undertaken in 2009 and 2013 to evaluate the longer term impacts. Quantitative assessment was undertaken via a KAP survey in 2013 among 600 local residents randomly selected from the two intervention wards and one control ward (Buu Long). Eight in-depth interviews and two focus group discussions were also undertaken for qualitative assessment. Most programme activities had ceased and dioxin risk communication activities had not been integrated into local routine health education programmes; however, main results generally remained and were better than that in Buu Long. In total, 48.2% of households undertook measures to prevent exposure, higher than those in pre- and post-intervention surveys (25.8% and 39.7%) and the control ward (7.7%). Migration and the sensitive nature of dioxin issues were the main challenges for the programme's sustainability
Resumo:
Mechanical flexibility is considered an asset in consumer electronics and next-generation electronic systems. Printed and flexible electronic devices could be embedded into clothing or other surfaces at home or office or in many products such as low-cost sensors integrated in transparent and flexible surfaces. In this context inks based on graphene and related two-dimensional materials (2DMs) are gaining increasing attention owing to their exceptional (opto)electronic, electrochemical and mechanical properties. The current limitation relies on the use of solvents, providing stable dispersions of graphene and 2DMs and fitting the proper fluidic requirements for printing, which are in general not environmentally benign, and with high boiling point. Non-toxic and low boiling point solvents do not possess the required rheological properties (i.e., surface tension, viscosity and density) for the solution processing of graphene and 2DMs. Such solvents (e.g., water, alcohols) require the addition of stabilizing agents such as polymers or surfactants for the dispersion of graphene and 2DMs, which however unavoidably corrupt their properties, thus preventing their use for the target application. Here, we demonstrate a viable strategy to tune the fluidic properties of water/ethanol mixtures (low-boiling point solvents) to first effectively exfoliate graphite and then disperse graphene flakes to formulate graphene-based inks. We demonstrate that such inks can be used to print conductive stripes (sheet resistance of ~13 kΩ/□) on flexible substrates (polyethylene terephthalate), moving a step forward towards the realization of graphene-based printed electronic devices.
Resumo:
Nepal, as a consequence of its geographical location and changing climate, faces frequent threats of natural disasters. According to the World Bank’s 2005 Natural Disasters Hotspots Report, Nepal is ranked the 11th most vulnerable country to earthquake and 30th to flood risk. Geo-Hazards International (2011) has classified Kathmandu as one of the world’s most vulnerable cities to earthquakes. In the last four decades more than 32,000 people in Nepal have lost their lives and annual monetary loss is estimated at more than 15 million (US) dollars. This review identifies gaps in knowledge, and progress towards implementation of the Post Hyogo Framework of Action. Nepal has identified priority areas: community resilience, sustainable development and climate change induced disaster risk reduction. However, one gap between policy and action lies in the ability of Nepal to act effectively in accordance with an appropriate framework for media activities. Supporting media agencies include the Press Council, Federation of Nepalese Journalists, Nepal Television, Radio Nepal and Telecommunications Authority and community based organizations. The challenge lies in further strengthening traditional and new media to undertake systematic work supported by government bodies and the National Risk Reduction Consortium (NRRC). Within this context, the ideal role for media is one that is proactive where journalists pay attention to a range of appropriate angles or frames when preparing and disseminating information. It is important to develop policy for effective information collection, sharing and dissemination in collaboration with Telecommunication, Media and Journalists. The aim of this paper is to describe the developments in disaster management in Nepal and their implications for media management. This study provides lessons for government, community and the media to help improve the framing of disaster messages. Significantly, the research highlights the prominence that should be given to flood, landslides, lightning and earthquakes.
Resumo:
This research explored how small and medium enterprises can achieve success with software as a service (SaaS) applications from cloud. Based upon an empirical investigation of six growth oriented and early technology adopting small and medium enterprises, this study proposes a SaaS for small and medium enterprise success model with two approaches: one for basic and one for advanced benefits. The basic model explains the effective use of SaaS for achieving informational and transactional benefits. The advanced model explains the enhanced use of software as a service for achieving strategic and transformational benefits. Both models explicate the information systems capabilities and organizational complementarities needed for achieving success with SaaS.
Resumo:
Ce1-xSnxO2 (x = 0.1-0.5) solid solution and its Pd substituted analogue have been prepared by a single step solution combustion method using tin oxalate precursor. The compounds were characterized by X-ray diffraction (XRD), X-ray photoelectron spectroscopy (XPS), transmission electron microscopy (TEM), and H-2/temperature programmed redution (TPR) studies. The cubic fluorite structure remained intact up to 50% of Sri substitution in CeO2, and the compounds were stable up to 700 C. Oxygen storage capacity of Ce1-xSnxO2 was found to be much higher than that of Ce1-xZrxO2 due to accessible Ce4+/Ce3+ and Sn4+/Sn2+ redox couples at temperatures between 200 and 400 C. Pd 21 ions in Ce0.78Sn0.2Pd0.02O2-delta are highly ionic, and the lattice oxygen of this catalyst is highly labile, leading to low temperature CO to CO2 conversion. The rate of CO oxidation was 2 mu mol g(-1) s(-1) at 50 degrees C. NO reduction by CO with 70% N-2 selectivity was observed at similar to 200 degrees C and 100% N-2 selectivity below 260 degrees C with 1000-5000 ppm NO. Thus, Pd2+ ion substituted Ce1-xSnxO2 is a superior catalyst compared to Pd2+ ions in CeO2, Ce1-xZrxO2, and Ce1-xTixO2 for low temperature exhaust applications due to the involvement of the Sn2+/Sn4+ redox couple along with Pd2+/Pd-0 and Ce4+/Ce3+ couples.
Resumo:
This study investigated driving reduction in a diverse sample of 229 male and female older drivers aged 70 years and above in Queensland, Australia. The study sought to determine whether differences existed between male and female older drivers in regard to driving patterns, and to identify factors that were predictive of driving reduction in female versus male older drivers. Participants provided information on their health, self-reported driving patterns, driving perceptions, alternative transport options, and feedback. Overall, females were more likely to avoid challenging situations but less likely to reduce their driving when compared to males. Self-rated health and driving confidence were significant predictors for driving reduction among females. For males, driving importance was the only significant predictor for driving reduction in this sample. This study indicates the need for longitudinal research on the process of driving reduction and whether the planning process for driving cessation differ between females and males.
Resumo:
Bug fixing is a highly cooperative work activity where developers, testers, product managers and other stake-holders collaborate using a bug tracking system. In the context of Global Software Development (GSD), where software development is distributed across different geographical locations, we focus on understanding the role of bug trackers in supporting software bug fixing activities. We carried out a small-scale ethnographic fieldwork in a software product team distributed between Finland and India at a multinational engineering company. Using semi-structured interviews and in-situ observations of 16 bug cases, we show that the bug tracker 1) supported information needs of different stake holder, 2) established common-ground, and 3) reinforced issues related to ownership, performance and power. Consequently, we provide implications for design around these findings.
Resumo:
Background: Inflammation and biomechanical factors have been associated with the development of vulnerable atherosclerotic plaques. Lipid-lowering therapy has been shown to be effective in stabilizing them by reducing plaque inflammation. Its effect on arterial wall strain, however, remains unknown. The aim of the present study was to investigate the role of high- and low-dose lipid-lowering therapy using an HMG-CoA reductase inhibitor, atorvastatin, on arterial wall strain. Methods and Results: Forty patients with carotid stenosis >40% were successfully followed up during the Atorvastatin Therapy: Effects on Reduction Of Macrophage Activity (ATHEROMA; ISRCTN64894118) Trial. All patients had plaque inflammation as shown by intraplaque accumulation of ultrasmall super paramagnetic particles of iron oxide on magnetic resonance imaging at baseline. Structural analysis was performed and change of strain was compared between high- and low-dose statin at 0 and 12 weeks. There was no significant difference in strain between the 2 groups at baseline (P=0.6). At 12 weeks, the maximum strain was significantly lower in the 80-mg group than in the 10-mg group (0.085±0.033 vs. 0.169±0.084; P=0.001). A significant reduction (26%) of maximum strain was observed in the 80-mg group at 12 weeks (0.018±0.02; P=0.01). Conclusions: Aggressive lipid-lowering therapy is associated with a significant reduction in arterial wall strain. The reduction in biomechanical strain may be associated with reductions in plaque inflammatory burden.
Resumo:
Objectives: The aim of this study was to evaluate the effects of low-dose (10 mg) and high-dose (80 mg) atorvastatin on carotid plaque inflammation as determined by ultrasmall superparamagnetic iron oxide (USPIO)-enhanced carotid magnetic resonance imaging (MRI). The hypothesis was that treatment with 80 mg atorvastatin would demonstrate quantifiable changes in USPIO-enhanced MRI-defined inflammation within the first 3 months of therapy. Background: Preliminary studies indicate that USPIO-enhanced MRI can identify macrophage infiltration in human carotid atheroma in vivo and hence may be a surrogate marker of plaque inflammation. Methods: Forty-seven patients with carotid stenosis >40% on duplex ultrasonography and who demonstrated intraplaque accumulation of USPIO on MRI at baseline were randomly assigned in a balanced, double-blind manner to either 10 or 80 mg atorvastatin daily for 12 weeks. Baseline statin therapy was equivalent to 10 mg of atorvastatin or less. The primary end point was change from baseline in signal intensity (ΔSI) on USPIO-enhanced MRI in carotid plaque at 6 and 12 weeks. Results: Twenty patients completed 12 weeks of treatment in each group. A significant reduction from baseline in USPIO-defined inflammation was observed in the 80-mg group at both 6 weeks (ΔSI 0.13; p = 0.0003) and at 12 weeks (ΔSI 0.20; p < 0.0001). No difference was observed with the low-dose regimen. The 80-mg atorvastatin dose significantly reduced total cholesterol by 15% (p = 0.0003) and low-density lipoprotein cholesterol by 29% (p = 0.0001) at 12 weeks. Conclusions: Aggressive lipid-lowering therapy over a 3-month period is associated with significant reduction in USPIO-defined inflammation. USPIO-enhanced MRI methodology may be a useful imaging biomarker for the screening and assessment of therapeutic response to "anti-inflammatory" interventions in patients with atherosclerotic lesions. (Effects of Atorvastatin on Macrophage Activity and Plaque Inflammation Using Magnetic Resonance Imaging [ATHEROMA]; NCT00368589).
Resumo:
This paper presents on overview of the issues in precisely defining, specifying and evaluating the dependability of software, particularly in the context of computer controlled process systems. Dependability is intended to be a generic term embodying various quality factors and is useful for both software and hardware. While the developments in quality assurance and reliability theories have proceeded mostly in independent directions for hardware and software systems, we present here the case for developing a unified framework of dependability—a facet of operational effectiveness of modern technological systems, and develop a hierarchical systems model helpful in clarifying this view. In the second half of the paper, we survey the models and methods available for measuring and improving software reliability. The nature of software “bugs”, the failure history of the software system in the various phases of its lifecycle, the reliability growth in the development phase, estimation of the number of errors remaining in the operational phase, and the complexity of the debugging process have all been considered to varying degrees of detail. We also discuss the notion of software fault-tolerance, methods of achieving the same, and the status of other measures of software dependability such as maintainability, availability and safety.
Resumo:
Birch reductio and reductive methylations of some substituted naphtholic acids have been examined. The factors influencing the mechanism of reduction process have been discussed. Some of the reduced naphthoic acids are useful synthons for synthesis.
Resumo:
Reducing unwanted trawl bycatch is actively encouraged in Australia, particularly in prawn trawl fisheries. We tested the performance of a Bycatch Reduction Device, the Yarrow Fisheye, during two periods of commercial fishing operations in Australia's Northern Prawn Fishery, by comparing the catches of paired treatment and control nets. We compared the catch weights of the small fish and invertebrate bycatch, and the commercially important tiger prawns, from 42 trawls in 2002. The Yarrow Fisheye reduced the weight of small bycatch by a mean of 22.7%, with no loss of tiger prawn. We also compared the numbers of seasnakes caught in 41 and 72 trawls during the spring trawling seasons of 2004 and 2005, respectively. The Yarrow Fisheye reduced the catches by a mean of 43.3%. Flume-tank tests of the Yarrow Fisheye showed that this device created a slow water-flow region extending over 2 m downstream from its position in the net, and close to where the catch accumulates. Finfish and seasnakes may be exploiting this slow water-flow region to escape via the eye, Although the reductions in fish and seasnake bycatch were excellent, we think they could be further improved by relating differences in fisheye position and localised water displacements, to design and rigging changes.
Resumo:
The literature contains many examples of digital procedures for the analytical treatment of electroencephalograms, but there is as yet no standard by which those techniques may be judged or compared. This paper proposes one method of generating an EEG, based on a computer program for Zetterberg's simulation. It is assumed that the statistical properties of an EEG may be represented by stationary processes having rational transfer functions and achieved by a system of software fillers and random number generators.The model represents neither the neurological mechanism response for generating the EEG, nor any particular type of EEG record; transient phenomena such as spikes, sharp waves and alpha bursts also are excluded. The basis of the program is a valid ‘partial’ statistical description of the EEG; that description is then used to produce a digital representation of a signal which if plotted sequentially, might or might not by chance resemble an EEG, that is unimportant. What is important is that the statistical properties of the series remain those of a real EEG; it is in this sense that the output is a simulation of the EEG. There is considerable flexibility in the form of the output, i.e. its alpha, beta and delta content, which may be selected by the user, the same selected parameters always producing the same statistical output. The filtered outputs from the random number sequences may be scaled to provide realistic power distributions in the accepted EEG frequency bands and then summed to create a digital output signal, the ‘stationary EEG’. It is suggested that the simulator might act as a test input to digital analytical techniques for the EEG, a simulator which would enable at least a substantial part of those techniques to be compared and assessed in an objective manner. The equations necessary to implement the model are given. The program has been run on a DEC1090 computer but is suitable for any microcomputer having more than 32 kBytes of memory; the execution time required to generate a 25 s simulated EEG is in the region of 15 s.