679 resultados para Produce


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Semi-conducting phase I CuTCNQ (TCNQ = 7,7,8,8-tetracyanoquinodimethane), which is of considerable interest as a switching device for memory storage materials, can be electrocrystallized from CH3CN via two distinctly different pathways when TCNQ is reduced to TCNQ˙− in the presence of [Cu(MeCN)4]+. The first pathway, identified in earlier studies, occurs at potentials where TCNQ is reduced to TCNQ˙− and involves a nucleation–growth mechanism at preferred sites on the electrode to produce arrays of well separated large branched needle-shaped phase I CuTCNQ crystals. The second pathway, now identified at more negative potentials, generates much smaller needle-shaped phase I CuTCNQ crystals. These electrocrystallize on parts of the surface not occupied in the initial process and give rise to film-like characteristics. This process is attributed to the reduction of Cu+[(TCNQ˙−)(TCNQ)] or a stabilised film of TCNQ via a solid–solid conversion process, which also involves ingress of Cu+via a nucleation–growth mechanism. The CuTCNQ surface area coverage is extensive since it occurs at all areas of the electrode and not just at defect sites that dominate the crystal formation sites for the first pathway. Infrared spectra, X-ray diffraction, surface plasmon resonance, quartz crystal microbalance, scanning electron microscopy and optical image data all confirm that two distinctly different pathways are available to produce the kinetically-favoured and more highly conducting phase I CuTCNQ solid, rather than the phase II material.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We demonstrate a simple electrochemical route to produce uniformly sized gold nanospikes without the need for a capping agent or prior modification of the electrode surface, which are predominantly oriented in the {111} crystal plane and exhibit promising electrocatalytic and SERS properties.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Stress corrosion cracking (SCC) is a well known form of environmental attack in low carat gold jewellery. It is desirable to have a quick, easy and cost effective way to detect SCC in alloys and prevent them from being used and later failing in their application. A facile chemical method to investigate SCC of 9 carat gold alloys is demonstrated. It involves a simple application of tensile stress to a wire sample in a corrosive environment such as 1–10 % FeCl3 which induces failure in less than 5 minutes. In this study three quaternary (Au, Ag, Cu and Zn) 9 carat gold alloy compositions were investigated for their resistance to SCC and the relationship between time to failure and processing conditions is studied. It is envisaged that the use of such a rapid and facile screening procedure at the production stage may readily identify alloy treatments that produce jewellery that will be susceptible to SCC in its lifetime.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Climate change is expected to be one of the biggest global health threats in the 21st century. In response to changes in climate and associated extreme events, public health adaptation has become imperative. This thesis examined several key issues in this emerging research field. The thesis aimed to identify the climate-health (particularly temperature-health) relationships, then develop quantitative models that can be used to project future health impacts of climate change, and therefore help formulate adaptation strategies for dealing with climate-related health risks and reducing vulnerability. The research questions addressed by this thesis were: (1) What are the barriers to public health adaptation to climate change? What are the research priorities in this emerging field? (2) What models and frameworks can be used to project future temperature-related mortality under different climate change scenarios? (3) What is the actual burden of temperature-related mortality? What are the impacts of climate change on future burden of disease? and (4) Can we develop public health adaptation strategies to manage the health effects of temperature in response to climate change? Using a literature review, I discussed how public health organisations should implement and manage the process of planned adaptation. This review showed that public health adaptation can operate at two levels: building adaptive capacity and implementing adaptation actions. However, there are constraints and barriers to adaptation arising from uncertainty, cost, technologic limits, institutional arrangements, deficits of social capital, and individual perception of risks. The opportunities for planning and implementing public health adaptation are reliant on effective strategies to overcome likely barriers. I proposed that high priorities should be given to multidisciplinary research on the assessment of potential health effects of climate change, projections of future health impacts under different climate and socio-economic scenarios, identification of health cobenefits of climate change policies, and evaluation of cost-effective public health adaptation options. Heat-related mortality is the most direct and highly-significant potential climate change impact on human health. I thus conducted a systematic review of research and methods for projecting future heat-related mortality under different climate change scenarios. The review showed that climate change is likely to result in a substantial increase in heatrelated mortality. Projecting heat-related mortality requires understanding of historical temperature-mortality relationships, and consideration of future changes in climate, population and acclimatisation. Further research is needed to provide a stronger theoretical framework for mortality projections, including a better understanding of socioeconomic development, adaptation strategies, land-use patterns, air pollution and mortality displacement. Most previous studies were designed to examine temperature-related excess deaths or mortality risks. However, if most temperature-related deaths occur in the very elderly who had only a short life expectancy, then the burden of temperature on mortality would have less public health importance. To guide policy decisions and resource allocation, it is desirable to know the actual burden of temperature-related mortality. To achieve this, I used years of life lost to provide a new measure of health effects of temperature. I conducted a time-series analysis to estimate years of life lost associated with changes in season and temperature in Brisbane, Australia. I also projected the future temperaturerelated years of life lost attributable to climate change. This study showed that the association between temperature and years of life lost was U-shaped, with increased years of life lost on cold and hot days. The temperature-related years of life lost will worsen greatly if future climate change goes beyond a 2 °C increase and without any adaptation to higher temperatures. The excess mortality during prolonged extreme temperatures is often greater than the predicted using smoothed temperature-mortality association. This is because sustained period of extreme temperatures produce an extra effect beyond that predicted by daily temperatures. To better estimate the burden of extreme temperatures, I estimated their effects on years of life lost due to cardiovascular disease using data from Brisbane, Australia. The results showed that the association between daily mean temperature and years of life lost due to cardiovascular disease was U-shaped, with the lowest years of life lost at 24 °C (the 75th percentile of daily mean temperature in Brisbane), rising progressively as temperatures become hotter or colder. There were significant added effects of heat waves, but no added effects of cold spells. Finally, public health adaptation to hot weather is necessary and pressing. I discussed how to manage the health effects of temperature, especially with the context of climate change. Strategies to minimise the health effects of high temperatures and climate change can fall into two categories: reducing the heat exposure and managing the health effects of high temperatures. However, policy decisions need information on specific adaptations, together with their expected costs and benefits. Therefore, more research is needed to evaluate cost-effective adaptation options. In summary, this thesis adds to the large body of literature on the impacts of temperature and climate change on human health. It improves our understanding of the temperaturehealth relationship, and how this relationship will change as temperatures increase. Although the research is limited to one city, which restricts the generalisability of the findings, the methods and approaches developed in this thesis will be useful to other researchers studying temperature-health relationships and climate change impacts. The results may be helpful for decision-makers who develop public health adaptation strategies to minimise the health effects of extreme temperatures and climate change.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Application of 'advanced analysis' methods suitable for non-linear analysis and design of steel frame structures permits direct and accurate determination of ultimate system strengths, without resort to simplified elastic methods of analysis and semi-empirical specification equations. However, the application of advanced analysis methods has previously been restricted to steel frames comprising only compact sections that are not influenced by the effects of local buckling. A research project has been conducted with the aim of developing concentrated plasticity methods suitable for practical advanced analysis of steel frame structures comprising non-compact sections. A primary objective was to produce a comprehensive range of new distributed plasticity analytical benchmark solutions for verification of the concentrated plasticity methods. A distributed plasticity model was developed using shell finite elements to explicitly account for the effects of gradual yielding and spread of plasticity, initial geometric imperfections, residual stresses and local buckling deformations. The model was verified by comparison with large-scale steel frame test results and a variety of existing analytical benchmark solutions. This paper presents a description of the distributed plasticity model and details of the verification study.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Over the past few decades a major paradigm shift has occurred in the conceptualisation of chronic pain as a complex multidimensional phenomenon. Yet, pain experienced by individuals with a primary disability continues to be understood largely from a traditional biomedical model, despite its inherent limitations. This is reflected in the body of literature on the topic that is primarily driven by positivist assumptions and the search for etiologic pain mechanisms. Conversely, little is known about the experiences of and meanings attributed to, disability-related pain. Thus the purpose of this paper is to discuss the use of focus group methodology in elucidating the meanings and experiences of this population. Here, a distinction is made between the method of the focus group and focus group research as methodology. Typically, the focus group is presented as a seemingly atheoretical method of research. Drawing on research undertaken on the impact of chronic pain in people with multiple sclerosis, this paper seeks to theorise the focus group in arguing the methodological congruence of focus group research and the study of pain experience. It is argued that the contributions of group interaction and shared experiences in focus group discussions produce data and insights less accessible through more structured research methods. It is concluded that a biopsychosocial perspective of chronic pain may only ever be appreciated when the person-in-context is the unit of investigation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study explores the accuracy and valuation implications of the application of a comprehensive list of equity multiples in the takeover context. Motivating the study is the prevalent use of equity multiples in practice, the observed long-run underperformance of acquirers following takeovers, and the scarcity of multiplesbased research in the merger and acquisition setting. In exploring the application of equity multiples in this context three research questions are addressed: (1) how accurate are equity multiples (RQ1); which equity multiples are more accurate in valuing the firm (RQ2); and which equity multiples are associated with greater misvaluation of the firm (RQ3). Following a comprehensive review of the extant multiples-based literature it is hypothesised that the accuracy of multiples in estimating stock market prices in the takeover context will rank as follows (from best to worst): (1) forecasted earnings multiples, (2) multiples closer to bottom line earnings, (3) multiples based on Net Cash Flow from Operations (NCFO) and trading revenue. The relative inaccuracies in multiples are expected to flow through to equity misvaluation (as measured by the ratio of estimated market capitalisation to residual income value, or P/V). Accordingly, it is hypothesised that greater overvaluation will be exhibited for multiples based on Trading Revenue, NCFO, Book Value (BV) and earnings before interest, tax, depreciation and amortisation (EBITDA) versus multiples based on bottom line earnings; and that multiples based on Intrinsic Value will display the least overvaluation. The hypotheses are tested using a sample of 147 acquirers and 129 targets involved in Australian takeover transactions announced between 1990 and 2005. The results show that first, the majority of computed multiples examined exhibit valuation errors within 30 percent of stock market values. Second, and consistent with expectations, the results provide support for the superiority of multiples based on forecasted earnings in valuing targets and acquirers engaged in takeover transactions. Although a gradual improvement in estimating stock market values is not entirely evident when moving down the Income Statement, historical earnings multiples perform better than multiples based on Trading Revenue or NCFO. Third, while multiples based on forecasted earnings have the highest valuation accuracy they, along with Trading Revenue multiples for targets, produce the most overvalued valuations for acquirers and targets. Consistent with predictions, greater overvaluation is exhibited for multiples based on Trading Revenue for targets, and NCFO and EBITDA for both acquirers and targets. Finally, as expected, multiples based Intrinsic Value (along with BV) are associated with the least overvaluation. Given the widespread usage of valuation multiples in takeover contexts these findings offer a unique insight into their relative effectiveness. Importantly, the findings add to the growing body of valuation accuracy literature, especially within Australia, and should assist market participants to better understand the relative accuracy and misvaluation consequences of various equity multiples used in takeover documentation and assist them in subsequent investment decision making.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A method of producing porous complex oxides includes the steps of providing a mixt. of (a) precursor elements suitable to produce the complex oxide, or (b) one or more precursor elements suitable to produce particles of the complex oxide and one or more metal oxide particles; and (c) a particulate carbon-contg. pore-forming material selected to provide pore sizes in the range of 7-250 nm, and treating the mixt. to (i) form the porous complex oxide in which two or more of the precursor elements from (a) above or one or more of the precursor elements and one or more of the metals in the metal oxide particles from (b) above are incorporated into a phase of the complex metal oxide and the complex metal oxide has grain sizes in the range of 1-150 nm, and (ii) removing the pore-forming material under conditions such that the porous structure and compn. of the complex oxide is substantially preserved. The method may be used to produce nonrefractory metal oxides as well. The mixt. further includes a surfactant, or a polymer. [on SciFinder(R)]

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Public policymakers are caught in a dilemma : there is a growing list of urgent issues to address, at the same time that public expenditure is being cut. Adding to this dilemma is a system of government designed in the 19th century and competing theories of policymaking dating back to the 1950s. The interlinked problems of disaster risk management and climate change adaptation are cases in point. As the climate changes, there will be more frequent, intense and/or prolonged disasters such as floods and bushfires. Clearly a well integrated whole of government response is needed, but how might this be achieved? Further, how could academic research contribute to resolving this dilemma in a way that would produce something of theoretical interest as well as practical outcomes for policymakers? These are the questions addressed by our research via a comparative analysis of the 2009 Victorian bushfires, the 2011 Perth Hills bushfires, and the 2011 Brisbane floods. Our findings suggest that there is a need to: improve community engagement and communication; refocus attention on resilience; improve interagency communication and collaboration; and, develop institutional arrangements that support continual improvement and policy learning. These findings have implications for all areas of public policy theory and practice.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The 2 hour game jam was performed as part of the State Library of Queensland 'Garage Gamer' series of events, summer 2013, at the SLQ exhibition. An aspect of the exhibition was the series of 'Level Up' game nights. We hosted the first of these - under the auspices of brIGDA, Game On. It was a party - but the focal point of the event was a live streamed 2 hour game jam. Game jams have become popular amongst the game development and design community in recent years, particularly with the growth of the Global Game Jam, a yearly event which brings thousands of game makers together across different sites in different countries. Other established jams take place on-line, for example the Ludum Dare challenge which as been running since 2002. Other challenges follow the same model in more intimate circumstances and it is now common to find institutions and groups holding their own small local game making jams. There are variations around the format, some jams are more competitive than others for example, but a common aspect is the creation of an intense creative crucible centred around team work and ‘accelerated game development’. Works (games) produced during these intense events often display more experimental qualities than those undertaken as commercial projects. In part this is because the typical jam is started with a conceptual design brief, perhaps a single word, or in the case of the specific game jam described in this paper, three words. Teams have to envision the challenge key word/s as a game design using whatever skills and technologies they can and produce a finished working game in the time given. Game jams thus provide design researchers with extraordinary fodder and recent years have also seen a number of projects which seek to illuminate the design process as seen in these events. For example, Gaydos, Harris and Martinez discuss the opportunity of the jam to expose students to principles of design process and design spaces (2011). Rouse muses on the game jam ‘as radical practice’ and a ‘corrective to game creation as it is normally practiced’. His observations about his own experience in a jam emphasise the same artistic endeavour forefronted earlier, where the experience is about creation that is divorced from the instrumental motivations of commercial game design (Rouse 2011) and where the focus is on process over product. Other participants remark on the social milieu of the event as a critical factor and the collaborative opportunity as a rich site to engage participants in design processes (Shin et al, 2012). Shin et al are particularly interested in the notion of the site of the process and the ramifications of participants being in the same location. They applaud the more localized event where there is an emphasis on local participation and collaboration. For other commentators, it is specifically the social experience in the place of the jam is the most important aspect (See Keogh 2011), not the material site but rather the physical embodied experience of ‘being there’ and being part of the event. Participants talk about game jams they have attended in a similar manner to those observations made by Dourish where the experience is layered on top of the physical space of the event (Dourish 2006). It is as if the event has taken on qualities of place where we find echoes of Tuan’s description of a particular site having an aura of history that makes it a very different place, redolent and evocative (Tuan 1977). The 2 hour game jam held during the SLQ Garage Gamer program was all about social experience.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Piezoelectric composites comprising an active phase of ferroelectric ceramic and a polymer matrix have recently attracted numerous sensory applications. However, it remains a major challenge to further improve their electromechanical response for advanced applications such as precision control and monitoring systems. We hereby investigated the incorporation of graphene platelets (GnPs) and multi-walled carbon nanotubes (MWNTs), each with various weight fractions, into PZT (lead zirconate titanate)/epoxy composites to produce three-phase nanocomposites. The nanocomposite films show markedly improved piezoelectric coefficients and electromechanical responses (50%) besides an enhancement of ~200% in stiffness. Carbon nanomaterials strengthened the impact of electric field on the PZT particles by appropriately raising the electrical conductivity of epoxy. GnPs have been proved far more promising in improving the poling behavior and dynamic response than MWNTs. The superior dynamic sensitivity of GnP-reinforced composite may be caused by GnPs’ high load transfer efficiency arising from their two-dimensional geometry and good compatibility with the matrix. Reduced acoustic impedance mismatch resulted from the improved thermal conductance may also contribute to the higher sensitivity of GnP-reinforced composite. This research pointed out the potential of employing GnPs to develop highly sensitive piezoelectric composites for sensing applications.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Custom designed for display on the Cube Installation situated in the new Science and Engineering Centre (SEC) at QUT, the ECOS project is a playful interface that uses real-time weather data to simulate how a five-star energy building operates in climates all over the world. In collaboration with the SEC building managers, the ECOS Project incorporates energy consumption and generation data of the building into an interactive simulation, which is both engaging to users and highly informative, and which invites play and reflection on the roles of green buildings. ECOS focuses on the principle that humans can have both a positive and negative impact on ecosystems with both local and global consequence. The ECOS project draws on the practice of Eco-Visualisation, a term used to encapsulate the important merging of environmental data visualization with the philosophy of sustainability. Holmes (2007) uses the term Eco-Visualisation (EV) to refer to data visualisations that ‘display the real time consumption statistics of key environmental resources for the goal of promoting ecological literacy’. EVs are commonly artifacts of interaction design, information design, interface design and industrial design, but are informed by various intellectual disciplines that have shared interests in sustainability. As a result of surveying a number of projects, Pierce, Odom and Blevis (2008) outline strategies for designing and evaluating effective EVs, including ‘connecting behavior to material impacts of consumption, encouraging playful engagement and exploration with energy, raising public awareness and facilitating discussion, and stimulating critical reflection.’ Consequently, Froehlich (2010) and his colleagues also use the term ‘Eco-feedback technology’ to describe the same field. ‘Green IT’ is another variation which Tomlinson (2010) describes as a ‘field at the juncture of two trends… the growing concern over environmental issues’ and ‘the use of digital tools and techniques for manipulating information.’ The ECOS Project team is guided by these principles, but more importantly, propose an example for how these principles may be achieved. The ECOS Project presents a simplified interface to the very complex domain of thermodynamic and climate modeling. From a mathematical perspective, the simulation can be divided into two models, which interact and compete for balance – the comfort of ECOS’ virtual denizens and the ecological and environmental health of the virtual world. The comfort model is based on the study of psychometrics, and specifically those relating to human comfort. This provides baseline micro-climatic values for what constitutes a comfortable working environment within the QUT SEC buildings. The difference between the ambient outside temperature (as determined by polling the Google Weather API for live weather data) and the internal thermostat of the building (as set by the user) allows us to estimate the energy required to either heat or cool the building. Once the energy requirements can be ascertained, this is then balanced with the ability of the building to produce enough power from green energy sources (solar, wind and gas) to cover its energy requirements. Calculating the relative amount of energy produced by wind and solar can be done by, in the case of solar for example, considering the size of panel and the amount of solar radiation it is receiving at any given time, which in turn can be estimated based on the temperature and conditions returned by the live weather API. Some of these variables can be altered by the user, allowing them to attempt to optimize the health of the building. The variables that can be changed are the budget allocated to green energy sources such as the Solar Panels, Wind Generator and the Air conditioning to control the internal building temperature. These variables influence the energy input and output variables, modeled on the real energy usage statistics drawn from the SEC data provided by the building managers.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND. The authors compared gemcitabine and carboplatin (GC) with mitomycin, ifosfamide, and cisplatin (MIC) or mitomycin, vinblastine, and cisplatin (MVP) in patients with advanced nonsmall cell lung carcinoma (NSCLC). The primary objective was survival. Secondary objectives were time to disease progression, response rates, evaluation of toxicity, disease-related symptoms, World Health Organization performance status (PS), and quality of life (QoL). METHODS. Three hundred seventy-two chemotherapy-naïve patients with International Staging System Stage III/IV NSCLC who were ineligible for curative radiotherapy or surgery were randomized to receive either 4 cycles of gemcitabine (1000 mg/m2 on Days 1, 8, and 15) plus carboplatin (area under the serum concentration-time curve, 5; given on Day 1) every 4 weeks (the GC arm) or MIC/MVP every 3 weeks (the MIC/MVP arm). RESULTS. There was no significant difference in median survival (248 days in the MIC/MVP arm vs. 236 days in the GC arm) or time to progression (225 days in the MIC/MVP arm vs. 218 days in the GC arm) between the 2 treatment arms. The 2-year survival rate was 11.8% in the MIC/MVP arm and 6.9% in the GC arm. The 1-year survival rate was 32.5% in the MIC/MVP arm and 33.2% in the GC arm. In the MIC/MVP arm, 33% of patients responded (4 complete responses [CRs] and 57 partial responses [PRs]) whereas in the GC arm, 30% of patients responded (3 CRs and 54 PRs). Nonhematologic toxicity was comparable for patients with Grade 3-4 symptoms, except there was more alopecia among patients in the MIC/MVP arm. GC appeared to produce more hematologic toxicity and necessitated more transfusions. There was no difference in performance status, disease-related symptoms, of QoL between patients in the two treatment arms. Fewer inpatient stays for complications were required with GC. CONCLUSIONS. The results of the current study failed to demonstrate any difference in efficacy between the newer regimen of GC and the older regimens of MIC and MVP. © 2003 American Cancer Society.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper reflects on the wider potential of digital narratives as a useful tool for social work practitioners. Despite the multiple points of connection between narrative approaches and social work, the influence of narratives on practice remains limited. A case study of a digital storytelling (DST) process employed in a research project with a small group of lone mothers from refugee backgrounds is used to trigger discussion of broader applications of DST as part of everyday social work practice. The use of DST acknowledged women’s capacities for self-representation and agency, in line with participatory and strengths-based approaches inherent in contemporary social work. The benefits of using DST with lone mothers from refugee backgrounds illustrate how this method can act as a pathway to produce counter-narratives, both at the individual and broader community levels. Documenting life stories digitally provides the opportunity to construct narratives about experiences of relocation and settlement as tools for social advocacy, which can assist social workers to ensure meaningful outcomes for service-users. These propositions can serve to inform social work practices with people from refugee backgrounds and address some of the intricacies of working in diverse and challenging contexts.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Light trapping, due to the embedding of metallic nanoparticles, has been shown to be beneficial for a better photoabsorption in organic solar cells. Researchers in plasmonics and in the organic photovoltaics fields are working together to improve the absorption of sunlight and the photon–electron coupling to boost the performance of the devices. Recent advances in the field of plasmonics for organic solar cells focus on the incorporation of gold nanoparticles. This article reviews the different methods to produce and embed gold nanoparticles into organic solar cells. In particular, concentration, size and geometry of gold nanoparticles are key factors that directly influence the light absorption in the devices. It is shown that a careful choice of size, concentration and location of gold nanoparticles in the device result in an enhancement of the power conversion efficiencies when compared to standard organic solar cell devices. Our latest results on gold nanoparticles embedded in on organic solar cell devices are included. We demonstrate that embedded gold nanoparticles, created by depositing and annealing a gold film on transparent electrode, generate a plasmonic effect which can be exploited to increase the power conversion efficiency of a bulk heterojunction solar cell up to 10%.