989 resultados para ink reduction software
Resumo:
Purpose: The therapeutic ratio for ionising radiation treatment of tumour is a trade-off between normal tissue side-effects and tumour control. Application of a radioprotector to normal tissue can reduce side-effects. Here we study the effects of a new radioprotector on the cellular response to radiation. Methylproamine is a DNA-binding radioprotector which, on the basis of published pulse radiolysis studies, acts by repair of transient radiation-induced oxidative species on DNA. To substantiate this hypothesis, we studied protection by methylproamine at both clonogenic survival and radiation-induced DNA damage, assessed by γH2AX (histone 2AX phosphorylation at serine 139) focus formation endpoints. Materials and methods: The human keratinocyte cell line FEP1811 was used to study clonogenic survival and yield of γH2AX foci following irradiation (137Cs γ-rays) of cells exposed to various concentrations of methylproamine. Uptake of methylproamine into cell nuclei was measured in parallel. Results: The extent of radioprotection at the clonogenic survival endpoint increased with methylproamine concentration up to a maximum dose modification factor (DMF) of 2.0 at 10 μM. At least 0.1 fmole/nucleus of methylproamine is required to achieve a substantial level of radioprotection (DMF of 1.3) with maximum protection (DMF of 2.0) achieved at 0.23 fmole/nucleus. The γH2AX focus yield per cell nucleus 45 min after irradiation decreased with drug concentration with a DMF of 2.5 at 10 μM. Conclusions: These results are consistent with the hypothesis that radioprotection by methylproamine is mediated by attenuation of the extent of initial DNA damage.
Resumo:
Software as a Service (SaaS) can provide significant benefits to small and medium enterprises (SMEs) due to advantages like ease of access, 7*24 availability, and utility pricing. However, underlying the SaaS delivery model is often the assumption that SMEs will directly interact with the SaaS vendor and use a self-service approach. In practice, we see the rise of SaaS intermediaries who can support SMEs with sourcing and leveraging SaaS. This paper reports on the roles of intermediaries and how they support SMEs with using SaaS. We conducted an empirical study of two SaaS intermediaries and analysed their business models, in particular their value propositions. We identified orientation (technology or customer) and alignment (operational or strategic) as themes for understanding their roles. The contributions of this paper include: (1) the identification and description of SaaS intermediaries for SMEs based on an empirical study and (2) understanding the different roles of SaaS intermediaries, in particular a more basic role based on technology orientation and operational alignment and a more value adding role based on customer orientation and strategic alignment. We propose that SaaS intermediaries can address SaaS adoption and implementation challenges of SMEs by playing a basic role and can also aim to support SMEs in creating business value with SaaS based solutions by playing an added value role.
Resumo:
Exploiting metal-free catalysts for the oxygen reduction reaction (ORR) and understanding their catalytic mechanisms are vital for the development of fuel cells (FCs). Our study has demonstrated that in-plane heterostructures of graphene and boron nitride (G/BN) can serve as an efficient metal-free catalyst for the ORR, in which the C-N interfaces of G/BN heterostructures act as reactive sites. The formation of water at the heterointerface is both energetically and kinetically favorable via a fourelectron pathway. Moreover, the water formed can be easily released from the heterointerface, and the catalytically active sites can be regenerated for the next reaction. Since G/BN heterostructures with controlled domain sizes have been successfully synthesized in recent reports (e.g. Nat. Nanotechnol., 2013, 8, 119), our results highlight the great potential of such heterostructures as a promising metal-free catalyst for ORR in FCs.
Resumo:
Clear-fell harvest of forest concerns many wildlife biologists because of loss of vital resources such as roosts or nests, and effects on population viability. However, actual impact has not been quantified. Using New Zealand long-tailed bats (Chalinolobus tuberculatus) as a model species we investigated impacts of clear-fell logging on bats in plantation forest. C. tuberculatus roost within the oldest stands in plantation forest so it was likely roost availability would decrease as harvest operations occurred. We predicted that post-harvest: (1) roosting range sizes would be smaller, (2) fewer roosts would be used, and (3) colony size would be smaller. We captured and radiotracked C. tuberculatus to day-roosts in Kinleith Forest, an exotic plantation forest, over three southern hemisphere summers (Season 1 October 2006–March 2007; Season 2 November 2007–March 2008; and Season 3 November 2008–March 2009). Individual roosting ranges (100% MCPs) post harvest were smaller than those in areas that had not been harvested, and declined in area during the 3 years. Following harvest, bats used fewer roosts than those in areas that had not been harvested. Over 3 years 20.7% of known roosts were lost: 14.5% due to forestry operations and 6.2% due to natural tree fall. Median colony size was 4.0 bats (IQR = 2.0–8.0) and declined during the study, probably because of locally high levels of roost loss. Post harvest colonies were smaller than colonies in areas that had not been harvested. Together, these results suggest the impact of clear-fell harvest on long-tailed bat populations is negative.
Resumo:
Defectivity has been historically identified as a leading technical roadblock to the implementation of nanoimprint lithography for semiconductor high volume manufacturing. The lack of confidence in nanoimprint's ability to meet defect requirements originates in part from the industry's past experiences with 1 × lithography and the shortage in enduser generated defect data. SEMATECH has therefore initiated a defect assessment aimed at addressing these concerns. The goal is to determine whether nanoimprint, specifically Jet and Flash Imprint Lithography from Molecular Imprints, is capable of meeting semiconductor industry defect requirements. At this time, several cycles of learning have been completed in SEMATECH's defect assessment, with promising results. J-FIL process random defectivity of < 0.1 def/cm2 has been demonstrated using a 120nm half-pitch template, providing proof of concept that a low defect nanoimprint process is possible. Template defectivity has also improved significantly as shown by a pre-production grade template at 80nm pitch. Cycles of learning continue on feature sizes down to 22nm. © 2011 SPIE.
Resumo:
A modification to the PVA-FX hydrogel whereby the chelating agent, xylenol orange, was partially bonded to the gelling agent, poly-vinyl alcohol, resulted in an 8% reduction in the post irradiation Fe3+ diffusion, adding approximately 1 hour to the useful timespan between irradiation and readout. This xylenol orange functionalised poly-vinyl alcohol hydrogel had an OD dose sensitivity of 0.014 Gy−1 and a diffusion rate of 0.133 mm2 h−1. As this partial bond yields only incremental improvement, it is proposed that more efficient methods of bonding xylenol orange to poly-vinyl alcohol be investigated to further reduce the diffusion in Fricke gels.
Resumo:
This thesis examines the existing frameworks for energy management in the brewing industry and details the design, development and implementation of a new framework at a modern brewery. The aim of the research was to develop an energy management framework to identify opportunities in a systematic manner using Systems Engineering concepts and principles. This work led to a Sustainable Energy Management Framework, SEMF. Using the SEMF approach, one of Australia's largest breweries has achieved number 1 ranking in the world for water use for the production of beer and has also improved KPI's and sustained the energy management improvements that have been implemented during the past 15 years. The framework can be adapted to other manufacturing industries in the Australian context and is considered to be a new concept and a potentially important tool for energy management.
Resumo:
Critical to the research of urban morphologists is the availability of historical records that document the urban transformation of the study area. However, thus far little work has been done towards an empirical approach to the validation of archival data in this field. Outlined in this paper, therefore, is a new methodology for validating the accuracy of archival records and mapping data, accrued through the process of urban morphological research, so as to establish a reliable platform from which analysis can proceed. The paper particularly addresses the problems of inaccuracies in existing curated historical information, as well as errors in archival research by student assistants, which together give rise to unacceptable levels of uncertainty in the documentation. The paper discusses the problems relating to the reliability of historical information, demonstrates the importance of data verification in urban morphological research, and proposes a rigorous method for objective testing of collected archival data through the use of qualitative data analysis software.
Resumo:
This paper proposes a recommendation system that supports process participants in taking risk-informed decisions, with the goal of reducing risks that may arise during process execution. Risk reduction involves decreasing the likelihood and severity of a process fault from occurring. Given a business process exposed to risks, e.g. a financial process exposed to a risk of reputation loss, we enact this process and whenever a process participant needs to provide input to the process, e.g. by selecting the next task to execute or by filling out a form, we suggest to the participant the action to perform which minimizes the predicted process risk. Risks are predicted by traversing decision trees generated from the logs of past process executions, which consider process data, involved resources, task durations and other information elements like task frequencies. When applied in the context of multiple process instances running concurrently, a second technique is employed that uses integer linear programming to compute the optimal assignment of resources to tasks to be performed, in order to deal with the interplay between risks relative to different instances. The recommendation system has been implemented as a set of components on top of the YAWL BPM system and its effectiveness has been evaluated using a real-life scenario, in collaboration with risk analysts of a large insurance company. The results, based on a simulation of the real-life scenario and its comparison with the event data provided by the company, show that the process instances executed concurrently complete with significantly fewer faults and with lower fault severities, when the recommendations provided by our recommendation system are taken into account.
Resumo:
The characterisation of facial expression through landmark-based analysis methods such as FACEM (Pilowsky & Katsikitis, 1994) has a variety of uses in psychiatric and psychological research. In these systems, important structural relationships are extracted from images of facial expressions by the analysis of a pre-defined set of feature points. These relationship measures may then be used, for instance, to assess the degree of variability and similarity between different facial expressions of emotion. FaceXpress is a multimedia software suite that provides a generalised workbench for landmark-based facial emotion analysis and stimulus manipulation. It is a flexible tool that is designed to be specialised at runtime by the user. While FaceXpress has been used to implement the FACEM process, it can also be configured to support any other similar, arbitrary system for quantifying human facial emotion. FaceXpress also implements an integrated set of image processing tools and specialised tools for facial expression stimulus production including facial morphing routines and the generation of expression-representative line drawings from photographs.
Resumo:
Neu-Model, an ongoing project aimed at developing a neural simulation environment that is extremely computationally powerful and flexible, is described. It is shown that the use of good Software Engineering techniques in Neu-Model’s design and implementation is resulting in a high performance system that is powerful and flexible enough to allow rigorous exploration of brain function at a variety of conceptual levels.
Resumo:
Non-thermal plasma (NTP) is a promising candidate for controlling engine exhaust emissions. Plasma is known as the fourth state of matter, where both electrons and positive ions co-exist. Both gaseous and particle emissions of diesel exhaust undergo chemical changes when they are exposed to plasma. In this project diesel particulate matter (DPM) mitigation from the actual diesel exhaust by using NTP technology has been studied. The effect of plasma, not only on PM mass but also on PM size distribution, physico-chemical structure of PM and PM removal mechanisms, has been investigated. It was found that NTP technology can significantly reduce both PM mass and number. However, under some circumstances particles can be formed by nucleation. Energy required to create the plasma with the current technology is higher than the benchmark set by the commonly used by the automotive industry. Further research will enable the mechanism of particle creation and energy consumption to be optimised.
Resumo:
Multidimensional data are getting increasing attention from researchers for creating better recommender systems in recent years. Additional metadata provides algorithms with more details for better understanding the interaction between users and items. While neighbourhood-based Collaborative Filtering (CF) approaches and latent factor models tackle this task in various ways effectively, they only utilize different partial structures of data. In this paper, we seek to delve into different types of relations in data and to understand the interaction between users and items more holistically. We propose a generic multidimensional CF fusion approach for top-N item recommendations. The proposed approach is capable of incorporating not only localized relations of user-user and item-item but also latent interaction between all dimensions of the data. Experimental results show significant improvements by the proposed approach in terms of recommendation accuracy.
Resumo:
User profiling is the process of constructing user models which represent personal characteristics and preferences of customers. User profiles play a central role in many recommender systems. Recommender systems recommend items to users based on user profiles, in which the items can be any objects which the users are interested in, such as documents, web pages, books, movies, etc. In recent years, multidimensional data are getting more and more attention for creating better recommender systems from both academia and industry. Additional metadata provides algorithms with more details for better understanding the interactions between users and items. However, most of the existing user/item profiling techniques for multidimensional data analyze data through splitting the multidimensional relations, which causes information loss of the multidimensionality. In this paper, we propose a user profiling approach using a tensor reduction algorithm, which we will show is based on a Tucker2 model. The proposed profiling approach incorporates latent interactions between all dimensions into user profiles, which significantly benefits the quality of neighborhood formation. We further propose to integrate the profiling approach into neighborhoodbased collaborative filtering recommender algorithms. Experimental results show significant improvements in terms of recommendation accuracy.
Resumo:
The development of microfinance in Vietnam since 1990s has coincided with a remarkable progress in poverty reduction. Numerous descriptive studies have illustrated that microfinance is an effective tool to eradicate poverty in Vietnam but evidence from quantitative studies is mixed. This study contributes to the literature by providing new evidence on the impact of microfinance to poverty reduction in Vietnam using the repeated cross - sectional data from the Vietnam Living Standard s Survey (VLSS) during period 1992 - 2010. Our results show that micro - loans contribute significantly to household consumption.