11 resultados para Cleaning
em CentAUR: Central Archive University of Reading - UK
Resumo:
Nitrogen trifluoride (NF3) is an industrial gas used in the semiconductor industry as a plasma etchant and chamber cleaning gas. NF3 is an alternative to other potent greenhouse gases and its usage has increased markedly over the last decade. In recognition of its increased relevance and to aid planning of future usage we report an updated radiative efficiency and global warming potentials for NF3. Laboratory measurements give an integrated absorption cross section of 7.04 x 10(-17) cm(2) molecule(-1) cm(-1) over the spectral region 200 2000 cm(-1). The radiative efficiency is calculated to be 0.21 Wm(-2) ppbv(-1) and the 100 year GWP, relative to carbon dioxide, is 17200. These values are approximately 60% higher than previously published estimates, primarily reflecting the higher infrared absorption cross-sections reported here.
Resumo:
Surfactin is a bacterial lipopeptide produced by Bacillus subtilis and is a powerful surfactant, having also antiviral, antibacterial and antitumor properties. The recovery and purification of surfactin from complex fermentation broths is a major obstacle to its commercialization; therefore, a two-step membrane filtration process was developed using a lab scale tangential flow filtration (TFF) unit with 10 kDa MWCO regenerated cellulose (RC) and polyethersulfone (PES)membranes at three different transmembrane pressure (TMP) of 1.5 bar, 2.0 bar and 2.5 bar. Two modes of filtrations were studied, with and without cleaning of membranes prior to UF-2. In a first step of ultrafiltration (UF-1), surfactin was retained effectively by membranes at above its critical micelle concentration (CMC); subsequently in UF-2, the retentate micelles were disrupted by addition of 50% (v/v) methanol solution to allow recovery of surfactin in the permeate. Main protein contaminants were effectively retained by the membrane in UF-2. Flux of permeates, rejection coefficient (R) of surfactin and proteinwere measured during the filtrations. Overall the three different TMPs applied have no significant effect in the filtrations and PES is the more suitable membrane to selectively separate surfactin from fermentation broth, achieving high recovery and level of purity. In addition this two-step UF process is scalable for larger volume of samples without affecting the original functionality of surfactin, although membranes permeability can be affected due to exposure to methanolic solution used in UF-2.
Resumo:
One of the most vexing issues for analysts and managers of property companies across Europe has been the existence and persistence of deviations of Net Asset Values of property companies from their market capitalisation. The issue has clear links to similar discounts and premiums in closed-end funds. The closed end fund puzzle is regarded as an important unsolved problem in financial economics undermining theories of market efficiency and the Law of One Price. Consequently, it has generated a huge body of research. Although it can be tempting to focus on the particular inefficiencies of real estate markets in attempting to explain deviations from NAV, the closed end fund discount puzzle indicates that divergences between underlying asset values and market capitalisation are not a ‘pure’ real estate phenomenon. When examining potential explanations, two recurring factors stand out in the closed end fund literature as often undermining the economic rationale for a discount – the existence of premiums and cross-sectional and periodic fluctuations in the level of discount/premium. These need to be borne in mind when considering potential explanations for real estate markets. There are two approaches to investigating the discount to net asset value in closed-end funds: the ‘rational’ approach and the ‘noise trader’ or ‘sentiment’ approach. The ‘rational’ approach hypothesizes the discount to net asset value as being the result of company specific factors relating to such factors as management quality, tax liability and the type of stocks held by the fund. Despite the intuitive appeal of the ‘rational’ approach to closed-end fund discounts the studies have not successfully explained the variance in closed-end fund discounts or why the discount to net asset value in closed-end funds varies so much over time. The variation over time in the average sector discount is not only a feature of closed-end funds but also property companies. This paper analyses changes in the deviations from NAV for UK property companies between 2000 and 2003. The paper present a new way to study the phenomenon ‘cleaning’ the gearing effect by introducing a new way of calculating the discount itself. We call it “ungeared discount”. It is calculated by assuming that a firm issues new equity to repurchase outstanding debt without any variation on asset side. In this way discount does not depend on an accounting effect and the analysis should better explain the effect of other independent variables.
Resumo:
The application of automatic segmentation methods in lesion detection is desirable. However, such methods are restricted by intensity similarities between lesioned and healthy brain tissue. Using multi-spectral magnetic resonance imaging (MRI) modalities may overcome this problem but it is not always practicable. In this article, a lesion detection approach requiring a single MRI modality is presented, which is an improved method based on a recent publication. This new method assumes that a low similarity should be found in the regions of lesions when the likeness between an intensity based fuzzy segmentation and a location based tissue probabilities is measured. The usage of a normalized similarity measurement enables the current method to fine-tune the threshold for lesion detection, thus maximizing the possibility of reaching high detection accuracy. Importantly, an extra cleaning step is included in the current approach which removes enlarged ventricles from detected lesions. The performance investigation using simulated lesions demonstrated that not only the majority of lesions were well detected but also normal tissues were identified effectively. Tests on images acquired in stroke patients further confirmed the strength of the method in lesion detection. When compared with the previous version, the current approach showed a higher sensitivity in detecting small lesions and had less false positives around the ventricle and the edge of the brain
Resumo:
Background: Microarray based comparative genomic hybridisation (CGH) experiments have been used to study numerous biological problems including understanding genome plasticity in pathogenic bacteria. Typically such experiments produce large data sets that are difficult for biologists to handle. Although there are some programmes available for interpretation of bacterial transcriptomics data and CGH microarray data for looking at genetic stability in oncogenes, there are none specifically to understand the mosaic nature of bacterial genomes. Consequently a bottle neck still persists in accurate processing and mathematical analysis of these data. To address this shortfall we have produced a simple and robust CGH microarray data analysis process that may be automated in the future to understand bacterial genomic diversity. Results: The process involves five steps: cleaning, normalisation, estimating gene presence and absence or divergence, validation, and analysis of data from test against three reference strains simultaneously. Each stage of the process is described and we have compared a number of methods available for characterising bacterial genomic diversity, for calculating the cut-off between gene presence and absence or divergence, and shown that a simple dynamic approach using a kernel density estimator performed better than both established, as well as a more sophisticated mixture modelling technique. We have also shown that current methods commonly used for CGH microarray analysis in tumour and cancer cell lines are not appropriate for analysing our data. Conclusion: After carrying out the analysis and validation for three sequenced Escherichia coli strains, CGH microarray data from 19 E. coli O157 pathogenic test strains were used to demonstrate the benefits of applying this simple and robust process to CGH microarray studies using bacterial genomes.
Resumo:
By using simulation methods, we studied the adsorption of binary CO2-CH4 mixtures on various CH4 preadsorbed carbonaceous materials (e.g., triply periodic carbon minimal surfaces, slit-shaped carbon micropores, and Harris's virtual porous carbons) at 293 K. Regardless of the different micropore geometry, two-stage mechanism of CH4 displacement from carbon nanospaces by coadsorbed CO2 has been proposed. In the first stage, the coadsorbed CO2 molecules induced the enhancement of CH4 adsorbed amount. In the second stage, the stronger affinity of CO2 to flat/curved graphitic surfaces as well as CO2-CO2 interactions cause the displacement of CH4 molecules from carbonaceous materials. The operating conditions of CO2-induced cleaning of the adsorbed phase from CH4 mixture component strongly depend on the size of the carbon micropores, but, in general, the enhanced adsorption field in narrow carbon ultramicropores facilitates the nonreactive displacement of CH4 by coadsorbed CO2. This is because in narrow carbon ultramicropores the equilibrium CO2/CH4 selectivity (i.e., preferential adsorption toward CO2) increased significantly. The adsorption field in wider micropores (i.e., the overall surface energy) for both CO2 and CH4 is very similar, which decreases the preferential CO2 adsorption. This suppresses the displacement of CH4 by coadsorbed CO2 and assists further adsorption of CH4 from the bulk mixture (i.e., CO2/CH4 mixing in adsorbed phase).
Resumo:
The practice of partial depopulation or ‘thinning’, i.e. early removal of a proportion of birds from a commercial broiler flock, is a reported risk factor for Campylobacter colonization of residual birds because of the difficulty in maintaining biosecurity during the process. Therefore, the effect of this practice was studied in detail for 51 target flocks, each at a different growing farm belonging to one of seven major poultry companies throughout the United Kingdom. On 21 of these farms, the target flock was already colonized by Campylobacter and at slaughter all cecal samples examined were positive, with a mean of log10 8 cfu / g. A further 27 flocks became positive within 2 – 6 days of the start of thinning and had similarly high levels of cecal carriage at slaughter. Just prior to the thinning process, Campylobacter could be isolated frequently from the farm driveways, transport vehicles, equipment and personnel. Strains from seven such farms on which flocks became colonized after thinning were examined by PFGE typing. The study demonstrated an association between strains occurring at specific sampling sites and those isolated subsequently from the thinned flocks. There were also indications that particular strains had spread from one farm to another, when the farms were jointly company-owned and served by the same bird-catching teams and / or vehicles. The results highlighted the need for better hygiene control in relation to catching equipment and personnel, and more effective cleaning and disinfection of vehicles, and bird-transport crates.
Resumo:
The effect of a prolonged period of strongly northward Interplanetary Magnetic Field (IMF) on the high-latitude F-region is studied using data from the EISCAT Common Programme Zero mode of operation on 11–12 August 1982. The analysis of the raw autocorrelation functions is kept to the directly derived parameters Ne, Te, Ti and velocity, and limits are defined for the errors introduced by assumptions about ion composition and by changes in the transmitted power and system constant. Simple data-cleaning criteria are employed to eliminate problems due to coherent signals and large background noise levels. The observed variations in plasma densities, temperatures and velocities are interpreted in terms of supporting data from ISEE-3 and local riometers and magnetometers. Both field-aligned and field-perpendicular plasma flows at Tromsø showed effects of the northward IMF: convection was slow and irregular and field-aligned flow profiles were characteristic of steady-state polar wind outflow with flux of order 1012 m−2 s−1. This period followed a strongly southward IMF which had triggered a substorm. The substorm gave enhanced convection, with a swing to equatorward flow and large (5 × 1012 m−2 s−1), steady-state field-aligned fluxes, leading to the possibility of O+ escape into the magnetosphere. The apparent influence of the IMF over both field-perpendicular and field-aligned flows is explained in terms of the cross-cap potential difference and the location of the auroral oval.
Resumo:
Assessments concerning the effects of climate change, water resource availability and water deprivation in West Africa have not frequently considered the positive contribution to be derived from collecting and reusing water for domestic purposes. Where the originating water is taken from a clean water source and has been used the first time for washing or bathing, this water is commonly called “greywater”. Greywater is a prolific resource that is generated wherever people live. Treated greywater can be used for domestic cleaning, for flushing toilets where appropriate, for washing cars, sometimes for watering kitchen gardens, and for clothes washing prior to rinsing. Therefore, a large theoretical potential exists to increase total water resource availability if greywater were to be widely reused. Locally treated greywater reduces the distribution network requirement, lower construction effort and cost and, wherever possible, minimising the associated carbon footprint. Such locally treated greywater offers significant practical opportunities for increasing the total available water resources at a local level. The reuse of treated greywater is one important action that will help to mitigate the reducing availability of clean water supplies in some areas, and the expected mitigation required in future aligns well with WHO/UNICEF (2012) aspirations. The evaluation of potential opportunities for prioritising greywater systems to support water reuse takes into account the availability of water resources, water use indicators and published estimates in order to understand typical patterns of water demand. The approach supports knowledge acquisition regarding local conditions for enabling capacity building for greywater reuse, the understanding of systems that are most likely to encourage greywater reuse, and practices and future actions to stimulate greywater infrastructure planning, design and implementation. Although reuse might be considered to increase the uncertainty of achieving a specified quality of the water supply, robust methods and technologies are available for local treatment. Resource strategies for greywater reuse have the potential to consistently improve water efficiency and availability in water impoverished and water stressed regions of Ghana and West Africa. Untreated greywater is referred to as “greywater”; treated greywater is referred to as “treated greywater” in this paper.
Resumo:
This paper presents a novel mobile sink area allocation scheme for consumer based mobile robotic devices with a proven application to robotic vacuum cleaners. In the home or office environment, rooms are physically separated by walls and an automated robotic cleaner cannot make a decision about which room to move to and perform the cleaning task. Likewise, state of the art cleaning robots do not move to other rooms without direct human interference. In a smart home monitoring system, sensor nodes may be deployed to monitor each separate room. In this work, a quad tree based data gathering scheme is proposed whereby the mobile sink physically moves through every room and logically links all separated sub-networks together. The proposed scheme sequentially collects data from the monitoring environment and transmits the information back to a base station. According to the sensor nodes information, the base station can command a cleaning robot to move to a specific location in the home environment. The quad tree based data gathering scheme minimizes the data gathering tour length and time through the efficient allocation of data gathering areas. A calculated shortest path data gathering tour can efficiently be allocated to the robotic cleaner to complete the cleaning task within a minimum time period. Simulation results show that the proposed scheme can effectively allocate and control the cleaning area to the robot vacuum cleaner without any direct interference from the consumer. The performance of the proposed scheme is then validated with a set of practical sequential data gathering tours in a typical office/home environment.