996 resultados para Sulphite pulping process
Resumo:
This thesis studies human gene expression space using high throughput gene expression data from DNA microarrays. In molecular biology, high throughput techniques allow numerical measurements of expression of tens of thousands of genes simultaneously. In a single study, this data is traditionally obtained from a limited number of sample types with a small number of replicates. For organism-wide analysis, this data has been largely unavailable and the global structure of human transcriptome has remained unknown. This thesis introduces a human transcriptome map of different biological entities and analysis of its general structure. The map is constructed from gene expression data from the two largest public microarray data repositories, GEO and ArrayExpress. The creation of this map contributed to the development of ArrayExpress by identifying and retrofitting the previously unusable and missing data and by improving the access to its data. It also contributed to creation of several new tools for microarray data manipulation and establishment of data exchange between GEO and ArrayExpress. The data integration for the global map required creation of a new large ontology of human cell types, disease states, organism parts and cell lines. The ontology was used in a new text mining and decision tree based method for automatic conversion of human readable free text microarray data annotations into categorised format. The data comparability and minimisation of the systematic measurement errors that are characteristic to each lab- oratory in this large cross-laboratories integrated dataset, was ensured by computation of a range of microarray data quality metrics and exclusion of incomparable data. The structure of a global map of human gene expression was then explored by principal component analysis and hierarchical clustering using heuristics and help from another purpose built sample ontology. A preface and motivation to the construction and analysis of a global map of human gene expression is given by analysis of two microarray datasets of human malignant melanoma. The analysis of these sets incorporate indirect comparison of statistical methods for finding differentially expressed genes and point to the need to study gene expression on a global level.
Resumo:
In visual object detection and recognition, classifiers have two interesting characteristics: accuracy and speed. Accuracy depends on the complexity of the image features and classifier decision surfaces. Speed depends on the hardware and the computational effort required to use the features and decision surfaces. When attempts to increase accuracy lead to increases in complexity and effort, it is necessary to ask how much are we willing to pay for increased accuracy. For example, if increased computational effort implies quickly diminishing returns in accuracy, then those designing inexpensive surveillance applications cannot aim for maximum accuracy at any cost. It becomes necessary to find trade-offs between accuracy and effort. We study efficient classification of images depicting real-world objects and scenes. Classification is efficient when a classifier can be controlled so that the desired trade-off between accuracy and effort (speed) is achieved and unnecessary computations are avoided on a per input basis. A framework is proposed for understanding and modeling efficient classification of images. Classification is modeled as a tree-like process. In designing the framework, it is important to recognize what is essential and to avoid structures that are narrow in applicability. Earlier frameworks are lacking in this regard. The overall contribution is two-fold. First, the framework is presented, subjected to experiments, and shown to be satisfactory. Second, certain unconventional approaches are experimented with. This allows the separation of the essential from the conventional. To determine if the framework is satisfactory, three categories of questions are identified: trade-off optimization, classifier tree organization, and rules for delegation and confidence modeling. Questions and problems related to each category are addressed and empirical results are presented. For example, related to trade-off optimization, we address the problem of computational bottlenecks that limit the range of trade-offs. We also ask if accuracy versus effort trade-offs can be controlled after training. For another example, regarding classifier tree organization, we first consider the task of organizing a tree in a problem-specific manner. We then ask if problem-specific organization is necessary.
Resumo:
An important question which has to be answered in evaluting the suitability of a microcomputer for a control application is the time it would take to execute the specified control algorithm. In this paper, we present a method of obtaining closed-form formulas to estimate this time. These formulas are applicable to control algorithms in which arithmetic operations and matrix manipulations dominate. The method does not require writing detailed programs for implementing the control algorithm. Using this method, the execution times of a variety of control algorithms on a range of 16-bit mini- and recently announced microcomputers are calculated. The formulas have been verified independently by an analysis program, which computes the execution time bounds of control algorithms coded in Pascal when they are run on a specified micro- or minicomputer.
Resumo:
Photography is now a highly automated activity where people enjoy phototaking by pointing and pressing a button. While this liberates people from having to interact with the processes of photography, e.g., controlling the parameters of the camera or printing images in the darkroom, we argue that an engagement with such processes can in fact enrich people's experience of phototaking. Drawing from fieldwork with members of a film-based photography club, we found that people who engage deeply with the various processes of phototaking experienced photography richly and meaningfully. Being able to participate fully in the entire process gave them a sense of achievement over the final result. Having the opportunity to engage with the process also allowed them to learn and hone their photographic skills. Through this understanding, we can imagine future technologies that enrich experiences of photography through providing the means to interact with photographic processes in new ways.
Resumo:
The change in energy during hydrogen abstraction by ketones is estimated for different electronic states as a function of the intermolecular orbital overlap employing perturbation theory. The results suggest that ketones preferentially undergo the in-plane reaction and abstract a hydrogen atom in their triplet nπ* state. For ketones where the triplet ππ* state lies below the triplet nπ* state, hydrogen abstraction can take place in the ππ* state owing to the crossing of the zero order reaction surfaces of the nπ* and ππ* states.
Resumo:
In the field of workplace air quality, measuring and analyzing the size distribution of airborne particles to identify their sources and apportion their contribution has become widely accepted, however, the driving factors that influence this parameter, particularly for nanoparticles (< 100 nm), have not been thoroughly determined. Identification of driving factors, and in turn, general trends in size distribution of emitted particles would facilitate the prediction of nanoparticles’ emission behavior and significantly contribute to their exposure assessment. In this study, a comprehensive analysis of the particle number size distribution data, with a particular focus on the ultrafine size range of synthetic clay particles emitted from a jet milling machine was conducted using the multi-lognormal fitting method. The results showed relatively high contribution of nanoparticles to the emissions in many of the tested cases, and also, that both surface treatment and feed rate of the machine are significant factors influencing the size distribution of the emitted particles of this size. In particular, applying surface treatments and increasing the machine feed rate have the similar effect of reducing the size of the particles, however, no general trend was found in variations of size distribution across different surface treatments and feed rates. The findings of our study demonstrate that for this process and other activities, where no general trend is found in the size distribution of the emitted airborne particles due to dissimilar effects of the driving factors, each case must be treated separately in terms of workplace exposure assessment and regulations.
Resumo:
Background Poor clinical handover has been associated with inaccurate clinical assessment and diagnosis, delays in diagnosis and test ordering, medication errors and decreased patient satisfaction in the acute care setting. Research on the handover process in the residential aged care sector is very limited. Purpose The aims of this study were to: (i) Develop an in-depth understanding of the handover process in aged care by mapping all the key activities and their information dynamics, (ii) Identify gaps in information exchange in the handover process and analyze implications for resident safety, (iii) Develop practical recommendations on how information communication technology (ICT) can improve the process and resident safety. Methods The study was undertaken at a large metropolitan facility in NSW with more than 300 residents and a staff including 55 registered nurses (RNs) and 146 assistants in nursing (AINs). A total of 3 focus groups, 12 interviews and 3 observation sessions were conducted over a period from July to October 2010. Process mapping was undertaken by translating the qualitative data via a five-category code book that was developed prior to the analysis. Results Three major sub-processes were identified and mapped. The three major stages are Handover process (HOP) I “Information gathering by RN”, HOP II “Preparation of preliminary handover sheet” and HOP III “Execution of handover meeting”. Inefficient processes were identified in relation to the handover including duplication of information, utilization of multiple communication modes and information sources, and lack of standardization. Conclusion By providing a robust process model of handover this study has made two critical contributions to research in aged care: (i) a means to identify important, possibly suboptimal practices; and (ii) valuable evidence to plan and improve ICT implementation in residential aged care. The mapping of this process enabled analysis of gaps in information flow and potential impacts on resident safety. In addition it offers the basis for further studies into a process that, despite its importance for securing resident safety and continuity of care, lacks research.
Resumo:
Background Medication safety is a pressing concern for residential aged care facilities (RACFs). Retrospective studies in RACF settings identify inadequate communication between RACFs, doctors, hospitals and community pharmacies as the major cause of medication errors. Existing literature offers limited insight about the gaps in the existing information exchange process that may lead to medication errors. The aim of this research was to explicate the cognitive distribution that underlies RACF medication ordering and delivery to identify gaps in medication-related information exchange which lead to medication errors in RACFs. Methods The study was undertaken in three RACFs in Sydney, Australia. Data were generated through ethnographic field work over a period of five months (May–September 2011). Triangulated analysis of data primarily focused on examining the transformation and exchange of information between different media across the process. Results The findings of this study highlight the extensive scope and intense nature of information exchange in RACF medication ordering and delivery. Rather than attributing error to individual care providers, the explication of distributed cognition processes enabled the identification of gaps in three information exchange dimensions which potentially contribute to the occurrence of medication errors namely: (1) design of medication charts which complicates order processing and record keeping (2) lack of coordination mechanisms between participants which results in misalignment of local practices (3) reliance on restricted communication bandwidth channels mainly telephone and fax which complicates the information processing requirements. The study demonstrates how the identification of these gaps enhances understanding of medication errors in RACFs. Conclusions Application of the theoretical lens of distributed cognition can assist in enhancing our understanding of medication errors in RACFs through identification of gaps in information exchange. Understanding the dynamics of the cognitive process can inform the design of interventions to manage errors and improve residents’ safety.
Resumo:
Composting refers to aerobic degradation of organic material and is one of the main waste treatment methods used in Finland for treating separated organic waste. The composting process allows converting organic waste to a humus-like end product which can be used to increase the organic matter in agricultural soils, in gardening, or in landscaping. Microbes play a key role as degraders during the composting-process, and the microbiology of composting has been studied for decades, but there are still open questions regarding the microbiota in industrial composting processes. It is known that with the traditional, culturing-based methods only a small fraction, below 1%, of the species in a sample is normally detected. In recent years an immense diversity of bacteria, fungi and archaea has been found to occupy many different environments. Therefore the methods of characterising microbes constantly need to be developed further. In this thesis the presence of fungi and bacteria in full-scale and pilot-scale composting processes was characterised with cloning and sequencing. Several clone libraries were constructed and altogether nearly 6000 clones were sequenced. The microbial communities detected in this study were found to differ from the compost microbes observed in previous research with cultivation based methods or with molecular methods from processes of smaller scale, although there were similarities as well. The bacterial diversity was high. Based on the non-parametric coverage estimations, the number of bacterial operational taxonomic units (OTU) in certain stages of composting was over 500. Sequences similar to Lactobacillus and Acetobacteria were frequently detected in the early stages of drum composting. In tunnel stages of composting the bacterial community comprised of Bacillus, Thermoactinomyces, Actinobacteria and Lactobacillus. The fungal diversity was found to be high and phylotypes similar to yeasts were abundantly found in the full-scale drum and tunnel processes. In addition to phylotypes similar to Candida, Pichia and Geotrichum moulds from genus Thermomyces and Penicillium were observed in tunnel stages of composting. Zygomycetes were detected in the pilot-scale composting processes and in the compost piles. In some of the samples there were a few abundant phylotypes present in the clone libraries that masked the rare ones. The rare phylotypes were of interest and a method for collecting them from clone libraries for sequencing was developed. With negative selection of the abundant phylotyps the rare ones were picked from the clone libraries. Thus 41% of the clones in the studied clone libraries were sequenced. Since microbes play a central role in composting and in many other biotechnological processes, rapid methods for characterization of microbial diversity would be of value, both scientifically and commercially. Current methods, however, lack sensitivity and specificity and are therefore under development. Microarrays have been used in microbial ecology for a decade to study the presence or absence of certain microbes of interest in a multiplex manner. The sequence database collected in this thesis was used as basis for probe design and microarray development. The enzyme assisted detection method, ligation-detection-reaction (LDR) based microarray, was adapted for species-level detection of microbes characteristic of each stage of the composting process. With the use of a specially designed control probe it was established that a species specific probe can detect target DNA representing as little as 0.04% of total DNA in a sample. The developed microarray can be used to monitor composting processes or the hygienisation of the compost end product. A large compost microbe sequence dataset was collected and analysed in this thesis. The results provide valuable information on microbial community composition during industrial scale composting processes. The microarray method was developed based on the sequence database collected in this study. The method can be utilised in following the fate of interesting microbes during composting process in an extremely sensitive and specific manner. The platform for the microarray is universal and the method can easily be adapted for studying microbes from environments other than compost.
Resumo:
The synthesis of a wide range of ferrocene-derived sulfur-linked mono- and disubstituted Michael adducts and conjugates mediated by benzyltriethylammonium tetrathiomolybdate (1) in a tandem process is reported. New route to access acryloylferrocene (4) and 1,1'-diacryloylferrocene (5) is discussed. Conjugation of amino acids to ferrocene is established via their N and C termini and also via side chains employing conjugate addition as key step to furnish mono-and divalent conjugates. This methodology has also been extended to access several ferrocene-carbohydrate conjugates. The electrochemical behavior of some selected ferrocene conjugates was studied by cyclic voltammetry.
Resumo:
A mathematical model is developed to simulate oxygen consumption, heat generation and cell growth in solid state fermentation (SSF). The fungal growth on the solid substrate particles results in the increase of the cell film thickness around the particles. The model incorporates this increase in the biofilm size which leads to decrease in the porosity of the substrate bed and diffusivity of oxygen in the bed. The model also takes into account the effect of steric hindrance limitations in SSF. The growth of cells around single particle and resulting expansion of biofilm around the particle is analyzed for simplified zero and first order oxygen consumption kinetics. Under conditions of zero order kinetics, the model predicts upper limit on cell density. The model simulations for packed bed of solid particles in tray bioreactor show distinct limitations on growth due to simultaneous heat and mass transport phenomena accompanying solid state fermentation process. The extent of limitation due to heat and/or mass transport phenomena is analyzed during different stages of fermentation. It is expected that the model will lead to better understanding of the transport processes in SSF, and therefore, will assist in optimal design of bioreactors for SSF.
Resumo:
Cytomegalovirus (CMV) is a major cause of morbidity, costs and even mortality in organ transplant recipients. CMV may also enhance the development of chronic allograft nephropathy (CAN), which is the most important cause of graft loss after kidney transplantation. The evidence for the role of CMV in chronic allograft nephropathy is somewhat limited, and controversial results have also been reported. The aim of this study was to investigate the role of CMV in the pathogenesis of CAN. Material for the purpose of this study was available from altogether 70 kidney transplant recipients who received a kidney transplant between the years 1992-2000. CMV infection was diagnosed with pp65 antigenemia test or by viral culture from blood, urine, or both. CMV proteins were demonstrated in the kidney allograft biopsies by immunohistochemisrty and CMV-DNA by in situ hybridization. Cytokines, adhesion molecules, and growth factors were demonstrated from allograft biopsies by immunohistochemistry, and from urinary samples by ELISA-methods. CMV proteins were detectable in the 6-month protocol biopsies from 18/41 recipients with evidence of CMV infection. In the histopathological analysis of the 6-month protocol biopsies, presence of CMV in the allograft together with a previous history of acute rejection episodes was associated with increased arteriosclerotic changes in small arterioles. In urinary samples collected during CMV infection, excretion of TGF-β was significantly increased. In recipients with increased urinary excretion of TGF-β, increased interstitial fibrosis was recorded in the 6- month protocol biopsies. In biopsies taken after an active CMV infection, CMV persisted in the kidney allograft in 17/48 recipients, as CMV DNA or antigens were detected in the biopsies more than 2 months after the last positive finding in blood or urine. This persistence was associated with increased expression of TGF-β, PDGF, and ICAM-1 and with increased vascular changes in the allografts. Graft survival and graft function one and two years after transplantation were reduced in recipients with persistent intragraft CMV. Persistent intragraft CMV infection was also a risk factor for reduced graft survival in Cox regression analysis, and an independent risk factor for poor graft function one and two years after transplantation in logistic regression analysis. In conclusion, these results show that persistent intragraft CMV infection is detrimental to kidney allografts, causing increased expression of growth factors and increased vascular changes, leading to reduced graft function and survival. Effective prevention, diagnosis and treatment of CMV infections may a major factor in improving the long term survival of kidney allograft.
Resumo:
Business process models have become an effective way of examining business practices to identify areas for improvement. While common information gathering approaches are generally efficacious, they can be quite time consuming and have the risk of developing inaccuracies when information is forgotten or incorrectly interpreted by analysts. In this study, the potential of a role-playing approach to process elicitation and specification has been examined. This method allows stakeholders to enter a virtual world and role-play actions similarly to how they would in reality. As actions are completed, a model is automatically developed, removing the need for stakeholders to learn and understand a modelling grammar. An empirical investigation comparing both the modelling outputs and participant behaviour of this virtual world role-play elicitor with an S-BPM process modelling tool found that while the modelling approaches of the two groups varied greatly, the virtual world elicitor may not only improve both the number of individual process task steps remembered and the correctness of task ordering, but also provide a reduction in the time required for stakeholders to model a process view.
Resumo:
Accurate estimations of water balance are needed in semi-arid and sub-humid tropical regions, where water resources are scarce compared to water demand. Evapotranspiration plays a major role in this context, and the difficulty to quantify it precisely leads to major uncertainties in the groundwater recharge assessment, especially in forested catchments. In this paper, we propose to assess the importance of deep unsaturated regolith and water uptake by deep tree roots on the groundwater recharge process by using a lumped conceptual model (COMFORT). The model is calibrated using a 5 year hydrological monitoring of an experimental watershed under dry deciduous forest in South India (Mule Hole watershed). The model was able to simulate the stream discharge as well as the contrasted behaviour of groundwater table along the hillslope. Water balance simulated for a 32 year climatic time series displayed a large year-to-year variability, with alternance of dry and wet phases with a time period of approximately 14 years. On an average, input by the rainfall was 1090 mm year(-1) and the evapotranspiration was about 900 mm year(-1) out of which 100 mm year(-1) was uptake from the deep saprolite horizons. The stream flow was 100 mm year(-1) while the groundwater underflow was 80 mm year(-1). The simulation results suggest that (i) deciduous trees can uptake a significant amount of water from the deep regolith, (ii) this uptake, combined with the spatial variability of regolith depth, can account for the variable lag time between drainage events and groundwater rise observed for the different piezometers and (iii) water table response to recharge is buffered due to the long vertical travel time through the deep vadose zone, which constitutes a major water reservoir. This study stresses the importance of long term observations for the understanding of hydrological processes in tropical forested ecosystems. (C) 2009 Elsevier B.V. All rights reserved.