46 resultados para Constituent process


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis studies human gene expression space using high throughput gene expression data from DNA microarrays. In molecular biology, high throughput techniques allow numerical measurements of expression of tens of thousands of genes simultaneously. In a single study, this data is traditionally obtained from a limited number of sample types with a small number of replicates. For organism-wide analysis, this data has been largely unavailable and the global structure of human transcriptome has remained unknown. This thesis introduces a human transcriptome map of different biological entities and analysis of its general structure. The map is constructed from gene expression data from the two largest public microarray data repositories, GEO and ArrayExpress. The creation of this map contributed to the development of ArrayExpress by identifying and retrofitting the previously unusable and missing data and by improving the access to its data. It also contributed to creation of several new tools for microarray data manipulation and establishment of data exchange between GEO and ArrayExpress. The data integration for the global map required creation of a new large ontology of human cell types, disease states, organism parts and cell lines. The ontology was used in a new text mining and decision tree based method for automatic conversion of human readable free text microarray data annotations into categorised format. The data comparability and minimisation of the systematic measurement errors that are characteristic to each lab- oratory in this large cross-laboratories integrated dataset, was ensured by computation of a range of microarray data quality metrics and exclusion of incomparable data. The structure of a global map of human gene expression was then explored by principal component analysis and hierarchical clustering using heuristics and help from another purpose built sample ontology. A preface and motivation to the construction and analysis of a global map of human gene expression is given by analysis of two microarray datasets of human malignant melanoma. The analysis of these sets incorporate indirect comparison of statistical methods for finding differentially expressed genes and point to the need to study gene expression on a global level.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In visual object detection and recognition, classifiers have two interesting characteristics: accuracy and speed. Accuracy depends on the complexity of the image features and classifier decision surfaces. Speed depends on the hardware and the computational effort required to use the features and decision surfaces. When attempts to increase accuracy lead to increases in complexity and effort, it is necessary to ask how much are we willing to pay for increased accuracy. For example, if increased computational effort implies quickly diminishing returns in accuracy, then those designing inexpensive surveillance applications cannot aim for maximum accuracy at any cost. It becomes necessary to find trade-offs between accuracy and effort. We study efficient classification of images depicting real-world objects and scenes. Classification is efficient when a classifier can be controlled so that the desired trade-off between accuracy and effort (speed) is achieved and unnecessary computations are avoided on a per input basis. A framework is proposed for understanding and modeling efficient classification of images. Classification is modeled as a tree-like process. In designing the framework, it is important to recognize what is essential and to avoid structures that are narrow in applicability. Earlier frameworks are lacking in this regard. The overall contribution is two-fold. First, the framework is presented, subjected to experiments, and shown to be satisfactory. Second, certain unconventional approaches are experimented with. This allows the separation of the essential from the conventional. To determine if the framework is satisfactory, three categories of questions are identified: trade-off optimization, classifier tree organization, and rules for delegation and confidence modeling. Questions and problems related to each category are addressed and empirical results are presented. For example, related to trade-off optimization, we address the problem of computational bottlenecks that limit the range of trade-offs. We also ask if accuracy versus effort trade-offs can be controlled after training. For another example, regarding classifier tree organization, we first consider the task of organizing a tree in a problem-specific manner. We then ask if problem-specific organization is necessary.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This work is concerned with presenting a modified theoretical approach to the study of centre-periphery relations in the Russian Federation. In the widely accepted scientific discourse, the Russian federal system under the Yeltsin Administration (1991-2000) was asymmetrical; largely owing to the varying amount of structural autonomy distributed among the federation s 89 constituent units. While providing an improved understanding as to which political and socio-economic structures contributed to federal asymmetry, it is felt that associated large N-studies have underemphasised the role played by actor agency in re-shaping Russian federal institutions. It is the main task of this thesis to reintroduce /re-emphasise the importance of actor agency as a major contributing element of institutional change in the Russian federal system. By focusing on the strategic agency of regional elites simultaneously within regional and federal contexts, the thesis adopts the position that political, ethnic and socio-economic structural factors alone cannot fully determine the extent to which regional leaders were successful in their pursuit of economic and political pay-offs from the institutionally weakened federal centre. Furthermore, this work hypothesises that under conditions of federal institutional uncertainty, it is the ability of regional leaders to simultaneously interpret various mutable structural conditions then translate them into plausible strategies which accounts for the regions ability to extract variable amounts of economic and political pay-offs from the Russian federal system. The thesis finds that while the hypothesis is accurate in its theoretical assumptions, several key conclusions provide paths for further inquiry posed by the initial research question. First, without reliable information or stable institutions to guide their actions, both regional and federal elites were forced into ad-hoc decision-making in order to maintain their core strategic focus: political survival. Second, instead of attributing asymmetry to either actor agency or structural factors exclusively, the empirical data shows that both agency and structures interact symbiotically in the strategic formulation process, thus accounting for the sub-optimal nature of several of the actions taken in the adopted cases. Third, as actor agency and structural factors mutate over time, so, too do the perceived payoffs from elite competition. In the case of the Russian federal system, the stronger the federal centre became, the less likely it was that regional leaders could extract the high degree of economic and political pay-offs that they clamoured for earlier in the Yeltsin period. Finally, traditional approaches to the study of federal systems which focus on institutions as measures of federalism are not fully applicable in the Russian case precisely because the institutions themselves were a secondary point of contention between competing elites. Institutional equilibriums between the regions and Moscow were struck only when highly personalised elite preferences were satisfied. Therefore the Russian federal system is the product of short-term, institutional solutions suited to elite survival strategies developed under conditions of economic, political and social uncertainty.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Composting refers to aerobic degradation of organic material and is one of the main waste treatment methods used in Finland for treating separated organic waste. The composting process allows converting organic waste to a humus-like end product which can be used to increase the organic matter in agricultural soils, in gardening, or in landscaping. Microbes play a key role as degraders during the composting-process, and the microbiology of composting has been studied for decades, but there are still open questions regarding the microbiota in industrial composting processes. It is known that with the traditional, culturing-based methods only a small fraction, below 1%, of the species in a sample is normally detected. In recent years an immense diversity of bacteria, fungi and archaea has been found to occupy many different environments. Therefore the methods of characterising microbes constantly need to be developed further. In this thesis the presence of fungi and bacteria in full-scale and pilot-scale composting processes was characterised with cloning and sequencing. Several clone libraries were constructed and altogether nearly 6000 clones were sequenced. The microbial communities detected in this study were found to differ from the compost microbes observed in previous research with cultivation based methods or with molecular methods from processes of smaller scale, although there were similarities as well. The bacterial diversity was high. Based on the non-parametric coverage estimations, the number of bacterial operational taxonomic units (OTU) in certain stages of composting was over 500. Sequences similar to Lactobacillus and Acetobacteria were frequently detected in the early stages of drum composting. In tunnel stages of composting the bacterial community comprised of Bacillus, Thermoactinomyces, Actinobacteria and Lactobacillus. The fungal diversity was found to be high and phylotypes similar to yeasts were abundantly found in the full-scale drum and tunnel processes. In addition to phylotypes similar to Candida, Pichia and Geotrichum moulds from genus Thermomyces and Penicillium were observed in tunnel stages of composting. Zygomycetes were detected in the pilot-scale composting processes and in the compost piles. In some of the samples there were a few abundant phylotypes present in the clone libraries that masked the rare ones. The rare phylotypes were of interest and a method for collecting them from clone libraries for sequencing was developed. With negative selection of the abundant phylotyps the rare ones were picked from the clone libraries. Thus 41% of the clones in the studied clone libraries were sequenced. Since microbes play a central role in composting and in many other biotechnological processes, rapid methods for characterization of microbial diversity would be of value, both scientifically and commercially. Current methods, however, lack sensitivity and specificity and are therefore under development. Microarrays have been used in microbial ecology for a decade to study the presence or absence of certain microbes of interest in a multiplex manner. The sequence database collected in this thesis was used as basis for probe design and microarray development. The enzyme assisted detection method, ligation-detection-reaction (LDR) based microarray, was adapted for species-level detection of microbes characteristic of each stage of the composting process. With the use of a specially designed control probe it was established that a species specific probe can detect target DNA representing as little as 0.04% of total DNA in a sample. The developed microarray can be used to monitor composting processes or the hygienisation of the compost end product. A large compost microbe sequence dataset was collected and analysed in this thesis. The results provide valuable information on microbial community composition during industrial scale composting processes. The microarray method was developed based on the sequence database collected in this study. The method can be utilised in following the fate of interesting microbes during composting process in an extremely sensitive and specific manner. The platform for the microarray is universal and the method can easily be adapted for studying microbes from environments other than compost.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Cytomegalovirus (CMV) is a major cause of morbidity, costs and even mortality in organ transplant recipients. CMV may also enhance the development of chronic allograft nephropathy (CAN), which is the most important cause of graft loss after kidney transplantation. The evidence for the role of CMV in chronic allograft nephropathy is somewhat limited, and controversial results have also been reported. The aim of this study was to investigate the role of CMV in the pathogenesis of CAN. Material for the purpose of this study was available from altogether 70 kidney transplant recipients who received a kidney transplant between the years 1992-2000. CMV infection was diagnosed with pp65 antigenemia test or by viral culture from blood, urine, or both. CMV proteins were demonstrated in the kidney allograft biopsies by immunohistochemisrty and CMV-DNA by in situ hybridization. Cytokines, adhesion molecules, and growth factors were demonstrated from allograft biopsies by immunohistochemistry, and from urinary samples by ELISA-methods. CMV proteins were detectable in the 6-month protocol biopsies from 18/41 recipients with evidence of CMV infection. In the histopathological analysis of the 6-month protocol biopsies, presence of CMV in the allograft together with a previous history of acute rejection episodes was associated with increased arteriosclerotic changes in small arterioles. In urinary samples collected during CMV infection, excretion of TGF-β was significantly increased. In recipients with increased urinary excretion of TGF-β, increased interstitial fibrosis was recorded in the 6- month protocol biopsies. In biopsies taken after an active CMV infection, CMV persisted in the kidney allograft in 17/48 recipients, as CMV DNA or antigens were detected in the biopsies more than 2 months after the last positive finding in blood or urine. This persistence was associated with increased expression of TGF-β, PDGF, and ICAM-1 and with increased vascular changes in the allografts. Graft survival and graft function one and two years after transplantation were reduced in recipients with persistent intragraft CMV. Persistent intragraft CMV infection was also a risk factor for reduced graft survival in Cox regression analysis, and an independent risk factor for poor graft function one and two years after transplantation in logistic regression analysis. In conclusion, these results show that persistent intragraft CMV infection is detrimental to kidney allografts, causing increased expression of growth factors and increased vascular changes, leading to reduced graft function and survival. Effective prevention, diagnosis and treatment of CMV infections may a major factor in improving the long term survival of kidney allograft.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

New stars form in dense interstellar clouds of gas and dust called molecular clouds. The actual sites where the process of star formation takes place are the dense clumps and cores deeply embedded in molecular clouds. The details of the star formation process are complex and not completely understood. Thus, determining the physical and chemical properties of molecular cloud cores is necessary for a better understanding of how stars are formed. Some of the main features of the origin of low-mass stars, like the Sun, are already relatively well-known, though many details of the process are still under debate. The mechanism through which high-mass stars form, on the other hand, is poorly understood. Although it is likely that the formation of high-mass stars shares many properties similar to those of low-mass stars, the very first steps of the evolutionary sequence are unclear. Observational studies of star formation are carried out particularly at infrared, submillimetre, millimetre, and radio wavelengths. Much of our knowledge about the early stages of star formation in our Milky Way galaxy is obtained through molecular spectral line and dust continuum observations. The continuum emission of cold dust is one of the best tracers of the column density of molecular hydrogen, the main constituent of molecular clouds. Consequently, dust continuum observations provide a powerful tool to map large portions across molecular clouds, and to identify the dense star-forming sites within them. Molecular line observations, on the other hand, provide information on the gas kinematics and temperature. Together, these two observational tools provide an efficient way to study the dense interstellar gas and the associated dust that form new stars. The properties of highly obscured young stars can be further examined through radio continuum observations at centimetre wavelengths. For example, radio continuum emission carries useful information on conditions in the protostar+disk interaction region where protostellar jets are launched. In this PhD thesis, we study the physical and chemical properties of dense clumps and cores in both low- and high-mass star-forming regions. The sources are mainly studied in a statistical sense, but also in more detail. In this way, we are able to examine the general characteristics of the early stages of star formation, cloud properties on large scales (such as fragmentation), and some of the initial conditions of the collapse process that leads to the formation of a star. The studies presented in this thesis are mainly based on molecular line and dust continuum observations. These are combined with archival observations at infrared wavelengths in order to study the protostellar content of the cloud cores. In addition, centimetre radio continuum emission from young stellar objects (YSOs; i.e., protostars and pre-main sequence stars) is studied in this thesis to determine their evolutionary stages. The main results of this thesis are as follows: i) filamentary and sheet-like molecular cloud structures, such as infrared dark clouds (IRDCs), are likely to be caused by supersonic turbulence but their fragmentation at the scale of cores could be due to gravo-thermal instability; ii) the core evolution in the Orion B9 star-forming region appears to be dynamic and the role played by slow ambipolar diffusion in the formation and collapse of the cores may not be significant; iii) the study of the R CrA star-forming region suggests that the centimetre radio emission properties of a YSO are likely to change with its evolutionary stage; iv) the IRDC G304.74+01.32 contains candidate high-mass starless cores which may represent the very first steps of high-mass star and star cluster formation; v) SiO outflow signatures are seen in several high-mass star-forming regions which suggest that high-mass stars form in a similar way as their low-mass counterparts, i.e., via disk accretion. The results presented in this thesis provide constraints on the initial conditions and early stages of both low- and high-mass star formation. In particular, this thesis presents several observational results on the early stages of clustered star formation, which is the dominant mode of star formation in our Galaxy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The ProFacil model is a generic process model defined as a framework model showing the links between the facilities management process and the building end user’s business process. The purpose of using the model is to support more detailed process modelling. The model has been developed using the IDEF0 modelling method. The ProFacil model describes business activities from the generalized point of view as management-, support-, and core processes and their relations. The model defines basic activities in the provision of a facility. Examples of these activities are “operate facilities”, “provide new facilities”, “provide re-build facilities”, “provide maintained facilities” and “perform dispose of facilities”. These are all generic activities providing a basis for a further specialisation of company specific FM activities and their tasks. A facilitator can establish a specialized process model using the ProFacil model and interacting with company experts to describe their company’s specific processes. These modelling seminars or interviews will be done in an informal way, supported by the high-level process model as a common reference.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A model of the information and material activities that comprise the overall construction process is presented, using the SADT activity modelling methodology. The basic model is further refined into a number of generic information handling activities such as creation of new information, information search and retrieval, information distribution and person-to-person communication. The viewpoint could be described as information logistics. This model is then combined with a more traditional building process model, consisting of phases such as design and construction. The resulting two-dimensional matrix can be used for positioning different types of generic IT-tools or construction specific applications. The model can thus provide a starting point for a discussion of the application of information and communication technology in construction and for measurements of the impacts of IT on the overall process and its related costs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The industry foundation classes (IFC) file format is one of the most complex and ambitious IT standardization projects currently being undertaken in any industry, focusing on the development of an open and neutral standard for exchanging building model data. Scientific literature related to the IFC standard has dominantly been technical so far; research looking at the IFC standard from an industry standardization per- spective could offer valuable new knowledge for both theory and practice. This paper proposes the use of IT standardization and IT adoption theories, supported by studies done within construction IT, to lay a theoretical foundation for further empirical analysis of the standardization process of the IFC file format.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

There has been a demand for uniform CAD standards in the construction industry ever since the large-scale introduction of computer aided design systems in the late 1980s. While some standards have been widely adopted without much formal effort, other standards have failed to gain support even though considerable resources have been allocated for the purpose. Establishing a standard concerning building information modeling has been one particularly active area of industry development and scientific interest within recent years. In this paper, four different standards are discussed as cases: the IGES and DXF/DWG standards for representing the graphics in 2D drawings, the ISO 13567 standard for the structuring of building information on layers, and the IFC standard for building product models. Based on a literature study combined with two qualitative interview studies with domain experts, a process model is proposed to describe and interpret the contrasting histories of past CAD standardisation processes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study contributes to our knowledge of how information contained in financial statements is interpreted and priced by the stock market in two aspects. First, the empirical findings indicate that investors interpret some of the information contained in new financial statements in the context of the information of prior financial statements. Second, two central hypotheses offered in earlier literature to explain the significant connection between publicly available financial statement information and future abnormal returns, that the signals proxy for risk and that the information is priced with a delay, are evaluated utilizing a new methodology. It is found that the mentioned significant connection for some financial statement signals can be explained by that the signals proxy for risk and for other financial statement signals by that the information contained in the signals is priced with a delay.