975 resultados para atmospheric chemistry, cloud processing, clustering


Relevância:

30.00% 30.00%

Publicador:

Resumo:

PURPOSE: To objectively characterize different heart tissues from functional and viability images provided by composite-strain-encoding (C-SENC) MRI. MATERIALS AND METHODS: C-SENC is a new MRI technique for simultaneously acquiring cardiac functional and viability images. In this work, an unsupervised multi-stage fuzzy clustering method is proposed to identify different heart tissues in the C-SENC images. The method is based on sequential application of the fuzzy c-means (FCM) and iterative self-organizing data (ISODATA) clustering algorithms. The proposed method is tested on simulated heart images and on images from nine patients with and without myocardial infarction (MI). The resulting clustered images are compared with MRI delayed-enhancement (DE) viability images for determining MI. Also, Bland-Altman analysis is conducted between the two methods. RESULTS: Normal myocardium, infarcted myocardium, and blood are correctly identified using the proposed method. The clustered images correctly identified 90 +/- 4% of the pixels defined as infarct in the DE images. In addition, 89 +/- 5% of the pixels defined as infarct in the clustered images were also defined as infarct in DE images. The Bland-Altman results show no bias between the two methods in identifying MI. CONCLUSION: The proposed technique allows for objectively identifying divergent heart tissues, which would be potentially important for clinical decision-making in patients with MI.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Proper division plane positioning is essential to achieve faithful DNA segregation and to control daughter cell size, positioning, or fate within tissues. In Schizosaccharomyces pombe, division plane positioning is controlled positively by export of the division plane positioning factor Mid1/anillin from the nucleus and negatively by the Pom1/DYRK (dual-specificity tyrosine-regulated kinase) gradients emanating from cell tips. Pom1 restricts to the cell middle cortical cytokinetic ring precursor nodes organized by the SAD-like kinase Cdr2 and Mid1/anillin through an unknown mechanism. In this study, we show that Pom1 modulates Cdr2 association with membranes by phosphorylation of a basic region cooperating with the lipid-binding KA-1 domain. Pom1 also inhibits Cdr2 interaction with Mid1, reducing its clustering ability, possibly by down-regulation of Cdr2 kinase activity. We propose that the dual regulation exerted by Pom1 on Cdr2 prevents Cdr2 assembly into stable nodes in the cell tip region where Pom1 concentration is high, which ensures proper positioning of cytokinetic ring precursors at the cell geometrical center and robust and accurate division plane positioning.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Remote sensing image processing is nowadays a mature research area. The techniques developed in the field allow many real-life applications with great societal value. For instance, urban monitoring, fire detection or flood prediction can have a great impact on economical and environmental issues. To attain such objectives, the remote sensing community has turned into a multidisciplinary field of science that embraces physics, signal theory, computer science, electronics, and communications. From a machine learning and signal/image processing point of view, all the applications are tackled under specific formalisms, such as classification and clustering, regression and function approximation, image coding, restoration and enhancement, source unmixing, data fusion or feature selection and extraction. This paper serves as a survey of methods and applications, and reviews the last methodological advances in remote sensing image processing.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

T cell receptor (TCR-CD3) triggering involves both receptor clustering and conformational changes at the cytoplasmic tails of the CD3 subunits. The mechanism by which TCRalphabeta ligand binding confers conformational changes to CD3 is unknown. By using well-defined ligands, we showed that induction of the conformational change requires both multivalent engagement and the mobility restriction of the TCR-CD3 imposed by the plasma membrane. The conformational change is elicited by cooperative rearrangements of two TCR-CD3 complexes and does not require accompanying changes in the structure of the TCRalphabeta ectodomains. This conformational change at CD3 reverts upon ligand dissociation and is required for T cell activation. Thus, our permissive geometry model provides a molecular mechanism that rationalizes how the information of ligand binding to TCRalphabeta is transmitted to the CD3 subunits and to the intracellular signaling machinery.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A discussion is presented of daytime sky imaging and techniques that may be applied to the analysis of full-color sky images to infer cloud macrophysical properties. Descriptions of two different types of skyimaging systems developed by the authors are presented, one of which has been developed into a commercially available instrument. Retrievals of fractional sky cover from automated processing methods are compared to human retrievals, both from direct observations and visual analyses of sky images. Although some uncertainty exists in fractional sky cover retrievals from sky images, this uncertainty is no greater than that attached to human observations for the commercially available sky-imager retrievals. Thus, the application of automatic digital image processing techniques on sky images is a useful method to complement, or even replace, traditional human observations of sky cover and, potentially, cloud type. Additionally, the possibilities for inferring other cloud parameters such as cloud brokenness and solar obstruction further enhance the usefulness of sky imagers

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper reports the results of a three-year study of the effectiveness of mini-projects in a first year laboratory course in chemistry at a Scottish university. A mini-project is a short, practical problem which requires for its solution the application of the knowledge and skills developed in previously completed set experiments. A number of recommendations have been made about the most appropriate ways of introducing mini-projects into undergraduate laboratory course. The main hypothesis of this survey was concerned with the value of mini-projects in laboratory courses formulated within the context of Information Processing Theory.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

From an analysis of a learning model based on the theory of information processing four hypothesis were developed for improving the design of laboratory courses. Three of these hypotheses concerned specific procedures to minimise the load on students' working memories (or working spaces) and the fourth hypothesis was concerned with the value of mini-projects in enhancing meaningful learning of the knowledge and skills underpinning the set experiments. A three-year study of a first year undergraduate chemistry laboratory course at a Scottish university has been carried out to test these four hypotheses. This paper reports the results of the study relevant to the three hypotheses about the burden on students' working spaces. It was predicted from the learning model that the load on students working space should be reduced by appropriate changes to the written instructions and the laboratory organisation and by the introduction of prelab-work and prelab-training in laboratory techniques. It was concluded from research conducted over the three years period that all these hypothesised changes were effective both in reducing the load on students' working spaces and in improving their attitudes to the laboratory course.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Video transcoding refers to the process of converting a digital video from one format into another format. It is a compute-intensive operation. Therefore, transcoding of a large number of simultaneous video streams requires a large amount of computing resources. Moreover, to handle di erent load conditions in a cost-e cient manner, the video transcoding service should be dynamically scalable. Infrastructure as a Service Clouds currently offer computing resources, such as virtual machines, under the pay-per-use business model. Thus the IaaS Clouds can be leveraged to provide a coste cient, dynamically scalable video transcoding service. To use computing resources e ciently in a cloud computing environment, cost-e cient virtual machine provisioning is required to avoid overutilization and under-utilization of virtual machines. This thesis presents proactive virtual machine resource allocation and de-allocation algorithms for video transcoding in cloud computing. Since users' requests for videos may change at di erent times, a check is required to see if the current computing resources are adequate for the video requests. Therefore, the work on admission control is also provided. In addition to admission control, temporal resolution reduction is used to avoid jitters in a video. Furthermore, in a cloud computing environment such as Amazon EC2, the computing resources are more expensive as compared with the storage resources. Therefore, to avoid repetition of transcoding operations, a transcoded video needs to be stored for a certain time. To store all videos for the same amount of time is also not cost-e cient because popular transcoded videos have high access rate while unpopular transcoded videos are rarely accessed. This thesis provides a cost-e cient computation and storage trade-o strategy, which stores videos in the video repository as long as it is cost-e cient to store them. This thesis also proposes video segmentation strategies for bit rate reduction and spatial resolution reduction video transcoding. The evaluation of proposed strategies is performed using a message passing interface based video transcoder, which uses a coarse-grain parallel processing approach where video is segmented at group of pictures level.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This doctoral thesis describes the development work performed on the leachand purification sections in the electrolytic zinc plant in Kokkola to increase the efficiency in these two stages, and thus the competitiveness of the plant. Since metallic zinc is a typical bulk product, the improvement of the competitiveness of a plant was mostly an issue of decreasing unit costs. The problems in the leaching were low recovery of valuable metals from raw materials, and that the available technology offered complicated and expensive processes to overcome this problem. In the purification, the main problem was consumption of zinc powder - up to four to six times the stoichiometric demand. This reduced the capacity of the plant as this zinc is re-circulated through the electrolysis, which is the absolute bottleneck in a zinc plant. Low selectivity gave low-grade and low-value precipitates for further processing to metallic copper, cadmium, cobalt and nickel. Knowledge of the underlying chemistry was poor and process interruptions causing losses of zinc production were frequent. Studies on leaching comprised the kinetics of ferrite leaching and jarosite precipitation, as well as the stability of jarosite in acidic plant solutions. A breakthrough came with the finding that jarosite could precipitate under conditions where ferrite would leach satisfactorily. Based on this discovery, a one-step process for the treatment of ferrite was developed. In the plant, the new process almost doubled the recovery of zinc from ferrite in the same equipment as the two-step jarosite process was operated in at that time. In a later expansion of the plant, investment savings were substantial compared to other technologies available. In the solution purification, the key finding was that Co, Ni, and Cu formed specific arsenides in the “hot arsenic zinc dust” step. This was utilized for the development of a three-step purification stage based on fluidized bed technology in all three steps, i.e. removal of Cu, Co and Cd. Both precipitation rates and selectivity increased, which strongly decreased the zinc powder consumption through a substantially suppressed hydrogen gas evolution. Better selectivity improved the value of the precipitates: cadmium, which caused environmental problems in the copper smelter, was reduced from 1-3% reported normally down to 0.05 %, and a cobalt cake with 15 % Co was easily produced in laboratory experiments in the cobalt removal. The zinc powder consumption in the plant for a solution containing Cu, Co, Ni and Cd (1000, 25, 30 and 350 mg/l, respectively), was around 1.8 g/l; i.e. only 1.4 times the stoichiometric demand – or, about 60% saving in powder consumption. Two processes for direct leaching of the concentrate under atmospheric conditions were developed, one of which was implemented in the Kokkola zinc plant. Compared to the existing pressure leach technology, savings were obtained mostly in investment. The scientific basis for the most important processes and process improvements is given in the doctoral thesis. This includes mathematical modeling and thermodynamic evaluation of experimental results and hypotheses developed. Five of the processes developed in this research and development program were implemented in the plant and are still operated. Even though these processes were developed with the focus on the plant in Kokkola, they can also be implemented at low cost in most of the zinc plants globally, and have thus a great significance in the development of the electrolytic zinc process in general.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The amount of biological data has grown exponentially in recent decades. Modern biotechnologies, such as microarrays and next-generation sequencing, are capable to produce massive amounts of biomedical data in a single experiment. As the amount of the data is rapidly growing there is an urgent need for reliable computational methods for analyzing and visualizing it. This thesis addresses this need by studying how to efficiently and reliably analyze and visualize high-dimensional data, especially that obtained from gene expression microarray experiments. First, we will study the ways to improve the quality of microarray data by replacing (imputing) the missing data entries with the estimated values for these entries. Missing value imputation is a method which is commonly used to make the original incomplete data complete, thus making it easier to be analyzed with statistical and computational methods. Our novel approach was to use curated external biological information as a guide for the missing value imputation. Secondly, we studied the effect of missing value imputation on the downstream data analysis methods like clustering. We compared multiple recent imputation algorithms against 8 publicly available microarray data sets. It was observed that the missing value imputation indeed is a rational way to improve the quality of biological data. The research revealed differences between the clustering results obtained with different imputation methods. On most data sets, the simple and fast k-NN imputation was good enough, but there were also needs for more advanced imputation methods, such as Bayesian Principal Component Algorithm (BPCA). Finally, we studied the visualization of biological network data. Biological interaction networks are examples of the outcome of multiple biological experiments such as using the gene microarray techniques. Such networks are typically very large and highly connected, thus there is a need for fast algorithms for producing visually pleasant layouts. A computationally efficient way to produce layouts of large biological interaction networks was developed. The algorithm uses multilevel optimization within the regular force directed graph layout algorithm.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The goal of most clustering algorithms is to find the optimal number of clusters (i.e. fewest number of clusters). However, analysis of molecular conformations of biological macromolecules obtained from computer simulations may benefit from a larger array of clusters. The Self-Organizing Map (SOM) clustering method has the advantage of generating large numbers of clusters, but often gives ambiguous results. In this work, SOMs have been shown to be reproducible when the same conformational dataset is independently clustered multiple times (~100), with the help of the Cramérs V-index (C_v). The ability of C_v to determine which SOMs are reproduced is generalizable across different SOM source codes. The conformational ensembles produced from MD (molecular dynamics) and REMD (replica exchange molecular dynamics) simulations of the penta peptide Met-enkephalin (MET) and the 34 amino acid protein human Parathyroid Hormone (hPTH) were used to evaluate SOM reproducibility. The training length for the SOM has a huge impact on the reproducibility. Analysis of MET conformational data definitively determined that toroidal SOMs cluster data better than bordered maps due to the fact that toroidal maps do not have an edge effect. For the source code from MATLAB, it was determined that the learning rate function should be LINEAR with an initial learning rate factor of 0.05 and the SOM should be trained by a sequential algorithm. The trained SOMs can be used as a supervised classification for another dataset. The toroidal 10×10 hexagonal SOMs produced from the MATLAB program for hPTH conformational data produced three sets of reproducible clusters (27%, 15%, and 13% of 100 independent runs) which find similar partitionings to those of smaller 6×6 SOMs. The χ^2 values produced as part of the C_v calculation were used to locate clusters with identical conformational memberships on independently trained SOMs, even those with different dimensions. The χ^2 values could relate the different SOM partitionings to each other.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Naïvement perçu, le processus d’évolution est une succession d’événements de duplication et de mutations graduelles dans le génome qui mènent à des changements dans les fonctions et les interactions du protéome. La famille des hydrolases de guanosine triphosphate (GTPases) similaire à Ras constitue un bon modèle de travail afin de comprendre ce phénomène fondamental, car cette famille de protéines contient un nombre limité d’éléments qui diffèrent en fonctionnalité et en interactions. Globalement, nous désirons comprendre comment les mutations singulières au niveau des GTPases affectent la morphologie des cellules ainsi que leur degré d’impact sur les populations asynchrones. Mon travail de maîtrise vise à classifier de manière significative différents phénotypes de la levure Saccaromyces cerevisiae via l’analyse de plusieurs critères morphologiques de souches exprimant des GTPases mutées et natives. Notre approche à base de microscopie et d’analyses bioinformatique des images DIC (microscopie d’interférence différentielle de contraste) permet de distinguer les phénotypes propres aux cellules natives et aux mutants. L’emploi de cette méthode a permis une détection automatisée et une caractérisation des phénotypes mutants associés à la sur-expression de GTPases constitutivement actives. Les mutants de GTPases constitutivement actifs Cdc42 Q61L, Rho5 Q91H, Ras1 Q68L et Rsr1 G12V ont été analysés avec succès. En effet, l’implémentation de différents algorithmes de partitionnement, permet d’analyser des données qui combinent les mesures morphologiques de population native et mutantes. Nos résultats démontrent que l’algorithme Fuzzy C-Means performe un partitionnement efficace des cellules natives ou mutantes, où les différents types de cellules sont classifiés en fonction de plusieurs facteurs de formes cellulaires obtenus à partir des images DIC. Cette analyse démontre que les mutations Cdc42 Q61L, Rho5 Q91H, Ras1 Q68L et Rsr1 G12V induisent respectivement des phénotypes amorphe, allongé, rond et large qui sont représentés par des vecteurs de facteurs de forme distincts. Ces distinctions sont observées avec différentes proportions (morphologie mutante / morphologie native) dans les populations de mutants. Le développement de nouvelles méthodes automatisées d’analyse morphologique des cellules natives et mutantes s’avère extrêmement utile pour l’étude de la famille des GTPases ainsi que des résidus spécifiques qui dictent leurs fonctions et réseau d’interaction. Nous pouvons maintenant envisager de produire des mutants de GTPases qui inversent leur fonction en ciblant des résidus divergents. La substitution fonctionnelle est ensuite détectée au niveau morphologique grâce à notre nouvelle stratégie quantitative. Ce type d’analyse peut également être transposé à d’autres familles de protéines et contribuer de manière significative au domaine de la biologie évolutive.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

L’élevage des porcs représente une source importante de déversement d’antibiotiques dans l’environnement par l’intermédiaire de l’épandage du lisier qui contient une grande quantité de ces molécules sur les champs agricoles. Il a été prouvé que ces molécules biologiquement actives peuvent avoir un impact toxique sur l’écosystème. Par ailleurs, elles sont aussi suspectées d’engendrer des problèmes sanitaires et de contribuer à la résistance bactérienne pouvant mener à des infections difficilement traitables chez les humains. Le contrôle de ces substances dans l’environnement est donc nécessaire. De nombreuses méthodes analytiques sont proposées dans la littérature scientifique pour recenser ces composés dans plusieurs types de matrice. Cependant, peu de ces méthodes permettent l’analyse de ces contaminants dans des matrices issues de l’élevage agricole intensif. Par ailleurs, les méthodes analytiques disponibles sont souvent sujettes à des faux positifs compte tenu de la complexité des matrices étudiées et du matériel utilisé et ne prennent souvent pas en compte les métabolites et produits de dégradation. Enfin, les niveaux d’analyse atteints avec ces méthodes ne sont parfois plus à jour étant donné l’évolution de la chimie analytique et de la spectrométrie de masse. Dans cette optique, de nouvelles méthodes d’analyses ont été développées pour rechercher et quantifier les antibiotiques dans des matrices dérivées de l’élevage intensif des porcs en essayant de proposer des approches alternatives sensibles, sélectives et robustes pour quantifier ces molécules. Une première méthode d’analyse basée sur une technique d’introduction d’échantillon alternative à l’aide d’une interface fonctionnant à l’aide d’une désorption thermique par diode laser munie d’une source à ionisation à pression atmosphérique, couplée à la spectrométrie de masse en tandem a été développée. L’objectif est de proposer une analyse plus rapide tout en atteignant des niveaux de concentration adaptés à la matrice étudiée. Cette technique d’analyse couplée à un traitement d’échantillon efficace a permis l’analyse de plusieurs antibiotiques vétérinaires de différentes classes dans des échantillons de lisier avec des temps d’analyse courts. Les limites de détection atteintes sont comprises entre 2,5 et 8,3 µg kg-1 et sont comparables avec celles pouvant être obtenues avec la chromatographie liquide dans une matrice similaire. En vue d’analyser simultanément une série de tétracyclines, une deuxième méthode d’analyse utilisant la chromatographie liquide couplée à la spectrométrie de masse à haute résolution (HRMS) a été proposée. L’utilisation de la HRMS a été motivée par le fait que cette technique d’analyse est moins sensible aux faux positifs que le triple quadripôle traditionnel. Des limites de détection comprises entre 1,5 et 3,6 µg kg-1 ont été atteintes dans des échantillons de lisier en utilisant un mode d’analyse par fragmentation. L’utilisation de méthodes de quantifications ciblées est une démarche intéressante lorsque la présence de contaminants est suspectée dans un échantillon. Toutefois, les contaminants non intégrés à cette méthode d’analyse ciblée ne peuvent être détectés même à de fortes concentrations. Dans ce contexte, une méthode d’analyse non ciblée a été développée pour la recherche de pharmaceutiques vétérinaires dans des effluents agricoles en utilisant la spectrométrie de masse à haute résolution et une cartouche SPE polymérique polyvalente. Cette méthode a permis l’identification d’antibiotiques et de pharmaceutiques couramment utilisés dans l’élevage porcin. La plupart des méthodes d’analyse disponibles dans la littérature se concentrent sur l’analyse des composés parents, mais pas sur les sous-produits de dégradation. L’approche utilisée dans la deuxième méthode d’analyse a donc été étendue et appliquée à d’autres classes d’antibiotiques pour mesurer les concentrations de plusieurs résidus d’antibiotiques dans les sols et les eaux de drainage d’un champ agricole expérimental. Les sols du champ renfermaient un mélange d’antibiotiques ainsi que leurs produits de dégradation relatifs à des concentrations mesurées jusqu’à 1020 µg kg-1. Une partie de ces composés ont voyagé par l’intermédiaire des eaux de drainage du champ ou des concentrations pouvant atteindre 3200 ng L-1 ont pu être relevées.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This doctoral thesis addresses the growing concern about the significant changes in the climatic and weather patterns due to the aerosol loading that have taken place in the Indo Gangetic Plain(IGP)which includes most of the Northern Indian region. The study region comprises of major industrial cities in India (New Delhi, Kanpur, Allahabad, Jamshedpur and Kolkata). Northern and central parts of India are one of the most thickly populated areas in the world and have the most intensely farmed areas. Rapid increase in population and urbanization has resulted in an abrupt increase in aerosol concentrations in recent years. The IGP has a major source of coal; therefore most of the industries including numerous thermal power plants that run on coal are located around this region. They inject copious amount of aerosols into the atmosphere. Moreover, the transport of dust aerosols from arid locations is prevalent during the dry months which increase the aerosol loading in theatmosphere. The topography of the place is also ideal for the congregation of aerosols. It is bounded by the Himalayas in the north, Thar Desert in the west, the Vindhyan range in the south and Brahmaputra ridge in the east. During the non‐monsoon months (October to May) the weather in the location is dry with very little rainfall. Surface winds are weak during most of the time in this dry season. The aerosols that reach the location by means of long distance transport and from regional sources get accumulated under these favourable conditions. The increase in aerosol concentration due to the complex combination of aerosol transport and anthropogenic factors mixed with the contribution from the natural sources alters the optical properties and the life time of clouds in the region. The associated perturbations in radiative balance have a significant impact on the meteorological parameters and this in turn determines the precipitation forming process. Therefore, any change in weather which disturbs the normal hydrological pattern is alarming in the socio‐economic point of view. Hence, the main focus of this work is to determine the variation in transport and distribution of aerosols in the region and to understand the interaction of these aerosols with meteorological parameters and cloud properties.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Ozone present in the atmosphere not only absorbs the biologically harmful ultraviolet radiation but also is an important ingredient of the climate system. The radiative absorption properties of ozone make it a determining factor in the structure of the atmosphere. Ozone in the troposphere has many negative impacts on humans and other living beings. Another significant aspect is the absorption of outgoing infrared radiation by ozone thus acting as a greenhouse gas. The variability of ozone in the atmosphere involves many interconnections with the incoming and outgoing radiation, temperature circulation etc. Hence ozone forms an important part of chemistry-climate as well as radiative transfer models. This aspect also makes the quantification of ozone more important. The discovery of Antarctic ozone hole and the role of anthropogenic activities in causing it made it possible to plan and implement necessary preventive measures. Continuous monitoring of ozone is also necessary to identify the effect of these preventive steps. The reactions involving the formation and destruction of ozone are influenced significantly by the temperature fluctuations of the atmosphere. On the other hand the variations in ozone can change the temperature structure of the atmosphere. Indian subcontinent is a region having large weather and climate variability which is evident from the large interannual variability of monsoon system over the region. Nearly half of Indian region comprises the tropical region. Most of ozone is formed in the tropical region and transported to higher latitudes. The formation and transport of ozone can be influenced by changes in solar radiation and various atmospheric circulation features. Besides industrial activities and vehicular traffic is more due to its large population. This may give rise to an increase in the production of tropospheric ozone which is greenhouse gas. Hence it becomes necessary to monitor the atmospheric ozone over this region. This study probes into the spatial distribution and temporal evolution of ozone over Indian subcontinent and discusses the contributing atmospheric parameters.