900 resultados para Algorithm desigh and analysis


Relevância:

100.00% 100.00%

Publicador:

Resumo:

In their dialogue - An Analysis of Stock Market Performance: The Dow Jones Industrial Average and the Three Top Performing Lodging Firms 1982 – 1988 - by N. H. Ringstrom, Professor and Elisa S. Moncarz, Associate Professor, School of Hospitality Management at Florida International University, Professors Ringstrom and Moncarz state at the outset: “An interesting comparison can be made between the Dow Jones lndustrial Average and the three top performing, publicly held lodging firms which had $100 million or more in annual lodging revenues. The authors provide that analytical comparison with Prime Motor Inns Inc., the Marriott Corporation, and Hilton Hotels Corporation.” “Based on a criterion of size, only those with $100 million in annual lodging revenues or more resulted in the inclusion of the following six major hotel firms: Prime Motor Inns, Inc., Marriott Corporation, Hilton Hotels Corporation, Ramada Inc., Holiday Corporation and La Quinta Motor Inns, Inc.,” say Professors Ringstrom and Moncarz in framing this discussion with its underpinnings in the years 1982 to 1988. The article looks at each company’s fiscal and Dow Jones performance for the years in question, and presents a detailed analysis of said performance. Graphic analysis is included. It helps to have a fairly vigorous knowledge of stock market and fiscal examination criteria to digest this material. The Ringstrom and Moncarz analysis of Prime Motor Inns Incorporated occupies the first 7 pages of this article in and of itself. Marriot Corporation also occupies a prominent position in this discussion. “Marriott, a giant in the hospitality industry, is huge and continuing to grow. Its 1987 sales were more than $6.5 billion, and its employees numbered over 200,000 individuals, which place Marriott among the 10 largest private employers in the country,” Ringstrom and Moncarz parse Marriott’s influence as a significant financial player. “The firm has a fantastic history of growth over the past 60 years, starting in May 1927 with a nine-seat A & W Root Beer stand in Washington, D.C.,” offer the authors in initialing Marriot’s portion of the discussion with a brief history lesson. The Marriot firm was officially incorporated as Hot Shoppes Inc. in 1929. As the thesis statement for the discussion suggests the performance of these huge, hospitality giants is compared and contrasted directly to the Dow Jones Industrial Average performance. Reasons and empirical data are offered by the authors to explain the distinctions. It would be difficult to explain those distinctions without delving deeply into corporate financial history and the authors willingly do so in an effort to help you understand the growth, as well as some of the setbacks of these hospitality based juggernauts. Ringstrom and Moncarz conclude the article with an extensive overview and analysis of the Hilton Hotels Corporation performance for the period outlined. It may well be the most fiscally dynamic of the firms presented for your perusal. “It is interesting to note that Hilton Hotels Corporation maintained a very strong financial position with relatively little debt during the years 1982-1988…the highest among all companies in the study,” the authors paint.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Accurately assessing the extent of myocardial tissue injury induced by Myocardial infarction (MI) is critical to the planning and optimization of MI patient management. With this in mind, this study investigated the feasibility of using combined fluorescence and diffuse reflectance spectroscopy to characterize a myocardial infarct at the different stages of its development. An animal study was conducted using twenty male Sprague-Dawley rats with MI. In vivo fluorescence spectra at 337 nm excitation and diffuse reflectance between 400 nm and 900 nm were measured from the heart using a portable fiber-optic spectroscopic system. Spectral acquisition was performed on - (1) the normal heart region; (2) the region immediately surrounding the infarct; and (3) the infarcted region - one, two, three and four weeks into MI development. The spectral data were divided into six subgroups according to the histopathological features associated with various degrees / severities of myocardial tissue injury as well as various stages of myocardial tissue remodeling, post infarction. Various data processing and analysis techniques were employed to recognize the representative spectral features corresponding to various histopathological features associated with myocardial infarction. The identified spectral features were utilized in discriminant analysis to further evaluate their effectiveness in classifying tissue injuries induced by MI. In this study, it was observed that MI induced significant alterations (p < 0.05) in the diffuse reflectance spectra, especially between 450 nm and 600 nm, from myocardial tissue within the infarcted and surrounding regions. In addition, MI induced a significant elevation in fluorescence intensities at 400 and 460 nm from the myocardial tissue from the same regions. The extent of these spectral alterations was related to the duration of the infarction. Using the spectral features identified, an effective tissue injury classification algorithm was developed which produced a satisfactory overall classification result (87.8%). The findings of this research support the concept that optical spectroscopy represents a useful tool to non-invasively determine the in vivo pathophysiological features of a myocardial infarct and its surrounding tissue, thereby providing valuable real-time feedback to surgeons during various surgical interventions for MI.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

La spectrométrie de masse mesure la masse des ions selon leur rapport masse sur charge. Cette technique est employée dans plusieurs domaines et peut analyser des mélanges complexes. L’imagerie par spectrométrie de masse (Imaging Mass Spectrometry en anglais, IMS), une branche de la spectrométrie de masse, permet l’analyse des ions sur une surface, tout en conservant l’organisation spatiale des ions détectés. Jusqu’à présent, les échantillons les plus étudiés en IMS sont des sections tissulaires végétales ou animales. Parmi les molécules couramment analysées par l’IMS, les lipides ont suscité beaucoup d'intérêt. Les lipides sont impliqués dans les maladies et le fonctionnement normal des cellules; ils forment la membrane cellulaire et ont plusieurs rôles, comme celui de réguler des événements cellulaires. Considérant l’implication des lipides dans la biologie et la capacité du MALDI IMS à les analyser, nous avons développé des stratégies analytiques pour la manipulation des échantillons et l’analyse de larges ensembles de données lipidiques. La dégradation des lipides est très importante dans l’industrie alimentaire. De la même façon, les lipides des sections tissulaires risquent de se dégrader. Leurs produits de dégradation peuvent donc introduire des artefacts dans l’analyse IMS ainsi que la perte d’espèces lipidiques pouvant nuire à la précision des mesures d’abondance. Puisque les lipides oxydés sont aussi des médiateurs importants dans le développement de plusieurs maladies, leur réelle préservation devient donc critique. Dans les études multi-institutionnelles où les échantillons sont souvent transportés d’un emplacement à l’autre, des protocoles adaptés et validés, et des mesures de dégradation sont nécessaires. Nos principaux résultats sont les suivants : un accroissement en fonction du temps des phospholipides oxydés et des lysophospholipides dans des conditions ambiantes, une diminution de la présence des lipides ayant des acides gras insaturés et un effet inhibitoire sur ses phénomènes de la conservation des sections au froid sous N2. A température et atmosphère ambiantes, les phospholipides sont oxydés sur une échelle de temps typique d’une préparation IMS normale (~30 minutes). Les phospholipides sont aussi décomposés en lysophospholipides sur une échelle de temps de plusieurs jours. La validation d’une méthode de manipulation d’échantillon est d’autant plus importante lorsqu’il s’agit d’analyser un plus grand nombre d’échantillons. L’athérosclérose est une maladie cardiovasculaire induite par l’accumulation de matériel cellulaire sur la paroi artérielle. Puisque l’athérosclérose est un phénomène en trois dimension (3D), l'IMS 3D en série devient donc utile, d'une part, car elle a la capacité à localiser les molécules sur la longueur totale d’une plaque athéromateuse et, d'autre part, car elle peut identifier des mécanismes moléculaires du développement ou de la rupture des plaques. l'IMS 3D en série fait face à certains défis spécifiques, dont beaucoup se rapportent simplement à la reconstruction en 3D et à l’interprétation de la reconstruction moléculaire en temps réel. En tenant compte de ces objectifs et en utilisant l’IMS des lipides pour l’étude des plaques d’athérosclérose d’une carotide humaine et d’un modèle murin d’athérosclérose, nous avons élaboré des méthodes «open-source» pour la reconstruction des données de l’IMS en 3D. Notre méthodologie fournit un moyen d’obtenir des visualisations de haute qualité et démontre une stratégie pour l’interprétation rapide des données de l’IMS 3D par la segmentation multivariée. L’analyse d’aortes d’un modèle murin a été le point de départ pour le développement des méthodes car ce sont des échantillons mieux contrôlés. En corrélant les données acquises en mode d’ionisation positive et négative, l’IMS en 3D a permis de démontrer une accumulation des phospholipides dans les sinus aortiques. De plus, l’IMS par AgLDI a mis en évidence une localisation différentielle des acides gras libres, du cholestérol, des esters du cholestérol et des triglycérides. La segmentation multivariée des signaux lipidiques suite à l’analyse par IMS d’une carotide humaine démontre une histologie moléculaire corrélée avec le degré de sténose de l’artère. Ces recherches aident à mieux comprendre la complexité biologique de l’athérosclérose et peuvent possiblement prédire le développement de certains cas cliniques. La métastase au foie du cancer colorectal (Colorectal cancer liver metastasis en anglais, CRCLM) est la maladie métastatique du cancer colorectal primaire, un des cancers le plus fréquent au monde. L’évaluation et le pronostic des tumeurs CRCLM sont effectués avec l’histopathologie avec une marge d’erreur. Nous avons utilisé l’IMS des lipides pour identifier les compartiments histologiques du CRCLM et extraire leurs signatures lipidiques. En exploitant ces signatures moléculaires, nous avons pu déterminer un score histopathologique quantitatif et objectif et qui corrèle avec le pronostic. De plus, par la dissection des signatures lipidiques, nous avons identifié des espèces lipidiques individuelles qui sont discriminants des différentes histologies du CRCLM et qui peuvent potentiellement être utilisées comme des biomarqueurs pour la détermination de la réponse à la thérapie. Plus spécifiquement, nous avons trouvé une série de plasmalogènes et sphingolipides qui permettent de distinguer deux différents types de nécrose (infarct-like necrosis et usual necrosis en anglais, ILN et UN, respectivement). L’ILN est associé avec la réponse aux traitements chimiothérapiques, alors que l’UN est associé au fonctionnement normal de la tumeur.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Thesis (Ph.D.)--University of Washington, 2016-08

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Hyperspectral sensors are being developed for remote sensing applications. These sensors produce huge data volumes which require faster processing and analysis tools. Vertex component analysis (VCA) has become a very useful tool to unmix hyperspectral data. It has been successfully used to determine endmembers and unmix large hyperspectral data sets without the use of any a priori knowledge of the constituent spectra. Compared with other geometric-based approaches VCA is an efficient method from the computational point of view. In this paper we introduce new developments for VCA: 1) a new signal subspace identification method (HySime) is applied to infer the signal subspace where the data set live. This step also infers the number of endmembers present in the data set; 2) after the projection of the data set onto the signal subspace, the algorithm iteratively projects the data set onto several directions orthogonal to the subspace spanned by the endmembers already determined. The new endmember signature corresponds to these extreme of the projections. The capability of VCA to unmix large hyperspectral scenes (real or simulated), with low computational complexity, is also illustrated.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Carbonic anhydrases are enzymes that are ubiquitously found in all organisms that are engaged in catalyzing the hydration of carbon dioxide to form bicarbonate and proton and vice versa. They are crucial in the process of respiration, bone resorption, pH regulation, ion transport, and photosynthesis in plants. Out of the five classes of carbonic anhydrase α, β, γ, δ, ζ this study focused in the α carbonic anhydrases. This class of CAs constitute of 16 subfamilies in mammals that include 3 non-active enzymes known as Carbonic Anhydrase Related Proteins. The inactiveness of these enzymes is due to the loss of one or more Histidine residues in the active site. This thesis was conducted based on the aim of studying evolutionary analysis of carbonic anhydrase sequences from organisms spanning from the Cambrian age. It was carried out in two phases. The first phase was the sequence collection, which involved many biological sequence databases as a source. The scope of this segment included sequence alignments and analysis of the sequence manually and in an automated form incorporating few analysis tools. The second Phase was phylogenetic analysis and exploring the subcellular location of the proteins, which was key for the evolutionary analysis. Through the medium of the methods conducted with respect to the phases mentioned above, it was possible to accomplish the desired result. Certain thought-provoking sequences were come across and analyzed thoroughly. Whereas, Phylogenetics showed interesting results to bolster previous findings and new findings as well which lay bedrock for future intensified studies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The challenge of detecting a change in the distribution of data is a sequential decision problem that is relevant to many engineering solutions, including quality control and machine and process monitoring. This dissertation develops techniques for exact solution of change-detection problems with discrete time and discrete observations. Change-detection problems are classified as Bayes or minimax based on the availability of information on the change-time distribution. A Bayes optimal solution uses prior information about the distribution of the change time to minimize the expected cost, whereas a minimax optimal solution minimizes the cost under the worst-case change-time distribution. Both types of problems are addressed. The most important result of the dissertation is the development of a polynomial-time algorithm for the solution of important classes of Markov Bayes change-detection problems. Existing techniques for epsilon-exact solution of partially observable Markov decision processes have complexity exponential in the number of observation symbols. A new algorithm, called constellation induction, exploits the concavity and Lipschitz continuity of the value function, and has complexity polynomial in the number of observation symbols. It is shown that change-detection problems with a geometric change-time distribution and identically- and independently-distributed observations before and after the change are solvable in polynomial time. Also, change-detection problems on hidden Markov models with a fixed number of recurrent states are solvable in polynomial time. A detailed implementation and analysis of the constellation-induction algorithm are provided. Exact solution methods are also established for several types of minimax change-detection problems. Finite-horizon problems with arbitrary observation distributions are modeled as extensive-form games and solved using linear programs. Infinite-horizon problems with linear penalty for detection delay and identically- and independently-distributed observations can be solved in polynomial time via epsilon-optimal parameterization of a cumulative-sum procedure. Finally, the properties of policies for change-detection problems are described and analyzed. Simple classes of formal languages are shown to be sufficient for epsilon-exact solution of change-detection problems, and methods for finding minimally sized policy representations are described.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Betanodavirus infections have a significant impact through direct losses and trade restrictions for aquaculture sectors in Australia. The giant grouper, Epinephelus lanceolatus, is a high-value, fast-growing species with significant aquaculture potential. With subacute to chronic mortalities reported from a commercial aquaculture facility in northern Queensland, the viral nervous necrosis in the affected fish was confirmed using a RT-qPCR followed by virus isolation using the SSN-1 cell line. The RNA1 and RNA2 segments were sequenced and nucleotide sequences were compared with betanodavirus sequences from GenBank. Phylogenetic analysis revealed that both these sequences clustered with sequences representing red spotted grouper nervous necrosis virus genotype and showed high sequence identity to virus sequences affecting other grouper species. This is the first report confirming infection by betanodavirus in E. lanceolatus from Australia with successful isolation of the virus in a cell culture system, and analysis of nearly full length RNA1 and RNA2 sequences.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The impact of end customer quality complaints with direct relationship with automotive components has presented negative trend at European level for the entire automotive industry. Thus, this research proposal is to concentrate efforts on the most important items of Pareto chart and understand the failure type and the mechanism involved, link and impact of the project and parameters on the process, ending it with the development of one of the company’s most desired tool, that hosted this project – European methodology of terminals defects classification, and listing real opportunities for improvement based on measurement and analysis of actual data. Through the development of terminals defects classification methodology, which is considered a valuable asset to the company, all the other companies of the YAZAKI’s group will be able to characterize terminals as brittle or ductile, in order to put in motion, more efficiently, all the other different existing internal procedures for the safeguarding of the components, improving manufacturing efficiency. Based on a brief observation, nothing can be said in absolute sense, concerning the failure causes. Base materials, project, handling during manufacture and storage, as well as the cold work performed by plastic deformation, all play an important role. However, it was expected that this failure has been due to a combination of factors, in detriment of the existence of a single cause. In order to acquire greater knowledge about this problem, unexplored by the company up to the date of commencement of this study, was conducted a thorough review of existing literature on the subject, real production sites were visited and, of course, the actual parts were tested in lab environment. To answer to many of the major issues raised throughout the investigation, were used extensively some theoretical concepts focused on the literature review, with a view to realizing the relationship existing between the different parameters concerned. Should here be stated that finding technical studies on copper and its alloys is really hard, not being given all the desirable information. This investigation has been performed as a YAZAKI Europe Limited Company project and as a Master Thesis for Instituto Superior de Engenharia do Porto, conducted during 9 months between 2012/2013.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This dissertation research points out major challenging problems with current Knowledge Organization (KO) systems, such as subject gateways or web directories: (1) the current systems use traditional knowledge organization systems based on controlled vocabulary which is not very well suited to web resources, and (2) information is organized by professionals not by users, which means it does not reflect intuitively and instantaneously expressed users’ current needs. In order to explore users’ needs, I examined social tags which are user-generated uncontrolled vocabulary. As investment in professionally-developed subject gateways and web directories diminishes (support for both BUBL and Intute, examined in this study, is being discontinued), understanding characteristics of social tagging becomes even more critical. Several researchers have discussed social tagging behavior and its usefulness for classification or retrieval; however, further research is needed to qualitatively and quantitatively investigate social tagging in order to verify its quality and benefit. This research particularly examined the indexing consistency of social tagging in comparison to professional indexing to examine the quality and efficacy of tagging. The data analysis was divided into three phases: analysis of indexing consistency, analysis of tagging effectiveness, and analysis of tag attributes. Most indexing consistency studies have been conducted with a small number of professional indexers, and they tended to exclude users. Furthermore, the studies mainly have focused on physical library collections. This dissertation research bridged these gaps by (1) extending the scope of resources to various web documents indexed by users and (2) employing the Information Retrieval (IR) Vector Space Model (VSM) - based indexing consistency method since it is suitable for dealing with a large number of indexers. As a second phase, an analysis of tagging effectiveness with tagging exhaustivity and tag specificity was conducted to ameliorate the drawbacks of consistency analysis based on only the quantitative measures of vocabulary matching. Finally, to investigate tagging pattern and behaviors, a content analysis on tag attributes was conducted based on the FRBR model. The findings revealed that there was greater consistency over all subjects among taggers compared to that for two groups of professionals. The analysis of tagging exhaustivity and tag specificity in relation to tagging effectiveness was conducted to ameliorate difficulties associated with limitations in the analysis of indexing consistency based on only the quantitative measures of vocabulary matching. Examination of exhaustivity and specificity of social tags provided insights into particular characteristics of tagging behavior and its variation across subjects. To further investigate the quality of tags, a Latent Semantic Analysis (LSA) was conducted to determine to what extent tags are conceptually related to professionals’ keywords and it was found that tags of higher specificity tended to have a higher semantic relatedness to professionals’ keywords. This leads to the conclusion that the term’s power as a differentiator is related to its semantic relatedness to documents. The findings on tag attributes identified the important bibliographic attributes of tags beyond describing subjects or topics of a document. The findings also showed that tags have essential attributes matching those defined in FRBR. Furthermore, in terms of specific subject areas, the findings originally identified that taggers exhibited different tagging behaviors representing distinctive features and tendencies on web documents characterizing digital heterogeneous media resources. These results have led to the conclusion that there should be an increased awareness of diverse user needs by subject in order to improve metadata in practical applications. This dissertation research is the first necessary step to utilize social tagging in digital information organization by verifying the quality and efficacy of social tagging. This dissertation research combined both quantitative (statistics) and qualitative (content analysis using FRBR) approaches to vocabulary analysis of tags which provided a more complete examination of the quality of tags. Through the detailed analysis of tag properties undertaken in this dissertation, we have a clearer understanding of the extent to which social tagging can be used to replace (and in some cases to improve upon) professional indexing.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Ancient starch analysis is a microbotanical method in which starch granules are extracted from archaeological residues and the botanical source is identified. The method is an important addition to established palaeoethnobotanical research, as it can reveal ancient microremains of starchy staples such as cereal grains and seeds. In addition, starch analysis can detect starch originating from underground storage organs, which are rarely discovered using other methods. Because starch is tolerant of acidic soils, unlike most organic matter, starch analysis can be successful in northern boreal regions. Starch analysis has potential in the study of cultivation, plant domestication, wild plant usage and tool function, as well as in locating activity areas at sites and discovering human impact on the environment. The aim of this study was to experiment with the starch analysis method in Finnish and Estonian archaeology by building a starch reference collection from cultivated and native plant species, by developing sampling, measuring and analysis protocols, by extracting starch residues from archaeological artefacts and soils, and by identifying their origin. The purpose of this experiment was to evaluate the suitability of the method for the study of subsistence strategies in prehistoric Finland and Estonia. A total of 64 archaeological samples were analysed from four Late Neolithic sites in Finland and Estonia, with radiocarbon dates ranging between 2904 calBC and 1770 calBC. The samples yielded starch granules, which were compared with the starch reference collection and descriptions in the literature. Cereal-type starch was identified from the Finnish Kiukainen culture site and from the Estonian Corded Ware site. The samples from the Finnish Corded Ware site yielded underground storage organ starch, which may be the first evidence of the use of rhizomes as food in Finland. No cereal-type starch was observed. Although the sample sets were limited, the experiment confirmed that starch granules have been preserved well in the archaeological material of Finland and Estonia, and that differences between subsistence patterns, as well as evidence of cultivation and wild plant gathering, can be discovered using starch analysis. By collecting large sample sets and addressing the three most important issues – preventing contamination, collecting adequate references and understanding taphonomic processes – starch analysis can substantially contribute to research on ancient subsistence in Finland and Estonia.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

La spectrométrie de masse mesure la masse des ions selon leur rapport masse sur charge. Cette technique est employée dans plusieurs domaines et peut analyser des mélanges complexes. L’imagerie par spectrométrie de masse (Imaging Mass Spectrometry en anglais, IMS), une branche de la spectrométrie de masse, permet l’analyse des ions sur une surface, tout en conservant l’organisation spatiale des ions détectés. Jusqu’à présent, les échantillons les plus étudiés en IMS sont des sections tissulaires végétales ou animales. Parmi les molécules couramment analysées par l’IMS, les lipides ont suscité beaucoup d'intérêt. Les lipides sont impliqués dans les maladies et le fonctionnement normal des cellules; ils forment la membrane cellulaire et ont plusieurs rôles, comme celui de réguler des événements cellulaires. Considérant l’implication des lipides dans la biologie et la capacité du MALDI IMS à les analyser, nous avons développé des stratégies analytiques pour la manipulation des échantillons et l’analyse de larges ensembles de données lipidiques. La dégradation des lipides est très importante dans l’industrie alimentaire. De la même façon, les lipides des sections tissulaires risquent de se dégrader. Leurs produits de dégradation peuvent donc introduire des artefacts dans l’analyse IMS ainsi que la perte d’espèces lipidiques pouvant nuire à la précision des mesures d’abondance. Puisque les lipides oxydés sont aussi des médiateurs importants dans le développement de plusieurs maladies, leur réelle préservation devient donc critique. Dans les études multi-institutionnelles où les échantillons sont souvent transportés d’un emplacement à l’autre, des protocoles adaptés et validés, et des mesures de dégradation sont nécessaires. Nos principaux résultats sont les suivants : un accroissement en fonction du temps des phospholipides oxydés et des lysophospholipides dans des conditions ambiantes, une diminution de la présence des lipides ayant des acides gras insaturés et un effet inhibitoire sur ses phénomènes de la conservation des sections au froid sous N2. A température et atmosphère ambiantes, les phospholipides sont oxydés sur une échelle de temps typique d’une préparation IMS normale (~30 minutes). Les phospholipides sont aussi décomposés en lysophospholipides sur une échelle de temps de plusieurs jours. La validation d’une méthode de manipulation d’échantillon est d’autant plus importante lorsqu’il s’agit d’analyser un plus grand nombre d’échantillons. L’athérosclérose est une maladie cardiovasculaire induite par l’accumulation de matériel cellulaire sur la paroi artérielle. Puisque l’athérosclérose est un phénomène en trois dimension (3D), l'IMS 3D en série devient donc utile, d'une part, car elle a la capacité à localiser les molécules sur la longueur totale d’une plaque athéromateuse et, d'autre part, car elle peut identifier des mécanismes moléculaires du développement ou de la rupture des plaques. l'IMS 3D en série fait face à certains défis spécifiques, dont beaucoup se rapportent simplement à la reconstruction en 3D et à l’interprétation de la reconstruction moléculaire en temps réel. En tenant compte de ces objectifs et en utilisant l’IMS des lipides pour l’étude des plaques d’athérosclérose d’une carotide humaine et d’un modèle murin d’athérosclérose, nous avons élaboré des méthodes «open-source» pour la reconstruction des données de l’IMS en 3D. Notre méthodologie fournit un moyen d’obtenir des visualisations de haute qualité et démontre une stratégie pour l’interprétation rapide des données de l’IMS 3D par la segmentation multivariée. L’analyse d’aortes d’un modèle murin a été le point de départ pour le développement des méthodes car ce sont des échantillons mieux contrôlés. En corrélant les données acquises en mode d’ionisation positive et négative, l’IMS en 3D a permis de démontrer une accumulation des phospholipides dans les sinus aortiques. De plus, l’IMS par AgLDI a mis en évidence une localisation différentielle des acides gras libres, du cholestérol, des esters du cholestérol et des triglycérides. La segmentation multivariée des signaux lipidiques suite à l’analyse par IMS d’une carotide humaine démontre une histologie moléculaire corrélée avec le degré de sténose de l’artère. Ces recherches aident à mieux comprendre la complexité biologique de l’athérosclérose et peuvent possiblement prédire le développement de certains cas cliniques. La métastase au foie du cancer colorectal (Colorectal cancer liver metastasis en anglais, CRCLM) est la maladie métastatique du cancer colorectal primaire, un des cancers le plus fréquent au monde. L’évaluation et le pronostic des tumeurs CRCLM sont effectués avec l’histopathologie avec une marge d’erreur. Nous avons utilisé l’IMS des lipides pour identifier les compartiments histologiques du CRCLM et extraire leurs signatures lipidiques. En exploitant ces signatures moléculaires, nous avons pu déterminer un score histopathologique quantitatif et objectif et qui corrèle avec le pronostic. De plus, par la dissection des signatures lipidiques, nous avons identifié des espèces lipidiques individuelles qui sont discriminants des différentes histologies du CRCLM et qui peuvent potentiellement être utilisées comme des biomarqueurs pour la détermination de la réponse à la thérapie. Plus spécifiquement, nous avons trouvé une série de plasmalogènes et sphingolipides qui permettent de distinguer deux différents types de nécrose (infarct-like necrosis et usual necrosis en anglais, ILN et UN, respectivement). L’ILN est associé avec la réponse aux traitements chimiothérapiques, alors que l’UN est associé au fonctionnement normal de la tumeur.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Kinematic structure of planar mechanisms addresses the study of attributes determined exclusively by the joining pattern among the links forming a mechanism. The system group classification is central to the kinematic structure and consists of determining a sequence of kinematically and statically independent-simple chains which represent a modular basis for the kinematics and force analysis of the mechanism. This article presents a novel graph-based algorithm for structural analysis of planar mechanisms with closed-loop kinematic structure which determines a sequence of modules (Assur groups) representing the topology of the mechanism. The computational complexity analysis and proof of correctness of the implemented algorithm are provided. A case study is presented to illustrate the results of the devised method.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The question of why most health policies do not achieve their intended results continues to receive a considerable attention in the literature. This is in the light of the recognized gap between policy as intent and policy as practice, which calls for substantial research work to understand the factors that improve policy implementation. Although there is substantial work that explains the reasons why policies achieve or fail to achieve their intended outcomes, there are limited case studies that illustrate how to analyze policies from the methodological perspective. In this article, we report and discuss how a mixed qualitative research method was applied for analyzing maternal and child health policies in Malawi. For the purposes of this article, we do not report research findings; instead we focus our dicussion on the methodology of the study and draw lessons for policy analysis research work. We base our disusssion on our experiences from a study in which we analyzed maternal and child health policies in Malawi over the period from 1964 to 2008. Noting the multifaceted nature of maternal and child health policies, we adopted a mixed qualitative research method, whereby a number of data collection methods were employed. This approach allowed for the capturing of different perspectives of maternal and child health policies in Malawi and for strengthening of the weaknesses of each method, especially in terms of data validity. This research suggested that the multidimensional nature of maternal and child health policies, like other health policies, calls for a combination of research designs as well as a variety of methods of data collection and analysis. In addition, we suggest that, as an emerging research field, health policy analysis will benefit more from case study designs because they provide rich experiences in the actual policy context.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Statistical analysis of DNA microarray data provides a valuable diagnostic tool for the investigation of genetic components of diseases. To take advantage of the multitude of available data sets and analysis methods, it is desirable to combine both different algorithms and data from different studies. Applying ensemble learning, consensus clustering and cross-study normalization methods for this purpose in an almost fully automated process and linking different analysis modules together under a single interface would simplify many microarray analysis tasks. Results: We present ArrayMining.net, a web-application for microarray analysis that provides easy access to a wide choice of feature selection, clustering, prediction, gene set analysis and cross-study normalization methods. In contrast to other microarray-related web-tools, multiple algorithms and data sets for an analysis task can be combined using ensemble feature selection, ensemble prediction, consensus clustering and cross-platform data integration. By interlinking different analysis tools in a modular fashion, new exploratory routes become available, e.g. ensemble sample classification using features obtained from a gene set analysis and data from multiple studies. The analysis is further simplified by automatic parameter selection mechanisms and linkage to web tools and databases for functional annotation and literature mining. Conclusion: ArrayMining.net is a free web-application for microarray analysis combining a broad choice of algorithms based on ensemble and consensus methods, using automatic parameter selection and integration with annotation databases.