936 resultados para Best available techniques


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Introduction: Coordination through CVHL/BVCS gives Canadian health libraries access to information technology they could not offer individually, thereby enhancing the library services offered to Canadian health professionals. An example is the portal being developed. Portal best practices are of increasing interest (usability.gov; Wikipedia portals; JISC subject portal project; Stanford clinical portals) but conclusive research is not yet available. This paper will identify best practices for a portal bringing together knowledge for Canadian health professionals supported through a network of libraries. Description: The portal for Canadian health professionals will include capabilities such as: • Authentication • Question referral • Specialist “branch libraries” • Integration of commercial resources, web resources and health systems data • Cross-resource search engine • Infrastructure to enable links from EHR and decision support systems • Knowledge translation tools, such as highlighting of best evidence Best practices will be determined by studying the capabilities of existing portals, including consortia/networks and individual institutions, and through a literature review. Outcomes: Best practices in portals will be reviewed. The collaboratively developed Virtual Library, currently the heart of cvhl.ca, is a unique database collecting high quality, free web documents and sites relevant to Canadian health care. The evident strengths of the Virtual Library will be discussed in light of best practices. Discussion: Identification of best practices will support cost-benefit analysis of options and provide direction for CVHL/BVCS. Open discussion with stakeholders (libraries and professionals) informed by this review will lead to adoption of the best technical solutions supporting Canadian health libraries and their users.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

La dystrophie cornéenne endothéliale de Fuchs (FECD, pour l’abréviation du terme anglais « Fuchs endothelial corneal dystrophy ») est une maladie de l'endothélium cornéen. Sa pathogenèse est mal connue. Aucun traitement médical n’est efficace. Le seul traitement existant est chirurgical et consiste dans le remplacement de l’endothélium pathologique par un endothélium sain provenant de cornées de la Banque des yeux. Le traitement chirurgical, en revanche, comporte 10% de rejet immunologique. Des modèles expérimentaux sont donc nécessaires afin de mieux comprendre cette maladie ainsi que pour le développement de traitements alternatifs. Le but général de cette thèse est de développer un modèle expérimental de la FECD en utilisant le génie tissulaire. Ceci a été réalisé en trois étapes. 1) Tout d'abord, l'endothélium cornéen a été reconstruit par génie tissulaire en utilisant des cellules endothéliales en culture, provenant de patients atteints de FECD. Ce modèle a ensuite été caractérisé in vitro. Brièvement, les cellules endothéliales cornéennes FECD ont été isolées à partir de membranes de Descemet prélevées lors de greffes de cornée. Les cellules au deuxième ou troisième passages ont ensuite été ensemencées sur une cornée humaine préalablement décellularisée. Suivant 2 semaines de culture, les endothélia cornéens reconstruits FECD (n = 6) ont été évalués à l'aide d'histologie, de microscopie électronique à transmission et d’immunomarquages de différentes protéines. Les endothélia cornéens reconstruits FECD ont formé une monocouche de cellules polygonales bien adhérées à la membrane de Descemet. Les immunomarquages ont démontré la présence des protéines importantes pour la fonctionnalité de l’endothélium cornéen telles que Na+-K+/ATPase α1 et Na+/HCO3-, ainsi qu’une expression faible et uniforme de la protéine clusterine. 2) Deux techniques chirurgicales (DSAEK ; pour « Descemet stripping automated endothelial keratoplasty » et la kératoplastie pénétrante) ont été comparées pour la transplantation cornéenne dans le modèle animal félin. Les paramètres comparés incluaient les défis chirurgicaux et les résultats cliniques. La technique « DSAEK » a été difficile à effectuer dans le modèle félin. Une formation rapide de fibrine a été observée dans tous les cas DSAEK (n = 5). 3) Finalement, la fonctionnalité in vivo des endothélia cornéens reconstruits FECD a été évaluée (n = 7). Les évaluations in vivo comprenaient la transparence, la pachymétrie et la tomographie par cohérence optique. Les évaluations post-mortem incluaient la morphométrie des cellules endothéliales, la microscopie électronique à transmission et des immunomarquage de protéines liées à la fonctionnalité. Après la transplantation, la pachymétrie a progressivement diminué et la transparence a progressivement augmenté. Sept jours après la transplantation, 6 des 7 greffes étaient claires. La microscopie électronique à transmission a montré la présence de matériel fibrillaire sous-endothélial dans toutes les greffes d’endothelia reconstruits FECD. Les endothélia reconstruits exprimaient aussi des protéines Na+-K+/ATPase et Na+/HCO3-. En résumé, cette thèse démontre que les cellules endothéliales de la cornée à un stade avancé FECD peuvent être utilisées pour reconstruire un endothélium cornéen par génie tissulaire. La kératoplastie pénétrante a été démontrée comme étant la procédure la plus appropriée pour transplanter ces tissus reconstruits dans l’œil du modèle animal félin. La restauration de l'épaisseur cornéenne et de la transparence démontrent que les greffons reconstruits FECD sont fonctionnels in vivo. Ces nouveaux modèles FECD démontrent une réhabilitation des cellules FECD, permettant d’utiliser le génie tissulaire pour reconstruire des endothelia fonctionnels à partir de cellules dystrophiques. Les applications potentielles sont nombreuses, y compris des études physiopathologiques et pharmacologiques.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Les analyses effectuées dans le cadre de ce mémoire ont été réalisées à l'aide du module MatchIt disponible sous l’environnent d'analyse statistique R. / Statistical analyzes of this thesis were performed using the MatchIt package available in the statistical analysis environment R.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Les problèmes de conception de réseaux ont reçu un intérêt particulier et ont été largement étudiés de par leurs nombreuses applications dans différents domaines, tels que les transports et les télécommunications. Nous nous intéressons dans ce mémoire au problème de conception de réseaux avec coûts d’ajout de capacité. Il s’agit d’installer un ensemble d’équipements sur un réseau en vue de satisfaire la demande, tout en respectant les contraintes de capacité, chaque arc pouvant admettre plusieurs équipements. L’objectif est de minimiser les coûts variables de transport des produits et les coûts fixes d’installation ou d’augmentation de capacité des équipements. La méthode que nous envisageons pour résoudre ce problème est basée sur les techniques utilisées en programmation linéaire en nombres entiers, notamment celles de génération de colonnes et de coupes. Ces méthodes sont introduites dans un algorithme général de branch-and-bound basé sur la relaxation linéaire. Nous avons testé notre méthode sur quatre groupes d’instances de tailles différentes, et nous l’avons comparée à CPLEX, qui constitue un des meilleurs solveurs permettant de résoudre des problèmes d’optimisation, ainsi qu’à une méthode existante dans la littérature combinant des méthodes exactes et heuristiques. Notre méthode a été plus performante que ces deux méthodes, notamment pour les instances de très grandes tailles.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Malgré le progrès technologique et nos connaissances pharmaceutiques et médicales croissantes, le développement du médicament demeure un processus difficile, dispendieux, long et très risqué. Ce processus mérite d'être amélioré pour faciliter le développement de nouveaux traitements. À cette fin, cette thèse vise à démontrer l’utilité de principes avancés et d’outils élaborés en pharmacocinétique (PK), actuels et nouveaux. Ces outils serviront à répondre efficacement à des questions importantes lors du développement d’un médicament, sauvant ainsi du temps et des coûts. Le premier volet de la thèse porte sur l’utilisation de la modélisation et des simulations et la création d’un nouveau modèle afin d’établir la bioéquivalence entre deux formulations de complexe de gluconate ferrique de sodium en solution de sucrose pour injection. Comparé aux méthodes courantes, cette nouvelle approche proposée se libère de plusieurs présuppositions, et requiert moins de données. Cette technique bénéficie d’une robustesse scientifique tout en étant associée à des économies de temps et de coûts. Donc, même si développé pour produits génériques, elle pourra également s’avérer utile dans le développement de molécules innovatrices et « biosimilaires ». Le deuxième volet décrit l’emploi de la modélisation pour mieux comprendre et quantifier les facteurs influençant la PK et la pharmacodynamie (PD) d’une nouvelle protéine thérapeutique, la pegloticase. L’analyse a démontré qu’aucun ajustement posologique n’était nécessaire et ces résultats sont inclus dans la monographie officielle du produit. Grâce à la modélisation, on pouvait répondre à des questions importantes concernant le dosage d’un médicament sans passer par des nouvelles études ni d'évaluations supplémentaires sur les patients. Donc, l’utilisation de cet outil a permis de réduire les dépenses sans prolonger le processus de développement. Le modèle développé dans le cadre de cette analyse pourrait servir à mieux comprendre d’autres protéines thérapeutiques, incluant leurs propriétés immunogènes. Le dernier volet démontre l’utilité de la modélisation et des simulations dans le choix des régimes posologiques d’un antibiotique (TP-434) pour une étude de Phase 2. Des données provenant d’études de Phase 1 ont été modélisées au fur et à mesure qu’elles devenaient disponibles, afin de construire un modèle décrivant le profil pharmacocinétique du TP-434. Ce processus de modélisation exemplifiait les cycles exploratoires et confirmatoires décrits par Sheiner. Ainsi, en se basant sur des relations PK/PD d’un antibiotique de classe identique, des simulations ont été effectuées avec le modèle PK final, afin de proposer de nouveaux régimes posologiques susceptibles d’être efficace chez les patients avant même d'effectuer des études. Cette démarche rationnelle a mené à l’utilisation de régimes posologiques avec une possibilité accrue d’efficacité, sans le dosage inutile des patients. Ainsi, on s’est dispensé d’études ou de cohortes supplémentaires coûteuses qui auraient prolongé le processus de développement. Enfin, cette analyse est la première à démontrer l’application de ces techniques dans le choix des doses d’antibiotique pour une étude de Phase 2. En conclusion, cette recherche démontre que des outils de PK avancés comme la modélisation et les simulations ainsi que le développement de nouveaux modèles peuvent répondre efficacement et souvent de manière plus robuste à des questions essentielles lors du processus de développement du médicament, tout en réduisant les coûts et en épargnant du temps.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

L’utilisation des mesures subjectives en épidémiologie s’est intensifiée récemment, notamment avec la volonté de plus en plus affirmée d’intégrer la perception qu’ont les sujets de leur santé dans l’étude des maladies et l’évaluation des interventions. La psychométrie regroupe les méthodes statistiques utilisées pour la construction des questionnaires et l’analyse des données qui en sont issues. Ce travail de thèse avait pour but d’explorer différents problèmes méthodologiques soulevés par l’utilisation des techniques psychométriques en épidémiologie. Trois études empiriques sont présentées et concernent 1/ la phase de validation de l’instrument : l’objectif était de développer, à l’aide de données simulées, un outil de calcul de la taille d’échantillon pour la validation d’échelle en psychiatrie ; 2/ les propriétés mathématiques de la mesure obtenue : l’objectif était de comparer les performances de la différence minimale cliniquement pertinente d’un questionnaire calculée sur des données de cohorte, soit dans le cadre de la théorie classique des tests (CTT), soit dans celui de la théorie de réponse à l’item (IRT) ; 3/ son utilisation dans un schéma longitudinal : l’objectif était de comparer, à l’aide de données simulées, les performances d’une méthode statistique d’analyse de l’évolution longitudinale d’un phénomène subjectif mesuré à l’aide de la CTT ou de l’IRT, en particulier lorsque certains items disponibles pour la mesure différaient à chaque temps. Enfin, l’utilisation de graphes orientés acycliques a permis de discuter, à l’aide des résultats de ces trois études, la notion de biais d’information lors de l’utilisation des mesures subjectives en épidémiologie.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Rapport de stage présenté à la Faculté de médecine en vue de l'obtention du grade de Maître ès sciences appliquées (M.Sc.A.) en génie biomédical, option génie clinique.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Sharing of information with those in need of it has always been an idealistic goal of networked environments. With the proliferation of computer networks, information is so widely distributed among systems, that it is imperative to have well-organized schemes for retrieval and also discovery. This thesis attempts to investigate the problems associated with such schemes and suggests a software architecture, which is aimed towards achieving a meaningful discovery. Usage of information elements as a modelling base for efficient information discovery in distributed systems is demonstrated with the aid of a novel conceptual entity called infotron.The investigations are focused on distributed systems and their associated problems. The study was directed towards identifying suitable software architecture and incorporating the same in an environment where information growth is phenomenal and a proper mechanism for carrying out information discovery becomes feasible. An empirical study undertaken with the aid of an election database of constituencies distributed geographically, provided the insights required. This is manifested in the Election Counting and Reporting Software (ECRS) System. ECRS system is a software system, which is essentially distributed in nature designed to prepare reports to district administrators about the election counting process and to generate other miscellaneous statutory reports.Most of the distributed systems of the nature of ECRS normally will possess a "fragile architecture" which would make them amenable to collapse, with the occurrence of minor faults. This is resolved with the help of the penta-tier architecture proposed, that contained five different technologies at different tiers of the architecture.The results of experiment conducted and its analysis show that such an architecture would help to maintain different components of the software intact in an impermeable manner from any internal or external faults. The architecture thus evolved needed a mechanism to support information processing and discovery. This necessitated the introduction of the noveI concept of infotrons. Further, when a computing machine has to perform any meaningful extraction of information, it is guided by what is termed an infotron dictionary.The other empirical study was to find out which of the two prominent markup languages namely HTML and XML, is best suited for the incorporation of infotrons. A comparative study of 200 documents in HTML and XML was undertaken. The result was in favor ofXML.The concept of infotron and that of infotron dictionary, which were developed, was applied to implement an Information Discovery System (IDS). IDS is essentially, a system, that starts with the infotron(s) supplied as clue(s), and results in brewing the information required to satisfy the need of the information discoverer by utilizing the documents available at its disposal (as information space). The various components of the system and their interaction follows the penta-tier architectural model and therefore can be considered fault-tolerant. IDS is generic in nature and therefore the characteristics and the specifications were drawn up accordingly. Many subsystems interacted with multiple infotron dictionaries that were maintained in the system.In order to demonstrate the working of the IDS and to discover the information without modification of a typical Library Information System (LIS), an Information Discovery in Library Information System (lDLIS) application was developed. IDLIS is essentially a wrapper for the LIS, which maintains all the databases of the library. The purpose was to demonstrate that the functionality of a legacy system could be enhanced with the augmentation of IDS leading to information discovery service. IDLIS demonstrates IDS in action. IDLIS proves that any legacy system could be augmented with IDS effectively to provide the additional functionality of information discovery service.Possible applications of IDS and scope for further research in the field are covered.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Ship recycling has been considered as the best means to dispose off an obsolete ship. The current state of art of technology combined with the demands of sustainable developments from the global maritime industrial sector has modified the status of erstwhile ‘ship breaking’ involving ship scrap business to a modern industry undertaking dismantling of ships and recycling/reusing the dismantled products in a supply chain of pre owned product market by following the principles of recycling. Industries will have to formulate a set of best practices and blend them with the engineering activities for producing better quality products, improving the productivity and for achieving improved performances related to sustainable development. Improved performance by industries in a sustainable development perspective is accomplished only by implementing the 4E principles, ie.,. ecofriendliness, engineering efficiency, energy conservation and ergonomics in their core operations. The present study has done a comprehensive investigation into various ship recycling operations for formulating a set of best practices.Being the ultimate life cycle stage of a ship, ship recycling activities incorporate certain commercial procedures well in advance to facilitate the objectives of dismantling and recycling/reusing of various parts of the vessel. Thorough knowledge regarding these background procedures in ship recycling is essential for examining and understanding the industrial business operations associated with it. As a first step, the practices followed in merchant shipping operations regarding the decision on decommissioning have been and made available in the thesis. Brief description about the positioning methods and important preparations for the most feasible ship recycling method ie.,. beach method have been provided as a part of the outline of the background information. Available sources of guidelines, codes and rules & regulations for ship recycling have been compiled and included in the discussion.Very brief summary of practices in major ship recycling destinations has been prepared and listed for providing an overview of the global ship recycling activities. The present status of ship recycling by treating it as a full fledged engineering industry has been brought out to establish the need for looking into the development of the best practices. Major engineering attributes of ship as a unique engineering product and the significant influencing factors on her life cycle stage operations have been studied and added to the information base on ship recycling. Role of ship recycling industry as an important player in global sustainable development efforts has been reviewed by analysing the benefits of ship recycling. A brief synopsis on the state of art of ship recycling in major international ship recycling centres has also been incorporated in the backdrop knowledgebase generation on ship recycling processes.Publications available in this field have been reviewed and classified into five subject categories viz., Infrastructure for recycling yards and methods of dismantling, Rules regarding ship recycling activities, Environmental and safety aspects of ship recycling, Role of naval architects and ship classification societies, Application of information technology and Demand forecasting. The inference from the literature survey have been summarised and recorded. Noticeable observations in the inference include need of creation of a comprehensive knowledgebase on ship recycling and its effective implementation in the industry and the insignificant involvement of naval architects and shipbuilding engineers in ship recycling industry. These two important inferences and the message conveyed by them have been addressed with due importance in the subsequent part of the present study.As a part of the study the importance of demand forecasting in ship recycling has been introduced and presented. A sample input for ship recycling data for implementation of computer based methods of demand forecasting has been presented in this section of the thesis.The interdisciplinary nature of engineering processes involved in ship recycling has been identified as one of the important features of this industry. The present study has identified more than a dozen major stake holders in ship recycling having their own interests and roles. It has also been observed that most of the ship recycling activities is carried out in South East Asian countries where the beach based ship recycling is done in yards without proper infrastructure support. A model of beach based ship recycling has been developed and the roles, responsibilities and the mutual interactions of the elements of the system have been documented as a part of the study Subsequently the need of a generation of a wide knowledgebase on ship recycling activities as pointed out by the literature survey has been addressed. The information base and source of expertise required to build a broad knowledgebase on ship recycling operations have been identified and tabulated. Eleven important ship recycling processes have been identified and a brief sketch of steps involved in these processes have been examined and addressed in detail. Based on these findings, a detailed sequential disassembly process plan of ship recycling has been prepared and charted. After having established the need of best practices in ship recycling initially, the present study here identifies development of a user friendly expert system for ship recycling process as one of the constituents of the proposed best practises. A user friendly expert system has been developed for beach based ship recycling processes and is named as Ship Recycling Recommender (SRR). Two important functions of SRR, first one for the ‘Administrators’, the stake holders at the helm of the ship recycling affairs and second one for the ‘Users’, the stake holders who execute the actual dismantling have been presented by highlighting the steps involved in the execution of the software. The important output generated, ie.,. recommended practices for ship dismantling processes and safe handling information on materials present onboard have been presented with the help of ship recycling reports generated by the expert system. A brief account of necessity of having a ship recycling work content estimation as part of the best practices has been presented in the study. This is supported by a detailed work estimation schedule for the same as one of the appendices.As mentioned earlier, a definite lack of involvement of naval architect has been observed in development of methodologies for improving the status of ship recycling industry. Present study has put forward a holistic approach to review the status of ship recycling not simply as end of life activity of all ‘time expired’ vessels, but as a focal point of integrating all life cycle activities. A new engineering design philosophy targeting sustainable development of marine industrial domain, named design for ship recycling has been identified, formulated and presented. A new model of ship life cycle has been proposed by adding few stages to the traditional life cycle after analysing their critical role in accomplishing clean and safe end of life and partial dismantling of ships. Two applications of design for ship recycling viz, recyclability of ships and her products and allotment of Green Safety Index for ships have been presented as a part of implementation of the philosophy in actual practice.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The present study aimed at the utlisation of microbial organisms for the production of good quality chitin and chitosan. The three strains used for the study were Lactobacillus plantarum, Lactobacililus brevis and Bacillus subtilis. These strains were selected on the basis of their acid producing ability to reduce the pH of the fermenting substrates to prevent spoilage and thus caused demineralisation of the shell. Besides, the proteolytic enzymes in these strains acted on proteinaceous covering of shrimp and thus caused deprotenisation of shrimp shell waste. Thus the two processes involved in chitin production can be affected to certain extent using bacterial fermentation of shrimp shell.Optimization parameters like fermentation period, quantity of inoculum, type of sugar, concentration of sugar etc. for fermentation with three different strains were studied. For these, parameters like pH, Total titrable acidity (TTA), changes in sugar concentration, changes in microbial count, sensory changes etc. were studied.Fermentation study with Lactobacillus plantarum was continued with 20% w/v jaggery broth for 15 days. The inoculum prepared yislded a cell concentration of approximately 108 CFU/ml. In the present study, lactic acid and dilute hydrochloric acid were used for initial pH adjustment because; without adjusting the initial pH, it took more than 5 hours for the lactic acid bacteria to convert glucose to lactic acid and during this delay spoilage occurred due to putrefying enzymes active at neutral or higher pH. During the fermentation study, pH first decreased in correspondence with increase in TTA values. This showed a clear indication of acid production by the strain. This trend continued till their proteolytic activity showed an increasing trend. When the available sugar source started depleting, proteolytic activity also decreased and pH increased. This was clearly reflected in the sensory evaluation results. Lactic acid treated samples showed greater extent of demineralization and deprotenisation at the end of fermentation study than hydrochloric acid treated samples. It can be due to the effect of strong hydrochloric acid on the initial microbial count, which directly affects the fermentation process. At the end of fermentation, about 76.5% of ash was removed in lactic acid treated samples and 71.8% in hydrochloric acid treated samples; 72.8% of proteins in lactic acid treated samples and 70.6% in hydrochloric acid treated samples.The residual protein and ash in the fermented residue were reduced to permissible limit by treatment with 0.8N HCI and 1M NaOH. Characteristics of chitin like chitin content, ash content, protein content, % of N- acetylation etc. were studied. Quality characteristics like viscosity, degree of deacetylation and molecular weight of chitosan prepared were also compared. The chitosan samples prepared from lactic acid treated showed high viscosity than HCI treated samples. But degree of deacetylation is more in HCI treated samples than lactic acid treated ones. Characteristics of protein liquor obtained like its biogenic composition, amino acid composition, total volatile base nitrogen, alpha amino nitrogen etc. also were studied to find out its suitability as animal feed supplement.Optimization of fermentation parameters for Lactobacillus brevis fermentation study was also conducted and parameters were standardized. Then detailed fermentation study was done in 20%wlv jaggery broth for 17 days. Also the effect of two different acid treatments (mild HCI and lactic acid) used for initial pH adjustment on chitin production were also studied. In this study also trend of changes in pH. changes in sugar concentration ,microbial count changes were similar to Lactobacillus plantarum studies. At the end of fermentation, residual protein in the samples were only 32.48% in HCI treated samples and 31.85% in lactic acid treated samples. The residual ash content was about 33.68% in HCI treated ones and 32.52% in lactic acid treated ones. The fermented residue was converted to chitin with good characteristics by treatment with 1.2MNaOH and 1NHCI.Characteristics of chitin samples prepared were studied and extent of Nacetylation was about 84% in HCI treated chitin and 85%in lactic acid treated ones assessed from FTIR spectrum. Chitosan was prepared from these samples by usual chemical method and its extent of solubility, degree of deacetylation, viscosity and molecular weight etc were studied. The values of viscosity and molecular weight of the samples prepared were comparatively less than the chitosan prepared by Lactobacillus plantarum fermentation. Characteristics of protein liquor obtained were analyzed to determine its quality and is suitability as animal feed supplement.Another strain used for the study was Bacillus subtilis and fermentation was carried out in 20%w/v jaggery broth for 15 days. It was found that Bacillus subtilis was more efficient than other Lactobacillus species for deprotenisation and demineralization. This was mainly due to the difference in the proteolytic nature of the strains. About 84% of protein and 72% of ash were removed at the end of fermentation. Considering the statistical significance (P

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Medical fields requires fast, simple and noninvasive methods of diagnostic techniques. Several methods are available and possible because of the growth of technology that provides the necessary means of collecting and processing signals. The present thesis details the work done in the field of voice signals. New methods of analysis have been developed to understand the complexity of voice signals, such as nonlinear dynamics aiming at the exploration of voice signals dynamic nature. The purpose of this thesis is to characterize complexities of pathological voice from healthy signals and to differentiate stuttering signals from healthy signals. Efficiency of various acoustic as well as non linear time series methods are analysed. Three groups of samples are used, one from healthy individuals, subjects with vocal pathologies and stuttering subjects. Individual vowels/ and a continuous speech data for the utterance of the sentence "iruvarum changatimaranu" the meaning in English is "Both are good friends" from Malayalam language are recorded using a microphone . The recorded audio are converted to digital signals and are subjected to analysis.Acoustic perturbation methods like fundamental frequency (FO), jitter, shimmer, Zero Crossing Rate(ZCR) were carried out and non linear measures like maximum lyapunov exponent(Lamda max), correlation dimension (D2), Kolmogorov exponent(K2), and a new measure of entropy viz., Permutation entropy (PE) are evaluated for all three groups of the subjects. Permutation Entropy is a nonlinear complexity measure which can efficiently distinguish regular and complex nature of any signal and extract information about the change in dynamics of the process by indicating sudden change in its value. The results shows that nonlinear dynamical methods seem to be a suitable technique for voice signal analysis, due to the chaotic component of the human voice. Permutation entropy is well suited due to its sensitivity to uncertainties, since the pathologies are characterized by an increase in the signal complexity and unpredictability. Pathological groups have higher entropy values compared to the normal group. The stuttering signals have lower entropy values compared to the normal signals.PE is effective in charaterising the level of improvement after two weeks of speech therapy in the case of stuttering subjects. PE is also effective in characterizing the dynamical difference between healthy and pathological subjects. This suggests that PE can improve and complement the recent voice analysis methods available for clinicians. The work establishes the application of the simple, inexpensive and fast algorithm of PE for diagnosis in vocal disorders and stuttering subjects.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Sonar signal processing comprises of a large number of signal processing algorithms for implementing functions such as Target Detection, Localisation, Classification, Tracking and Parameter estimation. Current implementations of these functions rely on conventional techniques largely based on Fourier Techniques, primarily meant for stationary signals. Interestingly enough, the signals received by the sonar sensors are often non-stationary and hence processing methods capable of handling the non-stationarity will definitely fare better than Fourier transform based methods.Time-frequency methods(TFMs) are known as one of the best DSP tools for nonstationary signal processing, with which one can analyze signals in time and frequency domains simultaneously. But, other than STFT, TFMs have been largely limited to academic research because of the complexity of the algorithms and the limitations of computing power. With the availability of fast processors, many applications of TFMs have been reported in the fields of speech and image processing and biomedical applications, but not many in sonar processing. A structured effort, to fill these lacunae by exploring the potential of TFMs in sonar applications, is the net outcome of this thesis. To this end, four TFMs have been explored in detail viz. Wavelet Transform, Fractional Fourier Transfonn, Wigner Ville Distribution and Ambiguity Function and their potential in implementing five major sonar functions has been demonstrated with very promising results. What has been conclusively brought out in this thesis, is that there is no "one best TFM" for all applications, but there is "one best TFM" for each application. Accordingly, the TFM has to be adapted and tailored in many ways in order to develop specific algorithms for each of the applications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

After skin cancer, breast cancer accounts for the second greatest number of cancer diagnoses in women. Currently the etiologies of breast cancer are unknown, and there is no generally accepted therapy for preventing it. Therefore, the best way to improve the prognosis for breast cancer is early detection and treatment. Computer aided detection systems (CAD) for detecting masses or micro-calcifications in mammograms have already been used and proven to be a potentially powerful tool , so the radiologists are attracted by the effectiveness of clinical application of CAD systems. Fractal geometry is well suited for describing the complex physiological structures that defy the traditional Euclidean geometry, which is based on smooth shapes. The major contribution of this research include the development of • A new fractal feature to accurately classify mammograms into normal and normal (i)With masses (benign or malignant) (ii) with microcalcifications (benign or malignant) • A novel fast fractal modeling method to identify the presence of microcalcifications by fractal modeling of mammograms and then subtracting the modeled image from the original mammogram. The performances of these methods were evaluated using different standard statistical analysis methods. The results obtained indicate that the developed methods are highly beneficial for assisting radiologists in making diagnostic decisions. The mammograms for the study were obtained from the two online databases namely, MIAS (Mammographic Image Analysis Society) and DDSM (Digital Database for Screening Mammography.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Magnetic Resonance Imaging (MRI) is a multi sequence medical imaging technique in which stacks of images are acquired with different tissue contrasts. Simultaneous observation and quantitative analysis of normal brain tissues and small abnormalities from these large numbers of different sequences is a great challenge in clinical applications. Multispectral MRI analysis can simplify the job considerably by combining unlimited number of available co-registered sequences in a single suite. However, poor performance of the multispectral system with conventional image classification and segmentation methods makes it inappropriate for clinical analysis. Recent works in multispectral brain MRI analysis attempted to resolve this issue by improved feature extraction approaches, such as transform based methods, fuzzy approaches, algebraic techniques and so forth. Transform based feature extraction methods like Independent Component Analysis (ICA) and its extensions have been effectively used in recent studies to improve the performance of multispectral brain MRI analysis. However, these global transforms were found to be inefficient and inconsistent in identifying less frequently occurred features like small lesions, from large amount of MR data. The present thesis focuses on the improvement in ICA based feature extraction techniques to enhance the performance of multispectral brain MRI analysis. Methods using spectral clustering and wavelet transforms are proposed to resolve the inefficiency of ICA in identifying small abnormalities, and problems due to ICA over-completeness. Effectiveness of the new methods in brain tissue classification and segmentation is confirmed by a detailed quantitative and qualitative analysis with synthetic and clinical, normal and abnormal, data. In comparison to conventional classification techniques, proposed algorithms provide better performance in classification of normal brain tissues and significant small abnormalities.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Knowledge discovery in databases is the non-trivial process of identifying valid, novel potentially useful and ultimately understandable patterns from data. The term Data mining refers to the process which does the exploratory analysis on the data and builds some model on the data. To infer patterns from data, data mining involves different approaches like association rule mining, classification techniques or clustering techniques. Among the many data mining techniques, clustering plays a major role, since it helps to group the related data for assessing properties and drawing conclusions. Most of the clustering algorithms act on a dataset with uniform format, since the similarity or dissimilarity between the data points is a significant factor in finding out the clusters. If a dataset consists of mixed attributes, i.e. a combination of numerical and categorical variables, a preferred approach is to convert different formats into a uniform format. The research study explores the various techniques to convert the mixed data sets to a numerical equivalent, so as to make it equipped for applying the statistical and similar algorithms. The results of clustering mixed category data after conversion to numeric data type have been demonstrated using a crime data set. The thesis also proposes an extension to the well known algorithm for handling mixed data types, to deal with data sets having only categorical data. The proposed conversion has been validated on a data set corresponding to breast cancer. Moreover, another issue with the clustering process is the visualization of output. Different geometric techniques like scatter plot, or projection plots are available, but none of the techniques display the result projecting the whole database but rather demonstrate attribute-pair wise analysis