931 resultados para Non line of sight (NLOS)


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Thèse réalisée en collaboration avec le Département de neurosciences et pharmacologie de l'Université de Copenhague, Danemark.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

En apprentissage automatique, domaine qui consiste à utiliser des données pour apprendre une solution aux problèmes que nous voulons confier à la machine, le modèle des Réseaux de Neurones Artificiels (ANN) est un outil précieux. Il a été inventé voilà maintenant près de soixante ans, et pourtant, il est encore de nos jours le sujet d'une recherche active. Récemment, avec l'apprentissage profond, il a en effet permis d'améliorer l'état de l'art dans de nombreux champs d'applications comme la vision par ordinateur, le traitement de la parole et le traitement des langues naturelles. La quantité toujours grandissante de données disponibles et les améliorations du matériel informatique ont permis de faciliter l'apprentissage de modèles à haute capacité comme les ANNs profonds. Cependant, des difficultés inhérentes à l'entraînement de tels modèles, comme les minima locaux, ont encore un impact important. L'apprentissage profond vise donc à trouver des solutions, en régularisant ou en facilitant l'optimisation. Le pré-entraînnement non-supervisé, ou la technique du ``Dropout'', en sont des exemples. Les deux premiers travaux présentés dans cette thèse suivent cette ligne de recherche. Le premier étudie les problèmes de gradients diminuants/explosants dans les architectures profondes. Il montre que des choix simples, comme la fonction d'activation ou l'initialisation des poids du réseaux, ont une grande influence. Nous proposons l'initialisation normalisée pour faciliter l'apprentissage. Le second se focalise sur le choix de la fonction d'activation et présente le rectifieur, ou unité rectificatrice linéaire. Cette étude a été la première à mettre l'accent sur les fonctions d'activations linéaires par morceaux pour les réseaux de neurones profonds en apprentissage supervisé. Aujourd'hui, ce type de fonction d'activation est une composante essentielle des réseaux de neurones profonds. Les deux derniers travaux présentés se concentrent sur les applications des ANNs en traitement des langues naturelles. Le premier aborde le sujet de l'adaptation de domaine pour l'analyse de sentiment, en utilisant des Auto-Encodeurs Débruitants. Celui-ci est encore l'état de l'art de nos jours. Le second traite de l'apprentissage de données multi-relationnelles avec un modèle à base d'énergie, pouvant être utilisé pour la tâche de désambiguation de sens.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

La tomographie d’émission par positrons (TEP) est une modalité d’imagerie moléculaire utilisant des radiotraceurs marqués par des isotopes émetteurs de positrons permettant de quantifier et de sonder des processus biologiques et physiologiques. Cette modalité est surtout utilisée actuellement en oncologie, mais elle est aussi utilisée de plus en plus en cardiologie, en neurologie et en pharmacologie. En fait, c’est une modalité qui est intrinsèquement capable d’offrir avec une meilleure sensibilité des informations fonctionnelles sur le métabolisme cellulaire. Les limites de cette modalité sont surtout la faible résolution spatiale et le manque d’exactitude de la quantification. Par ailleurs, afin de dépasser ces limites qui constituent un obstacle pour élargir le champ des applications cliniques de la TEP, les nouveaux systèmes d’acquisition sont équipés d’un grand nombre de petits détecteurs ayant des meilleures performances de détection. La reconstruction de l’image se fait en utilisant les algorithmes stochastiques itératifs mieux adaptés aux acquisitions à faibles statistiques. De ce fait, le temps de reconstruction est devenu trop long pour une utilisation en milieu clinique. Ainsi, pour réduire ce temps, on les données d’acquisition sont compressées et des versions accélérées d’algorithmes stochastiques itératifs qui sont généralement moins exactes sont utilisées. Les performances améliorées par l’augmentation de nombre des détecteurs sont donc limitées par les contraintes de temps de calcul. Afin de sortir de cette boucle et permettre l’utilisation des algorithmes de reconstruction robustes, de nombreux travaux ont été effectués pour accélérer ces algorithmes sur les dispositifs GPU (Graphics Processing Units) de calcul haute performance. Dans ce travail, nous avons rejoint cet effort de la communauté scientifique pour développer et introduire en clinique l’utilisation des algorithmes de reconstruction puissants qui améliorent la résolution spatiale et l’exactitude de la quantification en TEP. Nous avons d’abord travaillé sur le développement des stratégies pour accélérer sur les dispositifs GPU la reconstruction des images TEP à partir des données d’acquisition en mode liste. En fait, le mode liste offre de nombreux avantages par rapport à la reconstruction à partir des sinogrammes, entre autres : il permet d’implanter facilement et avec précision la correction du mouvement et le temps de vol (TOF : Time-Of Flight) pour améliorer l’exactitude de la quantification. Il permet aussi d’utiliser les fonctions de bases spatio-temporelles pour effectuer la reconstruction 4D afin d’estimer les paramètres cinétiques des métabolismes avec exactitude. Cependant, d’une part, l’utilisation de ce mode est très limitée en clinique, et d’autre part, il est surtout utilisé pour estimer la valeur normalisée de captation SUV qui est une grandeur semi-quantitative limitant le caractère fonctionnel de la TEP. Nos contributions sont les suivantes : - Le développement d’une nouvelle stratégie visant à accélérer sur les dispositifs GPU l’algorithme 3D LM-OSEM (List Mode Ordered-Subset Expectation-Maximization), y compris le calcul de la matrice de sensibilité intégrant les facteurs d’atténuation du patient et les coefficients de normalisation des détecteurs. Le temps de calcul obtenu est non seulement compatible avec une utilisation clinique des algorithmes 3D LM-OSEM, mais il permet également d’envisager des reconstructions rapides pour les applications TEP avancées telles que les études dynamiques en temps réel et des reconstructions d’images paramétriques à partir des données d’acquisitions directement. - Le développement et l’implantation sur GPU de l’approche Multigrilles/Multitrames pour accélérer l’algorithme LMEM (List-Mode Expectation-Maximization). L’objectif est de développer une nouvelle stratégie pour accélérer l’algorithme de référence LMEM qui est un algorithme convergent et puissant, mais qui a l’inconvénient de converger très lentement. Les résultats obtenus permettent d’entrevoir des reconstructions en temps quasi-réel que ce soit pour les examens utilisant un grand nombre de données d’acquisition aussi bien que pour les acquisitions dynamiques synchronisées. Par ailleurs, en clinique, la quantification est souvent faite à partir de données d’acquisition en sinogrammes généralement compressés. Mais des travaux antérieurs ont montré que cette approche pour accélérer la reconstruction diminue l’exactitude de la quantification et dégrade la résolution spatiale. Pour cette raison, nous avons parallélisé et implémenté sur GPU l’algorithme AW-LOR-OSEM (Attenuation-Weighted Line-of-Response-OSEM) ; une version de l’algorithme 3D OSEM qui effectue la reconstruction à partir de sinogrammes sans compression de données en intégrant les corrections de l’atténuation et de la normalisation dans les matrices de sensibilité. Nous avons comparé deux approches d’implantation : dans la première, la matrice système (MS) est calculée en temps réel au cours de la reconstruction, tandis que la seconde implantation utilise une MS pré- calculée avec une meilleure exactitude. Les résultats montrent que la première implantation offre une efficacité de calcul environ deux fois meilleure que celle obtenue dans la deuxième implantation. Les temps de reconstruction rapportés sont compatibles avec une utilisation clinique de ces deux stratégies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The present study examined the antiulcer effect of glucosamine on mucosal antioxidant defense system in ibuprofen-induced peptic ulcer in male albino rats. The results of the present study indicate that the pre-oral administration of chitosan and glucosamine maintain near to the normal status the activities of the mucosal antioxidant enzymes and the level of GSH (Glutathione), which protect mucosa against oxidative damage by decreasing the lipid peroxidation and strengthening the mucosal barrier, and which are the first line of defense against exogenous ulcerogenic agents. In this study indicate that the oral pre-treatment of chitosan and glucosamine can prevent ibuprofen-induced peptic ulcer in rats.This study can be concluded that co-administration of chitsosan and glucosamine can effectively prevent the isonized and rifampicin induced hepatotoxicity in rats.Comparatively, chitosan was found to have better results than glucosamine in alleviating the hepatic disorders.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Chit funds contribute to the value of financial markets in India particularly in Kerala. Chit finances with its unique features are of great significance especially as a savings cum borrowing avenue. The present study entitled “A Study of chit Finance in Kerala with special emphasis on Kerala State Financial Enterprises Ltd.” examines the socio-economic aspects of Chit schemes run by the private Chit Funds, KSFE, co-operatives, and informal Chit Funds. The study is an attempt to find the reasons for the growing popularity of Chit Funds as savings cum borrowings avenues even in the presence of various other avenues of savings and borrowings and also to understand how the Chit subscribers utilize the funds. The objectives of the present study are to examine the trends and pattern of growth of Chit Funds in the formal sector in Kerala, performance of KSFE as the only public sector Chit Fund company in India, preference for joining Chit Funds, estimate the cost and return on Chit Funds etc. is an indigenous financial instrument is complementary to modern financial techniques of savings and borrowings. KSFE is the dominant foremen in the chit business in Kerala, its weaknesses result in the non-attainment of certain objectives. Driven by the growing trend of privatization, KSFE needs to be innovative and competitive. It is also necessary that KSFE continue its leadership role by being more effective in being the harbinger of more efficiency, professionalism and good governance in the Chit Fund Industry. The growth and development of chit business by protecting the interests of both the subscribers and the foremen will therefore be most beneficial for any growing economy.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Aquaculture is one of the prime catalysts for the socio-economic development of Indian economy contributing to the nations food and nutritional security, export earnings, income and employment generation. In this study an evaluation of extension activities in the development of aquaculture in Kerala. This study was conducted with a view to examine how the Kerala fisheries department offered extension services to the aquaculture farmers in the freshwater and brackish water sectors of the state through various agencies like Fish Farmers Development Agency(FFDA), Brackish water Fish Farmers Development Agency (BFFDA). In this study there are 3 category of respondent’s fresh water beneficiary farmers, brackish water beneficiary farmers and fisheries extension officers. The main motive of the thesis is to make an attempt to explore the responses of local producers to the extension programs of the state with special reference to the aquaculture sector of Kerala, India. The most important technical constraint faced by the fresh water farmers was lack of knowledge followed by non-availability of quality seeds. In the case of brackish water farming, it was infection of disease followed by lack of knowledge. The overall activities of the department of fisheries were ‘fairly good’. It indicate the need for improvements in the delivery of extension services to various target groups. The state fisheries department has already moving towards evolving these modes of extension activities by community participation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The present study examined the antiulcer effect of glucosamine on mucosal antioxidant defense system in ibuprofen-induced peptic ulcer in male albino rats. The results of the present study indicate that the pre-oral administration of chitosan and glucosamine maintain near to the normal status the activities of the mucosal antioxidant enzymes and the level of GSH (Glutathione), which protect mucosa against oxidative damage by decreasing the lipid peroxidation and strengthening the mucosal barrier, and which are the first line of defense against exogenous ulcerogenic agents. In this study indicate that the oral pre-treatment of chitosan and glucosamine can prevent ibuprofen-induced peptic ulcer in rats.This study can be concluded that co-administration of chitsosan and glucosamine can effectively prevent the isonized and rifampicin induced hepatotoxicity in rats.Comparatively, chitosan was found to have better results than glucosamine in alleviating the hepatic disorders

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis is shows the result of the research work on the inherent Powers of the High Court in criminal jurisdiction. The criminal justice system in India recognizes inherent powers only of the High Court. The Theory and Philosophy of inherent powers are concerned the Distinction between civil and Criminal laws are of very little consequence. In formulating the research programme the confusion created by the concept of inherent powers and its application by High Court form the central point. How fully the concept is understood, how correctly the power is used, and how far it has enhanced the rationale of the administration of criminal justice, what is its importance and what are the solutions for the inherent power to earn a permanent status in the province of criminal jurisprudence are the themes of this study. The precipitation of new dimensions is the yardstick to acknowledge the inherent powers of the High Court and Supreme Court. It is of instant value in criminal justice system. This study concludes innovativeness provided by the inherent powers has helped the justice administration draw inspiration from the Constitution. A jurisprudence of inherent powers has developed with the weilding of inherent powers of the Supreme Court and the High Court. It is to unravel mystery of jurisprudence caused by the operation of the concept of inherent powers this research work gives emphasis. Its significance is all the more relevant when the power is exercised in the administration of criminal justice. Application or non application of inherent powers in a given case would tell upon the maturity and perfection of the standard of justice

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We propose a short-range generalization of the p-spin interaction spin-glass model. The model is well suited to test the idea that an entropy collapse is at the bottom line of the dynamical singularity encountered in structural glasses. The model is studied in three dimensions through Monte Carlo simulations, which put in evidence fragile glass behavior with stretched exponential relaxation and super-Arrhenius behavior of the relaxation time. Our data are in favor of a Vogel-Fulcher behavior of the relaxation time, related to an entropy collapse at the Kauzmann temperature. We, however, encounter difficulties analogous to those found in experimental systems when extrapolating thermodynamical data at low temperatures. We study the spin-glass susceptibility, investigating the behavior of the correlation length in the system. We find that the increase of the relaxation time is accompanied by a very slow growth of the correlation length. We discuss the scaling properties of off-equilibrium dynamics in the glassy regime, finding qualitative agreement with the mean-field theory.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Solid electrolytes for applications like chemical sensing, energy storage, and conversion have been actively investigated and developed since the early sixties. Although of immense potential, solid state protonic conductors have been ignored in comparison with the great interest that has been shown to other ionic conductors like lithium and silver ion conductors. The non-availability of good, stable protonic conductors could be partly the reason for this situation. Although organic solids are better known for their electrical insulating character, ionic conductors of organic origin constitute a recent addition to the class of ionic conductors. However, detailed studies (N1 such conductors are scarce. Also the last decade has witnessed an unprecedented boom in research on organic "conducting polymers". These newly devised materials show conductivity spanning from insulator to metallic regimes, which can be manipulated by appropriate chemical treatment. They find applications in devices ranging from rechargeable batteries to "smart windows". This thesis mainly deals with the synthesis and investigations on the electrical properties of (i) certain organbc protonic conductors derived from ethylenediamine and (ii) substituted polyanilines

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Urban developments have exerted immense pressure on wetlands. Urban areas are normally centers of commercial activity and continue to attract migrants in large numbers in search of employment from different areas. As a result, habitations keep coming up in the natural areas / flood plains. This is happening in various Indian cities and towns and large habitations are coming up in low-lying areas, often encroaching even over drainage channels. In some cases, houses are constructed even on top of nallahs and drains. In the case of Kochi the situation is even worse as the base of the urban development itself stands on a completely reclaimed island. Also the topography and geology demanded more reclamation of land when the city developed as an agglomerative cluster. Cochin is a coastal settlement interspersed with a large backwater system and fringed on the eastern side by laterite-capped low hills from which a number of streams drain into the backwater system. The ridge line of the eastern low hills provides a welldefined watershed delimiting Cochin basin which help to confine the environmental parameters within a physical limit. This leads to an obvious conclusion that if physiography alone is considered, the western flatland is ideal for urban development. However it will result in serious environmental deterioration, as it comprises mainly of wetland and for availability of land there has to be large scale filling up of these wetlands which includes shallow mangrove-fringed water sheets, paddy fields, Pokkali fields, estuary etc.Chapter 1 School 4 of Environmental Studies The urban boundaries of Cochin are expanding fast with a consequent over-stretching of the existing fabric of basic amenities and services. Urbanisation leads to the transformation of agricultural land into built-up areas with the concomitant problems regarding water supply, drainage, garbage and sewage disposal etc. Many of the environmental problems of Cochin are hydrologic in origin; like water-logging / floods, sedimentation and pollution in the water bodies as well as shoreline erosion

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The great potential for the culture of non-penaeid prawns, especially Macrobrachium rosenbergii in brackish and low saline areas of Indian coastal zone has not yet been fully exploited due to the non availability of healthy seed in adequate numbers and that too in the appropriate period. In spite of setting up several prawn hatcheries around the country to satiate the ever growing demands for the seed of the giant fresh water prawn, the supply still remains fear below the requirement mainly due to the mortality of the larvae at different stages of the larval cycle. In a larval rearing system of Macrobrachium rosenbergii, members of the family Vibrionaceae were found to be dominant flora and this was especially pronounced during the times of mortality However, to develop any sort of prophylactic and therapeutic measures, the pathogenic strains have to be segregated from the lot. This would never be possible unless they were clustered based on the principles of numerical taxonomy It is with these objectives and requirements that the present work involving phenotypic characterization of the isolates belonging to the family Vibrionaceae and working out the numerical taxonomy, determination of mole % G+C ratio, segregation of the pathogenic strains and screening antibiotics as therapeutics at times of emergency, was carried out.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

FT-IR spectrum of quinoline-2-carbaldehyde benzoyl hydrazone (HQb H2O) was recorded and analyzed. The synthesis and crystal structure data are also described. The vibrational wavenumbers were examined theoretically using the Gaussian03 package of programs using HF/6-31G(d) and B3LYP/6-31G(d) levels of theory. The data obtained from vibrational wavenumber calculations are used to assign vibrational bands obtained in infrared spectroscopy of the studied molecule. The first hyperpolarizability, infrared intensities and Raman activities are reported. The calculated first hyperpolarizability is comparable with the reported values of similar derivatives and is an attractive object for future studies of non-linear optics. The geometrical parameters of the title compound obtained from XRD studies are in agreement with the calculated values. The changes in the CAN bond lengths suggest an extended p-electron delocalization over quinoline and hydrazone moieties which is responsible for the non-linearity of the molecule

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Severe local storms, including tornadoes, damaging hail and wind gusts, frequently occur over the eastern and northeastern states of India during the pre-monsoon season (March-May). Forecasting thunderstorms is one of the most difficult tasks in weather prediction, due to their rather small spatial and temporal extension and the inherent non-linearity of their dynamics and physics. In this paper, sensitivity experiments are conducted with the WRF-NMM model to test the impact of convective parameterization schemes on simulating severe thunderstorms that occurred over Kolkata on 20 May 2006 and 21 May 2007 and validated the model results with observation. In addition, a simulation without convective parameterization scheme was performed for each case to determine if the model could simulate the convection explicitly. A statistical analysis based on mean absolute error, root mean square error and correlation coefficient is performed for comparisons between the simulated and observed data with different convective schemes. This study shows that the prediction of thunderstorm affected parameters is sensitive to convective schemes. The Grell-Devenyi cloud ensemble convective scheme is well simulated the thunderstorm activities in terms of time, intensity and the region of occurrence of the events as compared to other convective schemes and also explicit scheme

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The present study is an attempt to highlight the problem of typographical errors in OPACS. The errors made while typing catalogue entries as well as importing bibliographical records from other libraries exist unnoticed by librarians resulting the non-retrieval of available records and affecting the quality of OPACs. This paper follows previous research on the topic mainly by Jeffrey Beall and Terry Ballard. The word “management” was chosen from the list of likely to be misspelled words identified by previous research. It was found that the word is wrongly entered in several forms in local, national and international OPACs justifying the observations of Ballard that typos occur in almost everywhere. Though there are lots of corrective measures proposed and are in use, the study asserts the fact that human effort is needed to get rid of the problem. The paper is also an invitation to the library professionals and system designers to construct a strategy to solve the issue