999 resultados para semantic processing
Resumo:
Kerala was the pioneer in modern seafood processing and exporting. But now the industry is Iacingalot of problems due to low productivity and deterioration in the quality of the products. only about 17% of the installed freezing capacity in sea food processing industry was reported to be utilised during 1979-80. The price of the export commodities its decided by the buyers based on international supply and demand pattern and based on the strength and weakness of dollar/yen. The only way to increase the profitability of the processors is to reduce the cost of production to the possible extent. The individual processors find it difficult to continue in this field due to low productivity and quality problems. The main objectives of the research are to find out how the production is being managed in the seafood processing(freezing) 17industry in Kerala and the reasons for low productivity and poor quality of the products. The study includes a detailed analysis of Location of the factories. Layout Purchase, production and storage patterns. Production planning and scheduling. Work Measurement of the processing of important products. Quality Control and Inspection. Management Information System
Resumo:
In the present work, the author has designed and developed all types of solar air heaters called porous and nonporous collectors. The developed solar air heaters were subjected to different air mass flow rates in order to standardize the flow per unit area of the collector. Much attention was given to investigate the performance of the solar air heaters fitted with baffles. The output obtained from the experiments on pilot models, helped the installation of solar air heating system for industrial drying applications also. Apart from these, various types of solar dryers, for small and medium scale drying applications, were also built up. The feasibility of ‘latent heat thermal energy storage system’ based on Phase Change Material was also undertaken. The application of solar greenhouse for drying industrial effluent was analyzed in the present study and a solar greenhouse was developed. The effectiveness of Computational Fluid Dynamics (CFD) in the field of solar air heaters was also analyzed. The thesis is divided into eight chapters.
Resumo:
Use of short fibers as reinforcing fillers in rubber composites is on an increasing trend. They are popular due to the possibility of obtaining anisotropic properties, ease of processing and economy. In the preparation of these composites short fibers are incorporated on two roll mixing mills or in internal mixers. This is a high energy intensive time consuming process. This calls for developing less energy intensive and less time consuming processes for incorporation and distribution of short fibers in the rubber matrix. One method for this is to incorporate fibers in the latex stage. The present study is primarily to optimize the preparation of short fiber- natural rubber composite by latex stage compounding and to evaluate the resulting composites in terms of mechanical, dynamic mechanical and thermal properties. A synthetic fiber (Nylon) and a natural fiber (Coir) are used to evaluate the advantages of the processing through latex stage. To extract the full reinforcing potential of the coir fibers the macro fibers are converted to micro fibers through chemical and mechanical means. The thesis is presented in 7 chapters
Resumo:
This thesis investigated the potential use of Linear Predictive Coding in speech communication applications. A Modified Block Adaptive Predictive Coder is developed, which reduces the computational burden and complexity without sacrificing the speech quality, as compared to the conventional adaptive predictive coding (APC) system. For this, changes in the evaluation methods have been evolved. This method is as different from the usual APC system in that the difference between the true and the predicted value is not transmitted. This allows the replacement of the high order predictor in the transmitter section of a predictive coding system, by a simple delay unit, which makes the transmitter quite simple. Also, the block length used in the processing of the speech signal is adjusted relative to the pitch period of the signal being processed rather than choosing a constant length as hitherto done by other researchers. The efficiency of the newly proposed coder has been supported with results of computer simulation using real speech data. Three methods for voiced/unvoiced/silent/transition classification have been presented. The first one is based on energy, zerocrossing rate and the periodicity of the waveform. The second method uses normalised correlation coefficient as the main parameter, while the third method utilizes a pitch-dependent correlation factor. The third algorithm which gives the minimum error probability has been chosen in a later chapter to design the modified coder The thesis also presents a comparazive study beh-cm the autocorrelation and the covariance methods used in the evaluaiicn of the predictor parameters. It has been proved that the azztocorrelation method is superior to the covariance method with respect to the filter stabf-it)‘ and also in an SNR sense, though the increase in gain is only small. The Modified Block Adaptive Coder applies a switching from pitch precitzion to spectrum prediction when the speech segment changes from a voiced or transition region to an unvoiced region. The experiments cont;-:ted in coding, transmission and simulation, used speech samples from .\£=_‘ajr2_1a:r1 and English phrases. Proposal for a speaker reecgnifion syste: and a phoneme identification system has also been outlized towards the end of the thesis.
Resumo:
Interfacings of various subjects generate new field ofstudy and research that help in advancing human knowledge. One of the latest of such fields is Neurotechnology, which is an effective amalgamation of neuroscience, physics, biomedical engineering and computational methods. Neurotechnology provides a platform to interact physicist; neurologist and engineers to break methodology and terminology related barriers. Advancements in Computational capability, wider scope of applications in nonlinear dynamics and chaos in complex systems enhanced study of neurodynamics. However there is a need for an effective dialogue among physicists, neurologists and engineers. Application of computer based technology in the field of medicine through signal and image processing, creation of clinical databases for helping clinicians etc are widely acknowledged. Such synergic effects between widely separated disciplines may help in enhancing the effectiveness of existing diagnostic methods. One of the recent methods in this direction is analysis of electroencephalogram with the help of methods in nonlinear dynamics. This thesis is an effort to understand the functional aspects of human brain by studying electroencephalogram. The algorithms and other related methods developed in the present work can be interfaced with a digital EEG machine to unfold the information hidden in the signal. Ultimately this can be used as a diagnostic tool.
Resumo:
Broiler chicken is gaining popularity among the consumers of India. Since poultry is recognised as a leading food vehicle for Salmonella contamination, the prevalence and distribution of Salmonella serotypes in broiler chickens and processing environments of retail outlets has been studied. In the present study 214 samples of broiler chicken and 311 environmental samples from cage were analysed for the presence of Salmonella. Of the various body parts of live chicken analysed prevalence varied from 1.4% in cloacca to 6.9% in crop region. Environmental samples from the cage showed higher prevalence of Salmonella ranging from0 to 16.67%. Apart from Salmonella enteritidis, which was the predominant Salmonella serotype in the chickens as well as in the environmental samples, other serotypes such as S. bareilly, S. cerro, S. mbandaka and S. moladewere also encountered. The results of the research calls for strict hygiene standards for retail broiler chicken processing outlets
Resumo:
Several oral vaccination studies have been undertaken to evoke a better protection against white spot syndrome virus (WSSV), amajor shrimp pathogen. Formalin-inactivated virus andWSSV envelope protein VP28 were suggested as candidate vaccine components, but their uptake mechanism upon oral delivery was not elucidated. In this study the fate of these components and of live WSSV, orally intubated to black tiger shrimp (Penaeus monodon) was investigated by immunohistochemistry, employing antibodies specific for VP28 and haemocytes. The midgut has been identified as the most prominent site of WSSV uptake and processing. The truncated recombinant VP28 (rec-VP28), formalin-inactivated virus (IVP) and live WSSV follow an identical uptake route suggested as receptor-mediated endocytosis that starts with adherence of luminal antigens at the apical layers of gut epithelium. Processing of internalized antigens is performed in endo-lysosomal compartments leading to formation of supra-nuclear vacuoles. However, the majority of WSSV-antigens escape these compartments and are transported to the inter-cellular space via transcytosis. Accumulation of the transcytosed antigens in the connective tissue initiates aggregation and degranulation of haemocytes. Finally the antigens exiting the midgut seem to reach the haemolymph. The nearly identical uptake pattern of the different WSSV-antigens suggests that receptors on the apical membrane of shrimp enterocytes recognize rec-VP28 efficiently. Hence the truncated VP28 can be considered suitable for oral vaccination, when the digestion in the foregut can be bypassed
Resumo:
This paper compares statistical technique of paraphrase identification to semantic technique of paraphrase identification. The statistical techniques used for comparison are word set and word-order based methods where as the semantic technique used is the WordNet similarity matrix method described by Stevenson and Fernando in [3].
Resumo:
Rays, belonging to the class Elasmobranchii, constitute a major fishery in many states in India like Tamil Nadu, Gujarat, Andhra Pradesh, Kerala and Maharashtra. The estimated landings are 21,700 tonnes per annum. Even though the meat of rays is nutritious and free from bones and spines, there is little demand for fresh meat due to the presence of a high urea content. The landings are mainly used for salt curing which fetches only very low prices for the producers. Urea nitrogen constituted the major component (50.8%) of the non-protein nitrogen of the meat. An attempt has been made to standat-dize the processing steps to reduce the urea levels in the meat before freezing by using different simple techniques like dipping the fillets in stagnant chilled water, dipping in chilled running water and dipping in stirred chilled running water. It was found that meat dipped in stirred running water for two hours reduced the urea level of the meat by 62%. The yield of the lateral fin fillets and caudal fin fillets vary with the size of the ray. The drip loss during frozen storage is found to be more in the case of samples frozen stored after the treatment for urea removal by the method of stirring in running water. The samples treated in stagnant chilled water had the lowest drip loss. The total nitrogen was higher in samples treated in stagnant chilled water and lowest in the samples treated in stirred running water. The overall acceptability was high in the case of samples treated with stirred running water and frozen stored
Resumo:
Semantic Web: Software agents on the Semantic Web may use commonly agreed service language, which enables co-ordination between agents and proactive delivery of learning materials in the context of actual problems. The vision is that each user has his own personalized agent that communicates with other agents.
Resumo:
The date palm Phoenix dactylifera has played an important role in the day-to-day life of the people for the last 7000 years. Today worldwide production, utilization and industrialization of dates are continuously increasing since date fruits have earned great importance in human nutrition owing to their rich content of essential nutrients. Tons of date palm fruit wastes are discarded daily by the date processing industries leading to environmental problems. Wastes such as date pits represent an average of 10% of the date fruits. Thus, there is an urgent need to find suitable applications for this waste. In spite of several studies on date palm cultivation, their utilization and scope for utilizing date fruit in therapeutic applications, very few reviews are available and they are limited to the chemistry and pharmacology of the date fruits and phytochemical composition, nutritional significance and potential health benefits of date fruit consumption. In this context, in the present review the prospects of valorization of these date fruit processing by-products and wastes’ employing fermentation and enzyme processing technologies towards total utilization of this valuable commodity for the production of biofuels, biopolymers, biosurfactants, organic acids, antibiotics, industrial enzymes and other possible industrial chemicals are discussed
Resumo:
Sensitisation of natural rubber latex by addition of a small quantity of an anionic surfactant prior to the addition of a coacervant results in quick coagulation. The natural rubber prepared by the novel coagulation method shows improved raw rubber characteristics, better cure characteristics in gum and carbon black filled compounds and improved mechanical properties as compared to the conventionally coagulated natural rubber. Compounds based on dried masterbatches prepared by the incorporation of fluffy carbon black in different forms of soap sensitised natural rubber latices such as fresh latex, preserved field latex, centrifuged latex and a blend of preserved field latex and skim latex show improved cure characteristics and vucanizate properties as compared to an equivalent conventional dry rubber-fluffy carbon black based compound. The latex masterbatch based vulcanizates show higher level of crosslinking and better dispersion of filler. Vulcanizates based on fresh natural rubber latex- dual filler masterbatches containing a blend of carbon black and silica prepared by the modified coagulation process shows very good mechanical and dynamic properties that could be correlated to a low rolling resistance. The carbon black/silica/nanoclay tri-filler - fresh natural rubber latex masterbatch based vulcanizates show improved mechanical properties as the proportion of nanoclay increased up to 5 phr. The fresh natural rubber latex based carbon black-silica masterbatch/ polybutadiene blend vulcanizates show superior mechanical and dynamic properties as compared to the equivalent compound vulcanizates prepared from the dry natural rubber-filler (conventional dry mix)/polybutadiene blends
Resumo:
Die Technologie dienstorientierter Architekturen (Service-oriented Architectures, kurz SOA) weckt große Visionen auf Seiten der Industrie wie auch der Forschung. Sie hat sich als derzeit ideale Lösung für Umgebungen, in denen sich die Anforderungen an die IT-Bedürfnisse rapide ändern, erwiesen. Heutige IT-Systeme müssen Managementaufgaben wie Softwareinstallation, -anpassung oder -austausch erlauben, ohne dabei den laufenden Betrieb wesentlich zu stören. Die dafür nötige Flexibilität bieten dienstorientierte Architekturen, in denen Softwarekomponenten in Form von Diensten zur Verfügung stehen. Ein Dienst bietet über seine Schnittstelle lokalen wie entfernten Applikationen einen Zugang zu seiner Funktionalität. Wir betrachten im Folgenden nur solche dienstorientierte Architekturen, in denen Dienste zur Laufzeit dynamisch entdeckt, gebunden, komponiert, verhandelt und adaptiert werden können. Eine Applikation kann mit unterschiedlichen Diensten arbeiten, wenn beispielsweise Dienste ausfallen oder ein neuer Dienst die Anforderungen der Applikation besser erfüllt. Eine unserer Grundvoraussetzungen lautet somit, dass sowohl das Dienstangebot als auch die Nachfrageseite variabel sind. Dienstorientierte Architekturen haben besonderes Gewicht in der Implementierung von Geschäftsprozessen. Im Rahmen des Paradigmas Enterprise Integration Architecture werden einzelne Arbeitsschritte als Dienste implementiert und ein Geschäftsprozess als Workflow von Diensten ausgeführt. Eine solche Dienstkomposition wird auch Orchestration genannt. Insbesondere für die so genannte B2B-Integration (Business-to-Business) sind Dienste das probate Mittel, um die Kommunikation über die Unternehmensgrenzen hinaus zu unterstützen. Dienste werden hier in der Regel als Web Services realisiert, welche vermöge BPEL4WS orchestriert werden. Der XML-basierte Nachrichtenverkehr und das http-Protokoll sorgen für eine Verträglichkeit zwischen heterogenen Systemen und eine Transparenz des Nachrichtenverkehrs. Anbieter dieser Dienste versprechen sich einen hohen Nutzen durch ihre öffentlichen Dienste. Zum einen hofft man auf eine vermehrte Einbindung ihrer Dienste in Softwareprozesse. Zum anderen setzt man auf das Entwickeln neuer Software auf Basis ihrer Dienste. In der Zukunft werden hunderte solcher Dienste verfügbar sein und es wird schwer für den Entwickler passende Dienstangebote zu finden. Das Projekt ADDO hat in diesem Umfeld wichtige Ergebnisse erzielt. Im Laufe des Projektes wurde erreicht, dass der Einsatz semantischer Spezifikationen es ermöglicht, Dienste sowohl im Hinblick auf ihre funktionalen als auch ihre nicht-funktionalen Eigenschaften, insbesondere die Dienstgüte, automatisch zu sichten und an Dienstaggregate zu binden [15]. Dazu wurden Ontologie-Schemata [10, 16], Abgleichalgorithmen [16, 9] und Werkzeuge entwickelt und als Framework implementiert [16]. Der in diesem Rahmen entwickelte Abgleichalgorithmus für Dienstgüte beherrscht die automatische Aushandlung von Verträgen für die Dienstnutzung, um etwa kostenpflichtige Dienste zur Dienstnutzung einzubinden. ADDO liefert einen Ansatz, Schablonen für Dienstaggregate in BPEL4WS zu erstellen, die zur Laufzeit automatisch verwaltet werden. Das Vorgehen konnte seine Effektivität beim internationalen Wettbewerb Web Service Challenge 2006 in San Francisco unter Beweis stellen: Der für ADDO entwickelte Algorithmus zur semantischen Dienstkomposition erreichte den ersten Platz. Der Algorithmus erlaubt es, unter einer sehr großenMenge angebotener Dienste eine geeignete Auswahl zu treffen, diese Dienste zu Dienstaggregaten zusammenzufassen und damit die Funktionalität eines vorgegebenen gesuchten Dienstes zu leisten. Weitere Ergebnisse des Projektes ADDO wurden auf internationalen Workshops und Konferenzen veröffentlicht. [12, 11]
Resumo:
This report gives a detailed discussion on the system, algorithms, and techniques that we have applied in order to solve the Web Service Challenges (WSC) of the years 2006 and 2007. These international contests are focused on semantic web service composition. In each challenge of the contests, a repository of web services is given. The input and output parameters of the services in the repository are annotated with semantic concepts. A query to a semantic composition engine contains a set of available input concepts and a set of wanted output concepts. In order to employ an offered service for a requested role, the concepts of the input parameters of the offered operations must be more general than requested (contravariance). In contrast, the concepts of the output parameters of the offered service must be more specific than requested (covariance). The engine should respond to a query by providing a valid composition as fast as possible. We discuss three different methods for web service composition: an uninformed search in form of an IDDFS algorithm, a greedy informed search based on heuristic functions, and a multi-objective genetic algorithm.