22 resultados para Need of automated grading

em Cochin University of Science


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Image processing has been a challenging and multidisciplinary research area since decades with continuing improvements in its various branches especially Medical Imaging. The healthcare industry was very much benefited with the advances in Image Processing techniques for the efficient management of large volumes of clinical data. The popularity and growth of Image Processing field attracts researchers from many disciplines including Computer Science and Medical Science due to its applicability to the real world. In the meantime, Computer Science is becoming an important driving force for the further development of Medical Sciences. The objective of this study is to make use of the basic concepts in Medical Image Processing and develop methods and tools for clinicians’ assistance. This work is motivated from clinical applications of digital mammograms and placental sonograms, and uses real medical images for proposing a method intended to assist radiologists in the diagnostic process. The study consists of two domains of Pattern recognition, Classification and Content Based Retrieval. Mammogram images of breast cancer patients and placental images are used for this study. Cancer is a disaster to human race. The accuracy in characterizing images using simplified user friendly Computer Aided Diagnosis techniques helps radiologists in detecting cancers at an early stage. Breast cancer which accounts for the major cause of cancer death in women can be fully cured if detected at an early stage. Studies relating to placental characteristics and abnormalities are important in foetal monitoring. The diagnostic variability in sonographic examination of placenta can be overlooked by detailed placental texture analysis by focusing on placental grading. The work aims on early breast cancer detection and placental maturity analysis. This dissertation is a stepping stone in combing various application domains of healthcare and technology.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Study on variable stars is an important topic of modern astrophysics. After the invention of powerful telescopes and high resolving powered CCD’s, the variable star data is accumulating in the order of peta-bytes. The huge amount of data need lot of automated methods as well as human experts. This thesis is devoted to the data analysis on variable star’s astronomical time series data and hence belong to the inter-disciplinary topic, Astrostatistics. For an observer on earth, stars that have a change in apparent brightness over time are called variable stars. The variation in brightness may be regular (periodic), quasi periodic (semi-periodic) or irregular manner (aperiodic) and are caused by various reasons. In some cases, the variation is due to some internal thermo-nuclear processes, which are generally known as intrinsic vari- ables and in some other cases, it is due to some external processes, like eclipse or rotation, which are known as extrinsic variables. Intrinsic variables can be further grouped into pulsating variables, eruptive variables and flare stars. Extrinsic variables are grouped into eclipsing binary stars and chromospheri- cal stars. Pulsating variables can again classified into Cepheid, RR Lyrae, RV Tauri, Delta Scuti, Mira etc. The eruptive or cataclysmic variables are novae, supernovae, etc., which rarely occurs and are not periodic phenomena. Most of the other variations are periodic in nature. Variable stars can be observed through many ways such as photometry, spectrophotometry and spectroscopy. The sequence of photometric observa- xiv tions on variable stars produces time series data, which contains time, magni- tude and error. The plot between variable star’s apparent magnitude and time are known as light curve. If the time series data is folded on a period, the plot between apparent magnitude and phase is known as phased light curve. The unique shape of phased light curve is a characteristic of each type of variable star. One way to identify the type of variable star and to classify them is by visually looking at the phased light curve by an expert. For last several years, automated algorithms are used to classify a group of variable stars, with the help of computers. Research on variable stars can be divided into different stages like observa- tion, data reduction, data analysis, modeling and classification. The modeling on variable stars helps to determine the short-term and long-term behaviour and to construct theoretical models (for eg:- Wilson-Devinney model for eclips- ing binaries) and to derive stellar properties like mass, radius, luminosity, tem- perature, internal and external structure, chemical composition and evolution. The classification requires the determination of the basic parameters like pe- riod, amplitude and phase and also some other derived parameters. Out of these, period is the most important parameter since the wrong periods can lead to sparse light curves and misleading information. Time series analysis is a method of applying mathematical and statistical tests to data, to quantify the variation, understand the nature of time-varying phenomena, to gain physical understanding of the system and to predict future behavior of the system. Astronomical time series usually suffer from unevenly spaced time instants, varying error conditions and possibility of big gaps. This is due to daily varying daylight and the weather conditions for ground based observations and observations from space may suffer from the impact of cosmic ray particles. Many large scale astronomical surveys such as MACHO, OGLE, EROS, xv ROTSE, PLANET, Hipparcos, MISAO, NSVS, ASAS, Pan-STARRS, Ke- pler,ESA, Gaia, LSST, CRTS provide variable star’s time series data, even though their primary intention is not variable star observation. Center for Astrostatistics, Pennsylvania State University is established to help the astro- nomical community with the aid of statistical tools for harvesting and analysing archival data. Most of these surveys releases the data to the public for further analysis. There exist many period search algorithms through astronomical time se- ries analysis, which can be classified into parametric (assume some underlying distribution for data) and non-parametric (do not assume any statistical model like Gaussian etc.,) methods. Many of the parametric methods are based on variations of discrete Fourier transforms like Generalised Lomb-Scargle peri- odogram (GLSP) by Zechmeister(2009), Significant Spectrum (SigSpec) by Reegen(2007) etc. Non-parametric methods include Phase Dispersion Minimi- sation (PDM) by Stellingwerf(1978) and Cubic spline method by Akerlof(1994) etc. Even though most of the methods can be brought under automation, any of the method stated above could not fully recover the true periods. The wrong detection of period can be due to several reasons such as power leakage to other frequencies which is due to finite total interval, finite sampling interval and finite amount of data. Another problem is aliasing, which is due to the influence of regular sampling. Also spurious periods appear due to long gaps and power flow to harmonic frequencies is an inherent problem of Fourier methods. Hence obtaining the exact period of variable star from it’s time series data is still a difficult problem, in case of huge databases, when subjected to automation. As Matthew Templeton, AAVSO, states “Variable star data analysis is not always straightforward; large-scale, automated analysis design is non-trivial”. Derekas et al. 2007, Deb et.al. 2010 states “The processing of xvi huge amount of data in these databases is quite challenging, even when looking at seemingly small issues such as period determination and classification”. It will be beneficial for the variable star astronomical community, if basic parameters, such as period, amplitude and phase are obtained more accurately, when huge time series databases are subjected to automation. In the present thesis work, the theories of four popular period search methods are studied, the strength and weakness of these methods are evaluated by applying it on two survey databases and finally a modified form of cubic spline method is intro- duced to confirm the exact period of variable star. For the classification of new variable stars discovered and entering them in the “General Catalogue of Vari- able Stars” or other databases like “Variable Star Index“, the characteristics of the variability has to be quantified in term of variable star parameters.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The study is a close scrutiny of the process of investigation of offences in India along with an analysis of powers and functions of the investigating agency. The offences, which are prejudicial to sovereignty, integrity and security of the nation or to its friendly relations with foreign states, are generally called the offences against national security. Offences against national security being prejudicial to the very existence of the nation and its legal system, is a heinous and terrible one. As early as 1971 the Law Commission of India had pointed out the need of treating the offences relating to national security and their perpetrators on a totally different procedural footing. The recommendation that, all the offences coming under the said category ought to be brought under the purview of a single enactment so as to confront such offences effectively. The discrepancies in and inadequacies of the criminal justice system in India as much as they are related to the investigations of the offences against national security are examined and the reforms are also suggested. The quality of criminal justice is closely linked with the caliber of the prosecution system and many of the acquittals in courts can be ascribed not only to poor investigations but also to poor quality of prosecution.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the present study the development of bioreactors for nitrifying water in closed system hatcheries of penaeid and non-penaeid prawns. This work is an attempt in this direction to cater to the needs of aquaculture industry for treatment and remediation of ammonia and nitrate in penaeid and non-penaeid hatcheries, by developing nitrifying bacteria allochthonous to the particular environment under consideration, and immobilizing them on an appropriately designed support materials configured as reactors. Ammonia toxicity is the major limiting factors in penaeid and non-penaeid hatchery systems causing lethal and sublethal effects on larvae depending on the pH values. Pressing need of the aquaculture industry to have a user friendly and economically viable technology for the removal of ammonia, which can be easily integrated to the existing hatchery designs without any major changes or modifications. Only option available now is to have biological filters through which water can be circulated for the oxidation of ammonia to nitrate through nitrite by a group of chemolithotrophs known as nitrifying bacteria. Two types of bioreactors have been designed and developed. The first category named as in situ stringed bed suspended bioreactor(SBSBR) was designed for use in the larval rearing tanks to remove ammonia and nitrite during larval rearing on a continuous basis, and the other to be used for nitrifying freshly collected seawater and spent water named as ex situ packed bed bioreactior(PBBR). On employing the two reactors together , both penaeid and non-penaeid larval rearing systems can be made a closed recirculating system at least for a season. A survey of literature revealed that the in situ stringed bed suspended reactor developed here is unique in its design, fabrication and mode of application.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Aim of the present work was to automate CSP process, to deposit and characterize CuInS2/In2S3 layers using this system and to fabricate devices using these films.An automated spray system for the deposition of compound semiconductor thin films was designed and developed so as to eliminate the manual labour involved in spraying and facilitate standardization of the method. The system was designed such that parameters like spray rate, movement of spray head, duration of spray, temperature of substrate, pressure of carrier gas and height of the spray head from the substrate could be varied. Using this system, binary, ternary as well as quaternary films could be successfully deposited.The second part of the work deal with deposition and characterization of CuInS2 and In2S3 layers respectively.In the case of CuInS2 absorbers, the effects of different preparation conditions and post deposition treatments on the optoelectronic, morphological and structural properties were investigated. It was observed that preparation conditions and post deposition treatments played crucial role in controlling the properties of the films. The studies in this direction were useful in understanding how the variation in spray parameters tailored the properties of the absorber layer. These results were subsequently made use of in device fabrication process.Effects of copper incorporation in In2S3 films were investigated to find how the diffusion of Cu from CuInS2 to In2S3 will affect the properties at the junction. It was noticed that there was a regular variation in the opto-electronic properties with increase in copper concentration.Devices were fabricated on ITO coated glass using CuInS2 as absorber and In2S3 as buffer layer with silver as the top electrode. Stable devices could be deposited over an area of 0.25 cm2, even though the efficiency obtained was not high. Using manual spray system, we could achieve devices of area 0.01 cm2 only. Thus automation helped in obtaining repeatable results over larger areas than those obtained while using the manual unit. Silver diffusion on the cells before coating the electrodes resulted in better collection of carriers.From this work it was seen CuInS2/In2S3 junction deposited through automated spray process has potential to achieve high efficiencies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Extensive use of the Internet coupled with the marvelous growth in e-commerce and m-commerce has created a huge demand for information security. The Secure Socket Layer (SSL) protocol is the most widely used security protocol in the Internet which meets this demand. It provides protection against eaves droppings, tampering and forgery. The cryptographic algorithms RC4 and HMAC have been in use for achieving security services like confidentiality and authentication in the SSL. But recent attacks against RC4 and HMAC have raised questions in the confidence on these algorithms. Hence two novel cryptographic algorithms MAJE4 and MACJER-320 have been proposed as substitutes for them. The focus of this work is to demonstrate the performance of these new algorithms and suggest them as dependable alternatives to satisfy the need of security services in SSL. The performance evaluation has been done by using practical implementation method.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Sharing of information with those in need of it has always been an idealistic goal of networked environments. With the proliferation of computer networks, information is so widely distributed among systems, that it is imperative to have well-organized schemes for retrieval and also discovery. This thesis attempts to investigate the problems associated with such schemes and suggests a software architecture, which is aimed towards achieving a meaningful discovery. Usage of information elements as a modelling base for efficient information discovery in distributed systems is demonstrated with the aid of a novel conceptual entity called infotron.The investigations are focused on distributed systems and their associated problems. The study was directed towards identifying suitable software architecture and incorporating the same in an environment where information growth is phenomenal and a proper mechanism for carrying out information discovery becomes feasible. An empirical study undertaken with the aid of an election database of constituencies distributed geographically, provided the insights required. This is manifested in the Election Counting and Reporting Software (ECRS) System. ECRS system is a software system, which is essentially distributed in nature designed to prepare reports to district administrators about the election counting process and to generate other miscellaneous statutory reports.Most of the distributed systems of the nature of ECRS normally will possess a "fragile architecture" which would make them amenable to collapse, with the occurrence of minor faults. This is resolved with the help of the penta-tier architecture proposed, that contained five different technologies at different tiers of the architecture.The results of experiment conducted and its analysis show that such an architecture would help to maintain different components of the software intact in an impermeable manner from any internal or external faults. The architecture thus evolved needed a mechanism to support information processing and discovery. This necessitated the introduction of the noveI concept of infotrons. Further, when a computing machine has to perform any meaningful extraction of information, it is guided by what is termed an infotron dictionary.The other empirical study was to find out which of the two prominent markup languages namely HTML and XML, is best suited for the incorporation of infotrons. A comparative study of 200 documents in HTML and XML was undertaken. The result was in favor ofXML.The concept of infotron and that of infotron dictionary, which were developed, was applied to implement an Information Discovery System (IDS). IDS is essentially, a system, that starts with the infotron(s) supplied as clue(s), and results in brewing the information required to satisfy the need of the information discoverer by utilizing the documents available at its disposal (as information space). The various components of the system and their interaction follows the penta-tier architectural model and therefore can be considered fault-tolerant. IDS is generic in nature and therefore the characteristics and the specifications were drawn up accordingly. Many subsystems interacted with multiple infotron dictionaries that were maintained in the system.In order to demonstrate the working of the IDS and to discover the information without modification of a typical Library Information System (LIS), an Information Discovery in Library Information System (lDLIS) application was developed. IDLIS is essentially a wrapper for the LIS, which maintains all the databases of the library. The purpose was to demonstrate that the functionality of a legacy system could be enhanced with the augmentation of IDS leading to information discovery service. IDLIS demonstrates IDS in action. IDLIS proves that any legacy system could be augmented with IDS effectively to provide the additional functionality of information discovery service.Possible applications of IDS and scope for further research in the field are covered.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The research work which was carried out to study the impact of ISO 9001: 2000, in selected organisations in Kerala, spread over various types of activities, such as fabrication of aero space hardware, glass ware, construction industries, health care units etc, encompassing, government and private enterprises, public sector undertakings, small and medium scale industries and research and development establishments. The ambience, work culture and collaboration prevalent in these organisations were varying on account of the environment in which they have been working. Fifty percent of the organizations selected for the study had obtained the ISO 900 I: 2000-certification since seven years or more. The process of study undertaken could invoke interest in the respondents, when a brief explanation on the purpose and need of the study was given to them prior to conducting the survey. There has been total cooperation from the management and the employees of all the organisations for the conduct of the study. Personal discussions were held with the senior management to draw their total support and involvement for the study.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Ship recycling has been considered as the best means to dispose off an obsolete ship. The current state of art of technology combined with the demands of sustainable developments from the global maritime industrial sector has modified the status of erstwhile ‘ship breaking’ involving ship scrap business to a modern industry undertaking dismantling of ships and recycling/reusing the dismantled products in a supply chain of pre owned product market by following the principles of recycling. Industries will have to formulate a set of best practices and blend them with the engineering activities for producing better quality products, improving the productivity and for achieving improved performances related to sustainable development. Improved performance by industries in a sustainable development perspective is accomplished only by implementing the 4E principles, ie.,. ecofriendliness, engineering efficiency, energy conservation and ergonomics in their core operations. The present study has done a comprehensive investigation into various ship recycling operations for formulating a set of best practices.Being the ultimate life cycle stage of a ship, ship recycling activities incorporate certain commercial procedures well in advance to facilitate the objectives of dismantling and recycling/reusing of various parts of the vessel. Thorough knowledge regarding these background procedures in ship recycling is essential for examining and understanding the industrial business operations associated with it. As a first step, the practices followed in merchant shipping operations regarding the decision on decommissioning have been and made available in the thesis. Brief description about the positioning methods and important preparations for the most feasible ship recycling method ie.,. beach method have been provided as a part of the outline of the background information. Available sources of guidelines, codes and rules & regulations for ship recycling have been compiled and included in the discussion.Very brief summary of practices in major ship recycling destinations has been prepared and listed for providing an overview of the global ship recycling activities. The present status of ship recycling by treating it as a full fledged engineering industry has been brought out to establish the need for looking into the development of the best practices. Major engineering attributes of ship as a unique engineering product and the significant influencing factors on her life cycle stage operations have been studied and added to the information base on ship recycling. Role of ship recycling industry as an important player in global sustainable development efforts has been reviewed by analysing the benefits of ship recycling. A brief synopsis on the state of art of ship recycling in major international ship recycling centres has also been incorporated in the backdrop knowledgebase generation on ship recycling processes.Publications available in this field have been reviewed and classified into five subject categories viz., Infrastructure for recycling yards and methods of dismantling, Rules regarding ship recycling activities, Environmental and safety aspects of ship recycling, Role of naval architects and ship classification societies, Application of information technology and Demand forecasting. The inference from the literature survey have been summarised and recorded. Noticeable observations in the inference include need of creation of a comprehensive knowledgebase on ship recycling and its effective implementation in the industry and the insignificant involvement of naval architects and shipbuilding engineers in ship recycling industry. These two important inferences and the message conveyed by them have been addressed with due importance in the subsequent part of the present study.As a part of the study the importance of demand forecasting in ship recycling has been introduced and presented. A sample input for ship recycling data for implementation of computer based methods of demand forecasting has been presented in this section of the thesis.The interdisciplinary nature of engineering processes involved in ship recycling has been identified as one of the important features of this industry. The present study has identified more than a dozen major stake holders in ship recycling having their own interests and roles. It has also been observed that most of the ship recycling activities is carried out in South East Asian countries where the beach based ship recycling is done in yards without proper infrastructure support. A model of beach based ship recycling has been developed and the roles, responsibilities and the mutual interactions of the elements of the system have been documented as a part of the study Subsequently the need of a generation of a wide knowledgebase on ship recycling activities as pointed out by the literature survey has been addressed. The information base and source of expertise required to build a broad knowledgebase on ship recycling operations have been identified and tabulated. Eleven important ship recycling processes have been identified and a brief sketch of steps involved in these processes have been examined and addressed in detail. Based on these findings, a detailed sequential disassembly process plan of ship recycling has been prepared and charted. After having established the need of best practices in ship recycling initially, the present study here identifies development of a user friendly expert system for ship recycling process as one of the constituents of the proposed best practises. A user friendly expert system has been developed for beach based ship recycling processes and is named as Ship Recycling Recommender (SRR). Two important functions of SRR, first one for the ‘Administrators’, the stake holders at the helm of the ship recycling affairs and second one for the ‘Users’, the stake holders who execute the actual dismantling have been presented by highlighting the steps involved in the execution of the software. The important output generated, ie.,. recommended practices for ship dismantling processes and safe handling information on materials present onboard have been presented with the help of ship recycling reports generated by the expert system. A brief account of necessity of having a ship recycling work content estimation as part of the best practices has been presented in the study. This is supported by a detailed work estimation schedule for the same as one of the appendices.As mentioned earlier, a definite lack of involvement of naval architect has been observed in development of methodologies for improving the status of ship recycling industry. Present study has put forward a holistic approach to review the status of ship recycling not simply as end of life activity of all ‘time expired’ vessels, but as a focal point of integrating all life cycle activities. A new engineering design philosophy targeting sustainable development of marine industrial domain, named design for ship recycling has been identified, formulated and presented. A new model of ship life cycle has been proposed by adding few stages to the traditional life cycle after analysing their critical role in accomplishing clean and safe end of life and partial dismantling of ships. Two applications of design for ship recycling viz, recyclability of ships and her products and allotment of Green Safety Index for ships have been presented as a part of implementation of the philosophy in actual practice.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis presents a detailed account of a cost - effective approach towards enhanced production of alkaline protease at profitable levels using different fermentation designs employing cheap agro-industrial residues. It involves the optimisation of process parameters for the production of a thermostable alkaline protease by Vibrio sp. V26 under solid state, submerged and biphasic fermentations, production of the enzyme using cell immobilisation technology and the application of the crude enzyme on the deproteinisation of crustacean waste.The present investigation suggests an economic move towards Improved production of alkaline protease at gainful altitudes employing different fermentation designs utilising inexpensive agro-industrial residues. Moreover, the use of agro-industrial and other solid waste substrates for fermentation helps to provide a substitute in conserving the already dwindling global energy resources. Another alternative for accomplishing economically feasible production is by the use of immobilisation technique. This method avoids the wasteful expense of continually growing microorganisms. The high protease producing potential of the organism under study ascertains their exploitation in the utilisation and management of wastes. However, strain improvement studies for the production of high yielding variants using mutagens or by gene transfer are required before recommending them to Industries.Industries, all over the world, have made several attempts to exploit the microbial diversity of this planet. For sustainable development, it is essential to discover, develop and defend this natural prosperity. The Industrial development of any country is critically dependent on the intellectual and financial investment in this area. The need of the hour is to harness the beneficial uses of microbes for maximum utilisation of natural resources and technological yields. Owing to the multitude of applications in a variety of industrial sectors, there has always been an increasing demand for novel producers and resources of alkaline proteases as well as for innovative methods of production at a commercial altitude. This investigation forms a humble endeavour towards this perspective and bequeaths hope and inspiration for inventions to follow.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Artificial neural networks (ANNs) are relatively new computational tools that have found extensive utilization in solving many complex real-world problems. This paper describes how an ANN can be used to identify the spectral lines of elements. The spectral lines of Cadmium (Cd), Calcium (Ca), Iron (Fe), Lithium (Li), Mercury (Hg), Potassium (K) and Strontium (Sr) in the visible range are chosen for the investigation. One of the unique features of this technique is that it uses the whole spectrum in the visible range instead of individual spectral lines. The spectrum of a sample taken with a spectrometer contains both original peaks and spurious peaks. It is a tedious task to identify these peaks to determine the elements present in the sample. ANNs capability of retrieving original data from noisy spectrum is also explored in this paper. The importance of the need of sufficient data for training ANNs to get accurate results is also emphasized. Two networks are examined: one trained in all spectral lines and other with the persistent lines only. The network trained in all spectral lines is found to be superior in analyzing the spectrum even in a noisy environment.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Internet today has become a vital part of day to day life, owing to the revolutionary changes it has brought about in various fields. Dependence on the Internet as an information highway and knowledge bank is exponentially increasing so that a going back is beyond imagination. Transfer of critical information is also being carried out through the Internet. This widespread use of the Internet coupled with the tremendous growth in e-commerce and m-commerce has created a vital need for infonnation security.Internet has also become an active field of crackers and intruders. The whole development in this area can become null and void if fool-proof security of the data is not ensured without a chance of being adulterated. It is, hence a challenge before the professional community to develop systems to ensure security of the data sent through the Internet.Stream ciphers, hash functions and message authentication codes play vital roles in providing security services like confidentiality, integrity and authentication of the data sent through the Internet. There are several ·such popular and dependable techniques, which have been in use widely, for quite a long time. This long term exposure makes them vulnerable to successful or near successful attempts for attacks. Hence it is the need of the hour to develop new algorithms with better security.Hence studies were conducted on various types of algorithms being used in this area. Focus was given to identify the properties imparting security at this stage. By making use of a perception derived from these studies, new algorithms were designed. Performances of these algorithms were then studied followed by necessary modifications to yield an improved system consisting of a new stream cipher algorithm MAJE4, a new hash code JERIM- 320 and a new message authentication code MACJER-320. Detailed analysis and comparison with the existing popular schemes were also carried out to establish the security levels.The Secure Socket Layer (SSL) I Transport Layer Security (TLS) protocol is one of the most widely used security protocols in Internet. The cryptographic algorithms RC4 and HMAC have been in use for achieving security services like confidentiality and authentication in the SSL I TLS. But recent attacks on RC4 and HMAC have raised questions about the reliability of these algorithms. Hence MAJE4 and MACJER-320 have been proposed as substitutes for them. Detailed studies on the performance of these new algorithms were carried out; it has been observed that they are dependable alternatives.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This work is a study on ‘Legal Control of Fishing Industry in Kerala.Fishery and Fishery-related legislations are sought to be examined in the light of scientific opinion and judicial decisionsThis work is divided into five Part.The thrust of time Study is on the success of legislative measures in attempting to achieve socio-economic justice for the fishermen community.Fishing is more an avocation than an industry. It is basically the avocation of the artisanal or traditional fishermen who depend on it for their livelihood. As an ‘industry’, it is a generator of employment, income and wealth.The modern tendency in national legislations is to integrate legal proivisions relating to EEZ fisheries into the general fisheries legislation.Chartered fishing was introduced by the Central Government during 1977-78 to establish the abundance and distribution of fishery resources in Indian EEZ, for transfer of technology and for related purposes.Going by the provisions of Articles 61 and 62 of the U.N. Convention on the Law of the Sea, 1982, foreign fishing need be permitted in our EEZ area only if there is any surplus left after meeting our national requirements.Conservation of the renewable fishery resources should start with identification of the species, their habitats, feeding and breeding patterns, their classification and characteristics. Fishing patterns and their impact on different species and areas require to be examined and investigated.the Central Government, that the Kerala Marine Fishing Regulation Act, 1980 was passed.our traditional fishermen that our Governments in power in Kerala resorted to the appointment of Commissions after Commissions to enquire into the problems of resource management and conservation of the resources. The implementation of the recommendations of these Commissions is the need of the times.General infrastructure has increased to a certain extent in the fishery villages; but it is more the result of the development efforts of the State rather than due to increase in earnings from fishing. Fisherwomen ar e still unable to enjoy the status and role expected of them in the society and the family.Around 120 million people around the tuorld are economically dependent on fisheries. In developing countries like India, small-scale fishers are also the primary suppliers of fish, particularly for local consumption. A most important role of the fisheries sector is as a source of domestically produced food. Fish, as a food item, is a nutrient and it has great medicinal value.Consumers in our country face a dramatic rise in fish prices as our ‘fishing industry’ is linked with lucrative markets in industrial countries. Autonomy of States should be attempted to be maintained to the extent possible with the help and co-operation of the Centre. Regional co-operation of the coastal states interse and with the Centre should be attempted to be achieved under the leadership of the Centre in matters of regional concern. At time national level, a ifisheries management policy and plan should be framed in conformity with the national economic policies and plans as also keeping pace with the local and regional needs and priorities. Any such policy, plan and legislation should strive to achieve sustainability of the resources as well as support to the subsistence sector.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The need of miniaturization in the present day communication industry is challenging. In the present scenario, printed antenna technology is highly suitable for wireless communication due to its low profile and other desirable radiation characteristics. Small monopole type antennas are overruled by compact small antennas for present day mobile communication applications. Coplanar waveguides (CPW) are printed on one side of a dielectric substrate. CPW have attracted the attention of antenna designers due to their excellent properties like ease of integration with ‘MMIC’, low cost, wide bandwidth, flexibility towards multiband operation, low radiation leakage and less dispersion. The requirement of omnidirectional coverage, light weight and low cost made these CPW fed antennas a good candidate for wireless applications. The main focus of the thesis is the study of coplanar waveguide transmission line. Rigorous investigations were performed on both the ground plane and signal strip of a coplanar waveguide transmission line to create effective radiation characteristics. Good amount of works have been done to transform CPW line to antenna suitable for mobile phone applications

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Reducing fishing pressure in coastal waters is the need of the day in the Indian marine fisheries sector of the country which is fast changing from a mere vocational activity to a capital intensive industry. It requires continuous monitoring of the resource exploitation through a scientifically acceptable methodology, data on production of each species stock, the number and characteristics of the fishing gears of the fleet, various biological characteristics of each stock, the impact of fishing on the environment and the role of fishery—independent on availability and abundance. Besides this, there are issues relating to capabilities in stock assessment, taxonomy research, biodiversity, conservation and fisheries management. Generation of reliable data base over a fixed time frame, their analysis and interpretation are necessary before drawing conclusions on the stock size, maximum sustainable yield, maximum economic yield and to further implement various fishing regulatory measures. India being a signatory to several treaties and conventions, is obliged to carry out assessments of the exploited stocks and manage them at sustainable levels. Besides, the nation is bound by its obligation of protein food security to people and livelihood security to those engaged in marine fishing related activities. Also, there are regional variabilities in fishing technology and fishery resources. All these make it mandatory for India to continue and strengthen its marine capture fisheries research in general and deep sea fisheries in particular. Against this background, an attempt is made to strengthen the deep sea fish biodiversity and also to generate data on the distribution, abundance, catch per unit effort of fishery resources available beyond 200 m in the EEZ of southwest coast ofIndia and also unravel some of the aspects of life history traits of potentially important non conventional fish species inhabiting in the depth beyond 200 m. This study was carried out as part of the Project on Stock Assessment and Biology of Deep Sea Fishes of Indian EEZ (MoES, Govt. of India).