975 resultados para Flooding problem in the fields


Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this article, we report the preparation of conducting natural rubber (NR) with polyaniline (Pani). NR was made into a conductive material by the compounding of NR with Pani in powder form. NR latex was made into a conductive material by the in situ polymerization of aniline in the presence of NR latex. Different compositions of Pani- NR semi-interpenetrating networks were prepared, and the dielectric properties of all of the samples were determined in microwave frequencies. The cavity perturbation techpique was used for this study. A HP8510 vector network analyzer with a rectangular cavity resonator was used for this study. S bands 2-4 GHz in frequency were used. Thermal studies were also carried out with thermogravimetric analysis and differential scanning calorimetry.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this article, we report the preparation of conducting natural rubber (NR) with polyaniline (Pani). NR was made into a conductive material by the compounding of NR with Pani in powder form. NR latex was made into a conductive material by the in situ polymerization of aniline in the presence of NR latex. Different compositions of Pani- NR semi-interpenetrating networks were prepared, and the dielectric properties of all of the samples were determined in microwave frequencies. The cavity perturbation techpique was used for this study. A HP8510 vector network analyzer with a rectangular cavity resonator was used for this study. S bands 2-4 GHz in frequency were used. Thermal studies were also carried out with thermogravimetric analysis and differential scanning calorimetry.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In a recent paper [Phys. Rev. B 50, 3477 (1994)], P. Fratzl and O. Penrose present the results of the Monte Carlo simulation of the spinodal decomposition problem (phase separation) using the vacancy dynamics mechanism. They observe that the t1/3 growth regime is reached faster than when using the standard Kawasaki dynamics. In this Comment we provide a simple explanation for the phenomenon based on the role of interface diffusion, which they claim is irrelevant for the observed behavior.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Information and communication technologies are the tools that underpin the emerging “Knowledge Society”. Exchange of information or knowledge between people and through networks of people has always taken place. But the ICT has radically changed the magnitude of this exchange, and thus factors such as timeliness of information and information dissemination patterns have become more important than ever.Since information and knowledge are so vital for the all round human development, libraries and institutions that manage these resources are indeed invaluable. So, the Library and Information Centres have a key role in the acquisition, processing, preservation and dissemination of information and knowledge. ln the modern context, library is providing service based on different types of documents such as manuscripts, printed, digital, etc. At the same time, acquisition, access, process, service etc. of these resources have become complicated now than ever before. The lCT made instrumental to extend libraries beyond the physical walls of a building and providing assistance in navigating and analyzing tremendous amounts of knowledge with a variety of digital tools. Thus, modern libraries are increasingly being re-defined as places to get unrestricted access to information in many formats and from many sources.The research was conducted in the university libraries in Kerala State, India. lt was identified that even though the information resources are flooding world over and several technologies have emerged to manage the situation for providing effective services to its clientele, most of the university libraries in Kerala were unable to exploit these technologies at maximum level. Though the libraries have automated many of their functions, wide gap prevails between the possible services and provided services. There are many good examples world over in the application of lCTs in libraries for the maximization of services and many such libraries have adopted the principles of reengineering and re-defining as a management strategy. Hence this study was targeted to look into how effectively adopted the modern lCTs in our libraries for maximizing the efficiency of operations and services and whether the principles of re-engineering and- redefining can be applied towards this.Data‘ was collected from library users, viz; student as well as faculty users; library ,professionals and university librarians, using structured questionnaires. This has been .supplemented by-observation of working of the libraries, discussions and interviews with the different types of users and staff, review of literature, etc. Personal observation of the organization set up, management practices, functions, facilities, resources, utilization of information resources and facilities by the users, etc. of the university libraries in Kerala have been made. Statistical techniques like percentage, mean, weighted mean, standard deviation, correlation, trend analysis, etc. have been used to analyse data.All the libraries could exploit only a very few possibilities of modern lCTs and hence they could not achieve effective Universal Bibliographic Control and desired efficiency and effectiveness in services. Because of this, the users as well as professionals are dissatisfied. Functional effectiveness in acquisition, access and process of information resources in various formats, development and maintenance of OPAC and WebOPAC, digital document delivery to remote users, Web based clearing of library counter services and resources, development of full-text databases, digital libraries and institutional repositories, consortia based operations for e-journals and databases, user education and information literacy, professional development with stress on lCTs, network administration and website maintenance, marketing of information, etc. are major areas need special attention to improve the situation. Finance, knowledge level on ICTs among library staff, professional dynamism and leadership, vision and support of the administrators and policy makers, prevailing educational set up and social environment in the state, etc. are some of the major hurdles in reaping the maximum possibilities of lCTs by the university libraries in Kerala. The principles of Business Process Re-engineering are found suitable to effectively apply to re-structure and redefine the operations and service system of the libraries. Most of the conventional departments or divisions prevailing in the university libraries were functioning as watertight compartments and their existing management system was more rigid to adopt the principles of change management. Hence, a thorough re-structuring of the divisions was indicated. Consortia based activities and pooling and sharing of information resources was advocated to meet the varied needs of the users in the main campuses and off campuses of the universities, affiliated colleges and remote stations. A uniform staff policy similar to that prevailing in CSIR, DRDO, ISRO, etc. has been proposed by the study not only in the university libraries in kerala but for the entire country.Restructuring of Lis education,integrated and Planned development of school,college,research and public library systems,etc.were also justified for reaping maximum benefits of the modern ICTs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis Entitled Bayesian inference in Exponential and pareto populations in the presence of outliers. The main theme of the present thesis is focussed on various estimation problems using the Bayesian appraoch, falling under the general category of accommodation procedures for analysing Pareto data containing outlier. In Chapter II. the problem of estimation of parameters in the classical Pareto distribution specified by the density function. In Chapter IV. we discuss the estimation of (1.19) when the sample contain a known number of outliers under three different data generating mechanisms, viz. the exchangeable model. Chapter V the prediction of a future observation based on a random sample that contains one contaminant. Chapter VI is devoted to the study of estimation problems concerning the exponential parameters under a k-outlier model.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The thesis entitled Studies on Thermal Structure in the Seas Around India. An attempt is made in this study to document the observed variability of thermal structure, both on seasonal and short-term scales, in the eastern Arabian Sea and southwestern Bay of Bengal, from the spatial and time series data sets from a reasonably strong data base. The present study has certain limitations. The mean temperatures are based on an uneven distribution of data in space and time. Some of the areas, although having a ‘full annual coverage, do not have adequate data for some months. Some portions in the area under study are having data gaps. The consistency and the coherence in the internal wave characteristics could not be examined due to non-availability of adequate data sets. The influence of generating mechanisms; other than winds and tides on the observed internal wave fields could not be ascertained due to lack of data. However, a comprehensive and intensive data collection can overcome these limitations. The deployment of moored buoys with arrays of sensors at different depths at some important locations for about 5 to 10 years can provide intensive and extensive data sets. This strong data base can afford to address the short-term and seasonal variability of thermal field and understand in detail the individual and collective influences of various physical and dynamical mechanisms responsible for such variability.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Of the several physical processes occurring in the sea, vertical motions have special significance because of their marked effects on the oceanic environment. upwelling is the process in the sea whereby subsurface layers move up towards the surface. The reverse process of surface water sinking to subsurface depths is called sinking. Upwelling is a very conspicuous feature along the west coasts of continents and equatorial regions, though upwelling also occurs along certain east coasts of continents and other regions, The Thesis is an outcome of some investigations carried out by the author on upwelling and sinking off the west and east coasts of India. The aim of the study is to find out the actual period and duration of upwelling and sinking, their driving mechanism, various associated features and the factors that affect these processes. It is achieved by analysing the temperature and density fields off the west and east coasts of India, and further conclusions are drawn from the divergence field of surface currents, wind stress and sea level variations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The thesis begins with a review of basic elements of general theory of relativity (GTR) which forms the basis for the theoretical interpretation of the observations in cosmology. The first chapter also discusses the standard model in cosmology, namely the Friedmann model, its predictions and problems. We have also made a brief discussion on fractals and inflation of the early universe in the first chapter. In the second chapter we discuss the formulation of a new approach to cosmology namely a stochastic approach. In this model, the dynam ics of the early universe is described by a set of non-deterministic, Langevin type equations and we derive the solutions using the Fokker—Planck formalism. Here we demonstrate how the problems with the standard model, can be eliminated by introducing the idea of stochastic fluctuations in the early universe. Many recent observations indicate that the present universe may be approximated by a many component fluid and we assume that only the total energy density is conserved. This, in turn, leads to energy transfer between different components of the cosmic fluid and fluctuations in such energy transfer can certainly induce fluctuations in the mean to factor in the equation of state p = wp, resulting in a fluctuating expansion rate for the universe. The third chapter discusses the stochastic evolution of the cosmological parameters in the early universe, using the new approach. The penultimate chapter is about the refinements to be made in the present model, by means of a new deterministic model The concluding chapter presents a discussion on other problems with the conventional cosmology, like fractal correlation of galactic distribution. The author attempts an explanation for this problem using the stochastic approach.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The heavy metal contamination in the environment may lead to circumstances like bioaccumulation and inturn biomagnification. Hence cheaper and effective technologies are needed to protect the precious natural resources and biological lives. A suitable technique is the one which meets the technical and environmental criteria for dealing with a particular remediation problem and should be site-specific due to spatial and climatic variations and it may not economically feasible everywhere. The search for newer technologies for the environmental therapy, involving the removal of toxic metals from wastewaters has directed attention to adsorption, based on metal binding capacities of various adsorbent materials. Therefore, the present study aim to identify and evaluate the most current mathematical formulations describing sorption processes. Although vast amount of research has been carried out in the area of metal removal by adsorption process using activated carbon few specific research data are available in different scientific institutions. The present work highlights the seasonal and spatial variations in the distribution of some selected heavy metals among various geochemical phases of Cochin Estuarine system and also looked into an environmental theraptic/remedial approach by adsorption technique using activated charcoal and chitosan, to reduce and thereby controlling metallic pollution. The thesis has been addressed in seven chapters with further subdivisions. The first chapter is introductory, stating the necessity of reducing or preventing water pollution due to the hazardous impact on environment and health of living organisms and drawing it from a careful review of literature relevant to the present study. It provides a constricted description about the study area, geology, and general hydrology and also bears the major objectives and scope of the present study.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Arabian Sea is an area of complex air-sea interaction processes with seasonal reversing monsoons. The associated thermohaline variability in the upper layers appears to control the large scale monsoon flow which is not yet completely understood. The variability in the thermohaline fields is known to occur in temporal domain ranging from intra-diurnal to inter-annual time scales and on spatial domains of few tens of kilometers to few thousands of kilometers. In the Arabian Sea though the surface temperature was routinely measured by both conventional measurements and satellites, the corresponding information on the subsurface thermohaline field is very sparse due to the lack cw adequate measurements. In such cases the numerical models offer promise in providing information on the subsurface features given an initial thermohaline field and surface heat flux boundary conditions. This thesis is an outcome of investigations carried out on the various aspects of the thermohaline variability on different time scales. In addition to the description of the mean annual cycle. the one dimensional numerical models of Miller (1976) and Price et a1 (1986) are utilised to simulate the observed mixed layer characteristics at selected locations in the Arabian Sea on time scales ranging from intra-diurnal to synoptic scales under variable atmospheric forcing.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Urban developments have exerted immense pressure on wetlands. Urban areas are normally centers of commercial activity and continue to attract migrants in large numbers in search of employment from different areas. As a result, habitations keep coming up in the natural areas / flood plains. This is happening in various Indian cities and towns and large habitations are coming up in low-lying areas, often encroaching even over drainage channels. In some cases, houses are constructed even on top of nallahs and drains. In the case of Kochi the situation is even worse as the base of the urban development itself stands on a completely reclaimed island. Also the topography and geology demanded more reclamation of land when the city developed as an agglomerative cluster. Cochin is a coastal settlement interspersed with a large backwater system and fringed on the eastern side by laterite-capped low hills from which a number of streams drain into the backwater system. The ridge line of the eastern low hills provides a welldefined watershed delimiting Cochin basin which help to confine the environmental parameters within a physical limit. This leads to an obvious conclusion that if physiography alone is considered, the western flatland is ideal for urban development. However it will result in serious environmental deterioration, as it comprises mainly of wetland and for availability of land there has to be large scale filling up of these wetlands which includes shallow mangrove-fringed water sheets, paddy fields, Pokkali fields, estuary etc.Chapter 1 School 4 of Environmental Studies The urban boundaries of Cochin are expanding fast with a consequent over-stretching of the existing fabric of basic amenities and services. Urbanisation leads to the transformation of agricultural land into built-up areas with the concomitant problems regarding water supply, drainage, garbage and sewage disposal etc. Many of the environmental problems of Cochin are hydrologic in origin; like water-logging / floods, sedimentation and pollution in the water bodies as well as shoreline erosion

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Interfacings of various subjects generate new field ofstudy and research that help in advancing human knowledge. One of the latest of such fields is Neurotechnology, which is an effective amalgamation of neuroscience, physics, biomedical engineering and computational methods. Neurotechnology provides a platform to interact physicist; neurologist and engineers to break methodology and terminology related barriers. Advancements in Computational capability, wider scope of applications in nonlinear dynamics and chaos in complex systems enhanced study of neurodynamics. However there is a need for an effective dialogue among physicists, neurologists and engineers. Application of computer based technology in the field of medicine through signal and image processing, creation of clinical databases for helping clinicians etc are widely acknowledged. Such synergic effects between widely separated disciplines may help in enhancing the effectiveness of existing diagnostic methods. One of the recent methods in this direction is analysis of electroencephalogram with the help of methods in nonlinear dynamics. This thesis is an effort to understand the functional aspects of human brain by studying electroencephalogram. The algorithms and other related methods developed in the present work can be interfaced with a digital EEG machine to unfold the information hidden in the signal. Ultimately this can be used as a diagnostic tool.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The evolution of wireless sensor network technology has enabled us to develop advanced systems for real time monitoring. In the present scenario wireless sensor networks are increasingly being used for precision agriculture. The advantages of using wireless sensor networks in agriculture are distributed data collection and monitoring, monitor and control of climate, irrigation and nutrient supply. Hence decreasing the cost of production and increasing the efficiency of production. This paper describes the development and deployment of wireless sensor network for crop monitoring in the paddy fields of Kuttanad, a region of Kerala, the southern state of India.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Treating e-mail filtering as a binary text classification problem, researchers have applied several statistical learning algorithms to email corpora with promising results. This paper examines the performance of a Naive Bayes classifier using different approaches to feature selection and tokenization on different email corpora

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The problem of using information available from one variable X to make inferenceabout another Y is classical in many physical and social sciences. In statistics this isoften done via regression analysis where mean response is used to model the data. Onestipulates the model Y = µ(X) +ɛ. Here µ(X) is the mean response at the predictor variable value X = x, and ɛ = Y - µ(X) is the error. In classical regression analysis, both (X; Y ) are observable and one then proceeds to make inference about the mean response function µ(X). In practice there are numerous examples where X is not available, but a variable Z is observed which provides an estimate of X. As an example, consider the herbicidestudy of Rudemo, et al. [3] in which a nominal measured amount Z of herbicide was applied to a plant but the actual amount absorbed by the plant X is unobservable. As another example, from Wang [5], an epidemiologist studies the severity of a lung disease, Y , among the residents in a city in relation to the amount of certain air pollutants. The amount of the air pollutants Z can be measured at certain observation stations in the city, but the actual exposure of the residents to the pollutants, X, is unobservable and may vary randomly from the Z-values. In both cases X = Z+error: This is the so called Berkson measurement error model.In more classical measurement error model one observes an unbiased estimator W of X and stipulates the relation W = X + error: An example of this model occurs when assessing effect of nutrition X on a disease. Measuring nutrition intake precisely within 24 hours is almost impossible. There are many similar examples in agricultural or medical studies, see e.g., Carroll, Ruppert and Stefanski [1] and Fuller [2], , among others. In this talk we shall address the question of fitting a parametric model to the re-gression function µ(X) in the Berkson measurement error model: Y = µ(X) + ɛ; X = Z + η; where η and ɛ are random errors with E(ɛ) = 0, X and η are d-dimensional, and Z is the observable d-dimensional r.v.