15 resultados para concept éthique dense
em Cochin University of Science
Resumo:
The present study on chaos and fractals in general topological spaces. Chaos theory originated with the work of Edward Lorenz. The phenomenon which changes order into disorder is known as chaos. Theory of fractals has its origin with the frame work of Benoit Mandelbrot in 1977. Fractals are irregular objects. In this study different properties of topological entropy in chaos spaces are studied, which also include hyper spaces. Topological entropy is a measures to determine the complexity of the space, and compare different chaos spaces. The concept of fractals can’t be extended to general topological space fast it involves Hausdorff dimensions. The relations between hausdorff dimension and packing dimension. Regular sets in Metric spaces using packing measures, regular sets were defined in IR” using Hausdorff measures. In this study some properties of self similar sets and partial self similar sets. We can associate a directed graph to each partial selfsimilar set. Dimension properties of partial self similar sets are studied using this graph. Introduce superself similar sets as a generalization of self similar sets and also prove that chaotic self similar self are dense in hyper space. The study concludes some relationships between different kinds of dimension and fractals. By defining regular sets through packing dimension in the same way as regular sets defined by K. Falconer through Hausdorff dimension, and different properties of regular sets also.
Resumo:
School of Legal Studies, Cochin University of Science & Technology
Resumo:
Sharing of information with those in need of it has always been an idealistic goal of networked environments. With the proliferation of computer networks, information is so widely distributed among systems, that it is imperative to have well-organized schemes for retrieval and also discovery. This thesis attempts to investigate the problems associated with such schemes and suggests a software architecture, which is aimed towards achieving a meaningful discovery. Usage of information elements as a modelling base for efficient information discovery in distributed systems is demonstrated with the aid of a novel conceptual entity called infotron.The investigations are focused on distributed systems and their associated problems. The study was directed towards identifying suitable software architecture and incorporating the same in an environment where information growth is phenomenal and a proper mechanism for carrying out information discovery becomes feasible. An empirical study undertaken with the aid of an election database of constituencies distributed geographically, provided the insights required. This is manifested in the Election Counting and Reporting Software (ECRS) System. ECRS system is a software system, which is essentially distributed in nature designed to prepare reports to district administrators about the election counting process and to generate other miscellaneous statutory reports.Most of the distributed systems of the nature of ECRS normally will possess a "fragile architecture" which would make them amenable to collapse, with the occurrence of minor faults. This is resolved with the help of the penta-tier architecture proposed, that contained five different technologies at different tiers of the architecture.The results of experiment conducted and its analysis show that such an architecture would help to maintain different components of the software intact in an impermeable manner from any internal or external faults. The architecture thus evolved needed a mechanism to support information processing and discovery. This necessitated the introduction of the noveI concept of infotrons. Further, when a computing machine has to perform any meaningful extraction of information, it is guided by what is termed an infotron dictionary.The other empirical study was to find out which of the two prominent markup languages namely HTML and XML, is best suited for the incorporation of infotrons. A comparative study of 200 documents in HTML and XML was undertaken. The result was in favor ofXML.The concept of infotron and that of infotron dictionary, which were developed, was applied to implement an Information Discovery System (IDS). IDS is essentially, a system, that starts with the infotron(s) supplied as clue(s), and results in brewing the information required to satisfy the need of the information discoverer by utilizing the documents available at its disposal (as information space). The various components of the system and their interaction follows the penta-tier architectural model and therefore can be considered fault-tolerant. IDS is generic in nature and therefore the characteristics and the specifications were drawn up accordingly. Many subsystems interacted with multiple infotron dictionaries that were maintained in the system.In order to demonstrate the working of the IDS and to discover the information without modification of a typical Library Information System (LIS), an Information Discovery in Library Information System (lDLIS) application was developed. IDLIS is essentially a wrapper for the LIS, which maintains all the databases of the library. The purpose was to demonstrate that the functionality of a legacy system could be enhanced with the augmentation of IDS leading to information discovery service. IDLIS demonstrates IDS in action. IDLIS proves that any legacy system could be augmented with IDS effectively to provide the additional functionality of information discovery service.Possible applications of IDS and scope for further research in the field are covered.
Resumo:
n this paper, a time series complexity analysis of dense array electroencephalogram signals is carried out using the recently introduced Sample Entropy (SampEn) measure. This statistic quantifies the regularity in signals recorded from systems that can vary from the purely deterministic to purely stochastic realm. The present analysis is conducted with an objective of gaining insight into complexity variations related to changing brain dynamics for EEG recorded from the three cases of passive, eyes closed condition, a mental arithmetic task and the same mental task carried out after a physical exertion task. It is observed that the statistic is a robust quantifier of complexity suited for short physiological signals such as the EEG and it points to the specific brain regions that exhibit lowered complexity during the mental task state as compared to a passive, relaxed state. In the case of mental tasks carried out before and after the performance of a physical exercise, the statistic can detect the variations brought in by the intermediate fatigue inducing exercise period. This enhances its utility in detecting subtle changes in the brain state that can find wider scope for applications in EEG based brain studies.
Resumo:
The increase in traffic growth and maintenance expenditures demands the urgent need for building better, long-lasting, and more efficient roads preventing or minimizing bituminous pavement distresses. Many of the principal distresses in pavements initiate or increase in severity due to the presence of water. In Kerala highways, where traditional dense graded mixtures are used for the surface courses, major distress is due to moisture induced damages. The Stone Matrix Asphalt (SMA) mixtures provide a durable surface course. Proven field performance of test track at Delhi recommends Stone Matrix Asphalt as a right choice to sustain severe climatic and heavy traffic conditions. But the concept of SMA in India is not so popularized and its application is very limited mainly due to the lack of proper specifications. This research is an attempt to study the influence of additives on the characteristics of SMA mixtures and to propose an ideal surface course for the pavements. The additives used for this investigation are coir, sisal, banana fibres (natural fibres), waste plastics (waste material) and polypropylene (polymer). A preliminary investigation is conducted to characterize the materials used in this study. Marshall test is conducted for optimizing the SMA mixtures (Control mixture-without additives and Stabilized mixtures with additives). Indirect tensile strength tests, compression strength tests, triaxial strength tests and drain down sensitivity tests are conducted to study the engineering properties of stabilized mixtures. The comparison of the performance of all stabilized mixtures with the control mixture and among themselves are carried out. A statistical analysis (SPSS package Ver.16) is performed to establish the findings of this study
Resumo:
This thesis attempts to investigate the problems associated with such schemes and suggests a software architecture, which is aimed towards achieving a meaningful discovery. Usage of information elements as a modelling base for efficient information discovery in distributed systems is demonstrated with the aid of a novel conceptual entity called infotron. The investigations are focused on distributed systems and their associated problems. The study was directed towards identifying suitable software architecture and incorporating the same in an environment where information growth is phenomenal and a proper mechanism for carrying out information discovery becomes feasible. An empirical study undertaken with the aid of an election database of constituencies distributed geographically, provided the insights required. This is manifested in the Election Counting and Reporting Software (ECRS) System. ECRS system is a software system, which is essentially distributed in nature designed to prepare reports to district administrators about the election counting process and to generate other miscellaneous statutory reports.
Resumo:
Effective solids-liquid separation is the basic concept of any wastewater treatment system. Biological treatment methods involve microorganisms for the treatment of wastewater. Conventional activated sludge process (ASP) poses the problem of poor settleability and hence require a large footprint. Biogranulation is an effective biotechnological process which can overcome the drawbacks of conventional ASP to a great extent. Aerobic granulation represents an innovative cell immobilization strategy in biological wastewater treatment. Aerobic granules are selfimmobilized microbial aggregates that are cultivated in sequencing batch reactors (SBRs). Aerobic granules have several advantages over conventional activated sludge flocs such as a dense and compact microbial structure, good settleability and high biomass retention. For cells in a culture to aggregate, a number of conditions have to be satisfied. Hence aerobic granulation is affected by many operating parameters. The organic loading rate (OLR) helps to enrich different bacterial species and to influence the size and settling ability of granules. Hence, OLR was argued as an influencing parameter by helping to enrich different bacterial species and to influence the size and settling ability of granules. Hydrodynamic shear force, caused by aeration and measured as superficial upflow air velocity (SUAV), has a strong influence and hence it is used to control the granulation process. Settling time (ST) and volume exchange ratio (VER) are also two key influencing factors, which can be considered as selection pressures responsible for aerobic granulation based on the concept of minimal settling velocity. Hence, these four parameters - OLR, SUAV, ST and VER- were selected as major influencing parametersfor the present study. Influence of these four parameters on aerobic granulation was investigated in this work
Resumo:
HINDI
Resumo:
HINDI
Resumo:
HINDI
Resumo:
HINDI
Resumo:
Hindi