905 resultados para Modern western city, alternative community, spirituality, man mass, self-sustainability


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Suite à la crise financière globale de 2008, ainsi qu’aux différents facteurs qui y ont mené, il est étonnant qu’une alternative éthique et juste pour une finance lucrative et stable n’existe toujours pas (ou du moins, demeure oubliée). Ayant décidé de contribuer au domaine pluri-centenaire du Droit Commercial de la Chari’a, nous avons été surpris par la découverte d’une problématique à l’origine de cette thèse. En France, nous avons suivi des débats doctrinaux intéressants dont les conclusions se rapprochent de la perception générale Occidentale quand à la nature de la finance Islamique, que ce soit au niveau de la finance basée sur le Droit de la Chari’a ou encore les exigences de sa pleine introduction dans le système juridique français de l’époque. Cet intérêt initial dans la finance islamique a ensuite mené à un intérêt dans la question des avantages d’éthique et de justice du Droit Commercial de la Chari’a dans son ensemble, qui est au coeur de cette thèse. Dans le monde moderne du commerce et de la finance d’aujourd’hui, les transactions sont marquées par une prise de risque excessive et un esprit de spéculation qui s’apparente aux jeux de hasard, et menant à des pertes colossales. Pire encore, ces pertes sont ensuite transférées à la collectivité. Par conséquent, y at-il des préceptes, des principes ou des règles éthiques et juridiques qui peuvent fournir une certaine forme de sécurité et de protection dans les marchés financiers d'aujourd'hui? Est-ce réalisable? Cette thèse soutient que la richesse de la jurisprudence islamique ainsi que ses règles dont les avantages n’ont pas encore été pleinement saisis et régénérés en réponse aux nouveaux défis d’aujourd'hui, peuvent encore fournir continuellement des solutions, et réformer des produits financiers de façon à refléter des principes de justice et d'équité. Dans ce processus, un éclairage nouveau sera apporté à certains sujets déjà connus dans le cadre de la contribution prévue de cette thèse, mais ne sera pas le principal objectif de la thèse.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cette thèse analyse l’articulation de l’expérience de la perte d’expérience à laquelle donnent lieu les dictatures et les post-dictatures au Chili et en Argentine dans Lumpérica (1983) de Diamela Eltit, Los planetas (1999) de Sergio Chejfec, et Los rubios (2003) d’Albertina Carri. Les dernières dictatures militaires dans le Cône sud latino-américain imposent ou préparent le terrain pour l’implantation d’un nouvel ordre néolibéral qui s’intensifie sous les régimes démocratiques postérieurs. Au cours de cette transition, le terrorisme d’État au moyen duquel les gouvernements militaires visent à éliminer toute forme de résistance à la reconfiguration de la société nécessaire à la mise en place de politiques néolibérales, donne lieu à une expérience inédite, difficile à communiquer. De plus, autant les dictatures que les démocraties post-dictatoriales mettent en oeuvre des mécanismes d’oubli du passé, soit par la répression, le consensus politique ou les moyens de communication de masse. C’est dans ce contexte que l’expérience disparaît. Le questionnement sur l’expérience de la perte d’expérience est basé principalement sur deux axes théoriques: le concept de transition de l’État au Marché dans le Cône sud latino-américain développé par des intellectuels tels que Willy Thayer, Idelber Avelar ou Brett Levinson, parmi d’autres, ainsi que sur les réflexions sur l’expérience dictatoriale de Sergio Rojas et sur la crise de l’expérience dans la modernité de Walter Benjamin. Le premier chapitre, dédié à Lumpérica, interprète le rituel nocturne où la protagoniste, une femme appelée L. Iluminada, séduit le protagoniste masculin du roman, un panneau électrique appelé “el luminoso” qui projette des messages publicitaires au milieu d’une place publique de Santiago, pour que celui-ci la blesse et marque sa peau, comme la mise en scène d’un “désir photographique” de garder une empreinte de la transition que d’autres moyens de communication tendent à effacer. Le deuxième chapitre traite de la figuration de l’excès dans Los planetas et analyse comment l’écriture, la photographie et l’espace urbain, en assumant une fonction de suppléments de la voix et de la présence de M, séquestré et disparu pendant la dictature argentine, rendent compte de l’expérience de la perte de l’expérience d’une plénitude. Après avoir exposé le rôle des jouets dans la polémique générée par Los rubios, le troisième chapitre analyse comment le film de Carri sur la mémoire de ses parents disparus transmet l’expérience des générations post-dictatoriales et fait face à l’héritage du passé par le jeu.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dans les dernières décennies, les changements morphologiques des maisons iraniennes, l’arrivage de l'éclairage artificiel et le manque de connaissance suffisante de la valeur de la lumière du jour pour le bien-être des occupants ont résulté une diminution de l'utilisation de la lumière du jour dans les habitations iraniennes contemporaines. En conséquence, le niveau du bien-être des occupants a décru ce qui peut être corrélée avec la diminution de l'utilisation de la lumière du jour. Considérant l'architecture traditionnelle iranienne et l'importance de la lumière du jour dans les habitations traditionnelles, cette recherche étudie l’utilisation de la lumière du jour dans les habitations traditionnelles et explore comment extrapoler ces techniques dans les maisons contemporaines pourrait augmenter l'utilisation de la lumière du jour et par conséquence améliorer le bien-être des occupants. Une revue de littérature, une enquête des experts iraniens et une étude de cas des maisons à cour traditionnelles à la ville de Kashan ont permis de recueillir les données nécessaires pour cette recherche. De par le contexte de recherche, la ville de Kashan a été choisie particulièrement grâce à sa texture historique intacte. L’analyse de la lumière du jour a été faite par un logiciel de simulation pour trois maisons à cour de la ville de Kashan ayant les mêmes caractéristiques de salon d’hiver. Cette étude se concentre sur l’analyse de la lumière du jour dans les salons d'hiver du fait de la priorité obtenue de l'enquête des experts et de la revue de littérature. Les résultats de cette recherche montrent que l’extrapolation des techniques traditionnelles de l'utilisation de lumière du jour dans les habitations modernes peut être considéré comme une option de conception alternative. Cette dernière peut optimiser l'utilisation de lumière du jour et par conséquence améliorer le bien-être des occupants. L'approche utilisée dans cette recherche a fourni une occasion d’étudier l'architecture du passé et d’évaluer plus précisément son importance. Cette recherche contribue ainsi à définir un modèle en tirant les leçons du passé pour résoudre les problèmes actuels.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cette thèse commence comme une simple question en réponse au modèle du « parfait flâneur » que Baudelaire a élaboré dans Le peintre de la vie moderne (1853): un flâneur peut-il être imparfait? Je suggère trois interprétations possibles du mot « imparfait ». Il permet d’abord de sortir le flâneur du strict contexte du Paris du dix-neuvième siècle et permet des traductions imparfaites de personnages dans d’autres contextes. Ensuite, le flâneur déambule dans la dimension « imparfaite » de l’imagination fictionnelle – une dimension comparable à l’image anamorphique du crâne dans la peinture Les ambassadeurs de Holbein. Enfin, il réfère à l’imparfait conjugué, « l’imparfait flâneur » peut rappeler le personnage antihéroïque de l’humain dont l’existence est banale et inachevée, comme la phrase « il y avait ». Ces trois visions contribuent à la réinterprétation du flâneur dans le contexte de la fin du vingtième siècle. Mon hypothèse est que l’expérience urbaine du flâneur et la flânerie ne sont possibles que si l’on admet être imparfait(e), qu’on accepte ses imperfections et qu’elles ne nous surprennent pas. Quatre études de romans contemporains et de leurs villes respectives forment les principaux chapitres. Le premier étudie Montréal dans City of forgetting de Robert Majzels. J’examine les façons par lesquelles les personnages itinérants peuvent être considérés comme occupant (ou en échec d’occupation) du Montréal contemporain alors qu’ils sont eux-mêmes délogés. Quant au deuxième chapitre, il se concentre sur le Bombay de Rohinton Mistry dans A fine balance. Mon étude portera ici sur la question de l’hospitalité en relation à l’hébergement et au « dé-hébergement » des étrangers dans la ville. Le troisième chapitre nous amène à Hong-Kong avec la série Feituzhen de XiXi. Dans celle-ci, j’estime que la méthode spéciale de la marelle apparait comme une forme unique de flânerie imparfaite. Le quatrième chapitre étudie Istanbul à travers The black book d’Orhan Pamuk. Inspiré par les notions de « commencement » d’Edward Saïd, mon argumentaire est construit à partir de l’interrogation suivante : comment et quand commence une narration? En lieu de conclusion, j’ai imaginé une conversation entre l’auteur de cette thèse et les personnages de flâneurs imparfaits présents dans les différents chapitres.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The main source of protein for human and animal consumption is from the agricultural sector, where the production is vulnerable to diseases, fluctuations in climatic conditions and deteriorating hydrological conditions due to water pollution. Therefore Single Cell Protein (SCP) production has evolved as an excellent alternative. Among all sources of microbial protein, yeast has attained global acceptability and has been preferred for SCP production. The screening and evaluation of nutritional and other culture variables of microorganisms are very important in the development of a bioprocess for SCP production. The application of statistical experimental design in bioprocess development can result in improved product yields, reduced process variability, closer confirmation of the output response to target requirements and reduced development time and overall cost.The present work was undertaken to develop a bioprocess technology for the mass production of a marine yeast, Candida sp.S27. Yeasts isolated from the offshore waters of the South west coast of India and maintained in the Microbiology Laboratory were subjected to various tests for the selection of a potent strain for biomass production. The selected marine yeast was identified based on ITS sequencing. Biochemical/nutritional characterization of Candida sp.S27 was carried out. Using Response Surface Methodology (RSM) the process parameters (pH, temperature and salinity) were optimized. For mass production of yeast biomass, a chemically defined medium (Barnett and Ingram, 1955) and a crude medium (Molasses-Yeast extract) were optimized using RSM. Scale up of biomass production was done in a Bench top Fermenter using these two optimized media. Comparative efficacy of the defined and crude media were estimated besides nutritional evaluation of the biomass developed using these two optimized media.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Modern computer systems are plagued with stability and security problems: applications lose data, web servers are hacked, and systems crash under heavy load. Many of these problems or anomalies arise from rare program behavior caused by attacks or errors. A substantial percentage of the web-based attacks are due to buffer overflows. Many methods have been devised to detect and prevent anomalous situations that arise from buffer overflows. The current state-of-art of anomaly detection systems is relatively primitive and mainly depend on static code checking to take care of buffer overflow attacks. For protection, Stack Guards and I-leap Guards are also used in wide varieties.This dissertation proposes an anomaly detection system, based on frequencies of system calls in the system call trace. System call traces represented as frequency sequences are profiled using sequence sets. A sequence set is identified by the starting sequence and frequencies of specific system calls. The deviations of the current input sequence from the corresponding normal profile in the frequency pattern of system calls is computed and expressed as an anomaly score. A simple Bayesian model is used for an accurate detection.Experimental results are reported which show that frequency of system calls represented using sequence sets, captures the normal behavior of programs under normal conditions of usage. This captured behavior allows the system to detect anomalies with a low rate of false positives. Data are presented which show that Bayesian Network on frequency variations responds effectively to induced buffer overflows. It can also help administrators to detect deviations in program flow introduced due to errors.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study enfolds the environment of deposition and the lateral variation in texture, mineralogy and geochemistry of the Ashtamudy lake sediments. While the heavy mineral and clay mineral investigations enable us to decipher the nature, texture and source of sediments; organic matter and carbonate contents and the geochemical analysis of major and minor elements help establish the distribution and concentration of the same in regard to the various physico-chemical processes operating in the lake. Study of trace elements holds prime importance in this work, since their concentrations can be used to outline the extent of contaminated bottom area, as well as the source and dispersal paths of discharged_pollutants. In short, this study brings out a vivid picture of the mineralogy and geochemistry of the lake sediments in different environments, viz., the freshwater, brackish water and marine environments that are confined to the eastern, central and western parts of the lake respectively. For the better understanding and expression of the results of the analysis, the lake has been divided into 3 zones namely: eastern part, central part and western part.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Urban developments have exerted immense pressure on wetlands. Urban areas are normally centers of commercial activity and continue to attract migrants in large numbers in search of employment from different areas. As a result, habitations keep coming up in the natural areas / flood plains. This is happening in various Indian cities and towns and large habitations are coming up in low-lying areas, often encroaching even over drainage channels. In some cases, houses are constructed even on top of nallahs and drains. In the case of Kochi the situation is even worse as the base of the urban development itself stands on a completely reclaimed island. Also the topography and geology demanded more reclamation of land when the city developed as an agglomerative cluster. Cochin is a coastal settlement interspersed with a large backwater system and fringed on the eastern side by laterite-capped low hills from which a number of streams drain into the backwater system. The ridge line of the eastern low hills provides a welldefined watershed delimiting Cochin basin which help to confine the environmental parameters within a physical limit. This leads to an obvious conclusion that if physiography alone is considered, the western flatland is ideal for urban development. However it will result in serious environmental deterioration, as it comprises mainly of wetland and for availability of land there has to be large scale filling up of these wetlands which includes shallow mangrove-fringed water sheets, paddy fields, Pokkali fields, estuary etc.Chapter 1 School 4 of Environmental Studies The urban boundaries of Cochin are expanding fast with a consequent over-stretching of the existing fabric of basic amenities and services. Urbanisation leads to the transformation of agricultural land into built-up areas with the concomitant problems regarding water supply, drainage, garbage and sewage disposal etc. Many of the environmental problems of Cochin are hydrologic in origin; like water-logging / floods, sedimentation and pollution in the water bodies as well as shoreline erosion

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Giant freshwater prawn, Macrobrachium rosenbergii (de Man), is an important commercial species with considerable export value, ideal for cultivation under low saline conditions and in freshwater zones (Kurup 1994). However, despite more than a decade of research on its larval production systems, vibriosis still hampers seed production resulting in high mortality rates. Among the different species of vibrios, Vibrio alginolyticus has been isolated frequently from diseased shrimp as the aetiological agent of vibriosis and has been described as a principal pathogen of both penaeids and nonpenaeids (Lightner 1988; Baticados, Cruz-Lacierda, de la Cruz, Duremdez-Fernandez, Gacutan, Lavilla- Pitogo & Lio-Po 1990; Mohney, Lightner & Bell 1994; Lee, Yu, Chen, Yang & Liu 1996). Vibrio fluvialis, V. alginolyticus, V. cholerae non-O1 (Fujioka & Greco 1984), Aeromonas liquifaciens and V. anguillarum (Colorni 1985) have been isolated from the larvae of M. rosenbergii. A profound relationship between the abundance of members of the family Vibrionaceae and larval mortality (Singh 1990) and the predominance of Vibrio in eggs, larvae and post-larvae of M. rosenbergii (Hameed, Rahaman, Alagan & Yoganandhan 2003) was reported. The present paper reports the isolation, characterization, pathogenicity and antibiotic sensitivity of V. alginolyticus associated with M. rosenbergii larvae during an occurrence of severe mass mortality at the ninth larval stage.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this computerized, globalised and internet world our computer collects various types of information’s about every human being and stores them in files secreted deep on its hard drive. Files like cache, browser history and other temporary Internet files can be used to store sensitive information like logins and passwords, names addresses, and even credit card numbers. Now, a hacker can get at this information by wrong means and share with someone else or can install some nasty software on your computer that will extract your sensitive and secret information. Identity Theft posses a very serious problem to everyone today. If you have a driver’s license, a bank account, a computer, ration card number, PAN card number, ATM card or simply a social security number you are more than at risk, you are a target. Whether you are new to the idea of ID Theft, or you have some unanswered questions, we’ve compiled a quick refresher list below that should bring you up to speed. Identity theft is a term used to refer to fraud that involves pretending to be someone else in order to steal money or get other benefits. Identity theft is a serious crime, which is increasing at tremendous rate all over the world after the Internet evolution. There is widespread agreement that identity theft causes financial damage to consumers, lending institutions, retail establishments, and the economy as a whole. Surprisingly, there is little good public information available about the scope of the crime and the actual damages it inflicts. Accounts of identity theft in recent mass media and in film or literature have centered on the exploits of 'hackers' - variously lauded or reviled - who are depicted as cleverly subverting corporate firewalls or other data protection defenses to gain unauthorized access to credit card details, personnel records and other information. Reality is more complicated, with electronic identity fraud taking a range of forms. The impact of those forms is not necessarily quantifiable as a financial loss; it can involve intangible damage to reputation, time spent dealing with disinformation and exclusion from particular services because a stolen name has been used improperly. Overall we can consider electronic networks as an enabler for identity theft, with the thief for example gaining information online for action offline and the basis for theft or other injury online. As Fisher pointed out "These new forms of hightech identity and securities fraud pose serious risks to investors and brokerage firms across the globe," I am a victim of identity theft. Being a victim of identity theft I felt the need for creating an awareness among the computer and internet users particularly youngsters in India. Nearly 70 per cent of Indian‘s population are living in villages. Government of India already started providing computer and internet facilities even to the remote villages through various rural development and rural upliftment programmes. Highly educated people, established companies, world famous financial institutions are becoming victim of identity theft. The question here is how vulnerable the illiterate and innocent rural people are if they suddenly exposed to a new device through which some one can extract and exploit their personal data without their knowledge? In this research work an attempt has been made to bring out the real problems associated with Identity theft in developed countries from an economist point of view.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Kochi, the commercial capital of Kerala, South India and second most important city next to Mumbai on the Western coast is a land having a wide variety of residential environments. Due to rapid population growth, changing lifestyles, food habits and living standards, institutional weaknesses, improper choice of technology and public apathy, the present pattern of the city can be classified as that of haphazard growth with typical problems characteristics of unplanned urban development especially in the case of solid waste management. To have a better living condition for us and our future generations, we must know where we are now and how far we need to go. We, each individual must calculate how much nature we use and compare it to how much nature we have available. This can be achieved by applying the concept of ecological footprint. Ecological footprint analysis (EFA) is a quantitative tool that represents the ecological load imposed on earth by humans in spatial terms. The aim of applying EFA to Kochi city is to quantify the consumption and waste generation of a population and to compare it with the existing biocapacity. By quantifying the ecological footprint we can formulate strategies to reduce the footprint and there by having a sustainable living. The paper discusses the various footprint components of Kochi city and in detail analyses the waste footprint of the residential areas using waste footprint analyzer. An attempt is also made to suggest some waste foot print reduction strategies thereby making the city sustainable as far as solid waste management is concerned.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Kochi, the commercial capital of Kerala and the second most important city next to Mumbai on the Western coast of India, is a land having a wide variety of residential environments. The present pattern of the city can be classified as that of haphazard growth with typical problems characteristics of unplanned urban development. This trend can be ascribed to rapid population growth, our changing lifestyles, food habits, and change in living standards, institutional weaknesses, improper choice of technology and public apathy. Ecological footprint analysis (EFA) is a quantitative tool that represents the ecological load imposed on the earth by humans in spatial terms. This paper analyses the scope of EFA as a sustainable environmental management tool for Kochi City

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the past, natural resources were plentiful and people were scarce. But the situation is rapidly reversing. Our challenge is to find a way to balance human consumption and nature’s limited productivity in order to ensure that our communities are sustainable locally, regionally and globally. Kochi, the commercial capital of Kerala, South India and the second most important city next to Mumbai on the Western coast is a land having a wide variety of residential environments. Due to rapid population growth, changing lifestyles, food habits and living standards, institutional weaknesses, improper choice of technology and public apathy, the present pattern of the city can be classified as that of haphazard growth with typical problems characteristics of unplanned urban development. Ecological Footprint Analysis (EFA) is physical accounting method, developed by William Rees and M. Wackernagel, focusing on land appropriation using land as its “currency”. It provides a means for measuring and communicating human induced environmental impacts upon the planet. The aim of applying EFA to Kochi city is to quantify the consumption and waste generation of a population and to compare it with the existing biocapacity. By quantifying the ecological footprint we can formulate strategies to reduce the footprint and there by having a sustainable living. In this paper, an attempt is made to explore the tool Ecological Footprint Analysis and calculate and analyse the ecological footprint of the residential areas of Kochi city. The paper also discusses and analyses the waste footprint of the city. An attempt is also made to suggest strategies to reduce the footprint thereby making the city sustainable

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Study on variable stars is an important topic of modern astrophysics. After the invention of powerful telescopes and high resolving powered CCD’s, the variable star data is accumulating in the order of peta-bytes. The huge amount of data need lot of automated methods as well as human experts. This thesis is devoted to the data analysis on variable star’s astronomical time series data and hence belong to the inter-disciplinary topic, Astrostatistics. For an observer on earth, stars that have a change in apparent brightness over time are called variable stars. The variation in brightness may be regular (periodic), quasi periodic (semi-periodic) or irregular manner (aperiodic) and are caused by various reasons. In some cases, the variation is due to some internal thermo-nuclear processes, which are generally known as intrinsic vari- ables and in some other cases, it is due to some external processes, like eclipse or rotation, which are known as extrinsic variables. Intrinsic variables can be further grouped into pulsating variables, eruptive variables and flare stars. Extrinsic variables are grouped into eclipsing binary stars and chromospheri- cal stars. Pulsating variables can again classified into Cepheid, RR Lyrae, RV Tauri, Delta Scuti, Mira etc. The eruptive or cataclysmic variables are novae, supernovae, etc., which rarely occurs and are not periodic phenomena. Most of the other variations are periodic in nature. Variable stars can be observed through many ways such as photometry, spectrophotometry and spectroscopy. The sequence of photometric observa- xiv tions on variable stars produces time series data, which contains time, magni- tude and error. The plot between variable star’s apparent magnitude and time are known as light curve. If the time series data is folded on a period, the plot between apparent magnitude and phase is known as phased light curve. The unique shape of phased light curve is a characteristic of each type of variable star. One way to identify the type of variable star and to classify them is by visually looking at the phased light curve by an expert. For last several years, automated algorithms are used to classify a group of variable stars, with the help of computers. Research on variable stars can be divided into different stages like observa- tion, data reduction, data analysis, modeling and classification. The modeling on variable stars helps to determine the short-term and long-term behaviour and to construct theoretical models (for eg:- Wilson-Devinney model for eclips- ing binaries) and to derive stellar properties like mass, radius, luminosity, tem- perature, internal and external structure, chemical composition and evolution. The classification requires the determination of the basic parameters like pe- riod, amplitude and phase and also some other derived parameters. Out of these, period is the most important parameter since the wrong periods can lead to sparse light curves and misleading information. Time series analysis is a method of applying mathematical and statistical tests to data, to quantify the variation, understand the nature of time-varying phenomena, to gain physical understanding of the system and to predict future behavior of the system. Astronomical time series usually suffer from unevenly spaced time instants, varying error conditions and possibility of big gaps. This is due to daily varying daylight and the weather conditions for ground based observations and observations from space may suffer from the impact of cosmic ray particles. Many large scale astronomical surveys such as MACHO, OGLE, EROS, xv ROTSE, PLANET, Hipparcos, MISAO, NSVS, ASAS, Pan-STARRS, Ke- pler,ESA, Gaia, LSST, CRTS provide variable star’s time series data, even though their primary intention is not variable star observation. Center for Astrostatistics, Pennsylvania State University is established to help the astro- nomical community with the aid of statistical tools for harvesting and analysing archival data. Most of these surveys releases the data to the public for further analysis. There exist many period search algorithms through astronomical time se- ries analysis, which can be classified into parametric (assume some underlying distribution for data) and non-parametric (do not assume any statistical model like Gaussian etc.,) methods. Many of the parametric methods are based on variations of discrete Fourier transforms like Generalised Lomb-Scargle peri- odogram (GLSP) by Zechmeister(2009), Significant Spectrum (SigSpec) by Reegen(2007) etc. Non-parametric methods include Phase Dispersion Minimi- sation (PDM) by Stellingwerf(1978) and Cubic spline method by Akerlof(1994) etc. Even though most of the methods can be brought under automation, any of the method stated above could not fully recover the true periods. The wrong detection of period can be due to several reasons such as power leakage to other frequencies which is due to finite total interval, finite sampling interval and finite amount of data. Another problem is aliasing, which is due to the influence of regular sampling. Also spurious periods appear due to long gaps and power flow to harmonic frequencies is an inherent problem of Fourier methods. Hence obtaining the exact period of variable star from it’s time series data is still a difficult problem, in case of huge databases, when subjected to automation. As Matthew Templeton, AAVSO, states “Variable star data analysis is not always straightforward; large-scale, automated analysis design is non-trivial”. Derekas et al. 2007, Deb et.al. 2010 states “The processing of xvi huge amount of data in these databases is quite challenging, even when looking at seemingly small issues such as period determination and classification”. It will be beneficial for the variable star astronomical community, if basic parameters, such as period, amplitude and phase are obtained more accurately, when huge time series databases are subjected to automation. In the present thesis work, the theories of four popular period search methods are studied, the strength and weakness of these methods are evaluated by applying it on two survey databases and finally a modified form of cubic spline method is intro- duced to confirm the exact period of variable star. For the classification of new variable stars discovered and entering them in the “General Catalogue of Vari- able Stars” or other databases like “Variable Star Index“, the characteristics of the variability has to be quantified in term of variable star parameters.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Research on soil fertility management in sub-Saharan Africa was criticized lately for largely ignoring farmers’ management strategies and the underlying principles. To fill this gap of knowledge, detailed interviews were conducted with 108 farm households about their rationale in managing the soil fertility of 307 individual fields in the agro-pastoral village territory of Chikal in western Niger. To amplify the farmers’ information on manuring and corralling practices, repeated measurements of applied amounts of manure were carried out within six 1-km^2 monitoring areas from February to October 1998. The interviews revealed that only 2% of the fields were completely fallowed for a period of 1–15 years, but 40% of the fields were at least partially fallowed. Mulching of crop residues was mainly practiced to fight wind erosion but was restricted to 36% of the surveyed fields given the alternative use of straw as livestock feed. Manure application and livestock corralling, as most effective tools to enhance soil fertility, were targeted to less than 30% of the surveyed fields. The application of complete fallow and manuring and corralling practices were strongly related to the households’ endowment with resources, especially with land and livestock. Within particular fields, measures were mainly applied to spots of poor soil fertility, while the restoration of the productivity of hard pans was of secondary importance. Given the limited spatial coverage of indigenous soil fertility measures and their strong dependence on farmers’ wealth, supplementary strategies to restrict the decline of soil fertility in the drought prone areas of Niger with their heavily weathered soils are needed.