12 resultados para inter-disciplinary

em Cochin University of Science


Relevância:

60.00% 60.00%

Publicador:

Resumo:

Econometrics is a young science. It developed during the twentieth century in the mid-1930’s, primarily after the World War II. Econometrics is the unification of statistical analysis, economic theory and mathematics. The history of econometrics can be traced to the use of statistical and mathematics analysis in economics. The most prominent contributions during the initial period can be seen in the works of Tinbergen and Frisch, and also that of Haavelmo in the 1940's through the mid 1950's. Right from the rudimentary application of statistics to economic data, like the use of laws of error through the development of least squares by Legendre, Laplace, and Gauss, the discipline of econometrics has later on witnessed the applied works done by Edge worth and Mitchell. A very significant mile stone in its evolution has been the work of Tinbergen, Frisch, and Haavelmo in their development of multiple regression and correlation analysis. They used these techniques to test different economic theories using time series data. In spite of the fact that some predictions based on econometric methodology might have gone wrong, the sound scientific nature of the discipline cannot be ignored by anyone. This is reflected in the economic rationale underlying any econometric model, statistical and mathematical reasoning for the various inferences drawn etc. The relevance of econometrics as an academic discipline assumes high significance in the above context. Because of the inter-disciplinary nature of econometrics (which is a unification of Economics, Statistics and Mathematics), the subject can be taught at all these broad areas, not-withstanding the fact that most often Economics students alone are offered this subject as those of other disciplines might not have adequate Economics background to understand the subject. In fact, even for technical courses (like Engineering), business management courses (like MBA), professional accountancy courses etc. econometrics is quite relevant. More relevant is the case of research students of various social sciences, commerce and management. In the ongoing scenario of globalization and economic deregulation, there is the need to give added thrust to the academic discipline of econometrics in higher education, across various social science streams, commerce, management, professional accountancy etc. Accordingly, the analytical ability of the students can be sharpened and their ability to look into the socio-economic problems with a mathematical approach can be improved, and enabling them to derive scientific inferences and solutions to such problems. The utmost significance of hands-own practical training on the use of computer-based econometric packages, especially at the post-graduate and research levels need to be pointed out here. Mere learning of the econometric methodology or the underlying theories alone would not have much practical utility for the students in their future career, whether in academics, industry, or in practice This paper seeks to trace the historical development of econometrics and study the current status of econometrics as an academic discipline in higher education. Besides, the paper looks into the problems faced by the teachers in teaching econometrics, and those of students in learning the subject including effective application of the methodology in real life situations. Accordingly, the paper offers some meaningful suggestions for effective teaching of econometrics in higher education

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Study on variable stars is an important topic of modern astrophysics. After the invention of powerful telescopes and high resolving powered CCD’s, the variable star data is accumulating in the order of peta-bytes. The huge amount of data need lot of automated methods as well as human experts. This thesis is devoted to the data analysis on variable star’s astronomical time series data and hence belong to the inter-disciplinary topic, Astrostatistics. For an observer on earth, stars that have a change in apparent brightness over time are called variable stars. The variation in brightness may be regular (periodic), quasi periodic (semi-periodic) or irregular manner (aperiodic) and are caused by various reasons. In some cases, the variation is due to some internal thermo-nuclear processes, which are generally known as intrinsic vari- ables and in some other cases, it is due to some external processes, like eclipse or rotation, which are known as extrinsic variables. Intrinsic variables can be further grouped into pulsating variables, eruptive variables and flare stars. Extrinsic variables are grouped into eclipsing binary stars and chromospheri- cal stars. Pulsating variables can again classified into Cepheid, RR Lyrae, RV Tauri, Delta Scuti, Mira etc. The eruptive or cataclysmic variables are novae, supernovae, etc., which rarely occurs and are not periodic phenomena. Most of the other variations are periodic in nature. Variable stars can be observed through many ways such as photometry, spectrophotometry and spectroscopy. The sequence of photometric observa- xiv tions on variable stars produces time series data, which contains time, magni- tude and error. The plot between variable star’s apparent magnitude and time are known as light curve. If the time series data is folded on a period, the plot between apparent magnitude and phase is known as phased light curve. The unique shape of phased light curve is a characteristic of each type of variable star. One way to identify the type of variable star and to classify them is by visually looking at the phased light curve by an expert. For last several years, automated algorithms are used to classify a group of variable stars, with the help of computers. Research on variable stars can be divided into different stages like observa- tion, data reduction, data analysis, modeling and classification. The modeling on variable stars helps to determine the short-term and long-term behaviour and to construct theoretical models (for eg:- Wilson-Devinney model for eclips- ing binaries) and to derive stellar properties like mass, radius, luminosity, tem- perature, internal and external structure, chemical composition and evolution. The classification requires the determination of the basic parameters like pe- riod, amplitude and phase and also some other derived parameters. Out of these, period is the most important parameter since the wrong periods can lead to sparse light curves and misleading information. Time series analysis is a method of applying mathematical and statistical tests to data, to quantify the variation, understand the nature of time-varying phenomena, to gain physical understanding of the system and to predict future behavior of the system. Astronomical time series usually suffer from unevenly spaced time instants, varying error conditions and possibility of big gaps. This is due to daily varying daylight and the weather conditions for ground based observations and observations from space may suffer from the impact of cosmic ray particles. Many large scale astronomical surveys such as MACHO, OGLE, EROS, xv ROTSE, PLANET, Hipparcos, MISAO, NSVS, ASAS, Pan-STARRS, Ke- pler,ESA, Gaia, LSST, CRTS provide variable star’s time series data, even though their primary intention is not variable star observation. Center for Astrostatistics, Pennsylvania State University is established to help the astro- nomical community with the aid of statistical tools for harvesting and analysing archival data. Most of these surveys releases the data to the public for further analysis. There exist many period search algorithms through astronomical time se- ries analysis, which can be classified into parametric (assume some underlying distribution for data) and non-parametric (do not assume any statistical model like Gaussian etc.,) methods. Many of the parametric methods are based on variations of discrete Fourier transforms like Generalised Lomb-Scargle peri- odogram (GLSP) by Zechmeister(2009), Significant Spectrum (SigSpec) by Reegen(2007) etc. Non-parametric methods include Phase Dispersion Minimi- sation (PDM) by Stellingwerf(1978) and Cubic spline method by Akerlof(1994) etc. Even though most of the methods can be brought under automation, any of the method stated above could not fully recover the true periods. The wrong detection of period can be due to several reasons such as power leakage to other frequencies which is due to finite total interval, finite sampling interval and finite amount of data. Another problem is aliasing, which is due to the influence of regular sampling. Also spurious periods appear due to long gaps and power flow to harmonic frequencies is an inherent problem of Fourier methods. Hence obtaining the exact period of variable star from it’s time series data is still a difficult problem, in case of huge databases, when subjected to automation. As Matthew Templeton, AAVSO, states “Variable star data analysis is not always straightforward; large-scale, automated analysis design is non-trivial”. Derekas et al. 2007, Deb et.al. 2010 states “The processing of xvi huge amount of data in these databases is quite challenging, even when looking at seemingly small issues such as period determination and classification”. It will be beneficial for the variable star astronomical community, if basic parameters, such as period, amplitude and phase are obtained more accurately, when huge time series databases are subjected to automation. In the present thesis work, the theories of four popular period search methods are studied, the strength and weakness of these methods are evaluated by applying it on two survey databases and finally a modified form of cubic spline method is intro- duced to confirm the exact period of variable star. For the classification of new variable stars discovered and entering them in the “General Catalogue of Vari- able Stars” or other databases like “Variable Star Index“, the characteristics of the variability has to be quantified in term of variable star parameters.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This research was undertaken with an objective of studying software development project risk, risk management, project outcomes and their inter-relationship in the Indian context. Validated instruments were used to measure risk, risk management and project outcome in software development projects undertaken in India. A second order factor model was developed for risk with five first order factors. Risk management was also identified as a second order construct with four first order factors. These structures were validated using confirmatory factor analysis. Variation in risk across categories of select organization / project characteristics was studied through a series of one way ANOVA tests. Regression model was developed for each of the risk factors by linking it to risk management factors and project /organization characteristics. Similarly regression models were developed for the project outcome measures linking them to risk factors. Integrated models linking risk factors, risk management factors and project outcome measures were tested through structural equation modeling. Quality of the software developed was seen to have a positive relationship with risk management and negative relationship with risk. The other outcome variables, namely time overrun and cost over run, had strong positive relationship with risk. Risk management did not have direct effect on overrun variables. Risk was seen to be acting as an intervening variable between risk management and overrun variables.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Inter-digital capacitive electrodes working as electric field sensors have been developed for touch panel applications. Evaluation circuits to convert variations in electric fields in such sensors into computer compatible data are commercially available. We report development of an Interdigital capacitive electrode working as a sensitive pressure sensor in the range 0-120 kPa. Essentially it is a touch/proximity sensor converted into a pressure sensor with a suitable elastomer buffer medium acting as the pressure transmitter. The performance of the sensor has been evaluated and reported. Such sensors can be made very economical in comparison to existing pressure sensors. Moreover, they are very convenient to be fabricated into sensor arrays involving a number of sensors for distributed pressure sensing applications such as in biomedical systems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The present study entitled ‘Inter-State Variations in Manufacturing Productivity and Technological Changes in India’ covers a period of 38 years from l960 tol998-99. The study is mainly based on ASI data. The study starts with a discussion of the major facilitating factors of industrialization, namely, historical forces, public policy and infrastructure facilities. These are discussed in greater details in the context of our discussion on Perrox’s (1998) ‘growth pole’ and ‘development pole’, Hirschman’s (1958) ‘industrial centers’ and Myrdal’s ‘spread effect’ Most of the existing literature more or less agrees that the process of industrialization has not been unifonn in all Indian states. There has been a decline in inter-state industrial disparities over time. This aspect is dealt at some length in the third chapter. An important element that deserves detailed attention is the intra-regional differences in industrialisation. Regional industrialisation implies the emergence of a few focal points and industrial regions. Calcutta, Bombay and Madras were the initial focal points. Later other centers like Bangalore, Amritsar, Ahemedabad etc. emerged as nodal points in other states. All major states account for focal points. The analysis made in the third chapter shows that industrial activities generally converge to one or two focal points and industrial regions have emerged out of the focal points in almost all states. One of the general features of these complexes and regions is that they approximately accommodate 50 to 75 percent of the total industrial units and workers in the state. Such convergence is seen hands in glow with urbanization. It was further seen that intra-regional industrial disparity comes down in industrial states like Maharashtra, Gujarat and Uttar Pradesh.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The overall objective of the study is to examine whether the tribal communities in Kerala can be considered a coherent group in terms of select indicators of development by focusing on nine major tribal communities. The study also aims to bring out the intercommunity differences if any in aspects of livelihood options and education level of the tribal communities in Kerala

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study is an attempt to situate the quality of life and standard of living of local communities in ecotourism destinations inter alia their perception on forest conservation and the satisfaction level of the local community. 650 EDC/VSS members from Kerala demarcated into three zones constitute the data source. Four variables have been considered for evaluating the quality of life of the stakeholders of ecotourism sites, which is then funneled to the income-education spectrum for hypothesizing into the SLI framework. Zone-wise analysis of the community members working in tourism sector shows that the community members have benefited totally from tourism development in the region as they have got both employments as well as secured livelihood options. Most of the quality of life-indicators of the community in the eco-tourist centres show a promising position. The community perception does not show any negative impact on environment as well as on their local culture.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Heavy metals in the surface sediments of the two coastal ecosystems of Cochin, southwest India were assessed. The study intends to evaluate the degree of anthropogenic influence on heavy metal concentration in the sediments of the mangrove and adjacent estuarine stations using enrichment factor and geoaccumulation index. The inverse relationship of Cd and Zn with texture in the mangrove sediments suggested the anthropogenic enrichment of these metals in the mangrove systems. In the estuarine sediments, the absence of any significant correlation of the heavy metals with other sedimentary parameters and their strong interdependence revealed the possibility that the input is not through the natural weathering processes. The analysis of enrichment factor indicated a minor enrichment for Pb and Zn in mangrove sediments. While, extremely severe enrichment for Cd, moderate enrichment for Zn and minor enrichment of Pb were observed in estuarine system. The geo accumulation index exhibited very low values for all metals except Zn, indicating the sediments of the mangrove ecosystem are unpolluted to moderately polluted by anthropogenic activities. However, very strongly polluted condition for Cd and a moderately polluted condition for Zn were evident in estuarine sediments

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In many situations probability models are more realistic than deterministic models. Several phenomena occurring in physics are studied as random phenomena changing with time and space. Stochastic processes originated from the needs of physicists.Let X(t) be a random variable where t is a parameter assuming values from the set T. Then the collection of random variables {X(t), t ∈ T} is called a stochastic process. We denote the state of the process at time t by X(t) and the collection of all possible values X(t) can assume, is called state space