931 resultados para parametric oscillators and amplifiers


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The service quality of any sector has two major aspects namely technical and functional. Technical quality can be attained by maintaining technical specification as decided by the organization. Functional quality refers to the manner which service is delivered to customer which can be assessed by the customer feed backs. A field survey was conducted based on the management tool SERVQUAL, by designing 28 constructs under 7 dimensions of service quality. Stratified sampling techniques were used to get 336 valid responses and the gap scores of expectations and perceptions are analyzed using statistical techniques to identify the weakest dimension. To assess the technical aspects of availability six months live outage data of base transceiver were collected. The statistical and exploratory techniques were used to model the network performance. The failure patterns have been modeled in competing risk models and probability distribution of service outage and restorations were parameterized. Since the availability of network is a function of the reliability and maintainability of the network elements, any service provider who wishes to keep up their service level agreements on availability should be aware of the variability of these elements and its effects on interactions. The availability variations were studied by designing a discrete time event simulation model with probabilistic input parameters. The probabilistic distribution parameters arrived from live data analysis was used to design experiments to define the availability domain of the network under consideration. The availability domain can be used as a reference for planning and implementing maintenance activities. A new metric is proposed which incorporates a consistency index along with key service parameters that can be used to compare the performance of different service providers. The developed tool can be used for reliability analysis of mobile communication systems and assumes greater significance in the wake of mobile portability facility. It is also possible to have a relative measure of the effectiveness of different service providers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The study of simple chaotic maps for non-equilibrium processes in statistical physics has been one of the central themes in the theory of chaotic dynamical systems. Recently, many works have been carried out on deterministic diffusion in spatially extended one-dimensional maps This can be related to real physical systems such as Josephson junctions in the presence of microwave radiation and parametrically driven oscillators. Transport due to chaos is an important problem in Hamiltonian dynamics also. A recent approach is to evaluate the exact diffusion coefficient in terms of the periodic orbits of the system in the form of cycle expansions. But the fact is that the chaotic motion in such spatially extended maps has two complementary aspects- - diffusion and interrnittency. These are related to the time evolution of the probability density function which is approximately Gaussian by central limit theorem. It is noticed that the characteristic function method introduced by Fujisaka and his co-workers is a very powerful tool for analysing both these aspects of chaotic motion. The theory based on characteristic function actually provides a thermodynamic formalism for chaotic systems It can be applied to other types of chaos-induced diffusion also, such as the one arising in statistics of trajectory separation. It was noted that there is a close connection between cycle expansion technique and characteristic function method. It was found that this connection can be exploited to enhance the applicability of the cycle expansion technique. In this way, we found that cycle expansion can be used to analyse the probability density function in chaotic maps. In our research studies we have successfully applied the characteristic function method and cycle expansion technique for analysing some chaotic maps. We introduced in this connection, two classes of chaotic maps with variable shape by generalizing two types of maps well known in literature.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Vibrational overtone spectroscopy of molecules containing X-H oscillators (X = C, N, O...) has become an effective tool for the study of molecular structure, dynamics, inter and intramolecular interactions, conformational aspects and substituent effects in aliphatic and aromatic compounds. In the present work, the author studied the NIR overtone spectra of some liquid phase organic compounds. The analysis of the CH, NH and OH overtones yielded important structural information about these systems. In an attempt to get information on electronic energy levels, we studied the pulsed Nd:YAG laser induced fluorescence spectra of certain organic compounds. The pulsed laser Raman spectra of some organic compounds are also studied. The novel high resolution technique of near infrared tunable diode laser absorption spectroscopy (TDLAS) is used to record the rotational structure of the second OH overtone spectrum of 2-propanol. The spectral features corresponding to the different molecular conformations could be identified from the high resolution spectrum. The whole work described in this thesis is divided into five chapters.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

there has been much research on analyzing various forms of competing risks data. Nevertheless, there are several occasions in survival studies, where the existing models and methodologies are inadequate for the analysis competing risks data. ldentifiabilty problem and various types of and censoring induce more complications in the analysis of competing risks data than in classical survival analysis. Parametric models are not adequate for the analysis of competing risks data since the assumptions about the underlying lifetime distributions may not hold well. Motivated by this, in the present study. we develop some new inference procedures, which are completely distribution free for the analysis of competing risks data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The present work is an attempt to explain particle production in the early univese. We argue that nonzero values of the stress-energy tensor evaluated in squeezed vacuum state can be due to particle production and this supports the concept of particle production from zero-point quantum fluctuations. In the present calculation we use the squeezed coherent state introduced by Fan and Xiao [7]. The vacuum expectation values of stressenergy tensor defined prior to any dynamics in the background gravitational field give all information about particle production. Squeezing of the vacuum is achieved by means of the background gravitational field, which plays the role of a parametric amplifier [8]. The present calculation shows that the vacuum expectation value of the energy density and pressure contain terms in addition to the classical zero-point energy terms. The calculation of the particle production probability shows that the probability increases as the squeezing parameter increases, reaches a maximum value, and then decreases.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The present work is an attempt to explain particle production in the early univese. We argue that nonzero values of the stress-energy tensor evaluated in squeezed vacuum state can be due to particle production and this supports the concept of particle production from zero-point quantum fluctuations. In the present calculation we use the squeezed coherent state introduced by Fan and Xiao [7]. The vacuum expectation values of stressenergy tensor defined prior to any dynamics in the background gravitational field give all information about particle production. Squeezing of the vacuum is achieved by means of the background gravitational field, which plays the role of a parametric amplifier [8]. The present calculation shows that the vacuum expectation value of the energy density and pressure contain terms in addition to the classical zero-point energy terms. The calculation of the particle production probability shows that the probability increases as the squeezing parameter increases, reaches a maximum value, and then decreases.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A simple and inexpensive power supply suitable for characteristics studies of a klystron is described. The circuit is a modified form of the high voltage adjustable power supply based on LM 317. This provides the necessary cavity and repeller voltages over a wide range, with good regulation. The system is protected aa- ainst short circuits and is ideallv suitable for laboratorv, ex.Deri ments with reflex klystrons.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Study on variable stars is an important topic of modern astrophysics. After the invention of powerful telescopes and high resolving powered CCD’s, the variable star data is accumulating in the order of peta-bytes. The huge amount of data need lot of automated methods as well as human experts. This thesis is devoted to the data analysis on variable star’s astronomical time series data and hence belong to the inter-disciplinary topic, Astrostatistics. For an observer on earth, stars that have a change in apparent brightness over time are called variable stars. The variation in brightness may be regular (periodic), quasi periodic (semi-periodic) or irregular manner (aperiodic) and are caused by various reasons. In some cases, the variation is due to some internal thermo-nuclear processes, which are generally known as intrinsic vari- ables and in some other cases, it is due to some external processes, like eclipse or rotation, which are known as extrinsic variables. Intrinsic variables can be further grouped into pulsating variables, eruptive variables and flare stars. Extrinsic variables are grouped into eclipsing binary stars and chromospheri- cal stars. Pulsating variables can again classified into Cepheid, RR Lyrae, RV Tauri, Delta Scuti, Mira etc. The eruptive or cataclysmic variables are novae, supernovae, etc., which rarely occurs and are not periodic phenomena. Most of the other variations are periodic in nature. Variable stars can be observed through many ways such as photometry, spectrophotometry and spectroscopy. The sequence of photometric observa- xiv tions on variable stars produces time series data, which contains time, magni- tude and error. The plot between variable star’s apparent magnitude and time are known as light curve. If the time series data is folded on a period, the plot between apparent magnitude and phase is known as phased light curve. The unique shape of phased light curve is a characteristic of each type of variable star. One way to identify the type of variable star and to classify them is by visually looking at the phased light curve by an expert. For last several years, automated algorithms are used to classify a group of variable stars, with the help of computers. Research on variable stars can be divided into different stages like observa- tion, data reduction, data analysis, modeling and classification. The modeling on variable stars helps to determine the short-term and long-term behaviour and to construct theoretical models (for eg:- Wilson-Devinney model for eclips- ing binaries) and to derive stellar properties like mass, radius, luminosity, tem- perature, internal and external structure, chemical composition and evolution. The classification requires the determination of the basic parameters like pe- riod, amplitude and phase and also some other derived parameters. Out of these, period is the most important parameter since the wrong periods can lead to sparse light curves and misleading information. Time series analysis is a method of applying mathematical and statistical tests to data, to quantify the variation, understand the nature of time-varying phenomena, to gain physical understanding of the system and to predict future behavior of the system. Astronomical time series usually suffer from unevenly spaced time instants, varying error conditions and possibility of big gaps. This is due to daily varying daylight and the weather conditions for ground based observations and observations from space may suffer from the impact of cosmic ray particles. Many large scale astronomical surveys such as MACHO, OGLE, EROS, xv ROTSE, PLANET, Hipparcos, MISAO, NSVS, ASAS, Pan-STARRS, Ke- pler,ESA, Gaia, LSST, CRTS provide variable star’s time series data, even though their primary intention is not variable star observation. Center for Astrostatistics, Pennsylvania State University is established to help the astro- nomical community with the aid of statistical tools for harvesting and analysing archival data. Most of these surveys releases the data to the public for further analysis. There exist many period search algorithms through astronomical time se- ries analysis, which can be classified into parametric (assume some underlying distribution for data) and non-parametric (do not assume any statistical model like Gaussian etc.,) methods. Many of the parametric methods are based on variations of discrete Fourier transforms like Generalised Lomb-Scargle peri- odogram (GLSP) by Zechmeister(2009), Significant Spectrum (SigSpec) by Reegen(2007) etc. Non-parametric methods include Phase Dispersion Minimi- sation (PDM) by Stellingwerf(1978) and Cubic spline method by Akerlof(1994) etc. Even though most of the methods can be brought under automation, any of the method stated above could not fully recover the true periods. The wrong detection of period can be due to several reasons such as power leakage to other frequencies which is due to finite total interval, finite sampling interval and finite amount of data. Another problem is aliasing, which is due to the influence of regular sampling. Also spurious periods appear due to long gaps and power flow to harmonic frequencies is an inherent problem of Fourier methods. Hence obtaining the exact period of variable star from it’s time series data is still a difficult problem, in case of huge databases, when subjected to automation. As Matthew Templeton, AAVSO, states “Variable star data analysis is not always straightforward; large-scale, automated analysis design is non-trivial”. Derekas et al. 2007, Deb et.al. 2010 states “The processing of xvi huge amount of data in these databases is quite challenging, even when looking at seemingly small issues such as period determination and classification”. It will be beneficial for the variable star astronomical community, if basic parameters, such as period, amplitude and phase are obtained more accurately, when huge time series databases are subjected to automation. In the present thesis work, the theories of four popular period search methods are studied, the strength and weakness of these methods are evaluated by applying it on two survey databases and finally a modified form of cubic spline method is intro- duced to confirm the exact period of variable star. For the classification of new variable stars discovered and entering them in the “General Catalogue of Vari- able Stars” or other databases like “Variable Star Index“, the characteristics of the variability has to be quantified in term of variable star parameters.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The main instrument used in psychological measurement is the self-report questionnaire. One of its major drawbacks however is its susceptibility to response biases. A known strategy to control these biases has been the use of so-called ipsative items. Ipsative items are items that require the respondent to make between-scale comparisons within each item. The selected option determines to which scale the weight of the answer is attributed. Consequently in questionnaires only consisting of ipsative items every respondent is allotted an equal amount, i.e. the total score, that each can distribute differently over the scales. Therefore this type of response format yields data that can be considered compositional from its inception. Methodological oriented psychologists have heavily criticized this type of item format, since the resulting data is also marked by the associated unfavourable statistical properties. Nevertheless, clinicians have kept using these questionnaires to their satisfaction. This investigation therefore aims to evaluate both positions and addresses the similarities and differences between the two data collection methods. The ultimate objective is to formulate a guideline when to use which type of item format. The comparison is based on data obtained with both an ipsative and normative version of three psychological questionnaires, which were administered to 502 first-year students in psychology according to a balanced within-subjects design. Previous research only compared the direct ipsative scale scores with the derived ipsative scale scores. The use of compositional data analysis techniques also enables one to compare derived normative score ratios with direct normative score ratios. The addition of the second comparison not only offers the advantage of a better-balanced research strategy. In principle it also allows for parametric testing in the evaluation

Relevância:

30.00% 30.00%

Publicador:

Resumo:

There is almost not a case in exploration geology, where the studied data doesn’t includes below detection limits and/or zero values, and since most of the geological data responds to lognormal distributions, these “zero data” represent a mathematical challenge for the interpretation. We need to start by recognizing that there are zero values in geology. For example the amount of quartz in a foyaite (nepheline syenite) is zero, since quartz cannot co-exists with nepheline. Another common essential zero is a North azimuth, however we can always change that zero for the value of 360°. These are known as “Essential zeros”, but what can we do with “Rounded zeros” that are the result of below the detection limit of the equipment? Amalgamation, e.g. adding Na2O and K2O, as total alkalis is a solution, but sometimes we need to differentiate between a sodic and a potassic alteration. Pre-classification into groups requires a good knowledge of the distribution of the data and the geochemical characteristics of the groups which is not always available. Considering the zero values equal to the limit of detection of the used equipment will generate spurious distributions, especially in ternary diagrams. Same situation will occur if we replace the zero values by a small amount using non-parametric or parametric techniques (imputation). The method that we are proposing takes into consideration the well known relationships between some elements. For example, in copper porphyry deposits, there is always a good direct correlation between the copper values and the molybdenum ones, but while copper will always be above the limit of detection, many of the molybdenum values will be “rounded zeros”. So, we will take the lower quartile of the real molybdenum values and establish a regression equation with copper, and then we will estimate the “rounded” zero values of molybdenum by their corresponding copper values. The method could be applied to any type of data, provided we establish first their correlation dependency. One of the main advantages of this method is that we do not obtain a fixed value for the “rounded zeros”, but one that depends on the value of the other variable. Key words: compositional data analysis, treatment of zeros, essential zeros, rounded zeros, correlation dependency

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper a colour texture segmentation method, which unifies region and boundary information, is proposed. The algorithm uses a coarse detection of the perceptual (colour and texture) edges of the image to adequately place and initialise a set of active regions. Colour texture of regions is modelled by the conjunction of non-parametric techniques of kernel density estimation (which allow to estimate the colour behaviour) and classical co-occurrence matrix based texture features. Therefore, region information is defined and accurate boundary information can be extracted to guide the segmentation process. Regions concurrently compete for the image pixels in order to segment the whole image taking both information sources into account. Furthermore, experimental results are shown which prove the performance of the proposed method

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Considering the difficulty in the insulin dosage selection and the problem of hyper- and hypoglycaemia episodes in type 1 diabetes, dosage-aid systems appear as tremendously helpful for these patients. A model-based approach to this problem must unavoidably consider uncertainty sources such as the large intra-patient variability and food intake. This work addresses the prediction of glycaemia for a given insulin therapy face to parametric and input uncertainty, by means of modal interval analysis. As result, a band containing all possible glucose excursions suffered by the patient for the given uncertainty is obtained. From it, a safer prediction of possible hyper- and hypoglycaemia episodes can be calculated

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, robustness of parametric systems is analyzed using a new approach to interval mathematics called Modal Interval Analysis. Modal Intervals are an interval extension that, instead of classic intervals, recovers some of the properties required by a numerical system. Modal Interval Analysis not only simplifies the computation of interval functions but allows semantic interpretation of their results. Necessary, sufficient and, in some cases, necessary and sufficient conditions for robust performance are presented

Relevância:

30.00% 30.00%

Publicador:

Resumo:

El uso de barras de materiales compuestos (FRP) se propone como una alternativa efectiva para las tradicionales estructuras de hormigón armadas con acero que sufren corrosión en ambientes agresivos. La aceptación de estos materiales en el mundo de la construcción está condicionada a la compresión de su comportamiento estructural. Este trabajo estudia el comportamiento adherente entre barras de FRP y hormigón mediante dos programas experimentales. El primero incluye la caracterización de la adherencia entre barras de FRP y hormigón mediante ensayos de pull-out y el segundo estudia el proceso de fisuración de tirantes de hormigón reforzados con barras de GFRP mediante ensayo a tracción directa. El trabajo se concluye con el desarrollo de un modelo numérico para la simulación del comportamiento de elementos de hormigón reforzado bajo cargas de tracción. La flexibilidad del modelo lo convierte en una herramienta flexible para la realización de un estudio paramétrico sobre las variables que influyen en el proceso de fisuración.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

La tesis se centra en la Visión por Computador y, más concretamente, en la segmentación de imágenes, la cual es una de las etapas básicas en el análisis de imágenes y consiste en la división de la imagen en un conjunto de regiones visualmente distintas y uniformes considerando su intensidad, color o textura. Se propone una estrategia basada en el uso complementario de la información de región y de frontera durante el proceso de segmentación, integración que permite paliar algunos de los problemas básicos de la segmentación tradicional. La información de frontera permite inicialmente identificar el número de regiones presentes en la imagen y colocar en el interior de cada una de ellas una semilla, con el objetivo de modelar estadísticamente las características de las regiones y definir de esta forma la información de región. Esta información, conjuntamente con la información de frontera, es utilizada en la definición de una función de energía que expresa las propiedades requeridas a la segmentación deseada: uniformidad en el interior de las regiones y contraste con las regiones vecinas en los límites. Un conjunto de regiones activas inician entonces su crecimiento, compitiendo por los píxeles de la imagen, con el objetivo de optimizar la función de energía o, en otras palabras, encontrar la segmentación que mejor se adecua a los requerimientos exprsados en dicha función. Finalmente, todo esta proceso ha sido considerado en una estructura piramidal, lo que nos permite refinar progresivamente el resultado de la segmentación y mejorar su coste computacional. La estrategia ha sido extendida al problema de segmentación de texturas, lo que implica algunas consideraciones básicas como el modelaje de las regiones a partir de un conjunto de características de textura y la extracción de la información de frontera cuando la textura es presente en la imagen. Finalmente, se ha llevado a cabo la extensión a la segmentación de imágenes teniendo en cuenta las propiedades de color y textura. En este sentido, el uso conjunto de técnicas no-paramétricas de estimación de la función de densidad para la descripción del color, y de características textuales basadas en la matriz de co-ocurrencia, ha sido propuesto para modelar adecuadamente y de forma completa las regiones de la imagen. La propuesta ha sido evaluada de forma objetiva y comparada con distintas técnicas de integración utilizando imágenes sintéticas. Además, se han incluido experimentos con imágenes reales con resultados muy positivos.