956 resultados para search for primary scenes technique


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Compte-rendu / Review

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Photoplethysmography (PPG) is a simple and inexpensive optical technique that can be used to detect blood volume changes in the microvascular bed of tissues. There has been a resurgence of interest in the technique in recent years, driven by the demand for low cost, simple and portable technology for the primary care and community based clinical settings and the wide availability of low cost and small semiconductor components, and the advancement of computer-based pulse wave analysis techniques. The present research work deals with the design of a PPG sensor for recording the blood volume pulse signals and carry out selected cardiovascular studies based on these signals. The interaction of light with tissue, early and recent history of PPG, instrumentation, measurement protocol and pulse wave analysis are also discussed in this study. The effect of aging, mild cold exposure, and variation in the body posture on the PPG signal have been experimentally studied.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With the advent of satellite communication and radio astronomy, the need for large and efficient reflector antennas had triggered a widespread investigation in reflector feed design techniques. Major improvements sought are reduction in spill-over, cross polarization losses and the enhancement of aperture efficiency. The search for such a feed culminated in the corrugated horn. The main idea behind the present work is to use the H-plane sectoral horns fitted with,corrugated flanges as feeds of a paraboloid and see how the secondary pattern of the reflector antenna varies with different parameters of the feed. An offset paraboloid is used as the secondary reflector in order to avoid the adverse effect of aperture ‘blocking by the feed horn structure on the secondary radiation pattern. The measurements were repeated for three different H-plane sectoral horns with the same set of corrugated flanges at various X-band frequencies. The following parameters of the whole system are studied: (a) Beam shaping. (b) Gain. (c) Variation of VSWR and (d) Cross polarization

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The need for improved feed systems for large reflector antennas employed in Radio Astronomy and Satellite tracking spurred the interest in horn antenna research in the 1960's. The major requirements were to reduce spill over, cross-polarisation losses,and to enhance the aperture efficiency to the order of about 75-8O%L The search for such a feed culminated in the corrugated horn. The corrugat1e 1 horn triggered widespread interest and enthusiasm, and a large amount of work(32’34’49’5O’52’53’58’65’75’79)has already been done on this type of antennas. The properties of corrugated surfaces has been investigated in detail. It was strongly felt that the flange technique and the use of corrugated surfaces could be merged together to obtain the advantages of both. This is the idea behind the present work. Corrugations are made on the surface of flange elements. The effect of various corrugation parameters are studied. By varying the flange parameters, a good amount of data is collected and analysed to ascertain the effects of corrugated flanges. The measurements are repeated at various frequencies, in the X— and S-bands. The following parameters of the system were studied: (a) beam shaping (b) gain (c) variation of V.S.U.R. (d) possibility of obtaining circularly polarised radiation from the flanged horn. A theoretical explanation to the effects of corrugated flanges is attempted on the basis of the line-source theory. Even though this theory utilises a simplified model for the calculation of radiation patterns, fairly good agreement between the computed pattern and experimental results are observed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper investigates certain methods of training adopted in the Statistical Machine Translator (SMT) from English to Malayalam. In English Malayalam SMT, the word to word translation is determined by training the parallel corpus. Our primary goal is to improve the alignment model by reducing the number of possible alignments of all sentence pairs present in the bilingual corpus. Incorporating morphological information into the parallel corpus with the help of the parts of speech tagger has brought around better training results with improved accuracy

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Study on variable stars is an important topic of modern astrophysics. After the invention of powerful telescopes and high resolving powered CCD’s, the variable star data is accumulating in the order of peta-bytes. The huge amount of data need lot of automated methods as well as human experts. This thesis is devoted to the data analysis on variable star’s astronomical time series data and hence belong to the inter-disciplinary topic, Astrostatistics. For an observer on earth, stars that have a change in apparent brightness over time are called variable stars. The variation in brightness may be regular (periodic), quasi periodic (semi-periodic) or irregular manner (aperiodic) and are caused by various reasons. In some cases, the variation is due to some internal thermo-nuclear processes, which are generally known as intrinsic vari- ables and in some other cases, it is due to some external processes, like eclipse or rotation, which are known as extrinsic variables. Intrinsic variables can be further grouped into pulsating variables, eruptive variables and flare stars. Extrinsic variables are grouped into eclipsing binary stars and chromospheri- cal stars. Pulsating variables can again classified into Cepheid, RR Lyrae, RV Tauri, Delta Scuti, Mira etc. The eruptive or cataclysmic variables are novae, supernovae, etc., which rarely occurs and are not periodic phenomena. Most of the other variations are periodic in nature. Variable stars can be observed through many ways such as photometry, spectrophotometry and spectroscopy. The sequence of photometric observa- xiv tions on variable stars produces time series data, which contains time, magni- tude and error. The plot between variable star’s apparent magnitude and time are known as light curve. If the time series data is folded on a period, the plot between apparent magnitude and phase is known as phased light curve. The unique shape of phased light curve is a characteristic of each type of variable star. One way to identify the type of variable star and to classify them is by visually looking at the phased light curve by an expert. For last several years, automated algorithms are used to classify a group of variable stars, with the help of computers. Research on variable stars can be divided into different stages like observa- tion, data reduction, data analysis, modeling and classification. The modeling on variable stars helps to determine the short-term and long-term behaviour and to construct theoretical models (for eg:- Wilson-Devinney model for eclips- ing binaries) and to derive stellar properties like mass, radius, luminosity, tem- perature, internal and external structure, chemical composition and evolution. The classification requires the determination of the basic parameters like pe- riod, amplitude and phase and also some other derived parameters. Out of these, period is the most important parameter since the wrong periods can lead to sparse light curves and misleading information. Time series analysis is a method of applying mathematical and statistical tests to data, to quantify the variation, understand the nature of time-varying phenomena, to gain physical understanding of the system and to predict future behavior of the system. Astronomical time series usually suffer from unevenly spaced time instants, varying error conditions and possibility of big gaps. This is due to daily varying daylight and the weather conditions for ground based observations and observations from space may suffer from the impact of cosmic ray particles. Many large scale astronomical surveys such as MACHO, OGLE, EROS, xv ROTSE, PLANET, Hipparcos, MISAO, NSVS, ASAS, Pan-STARRS, Ke- pler,ESA, Gaia, LSST, CRTS provide variable star’s time series data, even though their primary intention is not variable star observation. Center for Astrostatistics, Pennsylvania State University is established to help the astro- nomical community with the aid of statistical tools for harvesting and analysing archival data. Most of these surveys releases the data to the public for further analysis. There exist many period search algorithms through astronomical time se- ries analysis, which can be classified into parametric (assume some underlying distribution for data) and non-parametric (do not assume any statistical model like Gaussian etc.,) methods. Many of the parametric methods are based on variations of discrete Fourier transforms like Generalised Lomb-Scargle peri- odogram (GLSP) by Zechmeister(2009), Significant Spectrum (SigSpec) by Reegen(2007) etc. Non-parametric methods include Phase Dispersion Minimi- sation (PDM) by Stellingwerf(1978) and Cubic spline method by Akerlof(1994) etc. Even though most of the methods can be brought under automation, any of the method stated above could not fully recover the true periods. The wrong detection of period can be due to several reasons such as power leakage to other frequencies which is due to finite total interval, finite sampling interval and finite amount of data. Another problem is aliasing, which is due to the influence of regular sampling. Also spurious periods appear due to long gaps and power flow to harmonic frequencies is an inherent problem of Fourier methods. Hence obtaining the exact period of variable star from it’s time series data is still a difficult problem, in case of huge databases, when subjected to automation. As Matthew Templeton, AAVSO, states “Variable star data analysis is not always straightforward; large-scale, automated analysis design is non-trivial”. Derekas et al. 2007, Deb et.al. 2010 states “The processing of xvi huge amount of data in these databases is quite challenging, even when looking at seemingly small issues such as period determination and classification”. It will be beneficial for the variable star astronomical community, if basic parameters, such as period, amplitude and phase are obtained more accurately, when huge time series databases are subjected to automation. In the present thesis work, the theories of four popular period search methods are studied, the strength and weakness of these methods are evaluated by applying it on two survey databases and finally a modified form of cubic spline method is intro- duced to confirm the exact period of variable star. For the classification of new variable stars discovered and entering them in the “General Catalogue of Vari- able Stars” or other databases like “Variable Star Index“, the characteristics of the variability has to be quantified in term of variable star parameters.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Stimuli outside classical receptive fields have been shown to exert significant influence over the activities of neurons in primary visual cortexWe propose that contextual influences are used for pre-attentive visual segmentation, in a new framework called segmentation without classification. This means that segmentation of an image into regions occurs without classification of features within a region or comparison of features between regions. This segmentation framework is simpler than previous computational approaches, making it implementable by V1 mechanisms, though higher leve l visual mechanisms are needed to refine its output. However, it easily handles a class of segmentation problems that are tricky in conventional methods. The cortex computes global region boundaries by detecting the breakdown of homogeneity or translation invariance in the input, using local intra-cortical interactions mediated by the horizontal connections. The difference between contextual influences near and far from region boundaries makes neural activities near region boundaries higher than elsewhere, making boundaries more salient for perceptual pop-out. This proposal is implemented in a biologically based model of V1, and demonstrated using examples of texture segmentation and figure-ground segregation. The model performs segmentation in exactly the same neural circuit that solves the dual problem of the enhancement of contours, as is suggested by experimental observations. Its behavior is compared with psychophysical and physiological data on segmentation, contour enhancement, and contextual influences. We discuss the implications of segmentation without classification and the predictions of our V1 model, and relate it to other phenomena such as asymmetry in visual search.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

El treball que s'ha dut a terme es centra en la recerca d'agents modificants per a fibres cel·lulòsiques capaços de reduir la polaritat de les funcions alcohol de la seva estructura per formació de funcions ester. Les fibres de jute se sotmeten a reacció en un sistema tancat provist d'atmòsfera de nitrogen a fi d'evitar reaccions laterals que no són del nostre interés.L'obtenció dels resultats perseguits està lligat a les condicions experimentals aplicades durant les reaccions. La influència de les diferents variables escollides facilitarà en major o menor grau la reacció entre les molècules d'agent d'acoblament i cel·lulosa vinculades. Una gran part de l'atenció es centrarà en l'estudi de l'etapa de modificació, sobretot en l'efectivitat dels reactius addicionats per la reacció amb els grups hidroxil. Un cop comparats els experiments realitzats tant amb clorur d'oleïl com amb anhídrid metacrílic, es conclou que la majoria de condicions provades permeten assolir valors de modificació prou significatius. L'excepció ve donada quan la temperatura utilitzada és de 20ºC, llavors les mateixes condicions que a una temperatura de 60ºC condueixen cap a resultats poc satisfactoris. La reactivitat per part dels dos agents d'acoblament utilitzats no ha estat la mateixa. Els resultats per condicions experimentals del mateix tipus han conduït cap a valors força diferents. Pel que fa a la determinació dels paràmetres òptims es conclou que les variables amb les que el grau de modificació millora considerablement són: una temperatura de 60ºC, 10% de catalitzador respecte la quantitat de clorur d'oleïl o anhídrid addicionada, relació OH reactiu estequiomètrica 1:1 i 40mL de solvent. Un cop modificada la fibra, se sotmet a reacció amb el monòmer estirè. Es comprova que el grau de polimerització segueix el valor de modificació prèviament obtingut, a menor quantitat de funcions alcohol lliures major interacció amb el monòmer estirè. Les propietats inicials de la fibra no es corresponen amb les obtingudes després del tractament, l'increment de la resistència a l'atac de microorganismes i a l'absorció d'humitat s'explica per una reducció de la presència de funcions alcohol polars i per la capa d'estirè polimeritzat per unió amb els dobles enllaços introduïts amb els agents d'acoblament. Pel que fa referència a les dues tècniques de caracterització més utilitzades, l'anàlisis elemental permet quantificar d'una manera precisa la reacció de la fibra amb els agents d'acoblament i la posterior reacció de la fibra modificada amb el monòmer estirè. La caracterització per espectroscopia d'infraroig permet comprovar qualitativament la reactivitat del clorur d'oleïl i qualitativament-quantitativament la de l'anhídrid metacrílic amb les funcions alcohol de la cel·lulosa present en les fibres de jute. Els pics més característics apareguts seran utilitzats per avaluar la reactivitat de la funció carbonílica del reactiu modificant amb l'estructura cel·lulòsica i del doble enllaç de la cel·lulosa modificada amb la matriu polimèrica.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we present a distributed computing framework for problems characterized by a highly irregular search tree, whereby no reliable workload prediction is available. The framework is based on a peer-to-peer computing environment and dynamic load balancing. The system allows for dynamic resource aggregation, does not depend on any specific meta-computing middleware and is suitable for large-scale, multi-domain, heterogeneous environments, such as computational Grids. Dynamic load balancing policies based on global statistics are known to provide optimal load balancing performance, while randomized techniques provide high scalability. The proposed method combines both advantages and adopts distributed job-pools and a randomized polling technique. The framework has been successfully adopted in a parallel search algorithm for subgraph mining and evaluated on a molecular compounds dataset. The parallel application has shown good calability and close-to linear speedup in a distributed network of workstations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A technique is presented for locating and tracking objects in cluttered environments. Agents are randomly distributed across the image, and subsequently grouped around targets. Each agent uses a weightless neural network and a histogram intersection technique to score its location. The system has been used to locate and track a head in 320x240 resolution video at up to 15fps.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we present a connectionist searching technique - the Stochastic Diffusion Search (SDS), capable of rapidly locating a specified pattern in a noisy search space. In operation SDS finds the position of the pre-specified pattern or if it does not exist - its best instantiation in the search space. This is achieved via parallel exploration of the whole search space by an ensemble of agents searching in a competitive cooperative manner. We prove mathematically the convergence of stochastic diffusion search. SDS converges to a statistical equilibrium when it locates the best instantiation of the object in the search space. Experiments presented in this paper indicate the high robustness of SDS and show good scalability with problem size. The convergence characteristic of SDS makes it a fully adaptive algorithm and suggests applications in dynamically changing environments.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Stochastic Diffusion Search is an efficient probabilistic bestfit search technique, capable of transformation invariant pattern matching. Although inherently parallel in operation it is difficult to implement efficiently in hardware as it requires full inter-agent connectivity. This paper describes a lattice implementation, which, while qualitatively retaining the properties of the original algorithm, restricts connectivity, enabling simpler implementation on parallel hardware. Diffusion times are examined for different network topologies, ranging from ordered lattices, over small-world networks to random graphs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Literacy as a social practice is integrally linked with social, economic and political institutions and processes. As such, it has a material base which is fundamentally constituted in power relations. Literacy is therefore interwoven with the text and context of everyday living in which multi-levelled meanings are organically produced at both individual and societal level. This paper argues that if language thus mediates social reality, then it follows that literacy defined as a social practice cannot really be addressed as a reified, neutral activity but that it should take account of the social, cultural and political processes in which literacy practices are embedded. Drawing on the work of key writers within the field, the paper foregrounds the primary role of the state in defining the forms and levels of literacy required and made available at particular moments within society. In a case-study of the social construction of literacy meanings in pre-revolutionary Iran, it explores the view that the discourse about societal literacy levels has historically constituted a key terrain in which the struggle for control over meaning has taken place. This struggle, it is argued, sets the interests of the state to maintain ideological and political control over the production of knowledge within the culture and society over and against the needs identified by the individual for personal development, empowerment and liberation. In an overall sense, the paper examines existing theoretical perspectives on societal literacy programmes in terms of the scope that they provide for analyses that encompass the multi-levelled power relations that shape and influence dominant discourses on the relative value of literacy for both the individual and society

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Searching for and mapping the physical extent of unmarked graves using geophysical techniques has proven difficult in many cases. The success of individual geophysical techniques for detecting graves depends on a site-by-site basis. Significantly, detection of graves often results from measured contrasts that are linked to the background soils rather than the type of archaeological feature associated with the grave. It is evident that investigation of buried remains should be considered within a 3D space as the variation in burial environment can be extremely varied through the grave. Within this paper, we demonstrate the need for a multi-method survey strategy to investigate unmarked graves, as applied at a “planned” but unmarked pauper’s cemetery. The outcome from this case study provides new insights into the strategy that is required at such sites. Perhaps the most significant conclusion is that unmarked graves are best understood in terms of characterization rather than identification. In this paper, we argue for a methodological approach that, while following the current trends to use multiple techniques, is fundamentally dependent on a structured approach to the analysis of the data. The ramifications of this case study illustrate the necessity of an integrated strategy to provide a more holistic understanding of unmarked graves that may help aid in management of these unseen but important aspects of our heritage. It is concluded that the search for graves is still a current debate and one that will be solved by methodological rather than technique-based arguments.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Attention is a critical mechanism for visual scene analysis. By means of attention, it is possible to break down the analysis of a complex scene to the analysis of its parts through a selection process. Empirical studies demonstrate that attentional selection is conducted on visual objects as a whole. We present a neurocomputational model of object-based selection in the framework of oscillatory correlation. By segmenting an input scene and integrating the segments with their conspicuity obtained from a saliency map, the model selects salient objects rather than salient locations. The proposed system is composed of three modules: a saliency map providing saliency values of image locations, image segmentation for breaking the input scene into a set of objects, and object selection which allows one of the objects of the scene to be selected at a time. This object selection system has been applied to real gray-level and color images and the simulation results show the effectiveness of the system. (C) 2010 Elsevier Ltd. All rights reserved.