957 resultados para conditional random field


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Typical properties of sparse random matrices over finite (Galois) fields are studied, in the limit of large matrices, using techniques from the physics of disordered systems. For the case of a finite field GF(q) with prime order q, we present results for the average kernel dimension, average dimension of the eigenvector spaces and the distribution of the eigenvalues. The number of matrices for a given distribution of entries is also calculated for the general case. The significance of these results to error-correcting codes and random graphs is also discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Properties of computing Boolean circuits composed of noisy logical gates are studied using the statistical physics methodology. A formula-growth model that gives rise to random Boolean functions is mapped onto a spin system, which facilitates the study of their typical behavior in the presence of noise. Bounds on their performance, derived in the information theory literature for specific gates, are straightforwardly retrieved, generalized and identified as the corresponding macroscopic phase transitions. The framework is employed for deriving results on error-rates at various function-depths and function sensitivity, and their dependence on the gate-type and noise model used. These are difficult to obtain via the traditional methods used in this field.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The ERS-1 Satellite was launched in July 1991 by the European Space Agency into a polar orbit at about 800 km, carrying a C-band scatterometer. A scatterometer measures the amount of backscatter microwave radiation reflected by small ripples on the ocean surface induced by sea-surface winds, and so provides instantaneous snap-shots of wind flow over large areas of the ocean surface, known as wind fields. Inherent in the physics of the observation process is an ambiguity in wind direction; the scatterometer cannot distinguish if the wind is blowing toward or away from the sensor device. This ambiguity implies that there is a one-to-many mapping between scatterometer data and wind direction. Current operational methods for wind field retrieval are based on the retrieval of wind vectors from satellite scatterometer data, followed by a disambiguation and filtering process that is reliant on numerical weather prediction models. The wind vectors are retrieved by the local inversion of a forward model, mapping scatterometer observations to wind vectors, and minimising a cost function in scatterometer measurement space. This thesis applies a pragmatic Bayesian solution to the problem. The likelihood is a combination of conditional probability distributions for the local wind vectors given the scatterometer data. The prior distribution is a vector Gaussian process that provides the geophysical consistency for the wind field. The wind vectors are retrieved directly from the scatterometer data by using mixture density networks, a principled method to model multi-modal conditional probability density functions. The complexity of the mapping and the structure of the conditional probability density function are investigated. A hybrid mixture density network, that incorporates the knowledge that the conditional probability distribution of the observation process is predominantly bi-modal, is developed. The optimal model, which generalises across a swathe of scatterometer readings, is better on key performance measures than the current operational model. Wind field retrieval is approached from three perspectives. The first is a non-autonomous method that confirms the validity of the model by retrieving the correct wind field 99% of the time from a test set of 575 wind fields. The second technique takes the maximum a posteriori probability wind field retrieved from the posterior distribution as the prediction. For the third technique, Markov Chain Monte Carlo (MCMC) techniques were employed to estimate the mass associated with significant modes of the posterior distribution, and make predictions based on the mode with the greatest mass associated with it. General methods for sampling from multi-modal distributions were benchmarked against a specific MCMC transition kernel designed for this problem. It was shown that the general methods were unsuitable for this application due to computational expense. On a test set of 100 wind fields the MAP estimate correctly retrieved 72 wind fields, whilst the sampling method correctly retrieved 73 wind fields.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We consider the random input problem for a nonlinear system modeled by the integrable one-dimensional self-focusing nonlinear Schrödinger equation (NLSE). We concentrate on the properties obtained from the direct scattering problem associated with the NLSE. We discuss some general issues regarding soliton creation from random input. We also study the averaged spectral density of random quasilinear waves generated in the NLSE channel for two models of the disordered input field profile. The first model is symmetric complex Gaussian white noise and the second one is a real dichotomous (telegraph) process. For the former model, the closed-form expression for the averaged spectral density is obtained, while for the dichotomous real input we present the small noise perturbative expansion for the same quantity. In the case of the dichotomous input, we also obtain the distribution of minimal pulse width required for a soliton generation. The obtained results can be applied to a multitude of problems including random nonlinear Fraunhoffer diffraction, transmission properties of randomly apodized long period Fiber Bragg gratings, and the propagation of incoherent pulses in optical fibers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

PURPOSE: To validate a new miniaturised, open-field wavefront device which has been developed with the capacity to be attached to an ophthalmic surgical microscope or slit-lamp. SETTING: Solihull Hospital and Aston University, Birmingham, UK DESIGN: Comparative non-interventional study. METHODS: The dynamic range of the Aston Aberrometer was assessed using a calibrated model eye. The validity of the Aston Aberrometer was compared to a conventional desk mounted Shack-Hartmann aberrometer (Topcon KR1W) by measuring the refractive error and higher order aberrations of 75 dilated eyes with both instruments in random order. The Aston Aberrometer measurements were repeated five times to assess intra-session repeatability. Data was converted to vector form for analysis. RESULTS: The Aston Aberrometer had a large dynamic range of at least +21.0 D to -25.0 D. It gave similar measurements to a conventional aberrometer for mean spherical equivalent (mean difference ± 95% confidence interval: 0.02 ± 0.49D; correlation: r=0.995, p<0.001), astigmatic components (J0: 0.02 ± 0.15D; r=0.977, p<0.001; J45: 0.03 ± 0.28; r=0.666, p<0.001) and higher order aberrations RMS (0.02 ± 0.20D; r=0.620, p<0.001). Intraclass correlation coefficient assessments of intra-sessional repeatability for the Aston Aberrometer were excellent (spherical equivalent =1.000, p<0.001; astigmatic components J0 =0.998, p<0.001, J45=0.980, p<0.01; higher order aberrations RMS =0.961, p<0.001). CONCLUSIONS: The Aston Aberrometer gives valid and repeatable measures of refractive error and higher order aberrations over a large range. As it is able to measure continuously, it can provide direct feedback to surgeons during intraocular lens implantations and corneal surgery as to the optical status of the visual system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The noise properties of supercontinuum generation continue to be a subject of wide interest within both pure and applied physics. Aside from immediate applications in supercontinuum source development, detailed studies of supercontinuum noise mechanisms have attracted interdisciplinary attention because of links with extreme instabilities in other physical systems, especially the infamous and destructive oceanic rogue waves. But the instabilities inherent in supercontinuum generation can also be interpreted in terms of natural links with the general field of random processes, and this raises new possibilities for applications in areas such as random number generation. In this contribution we will describe recent work where we interpret supercontinuum intensity and phase fluctuations in this way.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The concept of random lasers exploiting multiple scattering of photons in an amplifying disordered medium in order to generate coherent light without a traditional laser resonator has attracted a great deal of attention in recent years. This research area lies at the interface of the fundamental theory of disordered systems and laser science. The idea was originally proposed in the context of astrophysics in the 1960s by V.S. Letokhov, who studied scattering with "negative absorption" of the interstellar molecular clouds. Research on random lasers has since developed into a mature experimental and theoretical field. A simple design of such lasers would be promising for potential applications. However, in traditional random lasers the properties of the output radiation are typically characterized by complex features in the spatial, spectral and time domains, making them less attractive than standard laser systems in terms of practical applications. Recently, an interesting and novel type of one-dimensional random laser that operates in a conventional telecommunication fibre without any pre-designed resonator mirrors-random distributed feedback fibre laser-was demonstrated. The positive feedback required for laser generation in random fibre lasers is provided by the Rayleigh scattering from the inhomogeneities of the refractive index that are naturally present in silica glass. In the proposed laser concept, the randomly backscattered light is amplified through the Raman effect, providing distributed gain over distances up to 100km. Although an effective reflection due to the Rayleigh scattering is extremely small (~0.1%), the lasing threshold may be exceeded when a sufficiently large distributed Raman gain is provided. Such a random distributed feedback fibre laser has a number of interesting and attractive features. The fibre waveguide geometry provides transverse confinement, and effectively one-dimensional random distributed feedback leads to the generation of a stationary near-Gaussian beam with a narrow spectrum. A random distributed feedback fibre laser has efficiency and performance that are comparable to and even exceed those of similar conventional fibre lasers. The key features of the generated radiation of random distributed feedback fibre lasers include: a stationary narrow-band continuous modeless spectrum that is free of mode competition, nonlinear power broadening, and an output beam with a Gaussian profile in the fundamental transverse mode (generated both in single mode and multi-mode fibres).This review presents the current status of research in the field of random fibre lasers and shows their potential and perspectives. We start with an introductory overview of conventional distributed feedback lasers and traditional random lasers to set the stage for discussion of random fibre lasers. We then present a theoretical analysis and experimental studies of various random fibre laser configurations, including widely tunable, multi-wavelength, narrow-band generation, and random fibre lasers operating in different spectral bands in the 1-1.6μm range. Then we discuss existing and future applications of random fibre lasers, including telecommunication and distributed long reach sensor systems. A theoretical description of random lasers is very challenging and is strongly linked with the theory of disordered systems and kinetic theory. We outline two key models governing the generation of random fibre lasers: the average power balance model and the nonlinear Schrödinger equation based model. Recently invented random distributed feedback fibre lasers represent a new and exciting field of research that brings together such diverse areas of science as laser physics, the theory of disordered systems, fibre optics and nonlinear science. Stable random generation in optical fibre opens up new possibilities for research on wave transport and localization in disordered media. We hope that this review will provide background information for research in various fields and will stimulate cross-disciplinary collaborations on random fibre lasers. © 2014 Elsevier B.V.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We provide a theoretical explanation of the results on the intensity distributions and correlation functions obtained from a random-beam speckle field in nonlinear bulk waveguides reported in the recent publication by Bromberg et al. [Nat. Photonics 4, 721 (2010) ].. We study both the focusing and defocusing cases and in the limit of small speckle size (short-correlated disordered beam) provide analytical asymptotes for the intensity probability distributions at the output facet. Additionally we provide a simple relation between the speckle sizes at the input and output of a focusing nonlinear waveguide. The results are of practical significance for nonlinear Hanbury Brown and Twiss interferometry in both optical waveguides and Bose-Einstein condensates. © 2012 American Physical Society.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Random fiber lasers blend together attractive features of traditional random lasers, such as low cost and simplicity of fabrication, with high-performance characteristics of conventional fiber lasers, such as good directionality and high efficiency. Low coherence of random lasers is important for speckle-free imaging applications. The random fiber laser with distributed feedback proposed in 2010 led to a quickly developing class of light sources that utilize inherent optical fiber disorder in the form of the Rayleigh scattering and distributed Raman gain. The random fiber laser is an interesting and practically important example of a photonic device based on exploitation of optical medium disorder. We provide an overview of recent advances in this field, including high-power and high-efficiency generation, spectral and statistical properties of random fiber lasers, nonlinear kinetic theory of such systems, and emerging applications in telecommunications and distributed sensing.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study was an evaluation of a Field Project Model Curriculum and its impact on achievement, attitude toward science, attitude toward the environment, self-concept, and academic self-concept with at-risk eleventh and twelfth grade students. One hundred eight students were pretested and posttested on the Piers-Harris Children's Self-Concept Scale, PHCSC (1985); the Self-Concept as a Learner Scale, SCAL (1978); the Marine Science Test, MST (1987); the Science Attitude Inventory, SAI (1970); and the Environmental Attitude Scale, EAS (1972). Using a stratified random design, three groups of students were randomly assigned according to sex and stanine level, to three treatment groups. Group one received the field project method, group two received the field study method, and group three received the field trip method. All three groups followed the marine biology course content as specified by Florida Student Performance Objectives and Frameworks. The intervention occurred for ten months with each group participating in outside-of-classroom activities on a trimonthly basis. Analysis of covariance procedures were used to determine treatment effects. F-ratios, p-levels and t-tests at p $<$.0062 (.05/8) indicated that a significant difference existed among the three treatment groups. Findings indicated that groups one and two were significantly different from group three with group one displaying significantly higher results than group two. There were no significant differences between males and females in performance on the five dependent variables. The tenets underlying environmental education are congruent with the recommendations toward the reform of science education. These include a value analysis approach, inquiry methods, and critical thinking strategies that are applied to environmental issues. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Water-alternating-gas (WAG) is an enhanced oil recovery method combining the improved macroscopic sweep of water flooding with the improved microscopic displacement of gas injection. The optimal design of the WAG parameters is usually based on numerical reservoir simulation via trial and error, limited by the reservoir engineer’s availability. Employing optimisation techniques can guide the simulation runs and reduce the number of function evaluations. In this study, robust evolutionary algorithms are utilized to optimise hydrocarbon WAG performance in the E-segment of the Norne field. The first objective function is selected to be the net present value (NPV) and two global semi-random search strategies, a genetic algorithm (GA) and particle swarm optimisation (PSO) are tested on different case studies with different numbers of controlling variables which are sampled from the set of water and gas injection rates, bottom-hole pressures of the oil production wells, cycle ratio, cycle time, the composition of the injected hydrocarbon gas (miscible/immiscible WAG) and the total WAG period. In progressive experiments, the number of decision-making variables is increased, increasing the problem complexity while potentially improving the efficacy of the WAG process. The second objective function is selected to be the incremental recovery factor (IRF) within a fixed total WAG simulation time and it is optimised using the same optimisation algorithms. The results from the two optimisation techniques are analyzed and their performance, convergence speed and the quality of the optimal solutions found by the algorithms in multiple trials are compared for each experiment. The distinctions between the optimal WAG parameters resulting from NPV and oil recovery optimisation are also examined. This is the first known work optimising over this complete set of WAG variables. The first use of PSO to optimise a WAG project at the field scale is also illustrated. Compared to the reference cases, the best overall values of the objective functions found by GA and PSO were 13.8% and 14.2% higher, respectively, if NPV is optimised over all the above variables, and 14.2% and 16.2% higher, respectively, if IRF is optimised.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An object based image analysis approach (OBIA) was used to create a habitat map of the Lizard Reef. Briefly, georeferenced dive and snorkel photo-transect surveys were conducted at different locations surrounding Lizard Island, Australia. For the surveys, a snorkeler or diver swam over the bottom at a depth of 1-2m in the lagoon, One Tree Beach and Research Station areas, and 7m depth in Watson's Bay, while taking photos of the benthos at a set height using a standard digital camera and towing a surface float GPS which was logging its track every five seconds. The camera lens provided a 1.0 m x 1.0 m footprint, at 0.5 m height above the benthos. Horizontal distance between photos was estimated by fin kicks, and corresponded to a surface distance of approximately 2.0 - 4.0 m. Approximation of coordinates of each benthic photo was done based on the photo timestamp and GPS coordinate time stamp, using GPS Photo Link Software (www.geospatialexperts.com). Coordinates of each photo were interpolated by finding the gps coordinates that were logged at a set time before and after the photo was captured. Dominant benthic or substrate cover type was assigned to each photo by placing 24 points random over each image using the Coral Point Count excel program (Kohler and Gill, 2006). Each point was then assigned a dominant cover type using a benthic cover type classification scheme containing nine first-level categories - seagrass high (>=70%), seagrass moderate (40-70%), seagrass low (<= 30%), coral, reef matrix, algae, rubble, rock and sand. Benthic cover composition summaries of each photo were generated automatically in CPCe. The resulting benthic cover data for each photo was linked to GPS coordinates, saved as an ArcMap point shapefile, and projected to Universal Transverse Mercator WGS84 Zone 56 South. The OBIA class assignment followed a hierarchical assignment based on membership rules with levels for "reef", "geomorphic zone" and "benthic community" (above).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We analyze the far-field intensity distribution of binary phase gratings whose strips present certain randomness in their height. A statistical analysis based on the mutual coherence function is done in the plane just after the grating. Then, the mutual coherence function is propagated to the far field and the intensity distribution is obtained. Generally, the intensity of the diffraction orders decreases in comparison to that of the ideal perfect grating. Several important limit cases, such as low- and high-randomness perturbed gratings, are analyzed. In the high-randomness limit, the phase grating is equivalent to an amplitude grating plus a “halo.” Although these structures are not purely periodic, they behave approximately as a diffraction grating.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this work, we obtain analytical expressions for the near-and far-field diffraction of random Ronchi diffraction gratings where the slits of the grating are randomly displaced around their periodical positions. We theoretically show that the effect of randomness in the position of the slits of the grating produces a decrease of the contrast and even disappearance of the self-images for high randomness level at the near field. On the other hand, it cancels high-order harmonics in far field, resulting in only a few central diffraction orders. Numerical simulations by means of the Rayleigh–Sommerfeld diffraction formula are performed in order to corroborate the analytical results. These results are of interest for industrial and technological applications where manufacture errors need to be considered.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The power-law size distributions obtained experimentally for neuronal avalanches are an important evidence of criticality in the brain. This evidence is supported by the fact that a critical branching process exhibits the same exponent t~3=2. Models at criticality have been employed to mimic avalanche propagation and explain the statistics observed experimentally. However, a crucial aspect of neuronal recordings has been almost completely neglected in the models: undersampling. While in a typical multielectrode array hundreds of neurons are recorded, in the same area of neuronal tissue tens of thousands of neurons can be found. Here we investigate the consequences of undersampling in models with three different topologies (two-dimensional, small-world and random network) and three different dynamical regimes (subcritical, critical and supercritical). We found that undersampling modifies avalanche size distributions, extinguishing the power laws observed in critical systems. Distributions from subcritical systems are also modified, but the shape of the undersampled distributions is more similar to that of a fully sampled system. Undersampled supercritical systems can recover the general characteristics of the fully sampled version, provided that enough neurons are measured. Undersampling in two-dimensional and small-world networks leads to similar effects, while the random network is insensitive to sampling density due to the lack of a well-defined neighborhood. We conjecture that neuronal avalanches recorded from local field potentials avoid undersampling effects due to the nature of this signal, but the same does not hold for spike avalanches. We conclude that undersampled branching-process-like models in these topologies fail to reproduce the statistics of spike avalanches.