953 resultados para statistical narrow band model
Resumo:
Using magnetoencephalography, we studied the spatiotemporal properties of cortical responses in terms of event-related synchronization and event-related desynchronization to a range of stripe patterns in subjects with no neurological disorders. These stripes are known for their tendency to induce a range of abnormal sensations, such as illusions, nausea, dizziness, headache and attacks of pattern-sensitive epilepsy. The optimal stimulus must have specific physical properties, and maximum abnormalities occur at specific spatial frequency and contrast. Despite individual differences in the severity of discomfort experienced, psychophysical studies have shown that most observers experience some degree of visual anomaly on viewing such patterns. In a separate experiment, subjects reported the incidence of illusions and discomfort to each pattern. We found maximal cortical power in the gamma range (30-60 Hz) confined to the region of the primary visual cortex in response to patterns of 2-4 cycles per degree, peaking at 3 cycles per degree. This coincides with the peak of mean illusions and discomfort, also maximal for patterns of 2-4 cycles per degree. We show that gamma band activity in V1 is a narrow band function of spatial frequency. We hypothesize that the intrinsic properties of gamma oscillations may underlie visual discomfort and play a role in the onset of seizures.
Resumo:
A formalism recently introduced by Prugel-Bennett and Shapiro uses the methods of statistical mechanics to model the dynamics of genetic algorithms. To be of more general interest than the test cases they consider. In this paper, the technique is applied to the subset sum problem, which is a combinatorial optimization problem with a strongly non-linear energy (fitness) function and many local minima under single spin flip dynamics. It is a problem which exhibits an interesting dynamics, reminiscent of stabilizing selection in population biology. The dynamics are solved under certain simplifying assumptions and are reduced to a set of difference equations for a small number of relevant quantities. The quantities used are the population's cumulants, which describe its shape, and the mean correlation within the population, which measures the microscopic similarity of population members. Including the mean correlation allows a better description of the population than the cumulants alone would provide and represents a new and important extension of the technique. The formalism includes finite population effects and describes problems of realistic size. The theory is shown to agree closely to simulations of a real genetic algorithm and the mean best energy is accurately predicted.
Resumo:
In an attempt to clarify the behaviour of semi-conductor field emitters the properties of a narrow band gap material were investigated. A retarding potential analyser was built and tested using a tungsten emitter. The energy distribution of electrons emitted from single crystals of lead telluride (band gap 0.3 eV) and gallium phosphide (band gap 2.26 eV) were measured. The halfwidths of the distributions are discussed with respect to the relevant parameters for the materials. Methods of tip preparation had to be developed. The halfwidth of the energy distribution of electrons field emitted from carbon fibres was measured to be 0.21 ± 0.01 eV. A mechanism explaining the long lifetime of the emitters in poor vacuua is proposed.
Resumo:
The objective of this study was to design, construct, commission and operate a laboratory scale gasifier system that could be used to investigate the parameters that influence the gasification process. The gasifier is of the open-core variety and is fabricated from 7.5 cm bore quartz glass tubing. Gas cleaning is by a centrifugal contacting scrubber, with the product gas being flared. The system employs an on-line dedicated gas analysis system, monitoring the levels of H2, CO, CO2 and CH4 in the product gas. The gas composition data, as well as the gas flowrate, temperatures throughout the system and pressure data is recorded using a BBC microcomputer based data-logging system. Ten runs have been performed using the system of which six were predominantly commissioning runs. The main emphasis in the commissioning runs was placed on the gas clean-up, the product gas cleaning and the reactor bed temperature measurement. The reaction was observed to occur in a narrow band, of about 3 to 5 particle diameters thick. Initially the fuel was pyrolysed, with the volatiles produced being combusted and providing the energy to drive the process, and then the char product was gasified by reaction with the pyrolysis gases. Normally, the gasifier is operated with reaction zone supported on a bed of char, although it has been operated for short periods without a char bed. At steady state the depth of char remains constant, but by adjusting the air inlet rate it has been shown that the depth of char can be increased or decreased. It has been shown that increasing the depth of the char bed effects some improvement in the product gas quality.
Resumo:
We propose a novel recursive-algorithm based maximum a posteriori probability (MAP) detector in spectrally-efficient coherent wavelength division multiplexing (CoWDM) systems, and investigate its performance in a 1-bit/s/Hz on-off keyed (OOK) system limited by optical-signal-to-noise ratio. The proposed method decodes each sub-channel using the signal levels not only of the particular sub-channel but also of its adjacent sub-channels, and therefore can effectively compensate deterministic inter-sub-channel crosstalk as well as inter-symbol interference arising from narrow-band filtering and chromatic dispersion (CD). Numerical simulation of a five-channel OOK-based CoWDM system with 10Gbit/s per channel using either direct or coherent detection shows that the MAP decoder can eliminate the need for phase control of each optical carrier (which is necessarily required in a conventional CoWDM system), and greatly relaxes the spectral design of the demultiplexing filter at the receiver. It also significantly improves back-to-back sensitivity and CD tolerance of the system.
Resumo:
Nonlinear optical loop mirror (NOLM) requires breaking the loop symmetry to enable the counter propagating pulses to acquire a differential π phase shift. This is achieved with either an asymmetric fused fibre coupler at the input or by the inclusion of an asymmetrically located gain or loss element within the loop. By introducing a frequency selective loss element, nonlinear switching may be confined to a narrow band of wavelengths or multiple wavelengths. This configuration may have applications in time-wavelength demultiplexing. We demonstrate this technique of bandpass switching in the soliton regime using a fibre-Bragg grating reflector as the wavelength dependent loss.
Resumo:
Photonic signal processing is used to implement common mode signal cancellation across a very wide bandwidth utilising phase modulation of radio frequency (RF) signals onto a narrow linewidth laser carrier. RF spectra were observed using narrow-band, tunable optical filtering using a scanning Fabry Perot etalon. Thus functions conventionally performed using digital signal processing techniques in the electronic domain have been replaced by analog techniques in the photonic domain. This technique was able to observe simultaneous cancellation of signals across a bandwidth of 1400 MHz, limited only by the free spectral range of the etalon. © 2013 David M. Benton.
Resumo:
A new approach to locating gas and vapor plumes is proposed that is entirely passive. By modulating the transmission waveband of a narrow-band filter, an intensity modulation is established that allows regions of an image to be identified as containing a specific gas with absorption characteristics aligned with the filter. A system built from readily available components was constructed to identify regions of NO. Initial results show that this technique was able to distinguish an absorption cell containing NO gas in a test scene. © 2012 Society of Photo-Optical Instrumentation Engineers (SPIE).
Resumo:
Resource Space Model is a kind of data model which can effectively and flexibly manage the digital resources in cyber-physical system from multidimensional and hierarchical perspectives. This paper focuses on constructing resource space automatically. We propose a framework that organizes a set of digital resources according to different semantic dimensions combining human background knowledge in WordNet and Wikipedia. The construction process includes four steps: extracting candidate keywords, building semantic graphs, detecting semantic communities and generating resource space. An unsupervised statistical language topic model (i.e., Latent Dirichlet Allocation) is applied to extract candidate keywords of the facets. To better interpret meanings of the facets found by LDA, we map the keywords to Wikipedia concepts, calculate word relatedness using WordNet's noun synsets and construct corresponding semantic graphs. Moreover, semantic communities are identified by GN algorithm. After extracting candidate axes based on Wikipedia concept hierarchy, the final axes of resource space are sorted and picked out through three different ranking strategies. The experimental results demonstrate that the proposed framework can organize resources automatically and effectively.©2013 Published by Elsevier Ltd. All rights reserved.
Resumo:
The future northward expansion of the arthropod vectors of leishmaniasis caused by climate change seems to be essential veterinary and medical problem. Our aim was to build and evaluate a Climate Envelope Model (CEM) to assess the potential effects of climate change on five European sandfly species. The studied species – Phlebotomus ariasi Tonn., P. neglectus Tonn., P. papatasi Scop., P. perfiliewi Parrot, P. perniciosus Newst., P. sergenti Parrot, P. similis Perfiliev, P. tobbi Adler, Theodor et Lourie – are important vectors of the parasite Leishmania infantum or other Leishmania species. The projections were based on REMO regional climate model with European domain. The climate data were available in a 25 km resolution grid for the reference period (1961-90) and two future periods (2011-40, 2041-70). The regional climate model was based on the IPCC SRES A1B scenario. Three types of climatic parameters were used for every month (averaged in the 30-years periods). The model was supported by VBORNET digital area database (distribution maps), ESRI ArcGIS 10 software’s Spatial Analyst module (modeling environment), PAST (calibration of the model with statistical method). Iterative model evaluation was done by summarizing two types of model errors based on an aggregated distribution. The results show that the best model results can be achieved by leaving 5-5 percentiles from the two extrema of the mean temperature, 2-2 percentiles from the two extrema of the minimum temperature, 0 percentile from the minimum of and 8 percentiles from the maximum of the precipitation.
Resumo:
Historic changes in water-use management in the Florida Everglades have caused the quantity of freshwater inflow to Florida Bay to decline by approximately 60% while altering its timing and spatial distribution. Two consequences have been (1) increased salinity throughout the bay, including occurrences of hypersalinity, coupled with a decrease in salinity variability, and (2) change in benthic habitat structure. Restoration goals have been proposed to return the salinity climates (salinity and its variability) of Florida Bay to more estuarine conditions through changes in upstream water management, thereby returning seagrass species cover to a more historic state. To assess the potential for meeting those goals, we used two modeling approaches and long-term monitoring data. First, we applied the hydrological mass balance model FATHOM to predict salinity climate changes in sub-basins throughout the bay in response to a broad range of freshwater inflow from the Everglades. Second, because seagrass species exhibit different sensitivities to salinity climates, we used the FATHOM-modeled salinity climates as input to a statistical discriminant function model that associates eight seagrass community types with water quality variables including salinity, salinity variability, total organic carbon, total phosphorus, nitrate, and ammonium, as well as sediment depth and light reaching the benthos. Salinity climates in the western sub-basins bordering the Gulf of Mexico were insensitive to even the largest (5-fold) modeled increases in freshwater inflow. However, the north, northeastern, and eastern sub-basins were highly sensitive to freshwater inflow and responded to comparatively small increases with decreased salinity and increased salinity variability. The discriminant function model predicted increased occurrences ofHalodule wrightii communities and decreased occurrences of Thalassia testudinum communities in response to the more estuarine salinity climates. The shift in community composition represents a return to the historically observed state and suggests that restoration goals for Florida Bay can be achieved through restoration of freshwater inflow from the Everglades.
Resumo:
The work presented in this dissertation is focused on applying engineering methods to develop and explore probabilistic survival models for the prediction of decompression sickness in US NAVY divers. Mathematical modeling, computational model development, and numerical optimization techniques were employed to formulate and evaluate the predictive quality of models fitted to empirical data. In Chapters 1 and 2 we present general background information relevant to the development of probabilistic models applied to predicting the incidence of decompression sickness. The remainder of the dissertation introduces techniques developed in an effort to improve the predictive quality of probabilistic decompression models and to reduce the difficulty of model parameter optimization.
The first project explored seventeen variations of the hazard function using a well-perfused parallel compartment model. Models were parametrically optimized using the maximum likelihood technique. Model performance was evaluated using both classical statistical methods and model selection techniques based on information theory. Optimized model parameters were overall similar to those of previously published Results indicated that a novel hazard function definition that included both ambient pressure scaling and individually fitted compartment exponent scaling terms.
We developed ten pharmacokinetic compartmental models that included explicit delay mechanics to determine if predictive quality could be improved through the inclusion of material transfer lags. A fitted discrete delay parameter augmented the inflow to the compartment systems from the environment. Based on the observation that symptoms are often reported after risk accumulation begins for many of our models, we hypothesized that the inclusion of delays might improve correlation between the model predictions and observed data. Model selection techniques identified two models as having the best overall performance, but comparison to the best performing model without delay and model selection using our best identified no delay pharmacokinetic model both indicated that the delay mechanism was not statistically justified and did not substantially improve model predictions.
Our final investigation explored parameter bounding techniques to identify parameter regions for which statistical model failure will not occur. When a model predicts a no probability of a diver experiencing decompression sickness for an exposure that is known to produce symptoms, statistical model failure occurs. Using a metric related to the instantaneous risk, we successfully identify regions where model failure will not occur and identify the boundaries of the region using a root bounding technique. Several models are used to demonstrate the techniques, which may be employed to reduce the difficulty of model optimization for future investigations.
A New Method for Modeling Free Surface Flows and Fluid-structure Interaction with Ocean Applications
Resumo:
The computational modeling of ocean waves and ocean-faring devices poses numerous challenges. Among these are the need to stably and accurately represent both the fluid-fluid interface between water and air as well as the fluid-structure interfaces arising between solid devices and one or more fluids. As techniques are developed to stably and accurately balance the interactions between fluid and structural solvers at these boundaries, a similarly pressing challenge is the development of algorithms that are massively scalable and capable of performing large-scale three-dimensional simulations on reasonable time scales. This dissertation introduces two separate methods for approaching this problem, with the first focusing on the development of sophisticated fluid-fluid interface representations and the second focusing primarily on scalability and extensibility to higher-order methods.
We begin by introducing the narrow-band gradient-augmented level set method (GALSM) for incompressible multiphase Navier-Stokes flow. This is the first use of the high-order GALSM for a fluid flow application, and its reliability and accuracy in modeling ocean environments is tested extensively. The method demonstrates numerous advantages over the traditional level set method, among these a heightened conservation of fluid volume and the representation of subgrid structures.
Next, we present a finite-volume algorithm for solving the incompressible Euler equations in two and three dimensions in the presence of a flow-driven free surface and a dynamic rigid body. In this development, the chief concerns are efficiency, scalability, and extensibility (to higher-order and truly conservative methods). These priorities informed a number of important choices: The air phase is substituted by a pressure boundary condition in order to greatly reduce the size of the computational domain, a cut-cell finite-volume approach is chosen in order to minimize fluid volume loss and open the door to higher-order methods, and adaptive mesh refinement (AMR) is employed to focus computational effort and make large-scale 3D simulations possible. This algorithm is shown to produce robust and accurate results that are well-suited for the study of ocean waves and the development of wave energy conversion (WEC) devices.
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-08
Resumo:
On the area in relation with the very wave battered made, the associations to be noted are: 1) association with Bangia sp., which is seasonal and located at the level of Brachytrichia maculans. 2) Association with Chthamalus stellatus, which is very well developed over under the oysters’ level. 3) Association with Dermonema frappieri, which form a narrow band over the Chnoospora minima band. At the same level are located the associations with Ectocarpus braeviarticulatus, with Centroceras clavulatum, with Gymnogongrus Sereneii and with Chaetomorpha antennina. The infralittoral level does not present any conspicuous different.