30 resultados para Fermi-density distribution with two parameters
Resumo:
Our understanding of early spatial vision owes much to contrast masking and summation paradigms. In particular, the deep region of facilitation at low mask contrasts is thought to indicate a rapidly accelerating contrast transducer (eg a square-law or greater). In experiment 1, we tapped an early stage of this process by measuring monocular and binocular thresholds for patches of 1 cycle deg-1 sine-wave grating. Threshold ratios were around 1.7, implying a nearly linear transducer with an exponent around 1.3. With this form of transducer, two previous models (Legge, 1984 Vision Research 24 385 - 394; Meese et al, 2004 Perception 33 Supplement, 41) failed to fit the monocular, binocular, and dichoptic masking functions measured in experiment 2. However, a new model with two-stages of divisive gain control fits the data very well. Stage 1 incorporates nearly linear monocular transducers (to account for the high level of binocular summation and slight dichoptic facilitation), and monocular and interocular suppression (to fit the profound 42 Oral presentations: Spatial vision Thursday dichoptic masking). Stage 2 incorporates steeply accelerating transduction (to fit the deep regions of monocular and binocular facilitation), and binocular summation and suppression (to fit the monocular and binocular masking). With all model parameters fixed from the discrimination thresholds, we examined the slopes of the psychometric functions. The monocular and binocular slopes were steep (Weibull ߘ3-4) at very low mask contrasts and shallow (ߘ1.2) at all higher contrasts, as predicted by all three models. The dichoptic slopes were steep (ߘ3-4) at very low contrasts, and very steep (ß>5.5) at high contrasts (confirming Meese et al, loco cit.). A crucial new result was that intermediate dichoptic mask contrasts produced shallow slopes (ߘ2). Only the two-stage model predicted the observed pattern of slope variation, so providing good empirical support for a two-stage process of binocular contrast transduction. [Supported by EPSRC GR/S74515/01]
Resumo:
This thesis is a study of three techniques to improve performance of some standard fore-casting models, application to the energy demand and prices. We focus on forecasting demand and price one-day ahead. First, the wavelet transform was used as a pre-processing procedure with two approaches: multicomponent-forecasts and direct-forecasts. We have empirically compared these approaches and found that the former consistently outperformed the latter. Second, adaptive models were introduced to continuously update model parameters in the testing period by combining ?lters with standard forecasting methods. Among these adaptive models, the adaptive LR-GARCH model was proposed for the fi?rst time in the thesis. Third, with regard to noise distributions of the dependent variables in the forecasting models, we used either Gaussian or Student-t distributions. This thesis proposed a novel algorithm to infer parameters of Student-t noise models. The method is an extension of earlier work for models that are linear in parameters to the non-linear multilayer perceptron. Therefore, the proposed method broadens the range of models that can use a Student-t noise distribution. Because these techniques cannot stand alone, they must be combined with prediction models to improve their performance. We combined these techniques with some standard forecasting models: multilayer perceptron, radial basis functions, linear regression, and linear regression with GARCH. These techniques and forecasting models were applied to two datasets from the UK energy markets: daily electricity demand (which is stationary) and gas forward prices (non-stationary). The results showed that these techniques provided good improvement to prediction performance.
Resumo:
We investigate the feasibility of simultaneous suppressing of the amplification noise and nonlinearity, representing the most fundamental limiting factors in modern optical communication. To accomplish this task we developed a general design optimisation technique, based on concepts of noise and nonlinearity management. We demonstrate the immense efficiency of the novel approach by applying it to a design optimisation of transmission lines with periodic dispersion compensation using Raman and hybrid Raman-EDFA amplification. Moreover, we showed, using nonlinearity management considerations, that the optimal performance in high bit-rate dispersion managed fibre systems with hybrid amplification is achieved for a certain amplifier spacing – which is different from commonly known optimal noise performance corresponding to fully distributed amplification. Required for an accurate estimation of the bit error rate, the complete knowledge of signal statistics is crucial for modern transmission links with strong inherent nonlinearity. Therefore, we implemented the advanced multicanonical Monte Carlo (MMC) method, acknowledged for its efficiency in estimating distribution tails. We have accurately computed acknowledged for its efficiency in estimating distribution tails. We have accurately computed marginal probability density functions for soliton parameters, by numerical modelling of Fokker-Plank equation applying the MMC simulation technique. Moreover, applying a powerful MMC method we have studied the BER penalty caused by deviations from the optimal decision level in systems employing in-line 2R optical regeneration. We have demonstrated that in such systems the analytical linear approximation that makes a better fit in the central part of the regenerator nonlinear transfer function produces more accurate approximation of the BER and BER penalty. We present a statistical analysis of RZ-DPSK optical signal at direct detection receiver with Mach-Zehnder interferometer demodulation
Resumo:
In this paper we consider the optimisation of Shannon mutual information (MI) in the context of two model neural systems The first is a stochastic pooling network (population) of McCulloch-Pitts (MP) type neurons (logical threshold units) subject to stochastic forcing; the second is (in a rate coding paradigm) a population of neurons that each displays Poisson statistics (the so called 'Poisson neuron'). The mutual information is optimised as a function of a parameter that characterises the 'noise level'-in the MP array this parameter is the standard deviation of the noise, in the population of Poisson neurons it is the window length used to determine the spike count. In both systems we find that the emergent neural architecture and; hence, code that maximises the MI is strongly influenced by the noise level. Low noise levels leads to a heterogeneous distribution of neural parameters (diversity), whereas, medium to high noise levels result in the clustering of neural parameters into distinct groups that can be interpreted as subpopulations In both cases the number of subpopulations increases with a decrease in noise level. Our results suggest that subpopulations are a generic feature of an information optimal neural population.
Resumo:
We report observations of the diffraction pattern resulting when a nematic liquid crystal is illuminated with two equal power, high intensity beams of light from an Ar+ laser. The time evolution of the pattern is followed from the initial production of higher diffraction orders to a final striking display arising as a result of the self-diffraction of the two incident beams. The experimental results are described with good approximation by a model assuming a phase distribution at the output plane of the liquid crystal in the form of the sum of a gaussian and a sinusoid.
Resumo:
A procedure has been developed which measures the settling velocity distribution of particles within a complete sewage sample. The development of the test method included observations of particle and liquid interaction using both synthetic media and sewage. Comparison studies with two other currently used settling velocity test procedures was undertaken. The method is suitable for use with either DWF or storm sewage. Information relating to the catchment characteristics of 35 No. wastewater treatment works was collected from the privatised water companies in England and Wales. 29 No. of these sites were used in an experimental programme to determine the settling velocity grading of 33 No. sewage samples. The collected data were analysed in an attempt to relate the settling velocity distribution to the characteristics of the contributing catchment. Statistical analysis of the catchment data and the measured settling velocity distributions was undertaken. A curve fitting exercise using an S-shaped curve which had the same physical characteristics as the settling velocity distributions was performed. None of these analyses found evidence that the settling velocity distribution of sewage had a significant relationship with the chosen catchment characteristics. The regression equations produced from the statistical analysis cannot be used to assist in the design of separation devices. However, a grading curve envelope was produced, the limits of which were clearly defined for the measured data set. There was no evidence of a relationship between settling velocity grading and the characteristics of the contributing catchment, particularly the catchment area. The present empirical approach to settling tank design cannot be improved upon at present by considering the variation in catchment parameters. This study has provided a basis for future research into the settling velocity measurement and should be of benefit to future workers within this field.
Resumo:
Efforts to address the problems of literacy are often focused on developing countries. However, functional illiteracy is a challenge encountered by up to 50% of adults in developed countries. In this paper we reflect on the challenges we faced in trying to design and study the use of a mobile application to support adult literacy with two user groups: adults enrolled in literacy classes and carpenters without a high school education enrolled in an essential skills program. We also elaborate on aspects of the evaluations that are specific to a marginalized, functionally illiterate, group in a developed country - aspects that are less frequently present in similar studies of mobile literacy support technologies in developing countries. We conclude with presenting the lessons learnt from our evaluations and the impact of the studies' specific challenges on the outcome and uptake of such mobile assistive technologies in providing practical support to low-literacy adults in conjunction with literacy and essential skills training.
Resumo:
We report observations of the diffraction pattern resulting when a nematic liquid crystal is illuminated with two equal power, high intensity beams of light from an Ar+ laser. The time evolution of the pattern is followed from the initial production of higher diffraction orders to a final striking display arising as a result of the self-diffraction of the two incident beams. The experimental results are described with good approximation by a model assuming a phase distribution at the output plane of the liquid crystal in the form of the sum of a gaussian and a sinusoid.
Resumo:
As a discipline, supply chain management (SCM) has traditionally been primarily concerned with the procurement, processing, movement and sale of physical goods. However an important class of products has emerged - digital products - which cannot be described as physical as they do not obey commonly understood physical laws. They do not possess mass or volume, and they require no energy in their manufacture or distribution. With the Internet, they can be distributed at speeds unimaginable in the physical world, and every copy produced is a 100% perfect duplicate of the original version. Furthermore, the ease with which digital products can be replicated has few analogues in the physical world. This paper assesses the effect of non-physicality on one such product – software – in relation to the practice of SCM. It explores the challenges that arise when managing the software supply chain and how practitioners are addressing these challenges. Using a two-pronged exploratory approach that examines the literature around software management as well as direct interviews with software distribution practitioners, a number of key challenges associated with software supply chains are uncovered, along with responses to these challenges. This paper proposes a new model for software supply chains that takes into account the non-physicality of the product being delivered. Central to this model is the replacement of physical flows with flows of intellectual property, the growing importance of innovation over duplication and the increased centrality of the customer in the entire process. Hybrid physical / digital supply chains are discussed and a framework for practitioners concerned with software supply chains is presented.
Resumo:
Purpose: Current panretinal laser photocoagulative parameters are based on the Diabetic Retinopathy Study, which used exposures of 0.1 - 0.5 second to achieve moderate intensity retinal burns. Unfortunately, many patients find these settings painful. We wanted to investigate whether reducing exposure time and increasing power to give the same endpoint, is more comfortable and effective. Methods: 20 patients having panretinal photocoagulation for the first time underwent random allocation to two forms of laser treatment: half of the retinal area scheduled for treatment was treated with Green Yag laser with conventional parameters {exposure time 0.1 second (treatment A), power density sufficient to produce a visible grey - white burns}. The other half treated with shorter exposure 0.02 second (treatment B). All patient were asked to evaluate severity of pain on a visual analogue scale ranging from 0 - 10 (0 = no pain, 10 = most severe pain). All patients were masked as to the type of treatment. The order of carrying out the treatment on each patient was randomised. Fundus photographs were taken of each hemifundus to confirm treatment. Results: Of the 20 patients, 17 had proliferative diabetic retinopathy, 2 had ischaemic central retinal vein occlusion and one had ocular ischaemic syndrome. The average pain response to treatment A was 5.11 on a visual analogue scale with a mean power of 0.178 Watt; the average pain response to treatment B was 1.40 with a mean power of 0.489 Watt. Short exposure laser burns were significantly less painful (P < 0.001). Conclusion: Shortening exposure time with increased power is more comfortable for patients undergoing panretinal photocoagulation than conventional parameters.
Resumo:
Batch-mode reverse osmosis (batch-RO) operation is considered a promising desalination method due to its low energy requirement compared to other RO system arrangements. To improve and predict batch-RO performance, studies on concentration polarization (CP) are carried out. The Kimura-Sourirajan mass-transfer model is applied and validated by experimentation with two different spiral-wound RO elements. Explicit analytical Sherwood correlations are derived based on experimental results. For batch-RO operation, a new genetic algorithm method is developed to estimate the Sherwood correlation parameters, taking into account the effects of variation in operating parameters. Analytical procedures are presented, then the mass transfer coefficient models are developed for different operation processes, i.e., batch-RO and continuous RO. The CP related energy loss in batch-RO operation is quantified based on the resulting relationship between feed flow rates and mass transfer coefficients. It is found that CP increases energy consumption in batch-RO by about 25% compared to the ideal case in which CP is absent. For continuous RO process, the derived Sherwood correlation predicted CP accurately. In addition, we determined the optimum feed flow rate of our batch-RO system.
Resumo:
For more than a century it has been known that the eye is not a perfect optical system, but rather a system that suffers from aberrations beyond conventional prescriptive descriptions of defocus and astigmatism. Whereas traditional refraction attempts to describe the error of the eye with only two parameters, namely sphere and cylinder, measurements of wavefront aberrations depict the optical error with many more parameters. What remains questionable is the impact these additional parameters have on visual function. Some authors have argued that higher-order aberrations have a considerable effect on visual function and in certain cases this effect is significant enough to induce amblyopia. This has been referred to as ‘higher-order aberration-associated amblyopia’. In such cases, correction of higher-order aberrations would not restore visual function. Others have reported that patients with binocular asymmetric aberrations display an associated unilateral decrease in visual acuity and, if the decline in acuity results from the aberrations alone, such subjects may have been erroneously diagnosed as amblyopes. In these cases, correction of higher-order aberrations would restore visual function. This refractive entity has been termed ‘aberropia’. In order to investigate these hypotheses, the distribution of higher-order aberrations in strabismic, anisometropic and idiopathic amblyopes, and in a group of visual normals, was analysed both before and after wavefront-guided laser refractive correction. The results show: (i) there is no significant asymmetry in higher-order aberrations between amblyopic and fixing eyes prior to laser refractive treatment; (ii) the mean magnitude of higher-order aberrations is similar within the amblyopic and visually normal populations; (iii) a significant improvement in visual acuity can be realised for adult amblyopic patients utilising wavefront-guided laser refractive surgery and a modest increase in contrast sensitivity was observed for the amblyopic eye of anisometropes following treatment (iv) an overall trend towards increased higher-order aberrations following wavefront-guided laser refractive treatment was observed for both visually normal and amblyopic eyes. In conclusion, while the data do not provide any direct evidence for the concepts of either ‘aberropia’ or ‘higher-order aberration-associated amblyopia’, it is clear that gains in visual acuity and contrast sensitivity may be realised following laser refractive treatment of the amblyopic adult eye. Possible mechanisms by which these gains are realised are discussed.
Resumo:
Optical fibre based sensors are transforming industry by permitting monitoring in hitherto inaccessible environments or measurement approaches that cannot be reproduced using conventional electronic sensors. A multitude of techniques have been developed to render the fibres sensitive to a wide range of parameters including: temperature, strain, pressure (static and dynamic), acceleration, rotation, gas type, and specific biochemical species. Constructed entirely of glass or polymer material, optical fibre devices like fibre gratings offer the properties: low loss, dielectric construction, small size, multiplexing, and so on [1-3]. In this paper, the authors will show the latest developing industrial applications, using polymer optical fibre (POF) devices, and comparing their performance with silica optical fibre devices. The authors address two pressing commercial requirements. The first concerns the monitoring of fuel level in civil aircraft. There is a strong motivation in the aerospace industry to move away from electrical sensors, especially in the fuel system. This is driven by the need to eliminate potential ignition hazards, the desire to reduce cabling weight and the need to mitigate the effects of lightning strikes in aircraft where the conventional metallic skin is increasingly being replaced by composite materials. In this case, the authors have developed pressure sensors based on a diaphragm in which a polymer fibre Bragg grating (POFBG) has been embedded [3]. These devices provide high pressure sensitivity enabling level measurement in the mm range. Also, it has developed an approach incorporating several such sensors which can compensate for temperature drifts and is insensitive to fluid density. Compared with silica fibre-based sensors, their performance is highly enhanced. Initial results have attracted the interest of Airbus from UK, who is keen to explore the potential of optical technology in commercial aircraft. The second concerns the monitoring of acoustic signals and vibration in the subsea environment, for applications in geophysical surveying and security (detection of unwanted craft or personnel). There is strong motivation to move away from electrical sensors due to the bulk of the sensor and associated cabling and the impossibility of monitoring over large distances without electrical amplification. Optical approaches like optical hydrophones [5] offer a means of overcoming these difficulties. In collaboration with Kongsberg from Norway, the authors will exploit the sensitivity improvements possible by using POF instead of silica fibre. These improvements will arise as a result of the much more compliant nature of POF compared to silica fibre (3 GPa vs 72 GPa, respectively). Essentially, and despite the strain sensitivity of silica and POFBGs being very similar, this renders the POF much more sensitive to the applied stress resulting from acoustic signals or vibration. An alternative way of viewing this is that the POF is better impedance-matched to the surrounding environment (water for the intended applications), because although its impedance is higher than that of water, it is nearly an order of magnitude smaller than that of silica. Finally, other future industrial applications will be presented and discussed, showing the vast range of the optical fiber devices in sensing applications.
Resumo:
From 1992 to 2012 4.4 billion people were affected by disasters with almost 2 trillion USD in damages and 1.3 million people killed worldwide. The increasing threat of disasters stresses the need to provide solutions for the challenges faced by disaster managers, such as the logistical deployment of resources required to provide relief to victims. The location of emergency facilities, stock prepositioning, evacuation, inventory management, resource allocation, and relief distribution have been identified to directly impact the relief provided to victims during the disaster. Managing appropriately these factors is critical to reduce suffering. Disaster management commonly attracts several organisations working alongside each other and sharing resources to cope with the emergency. Coordinating these agencies is a complex task but there is little research considering multiple organisations, and none actually optimising the number of actors required to avoid shortages and convergence. The aim of the this research is to develop a system for disaster management based on a combination of optimisation techniques and geographical information systems (GIS) to aid multi-organisational decision-making. An integrated decision system was created comprising a cartographic model implemented in GIS to discard floodable facilities, combined with two models focused on optimising the decisions regarding location of emergency facilities, stock prepositioning, the allocation of resources and relief distribution, along with the number of actors required to perform these activities. Three in-depth case studies in Mexico were studied gathering information from different organisations. The cartographic model proved to reduce the risk to select unsuitable facilities. The preparedness and response models showed the capacity to optimise the decisions and the number of organisations required for logistical activities, pointing towards an excess of actors involved in all cases. The system as a whole demonstrated its capacity to provide integrated support for disaster preparedness and response, along with the existence of room for improvement for Mexican organisations in flood management.
Resumo:
A compact and low cost fiber sensor based on single multimode microfiber with Fresnel reflection is proposed and demonstrated for simultaneous measurement of refractive index and temperature. The sensor is fabricated with two simple steps including fiber tapering and then fiber endface cleaving. The reflection spectrum is an intensity modulated interference spectrum, as the tapered fiber generates interference pattern and the cleaved endface provides intensity modulation. By demodulating the fringe power and free spectrum range (FSR) of the spectrum, RI sensitivities of -72.247dB/RIU and 68.122nm/RIU, as well as temperature sensitivities of 0.0283dB/degrees C and -17pm/degrees C are obtained. Further, the sensing scheme could also provide the feasibility to construct a more compact sensing probe for dual-paramters measurement, which has great potential in bio/chemical detection.