883 resultados para Chebyshev And Binomial Distributions


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Error rates of a Boolean perceptron with threshold and either spherical or Ising constraint on the weight vector are calculated for storing patterns from biased input and output distributions derived within a one-step replica symmetry breaking (RSB) treatment. For unbiased output distribution and non-zero stability of the patterns, we find a critical load, α p, above which two solutions to the saddlepoint equations appear; one with higher free energy and zero threshold and a dominant solution with non-zero threshold. We examine this second-order phase transition and the dependence of α p on the required pattern stability, κ, for both one-step RSB and replica symmetry (RS) in the spherical case and for one-step RSB in the Ising case.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The use of MS imaging (MSI) to resolve the spatial and pharmacodynamic distributions of compounds in tissues is emerging as a powerful tool for pharmacological research. Unlike established imaging techniques, only limited a priori knowledge is required and no extensive manipulation (e.g., radiolabeling) of drugs is necessary prior to dosing. MS provides highly multiplexed detection, making it possible to identify compounds, their metabolites and other changes in biomolecular abundances directly off tissue sections in a single pass. This can be employed to obtain near cellular, or potentially subcellular, resolution images. Consideration of technical limitations that affect the process is required, from sample preparation through to analyte ionization and detection. The techniques have only recently been adapted for imaging and novel variations to the established MSI methodologies will further enhance the application of MSI for pharmacological research.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Molecular dynamics simulations were carried out for Si/Ge axial nanowire heterostructures using modified effective atom method (MEAM) potentials. A Si–Ge MEAM interatomic cross potential was developed based on available experimental data and was used for these studies. The atomic distortions and strain distributions near the Si/Ge interfaces are predicted for nanowires with their axes oriented along the [111] direction. The cases of 10 and 25 nm diameter Si/Ge biwires and of 25 nm diameter Si/Ge/Si axial heterostructures with the Ge disk 1 nm thick were studied. Substantial distortions in the height of the atoms adjacent to the interface were found for the biwires but not for the Ge disks. Strains as high as 3.5% were found for the Ge disk and values of 2%–2.5% were found at the Si and Ge interfacial layers in the biwires. Deformation potential theory was used to estimate the influence of the strains on the band gap, and reductions in band gap to as small as 40% of bulk values are predicted for the Ge disks. The localized regions of increased strain and resulting energy minima were also found within the Si/Ge biwire interfaces with the larger effects on the Ge side of the interface. The regions of strain maxima near and within the interfaces are anticipated to be useful for tailoring band gaps and producing quantum confinement of carriers. These results suggest that nanowire heterostructures provide greater design flexibility in band structure modification than is possible with planar layer growth.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The gamma-rays produced by the inelastic scattering of 14 MeV neutrons. in fusion reactor materials have been studied using a gamma-ray spectrometer employing a sodium iodide scintillation detector. The source neutrons are produced by the T(d,n)4He reaction using the SAMES accelerator at the University of Aston in Birmingham. In order to eliminate the large gamma-ray background and neutron signal due to the sensitivity of the sodium iodide detector to neutrons, the gamma-ray detector is heavily shielded and is used together with a particle time of flight discrimination system based on the associated particle time of flight method. The instant of production of a source neutron is determined by detecting the associated alpha-particle enabling discrimination between the neutrons and gamma-rays by their different time of flight times. The electronic system used for measuring the time of flight of the neutrons and gamrna-rays over the fixed flight path is described. The materials studied in this work were Lithium and Lead because of their importance as fuel breeding and shielding materials in conceptual fusion reactor designs. Several sample thicknesses were studied to determine the multiple scattering effects. The observed gamma-ray spectra from each sample at several scattering angles in the angular range Oº - 90° enabled absolute differential gamma-ray production cross-sections and angular distributions of the resolved gamma-rays from Lithium to be measured and compared with published data. For the Lead sample, the absolute differential gamma-ray production cross-sections for discrete 1 MeV ranges and the angular distributions were measured. The measured angular distributions of the present work and those on Iron from previous work are compared to the predictions of the Monte Carlo programme M.O.R.S.E. Good agreement was obtained between the experimental results and the theoretical predictions. In addition an empirical relation has been constructed which describes the multiple scattering effects by a single parameter and is capable of predicting the gamma-ray production cross-sections for the materials to an accuracy of ± 25%.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis describes the procedure and results from four years research undertaken through the IHD (Interdisciplinary Higher Degrees) Scheme at Aston University in Birmingham, sponsored by the SERC (Science and Engineering Research Council) and Monk Dunstone Associates, Chartered Quantity Surveyors. A stochastic networking technique VERT (Venture Evaluation and Review Technique) was used to model the pre-tender costs of public health, heating ventilating, air-conditioning, fire protection, lifts and electrical installations within office developments. The model enabled the quantity surveyor to analyse, manipulate and explore complex scenarios which previously had defied ready mathematical analysis. The process involved the examination of historical material costs, labour factors and design performance data. Components and installation types were defined and formatted. Data was updated and adjusted using mechanical and electrical pre-tender cost indices and location, selection of contractor, contract sum, height and site condition factors. Ranges of cost, time and performance data were represented by probability density functions and defined by constant, uniform, normal and beta distributions. These variables and a network of the interrelationships between services components provided the framework for analysis. The VERT program, in this particular study, relied upon Monte Carlo simulation to model the uncertainties associated with pre-tender estimates of all possible installations. The computer generated output in the form of relative and cumulative frequency distributions of current element and total services costs, critical path analyses and details of statistical parameters. From this data alternative design solutions were compared, the degree of risk associated with estimates was determined, heuristics were tested and redeveloped, and cost significant items were isolated for closer examination. The resultant models successfully combined cost, time and performance factors and provided the quantity surveyor with an appreciation of the cost ranges associated with the various engineering services design options.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis deals with the problems associated with the planning and control of production, with particular reference to a small aluminium die casting company. The main problem areas were identified as: (a) A need to be able to forecast the customers demands upon the company's facilities. (b) A need to produce a manufacturing programme in which the output of the foundry (or die casting section) was balanced with the available capacity in the machine shop. (c) The need to ensure that the resultant system enabled the company's operating budget to have a reasonable chance of being achieved. At the commencement of the research work the major customers were members of the automobile industry and had their own system of forecasting, from which they issued manufacturing schedules to their component suppliers, The errors in the forecast were analysed and the distributions noted. Using these distributions the customer's forecast was capable of being modified to enable his final demand to be met with a known degree of confidence. Before a manufacturing programme could be developed the actual manufacturing system had to be reviewed and it was found that as with many small companies there was a remarkable lack of formal control and written data. Relevant data with regards to the component and the manufacturing process had therefore to be collected and analysed. The foundry process was fixed but the secondary machining operations were analysed by a technique similar to Component Flow Analysis and as a result the machines were arranged in a series of flow lines. A system of manual production control was proposed and for comparison, a local computer bureau was approached and a system proposed incorporating the production of additional management information. These systems are compared and the relative merits discussed and a proposal made for implementation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Substantial altimetry datasets collected by different satellites have only become available during the past five years, but the future will bring a variety of new altimetry missions, both parallel and consecutive in time. The characteristics of each produced dataset vary with the different orbital heights and inclinations of the spacecraft, as well as with the technical properties of the radar instrument. An integral analysis of datasets with different properties offers advantages both in terms of data quantity and data quality. This thesis is concerned with the development of the means for such integral analysis, in particular for dynamic solutions in which precise orbits for the satellites are computed simultaneously. The first half of the thesis discusses the theory and numerical implementation of dynamic multi-satellite altimetry analysis. The most important aspect of this analysis is the application of dual satellite altimetry crossover points as a bi-directional tracking data type in simultaneous orbit solutions. The central problem is that the spatial and temporal distributions of the crossovers are in conflict with the time-organised nature of traditional solution methods. Their application to the adjustment of the orbits of both satellites involved in a dual crossover therefore requires several fundamental changes of the classical least-squares prediction/correction methods. The second part of the thesis applies the developed numerical techniques to the problems of precise orbit computation and gravity field adjustment, using the altimetry datasets of ERS-1 and TOPEX/Poseidon. Although the two datasets can be considered less compatible that those of planned future satellite missions, the obtained results adequately illustrate the merits of a simultaneous solution technique. In particular, the geographically correlated orbit error is partially observable from a dataset consisting of crossover differences between two sufficiently different altimetry datasets, while being unobservable from the analysis of altimetry data of both satellites individually. This error signal, which has a substantial gravity-induced component, can be employed advantageously in simultaneous solutions for the two satellites in which also the harmonic coefficients of the gravity field model are estimated.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The development of more realistic constitutive models for granular media, such as sand, requires ingredients which take into account the internal micro-mechanical response to deformation. Unfortunately, at present, very little is known about these mechanisms and therefore it is instructive to find out more about the internal nature of granular samples by conducting suitable tests. In contrast to physical testing the method of investigation used in this study employs the Distinct Element Method. This is a computer based, iterative, time-dependent technique that allows the deformation of granular assemblies to be numerically simulated. By making assumptions regarding contact stiffnesses each individual contact force can be measured and by resolution particle centroid forces can be calculated. Then by dividing particle forces by their respective mass, particle centroid velocities and displacements are obtained by numerical integration. The Distinct Element Method is incorporated into a computer program 'Ball'. This program is effectively a numerical apparatus which forms a logical housing for this method and allows data input and output, and also provides testing control. By using this numerical apparatus tests have been carried out on disc assemblies and many new interesting observations regarding the micromechanical behaviour are revealed. In order to relate the observed microscopic mechanisms of deformation to the flow of the granular system two separate approaches have been used. Firstly a constitutive model has been developed which describes the yield function, flow rule and translation rule for regular assemblies of spheres and discs when subjected to coaxial deformation. Secondly statistical analyses have been carried out using data which was extracted from the simulation tests. These analyses define and quantify granular structure and then show how the force and velocity distributions use the structure to produce the corresponding stress and strain-rate tensors.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose-To develop a non-invasive method for quantification of blood and pigment distributions across the posterior pole of the fundus from multispectral images using a computer-generated reflectance model of the fundus. Methods - A computer model was developed to simulate light interaction with the fundus at different wavelengths. The distribution of macular pigment (MP) and retinal haemoglobins in the fundus was obtained by comparing the model predictions with multispectral image data at each pixel. Fundus images were acquired from 16 healthy subjects from various ethnic backgrounds and parametric maps showing the distribution of MP and of retinal haemoglobins throughout the posterior pole were computed. Results - The relative distributions of MP and retinal haemoglobins in the subjects were successfully derived from multispectral images acquired at wavelengths 507, 525, 552, 585, 596, and 611?nm, providing certain conditions were met and eye movement between exposures was minimal. Recovery of other fundus pigments was not feasible and further development of the imaging technique and refinement of the software are necessary to understand the full potential of multispectral retinal image analysis. Conclusion - The distributions of MP and retinal haemoglobins obtained in this preliminary investigation are in good agreement with published data on normal subjects. The ongoing development of the imaging system should allow for absolute parameter values to be computed. A further study will investigate subjects with known pathologies to determine the effectiveness of the method as a screening and diagnostic tool.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Social media data are produced continuously by a large and uncontrolled number of users. The dynamic nature of such data requires the sentiment and topic analysis model to be also dynamically updated, capturing the most recent language use of sentiments and topics in text. We propose a dynamic Joint Sentiment-Topic model (dJST) which allows the detection and tracking of views of current and recurrent interests and shifts in topic and sentiment. Both topic and sentiment dynamics are captured by assuming that the current sentiment-topic-specific word distributions are generated according to the word distributions at previous epochs. We study three different ways of accounting for such dependency information: (1) Sliding window where the current sentiment-topic word distributions are dependent on the previous sentiment-topic-specific word distributions in the last S epochs; (2) skip model where history sentiment topic word distributions are considered by skipping some epochs in between; and (3) multiscale model where previous long- and shorttimescale distributions are taken into consideration. We derive efficient online inference procedures to sequentially update the model with newly arrived data and show the effectiveness of our proposed model on the Mozilla add-on reviews crawled between 2007 and 2011. © 2013 ACM 2157-6904/2013/12-ART5 $ 15.00.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A statistical approach to evaluate numerically transmission distances in optical communication systems was described. The proposed systems were subjected to strong patterning effects and strong intersymbol interference. The dependence of transmission distance on the total number of bits was described. Normal and Gaussian distributions were used to derive the error probability.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The target of no-reference (NR) image quality assessment (IQA) is to establish a computational model to predict the visual quality of an image. The existing prominent method is based on natural scene statistics (NSS). It uses the joint and marginal distributions of wavelet coefficients for IQA. However, this method is only applicable to JPEG2000 compressed images. Since the wavelet transform fails to capture the directional information of images, an improved NSS model is established by contourlets. In this paper, the contourlet transform is utilized to NSS of images, and then the relationship of contourlet coefficients is represented by the joint distribution. The statistics of contourlet coefficients are applicable to indicate variation of image quality. In addition, an image-dependent threshold is adopted to reduce the effect of content to the statistical model. Finally, image quality can be evaluated by combining the extracted features in each subband nonlinearly. Our algorithm is trained and tested on the LIVE database II. Experimental results demonstrate that the proposed algorithm is superior to the conventional NSS model and can be applied to different distortions. © 2009 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper describes the use of the Business Process Execution Language for Web Services (BPEL4WS/BPEL) for managing scientific workflows. This work is result of our attempt to adopt Service Oriented Architecture in order to perform Web services – based simulation of metal vapor lasers. Scientific workflows can be more demanding in their requirements than business processes. In the context of addressing these requirements, the features of the BPEL4WS specification are discussed, which is widely regarded as the de-facto standard for orchestrating Web services for business workflows. A typical use case of calculation the electric field potential and intensity distributions is discussed as an example of building a BPEL process to perform distributed simulation constructed by loosely-coupled services.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Multiple transformative forces target marketing, many of which derive from new technologies that allow us to sample thinking in real time (i.e., brain imaging), or to look at large aggregations of decisions (i.e., big data). There has been an inclination to refer to the intersection of these technologies with the general topic of marketing as “neuromarketing”. There has not been a serious effort to frame neuromarketing, which is the goal of this paper. Neuromarketing can be compared to neuroeconomics, wherein neuroeconomics is generally focused on how individuals make “choices”, and represent distributions of choices. Neuromarketing, in contrast, focuses on how a distribution of choices can be shifted or “influenced”, which can occur at multiple “scales” of behavior (e.g., individual, group, or market/society). Given influence can affect choice through many cognitive modalities, and not just that of valuation of choice options, a science of influence also implies a need to develop a model of cognitive function integrating attention, memory, and reward/aversion function. The paper concludes with a brief description of three domains of neuromarketing application for studying influence, and their caveats.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present, for the first time, a detailed investigation of the impact of second order co-propagating Raman pumping on long-haul 100G WDM DP-QPSK coherent transmission of up to 7082 km using Raman fibre laser based configurations. Signal power and noise distributions along the fibre for each pumping scheme were characterised both numerically and experimentally. Based on these pumping schemes, the Q factor penalties versus co-pump power ratios were experimentally measured and quantified. A significant Q factor penalty of up to 4.15 dB was observed after 1666 km using symmetric bidirectional pumping, compared with counter-pumping only. Our results show that whilst using co-pumping minimises the intra-cavity signal power variation and amplification noise, the Q factor penalty with co-pumping was too great for any advantage to be seen. The relative intensity noise (RIN) characteristics of the induced fibre laser and the output signal, and the intra-cavity RF spectra of the fibre laser are also presented. We attribute the Q factor degradation to RIN induced penalty due to RIN being transferred from the first order fibre laser and second order co-pump to the signal. More importantly, there were two different fibre lasing regimes contributing to the amplification. It was random distributed feedback lasing when using counter-pumping only and conventional Fabry-Perot cavity lasing when using all bidirectional pumping schemes. This also results in significantly different performances due to different laser cavity lengths for these two classes of laser.