24 resultados para Chebyshev And Binomial Distributions

em Aston University Research Archive


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The last decade has seen a considerable increase in the application of quantitative methods in the study of histological sections of brain tissue and especially in the study of neurodegenerative disease. These disorders are characterised by the deposition and aggregation of abnormal or misfolded proteins in the form of extracellular protein deposits such as senile plaques (SP) and intracellular inclusions such as neurofibrillary tangles (NFT). Quantification of brain lesions and studying the relationships between lesions and normal anatomical features of the brain, including neurons, glial cells, and blood vessels, has become an important method of elucidating disease pathogenesis. This review describes methods for quantifying the abundance of a histological feature such as density, frequency, and 'load' and the sampling methods by which quantitative measures can be obtained including plot/quadrat sampling, transect sampling, and the point-quarter method. In addition, methods for determining the spatial pattern of a histological feature, i.e., whether the feature is distributed at random, regularly, or is aggregated into clusters, are described. These methods include the use of the Poisson and binomial distributions, pattern analysis by regression, Fourier analysis, and methods based on mapped point patterns. Finally, the statistical methods available for studying the degree of spatial correlation between pathological lesions and neurons, glial cells, and blood vessels are described.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

TThe size frequency distributions of ß-amyloid (Aß) and prion protein (PrPsc) deposits were studied in Alzheimer’s disease (AD) and the variant form of Creutzfeldt-Jakob disease (vCJD) respectively. All size distributions were unimodal and positively skewed. Aß deposits reached a greater maximum size and their distributions were significantly less skewed than the PrPsc deposits. All distributions were approximately log-normal in shape but only the diffuse PrPsc deposits did not deviate significantly from a log-normal model. There were fewer larger classic Aß deposits than predicted and the florid PrPsc deposits occupied a more restricted size range than predicted by a log-normal model. Hence, Aß deposits exhibit greater growth than the corresponding PrPsc deposits. Surface diffusion may be particularly important in determining the growth of the diffuse PrPsc deposits. In addition, there are factors limiting the maximum size of the Aß and florid PrPsc deposits.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The thesis presents a two-dimensional Risk Assessment Method (RAM) where the assessment of risk to the groundwater resources incorporates both the quantification of the probability of the occurrence of contaminant source terms, as well as the assessment of the resultant impacts. The approach emphasizes the need for a greater dependency on the potential pollution sources, rather than the traditional approach where assessment is based mainly on the intrinsic geo-hydrologic parameters. The risk is calculated using Monte Carlo simulation methods whereby random pollution events were generated to the same distribution as historically occurring events or a priori potential probability distribution. Integrated mathematical models then simulate contaminant concentrations at the predefined monitoring points within the aquifer. The spatial and temporal distributions of the concentrations were calculated from repeated realisations, and the number of times when a user defined concentration magnitude was exceeded is quantified as a risk. The method was setup by integrating MODFLOW-2000, MT3DMS and a FORTRAN coded risk model, and automated, using a DOS batch processing file. GIS software was employed in producing the input files and for the presentation of the results. The functionalities of the method, as well as its sensitivities to the model grid sizes, contaminant loading rates, length of stress periods, and the historical frequencies of occurrence of pollution events were evaluated using hypothetical scenarios and a case study. Chloride-related pollution sources were compiled and used as indicative potential contaminant sources for the case study. At any active model cell, if a random generated number is less than the probability of pollution occurrence, then the risk model will generate synthetic contaminant source term as an input into the transport model. The results of the applications of the method are presented in the form of tables, graphs and spatial maps. Varying the model grid sizes indicates no significant effects on the simulated groundwater head. The simulated frequency of daily occurrence of pollution incidents is also independent of the model dimensions. However, the simulated total contaminant mass generated within the aquifer, and the associated volumetric numerical error appear to increase with the increasing grid sizes. Also, the migration of contaminant plume advances faster with the coarse grid sizes as compared to the finer grid sizes. The number of daily contaminant source terms generated and consequently the total mass of contaminant within the aquifer increases in a non linear proportion to the increasing frequency of occurrence of pollution events. The risk of pollution from a number of sources all occurring by chance together was evaluated, and quantitatively presented as risk maps. This capability to combine the risk to a groundwater feature from numerous potential sources of pollution proved to be a great asset to the method, and a large benefit over the contemporary risk and vulnerability methods.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This work presents a two-dimensional approach of risk assessment method based on the quantification of the probability of the occurrence of contaminant source terms, as well as the assessment of the resultant impacts. The risk is calculated using Monte Carlo simulation methods whereby synthetic contaminant source terms were generated to the same distribution as historically occurring pollution events or a priori potential probability distribution. The spatial and temporal distributions of the generated contaminant concentrations at pre-defined monitoring points within the aquifer were then simulated from repeated realisations using integrated mathematical models. The number of times when user defined ranges of concentration magnitudes were exceeded is quantified as risk. The utilities of the method were demonstrated using hypothetical scenarios, and the risk of pollution from a number of sources all occurring by chance together was evaluated. The results are presented in the form of charts and spatial maps. The generated risk maps show the risk of pollution at each observation borehole, as well as the trends within the study area. This capability to generate synthetic pollution events from numerous potential sources of pollution based on historical frequency of their occurrence proved to be a great asset to the method, and a large benefit over the contemporary methods.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Biomass-To-Liquid (BTL) is one of the most promising low carbon processes available to support the expanding transportation sector. This multi-step process produces hydrocarbon fuels from biomass, the so-called “second generation biofuels” that, unlike first generation biofuels, have the ability to make use of a wider range of biomass feedstock than just plant oils and sugar/starch components. A BTL process based on gasification has yet to be commercialized. This work focuses on the techno-economic feasibility of nine BTL plants. The scope was limited to hydrocarbon products as these can be readily incorporated and integrated into conventional markets and supply chains. The evaluated BTL systems were based on pressurised oxygen gasification of wood biomass or bio-oil and they were characterised by different fuel synthesis processes including: Fischer-Tropsch synthesis, the Methanol to Gasoline (MTG) process and the Topsoe Integrated Gasoline (TIGAS) synthesis. This was the first time that these three fuel synthesis technologies were compared in a single, consistent evaluation. The selected process concepts were modelled using the process simulation software IPSEpro to determine mass balances, energy balances and product distributions. For each BTL concept, a cost model was developed in MS Excel to estimate capital, operating and production costs. An uncertainty analysis based on the Monte Carlo statistical method, was also carried out to examine how the uncertainty in the input parameters of the cost model could affect the output (i.e. production cost) of the model. This was the first time that an uncertainty analysis was included in a published techno-economic assessment study of BTL systems. It was found that bio-oil gasification cannot currently compete with solid biomass gasification due to the lower efficiencies and higher costs associated with the additional thermal conversion step of fast pyrolysis. Fischer-Tropsch synthesis was the most promising fuel synthesis technology for commercial production of liquid hydrocarbon fuels since it achieved higher efficiencies and lower costs than TIGAS and MTG. None of the BTL systems were competitive with conventional fossil fuel plants. However, if government tax take was reduced by approximately 33% or a subsidy of £55/t dry biomass was available, transport biofuels could be competitive with conventional fuels. Large scale biofuel production may be possible in the long term through subsidies, fuels price rises and legislation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Semantic Web relies on carefully structured, well defined, data to allow machines to communicate and understand one another. In many domains (e.g. geospatial) the data being described contains some uncertainty, often due to incomplete knowledge; meaningful processing of this data requires these uncertainties to be carefully analysed and integrated into the process chain. Currently, within the SemanticWeb there is no standard mechanism for interoperable description and exchange of uncertain information, which renders the automated processing of such information implausible, particularly where error must be considered and captured as it propagates through a processing sequence. In particular we adopt a Bayesian perspective and focus on the case where the inputs / outputs are naturally treated as random variables. This paper discusses a solution to the problem in the form of the Uncertainty Markup Language (UncertML). UncertML is a conceptual model, realised as an XML schema, that allows uncertainty to be quantified in a variety of ways i.e. realisations, statistics and probability distributions. UncertML is based upon a soft-typed XML schema design that provides a generic framework from which any statistic or distribution may be created. Making extensive use of Geography Markup Language (GML) dictionaries, UncertML provides a collection of definitions for common uncertainty types. Containing both written descriptions and mathematical functions, encoded as MathML, the definitions within these dictionaries provide a robust mechanism for defining any statistic or distribution and can be easily extended. Universal Resource Identifiers (URIs) are used to introduce semantics to the soft-typed elements by linking to these dictionary definitions. The INTAMAP (INTeroperability and Automated MAPping) project provides a use case for UncertML. This paper demonstrates how observation errors can be quantified using UncertML and wrapped within an Observations & Measurements (O&M) Observation. The interpolation service uses the information within these observations to influence the prediction outcome. The output uncertainties may be encoded in a variety of UncertML types, e.g. a series of marginal Gaussian distributions, a set of statistics, such as the first three marginal moments, or a set of realisations from a Monte Carlo treatment. Quantifying and propagating uncertainty in this way allows such interpolation results to be consumed by other services. This could form part of a risk management chain or a decision support system, and ultimately paves the way for complex data processing chains in the Semantic Web.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Threshold stress intensity values, ranging from ∼6 to 16 MN m −3/2 can be obtained in powder-formed Nimonic AP1 by changing the microstructure. The threshold and low crack growth rate behaviour at room temperature of a number of widely differing API microstructures, with both ‘necklace’ and fully recrystallized grain structures of various sizes and uniform and bimodal γ′-distributions, have been investigated. The results indicate that grain size is an important microstructural parameter which can control threshold behaviour, with the value of threshold stress intensity increasing with increasing grain size, but that the γ′-distribution is also important. In this Ni-base alloy, as in many others, near threshold fatigue crack growth occurs in a crystallographic manner along {111} planes. This is due to the development of a dislocation structure involving persistent slip bands on {111} planes in the plastic zone, caused by the presence of ordered shearable precipitates in the microstructure. However, as the stress intensity range is increased, a striated growth mode takes over. The results presented show that this transition from faceted to striated growth is associated with a sudden increase in crack propagation rate and occurs when the size of the reverse plastic zone at the crack tip becomes equal to the grain size, independent of any other microstructural variables.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Emulsions and microcapsules are typical structures in various dispersion formulations for pharmaceutical, food, personal and house care applications. Precise control over size and size distribution of emulsion droplets and microcapsules are important for effective use and delivery of active components and better product quality. Many emulsification technologies have been developed to meet different formulation and processing requirements. Among them, membrane and microfluidic emulsification as emerging technologies have the feature of being able to precisely manufacture droplets in a drop-by-drop manner to give subscribed sizes and size distributions with lower energy consumption. This paper reviews fundamental sciences and engineering aspects of emulsification, membrane and microfluidic emulsification technologies and their use for precision manufacture of emulsions for intensified processing. Generic application examples are given for single and double emulsions and microcapsules with different structure features. © 2013 The Society of Powder Technology Japan. Published by Elsevier B.V.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Error rates of a Boolean perceptron with threshold and either spherical or Ising constraint on the weight vector are calculated for storing patterns from biased input and output distributions derived within a one-step replica symmetry breaking (RSB) treatment. For unbiased output distribution and non-zero stability of the patterns, we find a critical load, α p, above which two solutions to the saddlepoint equations appear; one with higher free energy and zero threshold and a dominant solution with non-zero threshold. We examine this second-order phase transition and the dependence of α p on the required pattern stability, κ, for both one-step RSB and replica symmetry (RS) in the spherical case and for one-step RSB in the Ising case.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The use of MS imaging (MSI) to resolve the spatial and pharmacodynamic distributions of compounds in tissues is emerging as a powerful tool for pharmacological research. Unlike established imaging techniques, only limited a priori knowledge is required and no extensive manipulation (e.g., radiolabeling) of drugs is necessary prior to dosing. MS provides highly multiplexed detection, making it possible to identify compounds, their metabolites and other changes in biomolecular abundances directly off tissue sections in a single pass. This can be employed to obtain near cellular, or potentially subcellular, resolution images. Consideration of technical limitations that affect the process is required, from sample preparation through to analyte ionization and detection. The techniques have only recently been adapted for imaging and novel variations to the established MSI methodologies will further enhance the application of MSI for pharmacological research.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Molecular dynamics simulations were carried out for Si/Ge axial nanowire heterostructures using modified effective atom method (MEAM) potentials. A Si–Ge MEAM interatomic cross potential was developed based on available experimental data and was used for these studies. The atomic distortions and strain distributions near the Si/Ge interfaces are predicted for nanowires with their axes oriented along the [111] direction. The cases of 10 and 25 nm diameter Si/Ge biwires and of 25 nm diameter Si/Ge/Si axial heterostructures with the Ge disk 1 nm thick were studied. Substantial distortions in the height of the atoms adjacent to the interface were found for the biwires but not for the Ge disks. Strains as high as 3.5% were found for the Ge disk and values of 2%–2.5% were found at the Si and Ge interfacial layers in the biwires. Deformation potential theory was used to estimate the influence of the strains on the band gap, and reductions in band gap to as small as 40% of bulk values are predicted for the Ge disks. The localized regions of increased strain and resulting energy minima were also found within the Si/Ge biwire interfaces with the larger effects on the Ge side of the interface. The regions of strain maxima near and within the interfaces are anticipated to be useful for tailoring band gaps and producing quantum confinement of carriers. These results suggest that nanowire heterostructures provide greater design flexibility in band structure modification than is possible with planar layer growth.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The gamma-rays produced by the inelastic scattering of 14 MeV neutrons. in fusion reactor materials have been studied using a gamma-ray spectrometer employing a sodium iodide scintillation detector. The source neutrons are produced by the T(d,n)4He reaction using the SAMES accelerator at the University of Aston in Birmingham. In order to eliminate the large gamma-ray background and neutron signal due to the sensitivity of the sodium iodide detector to neutrons, the gamma-ray detector is heavily shielded and is used together with a particle time of flight discrimination system based on the associated particle time of flight method. The instant of production of a source neutron is determined by detecting the associated alpha-particle enabling discrimination between the neutrons and gamma-rays by their different time of flight times. The electronic system used for measuring the time of flight of the neutrons and gamrna-rays over the fixed flight path is described. The materials studied in this work were Lithium and Lead because of their importance as fuel breeding and shielding materials in conceptual fusion reactor designs. Several sample thicknesses were studied to determine the multiple scattering effects. The observed gamma-ray spectra from each sample at several scattering angles in the angular range Oº - 90° enabled absolute differential gamma-ray production cross-sections and angular distributions of the resolved gamma-rays from Lithium to be measured and compared with published data. For the Lead sample, the absolute differential gamma-ray production cross-sections for discrete 1 MeV ranges and the angular distributions were measured. The measured angular distributions of the present work and those on Iron from previous work are compared to the predictions of the Monte Carlo programme M.O.R.S.E. Good agreement was obtained between the experimental results and the theoretical predictions. In addition an empirical relation has been constructed which describes the multiple scattering effects by a single parameter and is capable of predicting the gamma-ray production cross-sections for the materials to an accuracy of ± 25%.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis describes the procedure and results from four years research undertaken through the IHD (Interdisciplinary Higher Degrees) Scheme at Aston University in Birmingham, sponsored by the SERC (Science and Engineering Research Council) and Monk Dunstone Associates, Chartered Quantity Surveyors. A stochastic networking technique VERT (Venture Evaluation and Review Technique) was used to model the pre-tender costs of public health, heating ventilating, air-conditioning, fire protection, lifts and electrical installations within office developments. The model enabled the quantity surveyor to analyse, manipulate and explore complex scenarios which previously had defied ready mathematical analysis. The process involved the examination of historical material costs, labour factors and design performance data. Components and installation types were defined and formatted. Data was updated and adjusted using mechanical and electrical pre-tender cost indices and location, selection of contractor, contract sum, height and site condition factors. Ranges of cost, time and performance data were represented by probability density functions and defined by constant, uniform, normal and beta distributions. These variables and a network of the interrelationships between services components provided the framework for analysis. The VERT program, in this particular study, relied upon Monte Carlo simulation to model the uncertainties associated with pre-tender estimates of all possible installations. The computer generated output in the form of relative and cumulative frequency distributions of current element and total services costs, critical path analyses and details of statistical parameters. From this data alternative design solutions were compared, the degree of risk associated with estimates was determined, heuristics were tested and redeveloped, and cost significant items were isolated for closer examination. The resultant models successfully combined cost, time and performance factors and provided the quantity surveyor with an appreciation of the cost ranges associated with the various engineering services design options.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis deals with the problems associated with the planning and control of production, with particular reference to a small aluminium die casting company. The main problem areas were identified as: (a) A need to be able to forecast the customers demands upon the company's facilities. (b) A need to produce a manufacturing programme in which the output of the foundry (or die casting section) was balanced with the available capacity in the machine shop. (c) The need to ensure that the resultant system enabled the company's operating budget to have a reasonable chance of being achieved. At the commencement of the research work the major customers were members of the automobile industry and had their own system of forecasting, from which they issued manufacturing schedules to their component suppliers, The errors in the forecast were analysed and the distributions noted. Using these distributions the customer's forecast was capable of being modified to enable his final demand to be met with a known degree of confidence. Before a manufacturing programme could be developed the actual manufacturing system had to be reviewed and it was found that as with many small companies there was a remarkable lack of formal control and written data. Relevant data with regards to the component and the manufacturing process had therefore to be collected and analysed. The foundry process was fixed but the secondary machining operations were analysed by a technique similar to Component Flow Analysis and as a result the machines were arranged in a series of flow lines. A system of manual production control was proposed and for comparison, a local computer bureau was approached and a system proposed incorporating the production of additional management information. These systems are compared and the relative merits discussed and a proposal made for implementation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Substantial altimetry datasets collected by different satellites have only become available during the past five years, but the future will bring a variety of new altimetry missions, both parallel and consecutive in time. The characteristics of each produced dataset vary with the different orbital heights and inclinations of the spacecraft, as well as with the technical properties of the radar instrument. An integral analysis of datasets with different properties offers advantages both in terms of data quantity and data quality. This thesis is concerned with the development of the means for such integral analysis, in particular for dynamic solutions in which precise orbits for the satellites are computed simultaneously. The first half of the thesis discusses the theory and numerical implementation of dynamic multi-satellite altimetry analysis. The most important aspect of this analysis is the application of dual satellite altimetry crossover points as a bi-directional tracking data type in simultaneous orbit solutions. The central problem is that the spatial and temporal distributions of the crossovers are in conflict with the time-organised nature of traditional solution methods. Their application to the adjustment of the orbits of both satellites involved in a dual crossover therefore requires several fundamental changes of the classical least-squares prediction/correction methods. The second part of the thesis applies the developed numerical techniques to the problems of precise orbit computation and gravity field adjustment, using the altimetry datasets of ERS-1 and TOPEX/Poseidon. Although the two datasets can be considered less compatible that those of planned future satellite missions, the obtained results adequately illustrate the merits of a simultaneous solution technique. In particular, the geographically correlated orbit error is partially observable from a dataset consisting of crossover differences between two sufficiently different altimetry datasets, while being unobservable from the analysis of altimetry data of both satellites individually. This error signal, which has a substantial gravity-induced component, can be employed advantageously in simultaneous solutions for the two satellites in which also the harmonic coefficients of the gravity field model are estimated.