941 resultados para Chebyshev And Binomial Distributions


Relevância:

100.00% 100.00%

Publicador:

Resumo:

We give a selective review of quantum mechanical methods for calculating and characterizing resonances in small molecular systems, with an emphasis on recent progress in Chebyshev and Lanczos iterative methods. Two archetypal molecular systems are discussed: isolated resonances in HCO, which exhibit regular mode and state specificity, and overlapping resonances in strongly bound HO2, which exhibit irregular and chaotic behavior. Recent progresses for non-zero total angular momentum J calculations of resonances including parallel computing models are also included and future directions in this field are discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

It remains unclear whether genetic variants in SNCA (the alpha-synuclein gene) alter risk for sporadic Parkinson's disease (PD). The polymorphic mixed sequence repeat (NACP-Rep I) in the promoter region of SNCA has been previously examined as a potential susceptibility factor for PD with conflicting results. We report genotype and allele distributions at this locus from 369 PD cases and 370 control subjects of European Australian ancestry, with alleles designated as -1, 0, +1, +2, and +3 as previously described. Allele frequencies designated (0) were less common in Australian cases compared to controls (OR = 0.80, 95% CI 0.62-1.03). Combined analysis including all previously published ancestral European Rep1 data yielded a highly significant association between the 0 allele and a reduced risk for PD (OR = 0.79, 95% CI 0.70-0.89, p = 0.0001). Further study must now proceed to examine in detail this interesting and biologically plausible genetic association. (C) 2004 Elsevier Ireland Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Using imaging from the Hubble Space Telescope, we derive surface brightness profiles for ultracompact dwarfs in the Fornax Cluster and for the nuclei of dwarf elliptical galaxies in the Virgo Cluster. Ultracompact dwarfs are more extended and have higher surface brightnesses than typical dwarf nuclei, while the luminosities, colors, and sizes of the nuclei are closer to those of Galactic globular clusters. This calls into question the production of ultracompact dwarfs via threshing, whereby the lower surface brightness envelope of a dwarf elliptical galaxy is removed by tidal processes, leaving behind a bare nucleus. Threshing may still be a viable model if the relatively bright Fornax ultracompact dwarfs considered here are descended from dwarf elliptical galaxies whose nuclei are at the upper end of their luminosity and size distributions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present optical, near-IR, and radio follow-up of 16 Swift bursts, including our discovery of nine afterglows and a redshift determination for three. These observations, supplemented by data from the literature, provide an afterglow recovery rate of 52% in the optical/near-IR, much higher than in previous missions (BeppoSAX, HETE-2, INTEGRAL, and IPN). The optical/near-IR afterglows of Swift events are on average 1.8 mag fainter at t = 12 hr than those of previous missions. The X-ray afterglows are similarly fainter than those of pre-Swift bursts. In the radio the limiting factor is the VLA threshold, and the detection rate for Swift bursts is similar to that for past missions. The redshift distribution of pre-Swift bursts peaked at z similar to 1, whereas the six Swift bursts with measured redshifts are distributed evenly between 0.7 and 3.2. From these results we conclude that ( 1) the pre-Swift distributions were biased in favor of bright events and low-redshift events, ( 2) the higher sensitivity and accurate positions of Swift result in a better representation of the true burst redshift and brightness distributions ( which are higher and dimmer, respectively), and (3) similar to 10% of the bursts are optically dark, as a result of a high redshift and/or dust extinction. We remark that the apparent lack of low-redshift, low-luminosity Swift bursts and the lower event rate than prelaunch estimates ( 90 vs. 150 per year) are the result of a threshold that is similar to that of BATSE. In view of these inferences, afterglow observers may find it advisable to make significant changes in follow-up strategies of Swift events. The faintness of the afterglows means that large telescopes should be employed as soon as the burst is localized. Sensitive observations in RIz and near-IR bands will be needed to discriminate between a typical z similar to 2 burst with modest extinction and a high-redshift event. Radio observations will be profitable for a small fraction (similar to 10%) of events. Finally, we suggest that a search for bright host galaxies in untriggered BAT localizations may increase the chance of finding nearby low-luminosity GRBs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Recent analyses assert that large marine vertebrates such as marine mammals are now 'functionally or entirely extinct in most coastal ecosystems'. Moreton Bay is a large diverse marine ecosystem bordering the fastest growing area in Australia. The human population is over 1.6 million and increasing yearly by between 10% and 13% with resultant impacts upon the adjoining marine environment. Nonetheless, significant populations of three species of marine mammals are resident within Moreton Bay and a further 14 species are seasonal or occasional visitors. This paper reviews the current and historical distributions and abundance of these species in the context of the current management regime and suggests initiatives to increase the resilience of marine mammal populations to the changes wrought by the burgeoning human population in coastal environments. (C) 2004 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present a spectroscopic survey of almost 15 000 candidate intermediate-redshift luminous red galaxies (LRGs) brighter than i = 19.8, observed with 2dF on the Anglo-Australian Telescope. The targets were selected photometrically from the Sloan Digital Sky Survey (SDSS) and lie along two narrow equatorial strips covering 180 deg(2). Reliable redshifts were obtained for 92 per cent of the targets and the selection is very efficient: over 90 per cent have 0.45 < z < 0.8. More than 80 per cent of the similar to 11 000 red galaxies have pure absorption-line spectra consistent with a passively evolving old stellar population. The redshift, photometric and spatial distributions of the LRGs are described. The 2SLAQ data will be released publicly from mid-2006, providing a powerful resource for observational cosmology and the study of galaxy evolution.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Finite mixture models are being increasingly used to model the distributions of a wide variety of random phenomena. While normal mixture models are often used to cluster data sets of continuous multivariate data, a more robust clustering can be obtained by considering the t mixture model-based approach. Mixtures of factor analyzers enable model-based density estimation to be undertaken for high-dimensional data where the number of observations n is very large relative to their dimension p. As the approach using the multivariate normal family of distributions is sensitive to outliers, it is more robust to adopt the multivariate t family for the component error and factor distributions. The computational aspects associated with robustness and high dimensionality in these approaches to cluster analysis are discussed and illustrated.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The thesis presents a two-dimensional Risk Assessment Method (RAM) where the assessment of risk to the groundwater resources incorporates both the quantification of the probability of the occurrence of contaminant source terms, as well as the assessment of the resultant impacts. The approach emphasizes the need for a greater dependency on the potential pollution sources, rather than the traditional approach where assessment is based mainly on the intrinsic geo-hydrologic parameters. The risk is calculated using Monte Carlo simulation methods whereby random pollution events were generated to the same distribution as historically occurring events or a priori potential probability distribution. Integrated mathematical models then simulate contaminant concentrations at the predefined monitoring points within the aquifer. The spatial and temporal distributions of the concentrations were calculated from repeated realisations, and the number of times when a user defined concentration magnitude was exceeded is quantified as a risk. The method was setup by integrating MODFLOW-2000, MT3DMS and a FORTRAN coded risk model, and automated, using a DOS batch processing file. GIS software was employed in producing the input files and for the presentation of the results. The functionalities of the method, as well as its sensitivities to the model grid sizes, contaminant loading rates, length of stress periods, and the historical frequencies of occurrence of pollution events were evaluated using hypothetical scenarios and a case study. Chloride-related pollution sources were compiled and used as indicative potential contaminant sources for the case study. At any active model cell, if a random generated number is less than the probability of pollution occurrence, then the risk model will generate synthetic contaminant source term as an input into the transport model. The results of the applications of the method are presented in the form of tables, graphs and spatial maps. Varying the model grid sizes indicates no significant effects on the simulated groundwater head. The simulated frequency of daily occurrence of pollution incidents is also independent of the model dimensions. However, the simulated total contaminant mass generated within the aquifer, and the associated volumetric numerical error appear to increase with the increasing grid sizes. Also, the migration of contaminant plume advances faster with the coarse grid sizes as compared to the finer grid sizes. The number of daily contaminant source terms generated and consequently the total mass of contaminant within the aquifer increases in a non linear proportion to the increasing frequency of occurrence of pollution events. The risk of pollution from a number of sources all occurring by chance together was evaluated, and quantitatively presented as risk maps. This capability to combine the risk to a groundwater feature from numerous potential sources of pollution proved to be a great asset to the method, and a large benefit over the contemporary risk and vulnerability methods.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This work presents a two-dimensional approach of risk assessment method based on the quantification of the probability of the occurrence of contaminant source terms, as well as the assessment of the resultant impacts. The risk is calculated using Monte Carlo simulation methods whereby synthetic contaminant source terms were generated to the same distribution as historically occurring pollution events or a priori potential probability distribution. The spatial and temporal distributions of the generated contaminant concentrations at pre-defined monitoring points within the aquifer were then simulated from repeated realisations using integrated mathematical models. The number of times when user defined ranges of concentration magnitudes were exceeded is quantified as risk. The utilities of the method were demonstrated using hypothetical scenarios, and the risk of pollution from a number of sources all occurring by chance together was evaluated. The results are presented in the form of charts and spatial maps. The generated risk maps show the risk of pollution at each observation borehole, as well as the trends within the study area. This capability to generate synthetic pollution events from numerous potential sources of pollution based on historical frequency of their occurrence proved to be a great asset to the method, and a large benefit over the contemporary methods.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Biomass-To-Liquid (BTL) is one of the most promising low carbon processes available to support the expanding transportation sector. This multi-step process produces hydrocarbon fuels from biomass, the so-called “second generation biofuels” that, unlike first generation biofuels, have the ability to make use of a wider range of biomass feedstock than just plant oils and sugar/starch components. A BTL process based on gasification has yet to be commercialized. This work focuses on the techno-economic feasibility of nine BTL plants. The scope was limited to hydrocarbon products as these can be readily incorporated and integrated into conventional markets and supply chains. The evaluated BTL systems were based on pressurised oxygen gasification of wood biomass or bio-oil and they were characterised by different fuel synthesis processes including: Fischer-Tropsch synthesis, the Methanol to Gasoline (MTG) process and the Topsoe Integrated Gasoline (TIGAS) synthesis. This was the first time that these three fuel synthesis technologies were compared in a single, consistent evaluation. The selected process concepts were modelled using the process simulation software IPSEpro to determine mass balances, energy balances and product distributions. For each BTL concept, a cost model was developed in MS Excel to estimate capital, operating and production costs. An uncertainty analysis based on the Monte Carlo statistical method, was also carried out to examine how the uncertainty in the input parameters of the cost model could affect the output (i.e. production cost) of the model. This was the first time that an uncertainty analysis was included in a published techno-economic assessment study of BTL systems. It was found that bio-oil gasification cannot currently compete with solid biomass gasification due to the lower efficiencies and higher costs associated with the additional thermal conversion step of fast pyrolysis. Fischer-Tropsch synthesis was the most promising fuel synthesis technology for commercial production of liquid hydrocarbon fuels since it achieved higher efficiencies and lower costs than TIGAS and MTG. None of the BTL systems were competitive with conventional fossil fuel plants. However, if government tax take was reduced by approximately 33% or a subsidy of £55/t dry biomass was available, transport biofuels could be competitive with conventional fuels. Large scale biofuel production may be possible in the long term through subsidies, fuels price rises and legislation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Semantic Web relies on carefully structured, well defined, data to allow machines to communicate and understand one another. In many domains (e.g. geospatial) the data being described contains some uncertainty, often due to incomplete knowledge; meaningful processing of this data requires these uncertainties to be carefully analysed and integrated into the process chain. Currently, within the SemanticWeb there is no standard mechanism for interoperable description and exchange of uncertain information, which renders the automated processing of such information implausible, particularly where error must be considered and captured as it propagates through a processing sequence. In particular we adopt a Bayesian perspective and focus on the case where the inputs / outputs are naturally treated as random variables. This paper discusses a solution to the problem in the form of the Uncertainty Markup Language (UncertML). UncertML is a conceptual model, realised as an XML schema, that allows uncertainty to be quantified in a variety of ways i.e. realisations, statistics and probability distributions. UncertML is based upon a soft-typed XML schema design that provides a generic framework from which any statistic or distribution may be created. Making extensive use of Geography Markup Language (GML) dictionaries, UncertML provides a collection of definitions for common uncertainty types. Containing both written descriptions and mathematical functions, encoded as MathML, the definitions within these dictionaries provide a robust mechanism for defining any statistic or distribution and can be easily extended. Universal Resource Identifiers (URIs) are used to introduce semantics to the soft-typed elements by linking to these dictionary definitions. The INTAMAP (INTeroperability and Automated MAPping) project provides a use case for UncertML. This paper demonstrates how observation errors can be quantified using UncertML and wrapped within an Observations & Measurements (O&M) Observation. The interpolation service uses the information within these observations to influence the prediction outcome. The output uncertainties may be encoded in a variety of UncertML types, e.g. a series of marginal Gaussian distributions, a set of statistics, such as the first three marginal moments, or a set of realisations from a Monte Carlo treatment. Quantifying and propagating uncertainty in this way allows such interpolation results to be consumed by other services. This could form part of a risk management chain or a decision support system, and ultimately paves the way for complex data processing chains in the Semantic Web.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Threshold stress intensity values, ranging from ∼6 to 16 MN m −3/2 can be obtained in powder-formed Nimonic AP1 by changing the microstructure. The threshold and low crack growth rate behaviour at room temperature of a number of widely differing API microstructures, with both ‘necklace’ and fully recrystallized grain structures of various sizes and uniform and bimodal γ′-distributions, have been investigated. The results indicate that grain size is an important microstructural parameter which can control threshold behaviour, with the value of threshold stress intensity increasing with increasing grain size, but that the γ′-distribution is also important. In this Ni-base alloy, as in many others, near threshold fatigue crack growth occurs in a crystallographic manner along {111} planes. This is due to the development of a dislocation structure involving persistent slip bands on {111} planes in the plastic zone, caused by the presence of ordered shearable precipitates in the microstructure. However, as the stress intensity range is increased, a striated growth mode takes over. The results presented show that this transition from faceted to striated growth is associated with a sudden increase in crack propagation rate and occurs when the size of the reverse plastic zone at the crack tip becomes equal to the grain size, independent of any other microstructural variables.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Emulsions and microcapsules are typical structures in various dispersion formulations for pharmaceutical, food, personal and house care applications. Precise control over size and size distribution of emulsion droplets and microcapsules are important for effective use and delivery of active components and better product quality. Many emulsification technologies have been developed to meet different formulation and processing requirements. Among them, membrane and microfluidic emulsification as emerging technologies have the feature of being able to precisely manufacture droplets in a drop-by-drop manner to give subscribed sizes and size distributions with lower energy consumption. This paper reviews fundamental sciences and engineering aspects of emulsification, membrane and microfluidic emulsification technologies and their use for precision manufacture of emulsions for intensified processing. Generic application examples are given for single and double emulsions and microcapsules with different structure features. © 2013 The Society of Powder Technology Japan. Published by Elsevier B.V.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Hydroperiod, or the distribution, duration and timing of flooding affects both plant and animal distributions. The Florida Everglades is currently undergoing restoration that will result in altered hydroperiods. This study was conducted in Everglades National Park to document the variability in periphyton community structure and function between long and short hydroperiod Everglades marshes. Periphyton is an important primary producer and important food resource in the Everglades. Periphyton is also involved in marl soil formation and nutrient cycling. Although periphyton is an important component of the Everglades landscape, little is known about periphyton structural-functional variation between hydroperiods. ^ For this study diatoms, as well as fresh algae slides of diatoms, cyanobacteria and green algae were identified and enumerated. Short verse long hydroperiod soil and water column nutrients were compared. Short and long hydroperiod algal periphyton mat productivity rates were compared using BOD incubations. Experimental manipulations were performed to determine the effects of desiccation duration and rewetting on periphyton productivity, community structure, and nutrient flux. ^ Variation in periphyton community structure was significantly greater between hydroperiods than within hydroperiods. Short and long hydroperiod periphyton mats have the same algal species, it is the distribution and abundance that varies between hydroperiods. Long hydroperiod mats have greater diatom abundance while short hydroperiod mats have greater relative filamentous cyanobacterial abundance. ^ Long hydroperiod mats had greater net primary production (npp) than short hydroperiod mats. Short hydroperiod mats respond to rewetting more rapidly than do long hydroperiod mats. Dry short hydroperiod mats became net primary producers within 24 hours of rehydration. Increasing desiccation duration led to greater cyanobacterial abundance in long hydroperiod mats and decreased diatom abundance in both long and short hydroperiod mats. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Extensive data sets on water quality and seagrass distributions in Florida Bay have been assembled under complementary, but independent, monitoring programs. This paper presents the landscape-scale results from these monitoring programs and outlines a method for exploring the relationships between two such data sets. Seagrass species occurrence and abundance data were used to define eight benthic habitat classes from 677 sampling locations in Florida Bay. Water quality data from 28 monitoring stations spread across the Bay were used to construct a discriminant function model that assigned a probability of a given benthic habitat class occurring for a given combination of water quality variables. Mean salinity, salinity variability, the amount of light reaching the benthos, sediment depth, and mean nutrient concentrations were important predictor variables in the discriminant function model. Using a cross-validated classification scheme, this discriminant function identified the most likely benthic habitat type as the actual habitat type in most cases. The model predicted that the distribution of benthic habitat types in Florida Bay would likely change if water quality and water delivery were changed by human engineering of freshwater discharge from the Everglades. Specifically, an increase in the seasonal delivery of freshwater to Florida Bay should cause an expansion of seagrass beds dominated by Ruppia maritima and Halodule wrightii at the expense of the Thalassia testudinum-dominated community that now occurs in northeast Florida Bay. These statistical techniques should prove useful for predicting landscape-scale changes in community composition in diverse systems where communities are in quasi-equilibrium with environmental drivers.