965 resultados para Product State Distributions
Resumo:
Planktic foraminifera are heterotrophic mesozooplankton of global marine abundance. The position of planktic foraminifers in the marine food web is different compared to other protozoans and ranges above the base of heterotrophic consumers. Being secondary producers with an omnivorous diet, which ranges from algae to small metazoans, planktic foraminifers are not limited to a single food source, and are assumed to occur at a balanced abundance displaying the overall marine biological productivity at a regional scale. We have calculated the assemblage carbon biomass from data on standing stocks between the sea surface and 2500 m water depth, based on 754 protein-biomass data of 21 planktic foraminifer species and morphotypes, produced with a newly developed method to analyze the protein biomass of single planktic foraminifer specimens. Samples include symbiont bearing and symbiont barren species, characteristic of surface and deep-water habitats. Conversion factors between individual protein-biomass and assemblage-biomass are calculated for test sizes between 72 and 845 µm (minimum diameter). The calculated assemblage biomass data presented here include 1057 sites and water depth intervals. Although the regional coverage of database is limited to the North Atlantic, Arabian Sea, Red Sea, and Caribbean, our data include a wide range of oligotrophic to eutrophic waters covering six orders of magnitude of assemblage biomass. A first order estimate of the global planktic foraminifer biomass from average standing stocks (>125 µm) ranges at 8.5-32.7 Tg C yr-1 (i.e. 0.008-0.033 Gt C yr-1), and might be more than three time as high including the entire fauna including neanic and juvenile individuals adding up to 25-100 Tg C yr-1. However, this is a first estimate of regional planktic-foraminifer assemblage-biomass (PFAB) extrapolated to the global scale, and future estimates based on larger data-sets might considerably deviate from the one presented here. This paper is supported by, and a contribution to the Marine Ecosystem Data project (MAREDAT).
Resumo:
Item 535.
Resumo:
Mode of access: Internet.
Resumo:
"EPA 510-R-96-001."
Resumo:
A Product-Service System (PSS) is an integrated combination of products and services. This Western concept embraces a service-led competitive strategy, environmental sustainability, and the basis to differentiate from competitors who simply offer lower priced products. This paper aims to report the state-of-the-art of PSS research by presenting a clinical review of literature currently available on this topic. The literature is classified and the major outcomes of each study are addressed and analysed. On this basis, this paper defines the PSS concept, reports on its origin and features, gives examples of applications along with potential benefits and barriers to adoption, summarizes available tools and methodologies, and identifies future research challenges.
Resumo:
This dissertation presents a study of the D( e, e′p)n reaction carried out at the Thomas Jefferson National Accelerator Facility (Jefferson Lab) for a set of fixed values of four-momentum transfer Q 2 = 2.1 and 0.8 (GeV/c)2 and for missing momenta pm ranging from pm = 0.03 to pm = 0.65 GeV/c. The analysis resulted in the determination of absolute D(e,e′ p)n cross sections as a function of the recoiling neutron momentum and it's scattering angle with respect to the momentum transfer [vector] q. The angular distribution was compared to various modern theoretical predictions that also included final state interactions. The data confirmed the theoretical prediction of a strong anisotropy of final state interaction contributions at Q2 of 2.1 (GeV/c)2 while at the lower Q2 value, the anisotropy was much less pronounced. At Q2 of 0.8 (GeV/c)2, theories show a large disagreement with the experimental results. The experimental momentum distribution of the bound proton inside the deuteron has been determined for the first time at a set of fixed neutron recoil angles. The momentum distribution is directly related to the ground state wave function of the deuteron in momentum space. The high momentum part of this wave function plays a crucial role in understanding the short-range part of the nucleon-nucleon force. At Q2 = 2.1 (GeV/c)2, the momentum distribution determined at small neutron recoil angles is much less affected by FSI compared to a recoil angle of 75°. In contrast, at Q2 = 0.8 (GeV/c)2 there seems to be no region with reduced FSI for larger missing momenta. Besides the statistical errors, systematic errors of about 5–6 % were included in the final results in order to account for normalization uncertainties and uncertainties in the determi- nation of kinematic veriables. The measurements were carried out using an electron beam energy of 2.8 and 4.7 GeV with beam currents between 10 to 100 &mgr; A. The scattered electrons and the ejected protons originated from a 15cm long liquid deuterium target, and were detected in conicidence with the two high resolution spectrometers of Hall A at Jefferson Lab.^
Resumo:
The MAREDAT atlas covers 11 types of plankton, ranging in size from bacteria to jellyfish. Together, these plankton groups determine the health and productivity of the global ocean and play a vital role in the global carbon cycle. Working within a uniform and consistent spatial and depth grid (map) of the global ocean, the researchers compiled thousands and tens of thousands of data points to identify regions of plankton abundance and scarcity as well as areas of data abundance and scarcity. At many of the grid points, the MAREDAT team accomplished the difficult conversion from abundance (numbers of organisms) to biomass (carbon mass of organisms). The MAREDAT atlas provides an unprecedented global data set for ecological and biochemical analysis and modeling as well as a clear mandate for compiling additional existing data and for focusing future data gathering efforts on key groups in key areas of the ocean. This is a gridded data product about diazotrophic organisms . There are 6 variables. Each variable is gridded on a dimension of 360 (longitude) * 180 (latitude) * 33 (depth) * 12 (month). The first group of 3 variables are: (1) number of biomass observations, (2) biomass, and (3) special nifH-gene-based biomass. The second group of 3 variables is same as the first group except that it only grids non-zero data. We have constructed a database on diazotrophic organisms in the global pelagic upper ocean by compiling more than 11,000 direct field measurements including 3 sub-databases: (1) nitrogen fixation rates, (2) cyanobacterial diazotroph abundances from cell counts and (3) cyanobacterial diazotroph abundances from qPCR assays targeting nifH genes. Biomass conversion factors are estimated based on cell sizes to convert abundance data to diazotrophic biomass. Data are assigned to 3 groups including Trichodesmium, unicellular diazotrophic cyanobacteria (group A, B and C when applicable) and heterocystous cyanobacteria (Richelia and Calothrix). Total nitrogen fixation rates and diazotrophic biomass are calculated by summing the values from all the groups. Some of nitrogen fixation rates are whole seawater measurements and are used as total nitrogen fixation rates. Both volumetric and depth-integrated values were reported. Depth-integrated values are also calculated for those vertical profiles with values at 3 or more depths.
Resumo:
Phase-type distributions represent the time to absorption for a finite state Markov chain in continuous time, generalising the exponential distribution and providing a flexible and useful modelling tool. We present a new reversible jump Markov chain Monte Carlo scheme for performing a fully Bayesian analysis of the popular Coxian subclass of phase-type models; the convenient Coxian representation involves fewer parameters than a more general phase-type model. The key novelty of our approach is that we model covariate dependence in the mean whilst using the Coxian phase-type model as a very general residual distribution. Such incorporation of covariates into the model has not previously been attempted in the Bayesian literature. A further novelty is that we also propose a reversible jump scheme for investigating structural changes to the model brought about by the introduction of Erlang phases. Our approach addresses more questions of inference than previous Bayesian treatments of this model and is automatic in nature. We analyse an example dataset comprising lengths of hospital stays of a sample of patients collected from two Australian hospitals to produce a model for a patient's expected length of stay which incorporates the effects of several covariates. This leads to interesting conclusions about what contributes to length of hospital stay with implications for hospital planning. We compare our results with an alternative classical analysis of these data.
Resumo:
In many product categories of durable goods such as TV, PC, and DVD player, the largest component of sales is generated by consumers replacing existing units. Aggregate sales models proposed by diffusion of innovation researchers for the replacement component of sales have incorporated several different replacement distributions such as Rayleigh, Weibull, Truncated Normal and Gamma. Although these alternative replacement distributions have been tested using both time series sales data and individual-level actuarial “life-tables” of replacement ages, there is no census on which distributions are more appropriate to model replacement behaviour. In the current study we are motivated to develop a new “modified gamma” distribution by two reasons. First we recognise that replacements have two fundamentally different drivers – those forced by failure and early, discretionary replacements. The replacement distribution for each of these drivers is expected to be quite different. Second, we observed a poor fit of other distributions to out empirical data. We conducted a survey of 8,077 households to empirically examine models of replacement sales for six electronic consumer durables – TVs, VCRs, DVD players, digital cameras, personal and notebook computers. This data allows us to construct individual-level “life-tables” for replacement ages. We demonstrate the new modified gamma model fits the empirical data better than existing models for all six products using both a primary and a hold-out sample.
Resumo:
The uniformization method (also known as randomization) is a numerically stable algorithm for computing transient distributions of a continuous time Markov chain. When the solution is needed after a long run or when the convergence is slow, the uniformization method involves a large number of matrix-vector products. Despite this, the method remains very popular due to its ease of implementation and its reliability in many practical circumstances. Because calculating the matrix-vector product is the most time-consuming part of the method, overall efficiency in solving large-scale problems can be significantly enhanced if the matrix-vector product is made more economical. In this paper, we incorporate a new relaxation strategy into the uniformization method to compute the matrix-vector products only approximately. We analyze the error introduced by these inexact matrix-vector products and discuss strategies for refining the accuracy of the relaxation while reducing the execution cost. Numerical experiments drawn from computer systems and biological systems are given to show that significant computational savings are achieved in practical applications.
Resumo:
Biochemical reactions underlying genetic regulation are often modelled as a continuous-time, discrete-state, Markov process, and the evolution of the associated probability density is described by the so-called chemical master equation (CME). However the CME is typically difficult to solve, since the state-space involved can be very large or even countably infinite. Recently a finite state projection method (FSP) that truncates the state-space was suggested and shown to be effective in an example of a model of the Pap-pili epigenetic switch. However in this example, both the model and the final time at which the solution was computed, were relatively small. Presented here is a Krylov FSP algorithm based on a combination of state-space truncation and inexact matrix-vector product routines. This allows larger-scale models to be studied and solutions for larger final times to be computed in a realistic execution time. Additionally the new method computes the solution at intermediate times at virtually no extra cost, since it is derived from Krylov-type methods for computing matrix exponentials. For the purpose of comparison the new algorithm is applied to the model of the Pap-pili epigenetic switch, where the original FSP was first demonstrated. Also the method is applied to a more sophisticated model of regulated transcription. Numerical results indicate that the new approach is significantly faster and extendable to larger biological models.
Resumo:
The electron Volt Spectrometer (eVS) is an inverse geometry filter difference spectrometer that has been optimised to measure the single atom properties of condensed matter systems using a technique known as Neutron Compton Scattering (NCS) or Deep Inelastic Neutron Scattering (DINS). The spectrometer utilises the high flux of epithermal neutrons that are produced by the ISIS neutron spallation source enabling the direct measurement of atomic momentum distributions and ground state kinetic energies. In this paper the procedure that is used to calibrate the spectrometer is described. This includes details of the method used to determine detector positions and neutron flight path lengths as well as the determination of the instrument resolution. Examples of measurements on 3 different samples are shown, ZrH2, 4He and Sn which show the self-consistency of the calibration procedure.
Resumo:
Learning is most effective when intrinsically motivated through personal interest, and situated in a supportive socio-cultural context. This paper reports on findings from a study that explored implications for design of interactive learning environments through 18 months of ethnographic observations of people’s interactions at “Hack The Evening” (HTE). HTE is a meetup group initiated at the State Library of Queensland in Brisbane, Australia, and dedicated to provide visitors with opportunities for connected learning in relation to hacking, making and do-it-yourself technology. The results provide insights into factors that contributed to HTE as a social, interactive and participatory environment for learning – knowledge is created and co-created through uncoordinated interactions among participants that come from a diversity of backgrounds, skills and areas of expertise. The insights also reveal challenges and barriers that the HTE group faced in regards to connected learning. Four dimensions of design opportunities are presented to overcome those challenges and barriers towards improving connected learning in library buildings and other free-choice learning environments that seek to embody a more interactive and participatory culture among their users. The insights are relevant for librarians as well as designers, managers and decision makers of other interactive and free-choice learning environments.