925 resultados para exponential sums


Relevância:

10.00% 10.00%

Publicador:

Resumo:

The combined action of the plant-derived volatile, S-carvone, and mild heat treatment on the food-borne pathogen, Listeria monocytogenes, was evaluated. The viability of exponential phase cultures grown at 8 °C could be reduced by 1·3 log units after exposure to S-carvone (5 mmol l−1) for 30 min at 45 °C, while individual treatment with S-carvone or exposure to 45 °C for 30 min did not result in a loss in viability. Other plant-derived volatiles, namely carvacrol, cinnamaldehyde, thymol and decanal, were also found to reduce the viability of L. monocytogenes in combination with the same mild heat treatment at concentrations of 1·75 mmol l−1, 2·5 mmol l−1, 1·5 mmol l−1 and 2 mmol l−1, respectively. These findings show that essential oil compounds can play an important role in minimally processed foods, and can be used in the concept of Hurdle Technology to reduce the intensity of heat treatment or other individual hurdles.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Gamow's explanation of the exponential decay law uses complex 'eigenvalues' and exponentially growing 'eigenfunctions'. This raises the question, how Gamow's description fits into the quantum mechanical description of nature, which is based on real eigenvalues and square integrable wavefunctions. Observing that the time evolution of any wavefunction is given by its expansion in generalized eigenfunctions, we shall answer this question in the most straightforward manner, which at the same time is accessible to graduate students and specialists. Moreover, the presentation can well be used in physics lectures to students.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Type III secretion systems of enteric bacteria enable translocation of effector proteins into host cells. Secreted proteins of verotoxigenic Escherichia coli O157 strains include components of a translocation apparatus, EspA, -B, and -D, as well as "effectors" such as the translocated intimin receptor (Tir) and the mitochondrion-associated protein (Map). This research has investigated the regulation of LEE4 translocon proteins, in particular EspA. EspA filaments could not be detected on the bacterial cell surface when E. coli O157:H7 was cultured in M9 minimal medium but were expressed from only a proportion of the bacterial population when cultured in minimal essential medium modified with 25 mM HEPES. The highest proportions of EspA-filamented bacteria were detected in late exponential phase, after which filaments were lost rapidly from the bacterial cell surface. Our previous research had shown that human and bovine E. coli O157:H7 strains exhibit marked differences in EspD secretion levels. Here it is demonstrated that the proportion of the bacterial population expressing EspA filaments was associated with the level of EspD secretion. The ability of individual bacteria to express EspA filaments was not controlled at the level of LEE1-4 operon transcription, as demonstrated by using both beta-galactosidase and green fluorescent protein (GFP) promoter fusions. All bacteria, whether expressing EspA filaments or not, showed equivalent levels of GFP expression when LEEI-4 translational fusions were used. Despite this, the LEE4-espADB mRNA was more abundant from populations with a high proportion of nonsecreting bacteria (low secretors) than from populations with a high proportion of secreting and therefore filamented bacteria (high secretors). This research demonstrates that while specific environmental conditions are required to induce LEEI-4 expression, a further checkpoint exists before EspA filaments are produced on the bacterial surface and secretion of effector proteins occurs. This checkpoint in E. coli O157:H7 translocon expression is controlled by a posttranscriptional mechanism acting on LEE4-espADB mRNA. The heterogeneity in EspA filamentation could arise from phase-variable expression of regulators that control this posttranscriptional mechanism.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The ability of Escherichia coli O157:H7 to colonize the intestinal epithelia is dependent on the expression of intimin and other adhesins. The chromosome of E. coli O157:H7 carries two loci encoding long polar fimbriae (LPF). These fimbriae mediate adherence to epithelial cells and are associated with colonization of the intestine. In order to increase our knowledge about the conditions controlling their expression and their role in colonization of an animal model, the environmental cues that promote expression of lpf genes and the role of E. coli O157:H7 LPF in intestinal colonization of lambs were investigated. We found that expression of lpf1 was regulated in response to growth phase, osmolarity, and pH; that lpf2 transcription was stimulated during late exponential growth and iron depletion; and that LPF impacts the ability of E. coli O157:H7 to persist in the intestine of infected 6-week-old lambs.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper evaluates the relationship between the cloud modification factor (CMF) in the ultraviolet erythe- mal range and the cloud optical depth (COD) retrieved from the Aerosol Robotic Network (AERONET) "cloud mode" algorithm under overcast cloudy conditions (confirmed with sky images) at Granada, Spain, mainly for non-precipitating, overcast and relatively homogenous water clouds. Empirical CMF showed a clear exponential dependence on experimental COD values, decreasing approximately from 0.7 for COD=10 to 0.25 for COD=50. In addition, these COD measurements were used as input in the LibRadtran radia tive transfer code allowing the simulation of CMF values for the selected overcast cases. The modeled CMF exhibited a dependence on COD similar to the empirical CMF, but modeled values present a strong underestimation with respect to the empirical factors (mean bias of 22 %). To explain this high bias, an exhaustive comparison between modeled and experimental UV erythemal irradiance (UVER) data was performed. The comparison revealed that the radiative transfer simulations were 8 % higher than the observations for clear-sky conditions. The rest of the bias (~14 %) may be attributed to the substantial underestimation of modeled UVER with respect to experimental UVER under overcast conditions, although the correlation between both dataset was high (R2 ~ 0.93). A sensitive test showed that the main reason responsible for that underestimation is the experimental AERONET COD used as input in the simulations, which has been retrieved from zenith radiances in the visible range. In this sense, effective COD in the erythemal interval were derived from an iteration procedure based on searching the best match between modeled and experimental UVER values for each selected overcast case. These effective COD values were smaller than AERONET COD data in about 80 % of the overcast cases with a mean relative difference of 22 %.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The detection of long-range dependence in time series analysis is an important task to which this paper contributes by showing that whilst the theoretical definition of a long-memory (or long-range dependent) process is based on the autocorrelation function, it is not possible for long memory to be identified using the sum of the sample autocorrelations, as usually defined. The reason for this is that the sample sum is a predetermined constant for any stationary time series; a result that is independent of the sample size. Diagnostic or estimation procedures, such as those in the frequency domain, that embed this sum are equally open to this criticism. We develop this result in the context of long memory, extending it to the implications for the spectral density function and the variance of partial sums of a stationary stochastic process. The results are further extended to higher order sample autocorrelations and the bispectral density. The corresponding result is that the sum of the third order sample (auto) bicorrelations at lags h,k≥1, is also a predetermined constant, different from that in the second order case, for any stationary time series of arbitrary length.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

How fast can a mammal evolve from the size of a mouse to the size of an elephant? Achieving such a large transformation calls for major biological reorganization. Thus, the speed at which this occurs has important implications for extensive faunal changes, including adaptive radiations and recovery from mass extinctions. To quantify the pace of large-scale evolution we developed a metric, clade maximum rate, which represents the maximum evolutionary rate of a trait within a clade. We applied this metric to body mass evolution in mammals over the last 70 million years, during which multiple large evolutionary transitions occurred in oceans and on continents and islands. Our computations suggest that it took a minimum of 1.6, 5.1, and 10 million generations for terrestrial mammal mass to increase 100-, and 1,000-, and 5,000- fold, respectively. Values for whales were down to half the length (i.e., 1.1, 3, and 5 million generations), perhaps due to the reduced mechanical constraints of living in an aquatic environment. When differences in generation time are considered, we find an exponential increase in maximum mammal body mass during the 35 million years following the Cretaceous–Paleogene (K–Pg) extinction event. Our results also indicate a basic asymmetry in macroevolution: very large decreases (such as extreme insular dwarfism) can happen at more than 10 times the rate of increases. Our findings allow more rigorous comparisons of microevolutionary and macroevolutionary patterns and processes. Keywords: haldanes, biological time, scaling, pedomorphosis

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, various types of fault detection methods for fuel cells are compared. For example, those that use a model based approach or a data driven approach or a combination of the two. The potential advantages and drawbacks of each method are discussed and comparisons between methods are made. In particular, classification algorithms are investigated, which separate a data set into classes or clusters based on some prior knowledge or measure of similarity. In particular, the application of classification methods to vectors of reconstructed currents by magnetic tomography or to vectors of magnetic field measurements directly is explored. Bases are simulated using the finite integration technique (FIT) and regularization techniques are employed to overcome ill-posedness. Fisher's linear discriminant is used to illustrate these concepts. Numerical experiments show that the ill-posedness of the magnetic tomography problem is a part of the classification problem on magnetic field measurements as well. This is independent of the particular working mode of the cell but influenced by the type of faulty behavior that is studied. The numerical results demonstrate the ill-posedness by the exponential decay behavior of the singular values for three examples of fault classes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The bifidobacterial β-galactosidase (BbgIV) was produced in E. coli DH5α at 37 and 30 °C in a 5 L bioreactor under varied conditions of dissolved oxygen (dO2) and pH. The yield of soluble BbgIV was significantly (P < 0.05) increased once the dO2 dropped to 0–2% and remained at such low values during the exponential phase. Limited dO2 significantly (P < 0.05) increased the plasmid copy number and decreased the cells growth rate. Consequently, the BbgIV yield increased to its maximum (71–75 mg per g dry cell weight), which represented 20–25% of the total soluble proteins in the cells. In addition, the specific activity and catalytic efficiency of BbgIV were significantly (P < 0.05) enhanced under limited dO2 conditions. This was concomitant with a change in the enzyme secondary structure, suggesting a link between the enzyme structure and function. The knowledge generated from this work is very important for producing BbgIV as a biocatalyst for the development of a cost-effective process for the synthesis of prebiotic galactooligosaccharides from lactose.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the first part of this paper (Ulbrich et al. 2003), we gave a description of the August 2002 rainfall events and the resultant floods, in particular of the flood wave of the River Elbe. The extreme precipitation sums observed in the first half of the month were primarily associated with two rainfall episodes. The first episode occurred on 6/7 August 2002. The main rainfall area was situated over Lower Austria, the south-western part of the Czech Republic and south-eastern Germany. A severe flash flood was produced in the Lower Austrian Waldviertel (`forest quarter’ ). The second episode on 11± 13 August 2002 most severely affected the Erz Mountains and western parts of the Czech Republic. During this second episode 312mm of rain was recorded between 0600GMT on 12 August and 0600GMT on 13 August at the Zinnwald weather station in the ErzMountains, which is a new 24-hour record for Germany. The flash floods resulting from this rainfall episode and the subsequent Elbe flood produced the most expensive weatherrelated catastrophe in Europe in recent decades. In this part of the paper we discuss the meteorological conditions and physical mechanisms leading to the two main events. Similarities to the conditions that led to the recent summer floods of the River Oder in 1997 and the River Vistula in 2001 will be shown. This will lead us to a consideration of trends in extreme rainfall over Europe which are found in numerical simulations of anthropogenic climate change.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: Psychotic phenomena appear to form a continuum with normal experience and beliefs, and may build on common emotional interpersonal concerns. Aims: We tested predictions that paranoid ideation is exponentially distributed and hierarchically arranged in the general population, and that persecutory ideas build on more common cognitions of mistrust, interpersonal sensitivity and ideas of reference. Method: Items were chosen from the Structured Clinical Interview for DSM-IV Axis II Disorders (SCID-II) questionnaire and the Psychosis Screening Questionnaire in the second British National Survey of Psychiatric Morbidity (n = 8580), to test a putative hierarchy of paranoid development using confirmatory factor analysis, latent class analysis and factor mixture modelling analysis. Results: Different types of paranoid ideation ranged in frequency from less than 2% to nearly 30%. Total scores on these items followed an almost perfect exponential distribution (r = 0.99). Our four a priori first-order factors were corroborated (interpersonal sensitivity; mistrust; ideas of reference; ideas of persecution). These mapped onto four classes of individual respondents: a rare, severe, persecutory class with high endorsement of all item factors, including persecutory ideation; a quasi-normal class with infrequent endorsement of interpersonal sensitivity, mistrust and ideas of reference, and no ideas of persecution; and two intermediate classes, characterised respectively by relatively high endorsement of items relating to mistrust and to ideas of reference. Conclusions: The paranoia continuum has implications for the aetiology, mechanisms and treatment of psychotic disorders, while confirming the lack of a clear distinction from normal experiences and processes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper we propose and analyze a hybrid $hp$ boundary element method for the solution of problems of high frequency acoustic scattering by sound-soft convex polygons, in which the approximation space is enriched with oscillatory basis functions which efficiently capture the high frequency asymptotics of the solution. We demonstrate, both theoretically and via numerical examples, exponential convergence with respect to the order of the polynomials, moreover providing rigorous error estimates for our approximations to the solution and to the far field pattern, in which the dependence on the frequency of all constants is explicit. Importantly, these estimates prove that, to achieve any desired accuracy in the computation of these quantities, it is sufficient to increase the number of degrees of freedom in proportion to the logarithm of the frequency as the frequency increases, in contrast to the at least linear growth required by conventional methods.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We study the inuence of the intrinsic curvature on the large time behaviour of the heat equation in a tubular neighbourhood of an unbounded geodesic in a two-dimensional Riemannian manifold. Since we consider killing boundary conditions, there is always an exponential-type decay for the heat semigroup. We show that this exponential-type decay is slower for positively curved manifolds comparing to the at case. As the main result, we establish a sharp extra polynomial-type decay for the heat semigroup on negatively curved manifolds comparing to the at case. The proof employs the existence of Hardy-type inequalities for the Dirichlet Laplacian in the tubular neighbourhoods on negatively curved manifolds and the method of self-similar variables and weighted Sobolev spaces for the heat equation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this article, we investigate how the choice of the attenuation factor in an extended version of Katz centrality influences the centrality of the nodes in evolving communication networks. For given snapshots of a network, observed over a period of time, recently developed communicability indices aim to identify the best broadcasters and listeners (receivers) in the network. Here we explore the attenuation factor constraint, in relation to the spectral radius (the largest eigenvalue) of the network at any point in time and its computation in the case of large networks. We compare three different communicability measures: standard, exponential, and relaxed (where the spectral radius bound on the attenuation factor is relaxed and the adjacency matrix is normalised, in order to maintain the convergence of the measure). Furthermore, using a vitality-based measure of both standard and relaxed communicability indices, we look at the ways of establishing the most important individuals for broadcasting and receiving of messages related to community bridging roles. We compare those measures with the scores produced by an iterative version of the PageRank algorithm and illustrate our findings with two examples of real-life evolving networks: the MIT reality mining data set, consisting of daily communications between 106 individuals over the period of one year, a UK Twitter mentions network, constructed from the direct \emph{tweets} between 12.4k individuals during one week, and a subset the Enron email data set.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We study the solutions of the Smoluchowski coagulation equation with a regularization term which removes clusters from the system when their mass exceeds a specified cutoff size, M. We focus primarily on collision kernels which would exhibit an instantaneous gelation transition in the absence of any regularization. Numerical simulations demonstrate that for such kernels with monodisperse initial data, the regularized gelation time decreasesas M increases, consistent with the expectation that the gelation time is zero in the unregularized system. This decrease appears to be a logarithmically slow function of M, indicating that instantaneously gelling kernels may still be justifiable as physical models despite the fact that they are highly singular in the absence of a cutoff. We also study the case when a source of monomers is introduced in the regularized system. In this case a stationary state is reached. We present a complete analytic description of this regularized stationary state for the model kernel, K(m1,m2)=max{m1,m2}ν, which gels instantaneously when M→∞ if ν>1. The stationary cluster size distribution decays as a stretched exponential for small cluster sizes and crosses over to a power law decay with exponent ν for large cluster sizes. The total particle density in the stationary state slowly vanishes as [(ν−1)logM]−1/2 when M→∞. The approach to the stationary state is nontrivial: Oscillations about the stationary state emerge from the interplay between the monomer injection and the cutoff, M, which decay very slowly when M is large. A quantitative analysis of these oscillations is provided for the addition model which describes the situation in which clusters can only grow by absorbing monomers.