943 resultados para Time-Fractional Diffusion-Wave Problem


Relevância:

30.00% 30.00%

Publicador:

Resumo:

(1) In the period 1965/77 fertilizer consumption in Brazil increased nearly fifteen foild from circa 200,000 tons of N + P2O5 + K2O to 3 million tons. During the fifteen years extending from 1950 to 1964 usage of the primary macronutrients was raised by a factor of 2 only. (2) Several explanations are given for the remarkable increase, namely: an experimental background which supplied data for recommendations of rates, time and type of application; a convenient governmental policy for minimum prices and rural credit; capacity of the industry to meet the demand of the fertilizer market; an adequate mechanism for the diffusion of the practice of fertilizer use to the farmer. (3) The extension work, which has caused a permanent change in the aptitude towards fertilization, was carried out in the traditional way by salesmen supported by a technical staff, as well as by agronomists of the official services. (4) Two new programs were started and conducted in a rather short time, both putting emphasis on the relatively new technology of fertilizer use. (5) The first program, conducted in the Southern part of the country, extended lab and green house work supplemented by a few field trials to small land owners - the so called "operação tatú" (operation armadillo). (6) The seconde program, covering a larger problem area in the Northeast and in Central Brazil, began directly in field as thousands of demonstrations and simple experiments with the participation of local people whose involvement was essential for the success of the initiative; in this case the official extension services, both foreign and national sources of funds, and universities did participate under the leadership of the Brazilian Association for the Diffusion of Fertilizers (ANDA). (7) It is felt that the Brazilian experience gained thereof could be useful to other countries under similar conditions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

R.P. Boas has found necessary and sufficient conditions of belonging of function to Lipschitz class. From his findings it turned out, that the conditions on sine and cosine coefficients for belonging of function to Lip α(0 & α & 1) are the same, but for Lip 1 are different. Later his results were generalized by many authors in the viewpoint of generalization of condition on the majorant of modulus of continuity. The aim of this paper is to obtain Boas-type theorems for generalized Lipschitz classes. To define generalized Lipschitz classes we use the concept of modulus of smoothness of fractional order.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Whitehead minimization problem consists in finding a minimum size element in the automorphic orbit of a word, a cyclic word or a finitely generated subgroup in a finite rank free group. We give the first fully polynomial algorithm to solve this problem, that is, an algorithm that is polynomial both in the length of the input word and in the rank of the free group. Earlier algorithms had an exponential dependency in the rank of the free group. It follows that the primitivity problem – to decide whether a word is an element of some basis of the free group – and the free factor problem can also be solved in polynomial time.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present a new method for lysis of single cells in continuous flow, where cells are sequentially trapped, lysed and released in an automatic process. Using optimized frequencies, dielectrophoretic trapping allows exposing cells in a reproducible way to high electrical fields for long durations, thereby giving good control on the lysis parameters. In situ evaluation of cytosol extraction on single cells has been studied for Chinese hamster ovary (CHO) cells through out-diffusion of fluorescent molecules for different voltage amplitudes. A diffusion model is proposed to correlate this out-diffusion to the total area of the created pores, which is dependent on the potential drop across the cell membrane and enables evaluation of the total pore area in the membrane. The dielectrophoretic trapping is no longer effective after lysis because of the reduced conductivity inside the cells, leading to cell release. The trapping time is linked to the time required for cytosol extraction and can thus provide additional validation of the effective cytosol extraction for non-fluorescent cells. Furthermore, the application of one single voltage for both trapping and lysis provides a fully automatic process including cell trapping, lysis, and release, allowing operating the device in continuous flow without human intervention.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We propose a non-equidistant Q rate matrix formula and an adaptive numerical algorithm for a continuous time Markov chain to approximate jump-diffusions with affine or non-affine functional specifications. Our approach also accommodates state-dependent jump intensity and jump distribution, a flexibility that is very hard to achieve with other numerical methods. The Kolmogorov-Smirnov test shows that the proposed Markov chain transition density converges to the one given by the likelihood expansion formula as in Ait-Sahalia (2008). We provide numerical examples for European stock option pricing in Black and Scholes (1973), Merton (1976) and Kou (2002).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The monetary policy reaction function of the Bank of England is estimated by the standard GMM approach and the ex-ante forecast method developed by Goodhart (2005), with particular attention to the horizons for inflation and output at which each approach gives the best fit. The horizons for the ex-ante approach are much closer to what is implied by the Bank’s view of the transmission mechanism, while the GMM approach produces an implausibly slow adjustment of the interest rate, and suffers from a weak instruments problem. These findings suggest a strong preference for the ex-ante approach.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The monetary policy reaction function of the Bank of England is estimated by the standard GMM approach and the ex-ante forecast method developed by Goodhart (2005), with particular attention to the horizons for inflation and output at which each approach gives the best fit. The horizons for the ex-ante approach are much closer to what is implied by the Bank’s view of the transmission mechanism, while the GMM approach produces an implausibly slow adjustment of the interest rate, and suffers from a weak instruments problem. These findings suggest a strong preference for the ex-ante approach.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Several types of drugs currently used in clinical practice were screened in vitro for their potentiation of the antifungal effect of the fungistatic agent fluconazole (FLC) on Candida albicans. These drugs included inhibitors of multidrug efflux transporters, antimicrobial agents, antifungal agents, and membrane-active compounds with no antimicrobial activity, such as antiarrhythmic agents, proton pump inhibitors, and platelet aggregation inhibitors. Among the drugs tested in an agar disk diffusion assay, cyclosporine (Cy), which had no intrinsic antifungal activity, showed a potent antifungal effect in combination with FLC. In a checkerboard microtiter plate format, however, it was observed that the MIC of FLC, as classically defined by the NCCLS recommendations, was unchanged when FLC and Cy were combined. Nevertheless, if a different reading endpoint corresponding to the minimal fungicidal concentration needed to decrease viable counts by at least 3 logs in comparison to the growth control was chosen, the combination was synergistic (fractional inhibitory concentration index of <1). This endpoint fitted to the definition of MIC-0 (optically clear wells) and reflected the absence of the trailing effect, which is the result of a residual growth at FLC concentrations greater than the MIC. The MIC-0 values of FLC and Cy tested alone in C. albicans were >32 and >10 microg/ml, respectively, and decreased to 0.5 and 0.625 microg/ml when the two drugs were combined. The combination of 0.625 microg of Cy per ml with supra-MICs of FLC resulted in a potent antifungal effect in time-kill curve experiments. This effect was fungicidal or fungistatic, depending on the C. albicans strain used. Since the Cy concentration effective in vitro is achievable in vivo, the combination of this agent with FLC represents an attractive perspective for the development of new management strategies for candidiasis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND AND AIMS: In critically ill patients, fractional hepatic de novo lipogenesis increases in proportion to carbohydrate administration during isoenergetic nutrition. In this study, we sought to determine whether this increase may be the consequence of continuous enteral nutrition and bed rest. We, therefore, measured fractional hepatic de novo lipogenesis in a group of 12 healthy subjects during near-continuous oral feeding (hourly isoenergetic meals with a liquid formula containing 55% carbohydrate). In eight subjects, near-continuous enteral nutrition and bed rest were applied over a 10 h period. In the other four subjects, it was extended to 34 h. Fractional hepatic de novo lipogenesis was measured by infusing(13) C-labeled acetate and monitoring VLDL-(13)C palmitate enrichment with mass isotopomer distribution analysis. Fractional hepatic de novo lipogenesis was 3.2% (range 1.5-7.5%) in the eight subjects after 10 h of near continuous nutrition and 1.6% (range 1.3-2.0%) in the four subjects after 34 h of near-continuous nutrition and bed rest. This indicates that continuous nutrition and physical inactivity do not increase hepatic de novo lipogenesis. Fractional hepatic de novo lipogenesis previously reported in critically ill patients under similar nutritional conditions (9.3%) (range 5.3-15.8%) was markedly higher than in healthy subjects (P<0.001). These data from healthy subjects indicate that fractional hepatic de novo lipogenesis is increased in critically ill patients.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Validation is arguably the bottleneck in the diffusion magnetic resonance imaging (MRI) community. This paper evaluates and compares 20 algorithms for recovering the local intra-voxel fiber structure from diffusion MRI data and is based on the results of the "HARDI reconstruction challenge" organized in the context of the "ISBI 2012" conference. Evaluated methods encompass a mixture of classical techniques well known in the literature such as diffusion tensor, Q-Ball and diffusion spectrum imaging, algorithms inspired by the recent theory of compressed sensing and also brand new approaches proposed for the first time at this contest. To quantitatively compare the methods under controlled conditions, two datasets with known ground-truth were synthetically generated and two main criteria were used to evaluate the quality of the reconstructions in every voxel: correct assessment of the number of fiber populations and angular accuracy in their orientation. This comparative study investigates the behavior of every algorithm with varying experimental conditions and highlights strengths and weaknesses of each approach. This information can be useful not only for enhancing current algorithms and develop the next generation of reconstruction methods, but also to assist physicians in the choice of the most adequate technique for their studies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We propose a mixed finite element method for a class of nonlinear diffusion equations, which is based on their interpretation as gradient flows in optimal transportation metrics. We introduce an appropriate linearization of the optimal transport problem, which leads to a mixed symmetric formulation. This formulation preserves the maximum principle in case of the semi-discrete scheme as well as the fully discrete scheme for a certain class of problems. In addition solutions of the mixed formulation maintain exponential convergence in the relative entropy towards the steady state in case of a nonlinear Fokker-Planck equation with uniformly convex potential. We demonstrate the behavior of the proposed scheme with 2D simulations of the porous medium equations and blow-up questions in the Patlak-Keller-Segel model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper is the first to examine the implications of switching to PT work for women's subsequent earnings trajectories, distinguishing by their type of contract: permanent or fixedterm. Using a rich longitudinal Spanish data set from Social Security records of over 76,000 prime-aged women strongly attached to the Spanish labor market, we find that PT work aggravates the segmentation of the labor market insofar there is a PT pay penalty and this penalty is larger and more persistent in the case of women with fixed-term contracts. The paper discusses problems arising in empirical estimation (including a problem not discussed in the literature up to now: the differential measurement error of the LHS variable by PT status), and how to address them. It concludes with policy implications relevant for Continental Europe and its dual structure of employment protection.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present two new stabilized high-resolution numerical methods for the convection–diffusion–reaction (CDR) and the Helmholtz equations respectively. The work embarks upon a priori analysis of some consistency recovery procedures for some stabilization methods belonging to the Petrov–Galerkin framework. It was found that the use of some standard practices (e.g. M-Matrices theory) for the design of essentially non-oscillatory numerical methods is not feasible when consistency recovery methods are employed. Hence, with respect to convective stabilization, such recovery methods are not preferred. Next, we present the design of a high-resolution Petrov–Galerkin (HRPG) method for the 1D CDR problem. The problem is studied from a fresh point of view, including practical implications on the formulation of the maximum principle, M-Matrices theory, monotonicity and total variation diminishing (TVD) finite volume schemes. The current method is next in line to earlier methods that may be viewed as an upwinding plus a discontinuity-capturing operator. Finally, some remarks are made on the extension of the HRPG method to multidimensions. Next, we present a new numerical scheme for the Helmholtz equation resulting in quasi-exact solutions. The focus is on the approximation of the solution to the Helmholtz equation in the interior of the domain using compact stencils. Piecewise linear/bilinear polynomial interpolation are considered on a structured mesh/grid. The only a priori requirement is to provide a mesh/grid resolution of at least eight elements per wavelength. No stabilization parameters are involved in the definition of the scheme. The scheme consists of taking the average of the equation stencils obtained by the standard Galerkin finite element method and the classical finite difference method. Dispersion analysis in 1D and 2D illustrate the quasi-exact properties of this scheme. Finally, some remarks are made on the extension of the scheme to unstructured meshes by designing a method within the Petrov–Galerkin framework.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this study was to determine the prognostic accuracy of perfusion computed tomography (CT), performed at the time of emergency room admission, in acute stroke patients. Accuracy was determined by comparison of perfusion CT with delayed magnetic resonance (MR) and by monitoring the evolution of each patient's clinical condition. Twenty-two acute stroke patients underwent perfusion CT covering four contiguous 10mm slices on admission, as well as delayed MR, performed after a median interval of 3 days after emergency room admission. Eight were treated with thrombolytic agents. Infarct size on the admission perfusion CT was compared with that on the delayed diffusion-weighted (DWI)-MR, chosen as the gold standard. Delayed magnetic resonance angiography and perfusion-weighted MR were used to detect recanalization. A potential recuperation ratio, defined as PRR = penumbra size/(penumbra size + infarct size) on the admission perfusion CT, was compared with the evolution in each patient's clinical condition, defined by the National Institutes of Health Stroke Scale (NIHSS). In the 8 cases with arterial recanalization, the size of the cerebral infarct on the delayed DWI-MR was larger than or equal to that of the infarct on the admission perfusion CT, but smaller than or equal to that of the ischemic lesion on the admission perfusion CT; and the observed improvement in the NIHSS correlated with the PRR (correlation coefficient = 0.833). In the 14 cases with persistent arterial occlusion, infarct size on the delayed DWI-MR correlated with ischemic lesion size on the admission perfusion CT (r = 0.958). In all 22 patients, the admission NIHSS correlated with the size of the ischemic area on the admission perfusion CT (r = 0.627). Based on these findings, we conclude that perfusion CT allows the accurate prediction of the final infarct size and the evaluation of clinical prognosis for acute stroke patients at the time of emergency evaluation. It may also provide information about the extent of the penumbra. Perfusion CT could therefore be a valuable tool in the early management of acute stroke patients.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this study I try to explain the systemic problem of the low economic competitiveness of nuclear energy for the production of electricity by carrying out a biophysical analysis of its production process. Given the fact that neither econometric approaches nor onedimensional methods of energy analyses are effective, I introduce the concept of biophysical explanation as a quantitative analysis capable of handling the inherent ambiguity associated with the concept of energy. In particular, the quantities of energy, considered as relevant for the assessment, can only be measured and aggregated after having agreed on a pre-analytical definition of a grammar characterizing a given set of finite transformations. Using this grammar it becomes possible to provide a biophysical explanation for the low economic competitiveness of nuclear energy in the production of electricity. When comparing the various unit operations of the process of production of electricity with nuclear energy to the analogous unit operations of the process of production of fossil energy, we see that the various phases of the process are the same. The only difference is related to characteristics of the process associated with the generation of heat which are completely different in the two systems. Since the cost of production of fossil energy provides the base line of economic competitiveness of electricity, the (lack of) economic competitiveness of the production of electricity from nuclear energy can be studied, by comparing the biophysical costs associated with the different unit operations taking place in nuclear and fossil power plants when generating process heat or net electricity. In particular, the analysis focuses on fossil-fuel requirements and labor requirements for those phases that both nuclear plants and fossil energy plants have in common: (i) mining; (ii) refining/enriching; (iii) generating heat/electricity; (iv) handling the pollution/radioactive wastes. By adopting this approach, it becomes possible to explain the systemic low economic competitiveness of nuclear energy in the production of electricity, because of: (i) its dependence on oil, limiting its possible role as a carbon-free alternative; (ii) the choices made in relation to its fuel cycle, especially whether it includes reprocessing operations or not; (iii) the unavoidable uncertainty in the definition of the characteristics of its process; (iv) its large inertia (lack of flexibility) due to issues of time scale; and (v) its low power level.