982 resultados para semi-recursive method


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The present work deals with investigations on some technologically important polymer nanocomposite films and semi crystalline polypyrrole films.The work presented in the thesis deals with the realization of novel polymer nanocomposites with enhanced functionalities and prospects of applications in the fields related to nanophotonics. The development of inorganic/polymer nanocomposites is a rapidly expanding multidisciplinary research area with profound industrial applications. The incorporation of suitable inorganic nanoparticles can endow the resulting nanocomposites with excellent electrical, optical and mechanical properties. The first chapter gives a general introduction to nanotechnology, nanocomposites and conducting polymers. It also emphasizes the significance of ZnO among other semiconductor materials, which forms the inorganic filler in the polymer nanocomposites of the present study. This chapter also gives general ideas on the properties and applications of conducting polymers with special reference to polypyrrole. The objectives of the present investigations are also clearly addressed in this chapter. The second chapter deals with the theoretical aspects and details of all the experimental techniques used in the present work for the synthesis of polymer nanocomposites and polypyrrole samples and their various characterizations. Chapter 3 is based on the preparation and properties of ZnO/Polystyrene nanocomposite film samples. The optical properties of these nanocomoposite films are discussed in detail.Chapter 4 deals with the detailed investigations on the dependence of the optical properties of ZnO/PS nanocomposite films on the size of the nanostructured ZnO filler material. The excellent UV shielding properties of these nanocomposite films form the highlight of this chapter. Chapter 5 gives a detailed analysis of the nonlinear optical properties of ZnO/PS nanocomposite films using Z scan technique. The effect of ZnO particle size in the composite films on the nonlinear properties is discussed. The present study involves two phases of research activities. In the first phase, the linear and nonlinear optical properties of ZnO/polymer nanocomposites are investigated in detail. The second phase of work is centered on the synthesis and related studies on highly crystalline polypyrrole films. In the present study, nanosized ZnO is synthesized using wet chemical method at two different temperatures

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The identification of chemical mechanism that can exhibit oscillatory phenomena in reaction networks are currently of intense interest. In particular, the parametric question of the existence of Hopf bifurcations has gained increasing popularity due to its relation to the oscillatory behavior around the fixed points. However, the detection of oscillations in high-dimensional systems and systems with constraints by the available symbolic methods has proven to be difficult. The development of new efficient methods are therefore required to tackle the complexity caused by the high-dimensionality and non-linearity of these systems. In this thesis, we mainly present efficient algorithmic methods to detect Hopf bifurcation fixed points in (bio)-chemical reaction networks with symbolic rate constants, thereby yielding information about their oscillatory behavior of the networks. The methods use the representations of the systems on convex coordinates that arise from stoichiometric network analysis. One of the methods called HoCoQ reduces the problem of determining the existence of Hopf bifurcation fixed points to a first-order formula over the ordered field of the reals that can then be solved using computational-logic packages. The second method called HoCaT uses ideas from tropical geometry to formulate a more efficient method that is incomplete in theory but worked very well for the attempted high-dimensional models involving more than 20 chemical species. The instability of reaction networks may lead to the oscillatory behaviour. Therefore, we investigate some criterions for their stability using convex coordinates and quantifier elimination techniques. We also study Muldowney's extension of the classical Bendixson-Dulac criterion for excluding periodic orbits to higher dimensions for polynomial vector fields and we discuss the use of simple conservation constraints and the use of parametric constraints for describing simple convex polytopes on which periodic orbits can be excluded by Muldowney's criteria. All developed algorithms have been integrated into a common software framework called PoCaB (platform to explore bio- chemical reaction networks by algebraic methods) allowing for automated computation workflows from the problem descriptions. PoCaB also contains a database for the algebraic entities computed from the models of chemical reaction networks.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We study four measures of problem instance behavior that might account for the observed differences in interior-point method (IPM) iterations when these methods are used to solve semidefinite programming (SDP) problem instances: (i) an aggregate geometry measure related to the primal and dual feasible regions (aspect ratios) and norms of the optimal solutions, (ii) the (Renegar-) condition measure C(d) of the data instance, (iii) a measure of the near-absence of strict complementarity of the optimal solution, and (iv) the level of degeneracy of the optimal solution. We compute these measures for the SDPLIB suite problem instances and measure the correlation between these measures and IPM iteration counts (solved using the software SDPT3) when the measures have finite values. Our conclusions are roughly as follows: the aggregate geometry measure is highly correlated with IPM iterations (CORR = 0.896), and is a very good predictor of IPM iterations, particularly for problem instances with solutions of small norm and aspect ratio. The condition measure C(d) is also correlated with IPM iterations, but less so than the aggregate geometry measure (CORR = 0.630). The near-absence of strict complementarity is weakly correlated with IPM iterations (CORR = 0.423). The level of degeneracy of the optimal solution is essentially uncorrelated with IPM iterations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Quantitatively assessing the importance or criticality of each link in a network is of practical value to operators, as that can help them to increase the network's resilience, provide more efficient services, or improve some other aspect of the service. Betweenness is a graph-theoretical measure of centrality that can be applied to communication networks to evaluate link importance. However, as we illustrate in this paper, the basic definition of betweenness centrality produces inaccurate estimations as it does not take into account some aspects relevant to networking, such as the heterogeneity in link capacity or the difference between node-pairs in their contribution to the total traffic. A new algorithm for discovering link centrality in transport networks is proposed in this paper. It requires only static or semi-static network and topology attributes, and yet produces estimations of good accuracy, as verified through extensive simulations. Its potential value is demonstrated by an example application. In the example, the simple shortest-path routing algorithm is improved in such a way that it outperforms other more advanced algorithms in terms of blocking ratio

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The sustainability of cereal/legume intercropping was assessed by monitoring trends in grain yield, soil organic C (SOC) and soil extractable P (Olsen method) measured over 13 years at a long-term field trial on a P-deficient soil in semi-arid Kenya. Goat manure was applied annually for 13 years at 0, 5 and 10 t ha(-1) and trends in grain yield were not identifiable because of season-to-season variations. SOC and Olsen P increased for the first seven years of manure application and then remained constant. The residual effect of manure applied for four years only lasted another seven to eight years when assessed by yield, SOC and Olsen P. Mineral fertilizers provided the same annual rates of N and P as in 5 t ha(-1) manure and initially ,gave the same yield as manure, declining after nine years to about 80%. Therefore, manure applications could be made intermittently and nutrient requirements topped-up with fertilizers. Grain yields for sorghum with continuous manure were described well by correlations with rainfall and manure input only, if data were excluded for seasons with over 500 mm rainfall. A comprehensive simulation model should correctly describe crop losses caused by excess water.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper describes a new method for the assessment of palaeohydrology through the Holocene. A palaeoclimate model was linked with a hydrological model, using a weather generator to correct bias in the rainfall estimates, to simulate the changes in the flood frequency and the groundwater response through the late Pleistocene and Holocene for the Wadi Faynan in southern Jordan, a site considered internationally important due to its rich archaeological heritage spanning the Pleistocene and Holocene. This is the first study to describe the hydrological functioning of the Wadi Faynan, a meso-scale (241 km2) semi-arid catchment, setting this description within the framework of contemporary archaeological investigations. Historic meteorological records were collated and supplemented with new hydrological and water quality data. The modelled outcomes indicate that environmental changes, such as deforestation, had a major impact on the local water cycle and this amplified the effect of the prevailing climate on the flow regime. The results also show that increased rainfall alone does not necessarily imply better conditions for farming and highlight the importance of groundwater. The discussion focuses on the utility of the method and the importance of the local hydrology to the sustained settlement of the Wadi Faynan through pre-history and history.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We study boundary value problems posed in a semistrip for the elliptic sine-Gordon equation, which is the paradigm of an elliptic integrable PDE in two variables. We use the method introduced by one of the authors, which provides a substantial generalization of the inverse scattering transform and can be used for the analysis of boundary as opposed to initial-value problems. We first express the solution in terms of a 2 by 2 matrix Riemann-Hilbert problem whose \jump matrix" depends on both the Dirichlet and the Neumann boundary values. For a well posed problem one of these boundary values is an unknown function. This unknown function is characterised in terms of the so-called global relation, but in general this characterisation is nonlinear. We then concentrate on the case that the prescribed boundary conditions are zero along the unbounded sides of a semistrip and constant along the bounded side. This corresponds to a case of the so-called linearisable boundary conditions, however a major difficulty for this problem is the existence of non-integrable singularities of the function q_y at the two corners of the semistrip; these singularities are generated by the discontinuities of the boundary condition at these corners. Motivated by the recent solution of the analogous problem for the modified Helmholtz equation, we introduce an appropriate regularisation which overcomes this difficulty. Furthermore, by mapping the basic Riemann-Hilbert problem to an equivalent modified Riemann-Hilbert problem, we show that the solution can be expressed in terms of a 2 by 2 matrix Riemann-Hilbert problem whose jump matrix depends explicitly on the width of the semistrip L, on the constant value d of the solution along the bounded side, and on the residues at the given poles of a certain spectral function denoted by h. The determination of the function h remains open.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Many modern statistical applications involve inference for complex stochastic models, where it is easy to simulate from the models, but impossible to calculate likelihoods. Approximate Bayesian computation (ABC) is a method of inference for such models. It replaces calculation of the likelihood by a step which involves simulating artificial data for different parameter values, and comparing summary statistics of the simulated data with summary statistics of the observed data. Here we show how to construct appropriate summary statistics for ABC in a semi-automatic manner. We aim for summary statistics which will enable inference about certain parameters of interest to be as accurate as possible. Theoretical results show that optimal summary statistics are the posterior means of the parameters. Although these cannot be calculated analytically, we use an extra stage of simulation to estimate how the posterior means vary as a function of the data; and we then use these estimates of our summary statistics within ABC. Empirical results show that our approach is a robust method for choosing summary statistics that can result in substantially more accurate ABC analyses than the ad hoc choices of summary statistics that have been proposed in the literature. We also demonstrate advantages over two alternative methods of simulation-based inference.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Many applications, such as intermittent data assimilation, lead to a recursive application of Bayesian inference within a Monte Carlo context. Popular data assimilation algorithms include sequential Monte Carlo methods and ensemble Kalman filters (EnKFs). These methods differ in the way Bayesian inference is implemented. Sequential Monte Carlo methods rely on importance sampling combined with a resampling step, while EnKFs utilize a linear transformation of Monte Carlo samples based on the classic Kalman filter. While EnKFs have proven to be quite robust even for small ensemble sizes, they are not consistent since their derivation relies on a linear regression ansatz. In this paper, we propose another transform method, which does not rely on any a priori assumptions on the underlying prior and posterior distributions. The new method is based on solving an optimal transportation problem for discrete random variables. © 2013, Society for Industrial and Applied Mathematics

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The time discretization in weather and climate models introduces truncation errors that limit the accuracy of the simulations. Recent work has yielded a method for reducing the amplitude errors in leapfrog integrations from first-order to fifth-order. This improvement is achieved by replacing the Robert--Asselin filter with the RAW filter and using a linear combination of the unfiltered and filtered states to compute the tendency term. The purpose of the present paper is to apply the composite-tendency RAW-filtered leapfrog scheme to semi-implicit integrations. A theoretical analysis shows that the stability and accuracy are unaffected by the introduction of the implicitly treated mode. The scheme is tested in semi-implicit numerical integrations in both a simple nonlinear stiff system and a medium-complexity atmospheric general circulation model, and yields substantial improvements in both cases. We conclude that the composite-tendency RAW-filtered leapfrog scheme is suitable for use in semi-implicit integrations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Due to their broad differentiation potential and their persistence into adulthood, human neural crest-derived stem cells (NCSCs) harbour great potential for autologous cellular therapies, which include the treatment of neurodegenerative diseases and replacement of complex tissues containing various cell types, as in the case of musculoskeletal injuries. The use of serum-free approaches often results in insufficient proliferation of stem cells and foetal calf serum implicates the use of xenogenic medium components. Thus, there is much need for alternative cultivation strategies. In this study we describe for the first time a novel, human blood plasma based semi-solid medium for cultivation of human NCSCs. We cultivated human neural crest-derived inferior turbinate stem cells (ITSCs) within a blood plasma matrix, where they revealed higher proliferation rates compared to a standard serum-free approach. Three-dimensionality of the matrix was investigated using helium ion microscopy. ITSCs grew within the matrix as revealed by laser scanning microscopy. Genetic stability and maintenance of stemness characteristics were assured in 3D cultivated ITSCs, as demonstrated by unchanged expression profile and the capability for self-renewal. ITSCs pre-cultivated in the 3D matrix differentiated efficiently into ectodermal and mesodermal cell types, particularly including osteogenic cell types. Furthermore, ITSCs cultivated as described here could be easily infected with lentiviruses directly in substrate for potential tracing or gene therapeutic approaches. Taken together, the use of human blood plasma as an additive for a completely defined medium points towards a personalisable and autologous cultivation of human neural crest-derived stem cells under clinical grade conditions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Timediscretization in weatherandclimate modelsintroduces truncation errors that limit the accuracy of the simulations. Recent work has yielded a method for reducing the amplitude errors in leap-frog integrations from first-order to fifth-order.This improvement is achieved by replacing the Robert–Asselin filter with the Robert–Asselin–Williams (RAW) filter and using a linear combination of unfiltered and filtered states to compute the tendency term. The purpose of the present article is to apply the composite-tendency RAW-filtered leapfrog scheme to semi-implicit integrations. A theoretical analysis shows that the stability and accuracy are unaffected by the introduction of the implicitly treated mode. The scheme is tested in semi-implicit numerical integrations in both a simple nonlinear stiff system and a medium-complexity atmospheric general circulation model and yields substantial improvements in both cases. We conclude that the composite-tendency RAW-filtered leap-frog scheme is suitable for use in semi-implicit integrations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We discuss the estimation of the expected value of the quality-adjusted survival, based on multistate models. We generalize an earlier work, considering the sojourn times in health states are not identically distributed, for a given vector of covariates. Approaches based on semiparametric and parametric (exponential and Weibull distributions) methodologies are considered. A simulation study is conducted to evaluate the performance of the proposed estimator and the jackknife resampling method is used to estimate the variance of such estimator. An application to a real data set is also included.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present a variable time step, fully adaptive in space, hybrid method for the accurate simulation of incompressible two-phase flows in the presence of surface tension in two dimensions. The method is based on the hybrid level set/front-tracking approach proposed in [H. D. Ceniceros and A. M. Roma, J. Comput. Phys., 205, 391400, 2005]. Geometric, interfacial quantities are computed from front-tracking via the immersed-boundary setting while the signed distance (level set) function, which is evaluated fast and to machine precision, is used as a fluid indicator. The surface tension force is obtained by employing the mixed Eulerian/Lagrangian representation introduced in [S. Shin, S. I. Abdel-Khalik, V. Daru and D. Juric, J. Comput. Phys., 203, 493-516, 2005] whose success for greatly reducing parasitic currents has been demonstrated. The use of our accurate fluid indicator together with effective Lagrangian marker control enhance this parasitic current reduction by several orders of magnitude. To resolve accurately and efficiently sharp gradients and salient flow features we employ dynamic, adaptive mesh refinements. This spatial adaption is used in concert with a dynamic control of the distribution of the Lagrangian nodes along the fluid interface and a variable time step, linearly implicit time integration scheme. We present numerical examples designed to test the capabilities and performance of the proposed approach as well as three applications: the long-time evolution of a fluid interface undergoing Rayleigh-Taylor instability, an example of bubble ascending dynamics, and a drop impacting on a free interface whose dynamics we compare with both existing numerical and experimental data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Trabalho apresentado no XXXV CNMAC, Natal-RN, 2014.