956 resultados para Local Galerkin method


Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this work a p-adaptation (modification of the polynomial order) strategy based on the minimization of the truncation error is developed for high order discontinuous Galerkin methods. The truncation error is approximated by means of a truncation error estimation procedure and enables the identification of mesh regions that require adaptation. Three truncation error estimation approaches are developed and termed a posteriori, quasi-a priori and quasi-a priori corrected. Fine solutions, which are obtained by enriching the polynomial order, are required to solve the numerical problem with adequate accuracy. For the three truncation error estimation methods the former needs time converged solutions, while the last two rely on non-converged solutions, which lead to faster computations. Based on these truncation error estimation methods, algorithms for mesh adaptation were designed and tested. Firstly, an isotropic adaptation approach is presented, which leads to equally distributed polynomial orders in different coordinate directions. This first implementation is improved by incorporating a method to extrapolate the truncation error. This results in a significant reduction of computational cost. Secondly, the employed high order method permits the spatial decoupling of the estimated errors and enables anisotropic p-adaptation. The incorporation of anisotropic features leads to meshes with different polynomial orders in the different coordinate directions such that flow-features related to the geometry are resolved in a better manner. These adaptations result in a significant reduction of degrees of freedom and computational cost, while the amount of improvement depends on the test-case. Finally, this anisotropic approach is extended by using error extrapolation which leads to an even higher reduction in computational cost. These strategies are verified and compared in terms of accuracy and computational cost for the Euler and the compressible Navier-Stokes equations. The main result is that the two quasi-a priori methods achieve a significant reduction in computational cost when compared to a uniform polynomial enrichment. Namely, for a viscous boundary layer flow, we obtain a speedup of a factor of 6.6 and 7.6 for the quasi-a priori and quasi-a priori corrected approaches, respectively. RESUMEN En este trabajo se ha desarrollado una estrategia de adaptación-p (modificación del orden polinómico) para métodos Galerkin discontinuo de alto orden basada en la minimización del error de truncación. El error de truncación se estima utilizando el método tau-estimation. El estimador permite la identificación de zonas de la malla que requieren adaptación. Se distinguen tres técnicas de estimación: a posteriori, quasi a priori y quasi a priori con correción. Todas las estrategias requieren una solución obtenida en una malla fina, la cual es obtenida aumentando de manera uniforme el orden polinómico. Sin embargo, mientras que el primero requiere que esta solución esté convergida temporalmente, el resto utiliza soluciones no convergidas, lo que se traduce en un menor coste computacional. En este trabajo se han diseñado y probado algoritmos de adaptación de malla basados en métodos tau-estimation. En primer lugar, se presenta un algoritmo de adaptacin isótropo, que conduce a discretizaciones con el mismo orden polinómico en todas las direcciones espaciales. Esta primera implementación se mejora incluyendo un método para extrapolar el error de truncación. Esto resulta en una reducción significativa del coste computacional. En segundo lugar, el método de alto orden permite el desacoplamiento espacial de los errores estimados, permitiendo la adaptación anisotropica. Las mallas obtenidas mediante esta técnica tienen distintos órdenes polinómicos en cada una de las direcciones espaciales. La malla final tiene una distribución óptima de órdenes polinómicos, los cuales guardan relación con las características del flujo que, a su vez, depenen de la geometría. Estas técnicas de adaptación reducen de manera significativa los grados de libertad y el coste computacional. Por último, esta aproximación anisotropica se extiende usando extrapolación del error de truncación, lo que conlleva un coste computational aún menor. Las estrategias se verifican y se comparan en téminors de precisión y coste computacional utilizando las ecuaciones de Euler y Navier Stokes. Los dos métodos quasi a priori consiguen una reducción significativa del coste computacional en comparación con aumento uniforme del orden polinómico. En concreto, para una capa límite viscosa, obtenemos una mejora en tiempo de computación de 6.6 y 7.6 respectivamente, para las aproximaciones quasi-a priori y quasi-a priori con corrección.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The distribution of optimal local alignment scores of random sequences plays a vital role in evaluating the statistical significance of sequence alignments. These scores can be well described by an extreme-value distribution. The distribution’s parameters depend upon the scoring system employed and the random letter frequencies; in general they cannot be derived analytically, but must be estimated by curve fitting. For obtaining accurate parameter estimates, a form of the recently described ‘island’ method has several advantages. We describe this method in detail, and use it to investigate the functional dependence of these parameters on finite-length edge effects.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Internet has created new opportunities for librarians to present literature search results to clinicians. In order to take full advantage of these opportunities, libraries need to create locally maintained bibliographic databases. A simple method of creating a local bibliographic database and publishing it on the Web is described. The method uses off-the-shelf software and requires minimal programming. A hedge search strategy for outcome studies of clinical process interventions is created, and Ovid is used to search MEDLINE. The search results are saved and imported into EndNote libraries. The citations are modified, exported to a Microsoft Access database, and published on the Web. Clinicians can use a Web browser to search the database. The bibliographic database contains 13,803 MEDLINE citations of outcome studies. Most searches take between four and ten seconds and retrieve between ten and 100 citations. The entire cost of the software is under $900. Locally maintained bibliographic databases can be created easily and inexpensively. They significantly extend the evidence-based health care services that libraries can offer to clinicians.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A fast marching level set method is presented for monotonically advancing fronts, which leads to an extremely fast scheme for solving the Eikonal equation. Level set methods are numerical techniques for computing the position of propagating fronts. They rely on an initial value partial differential equation for a propagating level set function and use techniques borrowed from hyperbolic conservation laws. Topological changes, corner and cusp development, and accurate determination of geometric properties such as curvature and normal direction are naturally obtained in this setting. This paper describes a particular case of such methods for interfaces whose speed depends only on local position. The technique works by coupling work on entropy conditions for interface motion, the theory of viscosity solutions for Hamilton-Jacobi equations, and fast adaptive narrow band level set methods. The technique is applicable to a variety of problems, including shape-from-shading problems, lithographic development calculations in microchip manufacturing, and arrival time problems in control theory.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The role of cAMP subcellular compartmentation in the progress of beta-adrenergic stimulation of cardiac L-type calcium current (ICa) was investigated by using a method based on the use of whole-cell patch-clamp recording and a double capillary for extracellular microperfusion. Frog ventricular cells were sealed at both ends to two patch-clamp pipettes and positioned approximately halfway between the mouths of two capillaries that were separated by a 5-micron thin wall. ICa could be inhibited in one half or the other by omitting Ca2+ from one solution or the other. Exposing half of the cell to a saturating concentration of isoprenaline (ISO, 1 microM) produced a nonmaximal increase in ICa (347 +/- 70%; n = 4) since a subsequent application of ISO to the other part induced an additional effect of nearly similar amplitude to reach a 673 +/- 130% increase. However, half-cell exposure to forskolin (FSK, 30 microM) induced a maximal stimulation of ICa (561 +/- 55%; n = 4). This effect was not the result of adenylyl cyclase activation due to FSK diffusion in the nonexposed part of the cell. To determine the distant effects of ISO and FSK on ICa, the drugs were applied in a zero-Ca solution. Adding Ca2+ to the drug-containing solutions allowed us to record the local effect of the drugs. Dose-response curves for the local and distant effects of ISO and FSK on ICa were used as an index of cAMP concentration changes near the sarcolemma. We found that ISO induced a 40-fold, but FSK induced only a 4-fold, higher cAMP concentration close to the Ca2+ channels, in the part of the cell exposed to the drugs, than it did in the rest of the cell. cAMP compartmentation was greatly reduced after inhibition of phosphodiesterase activity with 3-isobutyl-methylxanthine, suggesting the colocalization of enzymes involved in the cAMP cascade. We conclude that beta-adrenergic receptors are functionally coupled to nearby Ca2+ channels via local elevations of cAMP.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The helix-coil transition equilibrium of polypeptides in aqueous solution was studied by molecular dynamics simulation. The peptide growth simulation method was introduced to generate dynamic models of polypeptide chains in a statistical (random) coil or an alpha-helical conformation. The key element of this method is to build up a polypeptide chain during the course of a molecular transformation simulation, successively adding whole amino acid residues to the chain in a predefined conformation state (e.g., alpha-helical or statistical coil). Thus, oligopeptides of the same length and composition, but having different conformations, can be incrementally grown from a common precursor, and their relative conformational free energies can be calculated as the difference between the free energies for growing the individual peptides. This affords a straightforward calculation of the Zimm-Bragg sigma and s parameters for helix initiation and helix growth. The calculated sigma and s parameters for the polyalanine alpha-helix are in reasonable agreement with the experimental measurements. The peptide growth simulation method is an effective way to study quantitatively the thermodynamics of local protein folding.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This mixed method study aimed to redress the gap in the literature on academic service-learning partnerships, especially in Eastern settings. It utilized Enos and Morton's (2003) theoretical framework to explore these partnerships at the American University in Cairo (AUC). Seventy-nine community partners, administrators, faculty members, and students from a diverse range of age, citizenship, racial, educational, and professional backgrounds participated in the study. Qualitative interviews were conducted with members of these four groups, and a survey with both close-ended and open-ended questions administered to students yielded 61 responses. Qualitative analyses revealed that the primary motivators for partners' engagement in service-learning partnerships included contributing to the community, enhancing students' learning and growth, and achieving the civic mission of the University. These partnerships were characterized by short-term relationships with partners' aspiring to progress toward long-term commitments. The challenges to these partnerships included issues pertaining to the institution, partnering organizations, culture, politics, pedagogy, students, and faculty members. Key strategies for improving these partnerships included institutionalizing service-learning in the University and cultivating an institutional culture supportive of community engagement. Quantitative analyses showed statistically significant relationships between students' scores on the Community Awareness and Interpersonal Effectiveness scales and their overall participation in community service activities inside and outside the classroom, as well as a statistically significant difference between their scores on the Community Awareness scale and department offering service-learning courses. The study's outcomes underscore the role of the local culture in shaping service-learning partnerships, as well as the role of both curricular and extracurricular activities in boosting students' awareness of their community and interpersonal effectiveness. Cultivating a culture of community engagement and building support mechanisms for engaged scholarship are among the critical steps required by public policy-makers in Egypt to promote service-learning in Egyptian higher education. Institutionalizing service-learning partnerships at AUC and enhancing the visibility of these partnerships on campus and in the community are essential to the future growth of these collaborations. Future studies should explore factors affecting community partners' satisfaction with these partnerships, top-down and bottom-up support to service-learning, the value of reflection to faculty members, and the influence of students' economic backgrounds on their involvement in service-learning partnerships.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a new approach to the delineation of local labour markets based on evolutionary computation. The main objective is the regionalisation of a given territory into functional regions based on commuting flows. According to the relevant literature, such regions are defined so that (a) their boundaries are rarely crossed in daily journeys to work, and (b) a high degree of intra-area movement exists. This proposal merges municipalities into functional regions by maximizing a fitness function that measures aggregate intra-region interaction under constraints of inter-region separation and minimum size. Real results are presented based on the latest database from the Census of Population in the Region of Valencia. Comparison between the results obtained through the official method which currently is most widely used (that of British Travel-to-Work Areas) and those from our approach is also presented, showing important improvements in terms of both the number of different market areas identified that meet the statistical criteria and the degree of aggregate intra-market interaction.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Given a territory composed of basic geographical units, the delineation of local labour market areas (LLMAs) can be seen as a problem in which those units are grouped subject to multiple constraints. In previous research, standard genetic algorithms were not able to find valid solutions, and a specific evolutionary algorithm was developed. The inclusion of multiple ad hoc operators allowed the algorithm to find better solutions than those of a widely-used greedy method. However, the percentage of invalid solutions was still very high. In this paper we improve that evolutionary algorithm through the inclusion of (i) a reparation process, that allows every invalid individual to fulfil the constraints and contribute to the evolution, and (ii) a hillclimbing optimisation procedure for each generated individual by means of an appropriate reassignment of some of its constituent units. We compare the results of both techniques against the previous results and a greedy method.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we demonstrate the use of a video camera for measuring the frequency of small-amplitude vibration movements. The method is based on image acquisition and multilevel thresholding and it only requires a video camera with high enough acquisition rate, not being necessary the use of targets or auxiliary laser beams. Our proposal is accurate and robust. We demonstrate the technique with a pocket camera recording low-resolution videos with AVI-JPEG compression and measuring different objects that vibrate in parallel or perpendicular direction to the optical sensor. Despite the low resolution and the noise, we are able to measure the main vibration modes of a tuning fork, a loudspeaker and a bridge. Results are successfully compared with design parameters and measurements with alternative devices.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We consider quasi-Newton methods for generalized equations in Banach spaces under metric regularity and give a sufficient condition for q-linear convergence. Then we show that the well-known Broyden update satisfies this sufficient condition in Hilbert spaces. We also establish various modes of q-superlinear convergence of the Broyden update under strong metric subregularity, metric regularity and strong metric regularity. In particular, we show that the Broyden update applied to a generalized equation in Hilbert spaces satisfies the Dennis–Moré condition for q-superlinear convergence. Simple numerical examples illustrate the results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In recent times the Douglas–Rachford algorithm has been observed empirically to solve a variety of nonconvex feasibility problems including those of a combinatorial nature. For many of these problems current theory is not sufficient to explain this observed success and is mainly concerned with questions of local convergence. In this paper we analyze global behavior of the method for finding a point in the intersection of a half-space and a potentially non-convex set which is assumed to satisfy a well-quasi-ordering property or a property weaker than compactness. In particular, the special case in which the second set is finite is covered by our framework and provides a prototypical setting for combinatorial optimization problems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this short review, we provide some new insights into the material synthesis and characterization of modern multi-component superconducting oxides. Two different approaches such as the high-pressure, high-temperature method and ceramic combinatorial chemistry will be reported with application to several typical examples. First, we highlight the key role of the extreme conditions in the growth of Fe-based superconductors, where a careful control of the composition-structure relation is vital for understanding the microscopic physics. The availability of high-quality LnFeAsO (Ln = lanthanide) single crystals with substitution of O by F, Sm by Th, Fe by Co, and As by P allowed us to measure intrinsic and anisotropic superconducting properties such as Hc2, Jc. Furthermore, we demonstrate that combinatorial ceramic chemistry is an efficient way to search for new superconducting compounds. A single-sample synthesis concept based on multi-element ceramic mixtures can produce a variety of local products. Such a system needs local probe analyses and separation techniques to identify compounds of interest. We present the results obtained from random mixtures of Ca, Sr, Ba, La, Zr, Pb, Tl, Y, Bi, and Cu oxides reacted at different conditions. By adding Zr but removing Tl, Y, and Bi, the bulk state superconductivity got enhanced up to about 122 K.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Mode of access: Internet.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Mode of access: Internet.