914 resultados para Monotonic interpolation


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Structural stability of small sized nonstoichiometric CdS nano clusters between zincblende and wurtzite structures has been investigated using first-principles density functional calculations. Our study shows that the relative stability of these two structures depends sensitively on whether the surface is S-terminated or Cd-terminated. The associated band gap also exhibits non-monotonic behavior as a function of cluster size. Our findings may shed light on contradictory reports of experimentally observed structures of CdS nano clusters found in the literature.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The structure and function of northern ecosystems are strongly influenced by climate change and variability and by human-induced disturbances. The projected global change is likely to have a pronounced effect on the distribution and productivity of different species, generating large changes in the equilibrium at the tree-line. In turn, movement of the tree-line and the redistribution of species produce feedback to both the local and the regional climate. This research was initiated with the objective of examining the influence of natural conditions on the small-scale spatial variation of climate in Finnish Lapland, and to study the interaction and feedback mechanisms in the climate-disturbances-vegetation system near the climatological border of boreal forest. The high (1 km) resolution spatial variation of climate parameters over northern Finland was determined by applying the Kriging interpolation method that takes into account the effect of external forcing variables, i.e., geographical coordinates, elevation, sea and lake coverage. Of all the natural factors shaping the climate, the geographical position, local topography and altitude proved to be the determining ones. Spatial analyses of temperature- and precipitation-derived parameters based on a 30-year dataset (1971-2000) provide a detailed description of the local climate. Maps of the mean, maximum and minimum temperatures, the frost-free period and the growing season indicate that the most favourable thermal conditions exist in the south-western part of Lapland, around large water bodies and in the Kemijoki basin, while the coldest regions are in highland and fell Lapland. The distribution of precipitation is predominantly longitudinally dependent but with the definite influence of local features. The impact of human-induced disturbances, i.e., forest fires, on local climate and its implication for forest recovery near the northern timberline was evaluated in the Tuntsa area of eastern Lapland, damaged by a widespread forest fire in 1960 and suffering repeatedly-failed vegetation recovery since that. Direct measurements of the local climate and simulated heat and water fluxes indicated the development of a more severe climate and physical conditions on the fire-disturbed site. Removal of the original, predominantly Norway spruce and downy birch vegetation and its substitution by tundra vegetation has generated increased wind velocity and reduced snow accumulation, associated with a large variation in soil temperature and moisture and deep soil frost. The changed structural parameters of the canopy have determined changes in energy fluxes by reducing the latter over the tundra vegetation. The altered surface and soil conditions, as well as the evolved severe local climate, have negatively affected seedling growth and survival, leading to more unfavourable conditions for the reproduction of boreal vegetation and thereby causing deviations in the regional position of the timberline. However it should be noted that other factors, such as an inadequate seed source or seedbed, the poor quality of the soil and the intensive logging of damaged trees could also exacerbate the poor tree regeneration. In spite of the failed forest recovery at Tunsta, the position and composition of the timberline and tree-line in Finnish Lapland may also benefit from present and future changes in climate. The already-observed and the projected increase in temperature, the prolonged growing season, as well as changes in the precipitation regime foster tree growth and new regeneration, resulting in an advance of the timberline and tree-line northward and upward. This shift in the distribution of vegetation might be decelerated or even halted by local topoclimatic conditions and by the expected increase in the frequency of disturbances.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Historical sediment nutrient concentrations and heavy-metal distributions were studied in five embayments in the Gulf of Finland and an adjacent lake. The main objective of the study was to examine the response of these water bodies to temporal changes in human activities. Sediment cores were collected from the sites and dated using 210Pb and 137Cs. The cores were analyzed for total carbon (TC), total nitrogen (TN), total phosphorus (TP), organic phosphorus (OP), inorganic phosphorus (IP), biogenic silica (BSi), loss on ignition (LOI), grain size, Cu, Zn, Al, Fe, Mn, K, Ca, Mg and Na. Principal component analysis (PCA) was used to summarize the trends in the geochemical variables and to compare trends between the different sites. The links between the catchment land use and sediment geochemical data were studied using a multivariate technique of redundancy analysis (RDA). Human activities produce marked geochemical variations in coastal sediments. These variations and signals are often challenging to interpret due to various sedimentological and post-depositional factors affecting the sediment profiles. In general, the sites studied here show significant upcore increases in sedimentation rates, TP and TN concentrations. Also Cu, which is considered to be a good indicator of anthropogenic influence, showed clear increases from 1850 towards the top part of the cores. Based on the RDA-analysis, in the least disturbed embayments with high forest cover, the sediments are dominated by lithogenic indicators Fe, K, Al and Mg. In embayments close to urban settlement, the sediments have high Cu concentrations and a high sediment Fe/Mn ratio. This study suggests that sediment accumulation rates vary significantly from site to site and that the overall sedimentation can be linked to the geomorphology and basin bathymetry, which appear to be the major factors governing sedimentation rates; i.e. a high sediment accumulation rate is not characteristic either to urban or to rural sites. The geochemical trends are strongly site specific and depend on the local geochemical background, basin characteristics and anthropogenic metal and nutrient loading. Of the studied geochemical indicators, OP shows the least monotonic trends in all studied sites. When compared to other available data, OP seems to be the most reliable geochemical indicator describing the trophic development of the study sites, whereas Cu and Zn appear to be good indicators for anthropogenic influence. As sedimentation environments, estuarine and marine sites are more complex than lacustrine basins with multiple sources of sediment input and more energetic conditions in the former. The crucial differences between lacustrine and estuarine/coastal sedimentation environments are mostly related to Fe. P sedimentation is largely governed by Fe redox-reactions in estuarine environments. In freshwaters, presence of Fe is clearly linked to the sedimentation of other lithogenic metals, and therefore P sedimentation and preservation has a more direct linkage to organic matter sedimentation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A finite element model for the analysis of laminated composite cylindrical shells with through cracks is presented. The analysis takes into account anisotropic elastic behaviour, bending-extensional coupling and transverse shear deformation effects. The proposed finite element model is based on the approach of dividing a cracked configuration into triangular shaped singular elements around the crack tip with adjoining quadrilateral shaped regular elements. The parabolic isoparametric cylindrical shell elements (both singular and regular) used in this model employ independent displacement and rotation interpolation in the shell middle surface. The numerical comparisons show the evidence to the conclusion that the proposed model will yield accurate stress intensity factors from a relatively coarse mesh. Through the analysis of a pressurised fibre composite cylindrical shell with an axial crack, the effect of material orthotropy on the crack tip stress intensity factors is shown to be quite significant.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Whether a statistician wants to complement a probability model for observed data with a prior distribution and carry out fully probabilistic inference, or base the inference only on the likelihood function, may be a fundamental question in theory, but in practice it may well be of less importance if the likelihood contains much more information than the prior. Maximum likelihood inference can be justified as a Gaussian approximation at the posterior mode, using flat priors. However, in situations where parametric assumptions in standard statistical models would be too rigid, more flexible model formulation, combined with fully probabilistic inference, can be achieved using hierarchical Bayesian parametrization. This work includes five articles, all of which apply probability modeling under various problems involving incomplete observation. Three of the papers apply maximum likelihood estimation and two of them hierarchical Bayesian modeling. Because maximum likelihood may be presented as a special case of Bayesian inference, but not the other way round, in the introductory part of this work we present a framework for probability-based inference using only Bayesian concepts. We also re-derive some results presented in the original articles using the toolbox equipped herein, to show that they are also justifiable under this more general framework. Here the assumption of exchangeability and de Finetti's representation theorem are applied repeatedly for justifying the use of standard parametric probability models with conditionally independent likelihood contributions. It is argued that this same reasoning can be applied also under sampling from a finite population. The main emphasis here is in probability-based inference under incomplete observation due to study design. This is illustrated using a generic two-phase cohort sampling design as an example. The alternative approaches presented for analysis of such a design are full likelihood, which utilizes all observed information, and conditional likelihood, which is restricted to a completely observed set, conditioning on the rule that generated that set. Conditional likelihood inference is also applied for a joint analysis of prevalence and incidence data, a situation subject to both left censoring and left truncation. Other topics covered are model uncertainty and causal inference using posterior predictive distributions. We formulate a non-parametric monotonic regression model for one or more covariates and a Bayesian estimation procedure, and apply the model in the context of optimal sequential treatment regimes, demonstrating that inference based on posterior predictive distributions is feasible also in this case.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Curved hollow bars of laminated anisotropic construction are used as structural members in many industries. They are used in order to save weight without loss of stiffness in comparison with solid sections. In this paper are presented the details of the development of the stiffness matrices of laminated anisotropic curved hollow bars under line member assumptions for two typical sections, circular and square. They are 16dof elements which make use of one-dimensional first-order Hermite interpolation polynomials for the description of assumed displacement state. Problems for which analytical or other solutions are available are first solved using these elements. Good agreement was found between the results. In order to show the capability of the element, application is made to carbon fibre reinforced plastic layered anisotropic curved hollow bars.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Analyzing statistical dependencies is a fundamental problem in all empirical science. Dependencies help us understand causes and effects, create new scientific theories, and invent cures to problems. Nowadays, large amounts of data is available, but efficient computational tools for analyzing the data are missing. In this research, we develop efficient algorithms for a commonly occurring search problem - searching for the statistically most significant dependency rules in binary data. We consider dependency rules of the form X->A or X->not A, where X is a set of positive-valued attributes and A is a single attribute. Such rules describe which factors either increase or decrease the probability of the consequent A. A classical example are genetic and environmental factors, which can either cause or prevent a disease. The emphasis in this research is that the discovered dependencies should be genuine - i.e. they should also hold in future data. This is an important distinction from the traditional association rules, which - in spite of their name and a similar appearance to dependency rules - do not necessarily represent statistical dependencies at all or represent only spurious connections, which occur by chance. Therefore, the principal objective is to search for the rules with statistical significance measures. Another important objective is to search for only non-redundant rules, which express the real causes of dependence, without any occasional extra factors. The extra factors do not add any new information on the dependence, but can only blur it and make it less accurate in future data. The problem is computationally very demanding, because the number of all possible rules increases exponentially with the number of attributes. In addition, neither the statistical dependency nor the statistical significance are monotonic properties, which means that the traditional pruning techniques do not work. As a solution, we first derive the mathematical basis for pruning the search space with any well-behaving statistical significance measures. The mathematical theory is complemented by a new algorithmic invention, which enables an efficient search without any heuristic restrictions. The resulting algorithm can be used to search for both positive and negative dependencies with any commonly used statistical measures, like Fisher's exact test, the chi-squared measure, mutual information, and z scores. According to our experiments, the algorithm is well-scalable, especially with Fisher's exact test. It can easily handle even the densest data sets with 10000-20000 attributes. Still, the results are globally optimal, which is a remarkable improvement over the existing solutions. In practice, this means that the user does not have to worry whether the dependencies hold in future data or if the data still contains better, but undiscovered dependencies.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We propose a new type of high-order elements that incorporates the mesh-free Galerkin formulations into the framework of finite element method. Traditional polynomial interpolation is replaced by mesh-free interpolations in the present high-order elements, and the strain smoothing technique is used for integration of the governing equations based on smoothing cells. The properties of high-order elements, which are influenced by the basis function of mesh-free interpolations and boundary nodes, are discussed through numerical examples. It can be found that the basis function has significant influence on the computational accuracy and upper-lower bounds of energy norm, when the strain smoothing technique retains the softening phenomenon. This new type of high-order elements shows good performance when quadratic basis functions are used in the mesh-free interpolations and present elements prove advantageous in adaptive mesh and nodes refinement schemes. Furthermore, it shows less sensitive to the quality of element because it uses the mesh-free interpolations and obeys the Weakened Weak (W2) formulation as introduced in [3, 5].

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper presents finite element analysis of laminated anisotropic beams of bimodulus materials. The finite element has 16 d.o.f. and uses the displacement field in terms of first order Hermite interpolation polynomials. As the neutral axis position may change from point to point along the length of the beam, an iterative procedure is employed to determine the location of zero strain points along the length. Using this element some problems of laminated beams of bimodulus materials are solved for concentrated loads/moments perpendicular and parallel to the layering planes as well as combined loads.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Real-time scheduling algorithms, such as Rate Monotonic and Earliest Deadline First, guarantee that calculations are performed within a pre-defined time. As many real-time systems operate on limited battery power, these algorithms have been enhanced with power-aware properties. In this thesis, 13 power-aware real-time scheduling algorithms for processor, device and system-level use are explored.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

As editors of the book Lilavati's Daughters: The Women Scientists of India, reviewed by Asha Gopinathan (Nature 460, 1082; 2009), we would like to elaborate on the background to its title. Lilavati was a mathematical treatise of the twelfth century, composed by the mathematician and astronomer Bhaskaracharya (1114–85) — also known as Bhaskara II — who was a teacher of repute and author of several other texts. The name Lilavati, which literally means 'playful', is a surprising title for an early scientific book. Some of the mathematical problems posed in the book are in verse form, and are addressed to a girl, the eponymous Lilavati. However, there is little real evidence concerning Lilavati's historicity. Tradition holds that she was Bhaskaracharya's daughter and that he wrote the treatise to console her after an accident that left her unable to marry. But this could be a later interpolation, as the idea was first mentioned in a Persian commentary. An alternative view has it that Lilavati was married at an inauspicious time and was widowed shortly afterwards. Other sources have implied that Lilavati was Bhaskaracharya's wife, or even one of his students — raising the possibility that women in parts of the Indian subcontinent could have participated in higher education as early as eight centuries ago. However, given that Bhaskara was a poet and pedagogue, it is also possible that he chose to address his mathematical problems to a doe-eyed girl simply as a whimsical and charming literary device.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The integration of stochastic wind power has accentuated a challenge for power system stability assessment. Since the power system is a time-variant system under wind generation fluctuations, pure time-domain simulations are difficult to provide real-time stability assessment. As a result, the worst-case scenario is simulated to give a very conservative assessment of system transient stability. In this study, a probabilistic contingency analysis through a stability measure method is proposed to provide a less conservative contingency analysis which covers 5-min wind fluctuations and a successive fault. This probabilistic approach would estimate the transfer limit of a critical line for a given fault with stochastic wind generation and active control devices in a multi-machine system. This approach achieves a lower computation cost and improved accuracy using a new stability measure and polynomial interpolation, and is feasible for online contingency analysis.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Gadolinium strontium manganite single crystals of the composition Gd0.5Sr0.5MnO3 were grown using the optical float zone method. We report here the magnetic and magnetotransport properties of these crystals. A large magnetoresistance similar to 10(9)% was observed at 45 K under the application of a 110 kOe field. We have observed notable thermomagnetic anomalies such as open hysteresis loops across the broadened first-order transition between the charge order insulator and the ferromagnetic metallic phase while traversing the magnetic field-temperature (H-T) plane isothermally or isomagnetically. In order to discern the cause of these observed anomalies, the H-T phase diagram for Gd0.5Sr0.5MnO3 is formulated using the magnetization-field (M-H), magnetization-temperature (M-T) and resistance-temperature (R-T) measurements. The temperature dependence of the critical field (i.e. H-up, the field required for transformation to the ferromagnetic metallic phase) is non-monotonic. We note that the non-monotonic variation of the supercooling limit is anomalous according to the classical concepts of the first-order phase transition. Accordingly, H-up values below similar to 20 K are unsuitable to represent the supercooling limit. It is possible that the nature of the metastable states responsible for the observed open hysteresis loops is different from that of the supercooled ones.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Past studies that have compared LBB stable discontinuous- and continuous-pressure finite element formulations on a variety of problems have concluded that both methods yield Solutions of comparable accuracy, and that the choice of interpolation is dictated by which of the two is more efficient. In this work, we show that using discontinuous-pressure interpolations can yield inaccurate solutions at large times on a class of transient problems, while the continuous-pressure formulation yields solutions that are in good agreement with the analytical Solution.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Technological development of fast multi-sectional, helical computed tomography (CT) scanners has allowed computed tomography perfusion (CTp) and angiography (CTA) in evaluating acute ischemic stroke. This study focuses on new multidetector computed tomography techniques, namely whole-brain and first-pass CT perfusion plus CTA of carotid arteries. Whole-brain CTp data is acquired during slow infusion of contrast material to achieve constant contrast concentration in the cerebral vasculature. From these data quantitative maps are constructed of perfused cerebral blood volume (pCBV). The probability curve of cerebral infarction as a function of normalized pCBV was determined in patients with acute ischemic stroke. Normalized pCBV, expressed as a percentage of contralateral normal brain pCBV, was determined in the infarction core and in regions just inside and outside the boundary between infarcted and noninfarcted brain. Corresponding probabilities of infarction were 0.99, 0.96, and 0.11, R² was 0.73, and differences in perfusion between core and inner and outer bands were highly significant. Thus a probability of infarction curve can help predict the likelihood of infarction as a function of percentage normalized pCBV. First-pass CT perfusion is based on continuous cine imaging over a selected brain area during a bolus injection of contrast. During its first passage, contrast material compartmentalizes in the intravascular space, resulting in transient tissue enhancement. Functional maps such as cerebral blood flow (CBF), and volume (CBV), and mean transit time (MTT) are then constructed. We compared the effects of three different iodine concentrations (300, 350, or 400 mg/mL) on peak enhancement of normal brain tissue and artery and vein, stratified by region-of-interest (ROI) location, in 102 patients within 3 hours of stroke onset. A monotonic increasing peak opacification was evident at all ROI locations, suggesting that CTp evaluation of patients with acute stroke is best performed with the highest available concentration of contrast agent. In another study we investigated whether lesion volumes on CBV, CBF, and MTT maps within 3 hours of stroke onset predict final infarct volume, and whether all these parameters are needed for triage to intravenous recombinant tissue plasminogen activator (IV-rtPA). The effect of IV-rtPA on the affected brain by measuring salvaged tissue volume in patients receiving IV-rtPA and in controls was investigated also. CBV lesion volume did not necessarily represent dead tissue. MTT lesion volume alone can serve to identify the upper size limit of the abnormally perfused brain, and those with IV-rtPA salvaged more brain than did controls. Carotid CTA was compared with carotid DSA in grading of stenosis in patients with stroke symptoms. In CTA, the grade of stenosis was determined by means of axial source and maximum intensity projection (MIP) images as well as a semiautomatic vessel analysis. CTA provides an adequate, less invasive alternative to conventional DSA, although tending to underestimate clinically relevant grades of stenosis.