898 resultados para Maximum Power Point Tracking algorithms


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Palaeodata in synthesis form are needed as benchmarks for the Palaeoclimate Modelling Intercomparison Project (PMIP). Advances since the last synthesis of terrestrial palaeodata from the last glacial maximum (LGM) call for a new evaluation, especially of data from the tropics. Here pollen, plant-macrofossil, lake-level, noble gas (from groundwater) and δ18O (from speleothems) data are compiled for 18±2 ka (14C), 32 °N–33 °S. The reliability of the data was evaluated using explicit criteria and some types of data were re-analysed using consistent methods in order to derive a set of mutually consistent palaeoclimate estimates of mean temperature of the coldest month (MTCO), mean annual temperature (MAT), plant available moisture (PAM) and runoff (P-E). Cold-month temperature (MAT) anomalies from plant data range from −1 to −2 K near sea level in Indonesia and the S Pacific, through −6 to −8 K at many high-elevation sites to −8 to −15 K in S China and the SE USA. MAT anomalies from groundwater or speleothems seem more uniform (−4 to −6 K), but the data are as yet sparse; a clear divergence between MAT and cold-month estimates from the same region is seen only in the SE USA, where cold-air advection is expected to have enhanced cooling in winter. Regression of all cold-month anomalies against site elevation yielded an estimated average cooling of −2.5 to −3 K at modern sea level, increasing to ≈−6 K by 3000 m. However, Neotropical sites showed larger than the average sea-level cooling (−5 to −6 K) and a non-significant elevation effect, whereas W and S Pacific sites showed much less sea-level cooling (−1 K) and a stronger elevation effect. These findings support the inference that tropical sea-surface temperatures (SSTs) were lower than the CLIMAP estimates, but they limit the plausible average tropical sea-surface cooling, and they support the existence of CLIMAP-like geographic patterns in SST anomalies. Trends of PAM and lake levels indicate wet LGM conditions in the W USA, and at the highest elevations, with generally dry conditions elsewhere. These results suggest a colder-than-present ocean surface producing a weaker hydrological cycle, more arid continents, and arguably steeper-than-present terrestrial lapse rates. Such linkages are supported by recent observations on freezing-level height and tropical SSTs; moreover, simulations of “greenhouse” and LGM climates point to several possible feedback processes by which low-level temperature anomalies might be amplified aloft.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Mineral dust aerosols in the atmosphere have the potential to affect the global climate by influencing the radiative balance of the atmosphere and the supply of micronutrients to the ocean. Ice and marine sediment cores indicate that dust deposition from the atmosphere was at some locations 2–20 times greater during glacial periods, raising the possibility that mineral aerosols might have contributed to climate change on glacial-interglacial time scales. To address this question, we have used linked terrestrial biosphere, dust source, and atmospheric transport models to simulate the dust cycle in the atmosphere for current and last glacial maximum (LGM) climates. We obtain a 2.5-fold higher dust loading in the entire atmosphere and a twenty-fold higher loading in high latitudes, in LGM relative to present. Comparisons to a compilation of atmospheric dust deposition flux estimates for LGM and present in marine sediment and ice cores show that the simulated flux ratios are broadly in agreement with observations; differences suggest where further improvements in the simple dust model could be made. The simulated increase in high-latitude dustiness depends on the expansion of unvegetated areas, especially in the high latitudes and in central Asia, caused by a combination of increased aridity and low atmospheric [CO2]. The existence of these dust source areas at the LGM is supported by pollen data and loess distribution in the northern continents. These results point to a role for vegetation feedbacks, including climate effects and physiological effects of low [CO2], in modulating the atmospheric distribution of dust.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Using monthly time-series data 1999-2013, the paper shows that markets for agricultural commodities provide a yardstick for real purchasing power, and thus a reference point for the real value of fiat currencies. The daily need for each adult to consume about 2800 food calories is universal; data from FAO food balance sheets confirm that the world basket of food consumed daily is non-volatile in comparison to the volatility of currency exchange rates, and so the replacement cost of food consumed provides a consistent indicator of economic value. Food commodities are storable for short periods, but ultimately perishable, and this exerts continual pressure for markets to clear in the short term; moreover, food calories can be obtained from a very large range of foodstuffs, and so most households are able to use arbitrage to select a near optimal weighting of quantities purchased. The paper proposes an original method to enable a standard of value to be established, definable in physical units on the basis of actual worldwide consumption of food goods, with an illustration of the method.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Algorithms for computer-aided diagnosis of dementia based on structural MRI have demonstrated high performance in the literature, but are difficult to compare as different data sets and methodology were used for evaluation. In addition, it is unclear how the algorithms would perform on previously unseen data, and thus, how they would perform in clinical practice when there is no real opportunity to adapt the algorithm to the data at hand. To address these comparability, generalizability and clinical applicability issues, we organized a grand challenge that aimed to objectively compare algorithms based on a clinically representative multi-center data set. Using clinical practice as the starting point, the goal was to reproduce the clinical diagnosis. Therefore, we evaluated algorithms for multi-class classification of three diagnostic groups: patients with probable Alzheimer's disease, patients with mild cognitive impairment and healthy controls. The diagnosis based on clinical criteria was used as reference standard, as it was the best available reference despite its known limitations. For evaluation, a previously unseen test set was used consisting of 354 T1-weighted MRI scans with the diagnoses blinded. Fifteen research teams participated with a total of 29 algorithms. The algorithms were trained on a small training set (n = 30) and optionally on data from other sources (e.g., the Alzheimer's Disease Neuroimaging Initiative, the Australian Imaging Biomarkers and Lifestyle flagship study of aging). The best performing algorithm yielded an accuracy of 63.0% and an area under the receiver-operating-characteristic curve (AUC) of 78.8%. In general, the best performances were achieved using feature extraction based on voxel-based morphometry or a combination of features that included volume, cortical thickness, shape and intensity. The challenge is open for new submissions via the web-based framework: http://caddementia.grand-challenge.org.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This article proposes a systematic approach to determine the most suitable analogue redesign method to be used for forward-type converters under digital voltage mode control. The focus of the method is to achieve the highest phase margin at the particular switching and crossover frequencies chosen by the designer. It is shown that at high crossover frequencies with respect to switching frequency, controllers designed using backward integration have the largest phase margin; whereas at low crossover frequencies with respect to switching frequency, controllers designed using bilinear integration with pre-warping have the largest phase margins. An algorithm has been developed to determine the frequency of the crossing point where the recommended discretisation method changes. An accurate model of the power stage is used for simulation and experimental results from a Buck converter are collected. The performance of the digital controllers is compared to that of the equivalent analogue controller both in simulation and experiment. Excellent closeness between the simulation and experimental results is presented. This work provides a concrete example to allow academics and engineers to systematically choose a discretisation method.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Fire activity has varied globally and continuously since the last glacial maximum (LGM) in response to long-term changes in global climate and shorter-term regional changes in climate, vegetation, and human land use. We have synthesized sedimentary charcoal records of biomass burning since the LGM and present global maps showing changes in fire activity for time slices during the past 21,000 years (as differences in charcoal accumulation values compared to pre-industrial). There is strong broad-scale coherence in fire activity after the LGM, but spatial heterogeneity in the signals increases thereafter. In North America, Europe and southern South America, charcoal records indicate less-than-present fire activity during the deglacial period, from 21,000 to ∼11,000 cal yr BP. In contrast, the tropical latitudes of South America and Africa show greater-than-present fire activity from ∼19,000 to ∼17,000 cal yr BP and most sites from Indochina and Australia show greater-than-present fire activity from 16,000 to ∼13,000 cal yr BP. Many sites indicate greater-than-present or near-present activity during the Holocene with the exception of eastern North America and eastern Asia from 8,000 to ∼3,000 cal yr BP, Indonesia and Australia from 11,000 to 4,000 cal yr BP, and southern South America from 6,000 to 3,000 cal yr BP where fire activity was less than present. Regional coherence in the patterns of change in fire activity was evident throughout the post-glacial period. These complex patterns can largely be explained in terms of large-scale climate controls modulated by local changes in vegetation and fuel load

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Verbal communication is essential for human society and human civilization. Non-verbal communication, on the other hand, is more widely used not only by human but also other kind of animals, and the content of information is estimated even larger than the verbal communication. Among the non-verbal communication mutual motion is the simplest and easiest to study experimentally and analytically. We measured the power spectrum of the hand velocity in various conditions and clarified the following points on the feed-back and feed- forward mechanism as basic knowledge to understand the condition of good communication.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Industrial robotic manipulators can be found in most factories today. Their tasks are accomplished through actively moving, placing and assembling parts. This movement is facilitated by actuators that apply a torque in response to a command signal. The presence of friction and possibly backlash have instigated the development of sophisticated compensation and control methods in order to achieve the desired performance may that be accurate motion tracking, fast movement or in fact contact with the environment. This thesis presents a dual drive actuator design that is capable of physically linearising friction and hence eliminating the need for complex compensation algorithms. A number of mathematical models are derived that allow for the simulation of the actuator dynamics. The actuator may be constructed using geared dc motors, in which case the benefits of torque magnification is retained whilst the increased non-linear friction effects are also linearised. An additional benefit of the actuator is the high quality, low latency output position signal provided by the differencing of the two drive positions. Due to this and the linearised nature of friction, the actuator is well suited for low velocity, stop-start applications, micro-manipulation and even in hard-contact tasks. There are, however, disadvantages to its design. When idle, the device uses power whilst many other, single drive actuators do not. Also the complexity of the models mean that parameterisation is difficult. Management of start-up conditions still pose a challenge.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we propose a new lifetime distribution which can handle bathtub-shaped unimodal increasing and decreasing hazard rate functions The model has three parameters and generalizes the exponential power distribution proposed by Smith and Bain (1975) with the inclusion of an additional shape parameter The maximum likelihood estimation procedure is discussed A small-scale simulation study examines the performance of the likelihood ratio statistics under small and moderate sized samples Three real datasets Illustrate the methodology (C) 2010 Elsevier B V All rights reserved

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We introduce in this paper a new class of discrete generalized nonlinear models to extend the binomial, Poisson and negative binomial models to cope with count data. This class of models includes some important models such as log-nonlinear models, logit, probit and negative binomial nonlinear models, generalized Poisson and generalized negative binomial regression models, among other models, which enables the fitting of a wide range of models to count data. We derive an iterative process for fitting these models by maximum likelihood and discuss inference on the parameters. The usefulness of the new class of models is illustrated with an application to a real data set. (C) 2008 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The representation of interfaces by means of the algebraic moving-least-squares (AMLS) technique is addressed. This technique, in which the interface is represented by an unconnected set of points, is interesting for evolving fluid interfaces since there is]to surface connectivity. The position of the surface points can thus be updated without concerns about the quality of any surface triangulation. We introduce a novel AMLS technique especially designed for evolving-interfaces applications that we denote RAMLS (for Robust AMLS). The main advantages with respect to previous AMLS techniques are: increased robustness, computational efficiency, and being free of user-tuned parameters. Further, we propose a new front-tracking method based on the Lagrangian advection of the unconnected point set that defines the RAMLS surface. We assume that a background Eulerian grid is defined with some grid spacing h. The advection of the point set makes the surface evolve in time. The point cloud can be regenerated at any time (in particular, we regenerate it each time step) by intersecting the gridlines with the evolved surface, which guarantees that the density of points on the surface is always well balanced. The intersection algorithm is essentially a ray-tracing algorithm, well-studied in computer graphics, in which a line (ray) is traced so as to detect all intersections with a surface. Also, the tracing of each gridline is independent and can thus be performed in parallel. Several tests are reported assessing first the accuracy of the proposed RAMLS technique, and then of the front-tracking method based on it. Comparison with previous Eulerian, Lagrangian and hybrid techniques encourage further development of the proposed method for fluid mechanics applications. (C) 2008 Elsevier Inc. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose - The purpose of this paper is to develop a novel unstructured simulation approach for injection molding processes described by the Hele-Shaw model. Design/methodology/approach - The scheme involves dual dynamic meshes with active and inactive cells determined from an initial background pointset. The quasi-static pressure solution in each timestep for this evolving unstructured mesh system is approximated using a control volume finite element method formulation coupled to a corresponding modified volume of fluid method. The flow is considered to be isothermal and non-Newtonian. Findings - Supporting numerical tests and performance studies for polystyrene described by Carreau, Cross, Ellis and Power-law fluid models are conducted. Results for the present method are shown to be comparable to those from other methods for both Newtonian fluid and polystyrene fluid injected in different mold geometries. Research limitations/implications - With respect to the methodology, the background pointset infers a mesh that is dynamically reconstructed here, and there are a number of efficiency issues and improvements that would be relevant to industrial applications. For instance, one can use the pointset to construct special bases and invoke a so-called ""meshless"" scheme using the basis. This would require some interesting strategies to deal with the dynamic point enrichment of the moving front that could benefit from the present front treatment strategy. There are also issues related to mass conservation and fill-time errors that might be addressed by introducing suitable projections. The general question of ""rate of convergence"" of these schemes requires analysis. Numerical results here suggest first-order accuracy and are consistent with the approximations made, but theoretical results are not available yet for these methods. Originality/value - This novel unstructured simulation approach involves dual meshes with active and inactive cells determined from an initial background pointset: local active dual patches are constructed ""on-the-fly"" for each ""active point"" to form a dynamic virtual mesh of active elements that evolves with the moving interface.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We investigate the combined influence of quenched randomness and dissipation on a quantum critical point with O(N) order-parameter symmetry. Utilizing a strong-disorder renormalization group, we determine the critical behavior in one space dimension exactly. For super-ohmic dissipation, we find a Kosterlitz-Thouless type transition with conventional (power-law) dynamical scaling. The dynamical critical exponent depends on the spectral density of the dissipative baths. We also discuss the Griffiths singularities, and we determine observables.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we present a novel approach for multispectral image contextual classification by combining iterative combinatorial optimization algorithms. The pixel-wise decision rule is defined using a Bayesian approach to combine two MRF models: a Gaussian Markov Random Field (GMRF) for the observations (likelihood) and a Potts model for the a priori knowledge, to regularize the solution in the presence of noisy data. Hence, the classification problem is stated according to a Maximum a Posteriori (MAP) framework. In order to approximate the MAP solution we apply several combinatorial optimization methods using multiple simultaneous initializations, making the solution less sensitive to the initial conditions and reducing both computational cost and time in comparison to Simulated Annealing, often unfeasible in many real image processing applications. Markov Random Field model parameters are estimated by Maximum Pseudo-Likelihood (MPL) approach, avoiding manual adjustments in the choice of the regularization parameters. Asymptotic evaluations assess the accuracy of the proposed parameter estimation procedure. To test and evaluate the proposed classification method, we adopt metrics for quantitative performance assessment (Cohen`s Kappa coefficient), allowing a robust and accurate statistical analysis. The obtained results clearly show that combining sub-optimal contextual algorithms significantly improves the classification performance, indicating the effectiveness of the proposed methodology. (C) 2010 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

For a fixed family F of graphs, an F-packing in a graph G is a set of pairwise vertex-disjoint subgraphs of G, each isomorphic to an element of F. Finding an F-packing that maximizes the number of covered edges is a natural generalization of the maximum matching problem, which is just F = {K(2)}. In this paper we provide new approximation algorithms and hardness results for the K(r)-packing problem where K(r) = {K(2), K(3,) . . . , K(r)}. We show that already for r = 3 the K(r)-packing problem is APX-complete, and, in fact, we show that it remains so even for graphs with maximum degree 4. On the positive side, we give an approximation algorithm with approximation ratio at most 2 for every fixed r. For r = 3, 4, 5 we obtain better approximations. For r = 3 we obtain a simple 3/2-approximation, achieving a known ratio that follows from a more involved algorithm of Halldorsson. For r = 4, we obtain a (3/2 + epsilon)-approximation, and for r = 5 we obtain a (25/14 + epsilon)-approximation. (C) 2008 Elsevier B.V. All rights reserved.