865 resultados para Maximum power point tracking


Relevância:

30.00% 30.00%

Publicador:

Resumo:

A method of automatically identifying and tracking polar-cap plasma patches, utilising data inversion and feature-tracking methods, is presented. A well-established and widely used 4-D ionospheric imaging algorithm, the Multi-Instrument Data Assimilation System (MIDAS), inverts slant total electron content (TEC) data from ground-based Global Navigation Satellite System (GNSS) receivers to produce images of the free electron distribution in the polar-cap ionosphere. These are integrated to form vertical TEC maps. A flexible feature-tracking algorithm, TRACK, previously used extensively in meteorological storm-tracking studies is used to identify and track maxima in the resulting 2-D data fields. Various criteria are used to discriminate between genuine patches and "false-positive" maxima such as the continuously moving day-side maximum, which results from the Earth's rotation rather than plasma motion. Results for a 12-month period at solar minimum, when extensive validation data are available, are presented. The method identifies 71 separate structures consistent with patch motion during this time. The limitations of solar minimum and the consequent small number of patches make climatological inferences difficult, but the feasibility of the method for patches larger than approximately 500 km in scale is demonstrated and a larger study incorporating other parts of the solar cycle is warranted. Possible further optimisation of discrimination criteria, particularly regarding the definition of a patch in terms of its plasma concentration enhancement over the surrounding background, may improve results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Palaeodata in synthesis form are needed as benchmarks for the Palaeoclimate Modelling Intercomparison Project (PMIP). Advances since the last synthesis of terrestrial palaeodata from the last glacial maximum (LGM) call for a new evaluation, especially of data from the tropics. Here pollen, plant-macrofossil, lake-level, noble gas (from groundwater) and δ18O (from speleothems) data are compiled for 18±2 ka (14C), 32 °N–33 °S. The reliability of the data was evaluated using explicit criteria and some types of data were re-analysed using consistent methods in order to derive a set of mutually consistent palaeoclimate estimates of mean temperature of the coldest month (MTCO), mean annual temperature (MAT), plant available moisture (PAM) and runoff (P-E). Cold-month temperature (MAT) anomalies from plant data range from −1 to −2 K near sea level in Indonesia and the S Pacific, through −6 to −8 K at many high-elevation sites to −8 to −15 K in S China and the SE USA. MAT anomalies from groundwater or speleothems seem more uniform (−4 to −6 K), but the data are as yet sparse; a clear divergence between MAT and cold-month estimates from the same region is seen only in the SE USA, where cold-air advection is expected to have enhanced cooling in winter. Regression of all cold-month anomalies against site elevation yielded an estimated average cooling of −2.5 to −3 K at modern sea level, increasing to ≈−6 K by 3000 m. However, Neotropical sites showed larger than the average sea-level cooling (−5 to −6 K) and a non-significant elevation effect, whereas W and S Pacific sites showed much less sea-level cooling (−1 K) and a stronger elevation effect. These findings support the inference that tropical sea-surface temperatures (SSTs) were lower than the CLIMAP estimates, but they limit the plausible average tropical sea-surface cooling, and they support the existence of CLIMAP-like geographic patterns in SST anomalies. Trends of PAM and lake levels indicate wet LGM conditions in the W USA, and at the highest elevations, with generally dry conditions elsewhere. These results suggest a colder-than-present ocean surface producing a weaker hydrological cycle, more arid continents, and arguably steeper-than-present terrestrial lapse rates. Such linkages are supported by recent observations on freezing-level height and tropical SSTs; moreover, simulations of “greenhouse” and LGM climates point to several possible feedback processes by which low-level temperature anomalies might be amplified aloft.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Mineral dust aerosols in the atmosphere have the potential to affect the global climate by influencing the radiative balance of the atmosphere and the supply of micronutrients to the ocean. Ice and marine sediment cores indicate that dust deposition from the atmosphere was at some locations 2–20 times greater during glacial periods, raising the possibility that mineral aerosols might have contributed to climate change on glacial-interglacial time scales. To address this question, we have used linked terrestrial biosphere, dust source, and atmospheric transport models to simulate the dust cycle in the atmosphere for current and last glacial maximum (LGM) climates. We obtain a 2.5-fold higher dust loading in the entire atmosphere and a twenty-fold higher loading in high latitudes, in LGM relative to present. Comparisons to a compilation of atmospheric dust deposition flux estimates for LGM and present in marine sediment and ice cores show that the simulated flux ratios are broadly in agreement with observations; differences suggest where further improvements in the simple dust model could be made. The simulated increase in high-latitude dustiness depends on the expansion of unvegetated areas, especially in the high latitudes and in central Asia, caused by a combination of increased aridity and low atmospheric [CO2]. The existence of these dust source areas at the LGM is supported by pollen data and loess distribution in the northern continents. These results point to a role for vegetation feedbacks, including climate effects and physiological effects of low [CO2], in modulating the atmospheric distribution of dust.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Using monthly time-series data 1999-2013, the paper shows that markets for agricultural commodities provide a yardstick for real purchasing power, and thus a reference point for the real value of fiat currencies. The daily need for each adult to consume about 2800 food calories is universal; data from FAO food balance sheets confirm that the world basket of food consumed daily is non-volatile in comparison to the volatility of currency exchange rates, and so the replacement cost of food consumed provides a consistent indicator of economic value. Food commodities are storable for short periods, but ultimately perishable, and this exerts continual pressure for markets to clear in the short term; moreover, food calories can be obtained from a very large range of foodstuffs, and so most households are able to use arbitrage to select a near optimal weighting of quantities purchased. The paper proposes an original method to enable a standard of value to be established, definable in physical units on the basis of actual worldwide consumption of food goods, with an illustration of the method.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This article proposes a systematic approach to determine the most suitable analogue redesign method to be used for forward-type converters under digital voltage mode control. The focus of the method is to achieve the highest phase margin at the particular switching and crossover frequencies chosen by the designer. It is shown that at high crossover frequencies with respect to switching frequency, controllers designed using backward integration have the largest phase margin; whereas at low crossover frequencies with respect to switching frequency, controllers designed using bilinear integration with pre-warping have the largest phase margins. An algorithm has been developed to determine the frequency of the crossing point where the recommended discretisation method changes. An accurate model of the power stage is used for simulation and experimental results from a Buck converter are collected. The performance of the digital controllers is compared to that of the equivalent analogue controller both in simulation and experiment. Excellent closeness between the simulation and experimental results is presented. This work provides a concrete example to allow academics and engineers to systematically choose a discretisation method.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Fire activity has varied globally and continuously since the last glacial maximum (LGM) in response to long-term changes in global climate and shorter-term regional changes in climate, vegetation, and human land use. We have synthesized sedimentary charcoal records of biomass burning since the LGM and present global maps showing changes in fire activity for time slices during the past 21,000 years (as differences in charcoal accumulation values compared to pre-industrial). There is strong broad-scale coherence in fire activity after the LGM, but spatial heterogeneity in the signals increases thereafter. In North America, Europe and southern South America, charcoal records indicate less-than-present fire activity during the deglacial period, from 21,000 to ∼11,000 cal yr BP. In contrast, the tropical latitudes of South America and Africa show greater-than-present fire activity from ∼19,000 to ∼17,000 cal yr BP and most sites from Indochina and Australia show greater-than-present fire activity from 16,000 to ∼13,000 cal yr BP. Many sites indicate greater-than-present or near-present activity during the Holocene with the exception of eastern North America and eastern Asia from 8,000 to ∼3,000 cal yr BP, Indonesia and Australia from 11,000 to 4,000 cal yr BP, and southern South America from 6,000 to 3,000 cal yr BP where fire activity was less than present. Regional coherence in the patterns of change in fire activity was evident throughout the post-glacial period. These complex patterns can largely be explained in terms of large-scale climate controls modulated by local changes in vegetation and fuel load

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Verbal communication is essential for human society and human civilization. Non-verbal communication, on the other hand, is more widely used not only by human but also other kind of animals, and the content of information is estimated even larger than the verbal communication. Among the non-verbal communication mutual motion is the simplest and easiest to study experimentally and analytically. We measured the power spectrum of the hand velocity in various conditions and clarified the following points on the feed-back and feed- forward mechanism as basic knowledge to understand the condition of good communication.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we propose a new lifetime distribution which can handle bathtub-shaped unimodal increasing and decreasing hazard rate functions The model has three parameters and generalizes the exponential power distribution proposed by Smith and Bain (1975) with the inclusion of an additional shape parameter The maximum likelihood estimation procedure is discussed A small-scale simulation study examines the performance of the likelihood ratio statistics under small and moderate sized samples Three real datasets Illustrate the methodology (C) 2010 Elsevier B V All rights reserved

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We introduce in this paper a new class of discrete generalized nonlinear models to extend the binomial, Poisson and negative binomial models to cope with count data. This class of models includes some important models such as log-nonlinear models, logit, probit and negative binomial nonlinear models, generalized Poisson and generalized negative binomial regression models, among other models, which enables the fitting of a wide range of models to count data. We derive an iterative process for fitting these models by maximum likelihood and discuss inference on the parameters. The usefulness of the new class of models is illustrated with an application to a real data set. (C) 2008 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The representation of interfaces by means of the algebraic moving-least-squares (AMLS) technique is addressed. This technique, in which the interface is represented by an unconnected set of points, is interesting for evolving fluid interfaces since there is]to surface connectivity. The position of the surface points can thus be updated without concerns about the quality of any surface triangulation. We introduce a novel AMLS technique especially designed for evolving-interfaces applications that we denote RAMLS (for Robust AMLS). The main advantages with respect to previous AMLS techniques are: increased robustness, computational efficiency, and being free of user-tuned parameters. Further, we propose a new front-tracking method based on the Lagrangian advection of the unconnected point set that defines the RAMLS surface. We assume that a background Eulerian grid is defined with some grid spacing h. The advection of the point set makes the surface evolve in time. The point cloud can be regenerated at any time (in particular, we regenerate it each time step) by intersecting the gridlines with the evolved surface, which guarantees that the density of points on the surface is always well balanced. The intersection algorithm is essentially a ray-tracing algorithm, well-studied in computer graphics, in which a line (ray) is traced so as to detect all intersections with a surface. Also, the tracing of each gridline is independent and can thus be performed in parallel. Several tests are reported assessing first the accuracy of the proposed RAMLS technique, and then of the front-tracking method based on it. Comparison with previous Eulerian, Lagrangian and hybrid techniques encourage further development of the proposed method for fluid mechanics applications. (C) 2008 Elsevier Inc. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose - The purpose of this paper is to develop a novel unstructured simulation approach for injection molding processes described by the Hele-Shaw model. Design/methodology/approach - The scheme involves dual dynamic meshes with active and inactive cells determined from an initial background pointset. The quasi-static pressure solution in each timestep for this evolving unstructured mesh system is approximated using a control volume finite element method formulation coupled to a corresponding modified volume of fluid method. The flow is considered to be isothermal and non-Newtonian. Findings - Supporting numerical tests and performance studies for polystyrene described by Carreau, Cross, Ellis and Power-law fluid models are conducted. Results for the present method are shown to be comparable to those from other methods for both Newtonian fluid and polystyrene fluid injected in different mold geometries. Research limitations/implications - With respect to the methodology, the background pointset infers a mesh that is dynamically reconstructed here, and there are a number of efficiency issues and improvements that would be relevant to industrial applications. For instance, one can use the pointset to construct special bases and invoke a so-called ""meshless"" scheme using the basis. This would require some interesting strategies to deal with the dynamic point enrichment of the moving front that could benefit from the present front treatment strategy. There are also issues related to mass conservation and fill-time errors that might be addressed by introducing suitable projections. The general question of ""rate of convergence"" of these schemes requires analysis. Numerical results here suggest first-order accuracy and are consistent with the approximations made, but theoretical results are not available yet for these methods. Originality/value - This novel unstructured simulation approach involves dual meshes with active and inactive cells determined from an initial background pointset: local active dual patches are constructed ""on-the-fly"" for each ""active point"" to form a dynamic virtual mesh of active elements that evolves with the moving interface.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We investigate the combined influence of quenched randomness and dissipation on a quantum critical point with O(N) order-parameter symmetry. Utilizing a strong-disorder renormalization group, we determine the critical behavior in one space dimension exactly. For super-ohmic dissipation, we find a Kosterlitz-Thouless type transition with conventional (power-law) dynamical scaling. The dynamical critical exponent depends on the spectral density of the dissipative baths. We also discuss the Griffiths singularities, and we determine observables.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present a variable time step, fully adaptive in space, hybrid method for the accurate simulation of incompressible two-phase flows in the presence of surface tension in two dimensions. The method is based on the hybrid level set/front-tracking approach proposed in [H. D. Ceniceros and A. M. Roma, J. Comput. Phys., 205, 391400, 2005]. Geometric, interfacial quantities are computed from front-tracking via the immersed-boundary setting while the signed distance (level set) function, which is evaluated fast and to machine precision, is used as a fluid indicator. The surface tension force is obtained by employing the mixed Eulerian/Lagrangian representation introduced in [S. Shin, S. I. Abdel-Khalik, V. Daru and D. Juric, J. Comput. Phys., 203, 493-516, 2005] whose success for greatly reducing parasitic currents has been demonstrated. The use of our accurate fluid indicator together with effective Lagrangian marker control enhance this parasitic current reduction by several orders of magnitude. To resolve accurately and efficiently sharp gradients and salient flow features we employ dynamic, adaptive mesh refinements. This spatial adaption is used in concert with a dynamic control of the distribution of the Lagrangian nodes along the fluid interface and a variable time step, linearly implicit time integration scheme. We present numerical examples designed to test the capabilities and performance of the proposed approach as well as three applications: the long-time evolution of a fluid interface undergoing Rayleigh-Taylor instability, an example of bubble ascending dynamics, and a drop impacting on a free interface whose dynamics we compare with both existing numerical and experimental data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we introduce the Weibull power series (WPS) class of distributions which is obtained by compounding Weibull and power series distributions where the compounding procedure follows same way that was previously carried out by Adamidis and Loukas (1998) This new class of distributions has as a particular case the two-parameter exponential power series (EPS) class of distributions (Chahkandi and Gawk 2009) which contains several lifetime models such as exponential geometric (Adamidis and Loukas 1998) exponential Poisson (Kus 2007) and exponential logarithmic (Tahmasbi and Rezaei 2008) distributions The hazard function of our class can be increasing decreasing and upside down bathtub shaped among others while the hazard function of an EPS distribution is only decreasing We obtain several properties of the WPS distributions such as moments order statistics estimation by maximum likelihood and inference for a large sample Furthermore the EM algorithm is also used to determine the maximum likelihood estimates of the parameters and we discuss maximum entropy characterizations under suitable constraints Special distributions are studied in some detail Applications to two real data sets are given to show the flexibility and potentiality of the new class of distributions (C) 2010 Elsevier B V All rights reserved

Relevância:

30.00% 30.00%

Publicador:

Resumo:

I consider the case for genuinely anonymous web searching. Big data seems to have it in for privacy. The story is well known, particularly since the dawn of the web. Vastly more personal information, monumental and quotidian, is gathered than in the pre-digital days. Once gathered it can be aggregated and analyzed to produce rich portraits, which in turn permit unnerving prediction of our future behavior. The new information can then be shared widely, limiting prospects and threatening autonomy. How should we respond? Following Nissenbaum (2011) and Brunton and Nissenbaum (2011 and 2013), I will argue that the proposed solutions—consent, anonymity as conventionally practiced, corporate best practices, and law—fail to protect us against routine surveillance of our online behavior. Brunton and Nissenbaum rightly maintain that, given the power imbalance between data holders and data subjects, obfuscation of one’s online activities is justified. Obfuscation works by generating “misleading, false, or ambiguous data with the intention of confusing an adversary or simply adding to the time or cost of separating good data from bad,” thus decreasing the value of the data collected (Brunton and Nissenbaum, 2011). The phenomenon is as old as the hills. Natural selection evidently blundered upon the tactic long ago. Take a savory butterfly whose markings mimic those of a toxic cousin. From the point of view of a would-be predator the data conveyed by the pattern is ambiguous. Is the bug lunch or potential last meal? In the light of the steep costs of a mistake, the savvy predator goes hungry. Online obfuscation works similarly, attempting for instance to disguise the surfer’s identity (Tor) or the nature of her queries (Howe and Nissenbaum 2009). Yet online obfuscation comes with significant social costs. First, it implies free riding. If I’ve installed an effective obfuscating program, I’m enjoying the benefits of an apparently free internet without paying the costs of surveillance, which are shifted entirely onto non-obfuscators. Second, it permits sketchy actors, from child pornographers to fraudsters, to operate with near impunity. Third, online merchants could plausibly claim that, when we shop online, surveillance is the price we pay for convenience. If we don’t like it, we should take our business to the local brick-and-mortar and pay with cash. Brunton and Nissenbaum have not fully addressed the last two costs. Nevertheless, I think the strict defender of online anonymity can meet these objections. Regarding the third, the future doesn’t bode well for offline shopping. Consider music and books. Intrepid shoppers can still find most of what they want in a book or record store. Soon, though, this will probably not be the case. And then there are those who, for perfectly good reasons, are sensitive about doing some of their shopping in person, perhaps because of their weight or sexual tastes. I argue that consumers should not have to pay the price of surveillance every time they want to buy that catchy new hit, that New York Times bestseller, or a sex toy.