77 resultados para Stand-Alone and Grid Connected PV applications


Relevância:

100.00% 100.00%

Publicador:

Resumo:

As Terabyte datasets become the norm, the focus has shifted away from our ability to produce and store ever larger amounts of data, onto its utilization. It is becoming increasingly difficult to gain meaningful insights into the data produced. Also many forms of the data we are currently producing cannot easily fit into traditional visualization methods. This paper presents a new and novel visualization technique based on the concept of a Data Forest. Our Data Forest has been designed to be used with vir tual reality (VR) as its presentation method. VR is a natural medium for investigating large datasets. Our approach can easily be adapted to be used in a variety of different ways, from a stand alone single user environment to large multi-user collaborative environments. A test application is presented using multi-dimensional data to demonstrate the concepts involved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Our eyes are input sensors which Provide our brains with streams of visual data. They have evolved to be extremely efficient, and they will constantly dart to-and-fro to rapidly build up a picture of the salient entities in a viewed scene. These actions are almost subconscious. However, they can provide telling signs of how the brain is decoding the visuals and call indicate emotional responses, prior to the viewer becoming aware of them. In this paper we discuss a method of tracking a user's eye movements, and Use these to calculate their gaze within an immersive virtual environment. We investigate how these gaze patterns can be captured and used to identify viewed virtual objects, and discuss how this can be used as a, natural method of interacting with the Virtual Environment. We describe a flexible tool that has been developed to achieve this, and detail initial validating applications that prove the concept.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The diagnosis of thalassaemia in archaeological populations has long been hindered by a lack of pathogonomic features, and the non-specific nature of cribra orbitalia and porotic hyperostosis. In fact, clinical research has highlighted more specific diagnostic criteria for thalassaemia major and intermedia based on changes to the thorax (‘rib-within-a-rib’ and costal osteomas). A recent re-examination of 364 child skeletons from Romano-British Poundbury Camp, Dorset revealed children with general ‘wasting’ of the bones and three children who demonstrated a variety of severe lesions (e.g. zygomatic bone and rib hypertrophy, porotic hyperostosis, rib lesions, osteopenia and pitted diaphyseal shafts) that are inconsistent with dietary deficiency alone, and more consistent with a diagnosis of genetic anaemia. Two of these children displayed rib lesions typical of those seen in modern cases of thalassaemia. The children of Poundbury Camp represent the first cases of genetic anaemia identified in a British archaeological population. As thalassaemia is a condition strongly linked to Mediterranean communities, the presence of this condition in a child from England, found within a mausoleum, suggests that they were born to wealthy immigrant parents living in this small Roman settlement in Dorset. This paper explores the diagnostic criteria for genetic anaemia in the archaeological literature and what its presence in ancient populations can contribute to our knowledge of past human migration.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A predictability index was defined as the ratio of the variance of the optimal prediction to the variance of the original time series by Granger and Anderson (1976) and Bhansali (1989). A new simplified algorithm for estimating the predictability index is introduced and the new estimator is shown to be a simple and effective tool in applications of predictability ranking and as an aid in the preliminary analysis of time series. The relationship between the predictability index and the position of the poles and lag p of a time series which can be modelled as an AR(p) model are also investigated. The effectiveness of the algorithm is demonstrated using numerical examples including an application to stock prices.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

*** Purpose – Computer tomography (CT) for 3D reconstruction entails a huge number of coplanar fan-beam projections for each of a large number of 2D slice images, and excessive radiation intensities and dosages. For some applications its rate of throughput is also inadequate. A technique for overcoming these limitations is outlined. *** Design methodology/approach – A novel method to reconstruct 3D surface models of objects is presented, using, typically, ten, 2D projective images. These images are generated by relative motion between this set of objects and a set of ten fanbeam X-ray sources and sensors, with their viewing axes suitably distributed in 2D angular space. *** Findings – The method entails a radiation dosage several orders of magnitude lower than CT, and requires far less computational power. Experimental results are given to illustrate the capability of the technique *** Practical implications – The substantially lower cost of the method and, more particularly, its dramatically lower irradiation make it relevant to many applications precluded by current techniques *** Originality/value – The method can be used in many applications such as aircraft hold-luggage screening, 3D industrial modelling and measurement, and it should also have important applications to medical diagnosis and surgery.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A neural network was used to map three PID operating regions for a two-input two-output steam generator system. The network was used in stand alone feedforward operation to control the whole operating range of the process, after being trained from the PID controllers corresponding to each control region. The network inputs are the plant error signals, their integral, their derivative and a 4-error delay train.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Interchange reconnection at the Sun, that is, reconnection between a doubly-connected field loop and singly-connected or open field line that extends to infinity, has important implications for the heliospheric magnetic flux budget. Recent work on the topic is reviewed, with emphasis on two aspects. The first is a possible heliospheric signature of interchange reconnection at the coronal hole boundary, where open fields meet closed loops. The second aspect concerns the means by which the heliospheric magnetic field strength reached record-lows during the recent solar minimum period. A new implication of this work is that interchange reconnection may be responsible for the puzzling, occasional coincidence of the heliospheric current sheet and the interface between fast and slow flow in the solar wind.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We review the sea-level and energy budgets together from 1961, using recent and updated estimates of all terms. From 1972 to 2008, the observed sea-level rise (1.8 ± 0.2 mm yr−1 from tide gauges alone and 2.1 ± 0.2 mm yr−1 from a combination of tide gauges and altimeter observations) agrees well with the sum of contributions (1.8 ± 0.4 mm yr−1) in magnitude and with both having similar increases in the rate of rise during the period. The largest contributions come from ocean thermal expansion (0.8 mm yr−1) and the melting of glaciers and ice caps (0.7 mm yr−1), with Greenland and Antarctica contributing about 0.4 mm yr−1. The cryospheric contributions increase through the period (particularly in the 1990s) but the thermosteric contribution increases less rapidly. We include an improved estimate of aquifer depletion (0.3 mm yr−1), partially offsetting the retention of water in dams and giving a total terrestrial storage contribution of −0.1 mm yr−1. Ocean warming (90% of the total of the Earth's energy increase) continues through to the end of the record, in agreement with continued greenhouse gas forcing. The aerosol forcing, inferred as a residual in the atmospheric energy balance, is estimated as −0.8 ± 0.4 W m−2 for the 1980s and early 1990s. It increases in the late 1990s, as is required for consistency with little surface warming over the last decade. This increase is likely at least partially related to substantial increases in aerosol emissions from developing nations and moderate volcanic activity

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The success of any diversification strategy depends upon the quality of the estimated correlation between assets. It is well known, however, that there is a tendency for the average correlation among assets to increase when the market falls and vice-versa. Thus, assuming that the correlation between assets is a constant over time seems unrealistic. Nonetheless, these changes in the correlation structure as a consequence of changes in the market’s return suggests that correlation shifts can be modelled as a function of the market return. This is the idea behind the model of Spurgin et al (2000), which models the beta or systematic risk, of the asset as a function of the returns in the market. This is an approach that offers particular attractions to fund managers as it suggest ways by which they can adjust their portfolios to benefit from changes in overall market conditions. In this paper the Spurgin et al (2000) model is applied to 31 real estate market segments in the UK using monthly data over the period 1987:1 to 2000:12. The results show that a number of market segments display significant negative correlation shifts, while others show significantly positive correlation shifts. Using this information fund managers can make strategic and tactical portfolio allocation decisions based on expectations of market volatility alone and so help them achieve greater portfolio performance overall and especially during different phases of the real estate cycle.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Integrated simulation models can be useful tools in farming system research. This chapter reviews three commonly used approaches, i.e. linear programming, system dynamics and agent-based models. Applications of each approach are presented and strengths and drawbacks discussed. We argue that, despite some challenges, mainly related to the integration of different approaches, model validation and the representation of human agents, integrated simulation models contribute important insights to the analysis of farming systems. They help unravelling the complex and dynamic interactions and feedbacks among bio-physical, socio-economic, and institutional components across scales and levels in farming systems. In addition, they can provide a platform for integrative research, and can support transdisciplinary research by functioning as learning platforms in participatory processes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Limnologists had an early preoccupation with lake classification. It gave a necessary structure to the many chemical and biological observations that were beginning to form the basis of one of the earliest truly environmental sciences. August Thienemann was the doyen of such classifiers and his concept with Einar Naumann of oligotrophic and eutrophic lakes remains central to the world-view that limnologists still have. Classification fell into disrepute, however, as it became clear that there would always be lakes that deviated from the prescriptions that the classifiers made for them. Continua became the de rigeur concept and lakes were seen as varying along many chemical, biological and geographic axes. Modern limnologists are comfortable with this concept. That all lakes are different guarantees an indefinite future for limnological research. For those who manage lakes and the landscapes in which they are set, however, it is not very useful. There may be as many as 300000 standing water bodies in England and Wales alone and maybe as many again in Scotland. More than 80 000 are sizable (> 1 ha). Some classification scheme to cope with these numbers is needed and, as human impacts on them increase, a system of assessing and monitoring change must be built into such a scheme. Although ways of classifying and monitoring running waters are well developed in the UK, the same is not true of standing waters. Sufficient understanding of what determines the nature and functioning of lakes exists to create a system which has intellectual credibility as well as practical usefulness. This paper outlines the thinking behind a system which will be workable on a north European basis and presents some early results.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Ruminants harbour both O157:H7 and non-O157 Attaching Effacing Escherichia coli (AEEC) strains but to date only nonO157 AEEC have been shown to induce attaching effacing lesions in naturally infected animals. However, O157 may induce lesions in deliberate oral inoculation studies and persistence is considered dependent upon the bacterially encoded locus for enterocyte effacement. In concurrent infections in ruminants it is unclear whether non-O157 AEEC contribute either positively or negatively to the persistence of E. coli O157:H7. To investigate this, and prior to animal studies, E. coli O157:H7 NCTC 12900, a non-toxigenic strain that persists in conventionally reared sheep, and non-toxigenic AEEC O26:K60 isolates of sheep origin were tested for adherence to Hep-2 tissue culture alone and in competition one with another. Applied together, both strains adhered in similar numbers but lower than when either was applied separately. Pre-incubation of tissue culture with either one strain reduced significantly (P < 0.05) the extent of adherence of the strain that was applied second. It was particularly noticeable that AEEC O26 when applied first reduced adherence and inhibited microcolony formation, as demonstrated by confocal microscopy, of E. coli 01 57:H7. The possibility that prior colonisation of a ruminant by non-O157 AEEC such as O26 may antagonise O157 colonisation and persistence in ruminants is discussed. (C) 2004 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVES: The prediction of protein structure and the precise understanding of protein folding and unfolding processes remains one of the greatest challenges in structural biology and bioinformatics. Computer simulations based on molecular dynamics (MD) are at the forefront of the effort to gain a deeper understanding of these complex processes. Currently, these MD simulations are usually on the order of tens of nanoseconds, generate a large amount of conformational data and are computationally expensive. More and more groups run such simulations and generate a myriad of data, which raises new challenges in managing and analyzing these data. Because the vast range of proteins researchers want to study and simulate, the computational effort needed to generate data, the large data volumes involved, and the different types of analyses scientists need to perform, it is desirable to provide a public repository allowing researchers to pool and share protein unfolding data. METHODS: To adequately organize, manage, and analyze the data generated by unfolding simulation studies, we designed a data warehouse system that is embedded in a grid environment to facilitate the seamless sharing of available computer resources and thus enable many groups to share complex molecular dynamics simulations on a more regular basis. RESULTS: To gain insight into the conformational fluctuations and stability of the monomeric forms of the amyloidogenic protein transthyretin (TTR), molecular dynamics unfolding simulations of the monomer of human TTR have been conducted. Trajectory data and meta-data of the wild-type (WT) protein and the highly amyloidogenic variant L55P-TTR represent the test case for the data warehouse. CONCLUSIONS: Web and grid services, especially pre-defined data mining services that can run on or 'near' the data repository of the data warehouse, are likely to play a pivotal role in the analysis of molecular dynamics unfolding data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

new rheology that explicitly accounts for the subcontinuum anisotropy of the sea ice cover is implemented into the Los Alamos sea ice model. This is in contrast to all models of sea ice included in global circulation models that use an isotropic rheology. The model contains one new prognostic variable, the local structure tensor, which quantifies the degree of anisotropy of the sea ice, and two parameters that set the time scale of the evolution of this tensor. The anisotropic rheology provides a subcontinuum description of the mechanical behavior of sea ice and accounts for a continuum scale stress with large shear to compression ratio and tensile stress component. Results over the Arctic of a stand-alone version of the model are presented and anisotropic model sensitivity runs are compared with a reference elasto-visco-plastic simulation. Under realistic forcing sea ice quickly becomes highly anisotropic over large length scales, as is observed from satellite imagery. The influence of the new rheology on the state and dynamics of the sea ice cover is discussed. Our reference anisotropic run reveals that the new rheology leads to a substantial change of the spatial distribution of ice thickness and ice drift relative to the reference standard visco-plastic isotropic run, with ice thickness regionally increased by more than 1 m, and ice speed reduced by up to 50%.