872 resultados para Elements, High Trhoughput Data, elettrofisiologia, elaborazione dati, analisi Real Time
Resumo:
Providing homeowners with real-time feedback on their electricity consumption through a dedicated display device has been shown to reduce consumption by approximately 6-10%. However, recent advances in smart grid technology have enabled larger sample sizes and more representative sample selection and recruitment methods for display trials. By analyzing these factors using data from current studies, this paper argues that a realistic, large-scale conservation effect from feedback is in the range of 3-5%. Subsequent analysis shows that providing real-time feedback may not be a cost effective strategy for reducing carbon emissions in Australia, but that it may enable additional benefits such as customer retention and peak-load shift.
Resumo:
The Solar TErrestrial RElations Observatory (STEREO) provides high cadence and high resolution images of the structure and morphology of coronal mass ejections (CMEs) in the inner heliosphere. CME directions and propagation speeds have often been estimated through the use of time-elongation maps obtained from the STEREO Heliospheric Imager (HI) data. Many of these CMEs have been identified by citizen scientists working within the SolarStormWatch project ( www.solarstormwatch.com ) as they work towards providing robust real-time identification of Earth-directed CMEs. The wide field of view of HI allows scientists to directly observe the two-dimensional (2D) structures, while the relative simplicity of time-elongation analysis means that it can be easily applied to many such events, thereby enabling a much deeper understanding of how CMEs evolve between the Sun and the Earth. For events with certain orientations, both the rear and front edges of the CME can be monitored at varying heliocentric distances (R) between the Sun and 1 AU. Here we take four example events with measurable position angle widths and identified by the citizen scientists. These events were chosen for the clarity of their structure within the HI cameras and their long track lengths in the time-elongation maps. We show a linear dependency with R for the growth of the radial width (W) and the 2D aspect ratio (χ) of these CMEs, which are measured out to ≈ 0.7 AU. We estimated the radial width from a linear best fit for the average of the four CMEs. We obtained the relationships W=0.14R+0.04 for the width and χ=2.5R+0.86 for the aspect ratio (W and R in units of AU).
Resumo:
This paper presents practical approaches to the problem of sample size re-estimation in the case of clinical trials with survival data when proportional hazards can be assumed. When data are readily available at the time of the review, on a full range of survival experiences across the recruited patients, it is shown that, as expected, performing a blinded re-estimation procedure is straightforward and can help to maintain the trial's pre-specified error rates. Two alternative methods for dealing with the situation where limited survival experiences are available at the time of the sample size review are then presented and compared. In this instance, extrapolation is required in order to undertake the sample size re-estimation. Worked examples, together with results from a simulation study are described. It is concluded that, as in the standard case, use of either extrapolation approach successfully protects the trial error rates. Copyright © 2012 John Wiley & Sons, Ltd.
Resumo:
Advances in hardware and software in the past decade allow to capture, record and process fast data streams at a large scale. The research area of data stream mining has emerged as a consequence from these advances in order to cope with the real time analysis of potentially large and changing data streams. Examples of data streams include Google searches, credit card transactions, telemetric data and data of continuous chemical production processes. In some cases the data can be processed in batches by traditional data mining approaches. However, in some applications it is required to analyse the data in real time as soon as it is being captured. Such cases are for example if the data stream is infinite, fast changing, or simply too large in size to be stored. One of the most important data mining techniques on data streams is classification. This involves training the classifier on the data stream in real time and adapting it to concept drifts. Most data stream classifiers are based on decision trees. However, it is well known in the data mining community that there is no single optimal algorithm. An algorithm may work well on one or several datasets but badly on others. This paper introduces eRules, a new rule based adaptive classifier for data streams, based on an evolving set of Rules. eRules induces a set of rules that is constantly evaluated and adapted to changes in the data stream by adding new and removing old rules. It is different from the more popular decision tree based classifiers as it tends to leave data instances rather unclassified than forcing a classification that could be wrong. The ongoing development of eRules aims to improve its accuracy further through dynamic parameter setting which will also address the problem of changing feature domain values.
Resumo:
Despite being generally perceived as detrimental to the cardiovascular system, testosterone has marked beneficial vascular effects; most notably it acutely and directly causes vasodilatation. Indeed, men with hypotestosteronaemia can present with myocardial ischemia and angina which can be rapidly alleviated by infusion of testosterone. To date, however, in vitro studies have failed to provide a convincing mechanism to account for this clinically important effect. Here, using whole-cell patch-clamp recordings to measure current flow through recombinant human L-type Ca2+ channel alpha(1C) subunits (Ca(v)1.2), we demonstrate that testosterone inhibits such currents in a concentration-dependent manner. Importantly, this occurs over the physiological range of testosterone concentrations (IC50 34 nM), and is not mimicked by the metabolite 5alpha-androstan-17beta-ol-3-one (DHT), nor by progesterone or estradiol, even at high (10 microM) concentration. L-type Ca2+ channels in the vasculature are also important clinical targets for vasodilatory dihydropyridines. A single point mutation (T1007Y) almost completely abolishes nifedipine sensitivity in our recombinant expression system. Crucially, the same mutation renders the channels insensitive to testosterone. Our data strongly suggest, for the first time, the molecular requirements for testosterone binding to L-type Ca2+ channels, thereby supporting its beneficial role as an endogenous Ca2+ channel antagonist in the treatment of cardiovascular disease.
Resumo:
This paper compares the performance of artificial neural networks (ANNs) with that of the modified Black model in both pricing and hedging Short Sterling options. Using high frequency data, standard and hybrid ANNs are trained to generate option prices. The hybrid ANN is significantly superior to both the modified Black model and the standard ANN in pricing call and put options. Hedge ratios for hedging Short Sterling options positions using Short Sterling futures are produced using the standard and hybrid ANN pricing models, the modified Black model, and also standard and hybrid ANNs trained directly on the hedge ratios. The performance of hedge ratios from ANNs directly trained on actual hedge ratios is significantly superior to those based on a pricing model, and to the modified Black model.
Resumo:
Drought is a global problem that has far-reaching impacts and especially 47 on vulnerable populations in developing regions. This paper highlights the need for a Global Drought Early Warning System (GDEWS), the elements that constitute its underlying framework (GDEWF) and the recent progress made towards its development. Many countries lack drought monitoring systems, as well as the capacity to respond via appropriate political, institutional and technological frameworks, and these have inhibited the development of integrated drought management plans or early warning systems. The GDEWS will provide a source of drought tools and products via the GDEWF for countries and regions to develop tailored drought early warning systems for their own users. A key goal of a GDEWS is to maximize the lead time for early warning, allowing drought managers and disaster coordinators more time to put mitigation measures in place to reduce the vulnerability to drought. To address this, the GDEWF will take both a top-down approach to provide global real-time drought monitoring and seasonal forecasting, and a bottom-up approach that builds upon existing national and regional systems to provide continental to global coverage. A number of challenges must be overcome, however, before a GDEWS can become a reality, including the lack of in-situ measurement networks and modest seasonal forecast skill in many regions, and the lack of infrastructure to translate data into useable information. A set of international partners, through a series of recent workshops and evolving collaborations, has made progress towards meeting these challenges and developing a global system.
Resumo:
The development of effective environmental management plans and policies requires a sound understanding of the driving forces involved in shaping and altering the structure and function of ecosystems. However, driving forces, especially anthropogenic ones, are defined and operate at multiple administrative levels, which do not always match ecological scales. This paper presents an innovative methodology of analysing drivers of change by developing a typology of scale sensitivity of drivers that classifies and describes the way they operate across multiple administrative levels. Scale sensitivity varies considerably among drivers, which can be classified into five broad categories depending on the response of ‘evenness’ and ‘intensity change’ when moving across administrative levels. Indirect drivers tend to show low scale sensitivity, whereas direct drivers show high scale sensitivity, as they operate in a non-linear way across the administrative scale. Thus policies addressing direct drivers of change, in particular, need to take scale into consideration during their formulation. Moreover, such policies must have a strong spatial focus, which can be achieved either by encouraging local–regional policy making or by introducing high flexibility in (inter)national policies to accommodate increased differentiation at lower administrative levels. High quality data is available for several drivers, however, the availability of consistent data at all levels for non-anthropogenic drivers is a major constraint to mapping and assessing their scale sensitivity. This lack of data may hinder effective policy making for environmental management, since it restricts the ability to fully account for scale sensitivity of natural drivers in policy design.
Resumo:
Analysis of human behaviour through visual information has been a highly active research topic in the computer vision community. This was previously achieved via images from a conventional camera, but recently depth sensors have made a new type of data available. This survey starts by explaining the advantages of depth imagery, then describes the new sensors that are available to obtain it. In particular, the Microsoft Kinect has made high-resolution real-time depth cheaply available. The main published research on the use of depth imagery for analysing human activity is reviewed. Much of the existing work focuses on body part detection and pose estimation. A growing research area addresses the recognition of human actions. The publicly available datasets that include depth imagery are listed, as are the software libraries that can acquire it from a sensor. This survey concludes by summarising the current state of work on this topic, and pointing out promising future research directions.
Resumo:
Considerable efforts are currently invested into the setup of a Global Climate Observing System (GCOS) for monitoring climate change over the coming decades, which is of high relevance given concerns on increasing human influences. A promising potential contribution to the GCOS is a suite of spaceborne Global Navigation Satellite System (GNSS) occultation sensors for global long-term monitoring of atmospheric change in temperature and other variables with high vertical resolution and accuracy. Besides the great importance with respect to climate change, the provision of high quality data is essential for the improvement of numerical weather prediction and for reanalysis efforts. We review the significance of GNSS radio occultation sounding in the climate observations context. In order to investigate the climate change detection capability of GNSS occultation sensors, we are currently performing an end-to-end GNSS occultation observing system simulation experiment over the 25-year period 2001 to 2025. We report on this integrated analysis, which involves in a realistic manner all aspects from modeling the atmosphere via generating a significant set of stimulated measurements to an objective statistical analysis and assessment of 2001–2025 temporal trends.
Resumo:
The possibility of using a time sequence of surface pressure observations in four-dimensional data assimilation is being investigated. It is shown that a linear multilevel quasi-geostrophic model can be updated successfully with surface data alone, provided the number of time levels are at least as many as the number of vertical levels. It is further demonstrated that current statistical analysis procedures are very inefficient to assimilate surface observations, and it is shown by numerical experiments that the vertical interpolation must be carried out using the structure of the most dominating baroclinic mode in order to obtain a satisfactory updating. Different possible ways towards finding a practical solution are being discussed.
Resumo:
This paper will introduce the Baltex research programme and summarize associated numerical modelling work which has been undertaken during the last five years. The research has broadly managed to clarify the main mechanisms determining the water and energy cycle in the Baltic region, such as the strong dependence upon the large scale atmospheric circulation. It has further been shown that the Baltic Sea has a positive water balance, albeit with large interannual variations. The focus on the modelling studies has been the use of limited area models at ultra-high resolution driven by boundary conditions from global models or from reanalysis data sets. The programme has further initiated a comprehensive integration of atmospheric, land surface and hydrological modelling incorporating snow, sea ice and special lake models. Other aspects of the programme include process studies such as the role of deep convection, air sea interaction and the handling of land surface moisture. Studies have also been undertaken to investigate synoptic and sub-synoptic events over the Baltic region, thus exploring the role of transient weather systems for the hydrological cycle. A special aspect has been the strong interests and commitments of the meteorological and hydrological services because of the potentially large societal interests of operational applications of the research. As a result of this interests special attention has been put on data-assimilation aspects and the use of new types of data such as SSM/I, GPS-measurements and digital radar. A series of high resolution data sets are being produced. One of those, a 1/6 degree daily precipitation climatology for the years 1996–1999, is such a unique contribution. The specific research achievements to be presented in this volume of Meteorology and Atmospheric Physics is the result of a cooperative venture between 11 European research groups supported under the EU-Framework programmes.
Resumo:
The Sea and Land Surface Temperature Radiometer (SLSTR) is a nine channel visible and infrared high precision radiometer designed to provide climate data of global sea and land surface temperatures. The SLSTR payload is destined to fly on the Ocean and Medium-Resolution Land Mission for the ESA/EU Global Monitoring for Environment and Security (GMES) Programme Sentinel-3 mission to measure the sea and land temperature and topography for near real-time environmental and atmospheric climate monitoring of the Earth. In this paper we describe the optical layout of infrared optics in the instrument, spectral thin-film multilayer design, and system channel throughput analysis for the combined interference filter and dichroic beamsplitter coatings to discriminate wavelengths at 3.74, 10.85 & 12.0 μm. The rationale for selection of thin-film materials, deposition technique, and environmental testing, inclusive of humidity, thermal cycling and ionizing radiation testing are also described.
Resumo:
With the introduction of new observing systems based on asynoptic observations, the analysis problem has changed in character. In the near future we may expect that a considerable part of meteorological observations will be unevenly distributed in four dimensions, i.e. three dimensions in space and one in time. The term analysis, or objective analysis in meteorology, means the process of interpolating observed meteorological observations from unevenly distributed locations to a network of regularly spaced grid points. Necessitated by the requirement of numerical weather prediction models to solve the governing finite difference equations on such a grid lattice, the objective analysis is a three-dimensional (or mostly two-dimensional) interpolation technique. As a consequence of the structure of the conventional synoptic network with separated data-sparse and data-dense areas, four-dimensional analysis has in fact been intensively used for many years. Weather services have thus based their analysis not only on synoptic data at the time of the analysis and climatology, but also on the fields predicted from the previous observation hour and valid at the time of the analysis. The inclusion of the time dimension in objective analysis will be called four-dimensional data assimilation. From one point of view it seems possible to apply the conventional technique on the new data sources by simply reducing the time interval in the analysis-forecasting cycle. This could in fact be justified also for the conventional observations. We have a fairly good coverage of surface observations 8 times a day and several upper air stations are making radiosonde and radiowind observations 4 times a day. If we have a 3-hour step in the analysis-forecasting cycle instead of 12 hours, which is applied most often, we may without any difficulties treat all observations as synoptic. No observation would thus be more than 90 minutes off time and the observations even during strong transient motion would fall within a horizontal mesh of 500 km * 500 km.
Resumo:
The development of effective environmental management plans and policies requires a sound understanding of the driving forces involved in shaping and altering the structure and function of ecosystems. However, driving forces, especially anthropogenic ones, are defined and operate at multiple administrative levels, which do not always match ecological scales. This paper presents an innovative methodology of analysing drivers of change by developing a typology of scale sensitivity of drivers that classifies and describes the way they operate across multiple administrative levels. Scale sensitivity varies considerably among drivers, which can be classified into five broad categories depending on the response of ‘evenness’ and ‘intensity change’ when moving across administrative levels. Indirect drivers tend to show low scale sensitivity, whereas direct drivers show high scale sensitivity, as they operate in a non-linear way across the administrative scale. Thus policies addressing direct drivers of change, in particular, need to take scale into consideration during their formulation. Moreover, such policies must have a strong spatial focus, which can be achieved either by encouraging local–regional policy making or by introducing high flexibility in (inter)national policies to accommodate increased differentiation at lower administrative levels. High quality data is available for several drivers, however, the availability of consistent data at all levels for non-anthropogenic drivers is a major constraint to mapping and assessing their scale sensitivity. This lack of data may hinder effective policy making for environmental management, since it restricts the ability to fully account for scale sensitivity of natural drivers in policy design.