992 resultados para Trend detection


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Present study performs the spatial and temporal trend analysis of annual, monthly and seasonal maximum and minimum temperatures (t(max), t(min)) in India. Recent trends in annual, monthly, winter, pre-monsoon, monsoon and post-monsoon extreme temperatures (t(max), t(min)) have been analyzed for three time slots viz. 1901-2003,1948-2003 and 1970-2003. For this purpose, time series of extreme temperatures of India as a whole and seven homogeneous regions, viz. Western Himalaya (WH), Northwest (NW), Northeast (NE), North Central (NC), East coast (EC), West coast (WC) and Interior Peninsula (IP) are considered. Rigorous trend detection analysis has been exercised using variety of non-parametric methods which consider the effect of serial correlation during analysis. During the last three decades minimum temperature trend is present in All India as well as in all temperature homogeneous regions of India either at annual or at any seasonal level (winter, pre-monsoon, monsoon, post-monsoon). Results agree with the earlier observation that the trend in minimum temperature is significant in the last three decades over India (Kothawale et al., 2010). Sequential MK test reveals that most of the trend both in maximum and minimum temperature began after 1970 either in annual or seasonal levels. (C) 2012 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

As the number of resources on the web exceeds by far the number of documents one can track, it becomes increasingly difficult to remain up to date on ones own areas of interest. The problem becomes more severe with the increasing fraction of multimedia data, from which it is difficult to extract some conceptual description of their contents. One way to overcome this problem are social bookmark tools, which are rapidly emerging on the web. In such systems, users are setting up lightweight conceptual structures called folksonomies, and overcome thus the knowledge acquisition bottleneck. As more and more people participate in the effort, the use of a common vocabulary becomes more and more stable. We present an approach for discovering topic-specific trends within folksonomies. It is based on a differential adaptation of the PageRank algorithm to the triadic hypergraph structure of a folksonomy. The approach allows for any kind of data, as it does not rely on the internal structure of the documents. In particular, this allows to consider different data types in the same analysis step. We run experiments on a large-scale real-world snapshot of a social bookmarking system.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Total ozone trends are typically studied using linear regression models that assume a first-order autoregression of the residuals [so-called AR(1) models]. We consider total ozone time series over 60°S–60°N from 1979 to 2005 and show that most latitude bands exhibit long-range correlated (LRC) behavior, meaning that ozone autocorrelation functions decay by a power law rather than exponentially as in AR(1). At such latitudes the uncertainties of total ozone trends are greater than those obtained from AR(1) models and the expected time required to detect ozone recovery correspondingly longer. We find no evidence of LRC behavior in southern middle-and high-subpolar latitudes (45°–60°S), where the long-term ozone decline attributable to anthropogenic chlorine is the greatest. We thus confirm an earlier prediction based on an AR(1) analysis that this region (especially the highest latitudes, and especially the South Atlantic) is the optimal location for the detection of ozone recovery, with a statistically significant ozone increase attributable to chlorine likely to be detectable by the end of the next decade. In northern middle and high latitudes, on the other hand, there is clear evidence of LRC behavior. This increases the uncertainties on the long-term trend attributable to anthropogenic chlorine by about a factor of 1.5 and lengthens the expected time to detect ozone recovery by a similar amount (from ∼2030 to ∼2045). If the long-term changes in ozone are instead fit by a piecewise-linear trend rather than by stratospheric chlorine loading, then the strong decrease of northern middle- and high-latitude ozone during the first half of the 1990s and its subsequent increase in the second half of the 1990s projects more strongly on the trend and makes a smaller contribution to the noise. This both increases the trend and weakens the LRC behavior at these latitudes, to the extent that ozone recovery (according to this model, and in the sense of a statistically significant ozone increase) is already on the verge of being detected. The implications of this rather controversial interpretation are discussed.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The export of sediments from coastal catchments can have detrimental impacts on estuaries and near shore reef ecosystems such as the Great Barrier Reef. Catchment management approaches aimed at reducing sediment loads require monitoring to evaluate their effectiveness in reducing loads over time. However, load estimation is not a trivial task due to the complex behaviour of constituents in natural streams, the variability of water flows and often a limited amount of data. Regression is commonly used for load estimation and provides a fundamental tool for trend estimation by standardising the other time specific covariates such as flow. This study investigates whether load estimates and resultant power to detect trends can be enhanced by (i) modelling the error structure so that temporal correlation can be better quantified, (ii) making use of predictive variables, and (iii) by identifying an efficient and feasible sampling strategy that may be used to reduce sampling error. To achieve this, we propose a new regression model that includes an innovative compounding errors model structure and uses two additional predictive variables (average discounted flow and turbidity). By combining this modelling approach with a new, regularly optimised, sampling strategy, which adds uniformity to the event sampling strategy, the predictive power was increased to 90%. Using the enhanced regression model proposed here, it was possible to detect a trend of 20% over 20 years. This result is in stark contrast to previous conclusions presented in the literature. (C) 2014 Elsevier B.V. All rights reserved.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Mann–Kendall non-parametric test was employed for observational trend detection of monthly, seasonal and annual precipitation of five meteorological subdivisions of Central Northeast India (CNE India) for different 30-year normal periods (NP) viz. 1889–1918 (NP1), 1919–1948 (NP2), 1949–1978 (NP3) and 1979–2008 (NP4). The trends of maximum and minimum temperatures were also investigated. The slopes of the trend lines were determined using the method of least square linear fitting. An application of Morelet wavelet analysis was done with monthly rainfall during June– September, total rainfall during monsoon season and annual rainfall to know the periodicity and to test the significance of periodicity using the power spectrum method. The inferences figure out from the analyses will be helpful to the policy managers, planners and agricultural scientists to work out irrigation and water management options under various possible climatic eventualities for the region. The long-term (1889–2008) mean annual rainfall of CNE India is 1,195.1 mm with a standard deviation of 134.1 mm and coefficient of variation of 11%. There is a significant decreasing trend of 4.6 mm/year for Jharkhand and 3.2 mm/day for CNE India. Since rice crop is the important kharif crop (May– October) in this region, the decreasing trend of rainfall during themonth of July may delay/affect the transplanting/vegetative phase of the crop, and assured irrigation is very much needed to tackle the drought situation. During themonth of December, all the meteorological subdivisions except Jharkhand show a significant decreasing trend of rainfall during recent normal period NP4. The decrease of rainfall during December may hamper sowing of wheat, which is the important rabi crop (November–March) in most parts of this region. Maximum temperature shows significant rising trend of 0.008°C/year (at 0.01 level) during monsoon season and 0.014°C/year (at 0.01 level) during post-monsoon season during the period 1914– 2003. The annual maximum temperature also shows significant increasing trend of 0.008°C/year (at 0.01 level) during the same period. Minimum temperature shows significant rising trend of 0.012°C/year (at 0.01 level) during postmonsoon season and significant falling trend of 0.002°C/year (at 0.05 level) during monsoon season. A significant 4– 8 years peak periodicity band has been noticed during September over Western UP, and 30–34 years periodicity has been observed during July over Bihar subdivision. However, as far as CNE India is concerned, no significant periodicity has been noticed in any of the time series.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

At Ny-Ålesund (78.9° N), Svalbard, surface radiation measurements of up- and downward short- and longwave radiation are operated since August 1992 in the frame of the Baseline Surface Radiation Network (BSRN), complemented with surface and upper air meteorology since August 1993. The long-term observations are the base for a climatological presentation of the surface radiation data. Over the 21-year observation period, ongoing changes in the Arctic climate system are reflected. Particularly, the observations indicate a strong seasonality of surface warming and related changes in different radiation parameters. The annual mean temperature at Ny-Ålesund has risen by +1.3 ± 0.7 K per decade, with a maximum seasonal increase during the winter months of +3.1 ± 2.6 K per decade. At the same time, winter is also the season with the largest long-term changes in radiation, featuring an increase of +15.6 ± 11.6 W/m**2 per decade in the downward longwave radiation. Furthermore, changes in the reflected solar radiation during the months of snow melt indicate an earlier onset of the warm season by about 1 week compared to the beginning of the observations. The online available dataset of Ny-Ålesund surface radiation measurements provides a valuable data source for the validation of satellite instruments and climate models.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The variability / climate change has generated great concern worldwide, is one of the major issues as global warming, which can is affecting the availability of water resources in irrigated perimeters. In the semiarid region of Northeastern Brazil it is known that there is a predominance of drought, but it is not enough known about trends in climate series of joint water loss by evaporation and transpiration (evapotranspiration). Therefore, this study aimed to analyze whether there is increase and / or decrease evidence in the regime of reference evapotranspiration (ETo), for the monthly, annual and interdecadal scales in irrigated polo towns of Juazeiro, BA (9 ° 24'S, 40 ° 26'W and 375,5m) and Petrolina, PE (09 ° 09'S, 40 ° 22'W and 376m), which is the main analysis objective. The daily meteorological data were provided by EMBRAPA Semiárido for the period from 01.01.1976 to 31.12.2014, estimated the daily ETo using the standard method of Penman-Monteith (EToPM) parameterized by Smith (1991). Other methods of more simplified estimatives were calculated and compared to EToPM, as the ones following: Solar Radiation (EToRS), Linacre (EToL), Hargreaves and Samani (EToHS) and the method of Class A pan (EToTCA). The main statistical analysis were non-parametric tests of homogeneity (Run), trend (Mann-kendall), magnitude of the trend (Sen) and early trend detection (Mann-Whitney). The statistical significance adopted was 5 and / or 1%. The Analysis of Variance - ANOVA was used to detect if there is a significant difference in mean interdecadal mean. For comparison between the methods of ETo, it were used the correlation test (r), the Student t test and Tukey levels of 5% significance. Finally, statistics Willmott et al. (1985) statistics was used to evaluate the concordance index and performance of simplified methods compared to the standard method. It obtained as main results that there was a decrease in the time series of EToPM in irrigated areas of Juazeiro, BA and Petrolina, PE, significant respectively at 1 and 5%, with an annual magnitude of -14.5 mm (Juazeiro) and -7.7 mm (Petrolina) and early trend in 1996. The methods which had better for better agreement with EToPM were EToRS with very good performance, in both locations, followed by the method of EToL with good performance (Juazeiro) and median (Petrolina). EToHS had the worst performance (bad) for both locations. It is suggested that this decrease of EToPM can be associated with the increase in irrigated agricultural areas and the construction of Sobradinho lake upstream of the perimeters.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The ontology engineering research community has focused for many years on supporting the creation, development and evolution of ontologies. Ontology forecasting, which aims at predicting semantic changes in an ontology, represents instead a new challenge. In this paper, we want to give a contribution to this novel endeavour by focusing on the task of forecasting semantic concepts in the research domain. Indeed, ontologies representing scientific disciplines contain only research topics that are already popular enough to be selected by human experts or automatic algorithms. They are thus unfit to support tasks which require the ability of describing and exploring the forefront of research, such as trend detection and horizon scanning. We address this issue by introducing the Semantic Innovation Forecast (SIF) model, which predicts new concepts of an ontology at time t + 1, using only data available at time t. Our approach relies on lexical innovation and adoption information extracted from historical data. We evaluated the SIF model on a very large dataset consisting of over one million scientific papers belonging to the Computer Science domain: the outcomes show that the proposed approach offers a competitive boost in mean average precision-at-ten compared to the baselines when forecasting over 5 years.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Filtering methods are explored for removing noise from data while preserving sharp edges that many indicate a trend shift in gas turbine measurements. Linear filters are found to be have problems with removing noise while preserving features in the signal. The nonlinear hybrid median filter is found to accurately reproduce the root signal from noisy data. Simulated faulty data and fault-free gas path measurement data are passed through median filters and health residuals for the data set are created. The health residual is a scalar norm of the gas path measurement deltas and is used to partition the faulty engine from the healthy engine using fuzzy sets. The fuzzy detection system is developed and tested with noisy data and with filtered data. It is found from tests with simulated fault-free and faulty data that fuzzy trend shift detection based on filtered data is very accurate with no false alarms and negligible missed alarms.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

BACKGROUND: To compare the ability of Glaucoma Progression Analysis (GPA) and Threshold Noiseless Trend (TNT) programs to detect visual-field deterioration.

METHODS: Patients with open-angle glaucoma followed for a minimum of 2 years and a minimum of seven reliable visual fields were included. Progression was assessed subjectively by four masked glaucoma experts, and compared with GPA and TNT results. Each case was judged to be stable, deteriorated or suspicious of deterioration

RESULTS: A total of 56 eyes of 42 patients were followed with a mean of 7.8 (SD 1.0) tests over an average of 5.5 (1.04) years. Interobserver agreement to detect progression was good (mean kappa = 0.57). Progression was detected in 10-19 eyes by the experts, in six by GPA and in 24 by TNT. Using the consensus expert opinion as the gold standard (four clinicians detected progression), the GPA sensitivity and specificity were 75% and 83%, respectively, while the TNT sensitivity and specificity was 100% and 77%, respectively.

CONCLUSION: TNT showed greater concordance with the experts than GPA in the detection of visual-field deterioration. GPA showed a high specificity but lower sensitivity, mainly detecting cases of high focality and pronounced mean defect slopes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Data preprocessing is widely recognized as an important stage in anomaly detection. This paper reviews the data preprocessing techniques used by anomaly-based network intrusion detection systems (NIDS), concentrating on which aspects of the network traffic are analyzed, and what feature construction and selection methods have been used. Motivation for the paper comes from the large impact data preprocessing has on the accuracy and capability of anomaly-based NIDS. The review finds that many NIDS limit their view of network traffic to the TCP/IP packet headers. Time-based statistics can be derived from these headers to detect network scans, network worm behavior, and denial of service attacks. A number of other NIDS perform deeper inspection of request packets to detect attacks against network services and network applications. More recent approaches analyze full service responses to detect attacks targeting clients. The review covers a wide range of NIDS, highlighting which classes of attack are detectable by each of these approaches. Data preprocessing is found to predominantly rely on expert domain knowledge for identifying the most relevant parts of network traffic and for constructing the initial candidate set of traffic features. On the other hand, automated methods have been widely used for feature extraction to reduce data dimensionality, and feature selection to find the most relevant subset of features from this candidate set. The review shows a trend toward deeper packet inspection to construct more relevant features through targeted content parsing. These context sensitive features are required to detect current attacks.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cognitive radio is an emerging technology proposing the concept of dynamic spec- trum access as a solution to the looming problem of spectrum scarcity caused by the growth in wireless communication systems. Under the proposed concept, non- licensed, secondary users (SU) can access spectrum owned by licensed, primary users (PU) so long as interference to PU are kept minimal. Spectrum sensing is a crucial task in cognitive radio whereby the SU senses the spectrum to detect the presence or absence of any PU signal. Conventional spectrum sensing assumes the PU signal as ‘stationary’ and remains in the same activity state during the sensing cycle, while an emerging trend models PU as ‘non-stationary’ and undergoes state changes. Existing studies have focused on non-stationary PU during the transmission period, however very little research considered the impact on spectrum sensing when the PU is non-stationary during the sensing period. The concept of PU duty cycle is developed as a tool to analyse the performance of spectrum sensing detectors when detecting non-stationary PU signals. New detectors are also proposed to optimise detection with respect to duty cycle ex- hibited by the PU. This research consists of two major investigations. The first stage investigates the impact of duty cycle on the performance of existing detec- tors and the extent of the problem in existing studies. The second stage develops new detection models and frameworks to ensure the integrity of spectrum sensing when detecting non-stationary PU signals. The first investigation demonstrates that conventional signal model formulated for stationary PU does not accurately reflect the behaviour of a non-stationary PU. Therefore the performance calculated and assumed to be achievable by the conventional detector does not reflect actual performance achieved. Through analysing the statistical properties of duty cycle, performance degradation is proved to be a problem that cannot be easily neglected in existing sensing studies when PU is modelled as non-stationary. The second investigation presents detectors that are aware of the duty cycle ex- hibited by a non-stationary PU. A two stage detection model is proposed to improve the detection performance and robustness to changes in duty cycle. This detector is most suitable for applications that require long sensing periods. A second detector, the duty cycle based energy detector is formulated by integrat- ing the distribution of duty cycle into the test statistic of the energy detector and suitable for short sensing periods. The decision threshold is optimised with respect to the traffic model of the PU, hence the proposed detector can calculate average detection performance that reflect realistic results. A detection framework for the application of spectrum sensing optimisation is proposed to provide clear guidance on the constraints on sensing and detection model. Following this framework will ensure the signal model accurately reflects practical behaviour while the detection model implemented is also suitable for the desired detection assumption. Based on this framework, a spectrum sensing optimisation algorithm is further developed to maximise the sensing efficiency for non-stationary PU. New optimisation constraints are derived to account for any PU state changes within the sensing cycle while implementing the proposed duty cycle based detector.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Before the age of 75 years, approximately 10% of women will be diagnosed with breast cancer, one of the most common malignancies and a leading cause of death among women. The objective of this study was to determine if expression of the nuclear receptor coactivators 1 and 3 (NCoA1 and NCoA3) varied in breast cancer grades. RNA was extracted from 25 breast tumours and transcribed into cDNA which underwent semi-quantitative polymerase chain reaction, normalised using 18S. Analysis indicated that an expression change for NCoA1 in cancer grades and estrogen receptor alpha negative tissue (P= 0.028 and 0.001 respectively). NCoA1 expression increased in grade 3 and estrogen receptor alpha negative tumours, compared to controls. NCoA3 showed a similar, but not significant, trend in grade and a non-significant decrease in estrogen receptor alpha negative tissues. Expression of NCoA1 in late stage and estrogen receptor alpha negative breast tumours may have implications to breast cancer treatment, particularly in the area of manipulation of hormone signalling systems in advanced tumours.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Crashes on motorway contribute to a significant proportion (40-50%) of non-recurrent motorway congestions. Hence reduce crashes will help address congestion issues (Meyer, 2008). Crash likelihood estimation studies commonly focus on traffic conditions in a Short time window around the time of crash while longer-term pre-crash traffic flow trends are neglected. In this paper we will show, through data mining techniques, that a relationship between pre-crash traffic flow patterns and crash occurrence on motorways exists, and that this knowledge has the potential to improve the accuracy of existing models and opens the path for new development approaches. The data for the analysis was extracted from records collected between 2007 and 2009 on the Shibuya and Shinjuku lines of the Tokyo Metropolitan Expressway in Japan. The dataset includes a total of 824 rear-end and sideswipe crashes that have been matched with traffic flow data of one hour prior to the crash using an incident detection algorithm. Traffic flow trends (traffic speed/occupancy time series) revealed that crashes could be clustered with regards of the dominant traffic flow pattern prior to the crash. Using the k-means clustering method allowed the crashes to be clustered based on their flow trends rather than their distance. Four major trends have been found in the clustering results. Based on these findings, crash likelihood estimation algorithms can be fine-tuned based on the monitored traffic flow conditions with a sliding window of 60 minutes to increase accuracy of the results and minimize false alarms.