953 resultados para Trend Analysis


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This article is based on research of the United States club industry conducted over the four-year period of 2003-2006. Twenty ratios were reported, covering the five general classes of financial ratios. The ratio results suggested that 2003 was a banner year for the club industry.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Water temperature measurements from Wivenhoe Dam offer a unique opportunity for studying fluctuations of temperatures in a subtropical dam as a function of time and depth. Cursory examination of the data indicate a complicated structure across both time and depth. We propose simplifying the task of describing these data by breaking the time series at each depth into physically meaningful components that individually capture daily, subannual, and annual (DSA) variations. Precise definitions for each component are formulated in terms of a wavelet-based multiresolution analysis. The DSA components are approximately pairwise uncorrelated within a given depth and between different depths. They also satisfy an additive property in that their sum is exactly equal to the original time series. Each component is based upon a set of coefficients that decomposes the sample variance of each time series exactly across time and that can be used to study both time-varying variances of water temperature at each depth and time-varying correlations between temperatures at different depths. Each DSA component is amenable for studying a certain aspect of the relationship between the series at different depths. The daily component in general is weakly correlated between depths, including those that are adjacent to one another. The subannual component quantifies seasonal effects and in particular isolates phenomena associated with the thermocline, thus simplifying its study across time. The annual component can be used for a trend analysis. The descriptive analysis provided by the DSA decomposition is a useful precursor to a more formal statistical analysis.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Present study performs the spatial and temporal trend analysis of annual, monthly and seasonal maximum and minimum temperatures (t(max), t(min)) in India. Recent trends in annual, monthly, winter, pre-monsoon, monsoon and post-monsoon extreme temperatures (t(max), t(min)) have been analyzed for three time slots viz. 1901-2003,1948-2003 and 1970-2003. For this purpose, time series of extreme temperatures of India as a whole and seven homogeneous regions, viz. Western Himalaya (WH), Northwest (NW), Northeast (NE), North Central (NC), East coast (EC), West coast (WC) and Interior Peninsula (IP) are considered. Rigorous trend detection analysis has been exercised using variety of non-parametric methods which consider the effect of serial correlation during analysis. During the last three decades minimum temperature trend is present in All India as well as in all temperature homogeneous regions of India either at annual or at any seasonal level (winter, pre-monsoon, monsoon, post-monsoon). Results agree with the earlier observation that the trend in minimum temperature is significant in the last three decades over India (Kothawale et al., 2010). Sequential MK test reveals that most of the trend both in maximum and minimum temperature began after 1970 either in annual or seasonal levels. (C) 2012 Elsevier B.V. All rights reserved.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Trend analysis is widely used for detecting changes in hydrological data. Parametric methods for this employ pre-specified models and associated tests to assess significance, whereas non-parametric methods generally apply rank tests to the data. Neither approach is suitable for exploratory analysis, because parametric models impose a particular, perhaps unsuitable, form of trend, while testing may confirm that trend is present but does not describe its form. This paper describes semi-parametric approaches to trend analysis using local likelihood fitting of annual maximum and partial duration series and illustrates their application to the exploratory analysis of changes in extremes in sea level and river flow data. Bootstrap methods are used to quantify the variability of estimates.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Objective: To compare trends in breast cancer mortality within three pairs of neighbouring European countries in relation to implementation of screening. Design: Retrospective trend analysis.
Setting: Three country pairs (Northern Ireland (United Kingdom) v Republic of Ireland, the Netherlands v Belgium and Flanders (Belgian region south of the Netherlands), and Sweden v Norway).
Data sources: WHO mortality database on cause of death and data sources on mammography screening, cancer treatment, and risk factors for breast cancer mortality.
Main outcome measures: Changes in breast cancer mortality calculated from linear regressions of log transformed, age adjusted death rates. Joinpoint analysis was used to identify the year when trends in mortality for all ages began to change.
Results: From 1989 to 2006, deaths from breast cancer decreased by 29% in Northern Ireland and by 26% in the Republic of Ireland; by 25% in the Netherlands and by 20% in Belgium and 25% in Flanders; and by 16% in Sweden and by 24% in Norway. The time trend and year of downward inflexion were similar between Northern Ireland and the Republic of Ireland and between the Netherlands and Flanders. In Sweden, mortality rates have steadily decreased since 1972, with no downward inflexion until 2006. Countries of each pair had similar healthcare services and prevalence of risk factors for breast cancer mortality but differing implementation of mammography screening, with a gap of about 10-15 years.
Conclusions: The contrast between the time differences in implementation of mammography screening and the similarity in reductions in mortality between the country pairs suggest that screening did not play a direct part in the reductions in breast cancer mortality.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

The forest has a crucial ecological role and the continuous forest loss can cause colossal effects on the environment. As Armenia is one of the low forest covered countries in the world, this problem is more critical. Continuous forest disturbances mainly caused by illegal logging started from the early 1990s had a huge damage on the forest ecosystem by decreasing the forest productivity and making more areas vulnerable to erosion. Another aspect of the Armenian forest is the lack of continuous monitoring and absence of accurate estimation of the level of cuts in some years. In order to have insight about the forest and the disturbances in the long period of time we used Landsat TM/ETM + images. Google Earth Engine JavaScript API was used, which is an online tool enabling the access and analysis of a great amount of satellite imagery. To overcome the data availability problem caused by the gap in the Landsat series in 1988- 1998, extensive cloud cover in the study area and the missing scan lines, we used pixel based compositing for the temporal window of leaf on vegetation (June-late September). Subsequently, pixel based linear regression analyses were performed. Vegetation indices derived from the 10 biannual composites for the years 1984-2014 were used for trend analysis. In order to derive the disturbances only in forests, forest cover layer was aggregated and the original composites were masked. It has been found, that around 23% of forests were disturbed during the study period.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Internet Telephony (VoIP) is changing the telecommunication industry. Oftentimes free, VoIP is becoming more and more popular amongst users. Large software companies have entered the market and heavily invest into it. In 2011, for instance, Microsoft bought Skype for 8.5bn USD. This trend increasingly impacts the incumbent telecommunication operators. They see their main source of revenue – classic telephony – under siege and disappear. The thesis at hand develops a most-likely scenario in order to determine how VoIP is evolving further and it predicts, based on a ten-year forecast, the impact it will have on the players in the telecommunication industry.The paper presents a model combining Rogers’ diffusion and Christensen’s innovation research. The model has the goal of explaining the past evolution of VoIP and to isolate the factors that determine the further diffusion of the innovation. Interviews with industry experts serve to assess how the identified factors are evolving.Two propositions are offered. First, VoIP operators are becoming more important in international, corporate, and mobile telephony. End-to-end VoIP (IP2IP) will exhibit strong growth rates and increasingly cannibalize the telephony revenues of the classic operators. Second, fix-net telephony in SMEs and at home will continue to be dominated by the incumbents. Yet, as prices for telephony fall towards zero also they will implement IP2IP in order to save costs. By 2022, up to 90% of the calls will be IP2IP. The author recommends the incumbents and VoIP operators to proactively face the change, to rethink their business strategies, and to even be open for cooperation.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Rain acidity may be ascribed to emissions from power station stacks, as well as emissions from other industry, biomass burning, maritime influences, agricultural influences, etc. Rain quality data are available for 30 sites in the South African interior, some from as early as 1985 for up to 14 rainfall seasons, while others only have relatively short records. The article examines trends over time in the raw and volume weighted concentrations of the parameters measured, separately for each of the sites for which sufficient data are available. The main thrust, however, is to examine the inter-relationship structure between the concentrations within each rain event (unweighted data), separately for each site, and to examine whether these inter-relationships have changed over time. The rain events at individual sites can be characterized by approximately eight combinations of rainfall parameters (or rain composition signatures), and these are common to all sites. Some sites will have more events from one signature than another, but there appear to be no signatures unique to a single site. Analysis via factor and cluster analysis, with a correspondence analysis of the results, also aid interpretation of the patterns. This spatio-temporal analysis, performed by pooling all rain event data, irrespective of site or time period, results in nine combinations of rainfall parameters being sufficient to characterize the rain events. The sites and rainfall seasons show patterns in these combinations of parameters, with some combinations appearing more frequently during certain rainfall seasons. In particular, the presence of the combination of low acetate and formate with high magnesium appears to be increasing in the later rainfall seasons, as does this combination together with calcium, sodium, chloride, potassium and fluoride. As expected, sites close together exhibit similar signatures. Copyright © 2002 John Wiley & Sons, Ltd.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

The first manuscript, entitled "Time-Series Analysis as Input for Clinical Predictive Modeling: Modeling Cardiac Arrest in a Pediatric ICU" lays out the theoretical background for the project. There are several core concepts presented in this paper. First, traditional multivariate models (where each variable is represented by only one value) provide single point-in-time snapshots of patient status: they are incapable of characterizing deterioration. Since deterioration is consistently identified as a precursor to cardiac arrests, we maintain that the traditional multivariate paradigm is insufficient for predicting arrests. We identify time series analysis as a method capable of characterizing deterioration in an objective, mathematical fashion, and describe how to build a general foundation for predictive modeling using time series analysis results as latent variables. Building a solid foundation for any given modeling task involves addressing a number of issues during the design phase. These include selecting the proper candidate features on which to base the model, and selecting the most appropriate tool to measure them. We also identified several unique design issues that are introduced when time series data elements are added to the set of candidate features. One such issue is in defining the duration and resolution of time series elements required to sufficiently characterize the time series phenomena being considered as candidate features for the predictive model. Once the duration and resolution are established, there must also be explicit mathematical or statistical operations that produce the time series analysis result to be used as a latent candidate feature. In synthesizing the comprehensive framework for building a predictive model based on time series data elements, we identified at least four classes of data that can be used in the model design. The first two classes are shared with traditional multivariate models: multivariate data and clinical latent features. Multivariate data is represented by the standard one value per variable paradigm and is widely employed in a host of clinical models and tools. These are often represented by a number present in a given cell of a table. Clinical latent features derived, rather than directly measured, data elements that more accurately represent a particular clinical phenomenon than any of the directly measured data elements in isolation. The second two classes are unique to the time series data elements. The first of these is the raw data elements. These are represented by multiple values per variable, and constitute the measured observations that are typically available to end users when they review time series data. These are often represented as dots on a graph. The final class of data results from performing time series analysis. This class of data represents the fundamental concept on which our hypothesis is based. The specific statistical or mathematical operations are up to the modeler to determine, but we generally recommend that a variety of analyses be performed in order to maximize the likelihood that a representation of the time series data elements is produced that is able to distinguish between two or more classes of outcomes. The second manuscript, entitled "Building Clinical Prediction Models Using Time Series Data: Modeling Cardiac Arrest in a Pediatric ICU" provides a detailed description, start to finish, of the methods required to prepare the data, build, and validate a predictive model that uses the time series data elements determined in the first paper. One of the fundamental tenets of the second paper is that manual implementations of time series based models are unfeasible due to the relatively large number of data elements and the complexity of preprocessing that must occur before data can be presented to the model. Each of the seventeen steps is analyzed from the perspective of how it may be automated, when necessary. We identify the general objectives and available strategies of each of the steps, and we present our rationale for choosing a specific strategy for each step in the case of predicting cardiac arrest in a pediatric intensive care unit. Another issue brought to light by the second paper is that the individual steps required to use time series data for predictive modeling are more numerous and more complex than those used for modeling with traditional multivariate data. Even after complexities attributable to the design phase (addressed in our first paper) have been accounted for, the management and manipulation of the time series elements (the preprocessing steps in particular) are issues that are not present in a traditional multivariate modeling paradigm. In our methods, we present the issues that arise from the time series data elements: defining a reference time; imputing and reducing time series data in order to conform to a predefined structure that was specified during the design phase; and normalizing variable families rather than individual variable instances. The final manuscript, entitled: "Using Time-Series Analysis to Predict Cardiac Arrest in a Pediatric Intensive Care Unit" presents the results that were obtained by applying the theoretical construct and its associated methods (detailed in the first two papers) to the case of cardiac arrest prediction in a pediatric intensive care unit. Our results showed that utilizing the trend analysis from the time series data elements reduced the number of classification errors by 73%. The area under the Receiver Operating Characteristic curve increased from a baseline of 87% to 98% by including the trend analysis. In addition to the performance measures, we were also able to demonstrate that adding raw time series data elements without their associated trend analyses improved classification accuracy as compared to the baseline multivariate model, but diminished classification accuracy as compared to when just the trend analysis features were added (ie, without adding the raw time series data elements). We believe this phenomenon was largely attributable to overfitting, which is known to increase as the ratio of candidate features to class examples rises. Furthermore, although we employed several feature reduction strategies to counteract the overfitting problem, they failed to improve the performance beyond that which was achieved by exclusion of the raw time series elements. Finally, our data demonstrated that pulse oximetry and systolic blood pressure readings tend to start diminishing about 10-20 minutes before an arrest, whereas heart rates tend to diminish rapidly less than 5 minutes before an arrest.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Polybrominated diphenyl ethers (PBDEs) are lipophilic, persistent pollutants found worldwide in environmental and human samples. Exposure pathways for PBDEs remain unclear but may include food, air and dust. The aim of this study was to conduct an integrated assessment of PBDE exposure and human body burden using 10 matched samples of human milk, indoor air and dust collected in 2007–2008 in Brisbane, Australia. In addition, temporal analysis was investigated comparing the results of the current study with PBDE concentrations in human milk collected in 2002–2003 from the same region. PBDEs were detected in all matrices and the median concentrations of BDEs -47 and -209 in human milk, air and dust were: 4.2 and 0.3 ng/g lipid; 25 and 7.8 pg/m3; and 56 and 291 ng/g dust, respectively. Significant correlations were observed between the concentrations of BDE-99 in air and human milk (r = 0.661, p = 0.038) and BDE-153 in dust and BDE-183 in human milk (r = 0.697, p = 0.025). These correlations do not suggest causal relationships — there is no hypothesis that can be offered to explain why BDE-153 in dust and BDE-183 in milk are correlated. The fact that so few correlations were found in the data could be a function of the small sample size, or because additional factors, such as sources of exposure not considered or measured in the study, might be important in explaining exposure to PBDEs. There was a slight decrease in PBDE concentrations from 2002–2003 to 2007–2008 but this may be due to sampling and analytical differences. Overall, average PBDE concentrations from these individual samples were similar to results from pooled human milk collected in Brisbane in 2002–2003 indicating that pooling may be an efficient, cost-effective strategy of assessing PBDE concentrations on a population basis. The results of this study were used to estimate an infant's daily PBDE intake via inhalation, dust ingestion and human milk consumption. Differences in PBDE intake of individual congeners from the different matrices were observed. Specifically, as the level of bromination increased, the contribution of PBDE intake decreased via human milk and increased via dust. As the impacts of the ban of the lower brominated (penta- and octa-BDE) products become evident, an increased use of the higher brominated deca-BDE product may result in dust making a greater contribution to infant exposure than it does currently. To better understand human body burden, further research is required into the sources and exposure pathways of PBDEs and metabolic differences influencing an individual's response to exposure. In addition, temporal trend analysis is necessary with continued monitoring of PBDEs in the human population as well as in the suggested exposure matrices of food, dust and air.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Emergency Health Services (EHS), encompassing hospital-based Emergency Departments (ED) and pre-hospital ambulance services, are a significant and high profile component of Australia’s health care system and congestion of these, evidenced by physical overcrowding and prolonged waiting times, is causing considerable community and professional concern. This concern relates not only to Australia’s capacity to manage daily health emergencies but also the ability to respond to major incidents and disasters. EHS congestion is a result of the combined effects of increased demand for emergency care, increased complexity of acute health care, and blocked access to ongoing care (e.g. inpatient beds). Despite this conceptual understanding there is a lack of robust evidence to explain the factors driving increased demand, or how demand contributes to congestion, and therefore public policy responses have relied upon limited or unsound information. The Emergency Health Services Queensland (EHSQ) research program proposes to determine the factors influencing the growing demand for emergency health care and to establish options for alternative service provision that may safely meet patient’s needs. The EHSQ study is funded by the Australian Research Council (ARC) through its Linkage Program and is supported financially by the Queensland Ambulance Service (QAS). This monograph is part of a suite of publications based on the research findings that examines the existing literature, and current operational context. Literature was sourced using standard search approaches and a range of databases as well as a selection of articles cited in the reviewed literature. Public sources including the Australian Institute of Health and Welfare (AIHW), the Council of Ambulance Authorities (CAA) Annual Reports, Australian Bureau of Statistics (ABS) and Department of Health and Ageing (DoHA) were examined for trend data across Australia.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Purpose – In the 21st Century, as knowledge, technology and education are widely accepted to play key roles in the local economic development, the importance of making space and place for knowledge production is, therefore, on the rise resulting many city administrations and urban policy-makers worldwide restructuring their cities to become highly competitive and creative. Consequently, this has led to a new type of city form, knowledge city, and a new approach in their development, knowledge-based urban development. In this context, knowledge-based foundations of universities are regarded as one of the key elements for knowledge-based urban development and knowledge city formation due to their ability to provide a strong platform for knowledge generation, marketing and transfer. This paper aims to investigate the role and importance of universities and their knowledge-based foundations in the context of developing countries, particularly in Malaysia, in building prosperous knowledge cities of the era of the knowledge economy. Design/Methodology/Approach – The main methodological techniques employed in this research includes: a thorough review of the literature on the role of universities in spatial and socio-economic development of cities; a best practice analysis and policy review of urban and regional development policies targeting to use of university clusters in leveraging knowledge-based development, and; a case study in Malaysia with a review of various policy documents and strategic plans of the local universities and local and state authorities, interviews with key actors, and a trend analysis of local socio-economic and spatial changes. Originality/Value – This paper reports the findings of a pioneering research on examining the role and impact of universities and their knowledge-based foundations, in the context of Malaysia, in building knowledge cities of the era of the knowledge economy. By undertaking a case study investigation in Bandar Seri Iskandar, which is a newly emerging Malaysian knowledge city, located in Perak, Malaysia, the paper sheds light on an important issue of the 21st Century of how universities contribute to the knowledge-based development of cities. Practical Implications – Universities with their rich knowledge-based foundations are increasingly being recognised as knowledge hubs, exercising a strong influence in the intellectual vitality of the city where they are embedded. This paper reveals that universities, in joint action with business and society at large, are necessary prerequisites for constructing and maintaining knowledge societies and, therefore, building prosperous knowledge cities. In light of the literature and case findings, the paper sheds light on the contribution of knowledge-based foundations of universities in knowledge city formation and provides generic recommendations for cities and regions seeking knowledge city transformation.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Technological growth in the 21st century is exponential. Simultaneously, development of the associated risk, uncertainty and user acceptance are scattered. This required appropriate study to establish people accepting controversial technology (PACT). The Internet and services around it, such as World Wide Web, e-mail, instant messaging and social networking are increasingly becoming important in many aspects of our lives. Information related to medical and personal health sharing using the Internet is controversial and demand validity, usability and acceptance. Whilst literature suggest, Internet enhances patients and physicians’ positive interactions some studies establish opposite of such interaction in particular the associated risk. In recent years Internet has attracted considerable attention as a means to improve health and health care delivery. However, it is not clear how widespread the use of Internet for health care really is or what impact it has on health care utilisation. Estimated impact of Internet usage varies widely from the locations locally and globally. As a result, an estimate (or predication) of Internet use and their effects in Medical Informatics related decision-making is impractical. This open up research issues on validating and accepting Internet usage when designing and developing appropriate policy and processes activities for Medical Informatics, Health Informatics and/or e-Health related protocols. Access and/or availability of data on Internet usage for Medical Informatics related activities are unfeasible. This paper presents a trend analysis of the growth of Internet usage in medical informatics related activities. In order to perform the analysis, data was extracted from ERA (Excellence Research in Australia) ranked “A” and “A*” Journal publications and reports from the authenticated public domain. The study is limited to the analyses of Internet usage trends in United States, Italy, France and Japan. Projected trends and their influence to the field of medical informatics is reviewed and discussed. The study clearly indicates a trend of patients becoming active consumers of health information rather than passive recipients.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The need for strong science, technology and innovation linkages between Higher Education Institutions (HEIs) and industries is a pivotal point for middle-income countries in their endeavor to enhance human capital in socioeconomic development. Currently, the University-Industry partnerships are at an infant stage in Sri Lankan higher education context. Technological maturity and effective communication skills are contributing factors for an efficient graduate profile. Also, expanding internship programs in particular for STEM disciplines provide work experience to students that would strengthen the relevance of higher education programs. This study reports historical overviews and current trends in STEM education in Sri Lanka. Emphasis will be drawn to recent technological and higher education curricular reforms. Data from the last 10 years were extracted from the higher education sector and Ministry of Higher Education Policy portfolios. Associations and trend analysis of the sector growth were compared with STEM existence, merger and predicted augmentations. Results were depicted and summarised based on STEM streams and disciplines. It was observed that the trend of STEM augmentation in the Sri Lankan Higher Education context is growing at a slow but steady pace. Further analysis with other sectors in particular, Industry information, would be useful and a worthwhile exercise.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Structural equation modeling (SEM) is a versatile multivariate statistical technique, and applications have been increasing since its introduction in the 1980s. This paper provides a critical review of 84 articles involving the use of SEM to address construction related problems over the period 1998–2012 including, but not limited to, seven top construction research journals. After conducting a yearly publication trend analysis, it is found that SEM applications have been accelerating over time. However, there are inconsistencies in the various recorded applications and several recurring problems exist. The important issues that need to be considered are examined in research design, model development and model evaluation and are discussed in detail with reference to current applications. A particularly important issue concerns the construct validity. Relevant topics for efficient research design also include longitudinal or cross-sectional studies, mediation and moderation effects, sample size issues and software selection. A guideline framework is provided to help future researchers in construction SEM applications.