953 resultados para Historical data usage


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Transportation Systems Center, Cambridge, Mass.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

National Highway Traffic Safety Administration, Office of Research and Development, Washington, D.C.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Mode of access: Internet.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Chiefly tables.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Traffic incidents are non-recurring events that can cause a temporary reduction in roadway capacity. They have been recognized as a major contributor to traffic congestion on our nation’s highway systems. To alleviate their impacts on capacity, automatic incident detection (AID) has been applied as an incident management strategy to reduce the total incident duration. AID relies on an algorithm to identify the occurrence of incidents by analyzing real-time traffic data collected from surveillance detectors. Significant research has been performed to develop AID algorithms for incident detection on freeways; however, similar research on major arterial streets remains largely at the initial stage of development and testing. This dissertation research aims to identify design strategies for the deployment of an Artificial Neural Network (ANN) based AID algorithm for major arterial streets. A section of the US-1 corridor in Miami-Dade County, Florida was coded in the CORSIM microscopic simulation model to generate data for both model calibration and validation. To better capture the relationship between the traffic data and the corresponding incident status, Discrete Wavelet Transform (DWT) and data normalization were applied to the simulated data. Multiple ANN models were then developed for different detector configurations, historical data usage, and the selection of traffic flow parameters. To assess the performance of different design alternatives, the model outputs were compared based on both detection rate (DR) and false alarm rate (FAR). The results show that the best models were able to achieve a high DR of between 90% and 95%, a mean time to detect (MTTD) of 55-85 seconds, and a FAR below 4%. The results also show that a detector configuration including only the mid-block and upstream detectors performs almost as well as one that also includes a downstream detector. In addition, DWT was found to be able to improve model performance, and the use of historical data from previous time cycles improved the detection rate. Speed was found to have the most significant impact on the detection rate, while volume was found to contribute the least. The results from this research provide useful insights on the design of AID for arterial street applications.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Traffic incidents are non-recurring events that can cause a temporary reduction in roadway capacity. They have been recognized as a major contributor to traffic congestion on our national highway systems. To alleviate their impacts on capacity, automatic incident detection (AID) has been applied as an incident management strategy to reduce the total incident duration. AID relies on an algorithm to identify the occurrence of incidents by analyzing real-time traffic data collected from surveillance detectors. Significant research has been performed to develop AID algorithms for incident detection on freeways; however, similar research on major arterial streets remains largely at the initial stage of development and testing. This dissertation research aims to identify design strategies for the deployment of an Artificial Neural Network (ANN) based AID algorithm for major arterial streets. A section of the US-1 corridor in Miami-Dade County, Florida was coded in the CORSIM microscopic simulation model to generate data for both model calibration and validation. To better capture the relationship between the traffic data and the corresponding incident status, Discrete Wavelet Transform (DWT) and data normalization were applied to the simulated data. Multiple ANN models were then developed for different detector configurations, historical data usage, and the selection of traffic flow parameters. To assess the performance of different design alternatives, the model outputs were compared based on both detection rate (DR) and false alarm rate (FAR). The results show that the best models were able to achieve a high DR of between 90% and 95%, a mean time to detect (MTTD) of 55-85 seconds, and a FAR below 4%. The results also show that a detector configuration including only the mid-block and upstream detectors performs almost as well as one that also includes a downstream detector. In addition, DWT was found to be able to improve model performance, and the use of historical data from previous time cycles improved the detection rate. Speed was found to have the most significant impact on the detection rate, while volume was found to contribute the least. The results from this research provide useful insights on the design of AID for arterial street applications.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

With recent advances in remote sensing processing technology, it has become more feasible to begin analysis of the enormous historic archive of remotely sensed data. This historical data provides valuable information on a wide variety of topics which can influence the lives of millions of people if processed correctly and in a timely manner. One such field of benefit is that of landslide mapping and inventory. This data provides a historical reference to those who live near high risk areas so future disasters may be avoided. In order to properly map landslides remotely, an optimum method must first be determined. Historically, mapping has been attempted using pixel based methods such as unsupervised and supervised classification. These methods are limited by their ability to only characterize an image spectrally based on single pixel values. This creates a result prone to false positives and often without meaningful objects created. Recently, several reliable methods of Object Oriented Analysis (OOA) have been developed which utilize a full range of spectral, spatial, textural, and contextual parameters to delineate regions of interest. A comparison of these two methods on a historical dataset of the landslide affected city of San Juan La Laguna, Guatemala has proven the benefits of OOA methods over those of unsupervised classification. Overall accuracies of 96.5% and 94.3% and F-score of 84.3% and 77.9% were achieved for OOA and unsupervised classification methods respectively. The greater difference in F-score is a result of the low precision values of unsupervised classification caused by poor false positive removal, the greatest shortcoming of this method.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

introduction of conservation practices in degraded agricultural land will generally recuperate soil quality, especially by increasing soil organic matter. This aspect of soil organic C (SOC) dynamics under distinct cropping and management systems can be conveniently analyzed with ecosystem models such as the Century Model. In this study, Century was used to simulate SOC stocks in farm fields of the Ibiruba region of north central Rio Grande do Sul state in Southern Brazil. The region, where soils are predominantly Oxisols, was originally covered with subtropical woodlands and grasslands. SOC dynamics was simulated with a general scenario developed with historical data on soil management and cropping systems beginning with the onset of agriculture in 1900. From 1993 to 2050, two contrasting scenarios based on no-tillage soil management were established: the status quo scenario, with crops and agricultural inputs as currently practiced in the region and the high biomass scenario with increased frequency of corn in the cropping system, resulting in about 80% higher biomass addition to soils. Century simulations were in close agreement with SOC stocks measured in 2005 in the Oxisols with finer texture surface horizon originally under woodlands. However, simulations in the Oxisols with loamy surface horizon under woodlands and in the grassland soils were not as accurate. SOC stock decreased from 44% to 50% in fields originally under woodland and from 20% to 27% in fields under grasslands with the introduction of intensive annual grain crops with intensive tillage and harrowing operations. The adoption of conservation practices in the 1980s led to a stabilization of SOC stocks followed by a partial recovery of native stocks. Simulations to 2050 indicate that maintaining status quo would allow SOC stocks to recover from 81% to 86% of the native stocks under woodland and from 80% to 91 % of the native stocks under grasslands. Adoption of a high biomass scenario would result in stocks from 75% to 95% of the original stocks under woodlands and from 89% to 102% in the grasslands by 2050. These simulations outcomes underline the importance of cropping system yielding higher biomass to further increase SOC content in these Oxisols. This application of the Century Model could reproduce general trends of SOC loss and recovery in the Oxisols of the Ibiruba region. Additional calibration and validation should be conducted before extensive usage of Century as a support tool for soil carbon sequestration projects in this and other regions can be recommended. (C) 2009 Elsevier B.V. All rights reserved.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Ecological extinction caused by overfishing precedes all other pervasive human disturbance to coastal ecosystems, including pollution, degradation of water quality, and anthropogenic climate change. Historical abundances of large consumer species were fantastically large in comparison with recent observations. Paleoecological, archaeological, and historical data show that time lags of decades to centuries occurred between the onset of overfishing and consequent changes in ecological communities, because unfished species of similar trophic level assumed the ecological roles of overfished species until they too were overfished or died of epidemic diseases related to overcrowding. Retrospective data not only help to clarify underlying causes and rates of ecological change, but they also demonstrate achievable goals for restoration and management of coastal ecosystems that could not even be contemplated based on the limited perspective of recent observations alone.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Introduction: The present paper deals with the issue of the increasing usage of corporation mergers and acquisitions strategies within pharmaceutical industry environment. The aim is to identify the triggers of such business phenomenon and the immediate impact on the financial outcome of two powerful biopharmaceutical corporations: Pfizer and GlaxoSmithKline, which have been sampled due to their successful approach of the tactics in question. Materials and Methods: In order to create an overview of the development steps through mergers and acquisitions, the historical data of the two corporations has been consulted, from their official websites. The most relevant events were then associated with adequate information from the financial reports and statements of the two corporations indulged by web-based financial data providers. Results and Discussions: In the past few decades Pfizer and GlaxoSmithKline have purchased or merged with various companies in order to monopolize new markets, diversify products and services portfolios, survive and surpass competitors. The consequences proved to be positive although this approach implies certain capital availability. Conclusions: Results reveal the fact that, as far as the two sampled companies are concerned, acquisitions and mergers are reactions at the pressure of the highly competitive environment. Moreover, the continuous diversification of the market’s needs is also a consistent motive. However, the prevalence and the eminence of mergers and acquisition strategies are conditioned by the tender offer, the announcer’s caliber, research and development status and further other factors determined by the internal and external actors of the market.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Digital Businesses have become a major driver for economic growth and have seen an explosion of new startups. At the same time, it also includes mature enterprises that have become global giants in a relatively short period of time. Digital Businesses have unique characteristics that make the running and management of a Digital Business much different from traditional offline businesses. Digital businesses respond to online users who are highly interconnected and networked. This enables a rapid flow of word of mouth, at a pace far greater than ever envisioned when dealing with traditional products and services. The relatively low cost of incremental user addition has led to a variety of innovation in pricing of digital products, including various forms of free and freemium pricing models. This thesis explores the unique characteristics and complexities of Digital Businesses and its implications on the design of Digital Business Models and Revenue Models. The thesis proposes an Agent Based Modeling Framework that can be used to develop Simulation Models that simulate the complex dynamics of Digital Businesses and the user interactions between users of a digital product. Such Simulation models can be used for a variety of purposes such as simple forecasting, analysing the impact of market disturbances, analysing the impact of changes in pricing models and optimising the pricing for maximum revenue generation or a balance between growth in usage and revenue generation. These models can be developed for a mature enterprise with a large historical record of user growth rate as well as for early stage enterprises without much historical data. Through three case studies, the thesis demonstrates the applicability of the Framework and its potential applications.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In this paper, we analyze the behavior of real interest rates over the long-run using historical data for nine developed economies, to assess the extent to which the recent decline observed in most advanced countries is at odds with the past data, as suggested by the Secular Stagnation hypothesis. By using data from 1703 and performing stationarity and structural breaks tests, we find that the recent decline in interest rates is not explained by a structural break in the time series. Our results also show that considering long-run data leads to different conclusions than using short-run data.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In recent years, thin whitetopping has evolved as a viable rehabilitation technique for deteriorated asphalt cement concrete (ACC) pavements. Numerous projects have been constructed and tested; these projects allow researchers to identify the important elements contributing to the projects’ successes. These elements include surface preparation, overlay thickness, synthetic fiber reinforcement usage, joint spacing, and joint sealing. Although the main factors affecting thin whitetopping performance have been identified by previous research, questions still existed as to the optimum design incorporating these variables. The objective of this research is to investigate the interaction between these variables over time. Laboratory testing and field-testing were planned in order to accomplish the research objective. Laboratory testing involved shear testing of the bond between the portland cement concrete (PCC) overlay and the ACC surface. Field-testing involved falling weight deflectometer deflection responses, measurement of joint faulting and joint opening, and visual distress surveys on the 9.6-mile project. The project was located on Iowa Highway 13 extending north from the city of Manchester, Iowa, to Iowa Highway 3 in Delaware County. Variables investigated included ACC surface preparation, PCC thickness, synthetic fiber reinforcement usage, and joint spacing. This report documents the planning, equipment selection, construction, field changes, and construction concerns of the project built in 2002. The data from this research could be combined with historical data to develop a design specification for the construction of thin, unbonded overlays.