825 resultados para network traffic analysis


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Traditional crash prediction models, such as generalized linear regression models, are incapable of taking into account the multilevel data structure, which extensively exists in crash data. Disregarding the possible within-group correlations can lead to the production of models giving unreliable and biased estimates of unknowns. This study innovatively proposes a -level hierarchy, viz. (Geographic region level – Traffic site level – Traffic crash level – Driver-vehicle unit level – Vehicle-occupant level) Time level, to establish a general form of multilevel data structure in traffic safety analysis. To properly model the potential cross-group heterogeneity due to the multilevel data structure, a framework of Bayesian hierarchical models that explicitly specify multilevel structure and correctly yield parameter estimates is introduced and recommended. The proposed method is illustrated in an individual-severity analysis of intersection crashes using the Singapore crash records. This study proved the importance of accounting for the within-group correlations and demonstrated the flexibilities and effectiveness of the Bayesian hierarchical method in modeling multilevel structure of traffic crash data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Denaturation of tissues can provide a unique biological environment for regenerative medicine application only if minimal disruption of their microarchitecture is achieved during the decellularization process. The goal is to keep the structural integrity of such a construct as functional as the tissues from which they were derived. In this work, cartilage-on-bone laminates were decellularized through enzymatic, non-ionic and ionic protocols. This work investigated the effects of decellularization process on the microarchitecture of cartiligous extracellular matrix; determining the extent of how each process deteriorated the structural organization of the network. High resolution microscopy was used to capture cross-sectional images of samples prior to and after treatment. The variation of the microarchitecture was then analysed using a well defined fast Fourier image processing algorithm. Statistical analysis of the results revealed how significant the alternations among aforementioned protocols were (p < 0.05). Ranking the treatments by their effectiveness in disrupting the ECM integrity, they were ordered as: Trypsin> SDS> Triton X-100.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Tissue-specific extracellular matrix (ECM) is known to be an ideal bioscaffold to inspire the future of regenerative medicine. It holds the secret of how nature has developed such an organization of molecules into a unique functional complexity. This work exploited an innovative image processing algorithm and high resolution microscopy associated with mechanical analysis to establish a correlation between the gradient organization of cartiligous ECM and its anisotropic biomechanical response. This was hypothesized to be a reliable determinant that can elucidate how microarchitecture interrelates with biomechanical properties. Hough-Radon transform of the ECM cross-section images revealed its conformational variation from tangential interface down to subchondral region. As the orientation varied layer by layer, the anisotropic mechanical response deviated relatively. Although, results were in good agreement (Kendall's tau-b > 90%), there were evidences proposing that alignment of the fibrous network, specifically in middle zone, is not as random as it was previously thought.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With increasing rate of shipping traffic, the risk of collisions in busy and congested port waters is expected to rise. However, due to low collision frequencies it is difficult to analyze such risk in a sound statistical manner. This study aims at examining the occurrence of traffic conflicts in order to understand the characteristics of vessels involved in navigational hazards. A binomial logit model was employed to evaluate the association of vessel attributes and the kinematic conditions with conflict severity levels. Results show a positive association for vessels of small gross tonnage, overall vessel length, vessel height and draft with conflict risk. Conflicts involving a pair of dynamic vessels sailing at low speeds also have similar effects.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The traffic conflict technique (TCT) is a powerful technique applied in road traffic safety assessment as a surrogate of the traditional accident data analysis. It has subdued the conceptual and implemental weaknesses of the accident statistics. Although this technique has been applied effectively in road traffic, it has not been practised well in marine traffic even though this traffic system has some distinct advantages in terms of having a monitoring system. This monitoring system can provide navigational information as well as other geometric information of the ships for a larger study area over a longer time period. However, for implementing the TCT in the marine traffic system, it should be examined critically to suit the complex nature of the traffic system. This paper examines the suitability of the TCT to be applied to marine traffic and proposes a framework for a follow up comprehensive conflict study.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The skyrocketing trend for social media on the Internet greatly alters analytical Customer Relationship Management (CRM). Against this backdrop, the purpose of this paper is to advance the conceptual design of Business Intelligence (BI) systems with data identified from social networks. We develop an integrated social network data model, based on an in-depth analysis of Facebook. The data model can inform the design of data warehouses in order to offer new opportunities for CRM analyses, leading to a more consistent and richer picture of customers? characteristics, needs, wants, and demands. Four major contributions are offered. First, Social CRM and Social BI are introduced as emerging fields of research. Second, we develop a conceptual data model to identify and systematize the data available on online social networks. Third, based on the identified data, we design a multidimensional data model as an early contribution to the conceptual design of Social BI systems and demonstrate its application by developing management reports in a retail scenario. Fourth, intellectual challenges for advancing Social CRM and Social BI are discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With increasing rate of shipping traffic, the risk of collisions in busy and congested port waters is likely to rise. However, due to low collision frequencies in port waters, it is difficult to analyze such risk in a sound statistical manner. A convenient approach of investigating navigational collision risk is the application of the traffic conflict techniques, which have potential to overcome the difficulty of obtaining statistical soundness. This study aims at examining port water conflicts in order to understand the characteristics of collision risk with regard to vessels involved, conflict locations, traffic and kinematic conditions. A hierarchical binomial logit model, which considers the potential correlations between observation-units, i.e., vessels, involved in the same conflicts, is employed to evaluate the association of explanatory variables with conflict severity levels. Results show higher likelihood of serious conflicts for vessels of small gross tonnage or small overall length. The probability of serious conflict also increases at locations where vessels have more varied headings, such as traffic intersections and anchorages; becoming more critical at night time. Findings from this research should assist both navigators operating in port waters as well as port authorities overseeing navigational management.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Navigational collisions are one of the major safety concerns in many seaports. To address this safety concern, a comprehensive and structured method of collision risk management is necessary. Traditionally management of port water collision risks has been relied on historical collision data. However, this collision-data-based approach is hampered by several shortcomings, such as randomness and rarity of collision occurrence leading to obtaining insufficient number of samples for a sound statistical analysis, insufficiency in explaining collision causation, and reactive approach to safety. A promising alternative approach that overcomes these shortcomings is the navigational traffic conflict technique that uses traffic conflicts as an alternative to the collision data. This paper proposes a collision risk management method by utilizing the principles of this technique. This risk management method allows safety analysts to diagnose safety deficiencies in a proactive manner, which, consequently, has great potential for managing collision risks in a fast, reliable and efficient manner.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Particles emitted by vehicles are known to cause detrimental health effects, with their size and oxidative potential among the main factors responsible. Therefore, understanding the relationship between traffic composition and both the physical characteristics and oxidative potential of particles is critical. To contribute to the limited knowledge base in this area, we investigated this relationship in a 4.5 km road tunnel in Brisbane, Australia. On-road concentrations of ultrafine particles (<100 nm, UFPs), fine particles (PM2.5), CO, CO2 and particle associated reactive oxygen species (ROS) were measured using vehicle-based mobile sampling. UFPs were measured using a condensation particle counter and PM2.5 with a DustTrak aerosol photometer. A new profluorescent nitroxide probe, BPEAnit, was used to determine ROS levels. Comparative measurements were also performed on an above-ground road to assess the role of emission dilution on the parameters measured. The profile of UFP and PM2.5 concentration with distance through the tunnel was determined, and demonstrated relationships with both road gradient and tunnel ventilation. ROS levels in the tunnel were found to be high compared to an open road with similar traffic characteristics, which was attributed to the substantial difference in estimated emission dilution ratios on the two roadways. Principal component analysis (PCA) revealed that the levels of pollutants and ROS were generally better correlated with total traffic count, rather than the traffic composition (i.e. diesel and gasoline-powered vehicles). A possible reason for the lack of correlation with HDV, which has previously been shown to be strongly associated with UFPs especially, was the low absolute numbers encountered during the sampling. This may have made their contribution to in-tunnel pollution largely indistinguishable from the total vehicle volume. For ROS, the stronger association observed with HDV and gasoline vehicles when combined (total traffic count) compared to when considered individually may signal a role for the interaction of their emissions as a determinant of on-road ROS in this pilot study. If further validated, this should not be overlooked in studies of on- or near-road particle exposure and its potential health effects.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The monitoring sites comprising a state of the environment (SOE) network must be carefully selected to ensure that they will be representative of the broader resource. Hierarchical cluster analysis (HCA) is a data-driven technique that can potentially be employed to assess the representativeness of a SOE monitoring network. The objective of this paper is to explore the use of HCA as an approach for assessing the representativeness of the New Zealand National Groundwater Monitoring Programme (NGMP), which is comprised of 110 monitoring sites across the country.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

New substation automation applications, such as sampled value process buses and synchrophasors, require sampling accuracy of 1 µs or better. The Precision Time Protocol (PTP), IEEE Std 1588, achieves this level of performance and integrates well into Ethernet based substation networks. This paper takes a systematic approach to the performance evaluation of commercially available PTP devices (grandmaster, slave, transparent and boundary clocks) from a variety of manufacturers. The ``error budget'' is set by the performance requirements of each application. The ``expenditure'' of this error budget by each component is valuable information for a system designer. The component information is used to design a synchronization system that meets the overall functional requirements. The quantitative performance data presented shows that this testing is effective and informative. Results from testing PTP performance in the presence of sampled value process bus traffic demonstrate the benefit of a ``bottom up'' component testing approach combined with ``top down'' system verification tests. A test method that uses a precision Ethernet capture card, rather than dedicated PTP test sets, to determine the Correction Field Error of transparent clocks is presented. This test is particularly relevant for highly loaded Ethernet networks with stringent timing requirements. The methods presented can be used for development purposes by manufacturers, or by system integrators for acceptance testing. A sampled value process bus was used as the test application for the systematic approach described in this paper. The test approach was applied, components were selected, and the system performance verified to meet the application's requirements. Systematic testing, as presented in this paper, is applicable to a range of industries that use, rather than develop, PTP for time transfer.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Texas Department of Transportation (TxDOT) is concerned about the widening gap between pavement preservation needs and available funding. Thus, the TxDOT Austin District Pavement Engineer (DPE) has investigated methods to strategically allocate available pavement funding to potential projects that improve the overall performance of the District and Texas highway systems. The primary objective of the study presented in this paper is to develop a network-level project screening and ranking method that supports the Austin District 4-year pavement management plan development. The study developed candidate project selection and ranking algorithms that evaluated pavement conditions of each project candidate using data contained in the Pavement Management Information system (PMIS) database and incorporated insights from Austin District pavement experts; and implemented the developed method and supporting algorithm. This process previously required weeks to complete, but now requires about 10 minutes including data preparation and running the analysis algorithm, which enables the Austin DPE to devote more time and resources to conducting field visits, performing project-level evaluation and testing candidate projects. The case study results showed that the proposed method assisted the DPE in evaluating and prioritizing projects and allocating funds to the right projects at the right time.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Findings from an online survey conducted by Queensland University of Technology (QUT) shows that Australia is suffering from a lack of data reflecting trip generation for use in Traffic Impact Assessments (TIAs). Current independent variables for trip generation estimation are not able to create robust outcomes as well. It is also challenging to account for the impact of the new development on public and active transport as well as the effect of trip chaining behaviour in Australian TIA studies. With this background in mind, research is being implemented by QUT to find a new approach developing a combined model of trip generation and mode choice with consideration of trip chaining effects. It is expected that the model will provide transferable outcomes as it is developed based on socio-demographic parameters. Child Care Centres within the Brisbane area have been nominated for model development. At the time, the project is in the data collection phase. Findings from the pilot survey associated with capturing trip chaining and mode choice information reveal that applying questionnaire is able to capture required information in an acceptable level. The result also reveals that several centres within an area should be surveyed in order to provide sufficient data for trip chaining and modal split analysis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Safety at Railway Level Crossings (RLXs) is an important issue within the Australian transport system. Crashes at RLXs involving road vehicles in Australia are estimated to cost $10 million each year. Such crashes are mainly due to human factors; unintentional errors contribute to 46% of all fatal collisions and are far more common than deliberate violations. This suggests that innovative intervention targeting drivers are particularly promising to improve RLX safety. In recent years there has been a rapid development of a variety of affordable technologies which can be used to increase driver’s risk awareness around crossings. To date, no research has evaluated the potential effects of such technologies at RLXs in terms of safety, traffic and acceptance of the technology. Integrating driving and traffic simulations is a safe and affordable approach for evaluating these effects. This methodology will be implemented in a driving simulator, where we recreated realistic driving scenario with typical road environments and realistic traffic. This paper presents a methodology for evaluating comprehensively potential benefits and negative effects of such interventions: this methodology evaluates driver awareness at RLXs , driver distraction and workload when using the technology . Subjective assessment on perceived usefulness and ease of use of the technology is obtained from standard questionnaires. Driving simulation will provide a model of driving behaviour at RLXs which will be used to estimate the effects of such new technology on a road network featuring RLX for different market penetrations using a traffic simulation. This methodology can assist in evaluating future safety interventions at RLXs.