13 resultados para large transportation network

em University of Queensland eSpace - Australia


Relevância:

40.00% 40.00%

Publicador:

Resumo:

We carried out a retrospective review of the videoconference activity records in a university-run hospital telemedicine studio. Usage records describing videoconferencing activity in the telemedicine studio were compared with the billing records provided by the telecommunications company. During a seven-month period there were 211 entries in the studio log: 108 calls made from the studio and 103 calls made from a far-end location. We found that 103 calls from a total of 195 calls reported by the telecommunications company were recorded in the usage log. The remaining 92 calls were not recorded, probably for one of several reasons, including: failed calls-a large number of unrecorded calls (57%) lasted for less than 2 min (median 1.6 min); origin of videoconference calls-calls may have been recorded incorrectly in the usage diary (i.e. as being initiated from the far end, when actually initiated from the studio); and human error. Our study showed that manual recording of videoconference activity may not accurately reflect the actual activity taking place. Those responsible for recording and analysing videoconference activity, particularly in large telemedicine networks, should do so with care.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A test of the ability of a probabilistic neural network to classify deposits into types on the basis of deposit tonnage and average Cu, Mo, Ag, Au, Zn, and Pb grades is conducted. The purpose is to examine whether this type of system might serve as a basis for integrating geoscience information available in large mineral databases to classify sites by deposit type. Benefits of proper classification of many sites in large regions are relatively rapid identification of terranes permissive for deposit types and recognition of specific sites perhaps worthy of exploring further. Total tonnages and average grades of 1,137 well-explored deposits identified in published grade and tonnage models representing 13 deposit types were used to train and test the network. Tonnages were transformed by logarithms and grades by square roots to reduce effects of skewness. All values were scaled by subtracting the variable's mean and dividing by its standard deviation. Half of the deposits were selected randomly to be used in training the probabilistic neural network and the other half were used for independent testing. Tests were performed with a probabilistic neural network employing a Gaussian kernel and separate sigma weights for each class (type) and each variable (grade or tonnage). Deposit types were selected to challenge the neural network. For many types, tonnages or average grades are significantly different from other types, but individual deposits may plot in the grade and tonnage space of more than one type. Porphyry Cu, porphyry Cu-Au, and porphyry Cu-Mo types have similar tonnages and relatively small differences in grades. Redbed Cu deposits typically have tonnages that could be confused with porphyry Cu deposits, also contain Cu and, in some situations, Ag. Cyprus and kuroko massive sulfide types have about the same tonnages. Cu, Zn, Ag, and Au grades. Polymetallic vein, sedimentary exhalative Zn-Pb, and Zn-Pb skarn types contain many of the same metals. Sediment-hosted Au, Comstock Au-Ag, and low-sulfide Au-quartz vein types are principally Au deposits with differing amounts of Ag. Given the intent to test the neural network under the most difficult conditions, an overall 75% agreement between the experts and the neural network is considered excellent. Among the largestclassification errors are skarn Zn-Pb and Cyprus massive sulfide deposits classed by the neuralnetwork as kuroko massive sulfides—24 and 63% error respectively. Other large errors are the classification of 92% of porphyry Cu-Mo as porphyry Cu deposits. Most of the larger classification errors involve 25 or fewer training deposits, suggesting that some errors might be the result of small sample size. About 91% of the gold deposit types were classed properly and 98% of porphyry Cu deposits were classes as some type of porphyry Cu deposit. An experienced economic geologist would not make many of the classification errors that were made by the neural network because the geologic settings of deposits would be used to reduce errors. In a separate test, the probabilistic neural network correctly classed 93% of 336 deposits in eight deposit types when trained with presence or absence of 58 minerals and six generalized rock types. The overall success rate of the probabilistic neural network when trained on tonnage and average grades would probably be more than 90% with additional information on the presence of a few rock types.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Rocks used as construction aggregate in temperate climates deteriorate to differing degrees because of repeated freezing and thawing. The magnitude of the deterioration depends on the rock's properties. Aggregate, including crushed carbonate rock, is required to have minimum geotechnical qualities before it can be used in asphalt and concrete. In order to reduce chances of premature and expensive repairs, extensive freeze-thaw tests are conducted on potential construction rocks. These tests typically involve 300 freeze-thaw cycles and can take four to five months to complete. Less time consuming tests that (1) predict durability as well as the extended freeze-thaw test or that (2) reduce the number of rocks subject to the extended test, could save considerable amounts of money. Here we use a probabilistic neural network to try and predict durability as determined by the freeze-thaw test using four rock properties measured on 843 limestone samples from the Kansas Department of Transportation. Modified freeze-thaw tests and less time consuming specific gravity (dry), specific gravity (saturated), and modified absorption tests were conducted on each sample. Durability factors of 95 or more as determined from the extensive freeze-thaw tests are viewed as acceptable—rocks with values below 95 are rejected. If only the modified freeze-thaw test is used to predict which rocks are acceptable, about 45% are misclassified. When 421 randomly selected samples and all four standardized and scaled variables were used to train aprobabilistic neural network, the rate of misclassification of 422 independent validation samples dropped to 28%. The network was trained so that each class (group) and each variable had its own coefficient (sigma). In an attempt to reduce errors further, an additional class was added to the training data to predict durability values greater than 84 and less than 98, resulting in only 11% of the samples misclassified. About 43% of the test data was classed by the neural net into the middle group—these rocks should be subject to full freeze-thaw tests. Thus, use of the probabilistic neural network would meanthat the extended test would only need be applied to 43% of the samples, and 11% of the rocks classed as acceptable would fail early.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present 547 optical redshifts obtained for galaxies in the region of the Horologium-Reticulum supercluster (HRS) using the 6 degrees field (6dF) multifiber spectrograph on the UK Schmidt Telescope at the Anglo-Australian Observatory. The HRS covers an area of more than 12 degrees x 12 degrees on the sky centered at approximately alpha = 03(h)19(m), delta = 50 degrees 02'. Our 6dF observations concentrate on the intercluster regions of the HRS, from which we describe four primary results. First, the HRS spans at least the redshift range from 17,000 to 22,500 km s(-1). Second, the overdensity of galaxies in the intercluster regions of the HRS in this redshift range is estimated to be 2.4, or delta rho/(rho) over bar similar to 1: 4. Third, we find a systematic trend of increasing redshift along a southeast-northwest spatial axis in the HRS, in that the mean redshift of HRS members increases by more than 1500 km s(-1) from southeast to northwest over a 12 degrees region. Fourth, the HRS is bimodal in redshift with a separation of similar to 2500 km s(-1) (35 Mpc) between the higher and lower redshift peaks. This fact is particularly evident if the above spatial-redshift trend is fitted and removed. In short, the HRS appears to consist of two components in redshift space, each one exhibiting a similar systematic spatial-redshift trend along a southeast-northwest axis. Lastly, we compare these results from the HRS with the Shapley supercluster and find similar properties and large-scale features.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Nitric Oxide (NO) plays a controversial role in the pathophysiology of sepsis and septic shock. Its vasodilatory effects are well known, but it also has pro- and antiinflammatory properties, assumes crucial importance in antimicrobial host defense, may act as an oxidant as well as an antioxidant, and is said to be a vital poison for the immune and inflammatory network. Large amounts of NO and peroxynitrite are responsible for hypotension, vasoplegia, cellular suffocation, apoptosis, lactic acidosis, and ultimately multiorgan failure. Therefore, NO synthase (NOS) inhibitors were developed to reverse the deleterious effects of NO. Studies using these compounds have not met with uniform success however, and a trial using the nonselective NOS inhibitor N-G-methyl-L-arginine hydrochloride was terminated prematurely because of increased mortality in the treatment arm despite improved shock resolution. Thus, the issue of NOS inhibition in sepsis remains a matter of debate. Several publications have emphasized the differences concerning clinical applicability of data obtained from unresuscitated, hypodynamic rodent models using a pretreatment approach versus resuscitated, hyperdynamic models in high-order species using posttreatment approaches. Therefore, the present review focuses on clinically relevant large-animal studies of endotoxin or living bacteria-induced, hyperdynamic models of sepsis that integrate standard day-today care resuscitative measures.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Great Barrier Reef Marine Park, an area almost the size , of Japan, has a new network of no-take areas that significantly improves the protection of biodiversity. The new marine park zoning implements, in a quantitative manner, many of the theoretical design principles discussed in the literature. For example, the new network of no-take areas has at least 20% protection per bioregion, minimum levels of protection for all known habitats and special or unique features, and minimum sizes for no-take areas of at least 10 or 20 kat across at the smallest diameter Overall, more than 33% of the Great Barrier Reef Marine Park is now in no-take areas (previously 4.5%). The steps taken leading to this outcome were to clarify to the interested public why the existing level of protection wets inadequate; detail the conservation objectives of establishing new no-take areas; work with relevant and independent experts to define, and contribute to, the best scientific process to deliver on the objectives; describe the biodiversity (e.g., map bioregions); define operational principles needed to achieve the objectives; invite community input on all of The above; gather and layer the data gathered in round-table discussions; report the degree of achievement of principles for various options of no-take areas; and determine how to address negative impacts. Some of the key success factors in this case have global relevance and include focusing initial communication on the problem to be addressed; applying the precautionary principle; using independent experts; facilitating input to decision making; conducting extensive and participatory consultation; having an existing marine park that encompassed much of the ecosystem; having legislative power under federal law; developing high-level support; ensuring agency Priority and ownership; and being able to address the issue of displaced fishers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In Australia more than 300 vertebrates, including 43 insectivorous bat species, depend on hollows in habitat trees for shelter, with many species using a network of multiple trees as roosts, We used roost-switching data on white-striped freetail bats (Tadarida australis; Microchiroptera: Molossidae) to construct a network representation of day roosts in suburban Brisbane, Australia. Bats were caught from a communal roost tree with a roosting group of several hundred individuals and released with transmitters. Each roost used by the bats represented a node in the network, and the movements of bats between roosts formed the links between nodes. Despite differences in gender and reproductive stages, the bats exhibited the same behavior throughout three radiotelemetry periods and over 500 bat days of radio tracking: each roosted in separate roosts, switched roosts very infrequently, and associated with other bats only at the communal roost This network resembled a scale-free network in which the distribution of the number of links from each roost followed a power law. Despite being spread over a large geographic area (> 200 km(2)), each roost was connected to others by less than three links. One roost (the hub or communal roost) defined the architecture of the network because it had the most links. That the network showed scale-free properties has profound implications for the management of the habitat trees of this roosting group. Scale-free networks provide high tolerance against stochastic events such as random roost removals but are susceptible to the selective removal of hub nodes. Network analysis is a useful tool for understanding the structural organization of habitat tree usage and allows the informed judgment of the relative importance of individual trees and hence the derivation of appropriate management decisions, Conservation planners and managers should emphasize the differential importance of habitat trees and think of them as being analogous to vital service centers in human societies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Topological measures of large-scale complex networks are applied to a specific artificial regulatory network model created through a whole genome duplication and divergence mechanism. This class of networks share topological features with natural transcriptional regulatory networks. Specifically, these networks display scale-free and small-world topology and possess subgraph distributions similar to those of natural networks. Thus, the topologies inherent in natural networks may be in part due to their method of creation rather than being exclusively shaped by subsequent evolution under selection. The evolvability of the dynamics of these networks is also examined by evolving networks in simulation to obtain three simple types of output dynamics. The networks obtained from this process show a wide variety of topologies and numbers of genes indicating that it is relatively easy to evolve these classes of dynamics in this model. (c) 2006 Elsevier Ireland Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With rapid advances in video processing technologies and ever fast increments in network bandwidth, the popularity of video content publishing and sharing has made similarity search an indispensable operation to retrieve videos of user interests. The video similarity is usually measured by the percentage of similar frames shared by two video sequences, and each frame is typically represented as a high-dimensional feature vector. Unfortunately, high complexity of video content has posed the following major challenges for fast retrieval: (a) effective and compact video representations, (b) efficient similarity measurements, and (c) efficient indexing on the compact representations. In this paper, we propose a number of methods to achieve fast similarity search for very large video database. First, each video sequence is summarized into a small number of clusters, each of which contains similar frames and is represented by a novel compact model called Video Triplet (ViTri). ViTri models a cluster as a tightly bounded hypersphere described by its position, radius, and density. The ViTri similarity is measured by the volume of intersection between two hyperspheres multiplying the minimal density, i.e., the estimated number of similar frames shared by two clusters. The total number of similar frames is then estimated to derive the overall similarity between two video sequences. Hence the time complexity of video similarity measure can be reduced greatly. To further reduce the number of similarity computations on ViTris, we introduce a new one dimensional transformation technique which rotates and shifts the original axis system using PCA in such a way that the original inter-distance between two high-dimensional vectors can be maximally retained after mapping. An efficient B+-tree is then built on the transformed one dimensional values of ViTris' positions. Such a transformation enables B+-tree to achieve its optimal performance by quickly filtering a large portion of non-similar ViTris. Our extensive experiments on real large video datasets prove the effectiveness of our proposals that outperform existing methods significantly.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presented a novel approach to develop car following models using reactive agent techniques for mapping perceptions to actions. The results showed that the model outperformed the Gipps and Psychophysical family of car following models. The standing of this work is highlighted by its acceptance and publication in the proceedings of the International IEEE Conference on Intelligent Transportation Systems (ITS), which is now recognised as the premier international conference on ITS. The paper acceptance rate to this conference was 67 percent. The standing of this paper is also evidenced by its listing in international databases like Ei Inspec and IEEE Xplore. The paper is also listed in Google Scholar. Dr Dia co-authored this paper with his PhD student Sakda Panwai.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Retrieving large amounts of information over wide area networks, including the Internet, is problematic due to issues arising from latency of response, lack of direct memory access to data serving resources, and fault tolerance. This paper describes a design pattern for solving the issues of handling results from queries that return large amounts of data. Typically these queries would be made by a client process across a wide area network (or Internet), with one or more middle-tiers, to a relational database residing on a remote server. The solution involves implementing a combination of data retrieval strategies, including the use of iterators for traversing data sets and providing an appropriate level of abstraction to the client, double-buffering of data subsets, multi-threaded data retrieval, and query slicing. This design has recently been implemented and incorporated into the framework of a commercial software product developed at Oracle Corporation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A pilot accident and emergency. teleconsulting service was established in Scotland. It was based at the accident and emergency department of the main hospital in Aberdeen. There were three peripheral sites in rural Grampian (Peterhead, Turriff and Huntly) and one in the Shetland Isles. The videoconferencing equipment used was connected by ISDN at 384 kbit/s. During the 15 months of the study, 1998 videoconference calls were made, of which 402 (20%) calls were made to the accident and emergency department for clinical consultations. The majority of the clinical calls (95%) were made between 09:00 and 17:00, and more than 90% were completed within 20 min. During the majority of calls (87%) one or more X-ray images were transmitted. The majority of patients (89%) received treatment without transportation to the main centre in Aberdeen. The present study demonstrated that accident and emergency teleconsultations can be technically reliable, effective in reducing the number of patient transfers and acceptable to the referring clinicians. As a result, approximately pound1.5 million has been made available by the government to develop a national system for Scotland.