953 resultados para shortest paths
Resumo:
This paper proposes a practical approach to the enhancement of Quality of Service (QoS) routing by means of providing alternative or repair paths in the event of a breakage of a working path. The proposed scheme guarantees that every Protected Node (PN) is connected to a multi-repair path such that no further failure or breakage of single or double repair paths can cause any simultaneous loss of connectivity between an ingress node and an egress node. Links to be protected in an MPLS network are predefined and a Label Switched path (LSP) request involves the establishment of a working path. The use of multi-protection paths permits the formation of numerous protection paths allowing greater flexibility. Our analysis examined several methods including single, double and multi-repair routes and the prioritization of signals along the protected paths to improve the Quality of Service (QoS), throughput, reduce the cost of the protection path placement, delay, congestion and collision. Results obtained indicated that creating multi-repair paths and prioritizing packets reduces delay and increases throughput in which case the delays at the ingress/egress LSPs were low compared to when the signals had not been classified. Therefore the proposed scheme provided a means to improve the QoS in path restoration in MPLS using available network resources. Prioritizing the packets in the data plane has revealed that the amount of traffic transmitted using a medium and low priority Label Switch Paths (LSPs) does not have any impact on the explicit rate of the high priority LSP in which case the problem of a knock-on effect is eliminated.
Resumo:
Distributed multimedia supports a symbiotic infotainment duality, i.e. the ability to transfer information to the user, yet also provide the user with a level of satisfaction. As multimedia is ultimately produced for the education and / or enjoyment of viewers, the user’s-perspective concerning the presentation quality is surely of equal importance as objective Quality of Service (QoS) technical parameters, to defining distributed multimedia quality. In order to extensively measure the user-perspective of multimedia video quality, we introduce an extended model of distributed multimedia quality that segregates quality into three discrete levels: the network-level, the media-level and content-level, using two distinct quality perspectives: the user-perspective and the technical-perspective. Since experimental questionnaires do not provide continuous monitoring of user attention, eye tracking was used in our study in order to provide a better understanding of the role that the human element plays in the reception, analysis and synthesis of multimedia data. Results showed that video content adaptation, results in disparity in user video eye-paths when: i) no single / obvious point of focus exists; or ii) when the point of attention changes dramatically. Accordingly, appropriate technical- and user-perspective parameter adaptation is implemented, for all quality abstractions of our model, i.e. network-level (via simulated delay and jitter), media-level (via a technical- and user-perspective manipulated region-of-interest attentive display) and content-level (via display-type and video clip-type). Our work has shown that user perception of distributed multimedia quality cannot be achieved by means of purely technical-perspective QoS parameter adaptation.
Resumo:
In terrestrial television transmission multiple paths of various lengths can occur between the transmitter and the receiver. Such paths occur because of reflections from objects outside the direct transmission path. The multipath signals arriving at the receiver are all detected along with the intended signal causing time displaced replicas called 'ghosts' to appear on the television picture. With an increasing number of people living within built up areas, ghosting is becoming commonplace and therefore deghosting is becoming increasingly important. This thesis uses a deterministic time domain approach to deghosting, resulting in a simple solution to the problem of removing ghosts. A new video detector is presented which reduces the synchronous detector local oscillator phase error, caused by any practical size of ghost, to a lower level than has ever previously been achieved. From the new detector, dispersion of the video signal is minimised and a known closed-form time domain description of the individual ghost components within the detected video is subsequently obtained. Developed from mathematical descriptions of the detected video, a new specific deghoster filter structure is presented which is capable of removing both inphase (I) and also the phase quadrature (Q) induced ghost signals derived from the VSB operation. The new deghoster filter requires much less hardware than any previous deghoster which is capable of removing both I and Q ghost components. A new channel identification algorithm was also required and written which is based upon simple correlation techniques to find the delay and complex amplitude characteristics of individual ghosts. The result of the channel identification is then passed to the new I and Q deghoster filter for ghost cancellation. Generated from the research work performed for this thesis, five papers have been published. D
Resumo:
The K-Means algorithm for cluster analysis is one of the most influential and popular data mining methods. Its straightforward parallel formulation is well suited for distributed memory systems with reliable interconnection networks. However, in large-scale geographically distributed systems the straightforward parallel algorithm can be rendered useless by a single communication failure or high latency in communication paths. This work proposes a fully decentralised algorithm (Epidemic K-Means) which does not require global communication and is intrinsically fault tolerant. The proposed distributed K-Means algorithm provides a clustering solution which can approximate the solution of an ideal centralised algorithm over the aggregated data as closely as desired. A comparative performance analysis is carried out against the state of the art distributed K-Means algorithms based on sampling methods. The experimental analysis confirms that the proposed algorithm is a practical and accurate distributed K-Means implementation for networked systems of very large and extreme scale.
Resumo:
The paper presents an overview of dynamic systems with inherent delays in both feedforward and feedback paths and how the performance of such systems can be affected by such delays. The authors concentrate on visually guided systems, where the behaviour of the system is largely dependent on the results of the vision sensors, with particular reference to active robot heads (real-time gaze control). We show how the performance of such systems can deteriorate substantially with the presence of unknown and/or variable delays. Considered choice of system architecture, however, allows the performance of active vision systems to be optimised with respect to the delays present in the system.
Resumo:
This paper explores the impact of the re-introduction of access restrictions to forests in Tanzania, through participatory forest management (PFM), that have excluded villagers from forests to which they have traditionally, albeit illegally, had access to collect non-timber forest products (NTFPs). Motivated by our fieldwork, and using a spatial–temporal model, we focus on the paths of forest degradation and regeneration and villagers' utility before and after an access restriction is introduced. Our paper illustrates a number of key points for policy makers. First, the benefits of forest conservation tend to be greatest in the first few periods after an access restriction is introduced, after which the overall forest quality often declines. Second, villagers may displace their NTFP collection into more distant forests that may have been completely protected by distance alone before access to a closer forest was restricted. Third, permitting villagers to collect limited amounts of NTFPs for a fee, or alternatively fining villagers caught collecting illegally from the protected forest, and returning the fee or fine revenue to the villagers, can improve both forest quality and villagers' livelihoods.
Resumo:
The school subject of Art and the profession of the primary school teacher are gendered female and both are considered low status within the field of Education and other professional areas of society. A number of sociological studies have examined the impact of gendered socialisation and habitus on females’ career choices and various educational initiatives have been put in place over the years to encourage females to select subjects and/or pursue career paths normally associated with males. Yet Art and primary school teaching continue to be a popular choice with middle class girls. Based on a critical ethnographic study of female BAED Art students, who are training to be primary school teachers, this study is an examination of the many factors, historically and contemporaneously that have shaped and continue to shape the subjectivities of females and frame their aspirations and ambitions. Within this discourse significant aspects of the history of Art and Art Education that have contributed to and influenced the construction of the female artist, and their consequent impact on artistically talented females’ personal identity as artists, are also examined.
Resumo:
BACKGROUND: Sex differences are present in many neuropsychiatric conditions that affect emotion and approach-avoidance behavior. One potential mechanism underlying such observations is testosterone in early development. Although much is known about the effects of testosterone in adolescence and adulthood, little is known in humans about how testosterone in fetal development influences later neural sensitivity to valenced facial cues and approach-avoidance behavioral tendencies. METHODS: With functional magnetic resonance imaging we scanned 25 8-11-year-old children while viewing happy, fear, neutral, or scrambled faces. Fetal testosterone (FT) was measured via amniotic fluid sampled between 13 and 20 weeks gestation. Behavioral approach-avoidance tendencies were measured via parental report on the Sensitivity to Punishment and Sensitivity to Rewards questionnaire. RESULTS: Increasing FT predicted enhanced selectivity for positive compared with negatively valenced facial cues in reward-related regions such as caudate, putamen, and nucleus accumbens but not the amygdala. Statistical mediation analyses showed that increasing FT predicts increased behavioral approach tendencies by biasing caudate, putamen, and nucleus accumbens but not amygdala to be more responsive to positive compared with negatively valenced cues. In contrast, FT was not predictive of behavioral avoidance tendencies, either through direct or neurally mediated paths. CONCLUSIONS: This work suggests that testosterone in humans acts as a fetal programming mechanism on the reward system and influences behavioral approach tendencies later in life. As a mechanism influencing atypical development, FT might be important across a range of neuropsychiatric conditions that asymmetrically affect the sexes, the reward system, emotion processing, and approach behavior.
Resumo:
The K-Means algorithm for cluster analysis is one of the most influential and popular data mining methods. Its straightforward parallel formulation is well suited for distributed memory systems with reliable interconnection networks, such as massively parallel processors and clusters of workstations. However, in large-scale geographically distributed systems the straightforward parallel algorithm can be rendered useless by a single communication failure or high latency in communication paths. The lack of scalable and fault tolerant global communication and synchronisation methods in large-scale systems has hindered the adoption of the K-Means algorithm for applications in large networked systems such as wireless sensor networks, peer-to-peer systems and mobile ad hoc networks. This work proposes a fully distributed K-Means algorithm (EpidemicK-Means) which does not require global communication and is intrinsically fault tolerant. The proposed distributed K-Means algorithm provides a clustering solution which can approximate the solution of an ideal centralised algorithm over the aggregated data as closely as desired. A comparative performance analysis is carried out against the state of the art sampling methods and shows that the proposed method overcomes the limitations of the sampling-based approaches for skewed clusters distributions. The experimental analysis confirms that the proposed algorithm is very accurate and fault tolerant under unreliable network conditions (message loss and node failures) and is suitable for asynchronous networks of very large and extreme scale.
Resumo:
We evaluated the accuracy of six watershed models of nitrogen export in streams (kg km2 yr−1) developed for use in large watersheds and representing various empirical and quasi-empirical approaches described in the literature. These models differ in their methods of calibration and have varying levels of spatial resolution and process complexity, which potentially affect the accuracy (bias and precision) of the model predictions of nitrogen export and source contributions to export. Using stream monitoring data and detailed estimates of the natural and cultural sources of nitrogen for 16 watersheds in the northeastern United States (drainage sizes = 475 to 70,000 km2), we assessed the accuracy of the model predictions of total nitrogen and nitrate-nitrogen export. The model validation included the use of an error modeling technique to identify biases caused by model deficiencies in quantifying nitrogen sources and biogeochemical processes affecting the transport of nitrogen in watersheds. Most models predicted stream nitrogen export to within 50% of the measured export in a majority of the watersheds. Prediction errors were negatively correlated with cultivated land area, indicating that the watershed models tended to over predict export in less agricultural and more forested watersheds and under predict in more agricultural basins. The magnitude of these biases differed appreciably among the models. Those models having more detailed descriptions of nitrogen sources, land and water attenuation of nitrogen, and water flow paths were found to have considerably lower bias and higher precision in their predictions of nitrogen export.
Resumo:
Department of Health staff wished to use systems modelling to discuss acute patient flows with groups of NHS staff. The aim was to assess the usefulness of system dynamics (SD) in a healthcare context and to elicit proposals concerning ways of improving patient experience. Since time restrictions excluded simulation modelling, a hybrid approach using stock/flow symbols from SD was created. Initial interviews and hospital site visits generated a series of stock/flow maps. A ‘Conceptual Framework’ was then created to introduce the mapping symbols and to generate a series of questions about different patient paths and what might speed or slow patient flows. These materials formed the centre of three workshops for NHS staff. The participants were able to propose ideas for improving patient flows and the elicited data was subsequently employed to create a finalized suite of maps of a general acute hospital. The maps and ideas were communicated back to the Department of Health and subsequently assisted the work of the Modernization Agency.
Resumo:
There is general agreement across the world that human-made climate change is a serious global problem,although there are still some sceptics who challenge this view. Research in organization studies on the topic is relatively new. Much of this research, however, is instrumental and managerialist in its focus on ‘win-win’ opportunities for business or its treatment of climate change as just another corporate social responsibility (CSR) exercise. In this paper, we suggest that climate change is not just an environmental problem requiring technical and managerial solutions; it is a political issue where a variety of organizations – state agencies, firms, industry associations, NGOs and multilateral organizations – engage in contestation as well as collaboration over the issue. We discuss the strategic, institutional and political economy dimensions of climate change and develop a socioeconomic regimes approach as a synthesis of these different theoretical perspectives. Given the urgency of the problem and the need for a rapid transition to a low-carbon economy, there is a pressing need for organization scholars to develop a better understanding of apathy and inertia in the face of the current crisis and to identify paths toward transformative change. The seven papers in this special issue address these areas of research and examine strategies, discourses, identities and practices in relation to climate change at multiple levels.
Resumo:
The plume of Ice Shelf Water (ISW) flowing into the Weddell Sea over the Filchner sill contributes to the formation of Antarctic Bottom Water. The Filchner overflow is simulated using a hydrostatic, primitive equation three-dimensional ocean model with a 0.5–2 Sv ISW influx above the Filchner sill. The best fit to mooring temperature observations is found with influxes of 0.5 and 1 Sv, below a previous estimate of 1.6 ± 0.5 Sv based on sparse mooring velocities. The plume first moves north over the continental shelf, and then turns west, along slope of the continental shelf break where it breaks up into subplumes and domes, some of which then move downslope. Other subplumes run into the eastern submarine ridge and propagate along the ridge downslope in a chaotic manner. The next, western ridge is crossed by the plume through several paths. Despite a number of discrepancies with observational data, the model reproduces many attributes of the flow. In particular, we argue that the temporal variability shown by the observations can largely be attributed to the unstable structure of the flow, where the temperature fluctuations are determined by the motion of the domes past the moorings. Our sensitivity studies show that while thermobaricity plays a role, its effect is small for the flows considered. Smoothing the ridges out demonstrate that their presence strongly affects the plume shape around the ridges. An increase in the bottom drag or viscosity leads to slowing down, and hence thickening and widening of the plume
Resumo:
In the stratosphere, chemical tracers are drawn systematically from the equator to the pole. This observed Brewer–Dobson circulation is driven by wave drag, which in the stratosphere arises mainly from the breaking and dissipation of planetary-scale Rossby waves. While the overall sense of the circulation follows from fundamental physical principles, a quantitative theoretical understanding of the connection between wave drag and Lagrangian transport is limited to linear, small-amplitude waves. However, planetary waves in the stratosphere generally grow to a large amplitude and break in a strongly nonlinear fashion. This paper addresses the connection between stratospheric wave drag and Lagrangian transport in the presence of strong nonlinearity, using a mechanistic three-dimensional primitive equations model together with offline particle advection. Attention is deliberately focused on a weak forcing regime, such that sudden warmings do not occur and a quasi-steady state is reached, in order to examine this question in the cleanest possible context. Wave drag is directly linked to the transformed Eulerian mean (TEM) circulation, which is often used as a surrogate for mean Lagrangian motion. The results show that the correspondence between the TEM and mean Lagrangian velocities is quantitatively excellent in regions of linear, nonbreaking waves (i.e., outside the surf zone), where streamlines are not closed. Within the surf zone, where streamlines are closed and meridional particle displacements are large, the agreement between the vertical components of the two velocity fields is still remarkably good, especially wherever particle paths are coherent so that diabatic dispersion is minimized. However, in this region the meridional mean Lagrangian velocity bears little relation to the meridional TEM velocity, and reflects more the kinematics of mixing within and across the edges of the surf zone. The results from the mechanistic model are compared with those from the Canadian Middle Atmosphere Model to test the robustness of the conclusions.
Resumo:
Possible changes in the frequency and intensity of windstorms under future climate conditions during the 21st century are investigated based on an ECHAM5 GCM multi-scenario ensemble. The intensity of a storm is quantified by the associated estimated loss derived with using an empirical model. The geographical focus is ‘Core Europe’, which comprises countries of Western Europe. Possible changes of losses are analysed by comparing ECHAM5 GCM data for recent (20C, 1960 to 2000) and future climate conditions (B1, A1B, A2; 2060 to 2100), each with 3 ensemble members. Changes are quantified using both rank statistics and return periods (RP) estimated by fitting an extreme value distribution using the peak over threshold method to potential storm losses. The estimated losses for ECHAM5 20C and reanalysis events show similar statistical features in terms of return periods. Under future climate conditions, all climate scenarios show an increase in both frequency and magnitude of potential losses caused by windstorms for Core Europe. Future losses that are double the highest ECHAM5 20C loss are identified for some countries. While positive changes of ranking are significant for many countries and multiple scenarios, significantly shorter RPs are mostly found under the A2 scenario for return levels correspondent to 20 yr losses or less. The emergence time of the statistically significant changes in loss varies from 2027 to 2100. These results imply an increased risk of occurrence of windstorm-associated losses, which can be largely attributed to changes in the meteorological severity of the events. Additionally, factors such as changes in the cyclone paths and in the location of the wind signatures relative to highly populated areas are also important to explain the changes in estimated losses.