971 resultados para Obstacle Avoidance


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Projeto de Pós-Graduação/Dissertação apresentado à Universidade Fernando Pessoa como parte dos requisitos para obtenção do grau de Mestre em Medicina Dentária

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Projeto de Pós-Graduação/Dissertação apresentado à Universidade Fernando Pessoa como parte dos requisitos para obtenção do grau de Mestre em Medicina Dentária

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Dissertação apresentada à Universidade Fernando Pessoa como parte dos requisitos para obtenção do grau de Mestre em Ciências da Educação: Educação Especial, área de especialização em Domínio Cognitivo e Motor

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The best-effort nature of the Internet poses a significant obstacle to the deployment of many applications that require guaranteed bandwidth. In this paper, we present a novel approach that enables two edge/border routers-which we call Internet Traffic Managers (ITM)-to use an adaptive number of TCP connections to set up a tunnel of desirable bandwidth between them. The number of TCP connections that comprise this tunnel is elastic in the sense that it increases/decreases in tandem with competing cross traffic to maintain a target bandwidth. An origin ITM would then schedule incoming packets from an application requiring guaranteed bandwidth over that elastic tunnel. Unlike many proposed solutions that aim to deliver soft QoS guarantees, our elastic-tunnel approach does not require any support from core routers (as with IntServ and DiffServ); it is scalable in the sense that core routers do not have to maintain per-flow state (as with IntServ); and it is readily deployable within a single ISP or across multiple ISPs. To evaluate our approach, we develop a flow-level control-theoretic model to study the transient behavior of established elastic TCP-based tunnels. The model captures the effect of cross-traffic connections on our bandwidth allocation policies. Through extensive simulations, we confirm the effectiveness of our approach in providing soft bandwidth guarantees. We also outline our kernel-level ITM prototype implementation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The quality of available network connections can often have a large impact on the performance of distributed applications. For example, document transfer applications such as FTP, Gopher and the World Wide Web suffer increased response times as a result of network congestion. For these applications, the document transfer time is directly related to the available bandwidth of the connection. Available bandwidth depends on two things: 1) the underlying capacity of the path from client to server, which is limited by the bottleneck link; and 2) the amount of other traffic competing for links on the path. If measurements of these quantities were available to the application, the current utilization of connections could be calculated. Network utilization could then be used as a basis for selection from a set of alternative connections or servers, thus providing reduced response time. Such a dynamic server selection scheme would be especially important in a mobile computing environment in which the set of available servers is frequently changing. In order to provide these measurements at the application level, we introduce two tools: bprobe, which provides an estimate of the uncongested bandwidth of a path; and cprobe, which gives an estimate of the current congestion along a path. These two measures may be used in combination to provide the application with an estimate of available bandwidth between server and client thereby enabling application-level congestion avoidance. In this paper we discuss the design and implementation of our probe tools, specifically illustrating the techniques used to achieve accuracy and robustness. We present validation studies for both tools which demonstrate their reliability in the face of actual Internet conditions; and we give results of a survey of available bandwidth to a random set of WWW servers as a sample application of our probe technique. We conclude with descriptions of other applications of our measurement tools, several of which are currently under development.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Replication is a commonly proposed solution to problems of scale associated with distributed services. However, when a service is replicated, each client must be assigned a server. Prior work has generally assumed that assignment to be static. In contrast, we propose dynamic server selection, and show that it enables application-level congestion avoidance. To make dynamic server selection practical, we demonstrate the use of three tools. In addition to direct measurements of round-trip latency, we introduce and validate two new tools: bprobe, which estimates the maximum possible bandwidth along a given path; and cprobe, which estimates the current congestion along a path. Using these tools we demonstrate dynamic server selection and compare it to previous static approaches. We show that dynamic server selection consistently outperforms static policies by as much as 50%. Furthermore, we demonstrate the importance of each of our tools in performing dynamic server selection.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

As the commoditization of sensing, actuation and communication hardware increases, so does the potential for dynamically tasked sense and respond networked systems (i.e., Sensor Networks or SNs) to replace existing disjoint and inflexible special-purpose deployments (closed-circuit security video, anti-theft sensors, etc.). While various solutions have emerged to many individual SN-centric challenges (e.g., power management, communication protocols, role assignment), perhaps the largest remaining obstacle to widespread SN deployment is that those who wish to deploy, utilize, and maintain a programmable Sensor Network lack the programming and systems expertise to do so. The contributions of this thesis centers on the design, development and deployment of the SN Workbench (snBench). snBench embodies an accessible, modular programming platform coupled with a flexible and extensible run-time system that, together, support the entire life-cycle of distributed sensory services. As it is impossible to find a one-size-fits-all programming interface, this work advocates the use of tiered layers of abstraction that enable a variety of high-level, domain specific languages to be compiled to a common (thin-waist) tasking language; this common tasking language is statically verified and can be subsequently re-translated, if needed, for execution on a wide variety of hardware platforms. snBench provides: (1) a common sensory tasking language (Instruction Set Architecture) powerful enough to express complex SN services, yet simple enough to be executed by highly constrained resources with soft, real-time constraints, (2) a prototype high-level language (and corresponding compiler) to illustrate the utility of the common tasking language and the tiered programming approach in this domain, (3) an execution environment and a run-time support infrastructure that abstract a collection of heterogeneous resources into a single virtual Sensor Network, tasked via this common tasking language, and (4) novel formal methods (i.e., static analysis techniques) that verify safety properties and infer implicit resource constraints to facilitate resource allocation for new services. This thesis presents these components in detail, as well as two specific case-studies: the use of snBench to integrate physical and wireless network security, and the use of snBench as the foundation for semester-long student projects in a graduate-level Software Engineering course.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The majority of the traffic (bytes) flowing over the Internet today have been attributed to the Transmission Control Protocol (TCP). This strong presence of TCP has recently spurred further investigations into its congestion avoidance mechanism and its effect on the performance of short and long data transfers. At the same time, the rising interest in enhancing Internet services while keeping the implementation cost low has led to several service-differentiation proposals. In such service-differentiation architectures, much of the complexity is placed only in access routers, which classify and mark packets from different flows. Core routers can then allocate enough resources to each class of packets so as to satisfy delivery requirements, such as predictable (consistent) and fair service. In this paper, we investigate the interaction among short and long TCP flows, and how TCP service can be improved by employing a low-cost service-differentiation scheme. Through control-theoretic arguments and extensive simulations, we show the utility of isolating TCP flows into two classes based on their lifetime/size, namely one class of short flows and another of long flows. With such class-based isolation, short and long TCP flows have separate service queues at routers. This protects each class of flows from the other as they possess different characteristics, such as burstiness of arrivals/departures and congestion/sending window dynamics. We show the benefits of isolation, in terms of better predictability and fairness, over traditional shared queueing systems with both tail-drop and Random-Early-Drop (RED) packet dropping policies. The proposed class-based isolation of TCP flows has several advantages: (1) the implementation cost is low since it only requires core routers to maintain per-class (rather than per-flow) state; (2) it promises to be an effective traffic engineering tool for improved predictability and fairness for both short and long TCP flows; and (3) stringent delay requirements of short interactive transfers can be met by increasing the amount of resources allocated to the class of short flows.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Border Gateway Protocol (BGP) is the current inter-domain routing protocol used to exchange reachability information between Autonomous Systems (ASes) in the Internet. BGP supports policy-based routing which allows each AS to independently adopt a set of local policies that specify which routes it accepts and advertises from/to other networks, as well as which route it prefers when more than one route becomes available. However, independently chosen local policies may cause global conflicts, which result in protocol divergence. In this paper, we propose a new algorithm, called Adaptive Policy Management Scheme (APMS), to resolve policy conflicts in a distributed manner. Akin to distributed feedback control systems, each AS independently classifies the state of the network as either conflict-free or potentially-conflicting by observing its local history only (namely, route flaps). Based on the degree of measured conflicts (policy conflict-avoidance vs. -control mode), each AS dynamically adjusts its own path preferences—increasing its preference for observably stable paths over flapping paths. APMS also includes a mechanism to distinguish route flaps due to topology changes, so as not to confuse them with those due to policy conflicts. A correctness and convergence analysis of APMS based on the substability property of chosen paths is presented. Implementation in the SSF network simulator is performed, and simulation results for different performance metrics are presented. The metrics capture the dynamic performance (in terms of instantaneous throughput, delay, routing load, etc.) of APMS and other competing solutions, thus exposing the often neglected aspects of performance.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Imprisonment is the most severe penalty utilised by the criminal courts in Ireland. In recent decades the prison population has grown significantly despite expressions both official and public to reduce the use of the sanction. Two other sanctions are available to the Irish sentencer which may be used as a direct and comparable sentence in lieu of a term of imprisonment namely, the community service order and the suspended sentence. The community service order remains under-utilised as an alternative to the custodial sentence. The suspended sentence is used quite liberally but its function may be more closely related to the aim of deterrence rather than avoiding the use of the custodial sentence. Thus the aim of decarceration may not be optimal in practice when either sanction is utilised. The decarcerative effect of either sanction is largely dependent upon the specific purpose which judges invest in the sanction. Judges may also be inhibited in the use of either sanction if they lack confidence that the sentence will be appropriately monitored and executed. The purpose of this thesis is to examine the role of the community service order and the suspended sentence in Irish sentencing practice. Although community service and the suspended sentence present primarily as alternatives to the custodial sentence, the manner in which the judges utilise or fail to utilise the sanctions may differ significantly from this primary manifestation. Therefore the study proceeds to examine the judges' cognitions and expectations of both sanctions to explore their underlying purposes and to reveal the manner in which the judges use the sanctions in practice. To access this previously undisclosed information a number of methodologies were deployed. An extensive literature review was conducted to delineate the purpose and functionality of both sanctions. Quantitative data was gathered by way of sampling for the suspended sentence and the part-suspended sentence where deficiencies were apparent to show the actual frequency in use of that sanction. Qualitative methodologies were used by way of focus groups and semi-structured interviews of judges at all jurisdictional levels to elucidate the purposes of both sanctions. These methods allowed a deeper investigation of the factors which may promote or inhibit such usage. The relative under-utilisation of the community service order as an alternative to the custodial sentence may in part be explained by a reluctance by some judges to equate it with a real custodial sentence. For most judges who use the sanction, particularly at summary level, community service serves a decarcerative function. The suspended sentence continues to be used extensively. It operates partly as a decarcerative penalty but the purpose of deterrence may in practice overtake its theoretical purpose namely the avoidance of custody. Despite ongoing criticism of executive agencies such as the Probation Service and the Prosecution in the supervision of such penalties both sanctions continue to be used. Engagement between the Criminal Justice actors may facilitate better outcomes in the use of either sanction. The purposes for which both sanctions are deployed find their meaning essentially in the practices of the judges themselves as opposed to any statutory or theoretical claims upon their use or purpose.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A method is presented for converting unstructured program schemas to strictly equivalent structured form. The predicates of the original schema are left intact with structuring being achieved by the duplication of he original decision vertices without the introduction of compound predicate expressions, or where possible by function duplication alone. It is shown that structured schemas must have at least as many decision vertices as the original unstructured schema, and must have more when the original schema contains branches out of decision constructs. The structuring method allows the complete avoidance of function duplication, but only at the expense of decision vertex duplication. It is shown that structured schemas have greater space-time requirements in general than their equivalent optimal unstructured counterparts and at best have the same requirements.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Coeliac disease is one of the most common food intolerances worldwide and at present the gluten free diet remains the only suitable treatment. A market overview conducted as part of this thesis on nutritional and sensory quality of commercially available gluten free breads and pasta showed that improvements are necessary. Many products show strong off-flavors, poor mouthfeel and reduced shelf-life. Since the life-long avoidance of the cereal protein gluten means a major change to the diet, it is important to also consider the nutritional value of products intending to replace staple foods such as bread or pasta. This thesis addresses this issue by characterising available gluten free cereal and pseudocereal flours to facilitate a better raw material choice. It was observed that especially quinoa, buckwheat and teff are high in essential nutrients, such as protein, minerals and folate. In addition the potential of functional ingredients such as inulin, β-glucan, HPMC and xanthan to improve loaf quality were evaluated. Results show that these ingredients can increase loaf volume and reduce crumb hardness as well as rate of staling but that the effect diverges strongly depending on the bread formulation used. Furthermore, fresh egg pasta formulations based on teff and oat flour were developed. The resulting products were characterised regarding sensory and textural properties as well as in vitro digestibility. Scanning electron and confocal laser scanning microscopy was used throughout the thesis to visualise structural changes occurring during baking and pasta making

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Aim: To develop and evaluate the psychometric properties of an instrument for the measurement of self-neglect (SN).Conceptual Framework: An elder self-neglect (ESN) conceptual framework guided the literature review and scale development. The framework has two key dimensions physical/psycho-social and environmental and seven sub dimensions which are representative of the factors that can contribute to intentional and unintentional SN. Methods: A descriptive cross-sectional design was adopted to achieve the research aim. The study was conducted in two phases. Phase 1 involved the development of the questionnaire content and structure. Phase 2 focused on establishing the psychometric properties of the instrument. Content validity was established by a panel of 8 experts and piloted with 9 health and social care professionals. The instrument was subsequently posted with a stamped addressed envelope to 566 health and social care professionals who met specific eligibility criteria across the four HSE areas. A total of 341 questionnaires were returned, a response rate of 60% and 305 (50%) completed responses were included in exploratory factor analysis (EFA). Item and factor analyses were performed to elicit the instruments underlying factor structure and establish preliminary construct validity. Findings: Item and factor analyses resulted in a logically coherent, 37 items, five factor solution, explaining 55.6% of the cumulative variance. The factors were labelled: ‘Environment’, ‘Social Networks’, ‘Emotional and Behavioural Liability’, ‘Health Avoidance’ and ‘Self-Determinism’. The factor loadings were >0.40 for all items on each of the five subscales. Preliminary construct validity was supported by findings. Conclusion: The main outcome of this research is a 37 item Self-Neglect (SN-37) measurement instrument that was developed by EFA and underpinned by an ESN conceptual framework. Preliminary psychometric evaluation of the instrument is promising. Future work should be directed at establishing the construct and criterion related validity of the instrument.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Currently, the sole strategy for managing food hypersensitivity involves strict avoidance of the trigger. Several alternate strategies for the treatment of food allergies are currently under study. Also being explored is the process of eliminating allergenic proteins from crop plants. Legumes are a rich source of protein and are an essential component of the human diet. Unfortunately, legumes, including soybean and peanut, are also common sources of food allergens. Four protein families and superfamilies account for the majority of legume allergens, which include storage proteins of seeds (cupins and prolamins), profilins, and the larger group of pathogenesis-related proteins. Two strategies have been used to produce hypoallergenic legume crops: (1) germplasm lines are screened for the absence or reduced content of specific allergenic proteins and (2) genetic transformation is used to silence native genes encoding allergenic proteins. Both approaches have been successful in producing cultivars of soybeans and peanuts with reduced allergenic proteins. However, it is unknown whether the cultivars are actually hypoallergenic to those with sensitivity. This review describes efforts to produce hypoallergenic cultivars of soybean and peanut and discusses the challenges that need to be overcome before such products could be available in the marketplace.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Recent evidence that echinoids of the genus Echinometra have moderate visual acuity that appears to be mediated by their spines screening off-axis light suggests that the urchin Strongylocentrotus purpuratus, with its higher spine density, may have even more acute spatial vision. We analyzed the movements of 39 specimens of S. purpuratus after they were placed in the center of a featureless tank containing a round, black target that had an angular diameter of 6.5 deg. or 10 deg. (solid angles of 0.01 sr and 0.024 sr, respectively). An average orientation vector for each urchin was determined by testing the animal four times, with the target placed successively at bearings of 0 deg., 90 deg., 180 deg. and 270 deg. (relative to magnetic east). The urchins showed no significant unimodal or axial orientation relative to any non-target feature of the environment or relative to the changing position of the 6.5 deg. target. However, the urchins were strongly axially oriented relative to the changing position of the 10 deg. target (mean axis from -1 to 179 deg.; 95% confidence interval +/- 12 deg.; P<0.001, Moore's non-parametric Hotelling's test), with 10 of the 20 urchins tested against that target choosing an average bearing within 10 deg. of either the target center or its opposite direction (two would be expected by chance). In addition, the average length of the 20 target-normalized bearings for the 10 deg. target (each the vector sum of the bearings for the four trials) were far higher than would be expected by chance (P<10(-10); Monte Carlo simulation), showing that each urchin, whether it moved towards or away from the target, did so with high consistency. These results strongly suggest that S. purpuratus detected the 10 deg. target, responding either by approaching it or fleeing it. Given that the urchins did not appear to respond to the 6.5 deg. target, it is likely that the 10 deg. target was close to the minimum detectable size for this species. Interestingly, measurements of the spine density of the regions of the test that faced horizontally predicted a similar visual resolution (8.3+/-0.5 deg. for the interambulacrum and 11+/-0.54 deg. for the ambulacrum). The function of this relatively low, but functional, acuity - on par with that of the chambered Nautilus and the horseshoe crab - is unclear but, given the bimodal response, is likely to be related to both shelter seeking and predator avoidance.