936 resultados para mandatory access control framework


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis presents a framework for aggregated congestion management for TCP flows and shows how to integrate such an approach in an existing TCP protocol stack. The thesis presents an initial implementation of this congestion management scheme in Linux, with performance evaluation in ns as well.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Many people suffer from conditions that lead to deterioration of motor control and makes access to the computer using traditional input devices difficult. In particular, they may loose control of hand movement to the extent that the standard mouse cannot be used as a pointing device. Most current alternatives use markers or specialized hardware to track and translate a user's movement to pointer movement. These approaches may be perceived as intrusive, for example, wearable devices. Camera-based assistive systems that use visual tracking of features on the user's body often require cumbersome manual adjustment. This paper introduces an enhanced computer vision based strategy where features, for example on a user's face, viewed through an inexpensive USB camera, are tracked and translated to pointer movement. The main contributions of this paper are (1) enhancing a video based interface with a mechanism for mapping feature movement to pointer movement, which allows users to navigate to all areas of the screen even with very limited physical movement, and (2) providing a customizable, hierarchical navigation framework for human computer interaction (HCI). This framework provides effective use of the vision-based interface system for accessing multiple applications in an autonomous setting. Experiments with several users show the effectiveness of the mapping strategy and its usage within the application framework as a practical tool for desktop users with disabilities.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This article describes neural network models for adaptive control of arm movement trajectories during visually guided reaching and, more generally, a framework for unsupervised real-time error-based learning. The models clarify how a child, or untrained robot, can learn to reach for objects that it sees. Piaget has provided basic insights with his concept of a circular reaction: As an infant makes internally generated movements of its hand, the eyes automatically follow this motion. A transformation is learned between the visual representation of hand position and the motor representation of hand position. Learning of this transformation eventually enables the child to accurately reach for visually detected targets. Grossberg and Kuperstein have shown how the eye movement system can use visual error signals to correct movement parameters via cerebellar learning. Here it is shown how endogenously generated arm movements lead to adaptive tuning of arm control parameters. These movements also activate the target position representations that are used to learn the visuo-motor transformation that controls visually guided reaching. The AVITE model presented here is an adaptive neural circuit based on the Vector Integration to Endpoint (VITE) model for arm and speech trajectory generation of Bullock and Grossberg. In the VITE model, a Target Position Command (TPC) represents the location of the desired target. The Present Position Command (PPC) encodes the present hand-arm configuration. The Difference Vector (DV) population continuously.computes the difference between the PPC and the TPC. A speed-controlling GO signal multiplies DV output. The PPC integrates the (DV)·(GO) product and generates an outflow command to the arm. Integration at the PPC continues at a rate dependent on GO signal size until the DV reaches zero, at which time the PPC equals the TPC. The AVITE model explains how self-consistent TPC and PPC coordinates are autonomously generated and learned. Learning of AVITE parameters is regulated by activation of a self-regulating Endogenous Random Generator (ERG) of training vectors. Each vector is integrated at the PPC, giving rise to a movement command. The generation of each vector induces a complementary postural phase during which ERG output stops and learning occurs. Then a new vector is generated and the cycle is repeated. This cyclic, biphasic behavior is controlled by a specialized gated dipole circuit. ERG output autonomously stops in such a way that, across trials, a broad sample of workspace target positions is generated. When the ERG shuts off, a modulator gate opens, copying the PPC into the TPC. Learning of a transformation from TPC to PPC occurs using the DV as an error signal that is zeroed due to learning. This learning scheme is called a Vector Associative Map, or VAM. The VAM model is a general-purpose device for autonomous real-time error-based learning and performance of associative maps. The DV stage serves the dual function of reading out new TPCs during performance and reading in new adaptive weights during learning, without a disruption of real-time operation. YAMs thus provide an on-line unsupervised alternative to the off-line properties of supervised error-correction learning algorithms. YAMs and VAM cascades for learning motor-to-motor and spatial-to-motor maps are described. YAM models and Adaptive Resonance Theory (ART) models exhibit complementary matching, learning, and performance properties that together provide a foundation for designing a total sensory-cognitive and cognitive-motor autonomous system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Environmental Control Systems (ECS), enable people with high cervical Spinal Cord Injury (high SCI) to control and access everyday electronic devices. In Ireland, however, access for those who might benefit from ECS is limited. This study used a qualitative approach to explore the insider experience of an ECS starter-pack developed by the author, an occupational therapist. The primary research questions: what is it really like to live with ECS, and what does it mean to live with ECS, were explored using a phenomenological methodology conducted in three phases. In Phase 1 fifteen people with high SCI met twice in four focus groups to discuss experiences and expectations of ECS. Thematic analysis (Krueger & Casey, 2000), influenced by the psychological phenomenological approach (Creswell, 1998), yielded three categories of rich, practical, phenomenological findings: ECS Usage and utility; ECS Expectations and The meaning of living with ECS. Phase 1 findings informed Phase 2 which consisted of the development of a generic electronic assistive technology pack (GrEAT) that included commercially available constituents as well as short instructional videos and an information booklet. This second phase culminated in a one-person, three-week pilot trial. Phase 3 involved a six person, 8-week trial of the GrEAT, followed by individual in-depth interviews. Interpretative Phenomenological Analysis IPA (Smith, Larkin & Flowers, 2009), aided by computer software ATLAS.ti and iMindmap, guided data analysis and identification of themes. Getting used to ECS, experienced as both a hassle and engaging, resulted in participants being able to Take back a little of what you have lost, which involved both feeling enabled and reclaiming a little doing. The findings of this study provide substantial insights into what it is like to live with ECS and the meanings attributed to that experience. Several practical, real world implications are discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Although Common Pool Resources (CPRs) make up a significant share of total income for rural households in Ethiopia and elsewhere in developing world, limited access to these resources and environmental degradation threaten local livelihoods. As a result, the issues of management, governance of CPRs and how to prevent their over-exploitation are of great importance for development policy. This study examines the current state and dynamics of CPRs and overall resource governance system of the Lake Tana sub-basin. This research employed the modified form of Institutional Analysis and Development (IAD) framework. The framework integrates the concept of Socio-Ecological Systems (SES) and Interactive Governance (IG) perspectives where social actors, institutions, the politico-economic context, discourses and ecological features across governance and government levels were considered. It has been observed that overexploitation, degradation and encroachment of CPRs have increased dramatically and this threatens the sustainability of Lake Tana ecosystem. The stakeholder analysis result reveals that there are multiple stakeholders with diverse interest in and power over CPRs. The analysis of institutional arrangements reveals that the existing formal rules and regulations governing access to and control over CPRs could not be implemented and were not effective to legally bind and govern CPR user’s behavior at the operational level. The study also shows that a top-down and non-participatory policy formulation, law and decision making process overlooks the local contexts (local knowledge and informal institutions). The outcomes of examining the participation of local resource users, as an alternative to a centralized, command-and-control, and hierarchical approach to resource management and governance, have called for a fundamental shift in CPR use, management and governance to facilitate the participation of stakeholders in decision making. Therefore, establishing a multi-level stakeholder governance system as an institutional structure and process is necessary to sustain stakeholder participation in decision-making regarding CPR use, management and governance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper develops a framework for estimating household preferences for school and neighborhood attributes in the presence of sorting. It embeds a boundary discontinuity design in a heterogeneous residential choice model, addressing the endogeneity of school and neighborhood characteristics. The model is estimated using restricted-access Census data from a large metropolitan area, yielding a number of new results. First, households are willing to pay less than 1 percent more in house prices - substantially lower than previous estimates - when the average performance of the local school increases by 5 percent. Second, much of the apparent willingness to pay for more educated and wealthier neighbors is explained by the correlation of these sociodemographic measures with unobserved neighborhood quality. Third, neighborhood race is not capitalized directly into housing prices; instead, the negative correlation of neighborhood percent black and housing prices is due entirely to the fact that blacks live in unobservably lower-quality neighborhoods. Finally, there is considerable heterogeneity in preferences for schools and neighbors, with households preferring to self-segregate on the basis of both race and education. © 2007 by The University of Chicago. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Autophagy has been predominantly studied as a nonselective self-digestion process that recycles macromolecules and produces energy in response to starvation. However, autophagy independent of nutrient status has long been known to exist. Recent evidence suggests that this form of autophagy enforces intracellular quality control by selectively disposing of aberrant protein aggregates and damaged organelles--common denominators in various forms of neurodegenerative diseases. By definition, this form of autophagy, termed quality-control (QC) autophagy, must be different from nutrient-regulated autophagy in substrate selectivity, regulation and function. We have recently identified the ubiquitin-binding deacetylase, HDAC6, as a key component that establishes QC. HDAC6 is not required for autophagy activation per se; rather, it is recruited to ubiquitinated autophagic substrates where it stimulates autophagosome-lysosome fusion by promoting F-actin remodeling in a cortactin-dependent manner. Remarkably, HDAC6 and cortactin are dispensable for starvation-induced autophagy. These findings reveal that autophagosomes associated with QC are molecularly and biochemically distinct from those associated with starvation autophagy, thereby providing a new molecular framework to understand the emerging complexity of autophagy and therapeutic potential of this unique machinery.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Integrated vector management (IVM) is increasingly being recommended as an option for sustainable malaria control. However, many malaria-endemic countries lack a policy framework to guide and promote the approach. The objective of the study was to assess knowledge and perceptions in relation to current malaria vector control policy and IVM in Uganda, and to make recommendations for consideration during future development of a specific IVM policy. METHODS: The study used a structured questionnaire to interview 34 individuals working at technical or policy-making levels in health, environment, agriculture and fisheries sectors. Specific questions on IVM focused on the following key elements of the approach: integration of chemical and non-chemical interventions of vector control; evidence-based decision making; inter-sectoral collaboration; capacity building; legislation; advocacy and community mobilization. RESULTS: All participants were familiar with the term IVM and knew various conventional malaria vector control (MVC) methods. Only 75% thought that Uganda had a MVC policy. Eighty percent (80%) felt there was inter-sectoral collaboration towards IVM, but that it was poor due to financial constraints, difficulties in involving all possible sectors and political differences. The health, environment and agricultural sectors were cited as key areas requiring cooperation in order for IVM to succeed. Sixty-seven percent (67%) of participants responded that communities were actively being involved in MVC, while 48% felt that the use of research results for evidence-based decision making was inadequate or poor. A majority of the participants felt that malaria research in Uganda was rarely used to facilitate policy changes. Suggestions by participants for formulation of specific and effective IVM policy included: revising the MVC policy and IVM-related policies in other sectors into a single, unified IVM policy and, using legislation to enforce IVM in development projects. CONCLUSION: Integrated management of malaria vectors in Uganda remains an underdeveloped component of malaria control policy. Cooperation between the health and other sectors needs strengthening and funding for MVC increased in order to develop and effectively implement an appropriate IVM policy. Continuous engagement of communities by government as well as monitoring and evaluation of vector control programmes will be crucial for sustaining IVM in the country.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Addressing global fisheries overexploitation requires better understanding of how small-scale fishing communities in developing countries limit access to fishing grounds. We analyze the performance of a system based on individual licenses and a common property-rights regime in their ability to generate incentives for self-governance and conservation of fishery resources. Using a qualitative before-after-control-impact approach, we compare two neighbouring fishing communities in the Gulf of California, Mexico. Both were initially governed by the same permit system, are situated in the same ecosystem, use similar harvesting technology, and have overharvested similar species. One community changed to a common property-right regime, enabling the emergence of access controls and avoiding overexploitation of benthic resources, while the other community, still relies on the permit system. We discuss the roles played by power, institutions, socio-historic, and biophysical factors to develop access controls. © 2012 The Author(s).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This chapter presents a model averaging approach in the M-open setting using sample re-use methods to approximate the predictive distribution of future observations. It first reviews the standard M-closed Bayesian Model Averaging approach and decision-theoretic methods for producing inferences and decisions. It then reviews model selection from the M-complete and M-open perspectives, before formulating a Bayesian solution to model averaging in the M-open perspective. It constructs optimal weights for MOMA:M-open Model Averaging using a decision-theoretic framework, where models are treated as part of the ‘action space’ rather than unknown states of nature. Using ‘incompatible’ retrospective and prospective models for data from a case-control study, the chapter demonstrates that MOMA gives better predictive accuracy than the proxy models. It concludes with open questions and future directions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The increasing complexity of new manufacturing processes and the continuously growing range of fabrication options mean that critical decisions about the insertion of new technologies must be made as early as possible in the design process. Mitigating the technology risks under limited knowledge is a key factor and major requirement to secure a successful development of the new technologies. In order to address this challenge, a risk mitigation methodology that incorporates both qualitative and quantitative analysis is required. This paper outlines the methodology being developed under a major UK grand challenge project - 3D-Mintegration. The main focus is on identifying the risks through identification of the product key characteristics using a product breakdown approach. The assessment of the identified risks uses quantification and prioritisation techniques to evaluate and rank the risks. Traditional statistical process control based on process capability and six sigma concepts are applied to measure the process capability as a result of the risks that have been identified. This paper also details a numerical approach that can be used to undertake risk analysis. This methodology is based on computational framework where modelling and statistical techniques are integrated. Also, an example of modeling and simulation technique is given using focused ion beam which is among the investigated in the project manufacturing processes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Orthogonal frequency division multiplexing(OFDM) is becoming a fundamental technology in future generation wireless communications. Call admission control is an effective mechanism to guarantee resilient, efficient, and quality-of-service (QoS) services in wireless mobile networks. In this paper, we present several call admission control algorithms for OFDM-based wireless multiservice networks. Call connection requests are differentiated into narrow-band calls and wide-band calls. For either class of calls, the traffic process is characterized as batch arrival since each call may request multiple subcarriers to satisfy its QoS requirement. The batch size is a random variable following a probability mass function (PMF) with realistically maximum value. In addition, the service times for wide-band and narrow-band calls are different. Following this, we perform a tele-traffic queueing analysis for OFDM-based wireless multiservice networks. The formulae for the significant performance metrics call blocking probability and bandwidth utilization are developed. Numerical investigations are presented to demonstrate the interaction between key parameters and performance metrics. The performance tradeoff among different call admission control algorithms is discussed. Moreover, the analytical model has been validated by simulation. The methodology as well as the result provides an efficient tool for planning next-generation OFDM-based broadband wireless access systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper is part of a major project about the Northern Cape Land Reform and Advocacy (NCLRA) programme being implemented by FARM-Africa* in South Africa. The NCLRA programme had initiated a financial mechanism to help poor communities to get access to finance and training in order to enable them to make better use of their newly-acquired land. One prominent aspect of the programme is the implementation of Livestock Banks, or the use of animals as financial products. The paper provides an analytical framework with which to evaluate the effectiveness of Livestock Banks in the poor communities of the Northern Cape in South Africa. It focuses on the design, implementation and future of Livestock Banks. The paper argues that Livestock Banks need to be reformed and enhanced if they are to continue to play a key role in the goal of creating financial and economic value in Africa, particularly when the primary objective is simultaneously to help reduce poverty. [Note]*FARM-Africa (Food & Agricultural Research Management) is a registered UK charity organisation and a company limited by guarantee in England and Wales no. 01926828.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The extent and gravity of the environmental degradation of the water resources in Dhaka due to untreated industrial waste is not fully recognised in international discourse. Pollution levels affect vast numbers, but the poor and the vulnerable are the worst affected. For example, rice productivity, the mainstay of poor farmers, in the Dhaka watershed has declined by 40% over a period of ten years. The study found significant correlations between water pollution and diseases such as jaundice, diarrhoea and skin problems. It was reported that the cost of treatment of one episode of skin disease could be as high as 29% of the weekly earnings of some of the poorest households. The dominant approach to deal with pollution in the SMEs is technocratic. Given the magnitude of the problem this paper argues that to control industrial pollution by SMEs and to enhance their compliance it is necessary to move from the technocratic approach to one which can also address the wider institutional and attitudinal issues. Underlying this shift is the need to adopt the appropriate methodology. The multi-stakeholder analysis enables an understanding of the actors, their influence, their capacity to participate in, or oppose change, and the existing and embedded incentive structures which allow them to pursue interests which are generally detrimental to environmental good. This enabled core and supporting strategies to be developed around three types of actors in industrial pollution, i.e., (i) principal actors, who directly contribute to industrial pollution; (ii) stakeholders who exacerbate the situation; and (iii) potential actors in mitigation. Within a carrot-and-stick framework, the strategies aim to improve environmental governance and transparency, set up a packet to incentive for industry and increase public awareness.