923 resultados para Territorial planning
Resumo:
This paper describes BUILD, a computer program which generates plans for building specified structures out of simple objects such as toy blocks. A powerful heuristic control structure enables BUILD to use a number of sophisticated construction techniques in its plans. Among these are the incorporation of pre-existing structure into the final design, pre-assembly of movable sub-structures on the table, and use of the extra blocks as temporary supports and counterweights in the course of construction. BUILD does its planning in a modeled 3-space in which blocks of various shapes and sizes can be represented in any orientation and location. The modeling system can maintain several world models at once, and contains modules for displaying states, testing them for inter-object contact and collision, and for checking the stability of complex structures involving frictional forces. Various alternative approaches are discussed, and suggestions are included for the extension of BUILD-like systems to other domains. Also discussed are the merits of BUILD's implementation language, CONNIVER, for this type of problem solving.
Resumo:
The motion planning problem is of central importance to the fields of robotics, spatial planning, and automated design. In robotics we are interested in the automatic synthesis of robot motions, given high-level specifications of tasks and geometric models of the robot and obstacles. The Mover's problem is to find a continuous, collision-free path for a moving object through an environment containing obstacles. We present an implemented algorithm for the classical formulation of the three-dimensional Mover's problem: given an arbitrary rigid polyhedral moving object P with three translational and three rotational degrees of freedom, find a continuous, collision-free path taking P from some initial configuration to a desired goal configuration. This thesis describes the first known implementation of a complete algorithm (at a given resolution) for the full six degree of freedom Movers' problem. The algorithm transforms the six degree of freedom planning problem into a point navigation problem in a six-dimensional configuration space (called C-Space). The C-Space obstacles, which characterize the physically unachievable configurations, are directly represented by six-dimensional manifolds whose boundaries are five dimensional C-surfaces. By characterizing these surfaces and their intersections, collision-free paths may be found by the closure of three operators which (i) slide along 5-dimensional intersections of level C-Space obstacles; (ii) slide along 1- to 4-dimensional intersections of level C-surfaces; and (iii) jump between 6 dimensional obstacles. Implementing the point navigation operators requires solving fundamental representational and algorithmic questions: we will derive new structural properties of the C-Space constraints and shoe how to construct and represent C-Surfaces and their intersection manifolds. A definition and new theoretical results are presented for a six-dimensional C-Space extension of the generalized Voronoi diagram, called the C-Voronoi diagram, whose structure we relate to the C-surface intersection manifolds. The representations and algorithms we develop impact many geometric planning problems, and extend to Cartesian manipulators with six degrees of freedom.
Resumo:
The problem of achieving conjunctive goals has been central to domain independent planning research; the nonlinear constraint-posting approach has been most successful. Previous planners of this type have been comlicated, heuristic, and ill-defined. I have combined and distilled the state of the art into a simple, precise, implemented algorithm (TWEAK) which I have proved correct and complete. I analyze previous work on domain-independent conjunctive planning; in retrospect it becomes clear that all conjunctive planners, linear and nonlinear, work the same way. The efficiency of these planners depends on the traditional add/delete-list representation for actions, which drastically limits their usefulness. I present theorems that suggest that efficient general purpose planning with more expressive action representations is impossible, and suggest ways to avoid this problem.
Resumo:
Robots must successfully plan and execute tasks in the presence of uncertainty. Uncertainty arises from errors in modeling, sensing, and control. Planning in the presence of uncertainty constitutes one facet of the general motion planning problem in robotics. This problem is concerned with the automatic synthesis of motion strategies from high level task specification and geometric models of environments. In order to develop successful motion strategies, it is necessary to understand the effect of uncertainty on the geometry of object interactions. Object interactions, both static and dynamic, may be represented in geometrical terms. This thesis investigates geometrical tools for modeling and overcoming uncertainty. The thesis describes an algorithm for computing backprojections o desired task configurations. Task goals and motion states are specified in terms of a moving object's configuration space. Backprojections specify regions in configuration space from which particular motions are guaranteed to accomplish a desired task. The backprojection algorithm considers surfaces in configuration space that facilitate sliding towards the goal, while avoiding surfaces on which motions may prematurely halt. In executing a motion for a backprojection region, a plan executor must be able to recognize that a desired task has been accomplished. Since sensors are subject to uncertainty, recognition of task success is not always possible. The thesis considers the structure of backprojection regions and of task goals that ensures goal recognizability. The thesis also develops a representation of friction in configuration space, in terms of a friction cone analogous to the real space friction cone. The friction cone provides the backprojection algorithm with a geometrical tool for determining points at which motions may halt.
Resumo:
A presente publicação descreve alguns modelos capazes de simular o comportamento e o destino de agrotóxicos e outros contaminantes, e destaca como esses modelos podem ser mais efetivos quando agrega-se, a eles, a capacidade de lidar com a dimensão espacial.
Resumo:
Bradshaw, K. & Urquhart, C. (2005). Theory and practice in strategic planning for health information systems. In: D. Wainwright (Ed.), UK Academy for Information Systems 10th conference 2005, 22-24 March 2005 (CD-ROM). Newcastle upon Tyne: Northumbria University.
Resumo:
The future of theology libraries is far from clear. Since the nineteenth century, theology libraries have evolved to support the work of theological education. This article briefly reviews the development of theology libraries in North America and examines the contextual changes impacting theology libraries today. Three significant factors that will shape theology libraries in the coming decade are collaborative models of pedagogy and scholarship, globalization and rapid changes in information technology, and changes in the nature of scholarly publishing including the digitization of information. A large body of research is available to assist those responsible for guiding the direction of theology libraries in the next decade, but there are significant gaps in what we know about the impact of technology on how people use information that must be filled in order to provide a solid foundation for planning.
Resumo:
In this paper we discuss a new type of query in Spatial Databases, called Trip Planning Query (TPQ). Given a set of points P in space, where each point belongs to a category, and given two points s and e, TPQ asks for the best trip that starts at s, passes through exactly one point from each category, and ends at e. An example of a TPQ is when a user wants to visit a set of different places and at the same time minimize the total travelling cost, e.g. what is the shortest travelling plan for me to visit an automobile shop, a CVS pharmacy outlet, and a Best Buy shop along my trip from A to B? The trip planning query is an extension of the well-known TSP problem and therefore is NP-hard. The difficulty of this query lies in the existence of multiple choices for each category. In this paper, we first study fast approximation algorithms for the trip planning query in a metric space, assuming that the data set fits in main memory, and give the theory analysis of their approximation bounds. Then, the trip planning query is examined for data sets that do not fit in main memory and must be stored on disk. For the disk-resident data, we consider two cases. In one case, we assume that the points are located in Euclidean space and indexed with an Rtree. In the other case, we consider the problem of points that lie on the edges of a spatial network (e.g. road network) and the distance between two points is defined using the shortest distance over the network. Finally, we give an experimental evaluation of the proposed algorithms using synthetic data sets generated on real road networks.
Resumo:
Since Wireless Sensor Networks (WSNs) are subject to failures, fault-tolerance becomes an important requirement for many WSN applications. Fault-tolerance can be enabled in different areas of WSN design and operation, including the Medium Access Control (MAC) layer and the initial topology design. To be robust to failures, a MAC protocol must be able to adapt to traffic fluctuations and topology dynamics. We design ER-MAC that can switch from energy-efficient operation in normal monitoring to reliable and fast delivery for emergency monitoring, and vice versa. It also can prioritise high priority packets and guarantee fair packet deliveries from all sensor nodes. Topology design supports fault-tolerance by ensuring that there are alternative acceptable routes to data sinks when failures occur. We provide solutions for four topology planning problems: Additional Relay Placement (ARP), Additional Backup Placement (ABP), Multiple Sink Placement (MSP), and Multiple Sink and Relay Placement (MSRP). Our solutions use a local search technique based on Greedy Randomized Adaptive Search Procedures (GRASP). GRASP-ARP deploys relays for (k,l)-sink-connectivity, where each sensor node must have k vertex-disjoint paths of length ≤ l. To count how many disjoint paths a node has, we propose Counting-Paths. GRASP-ABP deploys fewer relays than GRASP-ARP by focusing only on the most important nodes – those whose failure has the worst effect. To identify such nodes, we define Length-constrained Connectivity and Rerouting Centrality (l-CRC). Greedy-MSP and GRASP-MSP place minimal cost sinks to ensure that each sensor node in the network is double-covered, i.e. has two length-bounded paths to two sinks. Greedy-MSRP and GRASP-MSRP deploy sinks and relays with minimal cost to make the network double-covered and non-critical, i.e. all sensor nodes must have length-bounded alternative paths to sinks when an arbitrary sensor node fails. We then evaluate the fault-tolerance of each topology in data gathering simulations using ER-MAC.
Resumo:
Background: The Early Development Instrument (EDI) is a population-level measure of five developmental domains at school-entry age. The overall aim of this thesis was to explore the potential of the EDI as an indicator of early development in Ireland. Methods: A cross-sectional study was conducted in 47 primary schools in 2011 using the EDI and a linked parental questionnaire. EDI (teacher completed) scores were calculated for 1,344 children in their first year of full-time education. Those scoring in the lowest 10% of the sample population in one or more domains were deemed to be 'developmentally vulnerable'. Scores were correlated with contextual data from the parental questionnaire and with indicators of area and school-level deprivation. Rasch analysis was used to determine the validity of the EDI. Results: Over one quarter (27.5%) of all children in the study were developmentally vulnerable. Individual characteristics associated with increased risk of vulnerability were being male; under 5 years old; and having English as a second language. Adjusted for these demographics, low birth weight, poor parent/child interaction and mother’s lower level of education showed the most significant odds ratios for developmental vulnerability. Vulnerability did not follow the area-level deprivation gradient as measured by a composite index of material deprivation. Children considered by the teacher to be in need of assessment also had lower scores, which were not significantly different from those of children with a clinical diagnosis of special needs. all domains showed at least reasonable fit to the Rasch model supporting the validity of the instrument. However, there was a need for further refinement of the instrument in the Irish context. Conclusion: This thesis provides a unique snapshot of early development in Ireland. The EDI and linked parental questionnaires are promising indicators of the extent, distribution and determinants of developmental vulnerability.
Resumo:
Submission on behalf of UCC to the Government Consultation on the White paper on Irish Aid
Resumo:
A growing number of software development projects successfully exhibit a mix of agile and traditional software development methodologies. Many of these mixed methodologies are organization specific and tailored to a specific project. Our objective in this research-in-progress paper is to develop an artifact that can guide the development of such a mixed methodology. Using control theory, we design a process model that provides theoretical guidance to build a portfolio of controls that can support the development of a mixed methodology for software development. Controls, embedded in methods, provide a generalizable and adaptable framework for project managers to develop their mixed methodology specific to the demands of the project. A research methodology is proposed to test the model. Finally, future directions and contributions are discussed.
Resumo:
BACKGROUND: Outpatient palliative care, an evolving delivery model, seeks to improve continuity of care across settings and to increase access to services in hospice and palliative medicine (HPM). It can provide a critical bridge between inpatient palliative care and hospice, filling the gap in community-based supportive care for patients with advanced life-limiting illness. Low capacities for data collection and quantitative research in HPM have impeded assessment of the impact of outpatient palliative care. APPROACH: In North Carolina, a regional database for community-based palliative care has been created through a unique partnership between a HPM organization and academic medical center. This database flexibly uses information technology to collect patient data, entered at the point of care (e.g., home, inpatient hospice, assisted living facility, nursing home). HPM physicians and nurse practitioners collect data; data are transferred to an academic site that assists with analyses and data management. Reports to community-based sites, based on data they provide, create a better understanding of local care quality. CURRENT STATUS: The data system was developed and implemented over a 2-year period, starting with one community-based HPM site and expanding to four. Data collection methods were collaboratively created and refined. The database continues to grow. Analyses presented herein examine data from one site and encompass 2572 visits from 970 new patients, characterizing the population, symptom profiles, and change in symptoms after intervention. CONCLUSION: A collaborative regional approach to HPM data can support evaluation and improvement of palliative care quality at the local, aggregated, and statewide levels.
Resumo:
PURPOSE: To demonstrate the feasibility of using a knowledge base of prior treatment plans to generate new prostate intensity modulated radiation therapy (IMRT) plans. Each new case would be matched against others in the knowledge base. Once the best match is identified, that clinically approved plan is used to generate the new plan. METHODS: A database of 100 prostate IMRT treatment plans was assembled into an information-theoretic system. An algorithm based on mutual information was implemented to identify similar patient cases by matching 2D beam's eye view projections of contours. Ten randomly selected query cases were each matched with the most similar case from the database of prior clinically approved plans. Treatment parameters from the matched case were used to develop new treatment plans. A comparison of the differences in the dose-volume histograms between the new and the original treatment plans were analyzed. RESULTS: On average, the new knowledge-based plan is capable of achieving very comparable planning target volume coverage as the original plan, to within 2% as evaluated for D98, D95, and D1. Similarly, the dose to the rectum and dose to the bladder are also comparable to the original plan. For the rectum, the mean and standard deviation of the dose percentage differences for D20, D30, and D50 are 1.8% +/- 8.5%, -2.5% +/- 13.9%, and -13.9% +/- 23.6%, respectively. For the bladder, the mean and standard deviation of the dose percentage differences for D20, D30, and D50 are -5.9% +/- 10.8%, -12.2% +/- 14.6%, and -24.9% +/- 21.2%, respectively. A negative percentage difference indicates that the new plan has greater dose sparing as compared to the original plan. CONCLUSIONS: The authors demonstrate a knowledge-based approach of using prior clinically approved treatment plans to generate clinically acceptable treatment plans of high quality. This semiautomated approach has the potential to improve the efficiency of the treatment planning process while ensuring that high quality plans are developed.
Resumo:
PURPOSE: To investigate the dosimetric effects of adaptive planning on lung stereotactic body radiation therapy (SBRT). METHODS AND MATERIALS: Forty of 66 consecutive lung SBRT patients were selected for a retrospective adaptive planning study. CBCT images acquired at each fraction were used for treatment planning. Adaptive plans were created using the same planning parameters as the original CT-based plan, with the goal to achieve comparable comformality index (CI). For each patient, 2 cumulative plans, nonadaptive plan (PNON) and adaptive plan (PADP), were generated and compared for the following organs-at-risks (OARs): cord, esophagus, chest wall, and the lungs. Dosimetric comparison was performed between PNON and PADP for all 40 patients. Correlations were evaluated between changes in dosimetric metrics induced by adaptive planning and potential impacting factors, including tumor-to-OAR distances (dT-OAR), initial internal target volume (ITV1), ITV change (ΔITV), and effective ITV diameter change (ΔdITV). RESULTS: 34 (85%) patients showed ITV decrease and 6 (15%) patients showed ITV increase throughout the course of lung SBRT. Percentage ITV change ranged from -59.6% to 13.0%, with a mean (±SD) of -21.0% (±21.4%). On average of all patients, PADP resulted in significantly (P=0 to .045) lower values for all dosimetric metrics. ΔdITV/dT-OAR was found to correlate with changes in dose to 5 cc (ΔD5cc) of esophagus (r=0.61) and dose to 30 cc (ΔD30cc) of chest wall (r=0.81). Stronger correlations between ΔdITV/dT-OAR and ΔD30cc of chest wall were discovered for peripheral (r=0.81) and central (r=0.84) tumors, respectively. CONCLUSIONS: Dosimetric effects of adaptive lung SBRT planning depend upon target volume changes and tumor-to-OAR distances. Adaptive lung SBRT can potentially reduce dose to adjacent OARs if patients present large tumor volume shrinkage during the treatment.