1000 resultados para Planning modalities


Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we discuss a new type of query in Spatial Databases, called Trip Planning Query (TPQ). Given a set of points P in space, where each point belongs to a category, and given two points s and e, TPQ asks for the best trip that starts at s, passes through exactly one point from each category, and ends at e. An example of a TPQ is when a user wants to visit a set of different places and at the same time minimize the total travelling cost, e.g. what is the shortest travelling plan for me to visit an automobile shop, a CVS pharmacy outlet, and a Best Buy shop along my trip from A to B? The trip planning query is an extension of the well-known TSP problem and therefore is NP-hard. The difficulty of this query lies in the existence of multiple choices for each category. In this paper, we first study fast approximation algorithms for the trip planning query in a metric space, assuming that the data set fits in main memory, and give the theory analysis of their approximation bounds. Then, the trip planning query is examined for data sets that do not fit in main memory and must be stored on disk. For the disk-resident data, we consider two cases. In one case, we assume that the points are located in Euclidean space and indexed with an Rtree. In the other case, we consider the problem of points that lie on the edges of a spatial network (e.g. road network) and the distance between two points is defined using the shortest distance over the network. Finally, we give an experimental evaluation of the proposed algorithms using synthetic data sets generated on real road networks.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An active, attentionally-modulated recognition architecture is proposed for object recognition and scene analysis. The proposed architecture forms part of navigation and trajectory planning modules for mobile robots. Key characteristics of the system include movement planning and execution based on environmental factors and internal goal definitions. Real-time implementation of the system is based on space-variant representation of the visual field, as well as an optimal visual processing scheme utilizing separate and parallel channels for the extraction of boundaries and stimulus qualities. A spatial and temporal grouping module (VWM) allows for scene scanning, multi-object segmentation, and featural/object priming. VWM is used to modulate a tn~ectory formation module capable of redirecting the focus of spatial attention. Finally, an object recognition module based on adaptive resonance theory is interfaced through VWM to the visual processing module. The system is capable of using information from different modalities to disambiguate sensory input.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Since Wireless Sensor Networks (WSNs) are subject to failures, fault-tolerance becomes an important requirement for many WSN applications. Fault-tolerance can be enabled in different areas of WSN design and operation, including the Medium Access Control (MAC) layer and the initial topology design. To be robust to failures, a MAC protocol must be able to adapt to traffic fluctuations and topology dynamics. We design ER-MAC that can switch from energy-efficient operation in normal monitoring to reliable and fast delivery for emergency monitoring, and vice versa. It also can prioritise high priority packets and guarantee fair packet deliveries from all sensor nodes. Topology design supports fault-tolerance by ensuring that there are alternative acceptable routes to data sinks when failures occur. We provide solutions for four topology planning problems: Additional Relay Placement (ARP), Additional Backup Placement (ABP), Multiple Sink Placement (MSP), and Multiple Sink and Relay Placement (MSRP). Our solutions use a local search technique based on Greedy Randomized Adaptive Search Procedures (GRASP). GRASP-ARP deploys relays for (k,l)-sink-connectivity, where each sensor node must have k vertex-disjoint paths of length ≤ l. To count how many disjoint paths a node has, we propose Counting-Paths. GRASP-ABP deploys fewer relays than GRASP-ARP by focusing only on the most important nodes – those whose failure has the worst effect. To identify such nodes, we define Length-constrained Connectivity and Rerouting Centrality (l-CRC). Greedy-MSP and GRASP-MSP place minimal cost sinks to ensure that each sensor node in the network is double-covered, i.e. has two length-bounded paths to two sinks. Greedy-MSRP and GRASP-MSRP deploy sinks and relays with minimal cost to make the network double-covered and non-critical, i.e. all sensor nodes must have length-bounded alternative paths to sinks when an arbitrary sensor node fails. We then evaluate the fault-tolerance of each topology in data gathering simulations using ER-MAC.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: The Early Development Instrument (EDI) is a population-level measure of five developmental domains at school-entry age. The overall aim of this thesis was to explore the potential of the EDI as an indicator of early development in Ireland. Methods: A cross-sectional study was conducted in 47 primary schools in 2011 using the EDI and a linked parental questionnaire. EDI (teacher completed) scores were calculated for 1,344 children in their first year of full-time education. Those scoring in the lowest 10% of the sample population in one or more domains were deemed to be 'developmentally vulnerable'. Scores were correlated with contextual data from the parental questionnaire and with indicators of area and school-level deprivation. Rasch analysis was used to determine the validity of the EDI. Results: Over one quarter (27.5%) of all children in the study were developmentally vulnerable. Individual characteristics associated with increased risk of vulnerability were being male; under 5 years old; and having English as a second language. Adjusted for these demographics, low birth weight, poor parent/child interaction and mother’s lower level of education showed the most significant odds ratios for developmental vulnerability. Vulnerability did not follow the area-level deprivation gradient as measured by a composite index of material deprivation. Children considered by the teacher to be in need of assessment also had lower scores, which were not significantly different from those of children with a clinical diagnosis of special needs. all domains showed at least reasonable fit to the Rasch model supporting the validity of the instrument. However, there was a need for further refinement of the instrument in the Irish context. Conclusion: This thesis provides a unique snapshot of early development in Ireland. The EDI and linked parental questionnaires are promising indicators of the extent, distribution and determinants of developmental vulnerability.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Submission on behalf of UCC to the Government Consultation on the White paper on Irish Aid

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A growing number of software development projects successfully exhibit a mix of agile and traditional software development methodologies. Many of these mixed methodologies are organization specific and tailored to a specific project. Our objective in this research-in-progress paper is to develop an artifact that can guide the development of such a mixed methodology. Using control theory, we design a process model that provides theoretical guidance to build a portfolio of controls that can support the development of a mixed methodology for software development. Controls, embedded in methods, provide a generalizable and adaptable framework for project managers to develop their mixed methodology specific to the demands of the project. A research methodology is proposed to test the model. Finally, future directions and contributions are discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

PURPOSE: Mammography is known to be one of the most difficult radiographic exams to interpret. Mammography has important limitations, including the superposition of normal tissue that can obscure a mass, chance alignment of normal tissue to mimic a true lesion and the inability to derive volumetric information. It has been shown that stereomammography can overcome these deficiencies by showing that layers of normal tissue lay at different depths. If standard stereomammography (i.e., a single stereoscopic pair consisting of two projection images) can significantly improve lesion detection, how will multiview stereoscopy (MVS), where many projection images are used, compare to mammography? The aim of this study was to assess the relative performance of MVS compared to mammography for breast mass detection. METHODS: The MVS image sets consisted of the 25 raw projection images acquired over an arc of approximately 45 degrees using a Siemens prototype breast tomosynthesis system. The mammograms were acquired using a commercial Siemens FFDM system. The raw data were taken from both of these systems for 27 cases and realistic simulated mass lesions were added to duplicates of the 27 images at the same local contrast. The images with lesions (27 mammography and 27 MVS) and the images without lesions (27 mammography and 27 MVS) were then postprocessed to provide comparable and representative image appearance across the two modalities. All 108 image sets were shown to five full-time breast imaging radiologists in random order on a state-of-the-art stereoscopic display. The observers were asked to give a confidence rating for each image (0 for lesion definitely not present, 100 for lesion definitely present). The ratings were then compiled and processed using ROC and variance analysis. RESULTS: The mean AUC for the five observers was 0.614 +/- 0.055 for mammography and 0.778 +/- 0.052 for multiview stereoscopy. The difference of 0.164 +/- 0.065 was statistically significant with a p-value of 0.0148. CONCLUSIONS: The differences in the AUCs and the p-value suggest that multiview stereoscopy has a statistically significant advantage over mammography in the detection of simulated breast masses. This highlights the dominance of anatomical noise compared to quantum noise for breast mass detection. It also shows that significant lesion detection can be achieved with MVS without any of the artifacts associated with tomosynthesis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: Outpatient palliative care, an evolving delivery model, seeks to improve continuity of care across settings and to increase access to services in hospice and palliative medicine (HPM). It can provide a critical bridge between inpatient palliative care and hospice, filling the gap in community-based supportive care for patients with advanced life-limiting illness. Low capacities for data collection and quantitative research in HPM have impeded assessment of the impact of outpatient palliative care. APPROACH: In North Carolina, a regional database for community-based palliative care has been created through a unique partnership between a HPM organization and academic medical center. This database flexibly uses information technology to collect patient data, entered at the point of care (e.g., home, inpatient hospice, assisted living facility, nursing home). HPM physicians and nurse practitioners collect data; data are transferred to an academic site that assists with analyses and data management. Reports to community-based sites, based on data they provide, create a better understanding of local care quality. CURRENT STATUS: The data system was developed and implemented over a 2-year period, starting with one community-based HPM site and expanding to four. Data collection methods were collaboratively created and refined. The database continues to grow. Analyses presented herein examine data from one site and encompass 2572 visits from 970 new patients, characterizing the population, symptom profiles, and change in symptoms after intervention. CONCLUSION: A collaborative regional approach to HPM data can support evaluation and improvement of palliative care quality at the local, aggregated, and statewide levels.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

PURPOSE: To demonstrate the feasibility of using a knowledge base of prior treatment plans to generate new prostate intensity modulated radiation therapy (IMRT) plans. Each new case would be matched against others in the knowledge base. Once the best match is identified, that clinically approved plan is used to generate the new plan. METHODS: A database of 100 prostate IMRT treatment plans was assembled into an information-theoretic system. An algorithm based on mutual information was implemented to identify similar patient cases by matching 2D beam's eye view projections of contours. Ten randomly selected query cases were each matched with the most similar case from the database of prior clinically approved plans. Treatment parameters from the matched case were used to develop new treatment plans. A comparison of the differences in the dose-volume histograms between the new and the original treatment plans were analyzed. RESULTS: On average, the new knowledge-based plan is capable of achieving very comparable planning target volume coverage as the original plan, to within 2% as evaluated for D98, D95, and D1. Similarly, the dose to the rectum and dose to the bladder are also comparable to the original plan. For the rectum, the mean and standard deviation of the dose percentage differences for D20, D30, and D50 are 1.8% +/- 8.5%, -2.5% +/- 13.9%, and -13.9% +/- 23.6%, respectively. For the bladder, the mean and standard deviation of the dose percentage differences for D20, D30, and D50 are -5.9% +/- 10.8%, -12.2% +/- 14.6%, and -24.9% +/- 21.2%, respectively. A negative percentage difference indicates that the new plan has greater dose sparing as compared to the original plan. CONCLUSIONS: The authors demonstrate a knowledge-based approach of using prior clinically approved treatment plans to generate clinically acceptable treatment plans of high quality. This semiautomated approach has the potential to improve the efficiency of the treatment planning process while ensuring that high quality plans are developed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

PURPOSE: To investigate the dosimetric effects of adaptive planning on lung stereotactic body radiation therapy (SBRT). METHODS AND MATERIALS: Forty of 66 consecutive lung SBRT patients were selected for a retrospective adaptive planning study. CBCT images acquired at each fraction were used for treatment planning. Adaptive plans were created using the same planning parameters as the original CT-based plan, with the goal to achieve comparable comformality index (CI). For each patient, 2 cumulative plans, nonadaptive plan (PNON) and adaptive plan (PADP), were generated and compared for the following organs-at-risks (OARs): cord, esophagus, chest wall, and the lungs. Dosimetric comparison was performed between PNON and PADP for all 40 patients. Correlations were evaluated between changes in dosimetric metrics induced by adaptive planning and potential impacting factors, including tumor-to-OAR distances (dT-OAR), initial internal target volume (ITV1), ITV change (ΔITV), and effective ITV diameter change (ΔdITV). RESULTS: 34 (85%) patients showed ITV decrease and 6 (15%) patients showed ITV increase throughout the course of lung SBRT. Percentage ITV change ranged from -59.6% to 13.0%, with a mean (±SD) of -21.0% (±21.4%). On average of all patients, PADP resulted in significantly (P=0 to .045) lower values for all dosimetric metrics. ΔdITV/dT-OAR was found to correlate with changes in dose to 5 cc (ΔD5cc) of esophagus (r=0.61) and dose to 30 cc (ΔD30cc) of chest wall (r=0.81). Stronger correlations between ΔdITV/dT-OAR and ΔD30cc of chest wall were discovered for peripheral (r=0.81) and central (r=0.84) tumors, respectively. CONCLUSIONS: Dosimetric effects of adaptive lung SBRT planning depend upon target volume changes and tumor-to-OAR distances. Adaptive lung SBRT can potentially reduce dose to adjacent OARs if patients present large tumor volume shrinkage during the treatment.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Computed tomography (CT) is one of the most valuable modalities for in vivo imaging because it is fast, high-resolution, cost-effective, and non-invasive. Moreover, CT is heavily used not only in the clinic (for both diagnostics and treatment planning) but also in preclinical research as micro-CT. Although CT is inherently effective for lung and bone imaging, soft tissue imaging requires the use of contrast agents. For small animal micro-CT, nanoparticle contrast agents are used in order to avoid rapid renal clearance. A variety of nanoparticles have been used for micro-CT imaging, but the majority of research has focused on the use of iodine-containing nanoparticles and gold nanoparticles. Both nanoparticle types can act as highly effective blood pool contrast agents or can be targeted using a wide variety of targeting mechanisms. CT imaging can be further enhanced by adding spectral capabilities to separate multiple co-injected nanoparticles in vivo. Spectral CT, using both energy-integrating and energy-resolving detectors, has been used with multiple contrast agents to enable functional and molecular imaging. This review focuses on new developments for in vivo small animal micro-CT using novel nanoparticle probes applied in preclinical research.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Evolving family structure and economic conditions may affect individuals' ability and willingness to plan for future long-term care (LTC) needs. We applied life course constructs to analyze focus group data from a study of family decision making about LTC insurance. Participants described how past exposure to caregiving motivated them to engage in LTC planning; in contrast, child rearing discouraged LTC planning. Perceived institutional and economic instability drove individuals to regard financial LTC planning as either a wise precaution or another risk. Perceived economic instability also shaped opinions that adult children are ill-equipped to support parents' LTC. Despite concerns about viability of social insurance programs, some participants described strategies to maximize gains from them. Changing norms around aging and family roles also affected expectations of an active older age, innovative LTC options, and limitations to adult children's involvement. Understanding life course context can inform policy efforts to encourage LTC planning.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In order to develop a strategic plan that will guide their priorities and resource allocation for 2018-2021, North Carolina Sea Grant has implemented a multi-stage process designed to increase stakeholder engagement and to better assess and serve the coastal priorities of North Carolinians. This project explores strengths and potential areas for improvement within NC Sea Grant’s planning process with a specific focus on maximizing stakeholder engagement. By interviewing staff, observing focus groups, and creating a survey instrument for public distribution, we developed a set of recommendations highlighting the ways that NC Sea Grant can better facilitate inclusion of stakeholder, public, and staff input in its strategic planning process, such as holding some stakeholder events outside of typical business hours and discussing ways to incorporate diversity into the strategic plan.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

As announced in the November 2000 issue of MathStats&OR [1], one of the projects supported by the Maths, Stats & OR Network funds is an international survey of research into pedagogic issues in statistics and OR. I am taking the lead on this and report here on the progress that has been made during the first year. A paper giving some background to the project and describing initial thinking on how it might be implemented was presented at the 53rd session of the International Statistical Institute in Seoul, Korea, in August 2001 in a session on The future of statistics education research [2]. It sounded easy. I considered that I was something of an expert on surveys having lectured on the topic for many years and having helped students and others who were doing surveys, particularly with the design of their questionnaires. Surely all I had to do was to draft a few questions, send them electronically to colleagues in statistical education who would be only to happy to respond, and summarise their responses? I should have learnt from my experience of advising all those students who thought that doing a survey was easy and to whom I had to explain that their ideas were too ambitious. There are several inter-related stages in survey research and it is important to think about these before rushing into the collection of data. In the case of the survey in question, this planning stage revealed several challenges. Surveys are usually done for a purpose so even before planning how to do them, it is advisable to think about the final product and the dissemination of results. This is the route I followed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

There is now a broad scientific consensus that the global climate is changing in ways that are likely to have a profound impact on human society and the natural environment over the coming decades. The challenge for Facilities Mangers is to ensure that business continuity plans acknowledge the potential for such events and have contingencies in place to ensure that their organisation can recover from an extreme weather event in a timely fashion. This paper will review current literature/theories pertinent to extreme weather events and business continuity planning; will consider issues of risk; identify the key drivers that need to be considered by Facilities Managers in preparing contingency/disaster recover plans; and identify gaps in knowledge (understanding and toolkits) that need to be addressed. The paper will also briefly outline a 3 year research project underway in the UK to address the issues