893 resultados para Multiple methods framework
Resumo:
A new theoretical framework of tracer methods is proposed in the present contribution, on the basis of mass conservation. This model is applicable for both artificial and natural tracers. It can be used to calculate the spatial distribution patterns of sediment transport rate, thus providing independent information and verification for the results derived from empirical formulae. For the procedures of the calculation, first, the tracer concentration and topographic maps of two times are obtained. Then, the spatial and temporal changes in the concentration and seabed elevation are calculated, and the boundary conditions required are determined by field observations (such as flow and bedform migration measurements). Finally, based upon eqs. (1) and (13), the transport rate is calculated and expressed as a function of the position over the study area. Further, appropriate modifications to the model may allow the tracer to have different densities and grain size distributions from the bulk sediment.
Resumo:
The work reported here lies in the area of overlap between artificial intelligence software engineering. As research in artificial intelligence, it is a step towards a model of problem solving in the domain of programming. In particular, this work focuses on the routine aspects of programming which involve the application of previous experience with similar programs. I call this programming by inspection. Programming is viewed here as a kind of engineering activity. Analysis and synthesis by inspection area prominent part of expert problem solving in many other engineering disciplines, such as electrical and mechanical engineering. The notion of inspections methods in programming developed in this work is motivated by similar notions in other areas of engineering. This work is also motivated by current practical concerns in the area of software engineering. The inadequacy of current programming technology is universally recognized. Part of the solution to this problem will be to increase the level of automation in programming. I believe that the next major step in the evolution of more automated programming will be interactive systems which provide a mixture of partially automated program analysis, synthesis and verification. One such system being developed at MIT, called the programmer's apprentice, is the immediate intended application of this work. This report concentrates on the knowledge are of the programmer's apprentice, which is the form of a taxonomy of commonly used algorithms and data structures. To the extent that a programmer is able to construct and manipulate programs in terms of the forms in such a taxonomy, he may relieve himself of many details and generally raise the conceptual level of his interaction with the system, as compared with present day programming environments. Also, since it is practical to expand a great deal of effort pre-analyzing the entries in a library, the difficulty of verifying the correctness of programs constructed this way is correspondingly reduced. The feasibility of this approach is demonstrated by the design of an initial library of common techniques for manipulating symbolic data. This document also reports on the further development of a formalism called the plan calculus for specifying computations in a programming language independent manner. This formalism combines both data and control abstraction in a uniform framework that has facilities for representing multiple points of view and side effects.
Resumo:
This study describes relocation experiences of older people moving to supported housing in Scotland focusing on the nature of support. Using mixed methods, Phase one involved a Scottish cross-sectional survey of all people aged 65 and over moving into Coburg (Scotland) Housing Association supported accommodation during the first six months of 2008. A total of 122 respondents were included in the survey (59% response rate). People moved locally at advanced ages with moderate disability levels to achieve more manageable housing and support, suggesting ‘assistance migration’. Expectations were high, with many seeing it as a new start in life and generally positive views of moving were reported. In Phase two, five in-depth multiple-perspective longitudinal case studies were conducted to explore the experience of relocation into supported housing. In each case an older person, primary carer and the housing manager - all women – were interviewed over six months following relocation. Analysis was undertaken using a thematic framework approach (Ritchie et al., 2003). Findings suggested older women acted with agency to adapt to their new lives; recreating ‘normality’ through organising space and routines. It is argued that returning to normality formed the overarching objective of the older women as they sought to feel ‘in place’. Responsibilities for meeting assistance needs were often implicit, contested and shifting, leading to fragile, uncertain and transitory arrangements. Drawing on recent advances in developmental psychology it is argued ‘longings’ of older people, and others, to achieve an optimal life can relate and motivate towards actions such as relocation. Yet, personal ‘longings’ can be prioritised differently and may result in disputes over goal setting and ways needs are met. Further, utopian ideals must be reconciled with the reality of daily life. Policy and practitioners could adopt broader, dignity based objectives to assist older people to identify ways of aiding such reconciliation.
Resumo:
The advent of virtualization and cloud computing technologies necessitates the development of effective mechanisms for the estimation and reservation of resources needed by content providers to deliver large numbers of video-on-demand (VOD) streams through the cloud. Unfortunately, capacity planning for the QoS-constrained delivery of a large number of VOD streams is inherently difficult as VBR encoding schemes exhibit significant bandwidth variability. In this paper, we present a novel resource management scheme to make such allocation decisions using a mixture of per-stream reservations and an aggregate reservation, shared across all streams to accommodate peak demands. The shared reservation provides capacity slack that enables statistical multiplexing of peak rates, while assuring analytically bounded frame-drop probabilities, which can be adjusted by trading off buffer space (and consequently delay) and bandwidth. Our two-tiered bandwidth allocation scheme enables the delivery of any set of streams with less bandwidth (or equivalently with higher link utilization) than state-of-the-art deterministic smoothing approaches. The algorithm underlying our proposed frame-work uses three per-stream parameters and is linear in the number of servers, making it particularly well suited for use in an on-line setting. We present results from extensive trace-driven simulations, which confirm the efficiency of our scheme especially for small buffer sizes and delay bounds, and which underscore the significant realizable bandwidth savings, typically yielding losses that are an order of magnitude or more below our analytically derived bounds.
Resumo:
Creativity is often defined as developing something novel or new, that fits its context, and has value. To achieve this, the creative process itself has gained increasing attention as organizational leaders seek competitive advantages through developing new products, services, process, or business models. In this paper, we explore the notion of the creative process as including a series of “filters” or ways to process information as being a critical component of the creative process. We use the metaphor of coffee making and filters because many of our examples come from Vietnam, which is one of the world’s top coffee exporters and which has created a coffee culture rivaling many other countries. We begin with a brief review of the creative process its connection to information processing, propose a tentative framework for integrating the two ideas, and provide examples of how it might work. We close with implications for further practical and theoretical directions for this idea.
Resumo:
BACKGROUND: Sensor-augmented pump therapy (SAPT) integrates real-time continuous glucose monitoring (RT-CGM) with continuous subcutaneous insulin infusion (CSII) and offers an alternative to multiple daily injections (MDI). Previous studies provide evidence that SAPT may improve clinical outcomes among people with type 1 diabetes. Sensor-Augmented Pump Therapy for A1c Reduction (STAR) 3 is a multicenter randomized controlled trial comparing the efficacy of SAPT to that of MDI in subjects with type 1 diabetes. METHODS: Subjects were randomized to either continue with MDI or transition to SAPT for 1 year. Subjects in the MDI cohort were allowed to transition to SAPT for 6 months after completion of the study. SAPT subjects who completed the study were also allowed to continue for 6 months. The primary end point was the difference between treatment groups in change in hemoglobin A1c (HbA1c) percentage from baseline to 1 year of treatment. Secondary end points included percentage of subjects with HbA1c < or =7% and without severe hypoglycemia, as well as area under the curve of time spent in normal glycemic ranges. Tertiary end points include percentage of subjects with HbA1c < or =7%, key safety end points, user satisfaction, and responses on standardized assessments. RESULTS: A total of 495 subjects were enrolled, and the baseline characteristics similar between the SAPT and MDI groups. Study completion is anticipated in June 2010. CONCLUSIONS: Results of this randomized controlled trial should help establish whether an integrated RT-CGM and CSII system benefits patients with type 1 diabetes more than MDI.
Resumo:
Purpose: Environmental turbulence including rapid changes in technology and markets has resulted in the need for new approaches to performance measurement and benchmarking. There is a need for studies that attempt to measure and benchmark upstream, leading or developmental aspects of organizations. Therefore, the aim of this paper is twofold. The first is to conduct an in-depth case analysis of lead performance measurement and benchmarking leading to the further development of a conceptual model derived from the extant literature and initial survey data. The second is to outline future research agendas that could further develop the framework and the subject area.
Design/methodology/approach: A multiple case analysis involving repeated in-depth interviews with managers in organisational areas of upstream influence in the case organisations.
Findings: It was found that the effect of external drivers for lead performance measurement and benchmarking was mediated by organisational context factors such as level of progression in business improvement methods. Moreover, the legitimation of the business improvement methods used for this purpose, although typical, had been extended beyond their original purpose with the development of bespoke sets of lead measures.
Practical implications: Examples of methods and lead measures are given that can be used by organizations in developing a programme of lead performance measurement and benchmarking.
Originality/value: There is a paucity of in-depth studies relating to the theory and practice of lead performance measurement and benchmarking in organisations.
Resumo:
We describe a lightweight prototype framework (LIBERO) designed for experimentation with behavioural skeletons-components implementing a well-known parallelism exploitation pattern and a rule-based autonomic manager taking care of some non-functional feature related to pattern computation. LIBERO supports multiple autonomic managers within the same behavioural skeleton, each taking care of a different non-functional concern. We introduce LIBERO-built on plain Java and JBoss-and discuss how multiple managers may be coordinated to achieve a common goal using a two-phase coordination protocol developed in earlier work. We present experimental results that demonstrate how the prototype may be used to investigate autonomic management of multiple, independent concerns. © 2011 Springer-Verlag Berlin Heidelberg.
Integrating Multiple Point Statistics with Aerial Geophysical Data to assist Groundwater Flow Models
Resumo:
The process of accounting for heterogeneity has made significant advances in statistical research, primarily in the framework of stochastic analysis and the development of multiple-point statistics (MPS). Among MPS techniques, the direct sampling (DS) method is tested to determine its ability to delineate heterogeneity from aerial magnetics data in a regional sandstone aquifer intruded by low-permeability volcanic dykes in Northern Ireland, UK. The use of two two-dimensional bivariate training images aids in creating spatial probability distributions of heterogeneities of hydrogeological interest, despite relatively ‘noisy’ magnetics data (i.e. including hydrogeologically irrelevant urban noise and regional geologic effects). These distributions are incorporated into a hierarchy system where previously published density function and upscaling methods are applied to derive regional distributions of equivalent hydraulic conductivity tensor K. Several K models, as determined by several stochastic realisations of MPS dyke locations, are computed within groundwater flow models and evaluated by comparing modelled heads with field observations. Results show a significant improvement in model calibration when compared to a simplistic homogeneous and isotropic aquifer model that does not account for the dyke occurrence evidenced by airborne magnetic data. The best model is obtained when normal and reverse polarity dykes are computed separately within MPS simulations and when a probability threshold of 0.7 is applied. The presented stochastic approach also provides improvement when compared to a previously published deterministic anisotropic model based on the unprocessed (i.e. noisy) airborne magnetics. This demonstrates the potential of coupling MPS to airborne geophysical data for regional groundwater modelling.
Resumo:
BACKGROUND: Healthcare integration is a priority in many countries, yet there remains little direction on how to systematically evaluate this construct to inform further development. The examination of community-based palliative care networks provides an ideal opportunity for the advancement of integration measures, in consideration of how fundamental provider cohesion is to effective care at end of life.
AIM: This article presents a variable-oriented analysis from a theory-based case study of a palliative care network to help bridge the knowledge gap in integration measurement.
DESIGN: Data from a mixed-methods case study were mapped to a conceptual framework for evaluating integrated palliative care and a visual array depicting the extent of key factors in the represented palliative care network was formulated.
SETTING/PARTICIPANTS: The study included data from 21 palliative care network administrators, 86 healthcare professionals, and 111 family caregivers, all from an established palliative care network in Ontario, Canada.
RESULTS: The framework used to guide this research proved useful in assessing qualities of integration and functioning in the palliative care network. The resulting visual array of elements illustrates that while this network performed relatively well at the multiple levels considered, room for improvement exists, particularly in terms of interventions that could facilitate the sharing of information.
CONCLUSION: This study, along with the other evaluative examples mentioned, represents important initial attempts at empirically and comprehensively examining network-integrated palliative care and healthcare integration in general.
Resumo:
Dissertação de Mestrado, Gestão da Água e da Costa, Faculdade de Ciências e Tecnologia, Universidade do Algarve, 2010