890 resultados para Context Model
Resumo:
Thesis (Ed.D.)--University of Washington, 2016-06
Resumo:
0We study the exact solution for a two-mode model describing coherent coupling between atomic and molecular Bose-Einstein condensates (BEC), in the context of the Bethe ansatz. By combining an asymptotic and numerical analysis, we identify the scaling behaviour of the model and determine the zero temperature expectation value for the coherence and average atomic occupation. The threshold coupling for production of the molecular BEC is identified as the point at which the energy gap is minimum. Our numerical results indicate a parity effect for the energy gap between ground and first excited state depending on whether the total atomic number is odd or even. The numerical calculations for the quantum dynamics reveals a smooth transition from the atomic to the molecular BEC.
Resumo:
Predictive testing is one of the new genetic technologies which, in conjunction with developing fields such as pharmacogenomics, promises many benefits for preventive and population health. Understanding how individuals appraise and make genetic test decisions is increasingly relevant as the technology expands. Lay understandings of genetic risk and test decision-making, located within holistic life frameworks including family or kin relationships, may vary considerably from clinical representations of these phenomena. The predictive test for Huntington's disease (HD), whilst specific to a single-gene, serious, mature-onset but currently untreatable disorder, is regarded as a model in this context. This paper reports upon a qualitative Australian study which investigated predictive test decision-making by individuals at risk for HD, the contexts of their decisions and the appraisals which underpinned them. In-depth interviews were conducted in Australia with 16 individuals at 50% risk for HD, with variation across testing decisions, gender, age and selected characteristics. Findings suggested predictive testing was regarded as a significant life decision with important implications for self and others, while the right not to know genetic status was staunchly and unanimously defended. Multiple contexts of reference were identified within which test decisions were located, including intra- and inter-personal frameworks, family history and experience of HID, and temporality. Participants used two main criteria in appraising test options: perceived value of, or need for the test information, for self and/or significant others, and degree to which such information could be tolerated and managed, short and long-term, by self and/or others. Selected moral and ethical considerations involved in decision-making are examined, as well as the clinical and socio-political contexts in which predictive testing is located. The paper argues that psychosocial vulnerabilities generated by the availability of testing technologies and exacerbated by policy imperatives towards individual responsibility and self-governance should be addressed at broader societal levels. (C) 2003 Elsevier Science Ltd. All rights reserved.
Resumo:
This study examined the utility of a stress/coping model in explaining adaptation in two groups of people at-risk for Huntington's Disease (HD): those who have not approached genetic testing services (non-testees) and those who have engaged a testing service (testees). The aims were (1) to compare testees and non-testees on stress/coping variables, (2) to examine relations between adjustment and the stress/coping predictors in the two groups, and (3) to examine relations between the stress/coping variables and testees' satisfaction with their first counselling session. Participants were 44 testees and 40 non-testees who completed questionnaires which measured the stress/coping variables: adjustment (global distress, depression, health anxiety, social and dyadic adjustment), genetic testing concerns, testing context (HD contact, experience, knowledge), appraisal (control, threat, self-efficacy), coping strategies (avoidance, self-blame, wishful thinking, seeking support, problem solving), social support and locus of control. Testees also completed a genetic counselling session satisfaction scale. As expected, non-testees reported lower self-efficacy and control appraisals, higher threat and passive avoidant coping than testees. Overall, results supported the hypothesis that within each group poorer adjustment would be related to higher genetic testing concerns, contact with HD, threat appraisals, passive avoidant coping and external locus of control, and lower levels of positive experiences with HD, social support, internal locus of control, self-efficacy, control appraisals, problem solving, emotional approach and seeking social support coping. Session satisfaction scores were positively correlated with dyadic adjustment, problem solving and positive experience with HD, and inversely related to testing concerns, and threat and control appraisals. Findings support the utility of the stress/coping model in explaining adaptation in people who have decided not to seek genetic testing for HD and those who have decided to engage a genetic testing service.
Resumo:
Cued recall with an extralist cue poses a challenge for contemporary memory theory in that there is a need to explain how episodic and semantic information are combined. A parallel activation and intersection approach proposes one such means by assuming that an experimental cue will elicit its preexisting semantic network and a context cue will elicit a list memory. These 2 sources of information are then combined by focusing on information that is common to the 2 sources. Two key predictions of that approach are examined: (a) Combining semantic and episodic information can lead to item interactions and false memories, and (b) these effects are limited to memory tasks that involve an episodic context cue. Five experiments demonstrate such item interactions and false memories in cued recall but not in free association. Links are drawn between the use of context in this setting and in other settings.
Resumo:
This paper focuses on measuring the extent to which market power has been exercised in a recently deregulated electricity generation sector. Our study emphasises the need to consider the concept of market power in a long-run dynamic context. A market power index is constructed focusing on differences between actual market returns and long-run competitive returns, estimated using a programming model devised by the authors. The market power implications of hedge contracts are briefly considered. The state of Queensland Australia is used as a context for the analysis. The results suggest that generators have exercised significant market power since deregulation.
Resumo:
Social entrepreneurship is an emerging area of investigation within the entrepreneurship and not-for-profit marketing literatures. A review of the literature emerging from a number of domains reveals that it is fragmented and that there is no coherent theoretical framework. In particular, current conceptualizations of social entrepreneurship fail to adequately consider the unique characteristics of social entrepreneurs and the context within which they must operate. Using grounded theory method and drawing on nine in-depth case studies of social entrepreneurial not-for-profit organizations, this paper addresses this research gap and develops a bounded multidimensional model of social entrepreneurship. Implications for social entrepreneurship theory, management practice, and policy directions are discussed.
Resumo:
The Gauss-Marquardt-Levenberg (GML) method of computer-based parameter estimation, in common with other gradient-based approaches, suffers from the drawback that it may become trapped in local objective function minima, and thus report optimized parameter values that are not, in fact, optimized at all. This can seriously degrade its utility in the calibration of watershed models where local optima abound. Nevertheless, the method also has advantages, chief among these being its model-run efficiency, and its ability to report useful information on parameter sensitivities and covariances as a by-product of its use. It is also easily adapted to maintain this efficiency in the face of potential numerical problems (that adversely affect all parameter estimation methodologies) caused by parameter insensitivity and/or parameter correlation. The present paper presents two algorithmic enhancements to the GML method that retain its strengths, but which overcome its weaknesses in the face of local optima. Using the first of these methods an intelligent search for better parameter sets is conducted in parameter subspaces of decreasing dimensionality when progress of the parameter estimation process is slowed either by numerical instability incurred through problem ill-posedness, or when a local objective function minimum is encountered. The second methodology minimizes the chance of successive GML parameter estimation runs finding the same objective function minimum by starting successive runs at points that are maximally removed from previous parameter trajectories. As well as enhancing the ability of a GML-based method to find the global objective function minimum, the latter technique can also be used to find the locations of many non-global optima (should they exist) in parameter space. This can provide a useful means of inquiring into the well-posedness of a parameter estimation problem, and for detecting the presence of bimodal parameter and predictive probability distributions. The new methodologies are demonstrated by calibrating a Hydrological Simulation Program-FORTRAN (HSPF) model against a time series of daily flows. Comparison with the SCE-UA method in this calibration context demonstrates a high level of comparative model run efficiency for the new method. (c) 2006 Elsevier B.V. All rights reserved.
Resumo:
There is growing interest in the use of context-awareness as a technique for developing pervasive computing applications that are flexible, adaptable, and capable of acting autonomously on behalf of users. However, context-awareness introduces a variety of software engineering challenges. In this paper, we address these challenges by proposing a set of conceptual models designed to support the software engineering process, including context modelling techniques, a preference model for representing context-dependent requirements, and two programming models. We also present a software infrastructure and software engineering process that can be used in conjunction with our models. Finally, we discuss a case study that demonstrates the strengths of our models and software engineering approach with respect to a set of software quality metrics.
Resumo:
Context information is used by pervasive networking and context-aware programs to adapt intelligently to different environments and user tasks. As the context information is potentially sensitive, it is often necessary to provide privacy protection mechanisms for users. These mechanisms are intended to prevent breaches of user privacy through unauthorised context disclosure. To be effective, such mechanisms should not only support user specified context disclosure rules, but also the disclosure of context at different granularities. In this paper we describe a new obfuscation mechanism that can adjust the granularity of different types of context information to meet disclosure requirements stated by the owner of the context information. These requirements are specified using a preference model we developed previously and have since extended to provide granularity control. The obfuscation process is supported by our novel use of ontological descriptions that capture the granularity relationship between instances of an object type.
Resumo:
Pervasive computing applications must be engineered to provide unprecedented levels of flexibility in order to reconfigure and adapt in response to changes in computing resources and user requirements. To meet these challenges, appropriate software engineering abstractions and infrastructure are required as a platform on which to build adaptive applications. In this paper, we demonstrate the use of a disciplined, model-based approach to engineer a context-aware Session Initiation Protocol (SIP) based communication application. This disciplined approach builds on our previously developed conceptual models and infrastructural components, which enable the description, acquisition, management and exploitation of arbitrary types of context and user preference information to enable adaptation to context changes
Resumo:
Context-aware applications rely on implicit forms of input, such as sensor-derived data, in order to reduce the need for explicit input from users. They are especially relevant for mobile and pervasive computing environments, in which user attention is at a premium. To support the development of context-aware applications, techniques for modelling context information are required. These must address a unique combination of requirements, including the ability to model information supplied by both sensors and people, to represent imperfect information, and to capture context histories. As the field of context-aware computing is relatively new, mature solutions for context modelling do not exist, and researchers rely on information modelling solutions developed for other purposes. In our research, we have been using a variant of Object-Role Modeling (ORM) to model context. In this paper, we reflect on our experiences and outline some research challenges in this area.
Resumo:
The immaturity of the field of context-aware computing means that little is known about how to incorporate appropriate personalisation mechanisms into context-aware applications. One of the main challenges is how to elicit and represent complex, context-dependent requirements, and then use the resulting representations within context-aware applications to support decision-making processes. In this paper, we characterise several approaches to personalisation of context-aware applications and introduce our research on personalisation using a novel preference model.
Resumo:
Pervasive computing applications must be sufficiently autonomous to adapt their behaviour to changes in computing resources and user requirements. This capability is known as context-awareness. In some cases, context-aware applications must be implemented as autonomic systems which are capable of dynamically discovering and replacing context sources (sensors) at run-time. Unlike other types of application autonomy, this kind of dynamic reconfiguration has not been sufficiently investigated yet by the research community. However, application-level context models are becoming common, in order to ease programming of context-aware applications and support evolution by decoupling applications from context sources. We can leverage these context models to develop general (i.e., application-independent) solutions for dynamic, run-time discovery of context sources (i.e., context management). This paper presents a model and architecture for a reconfigurable context management system that supports interoperability by building on emerging standards for sensor description and classification.
Resumo:
In this paper, we present a framework for pattern-based model evolution approaches in the MDA context. In the framework, users define patterns using a pattern modeling language that is designed to describe software design patterns, and they can use the patterns as rules to evolve their model. In the framework, design model evolution takes place via two steps. The first step is a binding process of selecting a pattern and defining where and how to apply the pattern in the model. The second step is an automatic model transformation that actually evolves the model according to the binding information and the pattern rule. The pattern modeling language is defined in terms of a MOF-based role metamodel, and implemented using an existing modeling framework, EMF, and incorporated as a plugin to the Eclipse modeling environment. The model evolution process is also implemented as an Eclipse plugin. With these two plugins, we provide an integrated framework where defining and validating patterns, and model evolution based on patterns can take place in a single modeling environment.