994 resultados para CONCEPTUAL DESCRIPTION
Resumo:
This concluding essay discusses some crucial methodological issues raised by other papers in this issue. It also suggests directions for further conceptuel development concerning the qualitative research on chronic illness.
Resumo:
The performance of exchange and correlation (xc) functionals of the generalized gradient approximation (GGA) type and of the meta-GGA type in the calculation of chemical reactions is related to topological features of the electron density which, in turn, are connected to the orbital structure of chemical bonds within the Kohn-Sham (KS) theory. Seventeen GGA and meta-GGA xc functionals are assessed for 15 hydrogen abstraction reactions and 3 symmetrical S(N)2 reactions. Systems that are problematic for standard GGAs characteristically have enhanced values of the dimensionless gradient argument s(sigma)(2) with local maxima in the bonding region. The origin of this topological feature is the occupation of valence KS orbitals with an antibonding or essentially nonbonding character. The local enhancement of s(sigma)(2) yields too negative exchange-correlation energies with standard GGAs for the transition state of the S(N)2 reaction, which leads to the reduced calculated reaction barriers. The unwarranted localization of the effective xc hole of the standard GGAs, i.e., the nondynamical correlation that is built into them but is spurious in this case, wields its effect by their s(sigma)(2) dependence. Barriers are improved for xc functionals with the exchange functional OPTX as x component, which has a modified dependence on s(sigma)(2). Standard GGAs also underestimate the barriers for the hydrogen abstraction reactions. In this case the barriers are improved by correlation functionals, such as the Laplacian-dependent (LAP3) functional, which has a modified dependence on the Coulomb correlation of the opposite- and like-spin electrons. The best overall performance is established for the combination OLAP3 of OPTX and LAP3.
Resumo:
Improvement in the quality of end-of-life (EOL) care is a priority health care issue since serious deficiencies in quality of care have been reported across care settings. Increasing pressure is now focused on Canadian health care organizations to be accountable for the quality of palliative and EOL care delivered. Numerous domains of quality EOL care upon which to create accountability frameworks are now published, with some derived from the patient/family perspective. There is a need to reach common ground on the domains of quality EOL care valued by patients and families in order to develop consistent performance measures and set priorities for health care improvement. This paper describes a meta-synthesis study to develop a common conceptual framework of quality EOL care integrating attributes of quality valued by patients and their families. © 2005 Centre for Bioethics, IRCM.
Resumo:
The operation of supply chains (SCs) has for many years been focused on efficiency, leanness and responsiveness. This has resulted in reduced slack in operations, compressed cycle times, increased productivity and minimised inventory levels along the SC. Combined with tight tolerance settings for the realisation of logistics and production processes, this has led to SC performances that are frequently not robust. SCs are becoming increasingly vulnerable to disturbances, which can decrease the competitive power of the entire chain in the market. Moreover, in the case of food SCs non-robust performances may ultimately result in empty shelves in grocery stores and supermarkets.
The overall objective of this research is to contribute to Supply Chain Management (SCM) theory by developing a structured approach to assess SC vulnerability, so that robust performances of food SCs can be assured. We also aim to help companies in the food industry to evaluate their current state of vulnerability, and to improve their performance robustness through a better understanding of vulnerability issues. The following research questions (RQs) stem from these objectives:
RQ1: What are the main research challenges related to (food) SC robustness?
RQ2: What are the main elements that have to be considered in the design of robust SCs and what are the relationships between these elements?
RQ3: What is the relationship between the contextual factors of food SCs and the use of disturbance management principles?
RQ4: How to systematically assess the impact of disturbances in (food) SC processes on the robustness of (food) SC performances?
To answer these RQs we used different methodologies, both qualitative and quantitative. For each question, we conducted a literature survey to identify gaps in existing research and define the state of the art of knowledge on the related topics. For the second and third RQ, we conducted both exploration and testing on selected case studies. Finally, to obtain more detailed answers to the fourth question, we used simulation modelling and scenario analysis for vulnerability assessment.
Main findings are summarised as follows.
Based on an extensive literature review, we answered RQ1. The main research challenges were related to the need to define SC robustness more precisely, to identify and classify disturbances and their causes in the context of the specific characteristics of SCs and to make a systematic overview of (re)design strategies that may improve SC robustness. Also, we found that it is useful to be able to discriminate between varying degrees of SC vulnerability and to find a measure that quantifies the extent to which a company or SC shows robust performances when exposed to disturbances.
To address RQ2, we define SC robustness as the degree to which a SC shows an acceptable performance in (each of) its Key Performance Indicators (KPIs) during and after an unexpected event that caused a disturbance in one or more logistics processes. Based on the SCM literature we identified the main elements needed to achieve robust performances and structured them together to form a conceptual framework for the design of robust SCs. We then explained the logic of the framework and elaborate on each of its main elements: the SC scenario, SC disturbances, SC performance, sources of food SC vulnerability, and redesign principles and strategies.
Based on three case studies, we answered RQ3. Our major findings show that the contextual factors have a consistent relationship to Disturbance Management Principles (DMPs). The product and SC environment characteristics are contextual factors that are hard to change and these characteristics initiate the use of specific DMPs as well as constrain the use of potential response actions. The process and the SC network characteristics are contextual factors that are easier to change, and they are affected by the use of the DMPs. We also found a notable relationship between the type of DMP likely to be used and the particular combination of contextual factors present in the observed SC.
To address RQ4, we presented a new method for vulnerability assessments, the VULA method. The VULA method helps to identify how much a company is underperforming on a specific Key Performance Indicator (KPI) in the case of a disturbance, how often this would happen and how long it would last. It ultimately informs the decision maker about whether process redesign is needed and what kind of redesign strategies should be used in order to increase the SC’s robustness. The VULA method is demonstrated in the context of a meat SC using discrete-event simulation. The case findings show that performance robustness can be assessed for any KPI using the VULA method.
To sum-up the project, all findings were incorporated within an integrated framework for designing robust SCs. The integrated framework consists of the following steps: 1) Description of the SC scenario and identification of its specific contextual factors; 2) Identification of disturbances that may affect KPIs; 3) Definition of the relevant KPIs and identification of the main disturbances through assessment of the SC performance robustness (i.e. application of the VULA method); 4) Identification of the sources of vulnerability that may (strongly) affect the robustness of performances and eventually increase the vulnerability of the SC; 5) Identification of appropriate preventive or disturbance impact reductive redesign strategies; 6) Alteration of SC scenario elements as required by the selected redesign strategies and repeat VULA method for KPIs, as defined in Step 3.
Contributions of this research are listed as follows. First, we have identified emerging research areas - SC robustness, and its counterpart, vulnerability. Second, we have developed a definition of SC robustness, operationalized it, and identified and structured the relevant elements for the design of robust SCs in the form of a research framework. With this research framework, we contribute to a better understanding of the concepts of vulnerability and robustness and related issues in food SCs. Third, we identified the relationship between contextual factors of food SCs and specific DMPs used to maintain robust SC performances: characteristics of the product and the SC environment influence the selection and use of DMPs; processes and SC networks are influenced by DMPs. Fourth, we developed specific metrics for vulnerability assessments, which serve as a basis of a VULA method. The VULA method investigates different measures of the variability of both the duration of impacts from disturbances and the fluctuations in their magnitude.
With this project, we also hope to have delivered practical insights into food SC vulnerability. First, the integrated framework for the design of robust SCs can be used to guide food companies in successful disturbance management. Second, empirical findings from case studies lead to the identification of changeable characteristics of SCs that can serve as a basis for assessing where to focus efforts to manage disturbances. Third, the VULA method can help top management to get more reliable information about the “health” of the company.
The two most important research opportunities are: First, there is a need to extend and validate our findings related to the research framework and contextual factors through further case studies related to other types of (food) products and other types of SCs. Second, there is a need to further develop and test the VULA method, e.g.: to use other indicators and statistical measures for disturbance detection and SC improvement; to define the most appropriate KPI to represent the robustness of a complete SC. We hope this thesis invites other researchers to pick up these challenges and help us further improve the robustness of (food) SCs.
Resumo:
According to the World Health Organization, the patient and family should be viewed as the "unit of care" when palliative care is required. Therefore family caregivers should receive optimal supportive care from health professionals. However, the impact of supporting a dying relative is frequently described as having negative physical and psychosocial sequalae. Furthermore, family caregivers consistently report unmet needs and there has been a dearth of rigorous supportive interventions published. In addition, comprehensive conceptual frameworks to navigate the family caregiver experience and guide intervention development are lacking. This article draws on Lazarus and Folkman's seminal work on the transactional stress and coping framework to present a conceptual model specific to family caregivers of patients receiving palliative care. A comprehensive account of key variables to aid understanding of the family caregiver experience and intervention design is provided.
Resumo:
Prior research has argued that use of optional properties in conceptual models results in loss of information about the semantics of the domains represented by the models. Empirical research undertaken to date supports this argument. Nevertheless, no systematic analysis has been done of whether use of optional properties is always problematic. Furthermore, prior empirical research might have deliberately or unwittingly employed models where use of optionality always causes problems. Accordingly, we examine analytically whether use of optional properties is always problematic. We employ our analytical results to inform the design of an experiment where we systematically examined the impact of optionality on users’ ability to understand domains represented by different types of conceptual models. We found evidence that use of optionality undermines users’ ability to understand the domain represented by a model but that this effect weakens when use of mandatory properties to replace optional properties leads to more-complex models.
Resumo:
Bayesian probabilistic analysis offers a new approach to characterize semantic representations by inferring the most likely feature structure directly from the patterns of brain activity. In this study, infinite latent feature models [1] are used to recover the semantic features that give rise to the brain activation vectors when people think about properties associated with 60 concrete concepts. The semantic features recovered by ILFM are consistent with the human ratings of the shelter, manipulation, and eating factors that were recovered by a previous factor analysis. Furthermore, different areas of the brain encode different perceptual and conceptual features. This neurally-inspired semantic representation is consistent with some existing conjectures regarding the role of different brain areas in processing different semantic and perceptual properties. © 2012 Springer-Verlag.
Resumo:
The cost-effectiveness of novel interventions in the treatment of cancer is well researched; however, relatively little attention is paid to the cost of many aspects of routine care. Oesophageal cancer is the ninth most common cancer in the UK and sixth most common cause of cancer death. It usually presents late and has a poor prognosis. The hospital costs incurred by oesophageal cancer patients diagnosed in Northern Ireland in 2005 (n = 198) were determined by review of medical records. The average cost of hospital care per patient in the 12 months from presentation was £7847. Variations in total hospital costs by age at diagnosis, gender, cancer stage, histological type, mortality at 1 year, co-morbidity count and socio-economic status were analysed using multiple regression analyses. Higher costs were associated with earlier stages of cancer and cancer stage remained a significant predictor of costs after controlling for cancer type, patient age and mortality at 1 year. Thus, although early detection of cancer usually improves survival, this would mean increased costs in the first year. Deprivation achieved borderline significance with those from more deprived areas having lower resource consumption relative to the more affluent. © 2013 John Wiley & Sons Ltd.
Resumo:
A core activity in information systems development involves understanding the
conceptual model of the domain that the information system supports. Any conceptual model is ultimately created using a conceptual-modeling (CM) grammar. Accordingly, just as high quality conceptual models facilitate high quality systems development, high quality CM grammars facilitate high quality conceptual modeling. This paper seeks to provide a new perspective on improving the quality of CM grammar semantics. For the past twenty years, the leading approach to this topic has drawn on ontological theory. However, the ontological approach captures just half of the story. It needs to be coupled with a logical approach. We show how ontological quality and logical quality interrelate and we outline three contributions of a logical approach: the ability to see familiar conceptualmodeling problems in simpler ways, the illumination of new problems, and the ability to prove the benefit of modifying CM grammars.
Resumo:
A reduced-density-operator description is developed for coherent optical phenomena in many-electron atomic systems, utilizing a Liouville-space, multiple-mode Floquet–Fourier representation. The Liouville-space formulation provides a natural generalization of the ordinary Hilbert-space (Hamiltonian) R-matrix-Floquet method, which has been developed for multi-photon transitions and laser-assisted electron–atom collision processes. In these applications, the R-matrix-Floquet method has been demonstrated to be capable of providing an accurate representation of the complex, multi-level structure of many-electron atomic systems in bound, continuum, and autoionizing states. The ordinary Hilbert-space (Hamiltonian) formulation of the R-matrix-Floquet method has been implemented in highly developed computer programs, which can provide a non-perturbative treatment of the interaction of a classical, multiple-mode electromagnetic field with a quantum system. This quantum system may correspond to a many-electron, bound atomic system and a single continuum electron. However, including pseudo-states in the expansion of the many-electron atomic wave function can provide a representation of multiple continuum electrons. The 'dressed' many-electron atomic states thereby obtained can be used in a realistic non-perturbative evaluation of the transition probabilities for an extensive class of atomic collision and radiation processes in the presence of intense electromagnetic fields. In order to incorporate environmental relaxation and decoherence phenomena, we propose to utilize the ordinary Hilbert-space (Hamiltonian) R-matrix-Floquet method as a starting-point for a Liouville-space (reduced-density-operator) formulation. To illustrate how the Liouville-space R-matrix-Floquet formulation can be implemented for coherent atomic radiative processes, we discuss applications to electromagnetically induced transparency, as well as to related pump–probe optical phenomena, and also to the unified description of radiative and dielectronic recombination in electron–ion beam interactions and high-temperature plasmas.