91 resultados para operational semantics
Resumo:
Background Adherence to chronic therapy is a key determinant of patient health outcomes in chronic disease. However, only about 50 % of patients adhere to chronic therapy. One of the challenges in promoting adherence is having an accurate understanding of adherence rates and the factors that contribute to non-adherence. There are many measures available to assess patient medication adherence. Aim of the review This review aims to present the commonly used indirect methods available for measuring medication adherence in routine healthcare and research studies. Method A literature review on medication adherence measures in patient populations with chronic conditions taking chronic medications was conducted through Medline (2003-2013). A complementary manual search of references cited in the retrieved studies was performed in order to identify any additional studies. Results Of the 238 initial Medline search results, 57 full texts were retrieved. Forty-seven articles were included as a result of the manual search. Adherence measures identified were: self-report (reported in 50 publications), electronic measures (33), pharmacy refills and claims data (26) and pill counts (25). Patient self-report, electronic measures, pharmacy refill and claims data were the most commonly used measures of adherence in research, routine practice, epidemiological and intervention studies. These methods, and their strengths and limitations have been described in this paper. Conclusion A multitude of indirect measures of adherence exist in the literature, however, there is no "gold" standard for measuring adherence to medications. Triangulation of methods increases the validity and reliability of the adherence data collected. To strengthen the adherence data collected and allow for comparison of data, future research and practice interventions should use an internationally accepted, operational standardized definition of medication adherence and clearly describe the medication adherence methods used.
Resumo:
A traditional photonic-force microscope (PFM) results in huge sets of data, which requires tedious numerical analysis. In this paper, we propose instead an analog signal processor to attain real-time capabilities while retaining the richness of the traditional PFM data. Our system is devoted to intracellular measurements and is fully interactive through the use of a haptic joystick. Using our specialized analog hardware along with a dedicated algorithm, we can extract the full 3D stiffness matrix of the optical trap in real time, including the off-diagonal cross-terms. Our system is also capable of simultaneously recording data for subsequent offline analysis. This allows us to check that a good correlation exists between the classical analysis of stiffness and our real-time measurements. We monitor the PFM beads using an optical microscope. The force-feedback mechanism of the haptic joystick helps us in interactively guiding the bead inside living cells and collecting information from its (possibly anisotropic) environment. The instantaneous stiffness measurements are also displayed in real time on a graphical user interface. The whole system has been built and is operational; here we present early results that confirm the consistency of the real-time measurements with offline computations.
Resumo:
Analytical results harmonisation is investigated in this study to provide an alternative to the restrictive approach of analytical methods harmonisation which is recommended nowadays for making possible the exchange of information and then for supporting the fight against illicit drugs trafficking. Indeed, the main goal of this study is to demonstrate that a common database can be fed by a range of different analytical methods, whatever the differences in levels of analytical parameters between these latter ones. For this purpose, a methodology making possible the estimation and even the optimisation of results similarity coming from different analytical methods was then developed. In particular, the possibility to introduce chemical profiles obtained with Fast GC-FID in a GC-MS database is studied in this paper. By the use of the methodology, the similarity of results coming from different analytical methods can be objectively assessed and the utility in practice of database sharing by these methods can be evaluated, depending on profiling purposes (evidential vs. operational perspective tool). This methodology can be regarded as a relevant approach for database feeding by different analytical methods and puts in doubt the necessity to analyse all illicit drugs seizures in one single laboratory or to implement analytical methods harmonisation in each participating laboratory.
Resumo:
1. As trees in a given cohort progress through ontogeny, many individuals die. This risk of mortality is unevenly distributed across species because of many processes such as habitat filtering, interspecific competition and negative density dependence. Here, we predict and test the patterns that such ecological processes should inscribe on both species and phylogenetic diversity as plants recruit from saplings to the canopy. 2. We compared species and phylogenetic diversity of sapling and tree communities at two sites in French Guiana. We surveyed 2084 adult trees in four 1-ha tree plots and 943 saplings in sixteen 16-m2 subplots nested within the tree plots. Species diversity was measured using Fisher's alpha (species richness) and Simpson's index (species evenness). Phylogenetic diversity was measured using Faith's phylogenetic diversity (phylogenetic richness) and Rao's quadratic entropy index (phylogenetic evenness). The phylogenetic diversity indices were inferred using four phylogenetic hypotheses: two based on rbcLa plastid DNA sequences obtained from the inventoried individuals with different branch lengths, a global phylogeny available from the Angiosperm Phylogeny Group, and a combination of both. 3. Taxonomic identification of the saplings was performed by combining morphological and DNA barcoding techniques using three plant DNA barcodes (psbA-trnH, rpoC1 and rbcLa). DNA barcoding enabled us to increase species assignment and to assign unidentified saplings to molecular operational taxonomic units. 4. Species richness was similar between saplings and trees, but in about half of our comparisons, species evenness was higher in trees than in saplings. This suggests that negative density dependence plays an important role during the sapling-to-tree transition. 5. Phylogenetic richness increased between saplings and trees in about half of the comparisons. Phylogenetic evenness increased significantly between saplings and trees in a few cases (4 out of 16) and only with the most resolved phylogeny. These results suggest that negative density dependence operates largely independently of the phylogenetic structure of communities. 6. Synthesis. By contrasting species richness and evenness across size classes, we suggest that negative density dependence drives shifts in composition during the sapling-to-tree transition. In addition, we found little evidence for a change in phylogenetic diversity across age classes, suggesting that the observed patterns are not phylogenetically constrained.
Resumo:
BACKGROUND: Disease-management programs may enhance the quality of care provided to patients with chronic diseases, such as chronic obstructive pulmonary disease (COPD). The aim of this systematic review was to assess the effectiveness of COPD disease-management programs. METHODS: We conducted a computerized search of MEDLINE, EMBASE, CINAHL, PsychINFO, and the Cochrane Library (CENTRAL) for studies evaluating interventions meeting our operational definition of disease management: patient education, 2 or more different intervention components, 2 or more health care professionals actively involved in patients' care, and intervention lasting 12 months or more. Programs conducted in hospital only and those targeting patients receiving palliative care were excluded. Two reviewers evaluated 12,749 titles and fully reviewed 139 articles; among these, data from 13 studies were included and extracted. Clinical outcomes considered were all-cause mortality, lung function, exercise capacity (walking distance), health-related quality of life, symptoms, COPD exacerbations, and health care use. A meta-analysis of exercise capacity and all-cause mortality was performed using random-effects models. RESULTS: The studies included were 9 randomized controlled trials, 1 controlled trial, and 3 uncontrolled before-after trials. Results indicate that the disease-management programs studied significantly improved exercise capacity (32.2 m, 95% confidence interval [CI], 4.1-60.3), decreased risk of hospitalization, and moderately improved health-related quality of life. All-cause mortality did not differ between groups (pooled odds ratio 0.84, 95% CI, 0.54-1.40). CONCLUSION: COPD disease-management programs modestly improved exercise capacity, health-related quality of life, and hospital admissions, but not all-cause mortality. Future studies should explore the specific elements or characteristics of these programs that bring the greatest benefit.
Resumo:
This paper presents a pilot project to reinforce participatory practices in standardization. The INTERNORM project is funded by the University of Lausanne, Switzerland. It aims to create an interactive knowledge center based on the sharing of academic skills and the experiences accumulated by the civil society, especially consumer associations, environmental associations and trade unions to strengthen the participatory process of standardization. The first objective of the project is action-oriented: INTERNORM provides a common knowledge pool supporting the participation of civil society actors to international standard-setting activities by bringing them together with academic experts in working groups and by providing logistic and financial support to their participation to meetings of national and international technical committees. The second objective of the project is analytical: the standardization action initiated through INTERNORM provides a research field for a better understanding of the participatory dynamics underpinning international standardization. The paper presents three incentives that explain civil society (non-)involvement in standardization that try to overcome conventional resource-based hypotheses: an operational incentive, related to the use of standards in the selective goods provided by associations to their membership; a thematic incentive, provided by the setting of priorities by strategic committees created in some standardization organization; a rhetorical incentive, related to the discursive resource that civil society concerns offers to the different stakeholders.
Resumo:
The effective dose delivered to the patient was determined, by modeling, for 257 types of examinations covering the different modalities of diagnostic and interventional radiology. The basic operational dosimetric quantities considered were obtained from the parameters of the examinations on the basis of dosimetric models. These models required a precise characterization of each examination. The operational dosimetric quantities were converted into doses to organs and effective doses using appropriate conversion factors. The determination of the collective effective dose to the Swiss population requires a number of corrections to account for the variability of several parameters: sensitivity of the detection system, age, gender, and build of the patient. The use of various dosimetric models is illustrated in this paper for a limited number of examination types covering the different radiological modalities, for which the established typical effective doses are given. With regard to individual doses, the study indicated that the average effective doses per type of examination can be classified into three levels: (a) the weakly irradiating examinations (less than 0.1 mSv), which represent 78% of the examinations and 4% of the collective dose, (b) the moderately irradiating examinations (between 0.1 mSv and 10 mSv), which represent 21% of the examinations and 72% of the collective dose, (c) the strongly irradiating examinations (more than 10 mSv), which represent 1% of the examinations and 24% of the collective dose.
Resumo:
The competitiveness of businesses is increasingly dependent on their electronic networks with customers, suppliers, and partners. While the strategic and operational impact of external integration and IOS adoption has been extensively studied, much less attention has been paid to the organizational and technical design of electronic relationships. The objective of our longitudinal research project is the development of a framework for understanding and explaining B2B integration. Drawing on existing literature and empirical cases we present a reference model (a classification scheme for B2B Integration). The reference model comprises technical, organizational, and institutional levels to reflect the multiple facets of B2B integration. In this paper we onvestigate the current state of electronic collaboration in global supply chains focussing on the technical view. Using an indepth case analysis we identify five integration scenarios. In the subsequent confirmatory phase of the research we analyse 112 real-world company cases to validate these five integration scenarios. Our research advances and deepens existing studies by developing a B2B reference model, which reflects the current state of practice and is independent of specific implementation technologies. In the next stage of the research the emerging reference model will be extended to create an assessment model for analysing the maturity level of a given company in a specific supply chain.
Resumo:
Achieving a high degree of dependability in complex macro-systems is challenging. Because of the large number of components and numerous independent teams involved, an overview of the global system performance is usually lacking to support both design and operation adequately. A functional failure mode, effects and criticality analysis (FMECA) approach is proposed to address the dependability optimisation of large and complex systems. The basic inductive model FMECA has been enriched to include considerations such as operational procedures, alarm systems. environmental and human factors, as well as operation in degraded mode. Its implementation on a commercial software tool allows an active linking between the functional layers of the system and facilitates data processing and retrieval, which enables to contribute actively to the system optimisation. The proposed methodology has been applied to optimise dependability in a railway signalling system. Signalling systems are typical example of large complex systems made of multiple hierarchical layers. The proposed approach appears appropriate to assess the global risk- and availability-level of the system as well as to identify its vulnerabilities. This enriched-FMECA approach enables to overcome some of the limitations and pitfalls previously reported with classical FMECA approaches.
Resumo:
Medicine counterfeiting is a crime that has increased in recent years and now involves the whole world. Health and economic repercussions have led pharmaceutical industries and agencies to develop many measures to protect genuine medicines and differentiate them from counterfeits. Detecting counterfeit is chemically relatively simple for the specialists, but much more information can be gained from the analyses in a forensic intelligence perspective. Analytical data can feed criminal investigation and law enforcement by detecting and understanding the criminal phenomenon. Profiling seizures using chemical and packaging data constitutes a strong way to detect organised production and industrialised forms of criminality, and is the focus of this paper. Thirty-three seizures of a commonly counterfeited type of capsule have been studied. The results of the packaging and chemical analyses were gathered within an organised database. Strong linkage was found between the seizures at the different production steps, indicating the presence of a main counterfeit network dominating the market. The interpretation of the links with circumstantial data provided information about the production and the distribution of counterfeits coming from this network. This forensic intelligence perspective has the potential to be generalised to other types of products. This may be the only reliable approach to help the understanding of the organised crime phenomenon behind counterfeiting and to enable efficient strategic and operational decision making in an attempt to dismantle counterfeit network.
Resumo:
The choice to adopt risk-sensitive measurement approaches for operational risks: the case of Advanced Measurement Approach under Basel II New Capital Accord This paper investigates the choice of the operational risk approach under Basel II requirements and whether the adoption of advanced risk measurement approaches allows banks to save capital. Among the three possible approaches for operational risk measurement, the Advanced Measurement Approach (AMA) is the most sophisticated and requires the use of historical loss data, the application of statistical tools, and the engagement of a highly qualified staff. Our results provide evidence that the adoption of AMA is contingent on the availability of bank resources and prior experience in risk-sensitive operational risk measurement practices. Moreover, banks that choose AMA exhibit low requirements for capital and, as a result might gain a competitive advantage compared to banks that opt for less sophisticated approaches. - Internal Risk Controls and their Impact on Bank Solvency Recent cases in financial sector showed the importance of risk management controls on risk taking and firm performance. Despite advances in the design and implementation of risk management mechanisms, there is little research on their impact on behavior and performance of firms. Based on data from a sample of 88 banks covering the period between 2004 and 2010, we provide evidence that internal risk controls impact the solvency of banks. In addition, our results show that the level of internal risk controls leads to a higher degree of solvency in banks with a major shareholder in contrast to widely-held banks. However, the relationship between internal risk controls and bank solvency is negatively affected by BHC growth strategies and external restrictions on bank activities, while the higher regulatory requirements for bank capital moderates positively this relationship. - The Impact of the Sophistication of Risk Measurement Approaches under Basel II on Bank Holding Companies Value Previous research showed the importance of external regulation on banks' behavior. Some inefficient standards may accentuate risk-taking in banks and provoke a financial crisis. Despite the growing literature on the potential effects of Basel II rules, there is little empirical research on the efficiency of risk-sensitive capital measurement approaches and their impact on bank profitability and market valuation. Based on data from a sample of 66 banks covering the period between 2008 and 2010, we provide evidence that prudential ratios computed under Basel II standards predict the value of banks. However, this relation is contingent on the degree of sophistication of risk measurement approaches that banks apply. Capital ratios are effective in predicting bank market valuation when banks adopt the advanced approaches to compute the value of their risk-weighted assets.
Resumo:
The distribution of plants along environmental gradients is constrained by abiotic and biotic factors. Cumulative evidence attests of the impact of biotic factors on plant distributions, but only few studies discuss the role of belowground communities. Soil fungi, in particular, are thought to play an important role in how plant species assemble locally into communities. We first review existing evidence, and then test the effect of the number of soil fungal operational taxonomic units (OTUs) on plant species distributions using a recently collected dataset of plant and metagenomic information on soil fungi in the Western Swiss Alps. Using species distribution models (SDMs), we investigated whether the distribution of individual plant species is correlated to the number of OTUs of two important soil fungal classes known to interact with plants: the Glomeromycetes, that are obligatory symbionts of plants, and the Agaricomycetes, that may be facultative plant symbionts, pathogens, or wood decayers. We show that including the fungal richness information in the models of plant species distributions improves predictive accuracy. Number of fungal OTUs is especially correlated to the distribution of high elevation plant species. We suggest that high elevation soil show greater variation in fungal assemblages that may in turn impact plant turnover among communities. We finally discuss how to move beyond correlative analyses, through the design of field experiments manipulating plant and fungal communities along environmental gradients.
Resumo:
Due to the existence of free software and pedagogical guides, the use of data envelopment analysis (DEA) has been further democratized in recent years. Nowadays, it is quite usual for practitioners and decision makers with no or little knowledge in operational research to run themselves their own efficiency analysis. Within DEA, several alternative models allow for an environment adjustment. Five alternative models, each of them easily accessible to and achievable by practitioners and decision makers, are performed using the empirical case of the 90 primary schools of the State of Geneva, Switzerland. As the State of Geneva practices an upstream positive discrimination policy towards schools, this empirical case is particularly appropriate for an environment adjustment. The alternative of the majority of DEA models deliver divergent results. It is a matter of concern for applied researchers and a matter of confusion for practitioners and decision makers. From a political standpoint, these diverging results could lead to potentially opposite decisions. Grâce à l'existence de logiciels en libre accès et de guides pédagogiques, la méthode data envelopment analysis (DEA) s'est démocratisée ces dernières années. Aujourd'hui, il n'est pas rare que les décideurs avec peu ou pas de connaissances en recherche opérationnelle réalisent eux-mêmes leur propre analyse d'efficience. A l'intérieur de la méthode DEA, plusieurs modèles permettent de tenir compte des conditions plus ou moins favorables de l'environnement. Cinq de ces modèles, facilement accessibles et applicables par les décideurs, sont utilisés pour mesurer l'efficience des 90 écoles primaires du canton de Genève, Suisse. Le canton de Genève pratiquant une politique de discrimination positive envers les écoles défavorisées, ce cas pratique est particulièrement adapté pour un ajustement à l'environnement. La majorité des modèles DEA génèrent des résultats divergents. Ce constat est préoccupant pour les chercheurs appliqués et perturbant pour les décideurs. D'un point de vue politique, ces résultats divergents conduisent à des prises de décision différentes selon le modèle sur lequel elles sont fondées.