936 resultados para Interdisciplinary approach to knowledge


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background 29 autoimmune diseases, including Rheumatoid Arthritis, gout, Crohn’s Disease, and Systematic Lupus Erythematosus affect 7.6-9.4% of the population. While effective therapy is available, many patients do not follow treatment or use medications as directed. Digital health and Web 2.0 interventions have demonstrated much promise in increasing medication and treatment adherence, but to date many Internet tools have proven disappointing. In fact, most digital interventions continue to suffer from high attrition in patient populations, are burdensome for healthcare professionals, and have relatively short life spans. Objective Digital health tools have traditionally centered on the transformation of existing interventions (such as diaries, trackers, stage-based or cognitive behavioral therapy programs, coupons, or symptom checklists) to electronic format. Advanced digital interventions have also incorporated attributes of Web 2.0 such as social networking, text messaging, and the use of video. Despite these efforts, there has not been little measurable impact in non-adherence for illnesses that require medical interventions, and research must look to other strategies or development methodologies. As a first step in investigating the feasibility of developing such a tool, the objective of the current study is to systematically rate factors of non-adherence that have been reported in past research studies. Methods Grounded Theory, recognized as a rigorous method that facilitates the emergence of new themes through systematic analysis, data collection and coding, was used to analyze quantitative, qualitative and mixed method studies addressing the following autoimmune diseases: Rheumatoid Arthritis, gout, Crohn’s Disease, Systematic Lupus Erythematosus, and inflammatory bowel disease. Studies were only included if they contained primary data addressing the relationship with non-adherence. Results Out of the 27 studies, four non-modifiable and 11 modifiable risk factors were discovered. Over one third of articles identified the following risk factors as common contributors to medication non-adherence (percent of studies reporting): patients not understanding treatment (44%), side effects (41%), age (37%), dose regimen (33%), and perceived medication ineffectiveness (33%). An unanticipated finding that emerged was the need for risk stratification tools (81%) with patient-centric approaches (67%). Conclusions This study systematically identifies and categorizes medication non-adherence risk factors in select autoimmune diseases. Findings indicate that patients understanding of their disease and the role of medication are paramount. An unexpected finding was that the majority of research articles called for the creation of tailored, patient-centric interventions that dispel personal misconceptions about disease, pharmacotherapy, and how the body responds to treatment. To our knowledge, these interventions do not yet exist in digital format. Rather than adopting a systems level approach, digital health programs should focus on cohorts with heterogeneous needs, and develop tailored interventions based on individual non-adherence patterns.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Promoting the inclusion of students with disabilities in e-learning systems has brought many challenges for researchers and educators. The use of synchronous communication tools such as interactive whiteboards has been regarded as an obstacle for inclusive education. In this paper, we present the proposal of an inclusive approach to provide blind students with the possibility to participate in live learning sessions with whiteboard software. The approach is based on the provision of accessible textual descriptions by a live mediator. With the accessible descriptions, students are able to navigate through the elements and explore the content of the class using screen readers. The method used for this study consisted of the implementation of a software prototype within a virtual learning environment and a case study with the participation of a blind student in a live distance class. The results from the case study have shown that this approach can be very effective, and may be a starting point to provide blind students with resources they had previously been deprived from. The proof of concept implemented has shown that many further possibilities may be explored to enhance the interaction of blind users with educational content in whiteboards, and further pedagogical approaches can be investigated from this proposal. (C) 2009 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Two fundamental processes usually arise in the production planning of many industries. The first one consists of deciding how many final products of each type have to be produced in each period of a planning horizon, the well-known lot sizing problem. The other process consists of cutting raw materials in stock in order to produce smaller parts used in the assembly of final products, the well-studied cutting stock problem. In this paper the decision variables of these two problems are dependent of each other in order to obtain a global optimum solution. Setups that are typically present in lot sizing problems are relaxed together with integer frequencies of cutting patterns in the cutting problem. Therefore, a large scale linear optimizations problem arises, which is exactly solved by a column generated technique. It is worth noting that this new combined problem still takes the trade-off between storage costs (for final products and the parts) and trim losses (in the cutting process). We present some sets of computational tests, analyzed over three different scenarios. These results show that, by combining the problems and using an exact method, it is possible to obtain significant gains when compared to the usual industrial practice, which solve them in sequence. (C) 2010 The Franklin Institute. Published by Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Component-based software engineering has recently emerged as a promising solution to the development of system-level software. Unfortunately, current approaches are limited to specific platforms and domains. This lack of generality is particularly problematic as it prevents knowledge sharing and generally drives development costs up. In the past, we have developed a generic approach to component-based software engineering for system-level software called OpenCom. In this paper, we present OpenComL an instantiation of OpenCom to Linux environments and show how it can be profiled to meet a range of system-level software in Linux environments. For this, we demonstrate its application to constructing a programmable router platform and a middleware for parallel environments.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents an automatic method to detect and classify weathered aggregates by assessing changes of colors and textures. The method allows the extraction of aggregate features from images and the automatic classification of them based on surface characteristics. The concept of entropy is used to extract features from digital images. An analysis of the use of this concept is presented and two classification approaches, based on neural networks architectures, are proposed. The classification performance of the proposed approaches is compared to the results obtained by other algorithms (commonly considered for classification purposes). The obtained results confirm that the presented method strongly supports the detection of weathered aggregates.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Automatic summarization of texts is now crucial for several information retrieval tasks owing to the huge amount of information available in digital media, which has increased the demand for simple, language-independent extractive summarization strategies. In this paper, we employ concepts and metrics of complex networks to select sentences for an extractive summary. The graph or network representing one piece of text consists of nodes corresponding to sentences, while edges connect sentences that share common meaningful nouns. Because various metrics could be used, we developed a set of 14 summarizers, generically referred to as CN-Summ, employing network concepts such as node degree, length of shortest paths, d-rings and k-cores. An additional summarizer was created which selects the highest ranked sentences in the 14 systems, as in a voting system. When applied to a corpus of Brazilian Portuguese texts, some CN-Summ versions performed better than summarizers that do not employ deep linguistic knowledge, with results comparable to state-of-the-art summarizers based on expensive linguistic resources. The use of complex networks to represent texts appears therefore as suitable for automatic summarization, consistent with the belief that the metrics of such networks may capture important text features. (c) 2008 Elsevier Inc. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Alzheimer`s disease is an ultimately fatal neurodegenerative disease, and BACE-1 has become an attractive validated target for its therapy, with more than a hundred crystal structures deposited in the PDB. In the present study, we present a new methodology that integrates ligand-based methods with structural information derived from the receptor. 128 BACE-1 inhibitors recently disclosed by GlaxoSmithKline R&D were selected specifically because the crystal structures of 9 of these compounds complexed to BACE-1, as well as five closely related analogs, have been made available. A new fragment-guided approach was designed to incorporate this wealth of structural information into a CoMFA study, and the methodology was systematically compared to other popular approaches, such as docking, for generating a molecular alignment. The influence of the partial charges calculation method was also analyzed. Several consistent and predictive models are reported, including one with r (2) = 0.88, q (2) = 0.69 and r (pred) (2) = 0.72. The models obtained with the new methodology performed consistently better than those obtained by other methodologies, particularly in terms of external predictive power. The visual analyses of the contour maps in the context of the enzyme drew attention to a number of possible opportunities for the development of analogs with improved potency. These results suggest that 3D-QSAR studies may benefit from the additional structural information added by the presented methodology.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Measurement error models often arise in epidemiological and clinical research. Usually, in this set up it is assumed that the latent variable has a normal distribution. However, the normality assumption may not be always correct. Skew-normal/independent distribution is a class of asymmetric thick-tailed distributions which includes the Skew-normal distribution as a special case. In this paper, we explore the use of skew-normal/independent distribution as a robust alternative to null intercept measurement error model under a Bayesian paradigm. We assume that the random errors and the unobserved value of the covariate (latent variable) follows jointly a skew-normal/independent distribution, providing an appealing robust alternative to the routine use of symmetric normal distribution in this type of model. Specific distributions examined include univariate and multivariate versions of the skew-normal distribution, the skew-t distributions, the skew-slash distributions and the skew contaminated normal distributions. The methods developed is illustrated using a real data set from a dental clinical trial. (C) 2008 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper reports an expert system (SISTEMAT) developed for structural determination of diverse chemical classes of natural products, including lignans, based mainly on 13C NMR and 1H NMR data of these compounds. The system is composed of five programs that analyze specific data of a lignan and shows a skeleton probability for the compound. At the end of analyses, the results are grouped, the global probability is computed, and the most probable skeleton is exhibited to the user. SISTEMAT was able to properly predict the skeletons of 80% of the 30 lignans tested, demonstrating its advantage during the structural elucidation course in a short period of time.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The substitution of petroleum-based fuels with those from renewable sources has gained momentum worldwide. A UV-vis experiment for the quantitative analysis of biofuels (bioethanol or biodiesel) in (petroleum-based) diesel oil has been developed. Before the experiment, students were given a quiz on biofuels, and then they were asked to suggest a suitable UV-vis experiment for the quantification of biofuels in diesel oil. After discussing the results of the quiz, the experiment was conducted. This included the determination of lambda(max) of the medium-dependent, that is, solvatochromic, visible absorption band of the probe 2,6-bis[4-(tert-butyl)phenyl]-4-{2,4,6-tris[4-(tert-butyl)phenyl]pyridinium-1-yl}phenolate as a function of fuel composition. The students appreciated that the subject was linked to a daily situation and that they were asked to suggest the experiment. This experiment served to introduce the phenomena of solvation and solvatochromism.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The literature on the welfare costs of in‡ation universally assumes that the many-person household can be treated as a single economic agent. This paper explores what the heterogeneity of the agents in a household might imply for such welfare analyses. First, we show that allowing for a single-unity or for a multi-unity transacting technology impacts the money demand function and, therefore, the welfare costs of in‡ation. Second, we derive su¢cient conditions that make the welfare assessments which depart directly from the knowledge of the money demand function (as in Lucas (2000)) robust under this alternative setting. Third, we compare our general-equilibrium measure with Bailey’s (1956) partial-equilibrium one.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The literature on the welfare costs of ináation universally assumes that the many-person household can be treated as a single economic agent. This paper explores what the heterogeneity of the agents in a household might imply for such welfare analyses. First, we show that allowing for a one-person or for a many-person transacting technology impacts the money demand function and, therefore, the welfare costs of ináation. Second, more importantly, we derive su¢ cient conditions under which welfare assessments which depart directly from the knowledge of the money demand function (as in Lucas (2000)) are robust (invariant) under the number of persons considered in the household. Third, we show that Baileyís (1956) partial-equilibrium measure of the welfare costs of ináation can be obtained as a Örst-order approximation of the general-equilibrium welfare measure derived in this paper using a many-person transacting technology.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present analytical and numerical results for the specific heat and susceptibility amplitude ratios in parallel plate geometries. The results are derived using field-theoretic techniques suitable to describe the system in the bulk limit, i.e., (L/ξ±)≫ 1, where L is the distance between the plates and ξ± is the correlation length above (+) and below (-) the bulk critical temperature. Advantages and drawbacks of our method are discussed in the light of other approaches previously reported in the literature.