866 resultados para A network is to improve health and reduce health inequalities through information exchange


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objective. The general aim of this article is to describe the state-of-the-art of biocompatibility testing for dental materials, and present new strategies for improving operative dentistry techniques and the biocompatibility of dental materials as they relate to their interaction with the dentin-pulp complex.Methods. The literature was reviewed focusing on articles related to biocompatibilty testing, the dentin-pulp complex and new strategies and materials for operative dentistry. For this purpose, the PubMed database as well as 118 articles published in English from 1939 to 2014 were searched. Data concerning types of biological tests and standardization of in vitro and in vivo protocols employed to evaluate the cytotoxicity and biocompatibility of dental materials were also searched from the US Food and Drug Administration (FDA), International Standards Organization (ISO) and American National Standards Institute (ANSI).Results. While there is an ongoing search for feasible strategies in the molecular approach to direct the repair or regeneration of structures that form the oral tissues, it is necessary for professionals to master the clinical therapies available at present. In turn, these techniques must be applied based on knowledge of the morphological and physiological characteristics of the tissues involved, as well as the physical, mechanical and biologic properties of the biomaterials recommended for each specific situation. Thus, particularly within modern esthetic restorative dentistry, the use of minimally invasive operative techniques associated with the use of dental materials with excellent properties and scientifically proved by means of clinical and laboratory studies must be a routine for dentists. This professional and responsible attitude will certainly result in greater possibility of achieving clinical success, benefiting patients and dentists themselves.Signcance. This article provides a general and critical view of the relations that permeate the interaction between dental materials and the dentin-pulp complex, and establish real possibilities and strategies that favor biocompatibility of the present and new products used in Dentistry, which will certainly benefit clinicians and their patients. (C) 2014 Academy of Dental Materials. Published by Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The objective of this article is to discuss the meanings that health professionals and patients in treatment attribute to obesity. The research consisted of a qualitative survey in health, based on in-depth interviews with patients and professionals at an out-patient clinic at the University Hospital in Barcelona, Spain. Here, we discuss the concept of obesity, the meanings of diagnoses, the singularities involved in managing treatment, and the process of becoming ill, all in the light of the anthropology of health that has a sociocultural orientation. Obesity is usually seen by the professionals as a risk-factor disease. For patients, the incorporation of this rationality is procedural and is mixed in with other meanings attributed to being overweight/obese that have been gradually developed throughout life. A patient's autonomy in choosing to be fat, or obese, and to adhere to treatment, is defined as a process that requires support in order to come to joint proposals in caring for these problems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Connectivity is the basic factor for the proper operation of any wireless network. In a mobile wireless sensor network it is a challenge for applications and protocols to deal with connectivity problems, as links might get up and down frequently. In these scenarios, having knowledge of the node remaining connectivity time could both improve the performance of the protocols (e.g. handoff mechanisms) and save possible scarce nodes resources (CPU, bandwidth, and energy) by preventing unfruitful transmissions. The current paper provides a solution called Genetic Machine Learning Algorithm (GMLA) to forecast the remainder connectivity time in mobile environments. It consists in combining Classifier Systems with a Markov chain model of the RF link quality. The main advantage of using an evolutionary approach is that the Markov model parameters can be discovered on-the-fly, making it possible to cope with unknown environments and mobility patterns. Simulation results show that the proposal is a very suitable solution, as it overcomes the performance obtained by similar approaches.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

It is required that patients are provided information about therapeutic possibilities, showing the risks, benefits, prognosis and costs of each possible and indicated alternative. This is an ethical and legal resolution. However, health professionals possess the clinical/technical/scientific knowledge and determine what information will be (or not) provided. The patient in question decides to undergo a treatment, providing his/her free and informed consent on the basis of the data presented. Unfortunately, some professionals may not provide all the information necessary for making an informed decision or, after obtaining the consent of the patient, may provide him information that causes the patient to give up on the treatment initially accepted. Such information, if relevant, and not a supervening fact, should have been provided initially. However, the information may not be entirely true, and bring the patient, for instance, to decide based on inadequately presented risks. The craniofacial rehabilitation of the temporomandibular joint (TMJ) by means of TMJ prosthesis, is indicated in many situations. Often, patients in need of such prostheses have aesthetic and functional problems and the rehabilitation expectations run high. This work presents a case and discusses ethical and legal issues, including the liability of partial and inadequate information to a patient.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Transmission expansion planning (TEP) is a classic problem in electric power systems. In current optimization models used to approach the TEP problem, new transmission lines and two-winding transformers are commonly used as the only candidate solutions. However, in practice, planners have resorted to non-conventional solutions such as network reconfiguration and/or repowering of existing network assets (lines or transformers). These types of non-conventional solutions are currently not included in the classic mathematical models of the TEP problem. This paper presents the modeling of necessary equations, using linear expressions, in order to include non-conventional candidate solutions in the disjunctive linear model of the TEP problem. The resulting model is a mixed integer linear programming problem, which guarantees convergence to the optimal solution by means of available classical optimization tools. The proposed model is implemented in the AMPL modeling language and is solved using CPLEX optimizer. The Garver test system, IEEE 24-busbar system, and a Colombian system are used to demonstrate that the utilization of non-conventional candidate solutions can reduce investment costs of the TEP problem. (C) 2015 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Biodegradable polymers are starting to be introduced as raw materials in the food-packaging market. Nevertheless, their price is very high. Starch, a fully biodegradable and bioderived polymer is a very interesting alternative due to its very low price. However, the use of starch as the polymer matrix for the production of rigid food packaging, such as trays, is limited due to its poor mechanical properties, high hidrophilicity and high density. This work presents two strategies to overcome the poor mechanical properties of starch. First, the plasticization of starch with several amounts of glycerol to produce thermoplastic starch (TPS) and second, the production of biocomposites by reinforcing TPS with promising fibers, such as barley straw and grape waste. The mechanical properties obtained are compared with the values predicted by models used in the field of composites; law of mixtures, Kerner-Nielsen and Halpin-Tsai. To evaluate if the materials developed are suitable for the production of food-packaging trays, the TPS-based materials with better mechanical properties were compared with commercial grades of oil-based polymers, polypropylene (PP) and polyethylene-terphthalate (PET), and a biodegradable polymer, polylactic acid (PLA).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Tuberculosis is one of the leading causes of morbidity and mortality worldwide. Current treatment has several challenges including multi-drug resistance, extensively drug-resistance and HIV co-infection. Problems related to patients, treatment and health care system also contributes negatively to this panel. This review summarizes the main obstacles to treat tuberculosis and discuss several strategies to improve the treatment.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Low-frequency repetitive transcranial magnetic stimulation (rTMS) of the unaffected hemisphere can enhance function of the paretic hand in patients with mild motor impairment. Effects of low-frequency rTMS to the contralesional motor cortex at an early stage of mild to severe hemiparesis after stroke are unknown. In this pilot, randomized, double-blind clinical trial we compared the effects of low-frequency rTMS or sham rTMS as add-on therapies to outpatient customary rehabilitation, in 30 patients within 5-45 days after ischemic stroke, and mild to severe hand paresis. The primary feasibility outcome was compliance with the interventions. The primary safety outcome was the proportion of intervention-related adverse events. Performance of the paretic hand in the Jebsen-Taylor test and pinch strength were secondary outcomes. Outcomes were assessed at baseline, after ten sessions of treatment administered over 2 weeks and at 1 month after end of treatment. Baseline clinical features were comparable across groups. For the primary feasibility outcome, compliance with treatment was 100% in the active group and 94% in the sham group. There were no serious intervention-related adverse events. There were significant improvements in performance in the Jebsen-Taylor test (mean, 12.3% 1 month after treatment) and pinch force (mean, 0.5 Newtons) in the active group, but not in the sham group. Low-frequency rTMS to the contralesional motor cortex early after stroke is feasible, safe and potentially effective to improve function of the paretic hand, in patients with mild to severe hemiparesis. These promising results will be valuable to design larger randomized clinical trials.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Vaquero AR, Ferreira NE, Omae SV, Rodrigues MV, Teixeira SK, Krieger JE, Pereira AC. Using gene-network landscape to dissect genotype effects of TCF7L2 genetic variant on diabetes and cardiovascular risk. Physiol Genomics 44: 903-914, 2012. First published August 7, 2012; doi:10.1152/physiolgenomics.00030.2012.-The single nucleotide polymorphism (SNP) within the TCF7L2 gene, rs7903146, is, to date, the most significant genetic marker associated with Type 2 diabetes mellitus (T2DM) risk. Nonetheless, its functional role in disease pathology is poorly understood. The aim of the present study was to investigate, in vascular smooth muscle cells from 92 patients undergoing aortocoronary bypass surgery, the contribution of this SNP in T2DM using expression levels and expression correlation comparison approaches, which were visually represented as gene interaction networks. Initially, the expression levels of 41 genes (seven TCF7L2 splice forms and 40 other T2DM relevant genes) were compared between rs7903146 wild-type (CC) and T2DM-risk (CT + TT) genotype groups. Next, we compared the expression correlation patterns of these 41 genes between groups to observe if the relationships between genes were different. Five TCF7L2 splice forms and nine genes showed significant expression differences between groups. RXR alpha gene was pinpointed as showing the most different expression correlation pattern with other genes. Therefore, T2DM risk alleles appear to be influencing TCF7L2 splice form's expression in vascular smooth muscle cells, and RXR alpha gene is pointed out as a treatment target candidate for risk reduction in individuals with high risk of developing T2DM, especially individuals harboring TCF7L2 risk genotypes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Propolis is a polyphenol-rich resinous substance extensively used to improve health and prevent diseases. The effects of polyphenols from different sources of propolis on atherosclerotic lesions and inflammatory and angiogenic factors were investigated in LDL receptor gene (LDLr-/-) knockout mice. The animals received a cholesterol-enriched diet to induce the initial atherosclerotic lesions (IALs) or advanced atherosclerotic lesions (AALs). The IAL or AAL animals were divided into three groups, each receiving polyphenols from either the green, red or brown propolis (250 mg/kg per day) by gavage. After 4 weeks of polyphenol treatment, the animals were sacrificed and their blood was collected for lipid profile analysis. The atheromatous lesions at the aortic root were also analyzed for gene expression of inflammatory and angiogenic factors by quantitative real-time polymerase chain reaction and immunohistochemistry. All three polyphenol extracts improved the lipid profile and decreased the atherosclerotic lesion area in IAL animals. However, only polyphenols from the red propolis induced favorable changes in the lipid profiles and reduced the lesion areas in AAL mice. In IAL groups. VCAM, MCP-1, FGF, PDGF, VEGF, PECAM and MMP-9 gene expression was down-regulated, while the metalloproteinase inhibitor TIMP-1 gene was up-regulated by all polyphenol extracts. In contrast, for advanced lesions, only the polyphenols from red propolis induced the down-regulation of CD36 and the up-regulation of HO-1 and TIMP-1 when compared to polyphenols from the other two types of propolis. In conclusion, polyphenols from propolis, particularly red propolis, are able to reduce atherosclerotic lesions through mechanisms including the modulation of inflammatory and angiogenic factors. (C) 2012 Elsevier Inc. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The scale down of transistor technology allows microelectronics manufacturers such as Intel and IBM to build always more sophisticated systems on a single microchip. The classical interconnection solutions based on shared buses or direct connections between the modules of the chip are becoming obsolete as they struggle to sustain the increasing tight bandwidth and latency constraints that these systems demand. The most promising solution for the future chip interconnects are the Networks on Chip (NoC). NoCs are network composed by routers and channels used to inter- connect the different components installed on the single microchip. Examples of advanced processors based on NoC interconnects are the IBM Cell processor, composed by eight CPUs that is installed on the Sony Playstation III and the Intel Teraflops pro ject composed by 80 independent (simple) microprocessors. On chip integration is becoming popular not only in the Chip Multi Processor (CMP) research area but also in the wider and more heterogeneous world of Systems on Chip (SoC). SoC comprehend all the electronic devices that surround us such as cell-phones, smart-phones, house embedded systems, automotive systems, set-top boxes etc... SoC manufacturers such as ST Microelectronics , Samsung, Philips and also Universities such as Bologna University, M.I.T., Berkeley and more are all proposing proprietary frameworks based on NoC interconnects. These frameworks help engineers in the switch of design methodology and speed up the development of new NoC-based systems on chip. In this Thesis we propose an introduction of CMP and SoC interconnection networks. Then focusing on SoC systems we propose: • a detailed analysis based on simulation of the Spidergon NoC, a ST Microelectronics solution for SoC interconnects. The Spidergon NoC differs from many classical solutions inherited from the parallel computing world. Here we propose a detailed analysis of this NoC topology and routing algorithms. Furthermore we propose aEqualized a new routing algorithm designed to optimize the use of the resources of the network while also increasing its performance; • a methodology flow based on modified publicly available tools that combined can be used to design, model and analyze any kind of System on Chip; • a detailed analysis of a ST Microelectronics-proprietary transport-level protocol that the author of this Thesis helped developing; • a simulation-based comprehensive comparison of different network interface designs proposed by the author and the researchers at AST lab, in order to integrate shared-memory and message-passing based components on a single System on Chip; • a powerful and flexible solution to address the time closure exception issue in the design of synchronous Networks on Chip. Our solution is based on relay stations repeaters and allows to reduce the power and area demands of NoC interconnects while also reducing its buffer needs; • a solution to simplify the design of the NoC by also increasing their performance and reducing their power and area consumption. We propose to replace complex and slow virtual channel-based routers with multiple and flexible small Multi Plane ones. This solution allows us to reduce the area and power dissipation of any NoC while also increasing its performance especially when the resources are reduced. This Thesis has been written in collaboration with the Advanced System Technology laboratory in Grenoble France, and the Computer Science Department at Columbia University in the city of New York.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Forecasting the time, location, nature, and scale of volcanic eruptions is one of the most urgent aspects of modern applied volcanology. The reliability of probabilistic forecasting procedures is strongly related to the reliability of the input information provided, implying objective criteria for interpreting the historical and monitoring data. For this reason both, detailed analysis of past data and more basic research into the processes of volcanism, are fundamental tasks of a continuous information-gain process; in this way the precursor events of eruptions can be better interpreted in terms of their physical meanings with correlated uncertainties. This should lead to better predictions of the nature of eruptive events. In this work we have studied different problems associated with the long- and short-term eruption forecasting assessment. First, we discuss different approaches for the analysis of the eruptive history of a volcano, most of them generally applied for long-term eruption forecasting purposes; furthermore, we present a model based on the characteristics of a Brownian passage-time process to describe recurrent eruptive activity, and apply it for long-term, time-dependent, eruption forecasting (Chapter 1). Conversely, in an effort to define further monitoring parameters as input data for short-term eruption forecasting in probabilistic models (as for example, the Bayesian Event Tree for eruption forecasting -BET_EF-), we analyze some characteristics of typical seismic activity recorded in active volcanoes; in particular, we use some methodologies that may be applied to analyze long-period (LP) events (Chapter 2) and volcano-tectonic (VT) seismic swarms (Chapter 3); our analysis in general are oriented toward the tracking of phenomena that can provide information about magmatic processes. Finally, we discuss some possible ways to integrate the results presented in Chapters 1 (for long-term EF), 2 and 3 (for short-term EF) in the BET_EF model (Chapter 4).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis presents several data processing and compression techniques capable of addressing the strict requirements of wireless sensor networks. After introducing a general overview of sensor networks, the energy problem is introduced, dividing the different energy reduction approaches according to the different subsystem they try to optimize. To manage the complexity brought by these techniques, a quick overview of the most common middlewares for WSNs is given, describing in detail SPINE2, a framework for data processing in the node environment. The focus is then shifted on the in-network aggregation techniques, used to reduce data sent by the network nodes trying to prolong the network lifetime as long as possible. Among the several techniques, the most promising approach is the Compressive Sensing (CS). To investigate this technique, a practical implementation of the algorithm is compared against a simpler aggregation scheme, deriving a mixed algorithm able to successfully reduce the power consumption. The analysis moves from compression implemented on single nodes to CS for signal ensembles, trying to exploit the correlations among sensors and nodes to improve compression and reconstruction quality. The two main techniques for signal ensembles, Distributed CS (DCS) and Kronecker CS (KCS), are introduced and compared against a common set of data gathered by real deployments. The best trade-off between reconstruction quality and power consumption is then investigated. The usage of CS is also addressed when the signal of interest is sampled at a Sub-Nyquist rate, evaluating the reconstruction performance. Finally the group sparsity CS (GS-CS) is compared to another well-known technique for reconstruction of signals from an highly sub-sampled version. These two frameworks are compared again against a real data-set and an insightful analysis of the trade-off between reconstruction quality and lifetime is given.