999 resultados para Basic problematization units


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The topic of cardiorespiratory interactions is of extreme importance to the practicing intensivist. It also has a reputation for being intellectually challenging, due in part to the enormous volume of relevant, at times contradictory literature. Another source of difficulty is the need to simultaneously consider the interrelated functioning of several organ systems (not necessarily limited to the heart and lung), in other words, to adopt a systemic (as opposed to analytic) point of view. We believe that the proper understanding of a few simple physiological concepts is of great help in organizing knowledge in this field. The first part of this review will be devoted to demonstrating this point. The second part, to be published in a coming issue of Intensive Care Medicine, will apply these concepts to clinical situations. We hope that this text will be of some use, especially to intensivists in training, to demystify a field that many find intimidating.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Audit report of the financial statements of the governmental activities, the business type activities, the aggregate discretely presented component units, each major fund and the aggregate remaining fund information of the State of Iowa as of and for the year ended June 30, 2014

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Gait analysis methods to estimate spatiotemporal measures, based on two, three or four gyroscopes attached on lower limbs have been discussed in the literature. The most common approach to reduce the number of sensing units is to simplify the underlying biomechanical gait model. In this study, we propose a novel method based on prediction of movements of thighs from movements of shanks. Datasets from three previous studies were used. Data from the first study (ten healthy subjects and ten with Parkinson's disease) were used to develop and calibrate a system with only two gyroscopes attached on shanks. Data from two other studies (36 subjects with hip replacement, seven subjects with coxarthrosis, and eight control subjects) were used for comparison with the other methods and for assessment of error compared to a motion capture system. Results show that the error of estimation of stride length compared to motion capture with the system with four gyroscopes and our new method based on two gyroscopes was close ( -0.8 ±6.6 versus 3.8 ±6.6 cm). An alternative with three sensing units did not show better results (error: -0.2 ±8.4 cm). Finally, a fourth that also used two units but with a simpler gait model had the highest bias compared to the reference (error: -25.6 ±7.6 cm). We concluded that it is feasible to estimate movements of thighs from movements of shanks to reduce number of needed sensing units from 4 to 2 in context of ambulatory gait analysis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: basic calcium phosphate (BCP) crystals are commonly found in osteoarthritis (OA) and are associated with cartilage destruction. BCP crystals induce in vitro catabolic responses with the production of metalloproteases and inflammatory cytokines such as interleukin-1 (IL-1). In vivo, IL-1 production induced by BCP crystals is both dependant and independent of NLRP3 inflammasome. We aimed to clarify 1/ the role of BCP crystals in cartilage destruction and 2/ the role of IL-1 and NLRP3 inflammasome in cartilage degradation related to BCP crystals. METHODOLOGY PRINCIPAL FINDINGS: synovial membranes isolated from OA knees were analysed by alizarin Red and FTIR. Pyrogen free BCP crystals were injected into right knees of WT, NLRP3 -/-, ASC -/-, IL-1α -/- and IL-1β-/- mice and PBS was injected into left knees. To assess the role of IL-1, WT mice were treated by intra-peritoneal injections of anakinra, the IL-1Ra recombinant protein, or PBS. Articular destruction was studied at d4, d17 and d30 assessing synovial inflammation, proteoglycan loss and chondrocyte apoptosis. BCP crystals were frequently found in OA synovial membranes including low grade OA. BCP crystals injected into murine knee joints provoked synovial inflammation characterized by synovial macrophage infiltration that persisted at day 30, cartilage degradation as evidenced by loss of proteoglycan staining by Safranin-O and concomitant expression of VDIPEN epitopes, and increased chondrocyte apoptosis. BCP crystal-induced synovitis was totally independent of IL-1α and IL-1β signalling and no alterations of inflammation were observed in mice deficient for components of the NLRP3-inflammasome, IL-1α or IL-1β. Similarly, treatment with anakinra did not prevent BCP crystal effects. In vitro, BCP crystals elicited enhanced transcription of matrix degrading and pro-inflammatory genes in macrophages. CONCLUSIONS SIGNIFICANCE: intra-articular BCP crystals can elicit synovial inflammation and cartilage degradation suggesting that BCP crystals have a direct pathogenic role in OA. The effects are independent of IL-1 and NLRP3 inflammasome.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objectives: We tested the effects of the three forms of basic calcium phosphate (BCP) crystals (octacalcium phosphate (OCP), carbonate-substituted apatite (CA) and hydroxyapatite (HA)) on monocytes and macrophages on IL-1β secretion. The requirement for the NALP3 inflammasome and TLR2 and TLR4 receptors in this acute response was analyzed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Regulatory T cells control immune responses to self- and foreign-antigens and play a major role in maintaining the balance between immunity and tolerance. This article reviews recent key developments in the field of CD4+CD25+Foxp3+ regulatory T (TREG) cells. It presents their characteristics and describes their range of activity and mechanisms of action. Some models of diseases triggered by the imbalance between TREG cells and effector pathogenic T cells are described and their potential therapeutic applications in humans are outlined.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The demyelinative potential of the cytokines interleukin-1 alpha (IL-1 alpha), interferon-gamma (IFN-gamma), and tumor necrosis factor-alpha (TNF-alpha) has been investigated in myelinating aggregate brain cell cultures. Treatment of myelinated cultures with these cytokines resulted in a reduction in myelin basic protein (MBP) content. This effect was additively increased by anti-myelin/oligodendrocyte glycoprotein (alpha-MOG) in the presence of complement. Qualitative immunocytochemistry demonstrated that peritoneal macrophages, added to the fetal telencephalon cell suspensions at the start of the culture period, successfully integrated into aggregate cultures. Supplementing the macrophage component of the cultures in this fashion resulted in increased accumulation of MBP. The effect of IFN-gamma on MBP content of cultures was not affected by the presence of macrophages in increased numbers.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Soluble MHC-peptide complexes, commonly known as tetramers, allow the detection and isolation of antigen-specific T cells. Although other types of soluble MHC-peptide complexes have been introduced, the most commonly used MHC class I staining reagents are those originally described by Altman and Davis. As these reagents have become an essential tool for T cell analysis, it is important to have a large repertoire of such reagents to cover a broad range of applications in cancer research and clinical trials. Our tetramer collection currently comprises 228 human and 60 mouse tetramers and new reagents are continuously being added. For the MHC II tetramers, the list currently contains 21 human (HLA-DR, DQ and DP) and 5 mouse (I-A(b)) tetramers. Quantitative enumeration of antigen-specific T cells by tetramer staining, especially at low frequencies, critically depends on the quality of the tetramers and on the staining procedures. For conclusive longitudinal monitoring, standardized reagents and analysis protocols need to be used. This is especially true for the monitoring of antigen-specific CD4+ T cells, as there are large variations in the quality of MHC II tetramers and staining conditions. This commentary provides an overview of our tetramer collection and indications on how tetramers should be used to obtain optimal results.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Self Instructional Math course book is designed to provide a basic math knowledge for those involved in the planning, design, and construction of highways. It was developed in a manner to allow the student to take the course with minimal supervision and at times that the work schedule allows. The first version of the course was developed in the early 1970's and due to its popularity was revised in the early 1990's to reflect changes in the highway construction math needs. The anticipated move to metric (System International) measurements by the highway industry has necessitated the need to change the math course problem values to metric units. The course includes the latest in Iowa DOT policy information relative to the selection and use of metric values for highway design, and construction. Each unit of the book contains instructional information, section quizzes and a comprehensive examination. All problem values are expressed in metric rather than dual (english and SI) units. The appendix contains useful conversion factors to assist the reader in making the change to metric.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The objective of this work was to evaluate the use of basic density and pulp yield correlations with some chemical parameters, in order to differentiate an homogeneous eucalyptus tree population, in terms of its potential for pulp production or some other technological applications. Basic density and kraft pulp yield were determined for 120 Eucalyptus globulus trees, and the values were plotted as frequency distributions. Homogenized samples from the first and fourth density quartiles and first and fourth yield quartiles were submitted to total phenols, total sugars and methoxyl group analysis. Syringyl/guaiacyl (S/G) and syringaldehyde/vanillin (S/V) ratios were determined on the kraft lignins from wood of the same quartiles. The results show the similarity between samples from high density and low yield quartiles, both with lower S/G (3.88-4.12) and S/V (3.99-4.09) ratios and higher total phenols (13.3-14.3 g gallic acid kg-1 ). Woods from the high yield quartile are statistically distinguished from all the others because of their higher S/G (5.15) and S/V (4.98) ratios and lower total phenols (8.7 g gallic acid kg-1 ). Methoxyl group and total sugars parameters are more adequate to distinguish wood samples with lower density.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Polyhydroxyalkanoates (PHAs) are polyesters of hydroxyacids naturally synthesized in bacteria as a carbon reserve. PHAs have properties of biodegradable thermoplastics and elastomers and their synthesis in crop plants is seen as an attractive system for the sustained production of large amounts of polymers at low cost. A variety of PHAs having different physical properties have now been synthesized in a number of transgenic plants, including Arabidopsis thaliana, rape and corn. This has been accomplished through the creation of novel metabolic pathways either in the cytoplasm, plastid or peroxisome of plant cells. Beyond its impact in biotechnology, PHA production in plants can also be used to study some fundamental aspects of plant metabolism. Synthesis of PHA can be used both as an indicator and a modulator of the carbon flux to pathways competing for common substrates, such as acetyl-coenzyme A in fatty acid biosynthesis or 3-hydroxyacyl-coenzyme A in fatty acid degradation. Synthesis of PHAs in plant peroxisome has been used to demonstrate changes in the flux of fatty acids to the beta-oxidation cycle in transgenic plants and mutants affected in lipid biosynthesis, as well as to study the pathway of degradation of unusual fatty acids.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Manufactured nanoparticles are introduced into industrial processes, but they are suspected to cause similar negative health effects as ambient particles. The poor knowledge about the scale of this introduction did not allow global risk analysis so far. In 2006 a targeted telephone survey among Swiss companies (1) showed the usage of nanoparticles in a few selected companies but did not provide data to extrapolate on the totality of the Swiss workforce. To gain this kind of information a layered representative questionnaire survey among 1'626 Swiss companies was conducted in 2007. Data was collected about the number of potentially exposed persons in the companies and their protection strategy. The response rate was 58.3%. An expected number of 586 companies (95%−confidence interval 145 to 1'027) was shown by this study to use nanoparticles in Switzerland. Estimated 1'309 (1'073 to 1'545) workers do their job in the same room as a nanoparticle application. Personal protection was shown to be the predominant type of protection means. Companies starting productions with nanomaterials need to consider incorporating protection measures into the plans. This will not only benefit the workers' health, but will also likely increase the competitiveness of the companies. Technical and organisational protection means are not only more cost−effective on the long term, but are also easier to control. Guidelines may have to be designed specifically for different industrial applications, including fields outside nanotechnology, and adapted to all sizes of companies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Résumé Cette thèse est consacrée à l'analyse, la modélisation et la visualisation de données environnementales à référence spatiale à l'aide d'algorithmes d'apprentissage automatique (Machine Learning). L'apprentissage automatique peut être considéré au sens large comme une sous-catégorie de l'intelligence artificielle qui concerne particulièrement le développement de techniques et d'algorithmes permettant à une machine d'apprendre à partir de données. Dans cette thèse, les algorithmes d'apprentissage automatique sont adaptés pour être appliqués à des données environnementales et à la prédiction spatiale. Pourquoi l'apprentissage automatique ? Parce que la majorité des algorithmes d'apprentissage automatiques sont universels, adaptatifs, non-linéaires, robustes et efficaces pour la modélisation. Ils peuvent résoudre des problèmes de classification, de régression et de modélisation de densité de probabilités dans des espaces à haute dimension, composés de variables informatives spatialisées (« géo-features ») en plus des coordonnées géographiques. De plus, ils sont idéaux pour être implémentés en tant qu'outils d'aide à la décision pour des questions environnementales allant de la reconnaissance de pattern à la modélisation et la prédiction en passant par la cartographie automatique. Leur efficacité est comparable au modèles géostatistiques dans l'espace des coordonnées géographiques, mais ils sont indispensables pour des données à hautes dimensions incluant des géo-features. Les algorithmes d'apprentissage automatique les plus importants et les plus populaires sont présentés théoriquement et implémentés sous forme de logiciels pour les sciences environnementales. Les principaux algorithmes décrits sont le Perceptron multicouches (MultiLayer Perceptron, MLP) - l'algorithme le plus connu dans l'intelligence artificielle, le réseau de neurones de régression généralisée (General Regression Neural Networks, GRNN), le réseau de neurones probabiliste (Probabilistic Neural Networks, PNN), les cartes auto-organisées (SelfOrganized Maps, SOM), les modèles à mixture Gaussiennes (Gaussian Mixture Models, GMM), les réseaux à fonctions de base radiales (Radial Basis Functions Networks, RBF) et les réseaux à mixture de densité (Mixture Density Networks, MDN). Cette gamme d'algorithmes permet de couvrir des tâches variées telle que la classification, la régression ou l'estimation de densité de probabilité. L'analyse exploratoire des données (Exploratory Data Analysis, EDA) est le premier pas de toute analyse de données. Dans cette thèse les concepts d'analyse exploratoire de données spatiales (Exploratory Spatial Data Analysis, ESDA) sont traités selon l'approche traditionnelle de la géostatistique avec la variographie expérimentale et selon les principes de l'apprentissage automatique. La variographie expérimentale, qui étudie les relations entre pairs de points, est un outil de base pour l'analyse géostatistique de corrélations spatiales anisotropiques qui permet de détecter la présence de patterns spatiaux descriptible par une statistique. L'approche de l'apprentissage automatique pour l'ESDA est présentée à travers l'application de la méthode des k plus proches voisins qui est très simple et possède d'excellentes qualités d'interprétation et de visualisation. Une part importante de la thèse traite de sujets d'actualité comme la cartographie automatique de données spatiales. Le réseau de neurones de régression généralisée est proposé pour résoudre cette tâche efficacement. Les performances du GRNN sont démontrées par des données de Comparaison d'Interpolation Spatiale (SIC) de 2004 pour lesquelles le GRNN bat significativement toutes les autres méthodes, particulièrement lors de situations d'urgence. La thèse est composée de quatre chapitres : théorie, applications, outils logiciels et des exemples guidés. Une partie importante du travail consiste en une collection de logiciels : Machine Learning Office. Cette collection de logiciels a été développée durant les 15 dernières années et a été utilisée pour l'enseignement de nombreux cours, dont des workshops internationaux en Chine, France, Italie, Irlande et Suisse ainsi que dans des projets de recherche fondamentaux et appliqués. Les cas d'études considérés couvrent un vaste spectre de problèmes géoenvironnementaux réels à basse et haute dimensionnalité, tels que la pollution de l'air, du sol et de l'eau par des produits radioactifs et des métaux lourds, la classification de types de sols et d'unités hydrogéologiques, la cartographie des incertitudes pour l'aide à la décision et l'estimation de risques naturels (glissements de terrain, avalanches). Des outils complémentaires pour l'analyse exploratoire des données et la visualisation ont également été développés en prenant soin de créer une interface conviviale et facile à l'utilisation. Machine Learning for geospatial data: algorithms, software tools and case studies Abstract The thesis is devoted to the analysis, modeling and visualisation of spatial environmental data using machine learning algorithms. In a broad sense machine learning can be considered as a subfield of artificial intelligence. It mainly concerns with the development of techniques and algorithms that allow computers to learn from data. In this thesis machine learning algorithms are adapted to learn from spatial environmental data and to make spatial predictions. Why machine learning? In few words most of machine learning algorithms are universal, adaptive, nonlinear, robust and efficient modeling tools. They can find solutions for the classification, regression, and probability density modeling problems in high-dimensional geo-feature spaces, composed of geographical space and additional relevant spatially referenced features. They are well-suited to be implemented as predictive engines in decision support systems, for the purposes of environmental data mining including pattern recognition, modeling and predictions as well as automatic data mapping. They have competitive efficiency to the geostatistical models in low dimensional geographical spaces but are indispensable in high-dimensional geo-feature spaces. The most important and popular machine learning algorithms and models interesting for geo- and environmental sciences are presented in details: from theoretical description of the concepts to the software implementation. The main algorithms and models considered are the following: multi-layer perceptron (a workhorse of machine learning), general regression neural networks, probabilistic neural networks, self-organising (Kohonen) maps, Gaussian mixture models, radial basis functions networks, mixture density networks. This set of models covers machine learning tasks such as classification, regression, and density estimation. Exploratory data analysis (EDA) is initial and very important part of data analysis. In this thesis the concepts of exploratory spatial data analysis (ESDA) is considered using both traditional geostatistical approach such as_experimental variography and machine learning. Experimental variography is a basic tool for geostatistical analysis of anisotropic spatial correlations which helps to understand the presence of spatial patterns, at least described by two-point statistics. A machine learning approach for ESDA is presented by applying the k-nearest neighbors (k-NN) method which is simple and has very good interpretation and visualization properties. Important part of the thesis deals with a hot topic of nowadays, namely, an automatic mapping of geospatial data. General regression neural networks (GRNN) is proposed as efficient model to solve this task. Performance of the GRNN model is demonstrated on Spatial Interpolation Comparison (SIC) 2004 data where GRNN model significantly outperformed all other approaches, especially in case of emergency conditions. The thesis consists of four chapters and has the following structure: theory, applications, software tools, and how-to-do-it examples. An important part of the work is a collection of software tools - Machine Learning Office. Machine Learning Office tools were developed during last 15 years and was used both for many teaching courses, including international workshops in China, France, Italy, Ireland, Switzerland and for realizing fundamental and applied research projects. Case studies considered cover wide spectrum of the real-life low and high-dimensional geo- and environmental problems, such as air, soil and water pollution by radionuclides and heavy metals, soil types and hydro-geological units classification, decision-oriented mapping with uncertainties, natural hazards (landslides, avalanches) assessments and susceptibility mapping. Complementary tools useful for the exploratory data analysis and visualisation were developed as well. The software is user friendly and easy to use.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Iowa Department of Education surveyed Iowa’s 15 community colleges to gain information about each institution’s basic skill assessment requirements for placement into courses and programs. The survey asked what basic skill assessment(s) each institution uses, whether developmental course placement was mandatory, and what scores students needed to obtain to avoid being required or urged to take developmental courses in math, science, and reading. Additionally, staff members at each college were asked what the testing requirements are for students’ enrolled full time in high school that are taking community college classes.