758 resultados para aggregation of knowledge
Resumo:
La caracterización de los cultivos cubierta (cover crops) puede permitir comparar la idoneidad de diferentes especies para proporcionar servicios ecológicos como el control de la erosión, el reciclado de nutrientes o la producción de forrajes. En este trabajo se estudiaron bajo condiciones de campo diferentes técnicas para caracterizar el dosel vegetal con objeto de establecer una metodología para medir y comparar las arquitecturas de los cultivos cubierta más comunes. Se estableció un ensayo de campo en Madrid (España central) para determinar la relación entre el índice de área foliar (LAI) y la cobertura del suelo (GC) para un cultivo de gramínea, uno de leguminosa y uno de crucífera. Para ello se sembraron doce parcelas con cebada (Hordeum vulgare L.), veza (Vicia sativa L.), y colza (Brassica napus L.). En 10 fechas de muestreo se midieron el LAI (con estimaciones directas y del LAI-2000), la fracción interceptada de la radiación fotosintéticamente activa (FIPAR) y la GC. Un experimento de campo de dos años (Octubre-Abril) se estableció en la misma localización para evaluar diferentes especies (Hordeum vulgare L., Secale cereale L., x Triticosecale Whim, Sinapis alba L., Vicia sativa L.) y cultivares (20) en relación con su idoneidad para ser usadas como cultivos cubierta. La GC se monitorizó mediante análisis de imágenes digitales con 21 y 22 muestreos, y la biomasa se midió 8 y 10 veces, respectivamente para cada año. Un modelo de Gompertz caracterizó la cobertura del suelo hasta el decaimiento observado tras las heladas, mientras que la biomasa se ajustó a ecuaciones de Gompertz, logísticas y lineales-exponenciales. Al final del experimento se determinaron el C, el N y el contenido en fibra (neutrodetergente, ácidodetergente y lignina), así como el N fijado por las leguminosas. Se aplicó el análisis de decisión multicriterio (MCDA) con objeto de obtener un ranking de especies y cultivares de acuerdo con su idoneidad para actuar como cultivos cubierta en cuatro modalidades diferentes: cultivo de cobertura, cultivo captura, abono verde y forraje. Las asociaciones de cultivos leguminosas con no leguminosas pueden afectar al crecimiento radicular y a la absorción de N de ambos componentes de la mezcla. El conocimiento de cómo los sistemas radiculares específicos afectan al crecimiento individual de las especies es útil para entender las interacciones en las asociaciones, así como para planificar estrategias de cultivos cubierta. En un tercer ensayo se combinaron estudios en rhizotrones con extracción de raíces e identificación de especies por microscopía, así como con estudios de crecimiento, absorción de N y 15N en capas profundas del suelo. Las interacciones entre raíces en su crecimiento y en el aprovisionamiento de N se estudiaron para dos de los cultivares mejor valorados en el estudio previo: uno de cebada (Hordeum vulgare L. cv. Hispanic) y otro de veza (Vicia sativa L. cv. Aitana). Se añadió N en dosis de 0 (N0), 50 (N1) y 150 (N2) kg N ha-1. Como resultados del primer estudio, se ajustaron correctamente modelos lineales y cuadráticos a la relación entre la GC y el LAI para todos los cultivos, pero en la gramínea alcanzaron una meseta para un LAI>4. Antes de alcanzar la cobertura total, la pendiente de la relación lineal entre ambas variables se situó en un rango entre 0.025 y 0.030. Las lecturas del LAI-2000 estuvieron correlacionadas linealmente con el LAI, aunque con tendencia a la sobreestimación. Las correcciones basadas en el efecto de aglutinación redujeron el error cuadrático medio del LAI estimado por el LAI-2000 desde 1.2 hasta 0.5 para la crucífera y la leguminosa, no siendo efectivas para la cebada. Esto determinó que para los siguientes estudios se midieran únicamente la GC y la biomasa. En el segundo experimento, las gramíneas alcanzaron la mayor cobertura del suelo (83-99%) y la mayor biomasa (1226-1928 g m-2) al final del mismo. Con la mayor relación C/N (27-39) y contenido en fibra digestible (53-60%) y la menor calidad de residuo (~68%). La mostaza presentó elevadas GC, biomasa y absorción de N en el año más templado en similitud con las gramíneas, aunque escasa calidad como forraje en ambos años. La veza presentó la menor absorción de N (2.4-0.7 g N m-2) debido a la fijación de N (9.8-1.6 g N m-2) y escasa acumulación de N. El tiempo térmico hasta alcanzar el 30% de GC constituyó un buen indicador de especies de rápida cubrición. La cuantificación de las variables permitió hallar variabilidad entre las especies y proporcionó información para posteriores decisiones sobre la selección y manejo de los cultivos cubierta. La agregación de dichas variables a través de funciones de utilidad permitió confeccionar rankings de especies y cultivares para cada uso. Las gramíneas fueron las más indicadas para los usos de cultivo de cobertura, cultivo captura y forraje, mientras que las vezas fueron las mejor como abono verde. La mostaza alcanzó altos valores como cultivo de cobertura y captura en el primer año, pero el segundo decayó debido a su pobre actuación en los inviernos fríos. Hispanic fue el mejor cultivar de cebada como cultivo de cobertura y captura, mientras que Albacete como forraje. El triticale Titania alcanzó la posición más alta como cultiva de cobertura, captura y forraje. Las vezas Aitana y BGE014897 mostraron buenas aptitudes como abono verde y cultivo captura. El MCDA permitió la comparación entre especies y cultivares proporcionando información relevante para la selección y manejo de cultivos cubierta. En el estudio en rhizotrones tanto la mezcla de especies como la cebada alcanzaron mayor intensidad de raíces (RI) y profundidad (RD) que la veza, con valores alrededor de 150 cruces m-1 y 1.4 m respectivamente, comparados con 50 cruces m-1 y 0.9 m para la veza. En las capas más profundas del suelo, la asociación de cultivos mostró valores de RI ligeramente mayores que la cebada en monocultivo. La cebada y la asociación obtuvieron mayores valores de densidad de raíces (RLD) (200-600 m m-3) que la veza (25-130) entre 0.8 y 1.2 m de profundidad. Los niveles de N no mostraron efectos claros en RI, RD ó RLD, sin embargo, el incremento de N favoreció la proliferación de raíces de veza en la asociación en capas profundas del suelo, con un ratio cebada/veza situado entre 25 a N0 y 5 a N2. La absorción de N de la cebada se incrementó en la asociación a expensas de la veza (de ~100 a 200 mg planta-1). Las raíces de cebada en la asociación absorbieron también más nitrógeno marcado de las capas profundas del suelo (0.6 mg 15N planta-1) que en el monocultivo (0.3 mg 15N planta-1). ABSTRACT Cover crop characterization may allow comparing the suitability of different species to provide ecological services such as erosion control, nutrient recycling or fodder production. Different techniques to characterize plant canopy were studied under field conditions in order to establish a methodology for measuring and comparing cover crops canopies. A field trial was established in Madrid (central Spain) to determine the relationship between leaf area index (LAI) and ground cover (GC) in a grass, a legume and a crucifer crop. Twelve plots were sown with either barley (Hordeum vulgare L.), vetch (Vicia sativa L.), or rape (Brassica napus L.). On 10 sampling dates the LAI (both direct and LAI-2000 estimations), fraction intercepted of photosynthetically active radiation (FIPAR) and GC were measured. A two-year field experiment (October-April) was established in the same location to evaluate different species (Hordeum vulgare L., Secale cereale L., x Triticosecale Whim, Sinapis alba L., Vicia sativa L.) and cultivars (20) according to their suitability to be used as cover crops. GC was monitored through digital image analysis with 21 and 22 samples, and biomass measured 8 and 10 times, respectively for each season. A Gompertz model characterized ground cover until the decay observed after frosts, while biomass was fitted to Gompertz, logistic and linear-exponential equations. At the end of the experiment C, N, and fiber (neutral detergent, acid and lignin) contents, and the N fixed by the legumes were determined. Multicriteria decision analysis (MCDA) was applied in order to rank the species and cultivars according to their suitability to perform as cover crops in four different modalities: cover crop, catch crop, green manure and fodder. Intercropping legumes and non-legumes may affect the root growth and N uptake of both components in the mixture. The knowledge of how specific root systems affect the growth of the individual species is useful for understanding the interactions in intercrops as well as for planning cover cropping strategies. In a third trial rhizotron studies were combined with root extraction and species identification by microscopy and with studies of growth, N uptake and 15N uptake from deeper soil layers. The root interactions of root growth and N foraging were studied for two of the best ranked cultivars in the previous study: a barley (Hordeum vulgare L. cv. Hispanic) and a vetch (Vicia sativa L. cv. Aitana). N was added at 0 (N0), 50 (N1) and 150 (N2) kg N ha-1. As a result, linear and quadratic models fitted to the relationship between the GC and LAI for all of the crops, but they reached a plateau in the grass when the LAI > 4. Before reaching full cover, the slope of the linear relationship between both variables was within the range of 0.025 to 0.030. The LAI-2000 readings were linearly correlated with the LAI but they tended to overestimation. Corrections based on the clumping effect reduced the root mean square error of the estimated LAI from the LAI-2000 readings from 1.2 to less than 0.50 for the crucifer and the legume, but were not effective for barley. This determined that in the following studies only the GC and biomass were measured. In the second experiment, the grasses reached the highest ground cover (83- 99%) and biomass (1226-1928 g/m2) at the end of the experiment. The grasses had the highest C/N ratio (27-39) and dietary fiber (53-60%) and the lowest residue quality (~68%). The mustard presented high GC, biomass and N uptake in the warmer year with similarity to grasses, but low fodder capability in both years. The vetch presented the lowest N uptake (2.4-0.7 g N/m2) due to N fixation (9.8-1.6 g N/m2) and low biomass accumulation. The thermal time until reaching 30% ground cover was a good indicator of early coverage species. Variable quantification allowed finding variability among the species and provided information for further decisions involving cover crops selection and management. Aggregation of these variables through utility functions allowed ranking species and cultivars for each usage. Grasses were the most suitable for the cover crop, catch crop and fodder uses, while the vetches were the best as green manures. The mustard attained high ranks as cover and catch crop the first season, but the second decayed due to low performance in cold winters. Hispanic was the most suitable barley cultivar as cover and catch crop, and Albacete as fodder. The triticale Titania attained the highest rank as cover and catch crop and fodder. Vetches Aitana and BGE014897 showed good aptitudes as green manures and catch crops. MCDA allowed comparison among species and cultivars and might provide relevant information for cover crops selection and management. In the rhizotron study the intercrop and the barley attained slightly higher root intensity (RI) and root depth (RD) than the vetch, with values around 150 crosses m-1 and 1.4 m respectively, compared to 50 crosses m-1 and 0.9 m for the vetch. At deep soil layers, intercropping showed slightly larger RI values compared to the sole cropped barley. The barley and the intercropping had larger root length density (RLD) values (200-600 m m-3) than the vetch (25-130) at 0.8-1.2 m depth. The topsoil N supply did not show a clear effect on the RI, RD or RLD; however increasing topsoil N favored the proliferation of vetch roots in the intercropping at deep soil layers, with the barley/vetch root ratio ranging from 25 at N0 to 5 at N2. The N uptake of the barley was enhanced in the intercropping at the expense of the vetch (from ~100 mg plant-1 to 200). The intercropped barley roots took up more labeled nitrogen (0.6 mg 15N plant-1) than the sole-cropped barley roots (0.3 mg 15N plant-1) from deep layers.
Resumo:
Tradicionalmente, el uso de técnicas de análisis de datos ha sido una de las principales vías para el descubrimiento de conocimiento oculto en grandes cantidades de datos, recopilados por expertos en diferentes dominios. Por otra parte, las técnicas de visualización también se han usado para mejorar y facilitar este proceso. Sin embargo, existen limitaciones serias en la obtención de conocimiento, ya que suele ser un proceso lento, tedioso y en muchas ocasiones infructífero, debido a la dificultad de las personas para comprender conjuntos de datos de grandes dimensiones. Otro gran inconveniente, pocas veces tenido en cuenta por los expertos que analizan grandes conjuntos de datos, es la degradación involuntaria a la que someten a los datos durante las tareas de análisis, previas a la obtención final de conclusiones. Por degradación quiere decirse que los datos pueden perder sus propiedades originales, y suele producirse por una reducción inapropiada de los datos, alterando así su naturaleza original y llevando en muchos casos a interpretaciones y conclusiones erróneas que podrían tener serias implicaciones. Además, este hecho adquiere una importancia trascendental cuando los datos pertenecen al dominio médico o biológico, y la vida de diferentes personas depende de esta toma final de decisiones, en algunas ocasiones llevada a cabo de forma inapropiada. Ésta es la motivación de la presente tesis, la cual propone un nuevo framework visual, llamado MedVir, que combina la potencia de técnicas avanzadas de visualización y minería de datos para tratar de dar solución a estos grandes inconvenientes existentes en el proceso de descubrimiento de información válida. El objetivo principal es hacer más fácil, comprensible, intuitivo y rápido el proceso de adquisición de conocimiento al que se enfrentan los expertos cuando trabajan con grandes conjuntos de datos en diferentes dominios. Para ello, en primer lugar, se lleva a cabo una fuerte disminución en el tamaño de los datos con el objetivo de facilitar al experto su manejo, y a la vez preservando intactas, en la medida de lo posible, sus propiedades originales. Después, se hace uso de efectivas técnicas de visualización para representar los datos obtenidos, permitiendo al experto interactuar de forma sencilla e intuitiva con los datos, llevar a cabo diferentes tareas de análisis de datos y así estimular visualmente su capacidad de comprensión. De este modo, el objetivo subyacente se basa en abstraer al experto, en la medida de lo posible, de la complejidad de sus datos originales para presentarle una versión más comprensible, que facilite y acelere la tarea final de descubrimiento de conocimiento. MedVir se ha aplicado satisfactoriamente, entre otros, al campo de la magnetoencefalografía (MEG), que consiste en la predicción en la rehabilitación de lesiones cerebrales traumáticas (Traumatic Brain Injury (TBI) rehabilitation prediction). Los resultados obtenidos demuestran la efectividad del framework a la hora de acelerar y facilitar el proceso de descubrimiento de conocimiento sobre conjuntos de datos reales. ABSTRACT Traditionally, the use of data analysis techniques has been one of the main ways of discovering knowledge hidden in large amounts of data, collected by experts in different domains. Moreover, visualization techniques have also been used to enhance and facilitate this process. However, there are serious limitations in the process of knowledge acquisition, as it is often a slow, tedious and many times fruitless process, due to the difficulty for human beings to understand large datasets. Another major drawback, rarely considered by experts that analyze large datasets, is the involuntary degradation to which they subject the data during analysis tasks, prior to obtaining the final conclusions. Degradation means that data can lose part of their original properties, and it is usually caused by improper data reduction, thereby altering their original nature and often leading to erroneous interpretations and conclusions that could have serious implications. Furthermore, this fact gains a trascendental importance when the data belong to medical or biological domain, and the lives of people depends on the final decision-making, which is sometimes conducted improperly. This is the motivation of this thesis, which proposes a new visual framework, called MedVir, which combines the power of advanced visualization techniques and data mining to try to solve these major problems existing in the process of discovery of valid information. Thus, the main objective is to facilitate and to make more understandable, intuitive and fast the process of knowledge acquisition that experts face when working with large datasets in different domains. To achieve this, first, a strong reduction in the size of the data is carried out in order to make the management of the data easier to the expert, while preserving intact, as far as possible, the original properties of the data. Then, effective visualization techniques are used to represent the obtained data, allowing the expert to interact easily and intuitively with the data, to carry out different data analysis tasks, and so visually stimulating their comprehension capacity. Therefore, the underlying objective is based on abstracting the expert, as far as possible, from the complexity of the original data to present him a more understandable version, thus facilitating and accelerating the task of knowledge discovery. MedVir has been succesfully applied to, among others, the field of magnetoencephalography (MEG), which consists in predicting the rehabilitation of Traumatic Brain Injury (TBI). The results obtained successfully demonstrate the effectiveness of the framework to accelerate and facilitate the process of knowledge discovery on real world datasets.
Resumo:
Effective automatic summarization usually requires simulating human reasoning such as abstraction or relevance reasoning. In this paper we describe a solution for this type of reasoning in the particular case of surveillance of the behavior of a dynamic system using sensor data. The paper first presents the approach describing the required type of knowledge with a possible representation. This includes knowledge about the system structure, behavior, interpretation and saliency. Then, the paper shows the inference algorithm to produce a summarization tree based on the exploitation of the physical characteristics of the system. The paper illustrates how the method is used in the context of automatic generation of summaries of behavior in an application for basin surveillance in the presence of river floods.
Resumo:
According to the PMBOK (Project Management Body of Knowledge), project management is “the application of knowledge, skills, tools, and techniques to project activities to meet the project requirements” [1]. Project Management has proven to be one of the most important disciplines at the moment of determining the success of any project [2][3][4]. Given that many of the activities covered by this discipline can be said that are “horizontal” for any kind of domain, the importance of acknowledge the concepts and practices becomes even more obvious. The specific case of the projects that fall in the domain of Software Engineering are not the exception about the great influence of Project Management for their success. The critical role that this discipline plays in the industry has come to numbers. A report by McKinsey & Co [4] shows that the establishment of programs for the teaching of critical skills of project management can improve the performance of the project in time and costs. As an example of the above, the reports exposes: “One defense organization used these programs to train several waves of project managers and leaders who together administered a portfolio of more than 1,000 capital projects ranging in Project management size from $100,000 to $500 million. Managers who successfully completed the training were able to cut costs on most projects by between 20 and 35 percent. Over time, the organization expects savings of about 15 percent of its entire baseline spending”. In a white paper by the PMI (Project Management Institute) about the value of project management [5], it is stated that: “Leading organizations across sectors and geographic borders have been steadily embracing project management as a way to control spending and improve project results”. According to the research made by the PMI for the paper, after the economical crisis “Executives discovered that adhering to project management methods and strategies reduced risks, cut costs and improved success rates—all vital to surviving the economic crisis”. In every elite company, a proper execution of the project management discipline has become a must. Several members of the software industry have putted effort into achieving ways of assuring high quality results from projects; many standards, best practices, methodologies and other resources have been produced by experts from different fields of expertise. In the industry and the academic community, there is a continuous research on how to teach better software engineering together with project management [4][6]. For the general practices of Project Management the PMI produced a guide of the required knowledge that any project manager should have in their toolbox to lead any kind of project, this guide is called the PMBOK. On the side of best practices 10 and required knowledge for the Software Engineering discipline, the IEEE (Institute of Electrical and Electronics Engineers) developed the SWEBOK (Software Engineering Body of Knowledge) in collaboration with software industry experts and academic researchers, introducing into the guide many of the needed knowledge for a 5-year expertise software engineer [7]. The SWEBOK also covers management from the perspective of a software project. This thesis is developed to provide guidance to practitioners and members of the academic community about project management applied to software engineering. The way used in this thesis to get useful information for practitioners is to take an industry-approved guide for software engineering professionals such as the SWEBOK, and compare the content to what is found in the PMBOK. After comparing the contents of the SWEBOK and the PMBOK, what is found missing in the SWEBOK is used to give recommendations on how to enrich project management skills for a software engineering professional. Recommendations for members of the academic community on the other hand, are given taking into account the GSwE2009 (Graduated Software Engineering 2009) standard [8]. GSwE2009 is often used as a main reference for software engineering master programs [9]. The standard is mostly based on the content of the SWEBOK, plus some contents that are considered to reinforce the education of software engineering. Given the similarities between the SWEBOK and the GSwE2009, the results of comparing SWEBOK and PMBOK are also considered valid to enrich what the GSwE2009 proposes. So in the end the recommendations for practitioners end up being also useful for the academic community and their strategies to teach project management in the context of software engineering.
Resumo:
In Alzheimer disease (AD) the microtubule-associated protein tau is redistributed exponentially into paired helical filaments (PHFs) forming neurofibrillary tangles, which correlate with pyramidal cell destruction and dementia. Amorphous neuronal deposits and PHFs in AD are characterized by aggregation through the repeat domain and C-terminal truncation at Glu-391 by endogenous proteases. We show that a similar proteolytically stable complex can be generated in vitro following the self-aggregation of tau protein through a high-affinity binding site in the repeat domain. Once started, tau capture can be propagated by seeding the further accumulation of truncated tau in the presence of proteases. We have identified a nonneuroleptic phenothiazine previously used in man (methylene blue, MB), which reverses the proteolytic stability of protease-resistant PHFs by blocking the tau-tau binding interaction through the repeat domain. Although MB is inhibitory at a higher concentration than may be achieved clinically, the tau-tau binding assay was used to identify desmethyl derivatives of MB that have Ki values in the nanomolar range. Neuroleptic phenothiazines are inactive. Tau aggregation inhibitors do not affect the tau-tubulin interaction, which also occurs through the repeat domain. Our findings demonstrate that biologically selective pharmaceutical agents could be developed to facilitate the proteolytic degradation of tau aggregates and prevent the further propagation of tau capture in AD.
Resumo:
Atrial natriuretic peptide (ANP) is a 28-aa peptide hormone secreted predominantly from atrial cardiocytes. ANP is first synthesized in the form of a 126-aa precursor (proANP) which is targeted to dense core granules of the regulated secretory pathway. ProANP is stored until the cell receives a signal that triggers the processing and release of the mature peptide (regulated secretion). Various models have been proposed to explain the targeting of selected proteins to the regulated secretory pathway, including specific "sorting receptors" and calcium-mediated aggregation. As potential calcium binding regions had previously been reported in the profragment of ANP, the current study was undertaken in an effort to determine the relationship between the ability of ANP to enter the regulated secretory pathway and its calcium-mediated aggregation. Deletion and site-directed mutagenesis of selected regions of the prosegment demonstrates that acidic amino acids at positions 23 and 24 are critical for both regulated secretion of proANP from transfected AtT-20 cells and calcium-mediated aggregation of purified recombinant proANP in vitro. These results demonstrate that the ability of certain proteins to enter secretory granules is directly linked to their calcium-mediated aggregation.
Resumo:
The progress toward single-dose vaccines has been limited by the poor solid-state stability of vaccine antigens within controlled-release polymers, such as poly(lactide-co-glycolide). For example, herein we report that lyophilized tetanus toxoid aggregates during incubation at 37 degrees C and elevated humidity--i.e., conditions relevant to its release from such systems. The mechanism and extent of this aggregation are dependent on the moisture level in the solid protein, with maximum aggregation observed at intermediate moisture contents. The main aggregation pathway is consistent with formaldehyde-mediated cross-linking, where reactive electrophiles created and stored in the vaccine upon formalinization (exposure to formaldehyde during vaccine preparation) react with nucleophiles of a second vaccine molecule to form intermolecular cross-links. This process is inhibited by the following: (i) succinylating the vaccine to block reactive amino groups; (ii) treating the vaccine with sodium cyanoborohydride, which presumably reduces Schiff bases and some other electrophiles created upon formalinization; and (iii) addition of low-molecular-weight excipients, particularly sorbitol. The moisture-induced aggregation of another formalinized vaccine, diphtheria toxoid, is also retarded by succinylation, suggesting the generality of this mechanism for formalinized vaccines. Hence, mechanistic stability studies of the type described herein may be important for the development of effective single-dose vaccines.
Resumo:
NACP, a 140-amino acid presynaptic protein, is the precursor of NAC [the non-amyloid beta/A4 protein (A beta) component of Alzheimer disease (AD) amyloid], a peptide isolated from and immunologically localized to brain amyloid of patients afflicted with AD. NACP produced in Escherichia coli bound to A beta peptides, the major component of AD amyloid. NACP bound to A beta 1-38 and A beta 25-35 immobilized on nitrocellulose but did not bind to A beta 1-28 on the filter under the same conditions. NACP binding to A beta 1-38 was abolished by addition of A beta 25-35 but not by A beta 1-28, suggesting that the hydrophobic region of the A beta peptide is critical to this binding. NACP-112, a shorter splice variant of NACP containing the NAC sequence, bound to A beta, but NACP delta, a deletion mutant of NACP lacking the NAC domain, did not bind A beta 1-38. Furthermore, binding between NACP-112 and A beta 1-38 was decreased by addition of peptide Y, a peptide that covers the last 15 residues of NAC. In an aqueous solution, A beta 1-38 aggregation was observed when NACP was also present in an incubation mixture at a ratio of 1:125 (NACP/A beta), whereas A beta 1-38 alone or NACP alone did not aggregate under the same conditions, suggesting that the formation of a complex between A beta and NACP may promote aggregation of A beta. Thus, NACP can bind A beta peptides through the specific sequence and can promote A beta aggregation, raising the possibility that NACP may play a role in the development of AD amyloid.
Resumo:
The accumulation of microtubule-associated protein tau into fibrillar aggregates is the hallmark of Alzheimer’s disease and other neurodegenerative disorders, collectively referred to as tauopathies. Fibrils can propagate from one cell to the next and spread throughout the brain. However, a study shows that only small aggregates can be taken up by cultured neuronal cells. The mechanisms that lead to the breakage of fibrils into smaller fragments remain unknown. In yeast, the AAA+ chaperone HSP104 processes the reactivation of protein aggregates and is responsible for fragmentation of fibrils. This study focused on investigating the effects of molecular chaperones on tau fibrils and using HSP104 as a model system to test whether we can monitor fibril fracturing. The assays used to detect the chaperone’s actions on tau utilized acrylodan fluorescence, thioflavin T fluorescence, and sedimentation. Tau fibrils were either formed with a cofactor, heparin, to accelerate assembly or without a cofactor. In the process of investigating the effects of HSP104 on tau fibrils, this study established an assay to determine the effects of breakage on the seeding properties of tau fibrils. Our findings demonstrated that the sonication of tau fibrils produces smaller fragments (seeds) that accelerate the conversion of monomeric tau into fibrils. The use of this assay with HSP104 provided evidence that HSP104 inhibits the elongation of tau fibrils. Indeed, HSP104 inhibits the aggregation of soluble tau into aggregates. However, tau fibril breakage and dissociation were not observed with HSP104, either alone or in combination with co-chaperones (HSP70 and HSP40). Our findings provide insights into the seeding properties of tau fibrils, and suggest that fragmentation is a critical part of tau assembly. This knowledge should be valuable for understanding tau fibril aggregation and propagation in the brain, which is necessary to identify new treatments for neurodegenerative diseases.
Resumo:
This is a guide to develop a theoretical framework for any field of knowledge. It is a rational and organized to put everything that is known or has been written about an issue or a problem way.
Resumo:
"This report addresses the potential benefits of municipal aggregation of retail electric customers as a means for customers to benefit from the Electric Service Customer Choice and Rate Relief Law of 1997 (Public Act 90-561), referred to in this report as the Customer Choice Law. This report was authorized by the General Assembly on June 26, 2002, in Public Act 92-0585."--P. ii.
Resumo:
In 2003 there was an increase in the use of pulmonary artery catheters in Australia from 12, 000 to 16, 000 units in intensive care and peri-operative care. This survey of intensive care nurses in five intensive care units in Queensland addressed knowledge of use, safety and complications of the pulmonary artery catheter, using a previously validated 31 question multiple choice survey. One hundred and thirty-nine questionnaires were completed, a response rate of 46%. The mean score was 13.3, standard deviation +/-4.2 out of a total of 31 (42.8% correct). The range was 4 to 25. Scores were significantly higher in those participants with more ICU experience, higher nursing grade, a higher self-assessed level of knowledge and greater frequency of PAC supervision. There was no significant correlation between total score and hospital- or university-based education, or total score and public or private hospital participants. Fifty-one per cent were unable to correctly identify the significant pressure change as the catheter is advanced from the right ventricle to the pulmonary artery.
Resumo:
The WSIS is centrally interested in knowledge and has defined for itself a mission that is broadly humanitarian. Its development ‘talk’ is, rightly, replete with notions of equity, preserving culture, justice, human rights and so on. In incorporating such issues into knowledge society and economy discussions, WSIS has adopted a different posture towards knowledge than is seen in dominant discourses. This study analyses the dominant knowledge discourse using a large corpus of knowledge-related policy documents, discourse theory and an interrelational understanding of knowledge. I show that it is important to understand this dominant knowledge discourse because of its capacity to limit thought and action in relation to its central topic, knowledge. The results of this study demonstrate that the dominant knowledge discourse is technocratic, frequently insensitive to the humane mission at the core of the WSIS, and is based on a partial understanding of what knowledge is and how knowledge systems work. Moreover, I show that knowledge is inherently political, that the dominant knowledge discourse is politically oriented towards the concerns of business and technology, but that an emancipatory politics of knowledge is possible.
Resumo:
This case study concentrates on the extent of knowledge among the Australian public of Australia's tropical bird species, and their willingness to support their conservation. In order to place this issue in context, we provide background information on the status of Australian bird species, focusing attention on species that occur in tropical Australia. Then, using questionnaire survey results, we consider the hypothesis that the public's support for the conservation of different bird species depends on their understanding of the species' existence and status. Based on results from a sample of residents in Brisbane, Queensland, we found that knowledge of bird species that occur exclusively in the Australian tropics (including tropical Queensland) was very poor compared with that of those occurring in the Brisbane area that are relatively common. Experimental results indicated that when respondents in the sample had an option to allocate A$1,000 between 10 bird species listed in the survey, they allocated more funds to the better-known and more common species, unless they were provided with balanced information about all the selected species. With balanced information, the average allocation to bird species confined mostly to the Australian tropics, particularly those threatened, increased. This demonstrates the conservation implications of information provision about bird species. The results showed that public education can play a crucial role in attempts to conserve bird species that are poorly known and threatened.
Resumo:
Knowledge sharing is an essential component of effective knowledge management. However, evaluation apprehension, or the fear that your work may be critiqued, can inhibit knowledge sharing. Using the general framework of social exchange theory, we examined the effects of evaluation apprehension and perceived benefit of knowledge sharing ( such as enhanced reputation) on employees' knowledge sharing intentions in two contexts: interpersonal (i.e., by direct contact between two employees) and database (i.e., via repositories). Evaluation apprehension was negatively associated with knowledge sharing intentions in both contexts while perceived bene. it was only positively associated with knowledge sharing intentions in the database context. Moreover, compared to the interpersonal context, evaluation apprehension was higher and knowledge sharing lower in the database context. Finally, the negative effects of evaluation apprehension upon knowledge sharing intentions were worse when perceived benefits were low compared to when perceived benefits were high.