925 resultados para Domain-specific languages engineering


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Le cycle glycérolipides/acides gras libres (GL/FFA) est une voie métabolique clé qui relie le métabolisme du glucose et des acides gras et il est composé de deux processus métaboliques appelés lipogenèse et lipolyse. Le cycle GL/FFA, en particulier la lipolyse des triglycérides, génère diverses molécules de signalisation pour réguler la sécrétion d'insuline dans les cellules bêta pancréatiques et la thermogenèse non-frissonnante dans les adipocytes. Actuellement, les lipides provenant spécifiquement de la lipolyse impliqués dans ce processus sont mal connus. L’hydrolyse des triglycérides dans les cellules β est réalisée par les actions successives de la triglycéride lipase adipocytaire pour produire le diacylglycérol, ensuite par la lipase hormono-sensible pour produire le monoacylglycérol (MAG) et enfin par la MAG lipase (MAGL) qui relâche du glycerol et des acides gras. Dans les cellules bêta, la MAGL classique est très peu exprimée et cette étude a démontré que l’hydrolyse de MAG dans les cellules β est principalement réalisée par l'α/β-Hydrolase Domain-6 (ABHD6) nouvellement identifiée. L’inhibition d’ABHD6 par son inhibiteur spécifique WWL70, conduit à une accumulation des 1-MAG à longues chaines saturées à l'intérieur des cellules, accompagnée d’une augmentation de la sécrétion d'insuline stimulée par le glucose (GSIS). Baisser les niveaux de MAG en surexprimant ABHD6 dans la lignée cellulaire bêta INS832/13 réduit la GSIS, tandis qu’une augmentation des niveaux de MAG par le « knockdown » d’ABHD6 améliore la GSIS. L'exposition aiguë des monoacylglycérols exogènes stimule la sécrétion d'insuline de manière dose-dépendante et restaure la GSIS supprimée par un inhibiteur de lipases appelé orlistat. En outre, les souris avec une inactivation du gène ABHD6 dans tous les tissus (ABHD6-KO) et celles avec une inactivation du gène ABHD6 spécifiquement dans la cellule β présentent une GSIS stimulée, et leurs îlots montrent une augmentation de la production de monoacylglycérol et de la sécrétion d'insuline en réponse au glucose. L’inhibition d’ABHD6 chez les souris diabétiques (modèle induit par de faibles doses de streptozotocine) restaure la GSIS et améliore la tolérance au glucose. De plus, les résultats montrent que les MAGs non seulement améliorent la GSIS, mais potentialisent également la sécrétion d’insuline induite par les acides gras libres ainsi que la sécrétion d’insuline induite par divers agents et hormones, sans altération de l'oxydation et l'utilisation du glucose ainsi que l'oxydation des acides gras. Nous avons démontré que le MAG se lie à la protéine d’amorçage des vésicules appelée Munc13-1 et l’active, induisant ainsi l’exocytose de l'insuline. Sur la base de ces observations, nous proposons que le 1-MAG à chaines saturées agit comme facteur de couplage métabolique pour réguler la sécrétion d'insuline et que ABHD6 est un modulateur négatif de la sécrétion d'insuline. En plus de son rôle dans les cellules bêta, ABHD6 est également fortement exprimé dans les adipocytes et son niveau est augmenté avec l'obésité. Les souris dépourvues globalement d’ABHD6 et nourris avec une diète riche en gras (HFD) montrent une faible diminution de la prise alimentaire, une diminution du gain de poids corporel et de la glycémie à jeun et une amélioration de la tolérance au glucose et de la sensibilité à l'insuline et ont une activité locomotrice accrue. En outre, les souris ABHD6-KO affichent une augmentation de la dépense énergétique et de la thermogenèse induite par le froid. En conformité avec ceci, ces souris présentent des niveaux élevés d’UCP1 dans les adipocytes blancs et bruns, indiquant le brunissement des adipocytes blancs. Le phénotype de brunissement est reproduit dans les souris soit en les traitant de manière chronique avec WWL70 (inhibiteur d’ABHD6) ou des oligonucléotides anti-sense ciblant l’ABHD6. Les tissus adipeux blanc et brun isolés de souris ABHD6-KO montrent des niveaux très élevés de 1-MAG, mais pas de 2-MAG. L'augmentation des niveaux de MAG soit par administration exogène in vitro de 1-MAG ou par inhibition ou délétion génétique d’ABHD6 provoque le brunissement des adipocytes blancs. Une autre évidence indique que les 1-MAGs sont capables de transactiver PPARα et PPARγ et que l'effet de brunissement induit par WWL70 ou le MAG exogène est aboli par les antagonistes de PPARα et PPARγ. L’administration in vivo de l’antagoniste de PPARα GW6471 à des souris ABHD6-KO inverse partiellement les effets causés par l’inactivation du gène ABHD6 sur le gain de poids corporel, et abolit l’augmentation de la thermogenèse, le brunissement du tissu adipeux blanc et l'oxydation des acides gras dans le tissu adipeux brun. L’ensemble de ces observations indique que ABHD6 régule non seulement l’homéostasie de l'insuline et du glucose, mais aussi l'homéostasie énergétique et la fonction des tissus adipeux. Ainsi, 1-MAG agit non seulement comme un facteur de couplage métabolique pour réguler la sécrétion d'insuline en activant Munc13-1 dans les cellules bêta, mais régule aussi le brunissement des adipocytes blancs et améliore la fonction de la graisse brune par l'activation de PPARα et PPARγ. Ces résultats indiquent que ABHD6 est une cible prometteuse pour le développement de thérapies contre l'obésité, le diabète de type 2 et le syndrome métabolique.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

There are many ways to generate geometrical models for numerical simulation, and most of them start with a segmentation step to extract the boundaries of the regions of interest. This paper presents an algorithm to generate a patient-specific three-dimensional geometric model, based on a tetrahedral mesh, without an initial extraction of contours from the volumetric data. Using the information directly available in the data, such as gray levels, we built a metric to drive a mesh adaptation process. The metric is used to specify the size and orientation of the tetrahedral elements everywhere in the mesh. Our method, which produces anisotropic meshes, gives good results with synthetic and real MRI data. The resulting model quality has been evaluated qualitatively and quantitatively by comparing it with an analytical solution and with a segmentation made by an expert. Results show that our method gives, in 90% of the cases, as good or better meshes as a similar isotropic method, based on the accuracy of the volume reconstruction for a given mesh size. Moreover, a comparison of the Hausdorff distances between adapted meshes of both methods and ground-truth volumes shows that our method decreases reconstruction errors faster. Copyright © 2015 John Wiley & Sons, Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the twentieth century, as technology grew with it. This resulted in collective efforts and thinking in the direction of controlling work related hazards and accidents. Thus, safety management developed and became an important part of industrial management. While considerable research has been reported on the topic of safety management in industries from various parts of the world, there is scarcity of literature from India. It is logical to think that a clear understanding of the critical safety management practices and their relationships with accident rates and management system certifications would help in the development and implementation of safety management systems. In the first phase of research, a set of six critical safety management practices has been identified based on a thorough review of the prescriptive, practitioner, conceptual and empirical literature. An instrument for measuring the level of practice of these safety conduction a survey using questionnaire in chemical/process industry. The instrument has been empirically validated using Confirmatory Factor Analysis (CFA) approach. As the second step. Predictive validity of safety management practices and the relationship between safety management practices and self-reported accident rates and management system certifications have been investigated using ANOVA. Results of the ANOVA tests show that there is significant difference in the identified safety management practices and the determinants of safety performance have been investigated using Multiple Regression Analysis. The inter-relationships between safety management practices, determinants of safety performance and components of safety performance have been investigated with the help of structural equation modeling. Further investigations into engineering and construction industries reveal that safety climate factors are not stable across industries. However, some factors are found to be common in industries irrespective of the type of industry. This study identifies the critical safety management practices in major accident hazard chemical/process industry from the perspective of employees and the findings empirically support the necessity for obtaining safety specific management system certifications

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the present work, structural, optical and electrical properties of indium sulfide are tuned by specific and controlled doping. Silver, tin, copper and chlorine were used as the doping elements. In2S3 thin films for the present study were prepared using a simple and low cost “Chemical Spray Pyrolysis (CSP)” technique. This technique is adaptable for large-area deposition of thin films in any required shape and facilitates easiness of doping and/or variation of atomic ratio. It involves spraying a solution, usually aqueous, containing soluble salts of the constituents of the desired compound onto a heated substrate. Doping process was optimized for different doping concentrations. On optimizing doping conditions, we tuned the structural, optical and electrical properties of indium sulfide thin films making them perform as an ideal buffer layer.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Data mining is one of the hottest research areas nowadays as it has got wide variety of applications in common man’s life to make the world a better place to live. It is all about finding interesting hidden patterns in a huge history data base. As an example, from a sales data base, one can find an interesting pattern like “people who buy magazines tend to buy news papers also” using data mining. Now in the sales point of view the advantage is that one can place these things together in the shop to increase sales. In this research work, data mining is effectively applied to a domain called placement chance prediction, since taking wise career decision is so crucial for anybody for sure. In India technical manpower analysis is carried out by an organization named National Technical Manpower Information System (NTMIS), established in 1983-84 by India's Ministry of Education & Culture. The NTMIS comprises of a lead centre in the IAMR, New Delhi, and 21 nodal centres located at different parts of the country. The Kerala State Nodal Centre is located at Cochin University of Science and Technology. In Nodal Centre, they collect placement information by sending postal questionnaire to passed out students on a regular basis. From this raw data available in the nodal centre, a history data base was prepared. Each record in this data base includes entrance rank ranges, reservation, Sector, Sex, and a particular engineering. From each such combination of attributes from the history data base of student records, corresponding placement chances is computed and stored in the history data base. From this data, various popular data mining models are built and tested. These models can be used to predict the most suitable branch for a particular new student with one of the above combination of criteria. Also a detailed performance comparison of the various data mining models is done.This research work proposes to use a combination of data mining models namely a hybrid stacking ensemble for better predictions. A strategy to predict the overall absorption rate for various branches as well as the time it takes for all the students of a particular branch to get placed etc are also proposed. Finally, this research work puts forward a new data mining algorithm namely C 4.5 * stat for numeric data sets which has been proved to have competent accuracy over standard benchmarking data sets called UCI data sets. It also proposes an optimization strategy called parameter tuning to improve the standard C 4.5 algorithm. As a summary this research work passes through all four dimensions for a typical data mining research work, namely application to a domain, development of classifier models, optimization and ensemble methods.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Fine (approximately 18 nm) particles of nickel ferrite were synthesized by the sol-gel technique, and their structural properties were evaluated by X-ray diffraction. Neoprene-based rubber ferrite composites were prepared by incorporating these nickel ferrite powders in the rubber matrix according to a specific recipe. The cure characteristics were analyzed, and the samples were molded into particular shapes whose properties were determined according to ASTM standards. Magnetization studies were carried out using a Vibrating Sample Magnetometer. This study indicates that neoprene rubber-based flexible magnets with desired magnetic properties and appropriate mechanical properties can be prepared by incorporating an adequate amount of nanoscale nickel ferrite particles within the rubber matrix

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The objective of this study was to develop an internet-based seminar framework applicable for landscape architecture education. This process was accompanied by various aims. The basic expectation was to keep the main characteristics of landscape architecture education also in the online format. On top of that, four further objectives were anticipated: (1) training of competences for virtual team work, (2) fostering intercultural competence, (3) creation of equal opportunities for education through internet-based open access and (4) synergy effects and learning processes across institutional boundaries. This work started with the hypothesis that these four expected advantages would compensate for additional organisational efforts caused by the online delivery of the seminars and thus lead to a sustainable integration of this new learning mode into landscape architecture curricula. This rationale was followed by a presentation of four areas of knowledge to which the seminar development was directly related (1) landscape architecture as a subject and its pedagogy, (2) general learning theories, (3) developments in the ICT sector and (4) wider societal driving forces such as global citizenship and the increase of open educational resources. The research design took the shape of a pedagogical action research cycle. This approach was constructive: The author herself is teaching international landscape architecture students so that the model could directly be applied in practice. Seven online seminars were implemented in the period from 2008 to 2013 and this experience represents the core of this study. The seminars were conducted with varying themes while its pedagogy, organisation and the technological tools remained widely identical. The research design is further based on three levels of observation: (1) the seminar design on the basis of theory and methods from the learning sciences, in particular educational constructivism, (2) the seminar evaluation and (3) the evaluation of the seminars’ long term impact. The seminar model itself basically consists of four elements: (1) the taxonomy of learning objectives, (2) ICT tools and their application and pedagogy, (3) process models and (4) the case study framework. The seminar framework was followed by the presentation of the evaluation findings. The major findings of this study can be summed up as follows: Implementing online seminars across educational and national boundaries was possible both in term of organisation and technology. In particular, a high level of cultural diversity among the seminar participants has definitively been achieved. However, there were also obvious obstacles. These were primarily competing study commitments and incompatible schedules among the students attending from different academic programmes, partly even in different time zones. Both factors had negative impact on the individual and working group performances. With respect to the technical framework it can be concluded that the majority of the participants were able to use the tools either directly without any problem or after overcoming some smaller problems. Also the seminar wiki was intensively used for completing the seminar assignments. However, too less truly collaborative text production was observed which could be improved by changing the requirements for the collaborative task. Two different process models have been applied for guiding the collaboration of the small groups and both were in general successful. However, it needs to be said that even if the students were able to follow the collaborative task and to co-construct and compare case studies, most of them were not able to synthesize the knowledge they had compiled. This means that the area of consideration often remained on the level of the case and further reflections, generalisations and critique were largely missing. This shows that the seminar model needs to find better ways for triggering knowledge building and critical reflection. It was also suggested to have a more differentiated group building strategy in future seminars. A comparison of pre- and post seminar concept maps showed that an increase of factual and conceptual knowledge on the individual level was widely recognizable. Also the evaluation of the case studies (the major seminar output) revealed that the students have undergone developments of both the factual and the conceptual knowledge domain. Also their self-assessment with respect to individual learning development showed that the highest consensus was achieved in the field of subject-specific knowledge. The participants were much more doubtful with regard to the progress of generic competences such as analysis, communication and organisation. However, 50% of the participants confirmed that they perceived individual development on all competence areas the survey had asked for. Have the additional four targets been met? Concerning the competences for working in a virtual team it can be concluded that the vast majority was able to use the internet-based tools and to work with them in a target-oriented way. However, there were obvious differences regarding the intensity and activity of participation, both because of external and personal factors. A very positive aspect is the achievement of a high cultural diversity supporting the participants’ intercultural competence. Learning from group members was obviously a success factor for the working groups. Regarding the possibilities for better accessibility of educational opportunities it became clear that a significant number of participants were not able to go abroad during their studies because of financial or personal reasons. They confirmed that the online seminar was to some extent a compensation for not having been abroad for studying. Inter-institutional learning and synergy was achieved in so far that many teachers from different countries contributed with individual lectures. However, those teachers hardly ever followed more than one session. Therefore, the learning effect remained largely within the seminar learning group. Looking back at the research design it can be said that the pedagogical action research cycle was an appropriate and valuable approach allowing for strong interaction between theory and practice. However, some more external evaluation from peers in particular regarding the participants’ products would have been valuable.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Developments in mammalian cell culture and recombinant technology has allowed for the production of recombinant proteins for use as human therapeutics. Mammalian cell culture is typically operated at the physiological temperature of 37°. However, recent research has shown that the use of low-temperature conditions (30-33°) as a platform for cell-culture results in changes in cell characteristics, such as increased specific productivity and extended periods of cell viability, that can potentially improve the production of recombinant proteins. Furthermore, many recent reports have focused on investigating low-temperature mammalian cell culture of Chinese hamster ovary (CHO) cells, one of the principal cell-lines used in industrial production of recombinant proteins. Exposure to low ambient temperatures exerts an external stress on all living cells, and elicits a cellular response. This cold-stress response has been observed in bacteria, plants and mammals, and is regulated at the gene level. The exact genes and molecular mechanisms involved in the cold-stress response in prokaryotes and plants have been well studied. There are also various reports that detail the modification of cold-stress genes to improve the characteristics of bacteria or plant cells at low temperatures. However, there is very limited information on mammalian cold-stress genes or the related pathways governing the mammalian cold-stress response. This project seeks to investigate and characterise cold-stress genes that are differentially expressed during low-temperature culture of CHO cells, and to relate them to the various changes in cell characteristics observed in low-temperature culture of CHO cells. The gene information can then be used to modify CHO cell-lines for improved performance in the production of recombinant proteins.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Synechocystis PCC 6803 is a photosynthetic bacterium that has the potential to make bioproducts from carbon dioxide and light. Biochemical production from photosynthetic organisms is attractive because it replaces the typical bioprocessing steps of crop growth, milling, and fermentation, with a one-step photosynthetic process. However, low yields and slow growth rates limit the economic potential of such endeavors. Rational metabolic engineering methods are hindered by limited cellular knowledge and inadequate models of Synechocystis. Instead, inverse metabolic engineering, a scheme based on combinatorial gene searches which does not require detailed cellular models, but can exploit sequence data and existing molecular biological techniques, was used to find genes that (1) improve the production of the biopolymer poly-3-hydroxybutyrate (PHB) and (2) increase the growth rate. A fluorescence activated cell sorting assay was developed to screen for high PHB producing clones. Separately, serial sub-culturing was used to select clones that improve growth rate. Novel gene knock-outs were identified that increase PHB production and others that increase the specific growth rate. These improvements make this system more attractive for industrial use and demonstrate the power of inverse metabolic engineering to identify novel phenotype-associated genes in poorly understood systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Experimental and epidemiological studies demonstrate that fetal growth restriction and low birth weight enhance the risk of chronic diseases in adulthood. Derangements in tissue-specific epigenetic programming of fetal and placental tissues are a suggested mechanism of which DNA methylation is best understood. DNA methylation profiles in human tissue are mostly performed in DNA from white blood cells. The objective of this study was to assess DNA methylation profiles of IGF2 DMR and H19 in DNA derived from four tissues of the newborn. We obtained from 6 newborns DNA from fetal placental tissue (n = 5), umbilical cord CD34+ hematopoietic stem cells (HSC) and CD34- mononuclear cells (MNC) (n = 6), and umbilical cord Wharton jelly (n = 5). HCS were isolated using magnetic-activated cell separation. DNA methylation of the imprinted fetal growth genes IGF2 DMR and H19 was measured in all tissues using quantitative mass spectrometry. ANOVA testing showed tissue-specific differences in DNA methylation of IGF2 DMR (p value 0.002) and H19 (p value 0.001) mainly due to a higher methylation of IGF2 DMR in Wharton jelly (mean 0.65, sd 0.14) and a lower methylation of H19 in placental tissue (mean 0.25, sd 0.02) compared to other tissues. This study demonstrates the feasibility of the assessment of differential tissue specific DNA methylation. Although the results have to be confirmed in larger sample sizes, our approach gives opportunities to investigate epigenetic profiles as underlying mechanism of associations between pregnancy exposures and outcome, and disease risks in later life.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

La sepsis es un evento inflamatorio generalizado del organismo inducido por un daño causado generalmente por un agente infeccioso. El patógeno más frecuentemente asociado con esta entidad es el Staphylococcus aureus, responsable de la inducción de apoptosis en células endoteliales debida a la producción de ceramida. Se ha descrito el efecto protector de la proteína C activada (PCA) en sepsis y su relación con la disminución de la apoptosis de las células endoteliales. En este trabajo se analizó la activación de las quinasas AKT, ASK1, SAPK/JNK y p38 en un modelo de apoptosis endotelial usando las técnicas de Western Blotting y ELISA. Las células endoteliales (EA.hy926), se trataron con C2-ceramida (130μM) en presencia de inhibidores químicos de cada una de estas quinasas y PCA. La supervivencia de las células en presencia de inhibidores químicos y PCA fue evaluada por medio de ensayos de activación de las caspasas 3, 7 y 9, que verificaban la muerte celular por apoptosis. Los resultados evidencian que la ceramida reduce la activación de AKT y aumenta la activación de las quinasas ASK, SAPK/JNK y p38, en tanto que PCA ejerce el efecto contrario. Adicionalmente se encontró que la tiorredoxina incrementa la activación/fosforilación de AKT, mientras que la quinasa p38 induce la defosforilación de AKT.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The thesis L’ús dels clítics pronominals del català i la seva adquisició per parlants de romanès i de tagal [The use of pronominal clitics in Catalan and their acquisition by Romanian and Tagalog speakers] analyzes the mechanisms of transfer from the L1 in the process of acquisition of Catalan (L2) in two groups of learners, one of which has Romanian and the other Tagalog as their native language. Our study lends support to the idea of transfer from the L1 to a second language in general, and, in particular, within the process of acquisition of pronominal clitics from a Romance language (Catalan). The results show that the differences between the two groups are statistically significant and are attributable to the characteristics of the L1. Moreover, starting from a detailed description of the grammar of pronominal clitics in the three languages involved, we define the specific grammatical aspects of the Tagalog and Romanian languages that can have an influence on certain productions and on certain errors in the use of pronominal clitics in Catalan, within the process of acquisition of this Romance language as L2. In the theoretical domain, we started from studies on functional markedness to determine four reference terms that allowed us to carry out a systematized study of the difficulties in acquisition of the use of Catalan clitic pronouns according to their complexity and their degree of grammaticalization.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

La gestió de l'aigua residual és una tasca complexa. Hi ha moltes substàncies contaminants conegudes però encara moltes per conèixer, i el seu efecte individual o col·lgectiu és difícil de predir. La identificació i avaluació dels impactes ambientals resultants de la interacció entre els sistemes naturals i socials és un assumpte multicriteri. Els gestors ambientals necessiten eines de suport pels seus diagnòstics per tal de solucionar problemes ambientals. Les contribucions d'aquest treball de recerca són dobles: primer, proposar l'ús d'un enfoc basat en la modelització amb agents per tal de conceptualitzar i integrar tots els elements que estan directament o indirectament involucrats en la gestió de l'aigua residual. Segon, proposar un marc basat en l'argumentació amb l'objectiu de permetre als agents raonar efectivament. La tesi conté alguns exemples reals per tal de mostrar com un marc basat amb agents que argumenten pot suportar diferents interessos i diferents perspectives. Conseqüentment, pot ajudar a construir un diàleg més informat i efectiu i per tant descriure millor les interaccions entre els agents. En aquest document es descriu primer el context estudiat, escalant el problema global de la gestió de la conca fluvial a la gestiódel sistema urbà d'aigües residuals, concretament l'escenari dels abocaments industrials. A continuació, s'analitza el sistema mitjançant la descripció d'agents que interaccionen. Finalment, es descriuen alguns prototips capaços de raonar i deliberar, basats en la lògica no monòtona i en un llenguatge declaratiu (answer set programming). És important remarcar que aquesta tesi enllaça dues disciplines: l'enginyeria ambiental (concretament l'àrea de la gestió de les aigües residuals) i les ciències de la computació (concretament l'àrea de la intel·ligència artificial), contribuint així a la multidisciplinarietat requerida per fer front al problema estudiat. L'enginyeria ambiental ens proporciona el coneixement del domini mentre que les ciències de la computació ens permeten estructurar i especificar aquest coneixement.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

RESUMO: O conhecimento existe desde sempre, mesmo num estado latente condicionado algures e apenas à espera de um meio (de uma oportunidade) de se poder manifestar. O conhecimento é duplamente um fenómeno da consciência: porque dela procede num dado momento da sua vida e da sua história e porque só nela termina, aperfeiçoando-a e enriquecendo-a. O conhecimento está assim em constante mudança. À relativamente pouco tempo começou-se a falar de Gestão do Conhecimento e na altura foi muito associada às Tecnologias da Informação, como meio de colectar, processar e armazenar cada vez mais, maiores quantidades de informação. As Tecnologias da Informação têm tido, desde alguns anos para cá, um papel extremamente importante nas organizações, inicialmente foram adoptadas com o propósito de automatizar os processos operacionais das organizações, que suportam as suas actividades quotidianas e nestes últimos tempos as Tecnologias da Informação dentro das organizações têm evoluído rapidamente. Todo o conhecimento, mesmo até o menos relevante de uma determinada área de negócio, é fundamental para apoiar o processo de tomada de decisão. As organizações para atingirem melhores «performances» e conseguirem transcender as metas a que se propuseram inicialmente, tendem a munir-se de mais e melhores Sistemas de Informação, assim como, à utilização de várias metodologias e tecnologias hoje em dia disponíveis. Por conseguinte, nestes últimos anos, muitas organizações têm vindo a demonstrar uma necessidade crucial de integração de toda a sua informação, a qual está dispersa pelos diversos departamentos constituintes. Para que os gestores de topo (mas também para outros funcionários) possam ter disponível em tempo útil, informação pertinente, verdadeira e fiável dos negócios da organização que eles representam, precisam de ter acesso a bons Sistemas de Tecnologias de Informação. Numa acção de poderem agir mais eficazmente e eficientemente nas tomadas de decisão, por terem conseguido tirar por esses meios o máximo de proveito possível da informação, e assim, apresentarem melhores níveis de sucesso organizacionais. Também, os Sistemas de «Business Intelligence» e as Tecnologias da Informação a ele associadas, utilizam os dados existentes nas organizações para disponibilizar informação relevante para as tomadas de decisão. Mas, para poderem alcançar esses níveis tão satisfatórios, as organizações necessitam de recursos humanos, pois como podem elas serem competitivas sem Luís Miguel Borges – Gestão e Trabalhadores do Conhecimento em Tecnologias da Informação (UML) ULHT – ECATI 6 trabalhadores qualificados. Assim, surge a necessidade das organizações em recrutar os chamados hoje em dia “Trabalhadores do Conhecimento”, que são os indivíduos habilitados para interpretar as informações dentro de um domínio específico. Eles detectam problemas e identificam alternativas, com os seus conhecimentos e discernimento, eles trabalham para solucionar esses problemas, ajudando consideravelmente as organizações que representam. E, usando metodologias e tecnologias da Engenharia do Conhecimento como a modelação, criarem e gerirem um histórico de conhecimento, incluindo conhecimento tácito, sobre várias áreas de negócios da organização, que podem estar explícitos em modelos abstractos, que possam ser compreendidos e interpretados facilmente, por outros trabalhadores com níveis de competência equivalentes. ABSTRACT: Knowledge has always existed, even in a latent state conditioning somewhere and just waiting for a half (an opportunity) to be able to manifest. Knowledge is doubly a phenomenon of consciousness: because proceeds itself at one point in its life and its history and because solely itself ends, perfecting it and enriching it. The knowledge is so in constant change. In the relatively short time that it began to speak of Knowledge Management and at that time was very associated with Information Technologies, as a means to collect, process and store more and more, larger amounts of information. Information Technologies has had, from a few years back, an extremely important role in organizations, were initially adopted in order to automate the operational processes of organizations, that support their daily activities and in recent times Information Technologies within organizations has evolved rapidly. All the knowledge, even to the least relevant to a particular business area, is fundamental to support the process of decision making. The organizations to achieve better performances and to transcend the goals that were initially propose, tend to provide itself with more and better Information Systems, as well as, the use of various methodologies and technologies available today. Consequently, in recent years, many organizations have demonstrated a crucial need for integrating all their information, which is dispersed by the diver constituents departments. For top managers (but also for other employees) may have ready in time, pertinent, truthful and reliable information of the organization they represent, need access to good Information Technology Systems. In an action that they can act more effectively and efficiently in decision making, for having managed to get through these means the maximum possible advantage of the information, and so, present better levels of organizational success. Also, the systems of Business Intelligence and Information Technologies its associated, use existing data on organizations to provide relevant information for decision making. But, in order to achieve these levels as satisfactory, organizations need human resources, because how can they be competitive without skilled workers. Thus, arises the need for organizations to recruit called today “Knowledge Workers”, they are the individuals enable to interpret the information within a specific domain. They detect problems and identify alternatives, with their knowledge and discernment they work to solve these problems, helping considerably the organizations that represent. And, using Luís Miguel Borges – Gestão e Trabalhadores do Conhecimento em Tecnologias da Informação (UML) ULHT – ECATI 8 methodologies and technologies of Knowledge Engineering as modeling, create and manage a history of knowledge, including tacit knowledge, on various business areas of the organization, that can be explicit in the abstract models, that can be understood and interpreted easily, by other workers with equivalent levels of competence.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The chess endgame is increasingly being seen through the lens of, and therefore effectively defined by, a data ‘model’ of itself. It is vital that such models are clearly faithful to the reality they purport to represent. This paper examines that issue and systems engineering responses to it, using the chess endgame as the exemplar scenario. A structured survey has been carried out of the intrinsic challenges and complexity of creating endgame data by reviewing the past pattern of errors during work in progress, surfacing in publications and occurring after the data was generated. Specific measures are proposed to counter observed classes of error-risk, including a preliminary survey of techniques for using state-of-the-art verification tools to generate EGTs that are correct by construction. The approach may be applied generically beyond the game domain.