926 resultados para nuclear densities and introduced new imaginary potential component
Resumo:
Fluid handling systems such as pump and fan systems are found to have a significant potential for energy efficiency improvements. To deliver the energy saving potential, there is a need for easily implementable methods to monitor the system output. This is because information is needed to identify inefficient operation of the fluid handling system and to control the output of the pumping system according to process needs. Model-based pump or fan monitoring methods implemented in variable speed drives have proven to be able to give information on the system output without additional metering; however, the current model-based methods may not be usable or sufficiently accurate in the whole operation range of the fluid handling device. To apply model-based system monitoring in a wider selection of systems and to improve the accuracy of the monitoring, this paper proposes a new method for pump and fan output monitoring with variable-speed drives. The method uses a combination of already known operating point estimation methods. Laboratory measurements are used to verify the benefits and applicability of the improved estimation method, and the new method is compared with five previously introduced model-based estimation methods. According to the laboratory measurements, the new estimation method is the most accurate and reliable of the model-based estimation methods.
Resumo:
The emerging technologies have recently challenged the libraries to reconsider their role as a mere mediator between the collections, researchers, and wider audiences (Sula, 2013), and libraries, especially the nationwide institutions like national libraries, haven’t always managed to face the challenge (Nygren et al., 2014). In the Digitization Project of Kindred Languages, the National Library of Finland has become a node that connects the partners to interplay and work for shared goals and objectives. In this paper, I will be drawing a picture of the crowdsourcing methods that have been established during the project to support both linguistic research and lingual diversity. The National Library of Finland has been executing the Digitization Project of Kindred Languages since 2012. The project seeks to digitize and publish approximately 1,200 monograph titles and more than 100 newspapers titles in various, and in some cases endangered Uralic languages. Once the digitization has been completed in 2015, the Fenno-Ugrica online collection will consist of 110,000 monograph pages and around 90,000 newspaper pages to which all users will have open access regardless of their place of residence. The majority of the digitized literature was originally published in the 1920s and 1930s in the Soviet Union, and it was the genesis and consolidation period of literary languages. This was the era when many Uralic languages were converted into media of popular education, enlightenment, and dissemination of information pertinent to the developing political agenda of the Soviet state. The ‘deluge’ of popular literature in the 1920s to 1930s suddenly challenged the lexical orthographic norms of the limited ecclesiastical publications from the 1880s onward. Newspapers were now written in orthographies and in word forms that the locals would understand. Textbooks were written to address the separate needs of both adults and children. New concepts were introduced in the language. This was the beginning of a renaissance and period of enlightenment (Rueter, 2013). The linguistically oriented population can also find writings to their delight, especially lexical items specific to a given publication, and orthographically documented specifics of phonetics. The project is financially supported by the Kone Foundation in Helsinki and is part of the Foundation’s Language Programme. One of the key objectives of the Kone Foundation Language Programme is to support a culture of openness and interaction in linguistic research, but also to promote citizen science as a tool for the participation of the language community in research. In addition to sharing this aspiration, our objective within the Language Programme is to make sure that old and new corpora in Uralic languages are made available for the open and interactive use of the academic community as well as the language societies. Wordlists are available in 17 languages, but without tokenization, lemmatization, and so on. This approach was verified with the scholars, and we consider the wordlists as raw data for linguists. Our data is used for creating the morphological analyzers and online dictionaries at the Helsinki and Tromsø Universities, for instance. In order to reach the targets, we will produce not only the digitized materials but also their development tools for supporting linguistic research and citizen science. The Digitization Project of Kindred Languages is thus linked with the research of language technology. The mission is to improve the usage and usability of digitized content. During the project, we have advanced methods that will refine the raw data for further use, especially in the linguistic research. How does the library meet the objectives, which appears to be beyond its traditional playground? The written materials from this period are a gold mine, so how could we retrieve these hidden treasures of languages out of the stack that contains more than 200,000 pages of literature in various Uralic languages? The problem is that the machined-encoded text (OCR) contains often too many mistakes to be used as such in research. The mistakes in OCRed texts must be corrected. For enhancing the OCRed texts, the National Library of Finland developed an open-source code OCR editor that enabled the editing of machine-encoded text for the benefit of linguistic research. This tool was necessary to implement, since these rare and peripheral prints did often include already perished characters, which are sadly neglected by the modern OCR software developers, but belong to the historical context of kindred languages and thus are an essential part of the linguistic heritage (van Hemel, 2014). Our crowdsourcing tool application is essentially an editor of Alto XML format. It consists of a back-end for managing users, permissions, and files, communicating through a REST API with a front-end interface—that is, the actual editor for correcting the OCRed text. The enhanced XML files can be retrieved from the Fenno-Ugrica collection for further purposes. Could the crowd do this work to support the academic research? The challenge in crowdsourcing lies in its nature. The targets in the traditional crowdsourcing have often been split into several microtasks that do not require any special skills from the anonymous people, a faceless crowd. This way of crowdsourcing may produce quantitative results, but from the research’s point of view, there is a danger that the needs of linguists are not necessarily met. Also, the remarkable downside is the lack of shared goal or the social affinity. There is no reward in the traditional methods of crowdsourcing (de Boer et al., 2012). Also, there has been criticism that digital humanities makes the humanities too data-driven and oriented towards quantitative methods, losing the values of critical qualitative methods (Fish, 2012). And on top of that, the downsides of the traditional crowdsourcing become more imminent when you leave the Anglophone world. Our potential crowd is geographically scattered in Russia. This crowd is linguistically heterogeneous, speaking 17 different languages. In many cases languages are close to extinction or longing for language revitalization, and the native speakers do not always have Internet access, so an open call for crowdsourcing would not have produced appeasing results for linguists. Thus, one has to identify carefully the potential niches to complete the needed tasks. When using the help of a crowd in a project that is aiming to support both linguistic research and survival of endangered languages, the approach has to be a different one. In nichesourcing, the tasks are distributed amongst a small crowd of citizen scientists (communities). Although communities provide smaller pools to draw resources, their specific richness in skill is suited for complex tasks with high-quality product expectations found in nichesourcing. Communities have a purpose and identity, and their regular interaction engenders social trust and reputation. These communities can correspond to research more precisely (de Boer et al., 2012). Instead of repetitive and rather trivial tasks, we are trying to utilize the knowledge and skills of citizen scientists to provide qualitative results. In nichesourcing, we hand in such assignments that would precisely fill the gaps in linguistic research. A typical task would be editing and collecting the words in such fields of vocabularies where the researchers do require more information. For instance, there is lack of Hill Mari words and terminology in anatomy. We have digitized the books in medicine, and we could try to track the words related to human organs by assigning the citizen scientists to edit and collect words with the OCR editor. From the nichesourcing’s perspective, it is essential that altruism play a central role when the language communities are involved. In nichesourcing, our goal is to reach a certain level of interplay, where the language communities would benefit from the results. For instance, the corrected words in Ingrian will be added to an online dictionary, which is made freely available for the public, so the society can benefit, too. This objective of interplay can be understood as an aspiration to support the endangered languages and the maintenance of lingual diversity, but also as a servant of ‘two masters’: research and society.
Resumo:
Please consult the paper edition of this thesis to read. It is available on the 5th Floor of the Library at Call Number: Z 9999 E38 K535 2008
Resumo:
Cette thèse contribue à une théorie générale de la conception du projet. S’inscrivant dans une demande marquée par les enjeux du développement durable, l’objectif principal de cette recherche est la contribution d’un modèle théorique de la conception permettant de mieux situer l’utilisation des outils et des normes d’évaluation de la durabilité d’un projet. Les principes fondamentaux de ces instruments normatifs sont analysés selon quatre dimensions : ontologique, méthodologique, épistémologique et téléologique. Les indicateurs de certains effets contre-productifs reliés, en particulier, à la mise en compte de ces normes confirment la nécessité d’une théorie du jugement qualitatif. Notre hypothèse principale prend appui sur le cadre conceptuel offert par la notion de « principe de précaution » dont les premières formulations remontent du début des années 1970, et qui avaient précisément pour objectif de remédier aux défaillances des outils et méthodes d’évaluation scientifique traditionnelles. La thèse est divisée en cinq parties. Commençant par une revue historique des modèles classiques des théories de la conception (design thinking) elle se concentre sur l’évolution des modalités de prise en compte de la durabilité. Dans cette perspective, on constate que les théories de la « conception verte » (green design) datant du début des années 1960 ou encore, les théories de la « conception écologique » (ecological design) datant des années 1970 et 1980, ont finalement convergé avec les récentes théories de la «conception durable» (sustainable design) à partir du début des années 1990. Les différentes approches du « principe de précaution » sont ensuite examinées sous l’angle de la question de la durabilité du projet. Les standards d’évaluation des risques sont comparés aux approches utilisant le principe de précaution, révélant certaines limites lors de la conception d’un projet. Un premier modèle théorique de la conception intégrant les principales dimensions du principe de précaution est ainsi esquissé. Ce modèle propose une vision globale permettant de juger un projet intégrant des principes de développement durable et se présente comme une alternative aux approches traditionnelles d’évaluation des risques, à la fois déterministes et instrumentales. L’hypothèse du principe de précaution est dès lors proposée et examinée dans le contexte spécifique du projet architectural. Cette exploration débute par une présentation de la notion classique de «prudence» telle qu’elle fut historiquement utilisée pour guider le jugement architectural. Qu’en est-il par conséquent des défis présentés par le jugement des projets d’architecture dans la montée en puissance des méthodes d’évaluation standardisées (ex. Leadership Energy and Environmental Design; LEED) ? La thèse propose une réinterprétation de la théorie de la conception telle que proposée par Donald A. Schön comme une façon de prendre en compte les outils d’évaluation tels que LEED. Cet exercice révèle cependant un obstacle épistémologique qui devra être pris en compte dans une reformulation du modèle. En accord avec l’épistémologie constructiviste, un nouveau modèle théorique est alors confronté à l’étude et l’illustration de trois concours d'architecture canadienne contemporains ayant adopté la méthode d'évaluation de la durabilité normalisée par LEED. Une série préliminaire de «tensions» est identifiée dans le processus de la conception et du jugement des projets. Ces tensions sont ensuite catégorisées dans leurs homologues conceptuels, construits à l’intersection du principe de précaution et des théories de la conception. Ces tensions se divisent en quatre catégories : (1) conceptualisation - analogique/logique; (2) incertitude - épistémologique/méthodologique; (3) comparabilité - interprétation/analytique, et (4) proposition - universalité/ pertinence contextuelle. Ces tensions conceptuelles sont considérées comme autant de vecteurs entrant en corrélation avec le modèle théorique qu’elles contribuent à enrichir sans pour autant constituer des validations au sens positiviste du terme. Ces confrontations au réel permettent de mieux définir l’obstacle épistémologique identifié précédemment. Cette thèse met donc en évidence les impacts généralement sous-estimés, des normalisations environnementales sur le processus de conception et de jugement des projets. Elle prend pour exemple, de façon non restrictive, l’examen de concours d'architecture canadiens pour bâtiments publics. La conclusion souligne la nécessité d'une nouvelle forme de « prudence réflexive » ainsi qu’une utilisation plus critique des outils actuels d’évaluation de la durabilité. Elle appelle une instrumentalisation fondée sur l'intégration globale, plutôt que sur l'opposition des approches environnementales.
Resumo:
Les modifications post-transcriptionnelles de l’ARN messager (ARNm), comme l’épissage alternatif, jouent un rôle important dans la régulation du développement embryonnaire, de la fonction cellulaire et de l’immunité. De nouvelles évidences révèlent que l’épissage alternatif serait également impliqué dans la régulation de la maturation et de l’activation des cellules du système hématopoïétique. Le facteur hnRNP L a été identifié comme étant le principal régulateur de l’épissage alternatif du gène codant pour le récepteur CD45 in vitro. Le récepteur CD45 est une tyrosine phosphatase exprimée par toutes les cellules du système hématopoïétique qui contrôle le développement et l’activation des lymphocytes T. Dans un premier temps, nous avons étudié la fonction du facteur hnRNP L dans le développement des lymphocytes T et dans l’épissage de l’ARNm de CD45 in vivo en utilisant des souris dont le gène de hnRNP L a été supprimé spécifiquement dans les cellules T. La délétion de hnRNP L dans les thymocytes résulte en une expression aberrante des différents isoformes de CD45 avec une prédominance de l'isoforme CD45RA qui est généralement absent dans le thymus. Une conséquence de la délétion de hnRNP L est une diminution de la cellularité du thymus causée par un blocage partiel du développement des cellules pré-T au stade DN4. Cette réduction du nombre de cellules dans le thymus n’est pas liée à une hausse de la mort cellulaire. Les thymocytes déficients pour hnRNP L démontrent plutôt une prolifération augmentée comparée aux thymocytes sauvages due à une hyper-activation des kinases Lck, Erk1/2 et Akt. De plus, la délétion de hnRNP L dans le thymus cause une perte des cellules T en périphérie. Les résultats des expériences in vitro suggèrent que cette perte est principalement due à un défaut de migration des thymocytes déficients pour hnRNP L du thymus vers la périphérie en réponse aux chimiokines. L’épissage alternatif de CD45 ne peut expliquer ce phénotype mais l’identification de cibles par RNA-Seq a révélé un rôle de hnRNP L dans la régulation de l’épissage alternatif de facteurs impliqués dans la polymérisation de l’actine. Dans un second temps, nous avons étudié le rôle de hnRNP L dans l’hématopoïèse en utilisant des souris dont la délétion de hnRNP L était spécifique aux cellules hématopoïétiques dans les foies fœtaux et la moelle osseuse. L’ablation de hnRNP L réduit le nombre de cellules progénitrices incluant les cellules progénitrices lymphocytaires (CLPs), myéloïdes (CMPs, GMPs) et mégakaryocytes-érythrocytaires (MEPs) et une perte des cellules hématopoïétiques matures. À l’opposé des cellules progénitrices multipotentes (MPPs) qui sont affectées en absence de hnRNP L, la population de cellules souches hématopoïétiques (HSCs) n’est pas réduite et prolifère plus que les cellules contrôles. Cependant, les HSCs n’exprimant pas hnRNP L sont positives pour l'Annexin V et expriment CD95 ce qui suggère une mort cellulaire prononcée. Comme pour les thymocytes, une analyse par RNA-Seq des foies fœtaux a révélé différents gènes cibles de hnRNP L appartenant aux catégories reliées à la mort cellulaire, la réponse aux dommages à l’ADN et à l’adhésion cellulaire qui peuvent tous expliquer le phénotype des cellules n’exprimant pas le gène hnRNP L. Ces résultats suggèrent que hnRNP L et l’épissage alternatif sont essentiels pour maintenir le potentiel de différenciation des cellules souches hématopoïétiques et leur intégrité fonctionnelle. HnRNP L est aussi crucial pour le développement des cellules T par la régulation de l’épissage de CD45 ainsi que pour leur migration.
Resumo:
One of the fastest expanding areas of computer exploitation is in embedded systems, whose prime function is not that of computing, but which nevertheless require information processing in order to carry out their prime function. Advances in hardware technology have made multi microprocessor systems a viable alternative to uniprocessor systems in many embedded application areas. This thesis reports the results of investigations carried out on multi microprocessors oriented towards embedded applications, with a view to enhancing throughput and reliability. An ideal controller for multiprocessor operation is developed which would smoothen sharing of routines and enable more powerful and efficient code I data interchange. Results of performance evaluation are appended.A typical application scenario is presented, which calls for classifying tasks based on characteristic features that were identified. The different classes are introduced along with a partitioned storage scheme. Theoretical analysis is also given. A review of schemes available for reducing disc access time is carried out and a new scheme presented. This is found to speed up data base transactions in embedded systems. The significance of software maintenance and adaptation in such applications is highlighted. A novel scheme of prov1d1ng a maintenance folio to system firmware is presented, alongwith experimental results. Processing reliability can be enhanced if facility exists to check if a particular instruction in a stream is appropriate. Likelihood of occurrence of a particular instruction would be more prudent if number of instructions in the set is less. A new organisation is derived to form the basement for further work. Some early results that would help steer the course of the work are presented.
Resumo:
We extend the relativistic mean field theory model of Sugahara and Toki by adding new couplings suggested by modern effective field theories. An improved set of parameters is developed with the goal to test the ability of the models based on effective field theory to describe the properties of finite nuclei and, at the same time, to be consistent with the trends of Dirac-Brueckner-Hartree-Fock calculations at densities away from the saturation region. We compare our calculations with other relativistic nuclear force parameters for various nuclear phenomena.
Resumo:
The antikaon optical potential in hot and dense nuclear matter is studied within the framework of a coupled-channel self-consistent calculation taking, as bare meson-baryon interaction, the meson-exchange potential of the Jlich group. Typical conditions found in heavy-ion collisions at GSI are explored. As in the case of zero temperature, the angular momentum components larger than L=0 contribute significantly to the finite temperature antikaon optical potential at finite momentum. It is found that the particular treatment of the medium effects has a strong influence on the behavior of the antikaon potential with temperature. Our self-consistent model, in which antikaons and pions are dressed in the medium, gives a moderately temperature dependent antikaon potential which remains attractive at GSI temperatures, contrary to what one finds if only nuclear Pauli blocking effects are included.
Resumo:
The basic idea behind improving local food security consists of two paths; first, accessibility (price, stock) and second, availability (quantity and biodiversity); both are perquisites to the provision of nutrients and a continuous food supply with locally available resources. The objectives of this thesis are to investigate if indigenous knowledge still plays an important role in traditional farming in the Minangkabau`s culture, thus supporting local food security. If the indigenous knowledge still plays a role in food culture in the Minangkabau`s culture which is linked to the matrilineal role and leads to a sound nutrition. Further, it should be tested if marantau influences traditional farming and food culture in Minangkabau`s, and if the local government plays a role in changing of traditional farming systems and food culture. Furthermore this thesis wants to prove if education and gender are playing a role in changing traditional farming system and food culture, and if the mass media affects traditional farming systems and food culture for the Minangkabau. The study was completed at four locations in West Sumatera; Nagari Ulakan (NU) (coastal area), Nagari Aia Batumbuak (NAB) (hilly area), Nagari Padang Laweh Malalo (NPLM) (lake area), Nagari Pandai Sikek (NPS) (hilly area). The rainfall ranged from 1400- 4800 mm annually with fertile soils. Data was collected by using PRA (Participatory Rural Appraisal) to investigate indigenous knowledge (IK) and its interactions, which is also combining with in depth-interview, life history, a survey using semi-structured-questionnaire, pictures, mapping, and expert interview. The data was collected from June - September 2009 and June 2010. The materials are; map of area, list of names, questionnaires, voices recorder, note book, and digital camera. The sampling method was snowball sampling which resulted in the qualitative and quantitative data taken. For qualitative data, ethnography and life history was used. For quantitative, a statistical survey with a semi-structured questionnaire was used. 50 respondents per each site participated voluntarily. Data was analyzed by performing MAXQDA 10, and F4 audio analysis software (created and developed by Philip-University Marburg). The data is clustered based on causality. The results show that; the role of IK on TFS (traditional farming system) shown on NPLM which has higher food crop biodiversity in comparison to the other three places even though it has relatively similar temperature and rainfall. This high food crop biodiversity is due to the awareness of local people who realized that they lived in unfavourable climate and topography; therefore they are more prepared for any changes that may occur. Carbohydrate intake is 100 % through rice even though they are growing different staple crops. Whereas most of the people said in the interviews that not eating rice is like not really eating for them. In addition to that, mothers still play an important role in kitchen activities. But when the agriculture income is low, mothers have to decide whether to change the meals or to feel insecure about their food supply. Marantau yields positive impact through the remittances it provides to invest on the farm. On the other hand, it results in fewer workers for agriculture, and therefore a negative impact on the transfer of IK. The investigation showed that the local government has a PTS (Padi Tanam Sabatang) programme which still does not guarantee that the farmers are getting sufficient revenue from their land. The low agricultural income leads to situation of potential food insecurity. It is evident that education is equal among men and women, but in some cases women tend to leave school earlier because of arranged marriages or the distances of school from their homes. Men predominantly work in agriculture and fishing, while women work in the kitchen. In NAB, even though women work on farmland they earn less then men. Weaving (NPS) and kitchen activity is recognized as women’s work, which also supports the household income. Mass media is not yielding any changes in TFS and food culture in these days. The traditional farming system has changed because of intensive agricultural extension which has introduced new methods of agriculture for the last three decades (since the 1980’s). There is no evidence that they want to change any of their food habits because of the mass media despite the lapau activity which allows them to get more food choices, instead preparing traditional meal at home. The recommendations of this thesis are: 1) The empowerment of farmers. It is regarding the self sufficient supply of manure, cooperative seed, and sustainable farm management. Farmers should know – where are they in their state of knowledge – so they can use their local wisdom and still collaborate with new sources of knowledge. Farmers should learn the prognosis of supply and demand next prior to harvest. There is a need for farm management guidelines; that can be adopted from both their local wisdom and modern knowledge. 2) Increase of non-agricultural income Increasing the non-agricultural income is strongly recommended. The remittances can be invested on non-agricultural jobs. 3) The empowerment of the mother. The mother plays an important role in farm to fork activities; the mother can be an initiator and promoter of cultivating spices in the backyard. Improvement of nutritional knowledge through information and informal public education can be done through arisan ibu-ibu and lapau activity. The challenges to apply these recommendations are: 1) The gap between institutions and organizations of local governments. There is more than one institution involved in food security policy. 2) Training and facilities for field extension agriculture (FEA) is needed because the rapid change of interaction between local government and farmer’s dependent on this agency.
Resumo:
The interaction of short intense laser pulses with atoms/molecules produces a multitude of highly nonlinear processes requiring a non-perturbative treatment. Detailed study of these highly nonlinear processes by numerically solving the time-dependent Schrodinger equation becomes a daunting task when the number of degrees of freedom is large. Also the coupling between the electronic and nuclear degrees of freedom further aggravates the computational problems. In the present work we show that the time-dependent Hartree (TDH) approximation, which neglects the correlation effects, gives unreliable description of the system dynamics both in the absence and presence of an external field. A theoretical framework is required that treats the electrons and nuclei on equal footing and fully quantum mechanically. To address this issue we discuss two approaches, namely the multicomponent density functional theory (MCDFT) and the multiconfiguration time-dependent Hartree (MCTDH) method, that go beyond the TDH approximation and describe the correlated electron-nuclear dynamics accurately. In the MCDFT framework, where the time-dependent electronic and nuclear densities are the basic variables, we discuss an algorithm to calculate the exact Kohn-Sham (KS) potentials for small model systems. By simulating the photodissociation process in a model hydrogen molecular ion, we show that the exact KS potentials contain all the many-body effects and give an insight into the system dynamics. In the MCTDH approach, the wave function is expanded as a sum of products of single-particle functions (SPFs). The MCTDH method is able to describe the electron-nuclear correlation effects as the SPFs and the expansion coefficients evolve in time and give an accurate description of the system dynamics. We show that the MCTDH method is suitable to study a variety of processes such as the fragmentation of molecules, high-order harmonic generation, the two-center interference effect, and the lochfrass effect. We discuss these phenomena in a model hydrogen molecular ion and a model hydrogen molecule. Inclusion of absorbing boundaries in the mean-field approximation and its consequences are discussed using the model hydrogen molecular ion. To this end, two types of calculations are considered: (i) a variational approach with a complex absorbing potential included in the full many-particle Hamiltonian and (ii) an approach in the spirit of time-dependent density functional theory (TDDFT), including complex absorbing potentials in the single-particle equations. It is elucidated that for small grids the TDDFT approach is superior to the variational approach.
Resumo:
La península coreana ha sido desde la Guerra Fría y a la actualidad una zona convulsionada por intereses políticos, económicos e ideológicos. Ese panorama obliga un análisis sobre la configuración y los cambios que se han dado entre las potencias actuales, China y Estados Unidos, desde la existencia de un programa nuclear norcoreano que afecta a Corea del Sur y la definición de los intereses de Beijin y Washington.
Resumo:
Los gliomas malignos representan una de las formas más agresivas de los tumores del sistema nervioso central (SNC). De acuerdo con la clasificación de los tumores cerebrales de la Organización Mundial de la Salud (OMS), los astrocitomas han sido categorizados en cuatro grados, determinados por la patología subyacente. Es así como los gliomas malignos (o de alto grado) incluyen el glioma anaplásico (grado III) así como el glioblastoma multiforme (GBM, grado IV),estos últimos los más agresivos con el peor pronóstico (1). El manejo terapéutico de los tumores del SNC se basa en la cirugía, la radioterapia y la quimioterapia, dependiendo de las características del tumor, el estadio clínico y la edad (2),(3), sin embargo ninguno de los tratamientos estándar es completamente seguro y compatible con una calidad de vida aceptable (3), (4). En general, la quimioterapia es la primera opción en los tumores diseminados, como el glioblastoma invasivo y el meduloblastoma de alto riesgo o con metástasis múltiple, pero el pronóstico en estos pacientes es muy pobre (2),(3). Solamente nuevas terapias dirigidas (2) como las terapias anti-angiogénicas (4); o terapias génicas muestran un beneficio real en grupos limitados de pacientes con defectos moleculares específicos conocidos (4). De este modo, se hace necesario el desarrollo de nuevas terapias farmacológicas para atacar los tumores cerebrales. Frente a las terapias los gliomas malignos son con frecuencia quimioresistentes, y esta resistencia parece depender de al menos dos mecanismos: en primer lugar, la pobre penetración de muchas drogas anticáncer a través de la barrera hematoencefálica (BBB: Blood Brain Barrier), la barrera del fluido sangre-cerebroespinal (BCSFB: Blood-cerebrospinal fluid barrier) y la barrera sangre-tumor (BTB: blood-tumor barrier). Dicha resistencia se debe a la interacción de la droga con varios transportadores o bombas de eflujo de droga ABC (ABC: ATP-binding cassette) que se sobre expresan en las células endoteliales o epiteliales de estas barreras. En segundo lugar, estos transportadores de eflujo de drogas ABC propios de las células tumorales confieren un fenotipo conocido como resistencia a multidrogas (MDR: multidrug resistance), el cual es característico de varios tumores sólidos. Este fenotipo también está presente en los tumores del SNC y su papel en gliomas es objeto de investigación (5). Por consiguiente el suministro de medicamentos a través de la BBB es uno de los problemas vitales en los tratamientos de terapia dirigida. Estudios recientes han demostrado que algunas moléculas pequeñas utilizadas en estas terapias son sustratos de la glicoproteína P (Pgp: P-gycoprotein), así como también de otras bombas de eflujo como las proteínas relacionadas con la resistencia a multidrogas (MRPs: multidrug resistance-related proteins (MRPs) o la proteína relacionada con cáncer de seno (BCRP: breast-cancer resistance related protein)) que no permiten que las drogas de este tipo alcancen el tumor (1). Un sustrato de Pgp y BCRP es la DOXOrubicina (DOXO), un fármaco utilizado en la terapia anti cáncer, el cual es muy eficaz para atacar las células del tumor cerebral in vitro, pero con un uso clínico limitado por la poca entrega a través de la barrera hematoencefálica (BBB) y por la resistencia propia de los tumores. Por otra parte las células de BBB y las células del tumor cerebral tienen también proteínas superficiales, como el receptor de la lipoproteína de baja densidad (LDLR), que podría utilizarse como blanco terapéutico en BBB y tumores cerebrales. Es asi como la importancia de este estudio se basa en la generación de estrategias terapéuticas que promuevan el paso de las drogas a través de la barrera hematoencefalica y tumoral, y a su vez, se reconozcan mecanismos celulares que induzcan el incremento en la expresión de los transportadores ABC, de manera que puedan ser utilizados como blancos terapéuticos.Este estudio demostró que el uso de una nueva estrategia basada en el “Caballo de Troya”, donde se combina la droga DOXOrubicina, la cual es introducida dentro de un liposoma, salvaguarda la droga de manera que se evita su reconocimiento por parte de los transportadores ABC tanto de la BBB como de las células del tumor. La construcción del liposoma permitió utilizar el receptor LDLR de las células asegurando la entrada a través de la BBB y hacia las células tumorales a través de un proceso de endocitosis. Este mecanismo fue asociado al uso de estatinas o drogas anticolesterol las cuales favorecieron la expresión de LDLR y disminuyeron la actividad de los transportadores ABC por nitración de los mismos, incrementando la eficiencia de nuestro Caballo de Troya. Por consiguiente demostramos que el uso de una nueva estrategia o formulación denominada ApolipoDOXO más el uso de estatinas favorece la administración de fármacos a través de la BBB, venciendo la resistencia del tumor y reduciendo los efectos colaterales dosis dependiente de la DOXOrubicina. Además esta estrategia del "Caballo de Troya", es un nuevo enfoque terapéutico que puede ser considerado como una nueva estrategia para aumentar la eficacia de diferentes fármacos en varios tumores cerebrales y garantiza una alta eficiencia incluso en un medio hipóxico,característico de las células cancerosas, donde la expresión del transportador Pgp se vió aumentada. Teniendo en cuenta la relación entre algunas vías de señalización reconocidas como moduladores de la actividad de Pgp, este estudio presenta no solo la estrategia del Caballo de Troya, sino también otra propuesta terapéutica relacionada con el uso de Temozolomide más DOXOrubicina. Esta estrategia demostró que el temozolomide logra penetrar la BBB por que interviene en la via de señalización de la Wnt/GSK3/β-catenina, la cual modula la expresión del transportador Pgp. Se demostró que el TMZ disminuye la proteína y el mRNA de Wnt3 permitiendo plantear la hipótesis de que la droga al disminuir la transcripción del gen Wnt3 en células de BBB, incrementa la activación de la vía fosforilando la β-catenina y conduciendo a disminuir la β-catenina nuclear y por tanto su unión al promotor del gen mdr1. Con base en los resultados este estudio permitió el reconocimiento de tres mecanismos básicos relacionados con la expresión de los transportadores ABC y asociados a las estrategias empleadas: el primero fue el uso de las estatinas, el cual condujo a la nitración de los transportadores disminuyendo su actividad por la via del factor de transcripción NFκB; el segundo a partir del uso del temozolomide, el cual metila el gen de Wnt3 reduciendo la actividad de la via de señalización de la la β-catenina, disminuyendo la expresión del transportador Pgp. El tercero consistió en la determinación de la relación entre el eje RhoA/RhoA quinasa como un modulador de la via (no canónica) GSK3/β-catenina. Se demostró que la proteína quinasa RhoA promovió la activación de la proteína PTB1, la cual al fosforilar a GSK3 indujo la fosforilación de la β-catenina, lo cual dio lugar a su destrucción por el proteosoma, evitando su unión al promotor del gen mdr1 y por tanto reduciendo su expresión. En conclusión las estrategias propuestas en este trabajo incrementaron la citotoxicidad de las células tumorales al aumentar la permeabilidad no solo de la barrera hematoencefálica, sino también de la propia barrera tumoral. Igualmente, la estrategia del “Caballo de Troya” podría ser útil para la terapia de otras enfermedades asociadas al sistema nervioso central. Por otra parte estos estudios indican que el reconocimiento de mecanismos asociados a la expresión de los transportadores ABC podría constituir una herramienta clave en el desarrollo de nuevas terapias anticáncer.
Resumo:
L'agricultura i la industrialització han causat un augment significatiu del nombre d'ambients rics en amoni. La presència de compostos nitrogenats redueix la qualitat de l'aigua, causant problemes de toxicitat, deteriorant el medi ambient i fins i tot afectant la salut humana. En conseqüència, la nitrificació s'ha convertit en un procés global que afecta al cicle del nitrogen a la biosfera. Els bacteris oxidadors d'amoni (AOB) són els responsables de l'oxidació de l'amoni a nitrit, i juguen un paper essencial en el cicle del nitrogen. Els primers oxidadors d'amoni foren aïllats a finals del segle XIX, però la lentitud del seu creixement i les dificultats per cultivar-los feren que fins als anys 80, amb els primers estudis emprant el gen 16SrDNA, no s'assolís un coneixement complert d'aquest grup bacterià. Actualment les bases de dades contenen multitud d'entrades amb seqüències corresponents a AOB. L'objectiu d'aquest treball era trobar, desenvolupar i avaluar eines útils i fiables per a l'estudi dels AOB en mostres ambientals. En aquest treball primer descrivim la utilització de la hibridació in situ amb fluorescència (FISH), mitjançant l'aplicació de sondes amb diana en el 16SrRNA dels AOB. La FISH ens va permetre detectar i recomptar aquest grup bacterià; no obstant, aquest mètode no permetia la detecció de noves seqüències, pel que es necessitava una nova eina. Amb aquesta intenció vam aplicar la seqüència de la sonda Nso1225 en una PCR. El fet d'amplificar específicament un fragment del 16SrDNA dels AOB va suposar el desenvolupament d'una nova eina molecular que permetia detectar la presència i diversitat d'aquests bacteris en ambients naturals. Malgrat tot, algunes seqüències pertanyents a bacteris no oxidadors d'amoni del subgrup β dels proteobacteris, eren també obtingudes amb aquesta tècnica. Així mateix, un dels inconvenients de l'ús del 16SrDNA com a marcador és la impossibilitat de detectar simultàniament els AOB que pertanyen als subgrups β i γ dels proteobacteris. El gen amoA, que codifica per la subunitat A de l'enzim amoni monooxigenasa (AMO), era aleshores àmpliament utilitzat com a marcador per a la detecció dels AOB. En aquest treball també descrivim la utilització d'aquest marcador en mostres procedents d'un reactor SBR. Aquest marcador ens va permetre identificar seqüències de AOB en la mostra, però la necessitat de detectar amoA mitjançant clonatge fa que l'ús d'aquest marcador requereixi massa temps per a la seva utilització com a eina en estudis d'ecologia microbiana amb moltes mostres. Per altra banda, alguns autors han assenyalat l'obtenció de seqüències de no AOB en utilitzar amoA en un protocol de PCR-DGGE. Amb la finalitat d'obtenir una eina ràpida i rigorosa per detectar i identificar els AOB, vam desenvolupar un joc nou d'oligonucleòtids amb diana en el gen amoB, que codifica per a la subunitat transmembrana de l'enzim AMO. Aquest gen ha demostrat ser un bon marcador molecular pels AOB, oferint, sense tenir en compte afiliacions filogenètiques, una elevada especificitat, sensibilitat i fiabilitat. En aquest treball també presentem una anàlisi de RT-PCR basada en la detecció del gen amoB per a la quantificació del gènere Nitrosococcus. El nou joc d'oligonucleòtids dissenyat permet una enumeració altament específica i sensible de tots els γ-Nitrosococcus coneguts. Finalment, vam realitzar un estudi poligènic, comparant i avaluant els marcadors amoA, amoB i 16SrDNA, i vàrem construir un arbre filogenètic combinat. Com a resultat concloem que amoB és un marcador adequat per a la detecció i identificació dels AOB en mostres ambientals, proporcionant alhora agrupacions consistents en fer inferències filogenètiques. Per altra banda, la seqüència sencera del gen 16S rDNA és indicada com a marcador en estudis amb finalitats taxonòmiques i filogenètiques en treballar amb cultius purs de AOB.
Resumo:
A new control paradigm for Brain Computer Interfaces (BCIs) is proposed. BCIs provide a means of communication direct from the brain to a computer that allows individuals with motor disabilities an additional channel of communication and control of their external environment. Traditional BCI control paradigms use motor imagery, frequency rhythm modification or the Event Related Potential (ERP) as a means of extracting a control signal. A new control paradigm for BCIs based on speech imagery is initially proposed. Further to this a unique system for identifying correlations between components of the EEG and target events is proposed and introduced.
Resumo:
1. The feeding rates of many predators and parasitoids exhibit type II functional responses, with a decelerating rate of increase to reach an asymptotic value as the density of their prey or hosts increases. Holling's disc equation describes such relationships and predicts that the asymptotic feeding rate at high prey densities is set by handling time, while the rate at which feeding rate increases with increased prey density is determined by searching efficiency. Searching efficiency and handling time are also parameters in other models which describe the functional response. Models which incorporate functional responses in order to make predictions of the effects of food shortage thus rely upon a clear understanding and accurate quantification of searching efficiency and handling time. 2. Blackbird Turdus merula exhibit a type II functional response and use pause-travel foraging, a foraging technique in which animals search for prey while stationary and then move to capture prey. Pause-travel foraging allows accurate direct measurement of feeding rate and both searching efficiency and handling time. We use Blackbirds as a model species to: (i) compare observed measures of both searching efficiency and handling time with those estimated by statistically fitting the disc equation to the observed functional response; and (ii) investigate alternative measures of searching efficiency derived by the established method where search area is assumed to be circular and a new method that we propose where it is not. 3. We find that the disc equation can adequately explain the functional response of blackbirds feeding on artificial prey. However, this depends critically upon how searching efficiency is measured. Two variations on the previous method of measuring search area (a component of searching efficiency) overestimated searching efficiency, and hence predicted feeding rates higher than those observed. Two variations of our alternative approach produced lower estimates of searching efficiency, closer to that estimated by fitting the disc equation, and hence more accurately predicted feeding rate. Our study shows the limitations of the previous method of measuring searching efficiency, and describes a new method for measuring searching efficiency more accurately.