923 resultados para Complex non-linear paradigm, Non-linearity


Relevância:

50.00% 50.00%

Publicador:

Resumo:

Background: Frailty in older adults is a multifactorial syndrome defined by low metabolic reserve, less resistance to stressors, and difficulty in maintaining organic homeostasis due to cumulative decline of multiple physiological systems. The relationship between frailty and cognition remains unclear and studies about Mini-Mental State Examination (MMSE) performance and frailty are scarce. The objective was to examine the association between frailty and cognitive functioning as assessed by the MMSE and its subdomains. Methods: A cross-sectional population-based study (FIBRA) was carried out in Ermelino Matarazzo, a poor subdistrict of the city of Sao Paulo, Brazil. Participants were 384 community dwelling older adults, 65 years and older who completed the MMSE and a protocol to assess frailty criteria as described in the Cardiovascular Health Study (CHS). Results: Frail older adults had significantly worse performance on the MMSE (p < 0.001 for total score). Linear regression analyses showed that the MMSE total score was influenced by age (p < 0.001), education (p < 0.001), family income (p < 0.001), and frailty status (p < 0.036). Being frail was associated more significantly with worse scores in Time Orientation (p < 0.004) and Immediate Memory (p < 0.001). Conclusions: Our data suggest that being frail is associated with worse cognitive performance, as assessed by the MMSE. It is recommended that the assessment of frail older adults should include the investigation of their cognitive status.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Abstract Background The mitochondrial DNA of kinetoplastid flagellates is distinctive in the eukaryotic world due to its massive size, complex form and large sequence content. Comprised of catenated maxicircles that contain rRNA and protein-coding genes and thousands of heterogeneous minicircles encoding small guide RNAs, the kinetoplast network has evolved along with an extreme form of mRNA processing in the form of uridine insertion and deletion RNA editing. Many maxicircle-encoded mRNAs cannot be translated without this post-transcriptional sequence modification. Results We present the complete sequence and annotation of the Trypanosoma cruzi maxicircles for the CL Brener and Esmeraldo strains. Gene order is syntenic with Trypanosoma brucei and Leishmania tarentolae maxicircles. The non-coding components have strain-specific repetitive regions and a variable region that is unique for each strain with the exception of a conserved sequence element that may serve as an origin of replication, but shows no sequence identity with L. tarentolae or T. brucei. Alternative assemblies of the variable region demonstrate intra-strain heterogeneity of the maxicircle population. The extent of mRNA editing required for particular genes approximates that seen in T. brucei. Extensively edited genes were more divergent among the genera than non-edited and rRNA genes. Esmeraldo contains a unique 236-bp deletion that removes the 5'-ends of ND4 and CR4 and the intergenic region. Esmeraldo shows additional insertions and deletions outside of areas edited in other species in ND5, MURF1, and MURF2, while CL Brener has a distinct insertion in MURF2. Conclusion The CL Brener and Esmeraldo maxicircles represent two of three previously defined maxicircle clades and promise utility as taxonomic markers. Restoration of the disrupted reading frames might be accomplished by strain-specific RNA editing. Elements in the non-coding region may be important for replication, transcription, and anchoring of the maxicircle within the kinetoplast network.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Abstract (2,250 Maximum Characters): Several theories of tidal evolution, since the theory developed by Darwin in the XIX century, are based on the figure of equilibrium of the tidally deformed body. Frequently the adopted figure is a Jeans prolate spheroid. In some case, however, the rotation is important and Roche ellipsoids are used. The main limitations of these models are (a) they refer to homogeneous bodies; (b) the rotation axis is perpendicular to the plane of the orbit. This communication aims at presenting several results in which these hypotheses are not done. In what concerns the non-homogeneity, the presented results concerns initially bodies formed by N homogeneous layers and we study the non sphericity of each layer and relate them to the density distribution. The result is similar to the Clairaut figure of equilibrium, often used in planetary sciences, but taking into full account the tidal deformation. The case of the rotation axis non perpendicular to the orbital plane is much more complex and the study has been restricted for the moment to the case of homogeneous bodies.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

The dengue virus (DENV) non-structural 1 (NS1) protein plays a critical role in viral RNA replication and has a central position in DENV pathogenesis. DENV NS1 is a glycoprotein expressed in infected mammalian cells as soluble monomers that dimerize in the lumen of the endoplasmic reticulum; NS1 is subsequently transported to the cell surface, where it remains membrane associated or is secreted into the extracellular milieu as a hexameric complex. During the last three decades, the DENV NS1 protein has also been intensively investigated as a potential target for vaccines and antiviral drugs. In addition, NS1 is the major diagnostic marker for dengue infection. This review highlights some important issues regarding the role of NS1 in DENV pathogenesis and its biotechnological applications, both as a target for the development of safe and effective vaccines and antiviral drugs and as a tool for the generation of accurate diagnostic methods

Relevância:

50.00% 50.00%

Publicador:

Resumo:

This work provides a forward step in the study and comprehension of the relationships between stochastic processes and a certain class of integral-partial differential equation, which can be used in order to model anomalous diffusion and transport in statistical physics. In the first part, we brought the reader through the fundamental notions of probability and stochastic processes, stochastic integration and stochastic differential equations as well. In particular, within the study of H-sssi processes, we focused on fractional Brownian motion (fBm) and its discrete-time increment process, the fractional Gaussian noise (fGn), which provide examples of non-Markovian Gaussian processes. The fGn, together with stationary FARIMA processes, is widely used in the modeling and estimation of long-memory, or long-range dependence (LRD). Time series manifesting long-range dependence, are often observed in nature especially in physics, meteorology, climatology, but also in hydrology, geophysics, economy and many others. We deepely studied LRD, giving many real data examples, providing statistical analysis and introducing parametric methods of estimation. Then, we introduced the theory of fractional integrals and derivatives, which indeed turns out to be very appropriate for studying and modeling systems with long-memory properties. After having introduced the basics concepts, we provided many examples and applications. For instance, we investigated the relaxation equation with distributed order time-fractional derivatives, which describes models characterized by a strong memory component and can be used to model relaxation in complex systems, which deviates from the classical exponential Debye pattern. Then, we focused in the study of generalizations of the standard diffusion equation, by passing through the preliminary study of the fractional forward drift equation. Such generalizations have been obtained by using fractional integrals and derivatives of distributed orders. In order to find a connection between the anomalous diffusion described by these equations and the long-range dependence, we introduced and studied the generalized grey Brownian motion (ggBm), which is actually a parametric class of H-sssi processes, which have indeed marginal probability density function evolving in time according to a partial integro-differential equation of fractional type. The ggBm is of course Non-Markovian. All around the work, we have remarked many times that, starting from a master equation of a probability density function f(x,t), it is always possible to define an equivalence class of stochastic processes with the same marginal density function f(x,t). All these processes provide suitable stochastic models for the starting equation. Studying the ggBm, we just focused on a subclass made up of processes with stationary increments. The ggBm has been defined canonically in the so called grey noise space. However, we have been able to provide a characterization notwithstanding the underline probability space. We also pointed out that that the generalized grey Brownian motion is a direct generalization of a Gaussian process and in particular it generalizes Brownain motion and fractional Brownain motion as well. Finally, we introduced and analyzed a more general class of diffusion type equations related to certain non-Markovian stochastic processes. We started from the forward drift equation, which have been made non-local in time by the introduction of a suitable chosen memory kernel K(t). The resulting non-Markovian equation has been interpreted in a natural way as the evolution equation of the marginal density function of a random time process l(t). We then consider the subordinated process Y(t)=X(l(t)) where X(t) is a Markovian diffusion. The corresponding time-evolution of the marginal density function of Y(t) is governed by a non-Markovian Fokker-Planck equation which involves the same memory kernel K(t). We developed several applications and derived the exact solutions. Moreover, we considered different stochastic models for the given equations, providing path simulations.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Self-organisation is increasingly being regarded as an effective approach to tackle modern systems complexity. The self-organisation approach allows the development of systems exhibiting complex dynamics and adapting to environmental perturbations without requiring a complete knowledge of the future surrounding conditions. However, the development of self-organising systems (SOS) is driven by different principles with respect to traditional software engineering. For instance, engineers typically design systems combining smaller elements where the composition rules depend on the reference paradigm, but typically produce predictable results. Conversely, SOS display non-linear dynamics, which can hardly be captured by deterministic models, and, although robust with respect to external perturbations, are quite sensitive to changes on inner working parameters. In this thesis, we describe methodological aspects concerning the early-design stage of SOS built relying on the Multiagent paradigm: in particular, we refer to the A&A metamodel, where MAS are composed by agents and artefacts, i.e. environmental resources. Then, we describe an architectural pattern that has been extracted from a recurrent solution in designing self-organising systems: this pattern is based on a MAS environment formed by artefacts, modelling non-proactive resources, and environmental agents acting on artefacts so as to enable self-organising mechanisms. In this context, we propose a scientific approach for the early design stage of the engineering of self-organising systems: the process is an iterative one and each cycle is articulated in four stages, modelling, simulation, formal verification, and tuning. During the modelling phase we mainly rely on the existence of a self-organising strategy observed in Nature and, hopefully encoded as a design pattern. Simulations of an abstract system model are used to drive design choices until the required quality properties are obtained, thus providing guarantees that the subsequent design steps would lead to a correct implementation. However, system analysis exclusively based on simulation results does not provide sound guarantees for the engineering of complex systems: to this purpose, we envision the application of formal verification techniques, specifically model checking, in order to exactly characterise the system behaviours. During the tuning stage parameters are tweaked in order to meet the target global dynamics and feasibility constraints. In order to evaluate the methodology, we analysed several systems: in this thesis, we only describe three of them, i.e. the most representative ones for each of the three years of PhD course. We analyse each case study using the presented method, and describe the exploited formal tools and techniques.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Traditional software engineering approaches and metaphors fall short when applied to areas of growing relevance such as electronic commerce, enterprise resource planning, and mobile computing: such areas, in fact, generally call for open architectures that may evolve dynamically over time so as to accommodate new components and meet new requirements. This is probably one of the main reasons that the agent metaphor and the agent-oriented paradigm are gaining momentum in these areas. This thesis deals with the engineering of complex software systems in terms of the agent paradigm. This paradigm is based on the notions of agent and systems of interacting agents as fundamental abstractions for designing, developing and managing at runtime typically distributed software systems. However, today the engineer often works with technologies that do not support the abstractions used in the design of the systems. For this reason the research on methodologies becomes the basic point in the scientific activity. Currently most agent-oriented methodologies are supported by small teams of academic researchers, and as a result, most of them are in an early stage and still in the first context of mostly \academic" approaches for agent-oriented systems development. Moreover, such methodologies are not well documented and very often defined and presented only by focusing on specific aspects of the methodology. The role played by meta- models becomes fundamental for comparing and evaluating the methodologies. In fact a meta-model specifies the concepts, rules and relationships used to define methodologies. Although it is possible to describe a methodology without an explicit meta-model, formalising the underpinning ideas of the methodology in question is valuable when checking its consistency or planning extensions or modifications. A good meta-model must address all the different aspects of a methodology, i.e. the process to be followed, the work products to be generated and those responsible for making all this happen. In turn, specifying the work products that must be developed implies dening the basic modelling building blocks from which they are built. As a building block, the agent abstraction alone is not enough to fully model all the aspects related to multi-agent systems in a natural way. In particular, different perspectives exist on the role that environment plays within agent systems: however, it is clear at least that all non-agent elements of a multi-agent system are typically considered to be part of the multi-agent system environment. The key role of environment as a first-class abstraction in the engineering of multi-agent system is today generally acknowledged in the multi-agent system community, so environment should be explicitly accounted for in the engineering of multi-agent system, working as a new design dimension for agent-oriented methodologies. At least two main ingredients shape the environment: environment abstractions - entities of the environment encapsulating some functions -, and topology abstractions - entities of environment that represent the (either logical or physical) spatial structure. In addition, the engineering of non-trivial multi-agent systems requires principles and mechanisms for supporting the management of the system representation complexity. These principles lead to the adoption of a multi-layered description, which could be used by designers to provide different levels of abstraction over multi-agent systems. The research in these fields has lead to the formulation of a new version of the SODA methodology where environment abstractions and layering principles are exploited for en- gineering multi-agent systems.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

This work deals with some classes of linear second order partial differential operators with non-negative characteristic form and underlying non- Euclidean structures. These structures are determined by families of locally Lipschitz-continuous vector fields in RN, generating metric spaces of Carnot- Carath´eodory type. The Carnot-Carath´eodory metric related to a family {Xj}j=1,...,m is the control distance obtained by minimizing the time needed to go from two points along piecewise trajectories of vector fields. We are mainly interested in the causes in which a Sobolev-type inequality holds with respect to the X-gradient, and/or the X-control distance is Doubling with respect to the Lebesgue measure in RN. This study is divided into three parts (each corresponding to a chapter), and the subject of each one is a class of operators that includes the class of the subsequent one. In the first chapter, after recalling “X-ellipticity” and related concepts introduced by Kogoj and Lanconelli in [KL00], we show a Maximum Principle for linear second order differential operators for which we only assume a Sobolev-type inequality together with a lower terms summability. Adding some crucial hypotheses on measure and on vector fields (Doubling property and Poincar´e inequality), we will be able to obtain some Liouville-type results. This chapter is based on the paper [GL03] by Guti´errez and Lanconelli. In the second chapter we treat some ultraparabolic equations on Lie groups. In this case RN is the support of a Lie group, and moreover we require that vector fields satisfy left invariance. After recalling some results of Cinti [Cin07] about this class of operators and associated potential theory, we prove a scalar convexity for mean-value operators of L-subharmonic functions, where L is our differential operator. In the third chapter we prove a necessary and sufficient condition of regularity, for boundary points, for Dirichlet problem on an open subset of RN related to sub-Laplacian. On a Carnot group we give the essential background for this type of operator, and introduce the notion of “quasi-boundedness”. Then we show the strict relationship between this notion, the fundamental solution of the given operator, and the regularity of the boundary points.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

The present thesis has as its object of study a project called “Education towards Peace and No Violence”, developed in the town of Altinópolis, in the inner part of the state of São Paulo, Brazil. Based on the analysis and the studies that were carried out, it was possible to identify the difficulties in the implementation of the project, as well as the positive results evidenced by the reduction of the reports of violence in the town. Through a careful research of the historical facts, it shall be demonstrated the progress of education, particularly in Brazil, throughout the years. In addition, a panoramic view of the studies and programs for peace developed in North America and Occidental Europe will be presented. The education for peace based in the holistic concept of human development appeared as a new educational paradigm that works as a crucial instrument in the construction process towards a peaceful and more human society based in social justice.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

[EN]Ensemble forecasting is a methodology to deal with uncertainties in the numerical wind prediction. In this work we propose to apply ensemble methods to the adaptive wind forecasting model presented in. The wind field forecasting is based on a mass-consistent model and a log-linear wind profile using as input data the resulting forecast wind from Harmonie, a Non-Hydrostatic Dynamic model used experimentally at AEMET with promising results. The mass-consistent model parameters are estimated by using genetic algorithms. The mesh is generated using the meccano method and adapted to the geometry…

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Human reasoning is a fascinating and complex cognitive process that can be applied in different research areas such as philosophy, psychology, laws and financial. Unfortunately, developing supporting software (to those different areas) able to cope such as complex reasoning it’s difficult and requires a suitable logic abstract formalism. In this thesis we aim to develop a program, that has the job to evaluate a theory (a set of rules) w.r.t. a Goal, and provide some results such as “The Goal is derivable from the KB5 (of the theory)”. In order to achieve this goal we need to analyse different logics and choose the one that best meets our needs. In logic, usually, we try to determine if a given conclusion is logically implied by a set of assumptions T (theory). However, when we deal with programming logic we need an efficient algorithm in order to find such implications. In this work we use a logic rather similar to human logic. Indeed, human reasoning requires an extension of the first order logic able to reach a conclusion depending on not definitely true6 premises belonging to a incomplete set of knowledge. Thus, we implemented a defeasible logic7 framework able to manipulate defeasible rules. Defeasible logic is a non-monotonic logic designed for efficient defeasible reasoning by Nute (see Chapter 2). Those kind of applications are useful in laws area especially if they offer an implementation of an argumentation framework that provides a formal modelling of game. Roughly speaking, let the theory is the set of laws, a keyclaim is the conclusion that one of the party wants to prove (and the other one wants to defeat) and adding dynamic assertion of rules, namely, facts putted forward by the parties, then, we can play an argumentative challenge between two players and decide if the conclusion is provable or not depending on the different strategies performed by the players. Implementing a game model requires one more meta-interpreter able to evaluate the defeasible logic framework; indeed, according to Göedel theorem (see on page 127), we cannot evaluate the meaning of a language using the tools provided by the language itself, but we need a meta-language able to manipulate the object language8. Thus, rather than a simple meta-interpreter, we propose a Meta-level containing different Meta-evaluators. The former has been explained above, the second one is needed to perform the game model, and the last one will be used to change game execution and tree derivation strategies.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

The thesis studies the economic and financial conditions of Italian households, by using microeconomic data of the Survey on Household Income and Wealth (SHIW) over the period 1998-2006. It develops along two lines of enquiry. First it studies the determinants of households holdings of assets and liabilities and estimates their correlation degree. After a review of the literature, it estimates two non-linear multivariate models on the interactions between assets and liabilities with repeated cross-sections. Second, it analyses households financial difficulties. It defines a quantitative measure of financial distress and tests, by means of non-linear dynamic probit models, whether the probability of experiencing financial difficulties is persistent over time. Chapter 1 provides a critical review of the theoretical and empirical literature on the estimation of assets and liabilities holdings, on their interactions and on households net wealth. The review stresses the fact that a large part of the literature explain households debt holdings as a function, among others, of net wealth, an assumption that runs into possible endogeneity problems. Chapter 2 defines two non-linear multivariate models to study the interactions between assets and liabilities held by Italian households. Estimation refers to a pooling of cross-sections of SHIW. The first model is a bivariate tobit that estimates factors affecting assets and liabilities and their degree of correlation with results coherent with theoretical expectations. To tackle the presence of non normality and heteroskedasticity in the error term, generating non consistent tobit estimators, semi-parametric estimates are provided that confirm the results of the tobit model. The second model is a quadrivariate probit on three different assets (safe, risky and real) and total liabilities; the results show the expected patterns of interdependence suggested by theoretical considerations. Chapter 3 reviews the methodologies for estimating non-linear dynamic panel data models, drawing attention to the problems to be dealt with to obtain consistent estimators. Specific attention is given to the initial condition problem raised by the inclusion of the lagged dependent variable in the set of explanatory variables. The advantage of using dynamic panel data models lies in the fact that they allow to simultaneously account for true state dependence, via the lagged variable, and unobserved heterogeneity via individual effects specification. Chapter 4 applies the models reviewed in Chapter 3 to analyse financial difficulties of Italian households, by using information on net wealth as provided in the panel component of the SHIW. The aim is to test whether households persistently experience financial difficulties over time. A thorough discussion is provided of the alternative approaches proposed by the literature (subjective/qualitative indicators versus quantitative indexes) to identify households in financial distress. Households in financial difficulties are identified as those holding amounts of net wealth lower than the value corresponding to the first quartile of net wealth distribution. Estimation is conducted via four different methods: the pooled probit model, the random effects probit model with exogenous initial conditions, the Heckman model and the recently developed Wooldridge model. Results obtained from all estimators accept the null hypothesis of true state dependence and show that, according with the literature, less sophisticated models, namely the pooled and exogenous models, over-estimate such persistence.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Diese Doktorarbeit untersucht das Verhalten von komplexenFluidenunter Scherung, insbesondere den Einfluss von Scherflüssenauf dieStrukturbildung.Dazu wird ein Modell dieser entworfen, welches imRahmen von Molekulardynamiksimulationen verwendet wird.Zunächst werden Gleichgewichtseigenschaften dieses Modellsuntersucht.Hierbei wird unter anderem die Lage desOrdnungs--Unordnungsübergangs von derisotropen zur lamellaren Phase der Dimere bestimmt.Der Einfluss von Scherflüssen auf diese lamellare Phase wirdnununtersucht und mit analytischen Theorien verglichen. Die Scherung einer parallelen lamellaren Phase ruft eineNeuausrichtung des Direktors in Flussrichtung hervor.Das verursacht eine Verminderung der Schichtdicke mitsteigender Scherrateund führt oberhalb eines Schwellwertes zu Ondulationen.Ein vergleichbares Verhalten wird auch in lamellarenSystemengefunden, an denen in Richtung des Direktors gezogen wird.Allerdings wird festgestellt, dass die Art der Bifurkationenin beidenFällen unterschiedlich ist.Unter Scherung wird ein Übergang von Lamellen parallelerAusrichtung zu senkrechter gefunden.Dabei wird beoachtet, dass die Scherspannung in senkrechterOrientierungniedriger als in der parallelen ist.Dies führt unter bestimmten Bedingungen zum Auftreten vonScherbändern, was auch in Simulationen beobachtet wird. Es ist gelungen mit einem einfachen Modell viele Apsekte desVerhalten vonkomplexen Fluiden wiederzugeben. Die Strukturbildung hängt offensichtlich nurbedingt von lokalen Eigenschaften der Moleküle ab.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Das Hepatitis C Virus (HCV) ist ein umhülltes RNA Virus aus der Familie der Flaviviridae. Sein Genom kodiert für ein ca. 3000 Aminosäuren langes Polyprotein, welches co- und posttranslational in seine funktionellen Einheiten gespalten wird. Eines dieser viralen Proteine ist NS5A. Es handelt sich hierbei um ein stark phosphoryliertes Protein, das eine amphipatische α-Helix im Amino-Terminus trägt, welche für die Membran-Assoziation von NS5A verantwortlich ist. Welche Rolle die Phosphorylierung für die Funktion des Proteins spielt, bzw. welche Funktion NS5A überhaupt ausübt, ist zur Zeit noch unklar. Beobachtungen lassen Vermutungen über eine Funktion von NS5A bei der Resistenz infizierter Zellen gegenüber Interferon-alpha zu. Weiterhin wird vermutet, das NS5A als Komponente des membranständigen HCV Replikasekomplexes an der RNA Replikation beteiligt ist. Das Ziel dieser Doktorarbeit war es, die Funktion von NS5A für die RNA Replikation zu untersuchen. Zu diesem Zweck wurde eine Serie von Phosphorylierungsstellen-Mutanten generiert, die auf Ihre Replikationsfähigkeit und den Phosphorylierungsstatus hin untersucht wurden. Wir fanden, dass bestimmte Serin-Substitutionen im Zentrum von NS5A zu einer gesteigerten RNA Replikation führten, bei gleichzeitig reduzierter NS5A Hyperphosphorylierung. Weiterhin studierten wir den Einfluß von Mutationen in der Amino-terminalen amphipatischen α-Helix von NS5A auf die RNA-Replikation, sowie Phosphorylierung und subzelluläre Lokalisation des Proteins. Wir fanden, dass geringfügige strukturelle Veränderungen der amphipatischen Helix zu einer veränderten subzellulären Lokalisation von NS5A führten, was mit einer reduzierten oder komplett inhibierten RNA Replikation einherging. Zudem interferierten die strukturellen Veränderungen mit der Hyperphosphorylierung des Proteins, was den Schluß nahe legt, dass die amphipatische Helix eine wichtige strukturelle Komponente des Proteins darstellt, die für die korrekte Faltung und Phosphorylierung des Proteins essentiell ist. Als weitere Aspekte wurden die Trans-Komplementationsfähigkeit der verschiedenen viralen Komponenten des HCV Replikasekomplexes untersucht, sowie zelluläre Interaktionspartner von NS5A identifiziert. Zusammenfassend zeigen die Ergebnisse dieser Doktorarbeit, dass NS5A eine wichtige Rolle bei der RNA-Replikation spielt. Diese Funktion wird wahrscheinlich über den Phosphorylierungszustand des Proteins reguliert.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Recenti analisi sull’intero trascrittoma hanno rivelato una estensiva trascrizione di RNA non codificanti (ncRNA), le quali funzioni sono tuttavia in gran parte sconosciute. In questo lavoro è stato dimostrato che alte dosi di camptotecina (CPT), un farmaco antitumorale inibitore della Top1, aumentano la trascrizione di due ncRNA antisenso in 5’ e 3’ (5'aHIF-1α e 3'aHIF-1α rispettivamente) al locus genico di HIF-1α e diminuiscono i livelli dell’mRNA di HIF-1α stesso. Gli effetti del trattamento sono Top1-dipendenti, mentre non dipendono dal danno al DNA alla forca di replicazione o dai checkpoint attivati dal danno al DNA. I ncRNA vengono attivati in risposta a diversi tipi di stress, il 5'aHIF-1α è lungo circa 10 kb e possiede sia il CAP in 5’ sia poliadenilazione in 3’ (in letteratura è noto che il 3'aHIF-1α è un trascritto di 1,7 kb, senza 5’CAP né poliadenilazione). Analisi di localizzazione intracellulare hanno dimostrato che entrambi sono trascritti nucleari. In particolare 5'aHIF-1α co-localizza con proteine del complesso del poro nucleare, suggerendo un suo possibile ruolo come mediatore degli scambi della membrana nucleare. È stata dimostrata inoltre la trascrizione dei due ncRNA in tessuti di tumore umano del rene, evidenziandone possibili ruoli nello sviluppo del cancro. È anche noto in letteratura che basse dosi di CPT in condizioni di ipossia diminuiscono i livelli di proteina di HIF-1α. Dopo aver dimostrato su diverse linee cellulari che i due ncRNA sopracitati non potessero essere implicati in tale effetto, abbiamo studiato le variazioni dell’intero miRnoma alle nuove condizioni sperimentali. In tal modo abbiamo scoperto che il miR-X sembra essere il mediatore molecolare dell’abbattimento di HIF-1α dopo trattamento con basse dosi di CPT in ipossia. Complessivamente, questi risultati suggeriscono che il fattore di trascrizione HIF-1α venga finemente regolato da RNA non-codificanti indotti da danno al DNA.