947 resultados para non-normal space
Resumo:
Composite plants consisting of a wild-type shoot and a transgenic root are frequently used for functional genomics in legume research. Although transformation of roots using Agrobacterium rhizogenes leads to morphologically normal roots, the question arises as to whether such roots interact with arbuscular mycorrhizal (AM) fungi in the same way as wild-type roots. To address this question, roots transformed with a vector containing the fluorescence marker DsRed were used to analyse AM in terms of mycorrhization rate, morphology of fungal and plant subcellular structures, as well as transcript and secondary metabolite accumulations. Mycorrhization rate, appearance, and developmental stages of arbuscules were identical in both types of roots. Using Mt16kOLI1Plus microarrays, transcript profiling of mycorrhizal roots showed that 222 and 73 genes exhibited at least a 2-fold induction and less than half of the expression, respectively, most of them described as AM regulated in the same direction in wild-type roots. To verify this, typical AM marker genes were analysed by quantitative reverse transcription-PCR and revealed equal transcript accumulation in transgenic and wild-type roots. Regarding secondary metabolites, several isoflavonoids and apocarotenoids, all known to accumulate in mycorrhizal wild-type roots, have been found to be up-regulated in mycorrhizal in comparison with non-mycorrhizal transgenic roots. This set of data revealed a substantial similarity in mycorrhization of transgenic and wild-type roots of Medicago truncatula, validating the use of composite plants for studying AM-related effects.
Resumo:
International audience
Resumo:
Editing a literary magazine offers us a cultural space where our ideas and aesthetics can be expressed collectively and therefore be heard more effectively. This informs and frames our own writing by increasing our confidence in our own unusual voices. The sense of belonging Brand creates further breaks down the isolation of the writing life. The internationalism of Brand reinforces our own cultural identities as non-English writers. However, acting as a facilitator of others’ creativity can sometimes dissipate or even deplete creative energy. Editing and teaching can take over your writing to the point of annihilation. Further, in terms of external perceptions, you run the risk of disappearing as a writer. We shall look at how this can happen and explore ways that we can prevent it e.g. keeping the boundaries firm and clear.
Resumo:
Les études cliniques et in vitro suggèrent que la sclérose de l’os sous-chondral due aux ostéoblastes (Ob) anormaux est impliquée dans la progression de l’ostéoarthrose (OA). Les Ob OA humains isolés à partir d’os sous-chondral sclérosé montrent un phénotype altéré, un niveau réduit de signalisation Wnt/β-caténine canonique et une minéralisation in vitro réduite. Il existe également deux voies non-canoniques, Wnt/PKC et Wnt/PCP qui ont étés décrites dans la littérature. Cependant, il n’existe aucune étude qui traite de ces deux voies dans les Ob OA. Ces voies sont activées après qu’un ligand Wnt non-canonique tel que Wnt-5a se lie à un récepteur Wnt couplé à des corécepteurs de la voie non-canonique. Ceci enclenche, respectivement pour la voie Wnt/PKC-Ca2+ et Wnt/PCP, la phosphorylation de PKC (p-PKC) et la phosphorylation de JNK (p-JNK) et agit sur les cibles en aval. Nous avons voulu déterminer s’il était possible de constater des altérations dans les voies Wnt non-canoniques dans les Ob OA. Nous avons préparé des cultures primaires d’ostéoblastes sous-chondral humains à partir de plateaux tibiaux de patients OA subissant une arthroplastie totale du genou, ainsi qu’à partir de plateaux tibiaux recueillis à l’autopsie de patients « normaux ». L’expression des gènes impliqués dans les voies Wnt/PKC et Wnt/PCP a été évaluée par RT-qPCR et la production par Western Blot des protéines, ainsi que celle de p-PKC et p-JNK et que l’activité des facteurs NFAT et AP-1 utilisés par ces deux voies. L’activité phosphatase alcaline (ALPase) et la quantité d’ostéocalcine (OC) ont étés évaluées respectivement à l’aide d’hydrolyse de substrat et d’ELISA. Le niveau de minéralisation a été évalué par la coloration au rouge Alizarine. Nos résultats montrent que l’expression et la production de Wnt-5a étaient augmentées dans les Ob OA comparées aux Ob N et LGR5 était significativement plus élevée. De plus, l’expression de LGR5 est directement régulée via la stimulation ou la diminution de Wnt-5a, à la fois au niveau de l’ARNm et des protéines. Par ailleurs, Wnt-5a a stimulé la phosphorylation de JNK et de PKC ainsi que l’activité NFAT et AP-1. Les niveaux de minéralisation ainsi que d’activité ALPase et de sécrétion d’OC ont aussi été affectés par les changements du niveau de Wnt-5a. Ces résultats suggèrent que Wnt-5a, qui est augmentée dans les OA Ob, peut stimuler les voies Wnt non-canoniques et affecter le phénotype et la minéralisation des OA Ob humains.
Resumo:
Les études cliniques et in vitro suggèrent que la sclérose de l’os sous-chondral due aux ostéoblastes (Ob) anormaux est impliquée dans la progression de l’ostéoarthrose (OA). Les Ob OA humains isolés à partir d’os sous-chondral sclérosé montrent un phénotype altéré, un niveau réduit de signalisation Wnt/β-caténine canonique et une minéralisation in vitro réduite. Il existe également deux voies non-canoniques, Wnt/PKC et Wnt/PCP qui ont étés décrites dans la littérature. Cependant, il n’existe aucune étude qui traite de ces deux voies dans les Ob OA. Ces voies sont activées après qu’un ligand Wnt non-canonique tel que Wnt-5a se lie à un récepteur Wnt couplé à des corécepteurs de la voie non-canonique. Ceci enclenche, respectivement pour la voie Wnt/PKC-Ca2+ et Wnt/PCP, la phosphorylation de PKC (p-PKC) et la phosphorylation de JNK (p-JNK) et agit sur les cibles en aval. Nous avons voulu déterminer s’il était possible de constater des altérations dans les voies Wnt non-canoniques dans les Ob OA. Nous avons préparé des cultures primaires d’ostéoblastes sous-chondral humains à partir de plateaux tibiaux de patients OA subissant une arthroplastie totale du genou, ainsi qu’à partir de plateaux tibiaux recueillis à l’autopsie de patients « normaux ». L’expression des gènes impliqués dans les voies Wnt/PKC et Wnt/PCP a été évaluée par RT-qPCR et la production par Western Blot des protéines, ainsi que celle de p-PKC et p-JNK et que l’activité des facteurs NFAT et AP-1 utilisés par ces deux voies. L’activité phosphatase alcaline (ALPase) et la quantité d’ostéocalcine (OC) ont étés évaluées respectivement à l’aide d’hydrolyse de substrat et d’ELISA. Le niveau de minéralisation a été évalué par la coloration au rouge Alizarine. Nos résultats montrent que l’expression et la production de Wnt-5a étaient augmentées dans les Ob OA comparées aux Ob N et LGR5 était significativement plus élevée. De plus, l’expression de LGR5 est directement régulée via la stimulation ou la diminution de Wnt-5a, à la fois au niveau de l’ARNm et des protéines. Par ailleurs, Wnt-5a a stimulé la phosphorylation de JNK et de PKC ainsi que l’activité NFAT et AP-1. Les niveaux de minéralisation ainsi que d’activité ALPase et de sécrétion d’OC ont aussi été affectés par les changements du niveau de Wnt-5a. Ces résultats suggèrent que Wnt-5a, qui est augmentée dans les OA Ob, peut stimuler les voies Wnt non-canoniques et affecter le phénotype et la minéralisation des OA Ob humains.
Resumo:
The aim of this study is to evaluate sex-related differences in right ventricular (RV) function, assessed with cardiac magnetic resonance imaging, in patients with stable non-ischaemic dilated cardiomyopathy. Mean age was 60.9 ± 12.2 years. Men presented higher levels of haemoglobin and white blood cell counts than women, and performed better in cardiopulmonary stress testing. A total of 24 patients (12 women) presented severe left ventricular (LV) systolic dysfunction, 32 (13 female) moderate and 15 (8 women) mild LV systolic dysfunction. In the group with severe LV systolic dysfunction, average right ventricular ejection fraction (RVEF) was normal in women (52 ± 4 %), whereas it was reduced in men (39 ± 3 %) p = 0.035. Only one woman (8 %) had severe RV systolic dysfunction (RVEF < 35 %) compared with 6 men (50 %) p < 0.001. In patients with moderate and mild LV dysfunction , the mean RVEF was normal in both men and women. In the 14 healthy volunteers, the lowest value of RVEF was 48 % and mean RVEF was normal in women (56 ± 2 %) and in men (51 ± 1 %), p = 0.08. In patients with dilated cardiomyopathy, RV systolic dysfunction is found mainly in male patients with severe LV systolic dysfunction.
Resumo:
This thesis is concerned with change point analysis for time series, i.e. with detection of structural breaks in time-ordered, random data. This long-standing research field regained popularity over the last few years and is still undergoing, as statistical analysis in general, a transformation to high-dimensional problems. We focus on the fundamental »change in the mean« problem and provide extensions of the classical non-parametric Darling-Erdős-type cumulative sum (CUSUM) testing and estimation theory within highdimensional Hilbert space settings. In the first part we contribute to (long run) principal component based testing methods for Hilbert space valued time series under a rather broad (abrupt, epidemic, gradual, multiple) change setting and under dependence. For the dependence structure we consider either traditional m-dependence assumptions or more recently developed m-approximability conditions which cover, e.g., MA, AR and ARCH models. We derive Gumbel and Brownian bridge type approximations of the distribution of the test statistic under the null hypothesis of no change and consistency conditions under the alternative. A new formulation of the test statistic using projections on subspaces allows us to simplify the standard proof techniques and to weaken common assumptions on the covariance structure. Furthermore, we propose to adjust the principal components by an implicit estimation of a (possible) change direction. This approach adds flexibility to projection based methods, weakens typical technical conditions and provides better consistency properties under the alternative. In the second part we contribute to estimation methods for common changes in the means of panels of Hilbert space valued time series. We analyze weighted CUSUM estimates within a recently proposed »high-dimensional low sample size (HDLSS)« framework, where the sample size is fixed but the number of panels increases. We derive sharp conditions on »pointwise asymptotic accuracy« or »uniform asymptotic accuracy« of those estimates in terms of the weighting function. Particularly, we prove that a covariance-based correction of Darling-Erdős-type CUSUM estimates is required to guarantee uniform asymptotic accuracy under moderate dependence conditions within panels and that these conditions are fulfilled, e.g., by any MA(1) time series. As a counterexample we show that for AR(1) time series, close to the non-stationary case, the dependence is too strong and uniform asymptotic accuracy cannot be ensured. Finally, we conduct simulations to demonstrate that our results are practically applicable and that our methodological suggestions are advantageous.
Resumo:
Combined media on photographic paper. 297" x 55", Jargomatique Series.
Resumo:
Metadata that is associated with either an information system or an information object for purposes of description, administration, legal requirements, technical functionality, use and usage, and preservation, plays a critical role in ensuring the creation, management, preservation and use and re-use of trustworthymaterials, including records. Recordkeeping1 metadata, of which one key type is archival description, plays a particularly important role in documenting the reliability and authenticity of records and recordkeeping systemsas well as the various contexts (legal-administrative, provenancial, procedural, documentary, and technical) within which records are created and kept as they move across space and time. In the digital environment, metadata is also the means by which it is possible to identify how record components – those constituent aspects of a digital record that may be managed, stored and used separately by the creator or the preserver – can be reassembled to generate an authentic copy of a record or reformulated per a user’s request as a customized output package.Issues relating to the creation, capture, management and preservation of adequate metadata are, therefore, integral to any research study addressing the reliability and authenticity of digital entities, regardless of the community, sector or institution within which they are being created. The InterPARES 2 Description Cross-Domain Group (DCD) examined the conceptualization, definitions, roles, and current functionality of metadata and archival description in terms of requirements generated by InterPARES 12. Because of the needs to communicate the work of InterPARES in a meaningful way across not only other disciplines, but also different archival traditions; to interface with, evaluate and inform existing standards, practices and other research projects; and to ensure interoperability across the three focus areas of InterPARES2, the Description Cross-Domain also addressed its research goals with reference to wider thinking about and developments in recordkeeping and metadata. InterPARES2 addressed not only records, however, but a range of digital information objects (referred to as “entities” by InterPARES 2, but not to be confused with the term “entities” as used in metadata and database applications) that are the products and by-products of government, scientific and artistic activities that are carried out using dynamic, interactive or experiential digital systems. The nature of these entities was determined through a diplomatic analysis undertaken as part of extensive case studies of digital systems that were conducted by the InterPARES 2 Focus Groups. This diplomatic analysis established whether the entities identified during the case studies were records, non-records that nevertheless raised important concerns relating to reliability and authenticity, or “potential records.” To be determined to be records, the entities had to meet the criteria outlined by archival theory – they had to have a fixed documentary format and stable content. It was not sufficient that they be considered to be or treated as records by the creator. “Potential records” is a new construct that indicates that a digital system has the potential to create records upon demand, but does not actually fix and set aside records in the normal course of business. The work of the Description Cross-Domain Group, therefore, addresses the metadata needs for all three categories of entities.Finally, since “metadata” as a term is used today so ubiquitously and in so many different ways by different communities, that it is in peril of losing any specificity, part of the work of the DCD sought to name and type categories of metadata. It also addressed incentives for creators to generate appropriate metadata, as well as issues associated with the retention, maintenance and eventual disposition of the metadata that aggregates around digital entities over time.
Resumo:
The current approach to data analysis for the Laser Interferometry Space Antenna (LISA) depends on the time delay interferometry observables (TDI) which have to be generated before any weak signal detection can be performed. These are linear combinations of the raw data with appropriate time shifts that lead to the cancellation of the laser frequency noises. This is possible because of the multiple occurrences of the same noises in the different raw data. Originally, these observables were manually generated starting with LISA as a simple stationary array and then adjusted to incorporate the antenna's motions. However, none of the observables survived the flexing of the arms in that they did not lead to cancellation with the same structure. The principal component approach is another way of handling these noises that was presented by Romano and Woan which simplified the data analysis by removing the need to create them before the analysis. This method also depends on the multiple occurrences of the same noises but, instead of using them for cancellation, it takes advantage of the correlations that they produce between the different readings. These correlations can be expressed in a noise (data) covariance matrix which occurs in the Bayesian likelihood function when the noises are assumed be Gaussian. Romano and Woan showed that performing an eigendecomposition of this matrix produced two distinct sets of eigenvalues that can be distinguished by the absence of laser frequency noise from one set. The transformation of the raw data using the corresponding eigenvectors also produced data that was free from the laser frequency noises. This result led to the idea that the principal components may actually be time delay interferometry observables since they produced the same outcome, that is, data that are free from laser frequency noise. The aims here were (i) to investigate the connection between the principal components and these observables, (ii) to prove that the data analysis using them is equivalent to that using the traditional observables and (ii) to determine how this method adapts to real LISA especially the flexing of the antenna. For testing the connection between the principal components and the TDI observables a 10x 10 covariance matrix containing integer values was used in order to obtain an algebraic solution for the eigendecomposition. The matrix was generated using fixed unequal arm lengths and stationary noises with equal variances for each noise type. Results confirm that all four Sagnac observables can be generated from the eigenvectors of the principal components. The observables obtained from this method however, are tied to the length of the data and are not general expressions like the traditional observables, for example, the Sagnac observables for two different time stamps were generated from different sets of eigenvectors. It was also possible to generate the frequency domain optimal AET observables from the principal components obtained from the power spectral density matrix. These results indicate that this method is another way of producing the observables therefore analysis using principal components should give the same results as that using the traditional observables. This was proven by fact that the same relative likelihoods (within 0.3%) were obtained from the Bayesian estimates of the signal amplitude of a simple sinusoidal gravitational wave using the principal components and the optimal AET observables. This method fails if the eigenvalues that are free from laser frequency noises are not generated. These are obtained from the covariance matrix and the properties of LISA that are required for its computation are the phase-locking, arm lengths and noise variances. Preliminary results of the effects of these properties on the principal components indicate that only the absence of phase-locking prevented their production. The flexing of the antenna results in time varying arm lengths which will appear in the covariance matrix and, from our toy model investigations, this did not prevent the occurrence of the principal components. The difficulty with flexing, and also non-stationary noises, is that the Toeplitz structure of the matrix will be destroyed which will affect any computation methods that take advantage of this structure. In terms of separating the two sets of data for the analysis, this was not necessary because the laser frequency noises are very large compared to the photodetector noises which resulted in a significant reduction in the data containing them after the matrix inversion. In the frequency domain the power spectral density matrices were block diagonals which simplified the computation of the eigenvalues by allowing them to be done separately for each block. The results in general showed a lack of principal components in the absence of phase-locking except for the zero bin. The major difference with the power spectral density matrix is that the time varying arm lengths and non-stationarity do not show up because of the summation in the Fourier transform.
Resumo:
This dissertation describes the new compositional system introduced by Scriabin in 1909– 1910, focusing on Feuillet d’Album op. 58, Poème op. 59, nº1, Prélude op. 59, nº2 and Promethée op. 60. Based upon exhaustive pitch and formal analysis the present study (a) claims the inexistence of non-functional pitches in all analysed works, (b) shows that transpositional procedures have structural consequences on the “basic chord”, and (c) for the first time advances an explanation on the intrinsic relation between the sonata form and the slow Luce line in Promethée op. 60; RESUMO: Sob o título de “Alexander Scriabin: a definição dum novo espaço sonoro na crise da Tonalidade”, a presente tese descreve o novo sistema compositivo introduzido por Scriabin em 1909– 1910, tomando como ponto de partida o estudo de Feuillet d’Album op. 58, Poème op. 59, nº1, Prélude op. 59, nº2 e Promethée op. 60. Baseando-se numa análise exaustiva das alturas e da forma, este estudo (a) conclui pela inexistência de alturas não funcionais em qualquer das obras analisadas, (b) mostra que os procedimentos transpositivos têm consequências estruturais no “acorde básico”, e (c) pela primeira vez explica a estrutura formal de Promethée op. 60 a partir da relação intrínseca entre a sua forma sonata e a linha lenta de Luce.
Resumo:
The language connectome was in-vivo investigated using multimodal non-invasive quantitative MRI. In PPA patients (n=18) recruited by the IRCCS ISNB, Bologna, cortical thickness measures showed a predominant reduction on the left hemisphere (p<0.005) with respect to matched healthy controls (HC) (n=18), and an accuracy of 86.1% in discrimination from Alzheimer’s disease patients (n=18). The left temporal and para-hippocampal gyri significantly correlated (p<0.01) with language fluency. In PPA patients (n=31) recruited by the Northwestern University Chicago, DTI measures were longitudinally evaluated (2-years follow-up) under the supervision of Prof. M. Catani, King’s College London. Significant differences with matched HC (n=27) were found, tract-localized at baseline and widespread in the follow-up. Language assessment scores correlated with arcuate (AF) and uncinate (UF) fasciculi DTI measures. In left-ischemic stroke patients (n=16) recruited by the NatBrainLab, King’s College London, language recovery was longitudinally evaluated (6-months follow-up). Using arterial spin labelling imaging a significant correlation (p<0.01) between language recovery and cerebral blood flow asymmetry, was found in the middle cerebral artery perfusion, towards the right. In HC (n=29) recruited by the DIBINEM Functional MR Unit, University of Bologna, an along-tract algorithm was developed suitable for different tractography methods, using the Laplacian operator. A higher left superior temporal gyrus and precentral operculum AF connectivity was found (Talozzi L et al., 2018), and lateralized UF projections towards the left dorsal orbital cortex. In HC (n=50) recruited in the Human Connectome Project, a new tractography-driven approach was developed for left association fibres, using a principal component analysis. The first component discriminated cortical areas typically connected by the AF, suggesting a good discrimination of cortical areas sharing a similar connectivity pattern. The evaluation of morphological, microstructural and metabolic measures could be used as in-vivo biomarkers to monitor language impairment related to neurodegeneration or as surrogate of cognitive rehabilitation/interventional treatment efficacy.
Resumo:
Gli ologrammi sono parte integrante della cultura pop a partire dagli anni 50, tanto che ad oggi sentirne parlare non desta più scalpore. Dal lato pratico, invece, solo negli ultimi anni sono state fatte ricerche approfondite con lo scopo di realizzarli. Fra i dispositivi attualmente in commercio, in pochi sono degni di nota e presentano numerose limitazioni, questo perché è molto difficile riuscire a progettare un sistema che permetta di illuminare dei punti specifici in uno spazio tridimensionale per lunghi periodi. In questa tesi si illustrano i principi di funzionamento ed il progetto per un nuovo dispositivo, diverso da quelli fino ad ora realizzati, che sfrutti il decadimento spontaneo di atomi di rubidio eccitati tramite due fasci laser opportunamente incrociati. Nel punto di incrocio si produce luce visibile a 420 nm. Con un opportuno sistema di specchi che muovono velocemente il punto di intersezione tra i due fasci è possibile realizzare un vero ologramma tridimensionale visibile da quasi ogni angolazione.
Resumo:
The goal of this simulation thesis is to present a tool for studying and eliminating various numerical problems observed while analyzing the behavior of the MIND cable during fast voltage polarity reversal. The tool is built on the MATLAB environment, where several simulations were run to achieve oscillation-free results. This thesis will add to earlier research on HVDC cables subjected to polarity reversals. Initially, the code does numerical simulations to analyze the electric field and charge density behavior of a MIND cable for certain scenarios such as before, during, and after polarity reversal. However, the primary goal is to reduce numerical oscillations from the charge density profile. The generated code is notable for its usage of the Arithmetic Mean Approach and the Non-Uniform Field Approach for filtering and minimizing oscillations even under time and temperature variations.
Resumo:
Slot and van Emde Boas Invariance Thesis states that a time (respectively, space) cost model is reasonable for a computational model C if there are mutual simulations between Turing machines and C such that the overhead is polynomial in time (respectively, linear in space). The rationale is that under the Invariance Thesis, complexity classes such as LOGSPACE, P, PSPACE, become robust, i.e. machine independent. In this dissertation, we want to find out if it possible to define a reasonable space cost model for the lambda-calculus, the paradigmatic model for functional programming languages. We start by considering an unusual evaluation mechanism for the lambda-calculus, based on Girard's Geometry of Interaction, that was conjectured to be the key ingredient to obtain a space reasonable cost model. By a fine complexity analysis of this schema, based on new variants of non-idempotent intersection types, we disprove this conjecture. Then, we change the target of our analysis. We consider a variant over Krivine's abstract machine, a standard evaluation mechanism for the call-by-name lambda-calculus, optimized for space complexity, and implemented without any pointer. A fine analysis of the execution of (a refined version of) the encoding of Turing machines into the lambda-calculus allows us to conclude that the space consumed by this machine is indeed a reasonable space cost model. In particular, for the first time we are able to measure also sub-linear space complexities. Moreover, we transfer this result to the call-by-value case. Finally, we provide also an intersection type system that characterizes compositionally this new reasonable space measure. This is done through a minimal, yet non trivial, modification of the original de Carvalho type system.