913 resultados para Independant entity


Relevância:

10.00% 10.00%

Publicador:

Resumo:

The project was developed into three parts: the analysis of p63 isoform in breast tumours; the study of intra-tumour eterogeneicity in metaplastic breast carcinoma; the analysis of oncocytic breast carcinoma. p63 is a sequence-specific DNA-binding factor, homologue of the tumour suppressor and transcription factor p53. The human p63 gene is composed of 15 exons and transcription can occur from two distinct promoters: the transactivating isoforms (TAp63) are generated by a promoter upstream of exon 1, while the alternative promoter located in intron 3 leads to the expression of N-terminal truncated isoforms (ΔNp63). It has been demonstrated that anti-p63 antibodies decorate the majority of squamous cell carcinomas of different organs; moreover tumours with myoepithelial differentiation of the breast show nuclear p63 expression. Two new isoforms have been described with the same sequence as TAp63 and ΔNp63 but lacking exon 4: d4TAp63 and ΔNp73L, respectively. Purpose of the study was to investigate the molecular expression of N-terminal p63 isoforms in benign and malignant breast tissues. In the present study 40 specimens from normal breast, benign lesions, DIN/DCIS, and invasive carcinomas were analyzed by immunohistochemistry and RT-PCR (Reverse Transcriptase-PCR) in order to disclose the patterns of p63 expression. We have observed that the full-length isoforms can be detected in non neoplastic and neoplastic lesions, while the short isoforms are only present in the neoplastic cells of invasive carcinomas. Metaplastic carcinomas of the breast are a heterogeneous group of neoplasms which exhibit varied patterns of metaplasia and differentiation. The existence of such non-modal populations harbouring distinct genetic aberrations may explain the phenotypic diversity observed within a given tumour. Intra-tumour morphological heterogeneity is not uncommon in breast cancer and it can often be appreciated in metaplastic breast carcinomas. Aim of this study was to determine the existence of intra-tumour genetic heterogeneity in metaplastic breast cancers and whether areas with distinct morphological features in a given tumour might be underpinned by distinct patterns of genetic aberrations. 47 cases of metaplastic breast carcinomas were retrieved. Out of the 47 cases, 9 had areas that were of sufficient dimensions to be independently microdissected. Our results indicate that at least some breast cancers are composed of multiple non-modal populations of clonally related cells and provide direct evidence that at least some types of metaplastic breast cancers are composed of multiple non-modal clones harbouring distinct genetic aberrations. Oncocytic tumours represent a distinctive set of lesions with typical granular cytoplasmatic eosinophilia of the neoplastic cells. Only rare example of breast oncocytic carcinomas have been reported in literature and the incidence is probably underestimated. In this study we have analysed 33 cases of oncocytic invasive breast carcinoma of the breast, selected according to morphological and immunohistochemical criteria. These tumours were morphologically classified and studied by immunohistochemistry and aCGH. We have concluded that oncocytic breast carcinoma is a morphologic entity with distinctive ultrastructural and histological features; immunohistochemically is characterized by a luminal profile, it has a frequency of 19.8%, has not distinctive clinical features and, at molecular level, shows a specific constellation of genetic aberration.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Smart Environments are currently considered a key factor to connect the physical world with the information world. A Smart Environment can be defined as the combination of a physical environment, an infrastructure for data management (called Smart Space), a collection of embedded systems gathering heterogeneous data from the environment and a connectivity solution to convey these data to the Smart Space. With this vision, any application which takes advantages from the environment could be devised, without the need to directly access to it, since all information are stored in the Smart Space in a interoperable format. Moreover, according to this vision, for each entity populating the physical environment, i.e. users, objects, devices, environments, the following questions can be arise: “Who?”, i.e. which are the entities that should be identified? “Where?” i.e. where are such entities located in physical space? and “What?” i.e. which attributes and properties of the entities should be stored in the Smart Space in machine understandable format, in the sense that its meaning has to be explicitly defined and all the data should be linked together in order to be automatically retrieved by interoperable applications. Starting from this the location detection is a necessary step in the creation of Smart Environments. If the addressed entity is a user and the environment a generic environment, a meaningful way to assign the position, is through a Pedestrian Tracking System. In this work two solution for these type of system are proposed and compared. One of the two solution has been studied and developed in all its aspects during the doctoral period. The work also investigates the problem to create and manage the Smart Environment. The proposed solution is to create, by means of natural interactions, links between objects and between objects and their environment, through the use of specific devices, i.e. Smart Objects

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Recently in most of the industrial automation process an ever increasing degree of automation has been observed. This increasing is motivated by the higher requirement of systems with great performance in terms of quality of products/services generated, productivity, efficiency and low costs in the design, realization and maintenance. This trend in the growth of complex automation systems is rapidly spreading over automated manufacturing systems (AMS), where the integration of the mechanical and electronic technology, typical of the Mechatronics, is merging with other technologies such as Informatics and the communication networks. An AMS is a very complex system that can be thought constituted by a set of flexible working stations, one or more transportation systems. To understand how this machine are important in our society let considerate that every day most of us use bottles of water or soda, buy product in box like food or cigarets and so on. Another important consideration from its complexity derive from the fact that the the consortium of machine producers has estimated around 350 types of manufacturing machine. A large number of manufacturing machine industry are presented in Italy and notably packaging machine industry,in particular a great concentration of this kind of industry is located in Bologna area; for this reason the Bologna area is called “packaging valley”. Usually, the various parts of the AMS interact among them in a concurrent and asynchronous way, and coordinate the parts of the machine to obtain a desiderated overall behaviour is an hard task. Often, this is the case in large scale systems, organized in a modular and distributed manner. Even if the success of a modern AMS from a functional and behavioural point of view is still to attribute to the design choices operated in the definition of the mechanical structure and electrical electronic architecture, the system that governs the control of the plant is becoming crucial, because of the large number of duties associated to it. Apart from the activity inherent to the automation of themachine cycles, the supervisory system is called to perform other main functions such as: emulating the behaviour of traditional mechanical members thus allowing a drastic constructive simplification of the machine and a crucial functional flexibility; dynamically adapting the control strategies according to the different productive needs and to the different operational scenarios; obtaining a high quality of the final product through the verification of the correctness of the processing; addressing the operator devoted to themachine to promptly and carefully take the actions devoted to establish or restore the optimal operating conditions; managing in real time information on diagnostics, as a support of the maintenance operations of the machine. The kind of facilities that designers can directly find on themarket, in terms of software component libraries provides in fact an adequate support as regard the implementation of either top-level or bottom-level functionalities, typically pertaining to the domains of user-friendly HMIs, closed-loop regulation and motion control, fieldbus-based interconnection of remote smart devices. What is still lacking is a reference framework comprising a comprehensive set of highly reusable logic control components that, focussing on the cross-cutting functionalities characterizing the automation domain, may help the designers in the process of modelling and structuring their applications according to the specific needs. Historically, the design and verification process for complex automated industrial systems is performed in empirical way, without a clear distinction between functional and technological-implementation concepts and without a systematic method to organically deal with the complete system. Traditionally, in the field of analog and digital control design and verification through formal and simulation tools have been adopted since a long time ago, at least for multivariable and/or nonlinear controllers for complex time-driven dynamics as in the fields of vehicles, aircrafts, robots, electric drives and complex power electronics equipments. Moving to the field of logic control, typical for industrial manufacturing automation, the design and verification process is approached in a completely different way, usually very “unstructured”. No clear distinction between functions and implementations, between functional architectures and technological architectures and platforms is considered. Probably this difference is due to the different “dynamical framework”of logic control with respect to analog/digital control. As a matter of facts, in logic control discrete-events dynamics replace time-driven dynamics; hence most of the formal and mathematical tools of analog/digital control cannot be directly migrated to logic control to enlighten the distinction between functions and implementations. In addition, in the common view of application technicians, logic control design is strictly connected to the adopted implementation technology (relays in the past, software nowadays), leading again to a deep confusion among functional view and technological view. In Industrial automation software engineering, concepts as modularity, encapsulation, composability and reusability are strongly emphasized and profitably realized in the so-calledobject-oriented methodologies. Industrial automation is receiving lately this approach, as testified by some IEC standards IEC 611313, IEC 61499 which have been considered in commercial products only recently. On the other hand, in the scientific and technical literature many contributions have been already proposed to establish a suitable modelling framework for industrial automation. During last years it was possible to note a considerable growth in the exploitation of innovative concepts and technologies from ICT world in industrial automation systems. For what concerns the logic control design, Model Based Design (MBD) is being imported in industrial automation from software engineering field. Another key-point in industrial automated systems is the growth of requirements in terms of availability, reliability and safety for technological systems. In other words, the control system should not only deal with the nominal behaviour, but should also deal with other important duties, such as diagnosis and faults isolations, recovery and safety management. Indeed, together with high performance, in complex systems fault occurrences increase. This is a consequence of the fact that, as it typically occurs in reliable mechatronic systems, in complex systems such as AMS, together with reliable mechanical elements, an increasing number of electronic devices are also present, that are more vulnerable by their own nature. The diagnosis problem and the faults isolation in a generic dynamical system consists in the design of an elaboration unit that, appropriately processing the inputs and outputs of the dynamical system, is also capable of detecting incipient faults on the plant devices, reconfiguring the control system so as to guarantee satisfactory performance. The designer should be able to formally verify the product, certifying that, in its final implementation, it will perform itsrequired function guarantying the desired level of reliability and safety; the next step is that of preventing faults and eventually reconfiguring the control system so that faults are tolerated. On this topic an important improvement to formal verification of logic control, fault diagnosis and fault tolerant control results derive from Discrete Event Systems theory. The aimof this work is to define a design pattern and a control architecture to help the designer of control logic in industrial automated systems. The work starts with a brief discussion on main characteristics and description of industrial automated systems on Chapter 1. In Chapter 2 a survey on the state of the software engineering paradigm applied to industrial automation is discussed. Chapter 3 presentes a architecture for industrial automated systems based on the new concept of Generalized Actuator showing its benefits, while in Chapter 4 this architecture is refined using a novel entity, the Generalized Device in order to have a better reusability and modularity of the control logic. In Chapter 5 a new approach will be present based on Discrete Event Systems for the problemof software formal verification and an active fault tolerant control architecture using online diagnostic. Finally conclusive remarks and some ideas on new directions to explore are given. In Appendix A are briefly reported some concepts and results about Discrete Event Systems which should help the reader in understanding some crucial points in chapter 5; while in Appendix B an overview on the experimental testbed of the Laboratory of Automation of University of Bologna, is reported to validated the approach presented in chapter 3, chapter 4 and chapter 5. In Appendix C some components model used in chapter 5 for formal verification are reported.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Ontology design and population -core aspects of semantic technologies- re- cently have become fields of great interest due to the increasing need of domain-specific knowledge bases that can boost the use of Semantic Web. For building such knowledge resources, the state of the art tools for ontology design require a lot of human work. Producing meaningful schemas and populating them with domain-specific data is in fact a very difficult and time-consuming task. Even more if the task consists in modelling knowledge at a web scale. The primary aim of this work is to investigate a novel and flexible method- ology for automatically learning ontology from textual data, lightening the human workload required for conceptualizing domain-specific knowledge and populating an extracted schema with real data, speeding up the whole ontology production process. Here computational linguistics plays a fundamental role, from automati- cally identifying facts from natural language and extracting frame of relations among recognized entities, to producing linked data with which extending existing knowledge bases or creating new ones. In the state of the art, automatic ontology learning systems are mainly based on plain-pipelined linguistics classifiers performing tasks such as Named Entity recognition, Entity resolution, Taxonomy and Relation extraction [11]. These approaches present some weaknesses, specially in capturing struc- tures through which the meaning of complex concepts is expressed [24]. Humans, in fact, tend to organize knowledge in well-defined patterns, which include participant entities and meaningful relations linking entities with each other. In literature, these structures have been called Semantic Frames by Fill- 6 Introduction more [20], or more recently as Knowledge Patterns [23]. Some NLP studies has recently shown the possibility of performing more accurate deep parsing with the ability of logically understanding the structure of discourse [7]. In this work, some of these technologies have been investigated and em- ployed to produce accurate ontology schemas. The long-term goal is to collect large amounts of semantically structured information from the web of crowds, through an automated process, in order to identify and investigate the cognitive patterns used by human to organize their knowledge.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

While the use of distributed intelligence has been incrementally spreading in the design of a great number of intelligent systems, the field of Artificial Intelligence in Real Time Strategy games has remained mostly a centralized environment. Despite turn-based games have attained AIs of world-class level, the fast paced nature of RTS games has proven to be a significant obstacle to the quality of its AIs. Chapter 1 introduces RTS games describing their characteristics, mechanics and elements. Chapter 2 introduces Multi-Agent Systems and the use of the Beliefs-Desires-Intentions abstraction, analysing the possibilities given by self-computing properties. In Chapter 3 the current state of AI development in RTS games is analyzed highlighting the struggles of the gaming industry to produce valuable. The focus on improving multiplayer experience has impacted gravely on the quality of the AIs thus leaving them with serious flaws that impair their ability to challenge and entertain players. Chapter 4 explores different aspects of AI development for RTS, evaluating the potential strengths and weaknesses of an agent-based approach and analysing which aspects can benefit the most against centralized AIs. Chapter 5 describes a generic agent-based framework for RTS games where every game entity becomes an agent, each of which having its own knowledge and set of goals. Different aspects of the game, like economy, exploration and warfare are also analysed, and some agent-based solutions are outlined. The possible exploitation of self-computing properties to efficiently organize the agents activity is then inspected. Chapter 6 presents the design and implementation of an AI for an existing Open Source game in beta development stage: 0 a.d., an historical RTS game on ancient warfare which features a modern graphical engine and evolved mechanics. The entities in the conceptual framework are implemented in a new agent-based platform seamlessly nested inside the existing game engine, called ABot, widely described in Chapters 7, 8 and 9. Chapter 10 and 11 include the design and realization of a new agent based language useful for defining behavioural modules for the agents in ABot, paving the way for a wider spectrum of contributors. Chapter 12 concludes the work analysing the outcome of tests meant to evaluate strategies, realism and pure performance, finally drawing conclusions and future works in Chapter 13.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

An important property for devices is the charge-carrier mobility values for discotic organic materials like hexa-peri-hexabenzocoronenes. A close relation exists between the degree of their columnar self-arrangement of the molecules and their mobilities. Within this first step an induction of a higher order via hydrogen-bonding was considered, which mainly pointed towards the improvement of the intracolumnar stacking of the materials. For the analytics a broad range of methods was used including differential scanning calorimetry (DSC), wide-angle X-ray diffractometry (WAXS), solid-state NMR spectroscopy and scanning tunneling microscopy (STM). Indeed, a specific influence of the hydrogen-bonds could be identified, although in several cases by the cost of a severe reduction of solubility and processability. This effect was dampened by the addition of a long alkyl chain next to the hydrogen-bond exerting functional group, which resulted in an improved columnar arrangement by retention of processability. In contrast to the before mentioned example of inducing a higher intracolumnar order by hydrogen-bonding, the focus was also be set upon larger aromatic systems. The charge-carrier mobility is also in close relation to the size of the aromatic core and larger π-areas are expected to lead to improved mobilities. For photovoltaic applications a high extinction coefficient over a broad range of the spectrum is favorable, which can also be achieved by enlarging the aromatic core component. In addition the stronger π-interactions between the aromatic core components should yield an improved columnar stability and order. However the strengthening of the π-interactions between the aromatic core components led to a reduction of the solubility and the processability due to the stronger aggregation of the molecules. This required the introduction of efficiently solubilizing features in terms of long alkyl chains in the corona of the aromatic entity, in combination of a distortion of the aromatic core moiety by bulky tert-butyl groups. By this approach not only the processing and cleaning of the materials with standard laboratory techniques became possible, but moreover the first structure-rich UV/vis and a resolved 1H-NMR spectra for an aromatic system two times larger than hexa-peri-hexabenzocoronene were recorded. The bulk properties in an extruded fiber as well as on the surface showed a columnar self-assembly including a phase in which a homeotropic alignment on a substrate was observed, which turns the material into an interesting candidate for future applications in electronic devices.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Nel corso degli ultimi due decenni in particolare si è andata evidenziando a livello epatologico una entità definita oggi Non-alcoholic Fatty Liver Disease (NAFLD) che si è andata ad affiancare alle cause in precedenza conosciute, fino a risultare, attraverso il succedersi di riscontri scientifici, la causa prevalente di epatopatia, in particolare nei paesi occidentali e industrializzati. Negli stessi anni un'altra problematica clinica complessa che va sotto il nome di Sindrome Metabolica si è andata via via delineando attraverso le sue molteplici correlazioni con quelle che sono le cause di morbidità e mortalità prevalenti nella nostra realtà, dal diabete alla patologia cardiovascolare e non ultima alla NAFLD stessa. Scopo dello Studio in oggetto a questa tesi era proprio di rivalutare nel territorio italiano la prevalenza di epatopatia in particolare correlabile alla NAFLD e la sua associazione con la Sindrome Metabolica.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Lo studio condotto durante il Dottorato di Ricerca è stato focalizzato sulla valutazione e sul monitoraggio delle diverse degradazioni termossidative in oli da frittura. Per raggiungere tale obiettivo si è ritenuto opportuno procedere mediante uno screening dei principali oli presenti sul mercato italiano e successiva messa a punto di due miscele di oli vegetali che sono state sottoposte a due piani sperimentali di frittura controllata e standardizzata in laboratorio, seguiti da due piani di frittura eseguiti in due differenti situazioni reali quali mensa e ristorante. Ognuna delle due miscele è stata messa a confronto con due oli di riferimento. A tal fine è stata identificato il profilo in acidi grassi, la stabilità ossidativa ed idrolitica, il punto di fumo, i composti polari, il contenuto in tocoferoli totali, ed i composti volatili sia sugli oli crudi che sottoposti ai diversi processi di frittura. Lo studio condotto ha permesso di identificare una delle miscele ideate come valida alternativa all’impiego dell’olio di palma ampiamente utilizzato nelle fritture degli alimenti, oltre a fornire delle indicazioni più precise sulla tipologia e sull’entità delle modificazioni che avvengono in frittura, a seconda delle condizioni impiegate.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Hepatitis B x protein (HBx) is a non structural, multifunctional protein of hepatitis B virus (HBV) that modulates a variety of host processes.Due to its transcriptional activity,able to alter the expression of growth-control genes,it has been implicated in hepatocarcinogenesis.Increased expression of HBx has been reported on the liver tissue samples of hepatocellular carcinoma (HCC),and a specific anti-HBx immune response can be detected in the peripheral blood of patients with chronic HBV.However,its role and entity has not been yet clarified.Thus,we performed a cross-sectional analysis of anti-HBx specific T cell response in HBV-infected patients in different stage of disease.A total of 70 HBV-infected subjects were evaluated:15 affected by chronic hepatitis (CH-median age 45 yrs),14 by cirrhosis (median age 55 yrs),11 with dysplastic nodules (median age 64 yrs),15 with HCC (median age 60 yrs),15 with IC(median age 53 yrs).All patients were infected by virus genotype D with different levels of HBV viremia and most of them (91%) were HBeAb positive.The HBx-specific T cell response was evaluated by anti-Interferon (IFN)-gamma Elispot assay after in vitro stimulation of peripheral blood mononuclear cells,using 20 overlapping synthetic peptides covering all HBx protein sequence.HBx-specific IFN-gamma-secreting T cells were found in 6 out of 15 patients with chronic hepatitis (40%), 3 out of 14 cirrhosis (21%), in 5 out of 11 cirrhosis with macronodules (54%), and in 10 out of 15 HCC patients (67%). The number of responding patients resulted significantly higher in HCC than IC (p=0.02) and cirrhosis (p=0.02). Central specific region of the protein x was preferentially recognize,between 86-88 peptides. HBx response does not correlate with clinical feature disease(AFP,MELD).The HBx specific T-cell response seems to increase accordingly to progression of the disease, being increased in subjects with dysplastic or neoplastic lesions and can represent an additional tool to monitor the patients at high risk to develop HCC

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In questa tesi è stato realizzato un sistema web-based, per la configurazione di modelli meccanici tridimensionali. L’intero software è basato su architettura multi-tier. Il back-end espone servizi RESTful che permettono l’interrogazione di una base di dati contenente l’anagrafica dei modelli e l’interazione con il CAD 3D SolidWorks. Il front-end è rappresentato da due pagine HTML ideate come SPA (Single Page Application), una per l’amministratore e l’altra per l’utente finale; esse sono responsabili delle chiamate asincrone verso i servizi, dell’aggiornamento automatico dell’interfaccia e dell’interazione con immagini tridimensionali.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The last decades have witnessed significant and rapid progress in polymer chemistry and molecular biology. The invention of PCR and advances in automated solid phase synthesis of DNA have made this biological entity broadly available to all researchers across biological and chemical sciences. Thanks to the development of a variety of polymerization techniques, macromolecules can be synthesized with predetermined molecular weights and excellent structural control. In recent years these two exciting areas of research converged to generate a new type of nucleic acid hybrid material, consisting of oligodeoxynucleotides and organic polymers. By conjugating these two classes of materials, DNA block copolymers are generated exhibiting engineered material properties that cannot be realized with polymers or nucleic acids alone. Different synthetic strategies based on grafting onto routes in solution or on solid support were developed which afforded DNA block copolymers with hydrophilic, hydrophobic and thermoresponsive organic polymers in good yields. Beside the preparation of DNA block copolymers with a relative short DNA-segment, it was also demonstrated how these bioorganic polymers can be synthesized exhibiting large DNA blocks (>1000 bases) applying the polymerase chain reaction. Amphiphilic DNA block copolymers, which were synthesized fully automated in a DNA synthesizer, self-assemble into well-defined nanoparticles. Hybridization of spherical micelles with long DNA templates that encode several times the sequence of the micelle corona induced a transformation into rod-like micelles. The Watson-Crick motif aligned the hydrophobic polymer segments along the DNA double helix, which resulted in selective dimer formation. Even the length of the resulting nanostructures could be precisely adjusted by the number of nucleotides of the templates. In addition to changing the structural properties of DNA-b-PPO micelles, these materials were applied as 3D nanoscopic scaffolds for organic reactions. The DNA strands of the corona were organized by hydrophobic interactions of the organic polymer segments in such a fashion that several DNA-templated organic reactions proceeded in a sequence specific manner; either at the surface of the micelles or at the interface between the biological and the organic polymer blocks. The yields of reactions employing the micellar template were equivalent or better than existing template architectures. Aside from its physical properties and the morphologies achieved, an important requirement for a new biomaterial is its biocompatibility and interaction with living systems, i.e. human cells. The toxicity of the nanoparticles was analyzed by a cell proliferation assay. Motivated by the non-toxic nature of the amphiphilic DNA block copolymers, these nanoobjects were employed as drug delivery vehicles to target the anticancer drug to a tumor tissue. The micelles obtained from DNA block copolymers were easily functionalized with targeting units by hybridization. This facile route allowed studying the effect of the amount of targeting units on the targeting efficacy. By varying the site of functionalization, i.e. 5’ or 3’, the outcome of having the targeting unit at the periphery of the micelle or in the core of the micelle was studied. Additionally, these micelles were loaded with an anticancer drug, doxorubicin, and then applied to tumor cells. The viability of the cells was calculated in the presence and absence of targeting unit. It was demonstrated that the tumor cells bearing folate receptors showed a high mortality when the targeting unit was attached to the nanocarrier.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Spinal cord injury (SCI) results not only in paralysis; but it is also associated with a range of autonomic dysregulation that can interfere with cardiovascular, bladder, bowel, temperature, and sexual function. The entity of the autonomic dysfunction is related to the level and severity of injury to descending autonomic (sympathetic) pathways. For many years there was limited awareness of these issues and the attention given to them by the scientific and medical community was scarce. Yet, even if a new system to document the impact of SCI on autonomic function has recently been proposed, the current standard of assessment of SCI (American Spinal Injury Association (ASIA) examination) evaluates motor and sensory pathways, but not severity of injury to autonomic pathways. Beside the severe impact on quality of life, autonomic dysfunction in persons with SCI is associated with increased risk of cardiovascular disease and mortality. Therefore, obtaining information regarding autonomic function in persons with SCI is pivotal and clinical examinations and laboratory evaluations to detect the presence of autonomic dysfunction and quantitate its severity are mandatory. Furthermore, previous studies demonstrated that there is an intimate relationship between the autonomic nervous system and sleep from anatomical, physiological, and neurochemical points of view. Although, even if previous epidemiological studies demonstrated that sleep problems are common in spinal cord injury (SCI), so far only limited polysomnographic (PSG) data are available. Finally, until now, circadian and state dependent autonomic regulation of blood pressure (BP), heart rate (HR) and body core temperature (BcT) were never assessed in SCI patients. Aim of the current study was to establish the association between the autonomic control of the cardiovascular function and thermoregulation, sleep parameters and increased cardiovascular risk in SCI patients.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Eine neue auf einer Pyruvat abhängigen Biolumineszenzreaktion basierende Methode zur quantitativen Bestimmung und räumlichen Darstellung von Pyruvat in Gefrierschnitten von Gewebeproben wurde entwickelt. Dabei wurden biochemische Reaktionen so verknüpft, dass sichtbares Licht proportional zum eingesetzten Pyruvatgehalt entstand. Eine hoch signifikante positive Korrelation beider Parameter ermöglichte eine Kalibrierung mit definierten Pyruvatgehalten und damit die Quantifizierung in unbekannten Proben. Die Nachweisgrenze lag bei 0,04 pmol Pyruvat mit einer Auflösung von 0,02 µmol/g. Das Biolumineszenzverfahren wurde mit Hilfe anderer Methoden validiert, wobei eine Wiederfindung mit einer konzentrationsabhängigen Abweichung von ≤ 15 % erzielt wurde. Ein wesentlicher Vorteil der neuen Methode gegenüber bisherigen Verfahren zum Pyruvatnachweis liegt in der Messwerterfassung definierter histologischer Gewebsareale. Dies wird durch computergesteuerte Überlagerung von Metabolitverteilungen mit Schnittbildern aus Strukturfärbungen und interaktiver, „optischer Mikrodissektion“ der Gewebeschnitte möglich. Ein weiterer Nutzen der Methode ist deren optionale Kombination mit der Biolumineszenztechnik für andere Stoffwechselprodukte. So ermöglicht eine exakte Superposition zweier Metabolitbilder von unmittelbar aufeinander folgenden Gewebeschnitten eine korrelative Kolokalisationsanalyse beider Metabolite. Das Ergebnis lässt sich zum einen in Form von „Pixel-zu-Pixel“-Korrelationen dokumentieren, zum anderen kann für jeden Bildpunkt ein Laktat/Pyruvat-Verhältnis als Maß für den Redoxzustand des Gewebes berechnet und dargestellt werden. Hieraus ergeben sich z.B. räumliche L/P-Verteilungen (L/P-Karten). Ein solches „Redoximaging“ durch Kartierung des L/P-Quotienten ist bislang mit keinem anderen Verfahren möglich. Während die Entwicklung des Pyruvatnachweises eine Kernaufgabe der vorliegenden Arbeit darstellte, bestand ein weiterer wesentlicher Teil in der praktischen Anwendung der neuen Methode im Bereich der experimentellen Tumorforschung. So ergaben Messungen an acht verschiedenen Linien von humanen HNSCC-Xenotransplantaten (n = 70 Tumoren) einen mittleren Pyruvatgehalt von 1,24 ± 0,20 µmol/g. In sechs Humanbiopsien derselben Tumorentität wurde ein durchschnittlicher Pyruvatgehalt von 0,41 ± 0,09 µmol/g gemessen. Bei den Xenotransplantaten konnte eine signifikante positive Korrelation zwischen der Summe aus Laktat und Pyruvat bzw. dem L/P Verhältnis und der Strahlensensibilität gefunden werden, wobei das L/P-Verhältnis ebenso wie die Summe aus Laktat und Pyruvat maßgeblich von Laktat bestimmt wurden. Der Zusammenhang der Metabolite mit der Strahlensensibilität lässt sich durch deren antioxidative Eigenschaften erklären. Da der Redoxzustand der Zelle kritisch bezüglich der Effizienz von ROS induzierenden Therapieansätzen, wie z.B. Bestrahlung oder bestimmter Chemotherapeutika sein kann, könnte die Bestimmung des L/P Verhältnisses als prognostischer Faktor prädiktive Aussagen über die Sensibilität gegenüber solchen Behandlungen erlauben.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Introduzione: le Coliti Microscopiche, altrimenti note come Colite Collagena e Colite Linfocitica, sono disordini infiammatori cronici del colon che causano diarrea e colpiscono più frequentemente donne in età avanzata e soggetti in terapia farmacologica. Negli ultimi anni la loro incidenza sembra aumentata in diversi paesi occidentali ma la prevalenza in Italia è ancora incerta. Scopo: il presente studio prospettico e multicentrico è stato disegnato per valutare la prevalenza delle CM in pazienti sottoposti a colonscopia per diarrea cronica non ematica. Pazienti e metodi: dal Maggio 2010 al Settembre 2010 sono stati arruolati consecutivamente tutti i soggetti adulti afferenti in due strutture dell’area metropolitana milanese per eseguire una pancolonscopia. Nei soggetti con diarrea cronica non ematica sono state eseguite biopsie multiple nel colon ascendente, sigma e retto nonché in presenza di lesioni macroscopiche. Risultati: delle 8008 colonscopie esaminate 265 sono state eseguite per diarrea cronica; tra queste, 8 presentavano informazioni incomplete, 52 riscontri endoscopici consistenti con altri disordini intestinali (i.e. IBD, tumori, diverticoliti). 205 colonscopie sono risultate sostanzialmente negative, 175 dotate di adeguato campionamento microscopico (M:F=70:105; età mediana 61 anni). L’analisi istologica ha permesso di documentare 38 nuovi casi di CM (M:F=14:24; età mediana 67.5 anni): 27 CC (M:F=10:17; età mediana 69 anni) e 11 CL (M:F=4:7; età mediana 66 anni). In altri 25 casi sono state osservate alterazioni microscopiche prive dei sufficienti requisiti per la diagnosi di CM. Conclusioni: nel presente studio l’analisi microscopica del colon ha identificato la presenza di CM nel 21,7% dei soggetti con diarrea cronica non ematica ed indagine pancolonscopica negativa. Lo studio microscopico del colon è pertanto un passo diagnostico fondamentale per il corretto inquadramento diagnostico delle diarree croniche, specialmente dopo i 60 anni di età. Ampi studi prospettici e multicentrici dovranno chiarire ruolo e peso dei fattori di rischio associati a questi disordini.