832 resultados para Issued-based approach
Resumo:
Das Glaukom stellt eine heterogene Gruppe von okularen Erkrankungen dar, deren Pathogenese sich durch einen langsamen, progradienten Untergang von retinalen Ganglienzellen und ihren Axonen auszeichnet. rnIn den letzten Jahren wurde im Kontext der Glaukompathogenese verstärkt die Beteiligung autoreaktiver Antikörper diskutiert. Ein Schwerpunkt dieser Arbeit bestand in dem Vergleich solcher Autoantikörper-Reaktionen in den Serum- und Kammerwasserproben einzelner Glaukompatienten. Hierdurch sollte geklärt werden, inwieweit die Immunreaktivitäten dieser beiden Körperflüssigkeiten miteinander übereinstimmen und ob sich Hinweise auf eine lokale Antikörperproduktion im immunprivilegierten Auge finden lassen. Mittels eines etablierten Protein-Microarray-Verfahrens wurden die Immunreaktionen gegen 40 verschiedene Antigene, wie z.B. Hitzeschock-Proteine oder neuronale Strukturproteine, untersucht. Die Ergebnisse zeigten, dass die detektierten Autoantikörper-Reaktionen gegen mehr als 80% der untersuchten Antigene in beiden Körperflüssigkeiten miteinander übereinstimmen. Verdeutlicht wird hierdurch, dass die Antikörper-basierenden immunologischen Vorgänge im Auge bzw. Kammerwasser, trotz dessen Abschottung vom Blutkreislauf durch die Blut-Retina-Schranke, denen des Serums stark ähneln. Nur vereinzelt lassen sich Hinweise auf eine lokale Antikörperproduktion im Auge finden, wodurch die Bedeutung der detektierten Serumantikörper-Reaktionen für die Glaukomerkrankung belegt wird. rnEin weiterer Schwerpunkt der Arbeit lag auf der Detektion möglicher veränderter Proteinexpressionen in den Retinae und Serumproben von Glaukompatienten, die potentiell zu den neurodegenerativen Prozessen der Glaukompathogenese beitragen. Um die Analyse spezifischer Proteinexpressionen zu ermöglichen, wurde das Verfahren des Antikörper-Microarrays etabliert und auf die Fragestellung angewendet. Untersucht wurden hierbei vor allem die Abundanzen von Komplementproteinen, Zytokinen und Hitzeschock-Proteinen, aber auch die von verschiedenen neuronalen Strukturproteinen. Als Probenmaterial dienten Serum- und Retinaproben von Glaukompatienten, die vergleichend denen von gesunden Probanden gegenübergestellt wurden. Die Analyse erbrachte die Erkenntnis, dass neben der verstärkten Expression von Komplementproteinen in der Retina (z.B. C3, C6) auch im Serum der Glaukompatienten eine erhöhte Konzentration dieser Proteine vorliegt, die im Rahmen der Glaukomerkrankung möglicherweise ebenfalls eine Rolle spielen. Ähnliches konnte für verschiedene Zytokine, wie z.B. TNF-α, IFN-γ oder IL1-β beobachtet werden, die in den untersuchten Retinae von Glaukomprobanden, teilweise auch in den Serumproben der Patienten, in verstärktem Maße detektiert werden konnten. Die erhöhte Produktion von Zytokinen in der Retina ist wahrscheinlich auf die Aktivierung von Gliazellen zurückzuführen, ein Ereignis für das in dieser Arbeit zahlreiche Hinweise gefunden werden konnten. Die Gliaaktivierung wird vermutlich durch apoptotische Prozesse in der Retina ausgelöst, eventuell aber auch durch eine erfolgte Komplementaktivierung. Darüber hinaus konnten mittels eines massenspektrometrischen Verfahrens weitere Expressionsunterschiede verschiedener retinaler Proteine bei Glaukompatienten festgestellt werden. Diese Veränderungen, wie z.B. geminderte Mengen von ROS-eliminierenden Proteinen, wie der Superoxid Dismutase und Peroxiredoxin-2, begünstigen bzw. verstärken sehr wahrscheinlich die neurodegenerativen Prozesse in der Retina von GlaukompatientenrnInwieweit die untersuchten Faktoren kausativ an den neurodegenerativen Prozessen beteiligt sind, bleibt ungeklärt, jedoch untermauert deren Vielzahl die Notwendigkeit, die Ursache der Glaukomerkrankung als komplexe Interaktion und Wechselwirkung verschiedener Komponenten zu betrachten und nicht als einen einzelnen fehlgesteuerten Mechanismus.rn
Resumo:
From the late 1980s, the automation of sequencing techniques and the computer spread gave rise to a flourishing number of new molecular structures and sequences and to proliferation of new databases in which to store them. Here are presented three computational approaches able to analyse the massive amount of publicly avalilable data in order to answer to important biological questions. The first strategy studies the incorrect assignment of the first AUG codon in a messenger RNA (mRNA), due to the incomplete determination of its 5' end sequence. An extension of the mRNA 5' coding region was identified in 477 in human loci, out of all human known mRNAs analysed, using an automated expressed sequence tag (EST)-based approach. Proof-of-concept confirmation was obtained by in vitro cloning and sequencing for GNB2L1, QARS and TDP2 and the consequences for the functional studies are discussed. The second approach analyses the codon bias, the phenomenon in which distinct synonymous codons are used with different frequencies, and, following integration with a gene expression profile, estimates the total number of codons present across all the expressed mRNAs (named here "codonome value") in a given biological condition. Systematic analyses across different pathological and normal human tissues and multiple species shows a surprisingly tight correlation between the codon bias and the codonome bias. The third approach is useful to studies the expression of human autism spectrum disorder (ASD) implicated genes. ASD implicated genes sharing microRNA response elements (MREs) for the same microRNA are co-expressed in brain samples from healthy and ASD affected individuals. The different expression of a recently identified long non coding RNA which have four MREs for the same microRNA could disrupt the equilibrium in this network, but further analyses and experiments are needed.
Resumo:
This dissertation deals with the design and the characterization of novel reconfigurable silicon-on-insulator (SOI) devices to filter and route optical signals on-chip. Design is carried out through circuit simulations based on basic circuit elements (Building Blocks, BBs) in order to prove the feasibility of an approach allowing to move the design of Photonic Integrated Circuits (PICs) toward the system level. CMOS compatibility and large integration scale make SOI one of the most promising material to realize PICs. The concepts of generic foundry and BB based circuit simulations for the design are emerging as a solution to reduce the costs and increase the circuit complexity. To validate the BB based approach, the development of some of the most important BBs is performed first. A novel tunable coupler is also presented and it is demonstrated to be a valuable alternative to the known solutions. Two novel multi-element PICs are then analysed: a narrow linewidth single mode resonator and a passband filter with widely tunable bandwidth. Extensive circuit simulations are carried out to determine their performance, taking into account fabrication tolerances. The first PIC is based on two Grating Assisted Couplers in a ring resonator (RR) configuration. It is shown that a trade-off between performance, resonance bandwidth and device footprint has to be performed. The device could be employed to realize reconfigurable add-drop de/multiplexers. Sensitivity with respect to fabrication tolerances and spurious effects is however observed. The second PIC is based on an unbalanced Mach-Zehnder interferometer loaded with two RRs. Overall good performance and robustness to fabrication tolerances and nonlinear effects have confirmed its applicability for the realization of flexible optical systems. Simulated and measured devices behaviour is shown to be in agreement thus demonstrating the viability of a BB based approach to the design of complex PICs.
Resumo:
Among all possible realizations of quark and antiquark assembly, the nucleon (the proton and the neutron) is the most stable of all hadrons and consequently has been the subject of intensive studies. Mass, shape, radius and more complex representations of its internal structure are measured since several decades using different probes. The proton (spin 1/2) is described by the electric GE and magnetic GM form factors which characterise its internal structure. The simplest way to measure the proton form factors consists in measuring the angular distribution of the electron-proton elastic scattering accessing the so-called Space-Like region where q2 < 0. Using the crossed channel antiproton proton <--> e+e-, one accesses another kinematical region, the so-called Time-Like region where q2 > 0. However, due to the antiproton proton <--> e+e- threshold q2th, only the kinematical domain q2 > q2th > 0 is available. To access the unphysical region, one may use the antiproton proton --> pi0 e+ e- reaction where the pi0 takes away a part of the system energy allowing q2 to be varied between q2th and almost 0. This thesis aims to show the feasibility of such measurements with the PANDA detector which will be installed on the new high intensity antiproton ring at the FAIR facility at Darmstadt. To describe the antiproton proton --> pi0 e+ e- reaction, a Lagrangian based approach is developed. The 5-fold differential cross section is determined and related to linear combinations of hadronic tensors. Under the assumption of one nucleon exchange, the hadronic tensors are expressed in terms of the 2 complex proton electromagnetic form factors. An extraction method which provides an access to the proton electromagnetic form factor ratio R = |GE|/|GM| and for the first time in an unpolarized experiment to the cosine of the phase difference is developed. Such measurements have never been performed in the unphysical region up to now. Extended simulations were performed to show how the ratio R and the cosine can be extracted from the positron angular distribution. Furthermore, a model is developed for the antiproton proton --> pi0 pi+ pi- background reaction considered as the most dangerous one. The background to signal cross section ratio was estimated under different cut combinations of the particle identification information from the different detectors and of the kinematic fits. The background contribution can be reduced to the percent level or even less. The corresponding signal efficiency ranges from a few % to 30%. The precision on the determination of the ratio R and of the cosine is determined using the expected counting rates via Monte Carlo method. A part of this thesis is also dedicated to more technical work with the study of the prototype of the electromagnetic calorimeter and the determination of its resolution.
Resumo:
La tesi esplora la riflessione bioetica latino-americana e la sua evoluzione, valutandone i legami, intenzionali e non, con le posizioni bioetiche più consolidate e tendenzialmente dominanti del panorama internazionale. La trattazione indaga lo sviluppo della bioetica nel sub-continente latinoamericano, ponendo in evidenza lo sviluppo delle teorie morali che definiscono l'orizzonte interpretativo del dibattito contemporaneo, come pure le peculiarità della produzione sudamericana in materia di bioetica e la casistica tipica di questi territori. Per una comprensione delle caratteristiche di un pensiero bioetico, tipico di un territorio in via di sviluppo, la tesi presenta la trattazione di due precisi case studies: A- Il progetto peruviano: Propuesta de incorporación del enfoque intercultural y de representantes indígenas en los comités de ética en países multiculturales; B- La biomedicina e gli xenotrapianti in Messico. L'analisi dei casi studio è funzionale alla messa in luce delle caratteristiche costitutive del pensiero latinoamericano e della casistica tipica del territorio. La bioetica Latinoamericana, infatti, sviluppatasi circa con vent'anni di ritardo rispetto ai luoghi di nascita di questa disciplina (Gran Bretagna e USA), rivendica l'esigenza di un dibattito e di una prassi bioetica autoctone, capaci quindi di riassumere in sé stesse concetti e principi declinabili all'interno delle esperienze e delle concrete esigenze che prendono forma nella realtà culturale e socio-economica. Per questo motivo si discosta dalle cosiddette teorie bioetiche classiche rivendicando una bioetica di contestazione che si vincola costitutivamente al linguaggio dei diritti umani. La tesi propone inoltre l'utilizzo dei dati emersi dai casi studio per sviluppare una riflessione sulle modalità e su alcuni esiti della globalizzazione della bioetica. L’analisi si avvale di strumenti e categorie proprie della sociologia, dell’economia e della politica al fine di mettere in evidenza le diverse componenti costitutive e le criticità della bioetica globalizzata.
Resumo:
This work is focused on the analysis of sea–level change (last century), based mainly on instrumental observations. During this period, individual components of sea–level change are investigated, both at global and regional scales. Some of the geophysical processes responsible for current sea-level change such as glacial isostatic adjustments and current melting terrestrial ice sources, have been modeled and compared with observations. A new value of global mean sea level change based of tide gauges observations has been independently assessed in 1.5 mm/year, using corrections for glacial isostatic adjustment obtained with different models as a criterion for the tide gauge selection. The long wavelength spatial variability of the main components of sea–level change has been investigated by means of traditional and new spectral methods. Complex non–linear trends and abrupt sea–level variations shown by tide gauges records have been addressed applying different approaches to regional case studies. The Ensemble Empirical Mode Decomposition technique has been used to analyse tide gauges records from the Adriatic Sea to ascertain the existence of cyclic sea-level variations. An Early Warning approach have been adopted to detect tipping points in sea–level records of North East Pacific and their relationship with oceanic modes. Global sea–level projections to year 2100 have been obtained by a semi-empirical approach based on the artificial neural network method. In addition, a model-based approach has been applied to the case of the Mediterranean Sea, obtaining sea-level projection to year 2050.
Resumo:
Amphiphile Peptide, Pro-Glu-(Phe-Glu)n-Pro, Pro-Asp-(Phe-Asp)n-Pro, und Phe-Glu-(Phe-Glu)n-Phe, können so aus n alternierenden Sequenzen von hydrophoben und hydrophilen Aminosäuren konstruiert werden, dass sie sich in Monolagen an der Luft-Wasser Grenzfläche anordnen. In biologischen Systemen können Strukturen an der organisch-wässrigen Grenzfläche als Matrix für die Kristallisation von Hydroxyapatit dienen, ein Vorgang der für die Behandlung von Osteoporose verwendet werden kann. In der vorliegenden Arbeit wurden Computersimulationenrneingesetzt, um die Strukturen und die zugrunde liegenden Wechselwirkungen welche die Aggregation der Peptide auf mikroskopischer Ebene steuern, zu untersuchen. Atomistische Molekulardynamik-Simulationen von einzelnen Peptidsträngen zeigen, dass sie sich leicht an der Luft-Wasser Grenzfläche anordnen und die Fähigkeit haben, sich in β-Schleifen zu falten, selbst für relativ kurze Peptidlängen (n = 2). Seltene Ereignisse wie diese (i.e. Konformationsänderungen) erfordern den Einsatz fortgeschrittener Sampling-Techniken. Hier wurde “Replica Exchange” Molekulardynamik verwendet um den Einfluss der Peptidsequenzen zu untersuchen. Die Simulationsergebnisse zeigten, dass Peptide mit kürzeren azidischen Seitenketten (Asp vs. Glu) gestrecktere Konformationen aufwiesen als die mit längeren Seitenketten, die in der Lage waren die Prolin-Termini zu erreichen. Darüber hinaus zeigte sich, dass die Prolin-Termini (Pro vs. Phe) notwendig sind, um eine 2D-Ordnung innerhalb derrnAggregate zu erhalten. Das Peptid Pro-Asp-(Phe-Asp)n-Pro, das beide dieser Eigenschaften enthält, zeigt das geordnetste Verhalten, eine geringe Verdrehung der Hauptkette, und ist in der Lage die gebildeten Aggregate durch Wasserstoffbrücken zwischen den sauren Seitenketten zu stabilisieren. Somit ist dieses Peptid am besten zur Aggregation geeignet. Dies wurde auch durch die Beurteilung der Stabilität von experimentnah-aufgesetzten Peptidaggregaten, sowie der Neigung einzelner Peptide zur Selbstorganisation von anfänglich ungeordneten Konfigurationen unterstützt. Da atomistische Simulationen nur auf kleine Systemgrößen und relativ kurze Zeitskalen begrenzt sind, wird ein vergröbertes Modell entwickelt damit die Selbstorganisation auf einem größeren Maßstab studiert werden kann. Da die Selbstorganisation an der Grenzfläche vonrnInteresse ist, wurden existierenden Vergröberungsmethoden erweitert, um nicht-gebundene Potentiale für inhomogene Systeme zu bestimmen. Die entwickelte Methode ist analog zur iterativen Boltzmann Inversion, bildet aber das Update für das Interaktionspotential basierend auf der radialen Verteilungsfunktion in einer Slab-Geometrie und den Breiten des Slabs und der Grenzfläche. Somit kann ein Kompromiss zwischen der lokalen Flüssigketsstruktur und den thermodynamischen Eigenschaften der Grenzfläche erreicht werden. Die neue Methode wurde für einen Wasser- und einen Methanol-Slab im Vakuum demonstriert, sowie für ein einzelnes Benzolmolekül an der Vakuum-Wasser Grenzfläche, eine Anwendung die von besonderer Bedeutung in der Biologie ist, in der oft das thermodynamische/Grenzflächenpolymerisations-Verhalten zusätzlich der strukturellen Eigenschaften des Systems erhalten werden müssen. Daraufrnbasierend wurde ein vergröbertes Modell über einen Fragment-Ansatz parametrisiert und die Affinität des Peptids zur Vakuum-Wasser Grenzfläche getestet. Obwohl die einzelnen Fragmente sowohl die Struktur als auch die Wahrscheinlichkeitsverteilungen an der Grenzfläche reproduzierten, diffundierte das Peptid als Ganzes von der Grenzfläche weg. Jedoch führte eine Reparametrisierung der nicht-gebundenen Wechselwirkungen für eines der Fragmente der Hauptkette in einem Trimer dazu, dass das Peptid an der Grenzfläche blieb. Dies deutet darauf hin, dass die Kettenkonnektivität eine wichtige Rolle im Verhalten des Petpids an der Grenzfläche spielt.
Resumo:
Adhesion, immune evasion and invasion are key determinants during bacterial pathogenesis. Pathogenic bacteria possess a wide variety of surface exposed and secreted proteins which allow them to adhere to tissues, escape the immune system and spread throughout the human body. Therefore, extensive contacts between the human and the bacterial extracellular proteomes take place at the host-pathogen interface at the protein level. Recent researches emphasized the importance of a global and deeper understanding of the molecular mechanisms which underlie bacterial immune evasion and pathogenesis. Through the use of a large-scale, unbiased, protein microarray-based approach and of wide libraries of human and bacterial purified proteins, novel host-pathogen interactions were identified. This approach was first applied to Staphylococcus aureus, cause of a wide variety of diseases ranging from skin infections to endocarditis and sepsis. The screening led to the identification of several novel interactions between the human and the S. aureus extracellular proteomes. The interaction between the S. aureus immune evasion protein FLIPr (formyl-peptide receptor like-1 inhibitory protein) and the human complement component C1q, key players of the offense-defense fighting, was characterized using label-free techniques and functional assays. The same approach was also applied to Neisseria meningitidis, major cause of bacterial meningitis and fulminant sepsis worldwide. The screening led to the identification of several potential human receptors for the neisserial adhesin A (NadA), an important adhesion protein and key determinant of meningococcal interactions with the human host at various stages. The interaction between NadA and human LOX-1 (low-density oxidized lipoprotein receptor) was confirmed using label-free technologies and cell binding experiments in vitro. Taken together, these two examples provided concrete insights into S. aureus and N. meningitidis pathogenesis, and identified protein microarray coupled with appropriate validation methodologies as a powerful large scale tool for host-pathogen interactions studies.
Resumo:
Movement analysis carried out in laboratory settings is a powerful, but costly solution since it requires dedicated instrumentation, space and personnel. Recently, new technologies such as the magnetic and inertial measurement units (MIMU) are becoming widely accepted as tools for the assessment of human motion in clinical and research settings. They are relatively easy-to-use and potentially suitable for estimating gait kinematic features, including spatio-temporal parameters. The objective of this thesis regards the development and testing in clinical contexts of robust MIMUs based methods for assessing gait spatio-temporal parameters applicable across a number of different pathological gait patterns. First, considering the need of a solution the least obtrusive as possible, the validity of the single unit based approach was explored. A comparative evaluation of the performance of various methods reported in the literature for estimating gait temporal parameters using a single unit attached to the trunk first in normal gait and then in different pathological gait conditions was performed. Then, the second part of the research headed towards the development of new methods for estimating gait spatio-temporal parameters using shank worn MIMUs on different pathological subjects groups. In addition to the conventional gait parameters, new methods for estimating the changes of the direction of progression were explored. Finally, a new hardware solution and relevant methodology for estimating inter-feet distance during walking was proposed. Results of the technical validation of the proposed methods at different walking speeds and along different paths against a gold standard were reported and showed that the use of two MIMUs attached to the lower limbs associated with a robust method guarantee a much higher accuracy in determining gait spatio-temporal parameters. In conclusion, the proposed methods could be reliably applied to various abnormal gaits obtaining in some cases a comparable level of accuracy with respect to normal gait.
Resumo:
Every year, thousand of surgical treatments are performed in order to fix up or completely substitute, where possible, organs or tissues affected by degenerative diseases. Patients with these kind of illnesses stay long times waiting for a donor that could replace, in a short time, the damaged organ or the tissue. The lack of biological alternates, related to conventional surgical treatments as autografts, allografts, e xenografts, led the researchers belonging to different areas to collaborate to find out innovative solutions. This research brought to a new discipline able to merge molecular biology, biomaterial, engineering, biomechanics and, recently, design and architecture knowledges. This discipline is named Tissue Engineering (TE) and it represents a step forward towards the substitutive or regenerative medicine. One of the major challenge of the TE is to design and develop, using a biomimetic approach, an artificial 3D anatomy scaffold, suitable for cells adhesion that are able to proliferate and differentiate themselves as consequence of the biological and biophysical stimulus offered by the specific tissue to be replaced. Nowadays, powerful instruments allow to perform analysis day by day more accurateand defined on patients that need more precise diagnosis and treatments.Starting from patient specific information provided by TC (Computed Tomography) microCT and MRI(Magnetic Resonance Imaging), an image-based approach can be performed in order to reconstruct the site to be replaced. With the aid of the recent Additive Manufacturing techniques that allow to print tridimensional objects with sub millimetric precision, it is now possible to practice an almost complete control of the parametrical characteristics of the scaffold: this is the way to achieve a correct cellular regeneration. In this work, we focalize the attention on a branch of TE known as Bone TE, whose the bone is main subject. Bone TE combines osteoconductive and morphological aspects of the scaffold, whose main properties are pore diameter, structure porosity and interconnectivity. The realization of the ideal values of these parameters represents the main goal of this work: here we'll a create simple and interactive biomimetic design process based on 3D CAD modeling and generative algorithmsthat provide a way to control the main properties and to create a structure morphologically similar to the cancellous bone. Two different typologies of scaffold will be compared: the first is based on Triply Periodic MinimalSurface (T.P.M.S.) whose basic crystalline geometries are nowadays used for Bone TE scaffolding; the second is based on using Voronoi's diagrams and they are more often used in the design of decorations and jewellery for their capacity to decompose and tasselate a volumetric space using an heterogeneous spatial distribution (often frequent in nature). In this work, we will show how to manipulate the main properties (pore diameter, structure porosity and interconnectivity) of the design TE oriented scaffolding using the implementation of generative algorithms: "bringing back the nature to the nature".
Resumo:
With the discovery that DNA can be successfully recovered from museum collections, a new source of genetic information has been provided to extend our comprehension of the evolutionary history of species. However, historical specimens are often mislabeled or report incorrect information of origin, thus accurate identification of specimens is essential. Due to the highly damaged nature of ancient DNA many pitfalls exist and particular precautions need to be considered in order to perform genetic analysis. In this study we analyze 208 historical remains of pelagic fishes collected in the beginning of the 20th century. Through the adaptation of existing protocols, usually applied to human remains, we manage to successfully retrieve valuable genetic material from almost all of the examined samples using a guanidine and silica column-based approach. The combined use of two mitochondrial markers cytochrome-oxidase-1(mtDNA COI) and Control Region (mtDNA CR), and the nuclear marker first internal transcriber space (ITS1) allowed us to identify the majority of the examined specimens using traditional PCR and Sanger sequencing techniques. The creation of primers capable of amplifying heavily degraded DNA have great potential for future uses, both in ancient and in modern investigation. The methodologies developed in this study can in fact be applied for other ancient fish specimens as well as cooked or canned samples.
Resumo:
Human activities strongly influence environmental processes, and while human domination increases, biodiversity progressively declines in ecosystems worldwide. High genetic and phenotypic variability ensures functionality and stability of ecosystem processes through time and increases the resilience and the adaptive capacity of populations and communities, while a reduction in functional diversity leads to a decrease in the ability to respond in a changing environment. Pollution is becoming one of the major threats in aquatic ecosystem, and pharmaceutical and personal care products (PPCPs) in particular are a relatively new group of environmental contaminants suspected to have adverse effects on aquatic organisms. There is still a lake of knowledge on the responses of communities to complex chemical mixtures in the environment. We used an individual-trait-based approach to assess the response of a phytoplankton community in a scenario of combined pollution and environmental change (steady increasing in temperature). We manipulated individual-level trait diversity directly (by filtering out size classes) and indirectly (through exposure to PPCPs mixture), and studied how reduction in trait-diversity affected community structure, production of biomass and the ability of the community to track a changing environment. We found that exposure to PPCPs slows down the ability of the community to respond to an increasing temperature. Our study also highlights how physiological responses (induced by PPCPs exposure) are important for ecosystem processes: although from an ecological point of view experimental communities converged to a similar structure, they were functionally different.
Resumo:
We have developed a haptic-based approach for retraining of interjoint coordination following stroke called time-independent functional training (TIFT) and implemented this mode in the ARMin III robotic exoskeleton. The ARMin III robot was developed by Drs. Robert Riener and Tobias Nef at the Swiss Federal Institute of Technology Zurich (Eidgenossische Technische Hochschule Zurich, or ETH Zurich), in Zurich, Switzerland. In the TIFT mode, the robot maintains arm movements within the proper kinematic trajectory via haptic walls at each joint. These arm movements focus training of interjoint coordination with highly intuitive real-time feedback of performance; arm movements advance within the trajectory only if their movement coordination is correct. In initial testing, 37 nondisabled subjects received a single session of learning of a complex pattern. Subjects were randomized to TIFT or visual demonstration or moved along with the robot as it moved though the pattern (time-dependent [TD] training). We examined visual demonstration to separate the effects of action observation on motor learning from the effects of the two haptic guidance methods. During these training trials, TIFT subjects reduced error and interaction forces between the robot and arm, while TD subject performance did not change. All groups showed significant learning of the trajectory during unassisted recall trials, but we observed no difference in learning between groups, possibly because this learning task is dominated by vision. Further testing in stroke populations is warranted.
Resumo:
Bone morphogenetic proteins (BMP) have to be applied at high concentrations to stimulate bone healing. The limited therapeutic efficacy may be due to the local presence of BMP antagonists such as Noggin. Thus, inhibiting BMP antagonists is an attractive therapeutic option. We hypothesized that the engineered BMP2 variant L51P stimulates osteoinduction by antagonizing Noggin-mediated inhibition of BMP2. Primary murine osteoblasts (OB) were treated with L51P, BMP2, and Noggin. OB proliferation and differentiation were quantified with XTT and alkaline phosphatase (ALP) assays. BMP receptor dependent intracellular signaling in OB was evaluated with Smad and p38 MAPK phosphorylation assays. BMP2, Noggin, BMP receptor Ia/Ib/II, osteocalcin, and ALP mRNA expressions were analyzed with real-time PCR. L51P stimulated OB differentiation by blocking Noggin mediated inhibition of BMP2. L51P did not induce OB differentiation directly and did not activate BMP receptor dependent intracellular signaling via the Smad pathway. Treatment of OB cultures with BMP2 but not with L51P resulted in an increased expression of ALP, BMP2, and Noggin mRNA. By inhibiting the BMP antagonist Noggin, L51P enhances BMP2 activity and stimulates osteoinduction without exhibiting direct osteoinductive function. Indirect osteoinduction with L51P seems to be advantageous to osteoinduction with BMP2 as BMP2 stimulates the expression of Noggin thereby self-limiting its own osteoinductive activity. Treatment with L51P is the first protein-based approach available to augment BMP2 induced bone regeneration through inhibition of BMP antagonists. The described strategy may help to decrease the amounts of exogenous BMPs currently required to stimulate bone healing.
Resumo:
BACKGROUND: In order to optimise the cost-effectiveness of active surveillance to substantiate freedom from disease, a new approach using targeted sampling of farms was developed and applied on the example of infectious bovine rhinotracheitis (IBR) and enzootic bovine leucosis (EBL) in Switzerland. Relevant risk factors (RF) for the introduction of IBR and EBL into Swiss cattle farms were identified and their relative risks defined based on literature review and expert opinions. A quantitative model based on the scenario tree method was subsequently used to calculate the required sample size of a targeted sampling approach (TS) for a given sensitivity. We compared the sample size with that of a stratified random sample (sRS) with regard to efficiency. RESULTS: The required sample sizes to substantiate disease freedom were 1,241 farms for IBR and 1,750 farms for EBL to detect 0.2% herd prevalence with 99% sensitivity. Using conventional sRS, the required sample sizes were 2,259 farms for IBR and 2,243 for EBL. Considering the additional administrative expenses required for the planning of TS, the risk-based approach was still more cost-effective than a sRS (40% reduction on the full survey costs for IBR and 8% for EBL) due to the considerable reduction in sample size. CONCLUSIONS: As the model depends on RF selected through literature review and was parameterised with values estimated by experts, it is subject to some degree of uncertainty. Nevertheless, this approach provides the veterinary authorities with a promising tool for future cost-effective sampling designs.