895 resultados para Customer-value based approach
Resumo:
Purpose - The purpose of this paper is twofold: to analyze the computational complexity of the cogeneration design problem; to present an expert system to solve the proposed problem, comparing such an approach with the traditional searching methods available.Design/methodology/approach - The complexity of the cogeneration problem is analyzed through the transformation of the well-known knapsack problem. Both problems are formulated as decision problems and it is proven that the cogeneration problem is np-complete. Thus, several searching approaches, such as population heuristics and dynamic programming, could be used to solve the problem. Alternatively, a knowledge-based approach is proposed by presenting an expert system and its knowledge representation scheme.Findings - The expert system is executed considering two case-studies. First, a cogeneration plant should meet power, steam, chilled water and hot water demands. The expert system presented two different solutions based on high complexity thermodynamic cycles. In the second case-study the plant should meet just power and steam demands. The system presents three different solutions, and one of them was never considered before by our consultant expert.Originality/value - The expert system approach is not a "blind" method, i.e. it generates solutions based on actual engineering knowledge instead of the searching strategies from traditional methods. It means that the system is able to explain its choices, making available the design rationale for each solution. This is the main advantage of the expert system approach over the traditional search methods. On the other hand, the expert system quite likely does not provide an actual optimal solution. All it can provide is one or more acceptable solutions.
Resumo:
Registration is a necessarily sophisticated evaluation process applied to vertebrate pesticide products. Although conducted to minimize any potential impacts upon public health, the environment and food production, the all-encompassing process of registration can stifle innovation. Vertebrate pesticides are rarely used to control pest animals in food crops. In contrast to agrochemicals, relatively small amounts of vertebrate pesticides are used (50.1%), usually in solid or paste baits, and generally by discrete application methods rather than by broad-scale spray applications. We present a hierarchy or sliding scale of typical data requirements relative to application techniques, to help clarify an evolving science-based approach which focuses on requiring data to address key scientific questions while allowing waivers where additional data have minor value. Such an approach will facilitate the development and delivery of increasingly humane, species-targeted, low residue pesticides in the New World, along with the phasing out of less desirable chemicals that continue to be used due to a lack of alternatives.
Resumo:
Land development in the vicinity of airports often leads to land-use that can attract birds that are hazardous to aviation operations. For this reason, certain forms of land-use have traditionally been discouraged within prescribed distances of Canadian airports. However, this often leads to an unrealistic prohibition of land-use in the vicinity of airports located in urban settings. Furthermore, it is often unclear that the desired safety goals have been achieved. This paper describes a model that was created to assist in the development of zoning regulations for a future airport site in Canada. The framework links land-use to bird-related safety-risks and aircraft operations by categorizing the predictable relationships between: (i) different land uses found in urbanized and urbanizing settings near airports; (ii) bird species; and (iii) the different safety-risks to aircraft during various phases of flight. The latter is assessed relative to the runway approach and departure paths. Bird species are ranked to reflect the potential severity of an impact with an aircraft (using bird weight, flocking characteristics, and flight behaviours). These criteria are then employed to chart bird-related safety-risks relative to runway reference points. Each form of land-use is categorized to reflect the degree to which it attracts hazardous bird species. From this information, hazard and risk matrices have been developed and applied to the future airport setting, thereby providing risk-based guidance on appropriate land-uses that range from prohibited to acceptable. The framework has subsequently been applied to an existing Canadian airport, and is currently being adapted for national application. The framework provides a risk-based and science-based approach that offers municipalities and property owner’s flexibility in managing the risks to aviation related to their land use.
Resumo:
Rare variants are becoming the new candidates in the search for genetic variants that predispose individuals to a phenotype of interest. Their low prevalence in a population requires the development of dedicated detection and analytical methods. A family-based approach could greatly enhance their detection and interpretation because rare variants are nearly family specific. In this report, we test several distinct approaches for analyzing the information provided by rare and common variants and how they can be effectively used to pinpoint putative candidate genes for follow-up studies. The analyses were performed on the mini-exome data set provided by Genetic Analysis Workshop 17. Eight approaches were tested, four using the trait’s heritability estimates and four using QTDT models. These methods had their sensitivity, specificity, and positive and negative predictive values compared in light of the simulation parameters. Our results highlight important limitations of current methods to deal with rare and common variants, all methods presented a reduced specificity and, consequently, prone to false positive associations. Methods analyzing common variants information showed an enhanced sensibility when compared to rare variants methods. Furthermore, our limited knowledge of the use of biological databases for gene annotations, possibly for use as covariates in regression models, imposes a barrier to further research.
Resumo:
Abstract Background Over the last years, a number of researchers have investigated how to improve the reuse of crosscutting concerns. New possibilities have emerged with the advent of aspect-oriented programming, and many frameworks were designed considering the abstractions provided by this new paradigm. We call this type of framework Crosscutting Frameworks (CF), as it usually encapsulates a generic and abstract design of one crosscutting concern. However, most of the proposed CFs employ white-box strategies in their reuse process, requiring two mainly technical skills: (i) knowing syntax details of the programming language employed to build the framework and (ii) being aware of the architectural details of the CF and its internal nomenclature. Also, another problem is that the reuse process can only be initiated as soon as the development process reaches the implementation phase, preventing it from starting earlier. Method In order to solve these problems, we present in this paper a model-based approach for reusing CFs which shields application engineers from technical details, letting him/her concentrate on what the framework really needs from the application under development. To support our approach, two models are proposed: the Reuse Requirements Model (RRM) and the Reuse Model (RM). The former must be used to describe the framework structure and the later is in charge of supporting the reuse process. As soon as the application engineer has filled in the RM, the reuse code can be automatically generated. Results We also present here the result of two comparative experiments using two versions of a Persistence CF: the original one, whose reuse process is based on writing code, and the new one, which is model-based. The first experiment evaluated the productivity during the reuse process, and the second one evaluated the effort of maintaining applications developed with both CF versions. The results show the improvement of 97% in the productivity; however little difference was perceived regarding the effort for maintaining the required application. Conclusion By using the approach herein presented, it was possible to conclude the following: (i) it is possible to automate the instantiation of CFs, and (ii) the productivity of developers are improved as long as they use a model-based instantiation approach.
Resumo:
Facial expression recognition is one of the most challenging research areas in the image recognition ¯eld and has been actively studied since the 70's. For instance, smile recognition has been studied due to the fact that it is considered an important facial expression in human communication, it is therefore likely useful for human–machine interaction. Moreover, if a smile can be detected and also its intensity estimated, it will raise the possibility of new applications in the future
Resumo:
[EN]This paper is a proposal for teaching pragmatics following a corpus-based approach. Corpora have had a high impact on how linguistics is looked at these days. However, teaching linguistics is still traditional in its scope and stays away from a growing tendency of incorporating authentic samples in the theoretical classroom, and so lecturers perpetuate the presentation of the same canonical examples students may find in their textbooks or in other introductory monographs. Our view is that using corpus linguistics, especially corpora freely available in the World Wide Web, will result in a more engaging and fresh look at the course of Pragmatics, while promoting early research in students. This way, they learn the concepts but most importantly how to later identify pragmatic phenomena in real text. Here, we raise our concern with the methodology, presenting clear examples of corpus-based pragmatic activities, and one clear result is the fact that students learn also how to be autonomous in their analysis o f data. In our proposal, we move from more controlled tasks to autonomy. This proposal focuses on students enrolled in the course Pragmática de la Lengua inglesa, currently part of the curriculum in Lenguas Modernas, Universidad de Las Palmas de Gran Canaria.
Resumo:
The construction and use of multimedia corpora has been advocated for a while in the literature as one of the expected future application fields of Corpus Linguistics. This research project represents a pioneering experience aimed at applying a data-driven methodology to the study of the field of AVT, similarly to what has been done in the last few decades in the macro-field of Translation Studies. This research was based on the experience of Forlixt 1, the Forlì Corpus of Screen Translation, developed at the University of Bologna’s Department of Interdisciplinary Studies in Translation, Languages and Culture. As a matter of fact, in order to quantify strategies of linguistic transfer of an AV product, we need to take into consideration not only the linguistic aspect of such a product but all the meaning-making resources deployed in the filmic text. Provided that one major benefit of Forlixt 1 is the combination of audiovisual and textual data, this corpus allows the user to access primary data for scientific investigation, and thus no longer rely on pre-processed material such as traditional annotated transcriptions. Based on this rationale, the first chapter of the thesis sets out to illustrate the state of the art of research in the disciplinary fields involved. The primary objective was to underline the main repercussions on multimedia texts resulting from the interaction of a double support, audio and video, and, accordingly, on procedures, means, and methods adopted in their translation. By drawing on previous research in semiotics and film studies, the relevant codes at work in visual and acoustic channels were outlined. Subsequently, we concentrated on the analysis of the verbal component and on the peculiar characteristics of filmic orality as opposed to spontaneous dialogic production. In the second part, an overview of the main AVT modalities was presented (dubbing, voice-over, interlinguistic and intra-linguistic subtitling, audio-description, etc.) in order to define the different technologies, processes and professional qualifications that this umbrella term presently includes. The second chapter focuses diachronically on various theories’ contribution to the application of Corpus Linguistics’ methods and tools to the field of Translation Studies (i.e. Descriptive Translation Studies, Polysystem Theory). In particular, we discussed how the use of corpora can favourably help reduce the gap existing between qualitative and quantitative approaches. Subsequently, we reviewed the tools traditionally employed by Corpus Linguistics in regard to the construction of traditional “written language” corpora, to assess whether and how they can be adapted to meet the needs of multimedia corpora. In particular, we reviewed existing speech and spoken corpora, as well as multimedia corpora specifically designed to investigate Translation. The third chapter reviews Forlixt 1's main developing steps, from a technical (IT design principles, data query functions) and methodological point of view, by laying down extensive scientific foundations for the annotation methods adopted, which presently encompass categories of pragmatic, sociolinguistic, linguacultural and semiotic nature. Finally, we described the main query tools (free search, guided search, advanced search and combined search) and the main intended uses of the database in a pedagogical perspective. The fourth chapter lists specific compilation criteria retained, as well as statistics of the two sub-corpora, by presenting data broken down by language pair (French-Italian and German-Italian) and genre (cinema’s comedies, television’s soapoperas and crime series). Next, we concentrated on the discussion of the results obtained from the analysis of summary tables reporting the frequency of categories applied to the French-Italian sub-corpus. The detailed observation of the distribution of categories identified in the original and dubbed corpus allowed us to empirically confirm some of the theories put forward in the literature and notably concerning the nature of the filmic text, the dubbing process and Italian dubbed language’s features. This was possible by looking into some of the most problematic aspects, like the rendering of socio-linguistic variation. The corpus equally allowed us to consider so far neglected aspects, such as pragmatic, prosodic, kinetic, facial, and semiotic elements, and their combination. At the end of this first exploration, some specific observations concerning possible macrotranslation trends were made for each type of sub-genre considered (cinematic and TV genre). On the grounds of this first quantitative investigation, the fifth chapter intended to further examine data, by applying ad hoc models of analysis. Given the virtually infinite number of combinations of categories adopted, and of the latter with searchable textual units, three possible qualitative and quantitative methods were designed, each of which was to concentrate on a particular translation dimension of the filmic text. The first one was the cultural dimension, which specifically focused on the rendering of selected cultural references and on the investigation of recurrent translation choices and strategies justified on the basis of the occurrence of specific clusters of categories. The second analysis was conducted on the linguistic dimension by exploring the occurrence of phrasal verbs in the Italian dubbed corpus and by ascertaining the influence on the adoption of related translation strategies of possible semiotic traits, such as gestures and facial expressions. Finally, the main aim of the third study was to verify whether, under which circumstances, and through which modality, graphic and iconic elements were translated into Italian from an original corpus of both German and French films. After having reviewed the main translation techniques at work, an exhaustive account of possible causes for their non-translation was equally provided. By way of conclusion, the discussion of results obtained from the distribution of annotation categories on the French-Italian corpus, as well as the application of specific models of analysis allowed us to underline possible advantages and drawbacks related to the adoption of a corpus-based approach to AVT studies. Even though possible updating and improvement were proposed in order to help solve some of the problems identified, it is argued that the added value of Forlixt 1 lies ultimately in having created a valuable instrument, allowing to carry out empirically-sound contrastive studies that may be usefully replicated on different language pairs and several types of multimedia texts. Furthermore, multimedia corpora can also play a crucial role in L2 and translation teaching, two disciplines in which their use still lacks systematic investigation.
Resumo:
We propose an extension of the approach provided by Kluppelberg and Kuhn (2009) for inference on second-order structure moments. As in Kluppelberg and Kuhn (2009) we adopt a copula-based approach instead of assuming normal distribution for the variables, thus relaxing the equality in distribution assumption. A new copula-based estimator for structure moments is investigated. The methodology provided by Kluppelberg and Kuhn (2009) is also extended considering the copulas associated with the family of Eyraud-Farlie-Gumbel-Morgenstern distribution functions (Kotz, Balakrishnan, and Johnson, 2000, Equation 44.73). Finally, a comprehensive simulation study and an application to real financial data are performed in order to compare the different approaches.
Resumo:
From the late 1980s, the automation of sequencing techniques and the computer spread gave rise to a flourishing number of new molecular structures and sequences and to proliferation of new databases in which to store them. Here are presented three computational approaches able to analyse the massive amount of publicly avalilable data in order to answer to important biological questions. The first strategy studies the incorrect assignment of the first AUG codon in a messenger RNA (mRNA), due to the incomplete determination of its 5' end sequence. An extension of the mRNA 5' coding region was identified in 477 in human loci, out of all human known mRNAs analysed, using an automated expressed sequence tag (EST)-based approach. Proof-of-concept confirmation was obtained by in vitro cloning and sequencing for GNB2L1, QARS and TDP2 and the consequences for the functional studies are discussed. The second approach analyses the codon bias, the phenomenon in which distinct synonymous codons are used with different frequencies, and, following integration with a gene expression profile, estimates the total number of codons present across all the expressed mRNAs (named here "codonome value") in a given biological condition. Systematic analyses across different pathological and normal human tissues and multiple species shows a surprisingly tight correlation between the codon bias and the codonome bias. The third approach is useful to studies the expression of human autism spectrum disorder (ASD) implicated genes. ASD implicated genes sharing microRNA response elements (MREs) for the same microRNA are co-expressed in brain samples from healthy and ASD affected individuals. The different expression of a recently identified long non coding RNA which have four MREs for the same microRNA could disrupt the equilibrium in this network, but further analyses and experiments are needed.
Resumo:
Ziel dieser Dissertation ist die experimentelle Charakterisierung und quantitative Beschreibung der Hybridisierung von komplementären Nukleinsäuresträngen mit oberflächengebundenen Fängermolekülen für die Entwicklung von integrierten Biosensoren. Im Gegensatz zu lösungsbasierten Verfahren ist mit Microarray Substraten die Untersuchung vieler Nukleinsäurekombinationen parallel möglich. Als biologisch relevantes Evaluierungssystem wurde das in Eukaryoten universell exprimierte Actin Gen aus unterschiedlichen Pflanzenspezies verwendet. Dieses Testsystem ermöglicht es, nahe verwandte Pflanzenarten auf Grund von geringen Unterschieden in der Gen-Sequenz (SNPs) zu charakterisieren. Aufbauend auf dieses gut studierte Modell eines House-Keeping Genes wurde ein umfassendes Microarray System, bestehend aus kurzen und langen Oligonukleotiden (mit eingebauten LNA-Molekülen), cDNAs sowie DNA und RNA Targets realisiert. Damit konnte ein für online Messung optimiertes Testsystem mit hohen Signalstärken entwickelt werden. Basierend auf den Ergebnissen wurde der gesamte Signalpfad von Nukleinsärekonzentration bis zum digitalen Wert modelliert. Die aus der Entwicklung und den Experimenten gewonnen Erkenntnisse über die Kinetik und Thermodynamik von Hybridisierung sind in drei Publikationen zusammengefasst die das Rückgrat dieser Dissertation bilden. Die erste Publikation beschreibt die Verbesserung der Reproduzierbarkeit und Spezifizität von Microarray Ergebnissen durch online Messung von Kinetik und Thermodynamik gegenüber endpunktbasierten Messungen mit Standard Microarrays. Für die Auswertung der riesigen Datenmengen wurden zwei Algorithmen entwickelt, eine reaktionskinetische Modellierung der Isothermen und ein auf der Fermi-Dirac Statistik beruhende Beschreibung des Schmelzüberganges. Diese Algorithmen werden in der zweiten Publikation beschrieben. Durch die Realisierung von gleichen Sequenzen in den chemisch unterschiedlichen Nukleinsäuren (DNA, RNA und LNA) ist es möglich, definierte Unterschiede in der Konformation des Riboserings und der C5-Methylgruppe der Pyrimidine zu untersuchen. Die kompetitive Wechselwirkung dieser unterschiedlichen Nukleinsäuren gleicher Sequenz und die Auswirkungen auf Kinetik und Thermodynamik ist das Thema der dritten Publikation. Neben der molekularbiologischen und technologischen Entwicklung im Bereich der Sensorik von Hybridisierungsreaktionen oberflächengebundener Nukleinsäuremolekülen, der automatisierten Auswertung und Modellierung der anfallenden Datenmengen und der damit verbundenen besseren quantitativen Beschreibung von Kinetik und Thermodynamik dieser Reaktionen tragen die Ergebnisse zum besseren Verständnis der physikalisch-chemischen Struktur des elementarsten biologischen Moleküls und seiner nach wie vor nicht vollständig verstandenen Spezifizität bei.
Resumo:
In chronic myeloid leukemia and Philadelphia-positive acute lymphoblastic leukemia patients resistant to tyrosine kinase inhibitors (TKIs), BCR-ABL kinase domain mutation status is an essential component of the therapeutic decision algorithm. The recent development of Ultra-Deep Sequencing approach (UDS) has opened the way to a more accurate characterization of the mutant clones surviving TKIs conjugating assay sensitivity and throughput. We decided to set-up and validated an UDS-based for BCR-ABL KD mutation screening in order to i) resolve qualitatively and quantitatively the complexity and the clonal structure of mutated populations surviving TKIs, ii) study the dynamic of expansion of mutated clones in relation to TKIs therapy, iii) assess whether UDS may allow more sensitive detection of emerging clones, harboring critical 2GTKIs-resistant mutations predicting for an impending relapse, earlier than SS. UDS was performed on a Roche GS Junior instrument, according to an amplicon sequencing design and protocol set up and validated in the framework of the IRON-II (Interlaboratory Robustness of Next-Generation Sequencing) International consortium.Samples from CML and Ph+ ALL patients who had developed resistance to one or multiple TKIs and collected at regular time-points during treatment were selected for this study. Our results indicate the technical feasibility, accuracy and robustness of our UDS-based BCR-ABL KD mutation screening approach. UDS was found to provide a more accurate picture of BCR-ABL KD mutation status, both in terms of presence/absence of mutations and in terms of clonal complexity and showed that BCR-ABL KD mutations detected by SS are only the “tip of iceberg”. In addition UDS may reliably pick 2GTKIs-resistant mutations earlier than SS in a significantly greater proportion of patients.The enhanced sensitivity as well as the possibility to identify low level mutations point the UDS-based approach as an ideal alternative to conventional sequencing for BCR-ABL KD mutation screening in TKIs-resistant Ph+ leukemia patients
Resumo:
This work is focused on the analysis of sea–level change (last century), based mainly on instrumental observations. During this period, individual components of sea–level change are investigated, both at global and regional scales. Some of the geophysical processes responsible for current sea-level change such as glacial isostatic adjustments and current melting terrestrial ice sources, have been modeled and compared with observations. A new value of global mean sea level change based of tide gauges observations has been independently assessed in 1.5 mm/year, using corrections for glacial isostatic adjustment obtained with different models as a criterion for the tide gauge selection. The long wavelength spatial variability of the main components of sea–level change has been investigated by means of traditional and new spectral methods. Complex non–linear trends and abrupt sea–level variations shown by tide gauges records have been addressed applying different approaches to regional case studies. The Ensemble Empirical Mode Decomposition technique has been used to analyse tide gauges records from the Adriatic Sea to ascertain the existence of cyclic sea-level variations. An Early Warning approach have been adopted to detect tipping points in sea–level records of North East Pacific and their relationship with oceanic modes. Global sea–level projections to year 2100 have been obtained by a semi-empirical approach based on the artificial neural network method. In addition, a model-based approach has been applied to the case of the Mediterranean Sea, obtaining sea-level projection to year 2050.
Resumo:
Patienten, die an Osteosarkom leiden werden derzeit mit intravenös applizierten krebstherapeutischen Mitteln nach Tumorresektion behandelt, was oftmals mit schweren Nebenwirkungen und einem verzögerten Knochenheilungsprozess einhergeht. Darüber hinaus treten vermehrt Rezidive aufgrund von verbleibenden neoplastischen Zellen an der Tumorresektionsstelle auf. Erfolgreiche Knochenregeneration und die Kontrolle von den im Gewebe verbleibenden Krebszellen stellt eine Herausforderung für das Tissue Engineering nach Knochenverlust durch Tumorentfernung dar. In dieser Hinsicht scheint der Einsatz von Hydroxyapatit als Knochenersatzmaterial in Kombination mit Cyclodextrin als Medikamententräger, vielversprechend. Chemotherapeutika können an Biomaterial gebunden und direkt am Tumorbett über einen längeren Zeitraum freigesetzt werden, um verbliebene neoplastische Zellen zu eliminieren. Lokal applizierte Chemotherapie hat diverse Vorteile, einschließlich der direkten zytotoxischen Auswirkung auf lokale Zellen, sowie die Reduzierung schwerer Nebenwirkungen. Diese Studie wurde durchgeführt, um die Funktionsfähigkeit eines solchen Arzneimittelabgabesystems zu bewerten und um Strategien im Bereich des Tissue Engineerings zu entwickeln, die den Knochenheilungsprozess und im speziellen die Vaskularisierung fördern sollen. Die Ergebnisse zeigen, dass nicht nur Krebszellen von der chemotherapeutischen Behandlung betroffen sind. Primäre Endothelzellen wie zum Beispiel HUVEC zeigten eine hohe Sensibilität Cisplatin und Doxorubicin gegenüber. Beide Medikamente lösten in HUVEC ein tumor-unterdrückendes Signal durch die Hochregulation von p53 und p21 aus. Zudem scheint Hypoxie einen krebstherapeutischen Einfluss zu haben, da die Behandlung sensitiver HUVEC mit Hypoxie die Zellen vor Zytotoxizität schützte. Der chemo-protektive Effekt schien deutlich weniger auf Krebszelllinien zu wirken. Diese Resultate könnten eine mögliche chemotherapeutische Strategie darstellen, um den Effekt eines zielgerichteten Medikamenteneinsatzes auf Krebszellen zu verbessern unter gleichzeitiger Schonung gesunder Zellen. Eine erfolgreiche Integration eines Systems, das Arzneimittel abgibt, kombiniert mit einem Biomaterial zur Stabilisierung und Regeneration, könnte gesunden Endothelzellen die Möglichkeit bieten zu proliferieren und Blutgefäße zu bilden, während verbleibende Krebszellen eliminiert werden. Da der Prozess der Knochengeweberemodellierung mit einer starken Beeinträchtigung der Lebensqualität des Patienten einhergeht, ist die Beschleunigung des postoperativen Heilungsprozesses eines der Ziele des Tissue Engineerings. Die Bildung von Blutgefäßen ist unabdingbar für eine erfolgreiche Integration eines Knochentransplantats in das Gewebe. Daher ist ein umfangreich ausgebildetes Blutgefäßsystem für einen verbesserten Heilungsprozess während der klinischen Anwendung wünschenswert. Frühere Experimente zeigen, dass sich die Anwendung von Ko-Kulturen aus humanen primären Osteoblasten (pOB) und humanen outgrowth endothelial cells (OEC) im Hinblick auf die Bildung stabiler gefäßähnlicher Strukturen in vitro, die auch effizient in das mikrovaskuläre System in vivo integriert werden konnten, als erfolgreich erweisen. Dieser Ansatz könnte genutzt werden, um prä-vaskularisierte Konstrukte herzustellen, die den Knochenheilungsprozess nach der Implantation fördern. Zusätzlich repräsentiert das Ko-Kultursystem ein exzellentes in vitro Model, um Faktoren, welche stark in den Prozess der Knochenheilung und Angiogenese eingebunden sind, zu identifizieren und zu analysieren. Es ist bekannt, dass Makrophagen eine maßgebliche Rolle in der inflammatorisch-induzierten Angiogenese spielen. In diesem Zusammenhang hebt diese Studie den positiven Einfluss THP-1 abgeleiteter Makrophagen in Ko-Kultur mit pOB und OEC hervor. Die Ergebnisse zeigten, dass die Anwendung von Makrophagen als inflammatorischer Stimulus im bereits etablierten Ko-Kultursystem zu einer pro-angiogenen Aktivierung der OEC führte, was in einer signifikant erhöhten Bildung blutgefäßähnlicher Strukturen in vitro resultierte. Außerdem zeigte die Analyse von Faktoren, die in der durch Entzündung hervorgerufenen Angiogenese eine wichtige Rolle spielen, eine deutliche Hochregulation von VEGF, inflammatorischer Zytokine und Adhäsionsmoleküle, die letztlich zu einer verstärkten Vaskularisierung beitragen. Diese Resultate werden dem Einfluss von Makrophagen zugeschrieben und könnten zukünftig im Tissue Engineering eingesetzt werden, um den Heilungsprozess zu beschleunigen und damit die klinische Situation von Patienten zu verbessern. Darüber hinaus könnte die Kombination der auf Ko-Kulturen basierenden Ansätze für das Knochen Tissue Engineering mit einem biomaterial-basierenden Arzneimittelabgabesystem zum klinischen Einsatz kommen, der die Eliminierung verbliebener Krebszellen mit der Förderung der Knochenregeneration verbindet.
Resumo:
Currently, a variety of linear and nonlinear measures is in use to investigate spatiotemporal interrelation patterns of multivariate time series. Whereas the former are by definition insensitive to nonlinear effects, the latter detect both nonlinear and linear interrelation. In the present contribution we employ a uniform surrogate-based approach, which is capable of disentangling interrelations that significantly exceed random effects and interrelations that significantly exceed linear correlation. The bivariate version of the proposed framework is explored using a simple model allowing for separate tuning of coupling and nonlinearity of interrelation. To demonstrate applicability of the approach to multivariate real-world time series we investigate resting state functional magnetic resonance imaging (rsfMRI) data of two healthy subjects as well as intracranial electroencephalograms (iEEG) of two epilepsy patients with focal onset seizures. The main findings are that for our rsfMRI data interrelations can be described by linear cross-correlation. Rejection of the null hypothesis of linear iEEG interrelation occurs predominantly for epileptogenic tissue as well as during epileptic seizures.