916 resultados para Data Mining and its Application
Resumo:
This work is focused on the development of high quality nanoporous 1D photonic crystals –so called Bragg stacks – made by spin-coating of approximately 25 nm large SiO2 and TiO2 nanoparticles bearing interparticle voids large enough to infiltrate reactive species. Therefore, the first part of this work describes the synthesis of well-dispersed TiO2 nanoparticles in this size range (the corresponding SiO2 nanoparticles are commercially available). In the second part, a protocol was developed to prepare nanoporous Bragg stacks of up to 12 bilayers with high quality and precision. Tailor-made Bragg stacks were prepared for different applications such as (i) a surface emitting feedback laser with a FWHM of only 6 nm and (ii) an electrochromic device with absorption reversibly switchable by an external electrical bias independently of the Bragg reflection. In the last chapter, the approach to 1D photonic crystals is transferred to 1D phononic crystals. Contrast in the modulus is achieved by spin-coating SiO2 and PMMA as high and low moduli material. This system showed a band gap of fg = 12.6 GHz with a width of Dfg/fg = 4.5 GHz.
Resumo:
Data deduplication describes a class of approaches that reduce the storage capacity needed to store data or the amount of data that has to be transferred over a network. These approaches detect coarse-grained redundancies within a data set, e.g. a file system, and remove them.rnrnOne of the most important applications of data deduplication are backup storage systems where these approaches are able to reduce the storage requirements to a small fraction of the logical backup data size.rnThis thesis introduces multiple new extensions of so-called fingerprinting-based data deduplication. It starts with the presentation of a novel system design, which allows using a cluster of servers to perform exact data deduplication with small chunks in a scalable way.rnrnAfterwards, a combination of compression approaches for an important, but often over- looked, data structure in data deduplication systems, so called block and file recipes, is introduced. Using these compression approaches that exploit unique properties of data deduplication systems, the size of these recipes can be reduced by more than 92% in all investigated data sets. As file recipes can occupy a significant fraction of the overall storage capacity of data deduplication systems, the compression enables significant savings.rnrnA technique to increase the write throughput of data deduplication systems, based on the aforementioned block and file recipes, is introduced next. The novel Block Locality Caching (BLC) uses properties of block and file recipes to overcome the chunk lookup disk bottleneck of data deduplication systems. This chunk lookup disk bottleneck either limits the scalability or the throughput of data deduplication systems. The presented BLC overcomes the disk bottleneck more efficiently than existing approaches. Furthermore, it is shown that it is less prone to aging effects.rnrnFinally, it is investigated if large HPC storage systems inhibit redundancies that can be found by fingerprinting-based data deduplication. Over 3 PB of HPC storage data from different data sets have been analyzed. In most data sets, between 20 and 30% of the data can be classified as redundant. According to these results, future work in HPC storage systems should further investigate how data deduplication can be integrated into future HPC storage systems.rnrnThis thesis presents important novel work in different area of data deduplication re- search.
Resumo:
Summary Antibody-based cancer therapies have been successfully introduced into the clinic and have emerged as the most promising therapeutics in oncology. The limiting factor regarding the development of therapeutical antibody vaccines is the identification of tumor-associated antigens. PLAC1, the placenta-specific protein 1, was categorized for the first time by the group of Prof. Sahin as such a tumor-specific antigen. Within this work PLAC1 was characterized using a variety of biochemical methods. The protein expression profile, the cellular localization, the conformational state and especially the interacting partners of PLAC1 and its functionality in cancer were analyzed. Analysis of the protein expression profile of PLAC1 in normal human tissue confirms the published RT-PCR data. Except for placenta no PLAC1 expression was detectable in any other normal human tissue. Beyond, an increased PLAC1 expression was detected in several cancer cell lines derived of trophoblastic, breast and pancreatic lineage emphasizing its properties as tumor-specific antigen. rnThe cellular localization of PLAC1 revealed that PLAC1 contains a functional signal peptide which conducts the propeptide to the endoplasmic reticulum (ER) and results in the secretion of PLAC1 by the secretory pathway. Although PLAC1 did not exhibit a distinct transmembrane domain, no unbound protein was detectable in the cell culture supernatant of overexpressing cells. But by selective isolation of different cellular compartments PLAC1 was clearly enriched within the membrane fraction. Using size exclusion chromatography PLAC1 was characterized as a highly aggregating protein that forms a network of high molecular multimers, consisting of a mixture of non-covalent as well as covalent interactions. Those interactions were formed by PLAC1 with itself and probably other cellular components and proteins. Consequently, PLAC1 localize outside the cell, where it is associated to the membrane forming a stable extracellular coat-like structure.rnThe first mechanistic hint how PLAC1 promote cancer cell proliferation was achieved identifying the fibroblast growth factor FGF7 as a specific interacting partner of PLAC1. Moreover, it was clearly shown that PLAC1 as well as FGF7 bind to heparin, a glycosaminoglycan of the ECM that is also involved in FGF-signaling. The participation of PLAC1 within this pathway was approved after co-localizing PLAC1, FGF7 and the FGF7 specific receptor (FGFR2IIIb) and identifying the formation of a trimeric complex (PLAC1, FGF7 and the specific receptor FGFR2IIIb). Especially this trimeric complex revealed the role of PLAC1. Binding of PLAC1 together with FGF7 leads to the activation of the intracellular tyrosine kinase of the FGFR2IIIb-receptor and mediate the direct phosphorylation of the AKT-kinase. In the absence of PLAC1, no FGF7 mediated phosphorylation of AKT was observed. Consequently the function of PLAC1 was clarified: PLAC1 acts as a co-factor by stimulating proliferation by of the FGF7-FGFR2 signaling pathway.rnAll together, these novel biochemical findings underline that the placenta specific protein PLAC1 could be a new target for cancer immunotherapy, especially considering its potential applicability for antibody therapy in tumor patients.
Resumo:
Food items and nematode parasites were identified from the stomachs of 42 individuals of Phocoena phocoena, 6 of Lagenorhynchus acutus and 8 of L. albirostris stranded off the coastal waters of Northern Scotland between 2004 and 2014. Post-mortem examinations have revealed heavy parasitic worm burdens. Four nematode species complex as Anisakis spp., Contracaeucum spp., Pseudoterronova spp., and Hysterothylacium spp. were recorded. Data on presence of the anisakid species in cetaceans, reported a significative relationship between the presence of Hysterothylacium and the month of host stranding; suggesting a decrease of larval H. aduncum abundance in the period between April and August due to a seasonal effect related to prey availability. Similarly, the parasite burden of the all anisakid genera was related to the year fraction of stranding, and a relationship statistically significant was found just for L. albirostris with an increase between April and October. This finding is explained by a seasonality in occurrence of white-beaked dolphins, with a peak during August, that might be related to movements of shared prey species and competition with other species (Tursiops truncatus). Geographical differences were observed in parasites number of all anisakid species, which was the highest in cetaceans from the East area and lowest in the North coast. The parasites number also increased significantly with the length of the animal and during the year, but with a significant seasonal pattern only for P. phocoena. Regarding diet composition, through a data set consisting of 34 harbour porpoises and 1 Atlantic white-sided dolphins, we found a positive association between parasite number and the cephalopods genus Alloteuthis. This higher level of parasite infection in squid from this area, is probably due to a quantitative distribution of infective forms in squid prey, an abundance of the final host and age or size maturity of squid.
Resumo:
Für das Vermögen der Atmosphäre sich selbst zu reinigen spielen Stickstoffmonoxid (NO) und Stickstoffdioxid (NO2) eine bedeutende Rolle. Diese Spurengase bestimmen die photochemische Produktion von Ozon (O3) und beeinflussen das Vorkommen von Hydroxyl- (OH) und Nitrat-Radikalen (NO3). Wenn tagsüber ausreichend Solarstrahlung und Ozon vorherrschen, stehen NO und NO2 in einem schnellen photochemischen Gleichgewicht, dem „Photostationären Gleichgewichtszustand“ (engl.: photostationary state). Die Summe von NO und NO2 wird deshalb als NOx zusammengefasst. Vorhergehende Studien zum photostationären Gleichgewichtszustand von NOx umfassen Messungen an unterschiedlichsten Orten, angefangen bei Städten (geprägt von starken Luftverschmutzungen), bis hin zu abgeschiedenen Regionen (geprägt von geringeren Luftverschmutzungen). Während der photochemische Kreislauf von NO und NO2 unter Bedingungen erhöhter NOx-Konzentrationen grundlegend verstanden ist, gibt es in ländlicheren und entlegenen Regionen, welche geprägt sind von niedrigeren NOx-Konzetrationen, signifikante Lücken im Verständnis der zugrundeliegenden Zyklierungsprozesse. Diese Lücken könnten durch messtechnische NO2-Interferenzen bedingt sein - insbesondere bei indirekten Nachweismethoden, welche von Artefakten beeinflusst sein können. Bei sehr niedrigen NOx-Konzentrationen und wenn messtechnische NO2-Interferenzen ausgeschlossen werden können, wird häufig geschlussfolgert, dass diese Verständnislücken mit der Existenz eines „unbekannten Oxidationsmittels“ (engl.: unknown oxidant) verknüpft ist. Im Rahmen dieser Arbeit wird der photostationäre Gleichgewichtszustand von NOx analysiert, mit dem Ziel die potenzielle Existenz bislang unbekannter Prozesse zu untersuchen. Ein Gasanalysator für die direkte Messung von atmosphärischem NO¬2 mittels laserinduzierter Fluoreszenzmesstechnik (engl. LIF – laser induced fluorescence), GANDALF, wurde neu entwickelt und während der Messkampagne PARADE 2011 erstmals für Feldmessungen eingesetzt. Die Messungen im Rahmen von PARADE wurden im Sommer 2011 in einem ländlich geprägten Gebiet in Deutschland durchgeführt. Umfangreiche NO2-Messungen unter Verwendung unterschiedlicher Messtechniken (DOAS, CLD und CRD) ermöglichten einen ausführlichen und erfolgreichen Vergleich von GANDALF mit den übrigen NO2-Messtechniken. Weitere relevante Spurengase und meteorologische Parameter wurden gemessen, um den photostationären Zustand von NOx, basierend auf den NO2-Messungen mit GANDALF in dieser Umgebung zu untersuchen. Während PARADE wurden moderate NOx Mischungsverhältnisse an der Messstelle beobachtet (10^2 - 10^4 pptv). Mischungsverhältnisse biogener flüchtige Kohlenwasserstoffverbindungen (BVOC, engl.: biogenic volatile organic compounds) aus dem umgebenden Wald (hauptsächlich Nadelwald) lagen in der Größenordnung 10^2 pptv vor. Die Charakteristiken des photostationären Gleichgewichtszustandes von NOx bei niedrigen NOx-Mischungsverhältnissen (10 - 10^3 pptv) wurde für eine weitere Messstelle in einem borealen Waldgebiet während der Messkampagne HUMPPA-COPEC 2010 untersucht. HUMPPA–COPEC–2010 wurde im Sommer 2010 in der SMEARII-Station in Hyytiälä, Süd-Finnland, durchgeführt. Die charakteristischen Eigenschaften des photostationären Gleichgewichtszustandes von NOx in den beiden Waldgebieten werden in dieser Arbeit verglichen. Des Weiteren ermöglicht der umfangreiche Datensatz - dieser beinhaltet Messungen von relevanten Spurengasen für die Radikalchemie (OH, HO2), sowie der totalen OH-Reaktivität – das aktuelle Verständnis bezüglich der NOx-Photochemie unter Verwendung von einem Boxmodell, in welches die gemessenen Daten als Randbedingungen eingehen, zu überprüfen und zu verbessern. Während NOx-Konzentrationen in HUMPPA-COPEC 2010 niedriger sind, im Vergleich zu PARADE 2011 und BVOC-Konzentrationen höher, sind die Zyklierungsprozesse von NO und NO2 in beiden Fällen grundlegend verstanden. Die Analyse des photostationären Gleichgewichtszustandes von NOx für die beiden stark unterschiedlichen Messstandorte zeigt auf, dass potenziell unbekannte Prozesse in keinem der beiden Fälle vorhanden sind. Die aktuelle Darstellung der NOx-Chemie wurde für HUMPPA-COPEC 2010 unter Verwendung des chemischen Mechanismus MIM3* simuliert. Die Ergebnisse der Simulation sind konsistent mit den Berechnungen basierend auf dem photostationären Gleichgewichtszustand von NOx.
Resumo:
In vielen Industriezweigen, zum Beispiel in der Automobilindustrie, werden Digitale Versuchsmodelle (Digital MockUps) eingesetzt, um die Konstruktion und die Funktion eines Produkts am virtuellen Prototypen zu überprüfen. Ein Anwendungsfall ist dabei die Überprüfung von Sicherheitsabständen einzelner Bauteile, die sogenannte Abstandsanalyse. Ingenieure ermitteln dabei für bestimmte Bauteile, ob diese in ihrer Ruhelage sowie während einer Bewegung einen vorgegeben Sicherheitsabstand zu den umgebenden Bauteilen einhalten. Unterschreiten Bauteile den Sicherheitsabstand, so muss deren Form oder Lage verändert werden. Dazu ist es wichtig, die Bereiche der Bauteile, welche den Sicherhabstand verletzen, genau zu kennen. rnrnIn dieser Arbeit präsentieren wir eine Lösung zur Echtzeitberechnung aller den Sicherheitsabstand unterschreitenden Bereiche zwischen zwei geometrischen Objekten. Die Objekte sind dabei jeweils als Menge von Primitiven (z.B. Dreiecken) gegeben. Für jeden Zeitpunkt, in dem eine Transformation auf eines der Objekte angewendet wird, berechnen wir die Menge aller den Sicherheitsabstand unterschreitenden Primitive und bezeichnen diese als die Menge aller toleranzverletzenden Primitive. Wir präsentieren in dieser Arbeit eine ganzheitliche Lösung, welche sich in die folgenden drei großen Themengebiete unterteilen lässt.rnrnIm ersten Teil dieser Arbeit untersuchen wir Algorithmen, die für zwei Dreiecke überprüfen, ob diese toleranzverletzend sind. Hierfür präsentieren wir verschiedene Ansätze für Dreiecks-Dreiecks Toleranztests und zeigen, dass spezielle Toleranztests deutlich performanter sind als bisher verwendete Abstandsberechnungen. Im Fokus unserer Arbeit steht dabei die Entwicklung eines neuartigen Toleranztests, welcher im Dualraum arbeitet. In all unseren Benchmarks zur Berechnung aller toleranzverletzenden Primitive beweist sich unser Ansatz im dualen Raum immer als der Performanteste.rnrnDer zweite Teil dieser Arbeit befasst sich mit Datenstrukturen und Algorithmen zur Echtzeitberechnung aller toleranzverletzenden Primitive zwischen zwei geometrischen Objekten. Wir entwickeln eine kombinierte Datenstruktur, die sich aus einer flachen hierarchischen Datenstruktur und mehreren Uniform Grids zusammensetzt. Um effiziente Laufzeiten zu gewährleisten ist es vor allem wichtig, den geforderten Sicherheitsabstand sinnvoll im Design der Datenstrukturen und der Anfragealgorithmen zu beachten. Wir präsentieren hierzu Lösungen, die die Menge der zu testenden Paare von Primitiven schnell bestimmen. Darüber hinaus entwickeln wir Strategien, wie Primitive als toleranzverletzend erkannt werden können, ohne einen aufwändigen Primitiv-Primitiv Toleranztest zu berechnen. In unseren Benchmarks zeigen wir, dass wir mit unseren Lösungen in der Lage sind, in Echtzeit alle toleranzverletzenden Primitive zwischen zwei komplexen geometrischen Objekten, bestehend aus jeweils vielen hunderttausend Primitiven, zu berechnen. rnrnIm dritten Teil präsentieren wir eine neuartige, speicheroptimierte Datenstruktur zur Verwaltung der Zellinhalte der zuvor verwendeten Uniform Grids. Wir bezeichnen diese Datenstruktur als Shrubs. Bisherige Ansätze zur Speicheroptimierung von Uniform Grids beziehen sich vor allem auf Hashing Methoden. Diese reduzieren aber nicht den Speicherverbrauch der Zellinhalte. In unserem Anwendungsfall haben benachbarte Zellen oft ähnliche Inhalte. Unser Ansatz ist in der Lage, den Speicherbedarf der Zellinhalte eines Uniform Grids, basierend auf den redundanten Zellinhalten, verlustlos auf ein fünftel der bisherigen Größe zu komprimieren und zur Laufzeit zu dekomprimieren.rnrnAbschießend zeigen wir, wie unsere Lösung zur Berechnung aller toleranzverletzenden Primitive Anwendung in der Praxis finden kann. Neben der reinen Abstandsanalyse zeigen wir Anwendungen für verschiedene Problemstellungen der Pfadplanung.
Resumo:
In der vorliegenden Arbeit fokussierten wir uns auf drei verschiedene Aspekte der Leishmanien-Infektion. Wir charakterisierten den Prozess des Zelltods „Apoptose“ bei Parasiten (1), untersuchten die Eignung von Makrophagen und dendritischen Zellen als Wirtszelle für die Entwicklung der Parasiten (2) und analysierten die Konsequenzen der Infektion für die Entstehung einer adaptiven Immunantwort im humanen System. Von zentraler Bedeutung für dieses Projekt war die Hypothese, dass apoptotische Leishmanien den Autophagie-Mechanismus ihrer Wirtszellen ausnutzen, um eine T-Zell-vermittelte Abtötung der Parasiten zu vermindern.rnWir definierten eine apoptotische Leishmanien-Population, welche durch eine rundliche Morphologie und die Expression von Phosphatidylserin auf der Parasitenoberfläche charakterisiert war. Die apoptotischen Parasiten befanden sich zudem in der SubG1-Phase und wiesen weniger und fragmentierte DNA auf, welche durch TUNEL-Assay nachgewiesen werden konnte. Bei der Interaktion der Parasiten mit humanen Makrophagen und dendritischen Zellen zeigte sich, dass die anti-inflammatorischen Makrophagen anfälliger für Infektionen waren als die pro-inflammatorischen Makrophagen oder die dendritischen Zellen. Interessanterweise wurde in den dendritischen Zellen jedoch die effektivste Umwandlung zur krankheitsauslösenden, amastigoten Lebensform beobachtet. Da sowohl Makrophagen als auch dendritische Zellen zu den antigenpräsentierenden Zellen gehören, könnte dies zur Aktivierung der T-Zellen des adaptiven Immunsystems führen. Tatsächlich konnte während der Leishmanien-Infektion die Proliferation von T-Zellen beobachtet werden. Dabei stellten wir fest, dass es sich bei den proliferierenden T-Zellen um CD3+CD4+ T-Zellen handelte, welche sich überraschenderweise als Leishmanien-spezifische CD45RO+ T-Gedächtniszellen herausstellten. Dies war unerwartet, da ein vorheriger Kontakt der Spender mit Leishmanien als unwahrscheinlich gilt. In Gegenwart von apoptotischen Parasiten konnte eine signifikant schwächere T-Zell-Proliferation in Makrophagen, jedoch nicht in dendritischen Zellen beobachtet werden. Da sich die T-Zell-Proliferation negativ auf das Überleben der Parasiten auswirkt, konnten die niedrigsten Überlebensraten in dendritischen Zellen vorgefunden werden. Innerhalb der Zellen befanden sich die Parasiten in beiden Zelltypen im Phagosom, welches allerdings nur in Makrophagen den Autophagie-Marker LC3 aufwies. Chemische Induktion von Autophagie führte, ebenso wie die Anwesenheit von apoptotischen Parasiten, zu einer stark reduzierten T-Zell-Proliferation und dementsprechend zu einem höheren Überleben der Parasiten.rnZusammenfassend lässt sich aus unseren Daten schließen, dass Apoptose in Einzellern vorkommt. Während der Infektion können sowohl Makrophagen, als auch dendritische Zellen mit Leishmanien infiziert und das adaptive Immunsystem aktivert werden. Die eingeleitete T-Zell-Proliferation nach Infektion von Makrophagen ist in Gegenwart von apoptotischen Parasiten reduziert, weshalb sie im Vergleich zu dendritischen Zellen die geeigneteren Wirtszellen für Leishmanien darstellen. Dafür missbrauchen die Parasiten den Autophagie-Mechanismus der Makrophagen als Fluchtstrategie um das adaptive Immunsystem zu umgehen und somit das Überleben der Gesamtpopulation zu sichern. Diese Ergebnisse erklären den Vorteil von Apoptose in Einzellern und verdeutlichen, dass der Autophagie-Mechanismus als potentielles therapeutisches Ziel für die Behandlung von Leishmaniose dienen kann.rn
Resumo:
Due to its environmental, safety, health and socio-economic impacts, marine litter has been recognized as a 21st century global challenge, so that it has been included in Descriptor 10 of the EU MSFD. For its morphological features and anthropogenic pressures, the Adriatic Sea is very sensitive to the accumulation of debris, but data are inconsistent and fragmented. This thesis, in the framework of DeFishGear project, intents to assess marine litter on beaches and on seafloor in the Western Adriatic sea, and test if debris ingestion by fish occurs. Three beaches were sampled during two surveys in 2015. Benthic litter monitoring was carried out in the FAO GSA17 during fall 2014, using a rapido trawl. Litter ingestion was investigated through gut contents analysis of 260 fish belonging to 8 commercial species collected in Western Gulf of Venice. Average litter density on beaches was 1.5 items/m2 during spring, and decreased to 0.8 items/m2 in summer. Litter composition was heterogeneous and varied among sites, even if no significant differences were found. Most of debris consisted of plastic sheets, fragments, polystyrene pieces, mussels nets and cottons bud sticks, showing that sources are many and include aquaculture, land-based activities and local users of beaches. Average density of benthic litter was 913 items/Km2 (82 Kg/Km2). Plastic dominated in terms of numbers and weight, and consisted mainly of bags, sheets and mussel nets. The highest density was found close to the coast, and sources driving the major differences in litter distribution were mussel farms and shipping lanes. Litter ingestion occurred in 47% of examined fish, mainly consisting of fibers. Among species, S. pilchardus swallowed almost all debris categories. Findinds may provide a baseline to set the necessary measures to manage and minimize marine litter in the Western Adriatic region and to protect aquatic life from plastic pollution, even accounting the possible implications on human health.
Resumo:
Properdin, a serum glycoprotein, is an important component of innate immunity, the only known positive regulator of complement, acting as an initiation point for alternative pathway activation. As an X-linked protein, we hypothesized that properdin may play a modulatory role in the pathogenesis of viral wheeze in children, which tends to be more common and more severe in boys. We aimed to determine properdin levels in a community-based paediatric sample, and to assess whether levels of properdin were associated with childhood wheeze phenotypes and atopy. We studied 137 school-children aged 8-12 yrs, a nested sample from a cohort study. Properdin was measured by a commercial enzyme-linked immunoabsorbant assay. We assessed wheeze by questionnaire, validated it by a nurse-led interview and performed skin prick tests and a methacholine challenge in all children. Forty children (29%) reported current wheeze. Serum properdin levels ranged between 18 and 40 microg/ml. Properdin was not associated with age, gender, atopy, bronchial responsiveness, current wheeze (neither the viral wheeze nor multiple-trigger wheeze phenotype) or severity of wheeze, but was slightly lower in south Asian (median 21.8 microg/ml) compared with white children (23.3 microg/ml; p = 0.006). Our data make it unlikely that properdin deficiency is common in healthy children or that levels of properdin are a major risk factor for wheeze or atopy.
Resumo:
Instrumental daily series of temperature are often affected by inhomogeneities. Several methods are available for their correction at monthly and annual scales, whereas few exist for daily data. Here, an improved version of the higher-order moments (HOM) method, the higher-order moments for autocorrelated data (HOMAD), is proposed. HOMAD addresses the main weaknesses of HOM, namely, data autocorrelation and the subjective choice of regression parameters. Simulated series are used for the comparison of both methodologies. The results highlight and reveal that HOMAD outperforms HOM for small samples. Additionally, three daily temperature time series from stations in the eastern Mediterranean are used to show the impact of homogenization procedures on trend estimation and the assessment of extremes. HOMAD provides an improved correction of daily temperature time series and further supports the use of corrected daily temperature time series prior to climate change assessment.
Resumo:
Urban agriculture is a phenomenon that can be observed world-wide, particularly in cities of devel- oping countries. It is contributing significantly to food security and food safety and has sustained livelihood of the urban and peri-urban low income dwe llers in developing countries for many years. Population increase due to rural-urban migration and natural - formal as well as informal - urbani- sation are competing with urban farming for available space and scarce water resources. A mul- titemporal and multisensoral urban change analysis over the period of 25 years (1982-2007) was performed in order to measure and visualise the urban expansion along the Kizinga and Mzinga valley in the south of Dar Es Salaam. Airphotos and VHR satellite data were analysed by using a combination of a composition of anisotropic textural measures and spectral information. The study revealed that unplanned built-up area is expanding continuously, and vegetation covers and agricultural lands decline at a fast rate. The validation showed that the overall classification accuracy varied depending on the database. The extracted built-up areas were used for visual in- terpretation mapping purposes and served as information source for another research project. The maps visualise an urban congestion and expansion of nearly 18% of the total analysed area that had taken place in the Kizinga valley between 1982 and 2007. The same development can be ob- served in the less developed and more remote Mzinga valley between 1981 and 2002. Both areas underwent fast changes where land prices still tend to go up and an influx of people both from rural and urban areas continuously increase the density with the consequence of increasing multiple land use interests.
Resumo:
Toll-like receptors (TLRs) are key receptors of the innate immune system which are expressed on immune and nonimmune cells. They are activated by both pathogen-associated molecular patterns and endogenous ligands. Activation of TLRs culminates in the release of proinflammatory cytokines, chemokines, and apoptosis. Ischaemia and ischaemia/reperfusion (I/R) injury are associated with significant inflammation and tissue damage. There is emerging evidence to suggest that TLRs are involved in mediating ischaemia-induced damage in several organs. Critical limb ischaemia (CLI) is the most severe form of peripheral arterial disease (PAD) and is associated with skeletal muscle damage and tissue loss; however its pathophysiology is poorly understood. This paper will underline the evidence implicating TLRs in the pathophysiology of cerebral, renal, hepatic, myocardial, and skeletal muscle ischaemia and I/R injury and discuss preliminary data that alludes to the potential role of TLRs in the pathophysiology of skeletal muscle damage in CLI.
Resumo:
Higher education has a responsibility to educate a democratic citizenry and recent research indicates civic engagement is on the decline in the United States. Through a mixed methodological approach, I demonstrate that the potential exists for well structured short-term international service-learning programming to develop college students’ civic identities. Quantitative analysis of questionnaire data, collected from American college students immediately prior to their participation in a short-term service-learning experience in Northern Ireland and again upon their return to the United States, revealed increases in civic accountability, political efficacy, justice oriented citizenship, and service-learning. Subsequent qualitative analysis of interview transcripts, student journals, and field notes suggested that facilitated critical reflection before, during, and after the experience promoted transformational learning. Emergent themes included: (a) responsibilities to others, (b) the value of international service-learning, (c) crosspollination of ideas, (d) stepping outside the daily routine to facilitate divergent thinking, and (e) the necessity of precursory thinking for sustaining transformations in thinking. The first theme, responsibilities to others, was further divided into subthemes of thinking beyond oneself, raising awareness of responsibility to others, and voting responsibly.
Resumo:
In recent history, there has been a trend of increasing partisan polarization throughout most of the American political system. Some of the impacts of this polarization are obvious; however, there is reason to believe that we miss some of the indirect effects of polarization. Accompanying the trend of increased polarization has been an increase in the contentiousness of the Supreme Court confirmation process. I believe that these two trends are related. Furthermore, I argue that these trends have an impact on judicial behavior. This is an issue worth exploring, since the Supreme Court is the most isolated branch of the federal government. The Constitution structured the Supreme Court to ensure that it was as isolated as possible from short-term political pressures and interests. This study attempts to show how it may be possible that those goals are no longer being fully achieved. My first hypothesis in this study is that increases in partisan polarization are a direct cause of the increase in the level of contention during the confirmation process. I then hypothesize that the more contention a justice faces during his or her confirmation process, the more ideologically extreme that justice will then vote on the bench. This means that a nominee appointed by a Republican president will tend to vote even more conservatively than was anticipated following a contentious confirmation process, and vice versa for Democratic appointees. In order to test these hypotheses, I developed a data set for every Supreme Court nominee dating back to President Franklin D. Roosevelt¿s appointments (1937). With this data set, I ran a series of regression models to analyze these relationships. Statistically speaking, the results support my first hypothesis in a fairly robust manner. My regression results for my second hypothesis indicate that the trend I am looking for is present for Republican nominees. For Democratic nominees, the impacts are less robust. Nonetheless, as the results will show, contention during the confirmation process does seem to have some impact on judicial behavior. Following my quantitative analysis, I analyze a series of case studies. These case studies serve to provide tangible examples of these statistical trends as well as to explore what else may be going on during the confirmation process and subsequent judicial decision-making. I use Justices Stevens, Rehnquist, and Alito as the subjects for these case studies. These cases will show that the trends described above do seem to be identifiable at the level of an individual case. These studies further help to indicate other potential impacts on judicial behavior. For example, following Justice Rehnquist¿s move from Associate to Chief Justice, we see a marked change in his behavior. Overall, this study serves as a means of analyzing some of the more indirect impacts of partisan polarization in modern politics. Further, the study offers a means of exploring some of the possible constraints (both conscious and subconscious) that Supreme Court justices may feel while they decide how to cast a vote in a particular case. Given the wide-reaching implications of Supreme Court decisions, it is important to try to grasp a full view of how these decisions are made.