761 resultados para deep and surface approaches to learning


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Exposure to PM10 and PM2.5 (particulate matter with aerodynamic diameter smaller than 10 μm and 2.5 μm, respectively) is associated with a range of adverse health effects, including cancer, pulmonary and cardiovascular diseases. Surface characteristics (chemical reactivity, surface area) are considered of prime importance to understand the mechanisms which lead to harmful effects. A hypothetical mechanism to explain these adverse effects is the ability of components (organics, metal ions) adsorbed on these particles to generate Reactive Oxygen Species (ROS), and thereby to cause oxidative stress in biological systems (Donaldson et al., 2003). ROS can attack almost any cellular structure, like DNA or cellular membrane, leading to the formation of a wide variety of degradation products which can be used as a biomarker of oxidative stress. The aim of the present research project is to test whether there is a correlation between the exposure to Diesel Exhaust Particulate (DEP) and the oxidative stress status. For that purpose, a survey has been conducted in real occupational situations where workers were exposed to DEP (bus depots). Different exposure variables have been considered: - particulate number, size distribution and surface area (SMPS); - particulate mass - PM2.5 and PM4 (gravimetry); - elemental and organic carbon (coulometry); - total adsorbed heavy metals - iron, copper, manganese (atomic adsorption); - surface functional groups present on aerosols (Knudsen flow reactor). Several biomarkers of oxidative stress (8-hydroxy-2'-deoxyguanosine and several aldehydes) have been determined either in urine or serum of volunteers. Results obtained during the sampling campaign in several bus depots indicated that the occupational exposure to particulates in these places was rather low (40-50 μg/m3 for PM4). Bimodal size distributions were generally observed (5 μm and <1 μm). Surface characteristics of PM4 varied strongly, depending on the bus depot. They were usually characterized by high carbonyl and low acidic sites content. Among the different biomarkers which have been analyzed within the framework of this study, mean urinary levels of 8-hydroxy-2'-deoxyguanosine increased significantly (p<0.05) during two consecutive days of exposure for non-smoker workers. On the other hand, no statistically significant differences were observed for serum levels of hexanal, nonanal and 4- hydroxy-nonenal (p>0.05). Biomarkers levels will be compared to exposure variables to gain a better understanding of the relation between the particulate characteristics and the formation of ROS by-products. This project is financed by the Swiss State Secretariat for Education and Research. It is conducted within the framework of the COST Action 633 "Particulate Matter - Properties Related to Health Effects".

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Within the ENCODE Consortium, GENCODE aimed to accurately annotate all protein-coding genes, pseudogenes, and noncoding transcribed loci in the human genome through manual curation and computational methods. Annotated transcript structures were assessed, and less well-supported loci were systematically, experimentally validated. Predicted exon-exon junctions were evaluated by RT-PCR amplification followed by highly multiplexed sequencing readout, a method we called RT-PCR-seq. Seventy-nine percent of all assessed junctions are confirmed by this evaluation procedure, demonstrating the high quality of the GENCODE gene set. RT-PCR-seq was also efficient to screen gene models predicted using the Human Body Map (HBM) RNA-seq data. We validated 73% of these predictions, thus confirming 1168 novel genes, mostly noncoding, which will further complement the GENCODE annotation. Our novel experimental validation pipeline is extremely sensitive, far more than unbiased transcriptome profiling through RNA sequencing, which is becoming the norm. For example, exon-exon junctions unique to GENCODE annotated transcripts are five times more likely to be corroborated with our targeted approach than with extensive large human transcriptome profiling. Data sets such as the HBM and ENCODE RNA-seq data fail sampling of low-expressed transcripts. Our RT-PCR-seq targeted approach also has the advantage of identifying novel exons of known genes, as we discovered unannotated exons in ~11% of assessed introns. We thus estimate that at least 18% of known loci have yet-unannotated exons. Our work demonstrates that the cataloging of all of the genic elements encoded in the human genome will necessitate a coordinated effort between unbiased and targeted approaches, like RNA-seq and RT-PCR-seq.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Polyunsaturated aldehydes (PUAs) are organic compounds mainly produced by diatoms, after cell wounding. These compounds are increasingly reported as teratogenic for species of grazers and deleterious for phytoplanktonic species, but there is still scarce information regarding concentration ranges and the composition of PUAs in the open ocean. In this study, we analyzed the spatial distribution and the type of aldehydes produced by the large-sized (>10 μm) phytoplankton in the Atlantic Ocean surface. Analyses were conducted on PUAs released after mechanical disruption of the phytoplankton cells, referred to here as potential PUAs (pPUAs). Results show the ubiquitous presence of pPUA in the open ocean, including upwelling areas, as well as oligotrophic gyres. Total pPUA concentrations ranged from zero to 4.18 pmol from cells in 1 L. Identified PUAs were heptadienal, octadienal and decadienal, with heptadienal being the most common (79% of total stations). PUA amount and composition across the Atlantic Ocean was mainly related to the nitrogen:phosphorus ratio, suggesting nutrient-driven mechanisms of PUA production. Extending the range of trophic conditions considered by adding data reported for productive coastal waters, we found a pattern of PUA variation in relation to trophic status.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Learning object repositories are a basic piece of virtual learning environments used for content management. Nevertheless, learning objects have special characteristics that make traditional solutions for content management ine ective. In particular, browsing and searching for learning objects cannot be based on the typical authoritative meta-data used for describing content, such as author, title or publicationdate, among others. We propose to build a social layer on top of a learning object repository, providing nal users with additional services fordescribing, rating and curating learning objects from a teaching perspective. All these interactions among users, services and resources can be captured and further analyzed, so both browsing and searching can be personalized according to user pro le and the educational context, helping users to nd the most valuable resources for their learning process. In this paper we propose to use reputation schemes and collaborative filtering techniques for improving the user interface of a DSpace based learning object repository.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Today, most software development teams use free and open source software (FOSS) components, because it increases the speed and the quality of the development. Many open source components are the de facto standard of their category. However, FOSS has licensing restrictions, and corporate organizations usually maintain a list of allowed and forbidden licenses. But how do you enforce this policy? How can you make sure that ALL files in your source depot, either belong to you, or fit your licensing policy? A first, preventive approach is to train and increase the awareness of the development team to these licensing issues. Depending on the size of the team, it may be costly but necessary. However, this does not ensure that a single individual will not commit a forbidden icon or library, and jeopardize the legal status of the whole release... if not the company, since software is becoming more and more a critical asset. Another approach is to verify what is included in the source repository, and check whether it belongs to the open-source world. This can be done on-the-fly, whenever a new file is added into the source depot. It can also be part of the release process, as a verification step before publishing the release. In both cases, there are some tools and databases to automate the detection process. We will present the various options regarding FOSS detection, how this process can be integrated in the "software factory", and how the results can be displayed in a usable and efficient way.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A recently developed technique, polarimetric radar interferometry, is applied to tackle the problem of the detection of buried objects embedded in surface clutter. An experiment with a fully polarimetric radar in an anechoic chamber has been carried out using different frequency bands and baselines. The processed results show the ability of this technique to detect buried plastic mines and to measure their depth. This technique enables the detection of plastic mines even if their backscatter response is much lower than that of the surface clutter.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper analyzes the asymptotic performance of maximum likelihood (ML) channel estimation algorithms in wideband code division multiple access (WCDMA) scenarios. We concentrate on systems with periodic spreading sequences (period larger than or equal to the symbol span) where the transmitted signal contains a code division multiplexed pilot for channel estimation purposes. First, the asymptotic covariances of the training-only, semi-blind conditional maximum likelihood (CML) and semi-blind Gaussian maximum likelihood (GML) channelestimators are derived. Then, these formulas are further simplified assuming randomized spreading and training sequences under the approximation of high spreading factors and high number of codes. The results provide a useful tool to describe the performance of the channel estimators as a function of basicsystem parameters such as number of codes, spreading factors, or traffic to training power ratio.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We propose an innovative, integrated, cost-effective health system to combat major non-communicable diseases (NCDs), including cardiovascular, chronic respiratory, metabolic, rheumatologic and neurologic disorders and cancers, which together are the predominant health problem of the 21st century. This proposed holistic strategy involves comprehensive patient-centered integrated care and multi-scale, multi-modal and multi-level systems approaches to tackle NCDs as a common group of diseases. Rather than studying each disease individually, it will take into account their intertwined gene-environment, socio-economic interactions and co-morbidities that lead to individual-specific complex phenotypes. It will implement a road map for predictive, preventive, personalized and participatory (P4) medicine based on a robust and extensive knowledge management infrastructure that contains individual patient information. It will be supported by strategic partnerships involving all stakeholders, including general practitioners associated with patient-centered care. This systems medicine strategy, which will take a holistic approach to disease, is designed to allow the results to be used globally, taking into account the needs and specificities of local economies and health systems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Today two largely new approaches are available for the treatment of clinical hypertension. First, captopril, an orally active angiotensin converting enzyme inhibitor, makes possible chronic blockade of the renin-angiotensin system. This compound, given alone or in combination with a diuretic, normalizes the blood pressure of most hypertensive patients. Unfortunately, because captopril may induce serious adverse effects the use of this inhibitor must be restricted to patients with high blood pressure refractory to conventional antihypertensive drugs. Second, compounds such as verapamil and nifedipine are capable of producing a marked vasodilating effect by inhibiting the entry of calcium into the vascular smooth muscle cells. However, the role of calcium channel blockers in the treatment of hypertensive disease awaits more precise definition.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: The diagnosis of malignant hematologic diseases has become increasingly complex during the last decade. It is based on the interpretation of results from different laboratory analyses, which range from microscopy to gene expression profiling. Recently, a method for the analysis of RNA phenotypes has been developed, the nCounter technology (Nanostring® Technologies), which allows for simultaneous quantification of hundreds of RNA molecules in biological samples. We evaluated this technique in a Swiss multi-center study on eighty-six samples from acute leukemia patients. METHODS: mRNA and protein profiles were established for normal peripheral blood and bone marrow samples. Signal intensities of the various tested antigens with surface expression were similar to those found in previously performed Affymetrix microarray analyses. Acute leukemia samples were analyzed for a set of twenty-two validated antigens and the Pearson Correlation Coefficient for nCounter and flow cytometry results was calculated. RESULTS: Highly significant values between 0.40 and 0.97 were found for the twenty-two antigens tested. A second correlation analysis performed on a per sample basis resulted in concordant results between flow cytometry and nCounter in 44-100% of the antigens tested (mean = 76%), depending on the number of blasts present in a sample, the homogeneity of the blast population, and the type of leukemia (AML or ALL). CONCLUSIONS: The nCounter technology allows for fast and easy depiction of a mRNA profile from hematologic samples. This technology has the potential to become a valuable tool for the diagnosis of acute leukemias, in addition to multi-color flow cytometry.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Les reconstructions palinspastiques fournissent le cadre idéal à de nombreuses études géologiques, géographiques, océanographique ou climatiques. En tant qu?historiens de la terre, les "reconstructeurs" essayent d?en déchiffrer le passé. Depuis qu?ils savent que les continents bougent, les géologues essayent de retracer leur évolution à travers les âges. Si l?idée originale de Wegener était révolutionnaire au début du siècle passé, nous savons depuis le début des années « soixante » que les continents ne "dérivent" pas sans but au milieu des océans mais sont inclus dans un sur-ensemble associant croûte « continentale » et « océanique »: les plaques tectoniques. Malheureusement, pour des raisons historiques aussi bien que techniques, cette idée ne reçoit toujours pas l'écho suffisant parmi la communauté des reconstructeurs. Néanmoins, nous sommes intimement convaincus qu?en appliquant certaines méthodes et certains principes il est possible d?échapper à l?approche "Wégenerienne" traditionnelle pour enfin tendre vers la tectonique des plaques. Le but principal du présent travail est d?exposer, avec tous les détails nécessaires, nos outils et méthodes. Partant des données paléomagnétiques et paléogéographiques classiquement utilisées pour les reconstructions, nous avons développé une nouvelle méthodologie replaçant les plaques tectoniques et leur cinématique au coeur du problème. En utilisant des assemblages continentaux (aussi appelés "assemblées clés") comme des points d?ancrage répartis sur toute la durée de notre étude (allant de l?Eocène jusqu?au Cambrien), nous développons des scénarios géodynamiques permettant de passer de l?une à l?autre en allant du passé vers le présent. Entre deux étapes, les plaques lithosphériques sont peu à peu reconstruites en additionnant/ supprimant les matériels océaniques (symbolisés par des isochrones synthétiques) aux continents. Excepté lors des collisions, les plaques sont bougées comme des entités propres et rigides. A travers les âges, les seuls éléments évoluant sont les limites de plaques. Elles sont préservées aux cours du temps et suivent une évolution géodynamique consistante tout en formant toujours un réseau interconnecté à travers l?espace. Cette approche appelée "limites de plaques dynamiques" intègre de multiples facteurs parmi lesquels la flottabilité des plaques, les taux d'accrétions aux rides, les courbes de subsidence, les données stratigraphiques et paléobiogéographiques aussi bien que les évènements tectoniques et magmatiques majeurs. Cette méthode offre ainsi un bon contrôle sur la cinématique des plaques et fournit de sévères contraintes au modèle. Cette approche "multi-source" nécessite une organisation et une gestion des données efficaces. Avant le début de cette étude, les masses de données nécessaires était devenues un obstacle difficilement surmontable. Les SIG (Systèmes d?Information Géographiques) et les géo-databases sont des outils informatiques spécialement dédiés à la gestion, au stockage et à l?analyse des données spatialement référencées et de leurs attributs. Grâce au développement dans ArcGIS de la base de données PaleoDyn nous avons pu convertir cette masse de données discontinues en informations géodynamiques précieuses et facilement accessibles pour la création des reconstructions. Dans le même temps, grâce à des outils spécialement développés, nous avons, tout à la fois, facilité le travail de reconstruction (tâches automatisées) et amélioré le modèle en développant fortement le contrôle cinématique par la création de modèles de vitesses des plaques. Sur la base des 340 terranes nouvellement définis, nous avons ainsi développé un set de 35 reconstructions auxquelles est toujours associé un modèle de vitesse. Grâce à cet ensemble de données unique, nous pouvons maintenant aborder des problématiques majeurs de la géologie moderne telles que l?étude des variations du niveau marin et des changements climatiques. Nous avons commencé par aborder un autre problème majeur (et non définitivement élucidé!) de la tectonique moderne: les mécanismes contrôlant les mouvements des plaques. Nous avons pu observer que, tout au long de l?histoire de la terre, les pôles de rotation des plaques (décrivant les mouvements des plaques à la surface de la terre) tendent à se répartir le long d'une bande allant du Pacifique Nord au Nord de l'Amérique du Sud, l'Atlantique Central, l'Afrique du Nord, l'Asie Centrale jusqu'au Japon. Fondamentalement, cette répartition signifie que les plaques ont tendance à fuir ce plan médian. En l'absence d'un biais méthodologique que nous n'aurions pas identifié, nous avons interprété ce phénomène comme reflétant l'influence séculaire de la Lune sur le mouvement des plaques. La Lune sur le mouvement des plaques. Le domaine océanique est la clé de voute de notre modèle. Nous avons attaché un intérêt tout particulier à le reconstruire avec beaucoup de détails. Dans ce modèle, la croûte océanique est préservée d?une reconstruction à l?autre. Le matériel crustal y est symbolisé sous la forme d?isochrones synthétiques dont nous connaissons les âges. Nous avons également reconstruit les marges (actives ou passives), les rides médio-océaniques et les subductions intra-océaniques. En utilisant ce set de données très détaillé, nous avons pu développer des modèles bathymétriques 3-D unique offrant une précision bien supérieure aux précédents.<br/><br/>Palinspastic reconstructions offer an ideal framework for geological, geographical, oceanographic and climatology studies. As historians of the Earth, "reconstructers" try to decipher the past. Since they know that continents are moving, geologists a trying to retrieve the continents distributions through ages. If Wegener?s view of continent motions was revolutionary at the beginning of the 20th century, we know, since the Early 1960?s that continents are not drifting without goal in the oceanic realm but are included in a larger set including, all at once, the oceanic and the continental crust: the tectonic plates. Unfortunately, mainly due to technical and historical issues, this idea seems not to receive a sufficient echo among our particularly concerned community. However, we are intimately convinced that, by applying specific methods and principles we can escape the traditional "Wegenerian" point of view to, at last, reach real plate tectonics. This is the main aim of this study to defend this point of view by exposing, with all necessary details, our methods and tools. Starting with the paleomagnetic and paleogeographic data classically used in reconstruction studies, we developed a modern methodology placing the plates and their kinematics at the centre of the issue. Using assemblies of continents (referred as "key assemblies") as anchors distributed all along the scope of our study (ranging from Eocene time to Cambrian time) we develop geodynamic scenarios leading from one to the next, from the past to the present. In between, lithospheric plates are progressively reconstructed by adding/removing oceanic material (symbolized by synthetic isochrones) to major continents. Except during collisions, plates are moved as single rigid entities. The only evolving elements are the plate boundaries which are preserved and follow a consistent geodynamical evolution through time and form an interconnected network through space. This "dynamic plate boundaries" approach integrates plate buoyancy factors, oceans spreading rates, subsidence patterns, stratigraphic and paleobiogeographic data, as well as major tectonic and magmatic events. It offers a good control on plate kinematics and provides severe constraints for the model. This multi-sources approach requires an efficient data management. Prior to this study, the critical mass of necessary data became a sorely surmountable obstacle. GIS and geodatabases are modern informatics tools of specifically devoted to store, analyze and manage data and associated attributes spatially referenced on the Earth. By developing the PaleoDyn database in ArcGIS software we converted the mass of scattered data offered by the geological records into valuable geodynamical information easily accessible for reconstructions creation. In the same time, by programming specific tools we, all at once, facilitated the reconstruction work (tasks automation) and enhanced the model (by highly increasing the kinematic control of plate motions thanks to plate velocity models). Based on the 340 terranes properly defined, we developed a revised set of 35 reconstructions associated to their own velocity models. Using this unique dataset we are now able to tackle major issues of the geology (such as the global sea-level variations and climate changes). We started by studying one of the major unsolved issues of the modern plate tectonics: the driving mechanism of plate motions. We observed that, all along the Earth?s history, plates rotation poles (describing plate motions across the Earth?s surface) tend to follow a slight linear distribution along a band going from the Northern Pacific through Northern South-America, Central Atlantic, Northern Africa, Central Asia up to Japan. Basically, it sighifies that plates tend to escape this median plan. In the absence of a non-identified methodological bias, we interpreted it as the potential secular influence ot the Moon on plate motions. The oceanic realms are the cornerstone of our model and we attached a particular interest to reconstruct them with many details. In this model, the oceanic crust is preserved from one reconstruction to the next. The crustal material is symbolised by the synthetic isochrons from which we know the ages. We also reconstruct the margins (active or passive), ridges and intra-oceanic subductions. Using this detailed oceanic dataset, we developed unique 3-D bathymetric models offering a better precision than all the previously existing ones.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The parameter setting of a differential evolution algorithm must meet several requirements: efficiency, effectiveness, and reliability. Problems vary. The solution of a particular problem can be represented in different ways. An algorithm most efficient in dealing with a particular representation may be less efficient in dealing with other representations. The development of differential evolution-based methods contributes substantially to research on evolutionary computing and global optimization in general. The objective of this study is to investigatethe differential evolution algorithm, the intelligent adjustment of its controlparameters, and its application. In the thesis, the differential evolution algorithm is first examined using different parameter settings and test functions. Fuzzy control is then employed to make control parameters adaptive based on an optimization process and expert knowledge. The developed algorithms are applied to training radial basis function networks for function approximation with possible variables including centers, widths, and weights of basis functions and both having control parameters kept fixed and adjusted by fuzzy controller. After the influence of control variables on the performance of the differential evolution algorithm was explored, an adaptive version of the differential evolution algorithm was developed and the differential evolution-based radial basis function network training approaches were proposed. Experimental results showed that the performance of the differential evolution algorithm is sensitive to parameter setting, and the best setting was found to be problem dependent. The fuzzy adaptive differential evolution algorithm releases the user load of parameter setting and performs better than those using all fixedparameters. Differential evolution-based approaches are effective for training Gaussian radial basis function networks.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Quality inspection and assurance is a veryimportant step when today's products are sold to markets. As products are produced in vast quantities, the interest to automate quality inspection tasks has increased correspondingly. Quality inspection tasks usuallyrequire the detection of deficiencies, defined as irregularities in this thesis. Objects containing regular patterns appear quite frequently on certain industries and science, e.g. half-tone raster patterns in the printing industry, crystal lattice structures in solid state physics and solder joints and components in the electronics industry. In this thesis, the problem of regular patterns and irregularities is described in analytical form and three different detection methods are proposed. All the methods are based on characteristics of Fourier transform to represent regular information compactly. Fourier transform enables the separation of regular and irregular parts of an image but the three methods presented are shown to differ in generality and computational complexity. Need to detect fine and sparse details is common in quality inspection tasks, e.g., locating smallfractures in components in the electronics industry or detecting tearing from paper samples in the printing industry. In this thesis, a general definition of such details is given by defining sufficient statistical properties in the histogram domain. The analytical definition allowsa quantitative comparison of methods designed for detail detection. Based on the definition, the utilisation of existing thresholding methodsis shown to be well motivated. Comparison of thresholding methods shows that minimum error thresholding outperforms other standard methods. The results are successfully applied to a paper printability and runnability inspection setup. Missing dots from a repeating raster pattern are detected from Heliotest strips and small surface defects from IGT picking papers.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

PURPOSE: This study aimed to determine the neuro-mechanical and metabolic adjustments in the lower limbs induced by the running anaerobic sprint test (the so-called RAST). METHODS: Eight professional football players performed 6 × 35 m sprints interspersed with 10 s of active recovery on artificial turf with their football shoes. Sprinting mechanics (plantar pressure insoles), root mean square activity of the vastus lateralis (VL), rectus femoris (RF), and biceps femoris (BF) muscles (surface electromyography, EMG) and VL muscle oxygenation (near-infrared spectroscopy) were monitored continuously. RESULTS: Sprint time, contact time and total stride duration increased from the first to the last repetition (+17.4, +20.0 and +16.6 %; all P < 0.05), while flight time and stride length remained constant. Stride frequency (-13.9 %; P < 0.001) and vertical stiffness decreased (-27.2 %; P < 0.001) across trials. Root mean square EMG activities of RF and BF (-18.7 and -18.1 %; P < 0.01 and 0.001, respectively), but not VL (-1.2 %; P > 0.05), decreased over sprint repetitions and were correlated with the increase in running time (r = -0.82 and -0.90; both P < 0.05). Together with a better maintenance of RF and BF muscles activation levels over sprint repetitions, players with a better repeated-sprint performance (lower cumulated times) also displayed faster muscle de- (during sprints) and re-oxygenation (during recovery) rates (r = -0.74 and -0.84; P < 0.05 and 0.01, respectively). CONCLUSION: The repeated anaerobic sprint test leads to substantial alterations in stride mechanics and leg-spring behaviour. Our results also strengthen the link between repeated-sprint ability and the change in neuromuscular activation as well as in muscle de- and re-oxygenation rates.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Plants have the ability to use the composition of incident light as a cue to adapt development and growth to their environment. Arabidopsis thaliana as well as many crops are best adapted to sunny habitats. When subjected to shade, these plants exhibit a variety of physiological responses collectively called shade avoidance syndrome (SAS). It includes increased growth of hypocotyl and petioles, decreased growth rate of cotyledons and reduced branching and crop yield. These responses are mainly mediated by phytochrome photoreceptors, which exist either in an active, far-red light (FR) absorbing or an inactive, red light (R) absorbing isoform. In direct sunlight, the R to FR light (R/FR) ratio is high and converts the phytochromes into their physiologically active state. The phytochromes interact with downstream transcription factors such as PHYTOCHROME INTERACTING FACTOR (PIF), which are subsequently degraded. Light filtered through a canopy is strongly depleted in R, which result in a low R/FR ratio and renders the phytochromes inactive. Protein levels of downstream transcription factors are stabilized, which initiates the expression of shade-induced genes such as HFR1, PIL1 or ATHB-2. In my thesis, I investigated transcriptional responses mediated by the SAS in whole Arabidopsis seedlings. Using microarray and chromatin immunoprecipitation data, we identified genome-wide PIF4 and PIF5 dependent shade regulated gene as well as putative direct target genes of PIF5. This revealed evidence for a direct regulatory link between phytochrome signaling and the growth promoting phytohormone auxin (IAA) at the level of biosynthesis, transport and signaling. Subsequently, it was shown, that free-IAA levels are upregulated in response to shade. It is assumed that shade-induced auxin production takes predominantly place in cotyledons of seedlings. This implies, that IAA is subsequently transported basipetally to the hypocotyl and enhances elongation growth. The importance of auxin transport for growth responses has been established by chemical and genetic approaches. To gain a better understanding of spatio-temporal transcriptional regulation of shade-induce auxin, I generated in a second project, an organ specific high throughput data focusing on cotyledon and hypocotyl of young Arabidopsis seedlings. Interestingly, both organs show an opposite growth regulation by shade. I first investigated the spatio-transcriptional regulation of auxin re- sponsive gene, in order to determine how broad gene expression pattern can be explained by the hypothesized movement of auxin from cotyledons to hypocotyls in shade. The analysis suggests, that several genes are indeed regulated according to our prediction and others are regulated in a more complex manner. In addition, analysis of gene families of auxin biosynthetic and transport components, lead to the identification of essential family members for shade-induced growth re- sponses, which were subsequently experimentally confirmed. Finally, the analysis of expression pattern identified several candidate genes, which possibly explain aspects of the opposite growth response of the different organs.