990 resultados para Acquisition system


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The overall system is designed to permit automatic collection of delamination field data for bridge decks. In addition to measuring and recording the data in the field, the system provides for transferring the recorded data to a personal computer for processing and plotting. This permits rapid turnaround from data collection to a finished plot of the results in a fraction of the time previously required for manual analysis of the analog data captured on a strip chart recorder. In normal operation the Delamtect provides an analog voltage for each of two channels which is proportional to the extent of any delamination. These voltages are recorded on a strip chart for later visual analysis. An event marker voltage, produced by a momentary push button on the handle, is also provided by the Delamtect and recorded on a third channel of the analog recorder.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper proposes to enrich RBMTdictionaries with Named Entities(NEs) automatically acquired fromWikipedia. The method is appliedto the Apertium English-Spanishsystem and its performance comparedto that of Apertium with and withouthandtagged NEs. The system withautomatic NEs outperforms the onewithout NEs, while results vary whencompared to a system with handtaggedNEs (results are comparable forSpanish to English but slightly worstfor English to Spanish). Apart fromthat, adding automatic NEs contributesto decreasing the amount of unknownterms by more than 10%.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work had two primary objectives: 1) to produce a working prototype for automated printability assessment and 2) to perform a study of available machine vision and other necessary hardware solutions. The three printability testing methods, IGT Picking,He¬liotest, and mottling, considered in this work have several different requirements and the task was to produce a single automated testing system suitable for all methods. A system was designed and built and its performance was tested using the Heliotest. Working proto¬types are important tools for implementing theoretical methods into practical systems and testing and demonstrating the methodsin real life conditions. The system was found to be sufficient for the Heliotest method. Further testing and possible modifications related to other two test methods were left for future works. A short study of available systems and solutions concerning image acquisition of machine vision was performed. The theoretical part of this study includes lighting systems, optical systems and image acquisition tools, mainly cameras and the underlying physical aspects for each portion.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Members of the TCF/LEF (T cell factor / lymphoid enhancer factor) family of DNA-binding factors play important roles during embryogenesis, the establishment and/or maintenance of self-renewing tissues such as the immune system and for malignant transformation. Specifically, it has been shown that TCF-1 is required for T cell development. A role for LEF-1 became apparent when mice harbored two hypomorphic TCF-1 alleles and consequently expressed low levels of TCF-1. Here we show that NK cell development is similarly regulated by redundant functions of TCF-1 and LEF-1, whereby TCF-1 contributes significantly more to NK cell development than LEF-1. Despite this role for NK cell development, LEF-1 is not required for the establishment of a repertoire of MHC class I-specific Ly49 receptors on NK cells. The proper formation of this repertoire depends to a large extent on TCF-1. These findings suggest common and distinct functions of TCF-1 and LEF-1 during lymphocyte development.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Gaia is the most ambitious space astrometry mission currently envisaged and is a technological challenge in all its aspects. We describe a proposal for the payload data handling system of Gaia, as an example of a high-performance, real-time, concurrent, and pipelined data system. This proposal includes the front-end systems for the instrumentation, the data acquisition and management modules, the star data processing modules, and the payload data handling unit. We also review other payload and service module elements and we illustrate a data flux proposal.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Un système efficace de sismique tridimensionnelle (3-D) haute-résolution adapté à des cibles lacustres de petite échelle a été développé. Dans le Lac Léman, près de la ville de Lausanne, en Suisse, des investigations récentes en deux dimension (2-D) ont mis en évidence une zone de faille complexe qui a été choisie pour tester notre système. Les structures observées incluent une couche mince (<40 m) de sédiments quaternaires sub-horizontaux, discordants sur des couches tertiaires de molasse pentées vers le sud-est. On observe aussi la zone de faille de « La Paudèze » qui sépare les unités de la Molasse du Plateau de la Molasse Subalpine. Deux campagnes 3-D complètes, d?environ d?un kilomètre carré, ont été réalisées sur ce site de test. La campagne pilote (campagne I), effectuée en 1999 pendant 8 jours, a couvert 80 profils en utilisant une seule flûte. Pendant la campagne II (9 jours en 2001), le nouveau système trois-flûtes, bien paramétrés pour notre objectif, a permis l?acquisition de données de très haute qualité sur 180 lignes CMP. Les améliorations principales incluent un système de navigation et de déclenchement de tirs grâce à un nouveau logiciel. Celui-ci comprend un contrôle qualité de la navigation du bateau en temps réel utilisant un GPS différentiel (dGPS) à bord et une station de référence près du bord du lac. De cette façon, les tirs peuvent être déclenchés tous les 5 mètres avec une erreur maximale non-cumulative de 25 centimètres. Tandis que pour la campagne I la position des récepteurs de la flûte 48-traces a dû être déduite à partir des positions du bateau, pour la campagne II elle ont pu être calculées précisément (erreur <20 cm) grâce aux trois antennes dGPS supplémentaires placées sur des flotteurs attachés à l?extrémité de chaque flûte 24-traces. Il est maintenant possible de déterminer la dérive éventuelle de l?extrémité des flûtes (75 m) causée par des courants latéraux ou de petites variations de trajet du bateau. De plus, la construction de deux bras télescopiques maintenant les trois flûtes à une distance de 7.5 m les uns des autres, qui est la même distance que celle entre les lignes naviguées de la campagne II. En combinaison avec un espacement de récepteurs de 2.5 m, la dimension de chaque «bin» de données 3-D de la campagne II est de 1.25 m en ligne et 3.75 m latéralement. L?espacement plus grand en direction « in-line » par rapport à la direction «cross-line» est justifié par l?orientation structurale de la zone de faille perpendiculaire à la direction «in-line». L?incertitude sur la navigation et le positionnement pendant la campagne I et le «binning» imprécis qui en résulte, se retrouve dans les données sous forme d?une certaine discontinuité des réflecteurs. L?utilisation d?un canon à air à doublechambre (qui permet d?atténuer l?effet bulle) a pu réduire l?aliasing observé dans les sections migrées en 3-D. Celui-ci était dû à la combinaison du contenu relativement haute fréquence (<2000 Hz) du canon à eau (utilisé à 140 bars et à 0.3 m de profondeur) et d?un pas d?échantillonnage latéral insuffisant. Le Mini G.I 15/15 a été utilisé à 80 bars et à 1 m de profondeur, est mieux adapté à la complexité de la cible, une zone faillée ayant des réflecteurs pentés jusqu?à 30°. Bien que ses fréquences ne dépassent pas les 650 Hz, cette source combine une pénétration du signal non-aliasé jusqu?à 300 m dans le sol (par rapport au 145 m pour le canon à eau) pour une résolution verticale maximale de 1.1 m. Tandis que la campagne I a été acquise par groupes de plusieurs lignes de directions alternées, l?optimisation du temps d?acquisition du nouveau système à trois flûtes permet l?acquisition en géométrie parallèle, ce qui est préférable lorsqu?on utilise une configuration asymétrique (une source et un dispositif de récepteurs). Si on ne procède pas ainsi, les stacks sont différents selon la direction. Toutefois, la configuration de flûtes, plus courtes que pour la compagne I, a réduit la couverture nominale, la ramenant de 12 à 6. Une séquence classique de traitement 3-D a été adaptée à l?échantillonnage à haute fréquence et elle a été complétée par deux programmes qui transforment le format non-conventionnel de nos données de navigation en un format standard de l?industrie. Dans l?ordre, le traitement comprend l?incorporation de la géométrie, suivi de l?édition des traces, de l?harmonisation des «bins» (pour compenser l?inhomogénéité de la couverture due à la dérive du bateau et de la flûte), de la correction de la divergence sphérique, du filtrage passe-bande, de l?analyse de vitesse, de la correction DMO en 3-D, du stack et enfin de la migration 3-D en temps. D?analyses de vitesse détaillées ont été effectuées sur les données de couverture 12, une ligne sur deux et tous les 50 CMP, soit un nombre total de 600 spectres de semblance. Selon cette analyse, les vitesses d?intervalles varient de 1450-1650 m/s dans les sédiments non-consolidés et de 1650-3000 m/s dans les sédiments consolidés. Le fait que l?on puisse interpréter plusieurs horizons et surfaces de faille dans le cube, montre le potentiel de cette technique pour une interprétation tectonique et géologique à petite échelle en trois dimensions. On distingue cinq faciès sismiques principaux et leurs géométries 3-D détaillées sur des sections verticales et horizontales: les sédiments lacustres (Holocène), les sédiments glacio-lacustres (Pléistocène), la Molasse du Plateau, la Molasse Subalpine de la zone de faille (chevauchement) et la Molasse Subalpine au sud de cette zone. Les couches de la Molasse du Plateau et de la Molasse Subalpine ont respectivement un pendage de ~8° et ~20°. La zone de faille comprend de nombreuses structures très déformées de pendage d?environ 30°. Des tests préliminaires avec un algorithme de migration 3-D en profondeur avant sommation et à amplitudes préservées démontrent que la qualité excellente des données de la campagne II permet l?application de telles techniques à des campagnes haute-résolution. La méthode de sismique marine 3-D était utilisée jusqu?à présent quasi-exclusivement par l?industrie pétrolière. Son adaptation à une échelle plus petite géographiquement mais aussi financièrement a ouvert la voie d?appliquer cette technique à des objectifs d?environnement et du génie civil.<br/><br/>An efficient high-resolution three-dimensional (3-D) seismic reflection system for small-scale targets in lacustrine settings was developed. In Lake Geneva, near the city of Lausanne, Switzerland, past high-resolution two-dimensional (2-D) investigations revealed a complex fault zone (the Paudèze thrust zone), which was subsequently chosen for testing our system. Observed structures include a thin (<40 m) layer of subhorizontal Quaternary sediments that unconformably overlie southeast-dipping Tertiary Molasse beds and the Paudèze thrust zone, which separates Plateau and Subalpine Molasse units. Two complete 3-D surveys have been conducted over this same test site, covering an area of about 1 km2. In 1999, a pilot survey (Survey I), comprising 80 profiles, was carried out in 8 days with a single-streamer configuration. In 2001, a second survey (Survey II) used a newly developed three-streamer system with optimized design parameters, which provided an exceptionally high-quality data set of 180 common midpoint (CMP) lines in 9 days. The main improvements include a navigation and shot-triggering system with in-house navigation software that automatically fires the gun in combination with real-time control on navigation quality using differential GPS (dGPS) onboard and a reference base near the lake shore. Shots were triggered at 5-m intervals with a maximum non-cumulative error of 25 cm. Whereas the single 48-channel streamer system of Survey I requires extrapolation of receiver positions from the boat position, for Survey II they could be accurately calculated (error <20 cm) with the aid of three additional dGPS antennas mounted on rafts attached to the end of each of the 24- channel streamers. Towed at a distance of 75 m behind the vessel, they allow the determination of feathering due to cross-line currents or small course variations. Furthermore, two retractable booms hold the three streamers at a distance of 7.5 m from each other, which is the same distance as the sail line interval for Survey I. With a receiver spacing of 2.5 m, the bin dimension of the 3-D data of Survey II is 1.25 m in in-line direction and 3.75 m in cross-line direction. The greater cross-line versus in-line spacing is justified by the known structural trend of the fault zone perpendicular to the in-line direction. The data from Survey I showed some reflection discontinuity as a result of insufficiently accurate navigation and positioning and subsequent binning errors. Observed aliasing in the 3-D migration was due to insufficient lateral sampling combined with the relatively high frequency (<2000 Hz) content of the water gun source (operated at 140 bars and 0.3 m depth). These results motivated the use of a double-chamber bubble-canceling air gun for Survey II. A 15 / 15 Mini G.I air gun operated at 80 bars and 1 m depth, proved to be better adapted for imaging the complexly faulted target area, which has reflectors dipping up to 30°. Although its frequencies do not exceed 650 Hz, this air gun combines a penetration of non-aliased signal to depths of 300 m below the water bottom (versus 145 m for the water gun) with a maximum vertical resolution of 1.1 m. While Survey I was shot in patches of alternating directions, the optimized surveying time of the new threestreamer system allowed acquisition in parallel geometry, which is preferable when using an asymmetric configuration (single source and receiver array). Otherwise, resulting stacks are different for the opposite directions. However, the shorter streamer configuration of Survey II reduced the nominal fold from 12 to 6. A 3-D conventional processing flow was adapted to the high sampling rates and was complemented by two computer programs that format the unconventional navigation data to industry standards. Processing included trace editing, geometry assignment, bin harmonization (to compensate for uneven fold due to boat/streamer drift), spherical divergence correction, bandpass filtering, velocity analysis, 3-D DMO correction, stack and 3-D time migration. A detailed semblance velocity analysis was performed on the 12-fold data set for every second in-line and every 50th CMP, i.e. on a total of 600 spectra. According to this velocity analysis, interval velocities range from 1450-1650 m/s for the unconsolidated sediments and from 1650-3000 m/s for the consolidated sediments. Delineation of several horizons and fault surfaces reveal the potential for small-scale geologic and tectonic interpretation in three dimensions. Five major seismic facies and their detailed 3-D geometries can be distinguished in vertical and horizontal sections: lacustrine sediments (Holocene) , glaciolacustrine sediments (Pleistocene), Plateau Molasse, Subalpine Molasse and its thrust fault zone. Dips of beds within Plateau and Subalpine Molasse are ~8° and ~20°, respectively. Within the fault zone, many highly deformed structures with dips around 30° are visible. Preliminary tests with 3-D preserved-amplitude prestack depth migration demonstrate that the excellent data quality of Survey II allows application of such sophisticated techniques even to high-resolution seismic surveys. In general, the adaptation of the 3-D marine seismic reflection method, which to date has almost exclusively been used by the oil exploration industry, to a smaller geographical as well as financial scale has helped pave the way for applying this technique to environmental and engineering purposes.<br/><br/>La sismique réflexion est une méthode d?investigation du sous-sol avec un très grand pouvoir de résolution. Elle consiste à envoyer des vibrations dans le sol et à recueillir les ondes qui se réfléchissent sur les discontinuités géologiques à différentes profondeurs et remontent ensuite à la surface où elles sont enregistrées. Les signaux ainsi recueillis donnent non seulement des informations sur la nature des couches en présence et leur géométrie, mais ils permettent aussi de faire une interprétation géologique du sous-sol. Par exemple, dans le cas de roches sédimentaires, les profils de sismique réflexion permettent de déterminer leur mode de dépôt, leurs éventuelles déformations ou cassures et donc leur histoire tectonique. La sismique réflexion est la méthode principale de l?exploration pétrolière. Pendant longtemps on a réalisé des profils de sismique réflexion le long de profils qui fournissent une image du sous-sol en deux dimensions. Les images ainsi obtenues ne sont que partiellement exactes, puisqu?elles ne tiennent pas compte de l?aspect tridimensionnel des structures géologiques. Depuis quelques dizaines d?années, la sismique en trois dimensions (3-D) a apporté un souffle nouveau à l?étude du sous-sol. Si elle est aujourd?hui parfaitement maîtrisée pour l?imagerie des grandes structures géologiques tant dans le domaine terrestre que le domaine océanique, son adaptation à l?échelle lacustre ou fluviale n?a encore fait l?objet que de rares études. Ce travail de thèse a consisté à développer un système d?acquisition sismique similaire à celui utilisé pour la prospection pétrolière en mer, mais adapté aux lacs. Il est donc de dimension moindre, de mise en oeuvre plus légère et surtout d?une résolution des images finales beaucoup plus élevée. Alors que l?industrie pétrolière se limite souvent à une résolution de l?ordre de la dizaine de mètres, l?instrument qui a été mis au point dans le cadre de ce travail permet de voir des détails de l?ordre du mètre. Le nouveau système repose sur la possibilité d?enregistrer simultanément les réflexions sismiques sur trois câbles sismiques (ou flûtes) de 24 traces chacun. Pour obtenir des données 3-D, il est essentiel de positionner les instruments sur l?eau (source et récepteurs des ondes sismiques) avec une grande précision. Un logiciel a été spécialement développé pour le contrôle de la navigation et le déclenchement des tirs de la source sismique en utilisant des récepteurs GPS différentiel (dGPS) sur le bateau et à l?extrémité de chaque flûte. Ceci permet de positionner les instruments avec une précision de l?ordre de 20 cm. Pour tester notre système, nous avons choisi une zone sur le Lac Léman, près de la ville de Lausanne, où passe la faille de « La Paudèze » qui sépare les unités de la Molasse du Plateau et de la Molasse Subalpine. Deux campagnes de mesures de sismique 3-D y ont été réalisées sur une zone d?environ 1 km2. Les enregistrements sismiques ont ensuite été traités pour les transformer en images interprétables. Nous avons appliqué une séquence de traitement 3-D spécialement adaptée à nos données, notamment en ce qui concerne le positionnement. Après traitement, les données font apparaître différents faciès sismiques principaux correspondant notamment aux sédiments lacustres (Holocène), aux sédiments glacio-lacustres (Pléistocène), à la Molasse du Plateau, à la Molasse Subalpine de la zone de faille et la Molasse Subalpine au sud de cette zone. La géométrie 3-D détaillée des failles est visible sur les sections sismiques verticales et horizontales. L?excellente qualité des données et l?interprétation de plusieurs horizons et surfaces de faille montrent le potentiel de cette technique pour les investigations à petite échelle en trois dimensions ce qui ouvre des voies à son application dans les domaines de l?environnement et du génie civil.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis examines the history and evolution of information system process innovation (ISPI) processes (adoption, adaptation, and unlearning) within the information system development (ISD) work in an internal information system (IS) department and in two IS software house organisations in Finland over a 43-year time-period. The study offers insights into influential actors and their dependencies in deciding over ISPIs. The research usesa qualitative research approach, and the research methodology involves the description of the ISPI processes, how the actors searched for ISPIs, and how the relationships between the actors changed over time. The existing theories were evaluated using the conceptual models of the ISPI processes based on the innovationliterature in the IS area. The main focus of the study was to observe changes in the main ISPI processes over time. The main contribution of the thesis is a new theory. The term theory should be understood as 1) a new conceptual framework of the ISPI processes, 2) new ISPI concepts and categories, and the relationships between the ISPI concepts inside the ISPI processes. The study gives a comprehensive and systematic study on the history and evolution of the ISPI processes; reveals the factors that affected ISPI adoption; studies ISPI knowledge acquisition, information transfer, and adaptation mechanisms; and reveals the mechanismsaffecting ISPI unlearning; changes in the ISPI processes; and diverse actors involved in the processes. The results show that both the internal IS department and the two IS software houses sought opportunities to improve their technical skills and career paths and this created an innovative culture. When new technology generations come to the market the platform systems need to be renewed, and therefore the organisations invest in ISPIs in cycles. The extent of internal learning and experiments was higher than the external knowledge acquisition. Until the outsourcing event (1984) the decision-making was centralised and the internalIS department was very influential over ISPIs. After outsourcing, decision-making became distributed between the two IS software houses, the IS client, and itsinternal IT department. The IS client wanted to assure that information systemswould serve the business of the company and thus wanted to co-operate closely with the software organisations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

PURPOSE: All methods presented to date to map both conductivity and permittivity rely on multiple acquisitions to compute quantitatively the magnitude of radiofrequency transmit fields, B1+. In this work, we propose a method to compute both conductivity and permittivity based solely on relative receive coil sensitivities ( B1-) that can be obtained in one single measurement without the need to neither explicitly perform transmit/receive phase separation nor make assumptions regarding those phases. THEORY AND METHODS: To demonstrate the validity and the noise sensitivity of our method we used electromagnetic finite differences simulations of a 16-channel transceiver array. To experimentally validate our methodology at 7 Tesla, multi compartment phantom data was acquired using a standard 32-channel receive coil system and two-dimensional (2D) and 3D gradient echo acquisition. The reconstructed electric properties were correlated to those measured using dielectric probes. RESULTS: The method was demonstrated both in simulations and in phantom data with correlations to both the modeled and bench measurements being close to identity. The noise properties were modeled and understood. CONCLUSION: The proposed methodology allows to quantitatively determine the electrical properties of a sample using any MR contrast, with the only constraint being the need to have 4 or more receive coils and high SNR. Magn Reson Med, 2014. © 2014 Wiley Periodicals, Inc.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Sales configurators are essential tools for companies that offer complicated case specifically crafted products for customers. Most sophisticated of them are able to design an entire end product on the fly according to given constraints, calculate price for the offer and move the order into production. This thesis covers a sales configurator acquisition project in a large industrial company that offers cranes for its customers. The study spans the preliminary stages of a large-scale software purchase project starting from the specification of problem domain and ending up presenting the most viable software solution that fulfils the requirements for the new system. The project consists of mapping usage environment, use cases, and collecting requirements that are expected from the new system. The collected requirements involve fitting the new sales system into enterprise application infrastructure, mitigating the risks involved in the project and specifying new features to the application whilst preserving all of the admired features of the old sales system currently used in the company. The collected requirements were presented to a number of different sales software vendors who were asked to provide solution suggestions that would fulfil all the demands. All of the received solution proposals were exposed to an evaluation to determine the most feasible solutions, and the construction of evaluation criteria itself was a part of the study. The final outcome of this study is a short-list of the most feasible sales configurator solutions together with a description of how software purchase process in large enterprises work, and which aspects should be paid attention in large projects of similar kind.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The process of cold storage chambers contributes largely to the quality and longevity of stored products. In recent years, it has been intensified the study of control strategies in order to decrease the temperature change inside the storage chamber and to reduce the electric power consumption. This study has developed a system for data acquisition and process control, in LabVIEW language, to be applied in the cooling system of a refrigerating chamber of 30m³. The use of instrumentation and the application developed fostered the development of scientific experiments, which aimed to study the dynamic behavior of the refrigeration system, compare the performance of control strategies and the heat engine, even due to the controlled temperature, or to the electricity consumption. This system tested the strategies for on-off control, PID and fuzzy. Regarding power consumption, the fuzzy controller showed the best result, saving 10% when compared with other tested strategies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this study was to use digital images acquired by cameras attached to a helium balloon to detect variation of the nutritional status in Brachiaria decumbens. The treatments consisted of five doses of nitrogen (0, 50, 100, 150 e 200kg ha-1) with six replications each, evaluated in a completely randomized statistical design. A remote sensing system composed of digital cameras and microcomputers was used for image acquisition, and a helium balloon lifted the cameras to the heights of 15, 20, 25 and 30m. A portable chlorophyll meter and analyses of leaf nitrogen content were used to make comparisons with data obtained by the remote sensing system. Data was acquired in two phases, in different climatic conditions. At the end of each phase, dry matter production was measured. Three vegetation indices were used to evaluate the detection of different nutritional status. The three indices were able to detect the effects of N doses. The indices constructed with the Green spectral band showed to be more efficient.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The thesis develops guidelines for the implementation of the health and safety management system according to the OHSAS 18001 standard, as well as the feasible threat analysis, project proposal schedule, future system quality improvements and organizational change evaluation. The theoretical part clarifies determination of occupational health and safety, its management system, the OHSAS 18001 standard and integrated management system compounded of triple ISO 14001, ISO 9001 and OHSAS 18001 standards. The literature includes such important aspects as human factor, organizational policies, possible benefits, threats, organizational safety culture, Deming’s quality improvement cycle, system implementation, maintenance and cost matters. The empirical part demonstrates real-life situation by using Andritz Pulp & Paper Oy as a case study. Prior the thesis proposal, Andritz Group is analysed including separate business areas, acquisition and integration strategies, current status of the health and safety management and parallel experiences of the largest business area Andritz Hydro. The proposal is aimed at improving the current health and safety system for the permanent and sub-contracted employees at Andritz Pulp & Paper both in Finland and in various projects globally.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Asthma and allergy are common diseases and their prevalence is increasing. One of the hypotheses that explains this trend is exposure to inhalable chemicals such as traffi c-related air pollution. Epidemiological research supports this theory, as a correlation between environmental chemicals and allergic respiratory diseases has been found. In addition to ambient airborne particles, one may be exposed to engineered nanosized materials that are actively produced due to their favorable physico-chemical properties compared to their bulk size counterparts. On the cellular level, improper activity of T helper (Th) cells has been connected to allergic reactions. Th cells can differentiate into functionally different effector subsets, which are identifi ed according to their characteristic cytokine profi les resulting in specifi c ability to communicate with other cells. Th2 cells activate humoral immunity and stimulate eradication of extracellular pathogens. However, persistent predominance of Th2 cells is involved in a development of number of allergic diseases. The cytokine environment at the time of antigen recognition is the major factor determining the polarization of a naïve Th cell. Th2 cell differentiation is initiated by IL4, which signals via transcription factor STAT6. Although the importance of this pathway has been evaluated in the mouse studies, the signaling components involved have been largely unknown. The aim of this thesis was to identify molecules, which are under the control of IL4 and STAT6 in Th cells. This was done by using system-level analysis of STAT6 target genes at genome, mRNA and protein level resulting in identifi cation of various genes previously not connected to Th2 cell phenotype acquisition. In the study, STAT6-mediated primary and secondary target genes were dissection from each other and a detailed transcriptional kinetics of Th2 cell polarization of naïve human CD4+ T cells was collected. Integration of these data revealed the hierarchy of molecular events that mediates the differentiation towards Th2 cell phenotype. In addition, the results highlighted the importance of exploiting proteomics tools to complement the studies on STAT6 target genes identifi ed through transcriptional profi ling. In the last subproject, the effects of the exposure with ZnO and TiO2 nanoparticles was analyzed in Jurkat T cell line and in primary human monocyte-derived macrophages and dendritic cells to evaluate their toxicity and potential to cause infl ammation. Identifi cation of ZnO-derived gene expression showed that the same nanoparticles may elicit markedly distinctive responses in different cell types, thus underscoring the need for unbiased profi ling of target genes and pathways affected. The results gave additional proof that the cellular response to nanosized ZnO is due to leached Zn2+ ions. The approach used in ZnO and TiO2 nanoparticle study demonstrated the value of assessing nanoparticle responses through a toxicogenomics approach. The increased knowledge of Th2 cell signaling will hopefully reveal new therapeutic nodes and eventually improve our possibilities to prevent and tackle allergic infl ammatory diseases.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Weed mapping is a useful tool for site-specific herbicide applications. The objectives of this study were (1) to determine the percentage of land area covered by weeds in no-till and conventionally tilled fields of common bean using digital image processing and geostatistics, and (2) to compare two types of cameras. Two digital cameras (color and infrared) and a differential GPS were affixed to a center pivot structure for image acquisition. Sample field images were acquired in a regular grid pattern, and the images were processed to estimate the percentage of weed cover. After calculating the georeferenced weed percentage values, maps were constructed using geostatistical techniques. Based on the results, color images are recommended for mapping the percentage of weed cover in no-till systems, while infrared images are recommended for weed mapping in conventional tillage systems.