998 resultados para source parameters
Resumo:
Growth experiments showed that adenine and hypoxanthine can be used as nitrogen sources by several strains of K. pneumoniae under aerobic conditions. The assimilation of all nitrogens from these purines indicates that the catabolic pathway is complete and proceeds past allantoin. Here we identify the genetic system responsible for the oxidation of hypoxanthine to allantoin in K. pneumoniae. The hpx cluster consists of seven genes, for which an organization in four transcriptional units, hpxDE, hpxR, hpxO and hpxPQT, is proposed. The proteins involved in the oxidation of hypoxanthine (HpxDE) or uric acid (HpxO) did not display any similarity to other reported enzymes known to catalyze these reactions, but instead are similar to oxygenases acting on aromatic compounds. Expression of the hpx system is activated by nitrogen limitation and by the presence of specific substrates, with hpxDE and hpxPQT controlled by both signals. Nitrogen control of hpxPQT transcription, which depends on 54, is mediated by the Ntr system. In contrast, neither NtrC nor NAC is involved in the nitrogen control of hpxDE, which is dependent on 70 for transcription. Activation of these operons by the specific substrates is also mediated by different effectors and regulatory proteins. Induction of hpxPQT requires uric acid formation, whereas expression of hpxDE is induced by the presence of hypoxanthine through the regulatory protein HpxR. This LysR-type regulator binds to a TCTGC-N4-GCAAA site in the intergenic hpxD-hpxR region. When bound to this site for hpxDE activation, HpxR negatively controls its own transcription.
Resumo:
Characterizing the geological features and structures in three dimensions over inaccessible rock cliffs is needed to assess natural hazards such as rockfalls and rockslides and also to perform investigations aimed at mapping geological contacts and building stratigraphy and fold models. Indeed, the detailed 3D data, such as LiDAR point clouds, allow to study accurately the hazard processes and the structure of geologic features, in particular in vertical and overhanging rock slopes. Thus, 3D geological models have a great potential of being applied to a wide range of geological investigations both in research and applied geology projects, such as mines, tunnels and reservoirs. Recent development of ground-based remote sensing techniques (LiDAR, photogrammetry and multispectral / hyperspectral images) are revolutionizing the acquisition of morphological and geological information. As a consequence, there is a great potential for improving the modeling of geological bodies as well as failure mechanisms and stability conditions by integrating detailed remote data. During the past ten years several large rockfall events occurred along important transportation corridors where millions of people travel every year (Switzerland: Gotthard motorway and railway; Canada: Sea to sky highway between Vancouver and Whistler). These events show that there is still a lack of knowledge concerning the detection of potential rockfalls, making mountain residential settlements and roads highly risky. It is necessary to understand the main factors that destabilize rocky outcrops even if inventories are lacking and if no clear morphological evidences of rockfall activity are observed. In order to increase the possibilities of forecasting potential future landslides, it is crucial to understand the evolution of rock slope stability. Defining the areas theoretically most prone to rockfalls can be particularly useful to simulate trajectory profiles and to generate hazard maps, which are the basis for land use planning in mountainous regions. The most important questions to address in order to assess rockfall hazard are: Where are the most probable sources for future rockfalls located? What are the frequencies of occurrence of these rockfalls? I characterized the fracturing patterns in the field and with LiDAR point clouds. Afterwards, I developed a model to compute the failure mechanisms on terrestrial point clouds in order to assess the susceptibility to rockfalls at the cliff scale. Similar procedures were already available to evaluate the susceptibility to rockfalls based on aerial digital elevation models. This new model gives the possibility to detect the most susceptible rockfall sources with unprecented detail in the vertical and overhanging areas. The results of the computation of the most probable rockfall source areas in granitic cliffs of Yosemite Valley and Mont-Blanc massif were then compared to the inventoried rockfall events to validate the calculation methods. Yosemite Valley was chosen as a test area because it has a particularly strong rockfall activity (about one rockfall every week) which leads to a high rockfall hazard. The west face of the Dru was also chosen for the relevant rockfall activity and especially because it was affected by some of the largest rockfalls that occurred in the Alps during the last 10 years. Moreover, both areas were suitable because of their huge vertical and overhanging cliffs that are difficult to study with classical methods. Limit equilibrium models have been applied to several case studies to evaluate the effects of different parameters on the stability of rockslope areas. The impact of the degradation of rockbridges on the stability of large compartments in the west face of the Dru was assessed using finite element modeling. In particular I conducted a back-analysis of the large rockfall event of 2005 (265'000 m3) by integrating field observations of joint conditions, characteristics of fracturing pattern and results of geomechanical tests on the intact rock. These analyses improved our understanding of the factors that influence the stability of rock compartments and were used to define the most probable future rockfall volumes at the Dru. Terrestrial laser scanning point clouds were also successfully employed to perform geological mapping in 3D, using the intensity of the backscattered signal. Another technique to obtain vertical geological maps is combining triangulated TLS mesh with 2D geological maps. At El Capitan (Yosemite Valley) we built a georeferenced vertical map of the main plutonio rocks that was used to investigate the reasons for preferential rockwall retreat rate. Additional efforts to characterize the erosion rate were made at Monte Generoso (Ticino, southern Switzerland) where I attempted to improve the estimation of long term erosion by taking into account also the volumes of the unstable rock compartments. Eventually, the following points summarize the main out puts of my research: The new model to compute the failure mechanisms and the rockfall susceptibility with 3D point clouds allows to define accurately the most probable rockfall source areas at the cliff scale. The analysis of the rockbridges at the Dru shows the potential of integrating detailed measurements of the fractures in geomechanical models of rockmass stability. The correction of the LiDAR intensity signal gives the possibility to classify a point cloud according to the rock type and then use this information to model complex geologic structures. The integration of these results, on rockmass fracturing and composition, with existing methods can improve rockfall hazard assessments and enhance the interpretation of the evolution of steep rockslopes. -- La caractérisation de la géologie en 3D pour des parois rocheuses inaccessibles est une étape nécessaire pour évaluer les dangers naturels tels que chutes de blocs et glissements rocheux, mais aussi pour réaliser des modèles stratigraphiques ou de structures plissées. Les modèles géologiques 3D ont un grand potentiel pour être appliqués dans une vaste gamme de travaux géologiques dans le domaine de la recherche, mais aussi dans des projets appliqués comme les mines, les tunnels ou les réservoirs. Les développements récents des outils de télédétection terrestre (LiDAR, photogrammétrie et imagerie multispectrale / hyperspectrale) sont en train de révolutionner l'acquisition d'informations géomorphologiques et géologiques. Par conséquence, il y a un grand potentiel d'amélioration pour la modélisation d'objets géologiques, ainsi que des mécanismes de rupture et des conditions de stabilité, en intégrant des données détaillées acquises à distance. Pour augmenter les possibilités de prévoir les éboulements futurs, il est fondamental de comprendre l'évolution actuelle de la stabilité des parois rocheuses. Définir les zones qui sont théoriquement plus propices aux chutes de blocs peut être très utile pour simuler les trajectoires de propagation des blocs et pour réaliser des cartes de danger, qui constituent la base de l'aménagement du territoire dans les régions de montagne. Les questions plus importantes à résoudre pour estimer le danger de chutes de blocs sont : Où se situent les sources plus probables pour les chutes de blocs et éboulement futurs ? Avec quelle fréquence vont se produire ces événements ? Donc, j'ai caractérisé les réseaux de fractures sur le terrain et avec des nuages de points LiDAR. Ensuite, j'ai développé un modèle pour calculer les mécanismes de rupture directement sur les nuages de points pour pouvoir évaluer la susceptibilité au déclenchement de chutes de blocs à l'échelle de la paroi. Les zones sources de chutes de blocs les plus probables dans les parois granitiques de la vallée de Yosemite et du massif du Mont-Blanc ont été calculées et ensuite comparés aux inventaires des événements pour vérifier les méthodes. Des modèles d'équilibre limite ont été appliqués à plusieurs cas d'études pour évaluer les effets de différents paramètres sur la stabilité des parois. L'impact de la dégradation des ponts rocheux sur la stabilité de grands compartiments de roche dans la paroi ouest du Petit Dru a été évalué en utilisant la modélisation par éléments finis. En particulier j'ai analysé le grand éboulement de 2005 (265'000 m3), qui a emporté l'entier du pilier sud-ouest. Dans le modèle j'ai intégré des observations des conditions des joints, les caractéristiques du réseau de fractures et les résultats de tests géoméchaniques sur la roche intacte. Ces analyses ont amélioré l'estimation des paramètres qui influencent la stabilité des compartiments rocheux et ont servi pour définir des volumes probables pour des éboulements futurs. Les nuages de points obtenus avec le scanner laser terrestre ont été utilisés avec succès aussi pour produire des cartes géologiques en 3D, en utilisant l'intensité du signal réfléchi. Une autre technique pour obtenir des cartes géologiques des zones verticales consiste à combiner un maillage LiDAR avec une carte géologique en 2D. A El Capitan (Yosemite Valley) nous avons pu géoréferencer une carte verticale des principales roches plutoniques que j'ai utilisé ensuite pour étudier les raisons d'une érosion préférentielle de certaines zones de la paroi. D'autres efforts pour quantifier le taux d'érosion ont été effectués au Monte Generoso (Ticino, Suisse) où j'ai essayé d'améliorer l'estimation de l'érosion au long terme en prenant en compte les volumes des compartiments rocheux instables. L'intégration de ces résultats, sur la fracturation et la composition de l'amas rocheux, avec les méthodes existantes permet d'améliorer la prise en compte de l'aléa chute de pierres et éboulements et augmente les possibilités d'interprétation de l'évolution des parois rocheuses.
Resumo:
Innovaatioista on viime aikoina tullut entistä tärkeämpiä kilpailukyvyn lähteitä kansantalouksille sekä niiden sisällä toimiville alueille. Innovaatiotoiminta on nostettu yhdeksi keskeisimmistä tekijöistä yritysten sekä alueiden välisessä kilpailussa. Tässä tutkimuksessa selvitetään, ovatko innovatiiviset yritykset menestyneet muita saman toimialan yrityksiä paremmin ja mikä on ollut alueellisen innovaatioympäristön osuus niiden menestyksessä tai menestymättömyydessä? Tutkimuksessa perehdytään ensin aihealueeseen teoriatiedon sekä aiempien tutkimustulosten avulla. Tutkimuksen empiirisessä osuudessa mukana on ollut yhteensä 36 eteläsavolaista yritystä, jotka toimivat yhteensä seitsemällä eri toimialalla. Jokaisessa yrityksessä on käyty paikan päällä tekemässä syvähaastattelu. Yritysten menestystä mitattiin kasvua, kannattavuutta sekä pääomarakennetta kuvaavien tunnuslukujen avulla. Tunnuslukujen perusteella saatuja tuloksia syvennettiin haastattelujen tuloksilla. Tutkimuksessa havaittiin, että innovatiiviset yritykset olivat menestyneet hieman muita yrityksiä paremmin, mutta ero vertailuryhmien välillä ei ollut kovin suuri. Lisäksi havaittiin, ettei innovatiivisuus takaa yritykselle hyvää menestystä, vaan lähinnä antaa vain mahdollisuuden poikkeukselliseen menestymiseen. Innovaatioympäristönä Etelä-Savoa pidettiin yritysten innovaatiotoimintaa tukevana, mutta myös kehitettävää löytyi. Alueellisen innovaatioympäristön tärkeimpinä kehityskohteina nousivat esiin julkisten innovaatiopalveluiden sekä yritysten välisen yhteistyön toimivuus ja innovatiivisen miljöön luominen alueelle.
Resumo:
The present work is a part of the large project with purpose to qualify the Flash memory for automotive application using a standardized test and measurement flow. High memory reliability and data retention are the most critical parameters in this application. The current work covers the functional tests and data retention test. The purpose of the data retention test is to obtain the data retention parameters of the designed memory, i.e. the maximum time of information storage at specified conditions without critical charge leakage. For this purpose the charge leakage from the cells, which results in decrease of cells threshold voltage, was measured after a long-time hightemperature treatment at several temperatures. The amount of lost charge for each temperature was used to calculate the Arrhenius constant and activation energy for the discharge process. With this data, the discharge of the cells at different temperatures during long time can be predicted and the probability of data loss after years can be calculated. The memory chips, investigated in this work, were 0.035 μm CMOS Flash memory testchips, designed for further use in the Systems-on-Chips for automotive electronics.
Resumo:
Exposing the human bronchial epithelial cell line BEAS-2B to the nitric oxide (NO) donor sodium 1-(N,N-diethylamino)diazen-1-ium-1, 2-diolate (DEA/NO) at an initial concentration of 0.6 mM while generating superoxide ion at the rate of 1 microM/min with the hypoxanthine/xanthine oxidase (HX/XO) system induced C:G-->T:A transition mutations in codon 248 of the p53 gene. This pattern of mutagenicity was not seen by 'fish-restriction fragment length polymorphism/polymerase chain reaction' (fish-RFLP/PCR) on exposure to DEA/NO alone, however, exposure to HX/XO led to various mutations, suggesting that co-generation of NO and superoxide was responsible for inducing the observed point mutation. DEA/NO potentiated the ability of HX/XO to induce lipid peroxidation as well as DNA single- and double-strand breaks under these conditions, while 0.6 mM DEA/NO in the absence of HX/XO had no significant effect on these parameters. The results show that a point mutation seen at high frequency in certain common human tumors can be induced by simultaneous exposure to reactive oxygen species and a NO source.
Resumo:
This chapter presents possible uses and examples of Monte Carlo methods for the evaluation of uncertainties in the field of radionuclide metrology. The method is already well documented in GUM supplement 1, but here we present a more restrictive approach, where the quantities of interest calculated by the Monte Carlo method are estimators of the expectation and standard deviation of the measurand, and the Monte Carlo method is used to propagate the uncertainties of the input parameters through the measurement model. This approach is illustrated by an example of the activity calibration of a 103Pd source by liquid scintillation counting and the calculation of a linear regression on experimental data points. An electronic supplement presents some algorithms which may be used to generate random numbers with various statistical distributions, for the implementation of this Monte Carlo calculation method.
Resumo:
The CORNISH project is the highest resolution radio continuum survey of the Galactic plane to date. It is the 5 GHz radio continuum part of a series of multi-wavelength surveys that focus on the northern GLIMPSE region (10° < l < 65°), observed by the Spitzer satellite in the mid-infrared. Observations with the Very Large Array in B and BnA configurations have yielded a 1.''5 resolution Stokes I map with a root mean square noise level better than 0.4 mJy beam 1. Here we describe the data-processing methods and data characteristics, and present a new, uniform catalog of compact radio emission. This includes an implementation of automatic deconvolution that provides much more reliable imaging than standard CLEANing. A rigorous investigation of the noise characteristics and reliability of source detection has been carried out. We show that the survey is optimized to detect emission on size scales up to 14'' and for unresolved sources the catalog is more than 90% complete at a flux density of 3.9 mJy. We have detected 3062 sources above a 7σ detection limit and present their ensemble properties. The catalog is highly reliable away from regions containing poorly sampled extended emission, which comprise less than 2% of the survey area. Imaging problems have been mitigated by down-weighting the shortest spacings and potential artifacts flagged via a rigorous manual inspection with reference to the Spitzer infrared data. We present images of the most common source types found: H II regions, planetary nebulae, and radio galaxies. The CORNISH data and catalog are available online at http://cornish.leeds.ac.uk.
Resumo:
Rapid manufacturing is an advanced manufacturing technology based on layer-by-layer manufacturing to produce a part. This paper presents experimental work carried out to investigate the effects of scan speed, layer thickness, and building direction on the following part features: dimensional error, surface roughness, and mechanical properties for DMLS with DS H20 powder and SLM with CL 20 powder (1.4404/AISI 316L). Findings were evaluated using ANOVA analysis. According to the experimental results, build direction has a significant effect on part quality, in terms of dimensional error and surface roughness. For the SLM process, the build direction has no influence on mechanical properties. Results of this research support industry estimating part quality and mechanical properties before the production of parts with additive manufacturing, using iron-based powders
Resumo:
This paper presents a prototype of an interactive web-GIS tool for risk analysis of natural hazards, in particular for floods and landslides, based on open-source geospatial software and technologies. The aim of the presented tool is to assist the experts (risk managers) in analysing the impacts and consequences of a certain hazard event in a considered region, providing an essential input to the decision-making process in the selection of risk management strategies by responsible authorities and decision makers. This tool is based on the Boundless (OpenGeo Suite) framework and its client-side environment for prototype development, and it is one of the main modules of a web-based collaborative decision support platform in risk management. Within this platform, the users can import necessary maps and information to analyse areas at risk. Based on provided information and parameters, loss scenarios (amount of damages and number of fatalities) of a hazard event are generated on the fly and visualized interactively within the web-GIS interface of the platform. The annualized risk is calculated based on the combination of resultant loss scenarios with different return periods of the hazard event. The application of this developed prototype is demonstrated using a regional data set from one of the case study sites, Fella River of northeastern Italy, of the Marie Curie ITN CHANGES project.
Resumo:
L'exposition à certaines particules fongiques et bactéries présentes dans les aérosols de l'environnement intérieur a été associée au développement ou à l'exacerbation d'affections respiratoires telles que l'asthme, la rhinite allergique ou encore l'aspergillose (1-4). Le réservoir principal identifié dans cet environnement pour les bactéries aéroportées est constitué par les habitants eux-mêmes, alors que celui des particules fongiques est l'environnement extérieur, ou, lorsque les conditions sont réunies, l'environnement intérieur (5-7). Néanmoins, la nature et la taille de ces particules fongiques, ainsi que l'impact de l'occupation humaine sur ces paramètres n'ont été que peu explorés. Les articles de cette note s'intéressent justement à ces aspects et illustrent l'importance de leur prise en compte dans l'évaluation du risque d'exposition aux microorganismes dans l'environnement intérieur. L'étude de Hospodsky et coll. (2014) apporte une information quantitative sur le niveau d'émission de bactéries et particules fongiques résultant d'une occupation humaine dans des environnements intérieurs sains. Alors que l'étude de Afanou et coll. (2014) montre la complexité des particules fongiques qui peuvent être générées dans l'environnement intérieur, différentes espèces de moisissures pouvant participer en proportions différentes au nombre de particules submicroniques1 grâce à leurs fragments de spores ou hyphes.
Resumo:
BACKGROUND: Reference intervals for many laboratory parameters determined in 24-h urine collections are either not publicly available or based on small numbers, not sex specific or not from a representative sample. METHODS: Osmolality and concentrations or enzymatic activities of sodium, potassium, chloride, glucose, creatinine, citrate, cortisol, pancreatic α-amylase, total protein, albumin, transferrin, immunoglobulin G, α1-microglobulin, α2-macroglobulin, as well as porphyrins and their precursors (δ-aminolevulinic acid and porphobilinogen) were determined in 241 24-h urine samples of a population-based cohort of asymptomatic adults (121 men and 120 women). For 16 of these 24 parameters creatinine-normalized ratios were calculated based on 24-h urine creatinine. The reference intervals for these parameters were calculated according to the CLSI C28-A3 statistical guidelines. RESULTS: By contrast to most published reference intervals, which do not stratify for sex, reference intervals of 12 of 24 laboratory parameters in 24-h urine collections and of eight of 16 parameters as creatinine-normalized ratios differed significantly between men and women. For six parameters calculated as 24-h urine excretion and four parameters calculated as creatinine-normalized ratios no reference intervals had been published before. For some parameters we found significant and relevant deviations from previously reported reference intervals, most notably for 24-h urine cortisol in women. Ten 24-h urine parameters showed weak or moderate sex-specific correlations with age. CONCLUSIONS: By applying up-to-date analytical methods and clinical chemistry analyzers to 24-h urine collections from a large population-based cohort we provide as yet the most comprehensive set of sex-specific reference intervals calculated according to CLSI guidelines for parameters determined in 24-h urine collections.
Resumo:
NlmCategory="UNASSIGNED">Objects' borders are readily perceived despite absent contrast gradients, e.g. due to poor lighting or occlusion. In humans, a visual evoked potential (VEP) correlate of illusory contour (IC) sensitivity, the "IC effect", has been identified with an onset at ~90ms and generators within bilateral lateral occipital cortices (LOC). The IC effect is observed across a wide range of stimulus parameters, though until now it always involved high-contrast achromatic stimuli. Whether IC perception and its brain mechanisms differ as a function of the type of stimulus cue remains unknown. Resolving such will provide insights on whether there is a unique or multiple solutions to how the brain binds together spatially fractionated information into a cohesive perception. Here, participants discriminated IC from no-contour (NC) control stimuli that were either comprised of low-contrast achromatic stimuli or instead isoluminant chromatic contrast stimuli (presumably biasing processing to the magnocellular and parvocellular pathways, respectively) on separate blocks of trials. Behavioural analyses revealed that ICs were readily perceived independently of the stimulus cue-i.e. when defined by either chromatic or luminance contrast. VEPs were analysed within an electrical neuroimaging framework and revealed a generally similar timing of IC effects across both stimulus contrasts (i.e. at ~90ms). Additionally, an overall phase shift of the VEP on the order of ~30ms was consistently observed in response to chromatic vs. luminance contrast independently of the presence/absence of ICs. Critically, topographic differences in the IC effect were observed over the ~110-160ms period; different configurations of intracranial sources contributed to IC sensitivity as a function of stimulus contrast. Distributed source estimations localized these differences to LOC as well as V1/V2. The present data expand current models by demonstrating the existence of multiple, cue-dependent circuits in the brain for generating perceptions of illusory contours.
Resumo:
The evaluation of forensic evidence can occur at any level within the hierarchy of propositions depending on the question being asked and the amount and type of information that is taken into account within the evaluation. Commonly DNA evidence is reported given propositions that deal with the sub-source level in the hierarchy, which deals only with the possibility that a nominated individual is a source of DNA in a trace (or contributor to the DNA in the case of a mixed DNA trace). We explore the use of information obtained from examinations, presumptive and discriminating tests for body fluids, DNA concentrations and some case circumstances within a Bayesian network in order to provide assistance to the Courts that have to consider propositions at source level. We use a scenario in which the presence of blood is of interest as an exemplar and consider how DNA profiling results and the potential for laboratory error can be taken into account. We finish with examples of how the results of these reports could be presented in court using either numerical values or verbal descriptions of the results.