998 resultados para large segmental defects


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Työssä suunniteltiin ja toteutettiin linkkikorttien tuotannollinen testeri. Linkkikortti on osa CERN:iin rakennettavan hiukkaskiihdyttimen Large Hadron Colliderin koeasema Compact Muon Solenoidin luentajärjestelmää. Linkkikortin tehtävänä on muuttaa rinnakkaismuotoinen LVDS-signaali sarjamuotoiseksi optiseksi signaaliksi. Testattaessa testeri ja linkkikortti sijoitetaan kehikkoon, joten testerin liittimien pitää olla linkkikortin liittimien kanssa identtisiä. Testerin lähdöt ovat linkkikortin tuloja ja toisinpäin. Tällöin testattaessa voidaan ohjelmoitavien FPGA-piirien avulla lähettää signaalia kortilta toiselle. Vastaanottavan kortin FPGA-piirin avulla voidaan tarkistaa, onko data tullut perille muuttumattomana. Testin ohjaus tapahtuu tietokoneella, jolla käyttäjä antaa käskyn testin aloittamisesta ja jonne lopulta myös raportoidaan testin tulokset. Testien tulokset näytetään myös testerin ledeillä. Työssä ei pystytä linkkikorttien puuttumisen takia testaamaan testeriä loppukäytössään. Kuitenkin testerin toimivuus pystyttiin suurilta osin testaamaan, jolloin saatiin odotettuja tuloksia.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

UPM-Kymmene Oyj:n Jämsänkosken PK6:n ongelmana ovat olleet ajoittaiset pohjakatkot suurilla rullilla. Diplomityön tavoitteena oli parantaa superjumborullien rullauksen hallintaa Jagenbergin rakentamalla VariTop keskiörullaimella. Kirjallisuusosassa selvitettiin rullanmuodostumista ja erilaisia rullausteorioita. Samalla käytiin läpi paperin rullautuvuuteen vaikuttavia tekijöitä, rullausperäisten vikojen muodostumisen syitä sekä pehmeän rullaussylinterin etuja verrattuna kovaan sylinteriin. Kokeellisessa osassa suoritettiin koeajoja tuotannon pituusleikkurilla. Näistä kokeista saatujen tulosten perusteella leikkurin ajoreseptit päivitettiin vastaamaanoptimaalisinta rullarakennetta. Pehmeän rullaussylinterin etuja verrattuna kovaan rullaussylinteriin selvitettiin koeajoissa Voithin koeleikkurilla Saksan Krefeldissä. Tuloksista selvisi, että samalla ajoreseptillä rullattaessa pehmeä sylinteri antoi valmiille rullalle suuremman kovuuden. Myös paperissa olevien profiilivirheiden vaikutukset lievenivät käytettäessä pehmeää rullaussylinteriä. Istukoiden luistamisesta johtuvaa hylsyjen sisäpinnan jauhautumista on ilmennyt suurilla jumborullilla. Tämä johtuu todennäköisimmin kasvaneesta rasituksesta hylsyn sisäpinnalla, mikä on seurausta rullien kasvusta. Ratkaisuina tälle ongelmalle on mahdollinen istukan pidentäminen sekä välyksen pienentäminen istukan ja hylsyn välissä. Myös hylsyjen toimintaa testattiin rullauksessa jahuomattiin niiden täyttävän toimittajien antamat laatuarvot. Lepo- ja liikekitkalla tiedetään olevan suuri merkitys rullausperäisten vikojenmuodostumiseen. Eri lajeista lähetettiin paperinäytteitä UPM-Kymmene Oyj:n Lappeenrannan tutkimuskeskukseen kitkojen määrittämiseksi. Tulosten perusteella havaittiin luettelolajilla lepo- ja liikekitkassa merkittävä ero. Standardi SC lajilla ero oli selvästi pienempi kuin luettelolajilla. Tämä saattaa olla yksi syy luettelolajeilla ilmeneviin pohjaongelmiin.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Tutkielman tavoitteena oli tutkia pienten- ja keskisuurten (PK) ohjelmistoyritysten kansainvälisiä kumppanuuksia. Päätavoitteena oli löytää keinoja kuinka PK-ohjelmistoyritykset voisivat tulla strategisiksi kumppaneksi suurten kansainvälisten yritysten kumppanuusohjelmissa. Lisäksi tutkielmassa oli tavoitteena selvittää kuinka kumppaneiden välistä sitoutumista voitaisiin vahvistaa, jotta PK-ohjelmistoyritykset voisivat saavuttaa todellista lisäarvoa ja kansainvälistä kasvua kumppanuusohjelmien kautta. Tutkielma jakaantuu teoreettiseen ja empiiriseen osaan. Teoreettinen osa keskittyy tarkastelemaan korkean teknologian markkinointia ohjelmistoalalla sekä kansainvälisiä kumppanuuksia. Suurten yritystenkumppanuusohjelmia ei ole tutkittu suomalaisten PK-ohjelmistoyritysten näkökulmasta, minkä vuoksi empiirinen tutkimus on perusteltua. Empiirinen tutkimus toteutettiin laadullisena case-tutkimuksena ja tutkimusmenetelmänä käytettiin puolistrukturoitua haastattelua. Tutkimustulokset osoittavat, että strategisen kumppanin aseman saavuttaminen on pitkä ja haastava matka PK-yrityksille. Suurten kansainvälisten yritysten kumppanuusohjelmat ovat useimmiten monimutkaisia ja todellisen lisäarvon saavuttaminen kumppanuusohjelman kautta vaatii paljon resursseja PK-yrityksiltä. Jotta PK-yritykset voisivat saavuttaa ja säilyttää strategisen kumppanin aseman kumppanuusohjelmassa, vaatii se aktiivista ja päivittäistä vuorovaikutusta kumppaneiden kesken. Erityisesti tiiviit henkilösuhteet oikeiden avainhenkilöiden kanssa ovat välttämättömyys. Läheiset kontaktit mahdollistavat sen, että PK-yritykset voivat ainakin osittain ohittaa kumppanuusohjelman byrokratian, mikä lisää luottamusta ja sitoutumista kumppanuussuhteessa sekä edistää kansainvälistä kasvua ja menestystä liiketoiminnassa.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

AIMS: To study weight, length, body composition, sleeping energy expenditure (SEE), and respiratory quotient (RQ) at birth and at 5 mo of age in both adequate-for-gestational-age (AGA) and large-for-gestational-age (LGA) subjects; to compare the changes in body weight and body composition adjusting for gender, age, SEE, RQ and several maternal factors; to investigate the contribution of initial SEE and RQ to changes in body weight and body composition. METHODS: Sixty-nine neonates were recruited among term infants in the University Hospital of Verona, Italy. Forty-nine subjects participated until follow-up. At birth and follow-up, weight and length were measured and arm-fat area and arm-muscle area were calculated from triceps and subscapular skinfolds. SEE and RQ were measured by indirect calorimetry. RESULTS: At birth, weight, length, arm-muscle and arm-fat areas were significantly higher in LGA subjects than in AGA subjects. Weight status, SEE and RQ at birth did not explain the relative weight change after adjusting for gestational weight, placental weight, age at follow-up and gender. Arm-fat area and weight/length ratio at birth were negatively associated with relative changes in body weight after adjusting for the above variables (p < 0.05). CONCLUSION: Early growth from birth to 5 mo of life is significantly affected by body size and adiposity at birth. Fatter newborns had a slower growth rate than thinner newborns.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Proteins PRPF31, PRPF3 and PRPF8 (RP-PRPFs) are ubiquitously expressed components of the spliceosome, a macromolecular complex that processes nearly all pre-mRNAs. Although these spliceosomal proteins are conserved in eukaryotes and are essential for survival, heterozygous mutations in human RP-PRPF genes lead to retinitis pigmentosa, a hereditary disease restricted to the eye. Using cells from patients with 10 different mutations, we show that all clinically relevant RP-PRPF defects affect the stoichiometry of spliceosomal small nuclear RNAs (snRNAs), the protein composition of tri-small nuclear ribonucleoproteins and the kinetics of spliceosome assembly. These mutations cause inefficient splicing in vitro and affect constitutive splicing ex-vivo by impairing the removal of at least 9% of endogenously expressed introns. Alternative splicing choices are also affected when RP-PRPF defects are present. Furthermore, we show that the steady-state levels of snRNAs and processed pre-mRNAs are highest in the retina, indicating a particularly elevated splicing activity. Our results suggest a role for PRPFs defects in the etiology of PRPF-linked retinitis pigmentosa, which appears to be a truly systemic splicing disease. Although these mutations cause widespread and important splicing defects, they are likely tolerated by the majority of human tissues but are critical for retinal cell survival.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

La biologie de la conservation est communément associée à la protection de petites populations menacées d?extinction. Pourtant, il peut également être nécessaire de soumettre à gestion des populations surabondantes ou susceptibles d?une trop grande expansion, dans le but de prévenir les effets néfastes de la surpopulation. Du fait des différences tant quantitatives que qualitatives entre protection des petites populations et contrôle des grandes, il est nécessaire de disposer de modèles et de méthodes distinctes. L?objectif de ce travail a été de développer des modèles prédictifs de la dynamique des grandes populations, ainsi que des logiciels permettant de calculer les paramètres de ces modèles et de tester des scénarios de gestion. Le cas du Bouquetin des Alpes (Capra ibex ibex) - en forte expansion en Suisse depuis sa réintroduction au début du XXème siècle - servit d?exemple. Cette tâche fut accomplie en trois étapes : En premier lieu, un modèle de dynamique locale, spécifique au Bouquetin, fut développé : le modèle sous-jacent - structuré en classes d?âge et de sexe - est basé sur une matrice de Leslie à laquelle ont été ajoutées la densité-dépendance, la stochasticité environnementale et la chasse de régulation. Ce modèle fut implémenté dans un logiciel d?aide à la gestion - nommé SIM-Ibex - permettant la maintenance de données de recensements, l?estimation automatisée des paramètres, ainsi que l?ajustement et la simulation de stratégies de régulation. Mais la dynamique d?une population est influencée non seulement par des facteurs démographiques, mais aussi par la dispersion et la colonisation de nouveaux espaces. Il est donc nécessaire de pouvoir modéliser tant la qualité de l?habitat que les obstacles à la dispersion. Une collection de logiciels - nommée Biomapper - fut donc développée. Son module central est basé sur l?Analyse Factorielle de la Niche Ecologique (ENFA) dont le principe est de calculer des facteurs de marginalité et de spécialisation de la niche écologique à partir de prédicteurs environnementaux et de données d?observation de l?espèce. Tous les modules de Biomapper sont liés aux Systèmes d?Information Géographiques (SIG) ; ils couvrent toutes les opérations d?importation des données, préparation des prédicteurs, ENFA et calcul de la carte de qualité d?habitat, validation et traitement des résultats ; un module permet également de cartographier les barrières et les corridors de dispersion. Le domaine d?application de l?ENFA fut exploré par le biais d?une distribution d?espèce virtuelle. La comparaison à une méthode couramment utilisée pour construire des cartes de qualité d?habitat, le Modèle Linéaire Généralisé (GLM), montra qu?elle était particulièrement adaptée pour les espèces cryptiques ou en cours d?expansion. Les informations sur la démographie et le paysage furent finalement fusionnées en un modèle global. Une approche basée sur un automate cellulaire fut choisie, tant pour satisfaire aux contraintes du réalisme de la modélisation du paysage qu?à celles imposées par les grandes populations : la zone d?étude est modélisée par un pavage de cellules hexagonales, chacune caractérisée par des propriétés - une capacité de soutien et six taux d?imperméabilité quantifiant les échanges entre cellules adjacentes - et une variable, la densité de la population. Cette dernière varie en fonction de la reproduction et de la survie locale, ainsi que de la dispersion, sous l?influence de la densité-dépendance et de la stochasticité. Un logiciel - nommé HexaSpace - fut développé pour accomplir deux fonctions : 1° Calibrer l?automate sur la base de modèles de dynamique (par ex. calculés par SIM-Ibex) et d?une carte de qualité d?habitat (par ex. calculée par Biomapper). 2° Faire tourner des simulations. Il permet d?étudier l?expansion d?une espèce envahisseuse dans un paysage complexe composé de zones de qualité diverses et comportant des obstacles à la dispersion. Ce modèle fut appliqué à l?histoire de la réintroduction du Bouquetin dans les Alpes bernoises (Suisse). SIM-Ibex est actuellement utilisé par les gestionnaires de la faune et par les inspecteurs du gouvernement pour préparer et contrôler les plans de tir. Biomapper a été appliqué à plusieurs espèces (tant végétales qu?animales) à travers le Monde. De même, même si HexaSpace fut initialement conçu pour des espèces animales terrestres, il pourrait aisément être étndu à la propagation de plantes ou à la dispersion d?animaux volants. Ces logiciels étant conçus pour, à partir de données brutes, construire un modèle réaliste complexe, et du fait qu?ils sont dotés d?une interface d?utilisation intuitive, ils sont susceptibles de nombreuses applications en biologie de la conservation. En outre, ces approches peuvent également s?appliquer à des questions théoriques dans les domaines de l?écologie des populations et du paysage.<br/><br/>Conservation biology is commonly associated to small and endangered population protection. Nevertheless, large or potentially large populations may also need human management to prevent negative effects of overpopulation. As there are both qualitative and quantitative differences between small population protection and large population controlling, distinct methods and models are needed. The aim of this work was to develop theoretical models to predict large population dynamics, as well as computer tools to assess the parameters of these models and to test management scenarios. The alpine Ibex (Capra ibex ibex) - which experienced a spectacular increase since its reintroduction in Switzerland at the beginning of the 20th century - was used as paradigm species. This task was achieved in three steps: A local population dynamics model was first developed specifically for Ibex: the underlying age- and sex-structured model is based on a Leslie matrix approach with addition of density-dependence, environmental stochasticity and culling. This model was implemented into a management-support software - named SIM-Ibex - allowing census data maintenance, parameter automated assessment and culling strategies tuning and simulating. However population dynamics is driven not only by demographic factors, but also by dispersal and colonisation of new areas. Habitat suitability and obstacles modelling had therefore to be addressed. Thus, a software package - named Biomapper - was developed. Its central module is based on the Ecological Niche Factor Analysis (ENFA) whose principle is to compute niche marginality and specialisation factors from a set of environmental predictors and species presence data. All Biomapper modules are linked to Geographic Information Systems (GIS); they cover all operations of data importation, predictor preparation, ENFA and habitat suitability map computation, results validation and further processing; a module also allows mapping of dispersal barriers and corridors. ENFA application domain was then explored by means of a simulated species distribution. It was compared to a common habitat suitability assessing method, the Generalised Linear Model (GLM), and was proven better suited for spreading or cryptic species. Demography and landscape informations were finally merged into a global model. To cope with landscape realism and technical constraints of large population modelling, a cellular automaton approach was chosen: the study area is modelled by a lattice of hexagonal cells, each one characterised by a few fixed properties - a carrying capacity and six impermeability rates quantifying exchanges between adjacent cells - and one variable, population density. The later varies according to local reproduction/survival and dispersal dynamics, modified by density-dependence and stochasticity. A software - named HexaSpace - was developed, which achieves two functions: 1° Calibrating the automaton on the base of local population dynamics models (e.g., computed by SIM-Ibex) and a habitat suitability map (e.g. computed by Biomapper). 2° Running simulations. It allows studying the spreading of an invading species across a complex landscape made of variously suitable areas and dispersal barriers. This model was applied to the history of Ibex reintroduction in Bernese Alps (Switzerland). SIM-Ibex is now used by governmental wildlife managers to prepare and verify culling plans. Biomapper has been applied to several species (both plants and animals) all around the World. In the same way, whilst HexaSpace was originally designed for terrestrial animal species, it could be easily extended to model plant propagation or flying animals dispersal. As these softwares were designed to proceed from low-level data to build a complex realistic model and as they benefit from an intuitive user-interface, they may have many conservation applications. Moreover, theoretical questions in the fields of population and landscape ecology might also be addressed by these approaches.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Résumé: At least since the Great Depression, explaining why there are business fluctuations has been one of the biggest challenges that the science of economics has had to face. The hope is that if we could better understand recessions, then we could also be more successful in overcoming them. This dissertation consists of three papers that are part of the general endeavor of economists to understand these fluctuations. The first paper discusses, for a particular model, whether a result related to fluctuations would still hold if time were modeled as continuous rather than discrete. The two other papers focus on price stickiness. The second paper discusses why, after a large devaluation, prices of non-tradables may change by only a small amount in comparison to the magnitude of the devaluation. The third paper examines price adjustment in a model in which information is imperfect and it is costly to change prices.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Thisthesis supplements the systematic approach to competitive intelligence and competitor analysis by introducing an information-processing perspective on management of the competitive environment and competitors therein. The cognitive questions connected to the intelligence process and also the means that organizational actors use in sharing information are discussed. The ultimate aim has been to deepen knowledge of the different intraorganizational processes that are used in acorporate organization to manage and exploit the vast amount of competitor information that is received from the environment. Competitor information and competitive knowledge management is examined as a process, where organizational actorsidentify and perceive the competitive environment by using cognitive simplification, make interpretations resulting in learning and finally utilize competitor information and competitive knowledge in their work processes. The sharing of competitive information and competitive knowledge is facilitated by intraorganizational networks that evolve as a means of developing a shared, organizational level knowledge structure and ensuring that the right information is in the right place at the right time. This thesis approaches competitor information and competitive knowledge management both theoretically and empirically. Based on the conceptual framework developed by theoretical elaboration, further understanding of the studied phenomena is sought by an empirical study. The empirical research was carried out in a multinationally operating forest industry company. This thesis makes some preliminary suggestions of improving the competitive intelligence process. It is concluded that managing competitor information and competitive knowledge is not simply a question of managing information flow or improving sophistication of competitor analysis, but the crucial question to be solved is rather, how to improve the cognitive capabilities connected to identifying and making interpretations of the competitive environment and how to increase learning. It is claimed that competitive intelligence can not be treated like an organizational function or assigned solely to a specialized intelligence unit.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

It is commonly observed that complex fabricated structures subject tofatigue loading fail at the welded joints. Some problems can be corrected by proper detail design but fatigue performance can also be improved using post-weld improvement methods. In general, improvement methods can be divided into two main groups: weld geometry modification methods and residual stress modification methods. The former remove weld toe defects and/or reduce the stress concentrationwhile the latter introduce compressive stress fields in the area where fatigue cracks are likely to initiate. Ultrasonic impact treatment (UIT) is a novel post-weld treatment method that influences both the residual stress distribution andimproves the local geometry of the weld. The structural fatigue strength of non-load carrying attachments in the as-welded condition has been experimentally compared to the structural fatigue strength of ultrasonic impact treated welds. Longitudinal attachment specimens made of two thicknesses of steel S355 J0 have been tested for determining the efficiency of ultrasonic impacttreatment. Treated welds were found to have about 50% greater structural fatigue strength, when the slope of the S-N-curve is three. High mean stress fatigue testing based on the Ohta-method decreased the degree of weld improvement only 19%. This indicated that the method could be also applied for large fabricated structures operating under high reactive residual stresses equilibrated within the volume of the structure. The thickness of specimens has no significant effect tothe structural fatigue strength. The fatigue class difference between 5 mm and 8 mm specimen was only 8%. It was hypothesized that the UIT method added a significant crack initiation period to the total fatigue life of the welded joints. Crack initiation life was estimated by a local strain approach. Material parameters were defined using a modified Uniform Material Law developed in Germany. Finite element analysis and X-ray diffraction were used to define, respectively, the stress concentration and mean stress. The theoretical fatigue life was found to have good accuracy comparing to experimental fatigue tests.The predictive behaviour of the local strain approach combined with the uniformmaterial law was excellent for the joint types and conditions studied in this work.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Quality inspection and assurance is a veryimportant step when today's products are sold to markets. As products are produced in vast quantities, the interest to automate quality inspection tasks has increased correspondingly. Quality inspection tasks usuallyrequire the detection of deficiencies, defined as irregularities in this thesis. Objects containing regular patterns appear quite frequently on certain industries and science, e.g. half-tone raster patterns in the printing industry, crystal lattice structures in solid state physics and solder joints and components in the electronics industry. In this thesis, the problem of regular patterns and irregularities is described in analytical form and three different detection methods are proposed. All the methods are based on characteristics of Fourier transform to represent regular information compactly. Fourier transform enables the separation of regular and irregular parts of an image but the three methods presented are shown to differ in generality and computational complexity. Need to detect fine and sparse details is common in quality inspection tasks, e.g., locating smallfractures in components in the electronics industry or detecting tearing from paper samples in the printing industry. In this thesis, a general definition of such details is given by defining sufficient statistical properties in the histogram domain. The analytical definition allowsa quantitative comparison of methods designed for detail detection. Based on the definition, the utilisation of existing thresholding methodsis shown to be well motivated. Comparison of thresholding methods shows that minimum error thresholding outperforms other standard methods. The results are successfully applied to a paper printability and runnability inspection setup. Missing dots from a repeating raster pattern are detected from Heliotest strips and small surface defects from IGT picking papers.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The future of high technology welded constructions will be characterised by higher strength materials and improved weld quality with respect to fatigue resistance. The expected implementation of high quality high strength steel welds will require that more attention be given to the issues of crack initiation and mechanical mismatching. Experiments and finite element analyses were performed within the framework of continuum damage mechanics to investigate the effect of mismatching of welded joints on void nucleation and coalescence during monotonic loading. It was found that the damage of undermatched joints mainly occurred in the sandwich layer and the damageresistance of the joints decreases with the decrease of the sandwich layer width. The damage of over-matched joints mainly occurred in the base metal adjacent to the sandwich layer and the damage resistance of the joints increases with thedecrease of the sandwich layer width. The mechanisms of the initiation of the micro voids/cracks were found to be cracking of the inclusions or the embrittled second phase, and the debonding of the inclusions from the matrix. Experimental fatigue crack growth rate testing showed that the fatigue life of under-matched central crack panel specimens is longer than that of over-matched and even-matched specimens. Further investigation by the elastic-plastic finite element analysis indicated that fatigue crack closure, which originated from the inhomogeneousyielding adjacent to the crack tip, played an important role in the fatigue crack propagation. The applicability of the J integral concept to the mismatched specimens with crack extension under cyclic loading was assessed. The concept of fatigue class used by the International Institute of Welding was introduced in the parametric numerical analysis of several welded joints. The effect of weld geometry and load condition on fatigue strength of ferrite-pearlite steel joints was systematically evaluated based on linear elastic fracture mechanics. Joint types included lap joints, angle joints and butt joints. Various combinations of the tensile and bending loads were considered during the evaluation with the emphasis focused on the existence of both root and toe cracks. For a lap joint with asmall lack-of-penetration, a reasonably large weld leg and smaller flank angle were recommended for engineering practice in order to achieve higher fatigue strength. It was found that the fatigue strength of the angle joint depended strongly on the location and orientation of the preexisting crack-like welding defects, even if the joint was welded with full penetration. It is commonly believed that the double sided butt welds can have significantly higher fatigue strength than that of a single sided welds, but fatigue crack initiation and propagation can originate from the weld root if the welding procedure results in a partial penetration. It is clearly shown that the fatigue strength of the butt joint could be improved remarkably by ensuring full penetration. Nevertheless, increasing the fatigue strength of a butt joint by increasing the size of the weld is an uneconomical alternative.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

To date, published studies of alluvial bar architecture in large rivers have been restricted mostly to case studies of individual bars and single locations. Relatively little is known about how the depositional processes and sedimentary architecture of kilometre-scale bars vary within a multi-kilometre reach or over several hundreds of kilometres downstream. This study presents Ground Penetrating Radar and core data from 11, kilometre-scale bars from the Rio Parana, Argentina. The investigated bars are located between 30km upstream and 540km downstream of the Rio Parana - Rio Paraguay confluence, where a significant volume of fine-grained suspended sediment is introduced into the network. Bar-scale cross-stratified sets, with lengths and widths up to 600m and thicknesses up to 12m, enable the distinction of large river deposits from stacked deposits of smaller rivers, but are only present in half the surface area of the bars. Up to 90% of bar-scale sets are found on top of finer-grained ripple-laminated bar-trough deposits. Bar-scale sets make up as much as 58% of the volume of the deposits in small, incipient mid-channel bars, but this proportion decreases significantly with increasing age and size of the bars. Contrary to what might be expected, a significant proportion of the sedimentary structures found in the Rio Parana is similar in scale to those found in much smaller rivers. In other words, large river deposits are not always characterized by big structures that allow a simple interpretation of river scale. However, the large scale of the depositional units in big rivers causes small-scale structures, such as ripple sets, to be grouped into thicker cosets, which indicate river scale even when no obvious large-scale sets are present. The results also show that the composition of bars differs between the studied reaches upstream and downstream of the confluence with the Rio Paraguay. Relative to other controls on downstream fining, the tributary input of fine-grained suspended material from the Rio Paraguay causes a marked change in the composition of the bar deposits. Compared to the upstream reaches, the sedimentary architecture of the downstream reaches in the top ca 5m of mid-channel bars shows: (i) an increase in the abundance and thickness (up to metre-scale) of laterally extensive (hundreds of metres) fine-grained layers; (ii) an increase in the percentage of deposits comprised of ripple sets (to >40% in the upper bar deposits); and (iii) an increase in bar-trough deposits and a corresponding decrease in bar-scale cross-strata (<10%). The thalweg deposits of the Rio Parana are composed of dune sets, even directly downstream from the Rio Paraguay where the upper channel deposits are dominantly fine-grained. Thus, the change in sedimentary facies due to a tributary point-source of fine-grained sediment is primarily expressed in the composition of the upper bar deposits.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Hydrograph convolution is a product of tributary inputs from across the watershed. The time-space distribution of precipitation, the biophysical processes that control the conversion of precipitation to runoff and channel flow conveyance processes, are heterogeneous and different areas respond to rainfall in different ways. We take a subwatershed approach to this and account for tributary flow magnitude, relative timing, and sequencing. We hypothesize that as the scale of the watershed increases so we may start to see systematic differences in subwatershed hydrological response. We test this hypothesis for a large flood (T >100 years) in a large watershed in northern England. We undertake a sensitivity analysis of the effects of changing subwatershed hydrological response using a hydraulic model. Delaying upstream tributary peak flow timing to make them asynchronous from downstream subwatersheds reduced flood magnitude. However, significant hydrograph adjustment in any one subwatershed was needed for meaningful reductions in stage downstream, although smaller adjustments in multiple tributaries resulted in comparable impacts. For larger hydrograph adjustments, the effect of changing the timing of two tributaries together was lower than the effect of changing each one separately. For smaller adjustments synergy between two subwatersheds meant the effect of changing them together could be greater than the sum of the parts. Thus, this work shows that while the effects of modifying biophysical catchment properties diminishes with scale due to dilution effects, their impact on relative timing of tributaries may, if applied in the right locations, be an important element of flood management.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Microphthalmia with linear skin defects (MLS) syndrome is an X-linked male-lethal disorder also known as MIDAS (microphthalmia, dermal aplasia, and sclerocornea). Additional clinical features include neurological and cardiac abnormalities. MLS syndrome is genetically heterogeneous given that heterozygous mutations in HCCS or COX7B have been identified in MLS-affected females. Both genes encode proteins involved in the structure and function of complexes III and IV, which form the terminal segment of the mitochondrial respiratory chain (MRC). However, not all individuals with MLS syndrome carry a mutation in either HCCS or COX7B. The majority of MLS-affected females have severe skewing of X chromosome inactivation, suggesting that mutations in HCCS, COX7B, and other as-yet-unidentified X-linked gene(s) cause selective loss of cells in which the mutated X chromosome is active. By applying whole-exome sequencing and filtering for X-chromosomal variants, we identified a de novo nonsense mutation in NDUFB11 (Xp11.23) in one female individual and a heterozygous 1-bp deletion in a second individual, her asymptomatic mother, and an affected aborted fetus of the subject's mother. NDUFB11 encodes one of 30 poorly characterized supernumerary subunits of NADH:ubiquinone oxidoreductase, known as complex I (cI), the first and largest enzyme of the MRC. By shRNA-mediated NDUFB11 knockdown in HeLa cells, we demonstrate that NDUFB11 is essential for cI assembly and activity as well as cell growth and survival. These results demonstrate that X-linked genetic defects leading to the complete inactivation of complex I, III, or IV underlie MLS syndrome. Our data reveal an unexpected role of cI dysfunction in a developmental phenotype, further underscoring the existence of a group of mitochondrial diseases associated with neurocutaneous manifestations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents an approach based on the saddle-point approximation to study the equilibrium interactions between small molecules and macromolecules with a large number of sites. For this case, the application of the Darwin–Fowler method results in very simple expressions for the stoichiometric equilibrium constants and their corresponding free energies in terms of integrals of the binding curve plus a correction term which depends on the first derivatives of the binding curve in the points corresponding to an integer value of the mean occupation number. These expressions are simplified when the number of sites tends to infinity, providing an interpretation of the binding curve in terms of the stoichiometric stability constants. The formalism presented is applied to some simple complexation models, obtaining good values for the free energies involved. When heterogeneous complexation is assumed, simple expressions are obtained to relate the macroscopic description of the binding, given by the stoichiomeric constants, with the microscopic description in terms of the intrinsic stability constants or the affinity spectrum. © 1999 American Institute of Physics.