940 resultados para quality function development


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Jasmonates are ubiquitous oxylipin-derived phytohormones that are essential in the regulation of many development, growth and defence processes. Across the plant kingdom, jasmonates act as elicitors of the production of bioactive secondarymetabolites that serve in defence against attackers. Knowledge of the conserved jasmonate perception and early signalling machineries is increasing, but the downstream mechanisms that regulate defence metabolism remain largely unknown. Herewe showthat, in the legumeMedicago truncatula, jasmonate recruits the endoplasmic-reticulum-associated degradation (ERAD)quality control system tomanagethe production of triterpene saponins, widespread bioactive compounds that share a biogenic origin with sterols. An ERAD-type RING membraneanchor E3 ubiquitin ligase is co-expressed with saponin synthesis enzymes to control the activity of 3-hydroxy-3-methylglutaryl-CoA reductase (HMGR), the rate-limiting enzyme in the supply of the ubiquitous terpene precursor isopentenyl diphosphate. Thus, unrestrained bioactive saponin accumulationis prevented and plant development and integrity secured. This control apparatus is equivalent to the ERAD system that regulates sterol synthesis in yeasts and mammals but that uses distinct E3 ubiquitin ligases, of the HMGR degradation 1 (HRD1) type, to direct destruction of HMGR. Hence, the general principles for the management of sterol and triterpene saponin biosynthesis are conserved across eukaryotes but can be controlled by divergent regulatory cues.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Jasmonates are ubiquitous oxylipin-derived phytohormones that are essential in the regulation of many development, growth and defence processes. Across the plant kingdom, jasmonates act as elicitors of the production of bioactive secondarymetabolites that serve in defence against attackers. Knowledge of the conserved jasmonate perception and early signalling machineries is increasing, but the downstream mechanisms that regulate defence metabolism remain largely unknown. Herewe showthat, in the legumeMedicago truncatula, jasmonate recruits the endoplasmic-reticulum-associated degradation (ERAD)quality control system tomanagethe production of triterpene saponins, widespread bioactive compounds that share a biogenic origin with sterols. An ERAD-type RING membraneanchor E3 ubiquitin ligase is co-expressed with saponin synthesis enzymes to control the activity of 3-hydroxy-3-methylglutaryl-CoA reductase (HMGR), the rate-limiting enzyme in the supply of the ubiquitous terpene precursor isopentenyl diphosphate. Thus, unrestrained bioactive saponin accumulationis prevented and plant development and integrity secured. This control apparatus is equivalent to the ERAD system that regulates sterol synthesis in yeasts and mammals but that uses distinct E3 ubiquitin ligases, of the HMGR degradation 1 (HRD1) type, to direct destruction of HMGR. Hence, the general principles for the management of sterol and triterpene saponin biosynthesis are conserved across eukaryotes but can be controlled by divergent regulatory cues.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Tämän hetken trendit kuten globalisoituminen, ympäristömme turbulenttisuus, elintason nousu, turvallisuuden tarpeen kasvu ja teknologian kehitysnopeus korostavatmuutosten ennakoinnin tarpeellisuutta. Pysyäkseen kilpailukykyisenä yritysten tulee kerätä, analysoida ja hyödyntää liiketoimintatietoa, jokatukee niiden toimintaa viranomaisten, kilpailijoiden ja asiakkaiden toimenpiteiden ennakoinnissa. Innovoinnin ja uusien konseptien kehittäminen, kilpailijoiden toiminnan arviointi, asiakkaiden tarpeet muun muassa vaativatennakoivaa arviointia. Heikot signaalit ovat keskeisessä osassa organisaatioiden valmistautumisessa tulevaisuuden tapahtumiin. Opinnäytetyön tarkoitus on luoda ja kehittää heikkojen signaalien ymmärrystä ja hallintaa sekäkehittää konseptuaalinen ja käytännöllinen lähestymistapa ennakoivan toiminnan edistämiselle. Heikkojen signaalien tyyppien luokittelu perustuu ominaisuuksiin ajan, voimakkuuden ja liiketoimintaan integroinnin suhteen. Erityyppiset heikot signaalit piirteineen luovat reunaehdot laatutekijöiden keräämiselle ja siitä edelleen laatujärjestelmän ja matemaattiseen malliin perustuvan työvälineen kehittämiselle. Heikkojen signaalien laatutekijät on kerätty yhteen kaikista heikkojen signaalien konseptin alueista. Analysoidut ja kohdistetut laatumuuttujat antavat mahdollisuuden kehittää esianalyysiä ja ICT - työvälineitä perustuen matemaattisen mallin käyttöön. Opinnäytetyön tavoitteiden saavuttamiseksi tehtiin ensin Business Intelligence -kirjallisuustutkimus. Hiekkojen signaalien prosessi ja systeemi perustuvat koottuun Business Intelligence - systeemiin. Keskeisinä kehitysalueina tarkasteltiin liiketoiminnan integraatiota ja systemaattisen menetelmän kehitysaluetta. Heikkojen signaalien menetelmien ja määritelmien kerääminen sekä integrointi määriteltyyn prosessiin luovat uuden konseptin perustan, johon tyypitys ja laatutekijät kytkeytyvät. Käytännöllisen toiminnan tarkastelun ja käyttöönoton mahdollistamiseksi toteutettiin Business Intelligence markkinatutkimus (n=156) sekä yhteenveto muihin saatavilla oleviin markkinatutkimuksiin. Syvähaastatteluilla (n=21) varmennettiin laadullisen tarkastelun oikeellisuus. Lisäksi analysoitiin neljä käytännön projektia, joiden yhteenvedot kytkettiin uuden konseptin kehittämiseen. Prosessi voidaan jakaa kahteen luokkaan: yritysten markkinasignaalit vuoden ennakoinnilla ja julkisen sektorin verkostoprojektit kehittäen ennakoinnin struktuurin luonnin 7-15 vuoden ennakoivalle toiminnalle. Tutkimus rajattiin koskemaan pääasiassa ulkoisen tiedon aluetta. IT työvälineet ja lopullisen laatusysteemin kehittäminen jätettiin tutkimuksen ulkopuolelle. Opinnäytetyön tavoitteena ollut heikkojen signaalien konseptin kehittäminen toteutti sille asetetut odotusarvot. Heikkojen signaalien systemaattista tarkastelua ja kehittämistyötä on mahdollista edistää Business Intelligence - systematiikan hyödyntämisellä. Business Intelligence - systematiikkaa käytetään isojen yritysten liiketoiminnan suunnittelun tukena.Organisaatioiden toiminnassa ei ole kuitenkaan yleisesti hyödynnetty laadulliseen analyysiin tukeutuvaa ennakoinnin weak signals - toimintaa. Ulkoisenja sisäisen tiedon integroinnin ja systematiikan hyödyt PK -yritysten tukena vaativat merkittävää panostusta julkishallinnon rahoituksen ja kehitystoiminnan tukimuotoina. Ennakointi onkin tuottanut lukuisia julkishallinnon raportteja, mutta ei käytännön toteutuksia. Toisaalta analysoitujen case-tapausten tuloksena voidaan nähdä, ettei organisaatioissa välttämättä tarvita omaa projektipäällikköä liiketoiminnan tuen kehittämiseksi. Business vastuun ottamiseksi ja asiaan sitoutumiseen on kuitenkin löydyttävä oikea henkilö

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The central and peripheral nervous systems are involved in multiple age-dependent neurological deficits that are often attributed to alterations in function of myelinating glial cells. However, the molecular events that underlie the age-related decline of glial cell function are unknown. We used Schwann cells as a model to study biological processes affected in glial cells by aging. We comprehensively profiled gene expression of the Schwann cellrich mouse sciatic nerve throughout life, from day of birth until senescence (840 days of age). We combined the aging data with the microarray transcriptional data obtained using nerves isolated from Schwann cell-specific neuropathy-inducing mutants MPZCre/+/Lpin1fE2−3/fE2−3 , MPZCre/+/ScapfE1/fE1 and Pmp22-null mice. The majority of age related transcripts were also affected in the analyzed mouse models of neuropathy (54.4%) and in development (59.5%) indicating a high level of overlapping in implicated molecular pathways. We observed that compared to peripheral nerve development, dynamically changing expression profiles in aging have opposite (anticorrelated) orientation while they copy the orientation of transcriptional changes observed in analyzed neuropathy models. Subsequent clustering and biological annotation of dynamically changing transcripts revealed that the processes most significantly deregulated in aging include inflammatory/immune response and lipid biosynthesis/metabolism. Importantly, the changes in these pathways were also observed in myelinated oligodendrocyte-rich optic nerves of aged mice, albeit with lower magnitude. This observation suggests that similar biological processes are affected in aging glial cells in central and peripheral nervous systems, however with different dynamics. Our data, which provide the first comprehensive comparison of molecular changes in glial cells in three distinct biological conditions comprising development, aging and disease, provide not only a new inside into the molecular alterations underlying neural system aging but also identify target pathways for potential therapeutic approaches to prevent or delay complications associated with age-related and inherited forms of neuropathies. *Current address: Department of Physiology, UCSF, San Francisco, CA, USA.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Opinnäytetyö etsii korrelaatiota ohjelmistomittauksella saavutettujen tulosten ja ohjelmasta löytyneiden virheiden väliltä. Työssä käytetään koeryhmänä jo olemassaolevia ohjelmistoja. Työ tutkii olisiko ohjelmistomittareita käyttämällä ollut mahdollista paikallistaa ohjelmistojen ongelmakohdat ja näin saada arvokasta tietoa ohjelmistokehitykseen. Mittausta voitaisiin käyttää resurssien parempaan kohdentamiseen koodikatselmuksissa, koodi-integraatiossa, systeemitestauksessa ja aikataulutuksessa. Mittaamisen avulla nämä tehtävät saisivat enemmän tietoa resurssien kohdistamiseen. Koeryhmänä käytetään erilaisia ohjelmistotuotteita. Yhteistä näille kaikille tuotteille on niiden peräkkäiset julkaisut. Uutta julkaisua tehtäessä, edellistä julkaisua käytetään pohjana, jonka päällekehitetään uutta lähdekoodia. Tämän takia ohjelmistomittauksessa pitää pystyä erottelemaan edellisen julkaisun lähdekoodi uudesta lähdekoodista. Työssä käytettävät ohjelmistomittarit ovat yleisiä ja ohjelmistotekniikassalaajasti käytettyjä mittaamaan erilaisia lähdekoodin ominaisuuksia, joiden arvellaan vaikuttavan virhealttiuteen. Tämän työn tarkoitus on tutkia näiden ohjelmistomittareiden käytettävyyttä koeryhmänä toimivissa ohjelmistoympäristöissä. Käytännön osuus työstä onnistui löytämään korrelaation joidenkinohjelmistomittareiden ja virheiden väliltä, samalla kuin toiset ohjelmistomittarit eivät antaneet vakuuttavia tuloksia. Ohjelmistomittareita käyttämällä näyttää olevan mahdollista tunnistaa virhealttiit kohdat ohjelmasta ja siten parantaa ohjelmistokehityksen tehokkuutta. Ohjelmistomittareiden käyttö tuotekehityksessäon perusteltavaa ja niiden avulla mahdollisesti pystyttäisiin vaikuttamaan ohjelmiston laatuun tulevissa julkaisuissa.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Developing and updating high-quality guidelines requires substantial time and resources. To reduce duplication of effort and enhance efficiency, we developed a process for guideline adaptation and assessed initial perceptions of its feasibility and usefulness. METHODS: Based on preliminary developments and empirical studies, a series of meetings with guideline experts were organised to define a process for guideline adaptation (ADAPTE) and to develop a manual and a toolkit made available on a website (http://www.adapte.org). Potential users, guideline developers and implementers, were invited to register and to complete a questionnaire evaluating their perception about the proposed process. RESULTS: The ADAPTE process consists of three phases (set-up, adaptation, finalisation), 9 modules and 24 steps. The adaptation phase involves identifying specific clinical questions, searching for, retrieving and assessing available guidelines, and preparing the draft adapted guideline. Among 330 registered individuals (46 countries), 144 completed the questionnaire. A majority found the ADAPTE process clear (78%), comprehensive (69%) and feasible (60%), and the manual useful (79%). However, 21% found the ADAPTE process complex. 44% feared that they will not find appropriate and high-quality source guidelines. DISCUSSION: A comprehensive framework for guideline adaptation has been developed to meet the challenges of timely guideline development and implementation. The ADAPTE process generated important interest among guideline developers and implementers. The majority perceived the ADAPTE process to be feasible, useful and leading to improved methodological rigour and guideline quality. However, some de novo development might be needed if no high quality guideline exists for a given topic.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE: To investigate the effects of neonatal hypoglycemia on physical growth and neurocognitive function.Study design: A systematic detection of hypoglycemia (<2.6 mmol/L or 47 mg/dL) was carried out in 85 small-for-gestational-age preterm neonates. Prospective serial evaluations of physical growth and psychomotor development were performed. Retrospectively, infants were grouped according to their glycemic status. RESULTS: The incidence of hypoglycemia was 72.9%. Infants with repeated episodes of hypoglycemia had significantly reduced head circumferences and lower scores in specific psychometric tests at 3.5 years of age. Hypoglycemia also caused reduced head circumferences at 18 months and lower psychometric scores at 5 years of age. Infants with moderate recurrent hypoglycemia had lower scores at 3.5 and 5 years of age compared with the group of infants who had 1 single severe hypoglycemic episode. CONCLUSION: Recurrent episodes of hypoglycemia were strongly correlated with persistent neurodevelopmental and physical growth deficits until 5 years of age. Recurrent hypoglycemia also was a more predictable factor for long-term effects than the severity of a single hypoglycemic episode. Therefore repetitive blood glucose monitoring and rapid treatment even for mild hypoglycemia are recommended for small-for-gestational-age infants in the neonatal period.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Despite the increasing understanding of the relationships between institutions and entrepreneurship, the influence of the quality of government institutions on entrepreneurship is less addressed. This paper focuses on this critical determinant of entrepreneurship in developing and developed countries. Drawing from institutional theory we hypothesize and empirically assess the role of the quality of institutions in entrepreneurial activity. We examine how the quality of government institutions influences the rate of necessity-based entrepreneurial activity across countries and over time by using a cross-sectional time-series approach on data from the Global Entrepreneurship Monitor (GEM) database covering the years 2001–2011. Our results suggest that higher economic development associated with better quality of institutions reduces the prevalence of necessity-based entrepreneurship. Our findings imply that developing countries must rationally organize their functions, and seek to remove unnecessary barriers, decrease political instability, and controls that hamper entrepreneurial activity

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We propose a new approach and related indicators for globally distributed software support and development based on a 3-year process improvement project in a globally distributed engineering company. The company develops, delivers and supports a complex software system with tailored hardware components and unique end-customer installations. By applying the domain knowledge from operations management on lead time reduction and its multiple benefits to process performance, the workflows of globally distributed software development and multitier support processes were measured and monitored throughout the company. The results show that the global end-to-end process visibility and centrally managed reporting at all levels of the organization catalyzed a change process toward significantly better performance. Due to the new performance indicators based on lead times and their variation with fixed control procedures, the case company was able to report faster bug-fixing cycle times, improved response times and generally better customer satisfaction in its global operations. In all, lead times to implement new features and to respond to customer issues and requests were reduced by 50%.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper reviews the literature on managerially actionable new product development success factors and summarises the field in a classic managerial framework. Because of the varying quality, breadth and scope of the field, the review only contains post-1980 studies of tangible product development that are of a rigorous scientific standard. Success is interpreted as a commercial success. The field has gained insight into a broad set of factors that vary in scope, abstraction and context. Main areas that contribute to NPD success are top management support exhibited through resource allocation and communicating the strategic importance of NPD in the organisation. The right projects need to be selected for investment at the beginning of the process and should be aligned to the organisation's internal competencies and the external environment. The NPD process should use cross-functional teams and a competent project champions. Marketing research competency is crucial, as an understanding of the market, customers and competitors is repeatedly highlighted. Product launch competency was also consistently shown to be important. In terms of controlling the NPD process, strict project gates are required to maintain control.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Un système efficace de sismique tridimensionnelle (3-D) haute-résolution adapté à des cibles lacustres de petite échelle a été développé. Dans le Lac Léman, près de la ville de Lausanne, en Suisse, des investigations récentes en deux dimension (2-D) ont mis en évidence une zone de faille complexe qui a été choisie pour tester notre système. Les structures observées incluent une couche mince (<40 m) de sédiments quaternaires sub-horizontaux, discordants sur des couches tertiaires de molasse pentées vers le sud-est. On observe aussi la zone de faille de « La Paudèze » qui sépare les unités de la Molasse du Plateau de la Molasse Subalpine. Deux campagnes 3-D complètes, d?environ d?un kilomètre carré, ont été réalisées sur ce site de test. La campagne pilote (campagne I), effectuée en 1999 pendant 8 jours, a couvert 80 profils en utilisant une seule flûte. Pendant la campagne II (9 jours en 2001), le nouveau système trois-flûtes, bien paramétrés pour notre objectif, a permis l?acquisition de données de très haute qualité sur 180 lignes CMP. Les améliorations principales incluent un système de navigation et de déclenchement de tirs grâce à un nouveau logiciel. Celui-ci comprend un contrôle qualité de la navigation du bateau en temps réel utilisant un GPS différentiel (dGPS) à bord et une station de référence près du bord du lac. De cette façon, les tirs peuvent être déclenchés tous les 5 mètres avec une erreur maximale non-cumulative de 25 centimètres. Tandis que pour la campagne I la position des récepteurs de la flûte 48-traces a dû être déduite à partir des positions du bateau, pour la campagne II elle ont pu être calculées précisément (erreur <20 cm) grâce aux trois antennes dGPS supplémentaires placées sur des flotteurs attachés à l?extrémité de chaque flûte 24-traces. Il est maintenant possible de déterminer la dérive éventuelle de l?extrémité des flûtes (75 m) causée par des courants latéraux ou de petites variations de trajet du bateau. De plus, la construction de deux bras télescopiques maintenant les trois flûtes à une distance de 7.5 m les uns des autres, qui est la même distance que celle entre les lignes naviguées de la campagne II. En combinaison avec un espacement de récepteurs de 2.5 m, la dimension de chaque «bin» de données 3-D de la campagne II est de 1.25 m en ligne et 3.75 m latéralement. L?espacement plus grand en direction « in-line » par rapport à la direction «cross-line» est justifié par l?orientation structurale de la zone de faille perpendiculaire à la direction «in-line». L?incertitude sur la navigation et le positionnement pendant la campagne I et le «binning» imprécis qui en résulte, se retrouve dans les données sous forme d?une certaine discontinuité des réflecteurs. L?utilisation d?un canon à air à doublechambre (qui permet d?atténuer l?effet bulle) a pu réduire l?aliasing observé dans les sections migrées en 3-D. Celui-ci était dû à la combinaison du contenu relativement haute fréquence (<2000 Hz) du canon à eau (utilisé à 140 bars et à 0.3 m de profondeur) et d?un pas d?échantillonnage latéral insuffisant. Le Mini G.I 15/15 a été utilisé à 80 bars et à 1 m de profondeur, est mieux adapté à la complexité de la cible, une zone faillée ayant des réflecteurs pentés jusqu?à 30°. Bien que ses fréquences ne dépassent pas les 650 Hz, cette source combine une pénétration du signal non-aliasé jusqu?à 300 m dans le sol (par rapport au 145 m pour le canon à eau) pour une résolution verticale maximale de 1.1 m. Tandis que la campagne I a été acquise par groupes de plusieurs lignes de directions alternées, l?optimisation du temps d?acquisition du nouveau système à trois flûtes permet l?acquisition en géométrie parallèle, ce qui est préférable lorsqu?on utilise une configuration asymétrique (une source et un dispositif de récepteurs). Si on ne procède pas ainsi, les stacks sont différents selon la direction. Toutefois, la configuration de flûtes, plus courtes que pour la compagne I, a réduit la couverture nominale, la ramenant de 12 à 6. Une séquence classique de traitement 3-D a été adaptée à l?échantillonnage à haute fréquence et elle a été complétée par deux programmes qui transforment le format non-conventionnel de nos données de navigation en un format standard de l?industrie. Dans l?ordre, le traitement comprend l?incorporation de la géométrie, suivi de l?édition des traces, de l?harmonisation des «bins» (pour compenser l?inhomogénéité de la couverture due à la dérive du bateau et de la flûte), de la correction de la divergence sphérique, du filtrage passe-bande, de l?analyse de vitesse, de la correction DMO en 3-D, du stack et enfin de la migration 3-D en temps. D?analyses de vitesse détaillées ont été effectuées sur les données de couverture 12, une ligne sur deux et tous les 50 CMP, soit un nombre total de 600 spectres de semblance. Selon cette analyse, les vitesses d?intervalles varient de 1450-1650 m/s dans les sédiments non-consolidés et de 1650-3000 m/s dans les sédiments consolidés. Le fait que l?on puisse interpréter plusieurs horizons et surfaces de faille dans le cube, montre le potentiel de cette technique pour une interprétation tectonique et géologique à petite échelle en trois dimensions. On distingue cinq faciès sismiques principaux et leurs géométries 3-D détaillées sur des sections verticales et horizontales: les sédiments lacustres (Holocène), les sédiments glacio-lacustres (Pléistocène), la Molasse du Plateau, la Molasse Subalpine de la zone de faille (chevauchement) et la Molasse Subalpine au sud de cette zone. Les couches de la Molasse du Plateau et de la Molasse Subalpine ont respectivement un pendage de ~8° et ~20°. La zone de faille comprend de nombreuses structures très déformées de pendage d?environ 30°. Des tests préliminaires avec un algorithme de migration 3-D en profondeur avant sommation et à amplitudes préservées démontrent que la qualité excellente des données de la campagne II permet l?application de telles techniques à des campagnes haute-résolution. La méthode de sismique marine 3-D était utilisée jusqu?à présent quasi-exclusivement par l?industrie pétrolière. Son adaptation à une échelle plus petite géographiquement mais aussi financièrement a ouvert la voie d?appliquer cette technique à des objectifs d?environnement et du génie civil.<br/><br/>An efficient high-resolution three-dimensional (3-D) seismic reflection system for small-scale targets in lacustrine settings was developed. In Lake Geneva, near the city of Lausanne, Switzerland, past high-resolution two-dimensional (2-D) investigations revealed a complex fault zone (the Paudèze thrust zone), which was subsequently chosen for testing our system. Observed structures include a thin (<40 m) layer of subhorizontal Quaternary sediments that unconformably overlie southeast-dipping Tertiary Molasse beds and the Paudèze thrust zone, which separates Plateau and Subalpine Molasse units. Two complete 3-D surveys have been conducted over this same test site, covering an area of about 1 km2. In 1999, a pilot survey (Survey I), comprising 80 profiles, was carried out in 8 days with a single-streamer configuration. In 2001, a second survey (Survey II) used a newly developed three-streamer system with optimized design parameters, which provided an exceptionally high-quality data set of 180 common midpoint (CMP) lines in 9 days. The main improvements include a navigation and shot-triggering system with in-house navigation software that automatically fires the gun in combination with real-time control on navigation quality using differential GPS (dGPS) onboard and a reference base near the lake shore. Shots were triggered at 5-m intervals with a maximum non-cumulative error of 25 cm. Whereas the single 48-channel streamer system of Survey I requires extrapolation of receiver positions from the boat position, for Survey II they could be accurately calculated (error <20 cm) with the aid of three additional dGPS antennas mounted on rafts attached to the end of each of the 24- channel streamers. Towed at a distance of 75 m behind the vessel, they allow the determination of feathering due to cross-line currents or small course variations. Furthermore, two retractable booms hold the three streamers at a distance of 7.5 m from each other, which is the same distance as the sail line interval for Survey I. With a receiver spacing of 2.5 m, the bin dimension of the 3-D data of Survey II is 1.25 m in in-line direction and 3.75 m in cross-line direction. The greater cross-line versus in-line spacing is justified by the known structural trend of the fault zone perpendicular to the in-line direction. The data from Survey I showed some reflection discontinuity as a result of insufficiently accurate navigation and positioning and subsequent binning errors. Observed aliasing in the 3-D migration was due to insufficient lateral sampling combined with the relatively high frequency (<2000 Hz) content of the water gun source (operated at 140 bars and 0.3 m depth). These results motivated the use of a double-chamber bubble-canceling air gun for Survey II. A 15 / 15 Mini G.I air gun operated at 80 bars and 1 m depth, proved to be better adapted for imaging the complexly faulted target area, which has reflectors dipping up to 30°. Although its frequencies do not exceed 650 Hz, this air gun combines a penetration of non-aliased signal to depths of 300 m below the water bottom (versus 145 m for the water gun) with a maximum vertical resolution of 1.1 m. While Survey I was shot in patches of alternating directions, the optimized surveying time of the new threestreamer system allowed acquisition in parallel geometry, which is preferable when using an asymmetric configuration (single source and receiver array). Otherwise, resulting stacks are different for the opposite directions. However, the shorter streamer configuration of Survey II reduced the nominal fold from 12 to 6. A 3-D conventional processing flow was adapted to the high sampling rates and was complemented by two computer programs that format the unconventional navigation data to industry standards. Processing included trace editing, geometry assignment, bin harmonization (to compensate for uneven fold due to boat/streamer drift), spherical divergence correction, bandpass filtering, velocity analysis, 3-D DMO correction, stack and 3-D time migration. A detailed semblance velocity analysis was performed on the 12-fold data set for every second in-line and every 50th CMP, i.e. on a total of 600 spectra. According to this velocity analysis, interval velocities range from 1450-1650 m/s for the unconsolidated sediments and from 1650-3000 m/s for the consolidated sediments. Delineation of several horizons and fault surfaces reveal the potential for small-scale geologic and tectonic interpretation in three dimensions. Five major seismic facies and their detailed 3-D geometries can be distinguished in vertical and horizontal sections: lacustrine sediments (Holocene) , glaciolacustrine sediments (Pleistocene), Plateau Molasse, Subalpine Molasse and its thrust fault zone. Dips of beds within Plateau and Subalpine Molasse are ~8° and ~20°, respectively. Within the fault zone, many highly deformed structures with dips around 30° are visible. Preliminary tests with 3-D preserved-amplitude prestack depth migration demonstrate that the excellent data quality of Survey II allows application of such sophisticated techniques even to high-resolution seismic surveys. In general, the adaptation of the 3-D marine seismic reflection method, which to date has almost exclusively been used by the oil exploration industry, to a smaller geographical as well as financial scale has helped pave the way for applying this technique to environmental and engineering purposes.<br/><br/>La sismique réflexion est une méthode d?investigation du sous-sol avec un très grand pouvoir de résolution. Elle consiste à envoyer des vibrations dans le sol et à recueillir les ondes qui se réfléchissent sur les discontinuités géologiques à différentes profondeurs et remontent ensuite à la surface où elles sont enregistrées. Les signaux ainsi recueillis donnent non seulement des informations sur la nature des couches en présence et leur géométrie, mais ils permettent aussi de faire une interprétation géologique du sous-sol. Par exemple, dans le cas de roches sédimentaires, les profils de sismique réflexion permettent de déterminer leur mode de dépôt, leurs éventuelles déformations ou cassures et donc leur histoire tectonique. La sismique réflexion est la méthode principale de l?exploration pétrolière. Pendant longtemps on a réalisé des profils de sismique réflexion le long de profils qui fournissent une image du sous-sol en deux dimensions. Les images ainsi obtenues ne sont que partiellement exactes, puisqu?elles ne tiennent pas compte de l?aspect tridimensionnel des structures géologiques. Depuis quelques dizaines d?années, la sismique en trois dimensions (3-D) a apporté un souffle nouveau à l?étude du sous-sol. Si elle est aujourd?hui parfaitement maîtrisée pour l?imagerie des grandes structures géologiques tant dans le domaine terrestre que le domaine océanique, son adaptation à l?échelle lacustre ou fluviale n?a encore fait l?objet que de rares études. Ce travail de thèse a consisté à développer un système d?acquisition sismique similaire à celui utilisé pour la prospection pétrolière en mer, mais adapté aux lacs. Il est donc de dimension moindre, de mise en oeuvre plus légère et surtout d?une résolution des images finales beaucoup plus élevée. Alors que l?industrie pétrolière se limite souvent à une résolution de l?ordre de la dizaine de mètres, l?instrument qui a été mis au point dans le cadre de ce travail permet de voir des détails de l?ordre du mètre. Le nouveau système repose sur la possibilité d?enregistrer simultanément les réflexions sismiques sur trois câbles sismiques (ou flûtes) de 24 traces chacun. Pour obtenir des données 3-D, il est essentiel de positionner les instruments sur l?eau (source et récepteurs des ondes sismiques) avec une grande précision. Un logiciel a été spécialement développé pour le contrôle de la navigation et le déclenchement des tirs de la source sismique en utilisant des récepteurs GPS différentiel (dGPS) sur le bateau et à l?extrémité de chaque flûte. Ceci permet de positionner les instruments avec une précision de l?ordre de 20 cm. Pour tester notre système, nous avons choisi une zone sur le Lac Léman, près de la ville de Lausanne, où passe la faille de « La Paudèze » qui sépare les unités de la Molasse du Plateau et de la Molasse Subalpine. Deux campagnes de mesures de sismique 3-D y ont été réalisées sur une zone d?environ 1 km2. Les enregistrements sismiques ont ensuite été traités pour les transformer en images interprétables. Nous avons appliqué une séquence de traitement 3-D spécialement adaptée à nos données, notamment en ce qui concerne le positionnement. Après traitement, les données font apparaître différents faciès sismiques principaux correspondant notamment aux sédiments lacustres (Holocène), aux sédiments glacio-lacustres (Pléistocène), à la Molasse du Plateau, à la Molasse Subalpine de la zone de faille et la Molasse Subalpine au sud de cette zone. La géométrie 3-D détaillée des failles est visible sur les sections sismiques verticales et horizontales. L?excellente qualité des données et l?interprétation de plusieurs horizons et surfaces de faille montrent le potentiel de cette technique pour les investigations à petite échelle en trois dimensions ce qui ouvre des voies à son application dans les domaines de l?environnement et du génie civil.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Woven monofilament, multifilament, and spun yarn filter media have long been the standard media in liquid filtration equipment. While the energy for a solid-liquid separation process is determined by the engineering work, it is the interface between the slurry and the equipment - the filter media - that greatly affects the performance characteristics of the unit operation. Those skilled in the art are well aware that a poorly designed filter medium may endanger the whole operation, whereas well-performing filter media can make the operation smooth and economical. As the mineral and pulp producers seek to produce ever finer and more refined fractions of their products, it is becoming increasingly important to be able to dewater slurries with average particle sizes around 1 ¿m using conventional, high-capacity filtration equipment. Furthermore, the surface properties of the media must not allow sticky and adhesive particles to adhere to the media. The aim of this thesis was to test how the dirt-repellency, electrical resistance and highpressure filtration performance of selected woven filter media can be improved by modifying the fabric or yarn with coating, chemical treatment and calendering. The results achieved by chemical surface treatments clearly show that the woven media surface properties can be modified to achieve lower electrical resistance and improved dirt-repellency. The main challenge with the chemical treatments is the abrasion resistance and, while the experimental results indicate that the treatment is sufficiently permanent to resist standard weathering conditions, they may still prove to be inadequately strong in terms of actual use.From the pressure filtration studies in this work, it seems obvious that the conventional woven multifilament fabrics still perform surprisingly well against the coated media in terms of filtrate clarity and cake build-up. Especially in cases where the feed slurry concentration was low and the pressures moderate, the conventional media seemed to outperform the coated media. In the cases where thefeed slurry concentration was high, the tightly woven media performed well against the monofilament reference fabrics, but seemed to do worse than some of the coated media. This result is somewhat surprising in that the high initial specific resistance of the coated media would suggest that the media will blind more easily than the plain woven media. The results indicate, however, that it is actually the woven media that gradually clogs during the coarse of filtration. In conclusion, it seems obvious that there is a pressure limit above which the woven media looses its capacity to keep the solid particles from penetrating the structure. This finding suggests that for extreme pressures the only foreseeable solution is the coated fabrics supported by a strong enough woven fabric to hold thestructure together. Having said that, the high pressure filtration process seems to follow somewhat different laws than the more conventional processes. Based on the results, it may well be that the role of the cloth is most of all to support the cake, and the main performance-determining factor is a long life time. Measuring the pore size distribution with a commercially available porometer gives a fairly accurate picture of the pore size distribution of a fabric, but failsto give insight into which of the pore sizes is the most important in determining the flow through the fabric. Historically air, and sometimes water, permeability measures have been the standard in evaluating media filtration performance including particle retention. Permeability, however, is a function of a multitudeof variables and does not directly allow the estimation of the effective pore size. In this study a new method for estimating the effective pore size and open pore area in a densely woven multifilament fabric was developed. The method combines a simplified equation of the electrical resistance of fabric with the Hagen-Poiseuille flow equation to estimate the effective pore size of a fabric and the total open area of pores. The results are validated by comparison to the measured values of the largest pore size (Bubble point) and the average pore size. The results show good correlation with measured values. However, the measured and estimated values tend to diverge in high weft density fabrics. This phenomenon is thought to be a result of a more tortuous flow path of denser fabrics, and could most probably be cured by using another value for the tortuosity factor.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The fiber recovery process is an essential part of the modern paper mill. It creates the basisfor mill's internal recirculation of the most important raw materials ¿ water and fiber. It is normally also a start point for further treatment of wastewater and if it works efficiently, it offers excellent basis to minimize effluents. This dissertation offers two different approaches to the subject. Firstly a novel save-all disc filter feeding system is developed and presented. This so-called precoat method is tested both in the laboratory and full-scale conditions. In laboratory scale it beats the traditional one clearly, when low freeness pulps are used as a sweetener stock. The full-scale application needs still some development work before it can be implemented to the paper mills. Secondly, the operationenvironment of save-all disc filter is studied mostly in laboratory conditions.The focus of this study is in cases, where low-freeness pulps are used as a sweetener stock of save-all filter. The effects of CSF-value, pressure drop, suspension consistency and retention chemicals to the quantity and quality of the filtrate was studied. Also the filtration resistance of the low freeness pulps was one studied.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The parameter setting of a differential evolution algorithm must meet several requirements: efficiency, effectiveness, and reliability. Problems vary. The solution of a particular problem can be represented in different ways. An algorithm most efficient in dealing with a particular representation may be less efficient in dealing with other representations. The development of differential evolution-based methods contributes substantially to research on evolutionary computing and global optimization in general. The objective of this study is to investigatethe differential evolution algorithm, the intelligent adjustment of its controlparameters, and its application. In the thesis, the differential evolution algorithm is first examined using different parameter settings and test functions. Fuzzy control is then employed to make control parameters adaptive based on an optimization process and expert knowledge. The developed algorithms are applied to training radial basis function networks for function approximation with possible variables including centers, widths, and weights of basis functions and both having control parameters kept fixed and adjusted by fuzzy controller. After the influence of control variables on the performance of the differential evolution algorithm was explored, an adaptive version of the differential evolution algorithm was developed and the differential evolution-based radial basis function network training approaches were proposed. Experimental results showed that the performance of the differential evolution algorithm is sensitive to parameter setting, and the best setting was found to be problem dependent. The fuzzy adaptive differential evolution algorithm releases the user load of parameter setting and performs better than those using all fixedparameters. Differential evolution-based approaches are effective for training Gaussian radial basis function networks.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the era of fast product development and customized product requirements, the concept of product platform has proven its power in practice. The product platform approach has enabledcompanies to increase the speed of product introductions while simultaneously benefit from efficiency and effectiveness in the development and production activities. The product platforms are technological bases, which can be used to develop several derivative products, and hence, the differentiation can be pushed closer to the product introduction. The product platform development has some specific features, which differ somewhat from the product development of single products. The time horizon is longer, since the product platform¿slife cycle is longer than individual product's. The long time-horizon also proposes higher market risks and the use of new technologies increases the technological risks involved. The end-customer interface might be far away, but there is not a lack of needs aimed at the product platforms ¿ in fact, the product platform development is very much balancing between the varying needs set to it by thederivative products. This dissertation concentrated on product platform development from the internal product lines' perspective of a singlecase. Altogether six product platform development factors were identified: 'Strategic and business fit of product platform', 'Project communication and deliverables', 'Cooperation with product platform development', 'Innovativeness of product platform architecture and features', 'Reliability and quality of product platform', and 'Promised schedules and final product platform meeting the needs'. From the six factors, three were found to influence quite strongly the overall satisfaction, namely 'Strategic and business fit of product platform', 'Reliability and quality of product platform', and 'Promised schedules and final product platform meeting the needs'. Hence, these three factors might be the ones a new product platform development unit should concentrate first in order to satisfy their closest customers, the product lines. The 'Project communication and deliverables' and 'Innovativeness of product platform architecture and features' were weaker contributors to the overall satisfaction. Overall, the factors explained quite well the satisfaction of the product lines with product platform development. Along the research, several interesting aspects about the very basic nature of the product platform development were found. The long time horizon of the product platform development caused challenges in the area of strategic fIT - a conflict between the short-term requirements and long term needs. The fact that a product platform was used as basis of several derivative products resulted into varying needs, and hence the match with the needs and the strategies. The opinions, that the releases of the larger product lines were given higher priorities, give an interesting contribution to the strategy theory of powerand politics. The varying needs of the product lines, the strengths of them as well as large number of concurrent releases set requirements to prioritization. Hence, the research showed the complicated nature of the product platform development in the case unIT - the very basic nature of the product platform development might be its strength (gaining efficiency and effectiveness in product development and product launches) but also the biggest challenge (developing products to meet several needs). As a single case study, the results of this research are not directly generalizable to all the product platform development activities. Instead, the research serves best as a starting point for additional research as well as gives some insights about the factors and challengesof one product development unit.