988 resultados para peaks-over-threshold


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Laktoosi eli maitosokeri on tärkein ainesosa useimpien nisäkkäiden tuottamassa maidossa. Sitä erotetaan herasta, juustosta ja maidosta. Laktoosia käytetään elintarvike- ja lääketeollisuuden raaka-aineena monissaeri tuotteissa. Lääketeollisuudessa laktoosia käytetään esimerkiksi tablettien täyteaineena. Hapettamalla laktoosia voidaan valmistaa laktobionihappoa, 2-keto-laktobionihappoa ja laktuloosia. Laktobionihappoa käytetään biohajoavien pintojen ja kosmetiikkatuotteiden valmistuksessa, sekä sisäelinten säilöntäliuoksissa, joissa laktobionihappo estää happiradikaalien aiheuttamien kudosvaurioiden syntymistä. Tässä työssä laktoosia hapetettiin laktobionihapoksi sekoittimella varustetussa laboratoriomittakaavaisessa panosreaktorissa käyttäenkatalyyttinä palladiumia aktiivihiilellä. Muutamissa kokeissa katalyytin promoottorina käytettiin vismuttia, joka hidastaa katalyytin deaktivoitumista. Työn tarkoituksena oli saada lisää tietoa laktoosin hapettamisen kinetiikasta. Laktoosin hapettumisessa laktobionihapoksi havaittiin selektiivisyyteen vaikuttavan muunmuassa reaktiolämpötila, paine, pH ja käytetyn katalyytin määrä. Katalyyttiä kierrättämällä eri kokeiden välillä saatiin paremmat konversiot, selektiivisyydet ja saannot. Parhaat koetulokset saatiin hapetettaessa synteettisellä ilmalla 60 oC lämpötilassa ja 1 bar paineessa. Tehdyissä kokeissa pH:n säätö tehtiin manuaalisesti, joten pH ei pysynyt koko ajan haluttuna. Laktoosin konversio oli parhaimmillaan 95 %. Laktobionihapon suhteellinen selektiivisyys oli 100% ja suhteellinen saanto 100 %. Kinetiikan matemaattinen mallinnus tehtiin Modest-ohjelmalla käyttäen kokeista saatuja mittaustuloksia.Ohjelman avulla estimoitiin parametreja ja saatiin matemaattinen malli reaktorille. Tässä työssä tehtiin kineettinen mallinnus myös ravistelureaktorissa tehdyille laktoosin hapetuskokeille, missä pH pysyi koko ajan haluttuna 'in-situ' titrauksen avulla. Työn yhteydessä selvitettiin myös mahdollisuutta käyttää monoliittikatalyyttejä laktoosin hapetusreaktiossa.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVES: This study aims at evaluating the prevalence of rheumatic diseases in the elderly and its evolution over time. METHODS: We present a systematic international literature review of the prevalence of rheumatic diseases in the elderly and its evolution over time. RESULTS: The estimated current prevalence of rheumatic diseases among people aged 65 and more varies between 41% and 53%, and is similar to estimated prevalence rates in studies performed before 1990 (35-55%). The prevalence is high and seems to increase rapidly with age. Furthermore, women suffer more frequently from rheumatic diseases than men. CONCLUSION: The selected studies included a large range of methods, making comparisons difficult. However, estimates of the prevalence of rheumatic diseases in the elderly appear to be homogeneous in different countries and stable since 1980.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Posaconazole (POS) is a new antifungal agent for prevention and therapy of mycoses in immunocompromised patients. Variable POS pharmacokinetics after oral dosing may influence efficacy: a trough threshold of 0.5 ?g/ml has been recently proposed. Measurement of POS plasma concentrations by complex chromatographic techniques may thus contribute to optimize prevention and management of life-threatening infections. No microbiological analytical method is available. The objective of this study was to develop and validate a new simplified ultra-performance liquid chromatography tandem mass spectrometry (UPLC-MS/MS) method and a sensitive bioassay for quantification of POS over the clinical plasma concentration range. The UPLC-MS/MS equipment consisted of a triple quadrupole mass spectrometer, an electrospray ionization (ESI) source, and a C(18) analytical column. The Candida albicans POS-hypersusceptible mutant (MIC of 0.002 ?g/ml) ?cdr1 ?cdr2 ?flu ?mdr1 ?can constructed by targeted deletion of multidrug efflux transporters and calcineurin genes was used for the bioassay. POS was extracted from plasma by protein precipitation with acetonitrile-methanol (75%/25%, vol/vol). Reproducible standard curves were obtained over the range 0.014 to 12 (UPLC-MS/MS) and 0.028 to 12 ?g/ml (bioassay). Intra- and interrun accuracy levels were 106% ± 2% and 103% ± 4% for UPLC-MS/MS and 102% ± 8% and 104% ± 1% for bioassay, respectively. The intra- and interrun coefficients of variation were 7% ± 4% and 7% ± 3% for UPLC-MS/MS and 5% ± 3% and 4% ± 2% for bioassay, respectively. An excellent correlation between POS plasma concentrations measured by UPLC-MS/MS and bioassay was found (concordance, 0.96). In 26 hemato-oncological patients receiving oral POS, 27/69 (39%) trough plasma concentrations were lower than 0.5 ?g/ml. The UPLC-MS/MS method and sensitive bioassay offer alternative tools for accurate and precise quantification of the plasma concentrations in patients receiving oral posaconazole.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Temperature reconstructions for recent centuries are the basis of estimations of the natural variability in the climate system before and during the onset of anthropogenic perturbation. Here we present, for the first time, an independent and physically based reconstruction of mean annual temperature over the past half millennium obtained from groundwater in France. The reconstructed noble gas temperature (NGT) record suggests cooler than present climate conditions throughout the 16th-19th centuries. Periods of warming occur in the 17th-18th and 20th century, while cooling is reconstructed in the 19th century. A noticeable coincidence with other temperature records is demonstrated. Deuterium excess varies in parallel with the NGT, and indicates variation in the seasonality of the aquifer recharge; whereas high excess air in groundwater indicates periods with high oscillations of the water table.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Our main aim in this report is to use Next generation SDH to solve the problem associated with the new telecom services. We have tried to analyze the different services and in this way identified some drawbacks which can be seen as hindrances in supporting these services. In this thesis we will first try to have idea of the past SDH technology and how the next generation SDH came into effect overriding the drawbacks of the past SDH technology. Our main concern throughout the report will be the way we can use next generation SDH to provide quality telecommunication services. In the section dealing with the telecommunication services through next generation SDH we will consider how we can transport Ethernet services through the Next generation SDH and what are the benefits to the customer and the service provider in using next generation SDH as a carrier. We will also see to improve to the ATM services through Next generation SDH. And finally towards the end I have identified some possible future work can be done in this area.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND AND OBJECTIVE: The Lausanne Stroke Registry includes, from 1979, all patients admitted to the department of Neurology of the Lausanne University Hospital with the diagnosis of first clinical stroke. Using the Lausanne Stroke Registry, we aimed to determine trends in risk factors, causes, localization and inhospital mortality over 25 years in hospitalized stroke patients. METHODS: We assessed temporal trends in stroke patients characteristics through the following consecutive periods: 1979-1987, 1988-1995 and 1996-2003. Age-adjusted cardiovascular risk factors, etiologies, stroke localizations and mortality were compared between the three periods. RESULTS: Overall, 5,759 patients were included. Age was significantly different among the analyzed periods (p < 0.001), showing an increment in older patients throughout time. After adjustment for age, hypercholesterolemia increased (p < 0.001), as opposed to cigarette smoking (p < 0.001), hypertension (p < 0.001) and diabetes and hyperglycemia (p < 0.001). In patients with ischemic strokes, there were significant changes in the distribution of causes with an increase in cardioembolic strokes (p < 0.001), and in the localization of strokes with an increase in entire middle cerebral artery (MCA) and posterior circulation strokes together with a decrease in superficial middle cerebral artery stroke (p < 0.001). In patients with hemorrhagic strokes, the thalamic localizations increased, whereas the proportion of striatocapsular hemorrhage decreased (p = 0.022). Except in the older patient group, the mortality rate decreased. CONCLUSIONS: This study shows major trends in the characteristics of stroke patients admitted to a department of neurology over a 25-year time span, which may result from referral biases, development of acute stroke management and possibly from the evolution of cerebrovascular risk factors.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The increase in total health care expenditures in France can be explained by three distinct factors : the purely demographic effect (namely, the increase in the proportion of elderly people, given that health expenditure is an increasing function of age) ; the changes in morbidity at a given age ; the changes in practices, for a given age and morbidity level (e.g technological progress). The aim of this paper is basically to disentangle, evaluate and interpret the respective effects of these three factors. [Extrait introduction p. 3]

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Accurate perception of the temporal order of sensory events is a prerequisite in numerous functions ranging from language comprehension to motor coordination. We investigated the spatio-temporal brain dynamics of auditory temporal order judgment (aTOJ) using electrical neuroimaging analyses of auditory evoked potentials (AEPs) recorded while participants completed a near-threshold task requiring spatial discrimination of left-right and right-left sound sequences. AEPs to sound pairs modulated topographically as a function of aTOJ accuracy over the 39-77ms post-stimulus period, indicating the engagement of distinct configurations of brain networks during early auditory processing stages. Source estimations revealed that accurate and inaccurate performance were linked to bilateral posterior sylvian regions activity (PSR). However, activity within left, but not right, PSR predicted behavioral performance suggesting that left PSR activity during early encoding phases of pairs of auditory spatial stimuli appears critical for the perception of their order of occurrence. Correlation analyses of source estimations further revealed that activity between left and right PSR was significantly correlated in the inaccurate but not accurate condition, indicating that aTOJ accuracy depends on the functional decoupling between homotopic PSR areas. These results support a model of temporal order processing wherein behaviorally relevant temporal information--i.e. a temporal 'stamp'--is extracted within the early stages of cortical processes within left PSR but critically modulated by inputs from right PSR. We discuss our results with regard to current models of temporal of temporal order processing, namely gating and latency mechanisms.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Molecular docking softwares are one of the important tools of modern drug development pipelines. The promising achievements of the last 10 years emphasize the need for further improvement, as reflected by several recent publications (Leach et al., J Med Chem 2006, 49, 5851; Warren et al., J Med Chem 2006, 49, 5912). Our initial approach, EADock, showed a good performance in reproducing the experimental binding modes for a set of 37 different ligand-protein complexes (Grosdidier et al., Proteins 2007, 67, 1010). This article presents recent improvements regarding the scoring and sampling aspects over the initial implementation, as well as a new seeding procedure based on the detection of cavities, opening the door to blind docking with EADock. These enhancements were validated on 260 complexes taken from the high quality Ligand Protein Database [LPDB, (Roche et al., J Med Chem 2001, 44, 3592)]. Two issues were identified: first, the quality of the initial structures cannot be assumed and a manual inspection and/or a search in the literature are likely to be required to achieve the best performance. Second the description of interactions involving metal ions still has to be improved. Nonetheless, a remarkable success rate of 65% was achieved for a large scale blind docking assay, when considering only the top ranked binding mode and a success threshold of 2 A RMSD to the crystal structure. When looking at the five-top ranked binding modes, the success rate increases up to 76%. In a standard local docking assay, success rates of 75 and 83% were obtained, considering only the top ranked binding mode, or the five top binding modes, respectively.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: Intravenous thrombolysis with alteplase for ischemic stroke is fixed at a maximal dose of 90 mg for safety reasons. Little is known about the clinical outcomes of stroke patients weighing >100 kg, who may benefit less from thrombolysis due to this dose limitation. Methods: Prospective data on 1,479 consecutive stroke patients treated with intravenous alteplase in six Swiss stroke units were analyzed. Presenting characteristics and the frequency of favorable outcomes, defined as a modified Rankin scale (mRS) score of 0 or 1, a good outcome (mRS score 0-2), mortality and symptomatic intracranial hemorrhage (SICH) were compared between patients weighing >100 kg and those weighing ≤100 kg. Results: Compared to their counterparts (n = 1,384, mean body weight 73 kg), patients weighing >100 kg (n = 95, mean body weight 108 kg) were younger (61 vs. 67 years, p < 0.001), were more frequently males (83 vs. 60%, p < 0.001) and more frequently suffered from diabetes mellitus (30 vs. 13%, p < 0.001). As compared with patients weighing ≤100 kg, patients weighing >100 kg had similar rates of favorable outcomes (45 vs. 48%, p = 0.656), good outcomes (58 vs. 64%, p = 0.270) and mortality (17 vs. 12%, p = 0.196), and SICH risk (1 vs. 5%, p = 0.182). After multivariable adjustment, body weight >100 kg was strongly associated with mortality (p = 0.007) and poor outcome (p = 0.007). Conclusion: Our data do not suggest a reduced likehood of favorable outcomes in patients weighing >100 kg treated with the current dose regimen. The association of body weight >100 kg with mortality and poor outcome, however, demands further large-scale studies to replicate our findings and to explore the underlying mechanisms.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

PURPOSE This prospective multicenter phase III study compared the efficacy and safety of a triple combination (bortezomib-thalidomide-dexamethasone [VTD]) versus a dual combination (thalidomide-dexamethasone [TD]) in patients with multiple myeloma (MM) progressing or relapsing after autologous stem-cell transplantation (ASCT). PATIENTS AND METHODS Overall, 269 patients were randomly assigned to receive bortezomib (1.3 mg/m(2) intravenous bolus) or no bortezomib for 1 year, in combination with thalidomide (200 mg per day orally) and dexamethasone (40 mg orally once a day on 4 days once every 3 weeks). Bortezomib was administered on days 1, 4, 8, and 11 with a 10-day rest period (day 12 to day 21) for eight cycles (6 months), and then on days 1, 8, 15, and 22 with a 20-day rest period (day 23 to day 42) for four cycles (6 months). Results Median time to progression (primary end point) was significantly longer with VTD than TD (19.5 v 13.8 months; hazard ratio, 0.59; 95% CI, 0.44 to 0.80; P = .001), the complete response plus near-complete response rate was higher (45% v 25%; P = .001), and the median duration of response was longer (17.2 v 13.4 months; P = .03). The 24-month survival rate was in favor of VTD (71% v 65%; P = .093). Grade 3 peripheral neuropathy was more frequent with VTD (29% v 12%; P = .001) as were the rates of grades 3 and 4 infection and thrombocytopenia. CONCLUSION VTD was more effective than TD in the treatment of patients with MM with progressive or relapsing disease post-ASCT but was associated with a higher incidence of grade 3 neurotoxicity.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Left rostral dorsal premotor cortex (rPMd) and supramarginal gyrus (SMG) have been implicated in the dynamic control of actions. In 12 right-handed healthy individuals, we applied 30 min of low-frequency (1 Hz) repetitive transcranial magnetic stimulation (rTMS) over left rPMd to investigate the involvement of left rPMd and SMG in the rapid adjustment of actions guided by visuospatial cues. After rTMS, subjects underwent functional magnetic resonance imaging while making spatially congruent button presses with the right or left index finger in response to a left- or right-sided target. Subjects were asked to covertly prepare motor responses as indicated by a directional cue presented 1 s before the target. On 20% of trials, the cue was invalid, requiring subjects to readjust their motor plan according to the target location. Compared with sham rTMS, real rTMS increased the number of correct responses in invalidly cued trials. After real rTMS, task-related activity of the stimulated left rPMd showed increased task-related coupling with activity in ipsilateral SMG and the adjacent anterior intraparietal area (AIP). Individuals who showed a stronger increase in left-hemispheric premotor-parietal connectivity also made fewer errors on invalidly cued trials after rTMS. The results suggest that rTMS over left rPMd improved the ability to dynamically adjust visuospatial response mapping by strengthening left-hemispheric connectivity between rPMd and the SMG-AIP region. These results support the notion that left rPMd and SMG-AIP contribute toward dynamic control of actions and demonstrate that low-frequency rTMS can enhance functional coupling between task-relevant brain regions and improve some aspects of motor performance.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Un système efficace de sismique tridimensionnelle (3-D) haute-résolution adapté à des cibles lacustres de petite échelle a été développé. Dans le Lac Léman, près de la ville de Lausanne, en Suisse, des investigations récentes en deux dimension (2-D) ont mis en évidence une zone de faille complexe qui a été choisie pour tester notre système. Les structures observées incluent une couche mince (<40 m) de sédiments quaternaires sub-horizontaux, discordants sur des couches tertiaires de molasse pentées vers le sud-est. On observe aussi la zone de faille de « La Paudèze » qui sépare les unités de la Molasse du Plateau de la Molasse Subalpine. Deux campagnes 3-D complètes, d?environ d?un kilomètre carré, ont été réalisées sur ce site de test. La campagne pilote (campagne I), effectuée en 1999 pendant 8 jours, a couvert 80 profils en utilisant une seule flûte. Pendant la campagne II (9 jours en 2001), le nouveau système trois-flûtes, bien paramétrés pour notre objectif, a permis l?acquisition de données de très haute qualité sur 180 lignes CMP. Les améliorations principales incluent un système de navigation et de déclenchement de tirs grâce à un nouveau logiciel. Celui-ci comprend un contrôle qualité de la navigation du bateau en temps réel utilisant un GPS différentiel (dGPS) à bord et une station de référence près du bord du lac. De cette façon, les tirs peuvent être déclenchés tous les 5 mètres avec une erreur maximale non-cumulative de 25 centimètres. Tandis que pour la campagne I la position des récepteurs de la flûte 48-traces a dû être déduite à partir des positions du bateau, pour la campagne II elle ont pu être calculées précisément (erreur <20 cm) grâce aux trois antennes dGPS supplémentaires placées sur des flotteurs attachés à l?extrémité de chaque flûte 24-traces. Il est maintenant possible de déterminer la dérive éventuelle de l?extrémité des flûtes (75 m) causée par des courants latéraux ou de petites variations de trajet du bateau. De plus, la construction de deux bras télescopiques maintenant les trois flûtes à une distance de 7.5 m les uns des autres, qui est la même distance que celle entre les lignes naviguées de la campagne II. En combinaison avec un espacement de récepteurs de 2.5 m, la dimension de chaque «bin» de données 3-D de la campagne II est de 1.25 m en ligne et 3.75 m latéralement. L?espacement plus grand en direction « in-line » par rapport à la direction «cross-line» est justifié par l?orientation structurale de la zone de faille perpendiculaire à la direction «in-line». L?incertitude sur la navigation et le positionnement pendant la campagne I et le «binning» imprécis qui en résulte, se retrouve dans les données sous forme d?une certaine discontinuité des réflecteurs. L?utilisation d?un canon à air à doublechambre (qui permet d?atténuer l?effet bulle) a pu réduire l?aliasing observé dans les sections migrées en 3-D. Celui-ci était dû à la combinaison du contenu relativement haute fréquence (<2000 Hz) du canon à eau (utilisé à 140 bars et à 0.3 m de profondeur) et d?un pas d?échantillonnage latéral insuffisant. Le Mini G.I 15/15 a été utilisé à 80 bars et à 1 m de profondeur, est mieux adapté à la complexité de la cible, une zone faillée ayant des réflecteurs pentés jusqu?à 30°. Bien que ses fréquences ne dépassent pas les 650 Hz, cette source combine une pénétration du signal non-aliasé jusqu?à 300 m dans le sol (par rapport au 145 m pour le canon à eau) pour une résolution verticale maximale de 1.1 m. Tandis que la campagne I a été acquise par groupes de plusieurs lignes de directions alternées, l?optimisation du temps d?acquisition du nouveau système à trois flûtes permet l?acquisition en géométrie parallèle, ce qui est préférable lorsqu?on utilise une configuration asymétrique (une source et un dispositif de récepteurs). Si on ne procède pas ainsi, les stacks sont différents selon la direction. Toutefois, la configuration de flûtes, plus courtes que pour la compagne I, a réduit la couverture nominale, la ramenant de 12 à 6. Une séquence classique de traitement 3-D a été adaptée à l?échantillonnage à haute fréquence et elle a été complétée par deux programmes qui transforment le format non-conventionnel de nos données de navigation en un format standard de l?industrie. Dans l?ordre, le traitement comprend l?incorporation de la géométrie, suivi de l?édition des traces, de l?harmonisation des «bins» (pour compenser l?inhomogénéité de la couverture due à la dérive du bateau et de la flûte), de la correction de la divergence sphérique, du filtrage passe-bande, de l?analyse de vitesse, de la correction DMO en 3-D, du stack et enfin de la migration 3-D en temps. D?analyses de vitesse détaillées ont été effectuées sur les données de couverture 12, une ligne sur deux et tous les 50 CMP, soit un nombre total de 600 spectres de semblance. Selon cette analyse, les vitesses d?intervalles varient de 1450-1650 m/s dans les sédiments non-consolidés et de 1650-3000 m/s dans les sédiments consolidés. Le fait que l?on puisse interpréter plusieurs horizons et surfaces de faille dans le cube, montre le potentiel de cette technique pour une interprétation tectonique et géologique à petite échelle en trois dimensions. On distingue cinq faciès sismiques principaux et leurs géométries 3-D détaillées sur des sections verticales et horizontales: les sédiments lacustres (Holocène), les sédiments glacio-lacustres (Pléistocène), la Molasse du Plateau, la Molasse Subalpine de la zone de faille (chevauchement) et la Molasse Subalpine au sud de cette zone. Les couches de la Molasse du Plateau et de la Molasse Subalpine ont respectivement un pendage de ~8° et ~20°. La zone de faille comprend de nombreuses structures très déformées de pendage d?environ 30°. Des tests préliminaires avec un algorithme de migration 3-D en profondeur avant sommation et à amplitudes préservées démontrent que la qualité excellente des données de la campagne II permet l?application de telles techniques à des campagnes haute-résolution. La méthode de sismique marine 3-D était utilisée jusqu?à présent quasi-exclusivement par l?industrie pétrolière. Son adaptation à une échelle plus petite géographiquement mais aussi financièrement a ouvert la voie d?appliquer cette technique à des objectifs d?environnement et du génie civil.<br/><br/>An efficient high-resolution three-dimensional (3-D) seismic reflection system for small-scale targets in lacustrine settings was developed. In Lake Geneva, near the city of Lausanne, Switzerland, past high-resolution two-dimensional (2-D) investigations revealed a complex fault zone (the Paudèze thrust zone), which was subsequently chosen for testing our system. Observed structures include a thin (<40 m) layer of subhorizontal Quaternary sediments that unconformably overlie southeast-dipping Tertiary Molasse beds and the Paudèze thrust zone, which separates Plateau and Subalpine Molasse units. Two complete 3-D surveys have been conducted over this same test site, covering an area of about 1 km2. In 1999, a pilot survey (Survey I), comprising 80 profiles, was carried out in 8 days with a single-streamer configuration. In 2001, a second survey (Survey II) used a newly developed three-streamer system with optimized design parameters, which provided an exceptionally high-quality data set of 180 common midpoint (CMP) lines in 9 days. The main improvements include a navigation and shot-triggering system with in-house navigation software that automatically fires the gun in combination with real-time control on navigation quality using differential GPS (dGPS) onboard and a reference base near the lake shore. Shots were triggered at 5-m intervals with a maximum non-cumulative error of 25 cm. Whereas the single 48-channel streamer system of Survey I requires extrapolation of receiver positions from the boat position, for Survey II they could be accurately calculated (error <20 cm) with the aid of three additional dGPS antennas mounted on rafts attached to the end of each of the 24- channel streamers. Towed at a distance of 75 m behind the vessel, they allow the determination of feathering due to cross-line currents or small course variations. Furthermore, two retractable booms hold the three streamers at a distance of 7.5 m from each other, which is the same distance as the sail line interval for Survey I. With a receiver spacing of 2.5 m, the bin dimension of the 3-D data of Survey II is 1.25 m in in-line direction and 3.75 m in cross-line direction. The greater cross-line versus in-line spacing is justified by the known structural trend of the fault zone perpendicular to the in-line direction. The data from Survey I showed some reflection discontinuity as a result of insufficiently accurate navigation and positioning and subsequent binning errors. Observed aliasing in the 3-D migration was due to insufficient lateral sampling combined with the relatively high frequency (<2000 Hz) content of the water gun source (operated at 140 bars and 0.3 m depth). These results motivated the use of a double-chamber bubble-canceling air gun for Survey II. A 15 / 15 Mini G.I air gun operated at 80 bars and 1 m depth, proved to be better adapted for imaging the complexly faulted target area, which has reflectors dipping up to 30°. Although its frequencies do not exceed 650 Hz, this air gun combines a penetration of non-aliased signal to depths of 300 m below the water bottom (versus 145 m for the water gun) with a maximum vertical resolution of 1.1 m. While Survey I was shot in patches of alternating directions, the optimized surveying time of the new threestreamer system allowed acquisition in parallel geometry, which is preferable when using an asymmetric configuration (single source and receiver array). Otherwise, resulting stacks are different for the opposite directions. However, the shorter streamer configuration of Survey II reduced the nominal fold from 12 to 6. A 3-D conventional processing flow was adapted to the high sampling rates and was complemented by two computer programs that format the unconventional navigation data to industry standards. Processing included trace editing, geometry assignment, bin harmonization (to compensate for uneven fold due to boat/streamer drift), spherical divergence correction, bandpass filtering, velocity analysis, 3-D DMO correction, stack and 3-D time migration. A detailed semblance velocity analysis was performed on the 12-fold data set for every second in-line and every 50th CMP, i.e. on a total of 600 spectra. According to this velocity analysis, interval velocities range from 1450-1650 m/s for the unconsolidated sediments and from 1650-3000 m/s for the consolidated sediments. Delineation of several horizons and fault surfaces reveal the potential for small-scale geologic and tectonic interpretation in three dimensions. Five major seismic facies and their detailed 3-D geometries can be distinguished in vertical and horizontal sections: lacustrine sediments (Holocene) , glaciolacustrine sediments (Pleistocene), Plateau Molasse, Subalpine Molasse and its thrust fault zone. Dips of beds within Plateau and Subalpine Molasse are ~8° and ~20°, respectively. Within the fault zone, many highly deformed structures with dips around 30° are visible. Preliminary tests with 3-D preserved-amplitude prestack depth migration demonstrate that the excellent data quality of Survey II allows application of such sophisticated techniques even to high-resolution seismic surveys. In general, the adaptation of the 3-D marine seismic reflection method, which to date has almost exclusively been used by the oil exploration industry, to a smaller geographical as well as financial scale has helped pave the way for applying this technique to environmental and engineering purposes.<br/><br/>La sismique réflexion est une méthode d?investigation du sous-sol avec un très grand pouvoir de résolution. Elle consiste à envoyer des vibrations dans le sol et à recueillir les ondes qui se réfléchissent sur les discontinuités géologiques à différentes profondeurs et remontent ensuite à la surface où elles sont enregistrées. Les signaux ainsi recueillis donnent non seulement des informations sur la nature des couches en présence et leur géométrie, mais ils permettent aussi de faire une interprétation géologique du sous-sol. Par exemple, dans le cas de roches sédimentaires, les profils de sismique réflexion permettent de déterminer leur mode de dépôt, leurs éventuelles déformations ou cassures et donc leur histoire tectonique. La sismique réflexion est la méthode principale de l?exploration pétrolière. Pendant longtemps on a réalisé des profils de sismique réflexion le long de profils qui fournissent une image du sous-sol en deux dimensions. Les images ainsi obtenues ne sont que partiellement exactes, puisqu?elles ne tiennent pas compte de l?aspect tridimensionnel des structures géologiques. Depuis quelques dizaines d?années, la sismique en trois dimensions (3-D) a apporté un souffle nouveau à l?étude du sous-sol. Si elle est aujourd?hui parfaitement maîtrisée pour l?imagerie des grandes structures géologiques tant dans le domaine terrestre que le domaine océanique, son adaptation à l?échelle lacustre ou fluviale n?a encore fait l?objet que de rares études. Ce travail de thèse a consisté à développer un système d?acquisition sismique similaire à celui utilisé pour la prospection pétrolière en mer, mais adapté aux lacs. Il est donc de dimension moindre, de mise en oeuvre plus légère et surtout d?une résolution des images finales beaucoup plus élevée. Alors que l?industrie pétrolière se limite souvent à une résolution de l?ordre de la dizaine de mètres, l?instrument qui a été mis au point dans le cadre de ce travail permet de voir des détails de l?ordre du mètre. Le nouveau système repose sur la possibilité d?enregistrer simultanément les réflexions sismiques sur trois câbles sismiques (ou flûtes) de 24 traces chacun. Pour obtenir des données 3-D, il est essentiel de positionner les instruments sur l?eau (source et récepteurs des ondes sismiques) avec une grande précision. Un logiciel a été spécialement développé pour le contrôle de la navigation et le déclenchement des tirs de la source sismique en utilisant des récepteurs GPS différentiel (dGPS) sur le bateau et à l?extrémité de chaque flûte. Ceci permet de positionner les instruments avec une précision de l?ordre de 20 cm. Pour tester notre système, nous avons choisi une zone sur le Lac Léman, près de la ville de Lausanne, où passe la faille de « La Paudèze » qui sépare les unités de la Molasse du Plateau et de la Molasse Subalpine. Deux campagnes de mesures de sismique 3-D y ont été réalisées sur une zone d?environ 1 km2. Les enregistrements sismiques ont ensuite été traités pour les transformer en images interprétables. Nous avons appliqué une séquence de traitement 3-D spécialement adaptée à nos données, notamment en ce qui concerne le positionnement. Après traitement, les données font apparaître différents faciès sismiques principaux correspondant notamment aux sédiments lacustres (Holocène), aux sédiments glacio-lacustres (Pléistocène), à la Molasse du Plateau, à la Molasse Subalpine de la zone de faille et la Molasse Subalpine au sud de cette zone. La géométrie 3-D détaillée des failles est visible sur les sections sismiques verticales et horizontales. L?excellente qualité des données et l?interprétation de plusieurs horizons et surfaces de faille montrent le potentiel de cette technique pour les investigations à petite échelle en trois dimensions ce qui ouvre des voies à son application dans les domaines de l?environnement et du génie civil.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Adaptació de l'algorisme de Kumar per resoldre sistemes d'equacions amb matrius de Toeplitz sobre els reals a cossos finits en un temps 0 (n log n).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

1. Introduction "The one that has compiled ... a database, the collection, securing the validity or presentation of which has required an essential investment, has the sole right to control the content over the whole work or over either a qualitatively or quantitatively substantial part of the work both by means of reproduction and by making them available to the public", Finnish Copyright Act, section 49.1 These are the laconic words that implemented the much-awaited and hotly debated European Community Directive on the legal protection of databases,2 the EDD, into Finnish Copyright legislation in 1998. Now in the year 2005, after more than half a decade of the domestic implementation it is yet uncertain as to the proper meaning and construction of the convoluted qualitative criteria the current legislation employs as a prerequisite for the database protection both in Finland and within the European Union. Further, this opaque Pan-European instrument has the potential of bringing about a number of far-reaching economic and cultural ramifications, which have remained largely uncharted or unobserved. Thus the task of understanding this particular and currently peculiarly European new intellectual property regime is twofold: first, to understand the mechanics and functioning of the EDD and second, to realise the potential and risks inherent in the new legislation in economic, cultural and societal dimensions. 2. Subject-matter of the study: basic issues The first part of the task mentioned above is straightforward: questions such as what is meant by the key concepts triggering the functioning of the EDD such as presentation of independent information, what constitutes an essential investment in acquiring data and when the reproduction of a given database reaches either qualitatively or quantitatively the threshold of substantiality before the right-holder of a database can avail himself of the remedies provided by the statutory framework remain unclear and call for a careful analysis. As for second task, it is already obvious that the practical importance of the legal protection providedby the database right is in the rapid increase. The accelerating transformationof information into digital form is an existing fact, not merely a reflection of a shape of things to come in the future. To take a simple example, the digitisation of a map, traditionally in paper format and protected by copyright, can provide the consumer a markedly easier and faster access to the wanted material and the price can be, depending on the current state of the marketplace, cheaper than that of the traditional form or even free by means of public lending libraries providing access to the information online. This also renders it possible for authors and publishers to make available and sell their products to markedly larger, international markets while the production and distribution costs can be kept at minimum due to the new electronic production, marketing and distributionmechanisms to mention a few. The troublesome side is for authors and publishers the vastly enhanced potential for illegal copying by electronic means, producing numerous virtually identical copies at speed. The fear of illegal copying canlead to stark technical protection that in turn can dampen down the demand for information goods and services and furthermore, efficiently hamper the right of access to the materials available lawfully in electronic form and thus weaken the possibility of access to information, education and the cultural heritage of anation or nations, a condition precedent for a functioning democracy. 3. Particular issues in Digital Economy and Information Networks All what is said above applies a fortiori to the databases. As a result of the ubiquity of the Internet and the pending breakthrough of Mobile Internet, peer-to-peer Networks, Localand Wide Local Area Networks, a rapidly increasing amount of information not protected by traditional copyright, such as various lists, catalogues and tables,3previously protected partially by the old section 49 of the Finnish Copyright act are available free or for consideration in the Internet, and by the same token importantly, numerous databases are collected in order to enable the marketing, tendering and selling products and services in above mentioned networks. Databases and the information embedded therein constitutes a pivotal element in virtually any commercial operation including product and service development, scientific research and education. A poignant but not instantaneously an obvious example of this is a database consisting of physical coordinates of a certain selected group of customers for marketing purposes through cellular phones, laptops and several handheld or vehicle-based devices connected online. These practical needs call for answer to a plethora of questions already outlined above: Has thecollection and securing the validity of this information required an essential input? What qualifies as a quantitatively or qualitatively significant investment? According to the Directive, the database comprises works, information and other independent materials, which are arranged in systematic or methodical way andare individually accessible by electronic or other means. Under what circumstances then, are the materials regarded as arranged in systematic or methodical way? Only when the protected elements of a database are established, the question concerning the scope of protection becomes acute. In digital context, the traditional notions of reproduction and making available to the public of digital materials seem to fit ill or lead into interpretations that are at variance with analogous domain as regards the lawful and illegal uses of information. This may well interfere with or rework the way in which the commercial and other operators have to establish themselves and function in the existing value networks of information products and services. 4. International sphere After the expiry of the implementation period for the European Community Directive on legal protection of databases, the goals of the Directive must have been consolidated into the domestic legislations of the current twenty-five Member States within the European Union. On one hand, these fundamental questions readily imply that the problemsrelated to correct construction of the Directive underlying the domestic legislation transpire the national boundaries. On the other hand, the disputes arisingon account of the implementation and interpretation of the Directive on the European level attract significance domestically. Consequently, the guidelines on correct interpretation of the Directive importing the practical, business-oriented solutions may well have application on European level. This underlines the exigency for a thorough analysis on the implications of the meaning and potential scope of Database protection in Finland and the European Union. This position hasto be contrasted with the larger, international sphere, which in early 2005 does differ markedly from European Union stance, directly having a negative effect on international trade particularly in digital content. A particular case in point is the USA, a database producer primus inter pares, not at least yet having aSui Generis database regime or its kin, while both the political and academic discourse on the matter abounds. 5. The objectives of the study The above mentioned background with its several open issues calls for the detailed study of thefollowing questions: -What is a database-at-law and when is a database protected by intellectual property rights, particularly by the European database regime?What is the international situation? -How is a database protected and what is its relation with other intellectual property regimes, particularly in the Digital context? -The opportunities and threats provided by current protection to creators, users and the society as a whole, including the commercial and cultural implications? -The difficult question on relation of the Database protection and protection of factual information as such. 6. Dsiposition The Study, in purporting to analyse and cast light on the questions above, is divided into three mainparts. The first part has the purpose of introducing the political and rationalbackground and subsequent legislative evolution path of the European database protection, reflected against the international backdrop on the issue. An introduction to databases, originally a vehicle of modern computing and information andcommunication technology, is also incorporated. The second part sets out the chosen and existing two-tier model of the database protection, reviewing both itscopyright and Sui Generis right facets in detail together with the emergent application of the machinery in real-life societal and particularly commercial context. Furthermore, a general outline of copyright, relevant in context of copyright databases is provided. For purposes of further comparison, a chapter on the precursor of Sui Generi, database right, the Nordic catalogue rule also ensues. The third and final part analyses the positive and negative impact of the database protection system and attempts to scrutinize the implications further in the future with some caveats and tentative recommendations, in particular as regards the convoluted issue concerning the IPR protection of information per se, a new tenet in the domain of copyright and related rights.