980 resultados para single case Study


Relevância:

100.00% 100.00%

Publicador:

Resumo:

After a rockfall event, a usual post event survey includes qualitative volume estimation, trajectory mapping and determination of departing zones. However, quantitative measurements are not usually made. Additional relevant quantitative information could be useful in determining the spatial occurrence of rockfall events and help us in quantifying their size. Seismic measurements could be suitable for detection purposes since they are non invasive methods and are relatively inexpensive. Moreover, seismic techniques could provide important information on rockfall size and location of impacts. On 14 February 2007 the Avalanche Group of the University of Barcelona obtained the seismic data generated by an artificially triggered rockfall event at the Montserrat massif (near Barcelona, Spain) carried out in order to purge a slope. Two 3 component seismic stations were deployed in the area about 200 m from the explosion point that triggered the rockfall. Seismic signals and video images were simultaneously obtained. The initial volume of the rockfall was estimated to be 75 m3 by laser scanner data analysis. After the explosion, dozens of boulders ranging from 10¿4 to 5 m3 in volume impacted on the ground at different locations. The blocks fell down onto a terrace, 120 m below the release zone. The impact generated a small continuous mass movement composed of a mixture of rocks, sand and dust that ran down the slope and impacted on the road 60 m below. Time, time-frequency evolution and particle motion analysis of the seismic records and seismic energy estimation were performed. The results are as follows: 1 ¿ A rockfall event generates seismic signals with specific characteristics in the time domain; 2 ¿ the seismic signals generated by the mass movement show a time-frequency evolution different from that of other seismogenic sources (e.g. earthquakes, explosions or a single rock impact). This feature could be used for detection purposes; 3 ¿ particle motion plot analysis shows that the procedure to locate the rock impact using two stations is feasible; 4 ¿ The feasibility and validity of seismic methods for the detection of rockfall events, their localization and size determination are comfirmed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The present study explores the statistical properties of a randomization test based on the random assignment of the intervention point in a two-phase (AB) single-case design. The focus is on randomization distributions constructed with the values of the test statistic for all possible random assignments and used to obtain p-values. The shape of those distributions is investigated for each specific data division defined by the moment in which the intervention is introduced. Another aim of the study consisted in testing the detection of inexistent effects (i.e., production of false alarms) in autocorrelated data series, in which the assumption of exchangeability between observations may be untenable. In this way, it was possible to compare nominal and empirical Type I error rates in order to obtain evidence on the statistical validity of the randomization test for each individual data division. The results suggest that when either of the two phases has considerably less measurement times, Type I errors may be too probable and, hence, the decision making process to be carried out by applied researchers may be jeopardized.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Effect size indices are indispensable for carrying out meta-analyses and can also be seen as an alternative for making decisions about the effectiveness of a treatment in an individual applied study. The desirable features of the procedures for quantifying the magnitude of intervention effect include educational/clinical meaningfulness, calculus easiness, insensitivity to autocorrelation, low false alarm and low miss rates. Three effect size indices related to visual analysis are compared according to the aforementioned criteria. The comparison is made by means of data sets with known parameters: degree of serial dependence, presence or absence of general trend, changes in level and/or in slope. The percent of nonoverlapping data showed the highest discrimination between data sets with and without intervention effect. In cases when autocorrelation or trend is present, the percentage of data points exceeding the median may be a better option to quantify the effectiveness of a psychological treatment.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Visual inspection remains the most frequently applied method for detecting treatment effects in single-case designs. The advantages and limitations of visual inference are here discussed in relation to other procedures for assessing intervention effectiveness. The first part of the paper reviews previous research on visual analysis, paying special attention to the validation of visual analysts" decisions, inter-judge agreement, and false alarm and omission rates. The most relevant factors affecting visual inspection (i.e., effect size, autocorrelation, data variability, and analysts" expertise) are highlighted and incorporated into an empirical simulation study with the aim of providing further evidence about the reliability of visual analysis. Our results concur with previous studies that have reported the relationship between serial dependence and increased Type I rates. Participants with greater experience appeared to be more conservative and used more consistent criteria when assessing graphed data. Nonetheless, the decisions made by both professionals and students did not match sufficiently the simulated data features, and we also found low intra-judge agreement, thus suggesting that visual inspection should be complemented by other methods when assessing treatment effectiveness.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

If single case experimental designs are to be used to establish guidelines for evidence-based interventions in clinical and educational settings, numerical values that reflect treatment effect sizes are required. The present study compares four recently developed procedures for quantifying the magnitude of intervention effect using data with known characteristics. Monte Carlo methods were used to generate AB designs data with potential confounding variables (serial dependence, linear and curvilinear trend, and heteroscedasticity between phases) and two types of treatment effect (level and slope change). The results suggest that data features are important for choosing the appropriate procedure and, thus, inspecting the graphed data visually is a necessary initial stage. In the presence of serial dependence or a change in data variability, the Nonoverlap of All Pairs (NAP) and the Slope and Level Change (SLC) were the only techniques of the four examined that performed adequately. Introducing a data correction step in NAP renders it unaffected by linear trend, as is also the case for the Percentage of Nonoverlapping Corrected Data and SLC. The performance of these techniques indicates that professionals" judgments concerning treatment effectiveness can be readily complemented by both visual and statistical analyses. A flowchart to guide selection of techniques according to the data characteristics identified by visual inspection is provided.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The present study focuses on single-case data analysis and specifically on two procedures for quantifying differences between baseline and treatment measurements The first technique tested is based on generalized least squares regression analysis and is compared to a proposed non-regression technique, which allows obtaining similar information. The comparison is carried out in the context of generated data representing a variety of patterns (i.e., independent measurements, different serial dependence underlying processes, constant or phase-specific autocorrelation and data variability, different types of trend, and slope and level change). The results suggest that the two techniques perform adequately for a wide range of conditions and researchers can use both of them with certain guarantees. The regression-based procedure offers more efficient estimates, whereas the proposed non-regression procedure is more sensitive to intervention effects. Considering current and previous findings, some tentative recommendations are offered to applied researchers in order to help choosing among the plurality of single-case data analysis techniques.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Increasing anthropogenic pressures urge enhanced knowledge and understanding of the current state of marine biodiversity. This baseline information is pivotal to explore present trends, detect future modifications and propose adequate management actions for marine ecosystems. Coralligenous outcrops are a highly diverse and structurally complex deep-water habitat faced with major threats in the Mediterranean Sea. Despite its ecological, aesthetic and economic value, coralligenous biodiversity patterns are still poorly understood. There is currently no single sampling method that has been demonstrated to be sufficiently representative to ensure adequate community assessment and monitoring in this habitat. Therefore, we propose a rapid non-destructive protocol for biodiversity assessment and monitoring of coralligenous outcrops providing good estimates of its structure and species composition, based on photographic sampling and the determination of presence/absence of macrobenthic species. We used an extensive photographic survey, covering several spatial scales (100s of m to 100s of km) within the NW Mediterranean and including 2 different coralligenous assemblages: Paramuricea clavata (PCA) and Corallium rubrum assemblage (CRA). This approach allowed us to determine the minimal sampling area for each assemblage (5000 cm² for PCA and 2500 cm²for CRA). In addition, we conclude that 3 replicates provide an optimal sampling effort in order to maximize the species number and to assess the main biodiversity patterns of studied assemblages in variability studies requiring replicates. We contend that the proposed sampling approach provides a valuable tool for management and conservation planning, monitoring and research programs focused on coralligenous outcrops, potentially also applicable in other benthic ecosystems

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Les larves aquatiques d'éphémères (Ephemeroptera) colonisent toutes les eaux douces du monde et sont couramment utilisées comme bio-indicateurs de la qualité de l'eau. Le genre Rhithrogena (Heptageniidae) est le deuxième plus diversifié chez les éphémères, et plusieurs espèces européennes ont une distribution restreinte dans des environnements alpins sensibles. Les espèces de Rhithrogena ont été classées en "groupes d'espèces" faciles à identifier. Cependant, malgré leur importance écologique et en terme de conservation, beaucoup d'espèces présentent des différences morphologiques ambiguës, suggérant que lataxonomie actuelle ne refléterait pas correctement leur diversité évolutive. De plus, aucune information sur leurs relations, leur origine, le taux de spéciation ou les mécanismes ayant provoqué leur remarquable diversification dans les Alpes n'est disponible. Nous avons d'abord examiné le statut spécifique d'environ 50% des espèces européennes de Rhithrogena en utilisant un large échantillonnage de populations alpines incluant 22 localités typiques, ainsi qu'une analyse basée sur le modèle général mixte de Yule et de coalescence (GMYC) appliqué à un gène mitochondrial standard (coxl) et à un gène nucléaire développé spécifiquement pour cette étude. Nous avons observé un regroupement significatif des séquences coxl en 31 espèces potentielles, et nos résultats ont fortement suggéré la présence d'espèces cryptiques et de fractionnements taxonomiques excessifs chez les Rhithrogena. Nos analyses phylogénétiques ont démontré la monophylie de quatre des six groupes d'espèces reconnus présents dans notre échantillonnage. La taxonomie ADN développée dans cette étude pose les bases d'une future révision de ce genre important mais cryptique en Europe. Puis nous avons mené une étude phylogénétique multi-gènes entre les espèces européennes de Rhithrogena. Les données provenant de trois gènes nucléaires et de deux gènes mitochondriaux ont été largement concordantes, et les relations entre les espèces bien résolues au sein de la plupart des groupes d'espèces dans une analyse combinant tous les gènes. En l'absence de points de calibration extérieurs tels que des fossiles, nous avons appliqué à nos données mitochondriales une horloge moléculaire standard pour les insectes, suggérant une origine des Rhithrogena alpins à la limite Oligocène / Miocène. Nos résultats ont montré le rôle prépondérant qu'ont joué les glaciations du quaternaire dans leur diversification, favorisant la spéciation d'au moins la moitié des espèces actuelle dans les Alpes. La biodiversité et le taux d'endémisme à Madagascar, notamment au niveau de la faune des eaux douces, sont parmi les plus extraordinaires et les plus menacés au monde. On pense que beaucoup d'espèces d'éphémères sont restreintes à un seul bassin versant (microendémisme) dans les zones forestières, ce qui les rendrait particulièrement sensibles à la réduction et à la dégradation de leur habitat. Mis à part deux espèces décrites, Afronurus matitensis et Compsoneuria josettae, les Heptageniidae sont pratiquement inconnus à Madagascar. Les deux genres ont une distribution discontinue en Afrique, à Madagascar et en Asie du Sud-Est, et leur taxonomie complexe est régulièrement révisée. L'approche standard pour comprendre leur diversité, leur endémisme et leur origine requerrait un échantillonnage étendu sur plusieurs continents et des années de travaux taxonomiques. Pour accélérer le processus, nous avons utilisé des collections de musées ainsi que des individus fraîchement collectés, et appliqué une approche combinant taxonomie ADN et phylogénie. L'analyses GMYC du gène coxl a délimité 14 espèces potentielles à Madagascar, dont 70% vraisemblablement microendémiques. Une analyse phylogénique incluant des espèces africaines et asiatiques portant sur deux gènes mitochondriaux et quatre gènes nucléaires a montré que les Heptageniidae malgaches sont monophylétiques et groupe frère des Compsoneuria africains. L'existence de cette lignée unique, ainsi qu'un taux élevé de microendémisme, mettent en évidence leur importance en terme de conservation. Nos résultats soulignent également le rôle important que peuvent jouer les collections de musées dans les études moléculaires et en conservation. - Aquatic nymphs of mayflies (Ephemeroptera) colonize all types of freshwaters throughout the world and are extensively used as bio-indicators of water quality. Rhithrogena (Heptageniidae) is the second most species-rich genus of mayflies, and several European species have restricted distributions in sensitive Alpine environments and therefore are of conservation interest. The European Rhithrogena species are arranged into "species groups" that are easily identifiable. However, despite their ecological and conservation importance, ambiguous morphological differences among many species suggest that the current taxonomy may not accurately reflect their evolutionary diversity. Moreover, no information about their relationships, origin, timing of speciation and mechanisms promoting their successful diversification in the Alps is available. We first examined the species status of ca. 50% of European Rhithrogena diversity using a widespread sampling scheme of Alpine species that included 22 type localities, general mixed Yule- coalescent (GMYC) model analysis of one standard mitochondrial (coxl) and one newly developed nuclear marker. We observed significant clustering of coxl into 31 GMYC species, and our results strongly suggest the presence of both cryptic diversity and taxonomic oversplitting in Rhithrogena. Phylogenetic analyses recovered four of the six recognized species groups in our samples as monophyletic. The DNA taxonomy developed here lays the groundwork for a future revision of this important but cryptic genus in Europe. Then we conducted a species-level, multiple-gene phylogenetic study of European Rhithrogena. Data from three nuclear and two mitochondrial loci were broadly congruent, and species-level relationships were well resolved within most species groups in a combined analysis. In the absence of external calibration points like fossils, we applied a standard insect molecular clock hypothesis to our mitochondrial data, suggesting an origin of Alpine Rhithrogena in the Oligocene / Miocene boundary. Our results highlighted the preponderant role that quaternary glaciations played in their diversification, promoting speciation of at least half of the current diversity in the Alps. Madagascar's biodiversity and endemism are among the most extraordinary and endangered in the world. This includes the island's freshwater biodiversity, although detailed knowledge of the diversity, endemism, and biogeographic origin of freshwater invertebrates is lacking. Many mayfly species are thought to be restricted to single river basins (microendemic species) in forested areas, making them particularly sensitive to habitat reduction and degradation. The Heptageniidae are practically unknown in Madagascar except for two described species, Afronurus matitensis and Compsoneuria josettae. Both genera have a disjunct distribution in Africa, Madagascar and Southeast Asia, and a complex taxonomic status still in flux. The standard approach to understanding their diversity, endemism, and origin would require extensive field sampling on several continents and years of taxonomic work. Here we circumvent this using museum collections and freshly collected individuals in a combined approach of DNA taxonomy and phylogeny. The cox/-based GMYC analysis revealed 14 putative species on Madagascar, 70% of which potentially microendemics. A phylogenetic analysis that included African and Asian species and data from two mitochondrial and four nuclear loci indicated the Malagasy Heptageniidae are monophyletic and sister to African Compsoneuria. The observed monophyly and high microendemism highlight their conservation importance. Our results also underline the important role that museum collections can play in molecular studies, especially in critically endangered biodiversity hotspots like Madagascar.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This case study deals with a rock face monitoring in urban areas using a Terrestrial Laser Scanner. The pilot study area is an almost vertical, fifty meter high cliff, on top of which the village of Castellfollit de la Roca is located. Rockfall activity is currently causing a retreat of the rock face, which may endanger the houses located at its edge. TLS datasets consist of high density 3-D point clouds acquired from five stations, nine times in a time span of 22 months (from March 2006 to January 2008). The change detection, i.e. rockfalls, was performed through a sequential comparison of datasets. Two types of mass movement were detected in the monitoring period: (a) detachment of single basaltic columns, with magnitudes below 1.5 m3 and (b) detachment of groups of columns, with magnitudes of 1.5 to 150 m3. Furthermore, the historical record revealed (c) the occurrence of slab failures with magnitudes higher than 150 m3. Displacements of a likely slab failure were measured, suggesting an apparent stationary stage. Even failures are clearly episodic, our results, together with the study of the historical record, enabled us to estimate a mean detachment of material from 46 to 91.5 m3 year¿1. The application of TLS considerably improved our understanding of rockfall phenomena in the study area.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Tämän tutkimuksen tavoitteena oli tutkia ja selvittää monimutkaisia myyjän ja toimittajan välisiä liikesuhteita; miten ne kehittyvät ja millaisia prosesseja ne käyvät läpi, jos avainasiakassuhde on vaikeuksissa. Tavoitteena oli myös löytää syitä miksi ostokäyttäytyminen on muuttunut, onko se maailman- laajuinen ilmiö vai onko kyse vain yksittäisestä tapauksesta paperiteolli-suudessa. Lisäksi tavoitteena oli selvittää mitkä ovat alkusysäyksiä avain-asiakassuhteen murrostilaan. Tutkimuksen lähestymistapa on kvalitatiivinen tapaustutkimus. Tutkimuksen ensisijainen empiirinen aineisto on kerätty haastattelemalla UPM-Kymmenen johtoa, paperin osto-organisaation ostojohtajaa X ja asiakas Y:n entistä osto-johtajaa. Työ ei ole salainen. Tämän takia asiakkaiden nimiä ei voida julkaista, koska UPM-Kymmene vaati, että työ ei saa sisältää mitään informaatiota, josta lukija voi tunnistaa asiakas X:n tai Y:n. Johtopäätöksenä voidaan suosittaa toimittajan tarkkailevan ja ymmärtävän mahdollisia alkusysäyksiä ja varoitussignaaleja ehkäistäkseen tulevaisuuden murrostiloja liikesuhteissaan ja hallita paremmin avainasiakkaitaan.Pääasialliset alkusysäykset ovat vähentynyt avoin kommunikaatio, ostajan radikaalit säästöt, vähentynyt informaation vaihto ja ostajan johdon vaihtuminen, koska se luo epävarmuutta toimittajaan kuten myös ostajaan.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Un système efficace de sismique tridimensionnelle (3-D) haute-résolution adapté à des cibles lacustres de petite échelle a été développé. Dans le Lac Léman, près de la ville de Lausanne, en Suisse, des investigations récentes en deux dimension (2-D) ont mis en évidence une zone de faille complexe qui a été choisie pour tester notre système. Les structures observées incluent une couche mince (<40 m) de sédiments quaternaires sub-horizontaux, discordants sur des couches tertiaires de molasse pentées vers le sud-est. On observe aussi la zone de faille de « La Paudèze » qui sépare les unités de la Molasse du Plateau de la Molasse Subalpine. Deux campagnes 3-D complètes, d?environ d?un kilomètre carré, ont été réalisées sur ce site de test. La campagne pilote (campagne I), effectuée en 1999 pendant 8 jours, a couvert 80 profils en utilisant une seule flûte. Pendant la campagne II (9 jours en 2001), le nouveau système trois-flûtes, bien paramétrés pour notre objectif, a permis l?acquisition de données de très haute qualité sur 180 lignes CMP. Les améliorations principales incluent un système de navigation et de déclenchement de tirs grâce à un nouveau logiciel. Celui-ci comprend un contrôle qualité de la navigation du bateau en temps réel utilisant un GPS différentiel (dGPS) à bord et une station de référence près du bord du lac. De cette façon, les tirs peuvent être déclenchés tous les 5 mètres avec une erreur maximale non-cumulative de 25 centimètres. Tandis que pour la campagne I la position des récepteurs de la flûte 48-traces a dû être déduite à partir des positions du bateau, pour la campagne II elle ont pu être calculées précisément (erreur <20 cm) grâce aux trois antennes dGPS supplémentaires placées sur des flotteurs attachés à l?extrémité de chaque flûte 24-traces. Il est maintenant possible de déterminer la dérive éventuelle de l?extrémité des flûtes (75 m) causée par des courants latéraux ou de petites variations de trajet du bateau. De plus, la construction de deux bras télescopiques maintenant les trois flûtes à une distance de 7.5 m les uns des autres, qui est la même distance que celle entre les lignes naviguées de la campagne II. En combinaison avec un espacement de récepteurs de 2.5 m, la dimension de chaque «bin» de données 3-D de la campagne II est de 1.25 m en ligne et 3.75 m latéralement. L?espacement plus grand en direction « in-line » par rapport à la direction «cross-line» est justifié par l?orientation structurale de la zone de faille perpendiculaire à la direction «in-line». L?incertitude sur la navigation et le positionnement pendant la campagne I et le «binning» imprécis qui en résulte, se retrouve dans les données sous forme d?une certaine discontinuité des réflecteurs. L?utilisation d?un canon à air à doublechambre (qui permet d?atténuer l?effet bulle) a pu réduire l?aliasing observé dans les sections migrées en 3-D. Celui-ci était dû à la combinaison du contenu relativement haute fréquence (<2000 Hz) du canon à eau (utilisé à 140 bars et à 0.3 m de profondeur) et d?un pas d?échantillonnage latéral insuffisant. Le Mini G.I 15/15 a été utilisé à 80 bars et à 1 m de profondeur, est mieux adapté à la complexité de la cible, une zone faillée ayant des réflecteurs pentés jusqu?à 30°. Bien que ses fréquences ne dépassent pas les 650 Hz, cette source combine une pénétration du signal non-aliasé jusqu?à 300 m dans le sol (par rapport au 145 m pour le canon à eau) pour une résolution verticale maximale de 1.1 m. Tandis que la campagne I a été acquise par groupes de plusieurs lignes de directions alternées, l?optimisation du temps d?acquisition du nouveau système à trois flûtes permet l?acquisition en géométrie parallèle, ce qui est préférable lorsqu?on utilise une configuration asymétrique (une source et un dispositif de récepteurs). Si on ne procède pas ainsi, les stacks sont différents selon la direction. Toutefois, la configuration de flûtes, plus courtes que pour la compagne I, a réduit la couverture nominale, la ramenant de 12 à 6. Une séquence classique de traitement 3-D a été adaptée à l?échantillonnage à haute fréquence et elle a été complétée par deux programmes qui transforment le format non-conventionnel de nos données de navigation en un format standard de l?industrie. Dans l?ordre, le traitement comprend l?incorporation de la géométrie, suivi de l?édition des traces, de l?harmonisation des «bins» (pour compenser l?inhomogénéité de la couverture due à la dérive du bateau et de la flûte), de la correction de la divergence sphérique, du filtrage passe-bande, de l?analyse de vitesse, de la correction DMO en 3-D, du stack et enfin de la migration 3-D en temps. D?analyses de vitesse détaillées ont été effectuées sur les données de couverture 12, une ligne sur deux et tous les 50 CMP, soit un nombre total de 600 spectres de semblance. Selon cette analyse, les vitesses d?intervalles varient de 1450-1650 m/s dans les sédiments non-consolidés et de 1650-3000 m/s dans les sédiments consolidés. Le fait que l?on puisse interpréter plusieurs horizons et surfaces de faille dans le cube, montre le potentiel de cette technique pour une interprétation tectonique et géologique à petite échelle en trois dimensions. On distingue cinq faciès sismiques principaux et leurs géométries 3-D détaillées sur des sections verticales et horizontales: les sédiments lacustres (Holocène), les sédiments glacio-lacustres (Pléistocène), la Molasse du Plateau, la Molasse Subalpine de la zone de faille (chevauchement) et la Molasse Subalpine au sud de cette zone. Les couches de la Molasse du Plateau et de la Molasse Subalpine ont respectivement un pendage de ~8° et ~20°. La zone de faille comprend de nombreuses structures très déformées de pendage d?environ 30°. Des tests préliminaires avec un algorithme de migration 3-D en profondeur avant sommation et à amplitudes préservées démontrent que la qualité excellente des données de la campagne II permet l?application de telles techniques à des campagnes haute-résolution. La méthode de sismique marine 3-D était utilisée jusqu?à présent quasi-exclusivement par l?industrie pétrolière. Son adaptation à une échelle plus petite géographiquement mais aussi financièrement a ouvert la voie d?appliquer cette technique à des objectifs d?environnement et du génie civil.<br/><br/>An efficient high-resolution three-dimensional (3-D) seismic reflection system for small-scale targets in lacustrine settings was developed. In Lake Geneva, near the city of Lausanne, Switzerland, past high-resolution two-dimensional (2-D) investigations revealed a complex fault zone (the Paudèze thrust zone), which was subsequently chosen for testing our system. Observed structures include a thin (<40 m) layer of subhorizontal Quaternary sediments that unconformably overlie southeast-dipping Tertiary Molasse beds and the Paudèze thrust zone, which separates Plateau and Subalpine Molasse units. Two complete 3-D surveys have been conducted over this same test site, covering an area of about 1 km2. In 1999, a pilot survey (Survey I), comprising 80 profiles, was carried out in 8 days with a single-streamer configuration. In 2001, a second survey (Survey II) used a newly developed three-streamer system with optimized design parameters, which provided an exceptionally high-quality data set of 180 common midpoint (CMP) lines in 9 days. The main improvements include a navigation and shot-triggering system with in-house navigation software that automatically fires the gun in combination with real-time control on navigation quality using differential GPS (dGPS) onboard and a reference base near the lake shore. Shots were triggered at 5-m intervals with a maximum non-cumulative error of 25 cm. Whereas the single 48-channel streamer system of Survey I requires extrapolation of receiver positions from the boat position, for Survey II they could be accurately calculated (error <20 cm) with the aid of three additional dGPS antennas mounted on rafts attached to the end of each of the 24- channel streamers. Towed at a distance of 75 m behind the vessel, they allow the determination of feathering due to cross-line currents or small course variations. Furthermore, two retractable booms hold the three streamers at a distance of 7.5 m from each other, which is the same distance as the sail line interval for Survey I. With a receiver spacing of 2.5 m, the bin dimension of the 3-D data of Survey II is 1.25 m in in-line direction and 3.75 m in cross-line direction. The greater cross-line versus in-line spacing is justified by the known structural trend of the fault zone perpendicular to the in-line direction. The data from Survey I showed some reflection discontinuity as a result of insufficiently accurate navigation and positioning and subsequent binning errors. Observed aliasing in the 3-D migration was due to insufficient lateral sampling combined with the relatively high frequency (<2000 Hz) content of the water gun source (operated at 140 bars and 0.3 m depth). These results motivated the use of a double-chamber bubble-canceling air gun for Survey II. A 15 / 15 Mini G.I air gun operated at 80 bars and 1 m depth, proved to be better adapted for imaging the complexly faulted target area, which has reflectors dipping up to 30°. Although its frequencies do not exceed 650 Hz, this air gun combines a penetration of non-aliased signal to depths of 300 m below the water bottom (versus 145 m for the water gun) with a maximum vertical resolution of 1.1 m. While Survey I was shot in patches of alternating directions, the optimized surveying time of the new threestreamer system allowed acquisition in parallel geometry, which is preferable when using an asymmetric configuration (single source and receiver array). Otherwise, resulting stacks are different for the opposite directions. However, the shorter streamer configuration of Survey II reduced the nominal fold from 12 to 6. A 3-D conventional processing flow was adapted to the high sampling rates and was complemented by two computer programs that format the unconventional navigation data to industry standards. Processing included trace editing, geometry assignment, bin harmonization (to compensate for uneven fold due to boat/streamer drift), spherical divergence correction, bandpass filtering, velocity analysis, 3-D DMO correction, stack and 3-D time migration. A detailed semblance velocity analysis was performed on the 12-fold data set for every second in-line and every 50th CMP, i.e. on a total of 600 spectra. According to this velocity analysis, interval velocities range from 1450-1650 m/s for the unconsolidated sediments and from 1650-3000 m/s for the consolidated sediments. Delineation of several horizons and fault surfaces reveal the potential for small-scale geologic and tectonic interpretation in three dimensions. Five major seismic facies and their detailed 3-D geometries can be distinguished in vertical and horizontal sections: lacustrine sediments (Holocene) , glaciolacustrine sediments (Pleistocene), Plateau Molasse, Subalpine Molasse and its thrust fault zone. Dips of beds within Plateau and Subalpine Molasse are ~8° and ~20°, respectively. Within the fault zone, many highly deformed structures with dips around 30° are visible. Preliminary tests with 3-D preserved-amplitude prestack depth migration demonstrate that the excellent data quality of Survey II allows application of such sophisticated techniques even to high-resolution seismic surveys. In general, the adaptation of the 3-D marine seismic reflection method, which to date has almost exclusively been used by the oil exploration industry, to a smaller geographical as well as financial scale has helped pave the way for applying this technique to environmental and engineering purposes.<br/><br/>La sismique réflexion est une méthode d?investigation du sous-sol avec un très grand pouvoir de résolution. Elle consiste à envoyer des vibrations dans le sol et à recueillir les ondes qui se réfléchissent sur les discontinuités géologiques à différentes profondeurs et remontent ensuite à la surface où elles sont enregistrées. Les signaux ainsi recueillis donnent non seulement des informations sur la nature des couches en présence et leur géométrie, mais ils permettent aussi de faire une interprétation géologique du sous-sol. Par exemple, dans le cas de roches sédimentaires, les profils de sismique réflexion permettent de déterminer leur mode de dépôt, leurs éventuelles déformations ou cassures et donc leur histoire tectonique. La sismique réflexion est la méthode principale de l?exploration pétrolière. Pendant longtemps on a réalisé des profils de sismique réflexion le long de profils qui fournissent une image du sous-sol en deux dimensions. Les images ainsi obtenues ne sont que partiellement exactes, puisqu?elles ne tiennent pas compte de l?aspect tridimensionnel des structures géologiques. Depuis quelques dizaines d?années, la sismique en trois dimensions (3-D) a apporté un souffle nouveau à l?étude du sous-sol. Si elle est aujourd?hui parfaitement maîtrisée pour l?imagerie des grandes structures géologiques tant dans le domaine terrestre que le domaine océanique, son adaptation à l?échelle lacustre ou fluviale n?a encore fait l?objet que de rares études. Ce travail de thèse a consisté à développer un système d?acquisition sismique similaire à celui utilisé pour la prospection pétrolière en mer, mais adapté aux lacs. Il est donc de dimension moindre, de mise en oeuvre plus légère et surtout d?une résolution des images finales beaucoup plus élevée. Alors que l?industrie pétrolière se limite souvent à une résolution de l?ordre de la dizaine de mètres, l?instrument qui a été mis au point dans le cadre de ce travail permet de voir des détails de l?ordre du mètre. Le nouveau système repose sur la possibilité d?enregistrer simultanément les réflexions sismiques sur trois câbles sismiques (ou flûtes) de 24 traces chacun. Pour obtenir des données 3-D, il est essentiel de positionner les instruments sur l?eau (source et récepteurs des ondes sismiques) avec une grande précision. Un logiciel a été spécialement développé pour le contrôle de la navigation et le déclenchement des tirs de la source sismique en utilisant des récepteurs GPS différentiel (dGPS) sur le bateau et à l?extrémité de chaque flûte. Ceci permet de positionner les instruments avec une précision de l?ordre de 20 cm. Pour tester notre système, nous avons choisi une zone sur le Lac Léman, près de la ville de Lausanne, où passe la faille de « La Paudèze » qui sépare les unités de la Molasse du Plateau et de la Molasse Subalpine. Deux campagnes de mesures de sismique 3-D y ont été réalisées sur une zone d?environ 1 km2. Les enregistrements sismiques ont ensuite été traités pour les transformer en images interprétables. Nous avons appliqué une séquence de traitement 3-D spécialement adaptée à nos données, notamment en ce qui concerne le positionnement. Après traitement, les données font apparaître différents faciès sismiques principaux correspondant notamment aux sédiments lacustres (Holocène), aux sédiments glacio-lacustres (Pléistocène), à la Molasse du Plateau, à la Molasse Subalpine de la zone de faille et la Molasse Subalpine au sud de cette zone. La géométrie 3-D détaillée des failles est visible sur les sections sismiques verticales et horizontales. L?excellente qualité des données et l?interprétation de plusieurs horizons et surfaces de faille montrent le potentiel de cette technique pour les investigations à petite échelle en trois dimensions ce qui ouvre des voies à son application dans les domaines de l?environnement et du génie civil.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Electrical impedance tomography (EIT) is a non-invasive imaging technique that can measure cardiac-related intra-thoracic impedance changes. EIT-based cardiac output estimation relies on the assumption that the amplitude of the impedance change in the ventricular region is representative of stroke volume (SV). However, other factors such as heart motion can significantly affect this ventricular impedance change. In the present case study, a magnetic resonance imaging-based dynamic bio-impedance model fitting the morphology of a single male subject was built. Simulations were performed to evaluate the contribution of heart motion and its influence on EIT-based SV estimation. Myocardial deformation was found to be the main contributor to the ventricular impedance change (56%). However, motion-induced impedance changes showed a strong correlation (r = 0.978) with left ventricular volume. We explained this by the quasi-incompressibility of blood and myocardium. As a result, EIT achieved excellent accuracy in estimating a wide range of simulated SV values (error distribution of 0.57 ± 2.19 ml (1.02 ± 2.62%) and correlation of r = 0.996 after a two-point calibration was applied to convert impedance values to millilitres). As the model was based on one single subject, the strong correlation found between motion-induced changes and ventricular volume remains to be verified in larger datasets.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The present study evaluates the performance of four methods for estimating regression coefficients used to make statistical decisions regarding intervention effectiveness in single-case designs. Ordinary least squares estimation is compared to two correction techniques dealing with general trend and one eliminating autocorrelation whenever it is present. Type I error rates and statistical power are studied for experimental conditions defined by the presence or absence of treatment effect (change in level or in slope), general trend, and serial dependence. The results show that empirical Type I error rates do not approximate the nominal ones in presence of autocorrelation or general trend when ordinary and generalized least squares are applied. The techniques controlling trend show lower false alarm rates, but prove to be insufficiently sensitive to existing treatment effects. Consequently, the use of the statistical significance of the regression coefficients for detecting treatment effects is not recommended for short data series.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Following protection measures implemented since the 1970s, large carnivores are currently increasing in number and returning to areas from which they were absent for decades or even centuries. Monitoring programmes for these species rely extensively on non-invasive sampling and genotyping. However, attempts to connect results of such studies at larger spatial or temporal scales often suffer from the incompatibility of genetic markers implemented by researchers in different laboratories. This is particularly critical for long-distance dispersers, revealing the need for harmonized monitoring schemes that would enable the understanding of gene flow and dispersal dynamics. Based on a review of genetic studies on grey wolves Canis lupus from Europe, we provide an overview of the genetic markers currently in use, and identify opportunities and hurdles for studies based on continent-scale datasets. Our results highlight an urgent need for harmonization of methods to enable transnational research based on data that have already been collected, and to allow these data to be linked to material collected in the future. We suggest timely standardization of newly developed genotyping approaches, and propose that action is directed towards the establishment of shared single nucleotide polymorphism panels, next-generation sequencing of microsatellites, a common reference sample collection and an online database for data exchange. Enhanced cooperation among genetic researchers dealing with large carnivores in consortia would facilitate streamlining of methods, their faster and wider adoption, and production of results at the large spatial scales that ultimately matter for the conservation of these charismatic species.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Leadership is essential for the effectiveness of the teams and organizations they are part of. The challenges facing organizations today require an exhaustive review of the strategic role of leadership. In this context, it is necessary to explore new types of leadership capable of providing an effective response to new needs. The presentday situations, characterized by complexity and ambiguity, make it difficult for an external leader to perform all leadership functions successfully. Likewise, knowledge-based work requires providing professional groups with sufficient autonomy to perform leadership functions. This study focuses on shared leadership in the team context. Shared leadership is seen as an emergent team property resulting from the distribution of leadership influence across multiple team members. Shared leadership entails sharing power and influence broadly among the team members rather than centralizing it in the hands of a single individual who acts in the clear role of a leader. By identifying the team itself as a key source of influence, this study points to the relational nature of leadership as a social construct where leadership is seen as social process of relating processes that are co-constructed by several team members. Based on recent theoretical developments concerned with relational, practice-based and constructionist approaches to the study of leadership processes, this thesis proposes the study of leadership interactions, working processes and practices to focus on the construction of direction, alignment and commitment. During the research process, critical events, activities, working processes and practices of a case team have been examined and analyzed with the grounded theory –approach in the terms of shared leadership. There are a variety of components to this complex process and a multitude of factors that may influence the development of shared leadership. The study suggests that the development process of shared leadership is a common sense -making process and consists of four overlapping dimensions (individual, social, structural, and developmental) to work with as a team. For shared leadership to emerge, the members of the team must offer leadership services, and the team as a whole must be willing to rely on leadership by multiple team members. For these individual and collective behaviors to occur, the team members must believe that offering influence to and accepting it from fellow team members are welcome and constructive actions. Leadership emerges when people with differing world views use dialogue and collaborative learning to create spaces where a shared common purpose can be achieved while a diversity of perspectives is preserved and valued. This study also suggests that this process can be supported by different kinds of meaning-making and process tools. Leadership, then, does not reside in a person or in a role, but in the social system. The built framework integrates the different dimensions of shared leadership and describes their relationships. This way, the findings of this study can be seen as a contribution to the understanding of what constitutes essential aspects of shared leadership in the team context that can be of theoretical value in terms of advancing the adoption and development process of shared leadership. In the real world, teams and organizations can create conditions to foster and facilitate the process. We should encourage leaders and team members to approach leadership as a collective effort that the team can be prepared for, so that the response is rapid and efficient.