952 resultados para single channel algorithm


Relevância:

30.00% 30.00%

Publicador:

Resumo:

ABSTRACTThe Amazon várzeas are an important component of the Amazon biome, but anthropic and climatic impacts have been leading to forest loss and interruption of essential ecosystem functions and services. The objectives of this study were to evaluate the capability of the Landsat-based Detection of Trends in Disturbance and Recovery (LandTrendr) algorithm to characterize changes in várzeaforest cover in the Lower Amazon, and to analyze the potential of spectral and temporal attributes to classify forest loss as either natural or anthropogenic. We used a time series of 37 Landsat TM and ETM+ images acquired between 1984 and 2009. We used the LandTrendr algorithm to detect forest cover change and the attributes of "start year", "magnitude", and "duration" of the changes, as well as "NDVI at the end of series". Detection was restricted to areas identified as having forest cover at the start and/or end of the time series. We used the Support Vector Machine (SVM) algorithm to classify the extracted attributes, differentiating between anthropogenic and natural forest loss. Detection reliability was consistently high for change events along the Amazon River channel, but variable for changes within the floodplain. Spectral-temporal trajectories faithfully represented the nature of changes in floodplain forest cover, corroborating field observations. We estimated anthropogenic forest losses to be larger (1.071 ha) than natural losses (884 ha), with a global classification accuracy of 94%. We conclude that the LandTrendr algorithm is a reliable tool for studies of forest dynamics throughout the floodplain.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

El avance en la potencia de cómputo en nuestros días viene dado por la paralelización del procesamiento, dadas las características que disponen las nuevas arquitecturas de hardware. Utilizar convenientemente este hardware impacta en la aceleración de los algoritmos en ejecución (programas). Sin embargo, convertir de forma adecuada el algoritmo en su forma paralela es complejo, y a su vez, esta forma, es específica para cada tipo de hardware paralelo. En la actualidad los procesadores de uso general más comunes son los multicore, procesadores paralelos, también denominados Symmetric Multi-Processors (SMP). Hoy en día es difícil hallar un procesador para computadoras de escritorio que no tengan algún tipo de paralelismo del caracterizado por los SMP, siendo la tendencia de desarrollo, que cada día nos encontremos con procesadores con mayor numero de cores disponibles. Por otro lado, los dispositivos de procesamiento de video (Graphics Processor Units - GPU), a su vez, han ido desarrollando su potencia de cómputo por medio de disponer de múltiples unidades de procesamiento dentro de su composición electrónica, a tal punto que en la actualidad no es difícil encontrar placas de GPU con capacidad de 200 a 400 hilos de procesamiento paralelo. Estos procesadores son muy veloces y específicos para la tarea que fueron desarrollados, principalmente el procesamiento de video. Sin embargo, como este tipo de procesadores tiene muchos puntos en común con el procesamiento científico, estos dispositivos han ido reorientándose con el nombre de General Processing Graphics Processor Unit (GPGPU). A diferencia de los procesadores SMP señalados anteriormente, las GPGPU no son de propósito general y tienen sus complicaciones para uso general debido al límite en la cantidad de memoria que cada placa puede disponer y al tipo de procesamiento paralelo que debe realizar para poder ser productiva su utilización. Los dispositivos de lógica programable, FPGA, son dispositivos capaces de realizar grandes cantidades de operaciones en paralelo, por lo que pueden ser usados para la implementación de algoritmos específicos, aprovechando el paralelismo que estas ofrecen. Su inconveniente viene derivado de la complejidad para la programación y el testing del algoritmo instanciado en el dispositivo. Ante esta diversidad de procesadores paralelos, el objetivo de nuestro trabajo está enfocado en analizar las características especificas que cada uno de estos tienen, y su impacto en la estructura de los algoritmos para que su utilización pueda obtener rendimientos de procesamiento acordes al número de recursos utilizados y combinarlos de forma tal que su complementación sea benéfica. Específicamente, partiendo desde las características del hardware, determinar las propiedades que el algoritmo paralelo debe tener para poder ser acelerado. Las características de los algoritmos paralelos determinará a su vez cuál de estos nuevos tipos de hardware son los mas adecuados para su instanciación. En particular serán tenidos en cuenta el nivel de dependencia de datos, la necesidad de realizar sincronizaciones durante el procesamiento paralelo, el tamaño de datos a procesar y la complejidad de la programación paralela en cada tipo de hardware. Today´s advances in high-performance computing are driven by parallel processing capabilities of available hardware architectures. These architectures enable the acceleration of algorithms when thes ealgorithms are properly parallelized and exploit the specific processing power of the underneath architecture. Most current processors are targeted for general pruposes and integrate several processor cores on a single chip, resulting in what is known as a Symmetric Multiprocessing (SMP) unit. Nowadays even desktop computers make use of multicore processors. Meanwhile, the industry trend is to increase the number of integrated rocessor cores as technology matures. On the other hand, Graphics Processor Units (GPU), originally designed to handle only video processing, have emerged as interesting alternatives to implement algorithm acceleration. Current available GPUs are able to implement from 200 to 400 threads for parallel processing. Scientific computing can be implemented in these hardware thanks to the programability of new GPUs that have been denoted as General Processing Graphics Processor Units (GPGPU).However, GPGPU offer little memory with respect to that available for general-prupose processors; thus, the implementation of algorithms need to be addressed carefully. Finally, Field Programmable Gate Arrays (FPGA) are programmable devices which can implement hardware logic with low latency, high parallelism and deep pipelines. Thes devices can be used to implement specific algorithms that need to run at very high speeds. However, their programmability is harder that software approaches and debugging is typically time-consuming. In this context where several alternatives for speeding up algorithms are available, our work aims at determining the main features of thes architectures and developing the required know-how to accelerate algorithm execution on them. We look at identifying those algorithms that may fit better on a given architecture as well as compleme

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Introduction: Non-invasive brain imaging techniques often contrast experimental conditions across a cohort of participants, obfuscating distinctions in individual performance and brain mechanisms that are better characterised by the inter-trial variability. To overcome such limitations, we developed topographic analysis methods for single-trial EEG data [1]. So far this was typically based on time-frequency analysis of single-electrode data or single independent components. The method's efficacy is demonstrated for event-related responses to environmental sounds, hitherto studied at an average event-related potential (ERP) level. Methods: Nine healthy subjects participated to the experiment. Auditory meaningful sounds of common objects were used for a target detection task [2]. On each block, subjects were asked to discriminate target sounds, which were living or man-made auditory objects. Continuous 64-channel EEG was acquired during the task. Two datasets were considered for each subject including single-trial of the two conditions, living and man-made. The analysis comprised two steps. In the first part, a mixture of Gaussians analysis [3] provided representative topographies for each subject. In the second step, conditional probabilities for each Gaussian provided statistical inference on the structure of these topographies across trials, time, and experimental conditions. Similar analysis was conducted at group-level. Results: Results show that the occurrence of each map is structured in time and consistent across trials both at the single-subject and at group level. Conducting separate analyses of ERPs at single-subject and group levels, we could quantify the consistency of identified topographies and their time course of activation within and across participants as well as experimental conditions. A general agreement was found with previous analysis at average ERP level. Conclusions: This novel approach to single-trial analysis promises to have impact on several domains. In clinical research, it gives the possibility to statistically evaluate single-subject data, an essential tool for analysing patients with specific deficits and impairments and their deviation from normative standards. In cognitive neuroscience, it provides a novel tool for understanding behaviour and brain activity interdependencies at both single-subject and at group levels. In basic neurophysiology, it provides a new representation of ERPs and promises to cast light on the mechanisms of its generation and inter-individual variability.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In all actual clinical guidelines, dihydropyridine calcium channel blockers (CCBs) belong to the recommended first line antihypertensive drugs to treat essential hypertension. Several recent large clinical trials have confirmed their efficacy not only in lowering blood pressure but also in reducing cardiovascular morbidity and mortality in hypertensive patients with a normal or high cardiovascular risk profile. In clinical trials such as ALLHAT, VALUE or ASCOT, an amlodipine-based therapy was at least as effective, when not slightly superior, in lowering blood pressure and sometimes more effective in preventing target organ damages than blood pressure lowering strategies based on the use of diuretics, beta-blockers and blockers of the renin-angiotensin system. One of the main clinical side effects of the first and second generation CCBs including amlodipine is the development of peripheral edema. The incidence of leg edema can be markedly reduced by combining the CCB with a blocker of the renin-angiotensin system. This strategy has now led to the development of several fixed-dose combinations of amlodipine and angiotensin II receptor antagonists. Another alternative to lower the incidence of edema is to use CCBs of the third generation such as lercanidipine. Indeed, although no major clinical trials have been conducted with this compound, clinical studies have shown that lercanidipine and amlodipine have a comparable antihypertensive efficacy but with significantly less peripheral edema in patients receiving lercanidipine. In some countries, lercanidipine is now available in a single-pill association with an ACE inhibitor thereby further improving its efficacy and tolerability profile.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Diabetes mellitus (DM) is a major cause of peripheral neuropathy. More than 220 million people worldwide suffer from type 2 DM, which will, in approximately half of them, lead to the development of diabetic peripheral neuropathy. While of significant medical importance, the pathophysiological changes present in DPN are still poorly understood. To get more insight into DPN associated with type 2 DM, we decided to use the rodent model of this form of diabetes, the db/db mice. During the in-vivo conduction velocity studies on these animals, we observed the presence of multiple spiking followed by a single stimulation. This prompted us to evaluate the excitability properties of db/db peripheral nerves. Ex-vivo electrophysiological evaluation revealed a significant increase in the excitability of db/db sciatic nerves. While the shape and kinetics of the compound action potential of db/db nerves were the same as for control nerves, we observed an increase in the after-hyperpolarization phase (AHP) under diabetic conditions. Using pharmacological inhibitors we demonstrated that both the peripheral nerve hyperexcitability (PNH) and the increased AHP were mostly mediated by the decreased activity of Kv1-channels. Importantly, we corroborated these data at the molecular level. We observed a strong reduction of Kv1.2 channel presence in the juxtaparanodal regions of teased fibers in db/db mice as compared to control mice. Quantification of the amount of both Kv1.2 isoforms in DRG neurons and in the endoneurial compartment of peripheral nerve by Western blotting revealed that less mature Kv1.2 was integrated into the axonal membranes at the juxtaparanodes. Our observation that peripheral nerve hyperexcitability present in db/db mice is at least in part a consequence of changes in potassium channel distribution suggests that the same mechanism also mediates PNH in diabetic patients. ∗Current address: Department of Physiology, UCSF, San Francisco, CA, USA.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Auditory evoked potentials are informative of intact cortical functions of comatose patients. The integrity of auditory functions evaluated using mismatch negativity paradigms has been associated with their chances of survival. However, because auditory discrimination is assessed at various delays after coma onset, it is still unclear whether this impairment depends on the time of the recording. We hypothesized that impairment in auditory discrimination capabilities is indicative of coma progression, rather than of the comatose state itself and that rudimentary auditory discrimination remains intact during acute stages of coma. We studied 30 post-anoxic comatose patients resuscitated from cardiac arrest and five healthy, age-matched controls. Using a mismatch negativity paradigm, we performed two electroencephalography recordings with a standard 19-channel clinical montage: the first within 24 h after coma onset and under mild therapeutic hypothermia, and the second after 1 day and under normothermic conditions. We analysed electroencephalography responses based on a multivariate decoding algorithm that automatically quantifies neural discrimination at the single patient level. Results showed high average decoding accuracy in discriminating sounds both for control subjects and comatose patients. Importantly, accurate decoding was largely independent of patients' chance of survival. However, the progression of auditory discrimination between the first and second recordings was informative of a patient's chance of survival. A deterioration of auditory discrimination was observed in all non-survivors (equivalent to 100% positive predictive value for survivors). We show, for the first time, evidence of intact auditory processing even in comatose patients who do not survive and that progression of sound discrimination over time is informative of a patient's chance of survival. Tracking auditory discrimination in comatose patients could provide new insight to the chance of awakening in a quantitative and automatic fashion during early stages of coma.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Like numerous torrents in mountainous regions, the Illgraben creek (canton of Wallis, SW Switzerland) produces almost every year several debris flows. The total area of the active catchment is only 4.7 km², but large events ranging from 50'000 to 400'000 m³ are common (Zimmermann 2000). Consequently, the pathway of the main channel often changes suddenly. One single event can for instance fill the whole river bed and dig new several-meters-deep channels somewhere else (Bardou et al. 2003). The quantification of both, the rhythm and the magnitude of these changes, is very important to assess the variability of the bed's cross section and long profile. These parameters are indispensable for numerical modelling, as they should be considered as initial conditions. To monitor the channel evolution an Optech ILRIS 3D terrestrial laser scanner (LIDAR) was used. LIDAR permits to make a complete high precision 3D model of the channel and its surroundings by scanning it from different view points. The 3D data are treated and interpreted with the software Polyworks from Innovmetric Software Inc. Sequential 3D models allow for the determination of the variation in the bed's cross section and long profile. These data will afterwards be used to quantify the erosion and the deposition in the torrent reaches. To complete the chronological evolution of the landforms, precise digital terrain models, obtained by high resolution photogrammetry based on old aerial photographs, will be used. A 500 m long section of the Illgraben channel was scanned on 18th of August 2005 and on 7th of April 2006. These two data sets permit identifying the changes of the channel that occurred during the winter season. An upcoming scanning campaign in September 2006 will allow for the determination of the changes during this summer. Preliminary results show huge variations in the pathway of the Illgraben channel, as well as important vertical and lateral erosion of the river bed. Here we present the results of a river bank on the left (north-western) flank of the channel (Figure 1). For the August 2005 model the scans from 3 viewpoints were superposed, whereas the April 2006 3D image was obtained by combining 5 separate scans. The bank was eroded. The bank got eroded essentially on its left part (up to 6.3 m), where it is hit by the river and the debris flows (Figures 2 and 3). A debris cone has also formed (Figure 3), which suggests that a part of the bank erosion is due to shallow landslides. They probably occur when the river erosion creates an undercut slope. These geometrical data allow for the monitoring of the alluvial dynamics (i.e. aggradation and degradation) on different time scales and the influence of debris flows occurrence on these changes. Finally, the resistance against erosion of the bed's cross section and long profile will be analysed to assess the variability of these two key parameters. This information may then be used in debris flow simulation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The rate-limiting step of dietary calcium absorption in the intestine requires the brush border calcium entry channel TRPV6. The TRPV6 gene was completely sequenced in 170 renal calcium stone patients. The frequency of an ancestral TRPV6 haplotype consisting of three non-synonymous polymorphisms (C157R, M378V, M681T) was significantly higher (P = 0.039) in calcium stone formers (8.4%; derived = 502, ancestral = 46) compared to non-stone-forming individuals (5.4%; derived = 645, ancestral = 37). Mineral metabolism was investigated on four different calcium regimens: (i) free-choice diet, (ii) low calcium diet, (iii) fasting and (iv) after a 1 g oral calcium load. When patients homozygous for the derived haplotype were compared with heterozygous patients, no differences were found with respect to the plasma concentrations of 1,25-vitamin D, PTH and calcium, and the urinary excretion of calcium. In one stone-forming patient, the ancestral haplotype was found to be homozygous. This patient had absorptive hypercalciuria. We therefore expressed the ancestral protein (157R+378V+681T) in Xenopus oocytes and found a significantly enhanced calcium permeability when tested by a (45)Ca(2+) uptake assay (7.11 +/- 1.93 versus 3.61 +/- 1.01 pmol/min/oocyte for ancestral versus derived haplotype, P < 0.01). These results suggest that the ancestral gain-of-function haplotype in TRPV6 plays a role in calcium stone formation in certain forms of absorptive hypercalciuria.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We propose and validate a multivariate classification algorithm for characterizing changes in human intracranial electroencephalographic data (iEEG) after learning motor sequences. The algorithm is based on a Hidden Markov Model (HMM) that captures spatio-temporal properties of the iEEG at the level of single trials. Continuous intracranial iEEG was acquired during two sessions (one before and one after a night of sleep) in two patients with depth electrodes implanted in several brain areas. They performed a visuomotor sequence (serial reaction time task, SRTT) using the fingers of their non-dominant hand. Our results show that the decoding algorithm correctly classified single iEEG trials from the trained sequence as belonging to either the initial training phase (day 1, before sleep) or a later consolidated phase (day 2, after sleep), whereas it failed to do so for trials belonging to a control condition (pseudo-random sequence). Accurate single-trial classification was achieved by taking advantage of the distributed pattern of neural activity. However, across all the contacts the hippocampus contributed most significantly to the classification accuracy for both patients, and one fronto-striatal contact for one patient. Together, these human intracranial findings demonstrate that a multivariate decoding approach can detect learning-related changes at the level of single-trial iEEG. Because it allows an unbiased identification of brain sites contributing to a behavioral effect (or experimental condition) at the level of single subject, this approach could be usefully applied to assess the neural correlates of other complex cognitive functions in patients implanted with multiple electrodes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

For single-user MIMO communication with uncoded and coded QAM signals, we propose bit and power loading schemes that rely only on channel distribution information at the transmitter. To that end, we develop the relationship between the average bit error probability at the output of a ZF linear receiver and the bit rates and powers allocated at the transmitter. This relationship, and the fact that a ZF receiver decouples the MIMO parallel channels, allow leveraging bit loading algorithms already existing in the literature. We solve dual bit rate maximization and power minimization problems and present performance resultsthat illustrate the gains of the proposed scheme with respect toa non-optimized transmission.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Introduction: The primary somatosensory cortex (SI) contains Brodmann areas (BA) 1, 2, 3a, and 3b. Research in non-human primates showed that BAs 3b, 1, and 2 each contain one full representation of the hand with separate representations for each finger. This research also showed that the finger representation in BA3b has larger and clearer finger somatotopy than BA1 and 2. Although several efforts to map finger somatotopy in SI by fMRI have been made at 1.5 and 3T these studies have yielded variable results and were not able to detect single subject finger somatotopy, probably due to the limited spatial extent of the cortical areas representing a digit (close to the resolution in most fMRI experiments), complications due to acquisition of consistent maps for individual subjects (Schweizer et al 2008), or inter-individual variability in sulcal anatomy impeding group studies. Here, we used 7T fMRI to investigate finger somatotopy in SI, some of its functional characteristics, and its reproducibility. Methods: Eight right-handed male subjects were scanned on a 7T scanner (Siemens Medical, Germany) with an 8-channel Tx/Rx rf-coil (Rapid Biomedical, Germany). 1.3x1.3x1.3mm3 resolution fMRI data were acquired using a sinusoidal readout EPI sequence (Speck et al, 2008) and FOV=210mm, TE/TR=27ms/2.5s, GRAPPA=2. Each volume contained 28 transverse slices covering SI. A single EPI volume with 64 slices was acquired to aid coregistration. 1x1x1mm3 anatomical data were acquire using the MP2RAGE sequence (Marques et al, 2009; TE/TR/TI1,2/TRmprage=2.63ms/7.2ms/0.9,3.2s/5s). Subjects were positioned supine in the scanner with their right arm comfortably against the magnet bore. An experimenter was positioned at the entrance of the bore where he could easily reach and stroke successively the two distal phalanxes of each digit. The order of stroked digit was D1 (thumb)-D3-D5-D2-D4, with 20s ON, 10s OFF alternated. This sequence was repeated four times per run and two functional runs were acquired per subject. Realignment, smoothing (FWHM 2 mm), coregistration of the anatomical to the fMRI data and calculation of t-statistics were done using SPM8. An SI mask was obtained via an F-contrast (p<0.001) over all digits. Within the mask, voxels were labeled with the number of the digit demonstrating the highest t-value for that particular voxel. Results: For all subjects, areas corresponding to the five digits were identified in contralateral SI. BA3b showed the most consistent somatotopic finger representation (see an example in Fig.1). The five digits were localized in a consecutive order in the cortex, with D1 most anterior, inferior and distal and D5, most posterior, superior and medial (mean distance between centres of mass of digit representations ±stderr: 4.2±0.7mm; see Fig. 2). The analysis of average beta values within each finger representation region revealed the specificity of the somatotopic region to the tactile input for each tested finger (except digit 4 and 5). Five of these subjects also presented an orderly and consecutive representation of the five digits in BA1 and 2. Conclusions: Our data reveal that the increased BOLD sensitivity at 7T and the high spatial resolution used in this study allow consistent somatotopic mapping using human touch as a stimulus and that human SI contains at least three separate regions that contain five separate representations of all single contralateral fingers. Moreover, adjacent fingers were represented at adjacent cortical regions across the three SI regions. The spatial organization of SI as reflected in individual subject topography corresponds well with previous electrophysiological data in non-human primates. The small distance between digit representations highlights the need for the high spatial resolution available at 7T.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Neuroimaging studies typically compare experimental conditions using average brain responses, thereby overlooking the stimulus-related information conveyed by distributed spatio-temporal patterns of single-trial responses. Here, we take advantage of this rich information at a single-trial level to decode stimulus-related signals in two event-related potential (ERP) studies. Our method models the statistical distribution of the voltage topographies with a Gaussian Mixture Model (GMM), which reduces the dataset to a number of representative voltage topographies. The degree of presence of these topographies across trials at specific latencies is then used to classify experimental conditions. We tested the algorithm using a cross-validation procedure in two independent EEG datasets. In the first ERP study, we classified left- versus right-hemifield checkerboard stimuli for upper and lower visual hemifields. In a second ERP study, when functional differences cannot be assumed, we classified initial versus repeated presentations of visual objects. With minimal a priori information, the GMM model provides neurophysiologically interpretable features - vis à vis voltage topographies - as well as dynamic information about brain function. This method can in principle be applied to any ERP dataset testing the functional relevance of specific time periods for stimulus processing, the predictability of subject's behavior and cognitive states, and the discrimination between healthy and clinical populations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Chloride channels represent a group of targets for major clinical indications. However, molecular screening for chloride channel modulators has proven to be difficult and time-consuming as approaches essentially rely on the use of fluorescent dyes or invasive patch-clamp techniques which do not lend themselves to the screening of large sets of compounds. To address this problem, we have developed a non-invasive optical method, based on digital holographic microcopy (DHM), allowing monitoring of ion channel activity without using any electrode or fluorescent dye. To illustrate this approach, GABA(A) mediated chloride currents have been monitored with DHM. Practically, we show that DHM can non-invasively provide the quantitative determination of transmembrane chloride fluxes mediated by the activation of chloride channels associated with GABA(A) receptors. Indeed through an original algorithm, chloride currents elicited by application of appropriate agonists of the GABA(A) receptor can be derived from the quantitative phase signal recorded with DHM. Finally, chloride currents can be determined and pharmacologically characterized non-invasively simultaneously on a large cellular sampling by DHM.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Phase I research, Iowa Department of Transportation (IDOT) Project HR-214, "Feasibility Study of Strengthening Existing Single Span Steel Beam Concrete Deck Bridges," verified that post-tensioning can be used to provide strengthening of the composite bridges under investigation. Phase II research, reported here, involved the strengthening of two full-scale prototype bridges - one a prototype of the model bridge tested during Phase I and the other larger and skewed. In addition to the field work, Phase II also involved a considerable amount of laboratory work. A literature search revealed that only minimal data existed on the angle-plus-bar shear connectors. Thus, several specimens utilizing angle-plus-bar, as well as channels, studs and high strength bolts as shear connectors were fabricated and tested. To obtain additional shear connector information, the bridge model of Phase I was sawed into four composite concrete slab and steel beam specimens. Two of the resulting specimens were tested with the original shear connection, while the other two specimens had additional shear connectors added before testing. Although orthotropic plate theory was shown in Phase I to predict vertical load distribution in bridge decks and to predict approximate distribution of post-tensioning for right-angle bridges, it was questioned whether the theory could also be used on skewed bridges. Thus, a small plexiglas model was constructed and used in vertical load distribution tests and post-tensioning force distribution tests for verification of the theory. Conclusions of this research are as follows: (1) The capacity of existing shear connectors must be checked as part of a bridge strengthening program. Determination of the concrete deck strength in advance of bridge strengthening is also recommended. (2) The ultimate capacity of angle-plus-bar shear connectors can be computed on the basis of a modified AASHTO channel connector formula and an angle-to-beam weld capacity check. (3) Existing shear connector capacity can be augmented by means of double-nut high strength bolt connectors. (4) Post-tensioning did not significantly affect truck load distribution for right angle or skewed bridges. (5) Approximate post-tensioning and truck load distribution for actual bridges can be predicted by orthotropic plate theory for vertical load; however, the agreement between actual distribution and theoretical distribution is not as close as that measured for the laboratory model in Phase I. (6) The right angle bridge exhibited considerable end restraint at what would be assumed to be simple support. The construction details at bridge abutments seem to be the reason for the restraint. (7) The skewed bridge exhibited more end restraint than the right angle bridge. Both skew effects and construction details at the abutments accounted for the restraint. (8) End restraint in the right angle and skewed bridges reduced tension strains in the steel bridge beams due to truck loading, but also reduced the compression strains caused by post-tensioning.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Single-trial analysis of human electroencephalography (EEG) has been recently proposed for better understanding the contribution of individual subjects to a group-analysis effect as well as for investigating single-subject mechanisms. Independent Component Analysis (ICA) has been repeatedly applied to concatenated single-trial responses and at a single-subject level in order to extract those components that resemble activities of interest. More recently we have proposed a single-trial method based on topographic maps that determines which voltage configurations are reliably observed at the event-related potential (ERP) level taking advantage of repetitions across trials. Here, we investigated the correspondence between the maps obtained by ICA versus the topographies that we obtained by the single-trial clustering algorithm that best explained the variance of the ERP. To do this, we used exemplar data provided from the EEGLAB website that are based on a dataset from a visual target detection task. We show there to be robust correspondence both at the level of the activation time courses and at the level of voltage configurations of a subset of relevant maps. We additionally show the estimated inverse solution (based on low-resolution electromagnetic tomography) of two corresponding maps occurring at approximately 300 ms post-stimulus onset, as estimated by the two aforementioned approaches. The spatial distribution of the estimated sources significantly correlated and had in common a right parietal activation within Brodmann's Area (BA) 40. Despite their differences in terms of theoretical bases, the consistency between the results of these two approaches shows that their underlying assumptions are indeed compatible.