990 resultados para Neutrino oscillations.,SAND,DUNE,LArTPC,Event reconstruction,LAr imaging


Relevância:

20.00% 20.00%

Publicador:

Resumo:

LJM11, an abundant salivary protein from the sand fly Lutzomyia longipalpis, belongs to the insect "yellow" family of proteins. In this study, we immunized mice with 17 plasmids encoding L. longiplapis salivary proteins and demonstrated that LJM11 confers protective immunity against Leishmania major infection. This protection correlates with a strong induction of a delayed type hypersensitivity (DTH) response following exposure to L. longipalpis saliva. Additionally, splenocytes of exposed mice produce IFN-γ upon stimulation with LJM11, demonstrating the systemic induction of Th1 immunity by this protein. In contrast to LJM11, LJM111, another yellow protein from L. longipalpis saliva, does not produce a DTH response in these mice, suggesting that structural or functional features specific to LJM11 are important for the induction of a robust DTH response. To examine these features, we used calorimetric analysis to probe a possible ligand binding function for the salivary yellow proteins. LJM11, LJM111, and LJM17 all acted as high affinity binders of prohemostatic and proinflammatory biogenic amines, particularly serotonin, catecholamines, and histamine. We also determined the crystal structure of LJM11, revealing a six-bladed β-propeller fold with a single ligand binding pocket located in the central part of the propeller structure on one face of the molecule. A hypothetical model of LJM11 suggests a positive electrostatic potential on the face containing entry to the ligand binding pocket, whereas LJM111 is negative to neutral over its entire surface. This may be the reason for differences in antigenicity between the two proteins.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Un dels principals problemes quan es realitza un anàlisi de contorns és la gran quantitat de dades implicades en la descripció de la figura. Per resoldre aquesta problemàtica, s’aplica la parametrització que consisteix en obtenir d’un contorn unes dades representatives amb els mínims coeficients possibles, a partir dels quals es podrà reconstruir de nou sense pèrdues molt evidents d’informació. En figures de contorns tancats, la parametrització més estudiada és l’aplicació de la transformada discreta de Fourier (DFT). Aquesta s’aplica a la seqüència de valors que descriu el comportament de les coordenades x i y al llarg de tots els punts que formen el traç. A diferència, en els contorns oberts no es pot aplicar directament la DFT ja que per fer-ho es necessita que el valor de x i de y siguin iguals tan en el primer punt del contorn com en l’últim. Això és degut al fet que la DFT representa sense error senyals periòdics. Si els senyals no acaben en el mateix punt, representa que hi ha una discontinuïtat i apareixen oscil·lacions a la reconstrucció. L’objectiu d’aquest treball és parametritzar contorns oberts amb la mateixa eficiència que s’obté en la parametrització de contorns tancats. Per dur-ho a terme, s’ha dissenyat un programa que permet aplicar la DFT en contorns oberts mitjançant la modificació de les seqüencies de x i y. A més a més, també utilitzant el programari Matlab s’han desenvolupat altres aplicacions que han permès veure diferents aspectes sobre la parametrització i com es comporten els Descriptors El·líptics de Fourier (EFD). Els resultats obtinguts han demostrat que l’aplicació dissenyada permet la parametrització de contorns oberts amb compressions òptimes, fet que facilitarà l’anàlisi quantitatiu de formes en camps com l’ecologia, medicina, geografia, entre d’altres.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present an open-source ITK implementation of a directFourier method for tomographic reconstruction, applicableto parallel-beam x-ray images. Direct Fourierreconstruction makes use of the central-slice theorem tobuild a polar 2D Fourier space from the 1D transformedprojections of the scanned object, that is resampled intoa Cartesian grid. Inverse 2D Fourier transform eventuallyyields the reconstructed image. Additionally, we providea complex wrapper to the BSplineInterpolateImageFunctionto overcome ITKâeuro?s current lack for image interpolatorsdealing with complex data types. A sample application ispresented and extensively illustrated on the Shepp-Loganhead phantom. We show that appropriate input zeropaddingand 2D-DFT oversampling rates together with radial cubicb-spline interpolation improve 2D-DFT interpolationquality and are efficient remedies to reducereconstruction artifacts.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We propose an algorithm that extracts image features that are consistent with the 3D structure of the scene. The features can be robustly tracked over multiple views and serve as vertices of planar patches that suitably represent scene surfaces, while reducing the redundancy in the description of 3D shapes. In other words, the extracted features will off er good tracking properties while providing the basis for 3D reconstruction with minimum model complexity

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The development and tests of an iterative reconstruction algorithm for emission tomography based on Bayesian statistical concepts are described. The algorithm uses the entropy of the generated image as a prior distribution, can be accelerated by the choice of an exponent, and converges uniformly to feasible images by the choice of one adjustable parameter. A feasible image has been defined as one that is consistent with the initial data (i.e. it is an image that, if truly a source of radiation in a patient, could have generated the initial data by the Poisson process that governs radioactive disintegration). The fundamental ideas of Bayesian reconstruction are discussed, along with the use of an entropy prior with an adjustable contrast parameter, the use of likelihood with data increment parameters as conditional probability, and the development of the new fast maximum a posteriori with entropy (FMAPE) Algorithm by the successive substitution method. It is shown that in the maximum likelihood estimator (MLE) and FMAPE algorithms, the only correct choice of initial image for the iterative procedure in the absence of a priori knowledge about the image configuration is a uniform field.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we present a Bayesian image reconstruction algorithm with entropy prior (FMAPE) that uses a space-variant hyperparameter. The spatial variation of the hyperparameter allows different degrees of resolution in areas of different statistical characteristics, thus avoiding the large residuals resulting from algorithms that use a constant hyperparameter. In the first implementation of the algorithm, we begin by segmenting a Maximum Likelihood Estimator (MLE) reconstruction. The segmentation method is based on using a wavelet decomposition and a self-organizing neural network. The result is a predetermined number of extended regions plus a small region for each star or bright object. To assign a different value of the hyperparameter to each extended region and star, we use either feasibility tests or cross-validation methods. Once the set of hyperparameters is obtained, we carried out the final Bayesian reconstruction, leading to a reconstruction with decreased bias and excellent visual characteristics. The method has been applied to data from the non-refurbished Hubble Space Telescope. The method can be also applied to ground-based images.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present a novel approach for analyzing single-trial electroencephalography (EEG) data, using topographic information. The method allows for visualizing event-related potentials using all the electrodes of recordings overcoming the problem of previous approaches that required electrode selection and waveforms filtering. We apply this method to EEG data from an auditory object recognition experiment that we have previously analyzed at an ERP level. Temporally structured periods were statistically identified wherein a given topography predominated without any prior information about the temporal behavior. In addition to providing novel methods for EEG analysis, the data indicate that ERPs are reliably observable at a single-trial level when examined topographically.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Interdisciplinary frameworks for studying natural hazards and their temporal trends have an important potential in data generation for risk assessment, land use planning, and therefore the sustainable management of resources. This paper focuses on the adjustments required because of the wide variety of scientific fields involved in the reconstruction and characterisation of flood events for the past 1000 years. The aim of this paper is to describe various methodological aspects of the study of flood events in their historical dimension, including the critical evaluation of old documentary and instrumental sources, flood-event classification and hydraulic modelling, and homogeneity and quality control tests. Standardized criteria for flood classification have been defined and applied to the Isère and Drac floods in France, from 1600 to 1950, and to the Ter, the Llobregat and the Segre floods, in Spain, from 1300 to 1980. The analysis on the Drac and Isère data series from 1600 to the present day showed that extraordinary and catastrophic floods were not distributed uniformly in time. However, the largest floods (general catastrophic floods) were homogeneously distributed in time within the period 1600¿1900. No major flood occurred during the 20th century in these rivers. From 1300 to the present day, no homogeneous behaviour was observed for extraordinary floods in the Spanish rivers. The largest floods were uniformly distributed in time within the period 1300-1900, for the Segre and Ter rivers.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the scope of the European project Hydroptimet, INTERREG IIIB-MEDOCC programme, limited area model (LAM) intercomparison of intense events that produced many damages to people and territory is performed. As the comparison is limited to single case studies, the work is not meant to provide a measure of the different models' skill, but to identify the key model factors useful to give a good forecast on such a kind of meteorological phenomena. This work focuses on the Spanish flash-flood event, also known as "Montserrat-2000" event. The study is performed using forecast data from seven operational LAMs, placed at partners' disposal via the Hydroptimet ftp site, and observed data from Catalonia rain gauge network. To improve the event analysis, satellite rainfall estimates have been also considered. For statistical evaluation of quantitative precipitation forecasts (QPFs), several non-parametric skill scores based on contingency tables have been used. Furthermore, for each model run it has been possible to identify Catalonia regions affected by misses and false alarms using contingency table elements. Moreover, the standard "eyeball" analysis of forecast and observed precipitation fields has been supported by the use of a state-of-the-art diagnostic method, the contiguous rain area (CRA) analysis. This method allows to quantify the spatial shift forecast error and to identify the error sources that affected each model forecasts. High-resolution modelling and domain size seem to have a key role for providing a skillful forecast. Further work is needed to support this statement, including verification using a wider observational data set.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The 10 June 2000 event was the largest flash flood event that occurred in the Northeast of Spain in the late 20th century, both as regards its meteorological features and its considerable social impact. This paper focuses on analysis of the structures that produced the heavy rainfalls, especially from the point of view of meteorological radar. Due to the fact that this case is a good example of a Mediterranean flash flood event, a final objective of this paper is to undertake a description of the evolution of the rainfall structure that would be sufficiently clear to be understood at an interdisciplinary forum. Then, it could be useful not only to improve conceptual meteorological models, but also for application in downscaling models. The main precipitation structure was a Mesoscale Convective System (MCS) that crossed the region and that developed as a consequence of the merging of two previous squall lines. The paper analyses the main meteorological features that led to the development and triggering of the heavy rainfalls, with special emphasis on the features of this MCS, its life cycle and its dynamic features. To this end, 2-D and 3-D algorithms were applied to the imagery recorded over the complete life cycle of the structures, which lasted approximately 18 h. Mesoscale and synoptic information were also considered. Results show that it was an NS-MCS, quasi-stationary during its stage of maturity as a consequence of the formation of a convective train, the different displacement directions of the 2-D structures and the 3-D structures, including the propagation of new cells, and the slow movement of the convergence line associated with the Mediterranean mesoscale low.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Astute control of brain activity states is critical for adaptive behaviours and survival. In mammals and birds, electroencephalographic recordings reveal alternating states of wakefulness, slow wave sleep and paradoxical sleep (or rapid eye movement sleep). This control is profoundly impaired in narcolepsy with cataplexy, a disease resulting from the loss of orexin/hypocretin neurotransmitter signalling in the brain. Narcolepsy with cataplexy is characterized by irresistible bouts of sleep during the day, sleep fragmentation during the night and episodes of cataplexy, a sudden loss of muscle tone while awake and experiencing emotions. The neural mechanisms underlying cataplexy are unknown, but commonly thought to involve those of rapid eye movement-sleep atonia, and cataplexy typically is considered as a rapid eye movement sleep disorder. Here we reassess cataplexy in hypocretin (Hcrt, also known as orexin) gene knockout mice. Using a novel video/electroencephalogram double-blind scoring method, we show that cataplexy is not a state per se, as believed previously, but a dynamic, multi-phased process involving a reproducible progression of states. A knockout-specific state and a stereotypical paroxysmal event were introduced to account for signals and electroencephalogram spectral characteristics not seen in wild-type littermates. Cataplexy almost invariably started with a brief phase of wake-like electroencephalogram, followed by a phase featuring high-amplitude irregular theta oscillations, defining an activity profile distinct from paradoxical sleep, referred to as cataplexy-associated state and in the course of which 1.5-2 s high-amplitude, highly regular, hypersynchronous paroxysmal theta bursts (∼7 Hz) occurred. In contrast to cataplexy onset, exit from cataplexy did not show a predictable sequence of activities. Altogether, these data contradict the hypothesis that cataplexy is a state similar to paradoxical sleep, even if long cataplexies may evolve into paradoxical sleep. Although not exclusive to overt cataplexy, cataplexy-associated state and hypersynchronous paroxysmal theta activities are highly enriched during cataplexy in hypocretin/orexin knockout mice. Their occurrence in an independent narcolepsy mouse model, the orexin/ataxin 3 transgenic mouse, undergoing loss of orexin neurons, was confirmed. Importantly, we document for the first time similar paroxysmal theta hypersynchronies (∼4 Hz) during cataplexy in narcoleptic children. Lastly, we show by deep recordings in mice that the cataplexy-associated state and hypersynchronous paroxysmal theta activities are independent of hippocampal theta and involve the frontal cortex. Cataplexy hypersynchronous paroxysmal theta bursts may represent medial prefrontal activity, associated in humans and rodents with reward-driven motor impulse, planning and conflict monitoring.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper describes the development and applications of a super-resolution method, known as Super-Resolution Variable-Pixel Linear Reconstruction. The algorithm works combining different lower resolution images in order to obtain, as a result, a higher resolution image. We show that it can make significant spatial resolution improvements to satellite images of the Earth¿s surface allowing recognition of objects with size approaching the limiting spatial resolution of the lower resolution images. The algorithm is based on the Variable-Pixel Linear Reconstruction algorithm developed by Fruchter and Hook, a well-known method in astronomy but never used for Earth remote sensing purposes. The algorithm preserves photometry, can weight input images according to the statistical significance of each pixel, and removes the effect of geometric distortion on both image shape and photometry. In this paper, we describe its development for remote sensing purposes, show the usefulness of the algorithm working with images as different to the astronomical images as the remote sensing ones, and show applications to: 1) a set of simulated multispectral images obtained from a real Quickbird image; and 2) a set of multispectral real Landsat Enhanced Thematic Mapper Plus (ETM+) images. These examples show that the algorithm provides a substantial improvement in limiting spatial resolution for both simulated and real data sets without significantly altering the multispectral content of the input low-resolution images, without amplifying the noise, and with very few artifacts.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

PURPOSE: To evaluate the cause of recurrent pathologic instability after anterior cruciate ligament (ACL) surgery and the effectiveness of revision reconstruction using a quadriceps tendon autograft using a 2-incision technique. TYPE OF STUDY: Retrospective follow-up study. METHODS: Between 1999 and 2001, 31 patients underwent ACL revision reconstruction because of recurrent pathologic instability during sports or daily activities. Twenty-eight patients were reviewed after a mean follow-up of 4.2 years (range, 3.3 to 5.6 years). The mean age at revision surgery was 27 years (range, 18 to 41 years). The average time from primary procedure to revision surgery was 26 months (range, 9 to 45 months). A clinical, functional, and radiographic evaluation was performed. Also magnetic resonance imaging (MRI) or computed tomography (CT) scanning was performed. The International Knee Documentation Committee (IKDC), Lysholm, and Tegner scales were used. A KT-1000 arthrometer measurement (MEDmetric, San Diego, CA) by an experienced physician was made. RESULTS: Of the failures, 79% had radiographic evidence of malposition of their tunnels. In only 6 cases (21%) was the radiologic anatomy of tunnel placement judged to be correct on both the femoral and tibial side. The MRI or CT showed, in 6 cases, a too-centrally placed femoral tunnel. After revision surgery, the position of tunnels was corrected. A significant improvement of Lachman and pivot-shift phenomenon was observed. In particular, 17 patients had a negative Lachman test, and 11 patients had a grade I Lachman with a firm end point. Preoperatively, the pivot-shift test was positive in all cases, and at last follow-up in 7 patients (25%) a grade 1+ was found. Postoperatively, KT-1000 testing showed a mean manual maximum translation of 8.6 mm (SD, 2.34) for the affected knee; 97% of patients had a maximum manual side-to-side translation <5 mm. At the final postoperative evaluation, 26 patients (93%) graded their knees as normal or nearly normal according to the IKDC score. The mean Lysholm score was 93.6 (SD, 8.77) and the mean Tegner activity score was 6.1 (SD, 1.37). No patient required further revision. Five patients (18%) complained of hypersensitive scars from the reconstructive surgery that made kneeling difficult. CONCLUSIONS: There were satisfactory results after ACL revision surgery using quadriceps tendon and a 2-incision technique at a minimum 3 years' follow-up; 93% of patients returned to sports activities. LEVEL OF EVIDENCE: Level IV, case series, no control group.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Diversity patterns of ammonoids are analyzed and compared with the timing of anoxic deposits around the Cenomanian/Turonian (C/T) boundary in the Vocontian, Anglo-Paris, and Monster basins of Western Europe. Differing from most previous studies, which concentrate on a narrow time span bracketing the C/T boundary, the present analysis covers the latest Albian to Early Turonian interval for which a high resolution, ammonoid-based biochronology, including 34 Unitary Associations zones, is now available. During the latest Albian-Middle Cenomanian interval, species richness of ammonoids reveals a dynamical equilibrium oscillating around an average of 20 species, whereas the Late Cenomanian-Early Turonian interval displays an equilibrium centered on an average value of 6 species. The abrupt transition between these two successive equilibria lasted no longer than two Unitary Associations. The onset of the decline of species richness thus largely predates the spread of oxygen-poor water masses onto the shelves, while minimal values of species richness coincide with the Cenomanian-Turonian boundary only. The decline of species richness during the entire Late Cenomanian seems to result from lower origination percentages rather than from higher extinction percentages. This result is also supported by the absence of statistically significant changes in the extinction probabilities of the poly-cohorts. Separate analyses of species richness for acanthoceratids and heteromorphs, the two essential components of the Cenomanian ammonoid community, reveal that heteromorphs declined sooner than acanthoceratids. Moreover, acanthoceratids showed a later decline at the genus level than at the species level. Such a decoupling is accompanied by a significant increase in morphological disparity of acanthoceratids, which is expressed by the appearance of new genera. Last, during the Late Cenomanian, paedomorphic processes, juvenile innovations and reductions of adult size dominated the evolutionary radiation of acanthoceratids. Hence, the decrease in ammonoid species richness and their major evolutionary changes significantly predates the spread of anoxic deposits. Other environmental constraints such as global flooding of platforms, warmer and more equable climate, as well as productivity changes better correlate with the timing of diversity changes and evolutionary patterns of ammonoids and therefore, provide more likely causative mechanisms than anoxia alone.