989 resultados para Massive Parallelization
Resumo:
The MAP-i doctoral program of the Universities of Minho, Aveiro and Porto
Resumo:
Results of a search for decays of massive particles to fully hadronic final states are presented. This search uses 20.3 fb−1 of data collected by the ATLAS detector in s√=8TeV proton--proton collisions at the LHC. Signatures based on high jet multiplicities without requirements on the missing transverse momentum are used to search for R-parity-violating supersymmetric gluino pair production with subsequent decays to quarks. The analysis is performed using a requirement on the number of jets, in combination with separate requirements on the number of b-tagged jets, as well as a topological observable formed from the scalar sum of the mass values of large-radius jets in the event. Results are interpreted in the context of all possible branching ratios of direct gluino decays to various quark flavors. No significant deviation is observed from the expected Standard Model backgrounds estimated using jet-counting as well as data-driven templates of the total-jet-mass spectra. Gluino pair decays to ten or more quarks via intermediate neutralinos are excluded for a gluino with mass mg~<1TeV for a neutralino mass mχ~01=500GeV. Direct gluino decays to six quarks are excluded for mg~<917GeV for light-flavor final states, and results for various flavor hypotheses are presented.
Resumo:
Many extensions of the Standard Model posit the existence of heavy particles with long lifetimes. This article presents the results of a search for events containing at least one long-lived particle that decays at a significant distance from its production point into two leptons or into five or more charged particles. This analysis uses a data sample of proton-proton collisions at s√ = 8 TeV corresponding to an integrated luminosity of 20.3 fb−1 collected in 2012 by the ATLAS detector operating at the Large Hadron Collider. No events are observed in any of the signal regions, and limits are set on model parameters within supersymmetric scenarios involving R-parity violation, split supersymmetry, and gauge mediation. In some of the search channels, the trigger and search strategy are based only on the decay products of individual long-lived particles, irrespective of the rest of the event. In these cases, the provided limits can easily be reinterpreted in different scenarios.
Resumo:
This is the report of a rare case of endomyocardial fibrosis associated with massive calcification of the left ventricle in a male patient with dyspnea on great exertion, which began 5 years earlier and rapidly evolved. Due to lack of information and the absence of clinical signs that could characterize impairment of other organs, the case was initially managed as a disease with a pulmonary origin. With the evolution of the disease and in the presence of radiological images of heterogeneous opacification in the projection of the left ventricle, the diagnostic hypothesis of endomyocardial disease was established. This hypothesis was later confirmed on chest computed tomography. The patient died on the 16th day of the hospital stay, probably because of lack of myocardial reserve, with clinical findings of refractory heart failure, possibly aggravated by pulmonary infection. This shows that a rare disease such as endomyocardial fibrosis associated with massive calcification of the left ventricle may be suspected on a simple chest X-ray and confirmed by computed tomography.
Resumo:
Nuevas biotecnologías, como los marcadores de la molécula de ADN, permiten caracterizar el genoma vegetal. El uso de la información genómica producida para cientos o miles de posiciones cromosómicas permite identificar genotipos superiores en menos tiempo que el requerido por la selección fenotípica tradicional. La mayoría de los caracteres de las especies vegetales cultivadas de importancia agronómica y económica, son controlados por poli-genes causantes de un fenotipo con variación continua, altamente afectados por el ambiente. Su herencia es compleja ya que resulta de la interacción entre genes, del mismo o distinto cromosoma, y de la interacción del genotipo con el ambiente, dificultando la selección. Estas biotecnologías producen bases de datos con gran cantidad de información y estructuras complejas de correlación que requieren de métodos y modelos biométricos específicos para su procesamiento. Los modelos estadísticos focalizados en explicar el fenotipo a partir de información genómica masiva requieren la estimación de un gran número de parámetros. No existen métodos, dentro de la estadística paramétrica capaces de abordar este problema eficientemente. Además los modelos deben contemplar no-aditividades (interacciones) entre efectos génicos y de éstos con el ambiente que son también dificiles de manejar desde la concepción paramétrica. Se hipotetiza que el análisis de la asociación entre caracteres fenotípicos y genotipos moleculares, caracterizados por abundante información genómica, podría realizarse eficientemente en el contexto de los modelos mixtos semiparamétricos y/o de métodos no-paramétricos basados en técnicas de aprendizaje automático. El objetivo de este proyecto es desarrollar nuevos métodos para análisis de datos que permitan el uso eficiente de información genómica masiva en evaluaciones genéticas de interés agro-biotecnológico. Los objetivos específicos incluyen la comparación, respecto a propiedades estadísticas y computacionales, de estrategias analíticas paramétricas con estrategias semiparamétricas y no-paramétricas. Se trabajará con aproximaciones por regresión del análisis de loci de caracteres cuantitativos bajo distintas estrategias y escenarios (reales y simulados) con distinto volúmenes de datos de marcadores moleculares. En el área paramétrica se pondrá especial énfasis en modelos mixtos, mientras que en el área no paramétrica se evaluarán algoritmos de redes neuronales, máquinas de soporte vectorial, filtros multivariados, suavizados del tipo LOESS y métodos basados en núcleos de reciente aparición. La propuesta semiparamétrica se basará en una estrategia de análisis en dos etapas orientadas a: 1) reducir la dimensionalidad de los datos genómicos y 2) modelar el fenotipo introduciendo sólo las señales moleculares más significativas. Con este trabajo se espera poner a disposición de investigadores de nuestro medio, nuevas herramientas y procedimientos de análisis que permitan maximizar la eficiencia en el uso de los recursos asignados a la masiva captura de datos genómicos y su aplicación en desarrollos agro-biotecnológicos.
Resumo:
In this paper we investigate various algorithms for performing Fast Fourier Transformation (FFT)/Inverse Fast Fourier Transformation (IFFT), and proper techniquesfor maximizing the FFT/IFFT execution speed, such as pipelining or parallel processing, and use of memory structures with pre-computed values (look up tables -LUT) or other dedicated hardware components (usually multipliers). Furthermore, we discuss the optimal hardware architectures that best apply to various FFT/IFFT algorithms, along with their abilities to exploit parallel processing with minimal data dependences of the FFT/IFFT calculations. An interesting approach that is also considered in this paper is the application of the integrated processing-in-memory Intelligent RAM (IRAM) chip to high speed FFT/IFFT computing. The results of the assessment study emphasize that the execution speed of the FFT/IFFT algorithms is tightly connected to the capabilities of the FFT/IFFT hardware to support the provided parallelism of the given algorithm. Therefore, we suggest that the basic Discrete Fourier Transform (DFT)/Inverse Discrete Fourier Transform (IDFT) can also provide high performances, by utilizing a specialized FFT/IFFT hardware architecture that can exploit the provided parallelism of the DFT/IDF operations. The proposed improvements include simplified multiplications over symbols given in polar coordinate system, using sinе and cosine look up tables,and an approach for performing parallel addition of N input symbols.
Resumo:
Recurrent chromosomal translocations associated to peripheral T-cell lymphomas (PTCL) are rare. Here, we report a case of PTCL, not otherwise specified (NOS) with the karyotype 46,Y,add(X)(p22),t(6;14)(p25;q11) and FISH-proved breakpoints in the IRF4 and TCRAD loci, leading to juxtaposition of both genes. A 64-year-old male patient presented with mild cytopenias and massive splenomegaly. Splenectomy showed diffuse red pulp involvement by a pleomorphic medium- to large-cell T-cell lymphoma with a CD2+ CD3+ CD5- CD7- CD4+ CD8+/- CD30- TCRbeta-F1+ immunophenotype, an activated cytotoxic profile, and strong MUM1 expression. The clinical course was marked by disease progression in the bone marrow under treatment and death at 4 months. In contrast with two t(6;14)(p25;q11.2)-positive lymphomas previously reported to be cytotoxic PTCL, NOS with bone marrow and skin involvement, this case was manifested by massive splenomegaly, expanding the clinical spectrum of PTCLs harboring t(6;14)(p25;q11.2) and supporting consideration of this translocation as a marker of biological aggressiveness.
Resumo:
P>The first Variscan pseudo-adakites were identified in close association with the Saint-Jean-du-Doigt (SJDD) mafic intrusion (Brittany, France) in a geodynamic context unrelated to subduction. These rocks are trondhjemites emplaced 347 +/- 4 Ma ago as 2-3 km2 bodies and dykes. Trace-element concentrations and Sr-Nd-Pb isotope ratios indicate that the SJDD pseudo-adakites probably resulted from extreme differentiation of an SJDD-type hydrous basaltic magma in a lower continental crust of normal thickness (0.8 GPa). Modelling shows that garnet is not a required phase, which was commonly believed to be the case for continental arc-derived adakite-like rocks. A massive fractionation of amphibole fits the data much better and does not require high pressures, in agreement with the inferred extensional tectonic regime at the time of pluton emplacement. Alternatively, the SJDD pseudo-adakites could have resulted from the melting of newly underplated SJDD mafic precursors, but thermal considerations lead us to believe that this was not the case.
Resumo:
Vitamin K deficiency bleeding within the first 24 h of life is caused in most cases by maternal drug intake (e.g. coumarins, anticonvulsants, tuberculostatics) during pregnancy. Haemorrhage is often life-threatening and usually not prevented by vitamin K prophylaxis at birth. We report a case of severe intracranial bleeding at birth secondary to phenobarbital-induced vitamin K deficiency and traumatic delivery. Burr hole trepanations of the skull were performed and the subdural haematoma was evacuated. Despite the severe prognosis, the infant showed an unexpected good recovery. At the age of 3 years, neurological examinations were normal as was the EEG at the age of 9 months. CT showed close to normal intracranial structures. CONCLUSION: This case report stresses the importance of antenatal vitamin K prophylaxis and the consideration of a primary Caesarean section in maternal vitamin K deficiency states and demonstrates the successful management of massive subdural haemorrhage by a limited surgical approach.
Resumo:
Plus de cent maladies peuvent se manifester par une hémoptysie, qui peut être le reflet d'une pathologie sous-jacente potentiellement sérieuse. Elle n'est massive que dans 5% des cas et devient alors une entité clinique souvent dramatique, mortelle dans 30% des cas, qui nécessite une approche multidisciplinaire en milieu de soins intensifs. Quelques cas cliniques introduisent la discussion des aspects diagnostiques et thérapeutiques de leur prise en charge. Après avoir assuré la survie immédiate, l'origine du saignement sera localisée par une endoscopie qui permettra de réaliser un éventuel tamponnement endobronchique. Une artériographie doit ensuite être effectuée, afin de tenter d'obtenir l'hémostase par embolisation du réseau artériel bronchique responsable de l'épisode d'hémoptysie dans la majorité des cas.
Resumo:
With the advent of High performance computing, it is now possible to achieve orders of magnitude performance and computation e ciency gains over conventional computer architectures. This thesis explores the potential of using high performance computing to accelerate whole genome alignment. A parallel technique is applied to an algorithm for whole genome alignment, this technique is explained and some experiments were carried out to test it. This technique is based in a fair usage of the available resource to execute genome alignment and how this can be used in HPC clusters. This work is a rst approximation to whole genome alignment and it shows the advantages of parallelism and some of the drawbacks that our technique has. This work describes the resource limitations of current WGA applications when dealing with large quantities of sequences. It proposes a parallel heuristic to distribute the load and to assure that alignment quality is mantained.
Resumo:
Acute massive pulmonary embolism (PE) is a life-threatening event. Before the era of cardiopulmonary bypass, acute pulmonary embolectomy had been historically attempted in patients with severe hemodynamic compromise. The Klippel-Trenaunay syndrome (KTS) represents a significant life-long risk for major thromboembolic events. We present two young patients with Klippel-Trenaunay syndrome who survived surgical embolectomy after massive PE and cardiopulmonary resuscitation, with good postoperative recovery. Even though the role of surgical embolectomy in massive PE is not clearly defined, with current technology it can be life saving and can lead to a complete recovery, especially in young patients as described in this study.