913 resultados para sparse reconstruction
Resumo:
The main goal of the present work is the use of mineralogical data corresponding to sediment fine fractions (silt and clay) of Quaternary littoral deposits for the definition of a more detailed vertical zonography and to discriminate the most significant morphoclimatic changes concerned with sediment source areas and sediment deposition areas. The analysis of the available mineralogical data reveals a vertical evolution of the mineral composition. The following aspects deserve particular reference: 1) fine fractions (<38 nm) are composed of quartz and phyllosilicates associated to feldspars, prevailing over other minerals; however in certain sections iron hydroxides and evaporitic minerals occur in significant amounts; 2) clay fractions (<2 nm) show a general prevalence of illite associated with kaolinite and oscillations, in relative terms, of kaolinite and illite contents. Qualitative and quantitative lateral and vertical variations of clay and non clay minerals allow the discrimination of sedimentary sequences and the establishment of the ritmicity and periodicity of the morphoclimatic Quaternary episodes that occurred in the Cortegaça and Maceda beaches. To each one of the sedimentary sequences corresponds, in a first stage, a littoral environment that increasingly became more continental. Climate would be mild to cold, sometimes with humidity - aridity oscillations. Warmer and moister episodes alternated with cooler and dryer ones.
Resumo:
Sparse matrix-vector multiplication (SMVM) is a fundamental operation in many scientific and engineering applications. In many cases sparse matrices have thousands of rows and columns where most of the entries are zero, while non-zero data is spread over the matrix. This sparsity of data locality reduces the effectiveness of data cache in general-purpose processors quite reducing their performance efficiency when compared to what is achieved with dense matrix multiplication. In this paper, we propose a parallel processing solution for SMVM in a many-core architecture. The architecture is tested with known benchmarks using a ZYNQ-7020 FPGA. The architecture is scalable in the number of core elements and limited only by the available memory bandwidth. It achieves performance efficiencies up to almost 70% and better performances than previous FPGA designs.
Resumo:
Mestrado em Engenharia Electrotécnica e de Computadores - Ramo de Sistemas Autónomos
Resumo:
Parallel hyperspectral unmixing problem is considered in this paper. A semisupervised approach is developed under the linear mixture model, where the abundance's physical constraints are taken into account. The proposed approach relies on the increasing availability of spectral libraries of materials measured on the ground instead of resorting to endmember extraction methods. Since Libraries are potentially very large and hyperspectral datasets are of high dimensionality a parallel implementation in a pixel-by-pixel fashion is derived to properly exploits the graphics processing units (GPU) architecture at low level, thus taking full advantage of the computational power of GPUs. Experimental results obtained for real hyperspectral datasets reveal significant speedup factors, up to 164 times, with regards to optimized serial implementation.
Resumo:
In this paper, a new parallel method for sparse spectral unmixing of remotely sensed hyperspectral data on commodity graphics processing units (GPUs) is presented. A semi-supervised approach is adopted, which relies on the increasing availability of spectral libraries of materials measured on the ground instead of resorting to endmember extraction methods. This method is based on the spectral unmixing by splitting and augmented Lagrangian (SUNSAL) that estimates the material's abundance fractions. The parallel method is performed in a pixel-by-pixel fashion and its implementation properly exploits the GPU architecture at low level, thus taking full advantage of the computational power of GPUs. Experimental results obtained for simulated and real hyperspectral datasets reveal significant speedup factors, up to 1 64 times, with regards to optimized serial implementation.
Resumo:
13th International Conference on Autonomous Robot Systems (Robotica), 2013, Lisboa
Resumo:
Upper eyelid tumours, particularly basal cell carcinomas, are relatively frequent. Surgical ablation of these lesions creates defects of variable complexity. Although several options are available for lower eyelid reconstruction, fewer surgical alternatives exist for upper eyelid reconstruction. Large defects of this region are usually reconstructed with two-step procedures. In 1997, Okada et al. described a horizontal V-Y myotarsocutaneous advancement flap for reconstruction of a large upper eyelid defect in a single operative time. However, no further studies were published regarding the use of this particular flap in upper eyelid reconstruction. In addition, this flap is not described in most plastic surgery textbooks. The authors report here their experience of 16 cases of horizontal V-Y myotarsocutaneous advancement flaps used to reconstruct full-thickness defects of the upper eyelid after tumour excision. The tumour histological types were as follows: 12 basal cell carcinomas, 2 cases of squamous cell carcinomas, 1 case of sebaceous cell carcinoma and 1 of malignant melanoma. This technique allowed closure of defects of up to 60% of the eyelid width. None of the flaps suffered necrosis. The mean operative time was 30 min. No additional procedures were necessary as good functional and cosmetic results were achieved in all cases. No recurrences were noted. In this series, the horizontal V-Y myotarsocutaneous advancement flap proved to be a technically simple, reliable and expeditious option for reconstruction of full-thickness upper eyelid defects (as wide as 60% of the eyelid width) in a single operative procedure. In the future this technique may become the preferential option for such defects.
Resumo:
In cases of extensive damage to the foot, with significant bone loss, it is generally accepted that reconstruction must include bone flaps or grafts either in the emergency setting or subsequently. In this report, we describe the case of an 18-year-old student with an avulsion injury of the dorsum of his right foot. Consequently, he lost most of the soft tissue over the dorsum of the foot and the cuboid, navicular, and cuneiform bones. A latissimus dorsi free flap was used to reconstruct the defect. A functional pseudoarthrosis developed between the remaining bones of the foot, and the patient experienced satisfactory foot function after rehabilitation. For this reason, no additional reconstructive procedure was undertaken. This case suggests that it might be adequate to use the latissimus dorsi muscle flap more liberally than previously reported in the reconstruction of extensive defects of the dorsum of the foot, including cases with significant bone loss. This option could avoid the morbidity and inconvenience of a second surgery and the need to harvest a bone flap or graft.
Resumo:
Dissertação para obtenção do Grau de Mestre em Engenharia Informática
Resumo:
Breast cancer is the most common cancer among women, being a major public health problem. Worldwide, X-ray mammography is the current gold-standard for medical imaging of breast cancer. However, it has associated some well-known limitations. The false-negative rates, up to 66% in symptomatic women, and the false-positive rates, up to 60%, are a continued source of concern and debate. These drawbacks prompt the development of other imaging techniques for breast cancer detection, in which Digital Breast Tomosynthesis (DBT) is included. DBT is a 3D radiographic technique that reduces the obscuring effect of tissue overlap and appears to address both issues of false-negative and false-positive rates. The 3D images in DBT are only achieved through image reconstruction methods. These methods play an important role in a clinical setting since there is a need to implement a reconstruction process that is both accurate and fast. This dissertation deals with the optimization of iterative algorithms, with parallel computing through an implementation on Graphics Processing Units (GPUs) to make the 3D reconstruction faster using Compute Unified Device Architecture (CUDA). Iterative algorithms have shown to produce the highest quality DBT images, but since they are computationally intensive, their clinical use is currently rejected. These algorithms have the potential to reduce patient dose in DBT scans. A method of integrating CUDA in Interactive Data Language (IDL) is proposed in order to accelerate the DBT image reconstructions. This method has never been attempted before for DBT. In this work the system matrix calculation, the most computationally expensive part of iterative algorithms, is accelerated. A speedup of 1.6 is achieved proving the fact that GPUs can accelerate the IDL implementation.
Resumo:
Release of chloroethene compounds into the environment often results in groundwater contamination, which puts people at risk of exposure by drinking contaminated water. cDCE (cis-1,2-dichloroethene) accumulation on subsurface environments is a common environmental problem due to stagnation and partial degradation of other precursor chloroethene species. Polaromonas sp. strain JS666 apparently requires no exotic growth factors to be used as a bioaugmentation agent for aerobic cDCE degradation. Although being the only suitable microorganism found capable of such, further studies are needed for improving the intrinsic bioremediation rates and fully comprehend the metabolic processes involved. In order to do so, a metabolic model, iJS666, was reconstructed from genome annotation and available bibliographic data. FVA (Flux Variability Analysis) and FBA (Flux Balance Analysis) techniques were used to satisfactory validate the predictive capabilities of the iJS666 model. The iJS666 model was able to predict biomass growth for different previously tested conditions, allowed to design key experiments which should be done for further model improvement and, also, produced viable predictions for the use of biostimulant metabolites in the cDCE biodegradation.
Resumo:
Geochemical and geochronological analyses of samples of surficial Acre Basin sediments and fossils indicate an extensive fluvial-lacustrine system, occupying this region, desiccated slowly during the last glacial cycle (LGC). This research documents direct evidence for aridity in western Amazonia during the LGC and is important in establishing boundary conditions for LGC climate models as well as in correlating marine and continental (LGC) climate conditions.
Resumo:
Ship tracking systems allow Maritime Organizations that are concerned with the Safety at Sea to obtain information on the current location and route of merchant vessels. Thanks to Space technology in recent years the geographical coverage of the ship tracking platforms has increased significantly, from radar based near-shore traffic monitoring towards a worldwide picture of the maritime traffic situation. The long-range tracking systems currently in operations allow the storage of ship position data over many years: a valuable source of knowledge about the shipping routes between different ocean regions. The outcome of this Master project is a software prototype for the estimation of the most operated shipping route between any two geographical locations. The analysis is based on the historical ship positions acquired with long-range tracking systems. The proposed approach makes use of a Genetic Algorithm applied on a training set of relevant ship positions extracted from the long-term storage tracking database of the European Maritime Safety Agency (EMSA). The analysis of some representative shipping routes is presented and the quality of the results and their operational applications are assessed by a Maritime Safety expert.
Resumo:
The number of houses damaged or destroyed after disasters is frequently large, and re-housing of homeless people is one of the most important tasks of reconstruction programmes. Reconstruction works often last long and during that time, it is essential to provide victims with the minimum conditions to live with dignity, privacy, and protection. This research intends to demonstrate the crucial role of temporary accommodation buildings to provide spaces where people can live and gradually resume their life until they have a permanent house. The study also aims to identify the main problems of temporary accommodation strategies and to discuss some principles and guidelines in order to reach better design solutions. It is found that temporary accommodation is an issue that goes beyond the simple provision of buildings, since the whole space for temporary settlement is important. Likewise, temporary accommodation is a process that should start before a disaster occurs, as a preventive pre-planning. In spite of being temporary constructions, these housing buildings are one of the most important elements to provide in emergency scenarios, contributing for better recovery and reconstruction actions.
Resumo:
PhD Thesis in Bioengineering