974 resultados para Free source software
Resumo:
Airborne particulate matter (PM) is of environmental concern not only in urban but also rural areas that are easily inhalable and have been considered responsible, together with gaseous pollutants, for possible health effects. The objectives of this research study is to generate an extensive data set for ambient PM collected at Belle Glade and Delray Beach that ultimately was used together with published source profiles to predict the contributions of major sources to the overall airborne particle burden in Belle Glade and Delray Beach. ^ The size segregated particle sampling was conducted for one entire year. The samples collected during the months of January and May were further subjected to chemical analysis for organic compounds by Gas Chromatography-Mass Spectrometry. Additional, PM10 sampling was conducted simultaneously with size segregated particle sampling during January and May to analyze for trace elements using Instrumental Neutron Activation Analysis technique. Elements and organic marker compounds were used in Chemical Mass Balance modeling to determine the major source contribution to the ambient fine particle matter burden. ^ Size segregated particle distribution results show bimodal in both sampling sites. Sugarcane pre-harvest burning in the rural site elevated PM10 concentration by about 30% during the sugarcane harvest season compared to sugarcane growing season. Sea salt particles and Saharan dust particles accounted for the external sources. ^ The results of trace element analysis show that Al, Ca, Cs, Eu, Lu, Nd, Sc, Sm, Th, and Yb are more abundant at the rural sampling site. The trace elements Ba, Br, Ce, Cl, Cr, Fe, Gd, Hf, Na, Sb, Ta, V, and W show high abundance at the urban site due to anthropogenic activities except for Na and Cl, which are from sea salt spray. On the other hand, size segregated trace organic compounds measurements show that organic compounds mainly from combustion process were accumulated in PM0.95. ^ In conclusion, major particle sources were determined by the CMB8.2 software as follows: road dust, sugarcane leaf burning, diesel-powered and gasoline powered vehicle exhaust, leaf surface abrasion particles, and a very small fraction of meat cooking. ^
Resumo:
The mediator software architecture design has been developed to provide data integration and retrieval in distributed, heterogeneous environments. Since the initial conceptualization of this architecture, many new technologies have emerged that can facilitate the implementation of this design. The purpose of this thesis was to show that a mediator framework supporting users of mobile devices could be implemented using common software technologies available today. In addition, the prototype was developed with a view to providing a better understanding of what a mediator is and to expose issues that will have to be addressed in full, more robust designs. The prototype developed for this thesis was implemented using various technologies including: Java, XML, and Simple Object Access Protocol (SOAP) among others. SOAP was used to accomplish inter-process communication. In the end, it is expected that more data intensive software applications will be possible in a world with ever-increasing demands for information.
Resumo:
We present a data set of 738 planktonic foraminiferal species counts from sediment surface samples of the eastern North Atlantic and the South Atlantic between 87°N and 40°S, 35°E and 60°W including published Climate: Long-Range Investigation, Mapping, and Prediction (CLIMAP) data. These species counts are linked to Levitus's [1982] modern water temperature data for the four caloric seasons, four depth ranges (0, 30, 50, and 75 m), and the combined means of those depth ranges. The relation between planktonic foraminiferal assemblages and sea surface temperature (SST) data is estimated using the newly developed SIMMAX technique, which is an acronym for a modern analog technique (MAT) with a similarity index, based on (1) the scalar product of the normalized faunal percentages and (2) a weighting procedure of the modern analog's SSTs according to the inverse geographical distances of the most similar samples. Compared to the classical CLIMAP transfer technique and conventional MAT techniques, SIMMAX provides a more confident reconstruction of paleo-SSTs (correlation coefficient is 0.994 for the caloric winter and 0.993 for caloric summer). The standard deviation of the residuals is 0.90°C for caloric winter and 0.96°C for caloric summer at 0-m water depth. The SST estimates reach optimum stability (standard deviation of the residuals is 0.88°C) at the average 0- to 75-m water depth. Our extensive database provides SST estimates over a range of -1.4 to 27.2°C for caloric winter and 0.4 to 28.6°C for caloric summer, allowing SST estimates which are especially valuable for the high-latitude Atlantic during glacial times.
Resumo:
Software product line engineering promotes large software reuse by developing a system family that shares a set of developed core features, and enables the selection and customization of a set of variabilities that distinguish each software product family from the others. In order to address the time-to-market, the software industry has been using the clone-and-own technique to create and manage new software products or product lines. Despite its advantages, the clone-and-own approach brings several difficulties for the evolution and reconciliation of the software product lines, especially because of the code conflicts generated by the simultaneous evolution of the original software product line, called Source, and its cloned products, called Target. This thesis proposes an approach to evolve and reconcile cloned products based on mining software repositories and code conflict analysis techniques. The approach provides support to the identification of different kinds of code conflicts – lexical, structural and semantics – that can occur during development task integration – bug correction, enhancements and new use cases – from the original evolved software product line to the cloned product line. We have also conducted an empirical study of characterization of the code conflicts produced during the evolution and merging of two large-scale web information system product lines. The results of our study demonstrate the approach potential to automatically or semi-automatically solve several existing code conflicts thus contributing to reduce the complexity and costs of the reconciliation of cloned software product lines.
Resumo:
A manutenção e evolução de sistemas de software tornou-se uma tarefa bastante crítica ao longo dos últimos anos devido à diversidade e alta demanda de funcionalidades, dispositivos e usuários. Entender e analisar como novas mudanças impactam os atributos de qualidade da arquitetura de tais sistemas é um pré-requisito essencial para evitar a deterioração de sua qualidade durante sua evolução. Esta tese propõe uma abordagem automatizada para a análise de variação do atributo de qualidade de desempenho em termos de tempo de execução (tempo de resposta). Ela é implementada por um framework que adota técnicas de análise dinâmica e mineração de repositório de software para fornecer uma forma automatizada de revelar fontes potenciais – commits e issues – de variação de desempenho em cenários durante a evolução de sistemas de software. A abordagem define quatro fases: (i) preparação – escolher os cenários e preparar os releases alvos; (ii) análise dinâmica – determinar o desempenho de cenários e métodos calculando seus tempos de execução; (iii) análise de variação – processar e comparar os resultados da análise dinâmica para releases diferentes; e (iv) mineração de repositório – identificar issues e commits associados com a variação de desempenho detectada. Estudos empíricos foram realizados para avaliar a abordagem de diferentes perspectivas. Um estudo exploratório analisou a viabilidade de se aplicar a abordagem em sistemas de diferentes domínios para identificar automaticamente elementos de código fonte com variação de desempenho e as mudanças que afetaram tais elementos durante uma evolução. Esse estudo analisou três sistemas: (i) SIGAA – um sistema web para gerência acadêmica; (ii) ArgoUML – uma ferramenta de modelagem UML; e (iii) Netty – um framework para aplicações de rede. Outro estudo realizou uma análise evolucionária ao aplicar a abordagem em múltiplos releases do Netty, e dos frameworks web Wicket e Jetty. Nesse estudo foram analisados 21 releases (sete de cada sistema), totalizando 57 cenários. Em resumo, foram encontrados 14 cenários com variação significante de desempenho para Netty, 13 para Wicket e 9 para Jetty. Adicionalmente, foi obtido feedback de oito desenvolvedores desses sistemas através de um formulário online. Finalmente, no último estudo, um modelo de regressão para desempenho foi desenvolvido visando indicar propriedades de commits que são mais prováveis a causar degradação de desempenho. No geral, 997 commits foram minerados, sendo 103 recuperados de elementos de código fonte degradados e 19 de otimizados, enquanto 875 não tiveram impacto no tempo de execução. O número de dias antes de disponibilizar o release e o dia da semana se mostraram como as variáveis mais relevantes dos commits que degradam desempenho no nosso modelo. A área de característica de operação do receptor (ROC – Receiver Operating Characteristic) do modelo de regressão é 60%, o que significa que usar o modelo para decidir se um commit causará degradação ou não é 10% melhor do que uma decisão aleatória.
Resumo:
A manutenção e evolução de sistemas de software tornou-se uma tarefa bastante crítica ao longo dos últimos anos devido à diversidade e alta demanda de funcionalidades, dispositivos e usuários. Entender e analisar como novas mudanças impactam os atributos de qualidade da arquitetura de tais sistemas é um pré-requisito essencial para evitar a deterioração de sua qualidade durante sua evolução. Esta tese propõe uma abordagem automatizada para a análise de variação do atributo de qualidade de desempenho em termos de tempo de execução (tempo de resposta). Ela é implementada por um framework que adota técnicas de análise dinâmica e mineração de repositório de software para fornecer uma forma automatizada de revelar fontes potenciais – commits e issues – de variação de desempenho em cenários durante a evolução de sistemas de software. A abordagem define quatro fases: (i) preparação – escolher os cenários e preparar os releases alvos; (ii) análise dinâmica – determinar o desempenho de cenários e métodos calculando seus tempos de execução; (iii) análise de variação – processar e comparar os resultados da análise dinâmica para releases diferentes; e (iv) mineração de repositório – identificar issues e commits associados com a variação de desempenho detectada. Estudos empíricos foram realizados para avaliar a abordagem de diferentes perspectivas. Um estudo exploratório analisou a viabilidade de se aplicar a abordagem em sistemas de diferentes domínios para identificar automaticamente elementos de código fonte com variação de desempenho e as mudanças que afetaram tais elementos durante uma evolução. Esse estudo analisou três sistemas: (i) SIGAA – um sistema web para gerência acadêmica; (ii) ArgoUML – uma ferramenta de modelagem UML; e (iii) Netty – um framework para aplicações de rede. Outro estudo realizou uma análise evolucionária ao aplicar a abordagem em múltiplos releases do Netty, e dos frameworks web Wicket e Jetty. Nesse estudo foram analisados 21 releases (sete de cada sistema), totalizando 57 cenários. Em resumo, foram encontrados 14 cenários com variação significante de desempenho para Netty, 13 para Wicket e 9 para Jetty. Adicionalmente, foi obtido feedback de oito desenvolvedores desses sistemas através de um formulário online. Finalmente, no último estudo, um modelo de regressão para desempenho foi desenvolvido visando indicar propriedades de commits que são mais prováveis a causar degradação de desempenho. No geral, 997 commits foram minerados, sendo 103 recuperados de elementos de código fonte degradados e 19 de otimizados, enquanto 875 não tiveram impacto no tempo de execução. O número de dias antes de disponibilizar o release e o dia da semana se mostraram como as variáveis mais relevantes dos commits que degradam desempenho no nosso modelo. A área de característica de operação do receptor (ROC – Receiver Operating Characteristic) do modelo de regressão é 60%, o que significa que usar o modelo para decidir se um commit causará degradação ou não é 10% melhor do que uma decisão aleatória.
Resumo:
Software bug analysis is one of the most important activities in Software Quality. The rapid and correct implementation of the necessary repair influence both developers, who must leave the fully functioning software, and users, who need to perform their daily tasks. In this context, if there is an incorrect classification of bugs, there may be unwanted situations. One of the main factors to be assigned bugs in the act of its initial report is severity, which lives up to the urgency of correcting that problem. In this scenario, we identified in datasets with data extracted from five open source systems (Apache, Eclipse, Kernel, Mozilla and Open Office), that there is an irregular distribution of bugs with respect to existing severities, which is an early sign of misclassification. In the dataset analyzed, exists a rate of about 85% bugs being ranked with normal severity. Therefore, this classification rate can have a negative influence on software development context, where the misclassified bug can be allocated to a developer with little experience to solve it and thus the correction of the same may take longer, or even generate a incorrect implementation. Several studies in the literature have disregarded the normal bugs, working only with the portion of bugs considered severe or not severe initially. This work aimed to investigate this portion of the data, with the purpose of identifying whether the normal severity reflects the real impact and urgency, to investigate if there are bugs (initially classified as normal) that could be classified with other severity, and to assess if there are impacts for developers in this context. For this, an automatic classifier was developed, which was based on three algorithms (Näive Bayes, Max Ent and Winnow) to assess if normal severity is correct for the bugs categorized initially with this severity. The algorithms presented accuracy of about 80%, and showed that between 21% and 36% of the bugs should have been classified differently (depending on the algorithm), which represents somewhere between 70,000 and 130,000 bugs of the dataset.
Resumo:
This paper presents an experimental study of the sensitivity to 15-MeV neutrons of Advanced Low Power SRAMs (A-LPSRAM) at low bias voltage little above the threshold value that allows the retention of data. This family of memories is characterized by a 3D structure to minimize the area penalty and to cope with latchups, as well as by the presence of integrated capacitors to hinder the occurrence of single event upsets. In low voltage static tests, classical single event upsets were a minor source of errors, but other unexpected phenomena such as clusters of bitflips and hard errors turned out to be the origin of hundreds of bitflips. Besides, errors were not observed in dynamic tests at nominal voltage. This behavior is clearly different than that of standard bulk CMOS SRAMs, where thousands of errors have been reported.
Resumo:
The advent of the Auger Engineering Radio Array (AERA) necessitates the development of a powerful framework for the analysis of radio measurements of cosmic ray air showers. As AERA performs "radio-hybrid" measurements of air shower radio emission in coincidence with the surface particle detectors and fluorescence telescopes of the Pierre Auger Observatory, the radio analysis functionality had to be incorporated in the existing hybrid analysis solutions for fluorescence and surface detector data. This goal has been achieved in a natural way by extending the existing Auger Offline software framework with radio functionality. In this article, we lay out the design, highlights and features of the radio extension implemented in the Auger Offline framework. Its functionality has achieved a high degree of sophistication and offers advanced features such as vectorial reconstruction of the electric field, advanced signal processing algorithms, a transparent and efficient handling of FFTs, a very detailed simulation of detector effects, and the read-in of multiple data formats including data from various radio simulation codes. The source code of this radio functionality can be made available to interested parties on request. (C) 2011 Elsevier B.V. All rights reserved.
Resumo:
The Duke Free-electron laser (FEL) system, driven by the Duke electron storage ring, has been at the forefront of developing new light source capabilities over the past two decades. In 1999, the Duke FEL demonstrated the first lasing of a storage ring FEL in the vacuum ultraviolet (VUV) region at $194$ nm using two planar OK-4 undulators. With two helical undulators added to the outboard sides of the planar undulators, in 2005 the highest FEL gain ($47.8\%$) of a storage ring FEL was achieved using the Duke FEL system with a four-undulator configuration. In addition, the Duke FEL has been used as the photon source to drive the High Intensity $\gamma$-ray Source (HIGS) via Compton scattering of the FEL beam and electron beam inside the FEL cavity. Taking advantage of FEL's wavelength tunability as well as the adjustability of the energy of the electron beam in the storage ring, the nearly monochromatic $\gamma$-ray beam has been produced in a wide energy range from $1$ to $100$ MeV at the HIGS. To further push the FEL short wavelength limit and enhance the FEL gain in the VUV regime for high energy $\gamma$-ray production, two additional helical undulators were installed in 2012 using an undulator switchyard system to allow switching between the two planar and two helical undulators in the middle section of the FEL system. Using different undulator configurations made possible by the switchyard, a number of novel capabilities of the storage ring FEL have been developed and exploited for a wide FEL wavelength range from infrared (IR) to VUV. These new capabilities will eventually be made available to the $\gamma$-ray operation, which will greatly enhance the $\gamma$-ray user research program, creating new opportunities for certain types of nuclear physics research.
With the wide wavelength tuning range, the FEL is an intrinsically well-suited device to produce lasing with multiple colors. Taking advantage of the availability of an undulator system with multiple undulators, we have demonstrated the first two-color lasing of a storage ring FEL. Using either a three- or four-undulator configuration with a pair of dual-band high reflectivity mirrors, we have achieved simultaneous lasing in the IR and UV spectral regions. With the low-gain feature of the storage ring FEL, the power generated at the two wavelengths can be equally built up and precisely balanced to reach FEL saturation. A systematic experimental program to characterize this two-color FEL has been carried out, including precise power control, a study of the power stability of two-color lasing, wavelength tuning, and the impact of the FEL mirror degradation. Using this two-color laser, we have started to develop a new two-color $\gamma$-ray beam for scientific research at the HIGS.
Using the undulator switchyard, four helical undulators installed in the beamline can be configured to not only enhance the FEL gain in the VUV regime, but also allow for the full polarization control of the FEL beams. For the accelerator operation, the use of helical undulators is essential to extend the FEL mirror lifetime by reducing radiation damage from harmonic undulator radiation. Using a pair of helical undulators with opposite helicities, we have realized (1) fast helicity switching between left- and right-circular polarizations, and (2) the generation of fully controllable linear polarization. In order to extend these new capabilities of polarization control to the $\gamma$-ray operation in a wide energy range at the HIGS, a set of FEL polarization diagnostic systems need to be developed to cover the entire FEL wavelength range. The preliminary development of the polarization diagnostics for the wavelength range from IR to UV has been carried out.
Resumo:
This thesis describes the development of an open-source system for virtual bronchoscopy used in combination with electromagnetic instrument tracking. The end application is virtual navigation of the lung for biopsy of early stage cancer nodules. The open-source platform 3D Slicer was used for creating freely available algorithms for virtual bronchscopy. Firstly, the development of an open-source semi-automatic algorithm for prediction of solitary pulmonary nodule malignancy is presented. This approach may help the physician decide whether to proceed with biopsy of the nodule. The user-selected nodule is segmented in order to extract radiological characteristics (i.e., size, location, edge smoothness, calcification presence, cavity wall thickness) which are combined with patient information to calculate likelihood of malignancy. The overall accuracy of the algorithm is shown to be high compared to independent experts' assessment of malignancy. The algorithm is also compared with two different predictors, and our approach is shown to provide the best overall prediction accuracy. The development of an airway segmentation algorithm which extracts the airway tree from surrounding structures on chest Computed Tomography (CT) images is then described. This represents the first fundamental step toward the creation of a virtual bronchoscopy system. Clinical and ex-vivo images are used to evaluate performance of the algorithm. Different CT scan parameters are investigated and parameters for successful airway segmentation are optimized. Slice thickness is the most affecting parameter, while variation of reconstruction kernel and radiation dose is shown to be less critical. Airway segmentation is used to create a 3D rendered model of the airway tree for virtual navigation. Finally, the first open-source virtual bronchoscopy system was combined with electromagnetic tracking of the bronchoscope for the development of a GPS-like system for navigating within the lungs. Tools for pre-procedural planning and for helping with navigation are provided. Registration between the lungs of the patient and the virtually reconstructed airway tree is achieved using a landmark-based approach. In an attempt to reduce difficulties with registration errors, we also implemented a landmark-free registration method based on a balanced airway survey. In-vitro and in-vivo testing showed good accuracy for this registration approach. The centreline of the 3D airway model is extracted and used to compensate for possible registration errors. Tools are provided to select a target for biopsy on the patient CT image, and pathways from the trachea towards the selected targets are automatically created. The pathways guide the physician during navigation, while distance to target information is updated in real-time and presented to the user. During navigation, video from the bronchoscope is streamed and presented to the physician next to the 3D rendered image. The electromagnetic tracking is implemented with 5 DOF sensing that does not provide roll rotation information. An intensity-based image registration approach is implemented to rotate the virtual image according to the bronchoscope's rotations. The virtual bronchoscopy system is shown to be easy to use and accurate in replicating the clinical setting, as demonstrated in the pre-clinical environment of a breathing lung method. Animal studies were performed to evaluate the overall system performance.
Resumo:
In the process of engineering design of structural shapes, the flat plate analysis results can be generalized to predict behaviors of complete structural shapes. In this case, the purpose of this project is to analyze a thin flat plate under conductive heat transfer and to simulate the temperature distribution, thermal stresses, total displacements, and buckling deformations. The current approach in these cases has been using the Finite Element Method (FEM), whose basis is the construction of a conforming mesh. In contrast, this project uses the mesh-free Scan Solve Method. This method eliminates the meshing limitation using a non-conforming mesh. I implemented this modeling process developing numerical algorithms and software tools to model thermally induced buckling. In addition, convergence analysis was achieved, and the results were compared with FEM. In conclusion, the results demonstrate that the method gives similar solutions to FEM in quality, but it is computationally less time consuming.
Resumo:
The MAREDAT atlas covers 11 types of plankton, ranging in size from bacteria to jellyfish. Together, these plankton groups determine the health and productivity of the global ocean and play a vital role in the global carbon cycle. Working within a uniform and consistent spatial and depth grid (map) of the global ocean, the researchers compiled thousands and tens of thousands of data points to identify regions of plankton abundance and scarcity as well as areas of data abundance and scarcity. At many of the grid points, the MAREDAT team accomplished the difficult conversion from abundance (numbers of organisms) to biomass (carbon mass of organisms). The MAREDAT atlas provides an unprecedented global data set for ecological and biochemical analysis and modeling as well as a clear mandate for compiling additional existing data and for focusing future data gathering efforts on key groups in key areas of the ocean. The present collection presents the original data sets used to compile Global distributions of diazotrophs abundance, biomass and nitrogen fixation rates
Resumo:
The MAREDAT atlas covers 11 types of plankton, ranging in size from bacteria to jellyfish. Together, these plankton groups determine the health and productivity of the global ocean and play a vital role in the global carbon cycle. Working within a uniform and consistent spatial and depth grid (map) of the global ocean, the researchers compiled thousands and tens of thousands of data points to identify regions of plankton abundance and scarcity as well as areas of data abundance and scarcity. At many of the grid points, the MAREDAT team accomplished the difficult conversion from abundance (numbers of organisms) to biomass (carbon mass of organisms). The MAREDAT atlas provides an unprecedented global data set for ecological and biochemical analysis and modeling as well as a clear mandate for compiling additional existing data and for focusing future data gathering efforts on key groups in key areas of the ocean. The present data set presents depth integrated values of diazotrophs abundance and biomass, computed from a collection of source data sets.