974 resultados para Free source software
Resumo:
Dissertação (mestrado)—Universidade de Brasília, Instituto de Ciências Exatas, Departamento de Ciência da Computação, 2015.
Resumo:
New technologies appear each moment and its use can result in countless benefits for that they directly use and for all the society as well. In this direction, the State also can use the technologies of the information and communication to improve the level of rendering of services to the citizens, to give more quality of life to the society and to optimize the public expense, centering it in the main necessities. For this, it has many research on politics of Electronic Government (e-Gov) and its main effect for the citizen and the society as a whole. This research studies the concept of Electronic Government and wishes to understand the process of implementation of Free Softwares in the agencies of the Direct Administration in the Rio Grande do Norte. Moreover, it deepens the analysis to identify if its implantation results in reduction of cost for the state treasury and intends to identify the Free Software participation in the Administration and the bases of the politics of Electronic Government in this State. Through qualitative interviews with technologies coordinators and managers in 3 State Secretaries it could be raised the ways that come being trod for the Government in order to endow the State with technological capacity. It was perceived that the Rio Grande do Norte still is an immature State in relation to practical of electronic government (e-Gov) and with Free Softwares, where few agencies have factual and viable initiatives in this area. It still lacks of a strategical definition of the paper of Technology and more investments in infrastructure of staff and equipment. One also observed advances as the creation of the normative agency, the CETIC (State Advice of Technology of the Information and Communication), the Managing Plan of Technology that provide a necessary diagnosis with the situation how much Technology in the State and considered diverse goals for the area, the accomplishment of a course of after-graduation for managers of Technology and the training in BrOffice (OppenOffice) for 1120 public servers
Resumo:
FEA simulation of thermal metal cutting is central to interactive design and manufacturing. It is therefore relevant to assess the applicability of FEA open software to simulate 2D heat transfer in metal sheet laser cuts. Application of open source code (e.g. FreeFem++, FEniCS, MOOSE) makes possible additional scenarios (e.g. parallel, CUDA, etc.), with lower costs. However, a precise assessment is required on the scenarios in which open software can be a sound alternative to a commercial one. This article contributes in this regard, by presenting a comparison of the aforementioned freeware FEM software for the simulation of heat transfer in thin (i.e. 2D) sheets, subject to a gliding laser point source. We use the commercial ABAQUS software as the reference to compare such open software. A convective linear thin sheet heat transfer model, with and without material removal is used. This article does not intend a full design of computer experiments. Our partial assessment shows that the thin sheet approximation turns to be adequate in terms of the relative error for linear alumina sheets. Under mesh resolutions better than 10e−5 m , the open and reference software temperature differ in at most 1 % of the temperature prediction. Ongoing work includes adaptive re-meshing, nonlinearities, sheet stress analysis and Mach (also called ‘relativistic’) effects.
Resumo:
Visualisation provides good support for software analysis. It copes with the intangible nature of software by providing concrete representations of it. By reducing the complexity of software, visualisations are especially useful when dealing with large amounts of code. One domain that usually deals with large amounts of source code data is empirical analysis. Although there are many tools for analysis and visualisation, they do not cope well software corpora. In this paper we present Explora, an infrastructure that is specifically targeted at visualising corpora. We report on early results when conducting a sample analysis on Smalltalk and Java corpora.
Resumo:
This document introduces the planned new search for the neutron Electric Dipole Moment at the Spallation Neutron Source at the Oak Ridge National Laboratory. A spin precession measurement is to be carried out using Ultracold neutrons diluted in a superfluid Helium bath at T = 0.5 K, where spin polarized 3He atoms act as detector of the neutron spin polarization. This manuscript describes some of the key aspects of the planned experiment with the contributions from Caltech to the development of the project.
Techniques used in the design of magnet coils for Nuclear Magnetic Resonance were adapted to the geometry of the experiment. Described is an initial design approach using a pair of coils tuned to shield outer conductive elements from resistive heat loads, while inducing an oscillating field in the measurement volume. A small prototype was constructed to test the model of the field at room temperature.
A large scale test of the high voltage system was carried out in a collaborative effort at the Los Alamos National Laboratory. The application and amplification of high voltage to polished steel electrodes immersed in a superfluid Helium bath was studied, as well as the electrical breakdown properties of the electrodes at low temperatures. A suite of Monte Carlo simulation software tools to model the interaction of neutrons, 3He atoms, and their spins with the experimental magnetic and electric fields was developed and implemented to further the study of expected systematic effects of the measurement, with particular focus on the false Electric Dipole Moment induced by a Geometric Phase akin to Berry’s phase.
An analysis framework was developed and implemented using unbinned likelihood to fit the time modulated signal expected from the measurement data. A collaborative Monte Carlo data set was used to test the analysis methods.
Resumo:
Carbon Monoxide (CO) and Ozone (O3) are considered to be one of the most important atmospheric pollutants in the troposphere with both having significant effects on human health. Both are included in the U.S. E.P.A list of criteria pollutants. CO is primarily emitted in the source region whereas O3 can be formed near the source, during transport of the pollution plumes containing O3 precursors or in a receptor region as the plumes subside. The long chemical lifetimes of both CO and O3 enable them to be transported over long distances. This transport is important on continental scales as well, commonly referred to as inter-continental transport and affects the concentrations of both CO and O3 in downwind receptor regions, thereby having significant implications for their air quality standards. Over the period 2001-2011, there have been decreases in the anthropogenic emissions of CO and NOx in North America and Europe whereas the emissions over Asia have increased. How these emission trends have affected concentrations at remote sites located downwind of these continents is an important question. The PICO-NARE observatory located on the Pico Mountain in Azores, Portugal is frequently impacted by North American pollution outflow (both anthropogenic and biomass burning) and is a unique site to investigate long range transport from North America. This study uses in-situ observations of CO and O3 for the period 2001-2011 at PICO-NARE coupled with output from the full chemistry (with normal and fixed anthropogenic emissions) and tagged CO simulations in GEOS-Chem, a global 3-D chemical transport model of atmospheric composition driven by meteorological input from the Goddard Earth Observing System (GEOS) of the NASA Global Modeling and Assimilation Office, to determine and interpret the trends in CO and O3 concentrations over the past decade. These trends would be useful in ascertaining the impacts emission reductions in the United States have had over Pico and in general over the North Atlantic. A regression model with sinusoidal functions and a linear trend term was fit to the in-situ observations and the GEOS-Chem output for CO and O3 at Pico respectively. The regression model yielded decreasing trends for CO and O3 with the observations (-0.314 ppbv/year & -0.208 ppbv/year respectively) and the full chemistry simulation with normal emissions (-0.343 ppbv/year & -0.526 ppbv/year respectively). Based on analysis of the results from the full chemistry simulation with fixed anthropogenic emissions and the tagged CO simulation it was concluded that the decreasing trends in CO were a consequence of the anthropogenic emission changes in regions such as USA and Asia. The emission reductions in USA are countered by Asian increases but the former have a greater impact resulting in decreasing trends for CO at PICO-NARE. For O3 however, it is the increase in water vapor content (which increases O3 destruction) along the pathways of transport from North America to PICO-NARE as well as around the site that has resulted in decreasing trends over this period. This decrease is offset by increase in O3 concentrations due to anthropogenic influence which could be due to increasing Asian emissions of O3 precursors as these emissions have decreased over the US. However, the anthropogenic influence does not change the final direction of the trend. It can thus be concluded that CO and O3 concentrations at PICO-NARE have decreased over 2001-2011.
Resumo:
Lo scopo della presente tesi è sviluppare un ambiente per l'ottimizzazione strutturale di componenti per applicazione aerospaziale utilizzando codici open-source. In particolare, il codice Salome viene utilizzato per il disegno automatico delle strutture, il programma Code Aster permette di effettuare l'analisi agli elementi finiti del componente, mentre Octave viene utilizzato per svolgere l'ottimizzazione basata su un algoritmo euristico e per integrare fra di loro i differenti codici. Le tecniche di ottimizzazione dei componenti stanno rivestendo sempre più importanza visto che le moderne tecniche di Additive Manufacturing permettono di realizzare strutture molto complesse che un tempo non era conveniente (o possibile) realizzare con asportazione di materiale. Nella prima parte della tesi si descrivono gli strumenti software utilizzati e la loro integrazione al fine di parametrizzare la generazione di geometrie ed effettuare in modo automatico analisi strutturali. Successivamente si descrivono tre casi di studio in cui la metodologia è stata sperimentata: un primo caso di validazione in cui si è applicato il metodo alla definizione della geometria di minimo peso per una trave a sbalzo con carico concentrato, un secondo test di ottimizzazione di un longherone per aeromobile, un terzo caso applicativo legato alla ottimizzazione di un serbatoio per fluidi in pressione da utilizzare su un satellite.
Resumo:
Data la sempre maggiore richiesta di fabbisogno energetico, si è sviluppata una nuova filosofia nella gestione dei consumi energetici, il DSM (demand side management), che ha lo scopo di incoraggiare il consumatore ad usare energia in modo più intelligente e coscienzioso. Questo obiettivo, unito all’accumulo di energia da fonti rinnovabili, permetterà un abbassamento dell’utilizzo dell’energia elettrica proveniente dal consumo di fonti non rinnovabili e altamente inquinanti come quelle a combustibili fossili ed una diminuzione sia del consumo energetico, sia del costo per produrre energia che dell’energia stessa. L’home automation e la domotica in ambiente domestico rappresentano un esempio di DSM. L’obiettivo di questa tesi è quello di creare un sistema di home automation utilizzando tecnologie opensource. Sono stati utilizzati device come board Arduino UNO, Raspberry Pi ed un PC con sistema operativo GNU/Linux per creare una simulazione di un sistema di home automation abbinato alla gestione di celle fotovoltaiche ed energy storaging. Il sistema permette di poter spegnere un carico energetico in base a delle particolari circostanze come, per esempio, il superamento di una certa soglia di consumo di energia elettrica. Il software utilizzato è opensource e mira a poter ottimizzare il consumo energetico secondo le proprie finalità. Il tutto a dimostrare che si può creare un sistema di home automation da abbinare con il presente e futuro delle fonti rinnovabili utilizzando tecnologie libere in modo tale da preservare privacy e security oltre che customizzazione e possibilità di adattamento a diverse circostanze. Nella progettazione del sistema è stato implementato un algoritmo per gestire varie situazioni all’interno di un ambiente domestico. La realizzazione di tale algoritmo ha prodotto ottimi risultati nella raggiungimento degli obiettivi prefissati. Il progetto di questa tesi può essere ulteriormente ampliato ed il codice è reperibile in un repository pubblico.
Resumo:
In the presented thesis work, the meshfree method with distance fields was coupled with the lattice Boltzmann method to obtain solutions of fluid-structure interaction problems. The thesis work involved development and implementation of numerical algorithms, data structure, and software. Numerical and computational properties of the coupling algorithm combining the meshfree method with distance fields and the lattice Boltzmann method were investigated. Convergence and accuracy of the methodology was validated by analytical solutions. The research was focused on fluid-structure interaction solutions in complex, mesh-resistant domains as both the lattice Boltzmann method and the meshfree method with distance fields are particularly adept in these situations. Furthermore, the fluid solution provided by the lattice Boltzmann method is massively scalable, allowing extensive use of cutting edge parallel computing resources to accelerate this phase of the solution process. The meshfree method with distance fields allows for exact satisfaction of boundary conditions making it possible to exactly capture the effects of the fluid field on the solid structure.
Resumo:
The objective of this research is to identify the factors that influence the migration of free software to proprietary software, or vice-versa. The theoretical framework was developed in light of the Diffusion of Innovations Theory (DIT) proposed by Rogers (1976, 1995), and the Unified Theory of Acceptance and Use of Technology (UTAUT) proposed by Venkatesh, Morris, Davis and Davis (2003). The research was structured in two phases: the first phase was exploratory, characterized by adjustments of the revised theory to fit Brazilian reality and the identification of companies that could be the subject of investigation; and the second phase was qualitative, in which case studies were conducted at ArcelorMittal Tubarão (AMT), a private company that migrated from proprietary software (Unix) to free software (Linux), and the city government of Serra, in Espírito Santo state, a public organization that migrated from free software (OpenOffice) to proprietary (MS Office). The results show that software migration decision takes into account factors that go beyond issues involving technical or cost aspects, such as cultural barriers, user rejection and resistance to change. These results underscore the importance of social aspects, which can play a decisive role in the decision regarding software migration and its successful implementation.
Resumo:
Purpose: Custom cranio-orbital implants have been shown to achieve better performance than their hand-shaped counterparts by restoring skull anatomy more accurately and by reducing surgery time. Designing a custom implant involves reconstructing a model of the patient's skull using their computed tomography (CT) scan. The healthy side of the skull model, contralateral to the damaged region, can then be used to design an implant plan. Designing implants for areas of thin bone, such as the orbits, is challenging due to poor CT resolution of bone structures. This makes preoperative design time-intensive since thin bone structures in CT data must be manually segmented. The objective of this thesis was to research methods to accurately and efficiently design cranio-orbital implant plans, with a focus on the orbits, and to develop software that integrates these methods. Methods: The software consists of modules that use image and surface restoration approaches to enhance both the quality of CT data and the reconstructed model. It enables users to input CT data, and use tools to output a skull model with restored anatomy. The skull model can then be used to design the implant plan. The software was designed using 3D Slicer, an open-source medical visualization platform. It was tested on CT data from thirteen patients. Results: The average time it took to create a skull model with restored anatomy using our software was 0.33 hours ± 0.04 STD. In comparison, the design time of the manual segmentation method took between 3 and 6 hours. To assess the structural accuracy of the reconstructed models, CT data from the thirteen patients was used to compare the models created using our software with those using the manual method. When registering the skull models together, the difference between each set of skulls was found to be 0.4 mm ± 0.16 STD. Conclusions: We have developed a software to design custom cranio-orbital implant plans, with a focus on thin bone structures. The method described decreases design time, and is of similar accuracy to the manual method.
Resumo:
In Europe, the concerns with the status of marine ecosystems have increased, and the Marine Directive has as main goal the achievement of Good Environmental Status (GES) of EU marine waters by 2020. Molecular tools are seen as promising and emerging approaches to improve ecosystem monitoring, and have led ecology into a new era, representing perhaps the most source of innovation in marine monitoring techniques. Benthic nematodes are considered ideal organisms to be used as biological indicator of natural and anthropogenic disturbances in aquatic ecosystems underpinning monitoring programmes on the ecological quality of marine ecosystems, very useful to assess the GES of the marine environment. dT-RFLP (directed Terminal-Restriction Fragment Length Polymorphism) allows to assess the diversity of nematode communities, but also allows studying the functioning of the ecosystem, and combined with relative real-time PCR (qPCR), provides a high-throughput semi-quantitative characterization of nematode communities. These characteristics make the two molecular tools good descriptors for the good environmental status assessment. The main aim of this study is to develop and optimize the dT-RFLP and qPCR in Mira estuary (SW coast, Portugal). A molecular phylogenetic analysis of marine and estuarine nematodes is being performed combining morphological and molecular analysis to evaluate the diversity of free-living marine nematodes in Mira estuary. After morphological identification, barcoding of 18S rDNA and COI genes are being determined for each nematode species morphologically identified. So far we generated 40 new sequences belonging to 32 different genus and 17 families, and the study has shown a good degree of concordance between traditional morphology-based identification and DNA sequences. These results will improve the assessment of marine nematode diversity and contribute to a more robust nematode taxonomy. The DNA sequences are being used to develop the dT-RFLP with the ability to easily process large sample numbers (hundreds and thousands), rather than typical of classical taxonomic or low throughput molecular analyses. A preliminary study showed that the digest enzymes used in dT-RFLP for terrestrial assemblages separated poorly the marine nematodes at taxonomic level for functional group analysis. A new digest combination was designed using the software tool DRAT (Directed Terminal Restriction Analysis Tool) to distinguished marine nematode taxa. Several solutions were provided by DRAT and tested empirically to select the solution that cuts most efficiently. A combination of three enzymes and a single digest showed to be the best solution to separate the different clusters. Parallel to this, another tool is being developed to estimate the population size (qPCR). An improvement in qPCR estimation of gene copy number using an artificial reference is being performed for marine nematodes communities to quantify the abundance. Once developed, it is proposed to validate both methodologies by determining the spatial and temporal variability of benthic nematodes assemblages across different environments. The application of these high-throughput molecular approaches for benthic nematodes will improve sample throughput and their implementation more efficient and faster as indicator of ecological status of marine ecosystems.
Resumo:
Aims. Optically thin plasmas may deviate from thermal equilibrium and thus, electrons (and ions) are no longer described by the Maxwellian distribution. Instead they can be described by κ-distributions. The free-free spectrum and radiative losses depend on the temperature-averaged (over the electrons distribution) and total Gaunt factors, respectively. Thus, there is a need to calculate and make available these factors to be used by any software that deals with plasma emission. Methods. We recalculated the free-free Gaunt factor for a wide range of energies and frequencies using hypergeometric functions of complex arguments and the Clenshaw recurrence formula technique combined with approximations whenever the difference between the initial and final electron energies is smaller than 10−10 in units of z2Ry. We used double and quadruple precisions. The temperature- averaged and total Gaunt factors calculations make use of the Gauss-Laguerre integration with 128 nodes. Results. The temperature-averaged and total Gaunt factors depend on the κ parameter, which shows increasing deviations (with respect to the results obtained with the use of the Maxwellian distribution) with decreasing κ. Tables of these Gaunt factors are provided.
Resumo:
PoliEstudio 1.0 is a computational tool, with free license, created to work with polynomial expressions in one variable and it was created by a team in which the authors of this article are part of. This article documents the qualitative validation performed to this software which main objective was to bring to the Costa Rican Educational System a validated educational software that can solve, partially, the problems that nowadays exists in the mathematic education of secondary students, particularly in the topics related to polynomial expressions in one variable and specifically to those students who are in eighth grade.
Resumo:
La Mapoteca Virtual es un sitio web construido sobre la plataforma Joomla, auspiciado por la Escuela de Ciencias Geográfica de la Universidad Nacional, en colaboración con UNA VIRTUAL.Este sitio pretende apoyar la labor docente al permitirle cargar y difundir cartografía digital para los estudiantes, y a los investigadores les facilita la localización de cartografía en línea, necesaria para la realización de sus trabajos en diferentes áreas del conocimiento. Adicionalmente, es un espacio para presentar documentos actuales en relación a la práctica de la cartografía y ciencias conexas, a la vez que fomenta la colaboración y el acceso libre a cartografía digital.Palabras clave: Cartografía, Joomla, Mapas digitales, Herramientas didácticas en líneaAbstractThe Virtual Map Library, www.mapoteca.geo.una.ac.cr, is a website constructed on the Joomla platform and supported by the School of Geographic Sciences at National University, Costa Rica, in collaboration with UNAVIRTUAL. The site intends to support the work of education by allowing the teacher to load and disseminate digital cartography for the students and helping geographic investigators locate cartography needed to accomplish their works in different areas of knowledge.In addition, Mapoteca offers a space to present current documents relating to the practice of cartography and related sciences that at the same time promotes the contribution and free access to digital cartography.Key Words: Cartography, Mapoteca Virtual, Virtual Map Library, Joomla, digital maps, online teaching tools, School of Geographic Sciences, National University, Costa Rica.