966 resultados para C (Programming Language)


Relevância:

80.00% 80.00%

Publicador:

Resumo:

A mathematical model has been developed for predicting the spectral distribution of solar radiation incident on a horizontal surface. The solar spectrum in the wavelength range 0.29 to 4.0 micrometers has been divided in 144 intervals. Two variables in the model are the atmospheric water vapour content and atmospheric turbidity. After allowing for absorption and scattering in the atmosphere, the spectral intensity of direct and diffuse components of radiation are computed. When the predicted radiation levels are compared with the measured values for the total radiation and the values with glass filters RG715, RG630 and OG530, a close agreement ( 5%) has been achieved under clear sky conditions. A solar radiation measuring facility, close to the centre of Birmingham, has been set up utilising a microcomputer based data logging system. A suite of computer programs in the BASIC programming language has been developed and extensively tested for solar radiation data, logging, analysis and plotting. Two commonly used instruments, the Eppley PSP pyranometer and the Kipp and Zonen CM5 pyranometer, have been compared under different experimental conditions. Three models for computing the inclined plane irradiation, using total and diffuse radiation on a horizontal surface, have been tested for Birmingham. The anisotropic-alI-sky model, proposed by Klucher, provides a good agreement between the measured and the predicted radiation levels. Measurements of solar spectral distribution, using glass filters, are also reported for a number of inclines facing South.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Image segmentation is one of the most computationally intensive operations in image processing and computer vision. This is because a large volume of data is involved and many different features have to be extracted from the image data. This thesis is concerned with the investigation of practical issues related to the implementation of several classes of image segmentation algorithms on parallel architectures. The Transputer is used as the basic building block of hardware architectures and Occam is used as the programming language. The segmentation methods chosen for implementation are convolution, for edge-based segmentation; the Split and Merge algorithm for segmenting non-textured regions; and the Granlund method for segmentation of textured images. Three different convolution methods have been implemented. The direct method of convolution, carried out in the spatial domain, uses the array architecture. The other two methods, based on convolution in the frequency domain, require the use of the two-dimensional Fourier transform. Parallel implementations of two different Fast Fourier Transform algorithms have been developed, incorporating original solutions. For the Row-Column method the array architecture has been adopted, and for the Vector-Radix method, the pyramid architecture. The texture segmentation algorithm, for which a system-level design is given, demonstrates a further application of the Vector-Radix Fourier transform. A novel concurrent version of the quad-tree based Split and Merge algorithm has been implemented on the pyramid architecture. The performance of the developed parallel implementations is analysed. Many of the obtained speed-up and efficiency measures show values close to their respective theoretical maxima. Where appropriate comparisons are drawn between different implementations. The thesis concludes with comments on general issues related to the use of the Transputer system as a development tool for image processing applications; and on the issues related to the engineering of concurrent image processing applications.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper explores the use of the optimization procedures in SAS/OR software with application to the contemporary logistics distribution network design using an integrated multiple criteria decision making approach. Unlike the traditional optimization techniques, the proposed approach, combining analytic hierarchy process (AHP) and goal programming (GP), considers both quantitative and qualitative factors. In the integrated approach, AHP is used to determine the relative importance weightings or priorities of alternative warehouses with respect to both deliverer oriented and customer oriented criteria. Then, a GP model incorporating the constraints of system, resource, and AHP priority is formulated to select the best set of warehouses without exceeding the limited available resources. To facilitate the use of integrated multiple criteria decision making approach by SAS users, an ORMCDM code was implemented in the SAS programming language. The SAS macro developed in this paper selects the chosen variables from a SAS data file and constructs sets of linear programming models based on the selected GP model. An example is given to illustrate how one could use the code to design the logistics distribution network.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Pavel Azalov - Recursion is a powerful technique for producing simple algorithms. It is a main topics in almost every introductory programming course. However, educators often refer to difficulties in learning recursion, and suggest methods for teaching recursion. This paper offers a possible solutions to the problem by (1) expressing the recursive definitions through base operations, which have been predefined as a set of base functions and (2) practising recursion by solving sequences of problems. The base operations are specific for each sequence of problems, resulting in a smooth transitions from recursive definitions to recursive functions. Base functions hide the particularities of the concrete programming language and allows the students to focus solely on the formulation of recursive definitions.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

, , - , - . , , . .

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Raster graphic ampelometric software was not exclusively developed for the estimation of leaf area, but also for the characterization of grapevine (Viti vinifera L.) leaves. The software was written in C-Hprogramming language, using the C-1-1- Builder 2007 for Windows 95-XP and Linux operation systems. It handles desktop-scanned images. On the image analysed with the GRA.LE.D., the user has to determine 11 points. These points are then connected and the distances between them calculated. The GRA.LE.D. software supports standard ampelometric measurements such as leaf area, angles between the veins and lengths of the veins. These measurements are recorded by the software and exported into plain ASCII text files for single or multiple samples. Twenty-two biometric data points of each leaf are identified by the GRA.LE.D. It presents the opportunity to statistically analyse experimental data, allows comparison of cultivars and enables graphic reconstruction of leaves using the Microsoft Excel Chart Wizard. The GRA. LE.D. was thoroughly calibrated and compared to other widely used instruments and methods such as photo-gravimetry, LiCor L0100, WinDIAS2.0 and ImageTool. By comparison, the GRA.LE.D. presented the most accurate measurements of leaf area, but the LiCor L0100 and the WinDIAS2.0 were faster, while the photo-gravimetric method proved to be the most time-consuming. The WinDIAS2.0 instrument was the least reliable. The GRA.LE.D. is uncomplicated, user-friendly, accurate, consistent, reliable and has wide practical application.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This thesis chronicles the design and implementation of a Internet/Intranet and database based application for the quality control of hurricane surface wind observations. A quality control session consists of selecting desired observation types to be viewed and determining a storm track based time window for viewing the data. All observations of the selected types are then plotted in a storm relative view for the chosen time window and geography is positioned for the storm-center time about which an objective analysis can be performed. Users then make decisions about data validity through visual nearest-neighbor comparison and inspection. The project employed an Object Oriented iterative development method from beginning to end and its implementation primarily features the Java programming language. ^

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This thesis chronicles the design and implementation of a Intemet/Intranet and database based application for the quality control of hurricane surface wind observations. A quality control session consists of selecting desired observation types to be viewed and determining a storm track based time window for viewing the data. All observations of the selected types are then plotted in a storm relative view for the chosen time window and geography is positioned for the storm-center time about which an objective analysis can be performed. Users then make decisions about data validity through visual nearestneighbor comparison and inspection. The project employed an Object Oriented iterative development method from beginning to end and its implementation primarily features the Java programming language.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The aim of this study was to develop a practical, versatile and fast dosimetry and radiobiological model for calculation of the 3D dose distribution and radiobiological effectiveness of radioactive stents. The algorithm was written in Matlab 6.5 programming language and is based on the dose point kernel convolution. The dosimetry and radiobiological model was applied for evaluation of the 3D dose distribution of 32P, 90Y, 188Re and 177Lu stents. Of the four, 32P delivers the highest dose, while 90Y, 188Re and 177Lu require high levels of activity to deliver a significant therapeutic dose in the range of 15-30 Gy. Results of the radiobiological model demonstrated that the same physical dose delivered by different radioisotopes produces significantly different radiobiological effects. This type of theoretical dose calculation can be useful in the development of new stent designs, the planning of animal studies and clinical trials, and clinical decisions involving individualized treatment plans.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The heavy part of the oil can be used for numerous purposes, e.g. to obtain lubricating oils. In this context, many researchers have been studying alternatives such separation of crude oil components, among which may be mentioned molecular distillation. Molecular distillation is a forced evaporation technique different from other conventional processes in the literature. This process can be classified as a special distillation case under high vacuum with pressures that reach extremely low ranges of the order of 0.1 Pascal. The evaporation and condensation surfaces must have a distance from each other of the magnitude order of mean free path of the evaporated molecules, that is, molecules evaporated easily reach the condenser, because they find a route without obstacles, what is desirable. Thus, the main contribution of this work is the simulation of the falling-film molecular distillation for crude oil mixtures. The crude oil was characterized using UniSim Design and R430 Aspen HYSYS V8.5. The results of this characterization were performed in spreadsheets of Microsoft Excel, calculations of the physicochemical properties of the waste of an oil sample, i.e., thermodynamic and transport. Based on this estimated properties and boundary conditions suggested by the literature, equations of temperature and concentration profiles were resolved through the implicit finite difference method using the programming language Visual Basic (VBA) for Excel. The result of the temperature profile showed consistent with the reproduced by literature, having in their initial values a slight distortion as a result of the nature of the studied oil is lighter than the literature, since the results of the concentration profiles were effective allowing realize that the concentration of the more volatile decreases and of the less volatile increases due to the length of the evaporator. According to the transport phenomena present in the process, the velocity profile tends to increase to a peak and then decreases, and the film thickness decreases, both as a function of the evaporator length. It is concluded that the simulation code in Visual Basic language (VBA) is a final product of the work that allows application to molecular distillation of petroleum and other similar mixtures.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Este trabalho prope o estudo comparativo do uso de infogrficos multimdia pelos sites Clarn.com, da Argentina e Folha.com, do Brasil. A pesquisa tem como objetivo verificar e analisar como esses dois importantes veculos de comunicao online da Amrica Latina tm utilizado a tecnologia HTML5 para avanar nas possibilidades interativas do gnero jornalstico. Para tanto, a anlise comparada trata da infografia multimdia, que tem passado por profundas mudanas tecnolgicas, alterando o formato e o contedo da notcia. Alm da conceituao terica e reviso de literatura sobre infografia, newsgame, narrativa transmdia, jornalismo online, interatividade e as linguagens de programao voltadas para a produo de infografia multimdia, o trabalho realizou anlise comparativa das sees Infogrficos, veiculada pela Folha.com, e Especiales Multimedia, do Clarn.com. O estudo, quantitativo e qualitativo, verificou os recursos narrativos e informativos, ferramentas e tecnologias de linguagem de programao para Internet que so empregadas pelos dois meios de comunicao, com base no modelo de anlise proposto por Alberto Cairo em Infografia 2.0 visualizacin interactiva de informacin en prensa. A pesquisa demonstrou que ainda que o Clarn.com tenha utilizado a tecnologia Flash na maioria dos infogrficos multimdia analisados, os resultados da anlise comparada mostram que os infogrficos do jornal online argentino possibilitaram nveis mais elevados de interatividade do que os infogrficos multimdia da Folha.com, desenvolvidos majoritariamente em HTML5.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This thesis proposes the implementation of a space efficient Prolog implementation based on the work of David H. D. Warren and Hassan At-Kaci. The Common Lisp is the framework used to the construction of the Prolog system, it was chosen both to provide a space efficient environment and a rich programming language in the sense that it supply the user with abstractions and new ways of thinking. The resulting system is a new syntax to the initial language that runs on top of the SBCL Common Lisp implementation and can abstract away or exploit the underlying system.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

One of the leading motivations behind the multilingual semantic web is to make resources accessible digitally in an online global multilingual context. Consequently, it is fundamental for knowledge bases to find a way to manage multilingualism and thus be equipped with those procedures for its conceptual modelling. In this context, the goal of this paper is to discuss how common-sense knowledge and cultural knowledge are modelled in a multilingual framework. More particularly, multilingualism and conceptual modelling are dealt with from the perspective of FunGramKB, a lexico-conceptual knowledge base for natural language understanding. This project argues for a clear division between the lexical and the conceptual dimensions of knowledge. Moreover, the conceptual layer is organized into three modules, which result from a strong commitment towards capturing semantic knowledge (Ontology), procedural knowledge (Cognicon) and episodic knowledge (Onomasticon). Cultural mismatches are discussed and formally represented at the three conceptual levels of FunGramKB.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Lappeenrannan teknillinen yliopisto tutkii pientasajnniteshkn kytt. Yliopisto on rakennuttanut Jrvi-Suomen Energia Oy:n ja Suur-Savon Shk Oy:n kanssa yhteistyss kokeellisen pientasajnniteshkverkon, jolla pystytn tarjoamaan kenttolosuhteet pienjnnitetutkimukselle todellisilla asiakkailla ja todentaa LVDC-teknologiaa ja muita lykkn shkverkon toimintoja kenttolosuhteissa. Verkon tasajnniteyhteys on rakennettu 20 kV shknjakeluverkon ja neljn kuluttajan vlille. 20 kV keskijnnite suunnataan tasamuuntamolla 750 V pientasajnnitteeksi ja uudestaan 400/230 V vaihtojnnitteeksi kuluttajien lheisyydess. Tmn kandidaatintyn tarkoituksena on luoda yliopistolle tietokanta pientasajnniteshkverkosta kertyvlle tiedolle ja mittaustuloksille. Tietokanta nhtiin tarpeelliseksi luoda, jotta pienjnniteverkon mittaustuloksia pystytn myhemmin tarkastelemaan yhdess ja yhtenisess muodossa. Yhdeksi tutkimuskysymykseksi muodostui, kuinka jrjest ja visualisoida kaikki verkosta palvelimille kertyv mittausdata. Tyss on huomioitu mys kolme tietokantaa mahdollisesti hydyntv kyttjryhm: kotitalousasiakkaat, shkverkkoyhtit ja tutkimuslaboratorio, sek pohdittu tietokannan hyty ja merkityst nille kyttjille. Toiseksi tutkimuskysymykseksi muodostuikin, mik kaikesta tietokantaan talletetusta datasta olisi oleellisen trke ottaa talteen niden asiakkaiden kannalta, ja kuinka nm voisivat hakea tietoa tietokannasta. Tyn tutkimusmenetelmt perustuvat jo valmiiksi olemassa olevaan mittausdataan. Tyt varten on kytetty sek painettua ett shkisess muodossa olevaa kirjallisuutta. Tyn tuloksena on saatu luotua tietokanta MySQL Workbench -ohjelmistolla, sek mittausdatan kerys- ja ksittelyohjelmat Python-ohjelmointikielell. Lisksi on luotu erillinen MATLAB-rajapinta tiedon visualisoimista varten, jolla havainnollistetaan kolmen asiakasryhmn mittausdataa. Tietokanta ja sen tiedon visualisointi antavat kuluttajalle mahdollisuuden ymmrt paremmin omaa shknkyttn, sek shkverkkoyhtiille ja tutkimuslaboratorioille muun muassa tietoa shkn laadusta ja verkon kuormituksesta.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Thesis (Ph.D.)--University of Washington, 2016-08