951 resultados para Testing Source Code Generation
Resumo:
Sodium and potassium are the common alkalis present in fly ash. Excessive amounts of fly ash alkalis can cause efflorescence problems in concrete products and raise concern about the effectiveness of the fly ash to mitigate alkali-silica reaction (ASR). The available alkali test, which is commonly used to measure fly ash alkali, takes approximately 35 days for execution and reporting. Hence, in many instances the fly ash has already been incorporated into concrete before the test results are available. This complicates the job of the fly ash marketing agencies and it leads to disputes with fly ash users who often are concerned with accepting projects that contain materials that fail to meet specification limits. The research project consisted of a lab study and a field study. The lab study focused on the available alkali test and how fly ash alkali content impacts common performance tests (mortar-bar expansion tests). Twenty-one fly ash samples were evaluated during the testing. The field study focused on the inspection and testing of selected, well documented pavement sites that contained moderately reactive fine aggregate and high-alkali fly ash. A total of nine pavement sites were evaluated. Two of the sites were control sites that did not contain fly ash. The results of the lab study indicated that the available alkali test is prone to experimental errors that cause poor agreement between testing labs. A strong (linear) relationship was observed between available alkali content and total alkali content of Class C fly ash. This relationship can be used to provide a quicker, more precise method of estimating the available alkali content. The results of the field study failed to link the use of high-alkali fly ash with the occurrence of ASR in the various concrete sites. Petrographic examination of the pavement cores indicated that Wayland sand is an ASR-sensitive aggregate. This was in good agreement with Iowa DOT field service records. It was recommended that preventative measures should be used when this source of sand is used in concrete mixtures.
Resumo:
This project included the following tasks: (1) Preparation of a questionnaire and survey of all 99 Iowa county engineers for input on current surfacing material practice; (2) County survey data analysis and selection of surfacing materials gradations to be used for test road construction; (3) Solicitation of county engineers and stone producers for project participation; (4) Field inspection and selection of the test road; (5) Construction of test road using varying material gradations from a single source; and (6) Field and laboratory testing and test road monitoring. The results of this research project indicate that crushed stone surfacing material graded on the fine side of Iowa Department of Transportation Class A surfacing specifications provides lower roughness and better rideability; better braking and handling characteristics; and less dust generation than the coarser gradations. It is believed that this material has sufficient fines available to act as a binder for the coarser material, which in turn promotes the formation of tight surface crust. This crust acts to provide a smooth riding surface, reduces dust generation, and improves vehicle braking and handling characteristics.
Resumo:
This report describes a short-term study undertaken to investigate the potential for using dense three-dimensional (3D) point clouds generated from light detection and ranging (LIDAR) and photogrammetry to assess roadway roughness. Spatially continuous roughness maps have potential for the identification of localized roughness features, which would be a significant improvement over traditional profiling methods. This report specifically illustrates the use of terrestrial laser scanning (TLS) and photogrammetry using a process known as structure from motion (SFM) to acquire point clouds and illustrates the use of these point clouds in evaluating road roughness. Five roadway sections were chosen for scanning and testing: three gravel road sections, one portland cement concrete (PCC) section, and one asphalt concrete (AC) section. To compare clouds obtained from terrestrial laser scanning and photogrammetry, the coordinates of the clouds for the same section on the same date were matched using open source computer code. The research indicates that the technologies described are very promising for evaluating road roughness. The major advantage of both technologies is the large amount of data collected, which allows the evaluation of the full surface. Additional research is needed to further develop the use of dense 3D point clouds for roadway assessment.
Resumo:
With nearly 2,000 free and open source software (FLOSS) licenses, software license proliferation¿ can be a major headache for software development organizations trying to speed development through software component reuse, as well as companies redistributing software packages as components of their products. Scope is one problem: from the Free Beer license to the GPL family of licenses to platform-specific licenses such as Apache and Eclipse, the number and variety of licenses make it difficult for companies to ¿do the right thing¿ with respect to the software components in their products and applications. In addition to the sheer number of licenses, each license carries within it the author¿s specific definition of how the software can be used and re-used. Permissive licenses like BSD and MIT make it easy; software can be redistributed and developers can modify code without the requirement of making changes publicly available. Reciprocal licenses, on the other hand, place varying restrictions on re-use and redistribution. Woe to the developer who snags a bit of code after a simple web search without understanding the ramifications of license restrictions.
Resumo:
Tässä diplomityössä tutkitaan automatisoitua testausta ja käyttöliittymätestauksen tekemistä helpommaksi Symbian-käyttöjärjestelmässä. Työssä esitellään Symbian ja Symbian-sovelluskehityksessä kohdattavia haasteita. Lisäksi kerrotaan testausstrategioista ja -tavoista sekä automatisoidusta testaamisesta. Lopuksi esitetään työkalu, jolla testitapausten luominen toiminnalisuus- ja järjestelmätestaukseen tehdään helpommaksi. Graafiset käyttöliittymättuovat ainutlaatuisia haasteita ohjelmiston testaamiseen. Ne tehdään usein monimutkaisista komponenteista ja niitä suunnitellaan jatkuvasti uusiksi ohjelmistokehityksen aikana. Graafisten käyttöliittymien testaukseen käytetään usein kaappaus- ja toistotyökaluja. Käyttöliittymätestauksen testitapausten suunnittelu ja toteutus vaatii paljon panostusta. Koska graafiset käyttöliittymät muodostavat suuren osan koodista, voitaisiin säästää paljon resursseja tekemällä testitapausten luomisesta helpompaa. Käytännön osuudessa toteutettu projekti pyrkii tähän tekemällä testiskriptien luomisesta visuaalista. Näin ollen itse testien skriptikieltä ei tarvitse ymmärtää ja testien hahmottaminen on myös helpompaa.
Resumo:
Monet ohjelmistoyritykset ovat alkaneet kiinnittää yhä enemmän huomiota ohjelmistotuotteidensa laatuun. Tämä on johtanut siihen, että useimmat niistä ovat valinneet ohjelmistotestauksen välineeksi, jolla tätä laatua voidaan parantaa. Testausta ei pidä rajoittaa ainoastaan ohjelmistotuotteeseen itseensä, vaan sen tulisi kattaa koko ohjelmiston kehitysprosessi. Validaatiotestauksessa keskitytään varmistamaan, että lopputuote täyttää sille asetetut vaatimukset, kun taas verifikaatiotestausta käytetään ennaltaehkäisevänä testauksena, jolla pyritään poistamaan virheitä jo ennenkuin ne pääsevät lähdekoodiin asti. Työ, johon tämä diplomityö perustuu, tehtiin alkukevään ja kesän aikana vuonna 2003 Necsom Oy:n toimeksiannosta. Necsom on pieni suomalainen ohjelmistoyritys, jonka tutkimus- ja kehitysyksikkö toimii Lappeenrannassa.Tässä diplomityössä tutustutaan aluksi ohjelmistotestaukseen sekä eri tapoihin sen organisoimiseksi. Tämän lisäksi annetaan yleisiä ohjeita testisuunnitelmien ja testaustapausten tekoon, joita onnistunut ja tehokas testaus edellyttää. Kun tämä teoria on käyty läpi, esitetään esimerkkinä kuinka sisäinen ohjelmistotestaus toteutettiin Necsomilla. Lopuksi esitetään johtopäätökset, joihin päädyttiin käytännön testausprosessin seuraamisen jälkeen ja annetaan jatkotoimenpide-ehdotuksia.
Resumo:
Jatkuvasti lisääntyvä matkapuhelinten käyttäjien määrä, internetin kehittyminen yleiseksi tiedon ja viihteen lähteeksi on luonut tarpeen palvelulle liikkuvan työaseman liittämiseksi tietokoneverkkoihin. GPRS on uusi teknologia, joka tarjoaa olemassa olevia matka- puhelinverkkoja (esim. NMT ja GSM) nopeamman, tehokkaamman ja taloudellisemman liitynnän pakettidataverkkoihin, kuten internettiin ja intranetteihin. Tämän työn tavoitteena oli toteuttaa GPRS:n paketinohjausyksikön (Packet Control Unit, PCU) testauksessa tarvittavat viestintäajurit työasemaympristöön. Aidot matkapuhelinverkot ovat liian kalliita, eikä niistä saa tarvittavasti lokitulostuksia, jotta niitä voisi käyttää GPRS:n testauksessa ohjelmiston kehityksen alkuvaihessa. Tämän takia PCU-ohjelmiston testaus suoritetaan joustavammassa ja helpommin hallittavassa ympäristössä, joka ei aseta kovia reaaliaikavaatimuksia. Uusi toimintaympäristö ja yhteysmedia vaativat PCU:n ja muiden GPRS-verkon yksiköiden välisistä yhteyksistä huolehtivien ohjelman osien, viestintäajurien uuden toteutuksen. Tämän työn tuloksena syntyivät tarvittavien viestintäajurien työasemaversiot. Työssä tarkastellaan eri tiedonsiirtotapoja ja -protokollia testattavan ohjelmiston vaateiden, toteutetun ajurin ja testauksen kannalta. Työssä esitellään kunkin ajurin toteuttama rajapinta ja toteutuksen aste, eli mitkä toiminnot on toteutettu ja mitä on jätetty pois. Ajureiden rakenne ja toiminta selvitetään siltä osin, kuin se on oleellista ohjelman toiminnan kannalta.
Resumo:
In this work, we propose a copula-based method to generate synthetic gene expression data that account for marginal and joint probability distributions features captured from real data. Our method allows us to implant significant genes in the synthetic dataset in a controlled manner, giving the possibility of testing new detection algorithms under more realistic environments.
Resumo:
Peliteollisuus on nykyään erittäin suuri ohjelmistokehityksen ala, joten on ajankohtaista tutustua ilmaisten työkalujen ja kirjastojen tarjoamiin mahdollisuuksiin. Visuaalisen viihteen tuottamiseen tarvitaan yleensä C++-ohjelmointitaidon lisäksi mallinnustaitoa ja kuvankäsittelytaitoa. Tämän lisäksi äänten tuottaminen on erittäin suuri osa toimivan kokonaisuuden saavuttamiseksi. Tässä työssä käsitellään kaikki osa-alueet ja tutkitaan Open Source -työkalujen soveltuvuutta pelin kehitykseen win32-alustalla. Lopputuloksena syntyy täysin pelattava, tosin yksinkertainen peli CrazyBunny. Työn alussa esitellään kaikki käytettävät työkalut jotka kuuluvat tarvittavaan kehitysympäristöön. Tähän esittelyyn kuuluvat myös olennaisena osana työkalujen asennuksen läpikäynti sekä käyttöönotto. Työn perustana on käytetty OGRE-ohjelmistokehystä, joka ei ole varsinainen pelimoottori. Puuttuvia ominaisuuksia on lisätty käyttämällä CEGUI-kirjastoa käyttöliittymien tekoon sekä FMOD-kirjastoa äänijärjestelmän toteutukseen. Muita käytet-tyjä työkaluja ovat Code::Blocks-kehitysympäristö, Blender-mallinnusohjelma ja Audacity-äänieditori. Pelisovelluksen toteutuksen pohjana on käytetty State-sunnittelumalliin perustuvaa järjes-telmää pelitiloja hallintaan. Tässä mallissa pelin päävalikko, pelitila ja pelin loppu on ero-tettu omiksi tilaluokikseen, jolloin sovelluksesta saadaan helpommin hallittava. Päävali-kossa tärkein osa on itse valikoiden toteutus CEGUI-kirjaston avulla. Pelitilan toteutukses-sa tutustutaan OGRE:n visuaalisiin ominaisuuksiin kuten ympäristöön, valoihin, varjoihin, kuva-alustoihin ja visuaalisiin tehosteisiin. Tämän lisäksi peliin on toteutettu äänet suosi-tulla FMOD-kirjastolla, jota useat isot alan yritykset käyttävät kaupallisissa tuotteissaan.
Resumo:
The importance of the regional level in research has risen in the last few decades and a vast literature in the fields of, for instance, evolutionary and institutional economics, network theories, innovations and learning systems, as well as sociology, has focused on regional level questions. Recently the policy makers and regional actors have also began to pay increasing attention to the knowledge economy and its needs, in general, and the connectivity and support structures of regional clusters in particular. Nowadays knowledge is generally considered as the most important source of competitive advantage, but even the most specialised forms of knowledge are becoming a short-lived resource for example due to the accelerating pace of technological change. This emphasizes the need of foresight activities in national, regional and organizational levels and the integration of foresight and innovation activities. In regional setting this development sets great challenges especially in those regions having no university and thus usually very limited resources for research activities. Also the research problem of this dissertation is related to the need to better incorporate the information produced by foresight process to facilitate and to be used in regional practice-based innovation processes. This dissertation is a constructive case study the case being Lahti region and a network facilitating innovation policy adopted in that region. Dissertation consists of a summary and five articles and during the research process a construct or a conceptual model for solving this real life problem has been developed. It is also being implemented as part of the network facilitating innovation policy in the Lahti region.
Resumo:
Through indisputable evidence of climate change and its link to the greenhouse gas emissions comes the necessity for change in energy production infrastructure during the coming decades. Through political conventions and restrictions energy industry is pushed toward using bigger share of renewable energy sources as energy supply. In addition to climate change, sustainable energy supply is another major issue for future development plans, but neither of these should come with unbearable price. All the power production types have environmental effects as well as strengths and weaknesses. Although each change comes with a price, right track in minimising the environmental impacts and energy supply security can be found by combining all possible low-carbon technologies and by improving energy efficiency in all sectors, for creating a new power production infrastructure of tolerable energy price and of minor environmental effects. GEMIS-Global Emission Model for Integrated Systems is a life-cycle analysis program which was used in this thesis to make indicative energy models for Finland’s future energy supply. Results indicate that the energy supply must comprise both high capacity nuclear power as well as large variation of renewable energy sources for minimization of all environmental effects and keeping energy price reasonable.
Resumo:
Web application performance testing is an emerging and important field of software engineering. As web applications become more commonplace and complex, the need for performance testing will only increase. This paper discusses common concepts, practices and tools that lie at the heart of web application performance testing. A pragmatic, hands-on approach is assumed where applicable; real-life examples of test tooling, execution and analysis are presented right next to the underpinning theory. At the client-side, web application performance is primarily driven by the amount of data transmitted over the wire. At the server-side, selection of programming language and platform, implementation complexity and configuration are the primary contributors to web application performance. Web application performance testing is an activity that requires delicate coordination between project stakeholders, developers, system administrators and testers in order to produce reliable and useful results. Proper test definition, execution, reporting and repeatable test results are of utmost importance. Open-source performance analysis tools such as Apache JMeter, Firebug and YSlow can be used to realise effective web application performance tests. A sample case study using these tools is presented in this paper. The sample application was found to perform poorly even under the moderate load incurred by the sample tests.
Resumo:
In general, laboratory activities are costly in terms of time, space, and money. As such, the ability to provide realistically simulated laboratory data that enables students to practice data analysis techniques as a complementary activity would be expected to reduce these costs while opening up very interesting possibilities. In the present work, a novel methodology is presented for design of analytical chemistry instrumental analysis exercises that can be automatically personalized for each student and the results evaluated immediately. The proposed system provides each student with a different set of experimental data generated randomly while satisfying a set of constraints, rather than using data obtained from actual laboratory work. This allows the instructor to provide students with a set of practical problems to complement their regular laboratory work along with the corresponding feedback provided by the system's automatic evaluation process. To this end, the Goodle Grading Management System (GMS), an innovative web-based educational tool for automating the collection and assessment of practical exercises for engineering and scientific courses, was developed. The proposed methodology takes full advantage of the Goodle GMS fusion code architecture. The design of a particular exercise is provided ad hoc by the instructor and requires basic Matlab knowledge. The system has been employed with satisfactory results in several university courses. To demonstrate the automatic evaluation process, three exercises are presented in detail. The first exercise involves a linear regression analysis of data and the calculation of the quality parameters of an instrumental analysis method. The second and third exercises address two different comparison tests, a comparison test of the mean and a t-paired test.
Resumo:
The maximum realizable power throughput of power electronic converters may be limited or constrained by technical or economical considerations. One solution to this problemis to connect several power converter units in parallel. The parallel connection can be used to increase the current carrying capacity of the overall system beyond the ratings of individual power converter units. Thus, it is possible to use several lower-power converter units, produced in large quantities, as building blocks to construct high-power converters in a modular manner. High-power converters realized by using parallel connection are needed for example in multimegawatt wind power generation systems. Parallel connection of power converter units is also required in emerging applications such as photovoltaic and fuel cell power conversion. The parallel operation of power converter units is not, however, problem free. This is because parallel-operating units are subject to overcurrent stresses, which are caused by unequal load current sharing or currents that flow between the units. Commonly, the term ’circulatingcurrent’ is used to describe both the unequal load current sharing and the currents flowing between the units. Circulating currents, again, are caused by component tolerances and asynchronous operation of the parallel units. Parallel-operating units are also subject to stresses caused by unequal thermal stress distribution. Both of these problemscan, nevertheless, be handled with a proper circulating current control. To design an effective circulating current control system, we need information about circulating current dynamics. The dynamics of the circulating currents can be investigated by developing appropriate mathematical models. In this dissertation, circulating current models aredeveloped for two different types of parallel two-level three-phase inverter configurations. Themodels, which are developed for an arbitrary number of parallel units, provide a framework for analyzing circulating current generation mechanisms and developing circulating current control systems. In addition to developing circulating current models, modulation of parallel inverters is considered. It is illustrated that depending on the parallel inverter configuration and the modulation method applied, common-mode circulating currents may be excited as a consequence of the differential-mode circulating current control. To prevent the common-mode circulating currents that are caused by the modulation, a dual modulator method is introduced. The dual modulator basically consists of two independently operating modulators, the outputs of which eventually constitute the switching commands of the inverter. The two independently operating modulators are referred to as primary and secondary modulators. In its intended usage, the same voltage vector is fed to the primary modulators of each parallel unit, and the inputs of the secondary modulators are obtained from the circulating current controllers. To ensure that voltage commands obtained from the circulating current controllers are realizable, it must be guaranteed that the inverter is not driven into saturation by the primary modulator. The inverter saturation can be prevented by limiting the inputs of the primary and secondary modulators. Because of this, also a limitation algorithm is proposed. The operation of both the proposed dual modulator and the limitation algorithm is verified experimentally.
Resumo:
Currently, a high penetration level of Distributed Generations (DGs) has been observed in the Danish distribution systems, and even more DGs are foreseen to be present in the upcoming years. How to utilize them for maintaining the security of the power supply under the emergency situations, has been of great interest for study. This master project is intended to develop a control architecture for studying purposes of distribution systems with large scale integration of solar power. As part of the EcoGrid EU Smart Grid project, it focuses on the system modelling and simulation of a Danish representative LV network located in Bornholm island. Regarding the control architecture, two types of reactive control techniques are implemented and compare. In addition, a network voltage control based on a tap changer transformer is tested. The optimized results after applying a genetic algorithm to five typical Danish domestic loads are lower power losses and voltage deviation using Q(U) control, specially with large consumptions. Finally, a communication and information exchange system is developed with the objective of regulating the reactive power and thereby, the network voltage remotely and real-time. Validation test of the simulated parameters are performed as well.