918 resultados para Software testing. Problem-oriented programming. Teachingmethodology


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Abstract This PhD thesis addresses the issue of alleviating the burden of developing ad hoc applications. Such applications have the particularity of running on mobile devices, communicating in a peer-to-peer manner and implement some proximity-based semantics. A typical example of such application can be a radar application where users see their avatar as well as the avatars of their friends on a map on their mobile phone. Such application become increasingly popular with the advent of the latest generation of mobile smart phones with their impressive computational power, their peer-to-peer communication capabilities and their location detection technology. Unfortunately, the existing programming support for such applications is limited, hence the need to address this issue in order to alleviate their development burden. This thesis specifically tackles this problem by providing several tools for application development support. First, it provides the location-based publish/subscribe service (LPSS), a communication abstraction, which elegantly captures recurrent communication issues and thus allows to dramatically reduce the code complexity. LPSS is implemented in a modular manner in order to be able to target two different network architectures. One pragmatic implementation is aimed at mainstream infrastructure-based mobile networks, where mobile devices can communicate through fixed antennas. The other fully decentralized implementation targets emerging mobile ad hoc networks (MANETs), where no fixed infrastructure is available and communication can only occur in a peer-to-peer fashion. For each of these architectures, various implementation strategies tailored for different application scenarios that can be parametrized at deployment time. Second, this thesis provides two location-based message diffusion protocols, namely 6Shot broadcast and 6Shot multicast, specifically aimed at MANETs and fine tuned to be used as building blocks for LPSS. Finally this thesis proposes Phomo, a phone motion testing tool that allows to test proximity semantics of ad hoc applications without having to move around with mobile devices. These different developing support tools have been packaged in a coherent middleware framework called Pervaho.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

El departament d’electrònica i telecomunicacions de la Universitat de Vic ha dissenyat un conjunt de plaques entrenadores amb finalitat educativa. Perquè els alumnes puguin utilitzar aquestes plaques com a eina d’estudi, és necessari disposar d’un sistema de gravació econòmic i còmode. La major part dels programadors, en aquest cas, no compleixen amb aquests requeriments. L’objectiu d’aquest projecte és dissenyar un sistema de programació que utilitzi la comunicació sèrie i que no requereixi d'un hardware ni software específics. D’aquesta manera, obtenim una placa autònoma i un programador gratuït, de muntatge ràpid i simple d’utilitzar. El sistema de gravació dissenyat s’ha dividit en tres blocs. Per una banda, un programa que anomenem “programador” encarregat de transferir codi de programa des de l’ordinador al microcontrolador de la placa entrenadora. Per altra banda, un programa anomenat “bootloader”, situat al microcontrolador, permet rebre aquest codi de programa i emmagatzemar-lo a les direccions de memòria de programa corresponents. Com a tercer bloc, s’implementa un protocol de comunicació i un sistema de control d’errors per tal d’assegurar una correcta comunicació entre el “programador” i el “bootloader”. Els objectius d’aquest projecte s’han complert i per les proves realitzades, el sistema de programació ha funcionat correctament.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

En els últims anys, el món de la informàtica ha evolucionat d'una manera inimaginable, tan a nivell de Hardware com de Software. Aquesta evolució ha donat lloc a la creació de moltes empreses dedicades a la programació, on una de les seves principals feines ha estat la realització de programes de gestió d'empreses. Moltes vegades, però, els programes estàndards no poden satisfer el total de les necessitats dels clients, sinó algunes d’aquestes i realitzar un programa personalitzat té un cost elevat. En el cas de la Pastisseria Mas de Navàs, una empresa familiar, per poder realitzar les tasques administratives utilitzen fulls de càlcul, concretament el Microsoft Excel, que permet portar els comptes d'una manera més o menys senzilla, ja que només són necessàries unes nocions bàsiques d'informàtica. El mateix passa amb les dades dels proveïdors, que les guarden en una Base de Dades del tipus Microsoft Access. Una altra de les mancances és el tema dels encàrrecs que es fa de manera manual. Per tant, l’objectiu d’aquest projecte, és realitzar un programa que els hi faciliti la seva activitat. Aquest programa els permetrà gestionar les dades que utilitzen, com la informació sobre els clients, personal, comandes... També s’ha desenvolupat una web que permet obtenir informació sobre les comandes que s’han realitzat. Aquesta aplicació està dissenyada per funcionar en l’entorn Windows XP i s’ha desenvolupat amb el compilador de CodeGear Rad Studio, concretament el C++ Builder 2009. A nivell de base de dades, he utilitzat MySQL i en el cas de la pàgina web, PHP i lamateixa base de dades. L’anàlisi i el disseny ha estat fet en UML.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

L’organització de la producció és sempre un factor clau en qualsevol empresa. No hi ha cap fórmula magistral que pugui servir per a tothom, perquè aquesta és molt depenent del sector i de la mida. Softvic S.A., l’empresa on treballo, em va demanar que implantés un sistema d’organització adequat a una empresa de desenvolupament de Software. Les empreses d’aquesta tipologia tenen dues característiques diferenciadores respecte una empresa de fabricació: les feines es fan una única vegada i es redefineixen freqüentment els projectes a fer al futur. És a dir, els requisits són inestables i requereixen rapidesa i flexibilitat. Actualment, Softvic S.A. ja té la ISO 9001:2008 al departament de programació. Aquesta ISO contempla com es creen les ordres de programació (OP) i ordres d’incidència (OI) i com es registra i avalua la feina realitzada. L’objectiu és implantar una metodologia que s’encarregui de la part anterior a aquesta, és a dir, definir les feines a fer en un període. Això s’ha d’integrar perfectament amb la part ja recolzada per la ISO. Per aquest fet es va escollir la metodologia Scrum que complia tots els requisits esmentats i estava contrastada per diferents empreses del món del Software. Primerament es van fer proves en les quals es guardava la informació en un Excel i s’imprimien manualment les feines a realitzar. Un cop es va haver decidit quina informació era útil i quina no en el cas de Softvic, es va crear una base de dades amb les taules i camps necessaris. Per treballar de forma més còmoda es va fer posteriorment un programa per a mantenir les dades i un formulari per imprimir etiquetes. A mesura que hem anat utilitzant la metodologia Scrum, hem anat ajustant aspectes cap on hem cregut convenient pel nostre cas en particular.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

LEGISLATIVE STUDY – The 83rd General Assembly of the Iowa Legislature, in Senate File 2273, directed the Iowa Department of Transportation (DOT) to conduct a study of how to implement a uniform statewide system to allow for electronic transactions for the registration and titling of motor vehicles. PARTICIPANTS IN STUDY – As directed by Senate File 2273, the DOT formed a working group to conduct the study that included representatives from the Consumer Protection Division of the Office of the Attorney General, the Department of Public Safety, the Department of Revenue, the Iowa State County Treasurer’s Association, the Iowa Automobile Dealers Association, and the Iowa Independent Automobile Dealers Association. CONDUCT OF THE STUDY – The working group met eight times between June 17, 2010, and October 1, 2010. The group discussed the costs and benefits of electronic titling from the perspectives of new and used motor vehicle dealers, county treasurers, the DOT, lending institutions, consumers and consumer protection, and law enforcement. Security concerns, legislative implications, and implementation timelines were also considered. In the course of the meetings the group: 1. Reviewed the specific goals of S.F. 2273, and viewed a demonstration of Iowa’s current vehicle registration and titling system so participants that were not users of the system could gain an understanding of its current functionality and capabilities. 2. Reviewed the results of a survey of county treasurers conducted by the DOT to determine the extent to which county treasurers had processing backlogs and the extent to which county treasurers limited the number of dealer registration and titling transactions that they would process in a single day and while the dealer waited. Only eight reported placing a limit on the number of dealer transactions that would be processed while the dealer waited (with the number ranging from one to four), and only 11 reported a backlog in processing registration and titling transactions as of June 11, 2010, with most backlogs being reported in the range of one to three days. 3. Conducted conference calls with representatives of the American Association of Motor Vehicle Administrators (AAMVA) and representatives of three states -- Kansas, which has an electronic lien and titling (ELT) program, and Wisconsin and Florida, each of which have both an ELT program and an electronic registration and titling (ERT) program – to assess current and best practices for electronic transactions. In addition, the DOT (through AAMVA) submitted a survey to all U.S. jurisdictions to determine how, if at all, other states implemented electronic transactions for the registration and titling of motor vehicles. Twenty-eight states responded to the survey; of the 28 states that responded, only 13 allowed liens to be added or released electronically, and only five indicated allowing applications for registration and titling to be submitted electronically. DOT staff also heard a presentation from South Dakota on its ERT system at an AAMVA regional meeting. ELT information that emerged suggests a multi-vendor approach, in which vendors that meet state specifications for participation are authorized to interface with the state’s system to serve as a portal between lenders and the state system, will facilitate electronic lien releases and additions by offering lenders more choices and the opportunity to use the same vendor in multiple states. The ERT information that emerged indicates a multi-interface approach that offers an interface with existing dealer management software (DMS) systems and through a separate internet site will facilitate ERT by offering access that meets a variety of business needs and models. In both instances, information that emerged indicates that, in the long-term, adoption rates are positively affected by making participation above a certain minimum threshold mandatory. 4. To assess and compare functions or services that might be offered by or through a vendor, the group heard presentations from vendors that offer products or services that facilitate some aspect of ELT or ERT. 5. To assess the concerns, needs and interest of Iowa motor vehicle dealers, the group surveyed dealers to assess registration and titling difficulties experienced by dealers, the types of DMS systems (if any) used by dealers, and the dealers’ interest and preference in using an electronic interface to submit applications for registration and titling. Overall, 40% of the dealers that responded indicated interest and 57% indicated no interest, but interest was pronounced among new car dealers (75% were interested) and dealers with a high number of monthly transactions (85% of dealers averaging more than 50 sales per month were interested). The majority of dealers responding to the dealer survey ranked delays in processing and problems with daily limits on transaction as ―minor difficulty or ―no difficulty. RECOMMENDATIONS -- At the conclusion of the meetings, the working group discussed possible approaches for implementation of electronic transactions in Iowa and reached a consensus that a phased implementation of electronic titling that addressed first electronic lien and title transactions (ELT) and electronic fund transfers (EFT), and then electronic applications for registration and titling (ERT) is recommended. The recommendation of a phased implementation is based upon recognition that aspects of ELT and EFT are foundational to ERT, and that ELT and EFT solutions are more readily and easily attained than the ERT solution, which will take longer and be somewhat more difficult to develop and will require federal approval of an electronic odometer statement to fully implement. ELT – A multi-vendor approach is proposed for ELT. No direct costs to the state, counties, consumers, or dealers are anticipated under this approach. The vendor charges participating lenders user or transaction fees for the service, and it appears the lenders typically absorb those costs due to the savings offered by ELT. Existing staff can complete the programming necessary to interface the state system with vendors’ systems. The estimated time to implement ELT is six to nine months. Mandatory participation is not recommended initially, but should be considered after ELT has been implemented and a suitable number of vendors have enrolled to provide a fair assessment of participation rates and opportunities. EFT – A previous attempt to implement ELT and EFT was terminated due to concern that it would negatively impact county revenues by reducing interest income earned on state funds collected by the county and held until the monthly transfer to the state. To avoid that problem in this implementation, the EFT solution should remain revenue neutral to the counties, by allowing fees submitted by EFT to be immediately directed to the proper county account. Because ARTS was designed and has the capacity to accommodate EFT, a vendor is not needed to implement EFT. The estimated time to implement EFT is six to nine months. It is expected that EFT development will overlap ELT development. ERT – ERT itself must be developed in phases. It will not be possible to quickly implement a fully functioning, paperless ERT system, because federal law requires that transfer of title be accompanied by a written odometer statement unless approval for an alternate electronic statement is granted by the National Highway Traffic Safety Administration (NHTSA). It is expected that it will take as much as a year or more to obtain NHTSA approval, and that NHTSA approval will require design of a system that requires the seller to electronically confirm the seller’s identity, make the required disclosure to the buyer, and then transfer the disclosure to the buyer, who must also electronically confirm the buyer’s identity and electronically review and accept the disclosure to complete and submit the transaction. Given the time that it will take to develop and gain approval for this solution, initial ERT implementation will focus on completing and submitting applications and issuing registration applied for cards electronically, with the understanding that this process will still require submission of paper documents until an electronic odometer solution is developed. Because continued submission of paper documents undermines the efficiencies sought, ―full‖ ERT – that is, all documents necessary for registration and titling should be capable of approval and/or acceptance by all parties, and should be capable of submission without transmittal or delivery of duplicate paper documents .– should remain the ultimate goal. ERT is not recommended as a means to eliminate review and approval of registration and titling transactions by the county treasurers, or to place registration and titling approval in the hands of the dealers, as county treasurers perform an important role in deterring fraud and promoting accuracy by determining the genuineness and regularity of each application. Authorizing dealers to act as registration agents that approve registration and title applications, issue registration receipts, and maintain and deliver permanent metal license plates is not recommended. Although distribution of permanent plates by dealers is not recommended, it is recommended that dealers participating in ERT generate and print registration applied for cards electronically. Unlike the manually-issued cards currently in use, cards issued in this fashion may be queried by law enforcement and are less susceptible to misuse by customers and dealers. The estimated time to implement the electronic application and registration applied for cards is 12 to 18 months, to begin after ELT and EFT have been implemented. It is recommended that focus during this time be on facilitating transfers through motor vehicle dealers, with initial deployment focused on higher-volume dealers that use DMS systems. In the long term an internet option for access to ERT must also be developed and maintained to allow participation for lower-volume dealers that do not use a DMS system. This option will also lay the ground work for an ERT option for sales between private individuals. Mandatory participation in Iowa is not recommended initially. As with ELT, it is recommended that mandatory participation be considered after at least an initial phase of ERT has been implemented and a suitable number of dealers have enrolled to provide a fair assessment of participation rates and opportunities. The use of vendors to facilitate ERT is not initially proposed because 1) DOT IT support staff is capable of developing a system that will interact with DMS systems and will still have to develop a dealer and public interface regardless of whether a vendor acts as intermediary between the DMS systems, and 2) there is concern that the cost of the vendor-based system, which is funded by transaction-based payments from the dealer to the vendor, will be passed to the consumer in the form of additional documentation or conveyance fees. However, the DOT recommends flexibility on this point, as development and pilot of the system may indicate that a multi-vendor approach similar to that recommended for ELT may increase the adoption rate by larger dealers and may ultimately decrease the user management to be exercised by DOT staff. If vendors are used in the process, additional legislation or administrative rules may be needed to control the fees that may be passed to the consumer. No direct cost to the DOT or county treasurers is expected, as the DOT expects that it may complete necessary programming with existing staff. Use of vendors to facilitate ERT transactions by dealers using DMS systems would result in transaction fees that may ultimately be passed to consumers. LEGISLATION – As a result of the changes implemented in 2004 under Senate File 2070, the only changes to Iowa statutes proposed are to section 321.69 of the Iowa Code, ―Damage disclosure statement,and section 321.71, ―Odometer requirements.‖ In each instance, authority to execute these statements by electronic means would be clarified by authorizing language similar to that used in section 321.20, subsections ―2‖ and ―3,‖ which allows for electronic applications and directs the department to ―adopt rules on the method for providing signatures for applications made by electronic means.‖ In these sections, the authorizing language might read as follows: Notwithstanding contrary provisions of this section, the department may develop and implement a program to allow for any statement required by this section to be made electronically. The department shall adopt rules on the method for providing signatures for statements made by electronic means. Some changes to DOT administrative rules will be useful but only to enable changes to work processes that would be desirable in the long term. Examples of long term work processes that would be enabled by rule changes include allowing for signatures created through electronic means and electronic odometer certifications. The DOT rules, as currently written, do not hinder the ability to proceed with ELT, EFT, and ERT.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

There is a need for more efficient methods giving insight into the complex mechanisms of neurotoxicity. Testing strategies including in vitro methods have been proposed to comply with this requirement. With the present study we aimed to develop a novel in vitro approach which mimics in vivo complexity, detects neurotoxicity comprehensively, and provides mechanistic insight. For this purpose we combined rat primary re-aggregating brain cell cultures with a mass spectrometry (MS)-based metabolomics approach. For the proof of principle we treated developing re-aggregating brain cell cultures for 48h with the neurotoxicant methyl mercury chloride (0.1-100muM) and the brain stimulant caffeine (1-100muM) and acquired cellular metabolic profiles. To detect toxicant-induced metabolic alterations the profiles were analysed using commercial software which revealed patterns in the multi-parametric dataset by principal component analyses (PCA), and recognised the most significantly altered metabolites. PCA revealed concentration-dependent cluster formations for methyl mercury chloride (0.1-1muM), and treatment-dependent cluster formations for caffeine (1-100muM) at sub-cytotoxic concentrations. Four relevant metabolites responsible for the concentration-dependent alterations following methyl mercury chloride treatment could be identified using MS-MS fragmentation analysis. These were gamma-aminobutyric acid, choline, glutamine, creatine and spermine. Their respective mass ion intensities demonstrated metabolic alterations in line with the literature and suggest that the metabolites could be biomarkers for mechanisms of neurotoxicity or neuroprotection. In addition, we evaluated whether the approach could identify neurotoxic potential by testing eight compounds which have target organ toxicity in the liver, kidney or brain at sub-cytotoxic concentrations. PCA revealed cluster formations largely dependent on target organ toxicity indicating possible potential for the development of a neurotoxicity prediction model. With such results it could be useful to perform a validation study to determine the reliability, relevance and applicability of this approach to neurotoxicity screening. Thus, for the first time we show the benefits and utility of in vitro metabolomics to comprehensively detect neurotoxicity and to discover new biomarkers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper examines statistical analysis of social reciprocity at group, dyadic, and individual levels. Given that testing statistical hypotheses regarding social reciprocity can be also of interest, a statistical procedure based on Monte Carlo sampling has been developed and implemented in R in order to allow social researchers to describe groups and make statistical decisions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This research work deals with the problem of modeling and design of low level speed controller for the mobile robot PRIM. The main objective is to develop an effective educational, and research tool. On one hand, the interests in using the open mobile platform PRIM consist in integrating several highly related subjects to the automatic control theory in an educational context, by embracing the subjects of communications, signal processing, sensor fusion and hardware design, amongst others. On the other hand, the idea is to implement useful navigation strategies such that the robot can be served as a mobile multimedia information point. It is in this context, when navigation strategies are oriented to goal achievement, that a local model predictive control is attained. Hence, such studies are presented as a very interesting control strategy in order to develop the future capabilities of the system. In this context the research developed includes the visual information as a meaningful source that allows detecting the obstacle position coordinates as well as planning the free obstacle trajectory that should be reached by the robot

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Biometric system performance can be improved by means of data fusion. Several kinds of information can be fused in order to obtain a more accurate classification (identification or verification) of an input sample. In this paper we present a method for computing the weights in a weighted sum fusion for score combinations, by means of a likelihood model. The maximum likelihood estimation is set as a linear programming problem. The scores are derived from a GMM classifier working on a different feature extractor. Our experimental results assesed the robustness of the system in front a changes on time (different sessions) and robustness in front a change of microphone. The improvements obtained were significantly better (error bars of two standard deviations) than a uniform weighted sum or a uniform weighted product or the best single classifier. The proposed method scales computationaly with the number of scores to be fussioned as the simplex method for linear programming.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Research in epistasis or gene-gene interaction detection for human complex traits has grown over the last few years. It has been marked by promising methodological developments, improved translation efforts of statistical epistasis to biological epistasis and attempts to integrate different omics information sources into the epistasis screening to enhance power. The quest for gene-gene interactions poses severe multiple-testing problems. In this context, the maxT algorithm is one technique to control the false-positive rate. However, the memory needed by this algorithm rises linearly with the amount of hypothesis tests. Gene-gene interaction studies will require a memory proportional to the squared number of SNPs. A genome-wide epistasis search would therefore require terabytes of memory. Hence, cache problems are likely to occur, increasing the computation time. In this work we present a new version of maxT, requiring an amount of memory independent from the number of genetic effects to be investigated. This algorithm was implemented in C++ in our epistasis screening software MBMDR-3.0.3. We evaluate the new implementation in terms of memory efficiency and speed using simulated data. The software is illustrated on real-life data for Crohn’s disease. Results: In the case of a binary (affected/unaffected) trait, the parallel workflow of MBMDR-3.0.3 analyzes all gene-gene interactions with a dataset of 100,000 SNPs typed on 1000 individuals within 4 days and 9 hours, using 999 permutations of the trait to assess statistical significance, on a cluster composed of 10 blades, containing each four Quad-Core AMD Opteron(tm) Processor 2352 2.1 GHz. In the case of a continuous trait, a similar run takes 9 days. Our program found 14 SNP-SNP interactions with a multiple-testing corrected p-value of less than 0.05 on real-life Crohn’s disease (CD) data. Conclusions: Our software is the first implementation of the MB-MDR methodology able to solve large-scale SNP-SNP interactions problems within a few days, without using much memory, while adequately controlling the type I error rates. A new implementation to reach genome-wide epistasis screening is under construction. In the context of Crohn’s disease, MBMDR-3.0.3 could identify epistasis involving regions that are well known in the field and could be explained from a biological point of view. This demonstrates the power of our software to find relevant phenotype-genotype higher-order associations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

El objetivo de este proyecto es el desarrollo de una aplicación móvil para dispositivos con sistema operativo Android que permita la búsqueda de ofertas de hotel de una forma diferente a las actuales. Para ello se ha empleado el modelo del ciclo de vida en cascada, con las fases de análisis, diseño, construcción y pruebas del sistema desarrollado. El software final sigue una arquitectura de tipo cliente/servidor y ha sido realizado con Java como lenguaje base de programación, haciendo uso de algunas librerías como Apache HTTP Request para las conexiones con el servidor remoto como las propias de Android, que facilitan la creación de interfaces gráficas y la gestión de los recursos de los dispositivos en el desarrollo de aplicaciones móviles.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Currently, hydraulic cement grouts are approved for Iowa Department of Transportation projects on the basis of a pullout test. However, other properties of the grouts should be evaluated. Therefore, this research was initiated to develop criteria to better evaluate hydraulic cement grouts. Fourteen grouts were tested for compressive strength, time of set, durability, consistency and shrinkage. Tested grouts all yielded compressive strengths higher than 3000 psi at 7 days and durability factors were well above 70. Time of set and consistency was adequate. The testing showed most grouts tested shrank, even though tested grouts were labeled non-shrink grouts. For many applications of grouts such as setting in anchor bolts and as a filler, minor shrinkage is not a problem. However, for some critical applications, shrinkage cannot be tolerated. The proposed Instructional Memorandum will identify those grouts which do not excessively shrink or expand in the tests used. Based on test results, criteria for evaluation of hydraulic cement grouts have been recommended. Evaluation consists of tests for compressive strength, time of set, durability, consistency, shrinkage and pullout test.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recent reports indicate that of the over 25,000 bridges in Iowa, slightly over 7,000 (29%) are either structurally deficient or functionally obsolete. While many of these bridges may be strengthened or rehabilitated, some simply need to be replaced. Before implementing one of these options, one should consider performing a diagnostic load test on the structure to more accurately assess its load carrying capacity. Frequently, diagnostic load tests reveal strength and serviceability characteristics that exceed the predicted codified parameters. Usually, codified parameters are very conservative in predicting lateral load distribution characteristics and the influence of other structural attributes. As a result, the predicted rating factors are typically conservative. In cases where theoretical calculations show a structural deficiency, it may be very beneficial to apply a "tool" that utilizes a more accurate theoretical model which incorporates field-test data. At a minimum, this approach results in more accurate load ratings and many times results in increased rating factors. Bridge Diagnostics, Inc. (BDI) developed hardware and software that are specially designed for performing bridge ratings based on data obtained from physical testing. To evaluate the BDI system, the research team performed diagnostic load tests on seven "typical" bridge structures: three steel-girder bridges with concrete decks, two concrete slab bridges, and two steel-girder bridges with timber decks. In addition, a steel-girder bridge with a concrete deck previously tested and modeled by BDI was investigated for model verification purposes. The tests were performed by attaching strain transducers on the bridges at critical locations to measure strains resulting from truck loading positioned at various locations on the bridge. The field test results were used to develop and validate analytical rating models. Based on the experimental and analytical results, it was determined that bridge tests could be conducted relatively easy, that accurate models could be generated with the BDI software, and that the load ratings, in general, were greater than the ratings, obtained using the codified LFD Method (according to AASHTO Standard Specifications for Highway Bridges).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Currently, individuals including designers, contractors, and owners learn about the project requirements by studying a combination of paper and electronic copies of the construction documents including the drawings, specifications (standard and supplemental), road and bridge standard drawings, design criteria, contracts, addenda, and change orders. This can be a tedious process since one needs to go back and forth between the various documents (paper or electronic) to obtain information about the entire project. Object-oriented computer-aided design (OO-CAD) is an innovative technology that can bring a change to this process by graphical portrayal of information. OO-CAD allows users to point and click on portions of an object-oriented drawing that are then linked to relevant databases of information (e.g., specifications, procurement status, and shop drawings). The vision of this study is to turn paper-based design standards and construction specifications into an object-oriented design and specification (OODAS) system or a visual electronic reference library (ERL). Individuals can use the system through a handheld wireless book-size laptop that includes all of the necessary software for operating in a 3D environment. All parties involved in transportation projects can access all of the standards and requirements simultaneously using a 3D graphical interface. By using this system, users will have all of the design elements and all of the specifications readily available without concerns of omissions. A prototype object-oriented model was created and demonstrated to potential users representing counties, cities, and the state. Findings suggest that a system like this could improve productivity to find information by as much as 75% and provide a greater sense of confidence that all relevant information had been identified. It was also apparent that this system would be used by more people in construction than in design. There was also concern related to the cost to develop and maintain the complete system. The future direction should focus on a project-based system that can help the contractors and DOT inspectors find information (e.g., road standards, specifications, instructional memorandums) more rapidly as it pertains to a specific project.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Résumé Cette thèse est consacrée à l'analyse, la modélisation et la visualisation de données environnementales à référence spatiale à l'aide d'algorithmes d'apprentissage automatique (Machine Learning). L'apprentissage automatique peut être considéré au sens large comme une sous-catégorie de l'intelligence artificielle qui concerne particulièrement le développement de techniques et d'algorithmes permettant à une machine d'apprendre à partir de données. Dans cette thèse, les algorithmes d'apprentissage automatique sont adaptés pour être appliqués à des données environnementales et à la prédiction spatiale. Pourquoi l'apprentissage automatique ? Parce que la majorité des algorithmes d'apprentissage automatiques sont universels, adaptatifs, non-linéaires, robustes et efficaces pour la modélisation. Ils peuvent résoudre des problèmes de classification, de régression et de modélisation de densité de probabilités dans des espaces à haute dimension, composés de variables informatives spatialisées (« géo-features ») en plus des coordonnées géographiques. De plus, ils sont idéaux pour être implémentés en tant qu'outils d'aide à la décision pour des questions environnementales allant de la reconnaissance de pattern à la modélisation et la prédiction en passant par la cartographie automatique. Leur efficacité est comparable au modèles géostatistiques dans l'espace des coordonnées géographiques, mais ils sont indispensables pour des données à hautes dimensions incluant des géo-features. Les algorithmes d'apprentissage automatique les plus importants et les plus populaires sont présentés théoriquement et implémentés sous forme de logiciels pour les sciences environnementales. Les principaux algorithmes décrits sont le Perceptron multicouches (MultiLayer Perceptron, MLP) - l'algorithme le plus connu dans l'intelligence artificielle, le réseau de neurones de régression généralisée (General Regression Neural Networks, GRNN), le réseau de neurones probabiliste (Probabilistic Neural Networks, PNN), les cartes auto-organisées (SelfOrganized Maps, SOM), les modèles à mixture Gaussiennes (Gaussian Mixture Models, GMM), les réseaux à fonctions de base radiales (Radial Basis Functions Networks, RBF) et les réseaux à mixture de densité (Mixture Density Networks, MDN). Cette gamme d'algorithmes permet de couvrir des tâches variées telle que la classification, la régression ou l'estimation de densité de probabilité. L'analyse exploratoire des données (Exploratory Data Analysis, EDA) est le premier pas de toute analyse de données. Dans cette thèse les concepts d'analyse exploratoire de données spatiales (Exploratory Spatial Data Analysis, ESDA) sont traités selon l'approche traditionnelle de la géostatistique avec la variographie expérimentale et selon les principes de l'apprentissage automatique. La variographie expérimentale, qui étudie les relations entre pairs de points, est un outil de base pour l'analyse géostatistique de corrélations spatiales anisotropiques qui permet de détecter la présence de patterns spatiaux descriptible par une statistique. L'approche de l'apprentissage automatique pour l'ESDA est présentée à travers l'application de la méthode des k plus proches voisins qui est très simple et possède d'excellentes qualités d'interprétation et de visualisation. Une part importante de la thèse traite de sujets d'actualité comme la cartographie automatique de données spatiales. Le réseau de neurones de régression généralisée est proposé pour résoudre cette tâche efficacement. Les performances du GRNN sont démontrées par des données de Comparaison d'Interpolation Spatiale (SIC) de 2004 pour lesquelles le GRNN bat significativement toutes les autres méthodes, particulièrement lors de situations d'urgence. La thèse est composée de quatre chapitres : théorie, applications, outils logiciels et des exemples guidés. Une partie importante du travail consiste en une collection de logiciels : Machine Learning Office. Cette collection de logiciels a été développée durant les 15 dernières années et a été utilisée pour l'enseignement de nombreux cours, dont des workshops internationaux en Chine, France, Italie, Irlande et Suisse ainsi que dans des projets de recherche fondamentaux et appliqués. Les cas d'études considérés couvrent un vaste spectre de problèmes géoenvironnementaux réels à basse et haute dimensionnalité, tels que la pollution de l'air, du sol et de l'eau par des produits radioactifs et des métaux lourds, la classification de types de sols et d'unités hydrogéologiques, la cartographie des incertitudes pour l'aide à la décision et l'estimation de risques naturels (glissements de terrain, avalanches). Des outils complémentaires pour l'analyse exploratoire des données et la visualisation ont également été développés en prenant soin de créer une interface conviviale et facile à l'utilisation. Machine Learning for geospatial data: algorithms, software tools and case studies Abstract The thesis is devoted to the analysis, modeling and visualisation of spatial environmental data using machine learning algorithms. In a broad sense machine learning can be considered as a subfield of artificial intelligence. It mainly concerns with the development of techniques and algorithms that allow computers to learn from data. In this thesis machine learning algorithms are adapted to learn from spatial environmental data and to make spatial predictions. Why machine learning? In few words most of machine learning algorithms are universal, adaptive, nonlinear, robust and efficient modeling tools. They can find solutions for the classification, regression, and probability density modeling problems in high-dimensional geo-feature spaces, composed of geographical space and additional relevant spatially referenced features. They are well-suited to be implemented as predictive engines in decision support systems, for the purposes of environmental data mining including pattern recognition, modeling and predictions as well as automatic data mapping. They have competitive efficiency to the geostatistical models in low dimensional geographical spaces but are indispensable in high-dimensional geo-feature spaces. The most important and popular machine learning algorithms and models interesting for geo- and environmental sciences are presented in details: from theoretical description of the concepts to the software implementation. The main algorithms and models considered are the following: multi-layer perceptron (a workhorse of machine learning), general regression neural networks, probabilistic neural networks, self-organising (Kohonen) maps, Gaussian mixture models, radial basis functions networks, mixture density networks. This set of models covers machine learning tasks such as classification, regression, and density estimation. Exploratory data analysis (EDA) is initial and very important part of data analysis. In this thesis the concepts of exploratory spatial data analysis (ESDA) is considered using both traditional geostatistical approach such as_experimental variography and machine learning. Experimental variography is a basic tool for geostatistical analysis of anisotropic spatial correlations which helps to understand the presence of spatial patterns, at least described by two-point statistics. A machine learning approach for ESDA is presented by applying the k-nearest neighbors (k-NN) method which is simple and has very good interpretation and visualization properties. Important part of the thesis deals with a hot topic of nowadays, namely, an automatic mapping of geospatial data. General regression neural networks (GRNN) is proposed as efficient model to solve this task. Performance of the GRNN model is demonstrated on Spatial Interpolation Comparison (SIC) 2004 data where GRNN model significantly outperformed all other approaches, especially in case of emergency conditions. The thesis consists of four chapters and has the following structure: theory, applications, software tools, and how-to-do-it examples. An important part of the work is a collection of software tools - Machine Learning Office. Machine Learning Office tools were developed during last 15 years and was used both for many teaching courses, including international workshops in China, France, Italy, Ireland, Switzerland and for realizing fundamental and applied research projects. Case studies considered cover wide spectrum of the real-life low and high-dimensional geo- and environmental problems, such as air, soil and water pollution by radionuclides and heavy metals, soil types and hydro-geological units classification, decision-oriented mapping with uncertainties, natural hazards (landslides, avalanches) assessments and susceptibility mapping. Complementary tools useful for the exploratory data analysis and visualisation were developed as well. The software is user friendly and easy to use.