711 resultados para Localization real-world challenges


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The final year project came to us as an opportunity to get involved in a topic which has appeared to be attractive during the learning process of majoring in economics: statistics and its application to the analysis of economic data, i.e. econometrics.Moreover, the combination of econometrics and computer science is a very hot topic nowadays, given the Information Technologies boom in the last decades and the consequent exponential increase in the amount of data collected and stored day by day. Data analysts able to deal with Big Data and to find useful results from it are verydemanded in these days and, according to our understanding, the work they do, although sometimes controversial in terms of ethics, is a clear source of value added both for private corporations and the public sector. For these reasons, the essence of this project is the study of a statistical instrument valid for the analysis of large datasets which is directly related to computer science: Partial Correlation Networks.The structure of the project has been determined by our objectives through the development of it. At first, the characteristics of the studied instrument are explained, from the basic ideas up to the features of the model behind it, with the final goal of presenting SPACE model as a tool for estimating interconnections in between elements in large data sets. Afterwards, an illustrated simulation is performed in order to show the power and efficiency of the model presented. And at last, the model is put into practice by analyzing a relatively large data set of real world data, with the objective of assessing whether the proposed statistical instrument is valid and useful when applied to a real multivariate time series. In short, our main goals are to present the model and evaluate if Partial Correlation Network Analysis is an effective, useful instrument and allows finding valuable results from Big Data.As a result, the findings all along this project suggest the Partial Correlation Estimation by Joint Sparse Regression Models approach presented by Peng et al. (2009) to work well under the assumption of sparsity of data. Moreover, partial correlation networks are shown to be a very valid tool to represent cross-sectional interconnections in between elements in large data sets.The scope of this project is however limited, as there are some sections in which deeper analysis would have been appropriate. Considering intertemporal connections in between elements, the choice of the tuning parameter lambda, or a deeper analysis of the results in the real data application are examples of aspects in which this project could be completed.To sum up, the analyzed statistical tool has been proved to be a very useful instrument to find relationships that connect the elements present in a large data set. And after all, partial correlation networks allow the owner of this set to observe and analyze the existing linkages that could have been omitted otherwise.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This report proposes, that for certain types of highway construction projects undertaken by the Iowa Department of Transportation, a scheduling technique commonly referred to as linear scheduling may be more effective than the Critical Path Method scheduling technique that is currently being used. The types of projects that appear to be good candidates for the technique are those projects that have a strong linear orientation. Like a bar chart, this technique shows when an activity is scheduled to occur and like a CPM schedule it shows the sequence in which activities are expected to occur. During the 1992 construction season, the authors worked with an inlay project on Interstate 29 to demonstrate the linear scheduling technique to the Construction Office. The as-planned schedule was developed from the CPM schedule that the contractor had developed for the project. Therefore, this schedule represents what a linear representation of a CPM schedule would look like, and not necessarily what a true linear schedule would look like if it had been the only scheduling technique applied to the project. There is a need to expand the current repertoire of scheduling techniques to address those projects for which the bar chart and CPM may not be appropriate either because of the lack of control information or due to overly complex process for the actual project characteristics. The scheduling approaches used today on transportation projects have many shortcomings for properly modeling the real world constraints and conditions which are encountered. Linear project's predilection for activities with variable production rates, a concept very difficult to handle with the CPM, is easily handled and visualized with the linear technique. It is recommended that work proceed with the refinement of the method of linear scheduling described above and the development of a microcomputer based system for use by the Iowa Department of Transportation and contractors for its implementation. The system will be designed to provide the information needed to adjust schedules in a rational understandable method for monitoring progress on the projects and alerting Iowa Department of Transportation personnel when the contractor is deviating from the plan.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present a seabed profile estimation and following method for close proximity inspection of 3D underwater structures using autonomous underwater vehicles (AUVs). The presented method is used to determine a path allowing the AUV to pass its sensors over all points of the target structure, which is known as coverage path planning. Our profile following method goes beyond traditional seabed following at a safe altitude and exploits hovering capabilities of recent AUV developments. A range sonar is used to incrementally construct a local probabilistic map representation of the environment and estimates of the local profile are obtained via linear regression. Two behavior-based controllers use these estimates to perform horizontal and vertical profile following. We build upon these tools to address coverage path planning for 3D underwater structures using a (potentially inaccurate) prior map and following cross-section profiles of the target structure. The feasibility of the proposed method is demonstrated using the GIRONA 500 AUV both in simulation using synthetic and real-world bathymetric data and in pool trials

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Bacterial bioreporters have substantial potential for contaminant assessment but their real world application is currently impaired by a lack of sensitivity. Here, we exploit the bioconcentration of chemicals in the urine of animals to facilitate pollutant detection. The shore crab Carcinus maenas was exposed to the organic contaminant 2-hydroxybiphenyl, and urine was screened using an Escherichia coli-based luciferase gene (luxAB) reporter assay specific to this compound. Bioassay measurements differentiated between the original contaminant and its metabolites, quantifying bioconcentration factors of up to one hundred-fold in crab urine. Our results reveal the substantial potential of using bacterial bioreporter assays in real-time monitoring of biological matricesto determine exposure histories, with wide ranging potential for the in situ measurement of xenobiotics in risk assessments and epidemiology.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Hajautetulla järjestelmäkehityksellä tarkoitetaan tietojärjestelmän kehitysprojektin hajauttamista maantieteellisesti jakamalla projektiryhmä virtuaalitiimeiksi eri paikkakunnille tai eri maihin. Tässä diplomityössä tutustutaan hajautettuun järjestelmäkehitykseen käsitteenä sekä perehdytään kehitysprosessin hajauttamisen ja virtuaalitiimien käytön potentiaalisiin hyötyihin ja ongelmiin sekä kirjallisuuden että eräässä yrityksessä toteutetun projektin pohjalta, sekä kirjallisuudessaesiintyviin ratkaisuihin riskien pitämiseksi mahdollisimman pieninä. Tältä pohjalta etsitään keskeisimmät projektin onnistumiseen vaikuttavat tekijät. Työssä haetaan kirjallisuudessa esitettyjä keinoja yhdistämällä ratkaisuja virtuaalitiimien potentiaalisten hyötyjen toteuttamiseksi ja riskien välttämiseksi. Kehitettyjä ratkaisuja tarkastellaan sekä kirjallisuuden esimerkkitapausten, että hajautetusta järjestelmäkehitysprojektista saatujen omakohtaisten käytännön kokemusten valossa. Työssä luodaan myös silmäys hajautettua työskentelyätukeviin työkaluihin. Kirjallisuuden ja osin myös käytännön kokemusten pohjalta merkittävimmiksi hajautetun järjestelmäkehitystyön tukemisessa osoittautuivat työryhmän viestinnän tukeminen, erityisesti silloin kun ryhmänjäsenet eivät voi käyttää äidinkieltään, ryhmän jäsenten välisen luottamuksen ja yhteenkuuluvuuden luominen, sekä työn koordinointi.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Tässä diplomityössä tutkitaan dispariteettikartan laskennan tehostamista interpoloimalla. Kolmiomittausta käyttämällä stereokuvasta muodostetaan ensin harva dispariteettikartta, jonka jälkeen koko kuvan kattava dispariteettikartta muodostetaan interpoloimalla. Kolmiomittausta varten täytyy tietää samaa reaalimaailman pistettä vastaavat kuvapisteet molemmissa kameroissa. Huolimatta siitä, että vastaavien pisteiden hakualue voidaan pienentää kahdesta ulottuvuudesta yhteen ulottuvuuteen käyttämällä esimerkiksi epipolaarista geometriaa, on laskennallisesti tehokkaampaa määrittää osa dispariteetikartasta interpoloimalla, kuin etsiä vastaavia kuvapisteitä stereokuvista. Myöskin johtuen stereonäköjärjestelmän kameroiden välisestä etäisyydestä, kaikki kuvien pisteet eivät löydy toisesta kuvasta. Näin ollen on mahdotonta määrittää koko kuvan kattavaa dispariteettikartaa pelkästään vastaavista pisteistä. Vastaavien pisteiden etsimiseen tässä työssä käytetään dynaamista ohjelmointia sekä korrelaatiomenetelmää. Reaalimaailman pinnat ovat yleisesti ottaen jatkuvia, joten geometrisessä mielessä on perusteltua approksimoida kuvien esittämiä pintoja interpoloimalla. On myöskin olemassa tieteellistä näyttöä, jonkamukaan ihmisen stereonäkö interpoloi objektien pintoja.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Resume : Mieux comprendre les stromatolithes et les tapis microbiens est un sujet important en biogéosciences puisque cela aide à l'étude des premières formes de vie sur Terre, a mieux cerner l'écologie des communautés microbiennes et la contribution des microorganismes a la biominéralisation, et même à poser certains fondements dans les recherches en exobiologie. D'autre part, la modélisation est un outil puissant utilisé dans les sciences naturelles pour appréhender différents phénomènes de façon théorique. Les modèles sont généralement construits sur un système d'équations différentielles et les résultats sont obtenus en résolvant ce système. Les logiciels disponibles pour implémenter les modèles incluent les logiciels mathématiques et les logiciels généraux de simulation. L'objectif principal de cette thèse est de développer des modèles et des logiciels pour aider a comprendre, via la simulation, le fonctionnement des stromatolithes et des tapis microbiens. Ces logiciels ont été développés en C++ en ne partant d'aucun pré-requis de façon a privilégier performance et flexibilité maximales. Cette démarche permet de construire des modèles bien plus spécifiques et plus appropriés aux phénomènes a modéliser. Premièrement, nous avons étudié la croissance et la morphologie des stromatolithes. Nous avons construit un modèle tridimensionnel fondé sur l'agrégation par diffusion limitée. Le modèle a été implémenté en deux applications C++: un moteur de simulation capable d'exécuter un batch de simulations et de produire des fichiers de résultats, et un outil de visualisation qui permet d'analyser les résultats en trois dimensions. Après avoir vérifié que ce modèle peut en effet reproduire la croissance et la morphologie de plusieurs types de stromatolithes, nous avons introduit un processus de sédimentation comme facteur externe. Ceci nous a mené a des résultats intéressants, et permis de soutenir l'hypothèse que la morphologie des stromatolithes pourrait être le résultat de facteurs externes autant que de facteurs internes. Ceci est important car la classification des stromatolithes est généralement fondée sur leur morphologie, imposant que la forme d'un stromatolithe est dépendante de facteurs internes uniquement (c'est-à-dire les tapis microbiens). Les résultats avancés dans ce mémoire contredisent donc ces assertions communément admises. Ensuite, nous avons décidé de mener des recherches plus en profondeur sur les aspects fonctionnels des tapis microbiens. Nous avons construit un modèle bidimensionnel de réaction-diffusion fondé sur la simulation discrète. Ce modèle a été implémenté dans une application C++ qui permet de paramétrer et exécuter des simulations. Nous avons ensuite pu comparer les résultats de simulation avec des données du monde réel et vérifier que le modèle peut en effet imiter le comportement de certains tapis microbiens. Ainsi, nous avons pu émettre et vérifier des hypothèses sur le fonctionnement de certains tapis microbiens pour nous aider à mieux en comprendre certains aspects, comme la dynamique des éléments, en particulier le soufre et l'oxygène. En conclusion, ce travail a abouti à l'écriture de logiciels dédiés à la simulation de tapis microbiens d'un point de vue tant morphologique que fonctionnel, suivant deux approches différentes, l'une holistique, l'autre plus analytique. Ces logiciels sont gratuits et diffusés sous licence GPL (General Public License). Abstract : Better understanding of stromatolites and microbial mats is an important topic in biogeosciences as it helps studying the early forms of life on Earth, provides clues re- garding the ecology of microbial ecosystems and their contribution to biomineralization, and gives basis to a new science, exobiology. On the other hand, modelling is a powerful tool used in natural sciences for the theoretical approach of various phenomena. Models are usually built on a system of differential equations and results are obtained by solving that system. Available software to implement models includes mathematical solvers and general simulation software. The main objective of this thesis is to develop models and software able to help to understand the functioning of stromatolites and microbial mats. Software was developed in C++ from scratch for maximum performance and flexibility. This allows to build models much more specific to a phenomenon rather than general software. First, we studied stromatolite growth and morphology. We built a three-dimensional model based on diffusion-limited aggregation. The model was implemented in two C++ applications: a simulator engine, which can run a batch of simulations and produce result files, and a Visualization tool, which allows results to be analysed in three dimensions. After verifying that our model can indeed reproduce the growth and morphology of several types of stromatolites, we introduced a sedimentation process as an external factor. This lead to interesting results, and allowed to emit the hypothesis that stromatolite morphology may be the result of external factors as much as internal factors. This is important as stromatolite classification is usually based on their morphology, imposing that a stromatolite shape is dependant on internal factors only (i.e. the microbial mat). This statement is contradicted by our findings, Second, we decided to investigate deeper the functioning of microbial mats, We built a two-dimensional reaction-diffusion model based on discrete simulation, The model was implemented in a C++ application that allows setting and running simulations. We could then compare simulation results with real world data and verify that our model can indeed mimic the behaviour of some microbial mats. Thus, we have proposed and verified hypotheses regarding microbial mats functioning in order to help to better understand them, e.g. the cycle of some elements such as oxygen or sulfur. ln conclusion, this PhD provides a simulation software, dealing with two different approaches. This software is free and available under a GPL licence.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Työn tavoitteena oli kehittää kaivoskoneita valmistavalle yritykselle raportointimalli uusittujen tuotteiden kustannusten seuraamiseksi. Raporttien toivottiin olevan tarvelähtöisiä, sisällöltään luotettavia, vaivattomia päivittää sekä helposti saatavissa. Tutkimuksen toisena tavoitteena oli kartoittaa kohdeyrityksen tuotekohtaisenkustannusseurannan nykytilaa ja sen kehityskohteita. Tutkimuksen aineistoa kerättiin strukturoimattomilla haastatteluilla, yrityksen dokumentteihin tutustumalla, havainnoinnilla, keskusteluilla sekä benchmarking-menetelmällä. Luonteeltaan tutkimus oli konstruktiivinen eli sen tavoitteena oli luoda uusi ratkaisumalliongelmalliseksi koettuun tilanteeseen. Tutkimuksen tuloksena syntyi pitkälle automatisoitu kustannusraportointimalli, joka sisältää kaksi tuotekohtaista kustannusraporttityyppiä. Raportit ovat ns. on-line -raportteja eli käyttäjien on mahdollisuus lukea niitä yrityksen intranetin kautta haluamanaan aikana. Prototyyppiraportit laadittiin erään nykyisen tuoterakenteen mukaisen koneen tiedoilla.Raporttien avulla voidaan tehokkaasti seurata konekohtaisia kustannuksia ja tunnistaa ne tekijät joihin vaikuttamalla voidaan saavuttaa asetetut tavoitekustannukset. Mallissa määriteltiin myös raporttien käyttäjät, jakelu, vastuut ja päivitykset. Todellinen hyöty raportointimallista saadaan siinä vaiheessa, kun sen avulla voidaan seurata uuden tuoterakenteen kustannuksia. Tutkimuksen perusteella annettiin lisäksi suosituksia kohdeyrityksen kustannusseurannan kehittämiseksi tulevaisuudessa.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis examines coordination of systems development process in a contemporary software producing organization. The thesis consists of a series of empirical studies in which the actions, conceptions and artifacts of practitioners are analyzed using a theory-building case study research approach. The three phases of the thesis provide empirical observations on different aspects of systemsdevelopment. In the first phase is examined the role of architecture in coordination and cost estimation in multi-site environment. The second phase involves two studies on the evolving requirement understanding process and how to measure this process. The third phase summarizes the first two phases and concentrates on the role of methods and how practitioners work with them. All the phases provide evidence that current systems development method approaches are too naïve in looking at the complexity of the real world. In practice, development is influenced by opportunity and other contingent factors. The systems development processis not coordinated using phases and tasks defined in methods providing universal mechanism for managing this process like most of the method approaches assume.Instead, the studies suggest that managing systems development process happens through coordinating development activities using methods as tools. These studies contribute to the systems development methods by emphasizing the support of communication and collaboration between systems development participants. Methods should not describe the development activities and phases in a detail level, butshould include the higher level guidance for practitioners on how to act in different systems development environments.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Semantic Web applications take off is being slower than expected, at least with respect to “real-world” applications and users. One of the main reasons for this lack of adoption is that most Semantic Web user interfaces are still immature from the usability and accessibility points of view. This is due to the novelty of these technologies, but this also motivates the exploration of alternative interaction paradigms, different from the “traditional” Web or Desktop applications ones. Our proposal is realized in the Rhizomer platform, which explores the possibilities of the object–action interaction paradigm at the Web scale. This paradigm is well suited for heterogeneous resource spaces such as those common in the Semantic Web. Resources, described by metadata, correspond to the objects in the paradigm. Semantic web services, which are dynamically associated to these objects, correspond to the actions. The platform is being put into practice in the context of a research project in order to build an open application for media distribution based on Semantic Web technologies. Moreover, its usability and accessibility have been evaluated in this real setting and compared to similar systems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

En els darrers anys, la criptografia amb corbes el.líptiques ha adquirit una importància creixent, fins a arribar a formar part en la actualitat de diferents estàndards industrials. Tot i que s'han dissenyat variants amb corbes el.líptiques de criptosistemes clàssics, com el RSA, el seu màxim interès rau en la seva aplicació en criptosistemes basats en el Problema del Logaritme Discret, com els de tipus ElGamal. En aquest cas, els criptosistemes el.líptics garanteixen la mateixa seguretat que els construïts sobre el grup multiplicatiu d'un cos finit primer, però amb longituds de clau molt menor. Mostrarem, doncs, les bones propietats d'aquests criptosistemes, així com els requeriments bàsics per a que una corba sigui criptogràficament útil, estretament relacionat amb la seva cardinalitat. Revisarem alguns mètodes que permetin descartar corbes no criptogràficament útils, així com altres que permetin obtenir corbes bones a partir d'una de donada. Finalment, descriurem algunes aplicacions, com són el seu ús en Targes Intel.ligents i sistemes RFID, per concloure amb alguns avenços recents en aquest camp.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In recent years, Semantic Web (SW) research has resulted in significant outcomes. Various industries have adopted SW technologies, while the ‘deep web’ is still pursuing the critical transformation point, in which the majority of data found on the deep web will be exploited through SW value layers. In this article we analyse the SW applications from a ‘market’ perspective. We are setting the key requirements for real-world information systems that are SW-enabled and we discuss the major difficulties for the SW uptake that has been delayed. This article contributes to the literature of SW and knowledge management providing a context for discourse towards best practices on SW-based information systems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Sudoku problems are some of the most known and enjoyed pastimes, with a never diminishing popularity, but, for the last few years those problems have gone from an entertainment to an interesting research area, a twofold interesting area, in fact. On the one side Sudoku problems, being a variant of Gerechte Designs and Latin Squares, are being actively used for experimental design, as in [8, 44, 39, 9]. On the other hand, Sudoku problems, as simple as they seem, are really hard structured combinatorial search problems, and thanks to their characteristics and behavior, they can be used as benchmark problems for refining and testing solving algorithms and approaches. Also, thanks to their high inner structure, their study can contribute more than studies of random problems to our goal of solving real-world problems and applications and understanding problem characteristics that make them hard to solve. In this work we use two techniques for solving and modeling Sudoku problems, namely, Constraint Satisfaction Problem (CSP) and Satisfiability Problem (SAT) approaches. To this effect we define the Generalized Sudoku Problem (GSP), where regions can be of rectangular shape, problems can be of any order, and solution existence is not guaranteed. With respect to the worst-case complexity, we prove that GSP with block regions of m rows and n columns with m = n is NP-complete. For studying the empirical hardness of GSP, we define a series of instance generators, that differ in the balancing level they guarantee between the constraints of the problem, by finely controlling how the holes are distributed in the cells of the GSP. Experimentally, we show that the more balanced are the constraints, the higher the complexity of solving the GSP instances, and that GSP is harder than the Quasigroup Completion Problem (QCP), a problem generalized by GSP. Finally, we provide a study of the correlation between backbone variables – variables with the same value in all the solutions of an instance– and hardness of GSP.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Random problem distributions have played a key role in the study and design of algorithms for constraint satisfaction and Boolean satisfiability, as well as in ourunderstanding of problem hardness, beyond standard worst-case complexity. We consider random problem distributions from a highly structured problem domain that generalizes the Quasigroup Completion problem (QCP) and Quasigroup with Holes (QWH), a widely used domain that captures the structure underlying a range of real-world applications. Our problem domain is also a generalization of the well-known Sudoku puz- zle: we consider Sudoku instances of arbitrary order, with the additional generalization that the block regions can have rectangular shape, in addition to the standard square shape. We evaluate the computational hardness of Generalized Sudoku instances, for different parameter settings. Our experimental hardness results show that we can generate instances that are considerably harder than QCP/QWH instances of the same size. More interestingly, we show the impact of different balancing strategies on problem hardness. We also provide insights into backbone variables in Generalized Sudoku instances and how they correlate to problem hardness.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper reviews experimental methods for the study of the responses of people to violence in digital media, and in particular considers the issues of internal validity and ecological validity or generalisability of results to events in the real world. Experimental methods typically involve a significant level of abstraction from reality, with participants required to carry out tasks that are far removed from violence in real life, and hence their ecological validity is questionable. On the other hand studies based on fi eld data, while having ecological validity, cannot control multiple confounding variables that may have an impact on observed results, so that their internal validity is questionable. It is argued that immersive virtual reality may provide a unifi cation of these two approaches. Since people tend to respond realistically to situations and events that occur in virtual reality, and since virtual reality simulations can be completely controlled for experimental purposes, studies of responses to violence within virtual reality are likely to have both ecological and internal validity. This depends on a property that we call"plausibility"- including the fi delity of the depicted situation with prior knowledge and expectations. We illustrate this with data from a previously published experiment, a virtual reprise of Stanley Milgram"s 1960s obedience experiment, and also with pilot data from a new study being developed that looks at bystander responses to violent incidents.