893 resultados para data driven approach


Relevância:

80.00% 80.00%

Publicador:

Resumo:

This Phase II project follows a previous project titled Strategies to Address Nighttime Crashes at Rural, Unsignalized Intersections. Based on the results of the previous study, the Iowa Highway Research Board (IHRB) indicated interest in pursuing further research to address the quality of lighting, rather than just the presence of light, with respect to safety. The research team supplemented the literature review from the previous study, specifically addressing lighting level in terms of measurement, the relationship between light levels and safety, and lamp durability and efficiency. The Center for Transportation Research and Education (CTRE) teamed with a national research leader in roadway lighting, Virginia Tech Transportation Institute (VTTI) to collect the data. An integral instrument to the data collection efforts was the creation of the Roadway Monitoring System (RMS). The RMS allowed the research team to collect lighting data and approach information for each rural intersection identified in the previous phase. After data cleanup, the final data set contained illuminance data for 101 lighted intersections (of 137 lighted intersections in the first study). Data analysis included a robust statistical analysis based on Bayesian techniques. Average illuminance, average glare, and average uniformity ratio values were used to classify quality of lighting at the intersections.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Résumé Suite aux recentes avancées technologiques, les archives d'images digitales ont connu une croissance qualitative et quantitative sans précédent. Malgré les énormes possibilités qu'elles offrent, ces avancées posent de nouvelles questions quant au traitement des masses de données saisies. Cette question est à la base de cette Thèse: les problèmes de traitement d'information digitale à très haute résolution spatiale et/ou spectrale y sont considérés en recourant à des approches d'apprentissage statistique, les méthodes à noyau. Cette Thèse étudie des problèmes de classification d'images, c'est à dire de catégorisation de pixels en un nombre réduit de classes refletant les propriétés spectrales et contextuelles des objets qu'elles représentent. L'accent est mis sur l'efficience des algorithmes, ainsi que sur leur simplicité, de manière à augmenter leur potentiel d'implementation pour les utilisateurs. De plus, le défi de cette Thèse est de rester proche des problèmes concrets des utilisateurs d'images satellite sans pour autant perdre de vue l'intéret des méthodes proposées pour le milieu du machine learning dont elles sont issues. En ce sens, ce travail joue la carte de la transdisciplinarité en maintenant un lien fort entre les deux sciences dans tous les développements proposés. Quatre modèles sont proposés: le premier répond au problème de la haute dimensionalité et de la redondance des données par un modèle optimisant les performances en classification en s'adaptant aux particularités de l'image. Ceci est rendu possible par un système de ranking des variables (les bandes) qui est optimisé en même temps que le modèle de base: ce faisant, seules les variables importantes pour résoudre le problème sont utilisées par le classifieur. Le manque d'information étiquétée et l'incertitude quant à sa pertinence pour le problème sont à la source des deux modèles suivants, basés respectivement sur l'apprentissage actif et les méthodes semi-supervisées: le premier permet d'améliorer la qualité d'un ensemble d'entraînement par interaction directe entre l'utilisateur et la machine, alors que le deuxième utilise les pixels non étiquetés pour améliorer la description des données disponibles et la robustesse du modèle. Enfin, le dernier modèle proposé considère la question plus théorique de la structure entre les outputs: l'intègration de cette source d'information, jusqu'à présent jamais considérée en télédétection, ouvre des nouveaux défis de recherche. Advanced kernel methods for remote sensing image classification Devis Tuia Institut de Géomatique et d'Analyse du Risque September 2009 Abstract The technical developments in recent years have brought the quantity and quality of digital information to an unprecedented level, as enormous archives of satellite images are available to the users. However, even if these advances open more and more possibilities in the use of digital imagery, they also rise several problems of storage and treatment. The latter is considered in this Thesis: the processing of very high spatial and spectral resolution images is treated with approaches based on data-driven algorithms relying on kernel methods. In particular, the problem of image classification, i.e. the categorization of the image's pixels into a reduced number of classes reflecting spectral and contextual properties, is studied through the different models presented. The accent is put on algorithmic efficiency and the simplicity of the approaches proposed, to avoid too complex models that would not be used by users. The major challenge of the Thesis is to remain close to concrete remote sensing problems, without losing the methodological interest from the machine learning viewpoint: in this sense, this work aims at building a bridge between the machine learning and remote sensing communities and all the models proposed have been developed keeping in mind the need for such a synergy. Four models are proposed: first, an adaptive model learning the relevant image features has been proposed to solve the problem of high dimensionality and collinearity of the image features. This model provides automatically an accurate classifier and a ranking of the relevance of the single features. The scarcity and unreliability of labeled. information were the common root of the second and third models proposed: when confronted to such problems, the user can either construct the labeled set iteratively by direct interaction with the machine or use the unlabeled data to increase robustness and quality of the description of data. Both solutions have been explored resulting into two methodological contributions, based respectively on active learning and semisupervised learning. Finally, the more theoretical issue of structured outputs has been considered in the last model, which, by integrating outputs similarity into a model, opens new challenges and opportunities for remote sensing image processing.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This Phase II project follows a previous project titled Strategies to Address Nighttime Crashes at Rural, Unsignalized Intersections. Based on the results of the previous study, the Iowa Highway Research Board (IHRB) indicated interest in pursuing further research to address the quality of lighting, rather than just the presence of light, with respect to safety. The research team supplemented the literature review from the previous study, specifically addressing lighting level in terms of measurement, the relationship between light levels and safety, and lamp durability and efficiency. The Center for Transportation Research and Education (CTRE) teamed with a national research leader in roadway lighting, Virginia Tech Transportation Institute (VTTI) to collect the data. An integral instrument to the data collection efforts was the creation of the Roadway Monitoring System (RMS). The RMS allowed the research team to collect lighting data and approach information for each rural intersection identified in the previous phase. After data cleanup, the final data set contained illuminance data for 101 lighted intersections (of 137 lighted intersections in the first study). Data analysis included a robust statistical analysis based on Bayesian techniques. Average illuminance, average glare, and average uniformity ratio values were used to classify quality of lighting at the intersections.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Aerobic exercise training performed at the intensity eliciting maximal fat oxidation (Fatmax) has been shown to improve the metabolic profile of obese patients. However, limited information is available on the reproducibility of Fatmax and related physiological measures. The aim of this study was to assess the intra-individual variability of: a) Fatmax measurements determined using three different data analysis approaches and b) fat and carbohydrate oxidation rates at rest and at each stage of an individualized graded test. Fifteen healthy males [body mass index 23.1±0.6 kg/m2, maximal oxygen consumption ([Formula: see text]) 52.0±2.0 ml/kg/min] completed a maximal test and two identical submaximal incremental tests on ergocycle (30-min rest followed by 5-min stages with increments of 7.5% of the maximal power output). Fat and carbohydrate oxidation rates were determined using indirect calorimetry. Fatmax was determined with three approaches: the sine model (SIN), measured values (MV) and 3rd polynomial curve (P3). Intra-individual coefficients of variation (CVs) and limits of agreement were calculated. CV for Fatmax determined with SIN was 16.4% and tended to be lower than with P3 and MV (18.6% and 20.8%, respectively). Limits of agreement for Fatmax were -2±27% of [Formula: see text] with SIN, -4±32 with P3 and -4±28 with MV. CVs of oxygen uptake, carbon dioxide production and respiratory exchange rate were <10% at rest and <5% during exercise. Conversely, CVs of fat oxidation rates (20% at rest and 24-49% during exercise) and carbohydrate oxidation rates (33.5% at rest, 8.5-12.9% during exercise) were higher. The intra-individual variability of Fatmax and fat oxidation rates was high (CV>15%), regardless of the data analysis approach employed. Further research on the determinants of the variability of Fatmax and fat oxidation rates is required.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We present a novel and straightforward method for estimating recent migration rates between discrete populations using multilocus genotype data. The approach builds upon a two-step sampling design, where individual genotypes are sampled before and after dispersal. We develop a model that estimates all pairwise backwards migration rates (m(ij), the probability that an individual sampled in population i is a migrant from population j) between a set of populations. The method is validated with simulated data and compared with the methods of BayesAss and Structure. First, we use data for an island model and then we consider more realistic data simulations for a metapopulation of the greater white-toothed shrew (Crocidura russula). We show that the precision and bias of estimates primarily depend upon the proportion of individuals sampled in each population. Weak sampling designs may particularly affect the quality of the coverage provided by 95% highest posterior density intervals. We further show that it is relatively insensitive to the number of loci sampled and the overall strength of genetic structure. The method can easily be extended and makes fewer assumptions about the underlying demographic and genetic processes than currently available methods. It allows backwards migration rates to be estimated across a wide range of realistic conditions.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This supplementary project has been undertaken as an effort to continue work previously completed in the Pooled Fund Study of Premature Concrete Pavement Deterioration. As such, it shares the objective of "Identifying the variables that are present in those pavements exhibiting premature deterioration," by collecting additional data and performing statistical analysis of those data. The approach and philosophy of this work are identical to that followed in the above project, and the Pooled Fund Study Final Report provides a detailed description of this process. This project has involved the collection of data for additional sites in the state of Iowa. These sites have then been added to sites collected in the original study, and statistical analysis has been performed on the entire set. It is hoped that this will have two major effects. First, using data from only one state allows for the analysis of a larger set of independent variables with a greater degree of commonality than was possible in the multi-state study, since the data are not limited by state to state differences in data collection and retention. Second, more data on additional sites will increase the degrees of freedom in the model and hopefully add confidence to the results.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Interactions between stimuli's acoustic features and experience-based internal models of the environment enable listeners to compensate for the disruptions in auditory streams that are regularly encountered in noisy environments. However, whether auditory gaps are filled in predictively or restored a posteriori remains unclear. The current lack of positive statistical evidence that internal models can actually shape brain activity as would real sounds precludes accepting predictive accounts of filling-in phenomenon. We investigated the neurophysiological effects of internal models by testing whether single-trial electrophysiological responses to omitted sounds in a rule-based sequence of tones with varying pitch could be decoded from the responses to real sounds and by analyzing the ERPs to the omissions with data-driven electrical neuroimaging methods. The decoding of the brain responses to different expected, but omitted, tones in both passive and active listening conditions was above chance based on the responses to the real sound in active listening conditions. Topographic ERP analyses and electrical source estimations revealed that, in the absence of any stimulation, experience-based internal models elicit an electrophysiological activity different from noise and that the temporal dynamics of this activity depend on attention. We further found that the expected change in pitch direction of omitted tones modulated the activity of left posterior temporal areas 140-200 msec after the onset of omissions. Collectively, our results indicate that, even in the absence of any stimulation, internal models modulate brain activity as do real sounds, indicating that auditory filling in can be accounted for by predictive activity.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Tutkimuksen tavoitteena oli kartoittaa case-yrityksen vakiintuneet projektijohtamiseen liittyvät prosessit ja selvittää miten projektipäälliköt ja -koordinaattorit näkevät projektijohtamisen menestystekijät. Projektijohtamiseen liittyvän kirjallisuuden perusteella koostettiin projektijohtamisen optimiprosessi ja prosessivaiheittain listattiin myös eri menestystekijöitä. Tätä kuvausta verrattiin case-yrityksen projektijohtamisen prosessiin ja koettuihin menestystekijöihin, ja näiden pohjalta pohdittiin case-yrityksen projektijohtamisen prosessin parantamista. Projektijohtamisen prosessin lähes kaikissa vaiheissa löytyi tekijöitä, jotka ovat osatekijöitä projektin menestyksen kannalta. Yksi keskeisimpiä havaintoja oli projektin tarvelähtöisyyden merkitys projektin onnistumisessa ja projektitiimin sitoutumisessa. Kumppaneiden sitoutuminen yhtenä tärkeänä tekijänä nousi myös esille. Tutkimuksen tuloksena syntyi myös mekanismi kohdennettujen projekti-ideoiden keräämiseen hajautetussa ympäristössä.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Robotic grasping has been studied increasingly for a few decades. While progress has been made in this field, robotic hands are still nowhere near the capability of human hands. However, in the past few years, the increase in computational power and the availability of commercial tactile sensors have made it easier to develop techniques that exploit the feedback from the hand itself, the sense of touch. The focus of this thesis lies in the use of this sense. The work described in this thesis focuses on robotic grasping from two different viewpoints: robotic systems and data-driven grasping. The robotic systems viewpoint describes a complete architecture for the act of grasping and, to a lesser extent, more general manipulation. Two central claims that the architecture was designed for are hardware independence and the use of sensors during grasping. These properties enables the use of multiple different robotic platforms within the architecture. Secondly, new data-driven methods are proposed that can be incorporated into the grasping process. The first of these methods is a novel way of learning grasp stability from the tactile and haptic feedback of the hand instead of analytically solving the stability from a set of known contacts between the hand and the object. By learning from the data directly, there is no need to know the properties of the hand, such as kinematics, enabling the method to be utilized with complex hands. The second novel method, probabilistic grasping, combines the fields of tactile exploration and grasp planning. By employing well-known statistical methods and pre-existing knowledge of an object, object properties, such as pose, can be inferred with related uncertainty. This uncertainty is utilized by a grasp planning process which plans for stable grasps under the inferred uncertainty.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Few people see both opportunities and threats coming from IT legacy in current world. On one hand, effective legacy management can bring substantial hard savings and smooth transition to the desired future state. On the other hand, its mismanagement contributes to serious operational business risks, as old systems are not as reliable as it is required by the business users. This thesis offers one perspective of dealing with IT legacy – through effective contract management, as a component towards achieving Procurement Excellence in IT, thus bridging IT delivery departments, IT procurement, business units, and suppliers. It developed a model for assessing the impact of improvements on contract management process and set of tools and advices with regards to analysis and improvement actions. The thesis conducted case study to present and justify the implementation of Lean Six Sigma in IT legacy contract management environment. Lean Six Sigma proved to be successful and this thesis presents and discusses all the steps necessary, and pitfalls to avoid, to achieve breakthrough improvement in IT contract management process performance. For the IT legacy contract management process two improvements require special attention and can be easily copied to any organization. First is the issue of diluted contract ownership that stops all the improvements, as people do not know who is responsible for performing those actions. Second is the contract management performance evaluation tool, which can be used for monitoring, identifying outlying contracts and opportunities for improvements in the process. The study resulted in a valuable insight on the benefits of applying Lean Six Sigma to improve IT legacy contract management, as well as on how Lean Six Sigma can be applied in IT environment. Managerial implications are discussed. It is concluded that the use of data-driven Lean Six Sigma methodology for improving the existing IT contract management processes is a significant addition to the existing best practices in contract management.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Vaikka liiketoimintatiedon hallintaa sekä johdon päätöksentekoa on tutkittu laajasti, näiden kahden käsitteen yhteisvaikutuksesta on olemassa hyvin rajallinen määrä tutkimustietoa. Tulevaisuudessa aiheen tärkeys korostuu, sillä olemassa olevan datan määrä kasvaa jatkuvasti. Yritykset tarvitsevat jatkossa yhä enemmän kyvykkyyksiä sekä resursseja, jotta sekä strukturoitua että strukturoimatonta tietoa voidaan hyödyntää lähteestä riippumatta. Nykyiset Business Intelligence -ratkaisut mahdollistavat tehokkaan liiketoimintatiedon hallinnan osana johdon päätöksentekoa. Aiemman kirjallisuuden pohjalta, tutkimuksen empiirinen osuus tunnistaa liiketoimintatiedon hyödyntämiseen liittyviä tekijöitä, jotka joko tukevat tai rajoittavat johdon päätöksentekoprosessia. Tutkimuksen teoreettinen osuus johdattaa lukijan tutkimusaiheeseen kirjallisuuskatsauksen avulla. Keskeisimmät tutkimukseen liittyvät käsitteet, kuten Business Intelligence ja johdon päätöksenteko, esitetään relevantin kirjallisuuden avulla – tämän lisäksi myös dataan liittyvät käsitteet analysoidaan tarkasti. Tutkimuksen empiirinen osuus rakentuu tutkimusteorian pohjalta. Tutkimuksen empiirisessä osuudessa paneudutaan tutkimusteemoihin käytännön esimerkein: kolmen tapaustutkimuksen avulla tutkitaan sekä kuvataan toisistaan irrallisia tapauksia. Jokainen tapaus kuvataan sekä analysoidaan teoriaan perustuvien väitteiden avulla – nämä väitteet ovat perusedellytyksiä menestyksekkäälle liiketoimintatiedon hyödyntämiseen perustuvalle päätöksenteolle. Tapaustutkimusten avulla alkuperäistä tutkimusongelmaa voidaan analysoida tarkasti huomioiden jo olemassa oleva tutkimustieto. Analyysin tulosten avulla myös yksittäisiä rajoitteita sekä mahdollistavia tekijöitä voidaan analysoida. Tulokset osoittavat, että rajoitteilla on vahvasti negatiivinen vaikutus päätöksentekoprosessin onnistumiseen. Toisaalta yritysjohto on tietoinen liiketoimintatiedon hallintaan liittyvistä positiivisista seurauksista, vaikka kaikkia mahdollisuuksia ei olisikaan hyödynnetty. Tutkimuksen merkittävin tulos esittelee viitekehyksen, jonka puitteissa johdon päätöksentekoprosesseja voidaan arvioida sekä analysoida. Despite the fact that the literature on Business Intelligence and managerial decision-making is extensive, relatively little effort has been made to research the relationship between them. This particular field of study has become important since the amount of data in the world is growing every second. Companies require capabilities and resources in order to utilize structured data and unstructured data from internal and external data sources. However, the present Business Intelligence technologies enable managers to utilize data effectively in decision-making. Based on the prior literature, the empirical part of the thesis identifies the enablers and constraints in computer-aided managerial decision-making process. In this thesis, the theoretical part provides a preliminary understanding about the research area through a literature review. The key concepts such as Business Intelligence and managerial decision-making are explored by reviewing the relevant literature. Additionally, different data sources as well as data forms are analyzed in further detail. All key concepts are taken into account when the empirical part is carried out. The empirical part obtains an understanding of the real world situation when it comes to the themes that were covered in the theoretical part. Three selected case companies are analyzed through those statements, which are considered as critical prerequisites for successful computer-aided managerial decision-making. The case study analysis, which is a part of the empirical part, enables the researcher to examine the relationship between Business Intelligence and managerial decision-making. Based on the findings of the case study analysis, the researcher identifies the enablers and constraints through the case study interviews. The findings indicate that the constraints have a highly negative influence on the decision-making process. In addition, the managers are aware of the positive implications that Business Intelligence has for decision-making, but all possibilities are not yet utilized. As a main result of this study, a data-driven framework for managerial decision-making is introduced. This framework can be used when the managerial decision-making processes are evaluated and analyzed.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Presentation at Open Repositories 2014, Helsinki, Finland, June 9-13, 2014

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Tämän pro gradu –tutkielman tarkoituksena on selvittää minkälaisella prosessilla saadaan määriteltyä resursoinnin näkökulmasta toteutettu osaamiskartoitus. Tutkimus on laadullinen tapaustutkimus kohdeorganisaatiossa. Tutkimusaineisto on kerätty dokumenteista ja tutkimuksessa toteutetuista tapaamisista sekä työpajoista. Tutkimusaineisto on analysoitu aineistolähtöisellä sisällönanalyysimenetelmällä. Tutkimuksen tulosten mukaan osaamiskartoitusprosessiin ja sen onnistumiseen vaikuttavat merkittävästi yrityksen strategia, johdon sitoutuminen osaamiskartoitustyöhön, nykytilan analyysi, yhteiset käsitteistöt, mittarit ja tavoitteet. Resursoinnin näkökulmasta vaadittavat osaamiset eivät välttämättä ole samat kuin kehittämisen näkökulmasta. Määrittelyprosessin onnistumisen kannalta merkittäviä tekijöitä ovat oikeiden henkilöiden osallistuminen prosessiin ja heidän halunsa jakaa tietoa.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

As technology has developed it has increased the number of data produced and collected from business environment. Over 80% of that data includes some sort of reference to geographical location. Individuals have used that information by utilizing Google Maps or different GPS devices, however such information has remained unexploited in business. This thesis will study the use and utilization of geographically referenced data in capital-intensive business by first providing theoretical insight into how data and data-driven management enables and enhances the business and how especially geographically referenced data adds value to the company and then examining empirical case evidence how geographical information can truly be exploited in capital-intensive business and what are the value adding elements of geographical information to the business. The study contains semi-structured interviews that are used to scan attitudes and beliefs of an organization towards the geographic information and to discover fields of applications for the use of geographic information system within the case company. Additionally geographical data is tested in order to illustrate how the data could be used in practice. Finally the outcome of the thesis provides understanding from which elements the added value of geographical information in business is consisted of and how such data can be utilized in the case company and in capital-intensive business.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Human activity recognition in everyday environments is a critical, but challenging task in Ambient Intelligence applications to achieve proper Ambient Assisted Living, and key challenges still remain to be dealt with to realize robust methods. One of the major limitations of the Ambient Intelligence systems today is the lack of semantic models of those activities on the environment, so that the system can recognize the speci c activity being performed by the user(s) and act accordingly. In this context, this thesis addresses the general problem of knowledge representation in Smart Spaces. The main objective is to develop knowledge-based models, equipped with semantics to learn, infer and monitor human behaviours in Smart Spaces. Moreover, it is easy to recognize that some aspects of this problem have a high degree of uncertainty, and therefore, the developed models must be equipped with mechanisms to manage this type of information. A fuzzy ontology and a semantic hybrid system are presented to allow modelling and recognition of a set of complex real-life scenarios where vagueness and uncertainty are inherent to the human nature of the users that perform it. The handling of uncertain, incomplete and vague data (i.e., missing sensor readings and activity execution variations, since human behaviour is non-deterministic) is approached for the rst time through a fuzzy ontology validated on real-time settings within a hybrid data-driven and knowledgebased architecture. The semantics of activities, sub-activities and real-time object interaction are taken into consideration. The proposed framework consists of two main modules: the low-level sub-activity recognizer and the high-level activity recognizer. The rst module detects sub-activities (i.e., actions or basic activities) that take input data directly from a depth sensor (Kinect). The main contribution of this thesis tackles the second component of the hybrid system, which lays on top of the previous one, in a superior level of abstraction, and acquires the input data from the rst module's output, and executes ontological inference to provide users, activities and their in uence in the environment, with semantics. This component is thus knowledge-based, and a fuzzy ontology was designed to model the high-level activities. Since activity recognition requires context-awareness and the ability to discriminate among activities in di erent environments, the semantic framework allows for modelling common-sense knowledge in the form of a rule-based system that supports expressions close to natural language in the form of fuzzy linguistic labels. The framework advantages have been evaluated with a challenging and new public dataset, CAD-120, achieving an accuracy of 90.1% and 91.1% respectively for low and high-level activities. This entails an improvement over both, entirely data-driven approaches, and merely ontology-based approaches. As an added value, for the system to be su ciently simple and exible to be managed by non-expert users, and thus, facilitate the transfer of research to industry, a development framework composed by a programming toolbox, a hybrid crisp and fuzzy architecture, and graphical models to represent and con gure human behaviour in Smart Spaces, were developed in order to provide the framework with more usability in the nal application. As a result, human behaviour recognition can help assisting people with special needs such as in healthcare, independent elderly living, in remote rehabilitation monitoring, industrial process guideline control, and many other cases. This thesis shows use cases in these areas.