880 resultados para Building information modeling
Resumo:
Hydrologic analysis is a critical part of transportation design because it helps ensure that hydraulic structures are able to accommodate the flow regimes they are likely to see. This analysis is currently conducted using computer simulations of water flow patterns, and continuing developments in elevation survey techniques result in higher and higher resolution surveys. Current survey techniques now resolve many natural and anthropogenic features that were not practical to map and, thus, require new methods for dealing with depressions and flow discontinuities. A method for depressional analysis is proposed that uses the fact that most anthropogenically constructed embankments are roughly more symmetrical with greater slopes than natural depressions. An enforcement method for draining depressions is then analyzed on those depressions that should be drained. This procedure has been evaluated on a small watershed in central Iowa, Walnut Creek of the South Skunk River, HUC12 # 070801050901, and was found to accurately identify 88 of 92 drained depressions and place enforcements within two pixels, although the method often tries to drain prairie pothole depressions that are bisected by anthropogenic features.
Resumo:
The aim of this paper is to describe the process and challenges in building exposure scenarios for engineered nanomaterials (ENM), using an exposure scenario format similar to that used for the European Chemicals regulation (REACH). Over 60 exposure scenarios were developed based on information from publicly available sources (literature, books, and reports), publicly available exposure estimation models, occupational sampling campaign data from partnering institutions, and industrial partners regarding their own facilities. The primary focus was on carbon-based nanomaterials, nano-silver (nano-Ag) and nano-titanium dioxide (nano-TiO2), and included occupational and consumer uses of these materials with consideration of the associated environmental release. The process of building exposure scenarios illustrated the availability and limitations of existing information and exposure assessment tools for characterizing exposure to ENM, particularly as it relates to risk assessment. This article describes the gaps in the information reviewed, recommends future areas of ENM exposure research, and proposes types of information that should, at a minimum, be included when reporting the results of such research, so that the information is useful in a wider context.
Resumo:
In work-zone configurations where lane drops are present, merging of traffic at the taper presents an operational concern. In addition, as flow through the work zone is reduced, the relative traffic safety of the work zone is also reduced. Improving work-zone flow-through merge points depends on the behavior of individual drivers. By better understanding driver behavior, traffic control plans, work zone policies, and countermeasures can be better targeted to reinforce desirable lane closure merging behavior, leading to both improved safety and work-zone capacity. The researchers collected data for two work-zone scenarios that included lane drops with one scenario on the Interstate and the other on an urban arterial roadway. The researchers then modeled and calibrated these scenarios in VISSIM using real-world speeds, travel times, queue lengths, and merging behaviors (percentage of vehicles merging upstream and near the merge point). Once built and calibrated, the researchers modeled strategies for various countermeasures in the two work zones. The models were then used to test and evaluate how various merging strategies affect safety and operations at the merge areas in these two work zones.
Resumo:
Abstract
Resumo:
This work analyses the political news of eight Spanish television channels in order to see what image is built of politics, and particularly how the news of corruption affects the image of politics in Spanish news broadcasts. Different cases of corruption such as Gürtel, Palma Arena and those associated with judge Baltasar Garzón in his final stage in office, occupy part of the study. A new methodology is therefore proposed that enables the quality of the political information emitted from inside and outside the political content of the news programmes to be observed. Particular attention is paid to the news broadcasts of Televisión Española and Cuatro as those which offer a more balanced view of politics, and channels such as La Sexta, which give priority to a narrative construction of politics in the news programmes around causes of corruption.
Resumo:
Advancements in high-throughput technologies to measure increasingly complex biological phenomena at the genomic level are rapidly changing the face of biological research from the single-gene single-protein experimental approach to studying the behavior of a gene in the context of the entire genome (and proteome). This shift in research methodologies has resulted in a new field of network biology that deals with modeling cellular behavior in terms of network structures such as signaling pathways and gene regulatory networks. In these networks, different biological entities such as genes, proteins, and metabolites interact with each other, giving rise to a dynamical system. Even though there exists a mature field of dynamical systems theory to model such network structures, some technical challenges are unique to biology such as the inability to measure precise kinetic information on gene-gene or gene-protein interactions and the need to model increasingly large networks comprising thousands of nodes. These challenges have renewed interest in developing new computational techniques for modeling complex biological systems. This chapter presents a modeling framework based on Boolean algebra and finite-state machines that are reminiscent of the approach used for digital circuit synthesis and simulation in the field of very-large-scale integration (VLSI). The proposed formalism enables a common mathematical framework to develop computational techniques for modeling different aspects of the regulatory networks such as steady-state behavior, stochasticity, and gene perturbation experiments.
Resumo:
The paper presents some contemporary approaches to spatial environmental data analysis. The main topics are concentrated on the decision-oriented problems of environmental spatial data mining and modeling: valorization and representativity of data with the help of exploratory data analysis, spatial predictions, probabilistic and risk mapping, development and application of conditional stochastic simulation models. The innovative part of the paper presents integrated/hybrid model-machine learning (ML) residuals sequential simulations-MLRSS. The models are based on multilayer perceptron and support vector regression ML algorithms used for modeling long-range spatial trends and sequential simulations of the residuals. NIL algorithms deliver non-linear solution for the spatial non-stationary problems, which are difficult for geostatistical approach. Geostatistical tools (variography) are used to characterize performance of ML algorithms, by analyzing quality and quantity of the spatially structured information extracted from data with ML algorithms. Sequential simulations provide efficient assessment of uncertainty and spatial variability. Case study from the Chernobyl fallouts illustrates the performance of the proposed model. It is shown that probability mapping, provided by the combination of ML data driven and geostatistical model based approaches, can be efficiently used in decision-making process. (C) 2003 Elsevier Ltd. All rights reserved.
Resumo:
Tämän tutkimustyön kohteena on TietoEnator Oy:n kehittämän Fenix-tietojärjestelmän kapasiteettitarpeen ennustaminen. Työn tavoitteena on tutustua Fenix-järjestelmän eri osa-alueisiin, löytää tapa eritellä ja mallintaa eri osa-alueiden vaikutus järjestelmän kuormitukseen ja selvittää alustavasti mitkä parametrit vaikuttavat kyseisten osa-alueiden luomaan kuormitukseen. Osa tätä työtä on tutkia eri vaihtoehtoja simuloinnille ja selvittää eri vaihtoehtojen soveltuvuus monimutkaisten järjestelmien mallintamiseen. Kerätyn tiedon pohjaltaluodaan järjestelmäntietovaraston kuormitusta kuvaava simulaatiomalli. Hyödyntämällä mallista saatua tietoa ja tuotantojärjestelmästä mitattua tietoa mallia kehitetään vastaamaan yhä lähemmin todellisen järjestelmän toimintaa. Mallista tarkastellaan esimerkiksi simuloitua järjestelmäkuormaa ja jonojen käyttäytymistä. Tuotantojärjestelmästä mitataan eri kuormalähteiden käytösmuutoksia esimerkiksi käyttäjämäärän ja kellonajan suhteessa. Tämän työn tulosten on tarkoitus toimia pohjana myöhemmin tehtävälle jatkotutkimukselle, jossa osa-alueiden parametrisointia tarkennetaan lisää, mallin kykyä kuvata todellista järjestelmää tehostetaanja mallin laajuutta kasvatetaan.
Resumo:
Abstract:The objective of this work was to develop and validate a prognosis system for volume yield and basal area of intensively managed loblolly pine (Pinus taeda) stands, using stand and diameter class models compatible in basal area estimates. The data used in the study were obtained from plantations located in northern Uruguay. For model validation without data loss, a three-phase validation scheme was applied: first, the equations were fitted without the validation database; then, model validation was carried out; and, finally, the database was regrouped to recalibrate the parameter values. After the validation and final parameterization of the models, a simulation of the first commercial thinning was carried out. The developed prognosis system was precise and accurate in estimating basal area production per hectare or per diameter classes. There was compatibility in basal area estimates between diameter class and whole stand models, with a mean difference of -0.01 m2ha-1. The validation scheme applied is logic and consistent, since information on the accuracy and precision of the models is obtained without the loss of any information in the estimation of the models' parameters.
Resumo:
Tulevaisuuden hahmottamisen merkitys heikkojen signaalien avulla on korostunut viime vuosien aikana merkittävästi,koska yrityksen liiketoimintaympäristössä tapahtuvia muutoksia on ollut yhä vaikeampaa ennustaa historian perusteella. Liiketoimintaympäristössä monien muutoksien merkkejä on ollut nähtävissä, mutta niitä on ollut vaikea havaita. Heikkoja signaaleja tunnistamalla ja keräämällä sekä reagoimalla tilanteeseen riittävän ajoissa, on mahdollista saavuttaa ylivoimaista kilpailuetua. Kirjallisuustutkimus keskittyy heikkojen signaalien tunnistamisen haasteisiin liiketoimintaympäristöstä, signaalien ja informaation kehittymiseen sekä informaation hallintaan organisaatiossa. Kiinnostus näihin perustuu tarpeeseen määritellä heikkojen signaalien tunnistamiseen vaadittava prosessi, jonka avulla heikot signaalit voidaan huomioida M-real Oyj:n päätöksenteossa. Kirjallisuustutkimus osoittaa selvästi sen, että heikkoja signaaleita on olemassa ja niitä pystytään tunnistamaan liiketoimintaympäristöstä. Signaaleja voidaan rikastuttaa yrityksessä olevalla tietämyksellä ja hyödyntää edelleen päätöksenteossa. Vertailtaessa sekä kirjallisuustutkimusta että empiiristä tutkimusta tuli ilmi selkeästi tiedon moninaisuus; määrä,laatu ja tiedonsaannin oikea-aikaisuus päätöksenteossa. Tutkimuksen aikana kehittyi prosessimalli tiedon suodattamiselle, luokittelulle ja heikkojen signaalien tunnistamiselle. Työn edetessä prosessimalli kehittyi osaksi tässä työssä kehitettyä kokonaisuutta 'Weak Signal Capturing' -työkalua. Monistamalla työkalua voidaan kerätä heikkoja signaaleja eri M-realin liiketoiminnan osa-alueilta. Tietoja systemaattisesti kokoamalla voidaan kartoittaa tulevaisuutta koko M-realille.
Resumo:
Tutkimuksen tavoite oli selvittää yrityksen web toiminnan rakentamisen vaiheita sekä menestyksen mittaamista. Rakennusprosessia tutkittiin viisiportaisen askelmallin avulla. Mallin askeleet ovat; arviointi, strategian muotoilu, suunnitelma, pohjapiirros ja toteutus. Arviointi- ja toteutusvaiheiden täydentämiseksi sekä erityisesti myös internet toiminnan onnistumisen mittaamisen avuksi internet toiminnan hyödyt (CRM,kommunikointi-, myynti-, ja jakelukanava hyödyt markkinoinnin kannalta) käsiteltiin. Toiminnan menestyksen arvioinnin avuksi esiteltiin myös porrasmalli internet toimintaan. Porrasmalli määrittelee kauppakulissi-, dynaaminen-, transaktio- ja e-businessportaat. Tutkimuksessa löydettiin menestystekijöitä internet toimintojen menestykselle. Nämä tekijät ovat laadukas sisältö, kiinnostavuus, viihdyttävyys, informatiivisuus, ajankohtaisuus, personoitavuus, luottamus, interaktiivisuus, käytettävyys, kätevyys, lojaalisuus, suoriutuminen, responssiivisuus ja käyttäjätiedon kerääminen. Mittarit jaettiin tutkimuksessa aktiivisuus-, käyttäytymis- ja muunnosmittareihin. Lisäksi muita mittareita ja menestysindikaattoreita esiteltiin. Nämä menestyksen elementit ja mittarit koottiin yhteen uudessa internet toimintojen menestyksenarviointimallissa. Tutkielman empiirisessä osuudessa,esitettyjä teorioita peilattiin ABB:n (ABB:n sisällä erityisesti ABB Stotz-Kontakt) web toimintaan. Apuna olivat dokumenttianalyysi sekä haastattelut. Empiirinen osa havainnollisti teoriat käytännössä ja toi ilmi mahdollisuuden teorioiden laajentamiseen. Internet toimintojen rakentamismallia voidaan käyttää myös web toimintojen kehittämiseen ja porrasmalli sopii myös nykyisten internet toimintojen arvioimiseen. Mittareiden soveltaminen käytännössä toi kuitenkin ilmi tarpeen niiden kehittämiseen ja aiheen lisätutkimukseen. Niiden tulisi olla myös aiempaatiiviimmin liitetty kokonaisvaltaisen liiketoiminnan menestyksen mittaamiseen.
Resumo:
Static process simulation has traditionally been used to model complex processes for various purposes. However, the use of static processsimulators for the preparation of holistic examinations aiming at improving profit-making capability requires a lot of work because the production of results requires the assessment of the applicability of detailed data which may be irrelevant to the objective. The relevant data for the total assessment gets buried byirrelevant data. Furthermore, the models do not include an examination of the maintenance or risk management, and economic examination is often an extra property added to them which can be performed with a spreadsheet program. A process model applicable to holistic economic examinations has been developed in this work. The model is based on the life cycle profit philosophy developed by Hagberg and Henriksson in 1996. The construction of the model has utilized life cycle assessment and life cycle costing methodologies with a view to developing, above all, a model which would be applicable to the economic examinations of complete wholes and which would require the need for information focusing on aspects essential to the objectives. Life cycle assessment and costing differ from each other in terms of the modeling principles, but the features of bothmethodologies can be used in the development of economic process modeling. Methods applicable to the modeling of complex processes can be examined from the viewpoint of life cycle methodologies, because they involve the collection and management of large corpuses of information and the production of information for the needs of decision-makers as well. The results of the study shows that on the basis of the principles of life cycle modeling, a process model can be created which may be used to produce holistic efficiency examinations on the profit-making capability of the production line, with fewer resources thanwith traditional methods. The calculations of the model are based to the maximum extent on the information system of the factory, which means that the accuracyof the results can be improved by developing information systems so that they can provide the best information for this kind of examinations.
Resumo:
This paper aims to explore asynchronous communication in computer supported collaborative learning (CSCL). Thirty virtual forums are analysed in both a quantitative and a qualitative way. Quantitatively, the number of messages written, message threads and original and answer messages are counted. Qualitatively, the content of the notes is analysed, cataloguing these into two different levels: on the one hand, as a set of knowledge building process categories, and on the other hand, following the scaffolds that Knowledge Forum offers. The results show that both an exchange of information and a collaborative work take place. Nevertheless, the construction of knowledge is superficial.
Resumo:
Approximate models (proxies) can be employed to reduce the computational costs of estimating uncertainty. The price to pay is that the approximations introduced by the proxy model can lead to a biased estimation. To avoid this problem and ensure a reliable uncertainty quantification, we propose to combine functional data analysis and machine learning to build error models that allow us to obtain an accurate prediction of the exact response without solving the exact model for all realizations. We build the relationship between proxy and exact model on a learning set of geostatistical realizations for which both exact and approximate solvers are run. Functional principal components analysis (FPCA) is used to investigate the variability in the two sets of curves and reduce the dimensionality of the problem while maximizing the retained information. Once obtained, the error model can be used to predict the exact response of any realization on the basis of the sole proxy response. This methodology is purpose-oriented as the error model is constructed directly for the quantity of interest, rather than for the state of the system. Also, the dimensionality reduction performed by FPCA allows a diagnostic of the quality of the error model to assess the informativeness of the learning set and the fidelity of the proxy to the exact model. The possibility of obtaining a prediction of the exact response for any newly generated realization suggests that the methodology can be effectively used beyond the context of uncertainty quantification, in particular for Bayesian inference and optimization.
Resumo:
Children who sustain a prenatal or perinatal brain injury in the form of a stroke develop remarkably normal cognitive functions in certain areas, with a particular strength in language skills. A dominant explanation for this is that brain regions from the contralesional hemisphere "take over" their functions, whereas the damaged areas and other ipsilesional regions play much less of a role. However, it is difficult to tease apart whether changes in neural activity after early brain injury are due to damage caused by the lesion or by processes related to postinjury reorganization. We sought to differentiate between these two causes by investigating the functional connectivity (FC) of brain areas during the resting state in human children with early brain injury using a computational model. We simulated a large-scale network consisting of realistic models of local brain areas coupled through anatomical connectivity information of healthy and injured participants. We then compared the resulting simulated FC values of healthy and injured participants with the empirical ones. We found that the empirical connectivity values, especially of the damaged areas, correlated better with simulated values of a healthy brain than those of an injured brain. This result indicates that the structural damage caused by an early brain injury is unlikely to have an adverse and sustained impact on the functional connections, albeit during the resting state, of damaged areas. Therefore, these areas could continue to play a role in the development of near-normal function in certain domains such as language in these children.