888 resultados para data-driven simulation
Resumo:
Ontology design and population -core aspects of semantic technologies- re- cently have become fields of great interest due to the increasing need of domain-specific knowledge bases that can boost the use of Semantic Web. For building such knowledge resources, the state of the art tools for ontology design require a lot of human work. Producing meaningful schemas and populating them with domain-specific data is in fact a very difficult and time-consuming task. Even more if the task consists in modelling knowledge at a web scale. The primary aim of this work is to investigate a novel and flexible method- ology for automatically learning ontology from textual data, lightening the human workload required for conceptualizing domain-specific knowledge and populating an extracted schema with real data, speeding up the whole ontology production process. Here computational linguistics plays a fundamental role, from automati- cally identifying facts from natural language and extracting frame of relations among recognized entities, to producing linked data with which extending existing knowledge bases or creating new ones. In the state of the art, automatic ontology learning systems are mainly based on plain-pipelined linguistics classifiers performing tasks such as Named Entity recognition, Entity resolution, Taxonomy and Relation extraction [11]. These approaches present some weaknesses, specially in capturing struc- tures through which the meaning of complex concepts is expressed [24]. Humans, in fact, tend to organize knowledge in well-defined patterns, which include participant entities and meaningful relations linking entities with each other. In literature, these structures have been called Semantic Frames by Fill- 6 Introduction more [20], or more recently as Knowledge Patterns [23]. Some NLP studies has recently shown the possibility of performing more accurate deep parsing with the ability of logically understanding the structure of discourse [7]. In this work, some of these technologies have been investigated and em- ployed to produce accurate ontology schemas. The long-term goal is to collect large amounts of semantically structured information from the web of crowds, through an automated process, in order to identify and investigate the cognitive patterns used by human to organize their knowledge.
Resumo:
Il progetto Eye-Trauma si colloca all'interno dello sviluppo di un simulatore chirurgico per traumi alla zona oculare, sviluppato in collaborazione con Simulation Group in Boston, Harvard Medical School e Massachusetts General Hospital. Il simulatore presenta un busto in silicone fornito di moduli intercambiabili della zona oculare, per simulare diversi tipi di trauma. L'utilizzatore è chiamato ad eseguire la procedura medica di saturazione tramite degli strumenti chirurgici su cui sono installati dei sensori di forza e di apertura. I dati collezionati vengono utilizzati all'interno del software per il riconoscimento dei gesti e il controllo real-time della performance. L'algoritmo di gesture recognition, da me sviluppato, si basa sul concetto di macchine a stati; la transizione tra gli stati avviene in base agli eventi rilevati dal simulatore.
Resumo:
We propose a computationally efficient and biomechanically relevant soft-tissue simulation method for cranio-maxillofacial (CMF) surgery. A template-based facial muscle reconstruction was introduced to minimize the efforts on preparing a patient-specific model. A transversely isotropic mass-tensor model (MTM) was adopted to realize the effect of directional property of facial muscles in reasonable computation time. Additionally, sliding contact around teeth and mucosa was considered for more realistic simulation. Retrospective validation study with postoperative scan of a real patient showed that there were considerable improvements in simulation accuracy by incorporating template-based facial muscle anatomy and sliding contact.
Resumo:
Modeling of tumor growth has been performed according to various approaches addressing different biocomplexity levels and spatiotemporal scales. Mathematical treatments range from partial differential equation based diffusion models to rule-based cellular level simulators, aiming at both improving our quantitative understanding of the underlying biological processes and, in the mid- and long term, constructing reliable multi-scale predictive platforms to support patient-individualized treatment planning and optimization. The aim of this paper is to establish a multi-scale and multi-physics approach to tumor modeling taking into account both the cellular and the macroscopic mechanical level. Therefore, an already developed biomodel of clinical tumor growth and response to treatment is self-consistently coupled with a biomechanical model. Results are presented for the free growth case of the imageable component of an initially point-like glioblastoma multiforme tumor. The composite model leads to significant tumor shape corrections that are achieved through the utilization of environmental pressure information and the application of biomechanical principles. Using the ratio of smallest to largest moment of inertia of the tumor material to quantify the effect of our coupled approach, we have found a tumor shape correction of 20\% by coupling biomechanics to the cellular simulator as compared to a cellular simulation without preferred growth directions. We conclude that the integration of the two models provides additional morphological insight into realistic tumor growth behavior. Therefore, it might be used for the development of an advanced oncosimulator focusing on tumor types for which morphology plays an important role in surgical and/or radio-therapeutic treatment planning.
Resumo:
A prototype vortex-driven air lift pump was developed and experimentally evaluated. It was designed to be easily manufactured and scalable for arbitrary riser diameters. The model tested fit in a 2 inch diameter riser with six air injection nozzles through which airwas injected helically around the perimeter of the riser at an angle of 70º from pure tangential injection. The pump was intended to transport both water and sediment over a large range of submergence ratios. A test apparatus was designed to be able to simulate deep water or oceanic environments. The resulting test setup had a finite reservoir; over the course of a test, the submergence ratio varied from 0.48 to 0.39. For air injection pressures ranging from 10 to 60 psig and for air flow rates of 6 to 15 scfm, the induced water discharge flow rates varied only slightly, due to the limited range of available submergence ratios. The anticipated simulation of deep water environment, with a corresponding equivalent increase in thesubmergence ratio, proved unattainable. The pump prototype successfully transported both water and sediment (sand). Thepercent volume yield of the sediment was in an acceptable range. The pump design has been subsequently used successfully in a 4 inch configuration in a follow-on project. A computer program was written in Matlab to simulate the pump characteristics. The program output water pressures at the location of air injection which were physicallycompatible with the experimental data.
Resumo:
The performance of reanalysis-driven Canadian Regional Climate Model, version 5 (CRCM5) in reproducing the present climate over the North American COordinated Regional climate Downscaling EXperiment domain for the 1989–2008 period has been assessed in comparison with several observation-based datasets. The model reproduces satisfactorily the near-surface temperature and precipitation characteristics over most part of North America. Coastal and mountainous zones remain problematic: a cold bias (2–6 °C) prevails over Rocky Mountains in summertime and all year-round over Mexico; winter precipitation in mountainous coastal regions is overestimated. The precipitation patterns related to the North American Monsoon are well reproduced, except on its northern limit. The spatial and temporal structure of the Great Plains Low-Level Jet is well reproduced by the model; however, the night-time precipitation maximum in the jet area is underestimated. The performance of CRCM5 was assessed against earlier CRCM versions and other RCMs. CRCM5 is shown to have been substantially improved compared to CRCM3 and CRCM4 in terms of seasonal mean statistics, and to be comparable to other modern RCMs.
Resumo:
In this paper, we present the Cellular Dynamic Simulator (CDS) for simulating diffusion and chemical reactions within crowded molecular environments. CDS is based on a novel event driven algorithm specifically designed for precise calculation of the timing of collisions, reactions and other events for each individual molecule in the environment. Generic mesh based compartments allow the creation / importation of very simple or detailed cellular structures that exist in a 3D environment. Multiple levels of compartments and static obstacles can be used to create a dense environment to mimic cellular boundaries and the intracellular space. The CDS algorithm takes into account volume exclusion and molecular crowding that may impact signaling cascades in small sub-cellular compartments such as dendritic spines. With the CDS, we can simulate simple enzyme reactions; aggregation, channel transport, as well as highly complicated chemical reaction networks of both freely diffusing and membrane bound multi-protein complexes. Components of the CDS are generally defined such that the simulator can be applied to a wide range of environments in terms of scale and level of detail. Through an initialization GUI, a simple simulation environment can be created and populated within minutes yet is powerful enough to design complex 3D cellular architecture. The initialization tool allows visual confirmation of the environment construction prior to execution by the simulator. This paper describes the CDS algorithm, design implementation, and provides an overview of the types of features available and the utility of those features are highlighted in demonstrations.
Resumo:
The aim of this study was to explore potential causes and mechanisms for the sequence and temporal pattern of tree taxa, specifically for the shift from shrub-tundra to birch–juniper woodland during and after the transition from the Oldest Dryas to the Bølling–Allerød in the region surrounding the lake Gerzensee in southern Central Europe. We tested the influence of climate, forest dynamics, community dynamics compared to other causes for delays. For this aim temperature reconstructed from a δ18O-record was used as input driving the multi-species forest-landscape model TreeMig. In a stepwise scenario analysis, population dynamics along with pollen production and transport were simulated and compared with pollen-influx data, according to scenarios of different δ18O/temperature sensitivities, different precipitation levels, with/without inter-specific competition, and with/without prescribed arrival of species. In the best-fitting scenarios, the effects on competitive relationships, pollen production, spatial forest structure, albedo, and surface roughness were examined in more detail. The appearance of most taxa in the data could only be explained by the coldest temperature scenario with a sensitivity of 0.3‰/°C, corresponding to an anomaly of − 15 °C. Once the taxa were present, their temporal pattern was shaped by competition. The later arrival of Pinus could not be explained even by the coldest temperatures, and its timing had to be prescribed by first observations in the pollen record. After the arrival into the simulation area, the expansion of Pinus was further influenced by competitors and minor climate oscillations. The rapid change in the simulated species composition went along with a drastic change in forest structure, leaf area, albedo, and surface roughness. Pollen increased only shortly after biomass. Based on our simulations, two alternative potential scenarios for the pollen pattern can be given: either very cold climate suppressed most species in the Oldest Dryas, or they were delayed by soil formation or migration. One taxon, Pinus, was delayed by migration and then additionally hindered by competition. Community dynamics affected the pattern in two ways: potentially by facilitation, i.e. by nitrogen-fixing pioneer species at the onset, whereas the later pattern was clearly shaped by competition. The simulated structural changes illustrate how vegetation on a larger scale could feed back to the climate system. For a better understanding, a more integrated simulation approach covering also the immigration from refugia would be necessary, for this combines climate-driven population dynamics, migration, individual pollen production and transport, soil dynamics, and physiology of individual pollen production.
Resumo:
We consider a large quantum system with spins 12 whose dynamics is driven entirely by measurements of the total spin of spin pairs. This gives rise to a dissipative coupling to the environment. When one averages over the measurement results, the corresponding real-time path integral does not suffer from a sign problem. Using an efficient cluster algorithm, we study the real-time evolution from an initial antiferromagnetic state of the two-dimensional Heisenberg model, which is driven to a disordered phase, not by a Hamiltonian, but by sporadic measurements or by continuous Lindblad evolution.
Resumo:
Using quantum Monte Carlo, we study the nonequilibrium transport of magnetization in large open strongly correlated quantum spin-12 systems driven by purely dissipative processes that conserve the uniform or staggered magnetization, disregarding unitary Hamiltonian dynamics. We prepare both a low-temperature Heisenberg ferromagnet and an antiferromagnet in two parts of the system that are initially isolated from each other. We then bring the two subsystems in contact and study their real-time dissipative dynamics for different geometries. The flow of the uniform or staggered magnetization from one part of the system to the other is described by a diffusion equation that can be derived analytically.
Resumo:
Syndromic surveillance (SyS) systems currently exploit various sources of health-related data, most of which are collected for purposes other than surveillance (e.g. economic). Several European SyS systems use data collected during meat inspection for syndromic surveillance of animal health, as some diseases may be more easily detected post-mortem than at their point of origin or during the ante-mortem inspection upon arrival at the slaughterhouse. In this paper we use simulation to evaluate the performance of a quasi-Poisson regression (also known as an improved Farrington) algorithm for the detection of disease outbreaks during post-mortem inspection of slaughtered animals. When parameterizing the algorithm based on the retrospective analyses of 6 years of historic data, the probability of detection was satisfactory for large (range 83-445 cases) outbreaks but poor for small (range 20-177 cases) outbreaks. Varying the amount of historical data used to fit the algorithm can help increasing the probability of detection for small outbreaks. However, while the use of a 0·975 quantile generated a low false-positive rate, in most cases, more than 50% of outbreak cases had already occurred at the time of detection. High variance observed in the whole carcass condemnations time-series, and lack of flexibility in terms of the temporal distribution of simulated outbreaks resulting from low reporting frequency (monthly), constitute major challenges for early detection of outbreaks in the livestock population based on meat inspection data. Reporting frequency should be increased in the future to improve timeliness of the SyS system while increased sensitivity may be achieved by integrating meat inspection data into a multivariate system simultaneously evaluating multiple sources of data on livestock health.
Resumo:
Application of pressure-driven laminar flow has an impact on zone and boundary dispersion in open tubular CE. The GENTRANS dynamic simulator for electrophoresis was extended with Taylor-Aris diffusivity which accounts for dispersion due to the parabolic flow profile associated with pressure-driven flow. Effective diffusivity of analyte and system zones as functions of the capillary diameter and the amount of flow in comparison to molecular diffusion alone were studied for configurations with concomitant action of imposed hydrodynamic flow and electroosmosis. For selected examples under realistic experimental conditions, simulation data are compared with those monitored experimentally using modular CE setups featuring both capacitively coupled contactless conductivity and UV absorbance detection along a 50 μm id fused-silica capillary of 90 cm total length. The data presented indicate that inclusion of flow profile based Taylor-Aris diffusivity provides realistic simulation data for analyte and system peaks, particularly those monitored in CE with conductivity detection.
Resumo:
Sizes and power of selected two-sample tests of the equality of survival distributions are compared by simulation for small samples from unequally, randomly-censored exponential distributions. The tests investigated include parametric tests (F, Score, Likelihood, Asymptotic), logrank tests (Mantel, Peto-Peto), and Wilcoxon-Type tests (Gehan, Prentice). Equal sized samples, n = 18, 16, 32 with 1000 (size) and 500 (power) simulation trials, are compared for 16 combinations of the censoring proportions 0%, 20%, 40%, and 60%. For n = 8 and 16, the Asymptotic, Peto-Peto, and Wilcoxon tests perform at nominal 5% size expectations, but the F, Score and Mantel tests exceeded 5% size confidence limits for 1/3 of the censoring combinations. For n = 32, all tests showed proper size, with the Peto-Peto test most conservative in the presence of unequal censoring. Powers of all tests are compared for exponential hazard ratios of 1.4 and 2.0. There is little difference in power characteristics of the tests within the classes of tests considered. The Mantel test showed 90% to 95% power efficiency relative to parametric tests. Wilcoxon-type tests have the lowest relative power but are robust to differential censoring patterns. A modified Peto-Peto test shows power comparable to the Mantel test. For n = 32, a specific Weibull-exponential comparison of crossing survival curves suggests that the relative powers of logrank and Wilcoxon-type tests are dependent on the scale parameter of the Weibull distribution. Wilcoxon-type tests appear more powerful than logrank tests in the case of late-crossing and less powerful for early-crossing survival curves. Guidelines for the appropriate selection of two-sample tests are given. ^