891 resultados para data-driven simulation


Relevância:

40.00% 40.00%

Publicador:

Resumo:

ISO19156 Observations and Measurements (O&M) provides a standardised framework for organising information about the collection of information about the environment. Here we describe the implementation of a specialisation of O&M for environmental data, the Metadata Objects for Linking Environmental Sciences (MOLES3). MOLES3 provides support for organising information about data, and for user navigation around data holdings. The implementation described here, “CEDA-MOLES”, also supports data management functions for the Centre for Environmental Data Archival, CEDA. The previous iteration of MOLES (MOLES2) saw active use over five years, being replaced by CEDA-MOLES in late 2014. During that period important lessons were learnt both about the information needed, as well as how to design and maintain the necessary information systems. In this paper we review the problems encountered in MOLES2; how and why CEDA-MOLES was developed and engineered; the migration of information holdings from MOLES2 to CEDA-MOLES; and, finally, provide an early assessment of MOLES3 (as implemented in CEDA-MOLES) and its limitations. Key drivers for the MOLES3 development included the necessity for improved data provenance, for further structured information to support ISO19115 discovery metadata export (for EU INSPIRE compliance), and to provide appropriate fixed landing pages for Digital Object Identifiers (DOIs) in the presence of evolving datasets. Key lessons learned included the importance of minimising information structure in free text fields, and the necessity to support as much agility in the information infrastructure as possible without compromising on maintainability both by those using the systems internally and externally (e.g. citing in to the information infrastructure), and those responsible for the systems themselves. The migration itself needed to ensure continuity of service and traceability of archived assets.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

As a result of urbanization, stormwater runoff flow rates and volumes are significantly increased due to increasing impervious land cover and the decreased availability of depression storage. Storage tanks are the basic devices to efficiently control the flow rate in drainage systems during wet weather. Presented in the paper conception of vacuum-driven detention tanks allows to increase the storage capacity by usage of space above the free surface water elevation at the inlet channel. Partial vacuum storage makes possible to gain cost savings by reduction of both the horizontal area of the detention tank and necessary depth of foundations. Simulation model of vacuum-driven storage tank has been developed to estimate potential profits of its application in urban drainage system. Although SWMM5 has no direct options for vacuum tanks an existing functions (i.e. control rules) have been used to reflect its operation phases. Rainfall data used in simulations were recorded at raingage in Czestochowa during years 2010÷2012 with time interval of 10minutes. Simulation results gives overview to practical operation and maintenance cost (energy demand) of vacuum driven storage tanks depending of the ratio: vacuum-driven volume to total storage capacity. The following conclusion can be drawn from this investigations: vacuum-driven storage tanks are characterized by uncomplicated construction and control systems, thus can be applied in newly developed as well as in the existing urban drainage systems. the application of vacuum in underground detention facilities makes possible to increase of the storage capacity of existing reservoirs by usage the space above the maximum depth. Possible increase of storage capacity can achieve even a few dozen percent at relatively low investment costs. vacuum driven storage tanks can be included in existing simulation software (i.e. SWMM) using options intended for pumping stations (including control and action rules ).

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Stochastic methods based on time-series modeling combined with geostatistics can be useful tools to describe the variability of water-table levels in time and space and to account for uncertainty. Monitoring water-level networks can give information about the dynamic of the aquifer domain in both dimensions. Time-series modeling is an elegant way to treat monitoring data without the complexity of physical mechanistic models. Time-series model predictions can be interpolated spatially, with the spatial differences in water-table dynamics determined by the spatial variation in the system properties and the temporal variation driven by the dynamics of the inputs into the system. An integration of stochastic methods is presented, based on time-series modeling and geostatistics as a framework to predict water levels for decision making in groundwater management and land-use planning. The methodology is applied in a case study in a Guarani Aquifer System (GAS) outcrop area located in the southeastern part of Brazil. Communication of results in a clear and understandable form, via simulated scenarios, is discussed as an alternative, when translating scientific knowledge into applications of stochastic hydrogeology in large aquifers with limited monitoring network coverage like the GAS.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

With the increasing production of information from e-government initiatives, there is also the need to transform a large volume of unstructured data into useful information for society. All this information should be easily accessible and made available in a meaningful and effective way in order to achieve semantic interoperability in electronic government services, which is a challenge to be pursued by governments round the world. Our aim is to discuss the context of e-Government Big Data and to present a framework to promote semantic interoperability through automatic generation of ontologies from unstructured information found in the Internet. We propose the use of fuzzy mechanisms to deal with natural language terms and present some related works found in this area. The results achieved in this study are based on the architectural definition and major components and requirements in order to compose the proposed framework. With this, it is possible to take advantage of the large volume of information generated from e-Government initiatives and use it to benefit society.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Degree in Marine Sciences. Faculty of Marine Sciences, University of Las Palmas de Gran Canaria. Institut de Ciències del Mar, Consejo Superior de Investigaciones Científicas

Relevância:

40.00% 40.00%

Publicador:

Resumo:

ÈN]A trans-oceanic section at 24.5°N in the North Atlantic has been sampled at a decadal frequency. This work demonstrates that the wind-driven component of the Meridional Overturning Circulation (MOC) may be monitored using autonomous profiling floats deployed in the eastern North Atlantic Subtropical Gyre. More than 500 CTD vertical profiles from the surface to 2000 m depth, spanning one year (from April 2002 to March 2003), are used to compute the geostrophic transport stream function at 24.5°N. The baroclinic transport obtained from the autonomous profiling floats is not statistically different than that from three hydrographic cruises carried out in 1957, 1981 and 1992. A good agreement is found between the geostrophic transport stream function and the transport derived from the wind field through the Sverdrup relation.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Ontology design and population -core aspects of semantic technologies- re- cently have become fields of great interest due to the increasing need of domain-specific knowledge bases that can boost the use of Semantic Web. For building such knowledge resources, the state of the art tools for ontology design require a lot of human work. Producing meaningful schemas and populating them with domain-specific data is in fact a very difficult and time-consuming task. Even more if the task consists in modelling knowledge at a web scale. The primary aim of this work is to investigate a novel and flexible method- ology for automatically learning ontology from textual data, lightening the human workload required for conceptualizing domain-specific knowledge and populating an extracted schema with real data, speeding up the whole ontology production process. Here computational linguistics plays a fundamental role, from automati- cally identifying facts from natural language and extracting frame of relations among recognized entities, to producing linked data with which extending existing knowledge bases or creating new ones. In the state of the art, automatic ontology learning systems are mainly based on plain-pipelined linguistics classifiers performing tasks such as Named Entity recognition, Entity resolution, Taxonomy and Relation extraction [11]. These approaches present some weaknesses, specially in capturing struc- tures through which the meaning of complex concepts is expressed [24]. Humans, in fact, tend to organize knowledge in well-defined patterns, which include participant entities and meaningful relations linking entities with each other. In literature, these structures have been called Semantic Frames by Fill- 6 Introduction more [20], or more recently as Knowledge Patterns [23]. Some NLP studies has recently shown the possibility of performing more accurate deep parsing with the ability of logically understanding the structure of discourse [7]. In this work, some of these technologies have been investigated and em- ployed to produce accurate ontology schemas. The long-term goal is to collect large amounts of semantically structured information from the web of crowds, through an automated process, in order to identify and investigate the cognitive patterns used by human to organize their knowledge.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Il progetto Eye-Trauma si colloca all'interno dello sviluppo di un simulatore chirurgico per traumi alla zona oculare, sviluppato in collaborazione con Simulation Group in Boston, Harvard Medical School e Massachusetts General Hospital. Il simulatore presenta un busto in silicone fornito di moduli intercambiabili della zona oculare, per simulare diversi tipi di trauma. L'utilizzatore è chiamato ad eseguire la procedura medica di saturazione tramite degli strumenti chirurgici su cui sono installati dei sensori di forza e di apertura. I dati collezionati vengono utilizzati all'interno del software per il riconoscimento dei gesti e il controllo real-time della performance. L'algoritmo di gesture recognition, da me sviluppato, si basa sul concetto di macchine a stati; la transizione tra gli stati avviene in base agli eventi rilevati dal simulatore.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

We propose a computationally efficient and biomechanically relevant soft-tissue simulation method for cranio-maxillofacial (CMF) surgery. A template-based facial muscle reconstruction was introduced to minimize the efforts on preparing a patient-specific model. A transversely isotropic mass-tensor model (MTM) was adopted to realize the effect of directional property of facial muscles in reasonable computation time. Additionally, sliding contact around teeth and mucosa was considered for more realistic simulation. Retrospective validation study with postoperative scan of a real patient showed that there were considerable improvements in simulation accuracy by incorporating template-based facial muscle anatomy and sliding contact.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Modeling of tumor growth has been performed according to various approaches addressing different biocomplexity levels and spatiotemporal scales. Mathematical treatments range from partial differential equation based diffusion models to rule-based cellular level simulators, aiming at both improving our quantitative understanding of the underlying biological processes and, in the mid- and long term, constructing reliable multi-scale predictive platforms to support patient-individualized treatment planning and optimization. The aim of this paper is to establish a multi-scale and multi-physics approach to tumor modeling taking into account both the cellular and the macroscopic mechanical level. Therefore, an already developed biomodel of clinical tumor growth and response to treatment is self-consistently coupled with a biomechanical model. Results are presented for the free growth case of the imageable component of an initially point-like glioblastoma multiforme tumor. The composite model leads to significant tumor shape corrections that are achieved through the utilization of environmental pressure information and the application of biomechanical principles. Using the ratio of smallest to largest moment of inertia of the tumor material to quantify the effect of our coupled approach, we have found a tumor shape correction of 20\% by coupling biomechanics to the cellular simulator as compared to a cellular simulation without preferred growth directions. We conclude that the integration of the two models provides additional morphological insight into realistic tumor growth behavior. Therefore, it might be used for the development of an advanced oncosimulator focusing on tumor types for which morphology plays an important role in surgical and/or radio-therapeutic treatment planning.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

A prototype vortex-driven air lift pump was developed and experimentally evaluated. It was designed to be easily manufactured and scalable for arbitrary riser diameters. The model tested fit in a 2 inch diameter riser with six air injection nozzles through which airwas injected helically around the perimeter of the riser at an angle of 70º from pure tangential injection. The pump was intended to transport both water and sediment over a large range of submergence ratios. A test apparatus was designed to be able to simulate deep water or oceanic environments. The resulting test setup had a finite reservoir; over the course of a test, the submergence ratio varied from 0.48 to 0.39. For air injection pressures ranging from 10 to 60 psig and for air flow rates of 6 to 15 scfm, the induced water discharge flow rates varied only slightly, due to the limited range of available submergence ratios. The anticipated simulation of deep water environment, with a corresponding equivalent increase in thesubmergence ratio, proved unattainable. The pump prototype successfully transported both water and sediment (sand). Thepercent volume yield of the sediment was in an acceptable range. The pump design has been subsequently used successfully in a 4 inch configuration in a follow-on project. A computer program was written in Matlab to simulate the pump characteristics. The program output water pressures at the location of air injection which were physicallycompatible with the experimental data.