947 resultados para modeling and model calibration


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Madagascar’s terrestrial and aquatic ecosystems have long supported a unique set of ecological communities, many of whom are endemic to the tropical island. Those same ecosystems have been a source of valuable natural resources to some of the poorest people in the world. Nevertheless, with pride, ingenuity and resourcefulness, the Malagasy people of the southwest coast, being of Vezo identity, subsist with low development fishing techniques aimed at an increasingly threatened host of aquatic seascapes. Mangroves, sea grass bed, and coral reefs of the region are under increased pressure from the general populace for both food provisions and support of economic opportunity. Besides purveyors and extractors, the coastal waters are also subject to a number of natural stressors, including cyclones and invasive, predator species of both flora and fauna. In addition, the aquatic ecosystems of the region are undergoing increased nutrient and sediment runoff due, in part, to Madagascar’s heavy reliance on land for agricultural purposes (Scales, 2011). Moreover, its coastal waters, like so many throughout the world, have been proven to be warming at an alarming rate over the past few decades. In recognizing the intimate interconnectedness of the both the social and ecological systems, conservation organizations have invoked a host of complimentary conservation and social development efforts with the dual aim of preserving or restoring the health of both the coastal ecosystems and the people of the region. This paper provides a way of thinking more holistically about the social-ecological system within a resiliency frame of understanding. Secondly, it applies a platform known as state-and-transition modeling to give form to the process. State-and-transition modeling is an iterative investigation into the physical makeup of a system of study as well as the boundaries and influences on that state, and has been used in restorative ecology for more than a decade. Lastly, that model is sited within an adaptive management scheme that provides a structured, cyclical, objective-oriented process for testing stakeholders cognitive understanding of the ecosystem through a pragmatic implementation and monitoring a host of small-scale interventions developed as part of the adaptive management process. Throughout, evidence of the application of the theories and frameworks are offered, with every effort made to retool conservation-minded development practitioners with a comprehensive strategy for addressing the increasingly fragile social-ecological systems of southwest Madagascar. It is offered, in conclusion, that the seascapes of the region would be an excellent case study worthy of future application of state-and-transition modeling and adaptive management as frameworks for conservation-minded development practitioners whose multiple projects, each with its own objective, have been implemented with a single goal in mind: preserve and protect the state of the supporting environment while providing for the basic needs of the local Malagasy people.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Telescopic systems of structural members with clearance are found in many applications, e.g., mobile cranes, rack feeders, fork lifters, stacker cranes (see Figure 1). Operating these machines, undesirable vibrations may reduce the performance and increase safety problems. Therefore, this contribution has the aim to reduce these harmful vibrations. For a better understanding, the dynamic behaviour of these constructions is analysed. The main interest is the overlapping area of each two sections of the above described systems (see markings in Figure 1) which is investigated by measurements and by computations. A test rig is constructed to determine the dynamic behaviour by measuring fundamental vibrations and higher frequent oscillations, damping coefficients, special appearances and more. For an appropriate physical model, the governing boundary value problem is derived by applying Hamilton’s principle and a classical discretisation procedure is used to generate a coupled system of nonlinear ordinary differential equations as the corresponding truncated mathematical model. On the basis of this model, a controller concept for preventing harmful vibrations is developed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Mixed Reality (MR) aims to link virtual entities with the real world and has many applications such as military and medical domains [JBL+00, NFB07]. In many MR systems and more precisely in augmented scenes, one needs the application to render the virtual part accurately at the right time. To achieve this, such systems acquire data related to the real world from a set of sensors before rendering virtual entities. A suitable system architecture should minimize the delays to keep the overall system delay (also called end-to-end latency) within the requirements for real-time performance. In this context, we propose a compositional modeling framework for MR software architectures in order to specify, simulate and validate formally the time constraints of such systems. Our approach is first based on a functional decomposition of such systems into generic components. The obtained elements as well as their typical interactions give rise to generic representations in terms of timed automata. A whole system is then obtained as a composition of such defined components. To write specifications, a textual language named MIRELA (MIxed REality LAnguage) is proposed along with the corresponding compilation tools. The generated output contains timed automata in UPPAAL format for simulation and verification of time constraints. These automata may also be used to generate source code skeletons for an implementation on a MR platform. The approach is illustrated first on a small example. A realistic case study is also developed. It is modeled by several timed automata synchronizing through channels and including a large number of time constraints. Both systems have been simulated in UPPAAL and checked against the required behavioral properties.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A detailed characterization of air quality in the megacity of Paris (France) during two 1-month intensive campaigns and from additional 1-year observations revealed that about 70% of the urban background fine particulate matter (PM) is transported on average into the megacity from upwind regions. This dominant influence of regional sources was confirmed by in situ measurements during short intensive and longer-term campaigns, aerosol optical depth (AOD) measurements from ENVISAT, and modeling results from PMCAMx and CHIMERE chemistry transport models. While advection of sulfate is well documented for other megacities, there was surprisingly high contribution from long-range transport for both nitrate and organic aerosol. The origin of organic PM was investigated by comprehensive analysis of aerosol mass spectrometer (AMS), radiocarbon and tracer measurements during two intensive campaigns. Primary fossil fuel combustion emissions constituted less than 20%in winter and 40%in summer of carbonaceous fine PM, unexpectedly small for a megacity. Cooking activities and, during winter, residential wood burning are the major primary organic PM sources. This analysis suggests that the major part of secondary organic aerosol is of modern origin, i.e., from biogenic precursors and from wood burning. Black carbon concentrations are on the lower end of values encountered in megacities worldwide, but still represent an issue for air quality. These comparatively low air pollution levels are due to a combination of low emissions per inhabitant, flat terrain, and a meteorology that is in general not conducive to local pollution build-up. This revised picture of a megacity only being partially responsible for its own average and peak PM levels has important implications for air pollution regulation policies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The goal of this work has been to calibrate sensitivities and fragmentation pattern of various molecules as well as further characterize the lab model of the ROSINA Double Focusing Mass Spectrometer (DFMS) on board ESA’s Rosetta spacecraft bound to comet 67P/Churyumov-Gerasimenko. The detailed calibration and characterization of the instrument is key to understand and interpret the results in the coma of the comet. A static calibration was performed for the following species: Ne, Ar, Kr, Xe, H2O, N2, CO2, CH4, C2H6, C3H8, C4H10, and C2H4. The purpose of the calibration was to obtain sensitivities for all detectors and emissions, the fragmentation behavior of the ion source and to show the capabilities to measure isotopic ratios at the comet. The calibration included the recording of different correction factors to evaluate the data, including a detailed investigation of the detector gain. The quality of the calibration that could be tested for different gas mixtures including the calibration of the density inside the ion source when calibration gas from the gas calibration unit is introduced. In conclusion the calibration shows that DFMS meets the design requirements and that DFMS will be able to measure the D/H at the comet and help shed more light on the puzzle about the origin of water on Earth.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present a novel framework for encoding latency analysis of arbitrary multiview video coding prediction structures. This framework avoids the need to consider an specific encoder architecture for encoding latency analysis by assuming an unlimited processing capacity on the multiview encoder. Under this assumption, only the influence of the prediction structure and the processing times have to be considered, and the encoding latency is solved systematically by means of a graph model. The results obtained with this model are valid for a multiview encoder with sufficient processing capacity and serve as a lower bound otherwise. Furthermore, with the objective of low latency encoder design with low penalty on rate-distortion performance, the graph model allows us to identify the prediction relationships that add higher encoding latency to the encoder. Experimental results for JMVM prediction structures illustrate how low latency prediction structures with a low rate-distortion penalty can be derived in a systematic manner using the new model.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BioMet®Tools is a set of software applications developed for the biometrical characterization of voice in different fields as voice quality evaluation in laryngology, speech therapy and rehabilitation, education of the singing voice, forensic voice analysis in court, emotional detection in voice, secure access to facilities and services, etc. Initially it was conceived as plain research code to estimate the glottal source from voice and obtain the biomechanical parameters of the vocal folds from the spectral density of the estimate. This code grew to what is now the Glottex®Engine package (G®E). Further demands from users in medical and forensic fields instantiated the development of different Graphic User Interfaces (GUI’s) to encapsulate user interaction with the G®E. This required the personalized design of different GUI’s handling the same G®E. In this way development costs and time could be saved. The development model is described in detail leading to commercial production and distribution. Study cases from its application to the field of laryngology and speech therapy are given and discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The deployment of nodes in Wireless Sensor Networks (WSNs) arises as one of the biggest challenges of this field, which involves in distributing a large number of embedded systems to fulfill a specific application. The connectivity of WSNs is difficult to estimate due to the irregularity of the physical environment and affects the WSN designers? decision on deploying sensor nodes. Therefore, in this paper, a new method is proposed to enhance the efficiency and accuracy on ZigBee propagation simulation in indoor environments. The method consists of two steps: automatic 3D indoor reconstruction and 3D ray-tracing based radio simulation. The automatic 3D indoor reconstruction employs unattended image classification algorithm and image vectorization algorithm to build the environment database accurately, which also significantly reduces time and efforts spent on non-radio propagation issue. The 3D ray tracing is developed by using kd-tree space division algorithm and a modified polar sweep algorithm, which accelerates the searching of rays over the entire space. Signal propagation model is proposed for the ray tracing engine by considering both the materials of obstacles and the impact of positions along the ray path of radio. Three different WSN deployments are realized in the indoor environment of an office and the results are verified to be accurate. Experimental results also indicate that the proposed method is efficient in pre-simulation strategy and 3D ray searching scheme and is suitable for different indoor environments.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper some mathematical programming models are exposed in order to set the number of services on a specified system of bus lines, which are intended to assist high demand levels which may arise because of the disruption of Rapid Transit services or during the celebration of massive events. By means of this model two types of basic magnitudes can be determined, basically: a) the number of bus units assigned to each line and b) the number of services that should be assigned to those units. In these models, passenger flow assignment to lines can be considered of the system optimum type, in the sense that the assignment of units and of services is carried out minimizing a linear combination of operation costs and total travel time of users. The models consider delays experienced by buses as a consequence of the get in/out of the passengers, queueing at stations and the delays that passengers experience waiting at the stations. For the case of a congested strategy based user optimal passenger assignment model with strict capacities on the bus lines, the use of the method of successive averages is shown.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Knowledge acquisition and model maintenance are key problems in knowledge engineering to improve the productivity in the development of intelligent systems. Although historically a number of technical solutions have been proposed in this area, the recent experience shows that there is still an important gap between the way end-users describe their expertise and the way intelligent systems represent knowledge. In this paper we propose an original way to cope with this problem based on electronic documents. We propose the concept of intelligent document processor as a tool that allows the end-user to read/write a document explaining how an intelligent system operates in such a way that, if the user changes the content of the document, the intelligent system will react to these changes. The paper presents the structure of such a document based on knowledge categories derived from the modern knowledge modeling methodologies together with a number of requirements to be understandable by end-users and problem solvers.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: In recent years, Spain has implemented a number of air quality control measures that are expected to lead to a future reduction in fine particle concentrations and an ensuing positive impact on public health. Objectives: We aimed to assess the impact on mortality attributable to a reduction in fine particle levels in Spain in 2014 in relation to the estimated level for 2007. Methods: To estimate exposure, we constructed fine particle distribution models for Spain for 2007 (reference scenario) and 2014 (projected scenario) with a spatial resolution of 16x16 km2. In a second step, we used the concentration-response functions proposed by cohort studies carried out in Europe (European Study of Cohorts for Air Pollution Effects and Rome longitudinal cohort) and North America (American Cancer Society cohort, Harvard Six Cities study and Canadian national cohort) to calculate the number of attributable annual deaths corresponding to all causes, all non-accidental causes, ischemic heart disease and lung cancer among persons aged over 25 years (2005-2007 mortality rate data). We examined the effect of the Spanish demographic shift in our analysis using 2007 and 2012 population figures. Results: Our model suggested that there would be a mean overall reduction in fine particle levels of 1mg/m3 by 2014. Taking into account 2007 population data, between 8 and 15 all-cause deaths per 100,000 population could be postponed annually by the expected reduction in fine particle levels. For specific subgroups, estimates varied from 10 to 30 deaths for all non-accidental causes, from 1 to 5 for lung cancer, and from 2 to 6 for ischemic heart disease. The expected burden of preventable mortality would be even higher in the future due to the Spanish population growth. Taking into account the population older than 30 years in 2012, the absolute mortality impact estimate would increase approximately by 18%. Conclusions: Effective implementation of air quality measures in Spain, in a scenario with a short-term projection, would amount to an appreciable decline infine particle concentrations, and this, in turn, would lead to notable health-related benefits. Recent European cohort studies strengthen the evidence of an association between long-term exposure to fine particles and health effects, and could enhance the health impact quantification in Europe. Air quality models can contribute to improved assessment of air pollution health impact estimates, particularly in study areas without air pollution monitoring data.