885 resultados para Multiple-model filter
Resumo:
A new modeling approach-multiple mapping conditioning (MMC)-is introduced to treat mixing and reaction in turbulent flows. The model combines the advantages of the probability density function and the conditional moment closure methods and is based on a certain generalization of the mapping closure concept. An equivalent stochastic formulation of the MMC model is given. The validity of the closuring hypothesis of the model is demonstrated by a comparison with direct numerical simulation results for the three-stream mixing problem. (C) 2003 American Institute of Physics.
Resumo:
The use of a fitted parameter watershed model to address water quantity and quality management issues requires that it be calibrated under a wide range of hydrologic conditions. However, rarely does model calibration result in a unique parameter set. Parameter nonuniqueness can lead to predictive nonuniqueness. The extent of model predictive uncertainty should be investigated if management decisions are to be based on model projections. Using models built for four neighboring watersheds in the Neuse River Basin of North Carolina, the application of the automated parameter optimization software PEST in conjunction with the Hydrologic Simulation Program Fortran (HSPF) is demonstrated. Parameter nonuniqueness is illustrated, and a method is presented for calculating many different sets of parameters, all of which acceptably calibrate a watershed model. A regularization methodology is discussed in which models for similar watersheds can be calibrated simultaneously. Using this method, parameter differences between watershed models can be minimized while maintaining fit between model outputs and field observations. In recognition of the fact that parameter nonuniqueness and predictive uncertainty are inherent to the modeling process, PEST's nonlinear predictive analysis functionality is then used to explore the extent of model predictive uncertainty.
Resumo:
The radial undistortion model proposed by Fitzgibbon and the radial fundamental matrix were early steps to extend classical epipolar geometry to distorted cameras. Later minimal solvers have been proposed to find relative pose and radial distortion, given point correspondences between images. However, a big drawback of all these approaches is that they require the distortion center to be exactly known. In this paper we show how the distortion center can be absorbed into a new radial fundamental matrix. This new formulation is much more practical in reality as it allows also digital zoom, cropped images and camera-lens systems where the distortion center does not exactly coincide with the image center. In particular we start from the setting where only one of the two images contains radial distortion, analyze the structure of the particular radial fundamental matrix and show that the technique also generalizes to other linear multi-view relationships like trifocal tensor and homography. For the new radial fundamental matrix we propose different estimation algorithms from 9,10 and 11 points. We show how to extract the epipoles and prove the practical applicability on several epipolar geometry image pairs with strong distortion that - to the best of our knowledge - no other existing algorithm can handle properly.
Resumo:
This study aimed to evaluate the efficiency of multiple centroids to study the adaptability of alfalfa genotypes (Medicago sativa L.). In this method, the genotypes are compared with ideotypes defined by the bissegmented regression model, according to the researcher's interest. Thus, genotype classification is carried out as determined by the objective of the researcher and the proposed recommendation strategy. Despite the great potential of the method, it needs to be evaluated under the biological context (with real data). In this context, we used data on the evaluation of dry matter production of 92 alfalfa cultivars, with 20 cuttings, from an experiment in randomized blocks with two repetitions carried out from November 2004 to June 2006. The multiple centroid method proved efficient for classifying alfalfa genotypes. Moreover, it showed no unambiguous indications and provided that ideotypes were defined according to the researcher's interest, facilitating data interpretation.
Resumo:
This paper presents a distributed model predictive control (DMPC) for indoor thermal comfort that simultaneously optimizes the consumption of a limited shared energy resource. The control objective of each subsystem is to minimize the heating/cooling energy cost while maintaining the indoor temperature and used power inside bounds. In a distributed coordinated environment, the control uses multiple dynamically decoupled agents (one for each subsystem/house) aiming to achieve satisfaction of coupling constraints. According to the hourly power demand profile, each house assigns a priority level that indicates how much is willing to bid in auction for consume the limited clean resource. This procedure allows the bidding value vary hourly and consequently, the agents order to access to the clean energy also varies. Despite of power constraints, all houses have also thermal comfort constraints that must be fulfilled. The system is simulated with several houses in a distributed environment.
Resumo:
Amorphous SiC tandem heterostructures are used to filter a specific band, in the visible range. Experimental and simulated results are compared to validate the use of SiC multilayered structures in applications where gain compensation is needed or to attenuate unwanted wavelengths. Spectral response data acquired under different frequencies, optical wavelength control and side irradiations are analyzed. Transfer function characteristics are discussed. Color pulsed communication channels are transmitted together and the output signal analyzed under different background conditions. Results show that under controlled wavelength backgrounds, the device sensitivity is enhanced in a precise wavelength range and quenched in the others, tuning or suppressing a specific band. Depending on the background wavelength and irradiation side, the device acts either as a long-, a short-, or a band-rejection pass filter. An optoelectronic model supports the experimental results and gives insight on the physics of the device.
Resumo:
The application of a-SiC:H/a-Si:H pinpin photodiodes for optoelectronic applications as a WDM demultiplexer device has been demonstrated useful in optical communications that use the WDM technique to encode multiple signals in the visible light range. This is required in short range optical communication applications, where for costs reasons the link is provided by Plastic Optical Fibers. Characterization of these devices has shown the presence of large photocapacitive effects. By superimposing background illumination to the pulsed channel the device behaves as a filter, producing signal attenuation, or as an amplifier, producing signal gain, depending on the channel/background wavelength combination. We present here results, obtained by numerical simulations, about the internal electric configuration of a-SiC:H/a-Si:H pinpin photodiode. These results address the explanation of the device functioning in the frequency domain to a wavelength tunable photo-capacitance due to the accumulation of space charge localized at the bottom diode that, according to the Shockley-Read-Hall model, it is mainly due to defect trapping. Experimental result about measurement of the photodiode capacitance under different conditions of illumination and applied bias will be also presented. The combination of these analyses permits the description of a wavelength controlled photo-capacitance that combined with the series and parallel resistance of the diodes may result in the explicit definition of cut off frequencies for frequency capacitive filters activated by the light background or an oscillatory resonance of photogenerated carriers between the two diodes. (C) 2013 Elsevier B.V. All rights reserved.
Resumo:
A key aspect of decision-making in a disaster response scenario is the capability to evaluate multiple and simultaneously perceived goals. Current competing approaches to build decision-making agents are either mental-state based as BDI, or founded on decision-theoretic models as MDP. The BDI chooses heuristically among several goals and the MDP searches for a policy to achieve a specific goal. In this paper we develop a preferences model to decide among multiple simultaneous goals. We propose a pattern, which follows a decision-theoretic approach, to evaluate the expected causal effects of the observable and non-observable aspects that inform each decision. We focus on yes-or-no (i.e., pursue or ignore a goal) decisions and illustrate the proposal using the RoboCupRescue simulation environment.
Resumo:
Proceedings of International Conference - SPIE 7477, Image and Signal Processing for Remote Sensing XV - 28 September 2009
Resumo:
Over the last three decades, computer architects have been able to achieve an increase in performance for single processors by, e.g., increasing clock speed, introducing cache memories and using instruction level parallelism. However, because of power consumption and heat dissipation constraints, this trend is going to cease. In recent times, hardware engineers have instead moved to new chip architectures with multiple processor cores on a single chip. With multi-core processors, applications can complete more total work than with one core alone. To take advantage of multi-core processors, parallel programming models are proposed as promising solutions for more effectively using multi-core processors. This paper discusses some of the existent models and frameworks for parallel programming, leading to outline a draft parallel programming model for Ada.
Resumo:
In global scientific experiments with collaborative scenarios involving multinational teams there are big challenges related to data access, namely data movements are precluded to other regions or Clouds due to the constraints on latency costs, data privacy and data ownership. Furthermore, each site is processing local data sets using specialized algorithms and producing intermediate results that are helpful as inputs to applications running on remote sites. This paper shows how to model such collaborative scenarios as a scientific workflow implemented with AWARD (Autonomic Workflow Activities Reconfigurable and Dynamic), a decentralized framework offering a feasible solution to run the workflow activities on distributed data centers in different regions without the need of large data movements. The AWARD workflow activities are independently monitored and dynamically reconfigured and steering by different users, namely by hot-swapping the algorithms to enhance the computation results or by changing the workflow structure to support feedback dependencies where an activity receives feedback output from a successor activity. A real implementation of one practical scenario and its execution on multiple data centers of the Amazon Cloud is presented including experimental results with steering by multiple users.
Resumo:
We investigate the structural and thermodynamic properties of a model of particles with 2 patches of type A and 10 patches of type B. Particles are placed on the sites of a face centered cubic lattice with the patches oriented along the nearest neighbor directions. The competition between the self- assembly of chains, rings, and networks on the phase diagram is investigated by carrying out a systematic investigation of this class of models, using an extension ofWertheim's theory for associating fluids and Monte Carlo numerical simulations. We varied the ratio r epsilon(AB)/epsilon(AA) of the interaction between patches A and B, epsilon(AB), and between A patches, epsilon(AA) (epsilon(BB) is set to theta) as well as the relative position of the A patches, i.e., the angle. between the (lattice) directions of the A patches. We found that both r and theta (60 degrees, 90 degrees, or 120 degrees) have a profound effect on the phase diagram. In the empty fluid regime (r < 1/2) the phase diagram is reentrant with a closed miscibility loop. The region around the lower critical point exhibits unusual structural and thermodynamic behavior determined by the presence of relatively short rings. The agreement between the results of theory and simulation is excellent for theta = 120 degrees but deteriorates as. decreases, revealing the need for new theoretical approaches to describe the structure and thermodynamics of systems dominated by small rings. (C) 2014 AIP Publishing LLC.
Resumo:
In an attempt to be as close as possible to the infected and treated patients of the endemic areas of schistosomiasis (S. mansoni) and in order to achieve a long period of follow-up, mice were repeatedly infected with a low number of cercariae. Survival data and histological variables such as schistosomal granuloma, portal changes, hepatocellular necrosis, hepatocellular regeneration, schistosomotic pigment, periductal fibrosis and chiefly bile ducts changes were analysed in the infected treated and non treated mice. Oxamniquine chemotherapy in repeatedly infected mice prolonged survival significantly when compared to non-treated animals (chi-square 9.24, p = 0.0024), thus confirming previous results with a similar experimental model but with a shorter term follow-up. Furthermore, mortality decreased rapidly after treatment suggesting an abrupt reduction in the severity of hepatic lesions. A morphological and immunohistochemical study of the liver was carried out. Portal fibrosis, with a pattern resembling human Symmers fibrosis was present at a late phase in the infected animals. Bile duct lesions were quite close to those described in human Mansonian schistosomiasis. Schistosomal antigen was observed in one isolated altered bile duct cell. The pathogenesis of the bile duct changes and its relation to the parasite infection and/or their antigens are discussed.
Resumo:
The ecotoxicological response of the living organisms in an aquatic system depends on the physical, chemical and bacteriological variables, as well as the interactions between them. An important challenge to scientists is to understand the interaction and behaviour of factors involved in a multidimensional process such as the ecotoxicological response.With this aim, multiple linear regression (MLR) and principal component regression were applied to the ecotoxicity bioassay response of Chlorella vulgaris and Vibrio fischeri in water collected at seven sites of Leça river during five monitoring campaigns (February, May, June, August and September of 2006). The river water characterization included the analysis of 22 physicochemical and 3 microbiological parameters. The model that best fitted the data was MLR, which shows: (i) a negative correlation with dissolved organic carbon, zinc and manganese, and a positive one with turbidity and arsenic, regarding C. vulgaris toxic response; (ii) a negative correlation with conductivity and turbidity and a positive one with phosphorus, hardness, iron, mercury, arsenic and faecal coliforms, concerning V. fischeri toxic response. This integrated assessment may allow the evaluation of the effect of future pollution abatement measures over the water quality of Leça River.
Resumo:
The prediction of the time and the efficiency of the remediation of contaminated soils using soil vapor extraction remain a difficult challenge to the scientific community and consultants. This work reports the development of multiple linear regression and artificial neural network models to predict the remediation time and efficiency of soil vapor extractions performed in soils contaminated separately with benzene, toluene, ethylbenzene, xylene, trichloroethylene, and perchloroethylene. The results demonstrated that the artificial neural network approach presents better performances when compared with multiple linear regression models. The artificial neural network model allowed an accurate prediction of remediation time and efficiency based on only soil and pollutants characteristics, and consequently allowing a simple and quick previous evaluation of the process viability.