831 resultados para Tessellation-based model
Resumo:
Object recognition has long been a core problem in computer vision. To improve object spatial support and speed up object localization for object recognition, generating high-quality category-independent object proposals as the input for object recognition system has drawn attention recently. Given an image, we generate a limited number of high-quality and category-independent object proposals in advance and used as inputs for many computer vision tasks. We present an efficient dictionary-based model for image classification task. We further extend the work to a discriminative dictionary learning method for tensor sparse coding. In the first part, a multi-scale greedy-based object proposal generation approach is presented. Based on the multi-scale nature of objects in images, our approach is built on top of a hierarchical segmentation. We first identify the representative and diverse exemplar clusters within each scale. Object proposals are obtained by selecting a subset from the multi-scale segment pool via maximizing a submodular objective function, which consists of a weighted coverage term, a single-scale diversity term and a multi-scale reward term. The weighted coverage term forces the selected set of object proposals to be representative and compact; the single-scale diversity term encourages choosing segments from different exemplar clusters so that they will cover as many object patterns as possible; the multi-scale reward term encourages the selected proposals to be discriminative and selected from multiple layers generated by the hierarchical image segmentation. The experimental results on the Berkeley Segmentation Dataset and PASCAL VOC2012 segmentation dataset demonstrate the accuracy and efficiency of our object proposal model. Additionally, we validate our object proposals in simultaneous segmentation and detection and outperform the state-of-art performance. To classify the object in the image, we design a discriminative, structural low-rank framework for image classification. We use a supervised learning method to construct a discriminative and reconstructive dictionary. By introducing an ideal regularization term, we perform low-rank matrix recovery for contaminated training data from all categories simultaneously without losing structural information. A discriminative low-rank representation for images with respect to the constructed dictionary is obtained. With semantic structure information and strong identification capability, this representation is good for classification tasks even using a simple linear multi-classifier.
Resumo:
We analyze the behavior of spot prices in the Colombian wholesale power market, using a series of models derived from industrial organization theory -- We first create a Cournot-based model that simulates the strategic behavior of the market-leader power generators, which we use to estimate two industrial organization variables, the Index of Residual Demand and the Herfindahl-Hirschman Index (HHI) -- We use these variables to create VAR models that estimate spot prices and power market impulse-response relationships -- The results from these models show that hydroelectric generators can use their water storage capability strategically to affect off-peak prices primarily, while the thermal generators can manage their capacity strategically to affect on-peak prices -- In addition, shocks to the Index of Residual Capacity and to the HHI cause spot price fluctuations, which can be interpreted as the generators´ strategic response to these shocks
Resumo:
Mestrado em Ciências Empresariais
Resumo:
Facility location concerns the placement of facilities, for various objectives, by use of mathematical models and solution procedures. Almost all facility location models that can be found in literature are based on minimizing costs or maximizing cover, to cover as much demand as possible. These models are quite efficient for finding an optimal location for a new facility for a particular data set, which is considered to be constant and known in advance. In a real world situation, input data like demand and travelling costs are not fixed, nor known in advance. This uncertainty and uncontrollability can lead to unacceptable losses or even bankruptcy. A way of dealing with these factors is robustness modelling. A robust facility location model aims to locate a facility that stays within predefined limits for all expectable circumstances as good as possible. The deviation robustness concept is used as basis to develop a new competitive deviation robustness model. The competition is modelled with a Huff based model, which calculates the market share of the new facility. Robustness in this model is defined as the ability of a facility location to capture a minimum market share, despite variations in demand. A test case is developed by which algorithms can be tested on their ability to solve robust facility location models. Four stochastic optimization algorithms are considered from which Simulated Annealing turned out to be the most appropriate. The test case is slightly modified for a competitive market situation. With the Simulated Annealing algorithm, the developed competitive deviation model is solved, for three considered norms of deviation. At the end, also a grid search is performed to illustrate the landscape of the objective function of the competitive deviation model. The model appears to be multimodal and seems to be challenging for further research.
Resumo:
In the presented paper, the temporal and statistical properties of a Lyot filter based multiwavelength random DFB fiber laser with a wide flat spectrum, consisting of individual lines, were investigated. It was shown that separate spectral lines forming the laser spectrum have mostly Gaussian statistics and so represent stochastic radiation, but at the same time the entire radiation is not fully stochastic. A simple model, taking into account phenomenological correlations of the lines' initial phases was established. Radiation structure in the experiment and simulation proved to be different, demanding interactions between different lines to be described via a NLSE-based model.
Resumo:
Changing the traditional pattern of public procurement for an electronic paradigm is a radical innovation involving major organizational changes, the breaking up of traditional processes and practices, obsolescence of knowledge and skills. Going beyond the European Commission's recommendations, in 2009 Portugal pioneered in making e-procurement mandatory in the pre-award phase, in a European context of multiple technical standards and lack of interoperability of electronic platforms across the EU countries. Six years later, when the creation of a European e-procurement single market is a EU mission and a major legislative amendment is underway in Portugal, this study looks at the relationship between e-procurement and innovation in the Portuguese municipalities aiming to understand the extent into which the adoption of e-procurement embraced a real organizational change or, on the other hand, if it just represented a mere adaptation of the usual procurement practices. The study draws on data from an electronic survey to all municipalities in mainland Portugal and the analysis is mainly descriptive and exploratory. The paradigm shift in public procurement involves major organizational changes but, overall, the results suggest that most municipalities do not have a clear understanding of the innovative scope (depth and diversity) implied by e-procurement. E-procurement shows advantages over the paper-based model but an unbalanced perception of the innovation dimensions has influenced the implementation of e-procurement and the degree of organizational change.
Resumo:
Canopy and aerodynamic conductances (gC and gA) are two of the key land surface biophysical variables that control the land surface response of land surface schemes in climate models. Their representation is crucial for predicting transpiration (λET) and evaporation (λEE) flux components of the terrestrial latent heat flux (λE), which has important implications for global climate change and water resource management. By physical integration of radiometric surface temperature (TR) into an integrated framework of the Penman?Monteith and Shuttleworth?Wallace models, we present a novel approach to directly quantify the canopy-scale biophysical controls on λET and λEE over multiple plant functional types (PFTs) in the Amazon Basin. Combining data from six LBA (Large-scale Biosphere-Atmosphere Experiment in Amazonia) eddy covariance tower sites and a TR-driven physically based modeling approach, we identified the canopy-scale feedback-response mechanism between gC, λET, and atmospheric vapor pressure deficit (DA), without using any leaf-scale empirical parameterizations for the modeling. The TR-based model shows minor biophysical control on λET during the wet (rainy) seasons where λET becomes predominantly radiation driven and net radiation (RN) determines 75 to 80 % of the variances of λET. However, biophysical control on λET is dramatically increased during the dry seasons, and particularly the 2005 drought year, explaining 50 to 65 % of the variances of λET, and indicates λET to be substantially soil moisture driven during the rainfall deficit phase. Despite substantial differences in gA between forests and pastures, very similar canopy?atmosphere "coupling" was found in these two biomes due to soil moisture-induced decrease in gC in the pasture. This revealed the pragmatic aspect of the TR-driven model behavior that exhibits a high sensitivity of gC to per unit change in wetness as opposed to gA that is marginally sensitive to surface wetness variability. Our results reveal the occurrence of a significant hysteresis between λET and gC during the dry season for the pasture sites, which is attributed to relatively low soil water availability as compared to the rainforests, likely due to differences in rooting depth between the two systems. Evaporation was significantly influenced by gA for all the PFTs and across all wetness conditions. Our analytical framework logically captures the responses of gC and gA to changes in atmospheric radiation, DA, and surface radiometric temperature, and thus appears to be promising for the improvement of existing land?surface?atmosphere exchange parameterizations across a range of spatial scales.
Resumo:
Canopy and aerodynamic conductances (gC and gA) are two of the key land surface biophysical variables that control the land surface response of land surface schemes in climate models. Their representation is crucial for predicting transpiration (?ET) and evaporation (?EE) flux components of the terrestrial latent heat flux (?E), which has important implications for global climate change and water resource management. By physical integration of radiometric surface temperature (TR) into an integrated framework of the Penman?Monteith and Shuttleworth?Wallace models, we present a novel approach to directly quantify the canopy-scale biophysical controls on ?ET and ?EE over multiple plant functional types (PFTs) in the Amazon Basin. Combining data from six LBA (Large-scale Biosphere-Atmosphere Experiment in Amazonia) eddy covariance tower sites and a TR-driven physically based modeling approach, we identified the canopy-scale feedback-response mechanism between gC, ?ET, and atmospheric vapor pressure deficit (DA), without using any leaf-scale empirical parameterizations for the modeling. The TR-based model shows minor biophysical control on ?ET during the wet (rainy) seasons where ?ET becomes predominantly radiation driven and net radiation (RN) determines 75 to 80?% of the variances of ?ET. However, biophysical control on ?ET is dramatically increased during the dry seasons, and particularly the 2005 drought year, explaining 50 to 65?% of the variances of ?ET, and indicates ?ET to be substantially soil moisture driven during the rainfall deficit phase. Despite substantial differences in gA between forests and pastures, very similar canopy?atmosphere "coupling" was found in these two biomes due to soil moisture-induced decrease in gC in the pasture. This revealed the pragmatic aspect of the TR-driven model behavior that exhibits a high sensitivity of gC to per unit change in wetness as opposed to gA that is marginally sensitive to surface wetness variability. Our results reveal the occurrence of a significant hysteresis between ?ET and gC during the dry season for the pasture sites, which is attributed to relatively low soil water availability as compared to the rainforests, likely due to differences in rooting depth between the two systems. Evaporation was significantly influenced by gA for all the PFTs and across all wetness conditions. Our analytical framework logically captures the responses of gC and gA to changes in atmospheric radiation, DA, and surface radiometric temperature, and thus appears to be promising for the improvement of existing land?surface?atmosphere exchange parameterizations across a range of spatial scales.
Resumo:
This comprehensive study explores the intricate world of 3D printing, with a focus on Fused Deposition Modelling (FDM). It sheds light on the critical factors that influence the quality and mechanical properties of 3D printed objects. Using an optical microscope with 40X magnification, the shapes of the printed beads is correlated to specific slicing parameters, resulting in a 2D parametric model. This mathematical model, derived from real samples, serves as a tool to predict general mechanical behaviour, bridging the gap between theory and practice in FDM printing. The study begins by emphasising the importance of geometric parameters such as layer height, line width and filament tolerance on the final printed bead geometry and the resulting theoretical effect on mechanical properties. The introduction of VPratio parameter (ratio between the area of the voids and the area occupied by printed material) allows the quantification of the variation of geometric slicing parameters on the improvement or reduction of mechanical properties. The study also addresses the effect of overhang and the role of filament diameter tolerances. The research continues with the introduction of 3D FEM (Finite Element Analysis) models based on the RVE (Representative Volume Element) to verify the results obtained from the 2D model and to analyse other aspects that affect mechanical properties and not directly observable with the 2D model. The study also proposes a model for the examination of 3D printed infill structures, introducing also an innovative methodology called “double RVE” which speeds up the calculation of mechanical properties and is also more computationally efficient. Finally, the limitations of the RVE model are shown and a so-called Hybrid RVE-based model is created to overcome the limitations and inaccuracy of the conventional RVE model and homogenization procedure on some printed geometries.
Resumo:
This paper discusses the integrated design of parallel manipulators, which exhibit varying dynamics. This characteristic affects the machine stability and performance. The design methodology consists of four main steps: (i) the system modeling using flexible multibody technique, (ii) the synthesis of reduced-order models suitable for control design, (iii) the systematic flexible model-based input signal design, and (iv) the evaluation of some possible machine designs. The novelty in this methodology is to take structural flexibilities into consideration during the input signal design; therefore, enhancing the standard design process which mainly considers rigid bodies dynamics. The potential of the proposed strategy is exploited for the design evaluation of a two degree-of-freedom high-speed parallel manipulator. The results are experimentally validated. (C) 2010 Elsevier Ltd. All rights reserved.
Resumo:
Corresponding to the updated flow pattern map presented in Part I of this study, an updated general flow pattern based flow boiling heat transfer model was developed for CO2 using the Cheng-Ribatski-Wojtan-Thome [L. Cheng, G. Ribatski, L. Wojtan, J.R. Thome, New flow boiling heat transfer model and flow pattern map for carbon dioxide evaporating inside horizontal tubes, Int. J. Heat Mass Transfer 49 (2006) 4082-4094; L. Cheng, G. Ribatski, L. Wojtan, J.R. Thome, Erratum to: ""New flow boiling heat transfer model and flow pattern map for carbon dioxide evaporating inside tubes"" [Heat Mass Transfer 49 (21-22) (2006) 4082-4094], Int. J. Heat Mass Transfer 50 (2007) 391] flow boiling heat transfer model as the starting basis. The flow boiling heat transfer correlation in the dryout region was updated. In addition, a new mist flow heat transfer correlation for CO2 was developed based on the CO2 data and a heat transfer method for bubbly flow was proposed for completeness sake. The updated general flow boiling heat transfer model for CO2 covers all flow regimes and is applicable to a wider range of conditions for horizontal tubes: tube diameters from 0.6 to 10 mm, mass velocities from 50 to 1500 kg/m(2) s, heat fluxes from 1.8 to 46 kW/m(2) and saturation temperatures from -28 to 25 degrees C (reduced pressures from 0.21 to 0.87). The updated general flow boiling heat transfer model was compared to a new experimental database which contains 1124 data points (790 more than that in the previous model [Cheng et al., 2006, 2007]) in this study. Good agreement between the predicted and experimental data was found in general with 71.4% of the entire database and 83.2% of the database without the dryout and mist flow data predicted within +/-30%. However, the predictions for the dryout and mist flow regions were less satisfactory due to the limited number of data points, the higher inaccuracy in such data, scatter in some data sets ranging up to 40%, significant discrepancies from one experimental study to another and the difficulties associated with predicting the inception and completion of dryout around the perimeter of the horizontal tubes. (C) 2007 Elsevier Ltd. All rights reserved.
Resumo:
In this paper the continuous Verhulst dynamic model is used to synthesize a new distributed power control algorithm (DPCA) for use in direct sequence code division multiple access (DS-CDMA) systems. The Verhulst model was initially designed to describe the population growth of biological species under food and physical space restrictions. The discretization of the corresponding differential equation is accomplished via the Euler numeric integration (ENI) method. Analytical convergence conditions for the proposed DPCA are also established. Several properties of the proposed recursive algorithm, such as Euclidean distance from optimum vector after convergence, convergence speed, normalized mean squared error (NSE), average power consumption per user, performance under dynamics channels, and implementation complexity aspects, are analyzed through simulations. The simulation results are compared with two other DPCAs: the classic algorithm derived by Foschini and Miljanic and the sigmoidal of Uykan and Koivo. Under estimated errors conditions, the proposed DPCA exhibits smaller discrepancy from the optimum power vector solution and better convergence (under fixed and adaptive convergence factor) than the classic and sigmoidal DPCAs. (C) 2010 Elsevier GmbH. All rights reserved.
Resumo:
In this study, twenty hydroxylated and acetoxylated 3-phenylcoumarin derivatives were evaluated as inhibitors of immune complex-stimulated neutrophil oxidative metabolism and possible modulators of the inflammatory tissue damage found in type III hypersensitivity reactions. By using lucigenin- and luminol-enhanced chemiluminescence assays (CL-luc and CL-lum, respectively), we found that the 6,7-dihydroxylated and 6,7-diacetoxylated 3-phenylcoumarin derivatives were the most effective inhibitors. Different structural features of the other compounds determined CL-luc and/or CL-lum inhibition. The 2D-QSAR analysis suggested the importance of hydrophobic contributions to explain these effects. In addition, a statistically significant 3D-QSAR model built applying GRIND descriptors allowed us to propose a virtual receptor site considering pharmacophoric regions and mutual distances. Furthermore, the 3-phenylcoumarins studied were not toxic to neutrophils under the assessed conditions. (C) 2007 Elsevier Masson SAS. All rights reserved.
Heterogeneity in schizophrenia: A mixture model analysis based on age-of-onset, gender and diagnosis
Resumo:
A significant problem in the collection of responses to potentially sensitive questions, such as relating to illegal, immoral or embarrassing activities, is non-sampling error due to refusal to respond or false responses. Eichhorn & Hayre (1983) suggested the use of scrambled responses to reduce this form of bias. This paper considers a linear regression model in which the dependent variable is unobserved but for which the sum or product with a scrambling random variable of known distribution, is known. The performance of two likelihood-based estimators is investigated, namely of a Bayesian estimator achieved through a Markov chain Monte Carlo (MCMC) sampling scheme, and a classical maximum-likelihood estimator. These two estimators and an estimator suggested by Singh, Joarder & King (1996) are compared. Monte Carlo results show that the Bayesian estimator outperforms the classical estimators in almost all cases, and the relative performance of the Bayesian estimator improves as the responses become more scrambled.