4 resultados para Multi- Choice mixed integer goal programming

em Universidade Federal do Rio Grande do Norte(UFRN)


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This work presents a new model for the Heterogeneous p-median Problem (HPM), proposed to recover the hidden category structures present in the data provided by a sorting task procedure, a popular approach to understand heterogeneous individual’s perception of products and brands. This new model is named as the Penalty-free Heterogeneous p-median Problem (PFHPM), a single-objective version of the original problem, the HPM. The main parameter in the HPM is also eliminated, the penalty factor. It is responsible for the weighting of the objective function terms. The adjusting of this parameter controls the way that the model recovers the hidden category structures present in data, and depends on a broad knowledge of the problem. Additionally, two complementary formulations for the PFHPM are shown, both mixed integer linear programming problems. From these additional formulations lower-bounds were obtained for the PFHPM. These values were used to validate a specialized Variable Neighborhood Search (VNS) algorithm, proposed to solve the PFHPM. This algorithm provided good quality solutions for the PFHPM, solving artificial generated instances from a Monte Carlo Simulation and real data instances, even with limited computational resources. Statistical analyses presented in this work suggest that the new algorithm and model, the PFHPM, can recover more accurately the original category structures related to heterogeneous individual’s perceptions than the original model and algorithm, the HPM. Finally, an illustrative application of the PFHPM is presented, as well as some insights about some new possibilities for it, extending the new model to fuzzy environments

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Multi-Cloud Applications are composed of services offered by multiple cloud platforms where the user/developer has full knowledge of the use of such platforms. The use of multiple cloud platforms avoids the following problems: (i) vendor lock-in, which is dependency on the application of a certain cloud platform, which is prejudicial in the case of degradation or failure of platform services, or even price increasing on service usage; (ii) degradation or failure of the application due to fluctuations in quality of service (QoS) provided by some cloud platform, or even due to a failure of any service. In multi-cloud scenario is possible to change a service in failure or with QoS problems for an equivalent of another cloud platform. So that an application can adopt the perspective multi-cloud is necessary to create mechanisms that are able to select which cloud services/platforms should be used in accordance with the requirements determined by the programmer/user. In this context, the major challenges in terms of development of such applications include questions such as: (i) the choice of which underlying services and cloud computing platforms should be used based on the defined user requirements in terms of functionality and quality (ii) the need to continually monitor the dynamic information (such as response time, availability, price, availability), related to cloud services, in addition to the wide variety of services, and (iii) the need to adapt the application if QoS violations affect user defined requirements. This PhD thesis proposes an approach for dynamic adaptation of multi-cloud applications to be applied when a service is unavailable or when the requirements set by the user/developer point out that other available multi-cloud configuration meets more efficiently. Thus, this work proposes a strategy composed of two phases. The first phase consists of the application modeling, exploring the similarities representation capacity and variability proposals in the context of the paradigm of Software Product Lines (SPL). In this phase it is used an extended feature model to specify the cloud service configuration to be used by the application (similarities) and the different possible providers for each service (variability). Furthermore, the non-functional requirements associated with cloud services are specified by properties in this model by describing dynamic information about these services. The second phase consists of an autonomic process based on MAPE-K control loop, which is responsible for selecting, optimally, a multicloud configuration that meets the established requirements, and perform the adaptation. The adaptation strategy proposed is independent of the used programming technique for performing the adaptation. In this work we implement the adaptation strategy using various programming techniques such as aspect-oriented programming, context-oriented programming and components and services oriented programming. Based on the proposed steps, we tried to assess the following: (i) the process of modeling and the specification of non-functional requirements can ensure effective monitoring of user satisfaction; (ii) if the optimal selection process presents significant gains compared to sequential approach; and (iii) which techniques have the best trade-off when compared efforts to development/modularity and performance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The seismic method is of extreme importance in geophysics. Mainly associated with oil exploration, this line of research focuses most of all investment in this area. The acquisition, processing and interpretation of seismic data are the parts that instantiate a seismic study. Seismic processing in particular is focused on the imaging that represents the geological structures in subsurface. Seismic processing has evolved significantly in recent decades due to the demands of the oil industry, and also due to the technological advances of hardware that achieved higher storage and digital information processing capabilities, which enabled the development of more sophisticated processing algorithms such as the ones that use of parallel architectures. One of the most important steps in seismic processing is imaging. Migration of seismic data is one of the techniques used for imaging, with the goal of obtaining a seismic section image that represents the geological structures the most accurately and faithfully as possible. The result of migration is a 2D or 3D image which it is possible to identify faults and salt domes among other structures of interest, such as potential hydrocarbon reservoirs. However, a migration fulfilled with quality and accuracy may be a long time consuming process, due to the mathematical algorithm heuristics and the extensive amount of data inputs and outputs involved in this process, which may take days, weeks and even months of uninterrupted execution on the supercomputers, representing large computational and financial costs, that could derail the implementation of these methods. Aiming at performance improvement, this work conducted the core parallelization of a Reverse Time Migration (RTM) algorithm, using the parallel programming model Open Multi-Processing (OpenMP), due to the large computational effort required by this migration technique. Furthermore, analyzes such as speedup, efficiency were performed, and ultimately, the identification of the algorithmic scalability degree with respect to the technological advancement expected by future processors

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The tectonics activity on the southern border of Parnaíba Basin resulted in a wide range of brittle structures that affect siliciclastic sedimentary rocks. This tectonic activity and related faults, joints, and folds are poorly known. The main aims of this study were (1) to identify lineaments using several remotesensing systems, (2) to check how the interpretation based on these systems at several scales influence the identification of lineaments, and (3) to contribute to the knowledge of brittle tectonics in the southern border of the Parnaíba Basin. The integration of orbital and aerial systems allowed a multi-scale identification, classification, and quantification of lineaments. Maps of lineaments were elaborated in the following scales: 1:200,000 (SRTM Shuttle Radar Topographic Mission), 1:50,000 (Landsat 7 ETM+ satellite), 1:10,000 (aerial photographs) and 1:5,000 (Quickbird satellite). The classification of the features with structural significance allowed the determination of four structural sets: NW, NS, NE, and EW. They were usually identified in all remote-sensing systems. The NE-trending set was not easily identified in aerial photographs but was better visualized on images of medium-resolution systems (SRTM and Landsat 7 ETM+). The same behavior characterizes the NW-trending. The NS-and EW-trending sets were better identified on images from high-resolution systems (aerial photographs and Quickbird). The structural meaning of the lineaments was established after field work. The NEtrending set is associated with normal and strike-slip faults, including deformation bands. These are the oldest structures identified in the region and are related to the reactivation of Precambrian basement structures from the Transbrazilian Lineament. The NW-trending set represents strike-slip and subordinated normal faults. The high dispersion of this set suggests a more recent origin than the previous structures. The NW-trending set may be related to the Picos-Santa Inês Lineament. The NS-and EW-trending sets correspond to large joints (100 m 5 km long). The truncation relationships between these joint sets indicate that the EW-is older than the NS-trending set. The methodology developed by the present work is an excellent tool for the understanding of the regional and local tectonic structures in the Parnaíba basin. It helps the choice of the best remote-sensing system to identify brittle features in a poorly known sedimentary basin