999 resultados para Scenario Programming, Markup Languages, 3D Virtualworlds


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Strategic supply chain optimization (SCO) problems are often modelled as a two-stage optimization problem, in which the first-stage variables represent decisions on the development of the supply chain and the second-stage variables represent decisions on the operations of the supply chain. When uncertainty is explicitly considered, the problem becomes an intractable infinite-dimensional optimization problem, which is usually solved approximately via a scenario or a robust approach. This paper proposes a novel synergy of the scenario and robust approaches for strategic SCO under uncertainty. Two formulations are developed, namely, naïve robust scenario formulation and affinely adjustable robust scenario formulation. It is shown that both formulations can be reformulated into tractable deterministic optimization problems if the uncertainty is bounded with the infinity-norm, and the uncertain equality constraints can be reformulated into deterministic constraints without assumption of the uncertainty region. Case studies of a classical farm planning problem and an energy and bioproduct SCO problem demonstrate the advantages of the proposed formulations over the classical scenario formulation. The proposed formulations not only can generate solutions with guaranteed feasibility or indicate infeasibility of a problem, but also can achieve optimal expected economic performance with smaller numbers of scenarios.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Thesis (Ph.D.)--University of Washington, 2016-08

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Este artículo presenta el proceso de implementación de una API (Application Programming Interface) que permite la interacción del guante P5 de Essential Reality1 con un entorno virtual desarrollado en el lenguaje de programación Java y su librería Java 3D.2 Por otra parte, se describe un ejemplo implementado, haciendo uso de la API en cuestión. Con base en este ejemplo se presentan los resultados de la ejecución de pruebas de requerimientos de recursos físicos como la CPU y memoria física. Finalmente, se especifican las conclusiones y resultados obtenidos.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Applications are subject of a continuous evolution process with a profound impact on their underlining data model, hence requiring frequent updates in the applications' class structure and database structure as well. This twofold problem, schema evolution and instance adaptation, usually known as database evolution, is addressed in this thesis. Additionally, we address concurrency and error recovery problems with a novel meta-model and its aspect-oriented implementation. Modern object-oriented databases provide features that help programmers deal with object persistence, as well as all related problems such as database evolution, concurrency and error handling. In most systems there are transparent mechanisms to address these problems, nonetheless the database evolution problem still requires some human intervention, which consumes much of programmers' and database administrators' work effort. Earlier research works have demonstrated that aspect-oriented programming (AOP) techniques enable the development of flexible and pluggable systems. In these earlier works, the schema evolution and the instance adaptation problems were addressed as database management concerns. However, none of this research was focused on orthogonal persistent systems. We argue that AOP techniques are well suited to address these problems in orthogonal persistent systems. Regarding the concurrency and error recovery, earlier research showed that only syntactic obliviousness between the base program and aspects is possible. Our meta-model and framework follow an aspect-oriented approach focused on the object-oriented orthogonal persistent context. The proposed meta-model is characterized by its simplicity in order to achieve efficient and transparent database evolution mechanisms. Our meta-model supports multiple versions of a class structure by applying a class versioning strategy. Thus, enabling bidirectional application compatibility among versions of each class structure. That is to say, the database structure can be updated because earlier applications continue to work, as well as later applications that have only known the updated class structure. The specific characteristics of orthogonal persistent systems, as well as a metadata enrichment strategy within the application's source code, complete the inception of the meta-model and have motivated our research work. To test the feasibility of the approach, a prototype was developed. Our prototype is a framework that mediates the interaction between applications and the database, providing them with orthogonal persistence mechanisms. These mechanisms are introduced into applications as an {\it aspect} in the aspect-oriented sense. Objects do not require the extension of any super class, the implementation of an interface nor contain a particular annotation. Parametric type classes are also correctly handled by our framework. However, classes that belong to the programming environment must not be handled as versionable due to restrictions imposed by the Java Virtual Machine. Regarding concurrency support, the framework provides the applications with a multithreaded environment which supports database transactions and error recovery. The framework keeps applications oblivious to the database evolution problem, as well as persistence. Programmers can update the applications' class structure because the framework will produce a new version for it at the database metadata layer. Using our XML based pointcut/advice constructs, the framework's instance adaptation mechanism is extended, hence keeping the framework also oblivious to this problem. The potential developing gains provided by the prototype were benchmarked. In our case study, the results confirm that mechanisms' transparency has positive repercussions on the programmer's productivity, simplifying the entire evolution process at application and database levels. The meta-model itself also was benchmarked in terms of complexity and agility. Compared with other meta-models, it requires less meta-object modifications in each schema evolution step. Other types of tests were carried out in order to validate prototype and meta-model robustness. In order to perform these tests, we used an OO7 small size database due to its data model complexity. Since the developed prototype offers some features that were not observed in other known systems, performance benchmarks were not possible. However, the developed benchmark is now available to perform future performance comparisons with equivalent systems. In order to test our approach in a real world scenario, we developed a proof-of-concept application. This application was developed without any persistence mechanisms. Using our framework and minor changes applied to the application's source code, we added these mechanisms. Furthermore, we tested the application in a schema evolution scenario. This real world experience using our framework showed that applications remains oblivious to persistence and database evolution. In this case study, our framework proved to be a useful tool for programmers and database administrators. Performance issues and the single Java Virtual Machine concurrent model are the major limitations found in the framework.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

É do conhecimento geral de que, hoje em dia, a tecnologia evolui rapidamente. São criadas novas arquitecturas para resolver determinadas limitações ou problemas. Por vezes, essa evolução é pacífica e não requer necessidade de adaptação e, por outras, essa evolução pode Implicar mudanças. As linguagens de programação são, desde sempre, o principal elo de comunicação entre o programador e o computador. Novas linguagens continuam a aparecer e outras estão sempre em desenvolvimento para se adaptarem a novos conceitos e paradigmas. Isto requer um esforço extra para o programador, que tem de estar sempre atento a estas mudanças. A Programação Visual pode ser uma solução para este problema. Exprimir funções como módulos que recebem determinado Input e retomam determinado output poderá ajudar os programadores espalhados pelo mundo, através da possibilidade de lhes dar uma margem para se abstraírem de pormenores de baixo nível relacionados com uma arquitectura específica. Esta tese não só mostra como combinar as capacidades do CeII/B.E. (que tem uma arquitectura multi­processador heterogénea) com o OpenDX (que tem um ambiente de programação visual), como também demonstra que tal pode ser feito sem grande perda de performance. ABSTRACT; lt is known that nowadays technology develops really fast. New architectures are created ln order to provide new solutions for different technology limitations and problems. Sometimes, this evolution is pacific and there is no need to adapt to new technologies, but things also may require a change every once ln a while. Programming languages have always been the communication bridge between the programmer and the computer. New ones keep coming and other ones keep improving ln order to adapt to new concepts and paradigms. This requires an extra-effort for the programmer, who always needs to be aware of these changes. Visual Programming may be a solution to this problem. Expressing functions as module boxes which receive determined Input and return determined output may help programmers across the world by giving them the possibility to abstract from specific low-level hardware issues. This thesis not only shows how the CeII/B.E. (which has a heterogeneous multi-core architecture) capabilities can be combined with OpenDX (which has a visual programming environment), but also demonstrates that lt can be done without losing much performance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A poster of this paper will be presented at the 25th International Conference on Parallel Architecture and Compilation Technology (PACT ’16), September 11-15, 2016, Haifa, Israel.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents the implementation of a high quality real-time 3D video system intended for 3D videoconferencing -- Basically, the system is able to extract depth information from a pair of images coming from a short-baseline camera setup -- The system is based on the use of a variant of the adaptive support-weight algorithm to be applied on GPU-based architectures -- The reason to do it is to get real-time results without compromising accuracy and also to reduce costs by using commodity hardware -- The complete system runs over the GStreamer multimedia software platform to make it even more flexible -- Moreover, an autoestereoscopic display has been used as the end-up terminal for 3D content visualization

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Los protocolos de medición antropométrica se caracterizan por la profusión de medidas discretas o localizadas, en un intento para caracterizar completamente la forma corporal del sujeto -- Dichos protocolos se utilizan intensivamente en campos como medicina deportiva, forense y/o reconstructiva, diseño de prótesis, ergonomía, en la confección de prendas, accesorios, etc -- Con el avance de algoritmos de recuperación de formas a partir de muestreos (digitalizaciones) la caracterización antropométrica se ha alterado significativamente -- El articulo presente muestra el proceso de caracterización digital de forma corpórea, incluyendo los protocolos de medición sobre el sujeto, el ambiente computacional - DigitLAB- (desarrollado en el CII-CAD-CAM-CG de la Universidad EAFIT) para recuperación de superficies, hasta los modelos geométricos finales -- Se presentan comparaciones de los resultados obtenidos con DigitLAB y con paquetes comerciales de recuperación de forma 3D -- Los resultados de DigitLAB resultan superiores, debido principalmente al hecho de que este toma ventaja de los patrones de las digitalizaciones (planares de contacto, por rejilla de pixels - range images -, etc.) y provee módulos de tratamiento geométrico - estadístico de los datos para poder aplicar efectivamente los algoritmos de recuperación de forma -- Se presenta un caso de estudio dirigido a la industria de la confección, y otros efectuados sobre conjuntos de prueba comunes en el ámbito científico para la homologación de algoritmos

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Process systems design, operation and synthesis problems under uncertainty can readily be formulated as two-stage stochastic mixed-integer linear and nonlinear (nonconvex) programming (MILP and MINLP) problems. These problems, with a scenario based formulation, lead to large-scale MILPs/MINLPs that are well structured. The first part of the thesis proposes a new finitely convergent cross decomposition method (CD), where Benders decomposition (BD) and Dantzig-Wolfe decomposition (DWD) are combined in a unified framework to improve the solution of scenario based two-stage stochastic MILPs. This method alternates between DWD iterations and BD iterations, where DWD restricted master problems and BD primal problems yield a sequence of upper bounds, and BD relaxed master problems yield a sequence of lower bounds. A variant of CD, which includes multiple columns per iteration of DW restricted master problem and multiple cuts per iteration of BD relaxed master problem, called multicolumn-multicut CD is then developed to improve solution time. Finally, an extended cross decomposition method (ECD) for solving two-stage stochastic programs with risk constraints is proposed. In this approach, a CD approach at the first level and DWD at a second level is used to solve the original problem to optimality. ECD has a computational advantage over a bilevel decomposition strategy or solving the monolith problem using an MILP solver. The second part of the thesis develops a joint decomposition approach combining Lagrangian decomposition (LD) and generalized Benders decomposition (GBD), to efficiently solve stochastic mixed-integer nonlinear nonconvex programming problems to global optimality, without the need for explicit branch and bound search. In this approach, LD subproblems and GBD subproblems are systematically solved in a single framework. The relaxed master problem obtained from the reformulation of the original problem, is solved only when necessary. A convexification of the relaxed master problem and a domain reduction procedure are integrated into the decomposition framework to improve solution efficiency. Using case studies taken from renewable resource and fossil-fuel based application in process systems engineering, it can be seen that these novel decomposition approaches have significant benefit over classical decomposition methods and state-of-the-art MILP/MINLP global optimization solvers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The ability to create hybrid systems that blend different paradigms has now become a requirement for complex AI systems usually made of more than a component. In this way, it is possible to exploit the advantages of each paradigm and exploit the potential of different approaches such as symbolic and non-symbolic approaches. In particular, symbolic approaches are often exploited for their efficiency, effectiveness and ability to manage large amounts of data, while symbolic approaches are exploited to ensure aspects related to explainability, fairness, and trustworthiness in general. The thesis lies in this context, in particular in the design and development of symbolic technologies that can be easily integrated and interoperable with other AI technologies. 2P-Kt is a symbolic ecosystem developed for this purpose, it provides a logic-programming (LP) engine which can be easily extended and customized to deal with specific needs. The aim of this thesis is to extend 2P-Kt to support constraint logic programming (CLP) as one of the main paradigms for solving highly combinatorial problems given a declarative problem description and a general constraint-propagation engine. A real case study concerning school timetabling is described to show a practical usage of the CLP(FD) library implemented. Since CLP represents only a particular scenario for extending LP to domain-specific scenarios, in this thesis we present also a more general framework: Labelled Prolog, extending LP with labelled terms and in particular labelled variables. The designed framework shows how it is possible to frame all variations and extensions of LP under a single language reducing the huge amount of existing languages and libraries and focusing more on how to manage different domain needs using labels which can be associated with every kind of term. Mapping of CLP into Labeled Prolog is also discussed as well as the benefits of the provided approach.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An important approach to cancer therapy is the design of small molecule modulators that interfere with microtubule dynamics through their specific binding to the ²-subunit of tubulin. In the present work, comparative molecular field analysis (CoMFA) studies were conducted on a series of discodermolide analogs with antimitotic properties. Significant correlation coefficients were obtained (CoMFA(i), q² =0.68, r²=0.94; CoMFA(ii), q² = 0.63, r²= 0.91), indicating the good internal and external consistency of the models generated using two independent structural alignment strategies. The models were externally validated employing a test set, and the predicted values were in good agreement with the experimental results. The final QSAR models and the 3D contour maps provided important insights into the chemical and structural basis involved in the molecular recognition process of this family of discodermolide analogs, and should be useful for the design of new specific ²-tubulin modulators with potent anticancer activity.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of this study was to evaluate the stress distribution in the cervical region of a sound upper central incisor in two clinical situations, standard and maximum masticatory forces, by means of a 3D model with the highest possible level of fidelity to the anatomic dimensions. Two models with 331,887 linear tetrahedral elements that represent a sound upper central incisor with periodontal ligament, cortical and trabecular bones were loaded at 45º in relation to the tooth's long axis. All structures were considered to be homogeneous and isotropic, with the exception of the enamel (anisotropic). A standard masticatory force (100 N) was simulated on one of the models, while on the other one a maximum masticatory force was simulated (235.9 N). The software used were: PATRAN for pre- and post-processing and Nastran for processing. In the cementoenamel junction area, tensile forces reached 14.7 MPa in the 100 N model, and 40.2 MPa in the 235.9 N model, exceeding the enamel's tensile strength (16.7 MPa). The fact that the stress concentration in the amelodentinal junction exceeded the enamel's tensile strength under simulated conditions of maximum masticatory force suggests the possibility of the occurrence of non-carious cervical lesions such as abfractions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We analyze the breaking of Lorentz invariance in a 3D model of fermion fields self-coupled through four-fermion interactions. The low-energy limit of the theory contains various submodels which are similar to those used in the study of graphene or in the description of irrational charge fractionalization.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The knowledge of the atomic structure of clusters composed by few atoms is a basic prerequisite to obtain insights into the mechanisms that determine their chemical and physical properties as a function of diameter, shape, surface termination, as well as to understand the mechanism of bulk formation. Due to the wide use of metal systems in our modern life, the accurate determination of the properties of 3d, 4d, and 5d metal clusters poses a huge problem for nanoscience. In this work, we report a density functional theory study of the atomic structure, binding energies, effective coordination numbers, average bond lengths, and magnetic properties of the 3d, 4d, and 5d metal (30 elements) clusters containing 13 atoms, M(13). First, a set of lowest-energy local minimum structures (as supported by vibrational analysis) were obtained by combining high-temperature first- principles molecular-dynamics simulation, structure crossover, and the selection of five well-known M(13) structures. Several new lower energy configurations were identified, e. g., Pd(13), W(13), Pt(13), etc., and previous known structures were confirmed by our calculations. Furthermore, the following trends were identified: (i) compact icosahedral-like forms at the beginning of each metal series, more opened structures such as hexagonal bilayerlike and double simple-cubic layers at the middle of each metal series, and structures with an increasing effective coordination number occur for large d states occupation. (ii) For Au(13), we found that spin-orbit coupling favors the three-dimensional (3D) structures, i.e., a 3D structure is about 0.10 eV lower in energy than the lowest energy known two-dimensional configuration. (iii) The magnetic exchange interactions play an important role for particular systems such as Fe, Cr, and Mn. (iv) The analysis of the binding energy and average bond lengths show a paraboliclike shape as a function of the occupation of the d states and hence, most of the properties can be explained by the chemistry picture of occupation of the bonding and antibonding states.