926 resultados para Complex engineering problems


Relevância:

30.00% 30.00%

Publicador:

Resumo:

[EN]In this paper we review the novel meccano method. We summarize the main stages (subdivision, mapping, optimization) of this automatic tetrahedral mesh generation technique and we concentrate the study to complex genus-zero solids. In this case, our procedure only requires a surface triangulation of the solid. A crucial consequence of our method is the volume parametrization of the solid to a cube. We construct volume T-meshes for isogeometric analysis by using this result. The efficiency of the proposed technique is shown with several examples. A comparison between the meccano method and standard mesh generation techniques is introduced.-1…

Relevância:

30.00% 30.00%

Publicador:

Resumo:

[EN]Isogeometric analysis (IGA) has arisen as an attempt to unify the fields of CAD and classical finite element methods. The main idea of IGA consists in using for analysis the same functions (splines) that are used in CAD representation of the geometry. The main advantage with respect to the traditional finite element method is a higher smoothness of the numerical solution and more accurate representation of the geometry. IGA seems to be a promising tool with wide range of applications in engineering. However, this relatively new technique have some open problems that require a solution. In this work we present our results and contributions to this issue…

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis describes modelling tools and methods suited for complex systems (systems that typically are represented by a plurality of models). The basic idea is that all models representing the system should be linked by well-defined model operations in order to build a structured repository of information, a hierarchy of models. The port-Hamiltonian framework is a good candidate to solve this kind of problems as it supports the most important model operations natively. The thesis in particular addresses the problem of integrating distributed parameter systems in a model hierarchy, and shows two possible mechanisms to do that: a finite-element discretization in port-Hamiltonian form, and a structure-preserving model order reduction for discretized models obtainable from commercial finite-element packages.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Deutsch:In der vorliegenden Arbeit konnten neue Methoden zur Synthese anorganischer Materialien mit neuartiger Architektur im Mikrometer und Nanometer Maßstab beschrieben werden. Die zentrale Rolle der Formgebung basiert dabei auf der templatinduzierten Abscheidung der anorganischen Materialien auf selbstorganisierten Monoschichten. Als Substrate eignen sich goldbedampfte Glasträger und Goldkolloide, die eine Mittelstellung in der Welt der Atome bzw. Moleküle und der makroskopischen Welt der ausgedehnten Festkörper einnehmen. Auf diesen Substraten lassen sich Thiole zu einer monomolekularen Schicht adsorbieren und damit die Oberflächeneigenschaften des Substrates ändern. Ein besonderer Schwerpunkt bei dieser Arbeit stellt die Synthese speziell auf die Bedürfnisse der jeweiligen Anwendung ausgerichteten Thiole dar.Im ersten Teil der Arbeit wurden goldbedampfte Glasoberflächen als Template verwendet. Die Abscheidung von Calciumcarbonat wurde in Abhängigkeit der Schichtdicke der adsorbierten Monolage untersucht. Aragonit, eine der drei Hauptphasen des Calciumcarbonat Systems, wurde auf polyaromatischen Amid - Oberflächen mit Schichtdicken von 5 - 400 nm Dicke unter milden Bedingung abgeschieden. Die einstellbaren Parameter waren dabei die Kettenlänge des Polymers, der w-Substituent, die Bindung an die Goldoberfläche über Verwendung verschiedener Aminothiole und die Kristallisationstemperatur. Die Schichtdickeneinstellung der Polymerfilme erfolgte hierbei über einen automatisierten Synthesezyklus.Titanoxid Filme konnten auf Oberflächen strukturiert werden. Dabei kam ein speziell synthetisiertes Thiol zum Einsatz, das die Funktionalität einer Styroleinheit an der Oberflächen Grenze als auch eine Möglichkeit zur späteren Entfernung von der Oberfläche in sich vereinte. Die PDMS Stempeltechnik erzeugte dabei Mikrostrukturen auf der Goldoberfläche im Bereich von 5 bis 10 µm, die ihrerseits über die Polymerisation und Abscheidung des Polymers in den Titanoxid Film überführt werden konnten. Drei dimensionale Strukturen wurden über Goldkolloid Template erhalten. Tetraethylenglykol konnte mit einer Thiolgruppe im Austausch zu einer Hydroxylgruppe monofunktionalisiert werden. Das erhaltene Molekül wurde auf kolloidalem Gold selbstorganisiert; es entstand dabei ein wasserlösliches Goldkolloid. Die Darstellung erfolgte dabei in einer Einphasenreaktion. Die so erhaltenen Goldkolloide wurden als Krstallisationstemplate für die drei dimensionale Abscheidung von Calciumcarbonat verwendet. Es zeigte sich, dass Glykol die Kristallisation bzw. den Habitus des krsitalls bei niedrigem pH Wert modifiziert. Bei erhöhtem pH Wert (pH = 12) jedoch agieren die Glykol belegten Goldkolloide als Template und führen zu sphärisch Aggregaten. Werden Goldkolloide langkettigen Dithiolen ausgesetzt, so führt dies zu einer Aggregation und Ausfällung der Kolloide aufgrund der Vernetzung mehrer Goldkolloide mit den Thiolgruppen der Alkyldithiole. Zur Vermeidung konnte in dieser Arbeit ein halbseitig geschütztes Dithiol synthetisiert werden, mit dessen Hilfe die Aggregation unterbunden werden konnte. Das nachfolgende Entschützten der Thiolfunktion führte zu Goldkolloiden, deren Oberfläche Thiol funktionalisiert werden konnte. Die thiolaktiven Goldkolloide fungierten als template für die Abscheidung von Bleisulfid aus organisch/wässriger Lösung. Die Funktionsweise der Schutzgruppe und die Entschützung konnte mittels Plasmonenresonanz Spektroskopie verdeutlicht werden. Titanoxid / Gold / Polystyrol Komposite in Röhrenform konnten synthetisiert werden. Dazu wurde ein menschliches Haar als biologisches Templat für die Formgebung gewählt.. Durch Bedampfung des Haares mit Gold, Assemblierung eines Stryrolmonomers, welches zusätzlich eine Thiolfunktionalität trug, Polymerisation auf der Oberfläche, Abscheidung des Titanoxid Films und anschließendem Auflösen des biologischen Templates konnte eine Röhrenstruktur im Mikrometer Bereich dargestellt werden. Goldkolloide fungierten in dieser Arbeit nicht nur als Kristallisationstemplate und Formgeber, auch sie selbst wurden dahingehend modifiziert, dass sie drahtförmige Agglormerate im Nanometerbereich ausbilden. Dazu wurden Template aus Siliziumdioxid benutzt. Zum einen konnten Nanoröhren aus amorphen SiO2 in einer Sol Gel Methode dargestellt werden, zum anderen bediente sich diese Arbeit biologischer Siliziumoxid Hohlnadeln aus marinen Schwämmen isoliert. Goldkolloide wurden in die Hohlstrukturen eingebettet und die Struktur durch Ausbildung von Kolloid - Thiol Netzwerken mittels Dithiol Zugabe gefestigt. Die Gold-Nanodrähte im Bereich von 100 bis 500 nm wurden durch Auflösen des SiO2 - Templates freigelegt.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Broad consensus has been reached within the Education and Cognitive Psychology research communities on the need to center the learning process on experimentation and concrete application of knowledge, rather than on a bare transfer of notions. Several advantages arise from this educational approach, ranging from the reinforce of students learning, to the increased opportunity for a student to gain greater insight into the studied topics, up to the possibility for learners to acquire practical skills and long-lasting proficiency. This is especially true in Engineering education, where integrating conceptual knowledge and practical skills assumes a strategic importance. In this scenario, learners are called to play a primary role. They are actively involved in the construction of their own knowledge, instead of passively receiving it. As a result, traditional, teacher-centered learning environments should be replaced by novel learner-centered solutions. Information and Communication Technologies enable the development of innovative solutions that provide suitable answers to the need for the availability of experimentation supports in educational context. Virtual Laboratories, Adaptive Web-Based Educational Systems and Computer-Supported Collaborative Learning environments can significantly foster different learner-centered instructional strategies, offering the opportunity to enhance personalization, individualization and cooperation. More specifically, they allow students to explore different kinds of materials, to access and compare several information sources, to face real or realistic problems and to work on authentic and multi-facet case studies. In addition, they encourage cooperation among peers and provide support through coached and scaffolded activities aimed at fostering reflection and meta-cognitive reasoning. This dissertation will guide readers within this research field, presenting both the theoretical and applicative results of a research aimed at designing an open, flexible, learner-centered virtual lab for supporting students in learning Information Security.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In a large number of problems the high dimensionality of the search space, the vast number of variables and the economical constrains limit the ability of classical techniques to reach the optimum of a function, known or unknown. In this thesis we investigate the possibility to combine approaches from advanced statistics and optimization algorithms in such a way to better explore the combinatorial search space and to increase the performance of the approaches. To this purpose we propose two methods: (i) Model Based Ant Colony Design and (ii) Naïve Bayes Ant Colony Optimization. We test the performance of the two proposed solutions on a simulation study and we apply the novel techniques on an appplication in the field of Enzyme Engineering and Design.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Two of the main features of today complex software systems like pervasive computing systems and Internet-based applications are distribution and openness. Distribution revolves around three orthogonal dimensions: (i) distribution of control|systems are characterised by several independent computational entities and devices, each representing an autonomous and proactive locus of control; (ii) spatial distribution|entities and devices are physically distributed and connected in a global (such as the Internet) or local network; and (iii) temporal distribution|interacting system components come and go over time, and are not required to be available for interaction at the same time. Openness deals with the heterogeneity and dynamism of system components: complex computational systems are open to the integration of diverse components, heterogeneous in terms of architecture and technology, and are dynamic since they allow components to be updated, added, or removed while the system is running. The engineering of open and distributed computational systems mandates for the adoption of a software infrastructure whose underlying model and technology could provide the required level of uncoupling among system components. This is the main motivation behind current research trends in the area of coordination middleware to exploit tuple-based coordination models in the engineering of complex software systems, since they intrinsically provide coordinated components with communication uncoupling and further details in the references therein. An additional daunting challenge for tuple-based models comes from knowledge-intensive application scenarios, namely, scenarios where most of the activities are based on knowledge in some form|and where knowledge becomes the prominent means by which systems get coordinated. Handling knowledge in tuple-based systems induces problems in terms of syntax - e.g., two tuples containing the same data may not match due to differences in the tuple structure - and (mostly) of semantics|e.g., two tuples representing the same information may not match based on a dierent syntax adopted. Till now, the problem has been faced by exploiting tuple-based coordination within a middleware for knowledge intensive environments: e.g., experiments with tuple-based coordination within a Semantic Web middleware (surveys analogous approaches). However, they appear to be designed to tackle the design of coordination for specic application contexts like Semantic Web and Semantic Web Services, and they result in a rather involved extension of the tuple space model. The main goal of this thesis was to conceive a more general approach to semantic coordination. In particular, it was developed the model and technology of semantic tuple centres. It is adopted the tuple centre model as main coordination abstraction to manage system interactions. A tuple centre can be seen as a programmable tuple space, i.e. an extension of a Linda tuple space, where the behaviour of the tuple space can be programmed so as to react to interaction events. By encapsulating coordination laws within coordination media, tuple centres promote coordination uncoupling among coordinated components. Then, the tuple centre model was semantically enriched: a main design choice in this work was to try not to completely redesign the existing syntactic tuple space model, but rather provide a smooth extension that { although supporting semantic reasoning { keep the simplicity of tuple and tuple matching as easier as possible. By encapsulating the semantic representation of the domain of discourse within coordination media, semantic tuple centres promote semantic uncoupling among coordinated components. The main contributions of the thesis are: (i) the design of the semantic tuple centre model; (ii) the implementation and evaluation of the model based on an existent coordination infrastructure; (iii) a view of the application scenarios in which semantic tuple centres seem to be suitable as coordination media.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this thesis we made the first steps towards the systematic application of a methodology for automatically building formal models of complex biological systems. Such a methodology could be useful also to design artificial systems possessing desirable properties such as robustness and evolvability. The approach we follow in this thesis is to manipulate formal models by means of adaptive search methods called metaheuristics. In the first part of the thesis we develop state-of-the-art hybrid metaheuristic algorithms to tackle two important problems in genomics, namely, the Haplotype Inference by parsimony and the Founder Sequence Reconstruction Problem. We compare our algorithms with other effective techniques in the literature, we show strength and limitations of our approaches to various problem formulations and, finally, we propose further enhancements that could possibly improve the performance of our algorithms and widen their applicability. In the second part, we concentrate on Boolean network (BN) models of gene regulatory networks (GRNs). We detail our automatic design methodology and apply it to four use cases which correspond to different design criteria and address some limitations of GRN modeling by BNs. Finally, we tackle the Density Classification Problem with the aim of showing the learning capabilities of BNs. Experimental evaluation of this methodology shows its efficacy in producing network that meet our design criteria. Our results, coherently to what has been found in other works, also suggest that networks manipulated by a search process exhibit a mixture of characteristics typical of different dynamical regimes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Web is constantly evolving, thanks to the 2.0 transition, HTML5 new features and the coming of cloud-computing, the gap between Web and traditional desktop applications is tailing off. Web-apps are more and more widespread and bring several benefits compared to traditional ones. On the other hand reference technologies, JavaScript primarly, are not keeping pace, so a paradim shift is taking place in Web programming, and so many new languages and technologies are coming out. First objective of this thesis is to survey the reference and state-of-art technologies for client-side Web programming focusing in particular on what concerns concurrency and asynchronous programming. Taking into account the problems that affect existing technologies, we finally design simpAL-web, an innovative approach to tackle Web-apps development, based on the Agent-oriented programming abstraction and the simpAL language. == Versione in italiano: Il Web è in continua evoluzione, grazie alla transizione verso il 2.0, alle nuove funzionalità introdotte con HTML5 ed all’avvento del cloud-computing, il divario tra le applicazioni Web e quelle desktop tradizionali va assottigliandosi. Le Web-apps sono sempre più diffuse e presentano diversi vantaggi rispetto a quelle tradizionali. D’altra parte le tecnologie di riferimento, JavaScript in primis, non stanno tenendo il passo, motivo per cui la programmazione Web sta andando incontro ad un cambio di paradigma e nuovi linguaggi e tecnologie stanno spuntando sempre più numerosi. Primo obiettivo di questa tesi è di passare al vaglio le tecnologie di riferimento ed allo stato dell’arte per quel che riguarda la programmmazione Web client-side, porgendo particolare attenzione agli aspetti inerenti la concorrenza e la programmazione asincrona. Considerando i principali problemi di cui soffrono le attuali tecnologie passeremo infine alla progettazione di simpAL-web, un approccio innovativo con cui affrontare lo sviluppo di Web-apps basato sulla programmazione orientata agli Agenti e sul linguaggio simpAL.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Synthetic biology has recently had a great development, many papers have been published and many applications have been presented, spanning from the production of biopharmacheuticals to the synthesis of bioenergetic substrates or industrial catalysts. But, despite these advances, most of the applications are quite simple and don’t fully exploit the potential of this discipline. This limitation in complexity has many causes, like the incomplete characterization of some components, or the intrinsic variability of the biological systems, but one of the most important reasons is the incapability of the cell to sustain the additional metabolic burden introduced by a complex circuit. The objective of the project, of which this work is part, is trying to solve this problem through the engineering of a multicellular behaviour in prokaryotic cells. This system will introduce a cooperative behaviour that will allow to implement complex functionalities, that can’t be obtained with a single cell. In particular the goal is to implement the Leader Election, this procedure has been firstly devised in the field of distributed computing, to identify the process that allow to identify a single process as organizer and coordinator of a series of tasks assigned to the whole population. The election of the Leader greatly simplifies the computation providing a centralized control. Further- more this system may even be useful to evolutionary studies that aims to explain how complex organisms evolved from unicellular systems. The work presented here describes, in particular, the design and the experimental characterization of a component of the circuit that solves the Leader Election problem. This module, composed of an hybrid promoter and a gene, is activated in the non-leader cells after receiving the signal that a leader is present in the colony. The most important element, in this case, is the hybrid promoter, it has been realized in different versions, applying the heuristic rules stated in [22], and their activity has been experimentally tested. The objective of the experimental characterization was to test the response of the genetic circuit to the introduction, in the cellular environment, of particular molecules, inducers, that can be considered inputs of the system. The desired behaviour is similar to the one of a logic AND gate in which the exit, represented by the luminous signal produced by a fluorescent protein, is one only in presence of both inducers. The robustness and the stability of this behaviour have been tested by changing the concentration of the input signals and building dose response curves. From these data it is possible to conclude that the analysed constructs have an AND-like behaviour over a wide range of inducers’ concentrations, even if it is possible to identify many differences in the expression profiles of the different constructs. This variability accounts for the fact that the input and the output signals are continuous, and so their binary representation isn’t able to capture the complexity of the behaviour. The module of the circuit that has been considered in this analysis has a fundamental role in the realization of the intercellular communication system that is necessary for the cooperative behaviour to take place. For this reason, the second phase of the characterization has been focused on the analysis of the signal transmission. In particular, the interaction between this element and the one that is responsible for emitting the chemical signal has been tested. The desired behaviour is still similar to a logic AND, since, even in this case, the exit signal is determined by the hybrid promoter activity. The experimental results have demonstrated that the systems behave correctly, even if there is still a substantial variability between them. The dose response curves highlighted that stricter constrains on the inducers concentrations need to be imposed in order to obtain a clear separation between the two levels of expression. In the conclusive chapter the DNA sequences of the hybrid promoters are analysed, trying to identify the regulatory elements that are most important for the determination of the gene expression. Given the available data it wasn’t possible to draw definitive conclusions. In the end, few considerations on promoter engineering and complex circuits realization are presented. This section aims to briefly recall some of the problems outlined in the introduction and provide a few possible solutions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Flood disasters are a major cause of fatalities and economic losses, and several studies indicate that global flood risk is currently increasing. In order to reduce and mitigate the impact of river flood disasters, the current trend is to integrate existing structural defences with non structural measures. This calls for a wider application of advanced hydraulic models for flood hazard and risk mapping, engineering design, and flood forecasting systems. Within this framework, two different hydraulic models for large scale analysis of flood events have been developed. The two models, named CA2D and IFD-GGA, adopt an integrated approach based on the diffusive shallow water equations and a simplified finite volume scheme. The models are also designed for massive code parallelization, which has a key importance in reducing run times in large scale and high-detail applications. The two models were first applied to several numerical cases, to test the reliability and accuracy of different model versions. Then, the most effective versions were applied to different real flood events and flood scenarios. The IFD-GGA model showed serious problems that prevented further applications. On the contrary, the CA2D model proved to be fast and robust, and able to reproduce 1D and 2D flow processes in terms of water depth and velocity. In most applications the accuracy of model results was good and adequate to large scale analysis. Where complex flow processes occurred local errors were observed, due to the model approximations. However, they did not compromise the correct representation of overall flow processes. In conclusion, the CA model can be a valuable tool for the simulation of a wide range of flood event types, including lowland and flash flood events.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis deals with the study of optimal control problems for the incompressible Magnetohydrodynamics (MHD) equations. Particular attention to these problems arises from several applications in science and engineering, such as fission nuclear reactors with liquid metal coolant and aluminum casting in metallurgy. In such applications it is of great interest to achieve the control on the fluid state variables through the action of the magnetic Lorentz force. In this thesis we investigate a class of boundary optimal control problems, in which the flow is controlled through the boundary conditions of the magnetic field. Due to their complexity, these problems present various challenges in the definition of an adequate solution approach, both from a theoretical and from a computational point of view. In this thesis we propose a new boundary control approach, based on lifting functions of the boundary conditions, which yields both theoretical and numerical advantages. With the introduction of lifting functions, boundary control problems can be formulated as extended distributed problems. We consider a systematic mathematical formulation of these problems in terms of the minimization of a cost functional constrained by the MHD equations. The existence of a solution to the flow equations and to the optimal control problem are shown. The Lagrange multiplier technique is used to derive an optimality system from which candidate solutions for the control problem can be obtained. In order to achieve the numerical solution of this system, a finite element approximation is considered for the discretization together with an appropriate gradient-type algorithm. A finite element object-oriented library has been developed to obtain a parallel and multigrid computational implementation of the optimality system based on a multiphysics approach. Numerical results of two- and three-dimensional computations show that a possible minimum for the control problem can be computed in a robust and accurate manner.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The study aims at providing a framework conceptualizing patenting activities under the condition of intellectual property rights fragmentation. Such a framework has to deal with the interrelated problems of technological complexity in the modern patent landscape. In that respect, ex-post licensing agreements have been incorporated into the analysis. More precisely, by consolidating the right to use patents required for commercialization of a product, private market solutions, such as cross-licensing agreements and patent pools help firms to overcome problems triggered by the intellectual property rights fragmentation. Thereby, private bargaining between parties as such cannot be isolated from the legal framework. A result of this analysis is that policies ignoring market solutions and only focusing on static gains can mitigate the dynamic efficiency gains as induced by the patent system. The evidence found in this thesis supports the opinion that legal reforms that aim to decrease the degree of patent protection or to lift it all together can hamper the functioning of the current system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Constructing ontology networks typically occurs at design time at the hands of knowledge engineers who assemble their components statically. There are, however, use cases where ontology networks need to be assembled upon request and processed at runtime, without altering the stored ontologies and without tampering with one another. These are what we call "virtual [ontology] networks", and keeping track of how an ontology changes in each virtual network is called "multiplexing". Issues may arise from the connectivity of ontology networks. In many cases, simple flat import schemes will not work, because many ontology managers can cause property assertions to be erroneously interpreted as annotations and ignored by reasoners. Also, multiple virtual networks should optimize their cumulative memory footprint, and where they cannot, this should occur for very limited periods of time. We claim that these problems should be handled by the software that serves these ontology networks, rather than by ontology engineering methodologies. We propose a method that spreads multiple virtual networks across a 3-tier structure, and can reduce the amount of erroneously interpreted axioms, under certain raw statement distributions across the ontologies. We assumed OWL as the core language handled by semantic applications in the framework at hand, due to the greater availability of reasoners and rule engines. We also verified that, in common OWL ontology management software, OWL axiom interpretation occurs in the worst case scenario of pre-order visit. To measure the effectiveness and space-efficiency of our solution, a Java and RESTful implementation was produced within an Apache project. We verified that a 3-tier structure can accommodate reasonably complex ontology networks better, in terms of the expressivity OWL axiom interpretation, than flat-tree import schemes can. We measured both the memory overhead of the additional components we put on top of traditional ontology networks, and the framework's caching capabilities.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Copper(I) halide clusters are recently considered as good candidate for optoelectronic devices such as OLEDs . Although the copper halide clusters, in particular copper iodide, are very well known since the beginning of the 20th century, only in the late ‘70s the interest on these compounds grew dramatically due their particular photophysical behaviour. These complexes are characterized by a dual triplet emission bands, named Cluster Centred (3CC) and Halogen-to-Ligand charge transfer (3XLCT), the intensities of which are strictly related with the temperature. The CC transition, due to the presence of a metallophylic interactions, is prevalent at ambient temperature while the XLCT transition, located preferentially on the ligand part, became more prominent at low temperature. Since these pioneering works, it was easy to understand the photophysical properties of this compounds became more interesting in solid-state respect to solution with an improvement in emission efficiency. In this work we aim to characterize in SS organocopper(I)iodide compounds to valuate the correlation between the molecular crystal structure and the photophysical properties. It is also considered to hike new strategies to synthesize CuI complexes from the wet reactions to the more green solvent free methods. The advantages in using these strategies are evident but, obtain a single crystal suitable for SCXRD analysis from these batches is quite impossible. The structure solution still remains the key point in this research so we tackle this problem solving the structure by X-ray powder diffraction data. When the sample was fully characterized we moved to design and development of the associated OLED-device. Since copper iodide complexes are often insoluble in organic solvents, the high vacuum deposition technique is preferred. A new non-conventional deposition process have also been proposed to avoid the low complex stability in this practice with an in-situ complex formation in a layer-by layer deposition route.