21 resultados para Arrowhead, interoperability, soa, internet of things, smart spaces, api, simulation

em Université de Lausanne, Switzerland


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The motivation for this research initiated from the abrupt rise and fall of minicomputers which were initially used both for industrial automation and business applications due to their significantly lower cost than their predecessors, the mainframes. Later industrial automation developed its own vertically integrated hardware and software to address the application needs of uninterrupted operations, real-time control and resilience to harsh environmental conditions. This has led to the creation of an independent industry, namely industrial automation used in PLC, DCS, SCADA and robot control systems. This industry employs today over 200'000 people in a profitable slow clockspeed context in contrast to the two mainstream computing industries of information technology (IT) focused on business applications and telecommunications focused on communications networks and hand-held devices. Already in 1990s it was foreseen that IT and communication would merge into one Information and communication industry (ICT). The fundamental question of the thesis is: Could industrial automation leverage a common technology platform with the newly formed ICT industry? Computer systems dominated by complex instruction set computers (CISC) were challenged during 1990s with higher performance reduced instruction set computers (RISC). RISC started to evolve parallel to the constant advancement of Moore's law. These developments created the high performance and low energy consumption System-on-Chip architecture (SoC). Unlike to the CISC processors RISC processor architecture is a separate industry from the RISC chip manufacturing industry. It also has several hardware independent software platforms consisting of integrated operating system, development environment, user interface and application market which enables customers to have more choices due to hardware independent real time capable software applications. An architecture disruption merged and the smartphone and tablet market were formed with new rules and new key players in the ICT industry. Today there are more RISC computer systems running Linux (or other Unix variants) than any other computer system. The astonishing rise of SoC based technologies and related software platforms in smartphones created in unit terms the largest installed base ever seen in the history of computers and is now being further extended by tablets. An underlying additional element of this transition is the increasing role of open source technologies both in software and hardware. This has driven the microprocessor based personal computer industry with few dominating closed operating system platforms into a steep decline. A significant factor in this process has been the separation of processor architecture and processor chip production and operating systems and application development platforms merger into integrated software platforms with proprietary application markets. Furthermore the pay-by-click marketing has changed the way applications development is compensated: Three essays on major trends in a slow clockspeed industry: The case of industrial automation 2014 freeware, ad based or licensed - all at a lower price and used by a wider customer base than ever before. Moreover, the concept of software maintenance contract is very remote in the app world. However, as a slow clockspeed industry, industrial automation has remained intact during the disruptions based on SoC and related software platforms in the ICT industries. Industrial automation incumbents continue to supply systems based on vertically integrated systems consisting of proprietary software and proprietary mainly microprocessor based hardware. They enjoy admirable profitability levels on a very narrow customer base due to strong technology-enabled customer lock-in and customers' high risk leverage as their production is dependent on fault-free operation of the industrial automation systems. When will this balance of power be disrupted? The thesis suggests how industrial automation could join the mainstream ICT industry and create an information, communication and automation (ICAT) industry. Lately the Internet of Things (loT) and weightless networks, a new standard leveraging frequency channels earlier occupied by TV broadcasting, have gradually started to change the rigid world of Machine to Machine (M2M) interaction. It is foreseeable that enough momentum will be created that the industrial automation market will in due course face an architecture disruption empowered by these new trends. This thesis examines the current state of industrial automation subject to the competition between the incumbents firstly through a research on cost competitiveness efforts in captive outsourcing of engineering, research and development and secondly researching process re- engineering in the case of complex system global software support. Thirdly we investigate the industry actors', namely customers, incumbents and newcomers, views on the future direction of industrial automation and conclude with our assessments of the possible routes industrial automation could advance taking into account the looming rise of the Internet of Things (loT) and weightless networks. Industrial automation is an industry dominated by a handful of global players each of them focusing on maintaining their own proprietary solutions. The rise of de facto standards like IBM PC, Unix and Linux and SoC leveraged by IBM, Compaq, Dell, HP, ARM, Apple, Google, Samsung and others have created new markets of personal computers, smartphone and tablets and will eventually also impact industrial automation through game changing commoditization and related control point and business model changes. This trend will inevitably continue, but the transition to a commoditized industrial automation will not happen in the near future.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Computational modeling has become a widely used tool for unraveling the mechanisms of higher level cooperative cell behavior during vascular morphogenesis. However, experimenting with published simulation models or adding new assumptions to those models can be daunting for novice and even for experienced computational scientists. Here, we present a step-by-step, practical tutorial for building cell-based simulations of vascular morphogenesis using the Tissue Simulation Toolkit (TST). The TST is a freely available, open-source C++ library for developing simulations with the two-dimensional cellular Potts model, a stochastic, agent-based framework to simulate collective cell behavior. We will show the basic use of the TST to simulate and experiment with published simulations of vascular network formation. Then, we will present step-by-step instructions and explanations for building a recent simulation model of tumor angiogenesis. Demonstrated mechanisms include cell-cell adhesion, chemotaxis, cell elongation, haptotaxis, and haptokinesis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose: To investigate the effect of incremental increases in intraocular straylight on threshold measurements made by three modern forms of perimetry: Standard Automated Perimetry (SAP) using Octopus (Dynamic, G-Pattern), Pulsar Perimetry (PP) (TOP, 66 points) and the Moorfields Motion Displacement Test (MDT) (WEBS, 32 points).Methods: Four healthy young observers were recruited (mean age 26yrs [25yrs, 28yrs]), refractive correction [+2 D, -4.25D]). Five white opacity filters (WOF), each scattering light by different amounts were used to create incremental increases in intraocular straylight (IS). Resultant IS values were measured with each WOF and at baseline (no WOF) for each subject using a C-Quant Straylight Meter (Oculus, Wetzlar, Germany). A 25 yr old has an IS value of ~0.85 log(s). An increase of 40% in IS to 1.2log(s) corresponds to the physiological value of a 70yr old. Each WOFs created an increase in IS between 10-150% from baseline, ranging from effects similar to normal aging to those found with considerable cataract. Each subject underwent 6 test sessions over a 2-week period; each session consisted of the 3 perimetric tests using one of the five WOFs and baseline (both instrument and filter were randomised).Results: The reduction in sensitivity from baseline was calculated. A two-way ANOVA on mean change in threshold (where subjects were treated as rows in the block and each increment in fog filters was treated as column) was used to examine the effect of incremental increases in straylight. Both SAP (p<0.001) and Pulsar (p<0.001) were significantly affected by increases in straylight. The MDT (p=0.35) remained comparatively robust to increases in straylight.Conclusions: The Moorfields MDT measurement of threshold is robust to effects of additional straylight as compared to SAP and PP.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The fact that individuals learn can change the relationship between genotype and phenotype in the population, and thus affect the evolutionary response to selection. Here we ask how male ability to learn from female response affects the evolution of a novel male behavioral courtship trait under pre-existing female preference (sensory drive). We assume a courtship trait which has both a genetic and a learned component, and a two-level female response to males. With individual-based simulations we show that, under this scenario, learning generally increases the strength of selection on the genetic component of the courtship trait, at least when the population genetic mean is still low. As a consequence, learning not only accelerates the evolution of the courtship trait, but also enables it when the trait is costly, which in the absence of learning results in an adaptive valley. Furthermore, learning can enable the evolution of the novel trait in the face of gene flow mediated by immigration of males that show superior attractiveness to females based on another, non-heritable trait. However, rather than increasing monotonically with the speed of learning, the effect of learning on evolution is maximized at intermediate learning rates. This model shows that, at least under some scenarios, the ability to learn can drive the evolution of mating behaviors through a process equivalent to Waddington's genetic assimilation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

AbstractDigitalization gives to the Internet the power by allowing several virtual representations of reality, including that of identity. We leave an increasingly digital footprint in cyberspace and this situation puts our identity at high risks. Privacy is a right and fundamental social value that could play a key role as a medium to secure digital identities. Identity functionality is increasingly delivered as sets of services, rather than monolithic applications. So, an identity layer in which identity and privacy management services are loosely coupled, publicly hosted and available to on-demand calls could be more realistic and an acceptable situation. Identity and privacy should be interoperable and distributed through the adoption of service-orientation and implementation based on open standards (technical interoperability). Ihe objective of this project is to provide a way to implement interoperable user-centric digital identity-related privacy to respond to the need of distributed nature of federated identity systems. It is recognized that technical initiatives, emerging standards and protocols are not enough to guarantee resolution for the concerns surrounding a multi-facets and complex issue of identity and privacy. For this reason they should be apprehended within a global perspective through an integrated and a multidisciplinary approach. The approach dictates that privacy law, policies, regulations and technologies are to be crafted together from the start, rather than attaching it to digital identity after the fact. Thus, we draw Digital Identity-Related Privacy (DigldeRP) requirements from global, domestic and business-specific privacy policies. The requirements take shape of business interoperability. We suggest a layered implementation framework (DigldeRP framework) in accordance to model-driven architecture (MDA) approach that would help organizations' security team to turn business interoperability into technical interoperability in the form of a set of services that could accommodate Service-Oriented Architecture (SOA): Privacy-as-a-set-of- services (PaaSS) system. DigldeRP Framework will serve as a basis for vital understanding between business management and technical managers on digital identity related privacy initiatives. The layered DigldeRP framework presents five practical layers as an ordered sequence as a basis of DigldeRP project roadmap, however, in practice, there is an iterative process to assure that each layer supports effectively and enforces requirements of the adjacent ones. Each layer is composed by a set of blocks, which determine a roadmap that security team could follow to successfully implement PaaSS. Several blocks' descriptions are based on OMG SoaML modeling language and BPMN processes description. We identified, designed and implemented seven services that form PaaSS and described their consumption. PaaSS Java QEE project), WSDL, and XSD codes are given and explained.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The configuration space available to randomly cyclized polymers is divided into subspaces accessible to individual knot types. A phantom chain utilized in numerical simulations of polymers can explore all subspaces, whereas a real closed chain forming a figure-of-eight knot, for example, is confined to a subspace corresponding to this knot type only. One can conceptually compare the assembly of configuration spaces of various knot types to a complex foam where individual cells delimit the configuration space available to a given knot type. Neighboring cells in the foam harbor knots that can be converted into each other by just one intersegmental passage. Such a segment-segment passage occurring at the level of knotted configurations corresponds to a passage through the interface between neighboring cells in the foamy knot space. Using a DNA topoisomerase-inspired simulation approach we characterize here the effective interface area between neighboring knot spaces as well as the surface-to-volume ratio of individual knot spaces. These results provide a reference system required for better understanding mechanisms of action of various DNA topoisomerases.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Progressive pseudorheumatoid dysplasia (PPRD) is a genetic, non-inflammatory arthropathy caused by recessive loss of function mutations in WISP3 (Wnt1-inducible signaling pathway protein 3; MIM 603400), encoding for a signaling protein. The disease is clinically silent at birth and in infancy. It manifests between the age of 3 and 6 years with joint pain and progressive joint stiffness. Affected children are referred to pediatric rheumatologists and orthopedic surgeons; however, signs of inflammation are absent and anti-inflammatory treatment is of little help. Bony enlargement at the interphalangeal joints progresses leading to camptodactyly. Spine involvement develops in late childhood and adolescence leading to short trunk with thoracolumbar kyphosis. Adult height is usually below the 3rd percentile. Radiographic signs are relatively mild. Platyspondyly develops in late childhood and can be the first clue to the diagnosis. Enlargement of the phalangeal metaphyses develops subtly and is usually recognizable by 10 years. The femoral heads are large and the acetabulum forms a distinct "lip" overriding the femoral head. There is a progressive narrowing of all articular spaces as articular cartilage is lost. Medical management of PPRD remains symptomatic and relies on pain medication. Hip joint replacement surgery in early adulthood is effective in reducing pain and maintaining mobility and can be recommended. Subsequent knee joint replacement is a further option. Mutation analysis of WISP3 allowed the confirmation of the diagnosis in 63 out of 64 typical cases in our series. Intronic mutations in WISP3 leading to splicing aberrations can be detected only in cDNA from fibroblasts and therefore a skin biopsy is indicated when genomic analysis fails to reveal mutations in individuals with otherwise typical signs and symptoms. In spite of the first symptoms appearing in early childhood, the diagnosis of PPRD is most often made only in the second decade and affected children often receive unnecessary anti-inflammatory and immunosuppressive treatments. Increasing awareness of PPRD appears to be essential to allow for a timely diagnosis. © 2012 Wiley Periodicals, Inc.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Depuis le séminaire H. Cartan de 1954-55, il est bien connu que l'on peut trouver des éléments de torsion arbitrairement grande dans l'homologie entière des espaces d'Eilenberg-MacLane K(G,n) où G est un groupe abélien non trivial et n>1. L'objectif majeur de ce travail est d'étendre ce résultat à des H-espaces possédant plus d'un groupe d'homotopie non trivial. Dans le but de contrôler précisément le résultat de H. Cartan, on commence par étudier la dualité entre l'homologie et la cohomologie des espaces d'Eilenberg-MacLane 2-locaux de type fini. On parvient ainsi à raffiner quelques résultats qui découlent des calculs de H. Cartan. Le résultat principal de ce travail peut être formulé comme suit. Soit X un H-espace ne possédant que deux groupes d'homotopie non triviaux, tous deux finis et de 2-torsion. Alors X n'admet pas d'exposant pour son groupe gradué d'homologie entière réduite. On construit une large classe d'espaces pour laquelle ce résultat n'est qu'une conséquence d'une caractéristique topologique, à savoir l'existence d'un rétract faible X K(G,n) pour un certain groupe abélien G et n>1. On généralise également notre résultat principal à des espaces plus compliqués en utilisant la suite spectrale d'Eilenberg-Moore ainsi que des méthodes analytiques faisant apparaître les nombres de Betti et leur comportement asymptotique. Finalement, on conjecture que les espaces qui ne possédent qu'un nombre fini de groupes d'homotopie non triviaux n'admettent pas d'exposant homologique. Ce travail contient par ailleurs la présentation de la « machine d'Eilenberg-MacLane », un programme C++ conçu pour calculer explicitement les groupes d'homologie entière des espaces d'Eilenberg-MacLane. <br/><br/>By the work of H. Cartan, it is well known that one can find elements of arbitrarilly high torsion in the integral (co)homology groups of an Eilenberg-MacLane space K(G,n), where G is a non-trivial abelian group and n>1. The main goal of this work is to extend this result to H-spaces having more than one non-trivial homotopy groups. In order to have an accurate hold on H. Cartan's result, we start by studying the duality between homology and cohomology of 2-local Eilenberg-MacLane spaces of finite type. This leads us to some improvements of H. Cartan's methods in this particular case. Our main result can be stated as follows. Let X be an H-space with two non-vanishing finite 2-torsion homotopy groups. Then X does not admit any exponent for its reduced integral graded (co)homology group. We construct a wide class of examples for which this result is a simple consequence of a topological feature, namely the existence of a weak retract X K(G,n) for some abelian group G and n>1. We also generalize our main result to more complicated stable two stage Postnikov systems, using the Eilenberg-Moore spectral sequence and analytic methods involving Betti numbers and their asymptotic behaviour. Finally, we investigate some guesses on the non-existence of homology exponents for finite Postnikov towers. We conjecture that Postnikov pieces do not admit any (co)homology exponent. This work also includes the presentation of the "Eilenberg-MacLane machine", a C++ program designed to compute explicitely all integral homology groups of Eilenberg-MacLane spaces. <br/><br/>Il est toujours difficile pour un mathématicien de parler de son travail. La difficulté réside dans le fait que les objets qu'il étudie sont abstraits. On rencontre assez rarement un espace vectoriel, une catégorie abélienne ou une transformée de Laplace au coin de la rue ! Cependant, même si les objets mathématiques sont difficiles à cerner pour un non-mathématicien, les méthodes pour les étudier sont essentiellement les mêmes que celles utilisées dans les autres disciplines scientifiques. On décortique les objets complexes en composantes plus simples à étudier. On dresse la liste des propriétés des objets mathématiques, puis on les classe en formant des familles d'objets partageant un caractère commun. On cherche des façons différentes, mais équivalentes, de formuler un problème. Etc. Mon travail concerne le domaine mathématique de la topologie algébrique. Le but ultime de cette discipline est de parvenir à classifier tous les espaces topologiques en faisant usage de l'algèbre. Cette activité est comparable à celle d'un ornithologue (topologue) qui étudierait les oiseaux (les espaces topologiques) par exemple à l'aide de jumelles (l'algèbre). S'il voit un oiseau de petite taille, arboricole, chanteur et bâtisseur de nids, pourvu de pattes à quatre doigts, dont trois en avant et un, muni d'une forte griffe, en arrière, alors il en déduira à coup sûr que c'est un passereau. Il lui restera encore à déterminer si c'est un moineau, un merle ou un rossignol. Considérons ci-dessous quelques exemples d'espaces topologiques: a) un cube creux, b) une sphère et c) un tore creux (c.-à-d. une chambre à air). a) b) c) Si toute personne normalement constituée perçoit ici trois figures différentes, le topologue, lui, n'en voit que deux ! De son point de vue, le cube et la sphère ne sont pas différents puisque ils sont homéomorphes: on peut transformer l'un en l'autre de façon continue (il suffirait de souffler dans le cube pour obtenir la sphère). Par contre, la sphère et le tore ne sont pas homéomorphes: triturez la sphère de toutes les façons (sans la déchirer), jamais vous n'obtiendrez le tore. Il existe un infinité d'espaces topologiques et, contrairement à ce que l'on serait naïvement tenté de croire, déterminer si deux d'entre eux sont homéomorphes est très difficile en général. Pour essayer de résoudre ce problème, les topologues ont eu l'idée de faire intervenir l'algèbre dans leurs raisonnements. Ce fut la naissance de la théorie de l'homotopie. Il s'agit, suivant une recette bien particulière, d'associer à tout espace topologique une infinité de ce que les algébristes appellent des groupes. Les groupes ainsi obtenus sont appelés groupes d'homotopie de l'espace topologique. Les mathématiciens ont commencé par montrer que deux espaces topologiques qui sont homéomorphes (par exemple le cube et la sphère) ont les même groupes d'homotopie. On parle alors d'invariants (les groupes d'homotopie sont bien invariants relativement à des espaces topologiques qui sont homéomorphes). Par conséquent, deux espaces topologiques qui n'ont pas les mêmes groupes d'homotopie ne peuvent en aucun cas être homéomorphes. C'est là un excellent moyen de classer les espaces topologiques (pensez à l'ornithologue qui observe les pattes des oiseaux pour déterminer s'il a affaire à un passereau ou non). Mon travail porte sur les espaces topologiques qui n'ont qu'un nombre fini de groupes d'homotopie non nuls. De tels espaces sont appelés des tours de Postnikov finies. On y étudie leurs groupes de cohomologie entière, une autre famille d'invariants, à l'instar des groupes d'homotopie. On mesure d'une certaine manière la taille d'un groupe de cohomologie à l'aide de la notion d'exposant; ainsi, un groupe de cohomologie possédant un exposant est relativement petit. L'un des résultats principaux de ce travail porte sur une étude de la taille des groupes de cohomologie des tours de Postnikov finies. Il s'agit du théorème suivant: un H-espace topologique 1-connexe 2-local et de type fini qui ne possède qu'un ou deux groupes d'homotopie non nuls n'a pas d'exposant pour son groupe gradué de cohomologie entière réduite. S'il fallait interpréter qualitativement ce résultat, on pourrait dire que plus un espace est petit du point de vue de la cohomologie (c.-à-d. s'il possède un exposant cohomologique), plus il est intéressant du point de vue de l'homotopie (c.-à-d. il aura plus de deux groupes d'homotopie non nuls). Il ressort de mon travail que de tels espaces sont très intéressants dans le sens où ils peuvent avoir une infinité de groupes d'homotopie non nuls. Jean-Pierre Serre, médaillé Fields en 1954, a montré que toutes les sphères de dimension >1 ont une infinité de groupes d'homotopie non nuls. Des espaces avec un exposant cohomologique aux sphères, il n'y a qu'un pas à franchir...

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Smart canula concept allows for collapsed cannula insertion, and self-expansion within a vein of the body. (A) Computational fluid dynamics, and (B) bovine experiments (76+/-3.8 kg) were performed for comparative analyses, prior to (C) the first clinical application. For an 18F access, a given flow of 4 l/min (A) resulted in a pressure drop of 49 mmHg for smart cannula versus 140 mmHg for control. The corresponding Reynolds numbers are 680 versus 1170, respectively. (B) For an access of 28F, the maximal flow for smart cannula was 5.8+/-0.5 l/min versus 4.0+/-0.1 l/min for standard (P<0.0001), for 24F 5.5+/-0.6 l/min versus 3.2+/-0.4 l/min (P<0.0001), and for 20F 4.1+/-0.3 l/min versus 1.6+/-0.3 l/min (P<0.0001). The flow obtained with the smart cannula was 270+/-45% (20F), 172+/-26% (24F), and 134+/-13% (28F) of standard (one-way ANOVA, P=0.014). (C) First clinical application (1.42 m2) with a smart cannula showed 3.55 l/min (100% predicted) without additional fluids. All three assessment steps confirm the superior performance of the smart cannula design.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The pharmacokinetic determinants of successful antibiotic prophylaxis of endocarditis are not precisely known. Differences in half-lives of antibiotics between animals and humans preclude extrapolation of animal results to human situations. To overcome this limitation, we have mimicked in rats the amoxicillin kinetics in humans following a 3-g oral dose (as often used for prophylaxis of endocarditis) by delivering the drug through a computerized pump. Rats with catheter-induced vegetations were challenged with either of two strains of antibiotic-tolerant viridans group streptococci. Antibiotics were given either through the pump (to simulate the whole kinetic profile during prophylaxis in humans) or as an intravenous bolus which imitated only the peak level of amoxicillin (18 mg/liter) in human serum. Prophylaxis by intravenous bolus was inoculum dependent and afforded a limited protection only in rats challenged with the minimum inoculum size infecting > or = 90% of untreated controls. In contrast, simulation of kinetics in humans significantly protected animals challenged with 10 to 100 times the inoculum of either of the test organisms infecting > or = 90% of untreated controls. Thus, simulation of the profiles of amoxicillin prophylaxis in human serum was more efficacious than mere imitation of the transient peak level in rats. This confirms previous studies suggesting that the duration for which the serum amoxicillin level remained detectable (not only the magnitude of the peak) was an important parameter in successful prophylaxis of endocarditis. The results also suggest that single-dose prophylaxis with 3 g of amoxicillin in humans might be more effective than predicted by conventional animal models in which only peak levels of antibiotic in human serum were stimulated.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Attrition in longitudinal studies can lead to biased results. The study is motivated by the unexpected observation that alcohol consumption decreased despite increased availability, which may be due to sample attrition of heavy drinkers. Several imputation methods have been proposed, but rarely compared in longitudinal studies of alcohol consumption. The imputation of consumption level measurements is computationally particularly challenging due to alcohol consumption being a semi-continuous variable (dichotomous drinking status and continuous volume among drinkers), and the non-normality of data in the continuous part. Data come from a longitudinal study in Denmark with four waves (2003-2006) and 1771 individuals at baseline. Five techniques for missing data are compared: Last value carried forward (LVCF) was used as a single, and Hotdeck, Heckman modelling, multivariate imputation by chained equations (MICE), and a Bayesian approach as multiple imputation methods. Predictive mean matching was used to account for non-normality, where instead of imputing regression estimates, "real" observed values from similar cases are imputed. Methods were also compared by means of a simulated dataset. The simulation showed that the Bayesian approach yielded the most unbiased estimates for imputation. The finding of no increase in consumption levels despite a higher availability remained unaltered. Copyright (C) 2011 John Wiley & Sons, Ltd.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Résumé La mobilité ne signifie plus uniquement se mouvoir d'un point à un autre ; il s'agit d'un concept lui-même en constante évolution, grâce au progrès technique et à l'innovation sociale notamment. Aujourd'hui, la recherche de la vitesse n'est plus le seul enjeu au coeur de nos préoccupations. Elle a été remplacée par un retour au voyage enrichi par l'expérience et ce quelle que soit sa durée. Cet enrichissement s'est principalement fait par le truchement des technologies de l'information et de la communication et peut prendre plusieurs formes liées aux problématiques contemporaines de la ville et du territoire. Citons comme exemple la valorisation du temps de déplacement, grâce à un meilleur accès à l'information (travail, réseaux sociaux, etc.) et à la recherche d'une plus grande cohérence entre l'acte de se mouvoir et l'environnement proche ou lointain. Cette « recontextualisation » du mouvement nous interpelle dans notre rapport à l'espace et nous donne également des pistes pour repenser le métier d'urbaniste de la ville intelligente. Abstract Mobility issues do not only involve the act of moving nowadays. The concept itself evolves continuously thanks to technological and social innovations. The main stakes do not focus anymore on improving speed, but on enriching the experience of travelling, even in the case of short trips. One of the main factors that fosters this evolution is the progressive adoption of information and communication technologies that help to reshape the issues of contemporary cities. For example, the quality of travel time has improved thanks to the ubiquitous accessibility to information, and by offering a better coherence between the trip and the immediate social environment. The "recontextualisation" of everyday activities (working, interacting, etc.) challenges the relationship individuals have with space and offers many clues in regard to the required skills that urban planners and designers of the smart city should possess.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We perform direct numerical simulations of drainage by solving Navier- Stokes equations in the pore space and employing the Volume Of Fluid (VOF) method to track the evolution of the fluid-fluid interface. After demonstrating that the method is able to deal with large viscosity contrasts and to model the transition from stable flow to viscous fingering, we focus on the definition of macroscopic capillary pressure. When the fluids are at rest, the difference between inlet and outlet pressures and the difference between the intrinsic phase average pressure coincide with the capillary pressure. However, when the fluids are in motion these quantities are dominated by viscous forces. In this case, only a definition based on the variation of the interfacial energy provides an accurate measure of the macroscopic capillary pressure and allows separating the viscous from the capillary pressure components.