997 resultados para Decomposition framework
Resumo:
Finite element techniques for solving the problem of fluid-structure interaction of an elastic solid material in a laminar incompressible viscous flow are described. The mathematical problem consists of the Navier-Stokes equations in the Arbitrary Lagrangian-Eulerian formulation coupled with a non-linear structure model, considering the problem as one continuum. The coupling between the structure and the fluid is enforced inside a monolithic framework which computes simultaneously for the fluid and the structure unknowns within a unique solver. We used the well-known Crouzeix-Raviart finite element pair for discretization in space and the method of lines for discretization in time. A stability result using the Backward-Euler time-stepping scheme for both fluid and solid part and the finite element method for the space discretization has been proved. The resulting linear system has been solved by multilevel domain decomposition techniques. Our strategy is to solve several local subproblems over subdomain patches using the Schur-complement or GMRES smoother within a multigrid iterative solver. For validation and evaluation of the accuracy of the proposed methodology, we present corresponding results for a set of two FSI benchmark configurations which describe the self-induced elastic deformation of a beam attached to a cylinder in a laminar channel flow, allowing stationary as well as periodically oscillating deformations, and for a benchmark proposed by COMSOL multiphysics where a narrow vertical structure attached to the bottom wall of a channel bends under the force due to both viscous drag and pressure. Then, as an example of fluid-structure interaction in biomedical problems, we considered the academic numerical test which consists in simulating the pressure wave propagation through a straight compliant vessel. All the tests show the applicability and the numerical efficiency of our approach to both two-dimensional and three-dimensional problems.
Resumo:
Mixed Reality (MR) aims to link virtual entities with the real world and has many applications such as military and medical domains [JBL+00, NFB07]. In many MR systems and more precisely in augmented scenes, one needs the application to render the virtual part accurately at the right time. To achieve this, such systems acquire data related to the real world from a set of sensors before rendering virtual entities. A suitable system architecture should minimize the delays to keep the overall system delay (also called end-to-end latency) within the requirements for real-time performance. In this context, we propose a compositional modeling framework for MR software architectures in order to specify, simulate and validate formally the time constraints of such systems. Our approach is first based on a functional decomposition of such systems into generic components. The obtained elements as well as their typical interactions give rise to generic representations in terms of timed automata. A whole system is then obtained as a composition of such defined components. To write specifications, a textual language named MIRELA (MIxed REality LAnguage) is proposed along with the corresponding compilation tools. The generated output contains timed automata in UPPAAL format for simulation and verification of time constraints. These automata may also be used to generate source code skeletons for an implementation on a MR platform. The approach is illustrated first on a small example. A realistic case study is also developed. It is modeled by several timed automata synchronizing through channels and including a large number of time constraints. Both systems have been simulated in UPPAAL and checked against the required behavioral properties.
Resumo:
In the field of detection and monitoring of dynamic objects in quasi-static scenes, background subtraction techniques where background is modeled at pixel-level, although showing very significant limitations, are extensively used. In this work we propose a novel approach to background modeling that operates at region-level in a wavelet based multi-resolution framework. Based on a segmentation of the background, characterization is made for each region independently as a mixture of K Gaussian modes, considering the model of the approximation and detail coefficients at the different wavelet decomposition levels. Background region characterization is updated along time, and the detection of elements of interest is carried out computing the distance between background region models and those of each incoming image in the sequence. The inclusion of the context in the modeling scheme through each region characterization makes the model robust, being able to support not only gradual illumination and long-term changes, but also sudden illumination changes and the presence of strong shadows in the scene
Resumo:
A mathematical formulation for finite strain elasto plastic consolidation of fully saturated soil media is presented. Strong and weak forms of the boundary-value problem are derived using both the material and spatial descriptions. The algorithmic treatment of finite strain elastoplasticity for the solid phase is based on multiplicative decomposition and is coupled with the algorithm for fluid flow via the Kirchhoff pore water pressure. Balance laws are written for the soil-water mixture following the motion of the soil matrix alone. It is shown that the motion of the fluid phase only affects the Jacobian of the solid phase motion, and therefore can be characterized completely by the motion of the soil matrix. Furthermore, it is shown from energy balance consideration that the effective, or intergranular, stress is the appropriate measure of stress for describing the constitutive response of the soil skeleton since it absorbs all the strain energy generated in the saturated soil-water mixture. Finally, it is shown that the mathematical model is amenable to consistent linearization, and that explicit expressions for the consistent tangent operators can be derived for use in numerical solutions such as those based on the finite element method.
Resumo:
Work domain analysis (WDA) has been applied to a range of complex work domains, but few WDAs have been undertaken in medical contexts. One pioneering effort suggested that clinical abstraction is not based on means-ends relations, whereas another effort downplayed the role of bio-regulatory mechanisms. In this paper it is argued that bio-regulatory mechanisms that govern physiological behaviour must be part of WDA models of patients as the systems at the core of intensive care units. Furthermore it is argued that because the inner functioning of patients is not completely known, clinical abstraction is based on hypothetico-deductive abstract reasoning. This paper presents an alternative modelling framework that conforms to the broader aspirations of WDA. A modified version of the viable systems model is used to represent the patient system as a nested dissipative structure while aspects of the recognition primed decision model are used to represent the information resources available to clinicians in ways that support lsquoif...thenrsquo conceptual relations. These two frameworks come together to form the recursive diagnostic framework, which may provide a more appropriate foundation for information display design in the intensive care unit.
Resumo:
As process management projects have increased in size due to globalised and company-wide initiatives, a corresponding growth in the size of process modeling projects can be observed. Despite advances in languages, tools and methodologies, several aspects of these projects have been largely ignored by the academic community. This paper makes a first contribution to a potential research agenda in this field by defining the characteristics of large-scale process modeling projects and proposing a framework of related issues. These issues are derived from a semi -structured interview and six focus groups conducted in Australia, Germany and the USA with enterprise and modeling software vendors and customers. The focus groups confirm the existence of unresolved problems in business process modeling projects. The outcomes provide a research agenda which directs researchers into further studies in global process management, process model decomposition and the overall governance of process modeling projects. It is expected that this research agenda will provide guidance to researchers and practitioners by focusing on areas of high theoretical and practical relevance.
Resumo:
This paper presents a generic strategic framework of alternative international marketing strategies and market segmentation based on intra- and inter-cultural behavioural homogeneity. Consumer involvement (CI) is proposed as a pivotal construct to capture behavioural homogeneity, for the identification of market segments. Results from a five-country study demonstrate how the strategic framework can be valuable in managerial decision-making. First, there is evidence for the cultural invariance of the measurement of CI, allowing a true comparison of inter- and intra-cultural behavioural homogeneity. Second, CI influences purchase behaviour, and its evaluation provides a rich source of information for responsive market segmentation. Finally, a decomposition of behavioural variance suggests that national-cultural environment and nationally transcendent variables explain differences in behaviour. The Behavioural Homogeneity Evaluation Framework therefore suggests appropriate international marketing strategies, providing practical guidance for implementing involvement-contingent strategies. © 2007 Academy of International Business. All rights reserved.
Resumo:
Productivity at the macro level is a complex concept but also arguably the most appropriate measure of economic welfare. Currently, there is limited research available on the various approaches that can be used to measure it and especially on the relative accuracy of said approaches. This thesis has two main objectives: firstly, to detail some of the most common productivity measurement approaches and assess their accuracy under a number of conditions and secondly, to present an up-to-date application of productivity measurement and provide some guidance on selecting between sometimes conflicting productivity estimates. With regards to the first objective, the thesis provides a discussion on the issues specific to macro-level productivity measurement and on the strengths and weaknesses of the three main types of approaches available, namely index-number approaches (represented by Growth Accounting), non-parametric distance functions (DEA-based Malmquist indices) and parametric production functions (COLS- and SFA-based Malmquist indices). The accuracy of these approaches is assessed through simulation analysis, which provided some interesting findings. Probably the most important were that deterministic approaches are quite accurate even when the data is moderately noisy, that no approaches were accurate when noise was more extensive, that functional form misspecification has a severe negative effect in the accuracy of the parametric approaches and finally that increased volatility in inputs and prices from one period to the next adversely affects all approaches examined. The application was based on the EU KLEMS (2008) dataset and revealed that the different approaches do in fact result in different productivity change estimates, at least for some of the countries assessed. To assist researchers in selecting between conflicting estimates, a new, three step selection framework is proposed, based on findings of simulation analyses and established diagnostics/indicators. An application of this framework is also provided, based on the EU KLEMS dataset.
Resumo:
The appealing feature of the arbitrage-free Nelson-Siegel model of the yield curve is the ability to capture movements in the yield curve through readily interpretable shifts in its level, slope or curvature, all within a dynamic arbitrage-free framework. To ensure that the level, slope and curvature factors evolve so as not to admit arbitrage, the model introduces a yield-adjustment term. This paper shows how the yield-adjustment term can also be decomposed into the familiar level, slope and curvature elements plus some additional readily interpretable shape adjustments. This means that, even in an arbitrage-free setting, it continues to be possible to interpret movements in the yield curve in terms of level, slope and curvature influences. © 2014 © 2014 Taylor & Francis.
Resumo:
An original heuristic algorithm of sequential two-block decomposition of partial Boolean functions is researched. The key combinatorial task is considered: finding of suitable partition on the set of arguments, i. e. such one, on which the function is separable. The search for suitable partition is essentially accelerated by preliminary detection of its traces. Within the framework of the experimental system the efficiency of the algorithm is evaluated, the boundaries of its practical application are determined.
Resumo:
In the U.S., construction accidents remain a significant economic and social problem. Despite recent improvement, the Construction industry, generally, has lagged behind other industries in implementing safety as a total management process for achieving zero accidents and developing a high-performance safety culture. One aspect of this total approach to safety that has frustrated the construction industry the most has been “measurement”, which involves identifying and quantifying the factors that critically influence safe work behaviors. The basic problem attributed is the difficulty in assessing what to measure and how to measure it—particularly the intangible aspects of safety. Without measurement, the notion of continuous improvement is hard to follow. This research was undertaken to develop a strategic framework for the measurement and continuous improvement of total safety in order to achieve and sustain the goal of zero accidents, while improving the quality, productivity and the competitiveness of the construction industry as it moves forward. The research based itself on an integral model of total safety that allowed decomposition of safety into interior and exterior characteristics using a multiattribute analysis technique. Statistical relationships between total safety dimensions and safety performance (measured by safe work behavior) were revealed through a series of latent variables (factors) that describe the total safety environment of a construction organization. A structural equation model (SEM) was estimated for the latent variables to quantify relationships among them and between these total safety determinants and safety performance of a construction organization. The developed SEM constituted a strategic framework for identifying, measuring, and continuously improving safety as a total concern for achieving and sustaining the goal of zero accidents.
Resumo:
The Three-Layer distributed mediation architecture, designed by Secure System Architecture laboratory, employed a layered framework of presence, integration, and homogenization mediators. The architecture does not have any central component that may affect the system reliability. A distributed search technique was adapted in the system to increase its reliability. An Enhanced Chord-like algorithm (E-Chord) was designed and deployed in the integration layer. The E-Chord is a skip-list algorithm based on Distributed Hash Table (DHT) which is a distributed but structured architecture. DHT is distributed in the sense that no central unit is required to maintain indexes, and it is structured in the sense that indexes are distributed over the nodes in a systematic manner. Each node maintains three kind of routing information: a frequency list, a successor/predecessor list, and a finger table. None of the nodes in the system maintains all indexes, and each node knows about some other nodes in the system. These nodes, also called composer mediators, were connected in a P2P fashion. ^ A special composer mediator called a global mediator initiates the keyword-based matching decomposition of the request using the E-Chord. It generates an Integrated Data Structure Graph (IDSG) on the fly, creates association and dependency relations between nodes in the IDSG, and then generates a Global IDSG (GIDSG). The GIDSG graph is a plan which guides the global mediator how to integrate data. It is also used to stream data from the mediators in the homogenization layer which connected to the data sources. The connectors start sending the data to the global mediator just after the global mediator creates the GIDSG and just before the global mediator sends the answer to the presence mediator. Using the E-Chord and GIDSG made the mediation system more scalable than using a central global schema repository since all the composers in the integration layer are capable of handling and routing requests. Also, when a composer fails, it would only minimally affect the entire mediation system. ^
A class of domain decomposition preconditioners for hp-discontinuous Galerkin finite element methods
Resumo:
In this article we address the question of efficiently solving the algebraic linear system of equations arising from the discretization of a symmetric, elliptic boundary value problem using hp-version discontinuous Galerkin finite element methods. In particular, we introduce a class of domain decomposition preconditioners based on the Schwarz framework, and prove bounds on the condition number of the resulting iteration operators. Numerical results confirming the theoretical estimates are also presented.
Resumo:
Resource specialisation, although a fundamental component of ecological theory, is employed in disparate ways. Most definitions derive from simple counts of resource species. We build on recent advances in ecophylogenetics and null model analysis to propose a concept of specialisation that comprises affinities among resources as well as their co-occurrence with consumers. In the distance-based specialisation index (DSI), specialisation is measured as relatedness (phylogenetic or otherwise) of resources, scaled by the null expectation of random use of locally available resources. Thus, specialists use significantly clustered sets of resources, whereas generalists use over-dispersed resources. Intermediate species are classed as indiscriminate consumers. The effectiveness of this approach was assessed with differentially restricted null models, applied to a data set of 168 herbivorous insect species and their hosts. Incorporation of plant relatedness and relative abundance greatly improved specialisation measures compared to taxon counts or simpler null models, which overestimate the fraction of specialists, a problem compounded by insufficient sampling effort. This framework disambiguates the concept of specialisation with an explicit measure applicable to any mode of affinity among resource classes, and is also linked to ecological and evolutionary processes. This will enable a more rigorous deployment of ecological specialisation in empirical and theoretical studies.
Resumo:
A temperature pause introduced in a simple single-step thermal decomposition of iron, with the presence of silver seeds formed in the same reaction mixture, gives rise to novel compact heterostructures: brick-like Ag@Fe3O4 core-shell nanoparticles. This novel method is relatively easy to implement, and could contribute to overcome the challenge of obtaining a multifunctional heteroparticle in which a noble metal is surrounded by magnetite. Structural analyses of the samples show 4 nm silver nanoparticles wrapped within compact cubic external structures of Fe oxide, with curious rectangular shape. The magnetic properties indicate a near superparamagnetic like behavior with a weak hysteresis at room temperature. The value of the anisotropy involved makes these particles candidates to potential applications in nanomedicine.