947 resultados para Logical Framework Matrix


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The matrix of volcaniclastic kimberlite (VK) from the Muskox pipe (Northern Slave Province, Nunavut, Canada) is interpreted to represent an overprint of an original clastic matrix. Muskox VK is subdivided into three different matrix mineral assemblages that reflect differences in the proportions of original primary matrix constituents, temperature of formation and nature of the altering fluids. Using whole rock X-ray fluorescence (XRF), whole rock X-ray diffraction (XRD), microprobe analyses, back-scatter electron (BSE) imaging, petrography and core logging, we find that most matrix minerals (serpentine, phlogopite, chlorite, saponite, monticellite, Fe-Ti oxides and calcite) lack either primary igneous or primary clastic textures. The mineralogy and textures are most consistent with formation through alteration overprinting of an original clastic matrix that form by retrograde reactions as the deposit cools, or, in the case of calcite, by precipitation from Ca-bearing fluids into a secondary porosity. The first mineral assemblage consists largely of serpentine, phlogopite, calcite, Fe-Ti oxides and monticellite and occurs in VK with relatively fresh framework clasts. Alteration reactions, driven by deuteric fluids derived from the juvenile constituents, promote the crystallisation of minerals that indicate relatively high temperatures of formation (> 400 °C). Lower-temperature minerals are not present because permeability was occluded before the deposit cooled to low temperatures, thus shielding the facies from further interaction with fluids. The other two matrix mineral assemblages consist largely of serpentine, phlogopite, calcite, +/- diopside, and +/- chlorite. They form in VK that contains more country rock, which may have caused the deposit to be cooler upon emplacement. Most framework components are completely altered, suggesting that larger volumes of fluids drove the alteration reactions. These fluids were likely of meteoric provenance and became heated by the volcaniclastic debris when they percolated into the VK infill. Most alteration reactions ceased at temperatures > 200 °C, as indicated by the absence or paucity of lower-temperature phases in most samples, such as saponite. Recognition that Muskox VK contains an original clastic matrix is a necessary first step for evaluating the textural configuration, which is important for reconstructing the physical processes responsible for the formation of the deposit.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose – Business models to date have remained the creation of management, however, it is the belief of the authors that designers should be critically approaching, challenging and creating new business models as part of their practice. This belief portrays a new era where business model constructs become the new design brief of the future and fuel design and innovation to work together at the strategic level of an organisation. Design/methodology/approach – The purpose of this paper is to explore and investigate business model design. The research followed a deductive structured qualitative content analysis approach utilizing a predetermined categorization matrix. The analysis of forty business cases uncovered commonalities of key strategic drivers behind these innovative business models. Findings – Five business model typologies were derived from this content analysis, from which quick prototypes of new business models can be created. Research limitations/implications – Implications from this research suggest there is no “one right” model, but rather through experimentation, the generation of many unique and diverse concepts can result in greater possibilities for future innovation and sustained competitive advantage. Originality/value – This paper builds upon the emerging research and exploration into the importance and relevance of dynamic, design-driven approaches to the creation of innovative business models. These models aim to synthesize knowledge gained from real world examples into a tangible, accessible and provoking framework that provide new prototyping templates to aid the process of business model experimentation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The complexity, variability and vastness of the northern Australian rangelands make it difficult to assess the risks associated with climate change. In this paper we present a methodology to help industry and primary producers assess risks associated with climate change and to assess the effectiveness of adaptation options in managing those risks. Our assessment involved three steps. Initially, the impacts and adaptation responses were documented in matrices by ‘experts’ (rangeland and climate scientists). Then, a modified risk management framework was used to develop risk management matrices that identified important impacts, areas of greatest vulnerability (combination of potential impact and adaptive capacity) and priority areas for action at the industry level. The process was easy to implement and useful for arranging and analysing large amounts of information (both complex and interacting). Lastly, regional extension officers (after minimal ‘climate literacy’ training) could build on existing knowledge provided here and implement the risk management process in workshops with rangeland land managers. Their participation is likely to identify relevant and robust adaptive responses that are most likely to be included in regional and property management decisions. The process developed here for the grazing industry could be modified and used in other industries and sectors. By 2030, some areas of northern Australia will experience more droughts and lower summer rainfall. This poses a serious threat to the rangelands. Although the impacts and adaptive responses will vary between ecological and geographic systems, climate change is expected to have noticeable detrimental effects: reduced pasture growth and surface water availability; increased competition from woody vegetation; decreased production per head (beef and wool) and gross margin; and adverse impacts on biodiversity. Further research and development is needed to identify the most vulnerable regions, and to inform policy in time to facilitate transitional change and enable land managers to implement those changes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Partition of unity methods, such as the extended finite element method, allows discontinuities to be simulated independently of the mesh (Int. J. Numer. Meth. Engng. 1999; 45:601-620). This eliminates the need for the mesh to be aligned with the discontinuity or cumbersome re-meshing, as the discontinuity evolves. However, to compute the stiffness matrix of the elements intersected by the discontinuity, a subdivision of the elements into quadrature subcells aligned with the discontinuity is commonly adopted. In this paper, we use a simple integration technique, proposed for polygonal domains (Int. J. Nuttier Meth. Engng 2009; 80(1):103-134. DOI: 10.1002/nme.2589) to suppress the need for element subdivision. Numerical results presented for a few benchmark problems in the context of linear elastic fracture mechanics and a multi-material problem show that the proposed method yields accurate results. Owing to its simplicity, the proposed integration technique can be easily integrated in any existing code. Copyright (C) 2010 John Wiley & Sons, Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the thesis I study various quantum coherence phenomena and create some of the foundations for a systematic coherence theory. So far, the approach to quantum coherence in science has been purely phenomenological. In my thesis I try to answer the question what quantum coherence is and how it should be approached within the framework of physics, the metatheory of physics and the terminology related to them. It is worth noticing that quantum coherence is a conserved quantity that can be exactly defined. I propose a way to define quantum coherence mathematically from the density matrix of the system. Degenerate quantum gases, i.e., Bose condensates and ultracold Fermi systems, form a good laboratory to study coherence, since their entropy is small and coherence is large, and thus they possess strong coherence phenomena. Concerning coherence phenomena in degenerate quantum gases, I concentrate in my thesis mainly on collective association from atoms to molecules, Rabi oscillations and decoherence. It appears that collective association and oscillations do not depend on the spin-statistics of particles. Moreover, I study the logical features of decoherence in closed systems via a simple spin-model. I argue that decoherence is a valid concept also in systems with a possibility to experience recoherence, i.e., Poincaré recurrences. Metatheoretically this is a remarkable result, since it justifies quantum cosmology: to study the whole universe (i.e., physical reality) purely quantum physically is meaningful and valid science, in which decoherence explains why the quantum physical universe appears to cosmologists and other scientists very classical-like. The study of the logical structure of closed systems also reveals that complex enough closed (physical) systems obey a principle that is similar to Gödel's incompleteness theorem of logic. According to the theorem it is impossible to describe completely a closed system within the system, and the inside and outside descriptions of the system can be remarkably different. Via understanding this feature it may be possible to comprehend coarse-graining better and to define uniquely the mutual entanglement of quantum systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Regenerating codes are a class of distributed storage codes that allow for efficient repair of failed nodes, as compared to traditional erasure codes. An [n, k, d] regenerating code permits the data to be recovered by connecting to any k of the n nodes in the network, while requiring that a failed node be repaired by connecting to any d nodes. The amount of data downloaded for repair is typically much smaller than the size of the source data. Previous constructions of exact-regenerating codes have been confined to the case n = d + 1. In this paper, we present optimal, explicit constructions of (a) Minimum Bandwidth Regenerating (MBR) codes for all values of [n, k, d] and (b) Minimum Storage Regenerating (MSR) codes for all [n, k, d >= 2k - 2], using a new product-matrix framework. The product-matrix framework is also shown to significantly simplify system operation. To the best of our knowledge, these are the first constructions of exact-regenerating codes that allow the number n of nodes in the network, to be chosen independent of the other parameters. The paper also contains a simpler description, in the product-matrix framework, of a previously constructed MSR code with [n = d + 1, k, d >= 2k - 1].

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Regional frequency analysis is widely used for estimating quantiles of hydrological extreme events at sparsely gauged/ungauged target sites in river basins. It involves identification of a region (group of watersheds) resembling watershed of the target site, and use of information pooled from the region to estimate quantile for the target site. In the analysis, watershed of the target site is assumed to completely resemble watersheds in the identified region in terms of mechanism underlying generation of extreme event. In reality, it is rare to find watersheds that completely resemble each other. Fuzzy clustering approach can account for partial resemblance of watersheds and yield region(s) for the target site. Formation of regions and quantile estimation requires discerning information from fuzzy-membership matrix obtained based on the approach. Practitioners often defuzzify the matrix to form disjoint clusters (regions) and use them as the basis for quantile estimation. The defuzzification approach (DFA) results in loss of information discerned on partial resemblance of watersheds. The lost information cannot be utilized in quantile estimation, owing to which the estimates could have significant error. To avert the loss of information, a threshold strategy (TS) was considered in some prior studies. In this study, it is analytically shown that the strategy results in under-prediction of quantiles. To address this, a mathematical approach is proposed in this study and its effectiveness in estimating flood quantiles relative to DFA and TS is demonstrated through Monte-Carlo simulation experiments and case study on Mid-Atlantic water resources region, USA. (C) 2015 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Bone is a complex material with a hierarchical multi-scale organization from the molecule to the organ scale. The genetic bone disease, osteogenesis imperfecta, is primarily caused by mutations in the collagen type I genes, resulting in bone fragility. Because the basis of the disease is molecular with ramifications at the whole bone level, it provides a platform for investigating the relationship between structure, composition, and mechanics throughout the hierarchy. Prior studies have individually shown that OI leads to: 1. increased bone mineralization, 2. decreased elastic modulus, and 3. smaller apatite crystal size. However, these have not been studied together and the mechanism for how mineral structure influences tissue mechanics has not been identified. This lack of understanding inhibits the development of more accurate models and therapies. To address this research gap, we used a mouse model of the disease (oim) to measure these outcomes together in order to propose an underlying mechanism for the changes in properties. Our main finding was that despite increased mineralization, oim bones have lower stiffness that may result from the poorly organized mineral matrix with significantly smaller, highly packed and disoriented apatite crystals. Using a composite framework, we interpret the lower oim bone matrix elasticity observed as the result of a change in the aspect ratio of apatite crystals and a disruption of the crystal connectivity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Motivated by the problem of learning a linear regression model whose parameter is a large fixed-rank non-symmetric matrix, we consider the optimization of a smooth cost function defined on the set of fixed-rank matrices. We adopt the geometric framework of optimization on Riemannian quotient manifolds. We study the underlying geometries of several well-known fixed-rank matrix factorizations and then exploit the Riemannian quotient geometry of the search space in the design of a class of gradient descent and trust-region algorithms. The proposed algorithms generalize our previous results on fixed-rank symmetric positive semidefinite matrices, apply to a broad range of applications, scale to high-dimensional problems, and confer a geometric basis to recent contributions on the learning of fixed-rank non-symmetric matrices. We make connections with existing algorithms in the context of low-rank matrix completion and discuss the usefulness of the proposed framework. Numerical experiments suggest that the proposed algorithms compete with state-of-the-art algorithms and that manifold optimization offers an effective and versatile framework for the design of machine learning algorithms that learn a fixed-rank matrix. © 2013 Springer-Verlag Berlin Heidelberg.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this work we present the theoretical framework for the solution of the time-dependent Schrödinger equation (TDSE) of atomic and molecular systems under strong electromagnetic fields with the configuration space of the electron’s coordinates separated over two regions; that is, regions I and II. In region I the solution of the TDSE is obtained by an R-matrix basis set representation of the time-dependent wave function. In region II a grid representation of the wave function is considered and propagation in space and time is obtained through the finite-difference method. With this, a combination of basis set and grid methods is put forward for tackling multiregion time-dependent problems. In both regions, a high-order explicit scheme is employed for the time propagation. While, in a purely hydrogenic system no approximation is involved due to this separation, in multielectron systems the validity and the usefulness of the present method relies on the basic assumption of R-matrix theory, namely, that beyond a certain distance (encompassing region I) a single ejected electron is distinguishable from the other electrons of the multielectron system and evolves there (region II) effectively as a one-electron system. The method is developed in detail for single active electron systems and applied to the exemplar case of the hydrogen atom in an intense laser field.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As a class of defects in software requirements specification, inconsistency has been widely studied in both requirements engineering and software engineering. It has been increasingly recognized that maintaining consistency alone often results in some other types of non-canonical requirements, including incompleteness of a requirements specification, vague requirements statements, and redundant requirements statements. It is therefore desirable for inconsistency handling to take into account the related non-canonical requirements in requirements engineering. To address this issue, we propose an intuitive generalization of logical techniques for handling inconsistency to those that are suitable for managing non-canonical requirements, which deals with incompleteness and redundancy, in addition to inconsistency. We first argue that measuring non-canonical requirements plays a crucial role in handling them effectively. We then present a measure-driven logic framework for managing non-canonical requirements. The framework consists of five main parts, identifying non-canonical requirements, measuring them, generating candidate proposals for handling them, choosing commonly acceptable proposals, and revising them according to the chosen proposals. This generalization can be considered as an attempt to handle non-canonical requirements along with logic-based inconsistency handling in requirements engineering.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper addresses the pose recovery problem of a particular articulated object: the human body. In this model-based approach, the 2D-shape is associated to the corresponding stick figure allowing the joint segmentation and pose recovery of the subject observed in the scene. The main disadvantage of 2D-models is their restriction to the viewpoint. To cope with this limitation, local spatio-temporal 2D-models corresponding to many views of the same sequences are trained, concatenated and sorted in a global framework. Temporal and spatial constraints are then considered to build the probabilistic transition matrix (PTM) that gives a frame to frame estimation of the most probable local models to use during the fitting procedure, thus limiting the feature space. This approach takes advantage of 3D information avoiding the use of a complex 3D human model. The experiments carried out on both indoor and outdoor sequences have demonstrated the ability of this approach to adequately segment pedestrians and estimate their poses independently of the direction of motion during the sequence. (c) 2008 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Multiscale micro-mechanics theory is extensively used for the prediction of the material response and damage analysis of unidirectional lamina using a representative volume element (RVE). Th is paper presents a RVE-based approach to characterize the materi al response of a multi-fibre cross-ply laminate considering the effect of matrix damage and fibre-matrix interfacial strength. The framework of the homogenization theory for periodic media has been used for the analysis of a 'multi-fibre multi-layer representative volume element' (M2 RVE) representing cross-ply laminate. The non-homogeneous stress-strain fields within the M2RVE are related to the average stresses and strains by using Gauss theorem and the Hill-Mandal strain energy equivalence principle. The interfacial bonding strength affects the in-plane shear stress-strain response significantl y. The material response predicted by M2 RVE is in good agreement with the experimental results available in the literature. The maximum difference between the shear stress predicted using M2 RVE and the experimental results is ~15% for the bonding strength of 30MPa at the strain value of 1.1%

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Measuring inconsistency is crucial to effective inconsistency management in software development. A complete measurement of inconsistency should focus on not only the degree but also the significance of inconsistency. However, most of the approaches available only take the degree of inconsistency into account. The significance of inconsistency has not yet been given much needed consideration. This paper presents an approach for measuring the significance of inconsistency arising from different viewpoints in the Viewpoints framework. We call an individual set of requirements belonging to different viewpoints a combined requirements collection in this paper. We argue that the
significance of inconsistency arising in a combined requirements collection is closely associated with global priority levels of requirements involved in the inconsistency. Here we assume that the global priority level of an individual requirement captures the relative importance of every viewpoint including this requirement as well as the local priority level of the requirement within the viewpoint. Then we use the synthesis of global priority levels of all the requirements in a combined collection to measure the significance of the
collection. Following this, we present a scoring matrix function to measure the significance of inconsistency in an inconsistent combined requirements collection, which describes the contribution made by each subset of the requirements collection to the significance of the set of requirements involved in the inconsistency. An ordering relationship between inconsistencies of two combined requirements collections, termed more significant than, is also presented by comparing their significance scoring matrix functions. Finally, these techniques were implemented in a prototype tool called IncMeasurer, which we developed as a proof of concept.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The process involves encapsulation or immobilization of the active solid substance in a cellulose framework by regenerating cellulose dissolved in an ionic liq. solvent in a regenerating soln. The active substance can be initially present in the ionic liq. or in the regenerating solvent either as a soln. or dispersion. The invention is applicable to mol. encapsulation and to entrapping of larger particles including enzymes, nanoparticles and macroscopic components, and to the formation of bulk materials with a wide range of morphol. forms. Thus, carbamoylmethylphosphine oxide (I) encapsulated in a cellulose matrix was realized by adding I to a 10% soln. of cellulose in 1-butyl-3-methylimidazolium chloride (ionic liq.) under vigorous stirring and then removing the ionic liq. with water. [on SciFinder(R)]