909 resultados para Matrix of complex negotiation
Resumo:
The development of increasingly powerful computers, which has enabled the use of windowing software, has also opened the way for the computer study, via simulation, of very complex physical systems. In this study, the main issues related to the implementation of interactive simulations of complex systems are identified and discussed. Most existing simulators are closed in the sense that there is no access to the source code and, even if it were available, adaptation to interaction with other systems would require extensive code re-writing. This work aims to increase the flexibility of such software by developing a set of object-oriented simulation classes, which can be extended, by subclassing, at any level, i.e., at the problem domain, presentation or interaction levels. A strategy, which involves the use of an object-oriented framework, concurrent execution of several simulation modules, use of a networked windowing system and the re-use of existing software written in procedural languages, is proposed. A prototype tool which combines these techniques has been implemented and is presented. It allows the on-line definition of the configuration of the physical system and generates the appropriate graphical user interface. Simulation routines have been developed for the chemical recovery cycle of a paper pulp mill. The application, by creation of new classes, of the prototype to the interactive simulation of this physical system is described. Besides providing visual feedback, the resulting graphical user interface greatly simplifies the interaction with this set of simulation modules. This study shows that considerable benefits can be obtained by application of computer science concepts to the engineering domain, by helping domain experts to tailor interactive tools to suit their needs.
Resumo:
Methods of dynamic modelling and analysis of structures, for example the finite element method, are well developed. However, it is generally agreed that accurate modelling of complex structures is difficult and for critical applications it is necessary to validate or update the theoretical models using data measured from actual structures. The techniques of identifying the parameters of linear dynamic models using Vibration test data have attracted considerable interest recently. However, no method has received a general acceptance due to a number of difficulties. These difficulties are mainly due to (i) Incomplete number of Vibration modes that can be excited and measured, (ii) Incomplete number of coordinates that can be measured, (iii) Inaccuracy in the experimental data (iv) Inaccuracy in the model structure. This thesis reports on a new approach to update the parameters of a finite element model as well as a lumped parameter model with a diagonal mass matrix. The structure and its theoretical model are equally perturbed by adding mass or stiffness and the incomplete number of eigen-data is measured. The parameters are then identified by an iterative updating of the initial estimates, by sensitivity analysis, using eigenvalues or both eigenvalues and eigenvectors of the structure before and after perturbation. It is shown that with a suitable choice of the perturbing coordinates exact parameters can be identified if the data and the model structure are exact. The theoretical basis of the technique is presented. To cope with measurement errors and possible inaccuracies in the model structure, a well known Bayesian approach is used to minimize the least squares difference between the updated and the initial parameters. The eigen-data of the structure with added mass or stiffness is also determined using the frequency response data of the unmodified structure by a structural modification technique. Thus, mass or stiffness do not have to be added physically. The mass-stiffness addition technique is demonstrated by simulation examples and Laboratory experiments on beams and an H-frame.
Resumo:
Atomistic Molecular Dynamics provides powerful and flexible tools for the prediction and analysis of molecular and macromolecular systems. Specifically, it provides a means by which we can measure theoretically that which cannot be measured experimentally: the dynamic time-evolution of complex systems comprising atoms and molecules. It is particularly suitable for the simulation and analysis of the otherwise inaccessible details of MHC-peptide interaction and, on a larger scale, the simulation of the immune synapse. Progress has been relatively tentative yet the emergence of truly high-performance computing and the development of coarse-grained simulation now offers us the hope of accurately predicting thermodynamic parameters and of simulating not merely a handful of proteins but larger, longer simulations comprising thousands of protein molecules and the cellular scale structures they form. We exemplify this within the context of immunoinformatics.
Resumo:
Background - Lung cancer is the commonest cause of cancer in Scotland and is usually advanced at diagnosis. Median time between symptom onset and consultation is 14 weeks, so an intervention to prompt earlier presentation could support earlier diagnosis and enable curative treatment in more cases. Aim - To develop and optimise an intervention to reduce the time between onset and first consultation with symptoms that might indicate lung cancer. Design and setting - Iterative development of complex healthcare intervention according to the MRC Framework conducted in Northeast Scotland. Method - The study produced a complex intervention to promote early presentation of lung cancer symptoms. An expert multidisciplinary group developed the first draft of the intervention based on theory and existing evidence. This was refined following focus groups with health professionals and high-risk patients. Results - First draft intervention components included: information communicated persuasively, demonstrations of early consultation and its benefits, behaviour change techniques, and involvement of spouses/partners. Focus groups identified patient engagement, achieving behavioural change, and conflict at the patient–general practice interface as challenges and measures were incorporated to tackle these. Final intervention delivery included a detailed self-help manual and extended consultation with a trained research nurse at which specific action plans were devised. Conclusion -The study has developed an intervention that appeals to patients and health professionals and has theoretical potential for benefit. Now it requires evaluation.
Resumo:
The microstructural stability of aluminide diffusion coatings, prepared by means of a two-stage pack-aluminization treatment on single-crystal nickel-base superalloy substrates, is considered in this article. Edge-on specimens of coated superalloy are studied using transmission electron microscopy (TEM). The effects of coating thickness and post-coating heat treatment (duration, temperature, and atmosphere) on coating microstructure are examined. The article discusses the partial transformation of the matrix of the coating, from a B2-type phase (nominally NiAl) to a L12 phase (nominally Ni3(Al, Ti)), during exposure at temperatures of 850 °C and 950 °C in air and in vacuum for up to 138 hours. Three possible processes that can account for decom- position of the coating matrix are investigated, namely, interdiffusion between the coating and the substrate, oxidation of the coating surface, and aging of the coating. Of these processes, aging of the coating is shown to be the predominant factor in the coating transformation under the conditions considered. © 1992 The Minerals, Metals and Materials Society, and ASM International.
Resumo:
The purpose of this study was to investigate the effects of elastic anisotropy on nanoindentation measurements in human tibial cortical bone. Nanoindentation was conducted in 12 different directions in three principal planes for both osteonic and interstitial lamellae. The experimental indentation modulus was found to vary with indentation direction and showed obvious anisotropy (oneway analysis of variance test, P < 0.0001). Because experimental indentation modulus in a specific direction is determined by all of the elastic constants of cortical bone, a complex theoretical model is required to analyze the experimental results. A recently developed analysis of indentation for the properties of anisotropic materials was used to quantitatively predict indentation modulus by using the stiffness matrix of human tibial cortical bone, which was obtained from previous ultrasound studies. After allowing for the effects of specimen preparation (dehydrated specimens in nanoindentation tests vs. moist specimens in ultrasound tests) and the structural properties of bone (different microcomponents with different mechanical properties), there were no statistically significant differences between the corrected experimental indentation modulus (Mexp) values and corresponding predicted indentation modulus (Mpre) values (two-tailed unpaired t-test, P < 0.5). The variation of Mpre values was found to exhibit the same trends as the corrected Mexp data. These results show that the effects of anisotropy on nanoindentation measurements can be quantitatively evaluated. © 2002 Orthopaedic Research Society. Published by Elsevier Science Ltd. All rights reserved.
Resumo:
Assessment criteria are increasingly incorporated into teaching, making it important to clarify the pedagogic status of the qualities to which they refer. We reviewed theory and evidence about the extent to which four core criteria for student writing-critical thinking, use of language, structuring, and argument-refer to the outcomes of three types of learning: generic skills learning, a deep approach to learning, and complex learning. The analysis showed that all four of the core criteria describe to some extent properties of text resulting from using skills, but none qualify fully as descriptions of the outcomes of applying generic skills. Most also describe certain aspects of the outcomes of taking a deep approach to learning. Critical thinking and argument correspond most closely to the outcomes of complex learning. At lower levels of performance, use of language and structuring describe the outcomes of applying transferable skills. At higher levels of performance, they describe the outcomes of taking a deep approach to learning. We propose that the type of learning required to meet the core criteria is most usefully and accurately conceptualized as the learning of complex skills, and that this provides a conceptual framework for maximizing the benefits of using assessment criteria as part of teaching. © 2006 Taylor & Francis.
Resumo:
Research and development of mathematical model of optimum distribution of resources (basically financial) for maintenance of the new (raised) quality (reliability) of complex system concerning, which the decision on its re-structuring is accepted, is stated. The final model gives answers (algorithm of calculation) to questions: how many elements of system to allocate on modernization, which elements, up to what level of depth modernization of each of allocated is necessary, and optimum answers are by criterion of minimization of financial charges.
Resumo:
The objects of a large-scale gas-transport company (GTC) suggest a complex unified evolutionary approach, which covers basic building concepts, up-to-date technologies, models, methods and means that are used in the phases of design, adoption, maintenance and development of the multilevel automated distributed control systems (ADCS).. As a single methodological basis of the suggested approach three basic Concepts, which contain the basic methodological principles and conceptual provisions on the creation of distributed control systems, were worked out: systems of the lower level (ACS of the technological processes based on up-to-date SCADA), of the middle level (ACS of the operative-dispatch production control based on MES-systems) and of the high level (business process control on the basis of complex automated systems ERP).
Resumo:
Objectives: To develop a tool for the accurate reporting and aggregation of findings from each of the multiple methods used in a complex evaluation in an unbiased way. Study Design and Setting: We developed a Method for Aggregating The Reporting of Interventions in Complex Studies (MATRICS) within a gastroenterology study [Evaluating New Innovations in (the delivery and organisation of) Gastrointestinal (GI) endoscopy services by the NHS Modernisation Agency (ENIGMA)]. We subsequently tested it on a different gastroenterology trial [Multi-Institutional Nurse Endoscopy Trial (MINuET)]. We created three layers to define the effects, methods, and findings from ENIGMA. We assigned numbers to each effect in layer 1 and letters to each method in layer 2. We used an alphanumeric code based on layers 1 and 2 to every finding in layer 3 to link the aims, methods, and findings. We illustrated analogous findings by assigning more than one alphanumeric code to a finding. We also showed that more than one effect or method could report the same finding. We presented contradictory findings by listing them in adjacent rows of the MATRICS. Results: MATRICS was useful for the effective synthesis and presentation of findings of the multiple methods from ENIGMA. We subsequently successfully tested it by applying it to the MINuET trial. Conclusion: MATRICS is effective for synthesizing the findings of complex, multiple-method studies.
Resumo:
2000 Mathematics Subject Classification: 35P25, 81U20, 35S30, 47A10, 35B38.
Resumo:
This presentation focuses on methods for the evaluation of complex policies. In particular, it focuses on evaluating interactions between policies and the extent to which two or more interacting policies mutually reinforce or hinder one another, in the area of environmental sustainability. Environmental sustainability is increasingly gaining recognition as a complex policy area, requiring a more systemic perspective and approach (e.g. European Commission, 2011). Current trends in human levels of resource consumption are unsustainable, and single solutions which target isolated issues independently of the broader context have so far fallen short. Instead there is a growing call among both academics and policy practitioners for systemic change which acknowledges and engages with the complex interactions, barriers and opportunities across the different actors, sectors, and drivers of production and consumption. Policy mixes, and the combination and ordering of policies within, therefore become an important focus for those aspiring to design and manage transitions to sustainability. To this end, we need a better understanding of the interactions, synergies and conflicts between policies (Cunningham et al., 2013; Geels, 2014). As a contribution to this emerging field of research and to inform its next steps, I present a review on what methods are available to try to quantify the impacts of complex policy interactions, since there is no established method among practitioners, and I explore the merits or value of such attempts. The presentation builds on key works in the field of complexity science (e.g. Anderson, 1972), revisiting and combining these with more recent contributions in the emerging field of policy and complex systems, and evaluation (e.g. Johnstone et al., 2010). With a coalition of UK Government departments, agencies and Research Councils soon to announce the launch of a new internationally-leading centre to pioneer, test and promote innovative and inclusive methods for policy evaluation across the energy-environment-food nexus, the contribution is particularly timely.
Resumo:
With the development of information technology, the theory and methodology of complex network has been introduced to the language research, which transforms the system of language in a complex networks composed of nodes and edges for the quantitative analysis about the language structure. The development of dependency grammar provides theoretical support for the construction of a treebank corpus, making possible a statistic analysis of complex networks. This paper introduces the theory and methodology of the complex network and builds dependency syntactic networks based on the treebank of speeches from the EEE-4 oral test. According to the analysis of the overall characteristics of the networks, including the number of edges, the number of the nodes, the average degree, the average path length, the network centrality and the degree distribution, it aims to find in the networks potential difference and similarity between various grades of speaking performance. Through clustering analysis, this research intends to prove the network parameters’ discriminating feature and provide potential reference for scoring speaking performance.
Resumo:
Mitochondrial Complex II is a key mitochondrial enzyme connecting the tricarboxylic acid (TCA) cycle and the electron transport chain. Studies of complex II are clinically important since new roles for this enzyme have recently emerged in cell signalling, cancer biology, immune response and neurodegeneration. Oxaloacetate (OAA) is an intermediate of the TCA cycle and at the same time is an inhibitor of complex II with high affinity (Kd ~ 10− 8 M). Whether or not OAA inhibition of complex II is a physiologically relevant process is a significant, but still controversial topic. We found that complex II from mouse heart and brain tissue has similar affinity to OAA and that only a fraction of the enzyme in isolated mitochondrial membranes (30.2 ± 6.0% and 56.4 ± 5.6% in the heart and brain, respectively) is in the free, active form. Since OAA could bind to complex II during isolation, we established a novel approach to deplete OAA in the homogenates at the early stages of isolation. In heart, this treatment significantly increased the fraction of free enzyme, indicating that OAA binds to complex II during isolation. In brain the OAA-depleting system did not significantly change the amount of free enzyme, indicating that a large fraction of complex II is already in the OAA-bound inactive form. Furthermore, short-term ischemia resulted in a dramatic decline of OAA in tissues, but it did not change the amount of free complex II. Our data show that in brain OAA is an endogenous effector of complex II, potentially capable of modulating the activity of the enzyme.
Resumo:
Field-programmable gate arrays are ideal hosts to custom accelerators for signal, image, and data processing but de- mand manual register transfer level design if high performance and low cost are desired. High-level synthesis reduces this design burden but requires manual design of complex on-chip and off-chip memory architectures, a major limitation in applications such as video processing. This paper presents an approach to resolve this shortcoming. A constructive process is described that can derive such accelerators, including on- and off-chip memory storage from a C description such that a user-defined throughput constraint is met. By employing a novel statement-oriented approach, dataflow intermediate models are derived and used to support simple ap- proaches for on-/off-chip buffer partitioning, derivation of custom on-chip memory hierarchies and architecture transformation to ensure user-defined throughput constraints are met with minimum cost. When applied to accelerators for full search motion estima- tion, matrix multiplication, Sobel edge detection, and fast Fourier transform, it is shown how real-time performance up to an order of magnitude in advance of existing commercial HLS tools is enabled whilst including all requisite memory infrastructure. Further, op- timizations are presented that reduce the on-chip buffer capacity and physical resource cost by up to 96% and 75%, respectively, whilst maintaining real-time performance.