923 resultados para Complex Engineering Systems
Resumo:
The two areas of theory upon which this research was based were „strategy development process?(SDP) and „complex adaptive systems? (CAS), as part of complexity theory, focused on human social organisations. The literature reviewed showed that there is a paucity of empirical work and theory in the overlap of the two areas, providing an opportunity for contributions to knowledge in each area of theory, and for practitioners. An inductive approach was adopted for this research, in an effort to discover new insights to the focus area of study. It was undertaken from within an interpretivist paradigm, and based on a novel conceptual framework. The organisationally intimate nature of the research topic, and the researcher?s circumstances required a research design that was both in-depth and long term. The result was a single, exploratory, case study, which included use of data from 44 in-depth, semi-structured interviews, from 36 people, involving all the top management team members and significant other staff members; observations, rumour and grapevine (ORG) data; and archive data, over a 5½ year period (2005 – 2010). Findings confirm the validity of the conceptual framework, and that complex adaptive systems theory has potential to extend strategy development process theory. It has shown how and why the strategy process developed in the case study organisation by providing deeper insights to the behaviour of the people, their backgrounds, and interactions. Broad predictions of the „latent strategy development? process and some elements of the strategy content are also possible. Based on this research, it is possible to extend the utility of the SDP model by including peoples? behavioural characteristics within the organisation, via complex adaptive systems theory. Further research is recommended to test limits of the application of the conceptual framework and improve its efficacy with more organisations across a variety of sectors.
Resumo:
In this article, we describe and model the language classroom as a complex adaptive system (see Logan & Schumann, 2005). We argue that linear, categorical descriptions of classroom processes and interactions do not sufficiently explain the complex nature of classrooms, and cannot account for how classroom change occurs (or does not occur), over time. A relational model of classrooms is proposed which focuses on the relations between different elements (physical, environmental, cognitive, social) in the classroom and on how their interaction is crucial in understanding and describing classroom action.
Resumo:
Systemized analysis of trends towards integration and hybridization in contemporary expert systems is conducted, and a particular class of applied expert systems, integrated expert systems, is considered. For this purpose, terminology, classification, and models, proposed by the author, are employed. As examples of integrated expert systems, Russian systems designed in this field and available to the majority of specialists are analyzed.
Resumo:
Certain theoretical and methodological problems of designing real-time dynamical expert systems, which belong to the class of the most complex integrated expert systems, are discussed. Primary attention is given to the problems of designing subsystems for modeling the external environment in the case where the environment is represented by complex engineering systems. A specific approach to designing simulation models for complex engineering systems is proposed and examples of the application of this approach based on the G2 (Gensym Corp.) tool system are described.
Resumo:
“Availability” is the terminology used in asset intensive industries such as petrochemical and hydrocarbons processing to describe the readiness of equipment, systems or plants to perform their designed functions. It is a measure to suggest a facility’s capability of meeting targeted production in a safe working environment. Availability is also vital as it encompasses reliability and maintainability, allowing engineers to manage and operate facilities by focusing on one performance indicator. These benefits make availability a very demanding and highly desired area of interest and research for both industry and academia. In this dissertation, new models, approaches and algorithms have been explored to estimate and manage the availability of complex hydrocarbon processing systems. The risk of equipment failure and its effect on availability is vital in the hydrocarbon industry, and is also explored in this research. The importance of availability encouraged companies to invest in this domain by putting efforts and resources to develop novel techniques for system availability enhancement. Most of the work in this area is focused on individual equipment compared to facility or system level availability assessment and management. This research is focused on developing an new systematic methods to estimate system availability. The main focus areas in this research are to address availability estimation and management through physical asset management, risk-based availability estimation strategies, availability and safety using a failure assessment framework, and availability enhancement using early equipment fault detection and maintenance scheduling optimization.
Resumo:
We thank Dr. R. Yang (formerly at ASU), Dr. R.-Q. Su (formerly at ASU), and Mr. Zhesi Shen for their contributions to a number of original papers on which this Review is partly based. This work was supported by ARO under Grant No. W911NF-14-1-0504. W.-X. Wang was also supported by NSFC under Grants No. 61573064 and No. 61074116, as well as by the Fundamental Research Funds for the Central Universities, Beijing Nova Programme.
Resumo:
Lactic acid bacteria expolysaccharides (LAB-EPS), in particular those formed from sucrose have the potential to improve food and beverage rheology and enhance their sensory properties potentially replacing or reducing expensive hydrocolloids currently used as improvers in food and beverage industries. Addition of sucrose not only enables EPS formation but also affects organic acid formation, thus influencing the sensory properties of the resulting food/beverage products. The first part of the study the organoleptic modulation of barley malt derived wort fermented using in situ produced bacterial polysaccharides has been investigated. Weisella cibaria MG1 was capable to produce exopolysaccharides during sucrosesupplemented barley malt derived wort fermentation. Even though the strain dominated the (sucrose-supplemented) wort fermentation, it was found to produce EPS (14.4 g l-1) with lower efficiency than in SucMRS (34.6 g l-1). Higher maltose concentration in wort led to the increased formation of oligosaccharide (OS) at the expense of EPS. Additionally, small amounts of organic acids were formed and ethanol remained below 0.5% (v/v). W. cibaria MG1 fermented worts supplemented with 5 or 10% sucrose displayed a shear-thinning behaviour indicating the formation of polymers. This report showed how novel and nutritious LAB fermented wort-base beverage with prospects for further advancements can be formulated using tailored microbial cultures. In the next step, the impact of exopolysaccharide-producing Weissella cibaria MG1 on the ability to improve rheological properties of fermented plant-based milk substitute plant based soy and quinoa grain was evaluated. W. cibaria MG1 grew well in soy milk, exceeding a cell count of log 8 cfu/g within 6 h of fermentation. The presence of W. cibaria MG1 led to a decrease in gelation and fermentation time. EPS isolated from soy yoghurts supplemented with sucrose were higher in molecular weight (1.1 x 108 g/mol vs 6.6 x 107 g/mol), and resulted in reduced gel stiffness (190 ± 2.89 Pa vs 244 ± 15.9 Pa). Soy yoghurts showed typical biopolymer gels structure and the network structure changed to larger pores and less cross-linking in the presence of sucrose and increasing molecular weight of the EPS. In situ investigation of Weissella cibaria MG1 producing EPS on quinoa-based milk was performed. The production of quinoa milk, starting from wholemeal quinoa flour, was optimised to maximise EPS production. On doing that, enzymatic destructuration of protein and carbohydrate components of quinoa milk was successfully achieved applying alpha-amylase and proteases treatments. Fermented wholemeal quinoa milk using Weissella cibaria MG1 showed high viable cell counts (>109 cfu/mL), a pH of 5.16, and significantly higher water holding capacity (WHC, 100 %), viscosity (> 0. 5 Pa s) and exopolysaccharide (EPS) amount (40 mg/L) than the chemically acidified control. High EPS (dextran) concentration in quinoa milk caused earlier aggregation because more EPS occupy more space, and the chenopodin were forced to interact with each other. Direct observation of microstructure in fermented quinoa milk indicated that the network structures of EPS-protein could improve the texture of fermented quinoa milk. Overall, Weissella cibaria MG1 showed favorable technology properties and great potential for further possible application in the development of high viscosity fermented quinoa milk. The last part of the study investigate the ex-situ LAB-EPS (dextran) application compared to other hydrocolloids as a novel food ingredient to compensate for low protein in biscuit and wholemeal wheat flour. Three hydrocolloids, xanthan gum, dextran and hydroxypropyl methylcellulose, were incorporated into bread recipes based on high-protein flours, low-protein flours and coarse wholemeal flour. Hydrocolloid levels of 0–5 % (flour basis) were used in bread recipes to test the water absorption. The quality parameters of dough (farinograph, extensograph, rheofermentometre) and bread (specific volume, crumb structure and staling profile) were determined. Results showed that xanthan had negative impact on the dough and bread quality characteristics. HPMC and dextran generally improved dough and bread quality and showed dosage dependence. Volume of low-protein flour breads were significantly improved by incorporation of 0.5 % of the latter two hydrocolloids. However, dextran outperformed HPMC regarding initial bread hardness and staling shelf life regardless the flour applied in the formulation.
Resumo:
The work presented herein covers a broad range of research topics and so, in the interest of clarity, has been presented in a portfolio format. Accordingly, each chapter consists of its own introductory material prior to presentation of the key results garnered, this is then proceeded by a short discussion on their significance. In the first chapter, a methodology to facilitate the resolution and qualitative assessment of very large inorganic polyoxometalates was designed and implemented employing ion-mobility mass spectrometry. Furthermore, the potential of this technique for ‘mapping’ the conformational space occupied by this class of materials was demonstrated. These claims are then substantiated by the development of a tuneable, polyoxometalate-based calibration protocol that provided the necessary platform for quantitative assessments of similarly large, but unknown, polyoxometalate species. In addition, whilst addressing a major limitation of travelling wave ion mobility, this result also highlighted the potential of this technique for solution-phase cluster discovery. The second chapter reports on the application of a biophotovoltaic electrochemical cell for characterising the electrogenic activity inherent to a number of mutant Synechocystis strains. The intention was to determine the key components in the photosynthetic electron transport chain responsible for extracellular electron transfer. This would help to address the significant lack of mechanistic understanding in this field. Finally, in the third chapter, the design and fabrication of a low-cost, highly modular, continuous cell culture system is presented. To demonstrate the advantages and suitability of this platform for experimental evolution investigations, an exploration into the photophysiological response to gradual iron limitation, in both the ancestral wild type and a randomly generated mutant library population, was undertaken. Furthermore, coupling random mutagenesis to continuous culture in this way is shown to constitute a novel source of genetic variation that is open to further investigation.
Resumo:
Self-replication and compartmentalization are two central properties thought to be essential for minimal life, and understanding how such processes interact in the emergence of complex reaction networks is crucial to exploring the development of complexity in chemistry and biology. Autocatalysis can emerge from multiple different mechanisms such as formation of an initiator, template self-replication and physical autocatalysis (where micelles formed from the reaction product solubilize the reactants, leading to higher local concentrations and therefore higher rates). Amphiphiles are also used in artificial life studies to create protocell models such as micelles, vesicles and oil-in-water droplets, and can increase reaction rates by encapsulation of reactants. So far, no template self-replicator exists which is capable of compartmentalization, or transferring this molecular scale phenomenon to micro or macro-scale assemblies. Here a system is demonstrated where an amphiphilic imine catalyses its own formation by joining a non-polar alkyl tail group with a polar carboxylic acid head group to form a template, which was shown to form reverse micelles by Dynamic Light Scattering (DLS). The kinetics of this system were investigated by 1H NMR spectroscopy, showing clearly that a template self-replication mechanism operates, though there was no evidence that the reverse micelles participated in physical autocatalysis. Active oil droplets, composed from a mixture of insoluble organic compounds in an aqueous sub-phase, can undergo processes such as division, self-propulsion and chemotaxis, and are studied as models for minimal cells, or protocells. Although in most cases the Marangoni effect is responsible for the forces on the droplet, the behaviour of the droplet depends heavily on the exact composition. Though theoretical models are able to calculate the forces on a droplet, to model a mixture of oils on an aqueous surface where compounds from the oil phase are dissolving and diffusing through the aqueous phase is beyond current computational capability. The behaviour of a droplet in an aqueous phase can only be discovered through experiment, though it is determined by the droplet's composition. By using an evolutionary algorithm and a liquid handling robot to conduct droplet experiments and decide which compositions to test next, entirely autonomously, the composition of the droplet becomes a chemical genome capable of evolution. The selection is carried out according to a fitness function, which ranks the formulation based on how well it conforms to the chosen fitness criteria (e.g. movement or division). Over successive generations, significant increases in fitness are achieved, and this increase is higher with more components (i.e. greater complexity). Other chemical processes such as chemiluminescence and gelation were investigated in active oil droplets, demonstrating the possibility of controlling chemical reactions by selective droplet fusion. Potential future applications for this might include combinatorial chemistry, or additional fitness goals for the genetic algorithm. Combining the self-replication and the droplet protocells research, it was demonstrated that the presence of the amphiphilic replicator lowers the interfacial tension between droplets of a reaction mixture in organic solution and the alkaline aqueous phase, causing them to divide. Periodic sampling by a liquid handling robot revealed that the extent of droplet fission increased as the reaction progressed, producing more individual protocells with increased self-replication. This demonstrates coupling of the molecular scale phenomenon of template self-replication to a macroscale physicochemical effect.
Resumo:
Traditional software engineering approaches and metaphors fall short when applied to areas of growing relevance such as electronic commerce, enterprise resource planning, and mobile computing: such areas, in fact, generally call for open architectures that may evolve dynamically over time so as to accommodate new components and meet new requirements. This is probably one of the main reasons that the agent metaphor and the agent-oriented paradigm are gaining momentum in these areas. This thesis deals with the engineering of complex software systems in terms of the agent paradigm. This paradigm is based on the notions of agent and systems of interacting agents as fundamental abstractions for designing, developing and managing at runtime typically distributed software systems. However, today the engineer often works with technologies that do not support the abstractions used in the design of the systems. For this reason the research on methodologies becomes the basic point in the scientific activity. Currently most agent-oriented methodologies are supported by small teams of academic researchers, and as a result, most of them are in an early stage and still in the first context of mostly \academic" approaches for agent-oriented systems development. Moreover, such methodologies are not well documented and very often defined and presented only by focusing on specific aspects of the methodology. The role played by meta- models becomes fundamental for comparing and evaluating the methodologies. In fact a meta-model specifies the concepts, rules and relationships used to define methodologies. Although it is possible to describe a methodology without an explicit meta-model, formalising the underpinning ideas of the methodology in question is valuable when checking its consistency or planning extensions or modifications. A good meta-model must address all the different aspects of a methodology, i.e. the process to be followed, the work products to be generated and those responsible for making all this happen. In turn, specifying the work products that must be developed implies dening the basic modelling building blocks from which they are built. As a building block, the agent abstraction alone is not enough to fully model all the aspects related to multi-agent systems in a natural way. In particular, different perspectives exist on the role that environment plays within agent systems: however, it is clear at least that all non-agent elements of a multi-agent system are typically considered to be part of the multi-agent system environment. The key role of environment as a first-class abstraction in the engineering of multi-agent system is today generally acknowledged in the multi-agent system community, so environment should be explicitly accounted for in the engineering of multi-agent system, working as a new design dimension for agent-oriented methodologies. At least two main ingredients shape the environment: environment abstractions - entities of the environment encapsulating some functions -, and topology abstractions - entities of environment that represent the (either logical or physical) spatial structure. In addition, the engineering of non-trivial multi-agent systems requires principles and mechanisms for supporting the management of the system representation complexity. These principles lead to the adoption of a multi-layered description, which could be used by designers to provide different levels of abstraction over multi-agent systems. The research in these fields has lead to the formulation of a new version of the SODA methodology where environment abstractions and layering principles are exploited for en- gineering multi-agent systems.
Resumo:
The paper proposes a methodology especially focused on the generation of strategic plans of action, emphasizing the relevance of having a structured timeframe classification for the actions. The methodology explicitly recognizes the relevance of long-term goals as strategic drivers, which must insure that the complex system is capable to effectively respond to changes in the environment. In addition, the methodology employs engineering systems techniques in order to understand the inner working of the system and to build up alternative plans of action. Due to these different aspects, the proposed approach features higher flexibility compared to traditional methods. The validity and effectiveness of the methodology has been demonstrated by analyzing an airline company composed by 5 subsystems with the aim of defining a plan of action for the next 5 years, which can either: improve efficiency, redefine mission or increase revenues.
Resumo:
Dissertação apresentada na Faculdade de Ciências e Tecnologia da Universidade Nova de Lisboa para obtenção do grau de Mestre em Engenharia Informática
Resumo:
Economies are open complex adaptive systems far from thermodynamic equilibrium, and neo-classical environmental economics seems not to be the best way to describe the behaviour of such systems. Standard econometric analysis (i.e. time series) takes a deterministic and predictive approach, which encourages the search for predictive policy to ‘correct’ environmental problems. Rather, it seems that, because of the characteristics of economic systems, an ex-post analysis is more appropriate, which describes the emergence of such systems’ properties, and which sees policy as a social steering mechanism. With this background, some of the recent empirical work published in the field of ecological economics that follows the approach defended here is presented. Finally, the conclusion is reached that a predictive use of econometrics (i.e. time series analysis) in ecological economics should be limited to cases in which uncertainty decreases, which is not the normal situation when analysing the evolution of economic systems. However, that does not mean we should not use empirical analysis. On the contrary, this is to be encouraged, but from a structural and ex-post point of view.
Resumo:
There is an increasing reliance on computers to solve complex engineering problems. This is because computers, in addition to supporting the development and implementation of adequate and clear models, can especially minimize the financial support required. The ability of computers to perform complex calculations at high speed has enabled the creation of highly complex systems to model real-world phenomena. The complexity of the fluid dynamics problem makes it difficult or impossible to solve equations of an object in a flow exactly. Approximate solutions can be obtained by construction and measurement of prototypes placed in a flow, or by use of a numerical simulation. Since usage of prototypes can be prohibitively time-consuming and expensive, many have turned to simulations to provide insight during the engineering process. In this case the simulation setup and parameters can be altered much more easily than one could with a real-world experiment. The objective of this research work is to develop numerical models for different suspensions (fiber suspensions, blood flow through microvessels and branching geometries, and magnetic fluids), and also fluid flow through porous media. The models will have merit as a scientific tool and will also have practical application in industries. Most of the numerical simulations were done by the commercial software, Fluent, and user defined functions were added to apply a multiscale method and magnetic field. The results from simulation of fiber suspension can elucidate the physics behind the break up of a fiber floc, opening the possibility for developing a meaningful numerical model of the fiber flow. The simulation of blood movement from an arteriole through a venule via a capillary showed that the model based on VOF can successfully predict the deformation and flow of RBCs in an arteriole. Furthermore, the result corresponds to the experimental observation illustrates that the RBC is deformed during the movement. The concluding remarks presented, provide a correct methodology and a mathematical and numerical framework for the simulation of blood flows in branching. Analysis of ferrofluids simulations indicate that the magnetic Soret effect can be even higher than the conventional one and its strength depends on the strength of magnetic field, confirmed experimentally by Völker and Odenbach. It was also shown that when a magnetic field is perpendicular to the temperature gradient, there will be additional increase in the heat transfer compared to the cases where the magnetic field is parallel to the temperature gradient. In addition, the statistical evaluation (Taguchi technique) on magnetic fluids showed that the temperature and initial concentration of the magnetic phase exert the maximum and minimum contribution to the thermodiffusion, respectively. In the simulation of flow through porous media, dimensionless pressure drop was studied at different Reynolds numbers, based on pore permeability and interstitial fluid velocity. The obtained results agreed well with the correlation of Macdonald et al. (1979) for the range of actual flow Reynolds studied. Furthermore, calculated results for the dispersion coefficients in the cylinder geometry were found to be in agreement with those of Seymour and Callaghan.
Resumo:
Systems biology is a new, emerging and rapidly developing, multidisciplinary research field that aims to study biochemical and biological systems from a holistic perspective, with the goal of providing a comprehensive, system- level understanding of cellular behaviour. In this way, it addresses one of the greatest challenges faced by contemporary biology, which is to compre- hend the function of complex biological systems. Systems biology combines various methods that originate from scientific disciplines such as molecu- lar biology, chemistry, engineering sciences, mathematics, computer science and systems theory. Systems biology, unlike “traditional” biology, focuses on high-level concepts such as: network, component, robustness, efficiency, control, regulation, hierarchical design, synchronization, concurrency, and many others. The very terminology of systems biology is “foreign” to “tra- ditional” biology, marks its drastic shift in the research paradigm and it indicates close linkage of systems biology to computer science. One of the basic tools utilized in systems biology is the mathematical modelling of life processes tightly linked to experimental practice. The stud- ies contained in this thesis revolve around a number of challenges commonly encountered in the computational modelling in systems biology. The re- search comprises of the development and application of a broad range of methods originating in the fields of computer science and mathematics for construction and analysis of computational models in systems biology. In particular, the performed research is setup in the context of two biolog- ical phenomena chosen as modelling case studies: 1) the eukaryotic heat shock response and 2) the in vitro self-assembly of intermediate filaments, one of the main constituents of the cytoskeleton. The range of presented approaches spans from heuristic, through numerical and statistical to ana- lytical methods applied in the effort to formally describe and analyse the two biological processes. We notice however, that although applied to cer- tain case studies, the presented methods are not limited to them and can be utilized in the analysis of other biological mechanisms as well as com- plex systems in general. The full range of developed and applied modelling techniques as well as model analysis methodologies constitutes a rich mod- elling framework. Moreover, the presentation of the developed methods, their application to the two case studies and the discussions concerning their potentials and limitations point to the difficulties and challenges one encounters in computational modelling of biological systems. The problems of model identifiability, model comparison, model refinement, model inte- gration and extension, choice of the proper modelling framework and level of abstraction, or the choice of the proper scope of the model run through this thesis.