870 resultados para Refinement of (SOR1NM2)
Resumo:
Transitional flow past a three-dimensional circular cylinder is a widely studied phenomenon since this problem is of interest with respect to many technical applications. In the present work, the numerical simulation of flow past a circular cylinder, performed by using a commercial CFD code (ANSYS Fluent 12.1) with large eddy simulation (LES) and RANS (κ - ε and Shear-Stress Transport (SST) κ - ω! model) approaches. The turbulent flow for ReD = 1000 & 3900 is simulated to investigate the force coefficient, Strouhal number, flow separation angle, pressure distribution on cylinder and the complex three dimensional vortex shedding of the cylinder wake region. The numerical results extracted from these simulations have good agreement with the experimental data (Zdravkovich, 1997). Moreover, grid refinement and time-step influence have been examined. Numerical calculations of turbulent cross-flow in a staggered tube bundle continues to attract interest due to its importance in the engineering application as well as the fact that this complex flow represents a challenging problem for CFD. In the present work a time dependent simulation using κ – ε, κ - ω! and SST models are performed in two dimensional for a subcritical flow through a staggered tube bundle. The predicted turbulence statistics (mean and r.m.s velocities) have good agreement with the experimental data (S. Balabani, 1996). Turbulent quantities such as turbulent kinetic energy and dissipation rate are predicted using RANS models and compared with each other. The sensitivity of grid and time-step size have been analyzed. Model constants sensitivity study have been carried out by adopting κ – ε model. It has been observed that model constants are very sensitive to turbulence statistics and turbulent quantities.
Resumo:
Systems biology is a new, emerging and rapidly developing, multidisciplinary research field that aims to study biochemical and biological systems from a holistic perspective, with the goal of providing a comprehensive, system- level understanding of cellular behaviour. In this way, it addresses one of the greatest challenges faced by contemporary biology, which is to compre- hend the function of complex biological systems. Systems biology combines various methods that originate from scientific disciplines such as molecu- lar biology, chemistry, engineering sciences, mathematics, computer science and systems theory. Systems biology, unlike “traditional” biology, focuses on high-level concepts such as: network, component, robustness, efficiency, control, regulation, hierarchical design, synchronization, concurrency, and many others. The very terminology of systems biology is “foreign” to “tra- ditional” biology, marks its drastic shift in the research paradigm and it indicates close linkage of systems biology to computer science. One of the basic tools utilized in systems biology is the mathematical modelling of life processes tightly linked to experimental practice. The stud- ies contained in this thesis revolve around a number of challenges commonly encountered in the computational modelling in systems biology. The re- search comprises of the development and application of a broad range of methods originating in the fields of computer science and mathematics for construction and analysis of computational models in systems biology. In particular, the performed research is setup in the context of two biolog- ical phenomena chosen as modelling case studies: 1) the eukaryotic heat shock response and 2) the in vitro self-assembly of intermediate filaments, one of the main constituents of the cytoskeleton. The range of presented approaches spans from heuristic, through numerical and statistical to ana- lytical methods applied in the effort to formally describe and analyse the two biological processes. We notice however, that although applied to cer- tain case studies, the presented methods are not limited to them and can be utilized in the analysis of other biological mechanisms as well as com- plex systems in general. The full range of developed and applied modelling techniques as well as model analysis methodologies constitutes a rich mod- elling framework. Moreover, the presentation of the developed methods, their application to the two case studies and the discussions concerning their potentials and limitations point to the difficulties and challenges one encounters in computational modelling of biological systems. The problems of model identifiability, model comparison, model refinement, model inte- gration and extension, choice of the proper modelling framework and level of abstraction, or the choice of the proper scope of the model run through this thesis.
Resumo:
Software systems are expanding and becoming increasingly present in everyday activities. The constantly evolving society demands that they deliver more functionality, are easy to use and work as expected. All these challenges increase the size and complexity of a system. People may not be aware of a presence of a software system, until it malfunctions or even fails to perform. The concept of being able to depend on the software is particularly significant when it comes to the critical systems. At this point quality of a system is regarded as an essential issue, since any deficiencies may lead to considerable money loss or life endangerment. Traditional development methods may not ensure a sufficiently high level of quality. Formal methods, on the other hand, allow us to achieve a high level of rigour and can be applied to develop a complete system or only a critical part of it. Such techniques, applied during system development starting at early design stages, increase the likelihood of obtaining a system that works as required. However, formal methods are sometimes considered difficult to utilise in traditional developments. Therefore, it is important to make them more accessible and reduce the gap between the formal and traditional development methods. This thesis explores the usability of rigorous approaches by giving an insight into formal designs with the use of graphical notation. The understandability of formal modelling is increased due to a compact representation of the development and related design decisions. The central objective of the thesis is to investigate the impact that rigorous approaches have on quality of developments. This means that it is necessary to establish certain techniques for evaluation of rigorous developments. Since we are studying various development settings and methods, specific measurement plans and a set of metrics need to be created for each setting. Our goal is to provide methods for collecting data and record evidence of the applicability of rigorous approaches. This would support the organisations in making decisions about integration of formal methods into their development processes. It is important to control the software development, especially in its initial stages. Therefore, we focus on the specification and modelling phases, as well as related artefacts, e.g. models. These have significant influence on the quality of a final system. Since application of formal methods may increase the complexity of a system, it may impact its maintainability, and thus quality. Our goal is to leverage quality of a system via metrics and measurements, as well as generic refinement patterns, which are applied to a model and a specification. We argue that they can facilitate the process of creating software systems, by e.g. controlling complexity and providing the modelling guidelines. Moreover, we find them as additional mechanisms for quality control and improvement, also for rigorous approaches. The main contribution of this thesis is to provide the metrics and measurements that help in assessing the impact of rigorous approaches on developments. We establish the techniques for the evaluation of certain aspects of quality, which are based on structural, syntactical and process related characteristics of an early-stage development artefacts, i.e. specifications and models. The presented approaches are applied to various case studies. The results of the investigation are juxtaposed with the perception of domain experts. It is our aspiration to promote measurements as an indispensable part of quality control process and a strategy towards the quality improvement.
Resumo:
Formal methods provide a means of reasoning about computer programs in order to prove correctness criteria. One subtype of formal methods is based on the weakest precondition predicate transformer semantics and uses guarded commands as the basic modelling construct. Examples of such formalisms are Action Systems and Event-B. Guarded commands can intuitively be understood as actions that may be triggered when an associated guard condition holds. Guarded commands whose guards hold are nondeterministically chosen for execution, but no further control flow is present by default. Such a modelling approach is convenient for proving correctness, and the Refinement Calculus allows for a stepwise development method. It also has a parallel interpretation facilitating development of concurrent software, and it is suitable for describing event-driven scenarios. However, for many application areas, the execution paradigm traditionally used comprises more explicit control flow, which constitutes an obstacle for using the above mentioned formal methods. In this thesis, we study how guarded command based modelling approaches can be conveniently and efficiently scheduled in different scenarios. We first focus on the modelling of trust for transactions in a social networking setting. Due to the event-based nature of the scenario, the use of guarded commands turns out to be relatively straightforward. We continue by studying modelling of concurrent software, with particular focus on compute-intensive scenarios. We go from theoretical considerations to the feasibility of implementation by evaluating the performance and scalability of executing a case study model in parallel using automatic scheduling performed by a dedicated scheduler. Finally, we propose a more explicit and non-centralised approach in which the flow of each task is controlled by a schedule of its own. The schedules are expressed in a dedicated scheduling language, and patterns assist the developer in proving correctness of the scheduled model with respect to the original one.
Resumo:
-
Resumo:
Nowadays, computer-based systems tend to become more complex and control increasingly critical functions affecting different areas of human activities. Failures of such systems might result in loss of human lives as well as significant damage to the environment. Therefore, their safety needs to be ensured. However, the development of safety-critical systems is not a trivial exercise. Hence, to preclude design faults and guarantee the desired behaviour, different industrial standards prescribe the use of rigorous techniques for development and verification of such systems. The more critical the system is, the more rigorous approach should be undertaken. To ensure safety of a critical computer-based system, satisfaction of the safety requirements imposed on this system should be demonstrated. This task involves a number of activities. In particular, a set of the safety requirements is usually derived by conducting various safety analysis techniques. Strong assurance that the system satisfies the safety requirements can be provided by formal methods, i.e., mathematically-based techniques. At the same time, the evidence that the system under consideration meets the imposed safety requirements might be demonstrated by constructing safety cases. However, the overall safety assurance process of critical computerbased systems remains insufficiently defined due to the following reasons. Firstly, there are semantic differences between safety requirements and formal models. Informally represented safety requirements should be translated into the underlying formal language to enable further veri cation. Secondly, the development of formal models of complex systems can be labour-intensive and time consuming. Thirdly, there are only a few well-defined methods for integration of formal verification results into safety cases. This thesis proposes an integrated approach to the rigorous development and verification of safety-critical systems that (1) facilitates elicitation of safety requirements and their incorporation into formal models, (2) simplifies formal modelling and verification by proposing specification and refinement patterns, and (3) assists in the construction of safety cases from the artefacts generated by formal reasoning. Our chosen formal framework is Event-B. It allows us to tackle the complexity of safety-critical systems as well as to structure safety requirements by applying abstraction and stepwise refinement. The Rodin platform, a tool supporting Event-B, assists in automatic model transformations and proof-based verification of the desired system properties. The proposed approach has been validated by several case studies from different application domains.
Resumo:
Resilience is the property of a system to remain trustworthy despite changes. Changes of a different nature, whether due to failures of system components or varying operational conditions, significantly increase the complexity of system development. Therefore, advanced development technologies are required to build robust and flexible system architectures capable of adapting to such changes. Moreover, powerful quantitative techniques are needed to assess the impact of these changes on various system characteristics. Architectural flexibility is achieved by embedding into the system design the mechanisms for identifying changes and reacting on them. Hence a resilient system should have both advanced monitoring and error detection capabilities to recognise changes as well as sophisticated reconfiguration mechanisms to adapt to them. The aim of such reconfiguration is to ensure that the system stays operational, i.e., remains capable of achieving its goals. Design, verification and assessment of the system reconfiguration mechanisms is a challenging and error prone engineering task. In this thesis, we propose and validate a formal framework for development and assessment of resilient systems. Such a framework provides us with the means to specify and verify complex component interactions, model their cooperative behaviour in achieving system goals, and analyse the chosen reconfiguration strategies. Due to the variety of properties to be analysed, such a framework should have an integrated nature. To ensure the system functional correctness, it should rely on formal modelling and verification, while, to assess the impact of changes on such properties as performance and reliability, it should be combined with quantitative analysis. To ensure scalability of the proposed framework, we choose Event-B as the basis for reasoning about functional correctness. Event-B is a statebased formal approach that promotes the correct-by-construction development paradigm and formal verification by theorem proving. Event-B has a mature industrial-strength tool support { the Rodin platform. Proof-based verification as well as the reliance on abstraction and decomposition adopted in Event-B provides the designers with a powerful support for the development of complex systems. Moreover, the top-down system development by refinement allows the developers to explicitly express and verify critical system-level properties. Besides ensuring functional correctness, to achieve resilience we also need to analyse a number of non-functional characteristics, such as reliability and performance. Therefore, in this thesis we also demonstrate how formal development in Event-B can be combined with quantitative analysis. Namely, we experiment with integration of such techniques as probabilistic model checking in PRISM and discrete-event simulation in SimPy with formal development in Event-B. Such an integration allows us to assess how changes and di erent recon guration strategies a ect the overall system resilience. The approach proposed in this thesis is validated by a number of case studies from such areas as robotics, space, healthcare and cloud domain.
Resumo:
The "Java Intelligent Tutoring System" (JITS) research project focused on designing, constructing, and determining the effectiveness of an Intelligent Tutoring System for beginner Java programming students at the postsecondary level. The participants in this research were students in the School of Applied Computing and Engineering Sciences at Sheridan College. This research involved consistently gathering input from students and instructors using JITS as it developed. The cyclic process involving designing, developing, testing, and refinement was used for the construction of JITS to ensure that it adequately meets the needs of students and instructors. The second objective in this dissertation determined the effectiveness of learning within this environment. The main findings indicate that JITS is a richly interactive ITS that engages students on Java programming problems. JITS is equipped with a sophisticated personalized feedback mechanism that models and supports each student in his/her learning style. The assessment component involved 2 main quantitative experiments to determine the effectiveness of JITS in terms of student performance. In both experiments it was determined that a statistically significant difference was achieved between the control group and the experimental group (i.e., JITS group). The main effect for Test (i.e., pre- and postiest), F( l , 35) == 119.43,p < .001, was qualified by a Test by Group interaction, F( l , 35) == 4.98,p < .05, and a Test by Time interaction, F( l , 35) == 43.82, p < .001. Similar findings were found for the second experiment; Test by Group interaction revealed F( 1 , 92) == 5.36, p < .025. In both experiments the JITS groups outperformed the corresponding control groups at posttest.
Resumo:
The new Physiotherapy and Occupational Therapy programmes, based in the Faculty of Health Sciences, McMaster University (Hamilton, Ontario) are unique. The teaching and learning philosophies utilized are based on learner-centred and selfdirected learning theories. The 1991 admissions process of these programmes attempted to select individuals who would make highly qualified professionals and who would have the necessary skills to complete such unique programmes. In order to: 1 . learn more about the concept of self-directed learning and its related characteristics in health care professionals; 2. examine the relationship between various student characteristics - personal, learner and those assessed during the admissions process - and final course grades, and 3. determine which, if any, smdent characteristics could be considered predictors for success in learner-centred programmes requiring self-directed learning skills, a correlational research design was developed and carried out. Thirty Occupational Therapy and thirty Physiotherapy smdents were asked to complete 2 instruments - a questionnaire developed by the author and the Oddi Continuing Learning Inventory (Oddi, 1986). Course grades and ratings of students during the admissions process were also obtained. Both questionnaires were examined for reliability, and factor analyses were conducted to determine construct validity. Data obtained from the questionnaires, course grades and student ratings (from the admissions process) were analyzed and compared using the Contingency Co-efficient, the Pearson's product-moment correlation co-efficient, and the multiple regression analysis model. The research findings demonstrated a positive relationship (as identified by Contingency Coefficient or Pearson r values) between various course grades and the following personal and learner characteristics: field of smdy of highest level of education achieved, level of education achieved, sex, marital stams, motivation for completing the programmes, reasons for eru-oling in the programmes, decision to enrol in the programmes, employment history, preferred learning style, strong selfconcept and the identification of various components of the concept of self-directed learning. In most cases, the relationships were significant to the 0.01 or 0.(X)1 levels. Results of the multiple regression analyses demonstrated that several learner and admissions characteristic variables had R^ values that accounted for the largest proportion of the variance in several dependent variables. Thus, these variables could be considered predictors for success. The learner characteristics included: level of education and strong self-concept. The admissions characteristics included: ability to evaluate strengths, ability to give feedback, curiosity and creativity, and communication skills. It is recommended that research continue to be conducted to substantiate the relationships found between course grades and characteristic variables in more diverse populations. "Success in self-directed programmes" from the learner's perspective should also be investigated. The Oddi Continuing Learning Inventory should continue to be researched. Further research may lead to refinement or further development of the instrument, and may provide further insight into self-directed learner attributes. The concept of self-directed learning continues to be incorporated into educational programmes, and thus should continue to be explored.
Resumo:
The thesis assesses the impact of international factors on relations between Greek and Turkish Cypriots during and after the Cold War. Through an analysis of the Cyprus problem it explores both why external actors intervene in communal conflicts and how they influence relations between ethnic groups in plural societies. The analytical framework employed throughout the study draws on contributions of International Relations theorists and students of ethnic conflict. The thesis argues that, as in the global political system, relations between ethnic groups in unranked communal systems are anarchic; that is, actors within the system do not recognize a sovereign political authority. In bipolar communal systems dominated by two relatively equal groups, the struggle for security and power often leads to appeals for assistance from external actors. The framework notes that neighboring states and Great Powers may heed calls for assistance, or intervene without a prior request, if it is in their interest to do so. The convergence of regional and global interests in communal affairs exacerbates ethnic conflicts and precludes the development of effective political institutions. The impact of external intervention in ethnic conflicts has the potential to alter the basis of communal relations. The Cyprus problem is examined both during and after the Cold War in order to gauge how global and regional actors and the structure of their respective systems have affected relations between ethnic groups in Cyprus. The thesis argues that Cyprus's descent into civil war in 1963 was due in part to the entrenchment of external interests in the Republic's constitution. The study also notes that power politics involving the United States, Soviet Union, Greece and Turkey continued to affect the development of communal relations throughout the 1960s, 70s, and, 80s. External intervention culminated in July and August 1974, after a Greek sponsored coup was answered by Turkey's invasion and partition of Cyprus. The forced expulsion of Greek Cypriots from the island's northern territories led to the establishment of ethnically homogeneous zones, thus altering the context of communal relations dramatically. The study also examines the role of the United Nations in Cyprus, noting that its failure to settle the dispute was due in large part to a lack of cooperation from Turkey, and the United States' and Soviet Union's acceptance of the status quo following the 1974 invasion and partition of the island. The thesis argues that the deterioration of Greek-Turkish relations in the post-Cold War era has made a solution to the dispute unlikely for the time being. Barring any dramatic changes in relations between communal and regional antagonists, relations between Greek and Turkish Cypriots will continue to develop along the lines established in July/August 1974. The thesis concludes by affirming the validity of its core hypotheses through a brief survey of recent works touching on international politics and ethnic conflict. Questions requiring further research are noted as are elements of the study that require further refinement.
Resumo:
To investigate the thennal effects of latent heat in hydrothennal settings, an extension was made to the existing finite-element numerical modelling software, Aquarius. The latent heat algorithm was validated using a series of column models, which analysed the effects of penneability (flow rate), thennal gradient, and position along the two-phase curve (pressure). Increasing the flow rate and pressure increases displacement of the liquid-steam boundary from an initial position detennined without accounting for latent heat while increasing the thennal gradient decreases that displacement. Application to a regional scale model of a caldera-hosted hydrothennal system based on a representative suite of calderas (e.g., Yellowstone, Creede, Valles Grande) led to oscillations in the model solution. Oscillations can be reduced or eliminated by mesh refinement, which requires greater computation effort. Results indicate that latent heat should be accounted for to accurately model phase change conditions in hydrothennal settings.
Resumo:
There are many ways to generate geometrical models for numerical simulation, and most of them start with a segmentation step to extract the boundaries of the regions of interest. This paper presents an algorithm to generate a patient-specific three-dimensional geometric model, based on a tetrahedral mesh, without an initial extraction of contours from the volumetric data. Using the information directly available in the data, such as gray levels, we built a metric to drive a mesh adaptation process. The metric is used to specify the size and orientation of the tetrahedral elements everywhere in the mesh. Our method, which produces anisotropic meshes, gives good results with synthetic and real MRI data. The resulting model quality has been evaluated qualitatively and quantitatively by comparing it with an analytical solution and with a segmentation made by an expert. Results show that our method gives, in 90% of the cases, as good or better meshes as a similar isotropic method, based on the accuracy of the volume reconstruction for a given mesh size. Moreover, a comparison of the Hausdorff distances between adapted meshes of both methods and ground-truth volumes shows that our method decreases reconstruction errors faster. Copyright © 2015 John Wiley & Sons, Ltd.
Resumo:
Demand on magnesium and its alloys is increased significantly in the automotive industry because of their great potential in reducing the weight of components, thus resulting in improvement in fuel efficiency of the vehicle. To date, most of Mg products have been fabricated by casting, especially, by die-casting because of its high productivity, suitable strength, acceptable quality & dimensional accuracy and the components produced through sand, gravity and low pressure die casting are small extent. In fact, higher solidification rate is possible only in high pressure die casting, which results in finer grain size. However, achieving high cooling rate in gravity casting using sand and permanent moulds is a difficult task, which ends with a coarser grain nature and exhibit poor mechanical properties, which is an important aspect of the performance in industrial applications. Grain refinement is technologically attractive because it generally does not adversely affect ductility and toughness, contrary to most other strengthening methods. Therefore formation of fine grain structure in these castings is crucial, in order to improve the mechanical properties of these cast components. Therefore, the present investigation is “GRAIN REFINEMENT STUDIES ON Mg AND Mg-Al BASED ALLOYS”. The primary objective of this present investigation is to study the effect of various grain refining inoculants (Al-4B, Al- 5TiB2 master alloys, Al4C3, Charcoal particles) on Pure Mg and Mg-Al alloys such as AZ31, AZ91 and study their grain refining mechanisms. The second objective of this work is to study the effect of superheating process on the grain size of AZ31, AZ91 Mg alloys with and without inoculants addition. In addition, to study the effect of grain refinement on the mechanical properties of Mg and Mg-Al alloys. The thesis is well organized with seven chapters and the details of the studies are given below in detail.
Resumo:
This paper presents methods for moving object detection in airborne video surveillance. The motion segmentation in the above scenario is usually difficult because of small size of the object, motion of camera, and inconsistency in detected object shape etc. Here we present a motion segmentation system for moving camera video, based on background subtraction. An adaptive background building is used to take advantage of creation of background based on most recent frame. Our proposed system suggests CPU efficient alternative for conventional batch processing based background subtraction systems. We further refine the segmented motion by meanshift based mode association.
Resumo:
Zinc aluminate nanoparticles with average particle size of 40 nm were synthesized using a sol–gel combustion method. X-ray diffractometry result was analysed by Rietveld refinement method to establish the phase purity of the material. Different stages of phase formation of the material during the synthesis were investigated using differential scanning calorimetry and differential thermogravimetric analysis. Particle size was determined with transmission electron microscopy and the optical bandgap of the nanoparticles was determined by absorption spectroscopy in the ultraviolet-visible range. Dielectric permittivity and a.c. conductivity of the material were measured for frequencies from 100 kHz to 8 MHz in the temperature range of 30–120◦C. The presence of Maxwell– Wagner type interfacial polarization was found to exist in the material and hopping of electron by means of quantum mechanical tunneling is attributed as the reason for the observed a.c. conductivity