908 resultados para computer-based instrumentation


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Aging is associated with common conditions, including cancer, diabetes, cardiovascular disease, and Alzheimer"s disease. The type of multi‐targeted pharmacological approach necessary to address a complex multifaceted disease such as aging might take advantage of pleiotropic natural polyphenols affecting a wide variety of biological processes. We have recently postulated that the secoiridoids oleuropein aglycone (OA) and decarboxymethyl oleuropein aglycone (DOA), two complex polyphenols present in health‐promoting extra virgin olive oil (EVOO), might constitute a new family of plant‐produced gerosuppressant agents. This paper describes an analysis of the biological activity spectra (BAS) of OA and DOA using PASS (Prediction of Activity Spectra for Substances) software. PASS can predict thousands of biological activities, as the BAS of a compound is an intrinsic property that is largely dependent on the compound"s structure and reflects pharmacological effects, physiological and biochemical mechanisms of action, and specific toxicities. Using Pharmaexpert, a tool that analyzes the PASS‐predicted BAS of substances based on thousands of"mechanism‐ effect" and"effect‐mechanism" relationships, we illuminate hypothesis‐generating pharmacological effects, mechanisms of action, and targets that might underlie the anti‐aging/anti‐cancer activities of the gerosuppressant EVOO oleuropeins.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis concentrates on developing a practical local approach methodology based on micro mechanical models for the analysis of ductile fracture of welded joints. Two major problems involved in the local approach, namely the dilational constitutive relation reflecting the softening behaviour of material, and the failure criterion associated with the constitutive equation, have been studied in detail. Firstly, considerable efforts were made on the numerical integration and computer implementation for the non trivial dilational Gurson Tvergaard model. Considering the weaknesses of the widely used Euler forward integration algorithms, a family of generalized mid point algorithms is proposed for the Gurson Tvergaard model. Correspondingly, based on the decomposition of stresses into hydrostatic and deviatoric parts, an explicit seven parameter expression for the consistent tangent moduli of the algorithms is presented. This explicit formula avoids any matrix inversion during numerical iteration and thus greatly facilitates the computer implementation of the algorithms and increase the efficiency of the code. The accuracy of the proposed algorithms and other conventional algorithms has been assessed in a systematic manner in order to highlight the best algorithm for this study. The accurate and efficient performance of present finite element implementation of the proposed algorithms has been demonstrated by various numerical examples. It has been found that the true mid point algorithm (a = 0.5) is the most accurate one when the deviatoric strain increment is radial to the yield surface and it is very important to use the consistent tangent moduli in the Newton iteration procedure. Secondly, an assessment of the consistency of current local failure criteria for ductile fracture, the critical void growth criterion, the constant critical void volume fraction criterion and Thomason's plastic limit load failure criterion, has been made. Significant differences in the predictions of ductility by the three criteria were found. By assuming the void grows spherically and using the void volume fraction from the Gurson Tvergaard model to calculate the current void matrix geometry, Thomason's failure criterion has been modified and a new failure criterion for the Gurson Tvergaard model is presented. Comparison with Koplik and Needleman's finite element results shows that the new failure criterion is fairly accurate indeed. A novel feature of the new failure criterion is that a mechanism for void coalescence is incorporated into the constitutive model. Hence the material failure is a natural result of the development of macroscopic plastic flow and the microscopic internal necking mechanism. By the new failure criterion, the critical void volume fraction is not a material constant and the initial void volume fraction and/or void nucleation parameters essentially control the material failure. This feature is very desirable and makes the numerical calibration of void nucleation parameters(s) possible and physically sound. Thirdly, a local approach methodology based on the above two major contributions has been built up in ABAQUS via the user material subroutine UMAT and applied to welded T joints. By using the void nucleation parameters calibrated from simple smooth and notched specimens, it was found that the fracture behaviour of the welded T joints can be well predicted using present methodology. This application has shown how the damage parameters of both base material and heat affected zone (HAZ) material can be obtained in a step by step manner and how useful and capable the local approach methodology is in the analysis of fracture behaviour and crack development as well as structural integrity assessment of practical problems where non homogeneous materials are involved. Finally, a procedure for the possible engineering application of the present methodology is suggested and discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We've developed a new ambient occlusion technique based on an information-theoretic framework. Essentially, our method computes a weighted visibility from each object polygon to all viewpoints; we then use these visibility values to obtain the information associated with each polygon. So, just as a viewpoint has information about the model's polygons, the polygons gather information on the viewpoints. We therefore have two measures associated with an information channel defined by the set of viewpoints as input and the object's polygons as output, or vice versa. From this polygonal information, we obtain an occlusion map that serves as a classic ambient occlusion technique. Our approach also offers additional applications, including an importance-based viewpoint-selection guide, and a means of enhancing object features and producing nonphotorealistic object visualizations

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Across Latin America 420 indigenous languages are spoken. Spanish is considered a second language in indigenous communities and is progressively introduced in education. However, most of the tools to support teaching processes of a second language have been developed for the most common languages such as English, French, German, Italian, etc. As a result, only a small amount of learning objects and authoring tools have been developed for indigenous people considering the specific needs of their population. This paper introduces Multilingual–Tiny as a web authoring tool to support the virtual experience of indigenous students and teachers when they are creating learning objects in indigenous languages or in Spanish language, in particular, when they have to deal with the grammatical structures of Spanish. Multilingual–Tiny has a module based on the Case-based Reasoning technique to provide recommendations in real time when teachers and students write texts in Spanish. An experiment was performed in order to compare some local similarity functions to retrieve cases from the case library taking into account the grammatical structures. As a result we found the similarity function with the best performance

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The present paper reports a bacteria autonomous controlled concentrator prototype with a user-friendly interface for bench-top applications. It is based on a micro-fluidic lab-on-a-chip and its associated custom instrumentation, which consists in a dielectrophoretic actuator, to pre-concentrate the sample, and an impedance analyser, to measure concentrated bacteria levels. The system is composed by a single micro-fluidic chamber with interdigitated electrodes and a instrumentation with custom electronics. The prototype is supported by a real-time platform connected to a remote computer, which automatically controls the system and displays impedance data used to monitor the status of bacteria accumulation on-chip. The system automates the whole concentrating operation. Performance has been studied for controlled volumes of Escherichia coli (E. coli) samples injected into the micro-fluidic chip at constant flow rate of 10 μL/min. A media conductivity correcting protocol has been developed, as the preliminary results showed distortion of the impedance analyser measurement produced by bacterial media conductivity variations through time. With the correcting protocol, the measured impedance values were related to the quantity of bacteria concentrated with a correlation of 0.988 and a coefficient of variation of 3.1%. Feasibility of E. coli on-chip automated concentration, using the miniaturized system, has been demonstrated. Furthermore, the impedance monitoring protocol had been adjusted and optimized, to handle changes in the electrical properties of the bacteria media over time.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

International energy and climate strategies also set Finland’s commitments to increasing the use of renewable energy sources and reducing greenhouse gas emissions. The target can be achieved by, for example, increasing the use of energy wood. Finland’s forest biomass potential is significant compared with current use. Increased use will change forest management and wood harvesting methods however. The thesis examined the potential for integrated pulp and paper mills to increase bioenergy production. The effects of two bioenergy production technologies on the carbon footprint of an integrated LWC mill were studied at mill level and from the cradle-to-customer approach. The LignoBoost process and FT diesel production were chosen as bioenergy cases. The data for the LignoBoost process were obtained from Metso and for the FT diesel process from Neste Oil. The rest of the information is based on the literature and databases of the KCL-ECO life-cycle computer program and Ecoinvent. In both case studies, the carbon footprint was reduced. From the results, it can be concluded that it is possible to achieve a fossil-fuel-free pulp mill with the LignoBoost process. By using steam from the FT diesel process, the amount of auxiliary fuel can be reduced considerably and the bark boiler can be replaced. With a choice of auxiliary fuels for use in heat production in the paper mill and the production methods for purchased electricity, it is possible to affect the carbon footprints even more in both cases.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this thesis, a computer software for defining the geometry for a centrifugal compressor impeller is designed and implemented. The project is done under the supervision of Laboratory of Fluid Dynamics in Lappeenranta University of Technology. This thesis is similar to the thesis written by Tomi Putus (2009) in which a centrifugal compressor impeller flow channel is researched and commonly used design practices are reviewed. Putus wrote a computer software which can be used to define impeller’s three-dimensional geometry based on the basic geometrical dimensions given by a preliminary design. The software designed in this thesis is almost similar but it uses a different programming language (C++) and a different way to define the shape of the impeller meridional projection.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The development of correct programs is a core problem in computer science. Although formal verification methods for establishing correctness with mathematical rigor are available, programmers often find these difficult to put into practice. One hurdle is deriving the loop invariants and proving that the code maintains them. So called correct-by-construction methods aim to alleviate this issue by integrating verification into the programming workflow. Invariant-based programming is a practical correct-by-construction method in which the programmer first establishes the invariant structure, and then incrementally extends the program in steps of adding code and proving after each addition that the code is consistent with the invariants. In this way, the program is kept internally consistent throughout its development, and the construction of the correctness arguments (proofs) becomes an integral part of the programming workflow. A characteristic of the approach is that programs are described as invariant diagrams, a graphical notation similar to the state charts familiar to programmers. Invariant-based programming is a new method that has not been evaluated in large scale studies yet. The most important prerequisite for feasibility on a larger scale is a high degree of automation. The goal of the Socos project has been to build tools to assist the construction and verification of programs using the method. This thesis describes the implementation and evaluation of a prototype tool in the context of the Socos project. The tool supports the drawing of the diagrams, automatic derivation and discharging of verification conditions, and interactive proofs. It is used to develop programs that are correct by construction. The tool consists of a diagrammatic environment connected to a verification condition generator and an existing state-of-the-art theorem prover. Its core is a semantics for translating diagrams into verification conditions, which are sent to the underlying theorem prover. We describe a concrete method for 1) deriving sufficient conditions for total correctness of an invariant diagram; 2) sending the conditions to the theorem prover for simplification; and 3) reporting the results of the simplification to the programmer in a way that is consistent with the invariantbased programming workflow and that allows errors in the program specification to be efficiently detected. The tool uses an efficient automatic proof strategy to prove as many conditions as possible automatically and lets the remaining conditions be proved interactively. The tool is based on the verification system PVS and i uses the SMT (Satisfiability Modulo Theories) solver Yices as a catch-all decision procedure. Conditions that were not discharged automatically may be proved interactively using the PVS proof assistant. The programming workflow is very similar to the process by which a mathematical theory is developed inside a computer supported theorem prover environment such as PVS. The programmer reduces a large verification problem with the aid of the tool into a set of smaller problems (lemmas), and he can substantially improve the degree of proof automation by developing specialized background theories and proof strategies to support the specification and verification of a specific class of programs. We demonstrate this workflow by describing in detail the construction of a verified sorting algorithm. Tool-supported verification often has little to no presence in computer science (CS) curricula. Furthermore, program verification is frequently introduced as an advanced and purely theoretical topic that is not connected to the workflow taught in the early and practically oriented programming courses. Our hypothesis is that verification could be introduced early in the CS education, and that verification tools could be used in the classroom to support the teaching of formal methods. A prototype of Socos has been used in a course at Åbo Akademi University targeted at first and second year undergraduate students. We evaluate the use of Socos in the course as part of a case study carried out in 2007.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The central theme of this thesis is the emancipation and further development of learning activity in higher education in the context of the ongoing digital transformation of our societies. It was developed in response to the highly problematic mainstream approach to digital re-instrumentation of teaching and studying practises in contemporary higher education. The mainstream approach is largely based on centralisation, standardisation, commoditisation, and commercialisation, while re-producing the general patterns of control, responsibility, and dependence that are characteristic for activity systems of schooling. Whereas much of educational research and development focuses on the optimisation and fine-tuning of schooling, the overall inquiry that is underlying this thesis has been carried out from an explicitly critical position and within a framework of action science. It thus conceptualises learning activity in higher education not only as an object of inquiry but also as an object to engage with and to intervene into from a perspective of intentional change. The knowledge-constituting interest of this type of inquiry can be tentatively described as a combination of heuristic-instrumental (guidelines for contextualised action and intervention), practical-phronetic (deliberation of value-rational aspects of means and ends), and developmental-emancipatory (deliberation of issues of power, self-determination, and growth) aspects. Its goal is the production of orientation knowledge for educational practise. The thesis provides an analysis, argumentation, and normative claim on why the development of learning activity should be turned into an object of individual|collective inquiry and intentional change in higher education, and why the current state of affairs in higher education actually impedes such a development. It argues for a decisive shift of attention to the intentional emancipation and further development of learning activity as an important cultural instrument for human (self-)production within the digital transformation. The thesis also attempts an in-depth exploration of what type of methodological rationale can actually be applied to an object of inquiry (developing learning activity) that is at the same time conceptualised as an object of intentional change within the ongoing digital transformation. The result of this retrospective reflection is the formulation of “optimally incomplete” guidelines for educational R&D practise that shares the practicalphronetic (value related) and developmental-emancipatory (power related) orientations that had been driving the overall inquiry. In addition, the thesis formulates the instrumental-heuristic knowledge claim that the conceptual instruments that were adapted and validated in the context of a series of intervention studies provide means to effectively intervene into existing practise in higher education to support the necessary development of (increasingly emancipated) networked learning activity. It suggests that digital networked instruments (tools and services) generally should be considered and treated as transient elements within critical systemic intervention research in higher education. It further argues for the predominant use of loosely-coupled, digital networked instruments that allow for individual|collective ownership, control, (co-)production, and re-use in other contexts and for other purposes. Since the range of digital instrumentation options is continuously expanding and currently shows no signs of an imminent slow-down or consolidation, individual and collective exploration and experimentation of this realm needs to be systematically incorporated into higher education practise.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The maintenance of electric distribution network is a topical question for distribution system operators because of increasing significance of failure costs. In this dissertation the maintenance practices of the distribution system operators are analyzed and a theory for scheduling maintenance activities and reinvestment of distribution components is created. The scheduling is based on the deterioration of components and the increasing failure rates due to aging. The dynamic programming algorithm is used as a solving method to maintenance problem which is caused by the increasing failure rates of the network. The other impacts of network maintenance like environmental and regulation reasons are not included to the scope of this thesis. Further the tree trimming of the corridors and the major disturbance of the network are not included to the problem optimized in this thesis. For optimizing, four dynamic programming models are presented and the models are tested. Programming is made in VBA-language to the computer. For testing two different kinds of test networks are used. Because electric distribution system operators want to operate with bigger component groups, optimal timing for component groups is also analyzed. A maintenance software package is created to apply the presented theories in practice. An overview of the program is presented.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Formal methods provide a means of reasoning about computer programs in order to prove correctness criteria. One subtype of formal methods is based on the weakest precondition predicate transformer semantics and uses guarded commands as the basic modelling construct. Examples of such formalisms are Action Systems and Event-B. Guarded commands can intuitively be understood as actions that may be triggered when an associated guard condition holds. Guarded commands whose guards hold are nondeterministically chosen for execution, but no further control flow is present by default. Such a modelling approach is convenient for proving correctness, and the Refinement Calculus allows for a stepwise development method. It also has a parallel interpretation facilitating development of concurrent software, and it is suitable for describing event-driven scenarios. However, for many application areas, the execution paradigm traditionally used comprises more explicit control flow, which constitutes an obstacle for using the above mentioned formal methods. In this thesis, we study how guarded command based modelling approaches can be conveniently and efficiently scheduled in different scenarios. We first focus on the modelling of trust for transactions in a social networking setting. Due to the event-based nature of the scenario, the use of guarded commands turns out to be relatively straightforward. We continue by studying modelling of concurrent software, with particular focus on compute-intensive scenarios. We go from theoretical considerations to the feasibility of implementation by evaluating the performance and scalability of executing a case study model in parallel using automatic scheduling performed by a dedicated scheduler. Finally, we propose a more explicit and non-centralised approach in which the flow of each task is controlled by a schedule of its own. The schedules are expressed in a dedicated scheduling language, and patterns assist the developer in proving correctness of the scheduled model with respect to the original one.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Modern machine structures are often fabricated by welding. From a fatigue point of view, the structural details and especially, the welded details are the most prone to fatigue damage and failure. Design against fatigue requires information on the fatigue resistance of a structure’s critical details and the stress loads that act on each detail. Even though, dynamic simulation of flexible bodies is already current method for analyzing structures, obtaining the stress history of a structural detail during dynamic simulation is a challenging task; especially when the detail has a complex geometry. In particular, analyzing the stress history of every structural detail within a single finite element model can be overwhelming since the amount of nodal degrees of freedom needed in the model may require an impractical amount of computational effort. The purpose of computer simulation is to reduce amount of prototypes and speed up the product development process. Also, to take operator influence into account, real time models, i.e. simplified and computationally efficient models are required. This in turn, requires stress computation to be efficient if it will be performed during dynamic simulation. The research looks back at the theoretical background of multibody dynamic simulation and finite element method to find suitable parts to form a new approach for efficient stress calculation. This study proposes that, the problem of stress calculation during dynamic simulation can be greatly simplified by using a combination of floating frame of reference formulation with modal superposition and a sub-modeling approach. In practice, the proposed approach can be used to efficiently generate the relevant fatigue assessment stress history for a structural detail during or after dynamic simulation. In this work numerical examples are presented to demonstrate the proposed approach in practice. The results show that approach is applicable and can be used as proposed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Measurement is a tool for researching. Therefore, it is important that the measuring process is carried out correctly, without distorting the signal or the measured event. Researches of thermoelectric phenomena have been focused more on transverse thermoelectric phenomena during recent decades. Transverse Seebeck effect enables to produce thinner and faster heat flux sensor than before. Studies about transverse Seebeck effect have so far focused on materials, so in this Master’s Thesis instrumentation of transverse Seebeck effect based heat flux sensor is studied, This Master’s Thesis examines an equivalent circuit of transverse Seebeck effect heat flux sensors, their connectivity to electronics and choosing and design a right type amplifier. The research is carried out with a case study which is Gradient Heat Flux Sensors and an electrical motor. In this work, a general equivalent circuit was presented for the transverse Seebeck effect-based heat flux sensor. An amplifier was designed for the sensor of the case study, and the solution was produced for the measurement of the local heat flux of the electric motor to improve the electromagnetic compatibility.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper a computer program to model and support product design is presented. The product is represented through a hierarchical structure that allows the user to navigate across the product’s components, and it aims at facilitating each step of the detail design process. A graphical interface was also developed, which shows visually to the user the contents of the product structure. Features are used as building blocks for the parts that compose the product, and object-oriented methodology was used as a means to implement the product structure. Finally, an expert system was also implemented, whose knowledge base rules help the user design a product that meets design and manufacturing requirements.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Software plays an important role in our society and economy. Software development is an intricate process, and it comprises many different tasks: gathering requirements, designing new solutions that fulfill these requirements, as well as implementing these designs using a programming language into a working system. As a consequence, the development of high quality software is a core problem in software engineering. This thesis focuses on the validation of software designs. The issue of the analysis of designs is of great importance, since errors originating from designs may appear in the final system. It is considered economical to rectify the problems as early in the software development process as possible. Practitioners often create and visualize designs using modeling languages, one of the more popular being the Uni ed Modeling Language (UML). The analysis of the designs can be done manually, but in case of large systems, the need of mechanisms that automatically analyze these designs arises. In this thesis, we propose an automatic approach to analyze UML based designs using logic reasoners. This approach firstly proposes the translations of the UML based designs into a language understandable by reasoners in the form of logic facts, and secondly shows how to use the logic reasoners to infer the logical consequences of these logic facts. We have implemented the proposed translations in the form of a tool that can be used with any standard compliant UML modeling tool. Moreover, we authenticate the proposed approach by automatically validating hundreds of UML based designs that consist of thousands of model elements available in an online model repository. The proposed approach is limited in scope, but is fully automatic and does not require any expertise of logic languages from the user. We exemplify the proposed approach with two applications, which include the validation of domain specific languages and the validation of web service interfaces.