810 resultados para 380109 Psychological Methodology, Design and Analysis


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Bacterial reporters are live, genetically engineered cells with promising application in bioanalytics. They contain genetic circuitry to produce a cellular sensing element, which detects the target compound and relays the detection to specific synthesis of so-called reporter proteins (the presence or activity of which is easy to quantify). Bioassays with bacterial reporters are a useful complement to chemical analytics because they measure biological responses rather than total chemical concentrations. Simple bacterial reporter assays may also replace more costly chemical methods as a first line sample analysis technique. Recent promising developments integrate bacterial reporter cells with microsystems to produce bacterial biosensors. This lecture presents an in-depth treatment of the synthetic biological design principles of bacterial reporters, the engineering of which started as simple recombinant DNA puzzles, but has now become a more rational approach of choosing and combining sensing, controlling and reporting DNA 'parts'. Several examples of existing bacterial reporter designs and their genetic circuitry will be illustrated. Besides the design principles, the lecture also focuses on the application principles of bacterial reporter assays. A variety of assay formats will be illustrated, and principles of quantification will be dealt with. In addition to this discussion, substantial reference material is supplied in various Annexes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Gastric cancer affects about one million people per year worldwide, being the second leading cause of cancer mortality. The study of its etiology remains therefore a global issue as it may allow the identification of major targets, besides eradication of Helicobacter pylori infection, for primary prevention. It has however received little attention, given its comparatively low incidence in most high-income countries. We introduce a consortium of epidemiological investigations named the 'Stomach cancer Pooling (StoP) Project'. Twenty-two studies agreed to participate, for a total of over 9000 cases and 23 000 controls. Twenty studies have already shared the original data set. Of the patients, 40% are from Asia, 43% from Europe, and 17% from North America; 34% are women and 66% men; the median age is 61 years; 56% are from population-based case-control studies, 41% from hospital-based ones, and 3% from nested case-control studies derived from cohort investigations. Biological samples are available from 12 studies. The aim of the StoP Project is to analyze the role of lifestyle and genetic determinants in the etiology of gastric cancer through pooled analyses of individual-level data. The uniquely large data set will allow us to define and quantify the main effects of each risk factor of interest, including a number of infrequent habits, and to adequately address associations in subgroups of the population, as well as interaction within and between environmental and genetic factors. Further, we will carry out separate analyses according to different histotypes and subsites of gastric cancer, to identify potential different risk patterns and etiological characteristics.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Center for Transportation Research and Education (CTRE) used the traffic simulation model CORSIM to access proposed capacity and safety improvement strategies for the U.S. 61 corridor through Burlington, Iowa. The comparison between the base and alternative models allow for evaluation of the traffic flow performance under the existing conditions as well as other design scenarios. The models also provide visualization of performance for interpretation by technical staff, public policy makers, and the public. The objectives of this project are to evaluate the use of traffic simulation models for future use by the Iowa Department of Transportation (DOT) and to develop procedures for employing simulation modeling to conduct the analysis of alternative designs. This report presents both the findings of the U.S. 61 evaluation and an overview of model development procedures. The first part of the report includes the simulation modeling development procedures. The simulation analysis is illustrated through the Burlington U.S. 61 corridor case study application. Part I is not intended to be a user manual but simply introductory guidelines for traffic simulation modeling. Part II of the report evaluates the proposed improvement concepts in a side by side comparison of the base and alternative models.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The antihypertensive effects of the beta-blocking agent betaxolol and the calcium entry blocker verapamil were compared in a crossover single-blind trial. Seventeen patients with uncomplicated essential hypertension took either betaxolol or a slow-release formulation of verapamil for two consecutive 6-week periods. The sequence of treatment phases was randomly allocated and a 2-week washout period preceded each treatment. The antihypertensive effect of the test drugs was assessed both at the physician's office and during everyday activities using a portable blood pressure recorder. The crossover design of the trial made it possible to evaluate the antihypertensive efficacy of betaxolol and verapamil both in the group as a whole and in the individual patient. The individual patient response to one of these agents was not a reliable indicator of the same patient's response to the alternative agent. Betaxolol brought both office and ambulatory recorded blood pressures under control in a larger fraction of patients than verapamil, although the magnitude of the blood pressure fall in the responders was equal for each drug. These observations stress the need for an individualized approach to the evaluation of antihypertensive therapy. The present results also demonstrate that optimal antihypertensive therapy is still a matter of trial and error. The precise methodology that ought to characterize crossover trials may make it possible to improve the therapeutic approach to hypertensive patients.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Theultimate goal of any research in the mechanism/kinematic/design area may be called predictive design, ie the optimisation of mechanism proportions in the design stage without requiring extensive life and wear testing. This is an ambitious goal and can be realised through development and refinement of numerical (computational) technology in order to facilitate the design analysis and optimisation of complex mechanisms, mechanical components and systems. As a part of the systematic design methodology this thesis concentrates on kinematic synthesis (kinematic design and analysis) methods in the mechanism synthesis process. The main task of kinematic design is to find all possible solutions in the form of structural parameters to accomplish the desired requirements of motion. Main formulations of kinematic design can be broadly divided to exact synthesis and approximate synthesis formulations. The exact synthesis formulation is based in solving n linear or nonlinear equations in n variables and the solutions for the problem areget by adopting closed form classical or modern algebraic solution methods or using numerical solution methods based on the polynomial continuation or homotopy. The approximate synthesis formulations is based on minimising the approximation error by direct optimisation The main drawbacks of exact synthesis formulationare: (ia) limitations of number of design specifications and (iia) failure in handling design constraints- especially inequality constraints. The main drawbacks of approximate synthesis formulations are: (ib) it is difficult to choose a proper initial linkage and (iib) it is hard to find more than one solution. Recentformulations in solving the approximate synthesis problem adopts polynomial continuation providing several solutions, but it can not handle inequality const-raints. Based on the practical design needs the mixed exact-approximate position synthesis with two exact and an unlimited number of approximate positions has also been developed. The solutions space is presented as a ground pivot map but thepole between the exact positions cannot be selected as a ground pivot. In this thesis the exact synthesis problem of planar mechanism is solved by generating all possible solutions for the optimisation process ¿ including solutions in positive dimensional solution sets - within inequality constraints of structural parameters. Through the literature research it is first shown that the algebraic and numerical solution methods ¿ used in the research area of computational kinematics ¿ are capable of solving non-parametric algebraic systems of n equations inn variables and cannot handle the singularities associated with positive-dimensional solution sets. In this thesis the problem of positive-dimensional solutionsets is solved adopting the main principles from mathematical research area of algebraic geometry in solving parametric ( in the mathematical sense that all parameter values are considered ¿ including the degenerate cases ¿ for which the system is solvable ) algebraic systems of n equations and at least n+1 variables.Adopting the developed solution method in solving the dyadic equations in direct polynomial form in two- to three-precision-points it has been algebraically proved and numerically demonstrated that the map of the ground pivots is ambiguousand that the singularities associated with positive-dimensional solution sets can be solved. The positive-dimensional solution sets associated with the poles might contain physically meaningful solutions in the form of optimal defectfree mechanisms. Traditionally the mechanism optimisation of hydraulically driven boommechanisms is done at early state of the design process. This will result in optimal component design rather than optimal system level design. Modern mechanismoptimisation at system level demands integration of kinematic design methods with mechanical system simulation techniques. In this thesis a new kinematic design method for hydraulically driven boom mechanism is developed and integrated in mechanical system simulation techniques. The developed kinematic design method is based on the combinations of two-precision-point formulation and on optimisation ( with mathematical programming techniques or adopting optimisation methods based on probability and statistics ) of substructures using calculated criteria from the system level response of multidegree-of-freedom mechanisms. Eg. by adopting the mixed exact-approximate position synthesis in direct optimisation (using mathematical programming techniques) with two exact positions and an unlimitednumber of approximate positions the drawbacks of (ia)-(iib) has been cancelled.The design principles of the developed method are based on the design-tree -approach of the mechanical systems and the design method ¿ in principle ¿ is capable of capturing the interrelationship between kinematic and dynamic synthesis simultaneously when the developed kinematic design method is integrated with the mechanical system simulation techniques.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Superheater corrosion causes vast annual losses for the power companies. With a reliable corrosion prediction method, the plants can be designed accordingly, and knowledge of fuel selection and determination of process conditions may be utilized to minimize superheater corrosion. Growing interest to use recycled fuels creates additional demands for the prediction of corrosion potential. Models depending on corrosion theories will fail, if relations between the inputs and the output are poorly known. A prediction model based on fuzzy logic and an artificial neural network is able to improve its performance as the amount of data increases. The corrosion rate of a superheater material can most reliably be detected with a test done in a test combustor or in a commercial boiler. The steel samples can be located in a special, temperature-controlled probe, and exposed to the corrosive environment for a desired time. These tests give information about the average corrosion potential in that environment. Samples may also be cut from superheaters during shutdowns. The analysis ofsamples taken from probes or superheaters after exposure to corrosive environment is a demanding task: if the corrosive contaminants can be reliably analyzed, the corrosion chemistry can be determined, and an estimate of the material lifetime can be given. In cases where the reason for corrosion is not clear, the determination of the corrosion chemistry and the lifetime estimation is more demanding. In order to provide a laboratory tool for the analysis and prediction, a newapproach was chosen. During this study, the following tools were generated: · Amodel for the prediction of superheater fireside corrosion, based on fuzzy logic and an artificial neural network, build upon a corrosion database developed offuel and bed material analyses, and measured corrosion data. The developed model predicts superheater corrosion with high accuracy at the early stages of a project. · An adaptive corrosion analysis tool based on image analysis, constructedas an expert system. This system utilizes implementation of user-defined algorithms, which allows the development of an artificially intelligent system for thetask. According to the results of the analyses, several new rules were developed for the determination of the degree and type of corrosion. By combining these two tools, a user-friendly expert system for the prediction and analyses of superheater fireside corrosion was developed. This tool may also be used for the minimization of corrosion risks by the design of fluidized bed boilers.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The networking and digitalization of audio equipment has created a need for control protocols. These protocols offer new services to customers and ensure that the equipment operates correctly. The control protocols used in the computer networks are not directly applicable since embedded systems have resource and cost limitations. In this master's thesis the design and implementation of new loudspeaker control network protocols are presented. The protocol stack was required to be reliable, have short response times, configure the network automatically and support the dynamic addition and removal of loudspeakers. The implemented protocol stack was also required to be as efficient and lightweight as possible because the network nodes are fairly simple and lack processing power. The protocol stack was thoroughly tested, validated and verified. The protocols were formally described using LOTOS (Language of Temporal Ordering Specifications) and verified using reachability analysis. A prototype of the loudspeaker network was built and used for testing the operation and the performance of the control protocols. The implemented control protocol stack met the design specifications and proved to be highly reliable and efficient.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper is a literature review which describes the construction of state of the art of permanent magnet generators and motors constructing and discusses the current and possible application of these machines in industry. Permanent magnet machines are a well-know class of rotating and linear electric machines used for many years in industrial applications. A particular interest for permanent magnet generators is connected with wind mills, which seem to be becoming increasingly popular nowadays. Geared and direct-driven permanent magnet generators are described. A classification of direct-driven permanent magnet generators is given. Design aspects of permanent magnet generators are presented. Permanent magnet generators for wind turbines designs are highlighted. Dynamics and vibration problems of permanent magnet generators covered in literature are presented. The application of the Finite Element Method for mechanical problems solution in the field of permanent magnet generators is discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis concentrates on developing a practical local approach methodology based on micro mechanical models for the analysis of ductile fracture of welded joints. Two major problems involved in the local approach, namely the dilational constitutive relation reflecting the softening behaviour of material, and the failure criterion associated with the constitutive equation, have been studied in detail. Firstly, considerable efforts were made on the numerical integration and computer implementation for the non trivial dilational Gurson Tvergaard model. Considering the weaknesses of the widely used Euler forward integration algorithms, a family of generalized mid point algorithms is proposed for the Gurson Tvergaard model. Correspondingly, based on the decomposition of stresses into hydrostatic and deviatoric parts, an explicit seven parameter expression for the consistent tangent moduli of the algorithms is presented. This explicit formula avoids any matrix inversion during numerical iteration and thus greatly facilitates the computer implementation of the algorithms and increase the efficiency of the code. The accuracy of the proposed algorithms and other conventional algorithms has been assessed in a systematic manner in order to highlight the best algorithm for this study. The accurate and efficient performance of present finite element implementation of the proposed algorithms has been demonstrated by various numerical examples. It has been found that the true mid point algorithm (a = 0.5) is the most accurate one when the deviatoric strain increment is radial to the yield surface and it is very important to use the consistent tangent moduli in the Newton iteration procedure. Secondly, an assessment of the consistency of current local failure criteria for ductile fracture, the critical void growth criterion, the constant critical void volume fraction criterion and Thomason's plastic limit load failure criterion, has been made. Significant differences in the predictions of ductility by the three criteria were found. By assuming the void grows spherically and using the void volume fraction from the Gurson Tvergaard model to calculate the current void matrix geometry, Thomason's failure criterion has been modified and a new failure criterion for the Gurson Tvergaard model is presented. Comparison with Koplik and Needleman's finite element results shows that the new failure criterion is fairly accurate indeed. A novel feature of the new failure criterion is that a mechanism for void coalescence is incorporated into the constitutive model. Hence the material failure is a natural result of the development of macroscopic plastic flow and the microscopic internal necking mechanism. By the new failure criterion, the critical void volume fraction is not a material constant and the initial void volume fraction and/or void nucleation parameters essentially control the material failure. This feature is very desirable and makes the numerical calibration of void nucleation parameters(s) possible and physically sound. Thirdly, a local approach methodology based on the above two major contributions has been built up in ABAQUS via the user material subroutine UMAT and applied to welded T joints. By using the void nucleation parameters calibrated from simple smooth and notched specimens, it was found that the fracture behaviour of the welded T joints can be well predicted using present methodology. This application has shown how the damage parameters of both base material and heat affected zone (HAZ) material can be obtained in a step by step manner and how useful and capable the local approach methodology is in the analysis of fracture behaviour and crack development as well as structural integrity assessment of practical problems where non homogeneous materials are involved. Finally, a procedure for the possible engineering application of the present methodology is suggested and discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Aim of this thesis was to design and manufacture a microdistillation column. The literature review part of this thesis covers stainless steels, material processing and basics about engineering design and distillation. The main focus, however, is on the experimental part. Experimental part is divided into five distinct sections: First part is where the device is introduced and separated into three parts. Secondly the device is designed part by part. It consists mostly of detail problem solving, since the first drawings had already been drawn and the critical dimensions decided. Third part is the manufacture, which was not fully completed since the final assembly was left out of this thesis. Fourth part is the test welding for the device, and its analysis. Finally some ideas for further studies are presented. The main goal of this thesis was accomplished. The device only lacks some final assembly but otherwise it is complete. One thing that became clear during the process was how difficult it is to produce small and precise steel parts with conventional manufacturing methods. Internal stresses within steel plates and thermal distortions can easily ruin small steel structures. Designing appropriate welding jigs is an important task for even simple devices. Laser material processing is a promising tool for this kind of steel processing because of the flexibility, good cutting quality and also precise and low heat input when welding. Next step in this project is the final assembly and the actual distillation tests. The tests will be carried out at Helsinki University of Technology.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Systems biology is a new, emerging and rapidly developing, multidisciplinary research field that aims to study biochemical and biological systems from a holistic perspective, with the goal of providing a comprehensive, system- level understanding of cellular behaviour. In this way, it addresses one of the greatest challenges faced by contemporary biology, which is to compre- hend the function of complex biological systems. Systems biology combines various methods that originate from scientific disciplines such as molecu- lar biology, chemistry, engineering sciences, mathematics, computer science and systems theory. Systems biology, unlike “traditional” biology, focuses on high-level concepts such as: network, component, robustness, efficiency, control, regulation, hierarchical design, synchronization, concurrency, and many others. The very terminology of systems biology is “foreign” to “tra- ditional” biology, marks its drastic shift in the research paradigm and it indicates close linkage of systems biology to computer science. One of the basic tools utilized in systems biology is the mathematical modelling of life processes tightly linked to experimental practice. The stud- ies contained in this thesis revolve around a number of challenges commonly encountered in the computational modelling in systems biology. The re- search comprises of the development and application of a broad range of methods originating in the fields of computer science and mathematics for construction and analysis of computational models in systems biology. In particular, the performed research is setup in the context of two biolog- ical phenomena chosen as modelling case studies: 1) the eukaryotic heat shock response and 2) the in vitro self-assembly of intermediate filaments, one of the main constituents of the cytoskeleton. The range of presented approaches spans from heuristic, through numerical and statistical to ana- lytical methods applied in the effort to formally describe and analyse the two biological processes. We notice however, that although applied to cer- tain case studies, the presented methods are not limited to them and can be utilized in the analysis of other biological mechanisms as well as com- plex systems in general. The full range of developed and applied modelling techniques as well as model analysis methodologies constitutes a rich mod- elling framework. Moreover, the presentation of the developed methods, their application to the two case studies and the discussions concerning their potentials and limitations point to the difficulties and challenges one encounters in computational modelling of biological systems. The problems of model identifiability, model comparison, model refinement, model inte- gration and extension, choice of the proper modelling framework and level of abstraction, or the choice of the proper scope of the model run through this thesis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The present research focuses on the study of how to design message and select media in advertising to generate customer’s purchase intention towards senior mobile phone in China. The message design concentrates mainly on message framing and fear appeals study while the media selection method is only based on direct matching. For exploring the main research question, the study utilized qualitative methodology. The data collection consisted of a pre-interview questionnaire, interviews, and three sets of experiments. The experiments were designed to test the selected 18 participants’ responses toward different emotional appeals and message framings. The findings illustrate participants’ understanding of senior mobile phone and their media usage habits. Moreover, positive message framing and emotional appeals in advertising are more effective. Gender differences in responding to emotional appeals were explored as well.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This master’s thesis mainly focuses on the design requirements of an Electric drive for Hybrid car application and its control strategy to achieve a wide speed range. It also emphasises how the control and performance requirements are transformed into its design variables. A parallel hybrid topology is considered where an IC engine and an electric drive share a common crank shaft. A permanent magnet synchronous machine (PMSM) is used as an electric drive machine. Performance requirements are converted into Machine design variables using the vector model of PMSM. Main dimensions of the machine are arrived using analytical approach and Finite Element Analysis (FEA) is used to verify the design and performance. Vector control algorithm was used to control the machine. The control algorithm was tested in a low power PMSM using an embedded controller. A prototype of 10 kW PMSM was built according to the design values. The prototype was tested in the laboratory using a high power converter. Tests were carried out to verify different operating modes. The results were in agreement with the calculations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

At present, permanent magnet synchronous generators (PMSGs) are of great interest. Since they do not have electrical excitation losses, the highly efficient, lightweight and compact PMSGs equipped with damper windings work perfectly when connected to a network. However, in island operation, the generator (or parallel generators) alone is responsible for the building up of the network and maintaining its voltage and reactive power level. Thus, in island operation, a PMSG faces very tight constraints, which are difficult to meet, because the flux produced by the permanent magnets (PMs) is constant and the voltage of the generator cannot be controlled. Traditional electrically excited synchronous generators (EESGs) can easily meet these constraints, because the field winding current is controllable. The main drawback of the conventional EESG is the relatively high excitation loss. This doctoral thesis presents a study of an alternative solution termed as a hybrid excitation synchronous generator (HESG). HESGs are a special class of electrical machines, where the total rotor current linkage is produced by the simultaneous action of two different excitation sources: the electrical and permanent magnet (PM) excitation. An overview of the existing HESGs is given. Several HESGs are introduced and compared with the conventional EESG from technical and economic points of view. In the study, the armature-reaction-compensated permanent magnet synchronous generator with alternated current linkages (ARC-PMSG with ACL) showed a better performance than the other options. Therefore, this machine type is studied in more detail. An electromagnetic design and a thermal analysis are presented. To verify the operation principle and the electromagnetic design, a down-sized prototype of 69 kVA apparent power was built. The experimental results are demonstrated and compared with the predicted ones. A prerequisite for an ARC-PMSG with ACL is an even number of pole pairs (p = 2, 4, 6, …) in the machine. Naturally, the HESG technology is not limited to even-pole-pair machines. However, the analysis of machines with p = 3, 5, 7, … becomes more complicated, especially if analytical tools are used, and is outside the scope of this thesis. The contribution of this study is to propose a solution where an ARC-PMSG replaces an EESG in electrical power generation while meeting all the requirements set for generators given for instance by ship classification societies, particularly as regards island operation. The maximum power level when applying the technology studied here is mainly limited by the economy of the machine. The larger the machine is, the smaller is the efficiency benefit. However, it seems that machines up to ten megawatts of power could benefit from the technology. However, in low-power applications, for instance in the 500 kW range, the efficiency increase can be significant.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The significance of services as business and human activities has increased dramatically throughout the world in the last three decades. Becoming a more and more competitive and efficient service provider while still being able to provide unique value opportunities for customers requires new knowledge and ideas. Part of this knowledge is created and utilized in daily activities in every service organization, but not all of it, and therefore an emerging phenomenon in the service context is information awareness. Terms like big data and Internet of things are not only modern buzz-words but they are also describing urgent requirements for a new type of competences and solutions. When the amount of information increases and the systems processing information become more efficient and intelligent, it is the human understanding and objectives that may get separated from the automated processes and technological innovations. This is an important challenge and the core driver for this dissertation: What kind of information is created, possessed and utilized in the service context, and even more importantly, what information exists but is not acknowledged or used? In this dissertation the focus is on the relationship between service design and service operations. Reframing this relationship refers to viewing the service system from the architectural perspective. The selected perspective allows analysing the relationship between design activities and operational activities as an information system while maintaining the tight connection to existing service research contributions and approaches. This type of an innovative approach is supported by research methodology that relies on design science theory. The methodological process supports the construction of a new design artifact based on existing theoretical knowledge, creation of new innovations and testing the design artifact components in real service contexts. The relationship between design and operations is analysed in the health care and social care service systems. The existing contributions in service research tend to abstract services and service systems as value creation, working or interactive systems. This dissertation adds an important information processing system perspective to the research. The main contribution focuses on the following argument: Only part of the service information system is automated and computerized, whereas a significant part of information processing is embedded in human activities, communication and ad-hoc reactions. The results indicate that the relationship between service design and service operations is more complex and dynamic than the existing scientific and managerial models tend to view it. Both activities create, utilize, mix and share information, making service information management a necessary but relatively unknown managerial task. On the architectural level, service system -specific elements seem to disappear, but access to more general information elements and processes can be found. While this dissertation focuses on conceptual-level design artifact construction, the results provide also very practical implications for service providers. Personal, visual and hidden activities of service, and more importantly all changes that take place in any service system have also an information dimension. Making this information dimension visual and prioritizing the processed information based on service dimensions is likely to provide new opportunities to increase activities and provide a new type of service potential for customers.