937 resultados para choice task design
Resumo:
The general task of clamping devise is to connect the parts to the machining centers so that the work piece could be fixed on its position during the whole machining process. Additionally, the work piece should be clamped easily and rapidly by the machine users. The purpose of this Master’s thesis project was to develop a product design and find out the dimensioning of a hydraulic vise system for Astex Engineering OY, which was taking care of the general principles of product design and development during the design process. Throughout the process, the needs of manufacturing and assembling were taken into consideration for the machinability and minimization of the cost of manufacturing. The most critical component of the clamping devise was modeled by FEM for the issue of strength requirements. This 3D model was created with Solidworks and FEM-analysis was done with Cosmos software. As the result of this design work, a prototype of the hydraulic vise was manufactured for Astex Engineering OY and the practical test.
Resumo:
Rodrigo, Chamizo, McLaren, & Mackintosh (1997) demonstrated the blocking effect in a navigational task using a swimming pool: rats initially trained to use three landmarks (ABC) to find an invisible platform learned less about a fourth landmark (X) added later than did rats trained from the outset with these four landmarks (ABCX). The aim of the experiment reported here was to demonstrate unblocking using a similar procedure as in the previous work. Three groups of rats were initially trained to find an invisible platfom in the presence of three landmarks: ABC for the Blocking and Unblocking groups and LMN for the Control group. Then, all animals were trained to find the platform in the presence of four landmarks, ABCX. In this second training, unlike animals in the Blocking group to which only a new landmark (X) was added in comparison to the first training, the animals in the Unblocking group also had a change in the platform position. In the Control group, both the four landmarks and the platform position were totally new at the beginning of this second training. As in Rodrigo et al. (1997) a blocking effect was found: rats in the Blocking group learned less with respect to the added landmark (X) than did animals in the Control group. However, rats in the Unblocking group learned about the added landmark (X) as well as did animals in the Control group. The results are interpreted as an unblocking effect due to a change in the platform position between the two phases of training, similarly to what is normal in classical conditioning experiments, in which a change in the conditions of reinforcement between the two training phases of a blocking design produce an attenuation or elimination of this effect. These results are explained within an error-correcting connectionist account of spatial navigation (McLaren, 2002).
Resumo:
Aim of this thesis was to design and manufacture a microdistillation column. The literature review part of this thesis covers stainless steels, material processing and basics about engineering design and distillation. The main focus, however, is on the experimental part. Experimental part is divided into five distinct sections: First part is where the device is introduced and separated into three parts. Secondly the device is designed part by part. It consists mostly of detail problem solving, since the first drawings had already been drawn and the critical dimensions decided. Third part is the manufacture, which was not fully completed since the final assembly was left out of this thesis. Fourth part is the test welding for the device, and its analysis. Finally some ideas for further studies are presented. The main goal of this thesis was accomplished. The device only lacks some final assembly but otherwise it is complete. One thing that became clear during the process was how difficult it is to produce small and precise steel parts with conventional manufacturing methods. Internal stresses within steel plates and thermal distortions can easily ruin small steel structures. Designing appropriate welding jigs is an important task for even simple devices. Laser material processing is a promising tool for this kind of steel processing because of the flexibility, good cutting quality and also precise and low heat input when welding. Next step in this project is the final assembly and the actual distillation tests. The tests will be carried out at Helsinki University of Technology.
Resumo:
Broadcasting systems are networks where the transmission is received by several terminals. Generally broadcast receivers are passive devices in the network, meaning that they do not interact with the transmitter. Providing a certain Quality of Service (QoS) for the receivers in heterogeneous reception environment with no feedback is not an easy task. Forward error control coding can be used for protection against transmission errors to enhance the QoS for broadcast services. For good performance in terrestrial wireless networks, diversity should be utilized. The diversity is utilized by application of interleaving together with the forward error correction codes. In this dissertation the design and analysis of forward error control and control signalling for providing QoS in wireless broadcasting systems are studied. Control signaling is used in broadcasting networks to give the receiver necessary information on how to connect to the network itself and how to receive the services that are being transmitted. Usually control signalling is considered to be transmitted through a dedicated path in the systems. Therefore, the relationship of the signaling and service data paths should be considered early in the design phase. Modeling and simulations are used in the case studies of this dissertation to study this relationship. This dissertation begins with a survey on the broadcasting environment and mechanisms for providing QoS therein. Then case studies present analysis and design of such mechanisms in real systems. The mechanisms for providing QoS considering signaling and service data paths and their relationship at the DVB-H link layer are analyzed as the first case study. In particular the performance of different service data decoding mechanisms and optimal signaling transmission parameter selection are presented. The second case study investigates the design of signaling and service data paths for the more modern DVB-T2 physical layer. Furthermore, by comparing the performances of the signaling and service data paths by simulations, configuration guidelines for the DVB-T2 physical layer signaling are given. The presented guidelines can prove useful when configuring DVB-T2 transmission networks. Finally, recommendations for the design of data and signalling paths are given based on findings from the case studies. The requirements for the signaling design should be derived from the requirements for the main services. Generally, these requirements for signaling should be more demanding as the signaling is the enabler for service reception.
Resumo:
This study compares different electric propulsion systems. Results of the analysis of all the advantages and disadvantages of the different propulsion systems are given. This thesis estimates possibilities to apply different diesel-electric propulsion concepts for different vessel types. Small and medium size vessel’s power ranges are studied. The optimal delivery system is chosen. This choice is made on the base of detailed study of the concepts, electrical equipment market and comparison of mass, volume and efficiency parameters. In this thesis three marine generators are designed. They are: salient pole synchronous generator and two permanent magnet synchronous generators. Their electrical, dimensional, cost and efficiency parameters are compared. To understand all the benefits diagrams with these parameters are prepared. Possible benefits and money savings are estimated. As the result the advantages, disadvantages and boundary conditions for the permanent magnet synchronous generator application in marine electric-power systems are found out.
Resumo:
Mathematical models often contain parameters that need to be calibrated from measured data. The emergence of efficient Markov Chain Monte Carlo (MCMC) methods has made the Bayesian approach a standard tool in quantifying the uncertainty in the parameters. With MCMC, the parameter estimation problem can be solved in a fully statistical manner, and the whole distribution of the parameters can be explored, instead of obtaining point estimates and using, e.g., Gaussian approximations. In this thesis, MCMC methods are applied to parameter estimation problems in chemical reaction engineering, population ecology, and climate modeling. Motivated by the climate model experiments, the methods are developed further to make them more suitable for problems where the model is computationally intensive. After the parameters are estimated, one can start to use the model for various tasks. Two such tasks are studied in this thesis: optimal design of experiments, where the task is to design the next measurements so that the parameter uncertainty is minimized, and model-based optimization, where a model-based quantity, such as the product yield in a chemical reaction model, is optimized. In this thesis, novel ways to perform these tasks are developed, based on the output of MCMC parameter estimation. A separate topic is dynamical state estimation, where the task is to estimate the dynamically changing model state, instead of static parameters. For example, in numerical weather prediction, an estimate of the state of the atmosphere must constantly be updated based on the recently obtained measurements. In this thesis, a novel hybrid state estimation method is developed, which combines elements from deterministic and random sampling methods.
Resumo:
Filtration is a widely used unit operation in chemical engineering. The huge variation in the properties of materials to be ltered makes the study of ltration a challenging task. One of the objectives of this thesis was to show that conventional ltration theories are di cult to use when the system to be modelled contains all of the stages and features that are present in a complete solid/liquid separation process. Furthermore, most of the ltration theories require experimental work to be performed in order to obtain critical parameters required by the theoretical models. Creating a good overall understanding of how the variables a ect the nal product in ltration is somewhat impossible on a purely theoretical basis. The complexity of solid/liquid separation processes require experimental work and when tests are needed, it is advisable to use experimental design techniques so that the goals can be achieved. The statistical design of experiments provides the necessary tools for recognising the e ects of variables. It also helps to perform experimental work more economically. Design of experiments is a prerequisite for creating empirical models that can describe how the measured response is related to the changes in the values of the variable. A software package was developed that provides a ltration practitioner with experimental designs and calculates the parameters for linear regression models, along with the graphical representation of the responses. The developed software consists of two software modules. These modules are LTDoE and LTRead. The LTDoE module is used to create experimental designs for di erent lter types. The lter types considered in the software are automatic vertical pressure lter, double-sided vertical pressure lter, horizontal membrane lter press, vacuum belt lter and ceramic capillary action disc lter. It is also possible to create experimental designs for those cases where the variables are totally user de ned, say for a customized ltration cycle or di erent piece of equipment. The LTRead-module is used to read the experimental data gathered from the experiments, to analyse the data and to create models for each of the measured responses. Introducing the structure of the software more in detail and showing some of the practical applications is the main part of this thesis. This approach to the study of cake ltration processes, as presented in this thesis, has been shown to have good practical value when making ltration tests.
Resumo:
Filtration is a widely used unit operation in chemical engineering. The huge variation in the properties of materials to be ltered makes the study of ltration a challenging task. One of the objectives of this thesis was to show that conventional ltration theories are di cult to use when the system to be modelled contains all of the stages and features that are present in a complete solid/liquid separation process. Furthermore, most of the ltration theories require experimental work to be performed in order to obtain critical parameters required by the theoretical models. Creating a good overall understanding of how the variables a ect the nal product in ltration is somewhat impossible on a purely theoretical basis. The complexity of solid/liquid separation processes require experimental work and when tests are needed, it is advisable to use experimental design techniques so that the goals can be achieved. The statistical design of experiments provides the necessary tools for recognising the e ects of variables. It also helps to perform experimental work more economically. Design of experiments is a prerequisite for creating empirical models that can describe how the measured response is related to the changes in the values of the variable. A software package was developed that provides a ltration practitioner with experimental designs and calculates the parameters for linear regression models, along with the graphical representation of the responses. The developed software consists of two software modules. These modules are LTDoE and LTRead. The LTDoE module is used to create experimental designs for di erent lter types. The lter types considered in the software are automatic vertical pressure lter, double-sided vertical pressure lter, horizontal membrane lter press, vacuum belt lter and ceramic capillary action disc lter. It is also possible to create experimental designs for those cases where the variables are totally user de ned, say for a customized ltration cycle or di erent piece of equipment. The LTRead-module is used to read the experimental data gathered from the experiments, to analyse the data and to create models for each of the measured responses. Introducing the structure of the software more in detail and showing some of the practical applications is the main part of this thesis. This approach to the study of cake ltration processes, as presented in this thesis, has been shown to have good practical value when making ltration tests.
Resumo:
Työn tavoitteena oli tutkia ja kehittää saapuvan tavaran vastaanottoprosessia poistaen siitä Lean-oppien mukaisesti hukkaa. Kehitys tehtiin lisäämällä vastaanoton tiedonsiirtoautomatiikkaa, ja toteutus tehtiin RFID-tekniikan avulla. Tutkielman teoreettisena taustana ovat Lean-opit, joiden avulla tavoitteena oli eliminoida hukka, suunnitella vastaanottoprosessi uudelleen ja kehittää sitä edelleen. Keskeisenä tehtävänä oli löytää ne tekniikan valintaan vaikuttavat tekijät, joilla case-yrityksessä saatiin käyttövarma RDIF-tekniikka käyttöön. Tutkielmassa kerrotaan myös, miten tekniikan käyttöönotto toteutettiin. Lopuksi esitetään tuloksia ja päätelmiä uudistetusta prosessista ja kerrotaan niistä jatkotoimenpiteistä, joiden avulla teknologian käyttöä voidaan edelleen laajentaa. The aim of this study was analyze and develop the inbound receiving process of removing waste according of Lean-thinking. Development was done by raising inbound transmission automation implementation done by RFID- technology. The theoretical background was Lean –thinking; how to re-design process further development to eliminate waste. The main task was to find a reliable RFID- technology and the choice of technology factors were taken into account in the case company and how the technology was implemented. Finally, the presented results and conclusions of re-design process and possibilities to expand the technology to the case company processes.
Resumo:
This work presents recent results concerning a design methodology used to estimate the positioning deviation for a gantry (Cartesian) manipulator, related mainly to structural elastic deformation of components during operational conditions. The case-study manipulator is classified as gantry type and its basic dimensions are 1,53m x 0,97m x 1,38m. The dimensions used for the calculation of effective workspace due to end-effector path displacement are: 1m x 0,5m x 0,5m. The manipulator is composed by four basic modules defined as module X, module Y, module Z and terminal arm, where is connected the end-effector. Each module controlled axis performs a linear-parabolic positioning movement. The planning path algorithm has the maximum velocity and the total distance as input parameters for a given task. The acceleration and deceleration times are the same. Denavit-Hartemberg parameterization method is used in the manipulator kinematics model. The gantry manipulator can be modeled as four rigid bodies with three degrees-of-freedom in translational movements, connected as an open kinematics chain. Dynamic analysis were performed considering inertial parameters specification such as component mass, inertia and center of gravity position of each module. These parameters are essential for a correct manipulator dynamic modelling, due to multiple possibilities of motion and manipulation of objects with different masses. The dynamic analysis consists of a mathematical modelling of the static and dynamic interactions among the modules. The computation of the structural deformations uses the finite element method (FEM).
Resumo:
The significance of services as business and human activities has increased dramatically throughout the world in the last three decades. Becoming a more and more competitive and efficient service provider while still being able to provide unique value opportunities for customers requires new knowledge and ideas. Part of this knowledge is created and utilized in daily activities in every service organization, but not all of it, and therefore an emerging phenomenon in the service context is information awareness. Terms like big data and Internet of things are not only modern buzz-words but they are also describing urgent requirements for a new type of competences and solutions. When the amount of information increases and the systems processing information become more efficient and intelligent, it is the human understanding and objectives that may get separated from the automated processes and technological innovations. This is an important challenge and the core driver for this dissertation: What kind of information is created, possessed and utilized in the service context, and even more importantly, what information exists but is not acknowledged or used? In this dissertation the focus is on the relationship between service design and service operations. Reframing this relationship refers to viewing the service system from the architectural perspective. The selected perspective allows analysing the relationship between design activities and operational activities as an information system while maintaining the tight connection to existing service research contributions and approaches. This type of an innovative approach is supported by research methodology that relies on design science theory. The methodological process supports the construction of a new design artifact based on existing theoretical knowledge, creation of new innovations and testing the design artifact components in real service contexts. The relationship between design and operations is analysed in the health care and social care service systems. The existing contributions in service research tend to abstract services and service systems as value creation, working or interactive systems. This dissertation adds an important information processing system perspective to the research. The main contribution focuses on the following argument: Only part of the service information system is automated and computerized, whereas a significant part of information processing is embedded in human activities, communication and ad-hoc reactions. The results indicate that the relationship between service design and service operations is more complex and dynamic than the existing scientific and managerial models tend to view it. Both activities create, utilize, mix and share information, making service information management a necessary but relatively unknown managerial task. On the architectural level, service system -specific elements seem to disappear, but access to more general information elements and processes can be found. While this dissertation focuses on conceptual-level design artifact construction, the results provide also very practical implications for service providers. Personal, visual and hidden activities of service, and more importantly all changes that take place in any service system have also an information dimension. Making this information dimension visual and prioritizing the processed information based on service dimensions is likely to provide new opportunities to increase activities and provide a new type of service potential for customers.
Resumo:
Presentation at Open Repositories 2014, Helsinki, Finland, June 9-13, 2014
Resumo:
Pro graduavhanlingens svenska sammanfattning
Resumo:
There is evidence that the left hemisphere is more competent for motor control than the right hemisphere. This study investigated whether this hemispheric asymmetry is expressed in the latency/duration of sequential responses performed by the left and/or right hands. Thirty-two right-handed young adults (16 males, 16 females; 18-25 years old) were tested in a simple or choice reaction time task. They responded to a left and/or right visual target by moving their left and/or right middle fingers between two keys on each side of the midline. Right hand reaction time did not differ from left hand reaction time. Submovement times were longer for the right hand than the left hand when the response was bilateral. Pause times were shorter for the right hand than the left hand, both when the responses were unilateral or bilateral. Reaction time results indicate that the putatively more efficient response preparation by the left hemisphere motor mechanisms is not expressed behaviorally. Submovement time and pause time results indicate that the putatively more efficient response execution by the left hemisphere motor mechanisms is expressed behaviorally. In the case of the submovements, the less efficient motor control of the left hand would be compensated by a more intense attention to this hand.
Resumo:
In this work the separation of multicomponent mixtures in counter-current columns with supercritical carbon dioxide has been investigated using a process design methodology. First the separation task must be defined, then phase equilibria experiments are carried out, and the data obtained are correlated with thermodynamic models or empirical functions. Mutual solubilities, Ki-values, and separation factors aij are determined. Based on this data possible operating conditions for further extraction experiments can be determined. Separation analysis using graphical methods are performed to optimize the process parameters. Hydrodynamic experiments are carried out to determine the flow capacity diagram. Extraction experiments in laboratory scale are planned and carried out in order to determine HETP values, to validate the simulation results, and to provide new materials for additional phase equilibria experiments, needed to determine the dependence of separation factors on concetration. Numerical simulation of the separation process and auxiliary systems is carried out to optimize the number of stages, solvent-to-feed ratio, product purity, yield, and energy consumption. Scale-up and cost analysis close the process design. The separation of palmitic acid and (oleic+linoleic) acids from PFAD-Palm Fatty Acids Distillates was used as a case study.