837 resultados para User-centred design


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis is concerned with the role played by software tools in the analysis and dissemination of linguistic corpora and their contribution to a more widespread adoption of corpora in different fields. Chapter 1 contains an overview of some of the most relevant corpus analysis tools available today, presenting their most interesting features and some of their drawbacks. Chapter 2 begins with an explanation of the reasons why none of the available tools appear to satisfy the requirements of the user community and then continues with technical overview of the current status of the new system developed as part of this work. This presentation is followed by highlights of features that make the system appealing to users and corpus builders (i.e. scholars willing to make their corpora available to the public). The chapter concludes with an indication of future directions for the projects and information on the current availability of the software. Chapter 3 describes the design of an experiment devised to evaluate the usability of the new system in comparison to another corpus tool. Usage of the tool was tested in the context of a documentation task performed on a real assignment during a translation class in a master's degree course. In chapter 4 the findings of the experiment are presented on two levels of analysis: firstly a discussion on how participants interacted with and evaluated the two corpus tools in terms of interface and interaction design, usability and perceived ease of use. Then an analysis follows of how users interacted with corpora to complete the task and what kind of queries they submitted. Finally, some general conclusions are drawn and areas for future work are outlined.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Data Distribution Management (DDM) is a core part of High Level Architecture standard, as its goal is to optimize the resources used by simulation environments to exchange data. It has to filter and match the set of information generated during a simulation, so that each federate, that is a simulation entity, only receives the information it needs. It is important that this is done quickly and to the best in order to get better performances and avoiding the transmission of irrelevant data, otherwise network resources may saturate quickly. The main topic of this thesis is the implementation of a super partes DDM testbed. It evaluates the goodness of DDM approaches, of all kinds. In fact it supports both region and grid based approaches, and it may support other different methods still unknown too. It uses three factors to rank them: execution time, memory and distance from the optimal solution. A prearranged set of instances is already available, but we also allow the creation of instances with user-provided parameters. This is how this thesis is structured. We start introducing what DDM and HLA are and what do they do in details. Then in the first chapter we describe the state of the art, providing an overview of the most well known resolution approaches and the pseudocode of the most interesting ones. The third chapter describes how the testbed we implemented is structured. In the fourth chapter we expose and compare the results we got from the execution of four approaches we have implemented. The result of the work described in this thesis can be downloaded on sourceforge using the following link: https://sourceforge.net/projects/ddmtestbed/. It is licensed under the GNU General Public License version 3.0 (GPLv3).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The new generation of multicore processors opens new perspectives for the design of embedded systems. Multiprocessing, however, poses new challenges to the scheduling of real-time applications, in which the ever-increasing computational demands are constantly flanked by the need of meeting critical time constraints. Many research works have contributed to this field introducing new advanced scheduling algorithms. However, despite many of these works have solidly demonstrated their effectiveness, the actual support for multiprocessor real-time scheduling offered by current operating systems is still very limited. This dissertation deals with implementative aspects of real-time schedulers in modern embedded multiprocessor systems. The first contribution is represented by an open-source scheduling framework, which is capable of realizing complex multiprocessor scheduling policies, such as G-EDF, on conventional operating systems exploiting only their native scheduler from user-space. A set of experimental evaluations compare the proposed solution to other research projects that pursue the same goals by means of kernel modifications, highlighting comparable scheduling performances. The principles that underpin the operation of the framework, originally designed for symmetric multiprocessors, have been further extended first to asymmetric ones, which are subjected to major restrictions such as the lack of support for task migrations, and later to re-programmable hardware architectures (FPGAs). In the latter case, this work introduces a scheduling accelerator, which offloads most of the scheduling operations to the hardware and exhibits extremely low scheduling jitter. The realization of a portable scheduling framework presented many interesting software challenges. One of these has been represented by timekeeping. In this regard, a further contribution is represented by a novel data structure, called addressable binary heap (ABH). Such ABH, which is conceptually a pointer-based implementation of a binary heap, shows very interesting average and worst-case performances when addressing the problem of tick-less timekeeping of high-resolution timers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A two-dimensional model to analyze the distribution of magnetic fields in the airgap of a PM electrical machines is studied. A numerical algorithm for non-linear magnetic analysis of multiphase surface-mounted PM machines with semi-closed slots is developed, based on the equivalent magnetic circuit method. By using a modular structure geometry, whose the basic element can be duplicated, it allows to design whatever typology of windings distribution. In comparison to a FEA, permits a reduction in computing time and to directly changing the values of the parameters in a user interface, without re-designing the model. Output torque and radial forces acting on the moving part of the machine can be calculated. In addition, an analytical model for radial forces calculation in multiphase bearingless Surface-Mounted Permanent Magnet Synchronous Motors (SPMSM) is presented. It allows to predict amplitude and direction of the force, depending on the values of torque current, of levitation current and of rotor position. It is based on the space vectors method, letting the analysis of the machine also during transients. The calculations are conducted by developing the analytical functions in Fourier series, taking all the possible interactions between stator and rotor mmf harmonic components into account and allowing to analyze the effects of electrical and geometrical quantities of the machine, being parametrized. The model is implemented in the design of a control system for bearingless machines, as an accurate electromagnetic model integrated in a three-dimensional mechanical model, where one end of the motor shaft is constrained to simulate the presence of a mechanical bearing, while the other is free, only supported by the radial forces developed in the interactions between magnetic fields, to realize a bearingless system with three degrees of freedom. The complete model represents the design of the experimental system to be realized in the laboratory.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The relatively young discipline of astronautics represents one of the scientifically most fascinating and technologically advanced achievements of our time. The human exploration in space does not offer only extraordinary research possibilities but also demands high requirements from man and technology. The space environment provides a lot of attractive experimental tools towards the understanding of fundamental mechanism in natural sciences. It has been shown that especially reduced gravity and elevated radiation, two distinctive factors in space, influence the behavior of biological systems significantly. For this reason one of the key objectives on board of an earth orbiting laboratory is the research in the field of life sciences, covering the broad range from botany, human physiology and crew health up to biotechnology. The Columbus Module is the only European low gravity platform that allows researchers to perform ambitious experiments in a continuous time frame up to several months. Biolab is part of the initial outfitting of the Columbus Laboratory; it is a multi-user facility supporting research in the field of biology, e.g. effect of microgravity and space radiation on cell cultures, micro-organisms, small plants and small invertebrates. The Biolab IEC are projects designed to work in the automatic part of Biolab. In this moment in the TO-53 department of Airbus Defence & Space (formerly Astrium) there are two experiments that are in phase C/D of the development and they are the subject of this thesis: CELLRAD and CYTOSKELETON. They will be launched in soft configuration, that means packed inside a block of foam that has the task to reduce the launch loads on the payload. Until 10 years ago the payloads which were launched in soft configuration were supposed to be structural safe by themselves and a specific structural analysis could be waived on them; with the opening of the launchers market to private companies (that are not under the direct control of the international space agencies), the requirements on the verifications of payloads are changed and they have become much more conservative. In 2012 a new random environment has been introduced due to the new Space-X launch specification that results to be particularly challenging for the soft launched payloads. The last ESA specification requires to perform structural analysis on the payload for combined loads (random vibration, quasi-steady acceleration and pressure). The aim of this thesis is to create FEM models able to reproduce the launch configuration and to verify that all the margins of safety are positive and to show how they change because of the new Space-X random environment. In case the results are negative, improved design solution are implemented. Based on the FEM result a study of the joins has been carried out and, when needed, a crack growth analysis has been performed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Atmosphärische Aerosolpartikel wirken in vielerlei Hinsicht auf die Menschen und die Umwelt ein. Eine genaue Charakterisierung der Partikel hilft deren Wirken zu verstehen und dessen Folgen einzuschätzen. Partikel können hinsichtlich ihrer Größe, ihrer Form und ihrer chemischen Zusammensetzung charakterisiert werden. Mit der Laserablationsmassenspektrometrie ist es möglich die Größe und die chemische Zusammensetzung einzelner Aerosolpartikel zu bestimmen. Im Rahmen dieser Arbeit wurde das SPLAT (Single Particle Laser Ablation Time-of-flight mass spectrometer) zur besseren Analyse insbesondere von atmosphärischen Aerosolpartikeln weiterentwickelt. Der Aerosoleinlass wurde dahingehend optimiert, einen möglichst weiten Partikelgrößenbereich (80 nm - 3 µm) in das SPLAT zu transferieren und zu einem feinen Strahl zu bündeln. Eine neue Beschreibung für die Beziehung der Partikelgröße zu ihrer Geschwindigkeit im Vakuum wurde gefunden. Die Justage des Einlasses wurde mithilfe von Schrittmotoren automatisiert. Die optische Detektion der Partikel wurde so verbessert, dass Partikel mit einer Größe < 100 nm erfasst werden können. Aufbauend auf der optischen Detektion und der automatischen Verkippung des Einlasses wurde eine neue Methode zur Charakterisierung des Partikelstrahls entwickelt. Die Steuerelektronik des SPLAT wurde verbessert, so dass die maximale Analysefrequenz nur durch den Ablationslaser begrenzt wird, der höchsten mit etwa 10 Hz ablatieren kann. Durch eine Optimierung des Vakuumsystems wurde der Ionenverlust im Massenspektrometer um den Faktor 4 verringert.rnrnNeben den hardwareseitigen Weiterentwicklungen des SPLAT bestand ein Großteil dieser Arbeit in der Konzipierung und Implementierung einer Softwarelösung zur Analyse der mit dem SPLAT gewonnenen Rohdaten. CRISP (Concise Retrieval of Information from Single Particles) ist ein auf IGOR PRO (Wavemetrics, USA) aufbauendes Softwarepaket, das die effiziente Auswertung der Einzelpartikel Rohdaten erlaubt. CRISP enthält einen neu entwickelten Algorithmus zur automatischen Massenkalibration jedes einzelnen Massenspektrums, inklusive der Unterdrückung von Rauschen und von Problemen mit Signalen die ein intensives Tailing aufweisen. CRISP stellt Methoden zur automatischen Klassifizierung der Partikel zur Verfügung. Implementiert sind k-means, fuzzy-c-means und eine Form der hierarchischen Einteilung auf Basis eines minimal aufspannenden Baumes. CRISP bietet die Möglichkeit die Daten vorzubehandeln, damit die automatische Einteilung der Partikel schneller abläuft und die Ergebnisse eine höhere Qualität aufweisen. Daneben kann CRISP auf einfache Art und Weise Partikel anhand vorgebener Kriterien sortieren. Die CRISP zugrundeliegende Daten- und Infrastruktur wurde in Hinblick auf Wartung und Erweiterbarkeit erstellt. rnrnIm Rahmen der Arbeit wurde das SPLAT in mehreren Kampagnen erfolgreich eingesetzt und die Fähigkeiten von CRISP konnten anhand der gewonnen Datensätze gezeigt werden.rnrnDas SPLAT ist nun in der Lage effizient im Feldeinsatz zur Charakterisierung des atmosphärischen Aerosols betrieben zu werden, während CRISP eine schnelle und gezielte Auswertung der Daten ermöglicht.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The following research thesis is about a retrofit project made in Denmark, Copenhagen, and carried out on one of the buildings belonging to the Royal Danish Academy. The key assumption and base of the entire research process is that, up to now, the standard procedure in retrofit cases like this provides as comparative method between de facto and design, the use of Energy Simulation software. These programs generally divide the space into different thermal zones, assigning to each of them different levels of employment, activities, set-point temperatures set for cooling and heating analysis and so on, but always providing average and constant values, usually taken in the middle point of the single thermal zone. Therefore, the project and its research path stems from the attempt to investigate the potentialities of this kind of designing for retrofit process, as previously anticipated not antithetical but complementary to that classic energy-based retrofit, thus passing from the building scale, and all its thermal zones, to the users' scale, related to humans and microclimates. The main software used in this process is Autodesk Simulation CFD. The idea behind the project is that in certain situations, for example, it will not be necessary to add throughout insulation layers (previously parameterized and optimized with Design Builder), and that even in Winter conditions, due maybe to the users' activities, the increased level of clothing (clo) and the heat produced by equipments, thermal comfort could be achieved also in areas characterized by considerably lower MRT. After the analysis of the State of Art and its simulations, the project has still been supported by the tool itself, the CFD Software, in an iterative process aimed at achieving visible improvements in terms of MRT, on spaces with different needs and characteristics, both in Winter and Summer regimes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Percutaneous needle intervention based on PET/CT images is effective, but exposes the patient to unnecessary radiation due to the increased number of CT scans required. Computer assisted intervention can reduce the number of scans, but requires handling, matching and visualization of two different datasets. While one dataset is used for target definition according to metabolism, the other is used for instrument guidance according to anatomical structures. No navigation systems capable of handling such data and performing PET/CT image-based procedures while following clinically approved protocols for oncologic percutaneous interventions are available. The need for such systems is emphasized in scenarios where the target can be located in different types of tissue such as bone and soft tissue. These two tissues require different clinical protocols for puncturing and may therefore give rise to different problems during the navigated intervention. Studies comparing the performance of navigated needle interventions targeting lesions located in these two types of tissue are not often found in the literature. Hence, this paper presents an optical navigation system for percutaneous needle interventions based on PET/CT images. The system provides viewers for guiding the physician to the target with real-time visualization of PET/CT datasets, and is able to handle targets located in both bone and soft tissue. The navigation system and the required clinical workflow were designed taking into consideration clinical protocols and requirements, and the system is thus operable by a single person, even during transition to the sterile phase. Both the system and the workflow were evaluated in an initial set of experiments simulating 41 lesions (23 located in bone tissue and 18 in soft tissue) in swine cadavers. We also measured and decomposed the overall system error into distinct error sources, which allowed for the identification of particularities involved in the process as well as highlighting the differences between bone and soft tissue punctures. An overall average error of 4.23 mm and 3.07 mm for bone and soft tissue punctures, respectively, demonstrated the feasibility of using this system for such interventions. The proposed system workflow was shown to be effective in separating the preparation from the sterile phase, as well as in keeping the system manageable by a single operator. Among the distinct sources of error, the user error based on the system accuracy (defined as the distance from the planned target to the actual needle tip) appeared to be the most significant. Bone punctures showed higher user error, whereas soft tissue punctures showed higher tissue deformation error.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Simulation Automation Framework for Experiments (SAFE) is a project created to raise the level of abstraction in network simulation tools and thereby address issues that undermine credibility. SAFE incorporates best practices in network simulationto automate the experimental process and to guide users in the development of sound scientific studies using the popular ns-3 network simulator. My contributions to the SAFE project: the design of two XML-based languages called NEDL (ns-3 Experiment Description Language) and NSTL (ns-3 Script Templating Language), which facilitate the description of experiments and network simulationmodels, respectively. The languages provide a foundation for the construction of better interfaces between the user and the ns-3 simulator. They also provide input to a mechanism which automates the execution of network simulation experiments. Additionally,this thesis demonstrates that one can develop tools to generate ns-3 scripts in Python or C++ automatically from NSTL model descriptions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Currently, observations of space debris are primarily performed with ground-based sensors. These sensors have a detection limit at some centimetres diameter for objects in Low Earth Orbit (LEO) and at about two decimetres diameter for objects in Geostationary Orbit (GEO). The few space-based debris observations stem mainly from in-situ measurements and from the analysis of returned spacecraft surfaces. Both provide information about mostly sub-millimetre-sized debris particles. As a consequence the population of centimetre- and millimetre-sized debris objects remains poorly understood. The development, validation and improvement of debris reference models drive the need for measurements covering the whole diameter range. In 2003 the European Space Agency (ESA) initiated a study entitled “Space-Based Optical Observation of Space Debris”. The first tasks of the study were to define user requirements and to develop an observation strategy for a space-based instrument capable of observing uncatalogued millimetre-sized debris objects. Only passive optical observations were considered, focussing on mission concepts for the LEO, and GEO regions respectively. Starting from the requirements and the observation strategy, an instrument system architecture and an associated operations concept have been elaborated. The instrument system architecture covers the telescope, camera and onboard processing electronics. The proposed telescope is a folded Schmidt design, characterised by a 20 cm aperture and a large field of view of 6°. The camera design is based on the use of either a frame-transfer charge coupled device (CCD), or on a cooled hybrid sensor with fast read-out. A four megapixel sensor is foreseen. For the onboard processing, a scalable architecture has been selected. Performance simulations have been executed for the system as designed, focussing on the orbit determination of observed debris particles, and on the analysis of the object detection algorithms. In this paper we present some of the main results of the study. A short overview of the user requirements and observation strategy is given. The architectural design of the instrument is discussed, and the main tradeoffs are outlined. An insight into the results of the performance simulations is provided.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This report provides an analysis of the thermal performance and emissions characteristics of improved biomass stoves constructed using earthen materials. Commonly referred to as mud stoves, this type of improved stove incorporates high clay content soil with an organic binder in the construction of its combustion chamber and body. When large quantities of the mud material are used to construct the stove body, the stove does not offer significant improvements in fuel economy or air quality relative to traditional open fire cooking. This is partly because a significant amount of heat is absorbed by the mass of the stove reducing combustion efficiency and heat transfer to the cook pot. An analysis of the thermal and mechanical properties of stove materials was also performed. A material mixture containing a one‐to‐one ratio by volume of high content clay soil and straw was found to have thermal properties comparable to fired ceramics used in more advanced improved stove designs. Feedback from mud stove users in Mauritania and Mali, West Africa was also collected during implementation. Suggestions for stove design improvements were developed based on this information and the data collected in the performance, emissions, and material properties analysis. Design suggestions include reducing stove height to accommodate user cooking preferences and limiting overall stove mass to reduce heat loss to the stove body.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

New designs of user input systems have resulted from the developing technologies and specialized user demands. Conventional keyboard and mouse input devices still dominate the input speed, but other input mechanisms are demanded in special application scenarios. Touch screen and stylus input methods have been widely adopted by PDAs and smartphones. Reduced keypads are necessary for mobile phones. A new design trend is exploring the design space in applications requiring single-handed input, even with eyes-free on small mobile devices. This requires as few keys on the input device to make it feasible to operate. But representing many characters with fewer keys can make the input ambiguous. Accelerometers embedded in mobile devices provide opportunities to combine device movements with keys for input signal disambiguation. Recent research has explored its design space for text input. In this dissertation an accelerometer assisted single key positioning input system is developed. It utilizes input device tilt directions as input signals and maps their sequences to output characters and functions. A generic positioning model is developed as guidelines for designing positioning input systems. A calculator prototype and a text input prototype on the 4+1 (5 positions) positioning input system and the 8+1 (9 positions) positioning input system are implemented using accelerometer readings on a smartphone. Users use one physical key to operate and feedbacks are audible. Controlled experiments are conducted to evaluate the feasibility, learnability, and design space of the accelerometer assisted single key positioning input system. This research can provide inspiration and innovational references for researchers and practitioners in the positioning user input designs, applications of accelerometer readings, and new development of standard machine readable sign languages.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With energy demands and costs growing every day, the need for improving energy efficiency in electrical devices has become very important. Research into various methods of improving efficiency for all electrical components will be a key to meet future energy needs. This report documents the design, construction, and testing of a research quality electric machine dynamometer and test bed. This test cell system can be used for research in several areas including: electric drives systems, electric vehicle propulsion systems, power electronic converters, load/source element in an AC Microgrid, as well as many others. The test cell design criteria, and decisions, will be discussed in reference to user functionality and flexibility. The individual power components will be discussed in detail to how they relate to the project, highlighting any feature used in operation of the test cell. A project timeline will be discussed, clearly stating the work done by the different individuals involved in the project. In addition, the system will be parameterized and benchmark data will be used to provide the functional operation of the system. With energy demands and costs growing every day, the need for improving energy efficiency in electrical devices has become very important. Research into various methods of improving efficiency for all electrical components will be a key to meet future energy needs. This report documents the design, construction, and testing of a research quality electric machine dynamometer and test bed. This test cell system can be used for research in several areas including: electric drives systems, electric vehicle propulsion systems, power electronic converters, load/source element in an AC Microgrid, as well as many others. The test cell design criteria, and decisions, will be discussed in reference to user functionality and flexibility. The individual power components will be discussed in detail to how they relate to the project, highlighting any feature used in operation of the test cell. A project timeline will be discussed, clearly stating the work done by the different individuals involved in the project. In addition, the system will be parameterized and benchmark data will be used to provide the functional operation of the system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With the development and capabilities of the Smart Home system, people today are entering an era in which household appliances are no longer just controlled by people, but also operated by a Smart System. This results in a more efficient, convenient, comfortable, and environmentally friendly living environment. A critical part of the Smart Home system is Home Automation, which means that there is a Micro-Controller Unit (MCU) to control all the household appliances and schedule their operating times. This reduces electricity bills by shifting amounts of power consumption from the on-peak hour consumption to the off-peak hour consumption, in terms of different “hour price”. In this paper, we propose an algorithm for scheduling multi-user power consumption and implement it on an FPGA board, using it as the MCU. This algorithm for discrete power level tasks scheduling is based on dynamic programming, which could find a scheduling solution close to the optimal one. We chose FPGA as our system’s controller because FPGA has low complexity, parallel processing capability, a large amount of I/O interface for further development and is programmable on both software and hardware. In conclusion, it costs little time running on FPGA board and the solution obtained is good enough for the consumers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The widespread of low cost embedded electronics makes it easier to implement the smart devices that can understand either the environment or the user behaviors. The main object of this project is to design and implement home use portable smart electronics, including the portable monitoring device for home and office security and the portable 3D mouse for convenient use. Both devices in this project use the MPU6050 which contains a 3 axis accelerometer and a 3 axis gyroscope to sense the inertial motion of the door or the human hands movement. For the portable monitoring device for home and office security, MPU6050 is used to sense the door (either home front door or cabinet door) movement through the gyroscope, and Raspberry Pi is then used to process the data it receives from MPU6050, if the data value exceeds the preset threshold, Raspberry Pi would control the USB Webcam to take a picture and then send out an alert email with the picture to the user. The advantage of this device is that it is a small size portable stand-alone device with its own power source, it is easy to implement, really cheap for residential use, and energy efficient with instantaneous alert. For the 3D mouse, the MPU6050 would use both the accelerometer and gyroscope to sense user hands movement, the data are processed by MSP430G2553 through a digital smooth filter and a complementary filter, and then the filtered data will pass to the personal computer through the serial COM port. By applying the cursor movement equation in the PC driver, this device can work great as a mouse with acceptable accuracy. Compared to the normal optical mouse we are using, this mouse does not need any working surface, with the use of the smooth and complementary filter, it has certain accuracy for normal use, and it is easy to be extended to a portable mouse as small as a finger ring.