810 resultados para design based research


Relevância:

50.00% 50.00%

Publicador:

Resumo:

In recent years, mobile technology has been one of the major growth areas in computing. Designing the user interface for mobile applications, however, is a very complex undertaking which is made even more challenging by the rapid technological developments in mobile hardware. Mobile human-computer interaction, unlike desktop-based interaction, must be cognizant of a variety of complex contextual factors affecting both users and technology. The Handbook of Research on User Interface Design and Evaluation provides students, researchers, educators, and practitioners with a compendium of research on the key issues surrounding the design and evaluation of mobile user interfaces, such as the physical environment and social context in which a mobile device is being used and the impact of multitasking behavior typically exhibited by mobile-device users. Compiling the expertise of over 150 leading experts from 26 countries, this exemplary reference tool will make an indispensable addition to every library collection.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

This article characterizes key weaknesses in the ability of current digital libraries to support scholarly inquiry, and as a way to address these, proposes computational services grounded in semiformal models of the naturalistic argumentation commonly found in research literatures. It is argued that a design priority is to balance formal expressiveness with usability, making it critical to coevolve the modeling scheme with appropriate user interfaces for argument construction and analysis. We specify the requirements for an argument modeling scheme for use by untrained researchers and describe the resulting ontology, contrasting it with other domain modeling and semantic web approaches, before discussing passive and intelligent user interfaces designed to support analysts in the construction, navigation, and analysis of scholarly argument structures in a Web-based environment. © 2007 Wiley Periodicals, Inc. Int J Int Syst 22: 17–47, 2007.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Software applications created on top of the service-oriented architecture (SOA) are increasingly popular but testing them remains a challenge. In this paper a framework named TASSA for testing the functional and non-functional behaviour of service-based applications is presented. The paper focuses on the concept of design time testing, the corresponding testing approach and architectural integration of the consisting TASSA tools. The individual TASSA tools with sample validation scenarios were already presented with a general view of their relation. This paper’s contribution is the structured testing approach, based on the integral use of the tools and their architectural integration. The framework is based on SOA principles and is composable depending on user requirements.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Design verification in the digital domain, using model-based principles, is a key research objective to address the industrial requirement for reduced physical testing and prototyping. For complex assemblies, the verification of design and the associated production methods is currently fragmented, prolonged and sub-optimal, as it uses digital and physical verification stages that are deployed in a sequential manner using multiple systems. This paper describes a novel, hybrid design verification methodology that integrates model-based variability analysis with measurement data of assemblies, in order to reduce simulation uncertainty and allow early design verification from the perspective of satisfying key assembly criteria.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

PurposeTo develop and validate a classification system for focal vitreomacular traction (VMT) with and without macular hole based on spectral domain optical coherence tomography (SD-OCT), intended to aid in decision-making and prognostication.MethodsA panel of retinal specialists convened to develop this system. A literature review followed by discussion on a wide range of cases formed the basis for the proposed classification. Key features on OCT were identified and analysed for their utility in clinical practice. A final classification was devised based on two sequential, independent validation exercises to improve interobserver variability.ResultsThis classification tool pertains to idiopathic focal VMT assessed by a horizontal line scan using SD-OCT. The system uses width (W), interface features (I), foveal shape (S), retinal pigment epithelial changes (P), elevation of vitreous attachment (E), and inner and outer retinal changes (R) to give the acronym WISPERR. Each category is scored hierarchically. Results from the second independent validation exercise indicated a high level of agreement between graders: intraclass correlation ranged from 0.84 to 0.99 for continuous variables and Fleiss' kappa values ranged from 0.76 to 0.95 for categorical variables.ConclusionsWe present an OCT-based classification system for focal VMT that allows anatomical detail to be scrutinised and scored qualitatively and quantitatively using a simple, pragmatic algorithm, which may be of value in clinical practice as well as in future research studies.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Ezen kutatási tanulmány módszertani összefoglalója a Kutatási és fejlesztési tevékenység lehetőségei és korlátai a KKV szektorban című kutatási műhely tanulmányainak. A kutatók feltáró, kvalitatív kutatás végeztek, azon belül pontosabban kvalitatív mélyinterjúkon alapuló esettanulmányos kutatást. A kutatók 14 kis- és középvállalkozás vezetőjével készítettek mélyinterjút annak érdekében, hogy feltárják a kis- és középvállalkozások adaptációs és növekedési lehetőségeit és korlátait, valamint az innovációs tevékenységének lehetőségeit és korlátait. Jelen tanulmány röviden bemutatja a vizsgált 14 esetet is. / === / This research paper is the methodological summary of the research working papers entitled Opportunities and constraints in the R&D activities of the SMES. The researchers carried out an explorative, qualitative research, to be more exact a case study research based on qualitative interviews. The researchers conducted interviews with CEOs of 14 SMEs in order to explore the opportunities and constraints in the adaptation and growth potential and in the innovation activities of the SMEs. This paper introduces in brief the analysed 14 case study, too.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Traffic incidents are non-recurring events that can cause a temporary reduction in roadway capacity. They have been recognized as a major contributor to traffic congestion on our nation’s highway systems. To alleviate their impacts on capacity, automatic incident detection (AID) has been applied as an incident management strategy to reduce the total incident duration. AID relies on an algorithm to identify the occurrence of incidents by analyzing real-time traffic data collected from surveillance detectors. Significant research has been performed to develop AID algorithms for incident detection on freeways; however, similar research on major arterial streets remains largely at the initial stage of development and testing. This dissertation research aims to identify design strategies for the deployment of an Artificial Neural Network (ANN) based AID algorithm for major arterial streets. A section of the US-1 corridor in Miami-Dade County, Florida was coded in the CORSIM microscopic simulation model to generate data for both model calibration and validation. To better capture the relationship between the traffic data and the corresponding incident status, Discrete Wavelet Transform (DWT) and data normalization were applied to the simulated data. Multiple ANN models were then developed for different detector configurations, historical data usage, and the selection of traffic flow parameters. To assess the performance of different design alternatives, the model outputs were compared based on both detection rate (DR) and false alarm rate (FAR). The results show that the best models were able to achieve a high DR of between 90% and 95%, a mean time to detect (MTTD) of 55-85 seconds, and a FAR below 4%. The results also show that a detector configuration including only the mid-block and upstream detectors performs almost as well as one that also includes a downstream detector. In addition, DWT was found to be able to improve model performance, and the use of historical data from previous time cycles improved the detection rate. Speed was found to have the most significant impact on the detection rate, while volume was found to contribute the least. The results from this research provide useful insights on the design of AID for arterial street applications.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Today, modern System-on-a-Chip (SoC) systems have grown rapidly due to the increased processing power, while maintaining the size of the hardware circuit. The number of transistors on a chip continues to increase, but current SoC designs may not be able to exploit the potential performance, especially with energy consumption and chip area becoming two major concerns. Traditional SoC designs usually separate software and hardware. Thus, the process of improving the system performance is a complicated task for both software and hardware designers. The aim of this research is to develop hardware acceleration workflow for software applications. Thus, system performance can be improved with constraints of energy consumption and on-chip resource costs. The characteristics of software applications can be identified by using profiling tools. Hardware acceleration can have significant performance improvement for highly mathematical calculations or repeated functions. The performance of SoC systems can then be improved, if the hardware acceleration method is used to accelerate the element that incurs performance overheads. The concepts mentioned in this study can be easily applied to a variety of sophisticated software applications. The contributions of SoC-based hardware acceleration in the hardware-software co-design platform include the following: (1) Software profiling methods are applied to H.264 Coder-Decoder (CODEC) core. The hotspot function of aimed application is identified by using critical attributes such as cycles per loop, loop rounds, etc. (2) Hardware acceleration method based on Field-Programmable Gate Array (FPGA) is used to resolve system bottlenecks and improve system performance. The identified hotspot function is then converted to a hardware accelerator and mapped onto the hardware platform. Two types of hardware acceleration methods – central bus design and co-processor design, are implemented for comparison in the proposed architecture. (3) System specifications, such as performance, energy consumption, and resource costs, are measured and analyzed. The trade-off of these three factors is compared and balanced. Different hardware accelerators are implemented and evaluated based on system requirements. 4) The system verification platform is designed based on Integrated Circuit (IC) workflow. Hardware optimization techniques are used for higher performance and less resource costs. Experimental results show that the proposed hardware acceleration workflow for software applications is an efficient technique. The system can reach 2.8X performance improvements and save 31.84% energy consumption by applying the Bus-IP design. The Co-processor design can have 7.9X performance and save 75.85% energy consumption.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Traffic incidents are non-recurring events that can cause a temporary reduction in roadway capacity. They have been recognized as a major contributor to traffic congestion on our national highway systems. To alleviate their impacts on capacity, automatic incident detection (AID) has been applied as an incident management strategy to reduce the total incident duration. AID relies on an algorithm to identify the occurrence of incidents by analyzing real-time traffic data collected from surveillance detectors. Significant research has been performed to develop AID algorithms for incident detection on freeways; however, similar research on major arterial streets remains largely at the initial stage of development and testing. This dissertation research aims to identify design strategies for the deployment of an Artificial Neural Network (ANN) based AID algorithm for major arterial streets. A section of the US-1 corridor in Miami-Dade County, Florida was coded in the CORSIM microscopic simulation model to generate data for both model calibration and validation. To better capture the relationship between the traffic data and the corresponding incident status, Discrete Wavelet Transform (DWT) and data normalization were applied to the simulated data. Multiple ANN models were then developed for different detector configurations, historical data usage, and the selection of traffic flow parameters. To assess the performance of different design alternatives, the model outputs were compared based on both detection rate (DR) and false alarm rate (FAR). The results show that the best models were able to achieve a high DR of between 90% and 95%, a mean time to detect (MTTD) of 55-85 seconds, and a FAR below 4%. The results also show that a detector configuration including only the mid-block and upstream detectors performs almost as well as one that also includes a downstream detector. In addition, DWT was found to be able to improve model performance, and the use of historical data from previous time cycles improved the detection rate. Speed was found to have the most significant impact on the detection rate, while volume was found to contribute the least. The results from this research provide useful insights on the design of AID for arterial street applications.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

DNA sequences that are rich in the guanine nucleic base possess the ability to fold into higher order structures called G-quadruplexes. These higher level structures are formed as a result of two sets of four guanine bases hydrogen-bonding together in a planar arrangement called a guanine quartet. Guanine quartets subsequently stack upon each other to form quadruplexes. G-quadruplexes are mainly localized in telomeres as well as in oncogene promoters. One unique and promising therapeutic approach against cancer involves targeting and stabilizing G-quadruplexes with small molecules, generally in order to suppress oncogene expression and telomerase enzyme activity; the latter has been found to contribute to “out-of control” cell growth in ca. 80-85% of all cancer cells and primary tumours while being absent in normal somatic cells. In this work, we present efforts towards designing and synthesizing acridine-based macrocycles (Mh) and (Mb) with the purpose of providing potential G4 ligands that are suited for selective binding to G4 vs. duplex DNA, and stabilize G-quadruplex structures. Two ligands described in this study include an acridine core which provides an aromatic surface capable of π-π interactions with the surface of G-quadruplexes. The successful synthesis of 4,5-diaminoacridine is described in chapter 2, as an essential fragment of the macrocycles (Mh) and (Mb). In order to investigate the synthetic method for macrocyclization, model compounds composing almost half of the designed macrocycles were explored. As discussed in chapter 3, the synthesis of the model compound for (Mb) turned out to be challenging. However, as a step towards the synthesis of (Mh), the synthesis of the hydrogen-containing model compound, which is almost half of the desired macrocycle (Mh) was achieved in our group and proved to be promising.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

The inherent analogue nature of medical ultrasound signals in conjunction with the abundant merits provided by digital image acquisition, together with the increasing use of relatively simple front-end circuitries, have created considerable demand for single-bit  beamformers in digital ultrasound imaging systems. Furthermore, the increasing need to design lightweight ultrasound systems with low power consumption and low noise, provide ample justification for development and innovation in the use of single-bit  beamformers in ultrasound imaging systems. The overall aim of this research program is to investigate, establish, develop and confirm through a combination of theoretical analysis and detailed simulations, that utilize raw phantom data sets, suitable techniques for the design of simple-to-implement hardware efficient  digital ultrasound beamformers to address the requirements for 3D scanners with large channel counts, as well as portable and lightweight ultrasound scanners for point-of-care applications and intravascular imaging systems. In addition, the stability boundaries of higher-order High-Pass (HP) and Band-Pass (BP) Σ−Δ modulators for single- and dual- sinusoidal inputs are determined using quasi-linear modeling together with the describing-function method, to more accurately model the  modulator quantizer. The theoretical results are shown to be in good agreement with the simulation results for a variety of input amplitudes, bandwidths, and modulator orders. The proposed mathematical models of the quantizer will immensely help speed up the design of higher order HP and BP Σ−Δ modulators to be applicable for digital ultrasound beamformers. Finally, a user friendly design and performance evaluation tool for LP, BP and HP  modulators is developed. This toolbox, which uses various design methodologies and covers an assortment of  modulators topologies, is intended to accelerate the design process and evaluation of  modulators. This design tool is further developed to enable the design, analysis and evaluation of  beamformer structures including the noise analyses of the final B-scan images. Thus, this tool will allow researchers and practitioners to design and verify different reconstruction filters and analyze the results directly on the B-scan ultrasound images thereby saving considerable time and effort.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

This paper describes the development and evaluation of web-based museum trails for university-level design students to access on handheld devices in the Victoria and Albert Museum (V&A) in London. The trails offered students a range of ways of exploring the museum environment and collections, some encouraging students to interpret objects and museum spaces in lateral and imaginative ways, others more straightforwardly providing context and extra information. In a three-stage qualitative evaluation programme, student feedback showed that overall the trails enhanced students’ knowledge of, interest in, and closeness to the objects. However, the trails were only partially successful from a technological standpoint due to device and network problems. Broader findings suggest that technology has a key role to play in helping to maintain the museum as a learning space which complements that of universities as well as schools. This research informed my other work in visitor-constructed learning trails in museums, specifically in the theoretical approach to data analysis used, in the research design, and in informing ways to structure visitor experiences in museums. It resulted in a conference presentation, and more broadly informed my subsequent teaching practice.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Le dimensionnement basé sur la performance (DBP), dans une approche déterministe, caractérise les objectifs de performance par rapport aux niveaux de performance souhaités. Les objectifs de performance sont alors associés à l'état d'endommagement et au niveau de risque sismique établis. Malgré cette approche rationnelle, son application est encore difficile. De ce fait, des outils fiables pour la capture de l'évolution, de la distribution et de la quantification de l'endommagement sont nécessaires. De plus, tous les phénomènes liés à la non-linéarité (matériaux et déformations) doivent également être pris en considération. Ainsi, cette recherche montre comment la mécanique de l'endommagement pourrait contribuer à résoudre cette problématique avec une adaptation de la théorie du champ de compression modifiée et d'autres théories complémentaires. La formulation proposée adaptée pour des charges monotones, cycliques et de type pushover permet de considérer les effets non linéaires liés au cisaillement couplé avec les mécanismes de flexion et de charge axiale. Cette formulation est spécialement appliquée à l'analyse non linéaire des éléments structuraux en béton soumis aux effets de cisaillement non égligeables. Cette nouvelle approche mise en œuvre dans EfiCoS (programme d'éléments finis basé sur la mécanique de l'endommagement), y compris les critères de modélisation, sont également présentés ici. Des calibrations de cette nouvelle approche en comparant les prédictions avec des données expérimentales ont été réalisées pour les murs de refend en béton armé ainsi que pour des poutres et des piliers de pont où les effets de cisaillement doivent être pris en considération. Cette nouvelle version améliorée du logiciel EFiCoS a démontrée être capable d'évaluer avec précision les paramètres associés à la performance globale tels que les déplacements, la résistance du système, les effets liés à la réponse cyclique et la quantification, l'évolution et la distribution de l'endommagement. Des résultats remarquables ont également été obtenus en référence à la détection appropriée des états limites d'ingénierie tels que la fissuration, les déformations unitaires, l'éclatement de l'enrobage, l'écrasement du noyau, la plastification locale des barres d'armature et la dégradation du système, entre autres. Comme un outil pratique d'application du DBP, des relations entre les indices d'endommagement prédits et les niveaux de performance ont été obtenus et exprimés sous forme de graphiques et de tableaux. Ces graphiques ont été développés en fonction du déplacement relatif et de la ductilité de déplacement. Un tableau particulier a été développé pour relier les états limites d'ingénierie, l'endommagement, le déplacement relatif et les niveaux de performance traditionnels. Les résultats ont démontré une excellente correspondance avec les données expérimentales, faisant de la formulation proposée et de la nouvelle version d'EfiCoS des outils puissants pour l'application de la méthodologie du DBP, dans une approche déterministe.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

Creative ways of utilising renewable energy sources in electricity generation especially in remote areas and particularly in countries depending on imported energy, while increasing energy security and reducing cost of such isolated off-grid systems, is becoming an urgently needed necessity for the effective strategic planning of Energy Systems. The aim of this research project was to design and implement a new decision support framework for the optimal design of hybrid micro grids considering different types of different technologies, where the design objective is to minimize the total cost of the hybrid micro grid while at the same time satisfying the required electric demand. Results of a comprehensive literature review, of existing analytical, decision support tools and literature on HPS, has identified the gaps and the necessary conceptual parts of an analytical decision support framework. As a result this research proposes and reports an Iterative Analytical Design Framework (IADF) and its implementation for the optimal design of an Off-grid renewable energy based hybrid smart micro-grid (OGREH-SμG) with intra and inter-grid (μG2μG & μG2G) synchronization capabilities and a novel storage technique. The modelling design and simulations were based on simulations conducted using HOMER Energy and MatLab/SIMULINK, Energy Planning and Design software platforms. The design, experimental proof of concept, verification and simulation of a new storage concept incorporating Hydrogen Peroxide (H2O2) fuel cell is also reported. The implementation of the smart components consisting Raspberry Pi that is devised and programmed for the semi-smart energy management framework (a novel control strategy, including synchronization capabilities) of the OGREH-SμG are also detailed and reported. The hybrid μG was designed and implemented as a case study for the Bayir/Jordan area. This research has provided an alternative decision support tool to solve Renewable Energy Integration for the optimal number, type and size of components to configure the hybrid μG. In addition this research has formulated and reported a linear cost function to mathematically verify computer based simulations and fine tune the solutions in the iterative framework and concluded that such solutions converge to a correct optimal approximation when considering the properties of the problem. As a result of this investigation it has been demonstrated that, the implemented and reported OGREH-SμG design incorporates wind and sun powered generation complemented with batteries, two fuel cell units and a diesel generator is a unique approach to Utilizing indigenous renewable energy with a capability of being able to synchronize with other μ-grids is the most effective and optimal way of electrifying developing countries with fewer resources in a sustainable way, with minimum impact on the environment while also achieving reductions in GHG. The dissertation concludes with suggested extensions to this work in the future.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

This dissertation presents a case study of collaborative research through design with Floracaching, a gamified mobile application for citizen science biodiversity data collection. One contribution of this study is the articulation of collaborative research through design (CRtD), an approach that blends cooperative design approaches with the research through design methodology (RtD). Collaborative research through design is thus defined as an iterative process of cooperative design, where the collaborative vision of an ideal state is embedded in a design. Applying collaborative research through design with Floracaching illustrates how a number of cooperative techniques—especially contextual inquiry, prototyping, and focus groups—may be applied in a research through design setting. Four suggestions for collaborative research through design (recruit from a range of relevant backgrounds; take flexibility as a goal; enable independence and agency; and, choose techniques that support agreement or consensus) are offered to help others who wish to experiment with this new approach. Applying collaborative research through design to Floracaching yielded a new prototype of the application, accompanied by design annotations in the form of framing constructs for designing to support mobile, place-based citizen science activities. The prototype and framing constructs, which may inform other designers of similar citizen science technologies, are a second contribution of this research.