908 resultados para Computer aided design tool


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The inherent analogue nature of medical ultrasound signals in conjunction with the abundant merits provided by digital image acquisition, together with the increasing use of relatively simple front-end circuitries, have created considerable demand for single-bit  beamformers in digital ultrasound imaging systems. Furthermore, the increasing need to design lightweight ultrasound systems with low power consumption and low noise, provide ample justification for development and innovation in the use of single-bit  beamformers in ultrasound imaging systems. The overall aim of this research program is to investigate, establish, develop and confirm through a combination of theoretical analysis and detailed simulations, that utilize raw phantom data sets, suitable techniques for the design of simple-to-implement hardware efficient  digital ultrasound beamformers to address the requirements for 3D scanners with large channel counts, as well as portable and lightweight ultrasound scanners for point-of-care applications and intravascular imaging systems. In addition, the stability boundaries of higher-order High-Pass (HP) and Band-Pass (BP) Σ−Δ modulators for single- and dual- sinusoidal inputs are determined using quasi-linear modeling together with the describing-function method, to more accurately model the  modulator quantizer. The theoretical results are shown to be in good agreement with the simulation results for a variety of input amplitudes, bandwidths, and modulator orders. The proposed mathematical models of the quantizer will immensely help speed up the design of higher order HP and BP Σ−Δ modulators to be applicable for digital ultrasound beamformers. Finally, a user friendly design and performance evaluation tool for LP, BP and HP  modulators is developed. This toolbox, which uses various design methodologies and covers an assortment of  modulators topologies, is intended to accelerate the design process and evaluation of  modulators. This design tool is further developed to enable the design, analysis and evaluation of  beamformer structures including the noise analyses of the final B-scan images. Thus, this tool will allow researchers and practitioners to design and verify different reconstruction filters and analyze the results directly on the B-scan ultrasound images thereby saving considerable time and effort.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A large class of computational problems are characterised by frequent synchronisation, and computational requirements which change as a function of time. When such a problem is solved on a message passing multiprocessor machine [5], the combination of these characteristics leads to system performance which deteriorate in time. As the communication performance of parallel hardware steadily improves so load balance becomes a dominant factor in obtaining high parallel efficiency. Performance can be improved with periodic redistribution of computational load; however, redistribution can sometimes be very costly. We study the issue of deciding when to invoke a global load re-balancing mechanism. Such a decision policy must actively weigh the costs of remapping against the performance benefits, and should be general enough to apply automatically to a wide range of computations. This paper discusses a generic strategy for Dynamic Load Balancing (DLB) in unstructured mesh computational mechanics applications. The strategy is intended to handle varying levels of load changes throughout the run. The major issues involved in a generic dynamic load balancing scheme will be investigated together with techniques to automate the implementation of a dynamic load balancing mechanism within the Computer Aided Parallelisation Tools (CAPTools) environment, which is a semi-automatic tool for parallelisation of mesh based FORTRAN codes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Die Nützlichkeit des Einsatzes von Computern in Schule und Ausbildung ist schon seit einigen Jahren unbestritten. Uneinigkeit herrscht gegenwärtig allerdings darüber, welche Aufgaben von Computern eigenständig wahrgenommen werden können. Bewertet man die Übernahme von Lehrfunktionen durch computerbasierte Lehrsysteme, müssen häufig Mängel festgestellt werden. Das Ziel der vorliegenden Arbeit ist es, ausgehend von aktuellen Praxisrealisierungen computerbasierter Lehrsysteme unterschiedliche Klassen von zentralen Lehrkompetenzen (Schülermodellierung, Fachwissen und instruktionale Aktivitäten im engeren Sinne) zu bestimmen. Innerhalb jeder Klasse werden globale Leistungen der Lehrsysteme und notwendige, in komplementärer Relation stehende Tätigkeiten menschlicher Tutoren bestimmt. Das dabei entstandene Klassifikationsschema erlaubt sowohl die Einordnung typischer Lehrsysteme als auch die Feststellung von spezifischen Kompetenzen, die in der Lehrer- bzw. Trainerausbildung zukünftig vermehrt berücksichtigt werden sollten. (DIPF/Orig.)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Melanoma is a type of skin cancer and is caused by the uncontrolled growth of atypical melanocytes. In recent decades, computer aided diagnosis is used to support medical professionals; however, there is still no globally accepted tool. In this context, similar to state-of-the-art we propose a system that receives a dermatoscopy image and provides a diagnostic if the lesion is benign or malignant. This tool is composed with next modules: Preprocessing, Segmentation, Feature Extraction, and Classification. Preprocessing involves the removal of hairs. Segmentation is to isolate the lesion. Feature extraction is considering the ABCD dermoscopy rule. The classification is performed by the Support Vector Machine. Experimental evidence indicates that the proposal has 90.63 % accuracy, 95 % sensitivity, and 83.33 % specificity on a data-set of 104 dermatoscopy images. These results are favorable considering the performance of diagnosis by traditional progress in the area of dermatology

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The performance, energy efficiency and cost improvements due to traditional technology scaling have begun to slow down and present diminishing returns. Underlying reasons for this trend include fundamental physical limits of transistor scaling, the growing significance of quantum effects as transistors shrink, and a growing mismatch between transistors and interconnects regarding size, speed and power. Continued Moore's Law scaling will not come from technology scaling alone, and must involve improvements to design tools and development of new disruptive technologies such as 3D integration. 3D integration presents potential improvements to interconnect power and delay by translating the routing problem into a third dimension, and facilitates transistor density scaling independent of technology node. Furthermore, 3D IC technology opens up a new architectural design space of heterogeneously-integrated high-bandwidth CPUs. Vertical integration promises to provide the CPU architectures of the future by integrating high performance processors with on-chip high-bandwidth memory systems and highly connected network-on-chip structures. Such techniques can overcome the well-known CPU performance bottlenecks referred to as memory and communication wall. However the promising improvements to performance and energy efficiency offered by 3D CPUs does not come without cost, both in the financial investments to develop the technology, and the increased complexity of design. Two main limitations to 3D IC technology have been heat removal and TSV reliability. Transistor stacking creates increases in power density, current density and thermal resistance in air cooled packages. Furthermore the technology introduces vertical through silicon vias (TSVs) that create new points of failure in the chip and require development of new BEOL technologies. Although these issues can be controlled to some extent using thermal-reliability aware physical and architectural 3D design techniques, high performance embedded cooling schemes, such as micro-fluidic (MF) cooling, are fundamentally necessary to unlock the true potential of 3D ICs. A new paradigm is being put forth which integrates the computational, electrical, physical, thermal and reliability views of a system. The unification of these diverse aspects of integrated circuits is called Co-Design. Independent design and optimization of each aspect leads to sub-optimal designs due to a lack of understanding of cross-domain interactions and their impacts on the feasibility region of the architectural design space. Co-Design enables optimization across layers with a multi-domain view and thus unlocks new high-performance and energy efficient configurations. Although the co-design paradigm is becoming increasingly necessary in all fields of IC design, it is even more critical in 3D ICs where, as we show, the inter-layer coupling and higher degree of connectivity between components exacerbates the interdependence between architectural parameters, physical design parameters and the multitude of metrics of interest to the designer (i.e. power, performance, temperature and reliability). In this dissertation we present a framework for multi-domain co-simulation and co-optimization of 3D CPU architectures with both air and MF cooling solutions. Finally we propose an approach for design space exploration and modeling within the new Co-Design paradigm, and discuss the possible avenues for improvement of this work in the future.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the undergraduate engineering program at Griffith University in Australia, the unit 1006ENG Design and Professional Skills aims to provide an introduction to engineering design and professional practice through a project-based learning (PBL) approach to problem solving. It provides students with an experience of PBL in the first-year of their programme. The unit comprises an underpinning lecture series, design work including group project activities, an individual computer-aided drawing exercise/s and an oral presentation. Griffith University employs a ‘Student Experience of Course’ (SEC) online survey as part of its student evaluation of teaching, quality improvement and staff performance management processes. As well as numerical response scale items, it includes the following two questions inviting open-ended text responses from students: i) What did you find particularly good about this course? and ii) How could this course be improved? The collection of textual data in in student surveys is commonplace, due to the rich descriptions of respondent experiences they can provide at relatively low cost. However, historically these data have been underutilised because they are time consuming to analyse manually, and there has been a lack of automated tools to exploit such data efficiently. Text analytics approaches offer analysis methods that result in visual representations of comment data that highlight key individual themes in these data and the relationships between those themes. We present a text analytics-based evaluation of the SEC open-ended comments received in the first two years of offer of the PBL unit 1006ENG. We discuss the results obtained in detail. The method developed and documented here is a practical and useful approach to analysing/visualising open-ended comment data that could be applied by others with similar comment data sets.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Interface design is one of the main research areas in human-computer interaction (HCI). In computer science, many HCI re-searchers and designers explore novel interface designs with cutting-edge technology, but few investigate alternative interfaces for existing built environments, especially in the area of architecture. In this pa-per, we investigate alternative interface designs for existing architectural elements—such as walls, floors, and ceilings—that can be created with off-the-shelf materials. Instead of merely serving as discrete sensing and display devices integrated to an existing building’s surface, these liquid and thin materials act as interventions that can be ‘painted’ on a surface, transforming it into an architectural interface. This interface, Painterface, is a responsive material intervention that serves as an analogue, wall-type media interface that senses and responds to people’s actions. Painterface is equipped with three sensing and responsive capacities: touch, sound, and light. While the inter-face’s touch capacity performs tactile sensing, its sound-production and illumination capacities emit notes and light respectively. The out-comes of this research suggest the possibility of a simple, inexpensive, replaceable, and even disposable interface that could serve as an architectural intervention applicable to existing building surfaces.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

By providing vehicle-to-vehicle and vehicle-to-infrastructure wireless communications, vehicular ad hoc networks (VANETs), also known as the “networks on wheels”, can greatly enhance traffic safety, traffic efficiency and driving experience for intelligent transportation system (ITS). However, the unique features of VANETs, such as high mobility and uneven distribution of vehicular nodes, impose critical challenges of high efficiency and reliability for the implementation of VANETs. This dissertation is motivated by the great application potentials of VANETs in the design of efficient in-network data processing and dissemination. Considering the significance of message aggregation, data dissemination and data collection, this dissertation research targets at enhancing the traffic safety and traffic efficiency, as well as developing novel commercial applications, based on VANETs, following four aspects: 1) accurate and efficient message aggregation to detect on-road safety relevant events, 2) reliable data dissemination to reliably notify remote vehicles, 3) efficient and reliable spatial data collection from vehicular sensors, and 4) novel promising applications to exploit the commercial potentials of VANETs. Specifically, to enable cooperative detection of safety relevant events on the roads, the structure-less message aggregation (SLMA) scheme is proposed to improve communication efficiency and message accuracy. The scheme of relative position based message dissemination (RPB-MD) is proposed to reliably and efficiently disseminate messages to all intended vehicles in the zone-of-relevance in varying traffic density. Due to numerous vehicular sensor data available based on VANETs, the scheme of compressive sampling based data collection (CS-DC) is proposed to efficiently collect the spatial relevance data in a large scale, especially in the dense traffic. In addition, with novel and efficient solutions proposed for the application specific issues of data dissemination and data collection, several appealing value-added applications for VANETs are developed to exploit the commercial potentials of VANETs, namely general purpose automatic survey (GPAS), VANET-based ambient ad dissemination (VAAD) and VANET based vehicle performance monitoring and analysis (VehicleView). Thus, by improving the efficiency and reliability in in-network data processing and dissemination, including message aggregation, data dissemination and data collection, together with the development of novel promising applications, this dissertation will help push VANETs further to the stage of massive deployment.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the presented thesis work, the meshfree method with distance fields was coupled with the lattice Boltzmann method to obtain solutions of fluid-structure interaction problems. The thesis work involved development and implementation of numerical algorithms, data structure, and software. Numerical and computational properties of the coupling algorithm combining the meshfree method with distance fields and the lattice Boltzmann method were investigated. Convergence and accuracy of the methodology was validated by analytical solutions. The research was focused on fluid-structure interaction solutions in complex, mesh-resistant domains as both the lattice Boltzmann method and the meshfree method with distance fields are particularly adept in these situations. Furthermore, the fluid solution provided by the lattice Boltzmann method is massively scalable, allowing extensive use of cutting edge parallel computing resources to accelerate this phase of the solution process. The meshfree method with distance fields allows for exact satisfaction of boundary conditions making it possible to exactly capture the effects of the fluid field on the solid structure.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

n decentralised rural electrification through solar home systems, private companies and promoting institutions are faced with the problem of deploying maintenance structures to operate and guarantee the service of the solar systems for long periods (ten years or more). The problems linked to decentralisation, such as the dispersion of dwellings, difficult access and maintenance needs, makes it an arduous task. This paper proposes an innovative design tool created ad hoc for photovoltaic rural electrification based on a real photovoltaic rural electrification program in Morocco as a special case study. The tool is developed from a mathematical model comprising a set of decision variables (location, transport, etc.) that must meet certain constraints and whose optimisation criterion is the minimum cost of the operation and maintenance activity assuming an established quality of service. The main output of the model is the overall cost of the maintenance structure. The best location for the local maintenance headquarters and warehouses in a given region is established, as are the number of maintenance technicians and vehicles required.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper discusses challenges to developers of a national Life Cycle Inventory (LCI) database on which to base assessment of building environmental impacts and a key to development of a fully integrated eco-design tool created for automated eco-efficiency assessment of commercial building design direct from 3D CAD. The scope of this database includes Australian and overseas processing burdens involved in acquiring, processing, transporting, fabricating, finishing and using metals, masonry, timber, glazing, ceramics, plastics, fittings, composites and coatings. Burdens are classified, calculated and reported for all flows of raw materials, fuels, energy and emissions to and from the air, soil and water associated with typical products and services in building construction, fitout and operation. The aggregated life cycle inventory data provides the capacity to generate environmental impact assessment reports based on accepted performance indicators. Practitioners can identify hot spots showing high environmental burdens of a proposed design and drill down to report on specific building components. They can compare assessments with case studies and operational estimates to assist in eco-efficient design of a building, fitout and operation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Understanding the differences between the temporal and physical aspects of the building life cycle is an essential ingredient in the development of Building Environmental Assessment (BEA) tools. This paper illustrates a theoretical Life Cycle Assessment (LCA) framework aligning temporal decision-making with that of material flows over building development phases. It was derived during development of a prototype commercial building design tool that was based on a 3-D CAD information and communications technology (ICT) platform and LCA software. The framework aligns stakeholder BEA needs and the decision-making process against characteristics of leading green building tools. The paper explores related integration of BEA tool development applications on such ICT platforms. Key framework modules are depicted and practical examples for BEA are provided for: • Definition of investment and service goals at project initiation; • Design integrated to avoid overlaps/confusion over the project life cycle; • Detailing the supply chain considering building life cycle impacts; • Delivery of quality metrics for occupancy post-construction/handover; • Deconstruction profiling at end of life to facilitate recovery.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Short-termism among firms, the tendency to excessively discount long-term benefits and favour less valuable short-term benefits, has been a prominent issue in business and public policy debates but research to date has been inconclusive. We study how managers frame, interpret, and resolve problems of intertemporal choice in actual decisions by using computer aided text analysis to measure the frequency of top-team temporal references in 1653 listed Australian firms between 1992-2005. Contrary to short-termism arguments we find evidence of a significant general increase in Future orientation and a significant decrease in Current/Past orientation. We also show top-teams’ temporal orientation is related to their strategic orientation, specifically the extent to which they focus on Innovation-Expansion and Capacity Building.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The studio-gameon event was supported by the Institute of the Creative Industries and Innovation and the Faculty of IT as part of the State Library of Queensland GAME ON exhibition (ex Barbican, UK) The studio produced a full game in six weeks. It was a curated event, a live web-based exhibition, a performance for the public and the team produced a digital / creative work which is available for download. The studio enabled a team of students to experience the pressures of a real game studio within the space of the precincts but also very much in the public eye. It was a physical hypothesis of the University's mantra - "for the real world" statement: Studio GameOn is an opportunity running alongside the GAME ON exhibition at the State Library of Queensland. The exhibition itself is open to the public from November 17th through to February 15th. The studio runs from January 5th to February 13th 2009. The Studio GameOn challenge? To put together a team of game developers and make a playable game in six weeks! The studio-game on team consists of a group of game developers in training - the team members are all students who are either half-way through or completing a qualification in game design and all its elements - we have designers, artists, programmers and productionteam members. We are also fortunate to have an Industry Board consisting of local Queensland Games professionals: John Passfield (Red Sprite Studios), Adrian Cook (WIldfire Studios) and Duncan Curtis and Marko Grgic (The 3 Blokes). We also invite the public to play with us - there is an ideas box both on-site at the State Library and a number of ways to communicate with us on this studio website.