953 resultados para Set Design


Relevância:

30.00% 30.00%

Publicador:

Resumo:

To mitigate greenhouse gas (GHG) emissions and reduce U.S. dependence on imported oil, the United States (U.S.) is pursuing several options to create biofuels from renewable woody biomass (hereafter referred to as “biomass”). Because of the distributed nature of biomass feedstock, the cost and complexity of biomass recovery operations has significant challenges that hinder increased biomass utilization for energy production. To facilitate the exploration of a wide variety of conditions that promise profitable biomass utilization and tapping unused forest residues, it is proposed to develop biofuel supply chain models based on optimization and simulation approaches. The biofuel supply chain is structured around four components: biofuel facility locations and sizes, biomass harvesting/forwarding, transportation, and storage. A Geographic Information System (GIS) based approach is proposed as a first step for selecting potential facility locations for biofuel production from forest biomass based on a set of evaluation criteria, such as accessibility to biomass, railway/road transportation network, water body and workforce. The development of optimization and simulation models is also proposed. The results of the models will be used to determine (1) the number, location, and size of the biofuel facilities, and (2) the amounts of biomass to be transported between the harvesting areas and the biofuel facilities over a 20-year timeframe. The multi-criteria objective is to minimize the weighted sum of the delivered feedstock cost, energy consumption, and GHG emissions simultaneously. Finally, a series of sensitivity analyses will be conducted to identify the sensitivity of the decisions, such as the optimal site selected for the biofuel facility, to changes in influential parameters, such as biomass availability and transportation fuel price. Intellectual Merit The proposed research will facilitate the exploration of a wide variety of conditions that promise profitable biomass utilization in the renewable biofuel industry. The GIS-based facility location analysis considers a series of factors which have not been considered simultaneously in previous research. Location analysis is critical to the financial success of producing biofuel. The modeling of woody biomass supply chains using both optimization and simulation, combing with the GIS-based approach as a precursor, have not been done to date. The optimization and simulation models can help to ensure the economic and environmental viability and sustainability of the entire biofuel supply chain at both the strategic design level and the operational planning level. Broader Impacts The proposed models for biorefineries can be applied to other types of manufacturing or processing operations using biomass. This is because the biomass feedstock supply chain is similar, if not the same, for biorefineries, biomass fired or co-fired power plants, or torrefaction/pelletization operations. Additionally, the research results of this research will continue to be disseminated internationally through publications in journals, such as Biomass and Bioenergy, and Renewable Energy, and presentations at conferences, such as the 2011 Industrial Engineering Research Conference. For example, part of the research work related to biofuel facility identification has been published: Zhang, Johnson and Sutherland [2011] (see Appendix A). There will also be opportunities for the Michigan Tech campus community to learn about the research through the Sustainable Future Institute.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Space Based Solar Power satellites use solar arrays to generate clean, green, and renewable electricity in space and transmit it to earth via microwave, radiowave or laser beams to corresponding receivers (ground stations). These traditionally are large structures orbiting around earth at the geo-synchronous altitude. This thesis introduces a new architecture for a Space Based Solar Power satellite constellation. The proposed concept reduces the high cost involved in the construction of the space satellite and in the multiple launches to the geo-synchronous altitude. The proposed concept is a constellation of Low Earth Orbit satellites that are smaller in size than the conventional system. For this application a Repeated Sun-Synchronous Track Circular Orbit is considered (RSSTO). In these orbits, the spacecraft re-visits the same locations on earth periodically every given desired number of days with the line of nodes of the spacecraft’s orbit fixed relative to the Sun. A wide range of solutions are studied, and, in this thesis, a two-orbit constellation design is chosen and simulated. The number of satellites is chosen based on the electric power demands in a given set of global cities. The orbits of the satellites are designed such that their ground tracks visit a maximum number of ground stations during the revisit period. In the simulation, the locations of the ground stations are chosen close to big cities, in USA and worldwide, so that the space power constellation beams down power directly to locations of high electric power demands. The j2 perturbations are included in the mathematical model used in orbit design. The Coverage time of each spacecraft over a ground site and the gap time between two consecutive spacecrafts visiting a ground site are simulated in order to evaluate the coverage continuity of the proposed solar power constellation. It has been observed from simulations that there always periods in which s spacecraft does not communicate with any ground station. For this reason, it is suggested that each satellite in the constellation be equipped with power storage components so that it can store power for later transmission. This thesis presents a method for designing the solar power constellation orbits such that the number of ground stations visited during the given revisit period is maximized. This leads to maximizing the power transmission to ground stations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A range of societal issues have been caused by fossil fuel consumption in the transportation sector in the United States (U.S.), including health related air pollution, climate change, the dependence on imported oil, and other oil related national security concerns. Biofuels production from various lignocellulosic biomass types such as wood, forest residues, and agriculture residues have the potential to replace a substantial portion of the total fossil fuel consumption. This research focuses on locating biofuel facilities and designing the biofuel supply chain to minimize the overall cost. For this purpose an integrated methodology was proposed by combining the GIS technology with simulation and optimization modeling methods. The GIS based methodology was used as a precursor for selecting biofuel facility locations by employing a series of decision factors. The resulted candidate sites for biofuel production served as inputs for simulation and optimization modeling. As a precursor to simulation or optimization modeling, the GIS-based methodology was used to preselect potential biofuel facility locations for biofuel production from forest biomass. Candidate locations were selected based on a set of evaluation criteria, including: county boundaries, a railroad transportation network, a state/federal road transportation network, water body (rivers, lakes, etc.) dispersion, city and village dispersion, a population census, biomass production, and no co-location with co-fired power plants. The simulation and optimization models were built around key supply activities including biomass harvesting/forwarding, transportation and storage. The built onsite storage served for spring breakup period where road restrictions were in place and truck transportation on certain roads was limited. Both models were evaluated using multiple performance indicators, including cost (consisting of the delivered feedstock cost, and inventory holding cost), energy consumption, and GHG emissions. The impact of energy consumption and GHG emissions were expressed in monetary terms to keep consistent with cost. Compared with the optimization model, the simulation model represents a more dynamic look at a 20-year operation by considering the impacts associated with building inventory at the biorefinery to address the limited availability of biomass feedstock during the spring breakup period. The number of trucks required per day was estimated and the inventory level all year around was tracked. Through the exchange of information across different procedures (harvesting, transportation, and biomass feedstock processing procedures), a smooth flow of biomass from harvesting areas to a biofuel facility was implemented. The optimization model was developed to address issues related to locating multiple biofuel facilities simultaneously. The size of the potential biofuel facility is set up with an upper bound of 50 MGY and a lower bound of 30 MGY. The optimization model is a static, Mathematical Programming Language (MPL)-based application which allows for sensitivity analysis by changing inputs to evaluate different scenarios. It was found that annual biofuel demand and biomass availability impacts the optimal results of biofuel facility locations and sizes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The demand for consumer goods in the developing world continues to rise as populations and economies grow. As designers, manufacturers, and consumers look for ways to address this growing demand, many are considering the possibilities of 3D printing. Due to 3D printing’s flexibility and relative mobility, it is speculated that 3D printing could help to meet the growing demands of the developing world. While the merits and challenges of distributed manufacturing with 3D printing have been presented, little work has been done to determine the types of products that would be appropriate for such manufacturing. Inspired by the author’s two years of Peace Corps service in the Tanzania and the need for specialty equipment for various projects during that time, an in-depth literature search is undertaken to better understand and summarize the process and capabilities of 3D printing. Human-centered design considerations are developed to focus on the product desirability, the technical feasibility, and the financial viability of using 3D printing within Tanzania. Beginning with concerns of what Tanzanian consumers desire, many concerns later arise in regards to the feasibility of creating products that would be sufficient in strength and quality for the demands of developing world consumers. It is only after these concerns are addressed that the viability of products can be evaluated from an economic perspective. The larger impacts of a product beyond its use are vital in determining how it will affect the social, economic, and environmental well-being of a developing nation such as Tanzania. Thus technology specific criteria are necessary for assessing and quantifying the broader impacts that a 3D-printed product can have within its ecosystem, and appropriate criteria are developed for this purpose. Both sets of criteria are then demonstrated and tested while evaluating the desirability, feasibility, viability, and sustainability of printing a piece of equipment required for the author’s Peace Corps service: a set of Vernier calipers. Required for science educators throughout the country, specialty equipment such as calipers initially appear to be an ideal candidate for 3D printing, though ultimately the printing of calipers is not recommended due to current restrictions in the technology. By examining more specific challenges and opportunities of the products 3D printing can produce, it can be better determined what place 3D printing will have in manufacturing for the developing world. Furthermore, the considerations outlined in this paper could be adapted for other manufacturing technologies and regions of the world, as human centered design and sustainability will be critical in determining how to supply the developing world with the consumer goods it demands.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Continuous advancements in technology have led to increasingly comprehensive and distributed product development processes while in pursuit of improved products at reduced costs. Information associated with these products is ever changing, and structured frameworks have become integral to managing such fluid information. Ontologies and the Semantic Web have emerged as key alternatives for capturing product knowledge in both a human-readable and computable manner. The primary and conclusive focus of this research is to characterize relationships formed within methodically developed distributed design knowledge frameworks to ultimately provide a pervasive real-time awareness in distributed design processes. Utilizing formal logics in the form of the Semantic Web’s OWL and SWRL, causal relationships are expressed to guide and facilitate knowledge acquisition as well as identify contradictions between knowledge in a knowledge base. To improve the efficiency during both the development and operational phases of these “intelligent” frameworks, a semantic relatedness algorithm is designed specifically to identify and rank underlying relationships within product development processes. After reviewing several semantic relatedness measures, three techniques, including a novel meronomic technique, are combined to create AIERO, the Algorithm for Identifying Engineering Relationships in Ontologies. In determining its applicability and accuracy, AIERO was applied to three separate, independently developed ontologies. The results indicate AIERO is capable of consistently returning relatedness values one would intuitively expect. To assess the effectiveness of AIERO in exposing underlying causal relationships across product development platforms, a case study involving the development of an industry-inspired printed circuit board (PCB) is presented. After instantiating the PCB knowledge base and developing an initial set of rules, FIDOE, the Framework for Intelligent Distributed Ontologies in Engineering, was employed to identify additional causal relationships through extensional relatedness measurements. In a conclusive PCB redesign, the resulting “intelligent” framework demonstrates its ability to pass values between instances, identify inconsistencies amongst instantiated knowledge, and identify conflicting values within product development frameworks. The results highlight how the introduced semantic methods can enhance the current knowledge acquisition, knowledge management, and knowledge validation capabilities of traditional knowledge bases.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the field of thrombosis and haemostasis, many preanalytical variables influence the results of coagulation assays and measures to limit potential results variations should be taken. To our knowledge, no paper describing the development and maintenance of a haemostasis biobank has been previously published. Our description of the biobank of the Swiss cohort of elderly patients with venous thromboembolism (SWITCO65+) is intended to facilitate the set-up of other biobanks in the field of thrombosis and haemostasis. SWITCO65+ is a multicentre cohort that prospectively enrolled consecutive patients aged ≥65 years with venous thromboembolism at nine Swiss hospitals from 09/2009 to 03/2012. Patients will be followed up until December 2013. The cohort includes a biobank with biological material from each participant taken at baseline and after 12 months of follow-up. Whole blood from all participants is assayed with a standard haematology panel, for which fresh samples are required. Two buffy coat vials, one PAXgene Blood RNA System tube and one EDTA-whole blood sample are also collected at baseline for RNA/DNA extraction. Blood samples are processed and vialed within 1 h of collection and transported in batches to a central laboratory where they are stored in ultra-low temperature archives. All analyses of the same type are performed in the same laboratory in batches. Using multiple core laboratories increased the speed of sample analyses and reduced storage time. After recruiting, processing and analyzing the blood of more than 1,000 patients, we determined that the adopted methods and technologies were fit-for-purpose and robust.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Using stress and coping as a unifying theoretical concept, a series of five models was developed in order to synthesize the survey questions and to classify information. These models identified the question, listed the research study, described measurements, listed workplace data, and listed industry and national reference data.^ A set of 38 instrument questions was developed within the five coping correlate categories. In addition, a set of 22 stress symptoms was also developed. The study was conducted within two groups, police and professors, on a large university campus. The groups were selected because their occupations were diverse, but they were a part of the same macroenvironment. The premise was that police officers would be more highly stressed than professors.^ Of a total study group of 80, there were 37 respondents. The difference in the mean stress responses was observable between the two groups. Not only were the responses similar within each group, but the stress level of response was also similar within each group. While the response to the survey instrument was good, only 3 respondents answered the stress symptom survey properly. It was determined that none of the 37 respondents believed that they were ill. This perception of being well was also evidenced by the grand mean of the stress scores of 2.76 (3.0 = moderate stress). This also caused fewer independent variables to be entered in the multiple regression model.^ The survey instrument was carefully designed to be universal. Universality is the ability to transcend occupational or regional definitions as applied to stress. It is the ability to measure responses within broad categories such as physiological, emotional, behavioral, social, and cognitive functions without losing the ability to measure the detail within the individual questions, or the relationships between questions and categories.^ Replication is much easier to achieve with standardized categories, questions, and measurement procedures such as those developed for the universal survey instrument. Because the survey instrument is universal it can be used as an analytical device, an assessment device, a basic tool for planning and a follow-up instrument to measure individual response to planned reductions in occupational stress. (Abstract shortened with permission of author.) ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE The aim of the present study was to evaluate a dose reduction in contrast-enhanced chest computed tomography (CT) by comparing the three latest generations of Siemens CT scanners used in clinical practice. We analyzed the amount of radiation used with filtered back projection (FBP) and an iterative reconstruction (IR) algorithm to yield the same image quality. Furthermore, the influence on the radiation dose of the most recent integrated circuit detector (ICD; Stellar detector, Siemens Healthcare, Erlangen, Germany) was investigated. MATERIALS AND METHODS 136 Patients were included. Scan parameters were set to a thorax routine: SOMATOM Sensation 64 (FBP), SOMATOM Definition Flash (IR), and SOMATOM Definition Edge (ICD and IR). Tube current was set constantly to the reference level of 100 mA automated tube current modulation using reference milliamperes. Care kV was used on the Flash and Edge scanner, while tube potential was individually selected between 100 and 140 kVp by the medical technologists at the SOMATOM Sensation. Quality assessment was performed on soft-tissue kernel reconstruction. Dose was represented by the dose length product. RESULTS Dose-length product (DLP) with FBP for the average chest CT was 308 mGy*cm ± 99.6. In contrast, the DLP for the chest CT with IR algorithm was 196.8 mGy*cm ± 68.8 (P = 0.0001). Further decline in dose can be noted with IR and the ICD: DLP: 166.4 mGy*cm ± 54.5 (P = 0.033). The dose reduction compared to FBP was 36.1% with IR and 45.6% with IR/ICD. Signal-to-noise ratio (SNR) was favorable in the aorta, bone, and soft tissue for IR/ICD in combination compared to FBP (the P values ranged from 0.003 to 0.048). Overall contrast-to-noise ratio (CNR) improved with declining DLP. CONCLUSION The most recent technical developments, namely IR in combination with integrated circuit detectors, can significantly lower radiation dose in chest CT examinations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cleverly designed molecular building blocks provide chemists with the tools of a powerful molecular-scale construction set. They enable them to engineer materials having a predictable order and useful solid-state properties. Hence, it is in the realm of supramolecular chemistry to follow a strategy for synthesizing materials which combine a selected set of properties, for instance from the areas of magnetism, photophysics and electronics. As a successful approach, host/guest solids which are based on extended anionic, homo- and bimetallic oxalato-bridged transition-metal compounds with two-and three-dimensional connectivities have been investigated. In this report, a brief review is given on the structural aspects of this class of compounds followed by a presentation of a thermal and magnetic study for two distinct, heterometallic oxalato-bridged layer compounds.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The influence of respiratory motion on patient anatomy poses a challenge to accurate radiation therapy, especially in lung cancer treatment. Modern radiation therapy planning uses models of tumor respiratory motion to account for target motion in targeting. The tumor motion model can be verified on a per-treatment session basis with four-dimensional cone-beam computed tomography (4D-CBCT), which acquires an image set of the dynamic target throughout the respiratory cycle during the therapy session. 4D-CBCT is undersampled if the scan time is too short. However, short scan time is desirable in clinical practice to reduce patient setup time. This dissertation presents the design and optimization of 4D-CBCT to reduce the impact of undersampling artifacts with short scan times. This work measures the impact of undersampling artifacts on the accuracy of target motion measurement under different sampling conditions and for various object sizes and motions. The results provide a minimum scan time such that the target tracking error is less than a specified tolerance. This work also presents new image reconstruction algorithms for reducing undersampling artifacts in undersampled datasets by taking advantage of the assumption that the relevant motion of interest is contained within a volume-of-interest (VOI). It is shown that the VOI-based reconstruction provides more accurate image intensity than standard reconstruction. The VOI-based reconstruction produced 43% fewer least-squares error inside the VOI and 84% fewer error throughout the image in a study designed to simulate target motion. The VOI-based reconstruction approach can reduce acquisition time and improve image quality in 4D-CBCT.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Nowadays computing platforms consist of a very large number of components that require to be supplied with diferent voltage levels and power requirements. Even a very small platform, like a handheld computer, may contain more than twenty diferent loads and voltage regulators. The power delivery designers of these systems are required to provide, in a very short time, the right power architecture that optimizes the performance, meets electrical specifications plus cost and size targets. The appropriate selection of the architecture and converters directly defines the performance of a given solution. Therefore, the designer needs to be able to evaluate a significant number of options in order to know with good certainty whether the selected solutions meet the size, energy eficiency and cost targets. The design dificulties of selecting the right solution arise due to the wide range of power conversion products provided by diferent manufacturers. These products range from discrete components (to build converters) to complete power conversion modules that employ diferent manufacturing technologies. Consequently, in most cases it is not possible to analyze all the alternatives (combinations of power architectures and converters) that can be built. The designer has to select a limited number of converters in order to simplify the analysis. In this thesis, in order to overcome the mentioned dificulties, a new design methodology for power supply systems is proposed. This methodology integrates evolutionary computation techniques in order to make possible analyzing a large number of possibilities. This exhaustive analysis helps the designer to quickly define a set of feasible solutions and select the best trade-off in performance according to each application. The proposed approach consists of two key steps, one for the automatic generation of architectures and other for the optimized selection of components. In this thesis are detailed the implementation of these two steps. The usefulness of the methodology is corroborated by contrasting the results using real problems and experiments designed to test the limits of the algorithms.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This Doctoral Thesis entitled Contribution to the analysis, design and assessment of compact antenna test ranges at millimeter wavelengths aims to deepen the knowledge of a particular antenna measurement system: the compact range, operating in the frequency bands of millimeter wavelengths. The thesis has been developed at Radiation Group (GR), an antenna laboratory which belongs to the Signals, Systems and Radiocommunications department (SSR), from Technical University of Madrid (UPM). The Radiation Group owns an extensive experience on antenna measurements, running at present four facilities which operate in different configurations: Gregorian compact antenna test range, spherical near field, planar near field and semianechoic arch system. The research work performed in line with this thesis contributes the knowledge of the first measurement configuration at higher frequencies, beyond the microwaves region where Radiation Group features customer-level performance. To reach this high level purpose, a set of scientific tasks were sequentially carried out. Those are succinctly described in the subsequent paragraphs. A first step dealed with the State of Art review. The study of scientific literature dealed with the analysis of measurement practices in compact antenna test ranges in addition with the particularities of millimeter wavelength technologies. Joint study of both fields of knowledge converged, when this measurement facilities are of interest, in a series of technological challenges which become serious bottlenecks at different stages: analysis, design and assessment. Thirdly after the overview study, focus was set on Electromagnetic analysis algorithms. These formulations allow to approach certain electromagnetic features of interest, such as field distribution phase or stray signal analysis of particular structures when they interact with electromagnetic waves sources. Properly operated, a CATR facility features electromagnetic waves collimation optics which are large, in terms of wavelengths. Accordingly, the electromagnetic analysis tasks introduce an extense number of mathematic unknowns which grow with frequency, following different polynomic order laws depending on the used algorithmia. In particular, the optics configuration which was of our interest consisted on the reflection type serrated edge collimator. The analysis of these devices requires a flexible handling of almost arbitrary scattering geometries, becoming this flexibility the nucleus of the algorithmia’s ability to perform the subsequent design tasks. This thesis’ contribution to this field of knowledge consisted on reaching a formulation which was powerful at the same time when dealing with various analysis geometries and computationally speaking. Two algorithmia were developed. While based on the same principle of hybridization, they reached different order Physics performance at the cost of the computational efficiency. Inter-comparison of their CATR design capabilities was performed, reaching both qualitative as well as quantitative conclusions on their scope. In third place, interest was shifted from analysis - design tasks towards range assessment. Millimetre wavelengths imply strict mechanical tolerances and fine setup adjustment. In addition, the large number of unknowns issue already faced in the analysis stage appears as well in the on chamber field probing stage. Natural decrease of dynamic range available by semiconductor millimeter waves sources requires in addition larger integration times at each probing point. These peculiarities increase exponentially the difficulty of performing assessment processes in CATR facilities beyond microwaves. The bottleneck becomes so tight that it compromises the range characterization beyond a certain limit frequency which typically lies on the lowest segment of millimeter wavelength frequencies. However the value of range assessment moves, on the contrary, towards the highest segment. This thesis contributes this technological scenario developing quiet zone probing techniques which achieves substantial data reduction ratii. Collaterally, it increases the robustness of the results to noise, which is a virtual rise of the setup’s available dynamic range. In fourth place, the environmental sensitivity of millimeter wavelengths issue was approached. It is well known the drifts of electromagnetic experiments due to the dependance of the re sults with respect to the surrounding environment. This feature relegates many industrial practices of microwave frequencies to the experimental stage, at millimeter wavelengths. In particular, evolution of the atmosphere within acceptable conditioning bounds redounds in drift phenomena which completely mask the experimental results. The contribution of this thesis on this aspect consists on modeling electrically the indoor atmosphere existing in a CATR, as a function of environmental variables which affect the range’s performance. A simple model was developed, being able to handle high level phenomena, such as feed - probe phase drift as a function of low level magnitudes easy to be sampled: relative humidity and temperature. With this model, environmental compensation can be performed and chamber conditioning is automatically extended towards higher frequencies. Therefore, the purpose of this thesis is to go further into the knowledge of millimetre wavelengths involving compact antenna test ranges. This knowledge is dosified through the sequential stages of a CATR conception, form early low level electromagnetic analysis towards the assessment of an operative facility, stages for each one of which nowadays bottleneck phenomena exist and seriously compromise the antenna measurement practices at millimeter wavelengths.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

For years, the Human Computer Interaction (HCI) community has crafted usability guidelines that clearly define what characteristics a software system should have in order to be easy to use. However, in the Software Engineering (SE) community keep falling short of successfully incorporating these recommendations into software projects. From a SE perspective, the process of incorporating usability features into software is not always straightforward, as a large number of these features have heavy implications in the underlying software architecture. For example, successfully including an “undo” feature in an application requires the design and implementation of many complex interrelated data structures and functionalities. Our work is focused upon providing developers with a set of software design patterns to assist them in the process of designing more usable software. This would contribute to the proper inclusion of specific usability features with high impact on the software design. Preliminary validation data show that usage of the guidelines also has positive effects on development time and overall software design quality.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we describe a complete development platform that features different innovative acceleration strategies, not included in any other current platform, that simplify and speed up the definition of the different elements required to design a spoken dialog service. The proposed accelerations are mainly based on using the information from the backend database schema and contents, as well as cumulative information produced throughout the different steps in the design. Thanks to these accelerations, the interaction between the designer and the platform is improved, and in most cases the design is reduced to simple confirmations of the “proposals” that the platform dynamically provides at each step. In addition, the platform provides several other accelerations such as configurable templates that can be used to define the different tasks in the service or the dialogs to obtain or show information to the user, automatic proposals for the best way to request slot contents from the user (i.e. using mixed-initiative forms or directed forms), an assistant that offers the set of more probable actions required to complete the definition of the different tasks in the application, or another assistant for solving specific modality details such as confirmations of user answers or how to present them the lists of retrieved results after querying the backend database. Additionally, the platform also allows the creation of speech grammars and prompts, database access functions, and the possibility of using mixed initiative and over-answering dialogs. In the paper we also describe in detail each assistant in the platform, emphasizing the different kind of methodologies followed to facilitate the design process at each one. Finally, we describe the results obtained in both a subjective and an objective evaluation with different designers that confirm the viability, usefulness, and functionality of the proposed accelerations. Thanks to the accelerations, the design time is reduced in more than 56% and the number of keystrokes by 84%.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper shows the role that some foresight tools, such as scenario design, may play in exploring the future impacts of global challenges in our contemporary Society. Additionally, it provides some clues about how to reinforce scenario design so that it displays more in-depth analysis without losing its qualitative nature and communication advantages. Since its inception in the early seventies, scenario design has become one of the most popular foresight tools used in several fields of knowledge. Nevertheless, its wide acceptance has not been seconded by the urban planning academic and professional realm. In some instances, scenario design is just perceived as a story telling technique that generates oversimplified future visions without the support of rigorous and sound analysis. As a matter of fact, the potential of scenario design for providing more in-depth analysis and for connecting with quantitative methods has been generally missed, giving arguments away to its critics. Based on these premises, this document tries to prove the capability of scenario design to anticipate the impacts of complex global challenges and to do it in a more analytical way. These assumptions are tested through a scenario design exercise which explores the future evolution of the sustainable development paradigm (SD) and its implications in the Spanish urban development model. In order to reinforce the perception of scenario design as a useful and added value instrument to urban planners, three sets of implications –functional, parametric and spatial— are displayed to provide substantial and in-depth information for policy makers. This study shows some major findings. First, it is feasible to set up a systematic approach that provides anticipatory intelligence about future disruptive events that may affect the natural environment and socioeconomic fabric of a given territory. Second, there are opportunities for innovating in the Spanish urban planning processes and city governance models. Third, as a foresight tool, scenario design can be substantially reinforced if proper efforts are made to display functional, parametric and spatial implications generated by the scenarios. Fourth, the study confirms that foresight offers interesting opportunities for urban planners, such as anticipating changes, formulating visions, fostering participation and building networks