935 resultados para design driven
Resumo:
For most people design is a mystery. The products of design are integrated into our daily lives to the point that design has become invisible to us. However. what is subsumed in design practice is a creative problem-solving process that is applicable as a teaching strategy as well as a method for teaching the subject of design. The purpose of this study was to inquire into the current classroom practice of Ontario Visual Arts and Technological Education teachers, understand the goals of Ontario government curriculum developers, and explore the position held by the professional design community on secondary school design education. Data for this study were collected from: (a) a textual analysis of 4 Ministry curriculum documents; (b) interviews with JO stakeholders; (c) unobtrusive observations and informal conversations conducted at 7 secondary school open house events; and (d) observation of 2 sessions of an AQ course for Design and Technology. The research design modeled the design process and was divided into 2 parts: a discovery or problem-finding phase and a discussion or problem-solving phase. The results showed that design is misunderstood and misused; it has become lost between visual arts and technology where neither program holds responsibility for its delivery; students mistake working on computers for design practice; and while there is a desire within the professional community to have a voice in secondary school design education. there is no forum for participation. The technology-driven paradigm shift taking place in society today calls for a new framework for tellching and practicing dcsign. Further research is required; howcvcr. in the meantime. secondary school educators might benefit from professional development and classroom support from the professional dcsign community.
Resumo:
L'être humain utilise trois systèmes sensoriels distincts pour réguler le maintien de la station debout: la somesthésie, le système vestibulaire, et le système visuel. Le rôle de la vision dans la régulation posturale demeure peu connu, notamment sa variabilité en fonction de l'âge, du type développemental, et des atteintes neurologiques. Dans notre travail, la régulation posturale induite visuellement a été évaluée chez des participants au développement et vieillissement normaux âgés de 5-85 ans, chez des individus autistes (développement atypique) âgés de 12-33 ans, ainsi que chez des enfants entre 9-18 ans ayant subi un TCC léger. À cet effet, la réactivité posturale des participants en réponse à un tunnel virtuel entièrement immersif, se mouvant à trois niveaux de vélocité, a été mesurée; des conditions contrôles, où le tunnel était statique ou absent, ont été incluses. Les résultats montrent que la réactivité (i.e. instabilité) posturale induite visuellement est plus élevée chez les jeunes enfants; ensuite, elle s'atténue pour rejoindre des valeurs adultes vers 16-19 ans et augmente de façon linéaire en fonction de l'âge après 45 ans jusqu'à redevenir élevée vers 60 ans. De plus, à la plus haute vélocité du tunnel, les plus jeunes participants autistes ont manifesté significativement moins de réactivité posturale comparativement à leurs contrôles; cette différence n'était pas présente chez des participants plus âgés (16-33 ans). Enfin, les enfants ayant subi un TCC léger, et qui étaient initialement modérément symptomatiques, ont montré un niveau plus élevé d'instabilité posturale induite visuellement que les contrôles, et ce jusqu'à 12 semaines post-trauma malgré le fait que la majorité d'entre eux (89%) n'étaient plus symptomatiques à ce stade. En somme, cela suggère la présence d'une importante période de transition dans la maturation des systèmes sous-tendant l'intégration sensorimotrice impliquée dans le contrôle postural vers l'âge de 16 ans, et d'autres changements sensorimoteurs vers l'âge de 60 ans; cette sur-dépendance visuelle pour la régulation posturale chez les enfants et les aînés pourrait guider l'aménagement d'espaces et l'élaboration d'activités ajustés à l'âge des individus. De plus, le fait que l'hypo-réactivité posturale aux informations visuelles chez les autistes dépende des caractéristiques de l'environnement visuel et de l'âge chronologique, affine notre compréhension des anomalies sensorielles propres à l'autisme. Par ailleurs, le fait que les enfants ayant subi un TCC léger montrent des anomalies posturales jusqu'à 3 mois post-trauma, malgré une diminution significative des symptômes rapportés, pourrait être relié à une altération du traitement de l'information visuelle dynamique et pourrait avoir des implications quant à la gestion clinique des patients aux prises avec un TCC léger, puisque la résolution des symptômes est actuellement le principal critère utilisé pour la prise de décision quant au retour aux activités. Enfin, les résultats obtenus chez une population à développement atypique (autisme) et une population avec atteinte neurologique dite transitoire (TCC léger), contribuent non seulement à une meilleure compréhension des mécanismes d'intégration sensorimotrice sous-tendant le contrôle postural mais pourraient aussi servir comme marqueurs sensibles et spécifiques de dysfonction chez ces populations. Mots-clés : posture, équilibre, vision, développement/vieillissement sensorimoteur, autisme, TCC léger symptomatique, réalité virtuelle.
Resumo:
A low inductance, triggered spark gap switch suitable for a high-current fast discharge system has been developed. The details of the design and fabrication of this pressurized spark gap, which uses only commonly available materials are described. A transverse discharge Blumlein-driven N2 laser incorporating this device gives a peak output power of 700 kW with a FWHM of 3 ns and an efficiency of 0.51%, which is remarkably high for a pulsed nitrogen laser system.
Resumo:
Genetic programming is known to provide good solutions for many problems like the evolution of network protocols and distributed algorithms. In such cases it is most likely a hardwired module of a design framework that assists the engineer to optimize specific aspects of the system to be developed. It provides its results in a fixed format through an internal interface. In this paper we show how the utility of genetic programming can be increased remarkably by isolating it as a component and integrating it into the model-driven software development process. Our genetic programming framework produces XMI-encoded UML models that can easily be loaded into widely available modeling tools which in turn posses code generation as well as additional analysis and test capabilities. We use the evolution of a distributed election algorithm as an example to illustrate how genetic programming can be combined with model-driven development. This example clearly illustrates the advantages of our approach – the generation of source code in different programming languages.
Resumo:
Land use is a crucial link between human activities and the natural environment and one of the main driving forces of global environmental change. Large parts of the terrestrial land surface are used for agriculture, forestry, settlements and infrastructure. Given the importance of land use, it is essential to understand the multitude of influential factors and resulting land use patterns. An essential methodology to study and quantify such interactions is provided by the adoption of land-use models. By the application of land-use models, it is possible to analyze the complex structure of linkages and feedbacks and to also determine the relevance of driving forces. Modeling land use and land use changes has a long-term tradition. In particular on the regional scale, a variety of models for different regions and research questions has been created. Modeling capabilities grow with steady advances in computer technology, which on the one hand are driven by increasing computing power on the other hand by new methods in software development, e.g. object- and component-oriented architectures. In this thesis, SITE (Simulation of Terrestrial Environments), a novel framework for integrated regional sland-use modeling, will be introduced and discussed. Particular features of SITE are the notably extended capability to integrate models and the strict separation of application and implementation. These features enable efficient development, test and usage of integrated land-use models. On its system side, SITE provides generic data structures (grid, grid cells, attributes etc.) and takes over the responsibility for their administration. By means of a scripting language (Python) that has been extended by language features specific for land-use modeling, these data structures can be utilized and manipulated by modeling applications. The scripting language interpreter is embedded in SITE. The integration of sub models can be achieved via the scripting language or by usage of a generic interface provided by SITE. Furthermore, functionalities important for land-use modeling like model calibration, model tests and analysis support of simulation results have been integrated into the generic framework. During the implementation of SITE, specific emphasis was laid on expandability, maintainability and usability. Along with the modeling framework a land use model for the analysis of the stability of tropical rainforest margins was developed in the context of the collaborative research project STORMA (SFB 552). In a research area in Central Sulawesi, Indonesia, socio-environmental impacts of land-use changes were examined. SITE was used to simulate land-use dynamics in the historical period of 1981 to 2002. Analogous to that, a scenario that did not consider migration in the population dynamics, was analyzed. For the calculation of crop yields and trace gas emissions, the DAYCENT agro-ecosystem model was integrated. In this case study, it could be shown that land-use changes in the Indonesian research area could mainly be characterized by the expansion of agricultural areas at the expense of natural forest. For this reason, the situation had to be interpreted as unsustainable even though increased agricultural use implied economic improvements and higher farmers' incomes. Due to the importance of model calibration, it was explicitly addressed in the SITE architecture through the introduction of a specific component. The calibration functionality can be used by all SITE applications and enables largely automated model calibration. Calibration in SITE is understood as a process that finds an optimal or at least adequate solution for a set of arbitrarily selectable model parameters with respect to an objective function. In SITE, an objective function typically is a map comparison algorithm capable of comparing a simulation result to a reference map. Several map optimization and map comparison methodologies are available and can be combined. The STORMA land-use model was calibrated using a genetic algorithm for optimization and the figure of merit map comparison measure as objective function. The time period for the calibration ranged from 1981 to 2002. For this period, respective reference land-use maps were compiled. It could be shown, that an efficient automated model calibration with SITE is possible. Nevertheless, the selection of the calibration parameters required detailed knowledge about the underlying land-use model and cannot be automated. In another case study decreases in crop yields and resulting losses in income from coffee cultivation were analyzed and quantified under the assumption of four different deforestation scenarios. For this task, an empirical model, describing the dependence of bee pollination and resulting coffee fruit set from the distance to the closest natural forest, was integrated. Land-use simulations showed, that depending on the magnitude and location of ongoing forest conversion, pollination services are expected to decline continuously. This results in a reduction of coffee yields of up to 18% and a loss of net revenues per hectare of up to 14%. However, the study also showed that ecological and economic values can be preserved if patches of natural vegetation are conservated in the agricultural landscape. -----------------------------------------------------------------------
Resumo:
Conceptual Information Systems unfold the conceptual structure of data stored in relational databases. In the design phase of the system, conceptual hierarchies have to be created which describe different aspects of the data. In this paper, we describe two principal ways of designing such conceptual hierarchies, data driven design and theory driven design and discuss advantages and drawbacks. The central part of the paper shows how Attribute Exploration, a knowledge acquisition tool developped by B. Ganter can be applied for narrowing the gap between both approaches.
Resumo:
Caches are known to consume up to half of all system power in embedded processors. Co-optimizing performance and power of the cache subsystems is therefore an important step in the design of embedded systems, especially those employing application specific instruction processors. In this project, we propose an analytical cache model that succinctly captures the miss performance of an application over the entire cache parameter space. Unlike exhaustive trace driven simulation, our model requires that the program be simulated once so that a few key characteristics can be obtained. Using these application-dependent characteristics, the model can span the entire cache parameter space consisting of cache sizes, associativity and cache block sizes. In our unified model, we are able to cater for direct-mapped, set and fully associative instruction, data and unified caches. Validation against full trace-driven simulations shows that our model has a high degree of fidelity. Finally, we show how the model can be coupled with a power model for caches such that one can very quickly decide on pareto-optimal performance-power design points for rapid design space exploration.
Resumo:
Three supramolecular complexes of Co(II) using SCN-/SeCN- in combination with 4,4'-dipyridyl-N,N'-dioxide (dpyo), i.e., {[Co(SCN)(2)(dpyo)(2)].(dpyo)}(n) ( 1), {[Co(SCN)(2)(dpyo)(H2O)(2)].(H2O)}(n) ( 2), {[Co(SeCN)(2)(dpyo)(H2O)(2)]center dot(H2O)}(n) ( 3), have been synthesized and characterized by single-crystal X-ray analysis. Complex 1 is a rare example of a dpyo bridged two-dimensional (2D) coordination polymer, and pi-stacked dpyo supramolecular rods are generated by the lattice dpyo, passing through the rhombic grid of stacked layers, resulting in a three-dimensional (3D) superstructure. Complexes 2 and 3 are isomorphous one-dimensional (1D) coordination polymers [-Co-dpyo-Co-] that undergo self-assembly leading to a bilayer architecture derived through an R-2(2)(8) H-bonding synthon between coordinated water and dpyo oxygen. A reinvestigation of coordination polymers [Mn(SCN)(2)(dpyo)( H2O)(MeOH)](n) ( 4) and {[Fe(SCN)(2)(dpyo)(H2O)(2)]center dot(H2O)}(n) ( 5) reported recently by our group [ Manna et al. Indian J. Chem. 2006, 45A, 1813] reveals brick wall topology rather than bilayer architecture is due to the decisive role of S center dot center dot center dot S/Se center dot center dot center dot Se interactions in determining the helical nature in 4 and 5 as compared to zigzag polymeric chains in 2 and 3, although the same R-2(2)(8) synthon is responsible for supramolecular assembly in these complexes.
Resumo:
The EC Regulation No. 1924/2006 on Nutrition and Health claims made on foods has generated considerable debate and concern among scientists and industry. At the time of writing, the European Food Safety Authority (EFSA) has not approved any probiotic claims despite numerous human trials and meta-analyses showing evidence of beneficial effects. On 29th and 30th September 2010, ten independent, academic scientists with a documented record in probiotic research, met to discuss designs for future probiotic studies to demonstrate health benefits for gut and immune function. The expert panel recommended the following: (i) always formulate a precise and concrete hypothesis, and appropriate goals and parameters before starting a trial; (ii) ensure trials have sufficient sample size, such that they are adequately powered to reach statistically significant conclusions, either supporting or rejecting the a priori hypothesis, taking into account adjustment for multiple testing (this might necessitate more than one recruitment site); (iii) ensure trials are of appropriate duration; (iv) focus on a single, primary objective and only evaluate multiple parameters when they are hypothesis-driven. The panel agreed that there was an urgent need to better define which biomarkers are considered valuable for substantiation of a health claim. As a first step, the panel welcomed the publication on the day of the meeting of EFSA's draft guidance document on immune and gut health, although it came too late for study designs and dossiers to be adjusted accordingly. New validated biomarkers need to be identified in order to properly determine the range of physiological functions influenced by probiotics. In addition, validated biomarkers reflecting risk factors for disease, are required for article 14 claims (EC Regulation No. 1924/2006). Finally, the panel concluded that consensus among scientists is needed to decide appropriate clinical endpoints for trials.
Resumo:
Natural ventilation relies on less controllable natural forces so that it needs more artificial control, and thus its prediction, design and analysis become more important. This paper presents both theoretical and numerical simulations for predicting the natural ventilation flow in a two-zone building with multiple openings which is subjected to the combined natural forces. To our knowledge, this is the first analytical solutions obtained so far for a building with more than one zones and in each zone with possibly more than 2 openings. The analytical solution offers a possibility for validating a multi-zone airflow program. A computer program MIX is employed to conduct the numerical simulation. Good agreement is achieved. Different airflow modes are identified and some design recommendations are also provided.
Resumo:
This contribution introduces a new digital predistorter to compensate serious distortions caused by memory high power amplifiers (HPAs) which exhibit output saturation characteristics. The proposed design is based on direct learning using a data-driven B-spline Wiener system modeling approach. The nonlinear HPA with memory is first identified based on the B-spline neural network model using the Gauss-Newton algorithm, which incorporates the efficient De Boor algorithm with both B-spline curve and first derivative recursions. The estimated Wiener HPA model is then used to design the Hammerstein predistorter. In particular, the inverse of the amplitude distortion of the HPA's static nonlinearity can be calculated effectively using the Newton-Raphson formula based on the inverse of De Boor algorithm. A major advantage of this approach is that both the Wiener HPA identification and the Hammerstein predistorter inverse can be achieved very efficiently and accurately. Simulation results obtained are presented to demonstrate the effectiveness of this novel digital predistorter design.
Resumo:
In order to improve the quality of healthcare services, the integrated large-scale medical information system is needed to adapt to the changing medical environment. In this paper, we propose a requirement driven architecture of healthcare information system with hierarchical architecture. The system operates through the mapping mechanism between these layers and thus can organize functions dynamically adapting to user’s requirement. Furthermore, we introduce the organizational semiotics methods to capture and analyze user’s requirement through ontology chart and norms. Based on these results, the structure of user’s requirement pattern (URP) is established as the driven factor of our system. Our research makes a contribution to design architecture of healthcare system which can adapt to the changing medical environment.
Resumo:
We compare future changes in global mean temperature in response to different future scenarios which, for the first time, arise from emission-driven rather than concentration-driven perturbed parameter ensemble of a global climate model (GCM). These new GCM simulations sample uncertainties in atmospheric feedbacks, land carbon cycle, ocean physics and aerosol sulphur cycle processes. We find broader ranges of projected temperature responses arising when considering emission rather than concentration-driven simulations (with 10–90th percentile ranges of 1.7 K for the aggressive mitigation scenario, up to 3.9 K for the high-end, business as usual scenario). A small minority of simulations resulting from combinations of strong atmospheric feedbacks and carbon cycle responses show temperature increases in excess of 9 K (RCP8.5) and even under aggressive mitigation (RCP2.6) temperatures in excess of 4 K. While the simulations point to much larger temperature ranges for emission-driven experiments, they do not change existing expectations (based on previous concentration-driven experiments) on the timescales over which different sources of uncertainty are important. The new simulations sample a range of future atmospheric concentrations for each emission scenario. Both in the case of SRES A1B and the Representative Concentration Pathways (RCPs), the concentration scenarios used to drive GCM ensembles, lies towards the lower end of our simulated distribution. This design decision (a legacy of previous assessments) is likely to lead concentration-driven experiments to under-sample strong feedback responses in future projections. Our ensemble of emission-driven simulations span the global temperature response of the CMIP5 emission-driven simulations, except at the low end. Combinations of low climate sensitivity and low carbon cycle feedbacks lead to a number of CMIP5 responses to lie below our ensemble range. The ensemble simulates a number of high-end responses which lie above the CMIP5 carbon cycle range. These high-end simulations can be linked to sampling a number of stronger carbon cycle feedbacks and to sampling climate sensitivities above 4.5 K. This latter aspect highlights the priority in identifying real-world climate-sensitivity constraints which, if achieved, would lead to reductions on the upper bound of projected global mean temperature change. The ensembles of simulations presented here provides a framework to explore relationships between present-day observables and future changes, while the large spread of future-projected changes highlights the ongoing need for such work.
Resumo:
In this paper, we investigate the possibility to control a mobile robot via a sensory-motory coupling utilizing diffusion system. For this purpose, we implemented a simulation of the diffusion process of chemicals and the kinematics of the mobile robot. In comparison to the original Braitenberg vehicle in which sensorymotor coupling is tightly realised by hardwiring, our system employs the soft coupling. The mobile robot has two sets of independent sensory-motor unit, two sensors are implemented in front and two motors on each side of the robot. The framework used for the sensory-motor coupling was such that 1) Place two electrodes in the medium 2) Drop a certain amount of Chemical U and V related to the distance to the walls and the intensity of the light 3) Place other two electrodes in the medium 4) Measure the concentration of Chemical U and V to actuate the motors on both sides of the robot. The environment was constructed with four surrounding walls and a light source located at the center. Depending on the design parameters and initial conditions, the robot was able to successfully avoid the wall and light. More interestingly, the diffusion process in the sensory-motor coupling provided the robot with a simple form of memory which would not have been possible with a control framework based on a hard-wired electric circuit.
Resumo:
ISO19156 Observations and Measurements (O&M) provides a standardised framework for organising information about the collection of information about the environment. Here we describe the implementation of a specialisation of O&M for environmental data, the Metadata Objects for Linking Environmental Sciences (MOLES3). MOLES3 provides support for organising information about data, and for user navigation around data holdings. The implementation described here, “CEDA-MOLES”, also supports data management functions for the Centre for Environmental Data Archival, CEDA. The previous iteration of MOLES (MOLES2) saw active use over five years, being replaced by CEDA-MOLES in late 2014. During that period important lessons were learnt both about the information needed, as well as how to design and maintain the necessary information systems. In this paper we review the problems encountered in MOLES2; how and why CEDA-MOLES was developed and engineered; the migration of information holdings from MOLES2 to CEDA-MOLES; and, finally, provide an early assessment of MOLES3 (as implemented in CEDA-MOLES) and its limitations. Key drivers for the MOLES3 development included the necessity for improved data provenance, for further structured information to support ISO19115 discovery metadata export (for EU INSPIRE compliance), and to provide appropriate fixed landing pages for Digital Object Identifiers (DOIs) in the presence of evolving datasets. Key lessons learned included the importance of minimising information structure in free text fields, and the necessity to support as much agility in the information infrastructure as possible without compromising on maintainability both by those using the systems internally and externally (e.g. citing in to the information infrastructure), and those responsible for the systems themselves. The migration itself needed to ensure continuity of service and traceability of archived assets.