958 resultados para engineering model eliciting activities
Resumo:
Open the sports or business section of your daily newspaper, and you are immediately bombarded with an array of graphs, tables, diagrams, and statistical reports that require interpretation. Across all walks of life, the need to understand statistics is fundamental. Given that our youngsters’ future world will be increasingly data laden, scaffolding their statistical understanding and reasoning is imperative, from the early grades on. The National Council of Teachers of Mathematics (NCTM) continues to emphasize the importance of early statistical learning; data analysis and probability was the Council’s professional development “Focus of the Year” for 2007–2008. We need such a focus, especially given the results of the statistics items from the 2003 NAEP. As Shaughnessy (2007) noted, students’ performance was weak on more complex items involving interpretation or application of items of information in graphs and tables. Furthermore, little or no gains were made between the 2000 NAEP and the 2003 NAEP studies. One approach I have taken to promote young children’s statistical reasoning is through data modeling. Having implemented in grades 3 –9 a number of model-eliciting activities involving working with data (e.g., English 2010), I observed how competently children could create their own mathematical ideas and representations—before being instructed how to do so. I thus wished to introduce data-modeling activities to younger children, confi dent that they would likewise generate their own mathematics. I recently implemented data-modeling activities in a cohort of three first-grade classrooms of six year- olds. I report on some of the children’s responses and discuss the components of data modeling the children engaged in.
Resumo:
This action research examines the enhancement of visual communication within the architectural design studio through physical model making. „It is through physical model making that designers explore their conceptual ideas and develop the creation and understanding of space,‟ (Salama & Wilkinson 2007:126). This research supplements Crowther‟s findings extending the understanding of visual dialogue to include physical models. „Architecture Design 8‟ is the final core design unit at QUT in the fourth year of the Bachelor of Design Architecture. At this stage it is essential that students have the ability to communicate their ideas in a comprehensive manner, relying on a combination of skill sets including drawing, physical model making, and computer modeling. Observations within this research indicates that students did not integrate the combination of the skill sets in the design process through the first half of the semester by focusing primarily on drawing and computer modeling. The challenge was to promote deeper learning through physical model making. This research addresses one of the primary reasons for the lack of physical model making, which was the limited assessment emphasis on the physical models. The unit was modified midway through the semester to better correlate the lecture theory with studio activities by incorporating a series of model making exercises conducted during the studio time. The outcome of each exercise was assessed. Tutors were surveyed regarding the model making activities and a focus group was conducted to obtain formal feedback from students. Students and tutors recognised the added value in communicating design ideas through physical forms and model making. The studio environment was invigorated by the enhanced learning outcomes of the students who participated in the model making exercises. The conclusions of this research will guide the structure of the upcoming iteration of the fourth year design unit.
Resumo:
The world’s increasing complexity, competitiveness, interconnectivity, and dependence on technology generate new challenges for nations and individuals that cannot be met by continuing education as usual (Katehi, Pearson, & Feder, 2009). With the proliferation of complex systems have come new technologies for communication, collaboration, and conceptualisation. These technologies have led to significant changes in the forms of mathematical and scientific thinking that are required beyond the classroom. Modelling, in its various forms, can develop and broaden children’s mathematical and scientific thinking beyond the standard curriculum. This paper first considers future competencies in the mathematical sciences within an increasingly complex world. Next, consideration is given to interdisciplinary problem solving and models and modelling. Examples of complex, interdisciplinary modelling activities across grades are presented, with data modelling in 1st grade, model-eliciting in 4th grade, and engineering-based modelling in 7th-9th grades.
Resumo:
Background: An arteriovenous loop (AVL) enclosed in a polycarbonate chamber in vivo, produces a fibrin exudate which acts as a provisional matrix for the development of a tissue engineered microcirculatory network. Objectives: By administering enoxaparin sodium - an inhibitor of fibrin polymerization, the significance of fibrin scaffold formation on AVL construct size (including the AVL, fibrin scaffold, and new tissue growth into the fibrin), growth, and vascularization were assessed and compared to controls. Methods: In Sprague Dawley rats, an AVL was created on femoral vessels and inserted into a polycarbonate chamber in the groin in 3 control groups (Series I) and 3 experimental groups (Series II). Two hours before surgery and 6 hours post-surgery, saline (Series I) or enoxaparin sodium (0.6 mg/kg, Series II) was administered intra-peritoneally. Thereafter, the rats were injected daily with saline (Series I) or enoxaparin sodium (1.5 mg/kg, Series II) until construct retrieval at 3, 10, or 21 days. The retrieved constructs underwent weight and volume measurements, and morphologic/morphometric analysis of new tissue components. Results: Enoxaparin sodium treatment resulted in the development of smaller AVL constructs at 3, 10, and 21 days. Construct weight and volume were significantly reduced at 10 days (control weight 0.337 ± 0.016 g [Mean ± SEM] vs treated 0.228 ± 0.048, [P < .001]: control volume 0.317 ± 0.015 mL vs treated 0.184 ± 0.039 mL [P < .01]) and 21 days (control weight 0.306 ± 0.053 g vs treated 0.198 ± 0.043 g [P < .01]: control volume 0.285 ± 0.047 mL vs treated 0.148 ± 0.041 mL, [P < .01]). Angiogenesis was delayed in the enoxaparin sodium-treated constructs with the absolute vascular volume significantly decreased at 10 days (control vascular volume 0.029 ± 0.03 mL vs treated 0.012 ± 0.002 mL [P < .05]). Conclusion: In this in vivo tissue engineering model, endogenous, extra-vascularly deposited fibrin volume determines construct size and vascular growth in the first 3 weeks and is, therefore, critical to full construct development.
Resumo:
The quality of species distribution models (SDMs) relies to a large degree on the quality of the input data, from bioclimatic indices to environmental and habitat descriptors (Austin, 2002). Recent reviews of SDM techniques, have sought to optimize predictive performance e.g. Elith et al., 2006. In general SDMs employ one of three approaches to variable selection. The simplest approach relies on the expert to select the variables, as in environmental niche models Nix, 1986 or a generalized linear model without variable selection (Miller and Franklin, 2002). A second approach explicitly incorporates variable selection into model fitting, which allows examination of particular combinations of variables. Examples include generalized linear or additive models with variable selection (Hastie et al. 2002); or classification trees with complexity or model based pruning (Breiman et al., 1984, Zeileis, 2008). A third approach uses model averaging, to summarize the overall contribution of a variable, without considering particular combinations. Examples include neural networks, boosted or bagged regression trees and Maximum Entropy as compared in Elith et al. 2006. Typically, users of SDMs will either consider a small number of variable sets, via the first approach, or else supply all of the candidate variables (often numbering more than a hundred) to the second or third approaches. Bayesian SDMs exist, with several methods for eliciting and encoding priors on model parameters (see review in Low Choy et al. 2010). However few methods have been published for informative variable selection; one example is Bayesian trees (O’Leary 2008). Here we report an elicitation protocol that helps makes explicit a priori expert judgements on the quality of candidate variables. This protocol can be flexibly applied to any of the three approaches to variable selection, described above, Bayesian or otherwise. We demonstrate how this information can be obtained then used to guide variable selection in classical or machine learning SDMs, or to define priors within Bayesian SDMs.
Resumo:
Composite Applications on top of SAPs implementation of SOA (Enterprise SOA) enable the extension of already existing business logic. In this paper we show, based on a case study, how Model-Driven Engineering concepts are applied in the development of such Composite Applications. Our Case Study extends a back-end business process which is required for the specific needs of a demo company selling wine. We use this to describe how the business centric models specifying the modified business behaviour of our case study can be utilized for business performance analysis where most of the actions are performed by humans. In particular, we apply a refined version of Model-Driven Performance Engineering that we proposed in our previous work and motivate which business domain specifics have to be taken into account for business performance analysis. We additionally motivate the need for performance related decision support for domain experts, who generally lack performance related skills. Such a support should offer visual guidance about what should be changed in the design and resource mapping to get improved results with respect to modification constraints and performance objectives, or objectives for time.
Resumo:
Data mining is one of the most important analysis techniques to automatically extract knowledge from large amount of data. Nowadays, data mining is based on low-level specifications of the employed techniques typically bounded to a specific analysis platform. Therefore, data mining lacks a modelling architecture that allows analysts to consider it as a truly software-engineering process. Bearing in mind this situation, we propose a model-driven approach which is based on (i) a conceptual modelling framework for data mining, and (ii) a set of model transformations to automatically generate both the data under analysis (that is deployed via data-warehousing technology) and the analysis models for data mining (tailored to a specific platform). Thus, analysts can concentrate on understanding the analysis problem via conceptual data-mining models instead of wasting efforts on low-level programming tasks related to the underlying-platform technical details. These time consuming tasks are now entrusted to the model-transformations scaffolding. The feasibility of our approach is shown by means of a hypothetical data-mining scenario where a time series analysis is required.
Resumo:
Current model-driven Web Engineering approaches (such as OO-H, UWE or WebML) provide a set of methods and supporting tools for a systematic design and development of Web applications. Each method addresses different concerns using separate models (content, navigation, presentation, business logic, etc.), and provide model compilers that produce most of the logic and Web pages of the application from these models. However, these proposals also have some limitations, especially for exchanging models or representing further modeling concerns, such as architectural styles, technology independence, or distribution. A possible solution to these issues is provided by making model-driven Web Engineering proposals interoperate, being able to complement each other, and to exchange models between the different tools. MDWEnet is a recent initiative started by a small group of researchers working on model-driven Web Engineering (MDWE). Its goal is to improve current practices and tools for the model-driven development of Web applications for better interoperability. The proposal is based on the strengths of current model-driven Web Engineering methods, and the existing experience and knowledge in the field. This paper presents the background, motivation, scope, and objectives of MDWEnet. Furthermore, it reports on the MDWEnet results and achievements so far, and its future plan of actions.
Resumo:
Mode of access: Internet.
Resumo:
This paper is a continuation of the paper titled “Concurrent multi-scale modeling of civil infrastructure for analyses on structural deteriorating—Part I: Modeling methodology and strategy” with the emphasis on model updating and verification for the developed concurrent multi-scale model. The sensitivity-based parameter updating method was applied and some important issues such as selection of reference data and model parameters, and model updating procedures on the multi-scale model were investigated based on the sensitivity analysis of the selected model parameters. The experimental modal data as well as static response in terms of component nominal stresses and hot-spot stresses at the concerned locations were used for dynamic response- and static response-oriented model updating, respectively. The updated multi-scale model was further verified to act as the baseline model which is assumed to be finite-element model closest to the real situation of the structure available for the subsequent arbitrary numerical simulation. The comparison of dynamic and static responses between the calculated results by the final model and measured data indicated the updating and verification methods applied in this paper are reliable and accurate for the multi-scale model of frame-like structure. The general procedures of multi-scale model updating and verification were finally proposed for nonlinear physical-based modeling of large civil infrastructure, and it was applied to the model verification of a long-span bridge as an actual engineering practice of the proposed procedures.
Resumo:
Model-based testing (MBT) relies on models of a system under test and/or its environment to derive test cases for the system. This paper discusses the process of MBT and defines a taxonomy that covers the key aspects of MBT approaches. It is intended to help with understanding the characteristics, similarities and differences of those approaches, and with classifying the approach used in a particular MBT tool. To illustrate the taxonomy, a description of how three different examples of MBT tools fit into the taxonomy is provided.
Resumo:
Human lymphatic vascular malformations (LMs), also known as cystic hygromas or lymphangioma, consist of multiple lymphatic endothelial cell-lined lymph-containing cysts. No animal model of this disease exists. To develop a mouse xenograft model of human LM, CD34NegCD31Pos LM lymphatic endothelial cells (LM-LEC) were isolated from surgical specimens and compared to foreskin CD34NegCD31Pos lymphatic endothelial cells (LECs). Cells were implanted into a mouse tissue engineering model for 1, 2 and 4 weeks. In vitro LM-LECs showed increased proliferation and survival under starvation conditions (P < 0.0005 at 48 h, two-way ANOVA), increased migration (P < 0.001, two-way ANOVA) and formed fewer (P = 0.029, independent samples t test), shorter tubes (P = 0.029, independent samples t test) than foreskin LECs. In vivo LM-LECs implanted into a Matrigel™-containing mouse chamber model assembled to develop vessels with dilated cystic lumens lined with flat endothelium, morphology similar to that of clinical LMs. Human foreskin LECs failed to survive implantation. In LM-LEC implanted chambers the percent volume of podoplaninPos vessels was 1.18 ± 2.24 % at 1 week, 6.34 ± 2.68 % at 2 weeks and increasing to 7.67 ± 3.60 % at 4 weeks. In conclusion, the significantly increased proliferation, migration, resistance to apoptosis and decreased tubulogenesis of LM-LECs observed in vitro is likely to account for their survival and assembly into stable LM-like structures when implanted into a mouse vascularised chamber model. This in vivo xenograft model will provide the basis of future studies of LM biology and testing of potential pharmacological interventions for patients with lymphatic malformations.
Resumo:
In this paper, a model-predictive control (MPC) method is detailed for the control of nonlinear systems with stability considerations. It will be assumed that the plant is described by a local input/output ARX-type model, with the control potentially included in the premise variables, which enables the control of systems that are nonlinear in both the state and control input. Additionally, for the case of set point regulation, a suboptimal controller is derived which has the dual purpose of ensuring stability and enabling finite-iteration termination of the iterative procedure used to solve the nonlinear optimization problem that is used to determine the control signal.
Resumo:
Within Human-Computer Interaction (HCI) and Computer Supported Cooperative Work (CSCW) research, the notion of technologically-mediated awareness is often used for allowing relevant people to maintain a mental model of activities, behaviors and status information about each other so that they can organize and coordinate work or other joint activities. The initial conceptions of awareness focused largely on improving productivity and efficiency within work environments. With new social, cultural and commercial needs and the emergence of novel computing technologies, the focus of technologically-mediated awareness has extended from work environments to people’s everyday interactions. Hence, the scope of awareness has extended from conveying work related activities to people’s emotions, love, social status and other broad range of aspects. This trend of conceptualizing HCI design is termed as experience-focused HCI. In my PhD dissertation, designing for awareness, I have reported on how we, as HCI researchers, can design awareness systems from experience-focused HCI perspective that follow the trend of conveying awareness beyond the task-based, instrumental and productive needs. Within the overall aim to design for awareness, my research advocates ethnomethodologically-informed approaches for conceptualizing and designing for awareness. In this sense, awareness is not a predefined phenomenon but something that is situated and particular to a given environment. I have used this approach in two design cases of developing interactive systems that support awareness beyond task-based aspects in work environments. In both the cases, I have followed a complete design cycle: collecting an in-situ understanding of an environment, developing implications for a new technology, implementing a prototype technology to studying the use of the technology in its natural settings.