776 resultados para Engineering design -- Study and teaching (Higher)
Resumo:
This paper analyzes the possibilities of integrating cost information and engineering design. Special emphasis is put on finding the potential of using the activity-based costing (ABC) method. Today, the problem of cost estimation in engineering design is that there are two separate extremes of knowledge. On the one extreme, the engineers model the technical parametres behindcosts in great detail but do not get appropriate cost information to their elegant models. On the other extreme, the accounting professionals are stuck with traditional cost accounting methods driven by the procedures and cycles of financial accounting. Therefore, in many cases, the cost information needs of various decision making groups, for example design engineers, are not served satisfactorily. This paper studies if the activity-based costing (ABC) method could offer a compromise between the two extremes. Recognizing activities and activity chains as well as activity and cost drivers could be specially beneficial for design engineers. Also, recognizing the accurate and reliable product costs of existing products helps when doing variant design. However, ABC is not at its best if the cost system becomes too complicated. This is why a comprehensive ABC-cost information system with detailed cost information for the use of design engineers should be examined critically. ABC is at its best when considering such issues as which activities drive costs, the cost of product complexity, allocating indirect costs on the products, the relationships between processes and costs, and the cost of excess capacity.
Resumo:
Peer-reviewed
Resumo:
Teachers of the course Introduction to Mathematics for Engineers at the UOC, an online distance-learning university, have designed,developed and tested an online studymaterial. It includes basic pre-university mathematics, indications for correct follow-up of this content and recommendations for finding appropriate support and complementarymaterials. Many different resources are used,depending on the characteristics of thecontents: Flash sequences, interactive applets, WIRIS calculators and PDF files.During the last semester, the new study material has been tested with 119 students. The academic results and student satisfaction have allowed us to outline and prioritise future lines of action.
Resumo:
Teachers of the course Introduction to Mathematics for Engineers at the UOC, an online distance-learning university, have designed and produced online study material which includes basic pre-university mathematics, instructions for correct follow-up of this content and recommendations for finding appropiate support and complementary materials.
Resumo:
Peer-reviewed
Resumo:
This study evaluates the use of role-playing games (RPGs) as a methodological approach for teaching cellular biology, assessing student satisfaction, learning outcomes, and retention of acquired knowledge. First-year undergraduate medical students at two Brazilian public universities attended either an RPG-based class (RPG group) or a lecture (lecture-based group) on topics related to cellular biology. Pre- and post-RPG-based class questionnaires were compared to scores in regular exams and in an unannounced test one year later to assess students' attitudes and learning. From the 230 students that attended the RPG classes, 78.4% responded that the RPG-based classes were an effective tool for learning; 55.4% thought that such classes were better than lectures but did not replace them; and 81% responded that they would use this method. The lecture-based group achieved a higher grade in 1 of 14 regular exam questions. In the medium-term evaluation (one year later), the RPG group scored higher in 2 of 12 questions. RPG classes are thus quantitatively as effective as formal lectures, are well accepted by students, and may serve as educational tools, giving students the chance to learn actively and potentially retain the acquired knowledge more efficiently.
Resumo:
Introduction: The chronic kidney disease outcomes and practice patterns study (CKDopps) is an international observational, prospective, cohort study involving patients with chronic kidney disease (CKD) stages 3-5 [estimated glomerular filtration rate (eGFR) < 60 ml/min/1.73 m2, with a major focus upon care during the advanced CKD period (eGFR < 30 ml/min/1.73 m2)]. During a 1-year enrollment period, each one of the 22 selected clinics will enroll up to 60 advanced CKD patients (eGFR < 30 ml/min/1.73 m2 and not dialysis-dependent) and 20 earlier stage CKD patients (eGFR between 30-59 ml/min/1.73 m2). Exclusion criteria: age < 18 years old, patients on chronic dialysis or prior kidney transplant. The study timeline include up to one year for enrollment of patients at each clinic starting in the end of 2013, followed by up to 2-3 years of patient follow-up with collection of detailed longitudinal patient-level data, annual clinic practice-level surveys, and patient surveys. Analyses will apply regression models to evaluate the contribution of patient-level and clinic practice-level factors to study outcomes, and utilize instrumental variable-type techniques when appropriate. Conclusion: Launching in 2013, CKDopps Brazil will study advanced CKD care in a random selection of nephrology clinics across Brazil to gain understanding of variation in care across the country, and as part of a multinational study to identify optimal treatment practices to slow kidney disease progression and improve outcomes during the transition period to end-stage kidney disease.
Resumo:
The intent in this study was to investigate in what ways teachers· beliefs about education and teaching are expressed in the specific teaching behaviours they employ, and whether teaching behaviours, as perceived by their students, are correlated with students· critical thinking and self-directed learning. To this end the relationships studied were: among faCUlty members· philosophy of teaching, locus of control orientation, psychological type, and observed teaching behaviour; and among students· psychological type, perceptions of teaching behaviour, self-directed learning readiness, and critical thinking. The overall purpose of the study was to investigate whether the implicit goals of higher education, critical thinking and self-direction, were actually accounted for in the university classroom. The research was set within the context of path-goal theory, adapted from the leadership literature. Within this framework, Mezirow·s work on transformative learning, including the influences of Habermas· writings, was integrated to develop a theoretical perspective upon which to base the research methodology. Both qualitative and quantitative methodologies were incorporated. Four faCUlty and a total of 142 students participated in the study. Philosophy of teaching was described through faCUlty interviews and completion of a repertory grid. Faculty completed a descriptive locus of control scale, and a psychological type test. Observations of their teaching behaviour were conducted. Students completed a Teaching Behaviour Assessment Scale, the Self-Directed Learning Readiness Scale, a psychological type test, and the Watson-Glaser Critical Thinking Appraisal. A small sample of students were interviewed. Follow-up discussions with faculty were used to validate the interview, observation, teaching behaviour, and repertory grid data. Results indicated that some discrepancies existed between faculty's espoused philosophy of teaching and their observed teaching behaviour. Instructors' teaching behaviour, however, was a function of their personal theory of practice. Relationships were found between perceived teaching behaviour and students· self-directed learning and critical thinking, but these varied across situations, as would be predicted from path-goal theory. Psychological type of students and instructor also accounted for some of the variability in the relationships studied. Student psychological type could be shown as a partial predictor of self-directed learning readiness. The results were discussed in terms of theory development and implications for further research and practice.
Resumo:
This study explores how effectively current research assistantships impart research methods, skills, and attitudes; and how well those experiences prepare the next generation of researchers to meet the evolving needs of an ever-expanding, knowledge- based economy and society. Through personal interviews, 7 graduate student research assistants expressed their perceptions regarding their research assistantships. The open- ended interview questions emphasized (a) what research knowledge and skills the graduate students acquired; (b) what other lessons they took away from the experience; and (c) how the research assistantships influenced their graduate studies and future academic plans. After participants were interviewed, the data were transcribed, memberchecked, and then analyzed using a grounded theory research design. The findings show that research assistantships are valuable educational venues that can not only promote research learning but also benefit research assistants' master's studies and stimulate reflection regarding their future educational and research plans. Although data are limited to the responses of 7 students, findings can contribute to the enhancement of research assistantship opportunities as a means of developing skilled future researchers that in tum will benefit Canada as an emerging leader in research and development. The study is meant to serve as an informative source for (a) experienced researchers who have worked with research assistants; (b) researchers who are planning to hire research assistants; and (c) experienced and novice research assistants. Further, the study has the potential to inform future research training initiatives as well as related policies and practices.
Resumo:
Cette thèse contribue à une théorie générale de la conception du projet. S’inscrivant dans une demande marquée par les enjeux du développement durable, l’objectif principal de cette recherche est la contribution d’un modèle théorique de la conception permettant de mieux situer l’utilisation des outils et des normes d’évaluation de la durabilité d’un projet. Les principes fondamentaux de ces instruments normatifs sont analysés selon quatre dimensions : ontologique, méthodologique, épistémologique et téléologique. Les indicateurs de certains effets contre-productifs reliés, en particulier, à la mise en compte de ces normes confirment la nécessité d’une théorie du jugement qualitatif. Notre hypothèse principale prend appui sur le cadre conceptuel offert par la notion de « principe de précaution » dont les premières formulations remontent du début des années 1970, et qui avaient précisément pour objectif de remédier aux défaillances des outils et méthodes d’évaluation scientifique traditionnelles. La thèse est divisée en cinq parties. Commençant par une revue historique des modèles classiques des théories de la conception (design thinking) elle se concentre sur l’évolution des modalités de prise en compte de la durabilité. Dans cette perspective, on constate que les théories de la « conception verte » (green design) datant du début des années 1960 ou encore, les théories de la « conception écologique » (ecological design) datant des années 1970 et 1980, ont finalement convergé avec les récentes théories de la «conception durable» (sustainable design) à partir du début des années 1990. Les différentes approches du « principe de précaution » sont ensuite examinées sous l’angle de la question de la durabilité du projet. Les standards d’évaluation des risques sont comparés aux approches utilisant le principe de précaution, révélant certaines limites lors de la conception d’un projet. Un premier modèle théorique de la conception intégrant les principales dimensions du principe de précaution est ainsi esquissé. Ce modèle propose une vision globale permettant de juger un projet intégrant des principes de développement durable et se présente comme une alternative aux approches traditionnelles d’évaluation des risques, à la fois déterministes et instrumentales. L’hypothèse du principe de précaution est dès lors proposée et examinée dans le contexte spécifique du projet architectural. Cette exploration débute par une présentation de la notion classique de «prudence» telle qu’elle fut historiquement utilisée pour guider le jugement architectural. Qu’en est-il par conséquent des défis présentés par le jugement des projets d’architecture dans la montée en puissance des méthodes d’évaluation standardisées (ex. Leadership Energy and Environmental Design; LEED) ? La thèse propose une réinterprétation de la théorie de la conception telle que proposée par Donald A. Schön comme une façon de prendre en compte les outils d’évaluation tels que LEED. Cet exercice révèle cependant un obstacle épistémologique qui devra être pris en compte dans une reformulation du modèle. En accord avec l’épistémologie constructiviste, un nouveau modèle théorique est alors confronté à l’étude et l’illustration de trois concours d'architecture canadienne contemporains ayant adopté la méthode d'évaluation de la durabilité normalisée par LEED. Une série préliminaire de «tensions» est identifiée dans le processus de la conception et du jugement des projets. Ces tensions sont ensuite catégorisées dans leurs homologues conceptuels, construits à l’intersection du principe de précaution et des théories de la conception. Ces tensions se divisent en quatre catégories : (1) conceptualisation - analogique/logique; (2) incertitude - épistémologique/méthodologique; (3) comparabilité - interprétation/analytique, et (4) proposition - universalité/ pertinence contextuelle. Ces tensions conceptuelles sont considérées comme autant de vecteurs entrant en corrélation avec le modèle théorique qu’elles contribuent à enrichir sans pour autant constituer des validations au sens positiviste du terme. Ces confrontations au réel permettent de mieux définir l’obstacle épistémologique identifié précédemment. Cette thèse met donc en évidence les impacts généralement sous-estimés, des normalisations environnementales sur le processus de conception et de jugement des projets. Elle prend pour exemple, de façon non restrictive, l’examen de concours d'architecture canadiens pour bâtiments publics. La conclusion souligne la nécessité d'une nouvelle forme de « prudence réflexive » ainsi qu’une utilisation plus critique des outils actuels d’évaluation de la durabilité. Elle appelle une instrumentalisation fondée sur l'intégration globale, plutôt que sur l'opposition des approches environnementales.
Resumo:
Land use is a crucial link between human activities and the natural environment and one of the main driving forces of global environmental change. Large parts of the terrestrial land surface are used for agriculture, forestry, settlements and infrastructure. Given the importance of land use, it is essential to understand the multitude of influential factors and resulting land use patterns. An essential methodology to study and quantify such interactions is provided by the adoption of land-use models. By the application of land-use models, it is possible to analyze the complex structure of linkages and feedbacks and to also determine the relevance of driving forces. Modeling land use and land use changes has a long-term tradition. In particular on the regional scale, a variety of models for different regions and research questions has been created. Modeling capabilities grow with steady advances in computer technology, which on the one hand are driven by increasing computing power on the other hand by new methods in software development, e.g. object- and component-oriented architectures. In this thesis, SITE (Simulation of Terrestrial Environments), a novel framework for integrated regional sland-use modeling, will be introduced and discussed. Particular features of SITE are the notably extended capability to integrate models and the strict separation of application and implementation. These features enable efficient development, test and usage of integrated land-use models. On its system side, SITE provides generic data structures (grid, grid cells, attributes etc.) and takes over the responsibility for their administration. By means of a scripting language (Python) that has been extended by language features specific for land-use modeling, these data structures can be utilized and manipulated by modeling applications. The scripting language interpreter is embedded in SITE. The integration of sub models can be achieved via the scripting language or by usage of a generic interface provided by SITE. Furthermore, functionalities important for land-use modeling like model calibration, model tests and analysis support of simulation results have been integrated into the generic framework. During the implementation of SITE, specific emphasis was laid on expandability, maintainability and usability. Along with the modeling framework a land use model for the analysis of the stability of tropical rainforest margins was developed in the context of the collaborative research project STORMA (SFB 552). In a research area in Central Sulawesi, Indonesia, socio-environmental impacts of land-use changes were examined. SITE was used to simulate land-use dynamics in the historical period of 1981 to 2002. Analogous to that, a scenario that did not consider migration in the population dynamics, was analyzed. For the calculation of crop yields and trace gas emissions, the DAYCENT agro-ecosystem model was integrated. In this case study, it could be shown that land-use changes in the Indonesian research area could mainly be characterized by the expansion of agricultural areas at the expense of natural forest. For this reason, the situation had to be interpreted as unsustainable even though increased agricultural use implied economic improvements and higher farmers' incomes. Due to the importance of model calibration, it was explicitly addressed in the SITE architecture through the introduction of a specific component. The calibration functionality can be used by all SITE applications and enables largely automated model calibration. Calibration in SITE is understood as a process that finds an optimal or at least adequate solution for a set of arbitrarily selectable model parameters with respect to an objective function. In SITE, an objective function typically is a map comparison algorithm capable of comparing a simulation result to a reference map. Several map optimization and map comparison methodologies are available and can be combined. The STORMA land-use model was calibrated using a genetic algorithm for optimization and the figure of merit map comparison measure as objective function. The time period for the calibration ranged from 1981 to 2002. For this period, respective reference land-use maps were compiled. It could be shown, that an efficient automated model calibration with SITE is possible. Nevertheless, the selection of the calibration parameters required detailed knowledge about the underlying land-use model and cannot be automated. In another case study decreases in crop yields and resulting losses in income from coffee cultivation were analyzed and quantified under the assumption of four different deforestation scenarios. For this task, an empirical model, describing the dependence of bee pollination and resulting coffee fruit set from the distance to the closest natural forest, was integrated. Land-use simulations showed, that depending on the magnitude and location of ongoing forest conversion, pollination services are expected to decline continuously. This results in a reduction of coffee yields of up to 18% and a loss of net revenues per hectare of up to 14%. However, the study also showed that ecological and economic values can be preserved if patches of natural vegetation are conservated in the agricultural landscape. -----------------------------------------------------------------------
Resumo:
This paper presents the findings of a podcasting trial held in 2007-2008 within the Faculty of Economics and Business at the University of Sydney, Australia. The trial investigates the value of using short-format podcasts to support assessment for postgraduate and undergraduate students. A multi-method approach is taken in investigating perceptions of the benefits of podcasting, incorporating surveys, focus groups and interviews. The results show that a majority of students believe they gained learning benefits from the podcasts and appreciated the flexibility of the medium to support their learning, and the lecturers felt the innovation helped diversify their pedagogical approach and support a diverse student population. Three primary conclusions are presented: (1) most students reject the mobile potential of podcasting in favour of their traditional study space at home; (2) what students and lecturers value about this podcasting design overlap; (3) the assessment-focussed, short-format podcast design may be considered a successful podcasting model. The paper finishes by identifying areas for future research on the effective use of podcasting in learning and teaching.
Resumo:
The characteristics of service independence and flexibility of ATM networks make the control problems of such networks very critical. One of the main challenges in ATM networks is to design traffic control mechanisms that enable both economically efficient use of the network resources and desired quality of service to higher layer applications. Window flow control mechanisms of traditional packet switched networks are not well suited to real time services, at the speeds envisaged for the future networks. In this work, the utilisation of the Probability of Congestion (PC) as a bandwidth decision parameter is presented. The validity of PC utilisation is compared with QOS parameters in buffer-less environments when only the cell loss ratio (CLR) parameter is relevant. The convolution algorithm is a good solution for CAC in ATM networks with small buffers. If the source characteristics are known, the actual CLR can be very well estimated. Furthermore, this estimation is always conservative, allowing the retention of the network performance guarantees. Several experiments have been carried out and investigated to explain the deviation between the proposed method and the simulation. Time parameters for burst length and different buffer sizes have been considered. Experiments to confine the limits of the burst length with respect to the buffer size conclude that a minimum buffer size is necessary to achieve adequate cell contention. Note that propagation delay is a no dismiss limit for long distance and interactive communications, then small buffer must be used in order to minimise delay. Under previous premises, the convolution approach is the most accurate method used in bandwidth allocation. This method gives enough accuracy in both homogeneous and heterogeneous networks. But, the convolution approach has a considerable computation cost and a high number of accumulated calculations. To overcome this drawbacks, a new method of evaluation is analysed: the Enhanced Convolution Approach (ECA). In ECA, traffic is grouped in classes of identical parameters. By using the multinomial distribution function instead of the formula-based convolution, a partial state corresponding to each class of traffic is obtained. Finally, the global state probabilities are evaluated by multi-convolution of the partial results. This method avoids accumulated calculations and saves storage requirements, specially in complex scenarios. Sorting is the dominant factor for the formula-based convolution, whereas cost evaluation is the dominant factor for the enhanced convolution. A set of cut-off mechanisms are introduced to reduce the complexity of the ECA evaluation. The ECA also computes the CLR for each j-class of traffic (CLRj), an expression for the CLRj evaluation is also presented. We can conclude that by combining the ECA method with cut-off mechanisms, utilisation of ECA in real-time CAC environments as a single level scheme is always possible.