834 resultados para Design and Technology, Professional Development, Curriculum Implementation
Resumo:
The significance of services as business and human activities has increased dramatically throughout the world in the last three decades. Becoming a more and more competitive and efficient service provider while still being able to provide unique value opportunities for customers requires new knowledge and ideas. Part of this knowledge is created and utilized in daily activities in every service organization, but not all of it, and therefore an emerging phenomenon in the service context is information awareness. Terms like big data and Internet of things are not only modern buzz-words but they are also describing urgent requirements for a new type of competences and solutions. When the amount of information increases and the systems processing information become more efficient and intelligent, it is the human understanding and objectives that may get separated from the automated processes and technological innovations. This is an important challenge and the core driver for this dissertation: What kind of information is created, possessed and utilized in the service context, and even more importantly, what information exists but is not acknowledged or used? In this dissertation the focus is on the relationship between service design and service operations. Reframing this relationship refers to viewing the service system from the architectural perspective. The selected perspective allows analysing the relationship between design activities and operational activities as an information system while maintaining the tight connection to existing service research contributions and approaches. This type of an innovative approach is supported by research methodology that relies on design science theory. The methodological process supports the construction of a new design artifact based on existing theoretical knowledge, creation of new innovations and testing the design artifact components in real service contexts. The relationship between design and operations is analysed in the health care and social care service systems. The existing contributions in service research tend to abstract services and service systems as value creation, working or interactive systems. This dissertation adds an important information processing system perspective to the research. The main contribution focuses on the following argument: Only part of the service information system is automated and computerized, whereas a significant part of information processing is embedded in human activities, communication and ad-hoc reactions. The results indicate that the relationship between service design and service operations is more complex and dynamic than the existing scientific and managerial models tend to view it. Both activities create, utilize, mix and share information, making service information management a necessary but relatively unknown managerial task. On the architectural level, service system -specific elements seem to disappear, but access to more general information elements and processes can be found. While this dissertation focuses on conceptual-level design artifact construction, the results provide also very practical implications for service providers. Personal, visual and hidden activities of service, and more importantly all changes that take place in any service system have also an information dimension. Making this information dimension visual and prioritizing the processed information based on service dimensions is likely to provide new opportunities to increase activities and provide a new type of service potential for customers.
Resumo:
Technological innovations, the development of the internet, and globalization have increased the number and complexity of web applications. As a result, keeping web user interfaces understandable and usable (in terms of ease-of-use, effectiveness, and satisfaction) is a challenge. As part of this, designing userintuitive interface signs (i.e., the small elements of web user interface, e.g., navigational link, command buttons, icons, small images, thumbnails, etc.) is an issue for designers. Interface signs are key elements of web user interfaces because ‘interface signs’ act as a communication artefact to convey web content and system functionality, and because users interact with systems by means of interface signs. In the light of the above, applying semiotic (i.e., the study of signs) concepts on web interface signs will contribute to discover new and important perspectives on web user interface design and evaluation. The thesis mainly focuses on web interface signs and uses the theory of semiotic as a background theory. The underlying aim of this thesis is to provide valuable insights to design and evaluate web user interfaces from a semiotic perspective in order to improve overall web usability. The fundamental research question is formulated as What do practitioners and researchers need to be aware of from a semiotic perspective when designing or evaluating web user interfaces to improve web usability? From a methodological perspective, the thesis follows a design science research (DSR) approach. A systematic literature review and six empirical studies are carried out in this thesis. The empirical studies are carried out with a total of 74 participants in Finland. The steps of a design science research process are followed while the studies were designed and conducted; that includes (a) problem identification and motivation, (b) definition of objectives of a solution, (c) design and development, (d) demonstration, (e) evaluation, and (f) communication. The data is collected using observations in a usability testing lab, by analytical (expert) inspection, with questionnaires, and in structured and semi-structured interviews. User behaviour analysis, qualitative analysis and statistics are used to analyze the study data. The results are summarized as follows and have lead to the following contributions. Firstly, the results present the current status of semiotic research in UI design and evaluation and highlight the importance of considering semiotic concepts in UI design and evaluation. Secondly, the thesis explores interface sign ontologies (i.e., sets of concepts and skills that a user should know to interpret the meaning of interface signs) by providing a set of ontologies used to interpret the meaning of interface signs, and by providing a set of features related to ontology mapping in interpreting the meaning of interface signs. Thirdly, the thesis explores the value of integrating semiotic concepts in usability testing. Fourthly, the thesis proposes a semiotic framework (Semiotic Interface sign Design and Evaluation – SIDE) for interface sign design and evaluation in order to make them intuitive for end users and to improve web usability. The SIDE framework includes a set of determinants and attributes of user-intuitive interface signs, and a set of semiotic heuristics to design and evaluate interface signs. Finally, the thesis assesses (a) the quality of the SIDE framework in terms of performance metrics (e.g., thoroughness, validity, effectiveness, reliability, etc.) and (b) the contributions of the SIDE framework from the evaluators’ perspective.
Resumo:
In today’s world because of the rapid advancement in the field of technology and business, the requirements are not clear, and they are changing continuously in the development process. Due to those changes in the requirements the software development becomes very difficult. Use of traditional software development methods such as waterfall method is not a good option, as the traditional software development methods are not flexible to requirements and the software can be late and over budget. For developing high quality software that satisfies the customer, the organizations can use software development methods, such as agile methods which are flexible to change requirements at any stage in the development process. The agile methods are iterative and incremental methods that can accelerate the delivery of the initial business values through the continuous planning and feedback, and there is close communication between the customer and developers. The main purpose of the current thesis is to find out the problems in traditional software development and to show how agile methods reduced those problems in software development. The study also focuses the different success factors of agile methods, the success rate of agile projects and comparison between traditional and agile software development.
Resumo:
An analysis of Brazilian federal expenditures in science and technology is presented is this study. The 1990-1999 data were compiled from records provided by two federal agencies (MCT and CNPq) responsible for managing most of the national budget related to these activities. The results indicate that the federal investments in Brazilian science and technology stagnated during the last decade (US$ 2.32 billion in 1990, US$ 2.39 billion in 1996, and US$ 2.36 billion in 1999). In contrast, a great increase in private investments in research was acknowledged both by industry and by the government during the same period, from US$ 2.12 to US$ 4.64 billion. However, this investment did not result in an increase in invention patents granted to residents (492 in 1990 and only 232 in 1997) or in a reduction of patent costs. Despite this unfavorable scenario, the number of graduate programs in the country has increased two-fold in the last decade and the contribution of Brazilians to the database of the Institute for Scientific Information has increased 4.7-fold from 1990 (2,725 scientific publications) to 2000 (12,686 scientific publications). Unstable federal resources for science, together with the poor returns of private resources in terms of developing new technologies, may jeopardize the future of Brazilian technological development.
Resumo:
Oligonucleotides have a wide range of applications in fields such as biotechnology, molecular biology, diagnosis and therapy. However, the spectrum of uses can be broadened by introducing chemical modifications into their structures. The most prolific field in the search for new oligonucleotide analogs is the antisense strategy, where chemical modifications confer appropriate characteristics such as hybridization, resistance to nucleases, cellular uptake, selectivity and, basically, good pharmacokinetic and pharmacodynamic properties. Combinatorial technology is another research area where oligonucleotides and their analogs are extensively employed. Aptamers, new catalytic ribozymes and deoxyribozymes are RNA or DNA molecules individualized from a randomly synthesized library on the basis of a particular property. They are identified by repeated cycles of selection and amplification, using PCR technologies. Modified nucleotides can be introduced either during the amplification procedure or after selection.
Resumo:
Coronary artery disease is an atherosclerotic disease, which leads to narrowing of coronary arteries, deteriorated myocardial blood flow and myocardial ischaemia. In acute myocardial infarction, a prolonged period of myocardial ischaemia leads to myocardial necrosis. Necrotic myocardium is replaced with scar tissue. Myocardial infarction results in various changes in cardiac structure and function over time that results in “adverse remodelling”. This remodelling may result in a progressive worsening of cardiac function and development of chronic heart failure. In this thesis, we developed and validated three different large animal models of coronary artery disease, myocardial ischaemia and infarction for translational studies. In the first study the coronary artery disease model had both induced diabetes and hypercholesterolemia. In the second study myocardial ischaemia and infarction were caused by a surgical method and in the third study by catheterisation. For model characterisation, we used non-invasive positron emission tomography (PET) methods for measurement of myocardial perfusion, oxidative metabolism and glucose utilisation. Additionally, cardiac function was measured by echocardiography and computed tomography. To study the metabolic changes that occur during atherosclerosis, a hypercholesterolemic and diabetic model was used with [18F] fluorodeoxyglucose ([18F]FDG) PET-imaging technology. Coronary occlusion models were used to evaluate metabolic and structural changes in the heart and the cardioprotective effects of levosimendan during post-infarction cardiac remodelling. Large animal models were used in testing of novel radiopharmaceuticals for myocardial perfusion imaging. In the coronary artery disease model, we observed atherosclerotic lesions that were associated with focally increased [18F]FDG uptake. In heart failure models, chronic myocardial infarction led to the worsening of systolic function, cardiac remodelling and decreased efficiency of cardiac pumping function. Levosimendan therapy reduced post-infarction myocardial infarct size and improved cardiac function. The novel 68Ga-labeled radiopharmaceuticals tested in this study were not successful for the determination of myocardial blood flow. In conclusion, diabetes and hypercholesterolemia lead to the development of early phase atherosclerotic lesions. Coronary artery occlusion produced considerable myocardial ischaemia and later infarction following myocardial remodelling. The experimental models evaluated in these studies will enable further studies concerning disease mechanisms, new radiopharmaceuticals and interventions in coronary artery disease and heart failure.
Resumo:
Global energy consumption has been increasing yearly and a big portion of it is used in rotating electrical machineries. It is clear that in these machines energy should be used efficiently. In this dissertation the aim is to improve the design process of high-speed electrical machines especially from the mechanical engineering perspective in order to achieve more reliable and efficient machines. The design process of high-speed machines is challenging due to high demands and several interactions between different engineering disciplines such as mechanical, electrical and energy engineering. A multidisciplinary design flow chart for a specific type of high-speed machine in which computer simulation is utilized is proposed. In addition to utilizing simulation parallel with the design process, two simulation studies are presented. The first is used to find the limits of two ball bearing models. The second is used to study the improvement of machine load capacity in a compressor application to exceed the limits of current machinery. The proposed flow chart and simulation studies show clearly that improvements in the high-speed machinery design process can be achieved. Engineers designing in high-speed machines can utilize the flow chart and simulation results as a guideline during the design phase to achieve more reliable and efficient machines that use energy efficiently in required different operation conditions.
Resumo:
This study examined the strategies used by elementary school principals to facilitate and nurture the development of professional learning communities (PLC) within their school settings. Using a reputational sample of administrators whose schools were demonstrating observable characteristics of PLCs, this study documented and described the strategies and actions taken by the principals to move their schools forward. Data collection included the use of open-ended interviews as well as observations capturing the means by which the principals addressed the areas of culture, processes, and structures within their school setting. A grounded theory approach to data analysis uncovered 4 guiding principles used by the principals to facilitate the development of the PLCs within their school: (a) protecting the purpose; (b) attending to relationships; (c) sharing the responsibility; and (d) valuing the journey. The guiding principles were used by each administrator to anchor the decisions they made and develop responsive, contextspecific strategies to support the PLC at their school. The results highlighted the complex role of the principal and the supports required to tackle the difficult work of facilitating PLCs.
Resumo:
This research explored the elements that contribute to staff nurses' commitment to lifelong professional development. This exploration has been undertaken to provide insights into those factors that motivate individuals to continue their education for professional development and for clinical practice improvement. This study was conducted in an acute care hospital in Southern Ontario, and investigated the thoughts and experiences ofhealth care staffworking within that setting. A qualitative case study was undertaken which involved the collection of interview, document, and class observation data. Two exemplary clinical nurse educators and two motivated, professionally committed staffnurses were interviewed during the study. Teaching document review and observation ofclasses involving the clinical nurse educators were conducted to facilitate triangulation of fmdingswith data sources and strategies. These participants provided rich data that were captured in field notes and coded for conceptual meaning. Emerging from the data were the identification ofthree major elements of influence that contribute to staffnurses' commitment to lifelong professional development. Identified within the three intersecting spheres of influence upon staff nurses' lifelong commitment to professionalleaming were the environment, the clinical nurse educator, and the staff nurse. This research explored the intersecting spheres of influence and the elements within the partnership model ofprofessional education for staff nurses.
Science and technology of rubber reclamation with special attention to NR-based waste latex products
Resumo:
A comprehensive overview of reclamation of cured rubber with special emphasis on latex reclamation is depicted in this paper. The latex industry has expanded over the years to meet the world demands for gloves, condoms, latex thread, etc. Due to the strict specifications for the products and the unstable nature of the latex as high as 15% of the final latex products are rejected. As waste latex rubber (WLR) represents a source of high-quality rubber hydrocarbon, it is a potential candidate for generating reclaimed rubber of superior quality. The role of the different components in the reclamation recipe is explained and the reaction mechanism and chemistry during reclamation are discussed in detail. Different types of reclaiming processes are described with special reference to processes, which selectively cleave the cross links in the vulcanized rubber. The state-of-the-art techniques of reclamation with special attention on latex treatment are reviewed. An overview of the latest development concerning the fundamental studies in the field of rubber recycling by means of low-molecular weight compounds is described. A mathematical model description of main-chain and crosslink scission during devulcanization of a rubber vulcanizate is also given.
Resumo:
Propagation of electromagnetic waves through a microstrip line with 2D electromagnetic baud gap (EBG) structures of different geometrical shapes in the ground plane is investigated in this paper. Using transmission-line theory, the design equations for EBG structures are calculated. The measured, numerical. and simulated results are in gone) agreement
Resumo:
In this work we present the results of our attempt to build a compact photothermal spectrometer capable of both manual and automated mode of operation.The salient features of the system include the ability to analyse thin film, powder and polymer samples. The tool has been in use to investigate thermal, optical and transport properties. Binary and ternary semiconducting thin films were analysed for their thermal diffusivities. The system could perform thickness measurements nondestructively. Ion implanted semiconductors are widely studied for the effect of radiation induced defects. We could perform nondestructive imaging of defects using our spectrometer.The results reported in his thesis on the above in addition to studies on In2S3 and transparent conducting oxide ZnO have been achieved with this spectrometer. Various polymer samples have been easily analysed for their thermal diffusivities. The technique provided ease of analysis not achieved with conventional techniques like TGA and DSC. Industrial application of the tool has also been proved by analyzing defects of welded joints and adhesion of paints. Indigenization of the expensive lock-in-amplifier and automation has been the significant achievement in the course of this dissertation. We are on our way to prove the noise rejection capabilities of our PC LIA.
Resumo:
Department of Applied Chemistry, Cochin University of Science and Technology
Resumo:
Most of the commercial and financial data are stored in decimal fonn. Recently, support for decimal arithmetic has received increased attention due to the growing importance in financial analysis, banking, tax calculation, currency conversion, insurance, telephone billing and accounting. Performing decimal arithmetic with systems that do not support decimal computations may give a result with representation error, conversion error, and/or rounding error. In this world of precision, such errors are no more tolerable. The errors can be eliminated and better accuracy can be achieved if decimal computations are done using Decimal Floating Point (DFP) units. But the floating-point arithmetic units in today's general-purpose microprocessors are based on the binary number system, and the decimal computations are done using binary arithmetic. Only few common decimal numbers can be exactly represented in Binary Floating Point (BF P). ln many; cases, the law requires that results generated from financial calculations performed on a computer should exactly match with manual calculations. Currently many applications involving fractional decimal data perform decimal computations either in software or with a combination of software and hardware. The performance can be dramatically improved by complete hardware DFP units and this leads to the design of processors that include DF P hardware.VLSI implementations using same modular building blocks can decrease system design and manufacturing cost. A multiplexer realization is a natural choice from the viewpoint of cost and speed.This thesis focuses on the design and synthesis of efficient decimal MAC (Multiply ACeumulate) architecture for high speed decimal processors based on IEEE Standard for Floating-point Arithmetic (IEEE 754-2008). The research goal is to design and synthesize deeimal'MAC architectures to achieve higher performance.Efficient design methods and architectures are developed for a high performance DFP MAC unit as part of this research.