919 resultados para Visualization Using Computer Algebra Tools
Resumo:
This thesis aims to describe and demonstrate the developed concept to facilitate the use of thermal simulation tools during the building design process. Despite the impact of architectural elements on the performance of buildings, some influential decisions are frequently based solely on qualitative information. Even though such design support is adequate for most decisions, the designer will eventually have doubts concerning the performance of some design decisions. These situations will require some kind of additional knowledge to be properly approached. The concept of designerly ways of simulating focuses on the formulation and solution of design dilemmas, which are doubts about the design that cannot be fully understood nor solved without using quantitative information. The concept intends to combine the power of analysis from computer simulation tools with the capacity of synthesis from architects. Three types of simulation tools are considered: solar analysis, thermal/energy simulation and CFD. Design dilemmas are formulated and framed according to the architect s reflection process about performance aspects. Throughout the thesis, the problem is investigated in three fields: professional, technical and theoretical fields. This approach on distinct parts of the problem aimed to i) characterize different professional categories with regards to their design practice and use of tools, ii) investigate preceding researchers on the use of simulation tools and iii) draw analogies between the proposed concept, and some concepts developed or described in previous works about design theory. The proposed concept was tested in eight design dilemmas extracted from three case studies in the Netherlands. The three investigated processes are houses designed by Dutch architectural firms. Relevant information and criteria from each case study were obtained through interviews and conversations with the involved architects. The practical application, despite its success in the research context, allowed the identification of some applicability limitations of the concept, concerning the architects need to have technical knowledge and the actual evolution stage of simulation tools
Resumo:
Libraries since their inception 4000 years ago have been in a process of constant change. Although, changes were in slow motion for centuries, in the last decades, academic libraries have been continuously striving to adapt their services to the ever-changing user needs of students and academic staff. In addition, e-content revolution, technological advances, and ever-shrinking budgets have obliged libraries to efficiently allocate their limited resources among collection and services. Unfortunately, this resource allocation is a complex process due to the diversity of data sources and formats required to be analyzed prior to decision-making, as well as the lack of efficient integration methods. The main purpose of this study is to develop an integrated model that supports libraries in making optimal budgeting and resource allocation decisions among their services and collection by means of a holistic analysis. To this end, a combination of several methodologies and structured approaches is conducted. Firstly, a holistic structure and the required toolset to holistically assess academic libraries are proposed to collect and organize the data from an economic point of view. A four-pronged theoretical framework is used in which the library system and collection are analyzed from the perspective of users and internal stakeholders. The first quadrant corresponds to the internal perspective of the library system that is to analyze the library performance, and costs incurred and resources consumed by library services. The second quadrant evaluates the external perspective of the library system; user’s perception about services quality is judged in this quadrant. The third quadrant analyses the external perspective of the library collection that is to evaluate the impact of the current library collection on its users. Eventually, the fourth quadrant evaluates the internal perspective of the library collection; the usage patterns followed to manipulate the library collection are analyzed. With a complete framework for data collection, these data coming from multiple sources and therefore with different formats, need to be integrated and stored in an adequate scheme for decision support. A data warehousing approach is secondly designed and implemented to integrate, process, and store the holistic-based collected data. Ultimately, strategic data stored in the data warehouse are analyzed and implemented for different purposes including the following: 1) Data visualization and reporting is proposed to allow library managers to publish library indicators in a simple and quick manner by using online reporting tools. 2) Sophisticated data analysis is recommended through the use of data mining tools; three data mining techniques are examined in this research study: regression, clustering and classification. These data mining techniques have been applied to the case study in the following manner: predicting the future investment in library development; finding clusters of users that share common interests and similar profiles, but belong to different faculties; and predicting library factors that affect student academic performance by analyzing possible correlations of library usage and academic performance. 3) Input for optimization models, early experiences of developing an optimal resource allocation model to distribute resources among the different processes of a library system are documented in this study. Specifically, the problem of allocating funds for digital collection among divisions of an academic library is addressed. An optimization model for the problem is defined with the objective of maximizing the usage of the digital collection over-all library divisions subject to a single collection budget. By proposing this holistic approach, the research study contributes to knowledge by providing an integrated solution to assist library managers to make economic decisions based on an “as realistic as possible” perspective of the library situation.
Resumo:
Food bought at supermarkets in, for instance, North America or the European Union, give comprehensive information about ingredients and allergens. Meanwhile, the menus of restaurants are usually incomplete and cannot be normally completed by the waiter. This is specially important when traveling to countries with a di erent culture. A curious example is "calamares en su tinta" (squid in its own ink), a common dish in Spain. Its brief description would be "squid with boiled rice in its own (black) ink", but an ingredient of its sauce is flour, a fact very important for celiacs. There are constraints based on religious believes, due to food allergies or to illnesses, while others just derive from personal preferences. Another complicated situation arise in hospitals, where the doctors' nutritional recommendations have to be added to the patient's usual constraints. We have therefore designed and developed a Rule Based Expert System (RBES) that can address these problems. The rules derive directly from the recipes of the di fferent dishes and contain the information about the required ingredients and ways of cooking. In fact, we distinguish: ingredients and ways of cooking, intermediate products (like sauces, that aren't always made explicit) and final products (the dishes listed in the menu of the restaurant). For a certain restaurant, customer and instant, the input to the RBES are: actualized stock of ingredients and personal characteristics of that customer. The RBES then prepares a "personalized menu" using set operations and knowledge extraction (thanks to an algebraic inference engine [1]). The RBES has been implemented in the computer algebra system MapleTM2015. A rst version of this work was presented at "Applications of Computer Algebra 2015" (ACA'2015) conference. The corresponding abstract is available at [2].
Resumo:
In this thesis we study weak isometries of Hamming spaces. These are permutations of a Hamming space that preserve some but not necessarily all distances. We wish to find conditions under which a weak isometry is in fact an isometry. This type of problem was first posed by Beckman and Quarles for Rn. In chapter 2 we give definitions pertinent to our research. The 3rd chapter focuses on some known results in this area with special emphasis on papers by V. Krasin as well as S. De Winter and M. Korb who solved this problem for the Boolean cube, that is, the binary Hamming space. We attempted to generalize some of their methods to the non-boolean case. The 4th chapter has our new results and is split into two major contributions. Our first contribution shows if n=p or p < n2, then every weak isometry of Hnq that preserves distance p is an isometry. Our second contribution gives a possible method to check if a weak isometry is an isometry using linear algebra and graph theory.
Resumo:
This study examines the role of visual literacy in learning biology. Biology teachers promote the use of digital images as a learning tool for two reasons: because biology is the most visual of the sciences, and the use of imagery is becoming increasingly important with the advent of bioinformatics; and because studies indicate that this current generation of teenagers have a cognitive structure that is formed through exposure to digital media. On the other hand, there is concern that students are not being exposed enough to the traditional methods of processing biological information - thought to encourage left-brain sequential thinking patterns. Theories of Embodied Cognition point to the importance of hand-drawing for proper assimilation of knowledge, and theories of Multiple Intelligences suggest that some students may learn more easily using traditional pedagogical tools. To test the claim that digital learning tools enhance the acquisition of visual literacy in this generation of biology students, a learning intervention was carried out with 33 students enrolled in an introductory college biology course. The study compared learning outcomes following two types of learning tools. One learning tool was a traditional drawing activity, and the other was an interactive digital activity carried out on a computer. The sample was divided into two random groups, and a crossover design was implemented with two separate interventions. In the first intervention students learned how to draw and label a cell. Group 1 learned the material by computer and Group 2 learned the material by hand-drawing. In the second intervention, students learned how to draw the phases of mitosis, and the two groups were inverted. After each learning activity, students were given a quiz on the material they had learned. Students were also asked to self-evaluate their performance on each quiz, in an attempt to measure their level of metacognition. At the end of the study, they were asked to fill out a questionnaire that was used to measure the level of task engagement the students felt towards the two types of learning activities. In this study, following the first testing phase, the students who learned the material by drawing had a significantly higher average grade on the associated quiz compared to that of those who learned the material by computer. The difference was lost with the second “cross-over” trial. There was no correlation for either group between the grade the students thought they had earned through self-evaluation, and the grade that they received. In terms of different measures of task engagement, there were no significant differences between the two groups. One finding from the study showed a positive correlation between grade and self-reported time spent playing video games, and a negative correlation between grade and self-reported interest in drawing. This study provides little evidence to support claims that the use of digital tools enhances learning, but does provide evidence to support claims that drawing by hand is beneficial for learning biological images. However, the small sample size, limited number and type of learning tasks, and the indirect means of measuring levels of metacognition and task engagement restrict generalisation of these conclusions. Nevertheless, this study indicates that teachers should not use digital learning tools to the exclusion of traditional drawing activities: further studies on the effectiveness of these tools are warranted. Students in this study commented that the computer tool seemed more accurate and detailed - even though the two learning tools carried identical information. Thus there was a mismatch between the perception of the usefulness of computers as a learning tool and the reality, which again points to the need for an objective assessment of their usefulness. Students should be given the opportunity to try out a variety of traditional and digital learning tools in order to address their different learning preferences.
Resumo:
This paper focus on the development of an algorithm using Matlab to generate Typical Meteorological Years from weather data of eight locations in the Madeira Island and to predict the energy generation of photovoltaic systems based on solar cells modelling. Solar cells model includes the effect of ambient temperature and wind speed. The analysis of the PV system performance is carried out through the Weather Corrected Performance Ratio and the PV system yield for the entire island is estimated using spatial interpolation tools.
Resumo:
The design optimization of industrial products has always been an essential activity to improve product quality while reducing time-to-market and production costs. Although cost management is very complex and comprises all phases of the product life cycle, the control of geometrical and dimensional variations, known as Dimensional Management (DM), allows compliance with product and process requirements. Hence, the tolerance-cost optimization becomes the main practice to provide an effective application of Design for Tolerancing (DfT) and Design to Cost (DtC) approaches by enabling a connection between product tolerances and associated manufacturing costs. However, despite the growing interest in this topic, a profitable application in the industry of these techniques is hampered by their complexity: the definition of a systematic framework is the key element to improving design optimization, enhancing the concurrent use of Computer-Aided tools and Model-Based Definition (MBD) practices. The present doctorate research aims to define and develop an integrated methodology for product/process design optimization, to better exploit the new capabilities of advanced simulations and tools. By implementing predictive models and multi-disciplinary optimization, a Computer-Aided Integrated framework for tolerance-cost optimization has been proposed to allow the integration of DfT and DtC approaches and their direct application for the design of automotive components. Several case studies have been considered, with the final application of the integrated framework on a high-performance V12 engine assembly, to achieve both functional targets and cost reduction. From a scientific point of view, the proposed methodology provides an improvement for the tolerance-cost optimization of industrial components. The integration of theoretical approaches and Computer-Aided tools allows to analyse the influence of tolerances on both product performance and manufacturing costs. The case studies proved the suitability of the methodology for its application in the industrial field, providing the identification of further areas for improvement and refinement.
Resumo:
Universidade Estadual de Campinas . Faculdade de Educação Física
Resumo:
Acoustic resonances are observed in high-pressure discharge lamps operated with ac input modulated power frequencies in the kilohertz range. This paper describes an optical resonance detection method for high-intensity discharge lamps using computer-controlled cameras and image processing software. Experimental results showing acoustic resonances in high-pressure sodium lamps are presented.
Resumo:
In this paper, processing methods of Fourier optics implemented in a digital holographic microscopy system are presented. The proposed methodology is based on the possibility of the digital holography in carrying out the whole reconstruction of the recorded wave front and consequently, the determination of the phase and intensity distribution in any arbitrary plane located between the object and the recording plane. In this way, in digital holographic microscopy the field produced by the objective lens can be reconstructed along its propagation, allowing the reconstruction of the back focal plane of the lens, so that the complex amplitudes of the Fraunhofer diffraction, or equivalently the Fourier transform, of the light distribution across the object can be known. The manipulation of Fourier transform plane makes possible the design of digital methods of optical processing and image analysis. The proposed method has a great practical utility and represents a powerful tool in image analysis and data processing. The theoretical aspects of the method are presented, and its validity has been demonstrated using computer generated holograms and images simulations of microscopic objects. (c) 2007 Elsevier B.V. All rights reserved.
Resumo:
Photodynamic therapy involves administration of a photosensitizing drug and its subsequent activation by visible light of the appropriate wavelength. Several approaches to increasing the specificity of photosensitizers for cancerous tissues and, in particular, through their conjugation to ligands that are directed against tumor-associated antigens have been investigated. Here, we have studied the delivery of the photocytotoxic porphyrin compound TPP(p-O-beta-D-GluOH)(3) into tumor cells that overexpress the glycosphingolipid Gb3, using the Gb3-binding nontoxic B-subunit of Shiga toxin (STxB) as a vector. To allow for site-directed chemical coupling, an STxB variant carrying a free sulfhydryl moiety at its C-terminal end has been used. Binding affinity, cellular uptake, singlet oxygen quantum yield, and phototoxicity of the conjugate have been examined. Despite some effect of coupling on both the photophysical properties of TPP(p-O-beta-D-GluOH)(3) and the affinity of STxB for its receptor, the conjugate exhibited a higher photocytotoxic activity than the photosensitizer alone and was exquisitely selective for Gb3-expressing tumor cells. Furthermore, our data strongly suggest that STxB-mediated retrograde delivery of the photosensitizer to the biosynthetic/secretory pathway is critical for optimal cytotoxic activity. In conclusion, a strong rationale for using retrograde delivery tools such as STxB in combination with photosensitizing agents for the photodynamic therapy of tumors is presented.
Resumo:
Objective. The aim of this study was to compare Profile .04 taper series 29 instruments and hand files for gutta-percha removal. Study design. Twenty maxillary central incisors with a single straight canal were instrumented and filled. The teeth were divided into 2 groups of 10 specimens each, according to gutta-percha removal techniques: Group 1- Profile series 29 and Group 2- hand files and solvent. The amount of time for gutta-percha removal and the number of fractured instruments were evaluated. Radiographs were taken and the teeth were grooved longitudinally and split. The area of residual debris was measured using computer software. Results. The time for filling material removal was significantly shorter when Profile series 29 was used (P = .00). Regarding cleanliness, there were no statistical differences in the teeth halves evaluations (P = .05). Hand instruments cleaned the canals significantly better than Profiles, in the radiographic analysis considering the whole canal. Overall, the radiographic analysis showed a smaller percentage of residual debris than the teeth halves analysis. Conclusion. The Profile series 29 instruments proved to be faster than hand instruments in removing root filling materials; however, hand instruments yielded better root canal cleanliness. Some residual debris was not visualized by radiographs. (Oral Surg Oral Med Oral Pathol Oral Radiol Endod 2009; 108: e46-e50)
Resumo:
The purpose of this study was to evaluate a new periapical index based on cone beam computed tomography (CBCT) for identification of apical periodontitis (AP). The periapical index proposed in this study (CBCTPAI) was developed on the basis of criteria established from measurements corresponding to periapical radiolucency interpreted on CBCT scans. Radiolucent images suggestive of periapical lesions were measured by using the working tools of Planimp software on CBCT scans in 3 dimensions: buccopalatal, mesiodistal, and diagonal. The CBCTPAI was determined by the largest lesion extension. A 6-point (0-5) scoring system was used with 2 additional variables, expansion of cortical bone and destruction of cortical bone. A total of 1014 images (periapical radiographs and CBCT scans) originally taken from 596 patients were evaluated by 3 observers by using the CBCTPAI criteria. AP was identified in 39.5% and 60.9% of cases by radiography and CBCT, respectively (P<.01). The CBCTPAI offers an accurate diagnostic method for use with high-resolution images, which can reduce the incidence of false-negative diagnosis, minimize observer interference, and increase the reliability of epidemiologic studies, especially those referring to AP prevalence and severity. (J Endod 2008;34:1325-1331)
Resumo:
In the last 7 years, a method has been developed to analyse building energy performance using computer simulation, in Brazil. The method combines analysis of building design plans and documentation, walk-through visits, electric and thermal measurements and the use of an energy simulation tool (DOE-2.1E code), The method was used to model more than 15 office buildings (more than 200 000 m(2)), located between 12.5degrees and 27.5degrees South latitude. The paper describes the basic methodology, with data for one building and presents additional results for other six cases. (C) 2002 Elsevier Science Ltd. All rights reserved.
Resumo:
The effect of unitary noise on the discrete one-dimensional quantum walk is studied using computer simulations. For the noiseless quantum walk, starting at the origin (n=0) at time t=0, the position distribution P-t(n) at time t is very different from the Gaussian distribution obtained for the classical random walk. Furthermore, its standard deviation, sigma(t) scales as sigma(t)similar tot, unlike the classical random walk for which sigma(t)similar toroott. It is shown that when the quantum walk is exposed to unitary noise, it exhibits a crossover from quantum behavior for short times to classical-like behavior for long times. The crossover time is found to be Tsimilar toalpha(-2), where alpha is the standard deviation of the noise.